A novel concatenated code based on the improved SCG-LDPC code for optical transmission systems
NASA Astrophysics Data System (ADS)
Yuan, Jian-guo; Xie, Ya; Wang, Lin; Huang, Sheng; Wang, Yong
2013-01-01
Based on the optimization and improvement for the construction method of systematically constructed Gallager (SCG) (4, k) code, a novel SCG low density parity check (SCG-LDPC)(3969, 3720) code to be suitable for optical transmission systems is constructed. The novel SCG-LDPC (6561,6240) code with code rate of 95.1% is constructed by increasing the length of SCG-LDPC (3969,3720) code, and in a way, the code rate of LDPC codes can better meet the high requirements of optical transmission systems. And then the novel concatenated code is constructed by concatenating SCG-LDPC(6561,6240) code and BCH(127,120) code with code rate of 94.5%. The simulation results and analyses show that the net coding gain (NCG) of BCH(127,120)+SCG-LDPC(6561,6240) concatenated code is respectively 2.28 dB and 0.48 dB more than those of the classic RS(255,239) code and SCG-LDPC(6561,6240) code at the bit error rate (BER) of 10-7.
NASA Astrophysics Data System (ADS)
Yuan, Jian-guo; Zhou, Guang-xiang; Gao, Wen-chun; Wang, Yong; Lin, Jin-zhao; Pang, Yu
2016-01-01
According to the requirements of the increasing development for optical transmission systems, a novel construction method of quasi-cyclic low-density parity-check (QC-LDPC) codes based on the subgroup of the finite field multiplicative group is proposed. Furthermore, this construction method can effectively avoid the girth-4 phenomena and has the advantages such as simpler construction, easier implementation, lower encoding/decoding complexity, better girth properties and more flexible adjustment for the code length and code rate. The simulation results show that the error correction performance of the QC-LDPC(3 780,3 540) code with the code rate of 93.7% constructed by this proposed method is excellent, its net coding gain is respectively 0.3 dB, 0.55 dB, 1.4 dB and 1.98 dB higher than those of the QC-LDPC(5 334,4 962) code constructed by the method based on the inverse element characteristics in the finite field multiplicative group, the SCG-LDPC(3 969,3 720) code constructed by the systematically constructed Gallager (SCG) random construction method, the LDPC(32 640,30 592) code in ITU-T G.975.1 and the classic RS(255,239) code which is widely used in optical transmission systems in ITU-T G.975 at the bit error rate ( BER) of 10-7. Therefore, the constructed QC-LDPC(3 780,3 540) code is more suitable for optical transmission systems.
The Modified Cognitive Constructions Coding System: Reliability and Validity Assessments
ERIC Educational Resources Information Center
Moran, Galia S.; Diamond, Gary M.
2006-01-01
The cognitive constructions coding system (CCCS) was designed for coding client's expressed problem constructions on four dimensions: intrapersonal-interpersonal, internal-external, responsible-not responsible, and linear-circular. This study introduces, and examines the reliability and validity of, a modified version of the CCCS--a version that…
NASA Astrophysics Data System (ADS)
Yuan, Jian-guo; Liang, Meng-qi; Wang, Yong; Lin, Jin-zhao; Pang, Yu
2016-03-01
A novel lower-complexity construction scheme of quasi-cyclic low-density parity-check (QC-LDPC) codes for optical transmission systems is proposed based on the structure of the parity-check matrix for the Richardson-Urbanke (RU) algorithm. Furthermore, a novel irregular QC-LDPC(4 288, 4 020) code with high code-rate of 0.937 is constructed by this novel construction scheme. The simulation analyses show that the net coding gain ( NCG) of the novel irregular QC-LDPC(4 288,4 020) code is respectively 2.08 dB, 1.25 dB and 0.29 dB more than those of the classic RS(255, 239) code, the LDPC(32 640, 30 592) code and the irregular QC-LDPC(3 843, 3 603) code at the bit error rate ( BER) of 10-6. The irregular QC-LDPC(4 288, 4 020) code has the lower encoding/decoding complexity compared with the LDPC(32 640, 30 592) code and the irregular QC-LDPC(3 843, 3 603) code. The proposed novel QC-LDPC(4 288, 4 020) code can be more suitable for the increasing development requirements of high-speed optical transmission systems.
Construction of a new regular LDPC code for optical transmission systems
NASA Astrophysics Data System (ADS)
Yuan, Jian-guo; Tong, Qing-zhen; Xu, Liang; Huang, Sheng
2013-05-01
A novel construction method of the check matrix for the regular low density parity check (LDPC) code is proposed. The novel regular systematically constructed Gallager (SCG)-LDPC(3969,3720) code with the code rate of 93.7% and the redundancy of 6.69% is constructed. The simulation results show that the net coding gain (NCG) and the distance from the Shannon limit of the novel SCG-LDPC(3969,3720) code can respectively be improved by about 1.93 dB and 0.98 dB at the bit error rate (BER) of 10-8, compared with those of the classic RS(255,239) code in ITU-T G.975 recommendation and the LDPC(32640,30592) code in ITU-T G.975.1 recommendation with the same code rate of 93.7% and the same redundancy of 6.69%. Therefore, the proposed novel regular SCG-LDPC(3969,3720) code has excellent performance, and is more suitable for high-speed long-haul optical transmission systems.
A novel construction method of QC-LDPC codes based on CRT for optical communications
NASA Astrophysics Data System (ADS)
Yuan, Jian-guo; Liang, Meng-qi; Wang, Yong; Lin, Jin-zhao; Pang, Yu
2016-05-01
A novel construction method of quasi-cyclic low-density parity-check (QC-LDPC) codes is proposed based on Chinese remainder theory (CRT). The method can not only increase the code length without reducing the girth, but also greatly enhance the code rate, so it is easy to construct a high-rate code. The simulation results show that at the bit error rate ( BER) of 10-7, the net coding gain ( NCG) of the regular QC-LDPC(4 851, 4 546) code is respectively 2.06 dB, 1.36 dB, 0.53 dB and 0.31 dB more than those of the classic RS(255, 239) code in ITU-T G.975, the LDPC(32 640, 30 592) code in ITU-T G.975.1, the QC-LDPC(3 664, 3 436) code constructed by the improved combining construction method based on CRT and the irregular QC-LDPC(3 843, 3 603) code constructed by the construction method based on the Galois field ( GF( q)) multiplicative group. Furthermore, all these five codes have the same code rate of 0.937. Therefore, the regular QC-LDPC(4 851, 4 546) code constructed by the proposed construction method has excellent error-correction performance, and can be more suitable for optical transmission systems.
NASA Technical Reports Server (NTRS)
Rajpal, Sandeep; Rhee, DoJun; Lin, Shu
1997-01-01
In this paper, we will use the construction technique proposed in to construct multidimensional trellis coded modulation (TCM) codes for both the additive white Gaussian noise (AWGN) and the fading channels. Analytical performance bounds and simulation results show that these codes perform very well and achieve significant coding gains over uncoded reference modulation systems. In addition, the proposed technique can be used to construct codes which have a performance/decoding complexity advantage over the codes listed in literature.
Forecasting of construction and demolition waste in Brazil.
Paz, Diogo Hf; Lafayette, Kalinny Pv
2016-08-01
The objective of this article is to develop a computerised tool (software) that facilitates the analysis of strategies for waste management on construction sites through the use of indicators of construction and demolition waste generation. The development involved the following steps: knowledge acquisition, structuring the system, coding and system evaluation. The step of knowledge acquisition aims to provide subsidies for the representation of them through models. In the step of structuring the system, it was presented the structuring and formalisation of knowledge for the development of the system, and has two stages: the construction of the conceptual model and the subsequent instantiation of the model. The coding system aims to implement (code) the conceptual model developed in a model played by computer (digital). The results showed that the system is very useful and applicable in construction sites, helping to improve the quality of waste management, and creating a database that will support new research. © The Author(s) 2016.
A novel QC-LDPC code based on the finite field multiplicative group for optical communications
NASA Astrophysics Data System (ADS)
Yuan, Jian-guo; Xu, Liang; Tong, Qing-zhen
2013-09-01
A novel construction method of quasi-cyclic low-density parity-check (QC-LDPC) code is proposed based on the finite field multiplicative group, which has easier construction, more flexible code-length code-rate adjustment and lower encoding/decoding complexity. Moreover, a regular QC-LDPC(5334,4962) code is constructed. The simulation results show that the constructed QC-LDPC(5334,4962) code can gain better error correction performance under the condition of the additive white Gaussian noise (AWGN) channel with iterative decoding sum-product algorithm (SPA). At the bit error rate (BER) of 10-6, the net coding gain (NCG) of the constructed QC-LDPC(5334,4962) code is 1.8 dB, 0.9 dB and 0.2 dB more than that of the classic RS(255,239) code in ITU-T G.975, the LDPC(32640,30592) code in ITU-T G.975.1 and the SCG-LDPC(3969,3720) code constructed by the random method, respectively. So it is more suitable for optical communication systems.
ERIC Educational Resources Information Center
Yamamoto, Kentaro; He, Qiwei; Shin, Hyo Jeong; von Davier, Mattias
2017-01-01
Approximately a third of the Programme for International Student Assessment (PISA) items in the core domains (math, reading, and science) are constructed-response items and require human coding (scoring). This process is time-consuming, expensive, and prone to error as often (a) humans code inconsistently, and (b) coding reliability in…
Constructions for finite-state codes
NASA Technical Reports Server (NTRS)
Pollara, F.; Mceliece, R. J.; Abdel-Ghaffar, K.
1987-01-01
A class of codes called finite-state (FS) codes is defined and investigated. These codes, which generalize both block and convolutional codes, are defined by their encoders, which are finite-state machines with parallel inputs and outputs. A family of upper bounds on the free distance of a given FS code is derived from known upper bounds on the minimum distance of block codes. A general construction for FS codes is then given, based on the idea of partitioning a given linear block into cosets of one of its subcodes, and it is shown that in many cases the FS codes constructed in this way have a d sub free which is as large as possible. These codes are found without the need for lengthy computer searches, and have potential applications for future deep-space coding systems. The issue of catastropic error propagation (CEP) for FS codes is also investigated.
New double-byte error-correcting codes for memory systems
NASA Technical Reports Server (NTRS)
Feng, Gui-Liang; Wu, Xinen; Rao, T. R. N.
1996-01-01
Error-correcting or error-detecting codes have been used in the computer industry to increase reliability, reduce service costs, and maintain data integrity. The single-byte error-correcting and double-byte error-detecting (SbEC-DbED) codes have been successfully used in computer memory subsystems. There are many methods to construct double-byte error-correcting (DBEC) codes. In the present paper we construct a class of double-byte error-correcting codes, which are more efficient than those known to be optimum, and a decoding procedure for our codes is also considered.
Kim, Myoung Soo
2012-08-01
The purpose of this cross-sectional study was to examine current status of IT-based medication error prevention system construction and the relationships among system construction, medication error management climate and perception for system use. The participants were 124 patient safety chief managers working for 124 hospitals with over 300 beds in Korea. The characteristics of the participants, construction status and perception of systems (electric pharmacopoeia, electric drug dosage calculation system, computer-based patient safety reporting and bar-code system) and medication error management climate were measured in this study. The data were collected between June and August 2011. Descriptive statistics, partial Pearson correlation and MANCOVA were used for data analysis. Electric pharmacopoeia were constructed in 67.7% of participating hospitals, computer-based patient safety reporting systems were constructed in 50.8%, electric drug dosage calculation systems were in use in 32.3%. Bar-code systems showed up the lowest construction rate at 16.1% of Korean hospitals. Higher rates of construction of IT-based medication error prevention systems resulted in greater safety and a more positive error management climate prevailed. The supportive strategies for improving perception for use of IT-based systems would add to system construction, and positive error management climate would be more easily promoted.
Design of ACM system based on non-greedy punctured LDPC codes
NASA Astrophysics Data System (ADS)
Lu, Zijun; Jiang, Zihong; Zhou, Lin; He, Yucheng
2017-08-01
In this paper, an adaptive coded modulation (ACM) scheme based on rate-compatible LDPC (RC-LDPC) codes was designed. The RC-LDPC codes were constructed by a non-greedy puncturing method which showed good performance in high code rate region. Moreover, the incremental redundancy scheme of LDPC-based ACM system over AWGN channel was proposed. By this scheme, code rates vary from 2/3 to 5/6 and the complication of the ACM system is lowered. Simulations show that more and more obvious coding gain can be obtained by the proposed ACM system with higher throughput.
Van Laere, Sven; Nyssen, Marc; Verbeke, Frank
2017-01-01
Clinical coding is a requirement to provide valuable data for billing, epidemiology and health care resource allocation. In sub-Saharan Africa, we observe a growing awareness of the need for coding of clinical data, not only in health insurances, but also in governments and the hospitals. Presently, coding systems in sub-Saharan Africa are often used for billing purposes. In this paper we consider the use of a nomenclature to also have a clinical impact. Often coding systems are assumed to be complex and too extensive to be used in daily practice. Here, we present a method for constructing a new nomenclature based on existing coding systems by considering a minimal subset in the sub-Saharan region. Evaluation of completeness will be done nationally using the requirements of national registries. The nomenclature requires an extension character for dealing with codes that have to be used for multiple registries. Hospitals will benefit most by using this extension character.
NASA Astrophysics Data System (ADS)
Huang, Sheng; Ao, Xiang; Li, Yuan-yuan; Zhang, Rui
2016-09-01
In order to meet the needs of high-speed development of optical communication system, a construction method of quasi-cyclic low-density parity-check (QC-LDPC) codes based on multiplicative group of finite field is proposed. The Tanner graph of parity check matrix of the code constructed by this method has no cycle of length 4, and it can make sure that the obtained code can get a good distance property. Simulation results show that when the bit error rate ( BER) is 10-6, in the same simulation environment, the net coding gain ( NCG) of the proposed QC-LDPC(3 780, 3 540) code with the code rate of 93.7% in this paper is improved by 2.18 dB and 1.6 dB respectively compared with those of the RS(255, 239) code in ITU-T G.975 and the LDPC(3 2640, 3 0592) code in ITU-T G.975.1. In addition, the NCG of the proposed QC-LDPC(3 780, 3 540) code is respectively 0.2 dB and 0.4 dB higher compared with those of the SG-QC-LDPC(3 780, 3 540) code based on the two different subgroups in finite field and the AS-QC-LDPC(3 780, 3 540) code based on the two arbitrary sets of a finite field. Thus, the proposed QC-LDPC(3 780, 3 540) code in this paper can be well applied in optical communication systems.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
This standard provides rules for the construction of Class 1 nuclear components, parts, and appurtenances for use at elevated temperatures. This standard is a complete set of requirements only when used in conjunction with Section III of the ASME Boiler and Pressure Vessel Code (ASME Code) and addenda, ASME Code Cases 1592, 1593, 1594, 1595, and 1596, and RDT E 15-2NB. Unmodified paragraphs of the referenced Code Cases are not repeated in this standard but are a part of the requirements of this standard.
A HO-IRT Based Diagnostic Assessment System with Constructed Response Items
ERIC Educational Resources Information Center
Yang, Chih-Wei; Kuo, Bor-Chen; Liao, Chen-Huei
2011-01-01
The aim of the present study was to develop an on-line assessment system with constructed response items in the context of elementary mathematics curriculum. The system recorded the problem solving process of constructed response items and transfered the process to response codes for further analyses. An inference mechanism based on artificial…
Users manual for coordinate generation code CRDSRA
NASA Technical Reports Server (NTRS)
Shamroth, S. J.
1985-01-01
Generation of a viable coordinate system represents an important component of an isolated airfoil Navier-Stokes calculation. The manual describes a computer code for generation of such a coordinate system. The coordinate system is a general nonorthogonal one in which high resolution normal to the airfoil is obtained in the vicinity of the airfoil surface, and high resolution along the airfoil surface is obtained in the vicinity of the airfoil leading edge. The method of generation is a constructive technique which leads to a C type coordinate grid. The method of construction as well as input and output definitions are contained herein. The computer code itself as well as a sample output is being submitted to COSMIC.
The design of the CMOS wireless bar code scanner applying optical system based on ZigBee
NASA Astrophysics Data System (ADS)
Chen, Yuelin; Peng, Jian
2008-03-01
The traditional bar code scanner is influenced by the length of data line, but the farthest distance of the wireless bar code scanner of wireless communication is generally between 30m and 100m on the market. By rebuilding the traditional CCD optical bar code scanner, a CMOS code scanner is designed based on the ZigBee to meet the demands of market. The scan system consists of the CMOS image sensor and embedded chip S3C2401X, when the two dimensional bar code is read, the results show the inaccurate and wrong code bar, resulted from image defile, disturber, reads image condition badness, signal interference, unstable system voltage. So we put forward the method which uses the matrix evaluation and Read-Solomon arithmetic to solve them. In order to construct the whole wireless optics of bar code system and to ensure its ability of transmitting bar code image signals digitally with long distances, ZigBee is used to transmit data to the base station, and this module is designed based on image acquisition system, and at last the wireless transmitting/receiving CC2430 module circuit linking chart is established. And by transplanting the embedded RTOS system LINUX to the MCU, an applying wireless CMOS optics bar code scanner and multi-task system is constructed. Finally, performance of communication is tested by evaluation software Smart RF. In broad space, every ZIGBEE node can realize 50m transmission with high reliability. When adding more ZigBee nodes, the transmission distance can be several thousands of meters long.
New quantum codes derived from a family of antiprimitive BCH codes
NASA Astrophysics Data System (ADS)
Liu, Yang; Li, Ruihu; Lü, Liangdong; Guo, Luobin
The Bose-Chaudhuri-Hocquenghem (BCH) codes have been studied for more than 57 years and have found wide application in classical communication system and quantum information theory. In this paper, we study the construction of quantum codes from a family of q2-ary BCH codes with length n=q2m+1 (also called antiprimitive BCH codes in the literature), where q≥4 is a power of 2 and m≥2. By a detailed analysis of some useful properties about q2-ary cyclotomic cosets modulo n, Hermitian dual-containing conditions for a family of non-narrow-sense antiprimitive BCH codes are presented, which are similar to those of q2-ary primitive BCH codes. Consequently, via Hermitian Construction, a family of new quantum codes can be derived from these dual-containing BCH codes. Some of these new antiprimitive quantum BCH codes are comparable with those derived from primitive BCH codes.
Generalized type II hybrid ARQ scheme using punctured convolutional coding
NASA Astrophysics Data System (ADS)
Kallel, Samir; Haccoun, David
1990-11-01
A method is presented to construct rate-compatible convolutional (RCC) codes from known high-rate punctured convolutional codes, obtained from best-rate 1/2 codes. The construction method is rather simple and straightforward, and still yields good codes. Moreover, low-rate codes can be obtained without any limit on the lowest achievable code rate. Based on the RCC codes, a generalized type-II hybrid ARQ scheme, which combines the benefits of the modified type-II hybrid ARQ strategy of Hagenauer (1988) with the code-combining ARQ strategy of Chase (1985), is proposed and analyzed. With the proposed generalized type-II hybrid ARQ strategy, the throughput increases as the starting coding rate increases, and as the channel degrades, it tends to merge with the throughput of rate 1/2 type-II hybrid ARQ schemes with code combining, thus allowing the system to be flexible and adaptive to channel conditions, even under wide noise variations and severe degradations.
Improving performance of DS-CDMA systems using chaotic complex Bernoulli spreading codes
NASA Astrophysics Data System (ADS)
Farzan Sabahi, Mohammad; Dehghanfard, Ali
2014-12-01
The most important goal of spreading spectrum communication system is to protect communication signals against interference and exploitation of information by unintended listeners. In fact, low probability of detection and low probability of intercept are two important parameters to increase the performance of the system. In Direct Sequence Code Division Multiple Access (DS-CDMA) systems, these properties are achieved by multiplying the data information in spreading sequences. Chaotic sequences, with their particular properties, have numerous applications in constructing spreading codes. Using one-dimensional Bernoulli chaotic sequence as spreading code is proposed in literature previously. The main feature of this sequence is its negative auto-correlation at lag of 1, which with proper design, leads to increase in efficiency of the communication system based on these codes. On the other hand, employing the complex chaotic sequences as spreading sequence also has been discussed in several papers. In this paper, use of two-dimensional Bernoulli chaotic sequences is proposed as spreading codes. The performance of a multi-user synchronous and asynchronous DS-CDMA system will be evaluated by applying these sequences under Additive White Gaussian Noise (AWGN) and fading channel. Simulation results indicate improvement of the performance in comparison with conventional spreading codes like Gold codes as well as similar complex chaotic spreading sequences. Similar to one-dimensional Bernoulli chaotic sequences, the proposed sequences also have negative auto-correlation. Besides, construction of complex sequences with lower average cross-correlation is possible with the proposed method.
NASA Astrophysics Data System (ADS)
Sikder, Somali; Ghosh, Shila
2018-02-01
This paper presents the construction of unipolar transposed modified Walsh code (TMWC) and analysis of its performance in optical code-division multiple-access (OCDMA) systems. Specifically, the signal-to-noise ratio, bit error rate (BER), cardinality, and spectral efficiency were investigated. The theoretical analysis demonstrated that the wavelength-hopping time-spreading system using TMWC was robust against multiple-access interference and more spectrally efficient than systems using other existing OCDMA codes. In particular, the spectral efficiency was calculated to be 1.0370 when TMWC of weight 3 was employed. The BER and eye pattern for the designed TMWC were also successfully obtained using OptiSystem simulation software. The results indicate that the proposed code design is promising for enhancing network capacity.
Wood construction codes issues in the United States
Douglas R. Rammer
2006-01-01
The current wood construction codes find their origin in the 1935 Wood Handbook: Wood as an Engineering Material published by the USDA Forest Service. Many of the current design recommendations can be traced back to statements from this book. Since this time a series of development both historical and recent has led to a multi-layered system for use of wood products in...
Rate-Compatible LDPC Codes with Linear Minimum Distance
NASA Technical Reports Server (NTRS)
Divsalar, Dariush; Jones, Christopher; Dolinar, Samuel
2009-01-01
A recently developed method of constructing protograph-based low-density parity-check (LDPC) codes provides for low iterative decoding thresholds and minimum distances proportional to block sizes, and can be used for various code rates. A code constructed by this method can have either fixed input block size or fixed output block size and, in either case, provides rate compatibility. The method comprises two submethods: one for fixed input block size and one for fixed output block size. The first mentioned submethod is useful for applications in which there are requirements for rate-compatible codes that have fixed input block sizes. These are codes in which only the numbers of parity bits are allowed to vary. The fixed-output-blocksize submethod is useful for applications in which framing constraints are imposed on the physical layers of affected communication systems. An example of such a system is one that conforms to one of many new wireless-communication standards that involve the use of orthogonal frequency-division modulation
Clinical code set engineering for reusing EHR data for research: A review.
Williams, Richard; Kontopantelis, Evangelos; Buchan, Iain; Peek, Niels
2017-06-01
The construction of reliable, reusable clinical code sets is essential when re-using Electronic Health Record (EHR) data for research. Yet code set definitions are rarely transparent and their sharing is almost non-existent. There is a lack of methodological standards for the management (construction, sharing, revision and reuse) of clinical code sets which needs to be addressed to ensure the reliability and credibility of studies which use code sets. To review methodological literature on the management of sets of clinical codes used in research on clinical databases and to provide a list of best practice recommendations for future studies and software tools. We performed an exhaustive search for methodological papers about clinical code set engineering for re-using EHR data in research. This was supplemented with papers identified by snowball sampling. In addition, a list of e-phenotyping systems was constructed by merging references from several systematic reviews on this topic, and the processes adopted by those systems for code set management was reviewed. Thirty methodological papers were reviewed. Common approaches included: creating an initial list of synonyms for the condition of interest (n=20); making use of the hierarchical nature of coding terminologies during searching (n=23); reviewing sets with clinician input (n=20); and reusing and updating an existing code set (n=20). Several open source software tools (n=3) were discovered. There is a need for software tools that enable users to easily and quickly create, revise, extend, review and share code sets and we provide a list of recommendations for their design and implementation. Research re-using EHR data could be improved through the further development, more widespread use and routine reporting of the methods by which clinical codes were selected. Copyright © 2017 The Author(s). Published by Elsevier Inc. All rights reserved.
Renovating and Reconstructing in Phases--Specifying Phased Construction.
ERIC Educational Resources Information Center
Bunzick, John
2002-01-01
Discusses planning for phased school construction projects, including effects on occupancy (for example, construction adjacent to occupied space, construction procedure safety zones near occupied areas, and code-complying means of egress), effects on building systems (such as heating and cooling equipment and power distribution), and contract…
NASA Astrophysics Data System (ADS)
Modegi, Toshio
Using our previously developed audio to MIDI code converter tool “Auto-F”, from given vocal acoustic signals we can create MIDI data, which enable to playback the voice-like signals with a standard MIDI synthesizer. Applying this tool, we are constructing a MIDI database, which consists of previously converted simple harmonic structured MIDI codes from a set of 71 Japanese male and female syllable recorded signals. And we are developing a novel voice synthesizing system based on harmonically synthesizing musical sounds, which can generate MIDI data and playback voice signals with a MIDI synthesizer by giving Japanese plain (kana) texts, referring to the syllable MIDI code database. In this paper, we propose an improved MIDI converter tool, which can produce temporally higher-resolution MIDI codes. Then we propose an algorithm separating a set of 20 consonant and vowel phoneme MIDI codes from 71 syllable MIDI converted codes in order to construct a voice synthesizing system. And, we present the evaluation results of voice synthesizing quality between these separated phoneme MIDI codes and their original syllable MIDI codes by our developed 4-syllable word listening tests.
Applications of Coding in Network Communications
ERIC Educational Resources Information Center
Chang, Christopher SungWook
2012-01-01
This thesis uses the tool of network coding to investigate fast peer-to-peer file distribution, anonymous communication, robust network construction under uncertainty, and prioritized transmission. In a peer-to-peer file distribution system, we use a linear optimization approach to show that the network coding framework significantly simplifies…
The Code Noir: Construction of Slavery in French Colonial Louisiana.
ERIC Educational Resources Information Center
Arlyck, Kevin
2003-01-01
Presents a lesson focusing on the history of slavery. Compares two systems of slavery in North America to teach students about slavery within and outside of the United States. States that the lesson uses the "Code Noir" to help students understand the similarities and differences between the systems. (CMK)
Computer algorithm for coding gain
NASA Technical Reports Server (NTRS)
Dodd, E. E.
1974-01-01
Development of a computer algorithm for coding gain for use in an automated communications link design system. Using an empirical formula which defines coding gain as used in space communications engineering, an algorithm is constructed on the basis of available performance data for nonsystematic convolutional encoding with soft-decision (eight-level) Viterbi decoding.
Extended Plate and Beam Wall System
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gunderson, Patti
Home Innovation Research Labs studied the extended plate and beam wall (EP&B) system during a two-year period from mid-2015 to mid-2017 to determine the wall’s structural performance, moisture durability, constructability, and costeffectiveness for use as a high-R enclosure system for energy code minimum and above-code performance in climate zones 4–8.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rosenberg, Michael; Jonlin, Duane; Nadel, Steven
Today’s building energy codes focus on prescriptive requirements for features of buildings that are directly controlled by the design and construction teams and verifiable by municipal inspectors. Although these code requirements have had a significant impact, they fail to influence a large slice of the building energy use pie – including not only miscellaneous plug loads, cooking equipment and commercial/industrial processes, but the maintenance and optimization of the code-mandated systems as well. Currently, code compliance is verified only through the end of construction, and there are no limits or consequences for the actual energy use in an occupied building. Inmore » the future, our suite of energy regulations will likely expand to include building efficiency, energy use or carbon emission budgets over their full life cycle. Intelligent building systems, extensive renewable energy, and a transition from fossil fuel to electric heating systems will likely be required to meet ultra-low-energy targets. This paper lays out the authors’ perspectives on how buildings may evolve over the course of the 21st century and the roles that codes and regulations will play in shaping those buildings of the future.« less
78 FR 69286 - Electric System Construction Policies and Procedures
Federal Register 2010, 2011, 2012, 2013, 2014
2013-11-19
... DEPARTMENT OF AGRICULTURE Rural Utilities Service 7 CFR Part 1726 Electric System Construction Policies and Procedures CFR Correction In Title 7 of the Code of Federal Regulations, Parts 1600 to 1759, revised as of January 1, 2013, on page 246, in Sec. 1726.14, the second definition of Minor modification...
2013-09-30
fire sprinkler system during the initial construction of the RSOI facilities. The construction contract to build the RSOI...International Building Code. Compliant manual and automatic fire alarm and notification systems , portable fire extinguishers, fire sprinkler systems ...automatic fire sprinkler system that was not operational, a fire department connection that was obstructed, and a fire detection system
A family of chaotic pure analog coding schemes based on baker's map function
NASA Astrophysics Data System (ADS)
Liu, Yang; Li, Jing; Lu, Xuanxuan; Yuen, Chau; Wu, Jun
2015-12-01
This paper considers a family of pure analog coding schemes constructed from dynamic systems which are governed by chaotic functions—baker's map function and its variants. Various decoding methods, including maximum likelihood (ML), minimum mean square error (MMSE), and mixed ML-MMSE decoding algorithms, have been developed for these novel encoding schemes. The proposed mirrored baker's and single-input baker's analog codes perform a balanced protection against the fold error (large distortion) and weak distortion and outperform the classical chaotic analog coding and analog joint source-channel coding schemes in literature. Compared to the conventional digital communication system, where quantization and digital error correction codes are used, the proposed analog coding system has graceful performance evolution, low decoding latency, and no quantization noise. Numerical results show that under the same bandwidth expansion, the proposed analog system outperforms the digital ones over a wide signal-to-noise (SNR) range.
Data compression for satellite images
NASA Technical Reports Server (NTRS)
Chen, P. H.; Wintz, P. A.
1976-01-01
An efficient data compression system is presented for satellite pictures and two grey level pictures derived from satellite pictures. The compression techniques take advantages of the correlation between adjacent picture elements. Several source coding methods are investigated. Double delta coding is presented and shown to be the most efficient. Both predictive differential quantizing technique and double delta coding can be significantly improved by applying a background skipping technique. An extension code is constructed. This code requires very little storage space and operates efficiently. Simulation results are presented for various coding schemes and source codes.
One-way quantum repeaters with quantum Reed-Solomon codes
NASA Astrophysics Data System (ADS)
Muralidharan, Sreraman; Zou, Chang-Ling; Li, Linshu; Jiang, Liang
2018-05-01
We show that quantum Reed-Solomon codes constructed from classical Reed-Solomon codes can approach the capacity on the quantum erasure channel of d -level systems for large dimension d . We study the performance of one-way quantum repeaters with these codes and obtain a significant improvement in key generation rate compared to previously investigated encoding schemes with quantum parity codes and quantum polynomial codes. We also compare the three generations of quantum repeaters using quantum Reed-Solomon codes and identify parameter regimes where each generation performs the best.
Energy Storage System Safety: Plan Review and Inspection Checklist
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cole, Pam C.; Conover, David R.
Codes, standards, and regulations (CSR) governing the design, construction, installation, commissioning, and operation of the built environment are intended to protect the public health, safety, and welfare. While these documents change over time to address new technology and new safety challenges, there is generally some lag time between the introduction of a technology into the market and the time it is specifically covered in model codes and standards developed in the voluntary sector. After their development, there is also a timeframe of at least a year or two until the codes and standards are adopted. Until existing model codes andmore » standards are updated or new ones are developed and then adopted, one seeking to deploy energy storage technologies or needing to verify the safety of an installation may be challenged in trying to apply currently implemented CSRs to an energy storage system (ESS). The Energy Storage System Guide for Compliance with Safety Codes and Standards1 (CG), developed in June 2016, is intended to help address the acceptability of the design and construction of stationary ESSs, their component parts, and the siting, installation, commissioning, operations, maintenance, and repair/renovation of ESS within the built environment.« less
Constructing LDPC Codes from Loop-Free Encoding Modules
NASA Technical Reports Server (NTRS)
Divsalar, Dariush; Dolinar, Samuel; Jones, Christopher; Thorpe, Jeremy; Andrews, Kenneth
2009-01-01
A method of constructing certain low-density parity-check (LDPC) codes by use of relatively simple loop-free coding modules has been developed. The subclasses of LDPC codes to which the method applies includes accumulate-repeat-accumulate (ARA) codes, accumulate-repeat-check-accumulate codes, and the codes described in Accumulate-Repeat-Accumulate-Accumulate Codes (NPO-41305), NASA Tech Briefs, Vol. 31, No. 9 (September 2007), page 90. All of the affected codes can be characterized as serial/parallel (hybrid) concatenations of such relatively simple modules as accumulators, repetition codes, differentiators, and punctured single-parity check codes. These are error-correcting codes suitable for use in a variety of wireless data-communication systems that include noisy channels. These codes can also be characterized as hybrid turbolike codes that have projected graph or protograph representations (for example see figure); these characteristics make it possible to design high-speed iterative decoders that utilize belief-propagation algorithms. The present method comprises two related submethods for constructing LDPC codes from simple loop-free modules with circulant permutations. The first submethod is an iterative encoding method based on the erasure-decoding algorithm. The computations required by this method are well organized because they involve a parity-check matrix having a block-circulant structure. The second submethod involves the use of block-circulant generator matrices. The encoders of this method are very similar to those of recursive convolutional codes. Some encoders according to this second submethod have been implemented in a small field-programmable gate array that operates at a speed of 100 megasymbols per second. By use of density evolution (a computational- simulation technique for analyzing performances of LDPC codes), it has been shown through some examples that as the block size goes to infinity, low iterative decoding thresholds close to channel capacity limits can be achieved for the codes of the type in question having low maximum variable node degrees. The decoding thresholds in these examples are lower than those of the best-known unstructured irregular LDPC codes constrained to have the same maximum node degrees. Furthermore, the present method enables the construction of codes of any desired rate with thresholds that stay uniformly close to their respective channel capacity thresholds.
Building Interactive Simulations in Web Pages without Programming.
Mailen Kootsey, J; McAuley, Grant; Bernal, Julie
2005-01-01
A software system is described for building interactive simulations and other numerical calculations in Web pages. The system is based on a new Java-based software architecture named NumberLinX (NLX) that isolates each function required to build the simulation so that a library of reusable objects could be assembled. The NLX objects are integrated into a commercial Web design program for coding-free page construction. The model description is entered through a wizard-like utility program that also functions as a model editor. The complete system permits very rapid construction of interactive simulations without coding. A wide range of applications are possible with the system beyond interactive calculations, including remote data collection and processing and collaboration over a network.
Flexural anchorage performance at diagonal crack locations.
DOT National Transportation Integrated Search
2010-12-01
Large numbers of reinforced concrete deck girder bridges that were constructed during the interstate system expansion of the 1950s have developed diagonal cracking in the stems. Though compliant with design codes when constructed, many of these bridg...
Support for User Interfaces for Distributed Systems
NASA Technical Reports Server (NTRS)
Eychaner, Glenn; Niessner, Albert
2005-01-01
An extensible Java(TradeMark) software framework supports the construction and operation of graphical user interfaces (GUIs) for distributed computing systems typified by ground control systems that send commands to, and receive telemetric data from, spacecraft. Heretofore, such GUIs have been custom built for each new system at considerable expense. In contrast, the present framework affords generic capabilities that can be shared by different distributed systems. Dynamic class loading, reflection, and other run-time capabilities of the Java language and JavaBeans component architecture enable the creation of a GUI for each new distributed computing system with a minimum of custom effort. By use of this framework, GUI components in control panels and menus can send commands to a particular distributed system with a minimum of system-specific code. The framework receives, decodes, processes, and displays telemetry data; custom telemetry data handling can be added for a particular system. The framework supports saving and later restoration of users configurations of control panels and telemetry displays with a minimum of effort in writing system-specific code. GUIs constructed within this framework can be deployed in any operating system with a Java run-time environment, without recompilation or code changes.
Flexural anchorage performance at diagonal crack locations : final report.
DOT National Transportation Integrated Search
2010-12-01
Large numbers of reinforced concrete deck girder bridges that were constructed during the interstate system expansion of the 1950s have developed diagonal cracking in the stems. Though compliant with design codes when constructed, many of these bridg...
BOCA BASIC BUILDING CODE. 4TH ED., 1965 AND 1967. BOCA BASIC BUILDING CODE ACCUMULATIVE SUPPLEMENT.
ERIC Educational Resources Information Center
Building Officials Conference of America, Inc., Chicago, IL.
NATIONALLY RECOGNIZED STANDARDS FOR THE EVALUATION OF MINIMUM SAFE PRACTICE OR FOR DETERMINING THE PERFORMANCE OF MATERIALS OR SYSTEMS OF CONSTRUCTION HAVE BEEN COMPILED AS AN AID TO DESIGNERS AND LOCAL OFFICIALS. THE CODE PRESENTS REGULATIONS IN TERMS OF MEASURED PERFORMANCE RATHER THAN IN RIGID SPECIFICATION OF MATERIALS OR METHODS. THE AREAS…
ERIC Educational Resources Information Center
Cole, Charles; Mandelblatt, Bertie
2000-01-01
Uses Kintsch's proposition-based construction-integration theory of discourse comprehension to detail the user coding operations that occur in each of the three subsystems (Perception, Comprehension, Application) in which users process an information retrieval systems (IRS) message. Describes an IRS device made up of two separate parts that enable…
Constructing graph models for software system development and analysis
NASA Astrophysics Data System (ADS)
Pogrebnoy, Andrey V.
2017-01-01
We propose a concept for creating the instrumentation for functional and structural decisions rationale during the software system (SS) development. We propose to develop SS simultaneously on two models - functional (FM) and structural (SM). FM is a source code of the SS. Adequate representation of the FM in the form of a graph model (GM) is made automatically and called SM. The problem of creating and visualizing GM is considered from the point of applying it as a uniform platform for the adequate representation of the SS source code. We propose three levels of GM detailing: GM1 - for visual analysis of the source code and for SS version control, GM2 - for resources optimization and analysis of connections between SS components, GM3 - for analysis of the SS functioning in dynamics. The paper includes examples of constructing all levels of GM.
10 CFR 50.55a - Codes and standards.
Code of Federal Regulations, 2010 CFR
2010-01-01
..., standard design approval, and standard design certification application under part 52 of this chapter is... section. (a)(1) Structures, systems, and components must be designed, fabricated, erected, constructed... Guide 1.84, Revision 34, “Design, Fabrication, and Materials Code Case Acceptability, ASME Section III...
Assuring Structural Integrity in Army Systems
1985-02-28
power plants are* I. American Society of Mechanical Engineers (ASME) Boiler and Pressure Vessel Code , Section III - Rules for Construction of Nuclear...Power Plant Components; 2. ASNE Boiler and Pressure Vessel Code , Section XI, Rules for In-Service Inspection of Nuclear Power Plant Components; and 3
Long distance quantum communication with quantum Reed-Solomon codes
NASA Astrophysics Data System (ADS)
Muralidharan, Sreraman; Zou, Chang-Ling; Li, Linshu; Jiang, Liang; Jianggroup Team
We study the construction of quantum Reed Solomon codes from classical Reed Solomon codes and show that they achieve the capacity of quantum erasure channel for multi-level quantum systems. We extend the application of quantum Reed Solomon codes to long distance quantum communication, investigate the local resource overhead needed for the functioning of one-way quantum repeaters with these codes, and numerically identify the parameter regime where these codes perform better than the known quantum polynomial codes and quantum parity codes . Finally, we discuss the implementation of these codes into time-bin photonic states of qubits and qudits respectively, and optimize the performance for one-way quantum repeaters.
New quantum codes constructed from quaternary BCH codes
NASA Astrophysics Data System (ADS)
Xu, Gen; Li, Ruihu; Guo, Luobin; Ma, Yuena
2016-10-01
In this paper, we firstly study construction of new quantum error-correcting codes (QECCs) from three classes of quaternary imprimitive BCH codes. As a result, the improved maximal designed distance of these narrow-sense imprimitive Hermitian dual-containing quaternary BCH codes are determined to be much larger than the result given according to Aly et al. (IEEE Trans Inf Theory 53:1183-1188, 2007) for each different code length. Thus, families of new QECCs are newly obtained, and the constructed QECCs have larger distance than those in the previous literature. Secondly, we apply a combinatorial construction to the imprimitive BCH codes with their corresponding primitive counterpart and construct many new linear quantum codes with good parameters, some of which have parameters exceeding the finite Gilbert-Varshamov bound for linear quantum codes.
Lee, Jin Hee; Hong, Ki Jeong; Kim, Do Kyun; Kwak, Young Ho; Jang, Hye Young; Kim, Hahn Bom; Noh, Hyun; Park, Jungho; Song, Bongkyu; Jung, Jae Yun
2013-12-01
A clinically sensible diagnosis grouping system (DGS) is needed for describing pediatric emergency diagnoses for research, medical resource preparedness, and making national policy for pediatric emergency medical care. The Pediatric Emergency Care Applied Research Network (PECARN) developed the DGS successfully. We developed the modified PECARN DGS based on the different pediatric population of South Korea and validated the system to obtain the accurate and comparable epidemiologic data of pediatric emergent conditions of the selected population. The data source used to develop and validate the modified PECARN DGS was the National Emergency Department Information System of South Korea, which was coded by the International Classification of Diseases, 10th Revision (ICD-10) code system. To develop the modified DGS based on ICD-10 code, we matched the selected ICD-10 codes with those of the PECARN DGS by the General Equivalence Mappings (GEMs). After converting ICD-10 codes to ICD-9 codes by GEMs, we matched ICD-9 codes into PECARN DGS categories using the matrix developed by PECARN group. Lastly, we conducted the expert panel survey using Delphi method for the remaining diagnosis codes that were not matched. A total of 1879 ICD-10 codes were used in development of the modified DGS. After 1078 (57.4%) of 1879 ICD-10 codes were assigned to the modified DGS by GEM and PECARN conversion tools, investigators assigned each of the remaining 801 codes (42.6%) to DGS subgroups by 2 rounds of electronic Delphi surveys. And we assigned the remaining 29 codes (4%) into the modified DGS at the second expert consensus meeting. The modified DGS accounts for 98.7% and 95.2% of diagnoses of the 2008 and 2009 National Emergency Department Information System data set. This modified DGS also exhibited strong construct validity using the concepts of age, sex, site of care, and seasons. This also reflected the 2009 outbreak of H1N1 influenza in Korea. We developed and validated clinically feasible and sensible DGS system for describing pediatric emergent conditions in Korea. The modified PECARN DGS showed good comprehensiveness and demonstrated reliable construct validity. This modified DGS based on PECARN DGS framework may be effectively implemented for research, reporting, and resource planning in pediatric emergency system of South Korea.
NASA Technical Reports Server (NTRS)
Lin, Shu; Rhee, Dojun
1996-01-01
This paper is concerned with construction of multilevel concatenated block modulation codes using a multi-level concatenation scheme for the frequency non-selective Rayleigh fading channel. In the construction of multilevel concatenated modulation code, block modulation codes are used as the inner codes. Various types of codes (block or convolutional, binary or nonbinary) are being considered as the outer codes. In particular, we focus on the special case for which Reed-Solomon (RS) codes are used as the outer codes. For this special case, a systematic algebraic technique for constructing q-level concatenated block modulation codes is proposed. Codes have been constructed for certain specific values of q and compared with the single-level concatenated block modulation codes using the same inner codes. A multilevel closest coset decoding scheme for these codes is proposed.
Deductive Glue Code Synthesis for Embedded Software Systems Based on Code Patterns
NASA Technical Reports Server (NTRS)
Liu, Jian; Fu, Jicheng; Zhang, Yansheng; Bastani, Farokh; Yen, I-Ling; Tai, Ann; Chau, Savio N.
2006-01-01
Automated code synthesis is a constructive process that can be used to generate programs from specifications. It can, thus, greatly reduce the software development cost and time. The use of formal code synthesis approach for software generation further increases the dependability of the system. Though code synthesis has many potential benefits, the synthesis techniques are still limited. Meanwhile, components are widely used in embedded system development. Applying code synthesis to component based software development (CBSD) process can greatly enhance the capability of code synthesis while reducing the component composition efforts. In this paper, we discuss the issues and techniques for applying deductive code synthesis techniques to CBSD. For deductive synthesis in CBSD, a rule base is the key for inferring appropriate component composition. We use the code patterns to guide the development of rules. Code patterns have been proposed to capture the typical usages of the components. Several general composition operations have been identified to facilitate systematic composition. We present the technique for rule development and automated generation of new patterns from existing code patterns. A case study of using this method in building a real-time control system is also presented.
Some partial-unit-memory convolutional codes
NASA Technical Reports Server (NTRS)
Abdel-Ghaffar, K.; Mceliece, R. J.; Solomon, G.
1991-01-01
The results of a study on a class of error correcting codes called partial unit memory (PUM) codes are presented. This class of codes, though not entirely new, has until now remained relatively unexplored. The possibility of using the well developed theory of block codes to construct a large family of promising PUM codes is shown. The performance of several specific PUM codes are compared with that of the Voyager standard (2, 1, 6) convolutional code. It was found that these codes can outperform the Voyager code with little or no increase in decoder complexity. This suggests that there may very well be PUM codes that can be used for deep space telemetry that offer both increased performance and decreased implementational complexity over current coding systems.
Code of Federal Regulations, 2014 CFR
2014-10-01
..., the Self-Governance Tribe and the Secretary must agree upon and specify appropriate building codes and...-Governance Tribe in the preparation of its construction project proposal. If Tribal construction codes and standards (including national, regional, State, or Tribal building codes or construction industry standards...
Code of Federal Regulations, 2012 CFR
2012-10-01
..., the Self-Governance Tribe and the Secretary must agree upon and specify appropriate building codes and...-Governance Tribe in the preparation of its construction project proposal. If Tribal construction codes and standards (including national, regional, State, or Tribal building codes or construction industry standards...
Code of Federal Regulations, 2013 CFR
2013-10-01
..., the Self-Governance Tribe and the Secretary must agree upon and specify appropriate building codes and...-Governance Tribe in the preparation of its construction project proposal. If Tribal construction codes and standards (including national, regional, State, or Tribal building codes or construction industry standards...
Code of Federal Regulations, 2011 CFR
2011-10-01
..., the Self-Governance Tribe and the Secretary must agree upon and specify appropriate building codes and...-Governance Tribe in the preparation of its construction project proposal. If Tribal construction codes and standards (including national, regional, State, or Tribal building codes or construction industry standards...
Code of Federal Regulations, 2010 CFR
2010-10-01
..., the Self-Governance Tribe and the Secretary must agree upon and specify appropriate building codes and...-Governance Tribe in the preparation of its construction project proposal. If Tribal construction codes and standards (including national, regional, State, or Tribal building codes or construction industry standards...
Validation of a Communication Process Measure for Coding Control in Counseling.
ERIC Educational Resources Information Center
Heatherington, Laurie
The increasingly popular view of the counseling process from an interactional perspective necessitates the development of new measurement instruments which are suitable to the study of the reciprocal interaction between people. The validity of the Relational Communication Coding System, an instrument which operationalizes the constructs of…
1988-08-01
OFFICE SYMBOL 7a NAME OF MONITORING ORGANIZATION U.S. Army Construction (if applicable) Engr Research Laboratory CECER-EN 6c. ADDRESS (City, State...and ZIP Code) 7b ADDRESS (City, State, and ZIP Code) P.O. Box 4005 Champaign, IL 61821 8a. NAME OF FUNDING/SPONSORING 8b OFFICE SYMBOL 9 PROCUREMENT...NAME OF RESPONSIBLE INDIVIDUAL 22b TELEPHONE (Include Area Code) 22c OFFICE SYMBOL Jane Andrew 1(217) 352-6511, x388 CECER-IMT DD FORM 1473. 84 MAR 83
Improvements to the construction of binary black hole initial data
NASA Astrophysics Data System (ADS)
Ossokine, Serguei; Foucart, Francois; Pfeiffer, Harald P.; Boyle, Michael; Szilágyi, Béla
2015-12-01
Construction of binary black hole initial data is a prerequisite for numerical evolutions of binary black holes. This paper reports improvements to the binary black hole initial data solver in the spectral Einstein code, to allow robust construction of initial data for mass-ratio above 10:1, and for dimensionless black hole spins above 0.9, while improving efficiency for lower mass-ratios and spins. We implement a more flexible domain decomposition, adaptive mesh refinement and an updated method for choosing free parameters. We also introduce a new method to control and eliminate residual linear momentum in initial data for precessing systems, and demonstrate that it eliminates gravitational mode mixing during the evolution. Finally, the new code is applied to construct initial data for hyperbolic scattering and for binaries with very small separation.
Advanced ballistic range technology
NASA Technical Reports Server (NTRS)
Yates, Leslie A.
1993-01-01
Optical images, such as experimental interferograms, schlieren, and shadowgraphs, are routinely used to identify and locate features in experimental flow fields and for validating computational fluid dynamics (CFD) codes. Interferograms can also be used for comparing experimental and computed integrated densities. By constructing these optical images from flow-field simulations, one-to-one comparisons of computation and experiment are possible. During the period from February 1, 1992, to November 30, 1992, work has continued on the development of CISS (Constructed Interferograms, Schlieren, and Shadowgraphs), a code that constructs images from ideal- and real-gas flow-field simulations. In addition, research connected with the automated film-reading system and the proposed reactivation of the radiation facility has continued.
Developing Information Power Grid Based Algorithms and Software
NASA Technical Reports Server (NTRS)
Dongarra, Jack
1998-01-01
This exploratory study initiated our effort to understand performance modeling on parallel systems. The basic goal of performance modeling is to understand and predict the performance of a computer program or set of programs on a computer system. Performance modeling has numerous applications, including evaluation of algorithms, optimization of code implementations, parallel library development, comparison of system architectures, parallel system design, and procurement of new systems. Our work lays the basis for the construction of parallel libraries that allow for the reconstruction of application codes on several distinct architectures so as to assure performance portability. Following our strategy, once the requirements of applications are well understood, one can then construct a library in a layered fashion. The top level of this library will consist of architecture-independent geometric, numerical, and symbolic algorithms that are needed by the sample of applications. These routines should be written in a language that is portable across the targeted architectures.
Construction of self-dual codes in the Rosenbloom-Tsfasman metric
NASA Astrophysics Data System (ADS)
Krisnawati, Vira Hari; Nisa, Anzi Lina Ukhtin
2017-12-01
Linear code is a very basic code and very useful in coding theory. Generally, linear code is a code over finite field in Hamming metric. Among the most interesting families of codes, the family of self-dual code is a very important one, because it is the best known error-correcting code. The concept of Hamming metric is develop into Rosenbloom-Tsfasman metric (RT-metric). The inner product in RT-metric is different from Euclid inner product that is used to define duality in Hamming metric. Most of the codes which are self-dual in Hamming metric are not so in RT-metric. And, generator matrix is very important to construct a code because it contains basis of the code. Therefore in this paper, we give some theorems and methods to construct self-dual codes in RT-metric by considering properties of the inner product and generator matrix. Also, we illustrate some examples for every kind of the construction.
NASA Technical Reports Server (NTRS)
Rajpal, Sandeep; Rhee, Do Jun; Lin, Shu
1997-01-01
The first part of this paper presents a simple and systematic technique for constructing multidimensional M-ary phase shift keying (MMK) trellis coded modulation (TCM) codes. The construction is based on a multilevel concatenation approach in which binary convolutional codes with good free branch distances are used as the outer codes and block MPSK modulation codes are used as the inner codes (or the signal spaces). Conditions on phase invariance of these codes are derived and a multistage decoding scheme for these codes is proposed. The proposed technique can be used to construct good codes for both the additive white Gaussian noise (AWGN) and fading channels as is shown in the second part of this paper.
Generalized Bezout's Theorem and its applications in coding theory
NASA Technical Reports Server (NTRS)
Berg, Gene A.; Feng, Gui-Liang; Rao, T. R. N.
1996-01-01
This paper presents a generalized Bezout theorem which can be used to determine a tighter lower bound of the number of distinct points of intersection of two or more curves for a large class of plane curves. A new approach to determine a lower bound on the minimum distance (and also the generalized Hamming weights) for algebraic-geometric codes defined from a class of plane curves is introduced, based on the generalized Bezout theorem. Examples of more efficient linear codes are constructed using the generalized Bezout theorem and the new approach. For d = 4, the linear codes constructed by the new construction are better than or equal to the known linear codes. For d greater than 5, these new codes are better than the known codes. The Klein code over GF(2(sup 3)) is also constructed.
Fernández-Lansac, Violeta; Crespo, María
2017-07-26
This study introduces a new coding system, the Coding and Assessment System for Narratives of Trauma (CASNOT), to analyse several language domains in narratives of autobiographical memories, especially in trauma narratives. The development of the coding system is described. It was applied to assess positive and traumatic/negative narratives in 50 battered women (trauma-exposed group) and 50 nontrauma-exposed women (control group). Three blind raters coded each narrative. Inter-rater reliability analyses were conducted for the CASNOT language categories (multirater Kfree coefficients) and dimensions (intraclass correlation coefficients). High levels of inter-rater agreement were found for most of the language domains. Categories that did not reach the expected reliability were mainly those related to cognitive processes, which reflects difficulties in operationalizing constructs such as lack of control or helplessness, control or planning, and rationalization or memory elaboration. Applications and limitations of the CASNOT are discussed to enhance narrative measures for autobiographical memories.
Lightweight composites for modular panelized construction
NASA Astrophysics Data System (ADS)
Vaidya, Amol S.
Rapid advances in construction materials technology have enabled civil engineers to achieve impressive gains in the safety, economy, and functionality of structures built to serve the common needs of society. Modular building systems is a fast-growing modern, form of construction gaining recognition for its increased efficiency and ability to apply modern technology to the needs of the market place. In the modular construction technique, a single structural panel can perform a number of functions such as providing thermal insulation, vibration damping, and structural strength. These multifunctional panels can be prefabricated in a manufacturing facility and then transferred to the construction site. A system that uses prefabricated panels for construction is called a "panelized construction system". This study focuses on the development of pre-cast, lightweight, multifunctional sandwich composite panels to be used for panelized construction. Two thermoplastic composite panels are proposed in this study, namely Composite Structural Insulated Panels (CSIPs) for exterior walls, floors and roofs, and Open Core Sandwich composite for multifunctional interior walls of a structure. Special manufacturing techniques are developed for manufacturing these panels. The structural behavior of these panels is analyzed based on various building design codes. Detailed descriptions of the design, cost analysis, manufacturing, finite element modeling and structural testing of these proposed panels are included in this study in the of form five peer-reviewed journal articles. The structural testing of the proposed panels involved in this study included flexural testing, axial compression testing, and low and high velocity impact testing. Based on the current study, the proposed CSIP wall and floor panels were found satisfactory, based on building design codes ASCE-7-05 and ACI-318-05. Joining techniques are proposed in this study for connecting the precast panels on the construction site. Keywords: Modular panelized construction, sandwich composites, composite structural insulated panels (CSIPs).
Construction of a menu-based system
NASA Technical Reports Server (NTRS)
Noonan, R. E.; Collins, W. R.
1985-01-01
The development of the user interface to a software code management system is discussed. The user interface was specified using a grammar and implemented using a LR parser generator. This was found to be an effective method for the rapid prototyping of a menu based system.
1990-12-01
S) Naval Postgraduate School 6a. NAME OF PERFORMING ORGANIZATION 6b OFFICE SYMBOL 7a NAME OF MONITORING ORGANIZATION (if applicable ) Code 33 6c...FUNDING/SPONSORING Bb. OFFICE SYMBOL 9. PROCUREMENT INSTRUMENT IDENTIFICATION NUMBER ORGANIZATION (If applicable ) 8c. ADDRESS (City, State, and ZIP Code...system’s individual components. Then one derives the overall system reliability from that information, using a simple mathematical model, to be
How collaboration in therapy becomes therapeutic: the therapeutic collaboration coding system.
Ribeiro, Eugénia; Ribeiro, António P; Gonçalves, Miguel M; Horvath, Adam O; Stiles, William B
2013-09-01
The quality and strength of the therapeutic collaboration, the core of the alliance, is reliably associated with positive therapy outcomes. The urgent challenge for clinicians and researchers is constructing a conceptual framework to integrate the dialectical work that fosters collaboration, with a model of how clients make progress in therapy. We propose a conceptual account of how collaboration in therapy becomes therapeutic. In addition, we report on the construction of a coding system - the therapeutic collaboration coding system (TCCS) - designed to analyse and track on a moment-by-moment basis the interaction between therapist and client. Preliminary evidence is presented regarding the coding system's psychometric properties. The TCCS evaluates each speaking turn and assesses whether and how therapists are working within the client's therapeutic zone of proximal development, defined as the space between the client's actual therapeutic developmental level and their potential developmental level that can be reached in collaboration with the therapist. We applied the TCCS to five cases: a good and a poor outcome case of narrative therapy, a good and a poor outcome case of cognitive-behavioural therapy, and a dropout case of narrative therapy. The TCCS offers markers that may help researchers better understand the therapeutic collaboration on a moment-to-moment basis and may help therapists better regulate the relationship. © 2012 The British Psychological Society.
Evaluation of Passive Vents in New Construction Multifamily Buildings
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sean Maxwell; Berger, David; Zuluaga, Marc
Exhaust ventilation and corresponding outdoor air strategies are being implemented in high performance, new construction, multifamily buildings to meet program or code requirements for improved indoor air quality, but a lack of clear design guidance is resulting in poor performance of these systems despite the best intentions of the programs or standards.
25 CFR 900.125 - What shall a construction contract proposal contain?
Code of Federal Regulations, 2012 CFR
2012-04-01
... tribal building codes and engineering standards; (4) Structural integrity; (5) Accountability of funds..., standards and methods (including national, regional, state, or tribal building codes or construction... methods (including national, regional, state, or tribal building codes or construction industry standards...
25 CFR 900.125 - What shall a construction contract proposal contain?
Code of Federal Regulations, 2014 CFR
2014-04-01
... tribal building codes and engineering standards; (4) Structural integrity; (5) Accountability of funds..., standards and methods (including national, regional, state, or tribal building codes or construction... methods (including national, regional, state, or tribal building codes or construction industry standards...
25 CFR 900.125 - What shall a construction contract proposal contain?
Code of Federal Regulations, 2013 CFR
2013-04-01
... tribal building codes and engineering standards; (4) Structural integrity; (5) Accountability of funds..., standards and methods (including national, regional, state, or tribal building codes or construction... methods (including national, regional, state, or tribal building codes or construction industry standards...
25 CFR 900.125 - What shall a construction contract proposal contain?
Code of Federal Regulations, 2011 CFR
2011-04-01
... tribal building codes and engineering standards; (4) Structural integrity; (5) Accountability of funds..., standards and methods (including national, regional, state, or tribal building codes or construction... methods (including national, regional, state, or tribal building codes or construction industry standards...
25 CFR 900.125 - What shall a construction contract proposal contain?
Code of Federal Regulations, 2010 CFR
2010-04-01
... tribal building codes and engineering standards; (4) Structural integrity; (5) Accountability of funds..., standards and methods (including national, regional, state, or tribal building codes or construction... methods (including national, regional, state, or tribal building codes or construction industry standards...
NASA Astrophysics Data System (ADS)
Ma, Fanghui; Gao, Jian; Fu, Fang-Wei
2018-06-01
Let R={F}_q+v{F}_q+v2{F}_q be a finite non-chain ring, where q is an odd prime power and v^3=v. In this paper, we propose two methods of constructing quantum codes from (α +β v+γ v2)-constacyclic codes over R. The first one is obtained via the Gray map and the Calderbank-Shor-Steane construction from Euclidean dual-containing (α +β v+γ v2)-constacyclic codes over R. The second one is obtained via the Gray map and the Hermitian construction from Hermitian dual-containing (α +β v+γ v2)-constacyclic codes over R. As an application, some new non-binary quantum codes are obtained.
24 CFR 941.203 - Design and construction standards.
Code of Federal Regulations, 2013 CFR
2013-04-01
... national building code, such as Uniform Building Code, Council of American Building Officials Code, or Building Officials Conference of America Code; (2) Applicable State and local laws, codes, ordinances, and... intended to serve. Building design and construction shall strive to encourage in residents a proprietary...
24 CFR 941.203 - Design and construction standards.
Code of Federal Regulations, 2012 CFR
2012-04-01
... national building code, such as Uniform Building Code, Council of American Building Officials Code, or Building Officials Conference of America Code; (2) Applicable State and local laws, codes, ordinances, and... intended to serve. Building design and construction shall strive to encourage in residents a proprietary...
38 CFR 39.63 - Architectural design standards.
Code of Federal Regulations, 2014 CFR
2014-07-01
... 5000, Building Construction and Safety Code, and the 2002 edition of the National Electrical Code, NFPA... 5000, Building Construction and Safety Code. Where the adopted codes state conflicting requirements... the standards set forth in this section, all applicable local and State building codes and regulations...
38 CFR 39.63 - Architectural design standards.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 5000, Building Construction and Safety Code, and the 2002 edition of the National Electrical Code, NFPA... 5000, Building Construction and Safety Code. Where the adopted codes state conflicting requirements... the standards set forth in this section, all applicable local and State building codes and regulations...
38 CFR 39.63 - Architectural design standards.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 5000, Building Construction and Safety Code, and the 2002 edition of the National Electrical Code, NFPA... 5000, Building Construction and Safety Code. Where the adopted codes state conflicting requirements... the standards set forth in this section, all applicable local and State building codes and regulations...
Applications of Derandomization Theory in Coding
NASA Astrophysics Data System (ADS)
Cheraghchi, Mahdi
2011-07-01
Randomized techniques play a fundamental role in theoretical computer science and discrete mathematics, in particular for the design of efficient algorithms and construction of combinatorial objects. The basic goal in derandomization theory is to eliminate or reduce the need for randomness in such randomized constructions. In this thesis, we explore some applications of the fundamental notions in derandomization theory to problems outside the core of theoretical computer science, and in particular, certain problems related to coding theory. First, we consider the wiretap channel problem which involves a communication system in which an intruder can eavesdrop a limited portion of the transmissions, and construct efficient and information-theoretically optimal communication protocols for this model. Then we consider the combinatorial group testing problem. In this classical problem, one aims to determine a set of defective items within a large population by asking a number of queries, where each query reveals whether a defective item is present within a specified group of items. We use randomness condensers to explicitly construct optimal, or nearly optimal, group testing schemes for a setting where the query outcomes can be highly unreliable, as well as the threshold model where a query returns positive if the number of defectives pass a certain threshold. Finally, we design ensembles of error-correcting codes that achieve the information-theoretic capacity of a large class of communication channels, and then use the obtained ensembles for construction of explicit capacity achieving codes. [This is a shortened version of the actual abstract in the thesis.
NASA Astrophysics Data System (ADS)
Walker, Ernest; Chen, Xinjia; Cooper, Reginald L.
2010-04-01
An arbitrarily accurate approach is used to determine the bit-error rate (BER) performance for generalized asynchronous DS-CDMA systems, in Gaussian noise with Raleigh fading. In this paper, and the sequel, new theoretical work has been contributed which substantially enhances existing performance analysis formulations. Major contributions include: substantial computational complexity reduction, including a priori BER accuracy bounding; an analytical approach that facilitates performance evaluation for systems with arbitrary spectral spreading distributions, with non-uniform transmission delay distributions. Using prior results, augmented by these enhancements, a generalized DS-CDMA system model is constructed and used to evaluated the BER performance, in a variety of scenarios. In this paper, the generalized system modeling was used to evaluate the performance of both Walsh- Hadamard (WH) and Walsh-Hadamard-seeded zero-correlation-zone (WH-ZCZ) coding. The selection of these codes was informed by the observation that WH codes contain N spectral spreading values (0 to N - 1), one for each code sequence; while WH-ZCZ codes contain only two spectral spreading values (N/2 - 1,N/2); where N is the sequence length in chips. Since these codes span the spectral spreading range for DS-CDMA coding, by invoking an induction argument, the generalization of the system model is sufficiently supported. The results in this paper, and the sequel, support the claim that an arbitrary accurate performance analysis for DS-CDMA systems can be evaluated over the full range of binary coding, with minimal computational complexity.
Ho, Christabel Man-Fong; Oladinrin, Olugbenga Timo
2018-01-30
Due to the economic globalization which is characterized with business scandals, scholars and practitioners are increasingly engaged with the implementation of codes of ethics as a regulatory mechanism for stimulating ethical behaviours within an organization. The aim of this study is to examine various organizational practices regarding the effective implementation of codes of ethics within construction contracting companies. Views on ethics management in construction organizations together with the recommendations for improvement were gleaned through 19 semi-structured interviews, involving construction practitioners from various construction companies in Hong Kong. The findings suggested some practices for effective implementation of codes of ethics in order to diffuse ethical behaviours in an organizational setting which include; introduction of effective reward schemes, arrangement of ethics training for employees, and leadership responsiveness to reported wrongdoings. Since most of the construction companies in Hong Kong have codes of ethics, emphasis is made on the practical implementation of codes within the organizations. Hence, implications were drawn from the recommended measures to guide construction companies and policy makers.
NASA Technical Reports Server (NTRS)
Denney, Ewen W.; Fischer, Bernd
2009-01-01
Model-based development and automated code generation are increasingly used for production code in safety-critical applications, but since code generators are typically not qualified, the generated code must still be fully tested, reviewed, and certified. This is particularly arduous for mathematical and control engineering software which requires reviewers to trace subtle details of textbook formulas and algorithms to the code, and to match requirements (e.g., physical units or coordinate frames) not represented explicitly in models or code. Both tasks are complicated by the often opaque nature of auto-generated code. We address these problems by developing a verification-driven approach to traceability and documentation. We apply the AUTOCERT verification system to identify and then verify mathematical concepts in the code, based on a mathematical domain theory, and then use these verified traceability links between concepts, code, and verification conditions to construct a natural language report that provides a high-level structured argument explaining why and how the code uses the assumptions and complies with the requirements. We have applied our approach to generate review documents for several sub-systems of NASA s Project Constellation.
Meta-Modeling: A Knowledge-Based Approach to Facilitating Model Construction and Reuse
NASA Technical Reports Server (NTRS)
Keller, Richard M.; Dungan, Jennifer L.
1997-01-01
In this paper, we introduce a new modeling approach called meta-modeling and illustrate its practical applicability to the construction of physically-based ecosystem process models. As a critical adjunct to modeling codes meta-modeling requires explicit specification of certain background information related to the construction and conceptual underpinnings of a model. This information formalizes the heretofore tacit relationship between the mathematical modeling code and the underlying real-world phenomena being investigated, and gives insight into the process by which the model was constructed. We show how the explicit availability of such information can make models more understandable and reusable and less subject to misinterpretation. In particular, background information enables potential users to better interpret an implemented ecosystem model without direct assistance from the model author. Additionally, we show how the discipline involved in specifying background information leads to improved management of model complexity and fewer implementation errors. We illustrate the meta-modeling approach in the context of the Scientists' Intelligent Graphical Modeling Assistant (SIGMA) a new model construction environment. As the user constructs a model using SIGMA the system adds appropriate background information that ties the executable model to the underlying physical phenomena under investigation. Not only does this information improve the understandability of the final model it also serves to reduce the overall time and programming expertise necessary to initially build and subsequently modify models. Furthermore, SIGMA's use of background knowledge helps eliminate coding errors resulting from scientific and dimensional inconsistencies that are otherwise difficult to avoid when building complex models. As a. demonstration of SIGMA's utility, the system was used to reimplement and extend a well-known forest ecosystem dynamics model: Forest-BGC.
Low-Density Parity-Check (LDPC) Codes Constructed from Protographs
NASA Astrophysics Data System (ADS)
Thorpe, J.
2003-08-01
We introduce a new class of low-density parity-check (LDPC) codes constructed from a template called a protograph. The protograph serves as a blueprint for constructing LDPC codes of arbitrary size whose performance can be predicted by analyzing the protograph. We apply standard density evolution techniques to predict the performance of large protograph codes. Finally, we use a randomized search algorithm to find good protographs.
Accuracy comparison among different machine learning techniques for detecting malicious codes
NASA Astrophysics Data System (ADS)
Narang, Komal
2016-03-01
In this paper, a machine learning based model for malware detection is proposed. It can detect newly released malware i.e. zero day attack by analyzing operation codes on Android operating system. The accuracy of Naïve Bayes, Support Vector Machine (SVM) and Neural Network for detecting malicious code has been compared for the proposed model. In the experiment 400 benign files, 100 system files and 500 malicious files have been used to construct the model. The model yields the best accuracy 88.9% when neural network is used as classifier and achieved 95% and 82.8% accuracy for sensitivity and specificity respectively.
2015-06-01
cient parallel code for applying the operator. Our method constructs a polynomial preconditioner using a nonlinear least squares (NLLS) algorithm. We show...apply the underlying operator. Such a preconditioner can be very attractive in scenarios where one has a highly efficient parallel code for applying...repeatedly solve a large system of linear equations where one has an extremely fast parallel code for applying an underlying fixed linear operator
NASA Astrophysics Data System (ADS)
Nasaruddin; Tsujioka, Tetsuo
An optical CDMA (OCDMA) system is a flexible technology for future broadband multiple access networks. A secure OCDMA network in broadband optical access technologies is also becoming an issue of great importance. In this paper, we propose novel reconfigurable wavelength-time (W-T) optical codes that lead to secure transmission in OCDMA networks. The proposed W-T optical codes are constructed by using quasigroups (QGs) for wavelength hopping and one-dimensional optical orthogonal codes (OOCs) for time spreading; we call them QGs/OOCs. Both QGs and OOCs are randomly generated by a computer search to ensure that an eavesdropper could not improve its interception performance by making use of the coding structure. Then, the proposed reconfigurable QGs/OOCs can provide more codewords, and many different code set patterns, which differ in both wavelength and time positions for given code parameters. Moreover, the bit error probability of the proposed codes is analyzed numerically. To realize the proposed codes, a secure system is proposed by employing reconfigurable encoders/decoders based on array waveguide gratings (AWGs), which allow the users to change their codeword patterns to protect against eavesdropping. Finally, the probability of breaking a certain codeword in the proposed system is evaluated analytically. The results show that the proposed codes and system can provide a large codeword pattern, and decrease the probability of breaking a certain codeword, to enhance OCDMA network security.
49 CFR 178.345-10 - Pressure relief.
Code of Federal Regulations, 2010 CFR
2010-10-01
... applicable individual specification. The pressure and vacuum relief system must be designed to operate and... resulting from loading, unloading, or from heating and cooling of lading. Pressure relief systems are not required to conform to the ASME Code. (b) Type and construction of relief systems and devices. (1) Each...
Quantum error-correcting codes from algebraic geometry codes of Castle type
NASA Astrophysics Data System (ADS)
Munuera, Carlos; Tenório, Wanderson; Torres, Fernando
2016-10-01
We study algebraic geometry codes producing quantum error-correcting codes by the CSS construction. We pay particular attention to the family of Castle codes. We show that many of the examples known in the literature in fact belong to this family of codes. We systematize these constructions by showing the common theory that underlies all of them.
NASA Astrophysics Data System (ADS)
Kandouci, Chahinaz; Djebbari, Ali
2018-04-01
A new family of two-dimensional optical hybrid code which employs zero cross-correlation (ZCC) codes, constructed by the balanced incomplete block design BIBD, as both time-spreading and wavelength hopping patterns are used in this paper. The obtained codes have both off-peak autocorrelation and cross-correlation values respectively equal to zero and unity. The work in this paper is a computer experiment performed using Optisystem 9.0 software program as a simulator to determine the wavelength hopping/time spreading (WH/TS) OCDMA system performances limitations. Five system parameters were considered in this work: the optical fiber length (transmission distance), the bitrate, the chip spacing and the transmitted power. This paper shows for what sufficient system performance parameters (BER≤10-9, Q≥6) the system can stand for.
Heinemann, Allen W; Miskovic, Ana; Semik, Patrick; Wong, Alex; Dashner, Jessica; Baum, Carolyn; Magasi, Susan; Hammel, Joy; Tulsky, David S; Garcia, Sofia F; Jerousek, Sara; Lai, Jin-Shei; Carlozzi, Noelle E; Gray, David B
2016-12-01
To describe the unique and overlapping content of the newly developed Environmental Factors Item Banks (EFIB) and 7 legacy environmental factor instruments, and to evaluate the EFIB's construct validity by examining associations with legacy instruments. Cross-sectional, observational cohort. Community. A sample of community-dwelling adults with stroke, spinal cord injury, and traumatic brain injury (N=568). None. EFIB covering domains of the built and natural environment; systems, services, and policies; social environment; and access to information and technology; the Craig Hospital Inventory of Environmental Factors (CHIEF) short form; the Facilitators and Barriers Survey/Mobility (FABS/M) short form; the Home and Community Environment Instrument (HACE); the Measure of the Quality of the Environment (MQE) short form; and 3 of the Patient Reported Outcomes Measurement Information System's (PROMIS) Quality of Social Support measures. The EFIB and legacy instruments assess most of the International Classification of Functioning, Disability and Health (ICF) environmental factors chapters, including chapter 1 (products and technology; 75 items corresponding to 11 codes), chapter 2 (natural environment and human-made changes; 31 items corresponding to 7 codes), chapter 3 (support and relationships; 74 items corresponding to 7 codes), chapter 4 (attitudes; 83 items corresponding to 8 codes), and chapter 5 (services, systems, and policies; 72 items corresponding to 16 codes). Construct validity is provided by moderate correlations between EFIB measures and the CHIEF, MQE barriers, HACE technology mobility, FABS/M community built features, and PROMIS item banks and by small correlations with other legacy instruments. Only 5 of the 66 legacy instrument correlation coefficients are moderate, suggesting they measure unique aspects of the environment, whereas all intra-EFIB correlations were at least moderate. The EFIB measures provide a brief and focused assessment of ICF environmental factor chapters. The pattern of correlations with legacy instruments provides initial evidence of construct validity. Copyright © 2016 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.
Towers of generalized divisible quantum codes
NASA Astrophysics Data System (ADS)
Haah, Jeongwan
2018-04-01
A divisible binary classical code is one in which every code word has weight divisible by a fixed integer. If the divisor is 2ν for a positive integer ν , then one can construct a Calderbank-Shor-Steane (CSS) code, where X -stabilizer space is the divisible classical code, that admits a transversal gate in the ν th level of Clifford hierarchy. We consider a generalization of the divisibility by allowing a coefficient vector of odd integers with which every code word has zero dot product modulo the divisor. In this generalized sense, we construct a CSS code with divisor 2ν +1 and code distance d from any CSS code of code distance d and divisor 2ν where the transversal X is a nontrivial logical operator. The encoding rate of the new code is approximately d times smaller than that of the old code. In particular, for large d and ν ≥2 , our construction yields a CSS code of parameters [[O (dν -1) ,Ω (d ) ,d ] ] admitting a transversal gate at the ν th level of Clifford hierarchy. For our construction we introduce a conversion from magic state distillation protocols based on Clifford measurements to those based on codes with transversal T gates. Our tower contains, as a subclass, generalized triply even CSS codes that have appeared in so-called gauge fixing or code switching methods.
Simulations of pattern dynamics for reaction-diffusion systems via SIMULINK.
Wang, Kaier; Steyn-Ross, Moira L; Steyn-Ross, D Alistair; Wilson, Marcus T; Sleigh, Jamie W; Shiraishi, Yoichi
2014-04-11
Investigation of the nonlinear pattern dynamics of a reaction-diffusion system almost always requires numerical solution of the system's set of defining differential equations. Traditionally, this would be done by selecting an appropriate differential equation solver from a library of such solvers, then writing computer codes (in a programming language such as C or Matlab) to access the selected solver and display the integrated results as a function of space and time. This "code-based" approach is flexible and powerful, but requires a certain level of programming sophistication. A modern alternative is to use a graphical programming interface such as Simulink to construct a data-flow diagram by assembling and linking appropriate code blocks drawn from a library. The result is a visual representation of the inter-relationships between the state variables whose output can be made completely equivalent to the code-based solution. As a tutorial introduction, we first demonstrate application of the Simulink data-flow technique to the classical van der Pol nonlinear oscillator, and compare Matlab and Simulink coding approaches to solving the van der Pol ordinary differential equations. We then show how to introduce space (in one and two dimensions) by solving numerically the partial differential equations for two different reaction-diffusion systems: the well-known Brusselator chemical reactor, and a continuum model for a two-dimensional sheet of human cortex whose neurons are linked by both chemical and electrical (diffusive) synapses. We compare the relative performances of the Matlab and Simulink implementations. The pattern simulations by Simulink are in good agreement with theoretical predictions. Compared with traditional coding approaches, the Simulink block-diagram paradigm reduces the time and programming burden required to implement a solution for reaction-diffusion systems of equations. Construction of the block-diagram does not require high-level programming skills, and the graphical interface lends itself to easy modification and use by non-experts.
Codes and standards for structural wood products and their use in the United States
David W. Green; Roland Hernandez
1998-01-01
The system of model building codes and voluntary product standards used in the United States for structural lumber and engineered wood products can appear complicated and confusing to those introduced to it for the first time. This paper is a discussion of the various types of structural wood products commonly used in U.S. residential and commercial construction and...
24 CFR 200.925b - Residential and institutional building code comparison items.
Code of Federal Regulations, 2010 CFR
2010-04-01
...., materials, allowable stresses, design; (6) Excavation; (e) Materials standards. (f) Construction components...) Plumbing fixtures; (7) Water supply and distribution; (8) Storm drain systems. (j) Electrical. (1) Wiring...
24 CFR 200.925b - Residential and institutional building code comparison items.
Code of Federal Regulations, 2011 CFR
2011-04-01
...., materials, allowable stresses, design; (6) Excavation; (e) Materials standards. (f) Construction components...) Plumbing fixtures; (7) Water supply and distribution; (8) Storm drain systems. (j) Electrical. (1) Wiring...
NASA Technical Reports Server (NTRS)
Sang, Janche
2003-01-01
Within NASA's Aviation Safety Program, NASA GRC participates in the Modeling and Simulation Project called ASMM. NASA GRC s focus is to characterize the propulsion systems performance from a fleet management and maintenance perspective by modeling and through simulation predict the characteristics of two classes of commercial engines (CFM56 and GE90). In prior years, the High Performance Computing and Communication (HPCC) program funded, NASA Glenn in developing a large scale, detailed simulations for the analysis and design of aircraft engines called the Numerical Propulsion System Simulation (NPSS). Three major aspects of this modeling included the integration of different engine components, coupling of multiple disciplines, and engine component zooming at appropriate level fidelity, require relatively tight coupling of different analysis codes. Most of these codes in aerodynamics and solid mechanics are written in Fortran. Refitting these legacy Fortran codes with distributed objects can increase these codes reusability. Aviation Safety s modeling and simulation use in characterizing fleet management has similar needs. The modeling and simulation of these propulsion systems use existing Fortran and C codes that are instrumental in determining the performance of the fleet. The research centers on building a CORBA-based development environment for programmers to easily wrap and couple legacy Fortran codes. This environment consists of a C++ wrapper library to hide the details of CORBA and an efficient remote variable scheme to facilitate data exchange between the client and the server model. Additionally, a Web Service model should also be constructed for evaluation of this technology s use over the next two- three years.
Automated Construction of Node Software Using Attributes in a Ubiquitous Sensor Network Environment
Lee, Woojin; Kim, Juil; Kang, JangMook
2010-01-01
In sensor networks, nodes must often operate in a demanding environment facing restrictions such as restricted computing resources, unreliable wireless communication and power shortages. Such factors make the development of ubiquitous sensor network (USN) applications challenging. To help developers construct a large amount of node software for sensor network applications easily and rapidly, this paper proposes an approach to the automated construction of node software for USN applications using attributes. In the proposed technique, application construction proceeds by first developing a model for the sensor network and then designing node software by setting the values of the predefined attributes. After that, the sensor network model and the design of node software are verified. The final source codes of the node software are automatically generated from the sensor network model. We illustrate the efficiency of the proposed technique by using a gas/light monitoring application through a case study of a Gas and Light Monitoring System based on the Nano-Qplus operating system. We evaluate the technique using a quantitative metric—the memory size of execution code for node software. Using the proposed approach, developers are able to easily construct sensor network applications and rapidly generate a large number of node softwares at a time in a ubiquitous sensor network environment. PMID:22163678
Automated construction of node software using attributes in a ubiquitous sensor network environment.
Lee, Woojin; Kim, Juil; Kang, JangMook
2010-01-01
In sensor networks, nodes must often operate in a demanding environment facing restrictions such as restricted computing resources, unreliable wireless communication and power shortages. Such factors make the development of ubiquitous sensor network (USN) applications challenging. To help developers construct a large amount of node software for sensor network applications easily and rapidly, this paper proposes an approach to the automated construction of node software for USN applications using attributes. In the proposed technique, application construction proceeds by first developing a model for the sensor network and then designing node software by setting the values of the predefined attributes. After that, the sensor network model and the design of node software are verified. The final source codes of the node software are automatically generated from the sensor network model. We illustrate the efficiency of the proposed technique by using a gas/light monitoring application through a case study of a Gas and Light Monitoring System based on the Nano-Qplus operating system. We evaluate the technique using a quantitative metric-the memory size of execution code for node software. Using the proposed approach, developers are able to easily construct sensor network applications and rapidly generate a large number of node softwares at a time in a ubiquitous sensor network environment.
Box codes of lengths 48 and 72
NASA Technical Reports Server (NTRS)
Solomon, G.; Jin, Y.
1993-01-01
A self-dual code length 48, dimension 24, with Hamming distance essentially equal to 12 is constructed here. There are only six code words of weight eight. All the other code words have weights that are multiples of four and have a minimum weight equal to 12. This code may be encoded systematically and arises from a strict binary representation of the (8,4;5) Reed-Solomon (RS) code over GF (64). The code may be considered as six interrelated (8,7;2) codes. The Mattson-Solomon representation of the cyclic decomposition of these codes and their parity sums are used to detect an odd number of errors in any of the six codes. These may then be used in a correction algorithm for hard or soft decision decoding. A (72,36;15) box code was constructed from a (63,35;8) cyclic code. The theoretical justification is presented herein. A second (72,36;15) code is constructed from an inner (63,27;16) Bose Chaudhuri Hocquenghem (BCH) code and expanded to length 72 using box code algorithms for extension. This code was simulated and verified to have a minimum distance of 15 with even weight words congruent to zero modulo four. The decoding for hard and soft decision is still more complex than the first code constructed above. Finally, an (8,4;5) RS code over GF (512) in the binary representation of the (72,36;15) box code gives rise to a (72,36;16*) code with nine words of weight eight, and all the rest have weights greater than or equal to 16.
Heating, Ventilation, and Air Conditioning Design Strategy for a Hot-Humid Production Builder
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kerrigan, P.
2014-03-01
BSC worked directly with the David Weekley Homes - Houston division to redesign three floor plans in order to locate the HVAC system in conditioned space. The purpose of this project is to develop a cost effective design for moving the HVAC system into conditioned space. In addition, BSC conducted energy analysis to calculate the most economical strategy for increasing the energy performance of future production houses. This is in preparation for the upcoming code changes in 2015. The builder wishes to develop an upgrade package that will allow for a seamless transition to the new code mandate. The followingmore » research questions were addressed by this research project: 1. What is the most cost effective, best performing and most easily replicable method of locating ducts inside conditioned space for a hot-humid production home builder that constructs one and two story single family detached residences? 2. What is a cost effective and practical method of achieving 50% source energy savings vs. the 2006 International Energy Conservation Code for a hot-humid production builder? 3. How accurate are the pre-construction whole house cost estimates compared to confirmed post construction actual cost? BSC and the builder developed a duct design strategy that employs a system of dropped ceilings and attic coffers for moving the ductwork from the vented attic to conditioned space. The furnace has been moved to either a mechanical closet in the conditioned living space or a coffered space in the attic.« less
HVAC Design Strategy for a Hot-Humid Production Builder, Houston, Texas (Fact Sheet)
DOE Office of Scientific and Technical Information (OSTI.GOV)
None, None
BSC worked directly with the David Weekley Homes - Houston division to redesign three floor plans in order to locate the HVAC system in conditioned space. The purpose of this project is to develop a cost effective design for moving the HVAC system into conditioned space. In addition, BSC conducted energy analysis to calculate the most economical strategy for increasing the energy performance of future production houses. This is in preparation for the upcoming code changes in 2015. The builder wishes to develop an upgrade package that will allow for a seamless transition to the new code mandate. The followingmore » research questions were addressed by this research project: 1. What is the most cost effective, best performing and most easily replicable method of locating ducts inside conditioned space for a hot-humid production home builder that constructs one and two story single family detached residences? 2. What is a cost effective and practical method of achieving 50% source energy savings vs. the 2006 International Energy Conservation Code for a hot-humid production builder? 3. How accurate are the pre-construction whole house cost estimates compared to confirmed post construction actual cost? BSC and the builder developed a duct design strategy that employs a system of dropped ceilings and attic coffers for moving the ductwork from the vented attic to conditioned space. The furnace has been moved to either a mechanical closet in the conditioned living space or a coffered space in the attic.« less
Performance Analysis of New Binary User Codes for DS-CDMA Communication
NASA Astrophysics Data System (ADS)
Usha, Kamle; Jaya Sankar, Kottareddygari
2016-03-01
This paper analyzes new binary spreading codes through correlation properties and also presents their performance over additive white Gaussian noise (AWGN) channel. The proposed codes are constructed using gray and inverse gray codes. In this paper, a n-bit gray code appended by its n-bit inverse gray code to construct the 2n-length binary user codes are discussed. Like Walsh codes, these binary user codes are available in sizes of power of two and additionally code sets of length 6 and their even multiples are also available. The simple construction technique and generation of code sets of different sizes are the salient features of the proposed codes. Walsh codes and gold codes are considered for comparison in this paper as these are popularly used for synchronous and asynchronous multi user communications respectively. In the current work the auto and cross correlation properties of the proposed codes are compared with those of Walsh codes and gold codes. Performance of the proposed binary user codes for both synchronous and asynchronous direct sequence CDMA communication over AWGN channel is also discussed in this paper. The proposed binary user codes are found to be suitable for both synchronous and asynchronous DS-CDMA communication.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kamberg, L.D.
1998-02-23
This document serves as a notice of construction (NOC), pursuant to the requirements of Washington Administrative Code (WAC) 246-247-060, and as a request for approval to construct, pursuant to 40 Code of Federal Regulations (CFR) 61.07, for the Integrated Water Treatment System (IWTS) Filter Vessel Sparging Vent at 105-KW Basin. Additionally, the following description, and references are provided as the notices of startup, pursuant to 40 CFR 61.09(a)(1) and (2) in accordance with Title 40 Code of Federal Regulations, Part 61, National Emission Standards for Hazardous Air Pollutants. The 105-K West Reactor and its associated spent nuclear fuel (SNF) storagemore » basin were constructed in the early 1950s and are located on the Hanford Site in the 100-K Area about 1,400 feet from the Columbia River. The 105-KW Basin contains 964 Metric Tons of SNF stored under water in approximately 3,800 closed canisters. This SNF has been stored for varying periods of time ranging from 8 to 17 years. The 105-KW Basin is constructed of concrete with an epoxy coating and contains approximately 1.3 million gallons of water with an asphaltic membrane beneath the pool. The IWTS, which has been described in the Radioactive Air Emissions NOC for Fuel Removal for 105-KW Basin (DOE/RL-97-28 and page changes per US Department of Energy, Richland Operations Office letter 97-EAP-814) will be used to remove radionuclides from the basin water during fuel removal operations. The purpose of the modification described herein is to provide operational flexibility for the IWTS at the 105-KW basin. The proposed modification is scheduled to begin in calendar year 1998.« less
Entangled cloning of stabilizer codes and free fermions
NASA Astrophysics Data System (ADS)
Hsieh, Timothy H.
2016-10-01
Though the no-cloning theorem [Wooters and Zurek, Nature (London) 299, 802 (1982), 10.1038/299802a0] prohibits exact replication of arbitrary quantum states, there are many instances in quantum information processing and entanglement measurement in which a weaker form of cloning may be useful. Here, I provide a construction for generating an "entangled clone" for a particular but rather expansive and rich class of states. Given a stabilizer code or free fermion Hamiltonian, this construction generates an exact entangled clone of the original ground state, in the sense that the entanglement between the original and the exact copy can be tuned to be arbitrarily small but finite, or large, and the relation between the original and the copy can also be modified to some extent. For example, this Rapid Communication focuses on generating time-reversed copies of stabilizer codes and particle-hole transformed ground states of free fermion systems, although untransformed clones can also be generated. The protocol leverages entanglement to simulate a transformed copy of the Hamiltonian without having to physically implement it and can potentially be realized in superconducting qubits or ultracold atomic systems.
Error control techniques for satellite and space communications
NASA Technical Reports Server (NTRS)
Costello, Daniel J., Jr.
1994-01-01
Brief summaries of research in the following areas are presented: (1) construction of optimum geometrically uniform trellis codes; (2) a statistical approach to constructing convolutional code generators; and (3) calculating the exact performance of a convolutional code.
Error control techniques for satellite and space communications
NASA Technical Reports Server (NTRS)
Costello, Daniel J., Jr.
1989-01-01
The performance of bandwidth efficient trellis codes on channels with phase jitter, or those disturbed by jamming and impulse noise is analyzed. An heuristic algorithm for construction of bandwidth efficient trellis codes with any constraint length up to about 30, any signal constellation, and any code rate was developed. Construction of good distance profile trellis codes for sequential decoding and comparison of random coding bounds of trellis coded modulation schemes are also discussed.
Assessment of Spacecraft Systems Integration Using the Electric Propulsion Interactions Code (EPIC)
NASA Technical Reports Server (NTRS)
Mikellides, Ioannis G.; Kuharski, Robert A.; Mandell, Myron J.; Gardner, Barbara M.; Kauffman, William J. (Technical Monitor)
2002-01-01
SAIC is currently developing the Electric Propulsion Interactions Code 'EPIC', an interactive computer tool that allows the construction of a 3-D spacecraft model, and the assessment of interactions between its subsystems and the plume from an electric thruster. EPIC unites different computer tools to address the complexity associated with the interaction processes. This paper describes the overall architecture and capability of EPIC including the physics and algorithms that comprise its various components. Results from selected modeling efforts of different spacecraft-thruster systems are also presented.
The effect of multiple internal representations on context-rich instruction
NASA Astrophysics Data System (ADS)
Lasry, Nathaniel; Aulls, Mark W.
2007-11-01
We discuss n-coding, a theoretical model of multiple internal mental representations. The n-coding construct is developed from a review of cognitive and imaging data that demonstrates the independence of information processed along different modalities such as verbal, visual, kinesthetic, logico-mathematic, and social modalities. A study testing the effectiveness of the n-coding construct in classrooms is presented. Four sections differing in the level of n-coding opportunities were compared. Besides a traditional-instruction section used as a control group, each of the remaining three sections were given context-rich problems, which differed by the level of n-coding opportunities designed into their laboratory environment. To measure the effectiveness of the construct, problem-solving skills were assessed as conceptual learning using the force concept inventory. We also developed several new measures that take students' confidence in concepts into account. Our results show that the n-coding construct is useful in designing context-rich environments and can be used to increase learning gains in problem solving, conceptual knowledge, and concept confidence. Specifically, when using props in designing context-rich problems, we find n-coding to be a useful construct in guiding which additional dimensions need to be attended to.
Simulations of pattern dynamics for reaction-diffusion systems via SIMULINK
2014-01-01
Background Investigation of the nonlinear pattern dynamics of a reaction-diffusion system almost always requires numerical solution of the system’s set of defining differential equations. Traditionally, this would be done by selecting an appropriate differential equation solver from a library of such solvers, then writing computer codes (in a programming language such as C or Matlab) to access the selected solver and display the integrated results as a function of space and time. This “code-based” approach is flexible and powerful, but requires a certain level of programming sophistication. A modern alternative is to use a graphical programming interface such as Simulink to construct a data-flow diagram by assembling and linking appropriate code blocks drawn from a library. The result is a visual representation of the inter-relationships between the state variables whose output can be made completely equivalent to the code-based solution. Results As a tutorial introduction, we first demonstrate application of the Simulink data-flow technique to the classical van der Pol nonlinear oscillator, and compare Matlab and Simulink coding approaches to solving the van der Pol ordinary differential equations. We then show how to introduce space (in one and two dimensions) by solving numerically the partial differential equations for two different reaction-diffusion systems: the well-known Brusselator chemical reactor, and a continuum model for a two-dimensional sheet of human cortex whose neurons are linked by both chemical and electrical (diffusive) synapses. We compare the relative performances of the Matlab and Simulink implementations. Conclusions The pattern simulations by Simulink are in good agreement with theoretical predictions. Compared with traditional coding approaches, the Simulink block-diagram paradigm reduces the time and programming burden required to implement a solution for reaction-diffusion systems of equations. Construction of the block-diagram does not require high-level programming skills, and the graphical interface lends itself to easy modification and use by non-experts. PMID:24725437
1987-08-01
TO 8/87 68 6 CUPPEIFMyTARY 0A,()N Copies are available from,. the National Technical Information Service Springfield, VA 22161 *COSATI CODES 18 SUBJECT... information . The first exploratory research step was to determine the breath and depth of the construction schedule analysis domain. This step defined...ADDITIONAL INFORMATION REGARDING THIS RESEARCIH I. O’Connor, Michael J., Jesus M. Dc La Garza, and C. William Ibbs, "An Expert Systcm for Construction
New nonbinary quantum codes with larger distance constructed from BCH codes over 𝔽q2
NASA Astrophysics Data System (ADS)
Xu, Gen; Li, Ruihu; Fu, Qiang; Ma, Yuena; Guo, Luobin
2017-03-01
This paper concentrates on construction of new nonbinary quantum error-correcting codes (QECCs) from three classes of narrow-sense imprimitive BCH codes over finite field 𝔽q2 (q ≥ 3 is an odd prime power). By a careful analysis on properties of cyclotomic cosets in defining set T of these BCH codes, the improved maximal designed distance of these narrow-sense imprimitive Hermitian dual-containing BCH codes is determined to be much larger than the result given according to Aly et al. [S. A. Aly, A. Klappenecker and P. K. Sarvepalli, IEEE Trans. Inf. Theory 53, 1183 (2007)] for each different code length. Thus families of new nonbinary QECCs are constructed, and the newly obtained QECCs have larger distance than those in previous literature.
Noussa-Yao, Joseph; Heudes, Didier; Escudie, Jean-Baptiste; Degoulet, Patrice
2016-01-01
Short-stay MSO (Medicine, Surgery, Obstetrics) hospitalization activities in public and private hospitals providing public services are funded through charges for the services provided (T2A in French). Coding must be well matched to the severity of the patient's condition, to ensure that appropriate funding is provided to the hospital. We propose the use of an autocompletion process and multidimensional matrix, to help physicians to improve the expression of information and to optimize clinical coding. With this approach, physicians without knowledge of the encoding rules begin from a rough concept, which is gradually refined through semantic proximity and uses information on the associated codes stemming of optimized knowledge bases of diagnosis code.
1991-12-01
database, the Real Time Operation Management Information System (ROMIS), and Fitting Out Management Information System (FOMIS). These three configuration...Codes ROMIS Real Time Operation Management Information System SCLSIS Ship’s Configuration and Logistics Information System SCN Shipbuilding and
High-efficiency Gaussian key reconciliation in continuous variable quantum key distribution
NASA Astrophysics Data System (ADS)
Bai, ZengLiang; Wang, XuYang; Yang, ShenShen; Li, YongMin
2016-01-01
Efficient reconciliation is a crucial step in continuous variable quantum key distribution. The progressive-edge-growth (PEG) algorithm is an efficient method to construct relatively short block length low-density parity-check (LDPC) codes. The qua-sicyclic construction method can extend short block length codes and further eliminate the shortest cycle. In this paper, by combining the PEG algorithm and qua-si-cyclic construction method, we design long block length irregular LDPC codes with high error-correcting capacity. Based on these LDPC codes, we achieve high-efficiency Gaussian key reconciliation with slice recon-ciliation based on multilevel coding/multistage decoding with an efficiency of 93.7%.
NASA Technical Reports Server (NTRS)
Benedetto, S.; Divsalar, D.; Montorsi, G.; Pollara, F.
1998-01-01
Soft-input soft-output building blocks (modules) are presented to construct and iteratively decode in a distributed fashion code networks, a new concept that includes, and generalizes, various forms of concatenated coding schemes.
Cobweb/3: A portable implementation
NASA Technical Reports Server (NTRS)
Mckusick, Kathleen; Thompson, Kevin
1990-01-01
An algorithm is examined for data clustering and incremental concept formation. An overview is given of the Cobweb/3 system and the algorithm on which it is based, as well as the practical details of obtaining and running the system code. The implementation features a flexible user interface which includes a graphical display of the concept hierarchies that the system constructs.
Heyman, Richard E.
2006-01-01
The purpose of this review is to provide a balanced examination of the published research involving the observation of couples, with special attention toward the use of observation for clinical assessment. All published articles that (a) used an observational coding system and (b) relate to the validity of the coding system are summarized in a table. The psychometric properties of observational systems and the use of observation in clinical practice are discussed. Although advances have been made in understanding couple conflict through the use of observation, the review concludes with an appeal to the field to develop constructs in a psychometrically and theoretically sound manner. PMID:11281039
Heating, Ventilation, and Air Conditioning Design Strategy for a Hot-Humid Production Builder
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kerrigan, P.
2014-03-01
Building Science Corporation (BSC) worked directly with the David Weekley Homes - Houston division to develop a cost-effective design for moving the HVAC system into conditioned space. In addition, BSC conducted energy analysis to calculate the most economical strategy for increasing the energy performance of future production houses in preparation for the upcoming code changes in 2015. This research project addressed the following questions: 1. What is the most cost effective, best performing and most easily replicable method of locating ducts inside conditioned space for a hot-humid production home builder that constructs one and two story single family detached residences?more » 2. What is a cost effective and practical method of achieving 50% source energy savings vs. the 2006 International Energy Conservation Code for a hot-humid production builder? 3. How accurate are the pre-construction whole house cost estimates compared to confirmed post construction actual cost?« less
Entanglement-assisted quantum quasicyclic low-density parity-check codes
NASA Astrophysics Data System (ADS)
Hsieh, Min-Hsiu; Brun, Todd A.; Devetak, Igor
2009-03-01
We investigate the construction of quantum low-density parity-check (LDPC) codes from classical quasicyclic (QC) LDPC codes with girth greater than or equal to 6. We have shown that the classical codes in the generalized Calderbank-Skor-Steane construction do not need to satisfy the dual-containing property as long as preshared entanglement is available to both sender and receiver. We can use this to avoid the many four cycles which typically arise in dual-containing LDPC codes. The advantage of such quantum codes comes from the use of efficient decoding algorithms such as sum-product algorithm (SPA). It is well known that in the SPA, cycles of length 4 make successive decoding iterations highly correlated and hence limit the decoding performance. We show the principle of constructing quantum QC-LDPC codes which require only small amounts of initial shared entanglement.
Dynamic fisheye grids for binary black hole simulations
NASA Astrophysics Data System (ADS)
Zilhão, Miguel; Noble, Scott C.
2014-03-01
We present a new warped gridding scheme adapted to simulating gas dynamics in binary black hole spacetimes. The grid concentrates grid points in the vicinity of each black hole to resolve the smaller scale structures there, and rarefies grid points away from each black hole to keep the overall problem size at a practical level. In this respect, our system can be thought of as a ‘double’ version of the fisheye coordinate system, used before in numerical relativity codes for evolving binary black holes. The gridding scheme is constructed as a mapping between a uniform coordinate system—in which the equations of motion are solved—to the distorted system representing the spatial locations of our grid points. Since we are motivated to eventually use this system for circumbinary disc calculations, we demonstrate how the distorted system can be constructed to asymptote to the typical spherical polar coordinate system, amenable to efficiently simulating orbiting gas flows about central objects with little numerical diffusion. We discuss its implementation in the Harm3d code, tailored to evolve the magnetohydrodynamics equations in curved spacetimes. We evaluate the performance of the system’s implementation in Harm3d with a series of tests, such as the advected magnetic field loop test, magnetized Bondi accretion, and evolutions of hydrodynamic discs about a single black hole and about a binary black hole. Like we have done with Harm3d, this gridding scheme can be implemented in other unigrid codes as a (possibly) simpler alternative to adaptive mesh refinement.
Bi-orthogonal Symbol Mapping and Detection in Optical CDMA Communication System
NASA Astrophysics Data System (ADS)
Liu, Maw-Yang
2017-12-01
In this paper, the bi-orthogonal symbol mapping and detection scheme is investigated in time-spreading wavelength-hopping optical CDMA communication system. The carrier-hopping prime code is exploited as signature sequence, whose put-of-phase autocorrelation is zero. Based on the orthogonality of carrier-hopping prime code, the equal weight orthogonal signaling scheme can be constructed, and the proposed scheme using bi-orthogonal symbol mapping and detection can be developed. The transmitted binary data bits are mapped into corresponding bi-orthogonal symbols, where the orthogonal matrix code and its complement are utilized. In the receiver, the received bi-orthogonal data symbol is fed into the maximum likelihood decoder for detection. Under such symbol mapping and detection, the proposed scheme can greatly enlarge the Euclidean distance; hence, the system performance can be drastically improved.
NASA Astrophysics Data System (ADS)
Couvreur, A.
2009-05-01
The theory of algebraic-geometric codes has been developed in the beginning of the 80's after a paper of V.D. Goppa. Given a smooth projective algebraic curve X over a finite field, there are two different constructions of error-correcting codes. The first one, called "functional", uses some rational functions on X and the second one, called "differential", involves some rational 1-forms on this curve. Hundreds of papers are devoted to the study of such codes. In addition, a generalization of the functional construction for algebraic varieties of arbitrary dimension is given by Y. Manin in an article of 1984. A few papers about such codes has been published, but nothing has been done concerning a generalization of the differential construction to the higher-dimensional case. In this thesis, we propose a differential construction of codes on algebraic surfaces. Afterwards, we study the properties of these codes and particularly their relations with functional codes. A pretty surprising fact is that a main difference with the case of curves appears. Indeed, if in the case of curves, a differential code is always the orthogonal of a functional one, this assertion generally fails for surfaces. Last observation motivates the study of codes which are the orthogonal of some functional code on a surface. Therefore, we prove that, under some condition on the surface, these codes can be realized as sums of differential codes. Moreover, we show that some answers to some open problems "a la Bertini" could give very interesting informations on the parameters of these codes.
Transformation of two and three-dimensional regions by elliptic systems
NASA Technical Reports Server (NTRS)
Mastin, C. Wayne
1993-01-01
During this contract period, our work has focused on improvements to elliptic grid generation methods. There are two principle objectives in this project. One objective is to make the elliptic methods more reliable and efficient, and the other is to construct a modular code that can be incorporated into the National Grid Project (NGP), or any other grid generation code. Progress has been made in meeting both of these objectives. The two objectives are actually complementary. As the code development for the NGP progresses, we see many areas where improvements in algorithms can be made.
On codes with multi-level error-correction capabilities
NASA Technical Reports Server (NTRS)
Lin, Shu
1987-01-01
In conventional coding for error control, all the information symbols of a message are regarded equally significant, and hence codes are devised to provide equal protection for each information symbol against channel errors. However, in some occasions, some information symbols in a message are more significant than the other symbols. As a result, it is desired to devise codes with multilevel error-correcting capabilities. Another situation where codes with multi-level error-correcting capabilities are desired is in broadcast communication systems. An m-user broadcast channel has one input and m outputs. The single input and each output form a component channel. The component channels may have different noise levels, and hence the messages transmitted over the component channels require different levels of protection against errors. Block codes with multi-level error-correcting capabilities are also known as unequal error protection (UEP) codes. Structural properties of these codes are derived. Based on these structural properties, two classes of UEP codes are constructed.
How families cope with diabetes in adolescence. An approach and case analyses.
Hauser, S T; Paul, E L; Jacobson, A M; Weiss-Perry, B; Vieyra, M A; Rufo, P; Spetter, L D; DiPlacido, J; Wertlieb, D; Wolfsdorf, J
1988-01-01
In this paper we describe our newly constructed Family Coping Coding System. This scheme was constructed to identify family coping strategies that involve appraisal, problem solving, and emotion management dimensions. We discuss the theoretical rationale, meanings and reliability of the coping codes, and illustrate them through excerpts drawn from family discussions of a recent stressful situation (the onset of a chronic or acute illness in an adolescent member). Finally, we consider the clinical research relevance of this new assessment technique, exemplifying this potential with respect to medical compliance. We present analyses of two families with diabetic adolescents who strikingly differ with respect to compliance, and explore which family coping strategies may be predictive of an adolescent's favorable or problematic compliance to diabetes management.
ERIC Educational Resources Information Center
American Inst. of Architects, Washington, DC.
A MODEL BUILDING CODE FOR FALLOUT SHELTERS WAS DRAWN UP FOR INCLUSION IN FOUR NATIONAL MODEL BUILDING CODES. DISCUSSION IS GIVEN OF FALLOUT SHELTERS WITH RESPECT TO--(1) NUCLEAR RADIATION, (2) NATIONAL POLICIES, AND (3) COMMUNITY PLANNING. FALLOUT SHELTER REQUIREMENTS FOR SHIELDING, SPACE, VENTILATION, CONSTRUCTION, AND SERVICES SUCH AS ELECTRICAL…
Python-Assisted MODFLOW Application and Code Development
NASA Astrophysics Data System (ADS)
Langevin, C.
2013-12-01
The U.S. Geological Survey (USGS) has a long history of developing and maintaining free, open-source software for hydrological investigations. The MODFLOW program is one of the most popular hydrologic simulation programs released by the USGS, and it is considered to be the most widely used groundwater flow simulation code. MODFLOW was written using a modular design and a procedural FORTRAN style, which resulted in code that could be understood, modified, and enhanced by many hydrologists. The code is fast, and because it uses standard FORTRAN it can be run on most operating systems. Most MODFLOW users rely on proprietary graphical user interfaces for constructing models and viewing model results. Some recent efforts, however, have focused on construction of MODFLOW models using open-source Python scripts. Customizable Python packages, such as FloPy (https://code.google.com/p/flopy), can be used to generate input files, read simulation results, and visualize results in two and three dimensions. Automating this sequence of steps leads to models that can be reproduced directly from original data and rediscretized in space and time. Python is also being used in the development and testing of new MODFLOW functionality. New packages and numerical formulations can be quickly prototyped and tested first with Python programs before implementation in MODFLOW. This is made possible by the flexible object-oriented design capabilities available in Python, the ability to call FORTRAN code from Python, and the ease with which linear systems of equations can be solved using SciPy, for example. Once new features are added to MODFLOW, Python can then be used to automate comprehensive regression testing and ensure reliability and accuracy of new versions prior to release.
An address geocoding method for improving rural spatial information infrastructure
NASA Astrophysics Data System (ADS)
Pan, Yuchun; Chen, Baisong; Lu, Zhou; Li, Shuhua; Zhang, Jingbo; Zhou, YanBing
2010-11-01
The transition of rural and agricultural management from divisional to integrated mode has highlighted the importance of data integration and sharing. Current data are mostly collected by specific department to satisfy their own needs and lake of considering on wider potential uses. This led to great difference in data format, semantic, and precision even in same area, which is a significant barrier for constructing an integrated rural spatial information system to support integrated management and decision-making. Considering the rural cadastral management system and postal zones, the paper designs a rural address geocoding method based on rural cadastral parcel. It puts forward a geocoding standard which consists of absolute position code, relative position code and extended code. It designs a rural geocoding database model, and addresses collection and update model. Then, based on the rural address geocoding model, it proposed a data model for rural agricultural resources management. The results show that the address coding based on postal code is stable and easy to memorize, two-dimensional coding based on the direction and distance is easy to be located and memorized, while extended code can enhance the extensibility and flexibility of address geocoding.
Reliability and coverage analysis of non-repairable fault-tolerant memory systems
NASA Technical Reports Server (NTRS)
Cox, G. W.; Carroll, B. D.
1976-01-01
A method was developed for the construction of probabilistic state-space models for nonrepairable systems. Models were developed for several systems which achieved reliability improvement by means of error-coding, modularized sparing, massive replication and other fault-tolerant techniques. From the models developed, sets of reliability and coverage equations for the systems were developed. Comparative analyses of the systems were performed using these equation sets. In addition, the effects of varying subunit reliabilities on system reliability and coverage were described. The results of these analyses indicated that a significant gain in system reliability may be achieved by use of combinations of modularized sparing, error coding, and software error control. For sufficiently reliable system subunits, this gain may far exceed the reliability gain achieved by use of massive replication techniques, yet result in a considerable saving in system cost.
Protograph based LDPC codes with minimum distance linearly growing with block size
NASA Technical Reports Server (NTRS)
Divsalar, Dariush; Jones, Christopher; Dolinar, Sam; Thorpe, Jeremy
2005-01-01
We propose several LDPC code constructions that simultaneously achieve good threshold and error floor performance. Minimum distance is shown to grow linearly with block size (similar to regular codes of variable degree at least 3) by considering ensemble average weight enumerators. Our constructions are based on projected graph, or protograph, structures that support high-speed decoder implementations. As with irregular ensembles, our constructions are sensitive to the proportion of degree-2 variable nodes. A code with too few such nodes tends to have an iterative decoding threshold that is far from the capacity threshold. A code with too many such nodes tends to not exhibit a minimum distance that grows linearly in block length. In this paper we also show that precoding can be used to lower the threshold of regular LDPC codes. The decoding thresholds of the proposed codes, which have linearly increasing minimum distance in block size, outperform that of regular LDPC codes. Furthermore, a family of low to high rate codes, with thresholds that adhere closely to their respective channel capacity thresholds, is presented. Simulation results for a few example codes show that the proposed codes have low error floors as well as good threshold SNFt performance.
On decoding of multi-level MPSK modulation codes
NASA Technical Reports Server (NTRS)
Lin, Shu; Gupta, Alok Kumar
1990-01-01
The decoding problem of multi-level block modulation codes is investigated. The hardware design of soft-decision Viterbi decoder for some short length 8-PSK block modulation codes is presented. An effective way to reduce the hardware complexity of the decoder by reducing the branch metric and path metric, using a non-uniform floating-point to integer mapping scheme, is proposed and discussed. The simulation results of the design are presented. The multi-stage decoding (MSD) of multi-level modulation codes is also investigated. The cases of soft-decision and hard-decision MSD are considered and their performance are evaluated for several codes of different lengths and different minimum squared Euclidean distances. It is shown that the soft-decision MSD reduces the decoding complexity drastically and it is suboptimum. The hard-decision MSD further simplifies the decoding while still maintaining a reasonable coding gain over the uncoded system, if the component codes are chosen properly. Finally, some basic 3-level 8-PSK modulation codes using BCH codes as component codes are constructed and their coding gains are found for hard decision multistage decoding.
The Mystro system: A comprehensive translator toolkit
NASA Technical Reports Server (NTRS)
Collins, W. R.; Noonan, R. E.
1985-01-01
Mystro is a system that facilities the construction of compilers, assemblers, code generators, query interpretors, and similar programs. It provides features to encourage the use of iterative enhancement. Mystro was developed in response to the needs of NASA Langley Research Center (LaRC) and enjoys a number of advantages over similar systems. There are other programs available that can be used in building translators. These typically build parser tables, usually supply the source of a parser and parts of a lexical analyzer, but provide little or no aid for code generation. In general, only the front end of the compiler is addressed. Mystro, on the other hand, emphasizes tools for both ends of a compiler.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-07-23
... Offshore Drilling Units AGENCY: Coast Guard, DHS. ACTION: Notice of availability. SUMMARY: The Coast Guard...), Code for the Construction and Equipment of Mobile Offshore Drilling Units, 2009 (2009 MODU Code). CG...: Background and Purpose Foreign documented MODUs engaged in any offshore activity associated with the...
24 CFR 905.312 - Design and construction.
Code of Federal Regulations, 2014 CFR
2014-04-01
... constructed in compliance with: (1) A national building code, such as those developed by the International Code Council or the National Fire Protection Association; and the IECC or ASHRAE 90.1-2010 (both... a successor energy code or standard that has been adopted by HUD pursuant to 42 U.S.C. 12709 or...
41 CFR 102-76.10 - What basic design and construction policy governs Federal agencies?
Code of Federal Regulations, 2014 CFR
2014-01-01
.... (c) Follow nationally recognized model building codes and other applicable nationally recognized codes that govern Federal construction to the maximum extent feasible and consider local building code requirements. (See 40 U.S.C. 3310 and 3312.) (d) Design Federal buildings to have a long life expectancy and...
Building codes : obstacle or opportunity?
Alberto Goetzl; David B. McKeever
1999-01-01
Building codes are critically important in the use of wood products for construction. The codes contain regulations that are prescriptive or performance related for various kinds of buildings and construction types. A prescriptive standard might dictate that a particular type of material be used in a given application. A performance standard requires that a particular...
Quantum error correcting codes and 4-dimensional arithmetic hyperbolic manifolds
DOE Office of Scientific and Technical Information (OSTI.GOV)
Guth, Larry, E-mail: lguth@math.mit.edu; Lubotzky, Alexander, E-mail: alex.lubotzky@mail.huji.ac.il
2014-08-15
Using 4-dimensional arithmetic hyperbolic manifolds, we construct some new homological quantum error correcting codes. They are low density parity check codes with linear rate and distance n{sup ε}. Their rate is evaluated via Euler characteristic arguments and their distance using Z{sub 2}-systolic geometry. This construction answers a question of Zémor [“On Cayley graphs, surface codes, and the limits of homological coding for quantum error correction,” in Proceedings of Second International Workshop on Coding and Cryptology (IWCC), Lecture Notes in Computer Science Vol. 5557 (2009), pp. 259–273], who asked whether homological codes with such parameters could exist at all.
NASA Technical Reports Server (NTRS)
Solomon, G.
1993-01-01
A (72,36;15) box code is constructed as a 9 x 8 matrix whose columns add to form an extended BCH-Hamming (8,4;4) code and whose rows sum to odd or even parity. The newly constructed code, due to its matrix form, is easily decodable for all seven-error and many eight-error patterns. The code comes from a slight modification in the parity (eighth) dimension of the Reed-Solomon (8,4;5) code over GF(512). Error correction uses the row sum parity information to detect errors, which then become erasures in a Reed-Solomon correction algorithm.
Error control techniques for satellite and space communications
NASA Technical Reports Server (NTRS)
Costello, Daniel J., Jr.
1992-01-01
Worked performed during the reporting period is summarized. Construction of robustly good trellis codes for use with sequential decoding was developed. The robustly good trellis codes provide a much better trade off between free distance and distance profile. The unequal error protection capabilities of convolutional codes was studied. The problem of finding good large constraint length, low rate convolutional codes for deep space applications is investigated. A formula for computing the free distance of 1/n convolutional codes was discovered. Double memory (DM) codes, codes with two memory units per unit bit position, were studied; a search for optimal DM codes is being conducted. An algorithm for constructing convolutional codes from a given quasi-cyclic code was developed. Papers based on the above work are included in the appendix.
Error control techniques for satellite and space communications
NASA Technical Reports Server (NTRS)
Costello, Daniel J., Jr.
1993-01-01
The results included in the Ph.D. dissertation of Dr. Fu Quan Wang, who was supported by the grant as a Research Assistant from January 1989 through December 1992 are discussed. The sections contain a brief summary of the important aspects of this dissertation, which include: (1) erasurefree sequential decoding of trellis codes; (2) probabilistic construction of trellis codes; (3) construction of robustly good trellis codes; and (4) the separability of shaping and coding.
[Orthopedic and trauma surgery in the German DRG System 2007].
Franz, D; Kaufmann, M; Siebert, C H; Windolf, J; Roeder, N
2007-03-01
The German Diagnosis-Related Groups (DRG) System was further developed into its 2007 version. For orthopedic and trauma surgery, significant changes were made in terms of the coding of diagnoses and medical procedures, as well as in the DRG structure itself. The German Societies for Trauma Surgery and for Orthopedics and Orthopedic Surgery (Deutsch Gesellschaft für Unfallchirurgie, DGU; and Deutsche Gesellschaft für Orthopädie und Orthopädische Chirurgie, DGOOC) once again cooperated constructively with the German DRG Institute InEK. Among other innovations, new International Classification of Diseases (ICD) codes for second-degree burns were implemented. Procedure codes for joint operations, endoprosthetic-surgery and spine surgery were restructured. Furthermore, a specific code for septic surgery was introduced in 2007. In addition, the DRG structure was improved. Case allocation of patients with more than one significant operation was established. Further DRG subdivisions were established according to the patients age and the Patient Clinical Complexity Level (PCCL). DRG developments for 2007 have improved appropriate case allocation, but once again increased the system's complexity. Clinicians need an ever growing amount of specific coding know-how. Still, further adjustments to the German DRG system are required to allow for a correct allocation of cases and funds.
Response surface method in geotechnical/structural analysis, phase 1
NASA Astrophysics Data System (ADS)
Wong, F. S.
1981-02-01
In the response surface approach, an approximating function is fit to a long running computer code based on a limited number of code calculations. The approximating function, called the response surface, is then used to replace the code in subsequent repetitive computations required in a statistical analysis. The procedure of the response surface development and feasibility of the method are shown using a sample problem in slop stability which is based on data from centrifuge experiments of model soil slopes and involves five random soil parameters. It is shown that a response surface can be constructed based on as few as four code calculations and that the response surface is computationally extremely efficient compared to the code calculation. Potential applications of this research include probabilistic analysis of dynamic, complex, nonlinear soil/structure systems such as slope stability, liquefaction, and nuclear reactor safety.
Del Piccolo, Lidia; Putnam, Samuel M; Mazzi, Maria Angela; Zimmermann, Christa
2004-04-01
Factor analysis (FA) is a powerful method of testing the construct validity of coding systems of the medical interview. The study uses FA to test the underlying assumptions of the Verona Medical Interview Classification System (VR-MICS). The relationship between factor scores and patient characteristics was also examined. The VR-MICS coding categories consider the three domains of the biopsychosocial model and the main functions of the medical interview-data gathering, relationship building and patient education. FA was performed on the frequencies of the VR-MICS categories based on 238 medical interviews. Seven factors (62.5% of variance explained) distinguished different strategies patients and physicians use to exchange information, build a relationship and negotiate treatment within the domains of the biopsychosocial model. Three factors, Psychological, Social Inquiry and Management of Patient Agenda, were related to patient data: sociodemographic (female gender, age and employment), social (stressful events), clinical (GHQ-12 score), personality (chance external health locus of control) and clinical characteristics (psychiatric history, chronic illness, attributed presence of emotional distress).
New q-ary quantum MDS codes with distances bigger than q/2
NASA Astrophysics Data System (ADS)
He, Xianmang; Xu, Liqing; Chen, Hao
2016-07-01
The construction of quantum MDS codes has been studied by many authors. We refer to the table in page 1482 of (IEEE Trans Inf Theory 61(3):1474-1484, 2015) for known constructions. However, there have been constructed only a few q-ary quantum MDS [[n,n-2d+2,d
Code of Federal Regulations, 2010 CFR
2010-10-01
... compliance of health and safety codes during construction projects being performed by a Self-Governance Tribe... SERVICES TRIBAL SELF-GOVERNANCE Construction Roles of the Secretary in Establishing and Implementing Construction Project Agreements § 137.368 Is the Secretary responsible for oversight and compliance of health...
Combined group ECC protection and subgroup parity protection
Gara, Alan G.; Chen, Dong; Heidelberger, Philip; Ohmacht, Martin
2013-06-18
A method and system are disclosed for providing combined error code protection and subgroup parity protection for a given group of n bits. The method comprises the steps of identifying a number, m, of redundant bits for said error protection; and constructing a matrix P, wherein multiplying said given group of n bits with P produces m redundant error correction code (ECC) protection bits, and two columns of P provide parity protection for subgroups of said given group of n bits. In the preferred embodiment of the invention, the matrix P is constructed by generating permutations of m bit wide vectors with three or more, but an odd number of, elements with value one and the other elements with value zero; and assigning said vectors to rows of the matrix P.
In the Rearview Mirror: Social Skill Development in Deaf Youth, 1990-2015.
Cawthon, Stephanie W; Fink, Bentley; Schoffstall, Sarah; Wendel, Erica
2018-01-01
Social skills are a vehicle by which individuals negotiate important relationships. The present article presents historical data on how social skills in deaf students were conceptualized and studied empirically during the period 1990-2015. Using a structured literature review approach, the researchers coded 266 articles for theoretical frameworks used and constructs studied. The vast majority of articles did not explicitly align with a specific theoretical framework. Of the 37 that did, most focused on socioemotional and cognitive frameworks, while a minority drew from frameworks focusing on attitudes, developmental theories, or ecological systems theory. In addition, 315 social-skill constructs were coded across the data set; the majority focused on socioemotional functioning. Trends in findings across the past quarter century and implications for research and practice are examined.
STGT program: Ada coding and architecture lessons learned
NASA Technical Reports Server (NTRS)
Usavage, Paul; Nagurney, Don
1992-01-01
STGT (Second TDRSS Ground Terminal) is currently halfway through the System Integration Test phase (Level 4 Testing). To date, many software architecture and Ada language issues have been encountered and solved. This paper, which is the transcript of a presentation at the 3 Dec. meeting, attempts to define these lessons plus others learned regarding software project management and risk management issues, training, performance, reuse, and reliability. Observations are included regarding the use of particular Ada coding constructs, software architecture trade-offs during the prototyping, development and testing stages of the project, and dangers inherent in parallel or concurrent systems, software, hardware, and operations engineering.
On entanglement-assisted quantum codes achieving the entanglement-assisted Griesmer bound
NASA Astrophysics Data System (ADS)
Li, Ruihu; Li, Xueliang; Guo, Luobin
2015-12-01
The theory of entanglement-assisted quantum error-correcting codes (EAQECCs) is a generalization of the standard stabilizer formalism. Any quaternary (or binary) linear code can be used to construct EAQECCs under the entanglement-assisted (EA) formalism. We derive an EA-Griesmer bound for linear EAQECCs, which is a quantum analog of the Griesmer bound for classical codes. This EA-Griesmer bound is tighter than known bounds for EAQECCs in the literature. For a given quaternary linear code {C}, we show that the parameters of the EAQECC that EA-stabilized by the dual of {C} can be determined by a zero radical quaternary code induced from {C}, and a necessary condition under which a linear EAQECC may achieve the EA-Griesmer bound is also presented. We construct four families of optimal EAQECCs and then show the necessary condition for existence of EAQECCs is also sufficient for some low-dimensional linear EAQECCs. The four families of optimal EAQECCs are degenerate codes and go beyond earlier constructions. What is more, except four codes, our [[n,k,d_{ea};c
Seeing the Invisible: Embedding Tests in Code That Cannot be Modified
NASA Technical Reports Server (NTRS)
O'Malley, Owen; Mansouri-Samani, Masoud; Mehlitz, Peter; Penix, John
2005-01-01
The difficulty of characterizing and observing valid software behavior during testing can be very difficult in flight systems. To address this issue, we evaluated several approaches to increasing test observability on the Shuttle Abort Flight Management (SAFM) system. To increase test observability, we added probes into the running system to evaluate the internal state and analyze test data. To minimize the impact of the instrumentation and reduce manual effort, we used Aspect-Oriented Programming (AOP) tools to instrument the source code. We developed and elicited a spectrum of properties, from generic to application specific properties, to be monitored via the instrumentation. To evaluate additional approaches, SAFM was ported to Linux, enabling the use of gcov for measuring test coverage, Valgrind for looking for memory usage errors, and libraries for finding non-normal floating point values. An in-house C++ source code scanning tool was also used to identify violations of SAFM coding standards, and other potentially problematic C++ constructs. Using these approaches with the existing test data sets, we were able to verify several important properties, confirm several problems and identify some previously unidentified issues.
Debunking the Fire Sprinkler Myth.
ERIC Educational Resources Information Center
O'Connell, Thomas
1996-01-01
Sprinklers can protect school buildings, save lives, and actually reduce construction costs. Sprinkler-system costs can be easily offset by insurance savings, as well as by specific alternatives or design options permitted by nationally recognized building codes in view of the superior protection that sprinklers provide. (MLF)
Efficient Polar Coding of Quantum Information
NASA Astrophysics Data System (ADS)
Renes, Joseph M.; Dupuis, Frédéric; Renner, Renato
2012-08-01
Polar coding, introduced 2008 by Arıkan, is the first (very) efficiently encodable and decodable coding scheme whose information transmission rate provably achieves the Shannon bound for classical discrete memoryless channels in the asymptotic limit of large block sizes. Here, we study the use of polar codes for the transmission of quantum information. Focusing on the case of qubit Pauli channels and qubit erasure channels, we use classical polar codes to construct a coding scheme that asymptotically achieves a net transmission rate equal to the coherent information using efficient encoding and decoding operations and code construction. Our codes generally require preshared entanglement between sender and receiver, but for channels with a sufficiently low noise level we demonstrate that the rate of preshared entanglement required is zero.
Number theoretical foundations in cryptography
NASA Astrophysics Data System (ADS)
Atan, Kamel Ariffin Mohd
2017-08-01
In recent times the hazards in relationships among entities in different establishments worldwide have generated exciting developments in cryptography. Central to this is the theory of numbers. This area of mathematics provides very rich source of fundamental materials for constructing secret codes. Some number theoretical concepts that have been very actively used in designing crypto systems will be highlighted in this presentation. This paper will begin with introduction to basic number theoretical concepts which for many years have been thought to have no practical applications. This will include several theoretical assertions that were discovered much earlier in the historical development of number theory. This will be followed by discussion on the "hidden" properties of these assertions that were later exploited by designers of cryptosystems in their quest for developing secret codes. This paper also highlights some earlier and existing cryptosystems and the role played by number theoretical concepts in their constructions. The role played by cryptanalysts in detecting weaknesses in the systems developed by cryptographers concludes this presentation.
Evolutionary Construction of Block-Based Neural Networks in Consideration of Failure
NASA Astrophysics Data System (ADS)
Takamori, Masahito; Koakutsu, Seiichi; Hamagami, Tomoki; Hirata, Hironori
In this paper we propose a modified gene coding and an evolutionary construction in consideration of failure in evolutionary construction of Block-Based Neural Networks. In the modified gene coding, we arrange the genes of weights on a chromosome in consideration of the position relation of the genes of weight and structure. By the modified gene coding, the efficiency of search by crossover is increased. Thereby, it is thought that improvement of the convergence rate of construction and shortening of construction time can be performed. In the evolutionary construction in consideration of failure, the structure which is adapted for failure is built in the state where failure occured. Thereby, it is thought that BBNN can be reconstructed in a short time at the time of failure. To evaluate the proposed method, we apply it to pattern classification and autonomous mobile robot control problems. The computational experiments indicate that the proposed method can improve convergence rate of construction and shorten of construction and reconstruction time.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yamaguchi, Nobuyoshi; Nakao, Masato; Murakami, Masahide
2008-07-08
For seismic design, ductility-related force modification factors are named R factor in Uniform Building Code of U.S, q factor in Euro Code 8 and Ds (inverse of R) factor in Japanese Building Code. These ductility-related force modification factors for each type of shear elements are appeared in those codes. Some constructions use various types of shear walls that have different ductility, especially for their retrofit or re-strengthening. In these cases, engineers puzzle the decision of force modification factors of the constructions. Solving this problem, new method to calculate lateral strengths of stories for simple shear wall systems is proposed andmore » named 'Stiffness--Potential Energy Addition Method' in this paper. This method uses two design lateral strengths for each type of shear walls in damage limit state and safety limit state. Two lateral strengths of stories in both limit states are calculated from these two design lateral strengths for each type of shear walls in both limit states. Calculated strengths have the same quality as values obtained by strength addition method using many steps of load-deformation data of shear walls. The new method to calculate ductility factors is also proposed in this paper. This method is based on the new method to calculate lateral strengths of stories. This method can solve the problem to obtain ductility factors of stories with shear walls of different ductility.« less
NASA Astrophysics Data System (ADS)
Zhang, Miao; Tong, Xiaojun
2017-07-01
This paper proposes a joint image encryption and compression scheme based on a new hyperchaotic system and curvelet transform. A new five-dimensional hyperchaotic system based on the Rabinovich system is presented. By means of the proposed hyperchaotic system, a new pseudorandom key stream generator is constructed. The algorithm adopts diffusion and confusion structure to perform encryption, which is based on the key stream generator and the proposed hyperchaotic system. The key sequence used for image encryption is relation to plain text. By means of the second generation curvelet transform, run-length coding, and Huffman coding, the image data are compressed. The joint operation of compression and encryption in a single process is performed. The security test results indicate the proposed methods have high security and good compression effect.
Qin, Heng; Zuo, Yong; Zhang, Dong; Li, Yinghui; Wu, Jian
2017-03-06
Through slight modification on typical photon multiplier tube (PMT) receiver output statistics, a generalized received response model considering both scattered propagation and random detection is presented to investigate the impact of inter-symbol interference (ISI) on link data rate of short-range non-line-of-sight (NLOS) ultraviolet communication. Good agreement with the experimental results by numerical simulation is shown. Based on the received response characteristics, a heuristic check matrix construction algorithm of low-density-parity-check (LDPC) code is further proposed to approach the data rate bound derived in a delayed sampling (DS) binary pulse position modulation (PPM) system. Compared to conventional LDPC coding methods, better bit error ratio (BER) below 1E-05 is achieved for short-range NLOS UVC systems operating at data rate of 2Mbps.
New optimal asymmetric quantum codes constructed from constacyclic codes
NASA Astrophysics Data System (ADS)
Xu, Gen; Li, Ruihu; Guo, Luobin; Lü, Liangdong
2017-02-01
In this paper, we propose the construction of asymmetric quantum codes from two families of constacyclic codes over finite field 𝔽q2 of code length n, where for the first family, q is an odd prime power with the form 4t + 1 (t ≥ 1 is integer) or 4t - 1 (t ≥ 2 is integer) and n1 = q2+1 2; for the second family, q is an odd prime power with the form 10t + 3 or 10t + 7 (t ≥ 0 is integer) and n2 = q2+1 5. As a result, families of new asymmetric quantum codes [[n,k,dz/dx
C Language Integrated Production System, Ada Version
NASA Technical Reports Server (NTRS)
Culbert, Chris; Riley, Gary; Savely, Robert T.; Melebeck, Clovis J.; White, Wesley A.; Mcgregor, Terry L.; Ferguson, Melisa; Razavipour, Reza
1992-01-01
CLIPS/Ada provides capabilities of CLIPS v4.3 but uses Ada as source language for CLIPS executable code. Implements forward-chaining rule-based language. Program contains inference engine and language syntax providing framework for construction of expert-system program. Also includes features for debugging application program. Based on Rete algorithm which provides efficient method for performing repeated matching of patterns. Written in Ada.
NASA Astrophysics Data System (ADS)
Arkadov, G. V.; Zhukavin, A. P.; Kroshilin, A. E.; Parshikov, I. A.; Solov'ev, S. L.; Shishov, A. V.
2014-10-01
The article describes the "Virtual Digital VVER-Based Nuclear Power Plant" computerized system comprising a totality of verified initial data (sets of input data for a model intended for describing the behavior of nuclear power plant (NPP) systems in design and emergency modes of their operation) and a unified system of new-generation computation codes intended for carrying out coordinated computation of the variety of physical processes in the reactor core and NPP equipment. Experiments with the demonstration version of the "Virtual Digital VVER-Based NPP" computerized system has shown that it is in principle possible to set up a unified system of computation codes in a common software environment for carrying out interconnected calculations of various physical phenomena at NPPs constructed according to the standard AES-2006 project. With the full-scale version of the "Virtual Digital VVER-Based NPP" computerized system put in operation, the concerned engineering, design, construction, and operating organizations will have access to all necessary information relating to the NPP power unit project throughout its entire lifecycle. The domestically developed commercial-grade software product set to operate as an independently operating application to the project will bring about additional competitive advantages in the modern market of nuclear power technologies.
Transversal Clifford gates on folded surface codes
Moussa, Jonathan E.
2016-10-12
Surface and color codes are two forms of topological quantum error correction in two spatial dimensions with complementary properties. Surface codes have lower-depth error detection circuits and well-developed decoders to interpret and correct errors, while color codes have transversal Clifford gates and better code efficiency in the number of physical qubits needed to achieve a given code distance. A formal equivalence exists between color codes and folded surface codes, but it does not guarantee the transferability of any of these favorable properties. However, the equivalence does imply the existence of constant-depth circuit implementations of logical Clifford gates on folded surfacemore » codes. We achieve and improve this result by constructing two families of folded surface codes with transversal Clifford gates. This construction is presented generally for qudits of any dimension. Lastly, the specific application of these codes to universal quantum computation based on qubit fusion is also discussed.« less
ERIC Educational Resources Information Center
Valenzuela, Elena; Faure, Ana; Ramirez-Trujillo, Alma P.; Barski, Ewelina; Pangtay, Yolanda; Diez, Adriana
2012-01-01
The study examined heritage speaker grammars and to what extent they diverge with respect to grammatical gender from adult L2 learners. Results from a preference task involving code-mixed Determiner Phrases (DPs) and code-mixed copula constructions show a difference between these two types of operations. Heritage speakers patterned with the…
AutoBayes Program Synthesis System Users Manual
NASA Technical Reports Server (NTRS)
Schumann, Johann; Jafari, Hamed; Pressburger, Tom; Denney, Ewen; Buntine, Wray; Fischer, Bernd
2008-01-01
Program synthesis is the systematic, automatic construction of efficient executable code from high-level declarative specifications. AutoBayes is a fully automatic program synthesis system for the statistical data analysis domain; in particular, it solves parameter estimation problems. It has seen many successful applications at NASA and is currently being used, for example, to analyze simulation results for Orion. The input to AutoBayes is a concise description of a data analysis problem composed of a parameterized statistical model and a goal that is a probability term involving parameters and input data. The output is optimized and fully documented C/C++ code computing the values for those parameters that maximize the probability term. AutoBayes can solve many subproblems symbolically rather than having to rely on numeric approximation algorithms, thus yielding effective, efficient, and compact code. Statistical analysis is faster and more reliable, because effort can be focused on model development and validation rather than manual development of solution algorithms and code.
A Java-Enabled Interactive Graphical Gas Turbine Propulsion System Simulator
NASA Technical Reports Server (NTRS)
Reed, John A.; Afjeh, Abdollah A.
1997-01-01
This paper describes a gas turbine simulation system which utilizes the newly developed Java language environment software system. The system provides an interactive graphical environment which allows the quick and efficient construction and analysis of arbitrary gas turbine propulsion systems. The simulation system couples a graphical user interface, developed using the Java Abstract Window Toolkit, and a transient, space- averaged, aero-thermodynamic gas turbine analysis method, both entirely coded in the Java language. The combined package provides analytical, graphical and data management tools which allow the user to construct and control engine simulations by manipulating graphical objects on the computer display screen. Distributed simulations, including parallel processing and distributed database access across the Internet and World-Wide Web (WWW), are made possible through services provided by the Java environment.
Adding Concrete Syntax to a Prolog-Based Program Synthesis System
NASA Technical Reports Server (NTRS)
Fischer, Bernd; Visser, Eelco
2003-01-01
Program generation and transformation systems manipulate large, pa- rameterized object language fragments. Support for user-definable concrete syntax makes this easier but is typically restricted to certain object and meta languages. We show how Prolog can be retrofitted with concrete syntax and describe how a seamless interaction of concrete syntax fragments with an existing legacy meta-programming system based on abstract syntax is achieved. We apply the approach to gradually migrate the schemas of the AUTOBAYES program synthesis system to concrete syntax. Fit experiences show that this can result in a considerable reduction of the code size and an improved readability of the code. In particular, abstracting out fresh-variable generation and second-order term construction allows the formulation of larger continuous fragments and improves the locality in the schemas.
NASA Astrophysics Data System (ADS)
Alipchenkov, V. M.; Anfimov, A. M.; Afremov, D. A.; Gorbunov, V. S.; Zeigarnik, Yu. A.; Kudryavtsev, A. V.; Osipov, S. L.; Mosunova, N. A.; Strizhov, V. F.; Usov, E. V.
2016-02-01
The conceptual fundamentals of the development of the new-generation system thermal-hydraulic computational HYDRA-IBRAE/LM code are presented. The code is intended to simulate the thermalhydraulic processes that take place in the loops and the heat-exchange equipment of liquid-metal cooled fast reactor systems under normal operation and anticipated operational occurrences and during accidents. The paper provides a brief overview of Russian and foreign system thermal-hydraulic codes for modeling liquid-metal coolants and gives grounds for the necessity of development of a new-generation HYDRA-IBRAE/LM code. Considering the specific engineering features of the nuclear power plants (NPPs) equipped with the BN-1200 and the BREST-OD-300 reactors, the processes and the phenomena are singled out that require a detailed analysis and development of the models to be correctly described by the system thermal-hydraulic code in question. Information on the functionality of the computational code is provided, viz., the thermalhydraulic two-phase model, the properties of the sodium and the lead coolants, the closing equations for simulation of the heat-mass exchange processes, the models to describe the processes that take place during the steam-generator tube rupture, etc. The article gives a brief overview of the usability of the computational code, including a description of the support documentation and the supply package, as well as possibilities of taking advantages of the modern computer technologies, such as parallel computations. The paper shows the current state of verification and validation of the computational code; it also presents information on the principles of constructing of and populating the verification matrices for the BREST-OD-300 and the BN-1200 reactor systems. The prospects are outlined for further development of the HYDRA-IBRAE/LM code, introduction of new models into it, and enhancement of its usability. It is shown that the program of development and practical application of the code will allow carrying out in the nearest future the computations to analyze the safety of potential NPP projects at a qualitatively higher level.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vine, E.
1990-11-01
As part of Lawrence Berkeley Laboratory's (LBL) technical assistance to the Sustainable City Project, compliance and enforcement activities related to local and state building codes for existing and new construction were evaluated in two case studies. The analysis of the City of San Francisco's Residential Energy Conservation Ordinance (RECO) showed that a limited, prescriptive energy conservation ordinance for existing residential construction can be enforced relatively easily with little administrative costs, and that compliance with such ordinances can be quite high. Compliance with the code was facilitated by extensive publicity, an informed public concerned with the cost of energy and knowledgeablemore » about energy efficiency, the threat of punishment (Order of Abatement), the use of private inspectors, and training workshops for City and private inspectors. The analysis of California's Title 24 Standards for new residential and commercial construction showed that enforcement of this type of code for many climate zones is more complex and requires extensive administrative support for education and training of inspectors, architects, engineers, and builders. Under this code, prescriptive and performance approaches for compliance are permitted, resulting in the demand for alternative methods of enforcement: technical assistance, plan review, field inspection, and computer analysis. In contrast to existing construction, building design and new materials and construction practices are of critical importance in new construction, creating a need for extensive technical assistance and extensive interaction between enforcement personnel and the building community. Compliance problems associated with building design and installation did occur in both residential and nonresidential buildings. Because statewide codes are enforced by local officials, these problems may increase over time as energy standards change and become more complex and as other standards (eg, health and safety codes) remain a higher priority. The California Energy Commission realizes that code enforcement by itself is insufficient and expects that additional educational and technical assistance efforts (eg, manuals, training programs, and toll-free telephone lines) will ameliorate these problems.« less
Analysis of quantum error-correcting codes: Symplectic lattice codes and toric codes
NASA Astrophysics Data System (ADS)
Harrington, James William
Quantum information theory is concerned with identifying how quantum mechanical resources (such as entangled quantum states) can be utilized for a number of information processing tasks, including data storage, computation, communication, and cryptography. Efficient quantum algorithms and protocols have been developed for performing some tasks (e.g. , factoring large numbers, securely communicating over a public channel, and simulating quantum mechanical systems) that appear to be very difficult with just classical resources. In addition to identifying the separation between classical and quantum computational power, much of the theoretical focus in this field over the last decade has been concerned with finding novel ways of encoding quantum information that are robust against errors, which is an important step toward building practical quantum information processing devices. In this thesis I present some results on the quantum error-correcting properties of oscillator codes (also described as symplectic lattice codes) and toric codes. Any harmonic oscillator system (such as a mode of light) can be encoded with quantum information via symplectic lattice codes that are robust against shifts in the system's continuous quantum variables. I show the existence of lattice codes whose achievable rates match the one-shot coherent information over the Gaussian quantum channel. Also, I construct a family of symplectic self-dual lattices and search for optimal encodings of quantum information distributed between several oscillators. Toric codes provide encodings of quantum information into two-dimensional spin lattices that are robust against local clusters of errors and which require only local quantum operations for error correction. Numerical simulations of this system under various error models provide a calculation of the accuracy threshold for quantum memory using toric codes, which can be related to phase transitions in certain condensed matter models. I also present a local classical processing scheme for correcting errors on toric codes, which demonstrates that quantum information can be maintained in two dimensions by purely local (quantum and classical) resources.
Generating Customized Verifiers for Automatically Generated Code
NASA Technical Reports Server (NTRS)
Denney, Ewen; Fischer, Bernd
2008-01-01
Program verification using Hoare-style techniques requires many logical annotations. We have previously developed a generic annotation inference algorithm that weaves in all annotations required to certify safety properties for automatically generated code. It uses patterns to capture generator- and property-specific code idioms and property-specific meta-program fragments to construct the annotations. The algorithm is customized by specifying the code patterns and integrating them with the meta-program fragments for annotation construction. However, this is difficult since it involves tedious and error-prone low-level term manipulations. Here, we describe an annotation schema compiler that largely automates this customization task using generative techniques. It takes a collection of high-level declarative annotation schemas tailored towards a specific code generator and safety property, and generates all customized analysis functions and glue code required for interfacing with the generic algorithm core, thus effectively creating a customized annotation inference algorithm. The compiler raises the level of abstraction and simplifies schema development and maintenance. It also takes care of some more routine aspects of formulating patterns and schemas, in particular handling of irrelevant program fragments and irrelevant variance in the program structure, which reduces the size, complexity, and number of different patterns and annotation schemas that are required. The improvements described here make it easier and faster to customize the system to a new safety property or a new generator, and we demonstrate this by customizing it to certify frame safety of space flight navigation code that was automatically generated from Simulink models by MathWorks' Real-Time Workshop.
SPAMCART: a code for smoothed particle Monte Carlo radiative transfer
NASA Astrophysics Data System (ADS)
Lomax, O.; Whitworth, A. P.
2016-10-01
We present a code for generating synthetic spectral energy distributions and intensity maps from smoothed particle hydrodynamics simulation snapshots. The code is based on the Lucy Monte Carlo radiative transfer method, I.e. it follows discrete luminosity packets as they propagate through a density field, and then uses their trajectories to compute the radiative equilibrium temperature of the ambient dust. The sources can be extended and/or embedded, and discrete and/or diffuse. The density is not mapped on to a grid, and therefore the calculation is performed at exactly the same resolution as the hydrodynamics. We present two example calculations using this method. First, we demonstrate that the code strictly adheres to Kirchhoff's law of radiation. Secondly, we present synthetic intensity maps and spectra of an embedded protostellar multiple system. The algorithm uses data structures that are already constructed for other purposes in modern particle codes. It is therefore relatively simple to implement.
Low Density Parity Check Codes Based on Finite Geometries: A Rediscovery and More
NASA Technical Reports Server (NTRS)
Kou, Yu; Lin, Shu; Fossorier, Marc
1999-01-01
Low density parity check (LDPC) codes with iterative decoding based on belief propagation achieve astonishing error performance close to Shannon limit. No algebraic or geometric method for constructing these codes has been reported and they are largely generated by computer search. As a result, encoding of long LDPC codes is in general very complex. This paper presents two classes of high rate LDPC codes whose constructions are based on finite Euclidean and projective geometries, respectively. These classes of codes a.re cyclic and have good constraint parameters and minimum distances. Cyclic structure adows the use of linear feedback shift registers for encoding. These finite geometry LDPC codes achieve very good error performance with either soft-decision iterative decoding based on belief propagation or Gallager's hard-decision bit flipping algorithm. These codes can be punctured or extended to obtain other good LDPC codes. A generalization of these codes is also presented.
On the existence of binary simplex codes. [using combinatorial construction
NASA Technical Reports Server (NTRS)
Taylor, H.
1977-01-01
Using a simple combinatorial construction, the existence of a binary simplex code with m codewords for all m is greater than or equal to 1 is proved. The problem of the shortest possible length is left open.
Chen, Wen; Zhang, Xuan; Li, Jing; Huang, Shulan; Xiang, Shuanglin; Hu, Xiang; Liu, Changning
2018-05-09
Zebrafish is a full-developed model system for studying development processes and human disease. Recent studies of deep sequencing had discovered a large number of long non-coding RNAs (lncRNAs) in zebrafish. However, only few of them had been functionally characterized. Therefore, how to take advantage of the mature zebrafish system to deeply investigate the lncRNAs' function and conservation is really intriguing. We systematically collected and analyzed a series of zebrafish RNA-seq data, then combined them with resources from known database and literatures. As a result, we obtained by far the most complete dataset of zebrafish lncRNAs, containing 13,604 lncRNA genes (21,128 transcripts) in total. Based on that, a co-expression network upon zebrafish coding and lncRNA genes was constructed and analyzed, and used to predict the Gene Ontology (GO) and the KEGG annotation of lncRNA. Meanwhile, we made a conservation analysis on zebrafish lncRNA, identifying 1828 conserved zebrafish lncRNA genes (1890 transcripts) that have their putative mammalian orthologs. We also found that zebrafish lncRNAs play important roles in regulation of the development and function of nervous system; these conserved lncRNAs present a significant sequential and functional conservation, with their mammalian counterparts. By integrative data analysis and construction of coding-lncRNA gene co-expression network, we gained the most comprehensive dataset of zebrafish lncRNAs up to present, as well as their systematic annotations and comprehensive analyses on function and conservation. Our study provides a reliable zebrafish-based platform to deeply explore lncRNA function and mechanism, as well as the lncRNA commonality between zebrafish and human.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Femec, D.A.
This report describes two code-generating tools used to speed design and implementation of relational databases and user interfaces: CREATE-SCHEMA and BUILD-SCREEN. CREATE-SCHEMA produces the SQL commands that actually create and define the database. BUILD-SCREEN takes templates for data entry screens and generates the screen management system routine calls to display the desired screen. Both tools also generate the related FORTRAN declaration statements and precompiled SQL calls. Included with this report is the source code for a number of FORTRAN routines and functions used by the user interface. This code is broadly applicable to a number of different databases.
ERIC Educational Resources Information Center
Prime, Heather; Perlman, Michal; Tackett, Jennifer L.; Jenkins, Jennifer M.
2014-01-01
Research Findings: The goal of this study was to develop a construct of sibling cognitive sensitivity, which describes the extent to which children take their siblings' knowledge and cognitive abilities into account when working toward a joint goal. In addition, the study compared 2 coding methodologies for measuring the construct: a thin…
I-Ching, dyadic groups of binary numbers and the geno-logic coding in living bodies.
Hu, Zhengbing; Petoukhov, Sergey V; Petukhova, Elena S
2017-12-01
The ancient Chinese book I-Ching was written a few thousand years ago. It introduces the system of symbols Yin and Yang (equivalents of 0 and 1). It had a powerful impact on culture, medicine and science of ancient China and several other countries. From the modern standpoint, I-Ching declares the importance of dyadic groups of binary numbers for the Nature. The system of I-Ching is represented by the tables with dyadic groups of 4 bigrams, 8 trigrams and 64 hexagrams, which were declared as fundamental archetypes of the Nature. The ancient Chinese did not know about the genetic code of protein sequences of amino acids but this code is organized in accordance with the I-Ching: in particularly, the genetic code is constructed on DNA molecules using 4 nitrogenous bases, 16 doublets, and 64 triplets. The article also describes the usage of dyadic groups as a foundation of the bio-mathematical doctrine of the geno-logic code, which exists in parallel with the known genetic code of amino acids but serves for a different goal: to code the inherited algorithmic processes using the logical holography and the spectral logic of systems of genetic Boolean functions. Some relations of this doctrine with the I-Ching are discussed. In addition, the ratios of musical harmony that can be revealed in the parameters of DNA structure are also represented in the I-Ching book. Copyright © 2017 Elsevier Ltd. All rights reserved.
Overcoming Codes and Standards Barriers to Innovations in Building Energy Efficiency
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cole, Pamala C.; Gilbride, Theresa L.
2015-02-15
In this journal article, the authors discuss approaches to overcoming building code barriers to energy-efficiency innovations in home construction. Building codes have been a highly motivational force for increasing the energy efficiency of new homes in the United States in recent years. But as quickly as the codes seem to be changing, new products are coming to the market at an even more rapid pace, sometimes offering approaches and construction techniques unthought of when the current code was first proposed, which might have been several years before its adoption by various jurisdictions. Due to this delay, the codes themselves canmore » become barriers to innovations that might otherwise be helping to further increase the efficiency, comfort, health or durability of new homes. . The U.S. Department of Energy’s Building America, a program dedicated to improving the energy efficiency of America’s housing stock through research and education, is working with the U.S. housing industry through its research teams to help builders identify and remove code barriers to innovation in the home construction industry. The article addresses several approaches that builders use to achieve approval for innovative building techniques when code barriers appear to exist.« less
Combined group ECC protection and subgroup parity protection
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gara, Alan; Cheng, Dong; Heidelberger, Philip
A method and system are disclosed for providing combined error code protection and subgroup parity protection for a given group of n bits. The method comprises the steps of identifying a number, m, of redundant bits for said error protection; and constructing a matrix P, wherein multiplying said given group of n bits with P produces m redundant error correction code (ECC) protection bits, and two columns of P provide parity protection for subgroups of said given group of n bits. In the preferred embodiment of the invention, the matrix P is constructed by generating permutations of m bit widemore » vectors with three or more, but an odd number of, elements with value one and the other elements with value zero; and assigning said vectors to rows of the matrix P.« less
The O*Net Jobs Classification System: A Primer for Family Researchers
ERIC Educational Resources Information Center
Crouter, Ann C.; Lanza, Stephanie T.; Pirretti, Amy; Goodman, W. Benjamin; Neebe, Eloise
2006-01-01
We introduce family researchers to the Occupational Information Network, or O*Net, an electronic database on the work characteristics of over 950 occupations. The paper here is a practical primer that covers data collection, selecting occupational characteristics, coding occupations, scale creation, and construct validity, with empirical…
Dual coding: a cognitive model for psychoanalytic research.
Bucci, W
1985-01-01
Four theories of mental representation derived from current experimental work in cognitive psychology have been discussed in relation to psychoanalytic theory. These are: verbal mediation theory, in which language determines or mediates thought; perceptual dominance theory, in which imagistic structures are dominant; common code or propositional models, in which all information, perceptual or linguistic, is represented in an abstract, amodal code; and dual coding, in which nonverbal and verbal information are each encoded, in symbolic form, in separate systems specialized for such representation, and connected by a complex system of referential relations. The weight of current empirical evidence supports the dual code theory. However, psychoanalysis has implicitly accepted a mixed model-perceptual dominance theory applying to unconscious representation, and verbal mediation characterizing mature conscious waking thought. The characterization of psychoanalysis, by Schafer, Spence, and others, as a domain in which reality is constructed rather than discovered, reflects the application of this incomplete mixed model. The representations of experience in the patient's mind are seen as without structure of their own, needing to be organized by words, thus vulnerable to distortion or dissolution by the language of the analyst or the patient himself. In these terms, hypothesis testing becomes a meaningless pursuit; the propositions of the theory are no longer falsifiable; the analyst is always more or less "right." This paper suggests that the integrated dual code formulation provides a more coherent theoretical framework for psychoanalysis than the mixed model, with important implications for theory and technique. In terms of dual coding, the problem is not that the nonverbal representations are vulnerable to distortion by words, but that the words that pass back and forth between analyst and patient will not affect the nonverbal schemata at all. Using the dual code formulation, and applying an investigative methodology derived from experimental cognitive psychology, a new approach to the verification of interpretations is possible. Some constructions of a patient's story may be seen as more accurate than others, by virtue of their linkage to stored perceptual representations in long-term memory. We can demonstrate that such linking has occurred in functional or operational terms--through evaluating the representation of imagistic content in the patient's speech.
Residential building codes, affordability, and health protection: a risk-tradeoff approach.
Hammitt, J K; Belsky, E S; Levy, J I; Graham, J D
1999-12-01
Residential building codes intended to promote health and safety may produce unintended countervailing risks by adding to the cost of construction. Higher construction costs increase the price of new homes and may increase health and safety risks through "income" and "stock" effects. The income effect arises because households that purchase a new home have less income remaining for spending on other goods that contribute to health and safety. The stock effect arises because suppression of new-home construction leads to slower replacement of less safe housing units. These countervailing risks are not presently considered in code debates. We demonstrate the feasibility of estimating the approximate magnitude of countervailing risks by combining the income effect with three relatively well understood and significant home-health risks. We estimate that a code change that increases the nationwide cost of constructing and maintaining homes by $150 (0.1% of the average cost to build a single-family home) would induce offsetting risks yielding between 2 and 60 premature fatalities or, including morbidity effects, between 20 and 800 lost quality-adjusted life years (both discounted at 3%) each year the code provision remains in effect. To provide a net health benefit, the code change would need to reduce risk by at least this amount. Future research should refine these estimates, incorporate quantitative uncertainty analysis, and apply a full risk-tradeoff approach to real-world case studies of proposed code changes.
Evaluation of Passive Vents in New Construction Multifamily Buildings
DOE Office of Scientific and Technical Information (OSTI.GOV)
Maxwell, Sean; Berger, David; Zuluaga, Marc
Exhaust ventilation and corresponding outdoor air strategies are being implemented in high-performance new construction multifamily buildings to meet program or code requirements for improved indoor air quality, but a lack of clear design guidance is resulting in poor performance of these systems despite the best intentions of the programs or standards. CARB's 2014 'Evaluation of Ventilation Strategies in New Construction Multifamily Buildings' consistently demonstrated that commonly used outdoor air strategies are not performing as expected. Of the four strategies evaluated in 2014, the exhaust ventilation system that relied on outdoor air from a pressurized corridor was ruled out as amore » potential best practice due to its conflict with meeting requirements within most fire codes. Outdoor air that is ducted directly to the apartments was a strategy determined to have the highest likelihood of success, but with higher first costs and operating costs. Outdoor air through space conditioning systems was also determined to have good performance potential, with proper design and execution. The fourth strategy, passive systems, was identified as the least expensive option for providing outdoor air directly to apartments, with respect to both first costs and operating costs. However, little is known about how they actually perform in real-world conditions or how to implement them effectively. Based on the lack of data available on the performance of these low-cost systems and their frequent use in the high-performance building programs that require a provision for outdoor air, this research project sought to further evaluate the performance of passive vents.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Exhaust ventilation and corresponding outdoor air strategies are being implemented in high-performance new construction multifamily buildings to meet program or code requirements for improved indoor air quality, but a lack of clear design guidance is resulting in poor performance of these systems despite the best intentions of the programs or standards. CARB's 2014 'Evaluation of Ventilation Strategies in New Construction Multifamily Buildings' consistently demonstrated that commonly used outdoor air strategies are not performing as expected. Of the four strategies evaluated in 2014, the exhaust ventilation system that relied on outdoor air from a pressurized corridor was ruled out as amore » potential best practice due to its conflict with meeting requirements within most fire codes. Outdoor air that is ducted directly to the apartments was a strategy determined to have the highest likelihood of success, but with higher first costs and operating costs. Outdoor air through space conditioning systems was also determined to have good performance potential, with proper design and execution. The fourth strategy, passive systems, was identified as the least expensive option for providing outdoor air directly to apartments, with respect to both first costs and operating costs. However, little is known about how they actually perform in real-world conditions or how to implement them effectively. Based on the lack of data available on the performance of these low-cost systems and their frequent use in the high-performance building programs that require a provision for outdoor air, this research project sought to further evaluate the performance of passive vents.« less
Proceedings of Conference on Variable-Resolution Modeling, Washington, DC, 5-6 May 1992
1992-05-01
of powerful new computer architectures for supporting object-oriented computing. Objects, as self -contained data-code packages with orderly...another entity structure. For example, (copy-entstr e:sys- tcm ’ new -system) creates an entity structure named c:new-system that has the same structure...324 Parry, S-H. (1984): A Self -contained Hierarchical Model Construct. In: Systems Analysis and Modeling in Defense (R.K. Huber, Ed.), New York
Remote coding scheme based on waveguide Bragg grating in PLC splitter chip for PON monitoring.
Zhang, Xuan; Lu, Fengjun; Chen, Si; Zhao, Xingqun; Zhu, Min; Sun, Xiaohan
2016-03-07
A distributing arranged waveguide Bragg gratings (WBGs) in PLC splitter chip based remote coding scheme is proposed and analyzed for passive optical network (PON) monitoring, by which the management system can identify each drop fiber link through the same reflector in the terminal of each optical network unit, even though there exist several equidistant users. The corresponding coding and capacity models are respectively established and investigated so that we can obtain a minimum number of the WBGs needed under the condition of the distributed structure. Signal-to-noise ratio (SNR) model related to the number of equidistant users is also developed to extend the analyses for the overall performance of the system. Simulation results show the proposed scheme is feasible and allow the monitoring of a 64 users PON with SNR range of 7.5~10.6dB. The scheme can solve some of difficulties of construction site at the lower user cost for PON system.
O'keefe, Matthew; Parr, Terence; Edgar, B. Kevin; ...
1995-01-01
Massively parallel processors (MPPs) hold the promise of extremely high performance that, if realized, could be used to study problems of unprecedented size and complexity. One of the primary stumbling blocks to this promise has been the lack of tools to translate application codes to MPP form. In this article we show how applications codes written in a subset of Fortran 77, called Fortran-P, can be translated to achieve good performance on several massively parallel machines. This subset can express codes that are self-similar, where the algorithm applied to the global data domain is also applied to each subdomain. Wemore » have found many codes that match the Fortran-P programming style and have converted them using our tools. We believe a self-similar coding style will accomplish what a vectorizable style has accomplished for vector machines by allowing the construction of robust, user-friendly, automatic translation systems that increase programmer productivity and generate fast, efficient code for MPPs.« less
The Architecture Design of Detection and Calibration System for High-voltage Electrical Equipment
NASA Astrophysics Data System (ADS)
Ma, Y.; Lin, Y.; Yang, Y.; Gu, Ch; Yang, F.; Zou, L. D.
2018-01-01
With the construction of Material Quality Inspection Center of Shandong electric power company, Electric Power Research Institute takes on more jobs on quality analysis and laboratory calibration for high-voltage electrical equipment, and informationization construction becomes urgent. In the paper we design a consolidated system, which implements the electronic management and online automation process for material sampling, test apparatus detection and field test. In the three jobs we use QR code scanning, online Word editing and electronic signature. These techniques simplify the complex process of warehouse management and testing report transferring, and largely reduce the manual procedure. The construction of the standardized detection information platform realizes the integrated management of high-voltage electrical equipment from their networking, running to periodic detection. According to system operation evaluation, the speed of transferring report is doubled, and querying data is also easier and faster.
DOE Office of Scientific and Technical Information (OSTI.GOV)
DEXTER, M.L.
1999-11-15
This document serves as a notice of construction (NOC) pursuant to the requirements of Washington Administrative Code (WAC) 246 247-060, and as a request for approval to modify pursuant to 40 Code of Federal Regulations (CFR) 61 07 for the installation and operation of one waste retrieval system in the 24 1 AP-102 Tank and one waste retrieval system in the 241 AP 104 Tank Pursuant to 40 CFR 61 09 (a)( 1) this application is also intended to provide anticipated initial start up notification Its is requested that EPA approval of this application will also constitute EPA acceptance ofmore » the initial start up notification Project W 211 Initial Tank Retrieval Systems (ITRS) is scoped to install a waste retrieval system in the following double-shell tanks 241-AP 102-AP 104 AN 102, AN 103, AN-104, AN 105, AY 102 AZ 102 and SY-102 between now and the year 2011. Because of the extended installation schedules and unknowns about specific activities/designs at each tank, it was decided to submit NOCs as that information became available This NOC covers the installation and operation of a waste retrieval system in tanks 241 AP-102 and 241 AP 104 Generally this includes removal of existing equipment installation of new equipment and construction of new ancillary equipment and buildings Tanks 241 AP 102 and 241 AP 104 will provide waste feed for immobilization into a low activity waste (LAW) product (i.e. glass logs) The total effective dose equivalent (TEDE) to the offsite maximally exposed individual (MEI) from the construction activities is 0 045 millirem per year The unabated TEDE to the offsite ME1 from operation of the mixer pumps is 0 042 millirem per year.« less
Transmission over UWB channels with OFDM system using LDPC coding
NASA Astrophysics Data System (ADS)
Dziwoki, Grzegorz; Kucharczyk, Marcin; Sulek, Wojciech
2009-06-01
Hostile wireless environment requires use of sophisticated signal processing methods. The paper concerns on Ultra Wideband (UWB) transmission over Personal Area Networks (PAN) including MB-OFDM specification of physical layer. In presented work the transmission system with OFDM modulation was connected with LDPC encoder/decoder. Additionally the frame and bit error rate (FER and BER) of the system was decreased using results from the LDPC decoder in a kind of turbo equalization algorithm for better channel estimation. Computational block using evolutionary strategy, from genetic algorithms family, was also used in presented system. It was placed after SPA (Sum-Product Algorithm) decoder and is conditionally turned on in the decoding process. The result is increased effectiveness of the whole system, especially lower FER. The system was tested with two types of LDPC codes, depending on type of parity check matrices: randomly generated and constructed deterministically, optimized for practical decoder architecture implemented in the FPGA device.
A translator writing system for microcomputer high-level languages and assemblers
NASA Technical Reports Server (NTRS)
Collins, W. R.; Knight, J. C.; Noonan, R. E.
1980-01-01
In order to implement high level languages whenever possible, a translator writing system of advanced design was developed. It is intended for routine production use by many programmers working on different projects. As well as a fairly conventional parser generator, it includes a system for the rapid generation of table driven code generators. The parser generator was developed from a prototype version. The translator writing system includes various tools for the management of the source text of a compiler under construction. In addition, it supplies various default source code sections so that its output is always compilable and executable. The system thereby encourages iterative enhancement as a development methodology by ensuring an executable program from the earliest stages of a compiler development project. The translator writing system includes PASCAL/48 compiler, three assemblers, and two compilers for a subset of HAL/S.
Constructing binary black hole initial data with high mass ratios and spins
NASA Astrophysics Data System (ADS)
Ossokine, Serguei; Foucart, Francois; Pfeiffer, Harald; Szilagyi, Bela; Simulating Extreme Spacetimes Collaboration
2015-04-01
Binary black hole systems have now been successfully modelled in full numerical relativity by many groups. In order to explore high-mass-ratio (larger than 1:10), high-spin systems (above 0.9 of the maximal BH spin), we revisit the initial-data problem for binary black holes. The initial-data solver in the Spectral Einstein Code (SpEC) was not able to solve for such initial data reliably and robustly. I will present recent improvements to this solver, among them adaptive mesh refinement and control of motion of the center of mass of the binary, and will discuss the much larger region of parameter space this code can now address.
Epoxy resins in the construction industry.
Spee, Ton; Van Duivenbooden, Cor; Terwoert, Jeroen
2006-09-01
Epoxy resins are used as coatings, adhesives, and in wood and concrete repair. However, epoxy resins can be highly irritating to the skin and are strong sensitizers. Some hardeners are carcinogenic. Based on the results of earlier Dutch studies, an international project on "best practices,"--Epoxy Code--with epoxy products was started. Partners were from Denmark, Germany, the Netherlands, and the UK. The "Code" deals with substitution, safe working procedures, safer tools, and skin protection. The feasibility of an internationally agreed "ranking system" for the health risks of epoxy products was studied. Such a ranking system should inform the user of the harmfulness of different epoxies and stimulate research on less harmful products by product developers.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vine, E.
1990-11-01
As part of Lawrence Berkeley Laboratory's (LBL) technical assistance to the Sustainable City Project, compliance and enforcement activities related to local and state building codes for existing and new construction were evaluated in two case studies. The analysis of the City of San Francisco's Residential Energy Conservation Ordinance (RECO) showed that a limited, prescriptive energy conservation ordinance for existing residential construction can be enforced relatively easily with little administrative costs, and that compliance with such ordinances can be quite high. Compliance with the code was facilitated by extensive publicity, an informed public concerned with the cost of energy and knowledgeablemore » about energy efficiency, the threat of punishment (Order of Abatement), the use of private inspectors, and training workshops for City and private inspectors. The analysis of California's Title 24 Standards for new residential and commercial construction showed that enforcement of this type of code for many climate zones is more complex and requires extensive administrative support for education and training of inspectors, architects, engineers, and builders. Under this code, prescriptive and performance approaches for compliance are permitted, resulting in the demand for alternative methods of enforcement: technical assistance, plan review, field inspection, and computer analysis. In contrast to existing to construction, building design and new materials and construction practices are of critical importance in new construction, creating a need for extensive technical assistance and extensive interaction between enforcement personnel and the building community. Compliance problems associated with building design and installation did occur in both residential and nonresidential buildings. 12 refs., 5 tabs.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
Building Science Corporation (BSC) worked directly with the David Weekley Homes - Houston division to develop a cost-effective design for moving the HVAC system into conditioned space. In addition, BSC conducted energy analysis to calculate the most economical strategy for increasing the energy performance of future production houses in preparation for the upcoming code changes in 2015. The following research questions were addressed by this research project: 1. What is the most cost effective, best performing and most easily replicable method of locating ducts inside conditioned space for a hot-humid production home builder that constructs one and two story singlemore » family detached residences? 2. What is a cost effective and practical method of achieving 50% source energy savings vs. the 2006 International Energy Conservation Code for a hot-humid production builder? 3. How accurate are the pre-construction whole house cost estimates compared to confirmed post construction actual cost? BSC and the builder developed a duct design strategy that employs a system of dropped ceilings and attic coffers for moving the ductwork from the vented attic to conditioned space. The furnace has been moved to either a mechanical closet in the conditioned living space or a coffered space in the attic.« less
Adopting a code requiring radon-resistant new construction (RRNC) in Decatur, Alabama, took months of effort by four people. Their actions demonstrate the influence that passionate residents can have on reversing a city council’s direction.
RELAP5-3D Resolution of Known Restart/Backup Issues
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mesina, George L.; Anderson, Nolan A.
2014-12-01
The state-of-the-art nuclear reactor system safety analysis computer program developed at the Idaho National Laboratory (INL), RELAP5-3D, continues to adapt to changes in computer hardware and software and to develop to meet the ever-expanding needs of the nuclear industry. To continue at the forefront, code testing must evolve with both code and industry developments, and it must work correctly. To best ensure this, the processes of Software Verification and Validation (V&V) are applied. Verification compares coding against its documented algorithms and equations and compares its calculations against analytical solutions and the method of manufactured solutions. A form of this, sequentialmore » verification, checks code specifications against coding only when originally written then applies regression testing which compares code calculations between consecutive updates or versions on a set of test cases to check that the performance does not change. A sequential verification testing system was specially constructed for RELAP5-3D to both detect errors with extreme accuracy and cover all nuclear-plant-relevant code features. Detection is provided through a “verification file” that records double precision sums of key variables. Coverage is provided by a test suite of input decks that exercise code features and capabilities necessary to model a nuclear power plant. A matrix of test features and short-running cases that exercise them is presented. This testing system is used to test base cases (called null testing) as well as restart and backup cases. It can test RELAP5-3D performance in both standalone and coupled (through PVM to other codes) runs. Application of verification testing revealed numerous restart and backup issues in both standalone and couple modes. This document reports the resolution of these issues.« less
Federal Register 2010, 2011, 2012, 2013, 2014
2013-03-26
... for Residential Construction in High Wind Regions. ICC 700: National Green Building Standard The..., coordinated, and necessary to regulate the built environment. Federal agencies frequently use these codes and... International Codes and Standards consist of the following: ICC Codes International Building Code. International...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-04-16
... for Residential Construction in High Wind Areas. ICC 700: National Green Building Standard. The... Codes and Standards that are comprehensive, coordinated, and necessary to regulate the built environment... International Codes and Standards consist of the following: ICC Codes International Building Code. International...
User's manual for the Heat Pipe Space Radiator design and analysis Code (HEPSPARC)
NASA Technical Reports Server (NTRS)
Hainley, Donald C.
1991-01-01
A heat pipe space radiatior code (HEPSPARC), was written for the NASA Lewis Research Center and is used for the design and analysis of a radiator that is constructed from a pumped fluid loop that transfers heat to the evaporative section of heat pipes. This manual is designed to familiarize the user with this new code and to serve as a reference for its use. This manual documents the completed work and is intended to be the first step towards verification of the HEPSPARC code. Details are furnished to provide a description of all the requirements and variables used in the design and analysis of a combined pumped loop/heat pipe radiator system. A description of the subroutines used in the program is furnished for those interested in understanding its detailed workings.
Interactive three-dimensional visualization and creation of geometries for Monte Carlo calculations
NASA Astrophysics Data System (ADS)
Theis, C.; Buchegger, K. H.; Brugger, M.; Forkel-Wirth, D.; Roesler, S.; Vincke, H.
2006-06-01
The implementation of three-dimensional geometries for the simulation of radiation transport problems is a very time-consuming task. Each particle transport code supplies its own scripting language and syntax for creating the geometries. All of them are based on the Constructive Solid Geometry scheme requiring textual description. This makes the creation a tedious and error-prone task, which is especially hard to master for novice users. The Monte Carlo code FLUKA comes with built-in support for creating two-dimensional cross-sections through the geometry and FLUKACAD, a custom-built converter to the commercial Computer Aided Design package AutoCAD, exists for 3D visualization. For other codes, like MCNPX, a couple of different tools are available, but they are often specifically tailored to the particle transport code and its approach used for implementing geometries. Complex constructive solid modeling usually requires very fast and expensive special purpose hardware, which is not widely available. In this paper SimpleGeo is presented, which is an implementation of a generic versatile interactive geometry modeler using off-the-shelf hardware. It is running on Windows, with a Linux version currently under preparation. This paper describes its functionality, which allows for rapid interactive visualization as well as generation of three-dimensional geometries, and also discusses critical issues regarding common CAD systems.
An Advanced N -body Model for Interacting Multiple Stellar Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brož, Miroslav
We construct an advanced model for interacting multiple stellar systems in which we compute all trajectories with a numerical N -body integrator, namely the Bulirsch–Stoer from the SWIFT package. We can then derive various observables: astrometric positions, radial velocities, minima timings (TTVs), eclipse durations, interferometric visibilities, closure phases, synthetic spectra, spectral energy distribution, and even complete light curves. We use a modified version of the Wilson–Devinney code for the latter, in which the instantaneous true phase and inclination of the eclipsing binary are governed by the N -body integration. If all of these types of observations are at one’s disposal,more » a joint χ {sup 2} metric and an optimization algorithm (a simplex or simulated annealing) allow one to search for a global minimum and construct very robust models of stellar systems. At the same time, our N -body model is free from artifacts that may arise if mutual gravitational interactions among all components are not self-consistently accounted for. Finally, we present a number of examples showing dynamical effects that can be studied with our code and we discuss how systematic errors may affect the results (and how to prevent this from happening).« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, S.A.
In computing landscape which has a plethora of different hardware architectures and supporting software systems ranging from compilers to operating systems, there is an obvious and strong need for a philosophy of software development that lends itself to the design and construction of portable code systems. The current efforts to standardize software bear witness to this need. SABrE is an effort to implement a software development environment which is itself portable and promotes the design and construction of portable applications. SABrE does not include such important tools as editors and compilers. Well built tools of that kind are readily availablemore » across virtually all computer platforms. The areas that SABrE addresses are at a higher level involving issues such as data portability, portable inter-process communication, and graphics. These blocks of functionality have particular significance to the kind of code development done at LLNL. That is partly why the general computing community has not supplied us with these tools already. This is another key feature of the software development environments which we must recognize. The general computing community cannot and should not be expected to produce all of the tools which we require.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, S.A.
In computing landscape which has a plethora of different hardware architectures and supporting software systems ranging from compilers to operating systems, there is an obvious and strong need for a philosophy of software development that lends itself to the design and construction of portable code systems. The current efforts to standardize software bear witness to this need. SABrE is an effort to implement a software development environment which is itself portable and promotes the design and construction of portable applications. SABrE does not include such important tools as editors and compilers. Well built tools of that kind are readily availablemore » across virtually all computer platforms. The areas that SABrE addresses are at a higher level involving issues such as data portability, portable inter-process communication, and graphics. These blocks of functionality have particular significance to the kind of code development done at LLNL. That is partly why the general computing community has not supplied us with these tools already. This is another key feature of the software development environments which we must recognize. The general computing community cannot and should not be expected to produce all of the tools which we require.« less
Type-Separated Bytecode - Its Construction and Evaluation
NASA Astrophysics Data System (ADS)
Adler, Philipp; Amme, Wolfram
A lot of constrained systems still use interpreters to run mobile applications written in Java. These interpreters demand for only a few resources. On the other hand, it is difficult to apply optimizations during the runtime of the application. Annotations could be used to achieve a simpler and faster code analysis, which would allow optimizations even for interpreters on constrained devices. Unfortunately, there is no viable way of transporting annotations to and verifying them at the code consumer. In this paper we present type-separated bytecode as an intermediate representation which allows to safely transport annotations as type-extensions. We have implemented several versions of this system and show that it is possible to obtain a performance comparable to Java Bytecode, even though we use a type-separated system with annotations.
NASA Technical Reports Server (NTRS)
Tighe, Michael F.
1986-01-01
Intermetrics' experience is that the Ada package construct, which allows separation of specification and implementation allows specification of a CAIS that is transportable across varying hardware and software bases. Additionally, the CAIS is an excellent basis for providing operating system functionality to Ada applications. By allowing the Byron APSE to be moved easily from system to system, and allowing significant re-writes of underlying code. Ada and the CAIS provide portability as well as transparency to change at the application operating system interface level.
Marking parts to aid robot vision
NASA Technical Reports Server (NTRS)
Bales, J. W.; Barker, L. K.
1981-01-01
The premarking of parts for subsequent identification by a robot vision system appears to be beneficial as an aid in the automation of certain tasks such as construction in space. A simple, color coded marking system is presented which allows a computer vision system to locate an object, calculate its orientation, and determine its identity. Such a system has the potential to operate accurately, and because the computer shape analysis problem has been simplified, it has the ability to operate in real time.
24 CFR 92.251 - Property standards.
Code of Federal Regulations, 2011 CFR
2011-04-01
..., as applicable, one of three model codes: Uniform Building Code (ICBO), National Building Code (BOCA), Standard (Southern) Building Code (SBCCI); or the Council of American Building Officials (CABO) one or two...) Housing that is constructed or rehabilitated with HOME funds must meet all applicable local codes...
24 CFR 92.251 - Property standards.
Code of Federal Regulations, 2013 CFR
2013-04-01
..., as applicable, one of three model codes: Uniform Building Code (ICBO), National Building Code (BOCA), Standard (Southern) Building Code (SBCCI); or the Council of American Building Officials (CABO) one or two...) Housing that is constructed or rehabilitated with HOME funds must meet all applicable local codes...
41 CFR 128-1.8005 - Seismic safety standards.
Code of Federal Regulations, 2013 CFR
2013-07-01
... the model building codes that the Interagency Committee on Seismic Safety in Construction (ICSSC...) Uniform Building Code (UBC); (2) The 1992 Supplement to the Building Officials and Code Administrators International (BOCA) National Building Code (NBC); and (3) The 1992 Amendments to the Southern Building Code...
41 CFR 128-1.8005 - Seismic safety standards.
Code of Federal Regulations, 2012 CFR
2012-01-01
... the model building codes that the Interagency Committee on Seismic Safety in Construction (ICSSC...) Uniform Building Code (UBC); (2) The 1992 Supplement to the Building Officials and Code Administrators International (BOCA) National Building Code (NBC); and (3) The 1992 Amendments to the Southern Building Code...
41 CFR 128-1.8005 - Seismic safety standards.
Code of Federal Regulations, 2014 CFR
2014-01-01
... the model building codes that the Interagency Committee on Seismic Safety in Construction (ICSSC...) Uniform Building Code (UBC); (2) The 1992 Supplement to the Building Officials and Code Administrators International (BOCA) National Building Code (NBC); and (3) The 1992 Amendments to the Southern Building Code...
41 CFR 128-1.8005 - Seismic safety standards.
Code of Federal Regulations, 2011 CFR
2011-01-01
... the model building codes that the Interagency Committee on Seismic Safety in Construction (ICSSC...) Uniform Building Code (UBC); (2) The 1992 Supplement to the Building Officials and Code Administrators International (BOCA) National Building Code (NBC); and (3) The 1992 Amendments to the Southern Building Code...
24 CFR 92.251 - Property standards.
Code of Federal Regulations, 2012 CFR
2012-04-01
..., as applicable, one of three model codes: Uniform Building Code (ICBO), National Building Code (BOCA), Standard (Southern) Building Code (SBCCI); or the Council of American Building Officials (CABO) one or two...) Housing that is constructed or rehabilitated with HOME funds must meet all applicable local codes...
24 CFR 92.251 - Property standards.
Code of Federal Regulations, 2010 CFR
2010-04-01
..., as applicable, one of three model codes: Uniform Building Code (ICBO), National Building Code (BOCA), Standard (Southern) Building Code (SBCCI); or the Council of American Building Officials (CABO) one or two...) Housing that is constructed or rehabilitated with HOME funds must meet all applicable local codes...
NASA Astrophysics Data System (ADS)
Wuttke, Manfred W.
2017-04-01
At LIAG, we use numerical models to develop and enhance understanding of coupled transport processes and to predict the dynamics of the system under consideration. Topics include geothermal heat utilization, subrosion processes, and spontaneous underground coal fires. Although the details make it inconvenient if not impossible to apply a single code implementation to all systems, their investigations go along similar paths: They all depend on the solution of coupled transport equations. We thus saw a need for a modular code system with open access for the various communities to maximize the shared synergistic effects. To this purpose we develop the oops! ( open object-oriented parallel solutions) - toolkit, a C++ class library for the numerical solution of mathematical models of coupled thermal, hydraulic and chemical processes. This is used to develop problem-specific libraries like acme( amendable coal-fire modeling exercise), a class library for the numerical simulation of coal-fires and applications like kobra (Kohlebrand, german for coal-fire), a numerical simulation code for standard coal-fire models. Basic principle of the oops!-code system is the provision of data types for the description of space and time dependent data fields, description of terms of partial differential equations (pde), their discretisation and solving methods. Coupling of different processes, described by their particular pde is modeled by an automatic timescale-ordered operator-splitting technique. acme is a derived coal-fire specific application library, depending on oops!. If specific functionalities of general interest are implemented and have been tested they will be assimilated into the main oops!-library. Interfaces to external pre- and post-processing tools are easily implemented. Thus a construction kit which can be arbitrarily amended is formed. With the kobra-application constructed with acme we study the processes and propagation of shallow coal seam fires in particular in Xinjiang, China, as well as analyze and interpret results from lab experiments.
Analysis of Effectiveness of Phoenix Entry Reaction Control System
NASA Technical Reports Server (NTRS)
Dyakonov, Artem A.; Glass, Christopher E.; Desai, Prasun, N.; VanNorman, John W.
2008-01-01
Interaction between the external flowfield and the reaction control system (RCS) thruster plumes of the Phoenix capsule during entry has been investigated. The analysis covered rarefied, transitional, hypersonic and supersonic flight regimes. Performance of pitch, yaw and roll control authority channels was evaluated, with specific emphasis on the yaw channel due to its low nominal yaw control authority. Because Phoenix had already been constructed and its RCS could not be modified before flight, an assessment of RCS efficacy along the trajectory was needed to determine possible issues and to make necessary software changes. Effectiveness of the system at various regimes was evaluated using a hybrid DSMC-CFD technique, based on DSMC Analysis Code (DAC) code and General Aerodynamic Simulation Program (GASP), the LAURA (Langley Aerothermal Upwind Relaxation Algorithm) code, and the FUN3D (Fully Unstructured 3D) code. Results of the analysis at hypersonic and supersonic conditions suggest a significant aero-RCS interference which reduced the efficacy of the thrusters and could likely produce control reversal. Very little aero-RCS interference was predicted in rarefied and transitional regimes. A recommendation was made to the project to widen controller system deadbands to minimize (if not eliminate) the use of RCS thrusters through hypersonic and supersonic flight regimes, where their performance would be uncertain.
Safety and health in the construction of fixed offshore installations in the petroleum industry
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1981-01-01
A meeting convened by the ILO (International Labor Office) on safety problems in the offshore petroleum industry recommended the preparation of a code of practice setting out standards for safety and health during the construction of fixed offshore installations. Such a code, to be prepared by the ILO in co-operation with other bodies, including the Inter-Governmental Maritime Consultative Organisation (IMCO), was to take into consideration existing standards applicable to offshore construction activities and to supplement the ILO codes of practice on safety and health in building and civil engineering work, shipbuilding and ship repairing. (Copyright (c) International Labour Organisation 1981.)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wise, B.K.; Hughes, K.R.; Danko, S.L.
1994-07-01
This report was prepared for the US Department of Energy (DOE) Office of Codes and Standards by the Pacific Northwest Laboratory (PNL) through its Building Energy Standards Program (BESP). The purpose of this task was to identify demand-side management (DSM) strategies for new construction that utilities have adopted or developed to promote energy-efficient design and construction. PNL conducted a survey of utilities and used the information gathered to extrapolate lessons learned and to identify evolving trends in utility new-construction DSM programs. The ultimate goal of the task is to identify opportunities where states might work collaboratively with utilities to promotemore » the adoption, implementation, and enforcement of energy-efficient building energy codes.« less
Design of the Bus Interface Unit for the Distributed Processor/Memory System.
1976-12-01
microroutine flowchart developed. Once this had been done , a high-speed, flexible microprocessor that would be adapt- able to a hardware...routine) was translated Into microcode and provide the mnemonic code and flowchart , Chapter V summarizes and discusses actual system construction...Fig. 11. This diagram shows that the BIU is driven by Interrupt stimuli which select the beginn ing address of the appropriate microroutine rather
Data-Driven Hint Generation in Vast Solution Spaces: A Self-Improving Python Programming Tutor
ERIC Educational Resources Information Center
Rivers, Kelly; Koedinger, Kenneth R.
2017-01-01
To provide personalized help to students who are working on code-writing problems, we introduce a data-driven tutoring system, ITAP (Intelligent Teaching Assistant for Programming). ITAP uses state abstraction, path construction, and state reification to automatically generate personalized hints for students, even when given states that have not…
Validation of NASA Thermal Ice Protection Computer Codes. Part 3; The Validation of Antice
NASA Technical Reports Server (NTRS)
Al-Khalil, Kamel M.; Horvath, Charles; Miller, Dean R.; Wright, William B.
2001-01-01
An experimental program was generated by the Icing Technology Branch at NASA Glenn Research Center to validate two ice protection simulation codes: (1) LEWICE/Thermal for transient electrothermal de-icing and anti-icing simulations, and (2) ANTICE for steady state hot gas and electrothermal anti-icing simulations. An electrothermal ice protection system was designed and constructed integral to a 36 inch chord NACA0012 airfoil. The model was fully instrumented with thermo-couples, RTD'S, and heat flux gages. Tests were conducted at several icing environmental conditions during a two week period at the NASA Glenn Icing Research Tunnel. Experimental results of running-wet and evaporative cases were compared to the ANTICE computer code predictions and are presented in this paper.
Local non-Calderbank-Shor-Steane quantum error-correcting code on a three-dimensional lattice
NASA Astrophysics Data System (ADS)
Kim, Isaac H.
2011-05-01
We present a family of non-Calderbank-Shor-Steane quantum error-correcting code consisting of geometrically local stabilizer generators on a 3D lattice. We study the Hamiltonian constructed from ferromagnetic interaction of overcomplete set of local stabilizer generators. The degenerate ground state of the system is characterized by a quantum error-correcting code whose number of encoded qubits are equal to the second Betti number of the manifold. These models (i) have solely local interactions; (ii) admit a strong-weak duality relation with an Ising model on a dual lattice; (iii) have topological order in the ground state, some of which survive at finite temperature; and (iv) behave as classical memory at finite temperature.
Design of agricultural product quality safety retrospective supervision system of Jiangsu province
NASA Astrophysics Data System (ADS)
Wang, Kun
2017-08-01
In store and supermarkets to consumers can trace back agricultural products through the electronic province card to query their origin, planting, processing, packaging, testing and other important information and found that the problems. Quality and safety issues can identify the responsibility of the problem. This paper designs a retroactive supervision system for the quality and safety of agricultural products in Jiangsu Province. Based on the analysis of agricultural production and business process, the goal of Jiangsu agricultural product quality safety traceability system construction is established, and the specific functional requirements and non-functioning requirements of the retroactive system are analyzed, and the target is specified for the specific construction of the retroactive system. The design of the quality and safety traceability system in Jiangsu province contains the design of the overall design, the trace code design and the system function module.
Binary neutron stars with arbitrary spins in numerical relativity
NASA Astrophysics Data System (ADS)
Tacik, Nick; Foucart, Francois; Pfeiffer, Harald P.; Haas, Roland; Ossokine, Serguei; Kaplan, Jeff; Muhlberger, Curran; Duez, Matt D.; Kidder, Lawrence E.; Scheel, Mark A.; Szilágyi, Béla
2015-12-01
We present a code to construct initial data for binary neutron star systems in which the stars are rotating. Our code, based on a formalism developed by Tichy, allows for arbitrary rotation axes of the neutron stars and is able to achieve rotation rates near rotational breakup. We compute the neutron star angular momentum through quasilocal angular momentum integrals. When constructing irrotational binary neutron stars, we find a very small residual dimensionless spin of ˜2 ×10-4 . Evolutions of rotating neutron star binaries show that the magnitude of the stars' angular momentum is conserved, and that the spin and orbit precession of the stars is well described by post-Newtonian approximation. We demonstrate that orbital eccentricity of the binary neutron stars can be controlled to ˜0.1 % . The neutron stars show quasinormal mode oscillations at an amplitude which increases with the rotation rate of the stars.
The digital code driven autonomous synthesis of ibuprofen automated in a 3D-printer-based robot.
Kitson, Philip J; Glatzel, Stefan; Cronin, Leroy
2016-01-01
An automated synthesis robot was constructed by modifying an open source 3D printing platform. The resulting automated system was used to 3D print reaction vessels (reactionware) of differing internal volumes using polypropylene feedstock via a fused deposition modeling 3D printing approach and subsequently make use of these fabricated vessels to synthesize the nonsteroidal anti-inflammatory drug ibuprofen via a consecutive one-pot three-step approach. The synthesis of ibuprofen could be achieved on different scales simply by adjusting the parameters in the robot control software. The software for controlling the synthesis robot was written in the python programming language and hard-coded for the synthesis of ibuprofen by the method described, opening possibilities for the sharing of validated synthetic 'programs' which can run on similar low cost, user-constructed robotic platforms towards an 'open-source' regime in the area of chemical synthesis.
The digital code driven autonomous synthesis of ibuprofen automated in a 3D-printer-based robot
Kitson, Philip J; Glatzel, Stefan
2016-01-01
An automated synthesis robot was constructed by modifying an open source 3D printing platform. The resulting automated system was used to 3D print reaction vessels (reactionware) of differing internal volumes using polypropylene feedstock via a fused deposition modeling 3D printing approach and subsequently make use of these fabricated vessels to synthesize the nonsteroidal anti-inflammatory drug ibuprofen via a consecutive one-pot three-step approach. The synthesis of ibuprofen could be achieved on different scales simply by adjusting the parameters in the robot control software. The software for controlling the synthesis robot was written in the python programming language and hard-coded for the synthesis of ibuprofen by the method described, opening possibilities for the sharing of validated synthetic ‘programs’ which can run on similar low cost, user-constructed robotic platforms towards an ‘open-source’ regime in the area of chemical synthesis. PMID:28144350
Constructing Noise-Invariant Representations of Sound in the Auditory Pathway
Rabinowitz, Neil C.; Willmore, Ben D. B.; King, Andrew J.; Schnupp, Jan W. H.
2013-01-01
Identifying behaviorally relevant sounds in the presence of background noise is one of the most important and poorly understood challenges faced by the auditory system. An elegant solution to this problem would be for the auditory system to represent sounds in a noise-invariant fashion. Since a major effect of background noise is to alter the statistics of the sounds reaching the ear, noise-invariant representations could be promoted by neurons adapting to stimulus statistics. Here we investigated the extent of neuronal adaptation to the mean and contrast of auditory stimulation as one ascends the auditory pathway. We measured these forms of adaptation by presenting complex synthetic and natural sounds, recording neuronal responses in the inferior colliculus and primary fields of the auditory cortex of anaesthetized ferrets, and comparing these responses with a sophisticated model of the auditory nerve. We find that the strength of both forms of adaptation increases as one ascends the auditory pathway. To investigate whether this adaptation to stimulus statistics contributes to the construction of noise-invariant sound representations, we also presented complex, natural sounds embedded in stationary noise, and used a decoding approach to assess the noise tolerance of the neuronal population code. We find that the code for complex sounds in the periphery is affected more by the addition of noise than the cortical code. We also find that noise tolerance is correlated with adaptation to stimulus statistics, so that populations that show the strongest adaptation to stimulus statistics are also the most noise-tolerant. This suggests that the increase in adaptation to sound statistics from auditory nerve to midbrain to cortex is an important stage in the construction of noise-invariant sound representations in the higher auditory brain. PMID:24265596
Automatic Data Traffic Control on DSM Architecture
NASA Technical Reports Server (NTRS)
Frumkin, Michael; Jin, Hao-Qiang; Yan, Jerry; Kwak, Dochan (Technical Monitor)
2000-01-01
We study data traffic on distributed shared memory machines and conclude that data placement and grouping improve performance of scientific codes. We present several methods which user can employ to improve data traffic in his code. We report on implementation of a tool which detects the code fragments causing data congestions and advises user on improvements of data routing in these fragments. The capabilities of the tool include deduction of data alignment and affinity from the source code; detection of the code constructs having abnormally high cache or TLB misses; generation of data placement constructs. We demonstrate the capabilities of the tool on experiments with NAS parallel benchmarks and with a simple computational fluid dynamics application ARC3D.
Dynamic quality of service differentiation using fixed code weight in optical CDMA networks
NASA Astrophysics Data System (ADS)
Kakaee, Majid H.; Essa, Shawnim I.; Abd, Thanaa H.; Seyedzadeh, Saleh
2015-11-01
The emergence of network-driven applications, such as internet, video conferencing, and online gaming, brings in the need for a network the environments with capability of providing diverse Quality of Services (QoS). In this paper, a new code family of novel spreading sequences, called a Multi-Service (MS) code, has been constructed to support multiple services in Optical- Code Division Multiple Access (CDMA) system. The proposed method uses fixed weight for all services, however reducing the interfering codewords for the users requiring higher QoS. The performance of the proposed code is demonstrated using mathematical analysis. It shown that the total number of served users with satisfactory BER of 10-9 using NB=2 is 82, while they are only 36 and 10 when NB=3 and 4 respectively. The developed MS code is compared with variable-weight codes such as Variable Weight-Khazani Syed (VW-KS) and Multi-Weight-Random Diagonal (MW-RD). Different numbers of basic users (NB) are used to support triple-play services (audio, data and video) with different QoS requirements. Furthermore, reference to the BER of 10-12, 10-9, and 10-3 for video, data and audio, respectively, the system can support up to 45 total users. Hence, results show that the technique can clearly provide a relative QoS differentiation with lower value of basic users can support larger number of subscribers as well as better performance in terms of acceptable BER of 10-9 at fixed code weight.
Tail Biting Trellis Representation of Codes: Decoding and Construction
NASA Technical Reports Server (NTRS)
Shao. Rose Y.; Lin, Shu; Fossorier, Marc
1999-01-01
This paper presents two new iterative algorithms for decoding linear codes based on their tail biting trellises, one is unidirectional and the other is bidirectional. Both algorithms are computationally efficient and achieves virtually optimum error performance with a small number of decoding iterations. They outperform all the previous suboptimal decoding algorithms. The bidirectional algorithm also reduces decoding delay. Also presented in the paper is a method for constructing tail biting trellises for linear block codes.
Construction of Protograph LDPC Codes with Linear Minimum Distance
NASA Technical Reports Server (NTRS)
Divsalar, Dariush; Dolinar, Sam; Jones, Christopher
2006-01-01
A construction method for protograph-based LDPC codes that simultaneously achieve low iterative decoding threshold and linear minimum distance is proposed. We start with a high-rate protograph LDPC code with variable node degrees of at least 3. Lower rate codes are obtained by splitting check nodes and connecting them by degree-2 nodes. This guarantees the linear minimum distance property for the lower-rate codes. Excluding checks connected to degree-1 nodes, we show that the number of degree-2 nodes should be at most one less than the number of checks for the protograph LDPC code to have linear minimum distance. Iterative decoding thresholds are obtained by using the reciprocal channel approximation. Thresholds are lowered by using either precoding or at least one very high-degree node in the base protograph. A family of high- to low-rate codes with minimum distance linearly increasing in block size and with capacity-approaching performance thresholds is presented. FPGA simulation results for a few example codes show that the proposed codes perform as predicted.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Prowant, Matthew S.; Denslow, Kayte M.; Moran, Traci L.
2016-09-21
The desire to use high-density polyethylene (HDPE) piping in buried Class 3 service and cooling water systems in nuclear power plants is primarily motivated by the material’s high resistance to corrosion relative to that of steel and metal alloys. The rules for construction of Class 3 HDPE pressure piping systems were originally published in Code Case N-755 and were recently incorporated into the American Society of Mechanical Engineers Boiler and Pressure Vessel Code (ASME BPVC) Section III as Mandatory Appendix XXVI (2015 Edition). The requirements for HDPE examination are guided by criteria developed for metal pipe and are based onmore » industry-led HDPE research or conservative calculations.« less
World`s first SPB LNG carrier ``POLAR EAGLE``
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aoki, Eiji; Nakajima, Yoshiyuki; Yamada, Koichiro
1994-12-31
The world`s first Self-supporting Prismatic-shape IMO type B (SPB) LNG Carrier named ``POLAR EAGLE`` has been delivered to Phillips 66 Natural Gas Company and Marathon Oil Company in June, 1993. The cargo containment system installed onboard the vessel, SPB cargo containment system, was developed by Ishikawajima-Harima Heavy Industries Co., Ltd. (IHI) and fully complies with IMO Gas Carrier Code for a type B independent tank. ``POLAR EAGLE`` was constructed in the authors` Aichi works and delivered 34 months after the contract of the vessel. Its performance was confirmed through various kinds of tests and inspections during construction of the vessel.more » Results of typical tests and inspections are introduced.« less
Photonic ququart logic assisted by the cavity-QED system.
Luo, Ming-Xing; Deng, Yun; Li, Hui-Ran; Ma, Song-Ya
2015-08-14
Universal quantum logic gates are important elements for a quantum computer. In contrast to previous constructions of qubit systems, we investigate the possibility of ququart systems (four-dimensional states) dependent on two DOFs of photon systems. We propose some useful one-parameter four-dimensional quantum transformations for the construction of universal ququart logic gates. The interface between the spin of a photon and an electron spin confined in a quantum dot embedded in a microcavity is applied to build universal ququart logic gates on the photon system with two freedoms. Our elementary controlled-ququart gates cost no more than 8 CNOT gates in a qubit system, which is far less than the 104 CNOT gates required for a general four-qubit logic gate. The ququart logic is also used to generate useful hyperentanglements and hyperentanglement-assisted quantum error-correcting code, which may be available in modern physical technology.
Photonic ququart logic assisted by the cavity-QED system
Luo, Ming-Xing; Deng, Yun; Li, Hui-Ran; Ma, Song-Ya
2015-01-01
Universal quantum logic gates are important elements for a quantum computer. In contrast to previous constructions of qubit systems, we investigate the possibility of ququart systems (four-dimensional states) dependent on two DOFs of photon systems. We propose some useful one-parameter four-dimensional quantum transformations for the construction of universal ququart logic gates. The interface between the spin of a photon and an electron spin confined in a quantum dot embedded in a microcavity is applied to build universal ququart logic gates on the photon system with two freedoms. Our elementary controlled-ququart gates cost no more than 8 CNOT gates in a qubit system, which is far less than the 104 CNOT gates required for a general four-qubit logic gate. The ququart logic is also used to generate useful hyperentanglements and hyperentanglement-assisted quantum error-correcting code, which may be available in modern physical technology. PMID:26272869
24 CFR 200.926c - Model code provisions for use in partially accepted code jurisdictions.
Code of Federal Regulations, 2010 CFR
2010-04-01
... Minimum Property Standards § 200.926c Model code provisions for use in partially accepted code... partially accepted, then the properties eligible for HUD benefits in that jurisdiction shall be constructed..., those portions of one of the model codes with which the property must comply. Schedule for Model Code...
24 CFR 200.926c - Model code provisions for use in partially accepted code jurisdictions.
Code of Federal Regulations, 2011 CFR
2011-04-01
... Minimum Property Standards § 200.926c Model code provisions for use in partially accepted code... partially accepted, then the properties eligible for HUD benefits in that jurisdiction shall be constructed..., those portions of one of the model codes with which the property must comply. Schedule for Model Code...
Understanding Mixed Code and Classroom Code-Switching: Myths and Realities
ERIC Educational Resources Information Center
Li, David C. S.
2008-01-01
Background: Cantonese-English mixed code is ubiquitous in Hong Kong society, and yet using mixed code is widely perceived as improper. This paper presents evidence of mixed code being socially constructed as bad language behavior. In the education domain, an EDB guideline bans mixed code in the classroom. Teachers are encouraged to stick to…
Surface Modeling and Grid Generation of Orbital Sciences X34 Vehicle. Phase 1
NASA Technical Reports Server (NTRS)
Alter, Stephen J.
1997-01-01
The surface modeling and grid generation requirements, motivations, and methods used to develop Computational Fluid Dynamic volume grids for the X34-Phase 1 are presented. The requirements set forth by the Aerothermodynamics Branch at the NASA Langley Research Center serve as the basis for the final techniques used in the construction of all volume grids, including grids for parametric studies of the X34. The Integrated Computer Engineering and Manufacturing code for Computational Fluid Dynamics (ICEM/CFD), the Grid Generation code (GRIDGEN), the Three-Dimensional Multi-block Advanced Grid Generation System (3DMAGGS) code, and Volume Grid Manipulator (VGM) code are used to enable the necessary surface modeling, surface grid generation, volume grid generation, and grid alterations, respectively. All volume grids generated for the X34, as outlined in this paper, were used for CFD simulations within the Aerothermodynamics Branch.
Sandia National Laboratories analysis code data base
NASA Astrophysics Data System (ADS)
Peterson, C. W.
1994-11-01
Sandia National Laboratories' mission is to solve important problems in the areas of national defense, energy security, environmental integrity, and industrial technology. The laboratories' strategy for accomplishing this mission is to conduct research to provide an understanding of the important physical phenomena underlying any problem, and then to construct validated computational models of the phenomena which can be used as tools to solve the problem. In the course of implementing this strategy, Sandia's technical staff has produced a wide variety of numerical problem-solving tools which they use regularly in the design, analysis, performance prediction, and optimization of Sandia components, systems, and manufacturing processes. This report provides the relevant technical and accessibility data on the numerical codes used at Sandia, including information on the technical competency or capability area that each code addresses, code 'ownership' and release status, and references describing the physical models and numerical implementation.
The VATES-Diamond as a Verifier's Best Friend
NASA Astrophysics Data System (ADS)
Glesner, Sabine; Bartels, Björn; Göthel, Thomas; Kleine, Moritz
Within a model-based software engineering process it needs to be ensured that properties of abstract specifications are preserved by transformations down to executable code. This is even more important in the area of safety-critical real-time systems where additionally non-functional properties are crucial. In the VATES project, we develop formal methods for the construction and verification of embedded systems. We follow a novel approach that allows us to formally relate abstract process algebraic specifications to their implementation in a compiler intermediate representation. The idea is to extract a low-level process algebraic description from the intermediate code and to formally relate it to previously developed abstract specifications. We apply this approach to a case study from the area of real-time operating systems and show that this approach has the potential to seamlessly integrate modeling, implementation, transformation and verification stages of embedded system development.
Achieving the Heisenberg limit in quantum metrology using quantum error correction.
Zhou, Sisi; Zhang, Mengzhen; Preskill, John; Jiang, Liang
2018-01-08
Quantum metrology has many important applications in science and technology, ranging from frequency spectroscopy to gravitational wave detection. Quantum mechanics imposes a fundamental limit on measurement precision, called the Heisenberg limit, which can be achieved for noiseless quantum systems, but is not achievable in general for systems subject to noise. Here we study how measurement precision can be enhanced through quantum error correction, a general method for protecting a quantum system from the damaging effects of noise. We find a necessary and sufficient condition for achieving the Heisenberg limit using quantum probes subject to Markovian noise, assuming that noiseless ancilla systems are available, and that fast, accurate quantum processing can be performed. When the sufficient condition is satisfied, a quantum error-correcting code can be constructed that suppresses the noise without obscuring the signal; the optimal code, achieving the best possible precision, can be found by solving a semidefinite program.
A Low Cost Remote Sensing System Using PC and Stereo Equipment
NASA Technical Reports Server (NTRS)
Campbell, Joel F.; Flood, Michael A.; Prasad, Narasimha S.; Hodson, Wade D.
2011-01-01
A system using a personal computer, speaker, and a microphone is used to detect objects, and make crude measurements using a carrier modulated by a pseudorandom noise (PN) code. This system can be constructed using a personal computer and audio equipment commonly found in the laboratory or at home, or more sophisticated equipment that can be purchased at reasonable cost. We demonstrate its value as an instructional tool for teaching concepts of remote sensing and digital signal processing.
FPGA based digital phase-coding quantum key distribution system
NASA Astrophysics Data System (ADS)
Lu, XiaoMing; Zhang, LiJun; Wang, YongGang; Chen, Wei; Huang, DaJun; Li, Deng; Wang, Shuang; He, DeYong; Yin, ZhenQiang; Zhou, Yu; Hui, Cong; Han, ZhengFu
2015-12-01
Quantum key distribution (QKD) is a technology with the potential capability to achieve information-theoretic security. Phasecoding is an important approach to develop practical QKD systems in fiber channel. In order to improve the phase-coding modulation rate, we proposed a new digital-modulation method in this paper and constructed a compact and robust prototype of QKD system using currently available components in our lab to demonstrate the effectiveness of the method. The system was deployed in laboratory environment over a 50 km fiber and continuously operated during 87 h without manual interaction. The quantum bit error rate (QBER) of the system was stable with an average value of 3.22% and the secure key generation rate is 8.91 kbps. Although the modulation rate of the photon in the demo system was only 200 MHz, which was limited by the Faraday-Michelson interferometer (FMI) structure, the proposed method and the field programmable gate array (FPGA) based electronics scheme have a great potential for high speed QKD systems with Giga-bits/second modulation rate.
Error control techniques for satellite and space communications
NASA Technical Reports Server (NTRS)
Costello, Daniel J., Jr.
1989-01-01
Two aspects of the work for NASA are examined: the construction of multi-dimensional phase modulation trellis codes and a performance analysis of these codes. A complete list is contained of all the best trellis codes for use with phase modulation. LxMPSK signal constellations are included for M = 4, 8, and 16 and L = 1, 2, 3, and 4. Spectral efficiencies range from 1 bit/channel symbol (equivalent to rate 1/2 coded QPSK) to 3.75 bits/channel symbol (equivalent to 15/16 coded 16-PSK). The parity check polynomials, rotational invariance properties, free distance, path multiplicities, and coding gains are given for all codes. These codes are considered to be the best candidates for implementation of a high speed decoder for satellite transmission. The design of a hardware decoder for one of these codes, viz., the 16-state 3x8-PSK code with free distance 4.0 and coding gain 3.75 dB is discussed. An exhaustive simulation study of the multi-dimensional phase modulation trellis codes is contained. This study was motivated by the fact that coding gains quoted for almost all codes found in literature are in fact only asymptotic coding gains, i.e., the coding gain at very high signal to noise ratios (SNRs) or very low BER. These asymptotic coding gains can be obtained directly from a knowledge of the free distance of the code. On the other hand, real coding gains at BERs in the range of 10(exp -2) to 10(exp -6), where these codes are most likely to operate in a concatenated system, must be done by simulation.
Glynn, P.D.
1991-01-01
The computer code MBSSAS uses two-parameter Margules-type excess-free-energy of mixing equations to calculate thermodynamic equilibrium, pure-phase saturation, and stoichiometric saturation states in binary solid-solution aqueous-solution (SSAS) systems. Lippmann phase diagrams, Roozeboom diagrams, and distribution-coefficient diagrams can be constructed from the output data files, and also can be displayed by MBSSAS (on IBM-PC compatible computers). MBSSAS also will calculate accessory information, such as the location of miscibility gaps, spinodal gaps, critical-mixing points, alyotropic extrema, Henry's law solid-phase activity coefficients, and limiting distribution coefficients. Alternatively, MBSSAS can use such information (instead of the Margules, Guggenheim, or Thompson and Waldbaum excess-free-energy parameters) to calculate the appropriate excess-free-energy of mixing equation for any given SSAS system. ?? 1991.
27 CFR 53.97 - Constructive sale price; affiliated corporations.
Code of Federal Regulations, 2011 CFR
2011-04-01
...; affiliated corporations. 53.97 Section 53.97 Alcohol, Tobacco Products and Firearms ALCOHOL AND TOBACCO TAX...; affiliated corporations. (a) In general. Sections 4216(b) (3) and (4) of the Code establish procedures for determining a constructive sale price under section 4216(b)(1)(C) of the Code for sales between corporations...
27 CFR 53.97 - Constructive sale price; affiliated corporations.
Code of Federal Regulations, 2012 CFR
2012-04-01
...; affiliated corporations. 53.97 Section 53.97 Alcohol, Tobacco Products and Firearms ALCOHOL AND TOBACCO TAX...; affiliated corporations. (a) In general. Sections 4216(b) (3) and (4) of the Code establish procedures for determining a constructive sale price under section 4216(b)(1)(C) of the Code for sales between corporations...
27 CFR 53.97 - Constructive sale price; affiliated corporations.
Code of Federal Regulations, 2013 CFR
2013-04-01
...; affiliated corporations. 53.97 Section 53.97 Alcohol, Tobacco Products and Firearms ALCOHOL AND TOBACCO TAX...; affiliated corporations. (a) In general. Sections 4216(b) (3) and (4) of the Code establish procedures for determining a constructive sale price under section 4216(b)(1)(C) of the Code for sales between corporations...
27 CFR 53.97 - Constructive sale price; affiliated corporations.
Code of Federal Regulations, 2014 CFR
2014-04-01
...; affiliated corporations. 53.97 Section 53.97 Alcohol, Tobacco Products and Firearms ALCOHOL AND TOBACCO TAX...; affiliated corporations. (a) In general. Sections 4216(b) (3) and (4) of the Code establish procedures for determining a constructive sale price under section 4216(b)(1)(C) of the Code for sales between corporations...
78 FR 47677 - DOE Activities and Methodology for Assessing Compliance With Building Energy Codes
Federal Register 2010, 2011, 2012, 2013, 2014
2013-08-06
... construction. Post- construction evaluations were implemented in one study in an effort to reduce these costs... these pilot studies have led to a number of recommendations and potential changes to the DOE methodology... fundamental assumptions and approaches to measuring compliance with building energy codes. This notice...
Papaefstathiou, Giannis S; Friscić, Tomislav; MacGillivray, Leonard R
2005-10-19
A metal organic framework with two different nodes (circle and square) and a structure related to one of the 20 known 2-uniform nets has been constructed using an organic building unit that codes for multiply fused nodes.
Gildersleeve, Sara; Singer, Jefferson A; Skerrett, Karen; Wein, Shelter
2017-05-01
"We-ness," a couple's mutual investment in their relationship and in each other, has been found to be a potent dimension of couple resilience. This study examined the development of a method to capture We-ness in psychotherapy through the coding of relationship narratives co-constructed by couples ("We-Stories"). It used a coding system to identify the core thematic elements that make up these narratives. Couples that self-identified as "happy" (N = 53) generated We-Stories and completed measures of relationship satisfaction and mutuality. These stories were then coded using the We-Stories coding manual. Findings indicated that security, an element that involves aspects of safety, support, and commitment, was most common, appearing in 58.5% of all narratives. This element was followed by the elements of pleasure (49.1%) and shared meaning/vision (37.7%). The number of "We-ness" elements was also correlated with and predictive of discrepancy scores on measures of relationship mutuality, indicating the validity of the We-Stories coding manual. Limitations and future directions are discussed.
NASA Astrophysics Data System (ADS)
Gel, Aytekin; Hu, Jonathan; Ould-Ahmed-Vall, ElMoustapha; Kalinkin, Alexander A.
2017-02-01
Legacy codes remain a crucial element of today's simulation-based engineering ecosystem due to the extensive validation process and investment in such software. The rapid evolution of high-performance computing architectures necessitates the modernization of these codes. One approach to modernization is a complete overhaul of the code. However, this could require extensive investments, such as rewriting in modern languages, new data constructs, etc., which will necessitate systematic verification and validation to re-establish the credibility of the computational models. The current study advocates using a more incremental approach and is a culmination of several modernization efforts of the legacy code MFIX, which is an open-source computational fluid dynamics code that has evolved over several decades, widely used in multiphase flows and still being developed by the National Energy Technology Laboratory. Two different modernization approaches,'bottom-up' and 'top-down', are illustrated. Preliminary results show up to 8.5x improvement at the selected kernel level with the first approach, and up to 50% improvement in total simulated time with the latter were achieved for the demonstration cases and target HPC systems employed.
Toward Intelligent Software Defect Detection
NASA Technical Reports Server (NTRS)
Benson, Markland J.
2011-01-01
Source code level software defect detection has gone from state of the art to a software engineering best practice. Automated code analysis tools streamline many of the aspects of formal code inspections but have the drawback of being difficult to construct and either prone to false positives or severely limited in the set of defects that can be detected. Machine learning technology provides the promise of learning software defects by example, easing construction of detectors and broadening the range of defects that can be found. Pinpointing software defects with the same level of granularity as prominent source code analysis tools distinguishes this research from past efforts, which focused on analyzing software engineering metrics data with granularity limited to that of a particular function rather than a line of code.
NASA Astrophysics Data System (ADS)
Setiawan, Jody; Nakazawa, Shoji
2017-10-01
This paper discusses about comparison of seismic response behaviors, seismic performance and seismic loss function of a conventional special moment frame steel structure (SMF) and a special moment frame steel structure with base isolation (BI-SMF). The validation of the proposed simplified estimation method of the maximum deformation of the base isolation system by using the equivalent linearization method and the validation of the design shear force of the superstructure are investigated from results of the nonlinear dynamic response analysis. In recent years, the constructions of steel office buildings with seismic isolation system are proceeding even in Indonesia where the risk of earthquakes is high. Although the design code for the seismic isolation structure has been proposed, there is no actual construction example for special moment frame steel structure with base isolation. Therefore, in this research, the SMF and BI-SMF buildings are designed by Indonesian Building Code which are assumed to be built at Padang City in Indonesia. The material of base isolation system is high damping rubber bearing. Dynamic eigenvalue analysis and nonlinear dynamic response analysis are carried out to show the dynamic characteristics and seismic performance. In addition, the seismic loss function is obtained from damage state probability and repair cost. For the response analysis, simulated ground accelerations, which have the phases of recorded seismic waves (El Centro NS, El Centro EW, Kobe NS and Kobe EW), adapted to the response spectrum prescribed by the Indonesian design code, that has, are used.
Huffhines, Lindsay; Tunno, Angela M; Cho, Bridget; Hambrick, Erin P; Campos, Ilse; Lichty, Brittany; Jackson, Yo
2016-08-01
State social service agency case files are a common mechanism for obtaining information about a child's maltreatment history, yet these documents are often challenging for researchers to access, and then to process in a manner consistent with the requirements of social science research designs. Specifically, accessing and navigating case files is an extensive undertaking, and a task that many researchers have had to maneuver with little guidance. Even after the files are in hand and the research questions and relevant variables have been clarified, case file information about a child's maltreatment exposure can be idiosyncratic, vague, inconsistent, and incomplete, making coding such information into useful variables for statistical analyses difficult. The Modified Maltreatment Classification System (MMCS) is a popular tool used to guide the process, and though comprehensive, this coding system cannot cover all idiosyncrasies found in case files. It is not clear from the literature how researchers implement this system while accounting for issues outside of the purview of the MMCS or that arise during MMCS use. Finally, a large yet reliable file coding team is essential to the process, however, the literature lacks training guidelines and methods for establishing reliability between coders. In an effort to move the field toward a common approach, the purpose of the present discussion is to detail the process used by one large-scale study of child maltreatment, the Studying Pathways to Adjustment and Resilience in Kids (SPARK) project, a longitudinal study of resilience in youth in foster care. The article addresses each phase of case file coding, from accessing case files, to identifying how to measure constructs of interest, to dealing with exceptions to the coding system, to coding variables reliably, to training large teams of coders and monitoring for fidelity. Implications for a comprehensive and efficient approach to case file coding are discussed.
Huffhines, Lindsay; Tunno, Angela M.; Cho, Bridget; Hambrick, Erin P.; Campos, Ilse; Lichty, Brittany; Jackson, Yo
2016-01-01
State social service agency case files are a common mechanism for obtaining information about a child’s maltreatment history, yet these documents are often challenging for researchers to access, and then to process in a manner consistent with the requirements of social science research designs. Specifically, accessing and navigating case files is an extensive undertaking, and a task that many researchers have had to maneuver with little guidance. Even after the files are in hand and the research questions and relevant variables have been clarified, case file information about a child’s maltreatment exposure can be idiosyncratic, vague, inconsistent, and incomplete, making coding such information into useful variables for statistical analyses difficult. The Modified Maltreatment Classification System (MMCS) is a popular tool used to guide the process, and though comprehensive, this coding system cannot cover all idiosyncrasies found in case files. It is not clear from the literature how researchers implement this system while accounting for issues outside of the purview of the MMCS or that arise during MMCS use. Finally, a large yet reliable file coding team is essential to the process, however, the literature lacks training guidelines and methods for establishing reliability between coders. In an effort to move the field toward a common approach, the purpose of the present discussion is to detail the process used by one large-scale study of child maltreatment, the Studying Pathways to Adjustment and Resilience in Kids (SPARK) project, a longitudinal study of resilience in youth in foster care. The article addresses each phase of case file coding, from accessing case files, to identifying how to measure constructs of interest, to dealing with exceptions to the coding system, to coding variables reliably, to training large teams of coders and monitoring for fidelity. Implications for a comprehensive and efficient approach to case file coding are discussed. PMID:28138207
Nakamura, Mikiko; Suzuki, Ayako; Akada, Junko; Tomiyoshi, Keisuke; Hoshida, Hisashi; Akada, Rinji
2015-12-01
Mammalian gene expression constructs are generally prepared in a plasmid vector, in which a promoter and terminator are located upstream and downstream of a protein-coding sequence, respectively. In this study, we found that front terminator constructs-DNA constructs containing a terminator upstream of a promoter rather than downstream of a coding region-could sufficiently express proteins as a result of end joining of the introduced DNA fragment. By taking advantage of front terminator constructs, FLAG substitutions, and deletions were generated using mutagenesis primers to identify amino acids specifically recognized by commercial FLAG antibodies. A minimal epitope sequence for polyclonal FLAG antibody recognition was also identified. In addition, we analyzed the sequence of a C-terminal Ser-Lys-Leu peroxisome localization signal, and identified the key residues necessary for peroxisome targeting. Moreover, front terminator constructs of hepatitis B surface antigen were used for deletion analysis, leading to the identification of regions required for the particle formation. Collectively, these results indicate that front terminator constructs allow for easy manipulations of C-terminal protein-coding sequences, and suggest that direct gene expression with PCR-amplified DNA is useful for high-throughput protein analysis in mammalian cells.
Building a Better Campus: An Update on Building Codes.
ERIC Educational Resources Information Center
Madden, Michael J.
2002-01-01
Discusses the implications for higher education institutions in terms of facility planning, design, construction, and renovation of the move from regionally-developed model-building codes to two international sets of codes. Also addresses the new performance-based design option within the codes. (EV)
NASA Technical Reports Server (NTRS)
Haimes, Robert; Follen, Gregory J.
1998-01-01
CAPRI is a CAD-vendor neutral application programming interface designed for the construction of analysis and design systems. By allowing access to the geometry from within all modules (grid generators, solvers and post-processors) such tasks as meshing on the actual surfaces, node enrichment by solvers and defining which mesh faces are boundaries (for the solver and visualization system) become simpler. The overall reliance on file 'standards' is minimized. This 'Geometry Centric' approach makes multi-physics (multi-disciplinary) analysis codes much easier to build. By using the shared (coupled) surface as the foundation, CAPRI provides a single call to interpolate grid-node based data from the surface discretization in one volume to another. Finally, design systems are possible where the results can be brought back into the CAD system (and therefore manufactured) because all geometry construction and modification are performed using the CAD system's geometry kernel.
NASTRAN as a resource in code development
NASA Technical Reports Server (NTRS)
Stanton, E. L.; Crain, L. M.; Neu, T. F.
1975-01-01
A case history is presented in which the NASTRAN system provided both guidelines and working software for use in the development of a discrete element program, PATCHES-111. To avoid duplication and to take advantage of the wide spread user familiarity with NASTRAN, the PATCHES-111 system uses NASTRAN bulk data syntax, NASTRAN matrix utilities, and the NASTRAN linkage editor. Problems in developing the program are discussed along with details on the architecture of the PATCHES-111 parametric cubic modeling system. The system includes model construction procedures, checkpoint/restart strategies, and other features.
Wireless Headset Communication System
NASA Technical Reports Server (NTRS)
Lau, Wilfred K.; Swanson, Richard; Christensen, Kurt K.
1995-01-01
System combines features of pagers, walkie-talkies, and cordless telephones. Wireless headset communication system uses digital modulation on spread spectrum to avoid interference among units. Consists of base station, 4 radio/antenna modules, and as many as 16 remote units with headsets. Base station serves as network controller, audio-mixing network, and interface to such outside services as computers, telephone networks, and other base stations. Developed for use at Kennedy Space Center, system also useful in industrial maintenance, emergency operations, construction, and airport operations. Also, digital capabilities exploited; by adding bar-code readers for use in taking inventories.
H-division quarterly report, October--December 1977. [Lawrence Livermore Laboratory
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1978-02-10
The Theoretical EOS Group develops theoretical techniques for describing material properties under extreme conditions and constructs equation-of-state (EOS) tables for specific applications. Work this quarter concentrated on a Li equation of state, equation of state for equilibrium plasma, improved ion corrections to the Thomas--Fermi--Kirzhnitz theory, and theoretical estimates of high-pressure melting in metals. The Experimental Physics Group investigates properties of materials at extreme conditions of pressure and temperature, and develops new experimental techniques. Effort this quarter concerned the following: parabolic projectile distortion in the two-state light-gas gun, construction of a ballistic range for long-rod penetrators, thermodynamics and sound velocities inmore » liquid metals, isobaric expansion measurements in Pt, and calculation of the velocity--mass profile of a jet produced by a shaped charge. Code development was concentrated on the PELE code, a multimaterial, multiphase, explicit finite-difference Eulerian code for pool suppression dynamics of a hypothetical loss-of-coolant accident in a nuclear reactor. Activities of the Fluid Dynamics Group were directed toward development of a code to compute the equations of state and transport properties of liquid metals (e.g. Li) and partially ionized dense plasmas, jet stability in the Li reactor system, and the study and problem application of fluid dynamic turbulence theory. 19 figures, 5 tables. (RWR)« less
Jiang, Jun; Zhou, Zongtan; Yin, Erwei; Yu, Yang; Liu, Yadong; Hu, Dewen
2015-11-01
Motor imagery (MI)-based brain-computer interfaces (BCIs) allow disabled individuals to control external devices voluntarily, helping us to restore lost motor functions. However, the number of control commands available in MI-based BCIs remains limited, limiting the usability of BCI systems in control applications involving multiple degrees of freedom (DOF), such as control of a robot arm. To address this problem, we developed a novel Morse code-inspired method for MI-based BCI design to increase the number of output commands. Using this method, brain activities are modulated by sequences of MI (sMI) tasks, which are constructed by alternately imagining movements of the left or right hand or no motion. The codes of the sMI task was detected from EEG signals and mapped to special commands. According to permutation theory, an sMI task with N-length allows 2 × (2(N)-1) possible commands with the left and right MI tasks under self-paced conditions. To verify its feasibility, the new method was used to construct a six-class BCI system to control the arm of a humanoid robot. Four subjects participated in our experiment and the averaged accuracy of the six-class sMI tasks was 89.4%. The Cohen's kappa coefficient and the throughput of our BCI paradigm are 0.88 ± 0.060 and 23.5bits per minute (bpm), respectively. Furthermore, all of the subjects could operate an actual three-joint robot arm to grasp an object in around 49.1s using our approach. These promising results suggest that the Morse code-inspired method could be used in the design of BCIs for multi-DOF control. Copyright © 2015 Elsevier Ltd. All rights reserved.
Multi-level bandwidth efficient block modulation codes
NASA Technical Reports Server (NTRS)
Lin, Shu
1989-01-01
The multilevel technique is investigated for combining block coding and modulation. There are four parts. In the first part, a formulation is presented for signal sets on which modulation codes are to be constructed. Distance measures on a signal set are defined and their properties are developed. In the second part, a general formulation is presented for multilevel modulation codes in terms of component codes with appropriate Euclidean distances. The distance properties, Euclidean weight distribution and linear structure of multilevel modulation codes are investigated. In the third part, several specific methods for constructing multilevel block modulation codes with interdependency among component codes are proposed. Given a multilevel block modulation code C with no interdependency among the binary component codes, the proposed methods give a multilevel block modulation code C which has the same rate as C, a minimum squared Euclidean distance not less than that of code C, a trellis diagram with the same number of states as that of C and a smaller number of nearest neighbor codewords than that of C. In the last part, error performance of block modulation codes is analyzed for an AWGN channel based on soft-decision maximum likelihood decoding. Error probabilities of some specific codes are evaluated based on their Euclidean weight distributions and simulation results.
Lebeau, Jean-Pierre; Cadwallader, Jean-Sébastien; Vaillant-Roussel, Hélène; Pouchain, Denis; Yaouanc, Virginie; Aubin-Auger, Isabelle; Mercier, Alain; Rusch, Emmanuel; Remmen, Roy; Vermeire, Etienne; Hendrickx, Kristin
2016-01-01
Objective To construct a typology of general practitioners’ (GPs) responses regarding their justification of therapeutic inertia in cardiovascular primary prevention for high-risk patients with hypertension. Design Empirically grounded construction of typology. Types were defined by attributes derived from the qualitative analysis of GPs’ reported reasons for inaction. Participants 256 GPs randomised in the intervention group of a cluster randomised controlled trial. Setting GPs members of 23 French Regional Colleges of Teachers in General Practice, included in the EffectS of a multifaceted intervention on CArdiovascular risk factors in high-risk hyPErtensive patients (ESCAPE) trial. Data collection and analysis The database consisted of 2638 written responses given by the GPs to an open-ended question asking for the reasons why drug treatment was not changed as suggested by the national guidelines. All answers were coded using constant comparison analysis. A matrix analysis of codes per GP allowed the construction of a response typology, where types were defined by codes as attributes. Initial coding and definition of types were performed independently by two teams. Results Initial coding resulted in a list of 69 codes in the final codebook, representing 4764 coded references in the question responses. A typology including seven types was constructed. 100 GPs were allocated to one and only one of these types, while 25 GPs did not provide enough data to allow classification. Types (numbers of GPs allocated) were: ‘optimists’ (28), ‘negotiators’ (20), ‘checkers’ (15), ‘contextualisers’ (13), ‘cautious’ (11), ‘rounders’ (8) and ‘scientists’ (5). For the 36 GPs that provided 50 or more coded references, analysis of the code evolution over time and across patients showed a consistent belonging to the initial type for any given GP. Conclusion This typology could provide GPs with some insight into their general ways of considering changes in the treatment/management of cardiovascular risk factors and guide design of specific physician-centred interventions to reduce inappropriate inaction. Trial registration number NCT00348855. PMID:27178974
Jackson Park Hospital Green Building Medical Center
DOE Office of Scientific and Technical Information (OSTI.GOV)
William Dorsey; Nelson Vasquez
2010-03-31
Jackson Park Hospital completed the construction of a new Medical Office Building on its campus this spring. The new building construction has adopted the City of Chicago's recent focus on protecting the environment, and conserving energy and resources, with the introduction of green building codes. Located in a poor, inner city neighborhood on the South side of Chicago, Jackson Park Hospital has chosen green building strategies to help make the area a better place to live and work. The new green building houses the hospital's Family Medicine Residency Program and Specialty Medical Offices. The residency program has been vital inmore » attracting new, young physicians to this medically underserved area. The new outpatient center will also help to allure needed medical providers to the community. The facility also has areas designated to women's health and community education. The Community Education Conference Room will provide learning opportunities to area residents. Emphasis will be placed on conserving resources and protecting our environment, as well as providing information on healthcare access and preventive medicine. The new Medical Office Building was constructed with numerous energy saving features. The exterior cladding of the building is an innovative, locally-manufactured precast concrete panel system with integral insulation that achieves an R-value in excess of building code requirements. The roof is a 'green roof' covered by native plantings, lessening the impact solar heat gain on the building, and reducing air conditioning requirements. The windows are low-E, tinted, and insulated to reduce cooling requirements in summer and heating requirements in winter. The main entrance has an air lock to prevent unconditioned air from entering the building and impacting interior air temperatures. Since much of the traffic in and out of the office building comes from the adjacent Jackson Park Hospital, a pedestrian bridge connects the two buildings, further decreasing the amount of unconditioned air that enters the office building. The HVAC system has an Energy Efficiency Rating 29% greater than required. No CFC based refrigerants were used in the HVAC system, thus reducing the emission of compounds that contribute to ozone depletion and global warming. In addition, interior light fixtures employ the latest energy-efficient lamp and ballast technology. Interior lighting throughout the building is operated by sensors that will automatically turn off lights inside a room when the room is unoccupied. The electrical traction elevators use less energy than typical elevators, and they are made of 95% recycled material. Further, locally manufactured products were used throughout, minimizing the amount of energy required to construct this building. The primary objective was to construct a 30,000 square foot medical office building on the Jackson Park Hospital campus that would comply with newly adopted City of Chicago green building codes focusing on protecting the environment and conserving energy and resources. The energy saving systems demonstrate a state of the-art whole-building approach to energy efficient design and construction. The energy efficiency and green aspects of the building contribute to the community by emphasizing the environmental and economic benefits of conserving resources. The building highlights the integration of Chicago's new green building codes into a poor, inner city neighborhood project and it is designed to attract medical providers and physicians to a medically underserved area.« less
Non-coding variants contribute to the clinical heterogeneity of TTR amyloidosis.
Iorio, Andrea; De Lillo, Antonella; De Angelis, Flavio; Di Girolamo, Marco; Luigetti, Marco; Sabatelli, Mario; Pradotto, Luca; Mauro, Alessandro; Mazzeo, Anna; Stancanelli, Claudia; Perfetto, Federico; Frusconi, Sabrina; My, Filomena; Manfellotto, Dario; Fuciarelli, Maria; Polimanti, Renato
2017-09-01
Coding mutations in TTR gene cause a rare hereditary form of systemic amyloidosis, which has a complex genotype-phenotype correlation. We investigated the role of non-coding variants in regulating TTR gene expression and consequently amyloidosis symptoms. We evaluated the genotype-phenotype correlation considering the clinical information of 129 Italian patients with TTR amyloidosis. Then, we conducted a re-sequencing of TTR gene to investigate how non-coding variants affect TTR expression and, consequently, phenotypic presentation in carriers of amyloidogenic mutations. Polygenic scores for genetically determined TTR expression were constructed using data from our re-sequencing analysis and the GTEx (Genotype-Tissue Expression) project. We confirmed a strong phenotypic heterogeneity across coding mutations causing TTR amyloidosis. Considering the effects of non-coding variants on TTR expression, we identified three patient clusters with specific expression patterns associated with certain phenotypic presentations, including late onset, autonomic neurological involvement, and gastrointestinal symptoms. This study provides novel data regarding the role of non-coding variation and the gene expression profiles in patients affected by TTR amyloidosis, also putting forth an approach that could be used to investigate the mechanisms at the basis of the genotype-phenotype correlation of the disease.
NASA Astrophysics Data System (ADS)
Bognot, J. R.; Candido, C. G.; Blanco, A. C.; Montelibano, J. R. Y.
2018-05-01
Monitoring the progress of building's construction is critical in construction management. However, measuring the building construction's progress are still manual, time consuming, error prone, and impose tedious process of analysis leading to delays, additional costings and effort. The main goal of this research is to develop a methodology for building construction progress monitoring based on 3D as-built model of the building from unmanned aerial system (UAS) images, 4D as-planned model (with construction schedule integrated) and, GIS analysis. Monitoring was done by capturing videos of the building with a camera-equipped UAS. Still images were extracted, filtered, bundle-adjusted, and 3D as-built model was generated using open source photogrammetric software. The as-planned model was generated from digitized CAD drawings using GIS. The 3D as-built model was aligned with the 4D as-planned model of building formed from extrusion of building elements, and integration of the construction's planned schedule. The construction progress is visualized via color-coding the building elements in the 3D model. The developed methodology was conducted and applied from the data obtained from an actual construction site. Accuracy in detecting `built' or `not built' building elements ranges from 82-84 % and precision of 50-72 %. Quantified progress in terms of the number of building elements are 21.31% (November 2016), 26.84 % (January 2017) and 44.19 % (March 2017). The results can be used as an input for progress monitoring performance of construction projects and improving related decision-making process.
FRAGS: estimation of coding sequence substitution rates from fragmentary data
Swart, Estienne C; Hide, Winston A; Seoighe, Cathal
2004-01-01
Background Rates of substitution in protein-coding sequences can provide important insights into evolutionary processes that are of biomedical and theoretical interest. Increased availability of coding sequence data has enabled researchers to estimate more accurately the coding sequence divergence of pairs of organisms. However the use of different data sources, alignment protocols and methods to estimate substitution rates leads to widely varying estimates of key parameters that define the coding sequence divergence of orthologous genes. Although complete genome sequence data are not available for all organisms, fragmentary sequence data can provide accurate estimates of substitution rates provided that an appropriate and consistent methodology is used and that differences in the estimates obtainable from different data sources are taken into account. Results We have developed FRAGS, an application framework that uses existing, freely available software components to construct in-frame alignments and estimate coding substitution rates from fragmentary sequence data. Coding sequence substitution estimates for human and chimpanzee sequences, generated by FRAGS, reveal that methodological differences can give rise to significantly different estimates of important substitution parameters. The estimated substitution rates were also used to infer upper-bounds on the amount of sequencing error in the datasets that we have analysed. Conclusion We have developed a system that performs robust estimation of substitution rates for orthologous sequences from a pair of organisms. Our system can be used when fragmentary genomic or transcript data is available from one of the organisms and the other is a completely sequenced genome within the Ensembl database. As well as estimating substitution statistics our system enables the user to manage and query alignment and substitution data. PMID:15005802
Fire safety of wood construction
Robert H. White; Mark A. Dietenberger
2010-01-01
Fire safety is an important concern in all types of construction. The high level of national concern for fire safety is reflected in limitations and design requirements in building codes. These code requirements and related fire performance data are discussed in the context of fire safety design and evaluation in the initial section of this chapter. Because basic data...
A CellML simulation compiler and code generator using ODE solving schemes
2012-01-01
Models written in description languages such as CellML are becoming a popular solution to the handling of complex cellular physiological models in biological function simulations. However, in order to fully simulate a model, boundary conditions and ordinary differential equation (ODE) solving schemes have to be combined with it. Though boundary conditions can be described in CellML, it is difficult to explicitly specify ODE solving schemes using existing tools. In this study, we define an ODE solving scheme description language-based on XML and propose a code generation system for biological function simulations. In the proposed system, biological simulation programs using various ODE solving schemes can be easily generated. We designed a two-stage approach where the system generates the equation set associating the physiological model variable values at a certain time t with values at t + Δt in the first stage. The second stage generates the simulation code for the model. This approach enables the flexible construction of code generation modules that can support complex sets of formulas. We evaluate the relationship between models and their calculation accuracies by simulating complex biological models using various ODE solving schemes. Using the FHN model simulation, results showed good qualitative and quantitative correspondence with the theoretical predictions. Results for the Luo-Rudy 1991 model showed that only first order precision was achieved. In addition, running the generated code in parallel on a GPU made it possible to speed up the calculation time by a factor of 50. The CellML Compiler source code is available for download at http://sourceforge.net/projects/cellmlcompiler. PMID:23083065
DOE Office of Scientific and Technical Information (OSTI.GOV)
Carbary, Lawrence D.; Perkins, Laura L.; Serino, Roland
The team led by Dow Corning collaborated to increase the thermal performance of exterior insulation and finishing systems (EIFS) to reach R-40 performance meeting the needs for high efficiency insulated walls. Additionally, the project helped remove barriers to using EIFS on retrofit commercial buildings desiring high insulated walls. The three wall systems developed within the scope of this project provide the thermal performance of R-24 to R-40 by incorporating vacuum insulation panels (VIPs) into an expanded polystyrene (EPS) encapsulated vacuum insulated sandwich element (VISE). The VISE was incorporated into an EIFS as pre-engineered insulation boards. The VISE is installed usingmore » typical EIFS details and network of trained installers. These three wall systems were tested and engineered to be fully code compliant as an EIFS and meet all of the International Building Code structural, durability and fire test requirements for a code compliant exterior wall cladding system. This system is being commercialized under the trade name Dryvit® Outsulation® HE system. Full details, specifications, and application guidelines have been developed for the system. The system has been modeled both thermally and hygrothermally to predict condensation potential. Based on weather models for Baltimore, MD; Boston, MA; Miami, FL; Minneapolis, MN; Phoenix, AZ; and Seattle, WA; condensation and water build up in the wall system is not a concern. Finally, the team conducted a field trial of the system on a building at the former Brunswick Naval Air Station which is being redeveloped by the Midcoast Regional Redevelopment Authority (Brunswick, Maine). The field trial provided a retrofit R-30 wall onto a wood frame construction, slab on grade, 1800 ft2 building, that was monitored over the course of a year. Simultaneous with the façade retrofit, the building’s windows were upgraded at no charge to this program. The retrofit building used 49% less natural gas during the winter of 2012 compared to previous winters. This project achieved its goal of developing a system that is constructible, offers protection to the VIPs, and meets all performance targets established for the project.« less
42 CFR 52b.12 - What are the minimum requirements of construction and equipment?
Code of Federal Regulations, 2014 CFR
2014-10-01
...-8400). (3) ICBO “Uniform Building Code,” Volumes 1-3 (1997). International Conference of Building...-4406). (4) BOCA National Building Code (1996) 1998 Supplement, Building Officials and Code... Southern Building Code Congress (SBCC), 900 Montclair Road, Birmingham, AL 35213-1206 (telephone 205-591...
42 CFR 52b.12 - What are the minimum requirements of construction and equipment?
Code of Federal Regulations, 2012 CFR
2012-10-01
...-8400). (3) ICBO “Uniform Building Code,” Volumes 1-3 (1997). International Conference of Building...-4406). (4) BOCA National Building Code (1996) 1998 Supplement, Building Officials and Code... Southern Building Code Congress (SBCC), 900 Montclair Road, Birmingham, AL 35213-1206 (telephone 205-591...
42 CFR 52b.12 - What are the minimum requirements of construction and equipment?
Code of Federal Regulations, 2013 CFR
2013-10-01
...-8400). (3) ICBO “Uniform Building Code,” Volumes 1-3 (1997). International Conference of Building...-4406). (4) BOCA National Building Code (1996) 1998 Supplement, Building Officials and Code... Southern Building Code Congress (SBCC), 900 Montclair Road, Birmingham, AL 35213-1206 (telephone 205-591...
Chaotic coordinates for the Large Helical Device
NASA Astrophysics Data System (ADS)
Hudson, Stuart; Suzuki, Yasuhiro
2014-10-01
The study of dynamical systems is facilitated by a coordinate framework with coordinate surfaces that coincide with invariant structures of the dynamical flow. For axisymmetric systems, a continuous family of invariant surfaces is guaranteed and straight-fieldline coordinates may be constructed. For non-integrable systems, e.g. stellarators, perturbed tokamaks, this continuous family is broken. Nevertheless, coordinates can still be constructed that simplify the description of the dynamics. The Poincare-Birkhoff theorem, the Aubry-Mather theorem, and the KAM theorem show that there are important structures that are invariant under the perturbed dynamics; namely the periodic orbits, the cantori, and the irrational flux surfaces. Coordinates adapted to these invariant sets, which we call chaotic coordinates, provide substantial advantages. The regular motion becomes straight, and the irregular motion is bounded by, and dissected by, coordinate surfaces that coincide with surfaces of locally-minimal magnetic-fieldline flux. The chaotic edge of the magnetic field, as calculated by HINT2 code, in the Large Helical Device (LHD) is examined, and a coordinate system is constructed so that the flux surfaces are ``straight'' and the islands become ``square.''
A distributed programming environment for Ada
NASA Technical Reports Server (NTRS)
Brennan, Peter; Mcdonnell, Tom; Mcfarland, Gregory; Timmins, Lawrence J.; Litke, John D.
1986-01-01
Despite considerable commercial exploitation of fault tolerance systems, significant and difficult research problems remain in such areas as fault detection and correction. A research project is described which constructs a distributed computing test bed for loosely coupled computers. The project is constructing a tool kit to support research into distributed control algorithms, including a distributed Ada compiler, distributed debugger, test harnesses, and environment monitors. The Ada compiler is being written in Ada and will implement distributed computing at the subsystem level. The design goal is to provide a variety of control mechanics for distributed programming while retaining total transparency at the code level.
Combinatorics associated with inflections and bitangents of plane quartics
NASA Astrophysics Data System (ADS)
Gizatullin, M. Kh
2013-08-01
After a preliminary survey and a description of some small Steiner systems from the standpoint of the theory of invariants of binary forms, we construct a binary Golay code (of length 24) using ideas from J. Grassmann's thesis of 1875. One of our tools is a pair of disjoint Fano planes. Another application of such pairs and properties of plane quartics is a construction of a new block design on 28 objects. This block design is a part of a dissection of the set of 288 Aronhold sevens. The dissection distributes the Aronhold sevens into 8 disjoint block designs of this type.
Computer-Aided System Engineering and Analysis (CASE/A) Programmer's Manual, Version 5.0
NASA Technical Reports Server (NTRS)
Knox, J. C.
1996-01-01
The Computer Aided System Engineering and Analysis (CASE/A) Version 5.0 Programmer's Manual provides the programmer and user with information regarding the internal structure of the CASE/A 5.0 software system. CASE/A 5.0 is a trade study tool that provides modeling/simulation capabilities for analyzing environmental control and life support systems and active thermal control systems. CASE/A has been successfully used in studies such as the evaluation of carbon dioxide removal in the space station. CASE/A modeling provides a graphical and command-driven interface for the user. This interface allows the user to construct a model by placing equipment components in a graphical layout of the system hardware, then connect the components via flow streams and define their operating parameters. Once the equipment is placed, the simulation time and other control parameters can be set to run the simulation based on the model constructed. After completion of the simulation, graphical plots or text files can be obtained for evaluation of the simulation results over time. Additionally, users have the capability to control the simulation and extract information at various times in the simulation (e.g., control equipment operating parameters over the simulation time or extract plot data) by using "User Operations (OPS) Code." This OPS code is written in FORTRAN with a canned set of utility subroutines for performing common tasks. CASE/A version 5.0 software runs under the VAX VMS(Trademark) environment. It utilizes the Tektronics 4014(Trademark) graphics display system and the VTIOO(Trademark) text manipulation/display system.
NASA Technical Reports Server (NTRS)
1990-01-01
Lunar base projects, including a reconfigurable lunar cargo launcher, a thermal and micrometeorite protection system, a versatile lifting machine with robotic capabilities, a cargo transport system, the design of a road construction system for a lunar base, and the design of a device for removing lunar dust from material surfaces, are discussed. The emphasis on the Gulf of Mexico project was on the development of a computer simulation model for predicting vessel station keeping requirements. An existing code, used in predicting station keeping requirements for oil drilling platforms operating in North Shore (Alaska) waters was used as a basis for the computer simulation. Modifications were made to the existing code. The input into the model consists of satellite altimeter readings and water velocity readings from buoys stationed in the Gulf of Mexico. The satellite data consists of altimeter readings (wave height) taken during the spring of 1989. The simulation model predicts water velocity and direction, and wind velocity.
Initial and Long-Term Movement of Cladding Installed Over Exterior Rigid Insulation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Baker, Peter
Changes in the International Energy Conservation Code (IECC) from 2009 to 2012 have resulted in the use of exterior rigid insulation becoming part of the prescriptive code requirements. With more jurisdictions adopting the 2012 IECC builders will be required to incorporate exterior insulation in the construction of their exterior wall assemblies. For thick layers of exterior insulation (levels greater than 1.5 inches), the use of wood furring strips attached through the insulation back to the structure has been used by many contractors and designers as a means to provide a convenient cladding attachment location. This research was an extension onmore » previous research conducted by Building Science Corporation in 2011, and 2012. Each year the understanding of the system discrete load component interactions, as well as impacts of environmental loading, has increased. The focus of the research was to examine more closely the impacts of screw fastener bending on the total system capacity, effects of thermal expansion and contraction of materials on the compressive forces in the assembly, as well as to analyze a full year’s worth of cladding movement data from assemblies constructed in an exposed outdoor environment.« less
CFD analysis of turbopump volutes
NASA Technical Reports Server (NTRS)
Ascoli, Edward P.; Chan, Daniel C.; Darian, Armen; Hsu, Wayne W.; Tran, Ken
1993-01-01
An effort is underway to develop a procedure for the regular use of CFD analysis in the design of turbopump volutes. Airflow data to be taken at NASA Marshall will be used to validate the CFD code and overall procedure. Initial focus has been on preprocessing (geometry creation, translation, and grid generation). Volute geometries have been acquired electronically and imported into the CATIA CAD system and RAGGS (Rockwell Automated Grid Generation System) via the IGES standard. An initial grid topology has been identified and grids have been constructed for turbine inlet and discharge volutes. For CFD analysis of volutes to be used regularly, a procedure must be defined to meet engineering design needs in a timely manner. Thus, a compromise must be established between making geometric approximations, the selection of grid topologies, and possible CFD code enhancements. While the initial grid developed approximated the volute tongue with a zero thickness, final computations should more accurately account for the geometry in this region. Additionally, grid topologies will be explored to minimize skewness and high aspect ratio cells that can affect solution accuracy and slow code convergence. Finally, as appropriate, code modifications will be made to allow for new grid topologies in an effort to expedite the overall CFD analysis process.
Cai, Yong; Li, Xiwen; Wang, Runmiao; Yang, Qing; Li, Peng; Hu, Hao
2016-01-01
Currently, the chemical fingerprint comparison and analysis is mainly based on professional equipment and software, it's expensive and inconvenient. This study aims to integrate QR (Quick Response) code with quality data and mobile intelligent technology to develop a convenient query terminal for tracing quality in the whole industrial chain of TCM (traditional Chinese medicine). Three herbal medicines were randomly selected and their chemical two-dimensional barcode (2D) barcodes fingerprints were constructed. Smartphone application (APP) based on Android system was developed to read initial data of 2D chemical barcodes, and compared multiple fingerprints from different batches of same species or different species. It was demonstrated that there were no significant differences between original and scanned TCM chemical fingerprints. Meanwhile, different TCM chemical fingerprint QR codes could be rendered in the same coordinate and showed the differences very intuitively. To be able to distinguish the variations of chemical fingerprint more directly, linear interpolation angle cosine similarity algorithm (LIACSA) was proposed to get similarity ratio. This study showed that QR codes can be used as an effective information carrier to transfer quality data. Smartphone application can rapidly read quality information in QR codes and convert data into TCM chemical fingerprints.
EvoluCode: Evolutionary Barcodes as a Unifying Framework for Multilevel Evolutionary Data.
Linard, Benjamin; Nguyen, Ngoc Hoan; Prosdocimi, Francisco; Poch, Olivier; Thompson, Julie D
2012-01-01
Evolutionary systems biology aims to uncover the general trends and principles governing the evolution of biological networks. An essential part of this process is the reconstruction and analysis of the evolutionary histories of these complex, dynamic networks. Unfortunately, the methodologies for representing and exploiting such complex evolutionary histories in large scale studies are currently limited. Here, we propose a new formalism, called EvoluCode (Evolutionary barCode), which allows the integration of different evolutionary parameters (eg, sequence conservation, orthology, synteny …) in a unifying format and facilitates the multilevel analysis and visualization of complex evolutionary histories at the genome scale. The advantages of the approach are demonstrated by constructing barcodes representing the evolution of the complete human proteome. Two large-scale studies are then described: (i) the mapping and visualization of the barcodes on the human chromosomes and (ii) automatic clustering of the barcodes to highlight protein subsets sharing similar evolutionary histories and their functional analysis. The methodologies developed here open the way to the efficient application of other data mining and knowledge extraction techniques in evolutionary systems biology studies. A database containing all EvoluCode data is available at: http://lbgi.igbmc.fr/barcodes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Campbell, Michael T.; Safdari, Masoud; Kress, Jessica E.
The project described in this report constructed and exercised an innovative multiphysics coupling toolkit called the Illinois Rocstar MultiPhysics Application Coupling Toolkit (IMPACT). IMPACT is an open source, flexible, natively parallel infrastructure for coupling multiple uniphysics simulation codes into multiphysics computational systems. IMPACT works with codes written in several high-performance-computing (HPC) programming languages, and is designed from the beginning for HPC multiphysics code development. It is designed to be minimally invasive to the individual physics codes being integrated, and has few requirements on those physics codes for integration. The goal of IMPACT is to provide the support needed to enablemore » coupling existing tools together in unique and innovative ways to produce powerful new multiphysics technologies without extensive modification and rewrite of the physics packages being integrated. There are three major outcomes from this project: 1) construction, testing, application, and open-source release of the IMPACT infrastructure, 2) production of example open-source multiphysics tools using IMPACT, and 3) identification and engagement of interested organizations in the tools and applications resulting from the project. This last outcome represents the incipient development of a user community and application echosystem being built using IMPACT. Multiphysics coupling standardization can only come from organizations working together to define needs and processes that span the space of necessary multiphysics outcomes, which Illinois Rocstar plans to continue driving toward. The IMPACT system, including source code, documentation, and test problems are all now available through the public gitHUB.org system to anyone interested in multiphysics code coupling. Many of the basic documents explaining use and architecture of IMPACT are also attached as appendices to this document. Online HTML documentation is available through the gitHUB site. There are over 100 unit tests provided that run through the Illinois Rocstar Application Development (IRAD) lightweight testing infrastructure that is also supplied along with IMPACT. The package as a whole provides an excellent base for developing high-quality multiphysics applications using modern software development practices. To facilitate understanding how to utilize IMPACT effectively, two multiphysics systems have been developed and are available open-source through gitHUB. The simpler of the two systems, named ElmerFoamFSI in the repository, is a multiphysics, fluid-structure-interaction (FSI) coupling of the solid mechanics package Elmer with a fluid dynamics module from OpenFOAM. This coupling illustrates how to combine software packages that are unrelated by either author or architecture and combine them into a robust, parallel multiphysics system. A more complex multiphysics tool is the Illinois Rocstar Rocstar Multiphysics code that was rebuilt during the project around IMPACT. Rocstar Multiphysics was already an HPC multiphysics tool, but now that it has been rearchitected around IMPACT, it can be readily expanded to capture new and different physics in the future. In fact, during this project, the Elmer and OpenFOAM tools were also coupled into Rocstar Multiphysics and demonstrated. The full Rocstar Multiphysics codebase is also available on gitHUB, and licensed for any organization to use as they wish. Finally, the new IMPACT product is already being used in several multiphysics code coupling projects for the Air Force, NASA and the Missile Defense Agency, and initial work on expansion of the IMPACT-enabled Rocstar Multiphysics has begun in support of a commercial company. These initiatives promise to expand the interest and reach of IMPACT and Rocstar Multiphysics, ultimately leading to the envisioned standardization and consortium of users that was one of the goals of this project.« less
Chen, Meng; Matthias, Marianne S.; Bell, Robert A.; Kravitz, Richard L.
2016-01-01
Objective. To describe the development and initial application of the Chronic Pain Coding System. Design. Secondary analysis of data from a randomized clinical trial. Setting. Six primary care clinics in northern California. Subjects. Forty-five primary care visits involving 33 clinicians and 45 patients on opioids for chronic noncancer pain. Methods. The authors developed a structured coding system to accurately and objectively characterize discussions about pain and opioids. Two coders applied the final system to visit transcripts. Intercoder agreement for major coding categories was moderate to substantial (kappa = 0.5–0.7). Mixed effects regression was used to test six hypotheses to assess preliminary construct validity. Results. Greater baseline pain interference was associated with longer pain discussions (P = 0.007) and more patient requests for clinician action (P = 0.02) but not more frequent negative patient evaluations of pain (P = 0.15). Greater clinician-reported visit difficulty was associated with more frequent disagreements with clinician recommendations (P = 0.003) and longer discussions of opioid risks (P = 0.049) but not more frequent requests for clinician action (P = 0.11). Rates of agreement versus disagreement with patient requests and clinician recommendations were similar for opioid-related and non-opioid–related utterances. Conclusions. This coding system appears to be a reliable and valid tool for characterizing patient-clinician communication about opioids and chronic pain during clinic visits. Objective data on how patients and clinicians discuss chronic pain and opioids are necessary to identify communication patterns and strategies for improving the quality and productivity of discussions about chronic pain that may lead to more effective pain management and reduce inappropriate opioid prescribing. PMID:26936453
Henry, Stephen G; Chen, Meng; Matthias, Marianne S; Bell, Robert A; Kravitz, Richard L
2016-10-01
To describe the development and initial application of the Chronic Pain Coding System. Secondary analysis of data from a randomized clinical trial. Six primary care clinics in northern California. Forty-five primary care visits involving 33 clinicians and 45 patients on opioids for chronic noncancer pain. The authors developed a structured coding system to accurately and objectively characterize discussions about pain and opioids. Two coders applied the final system to visit transcripts. Intercoder agreement for major coding categories was moderate to substantial (kappa = 0.5-0.7). Mixed effects regression was used to test six hypotheses to assess preliminary construct validity. Greater baseline pain interference was associated with longer pain discussions (P = 0.007) and more patient requests for clinician action (P = 0.02) but not more frequent negative patient evaluations of pain (P = 0.15). Greater clinician-reported visit difficulty was associated with more frequent disagreements with clinician recommendations (P = 0.003) and longer discussions of opioid risks (P = 0.049) but not more frequent requests for clinician action (P = 0.11). Rates of agreement versus disagreement with patient requests and clinician recommendations were similar for opioid-related and non-opioid-related utterances. This coding system appears to be a reliable and valid tool for characterizing patient-clinician communication about opioids and chronic pain during clinic visits. Objective data on how patients and clinicians discuss chronic pain and opioids are necessary to identify communication patterns and strategies for improving the quality and productivity of discussions about chronic pain that may lead to more effective pain management and reduce inappropriate opioid prescribing. © 2016 American Academy of Pain Medicine. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Evaluation of CFETR as a Fusion Nuclear Science Facility using multiple system codes
NASA Astrophysics Data System (ADS)
Chan, V. S.; Costley, A. E.; Wan, B. N.; Garofalo, A. M.; Leuer, J. A.
2015-02-01
This paper presents the results of a multi-system codes benchmarking study of the recently published China Fusion Engineering Test Reactor (CFETR) pre-conceptual design (Wan et al 2014 IEEE Trans. Plasma Sci. 42 495). Two system codes, General Atomics System Code (GASC) and Tokamak Energy System Code (TESC), using different methodologies to arrive at CFETR performance parameters under the same CFETR constraints show that the correlation between the physics performance and the fusion performance is consistent, and the computed parameters are in good agreement. Optimization of the first wall surface for tritium breeding and the minimization of the machine size are highly compatible. Variations of the plasma currents and profiles lead to changes in the required normalized physics performance, however, they do not significantly affect the optimized size of the machine. GASC and TESC have also been used to explore a lower aspect ratio, larger volume plasma taking advantage of the engineering flexibility in the CFETR design. Assuming the ITER steady-state scenario physics, the larger plasma together with a moderately higher BT and Ip can result in a high gain Qfus ˜ 12, Pfus ˜ 1 GW machine approaching DEMO-like performance. It is concluded that the CFETR baseline mode can meet the minimum goal of the Fusion Nuclear Science Facility (FNSF) mission and advanced physics will enable it to address comprehensively the outstanding critical technology gaps on the path to a demonstration reactor (DEMO). Before proceeding with CFETR construction steady-state operation has to be demonstrated, further development is needed to solve the divertor heat load issue, and blankets have to be designed with tritium breeding ratio (TBR) >1 as a target.
Multi-level trellis coded modulation and multi-stage decoding
NASA Technical Reports Server (NTRS)
Costello, Daniel J., Jr.; Wu, Jiantian; Lin, Shu
1990-01-01
Several constructions for multi-level trellis codes are presented and many codes with better performance than previously known codes are found. These codes provide a flexible trade-off between coding gain, decoding complexity, and decoding delay. New multi-level trellis coded modulation schemes using generalized set partitioning methods are developed for Quadrature Amplitude Modulation (QAM) and Phase Shift Keying (PSK) signal sets. New rotationally invariant multi-level trellis codes which can be combined with differential encoding to resolve phase ambiguity are presented.
OCTGRAV: Sparse Octree Gravitational N-body Code on Graphics Processing Units
NASA Astrophysics Data System (ADS)
Gaburov, Evghenii; Bédorf, Jeroen; Portegies Zwart, Simon
2010-10-01
Octgrav is a very fast tree-code which runs on massively parallel Graphical Processing Units (GPU) with NVIDIA CUDA architecture. The algorithms are based on parallel-scan and sort methods. The tree-construction and calculation of multipole moments is carried out on the host CPU, while the force calculation which consists of tree walks and evaluation of interaction list is carried out on the GPU. In this way, a sustained performance of about 100GFLOP/s and data transfer rates of about 50GB/s is achieved. It takes about a second to compute forces on a million particles with an opening angle of heta approx 0.5. To test the performance and feasibility, we implemented the algorithms in CUDA in the form of a gravitational tree-code which completely runs on the GPU. The tree construction and traverse algorithms are portable to many-core devices which have support for CUDA or OpenCL programming languages. The gravitational tree-code outperforms tuned CPU code during the tree-construction and shows a performance improvement of more than a factor 20 overall, resulting in a processing rate of more than 2.8 million particles per second. The code has a convenient user interface and is freely available for use.
The WorkPlace distributed processing environment
NASA Technical Reports Server (NTRS)
Ames, Troy; Henderson, Scott
1993-01-01
Real time control problems require robust, high performance solutions. Distributed computing can offer high performance through parallelism and robustness through redundancy. Unfortunately, implementing distributed systems with these characteristics places a significant burden on the applications programmers. Goddard Code 522 has developed WorkPlace to alleviate this burden. WorkPlace is a small, portable, embeddable network interface which automates message routing, failure detection, and re-configuration in response to failures in distributed systems. This paper describes the design and use of WorkPlace, and its application in the construction of a distributed blackboard system.
ADAPTION OF NONSTANDARD PIPING COMPONENTS INTO PRESENT DAY SEISMIC CODES
DOE Office of Scientific and Technical Information (OSTI.GOV)
D. T. Clark; M. J. Russell; R. E. Spears
2009-07-01
With spiraling energy demand and flat energy supply, there is a need to extend the life of older nuclear reactors. This sometimes requires that existing systems be evaluated to present day seismic codes. Older reactors built in the 1960s and early 1970s often used fabricated piping components that were code compliant during their initial construction time period, but are outside the standard parameters of present-day piping codes. There are several approaches available to the analyst in evaluating these non-standard components to modern codes. The simplest approach is to use the flexibility factors and stress indices for similar standard components withmore » the assumption that the non-standard component’s flexibility factors and stress indices will be very similar. This approach can require significant engineering judgment. A more rational approach available in Section III of the ASME Boiler and Pressure Vessel Code, which is the subject of this paper, involves calculation of flexibility factors using finite element analysis of the non-standard component. Such analysis allows modeling of geometric and material nonlinearities. Flexibility factors based on these analyses are sensitive to the load magnitudes used in their calculation, load magnitudes that need to be consistent with those produced by the linear system analyses where the flexibility factors are applied. This can lead to iteration, since the magnitude of the loads produced by the linear system analysis depend on the magnitude of the flexibility factors. After the loading applied to the nonstandard component finite element model has been matched to loads produced by the associated linear system model, the component finite element model can then be used to evaluate the performance of the component under the loads with the nonlinear analysis provisions of the Code, should the load levels lead to calculated stresses in excess of Allowable stresses. This paper details the application of component-level finite element modeling to account for geometric and material nonlinear component behavior in a linear elastic piping system model. Note that this technique can be applied to the analysis of B31 piping systems.« less
Software Tools for Stochastic Simulations of Turbulence
2015-08-28
client interface to FTI. Specefic client programs using this interface include the weather forecasting code WRF ; the high energy physics code, FLASH...client programs using this interface include the weather forecasting code WRF ; the high energy physics code, FLASH; and two locally constructed fluid...45 4.4.2.2 FLASH . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 45 4.4.2.3 WRF
Potential Job Creation in Rhode Island as a Result of Adopting New Residential Building Energy Codes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Scott, Michael J.; Niemeyer, Jackie M.
Are there advantages to states that adopt the most recent model building energy codes other than saving energy? For example, can the construction activity and energy savings associated with code-compliant housing units become significant sources of job creation for states if new building energy codes are adopted to cover residential construction? , The U.S. Department of Energy (DOE) Building Energy Codes Program (BECP) asked Pacific Northwest National Laboratory (PNNL) to research and ascertain whether jobs would be created in individual states based on their adoption of model building energy codes. Each state in the country is dealing with high levelsmore » of unemployment, so job creation has become a top priority. Many programs have been created to combat unemployment with various degrees of failure and success. At the same time, many states still have not yet adopted the most current versions of the International Energy Conservation Code (IECC) model building energy code, when doing so could be a very effective tool in creating jobs to assist states in recovering from this economic downturn.« less
Potential Job Creation in Minnesota as a Result of Adopting New Residential Building Energy Codes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Scott, Michael J.; Niemeyer, Jackie M.
Are there advantages to states that adopt the most recent model building energy codes other than saving energy? For example, can the construction activity and energy savings associated with code-compliant housing units become significant sources of job creation for states if new building energy codes are adopted to cover residential construction? , The U.S. Department of Energy (DOE) Building Energy Codes Program (BECP) asked Pacific Northwest National Laboratory (PNNL) to research and ascertain whether jobs would be created in individual states based on their adoption of model building energy codes. Each state in the country is dealing with high levelsmore » of unemployment, so job creation has become a top priority. Many programs have been created to combat unemployment with various degrees of failure and success. At the same time, many states still have not yet adopted the most current versions of the International Energy Conservation Code (IECC) model building energy code, when doing so could be a very effective tool in creating jobs to assist states in recovering from this economic downturn.« less
Potential Job Creation in Tennessee as a Result of Adopting New Residential Building Energy Codes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Scott, Michael J.; Niemeyer, Jackie M.
Are there advantages to states that adopt the most recent model building energy codes other than saving energy? For example, can the construction activity and energy savings associated with code-compliant housing units become significant sources of job creation for states if new building energy codes are adopted to cover residential construction? , The U.S. Department of Energy (DOE) Building Energy Codes Program (BECP) asked Pacific Northwest National Laboratory (PNNL) to research and ascertain whether jobs would be created in individual states based on their adoption of model building energy codes. Each state in the country is dealing with high levelsmore » of unemployment, so job creation has become a top priority. Many programs have been created to combat unemployment with various degrees of failure and success. At the same time, many states still have not yet adopted the most current versions of the International Energy Conservation Code (IECC) model building energy code, when doing so could be a very effective tool in creating jobs to assist states in recovering from this economic downturn.« less
Potential Job Creation in Nevada as a Result of Adopting New Residential Building Energy Codes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Scott, Michael J.; Niemeyer, Jackie M.
Are there advantages to states that adopt the most recent model building energy codes other than saving energy? For example, can the construction activity and energy savings associated with code-compliant housing units become significant sources of job creation for states if new building energy codes are adopted to cover residential construction? , The U.S. Department of Energy (DOE) Building Energy Codes Program (BECP) asked Pacific Northwest National Laboratory (PNNL) to research and ascertain whether jobs would be created in individual states based on their adoption of model building energy codes. Each state in the country is dealing with high levelsmore » of unemployment, so job creation has become a top priority. Many programs have been created to combat unemployment with various degrees of failure and success. At the same time, many states still have not yet adopted the most current versions of the International Energy Conservation Code (IECC) model building energy code, when doing so could be a very effective tool in creating jobs to assist states in recovering from this economic downturn.« less
Coded Cooperation for Multiway Relaying in Wireless Sensor Networks †
Si, Zhongwei; Ma, Junyang; Thobaben, Ragnar
2015-01-01
Wireless sensor networks have been considered as an enabling technology for constructing smart cities. One important feature of wireless sensor networks is that the sensor nodes collaborate in some manner for communications. In this manuscript, we focus on the model of multiway relaying with full data exchange where each user wants to transmit and receive data to and from all other users in the network. We derive the capacity region for this specific model and propose a coding strategy through coset encoding. To obtain good performance with practical codes, we choose spatially-coupled LDPC (SC-LDPC) codes for the coded cooperation. In particular, for the message broadcasting from the relay, we construct multi-edge-type (MET) SC-LDPC codes by repeatedly applying coset encoding. Due to the capacity-achieving property of the SC-LDPC codes, we prove that the capacity region can theoretically be achieved by the proposed MET SC-LDPC codes. Numerical results with finite node degrees are provided, which show that the achievable rates approach the boundary of the capacity region in both binary erasure channels and additive white Gaussian channels. PMID:26131675
Coded Cooperation for Multiway Relaying in Wireless Sensor Networks.
Si, Zhongwei; Ma, Junyang; Thobaben, Ragnar
2015-06-29
Wireless sensor networks have been considered as an enabling technology for constructing smart cities. One important feature of wireless sensor networks is that the sensor nodes collaborate in some manner for communications. In this manuscript, we focus on the model of multiway relaying with full data exchange where each user wants to transmit and receive data to and from all other users in the network. We derive the capacity region for this specific model and propose a coding strategy through coset encoding. To obtain good performance with practical codes, we choose spatially-coupled LDPC (SC-LDPC) codes for the coded cooperation. In particular, for the message broadcasting from the relay, we construct multi-edge-type (MET) SC-LDPC codes by repeatedly applying coset encoding. Due to the capacity-achieving property of the SC-LDPC codes, we prove that the capacity region can theoretically be achieved by the proposed MET SC-LDPC codes. Numerical results with finite node degrees are provided, which show that the achievable rates approach the boundary of the capacity region in both binary erasure channels and additive white Gaussian channels.
76 FR 70807 - Notice of Passenger Facility Charge (PFC) Approvals and Disapprovals
Federal Register 2010, 2011, 2012, 2013, 2014
2011-11-15
... (design and construction). Enplane road structural improvements (design and construction). Landside signage improvements (design and construction). Taxiway B-2 extension and taxiway B-1 rehabilitation (design and construction). Elevator and escalator safety code compliance improvements (design and...
Code of Federal Regulations, 2012 CFR
2012-10-01
....) NBIC. National Board Inspection Code published by the National Board of Boiler and Pressure Vessel.... American National Standards Institute. API. American Petroleum Institute. ASME. American Society of... separation into parts. Code of original construction. The manufacturer's or industry code in effect when the...
Code of Federal Regulations, 2014 CFR
2014-10-01
....) NBIC. National Board Inspection Code published by the National Board of Boiler and Pressure Vessel.... American National Standards Institute. API. American Petroleum Institute. ASME. American Society of... separation into parts. Code of original construction. The manufacturer's or industry code in effect when the...
Code of Federal Regulations, 2013 CFR
2013-10-01
....) NBIC. National Board Inspection Code published by the National Board of Boiler and Pressure Vessel.... American National Standards Institute. API. American Petroleum Institute. ASME. American Society of... separation into parts. Code of original construction. The manufacturer's or industry code in effect when the...
The Influence of Building Codes on Recreation Facility Design.
ERIC Educational Resources Information Center
Morrison, Thomas A.
1989-01-01
Implications of building codes upon design and construction of recreation facilities are investigated (national building codes, recreation facility standards, and misperceptions of design requirements). Recreation professionals can influence architectural designers to correct past deficiencies, but they must understand architectural and…
A test of the validity of the motivational interviewing treatment integrity code.
Forsberg, Lars; Berman, Anne H; Kallmén, Håkan; Hermansson, Ulric; Helgason, Asgeir R
2008-01-01
To evaluate the Swedish version of the Motivational Interviewing Treatment Code (MITI), MITI coding was applied to tape-recorded counseling sessions. Construct validity was assessed using factor analysis on 120 MITI-coded sessions. Discriminant validity was assessed by comparing MITI coding of motivational interviewing (MI) sessions with information- and advice-giving sessions as well as by comparing MI-trained practitioners with untrained practitioners. A principal-axis factoring analysis yielded some evidence for MITI construct validity. MITI differentiated between practitioners with different levels of MI training as well as between MI practitioners and advice-giving counselors, thus supporting discriminant validity. MITI may be used as a training tool together with supervision to confirm and enhance MI practice in clinical settings. MITI can also serve as a tool for evaluating MI integrity in clinical research.
A Boundary Condition for Simulation of Flow Over Porous Surfaces
NASA Technical Reports Server (NTRS)
Frink, Neal T.; Bonhaus, Daryl L.; Vatsa, Veer N.; Bauer, Steven X. S.; Tinetti, Ana F.
2001-01-01
A new boundary condition is presented.for simulating the flow over passively porous surfaces. The model builds on the prior work of R.H. Bush to eliminate the need for constructing grid within an underlying plenum, thereby simplifying the numerical modeling of passively porous flow control systems and reducing computation cost. Code experts.for two structured-grid.flow solvers, TLNS3D and CFL3D. and one unstructured solver, USM3Dns, collaborated with an experimental porosity expert to develop the model and implement it into their respective codes. Results presented,for the three codes on a slender forebody with circumferential porosity and a wing with leading-edge porosity demonstrate a good agreement with experimental data and a remarkable ability to predict the aggregate aerodynamic effects of surface porosity with a simple boundary condition.
Sun, Xiaojing; Brown, Marilyn A.; Cox, Matt; ...
2015-03-11
This paper provides a global overview of the design, implementation, and evolution of building energy codes. Reflecting alternative policy goals, building energy codes differ significantly across the United States, the European Union, and China. This review uncovers numerous innovative practices including greenhouse gas emissions caps per square meter of building space, energy performance certificates with retrofit recommendations, and inclusion of renewable energy to achieve “nearly zero-energy buildings”. These innovations motivated an assessment of an aggressive commercial building code applied to all US states, requiring both new construction and buildings with major modifications to comply with the latest version of themore » ASHRAE 90.1 Standards. Using the National Energy Modeling System (NEMS), we estimate that by 2035, such building codes in the United States could reduce energy for space heating, cooling, water heating and lighting in commercial buildings by 16%, 15%, 20% and 5%, respectively. Impacts on different fuels and building types, energy rates and bills as well as pollution emission reductions are also examined.« less
Study of the OCDMA Transmission Characteristics in FSO-FTTH at Various Distances, Outdoor
NASA Astrophysics Data System (ADS)
Aldouri, Muthana Y.; Aljunid, S. A.; Fadhil, Hilal A.
2013-06-01
It is important to apply the field Programmable Gate Array (FPGA), and Optical Switch technology as an encoder and decoder for Spectral Amplitude Coding Optical Code Division Multiple Access (SAC-OCDMA) Free Space Optic Fiber to the Home (FSO-FTTH) transmitter and receiver system design. The encoder and decoder module will be using FPGA as a code generator, optical switch using as encode and decode of optical source. This module was tested by using the Modified Double Weight (MDW) code, which is selected as an excellent candidate because it had shown superior performance were by the total noise is reduced. It is also easy to construct and can reduce the number of filters required at a receiver by a newly proposed detection scheme known as AND Subtraction technique. MDW code is presented here to support Fiber-To-The-Home (FTTH) access network in Point-To-Multi-Point (P2MP) application. The conversion used a Mach-Zehnder interferometer (MZI) wavelength converter. The performances are characterized through BER and bit rate (BR), also, the received power at a variety of bit rates.
The NASA Neutron Star Grand Challenge: The coalescences of Neutron Star Binary System
NASA Astrophysics Data System (ADS)
Suen, Wai-Mo
1998-04-01
NASA funded a Grand Challenge Project (9/1996-1999) for the development of a multi-purpose numerical treatment for relativistic astrophysics and gravitational wave astronomy. The coalescence of binary neutron stars is chosen as the model problem for the code development. The institutes involved in it are the Argonne Lab, Livermore lab, Max-Planck Institute at Potsdam, StonyBrook, U of Illinois and Washington U. We have recently succeeded in constructing a highly optimized parallel code which is capable of solving the full Einstein equations coupled with relativistic hydrodynamics, running at over 50 GFLOPS on a T3E (the second milestone point of the project). We are presently working on the head-on collisions of two neutron stars, and the inclusion of realistic equations of state into the code. The code will be released to the relativity and astrophysics community in April of 1998. With the full dynamics of the spacetime, relativistic hydro and microphysics all combined into a unified 3D code for the first time, many interesting large scale calculations in general relativistic astrophysics can now be carried out on massively parallel computers.
Self-consistent modeling of electron cyclotron resonance ion sources
NASA Astrophysics Data System (ADS)
Girard, A.; Hitz, D.; Melin, G.; Serebrennikov, K.; Lécot, C.
2004-05-01
In order to predict the performances of electron cyclotron resonance ion source (ECRIS), it is necessary to perfectly model the different parts of these sources: (i) magnetic configuration; (ii) plasma characteristics; (iii) extraction system. The magnetic configuration is easily calculated via commercial codes; different codes also simulate the ion extraction, either in two dimension, or even in three dimension (to take into account the shape of the plasma at the extraction influenced by the hexapole). However the characteristics of the plasma are not always mastered. This article describes the self-consistent modeling of ECRIS: we have developed a code which takes into account the most important construction parameters: the size of the plasma (length, diameter), the mirror ratio and axial magnetic profile, whether a biased probe is installed or not. These input parameters are used to feed a self-consistent code, which calculates the characteristics of the plasma: electron density and energy, charge state distribution, plasma potential. The code is briefly described, and some of its most interesting results are presented. Comparisons are made between the calculations and the results obtained experimentally.
Trellises and Trellis-Based Decoding Algorithms for Linear Block Codes
NASA Technical Reports Server (NTRS)
Lin, Shu
1998-01-01
A code trellis is a graphical representation of a code, block or convolutional, in which every path represents a codeword (or a code sequence for a convolutional code). This representation makes it possible to implement Maximum Likelihood Decoding (MLD) of a code with reduced decoding complexity. The most well known trellis-based MLD algorithm is the Viterbi algorithm. The trellis representation was first introduced and used for convolutional codes [23]. This representation, together with the Viterbi decoding algorithm, has resulted in a wide range of applications of convolutional codes for error control in digital communications over the last two decades. There are two major reasons for this inactive period of research in this area. First, most coding theorists at that time believed that block codes did not have simple trellis structure like convolutional codes and maximum likelihood decoding of linear block codes using the Viterbi algorithm was practically impossible, except for very short block codes. Second, since almost all of the linear block codes are constructed algebraically or based on finite geometries, it was the belief of many coding theorists that algebraic decoding was the only way to decode these codes. These two reasons seriously hindered the development of efficient soft-decision decoding methods for linear block codes and their applications to error control in digital communications. This led to a general belief that block codes are inferior to convolutional codes and hence, that they were not useful. Chapter 2 gives a brief review of linear block codes. The goal is to provide the essential background material for the development of trellis structure and trellis-based decoding algorithms for linear block codes in the later chapters. Chapters 3 through 6 present the fundamental concepts, finite-state machine model, state space formulation, basic structural properties, state labeling, construction procedures, complexity, minimality, and sectionalization of trellises. Chapter 7 discusses trellis decomposition and subtrellises for low-weight codewords. Chapter 8 first presents well known methods for constructing long powerful codes from short component codes or component codes of smaller dimensions, and then provides methods for constructing their trellises which include Shannon and Cartesian product techniques. Chapter 9 deals with convolutional codes, puncturing, zero-tail termination and tail-biting.Chapters 10 through 13 present various trellis-based decoding algorithms, old and new. Chapter 10 first discusses the application of the well known Viterbi decoding algorithm to linear block codes, optimum sectionalization of a code trellis to minimize computation complexity, and design issues for IC (integrated circuit) implementation of a Viterbi decoder. Then it presents a new decoding algorithm for convolutional codes, named Differential Trellis Decoding (DTD) algorithm. Chapter 12 presents a suboptimum reliability-based iterative decoding algorithm with a low-weight trellis search for the most likely codeword. This decoding algorithm provides a good trade-off between error performance and decoding complexity. All the decoding algorithms presented in Chapters 10 through 12 are devised to minimize word error probability. Chapter 13 presents decoding algorithms that minimize bit error probability and provide the corresponding soft (reliability) information at the output of the decoder. Decoding algorithms presented are the MAP (maximum a posteriori probability) decoding algorithm and the Soft-Output Viterbi Algorithm (SOVA) algorithm. Finally, the minimization of bit error probability in trellis-based MLD is discussed.
Active plasma release experiments
NASA Technical Reports Server (NTRS)
1986-01-01
A pulse code modulator (PCM) encoder capable of storing data onboard into the mass memory in the encoder at up to 12 megabits per second was designed and constructed. This telemetry system was programed for two successful flights. All parts of the electronic system functioned perfectly during both previous flights and the detectors performed perfectly. However, in the first flight in Pokerflat, Alaska, an electron arm did not deploy for reasons as yet unkown. The ion arm deployed perfectly and good data was acquired.
NASA Technical Reports Server (NTRS)
Gelinas, R. J.; Doss, S. K.; Vajk, J. P.; Djomehri, J.; Miller, K.
1983-01-01
The mathematical background regarding the moving finite element (MFE) method of Miller and Miller (1981) is discussed, taking into account a general system of partial differential equations (PDE) and the amenability of the MFE method in two dimensions to code modularization and to semiautomatic user-construction of numerous PDE systems for both Dirichlet and zero-Neumann boundary conditions. A description of test problem results is presented, giving attention to aspects of single square wave propagation, and a solution of the heat equation.
NASA Astrophysics Data System (ADS)
Gao, Jian; Wang, Yongkang
2018-01-01
Structural properties of u-constacyclic codes over the ring F_p+u{F}_p are given, where p is an odd prime and u^2=1. Under a special Gray map from F_p+u{F}_p to F_p^2, some new non-binary quantum codes are obtained by this class of constacyclic codes.
Gel, Aytekin; Hu, Jonathan; Ould-Ahmed-Vall, ElMoustapha; ...
2017-03-20
Legacy codes remain a crucial element of today's simulation-based engineering ecosystem due to the extensive validation process and investment in such software. The rapid evolution of high-performance computing architectures necessitates the modernization of these codes. One approach to modernization is a complete overhaul of the code. However, this could require extensive investments, such as rewriting in modern languages, new data constructs, etc., which will necessitate systematic verification and validation to re-establish the credibility of the computational models. The current study advocates using a more incremental approach and is a culmination of several modernization efforts of the legacy code MFIX, whichmore » is an open-source computational fluid dynamics code that has evolved over several decades, widely used in multiphase flows and still being developed by the National Energy Technology Laboratory. Two different modernization approaches,‘bottom-up’ and ‘top-down’, are illustrated. Here, preliminary results show up to 8.5x improvement at the selected kernel level with the first approach, and up to 50% improvement in total simulated time with the latter were achieved for the demonstration cases and target HPC systems employed.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gel, Aytekin; Hu, Jonathan; Ould-Ahmed-Vall, ElMoustapha
Legacy codes remain a crucial element of today's simulation-based engineering ecosystem due to the extensive validation process and investment in such software. The rapid evolution of high-performance computing architectures necessitates the modernization of these codes. One approach to modernization is a complete overhaul of the code. However, this could require extensive investments, such as rewriting in modern languages, new data constructs, etc., which will necessitate systematic verification and validation to re-establish the credibility of the computational models. The current study advocates using a more incremental approach and is a culmination of several modernization efforts of the legacy code MFIX, whichmore » is an open-source computational fluid dynamics code that has evolved over several decades, widely used in multiphase flows and still being developed by the National Energy Technology Laboratory. Two different modernization approaches,‘bottom-up’ and ‘top-down’, are illustrated. Here, preliminary results show up to 8.5x improvement at the selected kernel level with the first approach, and up to 50% improvement in total simulated time with the latter were achieved for the demonstration cases and target HPC systems employed.« less
Studies of Planet Formation Using a Hybrid N-Body + Planetesimal Code
NASA Technical Reports Server (NTRS)
Kenyon, Scott J.
2004-01-01
The goal of our proposal was to use a hybrid multi-annulus planetesimal/n-body code to examine the planetesimal theory, one of the two main theories of planet formation. We developed this code to follow the evolution of numerous 1 m to 1 km planetesimals as they collide, merge, and grow into full-fledged planets. Our goal was to apply the code to several well-posed, topical problems in planet formation and to derive observational consequences of the models. We planned to construct detailed models to address two fundamental issues: (1) icy planets: models for icy planet formation will demonstrate how the physical properties of debris disks - including the Kuiper Belt in our solar system - depend on initial conditions and input physics; and (2) terrestrial planets: calculations following the evolution of 1-10 km planetesimals into Earth-mass planets and rings of dust will provide a better understanding of how terrestrial planets form and interact with their environment.
Code of Federal Regulations, 2012 CFR
2012-04-01
... from social security taxes by certain tax-exempt organizations. 31.3121(k)-4 Section 31.3121(k)-4... Contributions Act (Chapter 21, Internal Revenue Code of 1954) General Provisions § 31.3121(k)-4 Constructive... organization did not file a valid waiver certificate under section 3121(k)(1) of the Internal Revenue Code of...
Code of Federal Regulations, 2011 CFR
2011-04-01
... from social security taxes by certain tax-exempt organizations. 31.3121(k)-4 Section 31.3121(k)-4... Contributions Act (Chapter 21, Internal Revenue Code of 1954) General Provisions § 31.3121(k)-4 Constructive... organization did not file a valid waiver certificate under section 3121(k)(1) of the Internal Revenue Code of...
Code of Federal Regulations, 2013 CFR
2013-04-01
... from social security taxes by certain tax-exempt organizations. 31.3121(k)-4 Section 31.3121(k)-4... Contributions Act (Chapter 21, Internal Revenue Code of 1954) General Provisions § 31.3121(k)-4 Constructive... organization did not file a valid waiver certificate under section 3121(k)(1) of the Internal Revenue Code of...
Code of Federal Regulations, 2014 CFR
2014-04-01
... from social security taxes by certain tax-exempt organizations. 31.3121(k)-4 Section 31.3121(k)-4... Contributions Act (Chapter 21, Internal Revenue Code of 1954) General Provisions § 31.3121(k)-4 Constructive... organization did not file a valid waiver certificate under section 3121(k)(1) of the Internal Revenue Code of...
Code of Federal Regulations, 2012 CFR
2012-04-01
... construction, a trained and qualified servicing housing office representative or building inspector will review the construction to ensure that it meets applicable minimum construction standards and building codes...
Code of Federal Regulations, 2013 CFR
2013-04-01
... construction, a trained and qualified servicing housing office representative or building inspector will review the construction to ensure that it meets applicable minimum construction standards and building codes...
Code of Federal Regulations, 2014 CFR
2014-04-01
... construction, a trained and qualified servicing housing office representative or building inspector will review the construction to ensure that it meets applicable minimum construction standards and building codes...
Code of Federal Regulations, 2011 CFR
2011-04-01
... construction, a trained and qualified servicing housing office representative or building inspector will review the construction to ensure that it meets applicable minimum construction standards and building codes...
Code of Federal Regulations, 2010 CFR
2010-04-01
... construction, a trained and qualified servicing housing office representative or building inspector will review the construction to ensure that it meets applicable minimum construction standards and building codes...
Protecting quantum memories using coherent parity check codes
NASA Astrophysics Data System (ADS)
Roffe, Joschka; Headley, David; Chancellor, Nicholas; Horsman, Dominic; Kendon, Viv
2018-07-01
Coherent parity check (CPC) codes are a new framework for the construction of quantum error correction codes that encode multiple qubits per logical block. CPC codes have a canonical structure involving successive rounds of bit and phase parity checks, supplemented by cross-checks to fix the code distance. In this paper, we provide a detailed introduction to CPC codes using conventional quantum circuit notation. We demonstrate the implementation of a CPC code on real hardware, by designing a [[4, 2, 2
Ross, Jaclyn M.; Girard, Jeffrey M.; Wright, Aidan G.C.; Beeney, Joseph E.; Scott, Lori N.; Hallquist, Michael N.; Lazarus, Sophie A.; Stepp, Stephanie D.; Pilkonis, Paul A.
2016-01-01
Relationships are among the most salient factors affecting happiness and wellbeing for individuals and families. Relationship science has identified the study of dyadic behavioral patterns between couple members during conflict as an important window in to relational functioning with both short-term and long-term consequences. Several methods have been developed for the momentary assessment of behavior during interpersonal transactions. Among these, the most popular is the Specific Affect Coding System (SPAFF), which organizes social behavior into a set of discrete behavioral constructs. This study examines the interpersonal meaning of the SPAFF codes through the lens of interpersonal theory, which uses the fundamental dimensions of Dominance and Affiliation to organize interpersonal behavior. A sample of 67 couples completed a conflict task, which was video recorded and coded using SPAFF and a method for rating momentary interpersonal behavior, the Continuous Assessment of Interpersonal Dynamics (CAID). Actor partner interdependence models in a multilevel structural equation modeling framework were used to study the covariation of SPAFF codes and CAID ratings. Results showed that a number of SPAFF codes had clear interpersonal signatures, but many did not. Additionally, actor and partner effects for the same codes were strongly consistent with interpersonal theory’s principle of complementarity. Thus, findings reveal points of convergence and divergence in the two systems and provide support for central tenets of interpersonal theory. Future directions based on these initial findings are discussed. PMID:27148786
Ross, Jaclyn M; Girard, Jeffrey M; Wright, Aidan G C; Beeney, Joseph E; Scott, Lori N; Hallquist, Michael N; Lazarus, Sophie A; Stepp, Stephanie D; Pilkonis, Paul A
2017-02-01
Relationships are among the most salient factors affecting happiness and wellbeing for individuals and families. Relationship science has identified the study of dyadic behavioral patterns between couple members during conflict as an important window in to relational functioning with both short-term and long-term consequences. Several methods have been developed for the momentary assessment of behavior during interpersonal transactions. Among these, the most popular is the Specific Affect Coding System (SPAFF), which organizes social behavior into a set of discrete behavioral constructs. This study examines the interpersonal meaning of the SPAFF codes through the lens of interpersonal theory, which uses the fundamental dimensions of Dominance and Affiliation to organize interpersonal behavior. A sample of 67 couples completed a conflict task, which was video recorded and coded using SPAFF and a method for rating momentary interpersonal behavior, the Continuous Assessment of Interpersonal Dynamics (CAID). Actor partner interdependence models in a multilevel structural equation modeling framework were used to study the covariation of SPAFF codes and CAID ratings. Results showed that a number of SPAFF codes had clear interpersonal signatures, but many did not. Additionally, actor and partner effects for the same codes were strongly consistent with interpersonal theory's principle of complementarity. Thus, findings reveal points of convergence and divergence in the 2 systems and provide support for central tenets of interpersonal theory. Future directions based on these initial findings are discussed. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
DNA fingerprinting of Chinese melon provides evidentiary support of seed quality appraisal.
Gao, Peng; Ma, Hongyan; Luan, Feishi; Song, Haibin
2012-01-01
Melon, Cucumis melo L. is an important vegetable crop worldwide. At present, there are phenomena of homonyms and synonyms present in the melon seed markets of China, which could cause variety authenticity issues influencing the process of melon breeding, production, marketing and other aspects. Molecular markers, especially microsatellites or simple sequence repeats (SSRs) are playing increasingly important roles for cultivar identification. The aim of this study was to construct a DNA fingerprinting database of major melon cultivars, which could provide a possibility for the establishment of a technical standard system for purity and authenticity identification of melon seeds. In this study, to develop the core set SSR markers, 470 polymorphic SSRs were selected as the candidate markers from 1219 SSRs using 20 representative melon varieties (lines). Eighteen SSR markers, evenly distributed across the genome and with the highest contents of polymorphism information (PIC) were identified as the core marker set for melon DNA fingerprinting analysis. Fingerprint codes for 471 melon varieties (lines) were established. There were 51 materials which were classified into17 groups based on sharing the same fingerprint code, while field traits survey results showed that these plants in the same group were synonyms because of the same or similar field characters. Furthermore, DNA fingerprinting quick response (QR) codes of 471 melon varieties (lines) were constructed. Due to its fast readability and large storage capacity, QR coding melon DNA fingerprinting is in favor of read convenience and commercial applications.
DNA Fingerprinting of Chinese Melon Provides Evidentiary Support of Seed Quality Appraisal
Gao, Peng; Ma, Hongyan; Luan, Feishi; Song, Haibin
2012-01-01
Melon, Cucumis melo L. is an important vegetable crop worldwide. At present, there are phenomena of homonyms and synonyms present in the melon seed markets of China, which could cause variety authenticity issues influencing the process of melon breeding, production, marketing and other aspects. Molecular markers, especially microsatellites or simple sequence repeats (SSRs) are playing increasingly important roles for cultivar identification. The aim of this study was to construct a DNA fingerprinting database of major melon cultivars, which could provide a possibility for the establishment of a technical standard system for purity and authenticity identification of melon seeds. In this study, to develop the core set SSR markers, 470 polymorphic SSRs were selected as the candidate markers from 1219 SSRs using 20 representative melon varieties (lines). Eighteen SSR markers, evenly distributed across the genome and with the highest contents of polymorphism information (PIC) were identified as the core marker set for melon DNA fingerprinting analysis. Fingerprint codes for 471 melon varieties (lines) were established. There were 51 materials which were classified into17 groups based on sharing the same fingerprint code, while field traits survey results showed that these plants in the same group were synonyms because of the same or similar field characters. Furthermore, DNA fingerprinting quick response (QR) codes of 471 melon varieties (lines) were constructed. Due to its fast readability and large storage capacity, QR coding melon DNA fingerprinting is in favor of read convenience and commercial applications. PMID:23285039
Qing, Yu; Shuai, Han; Qiang, Wang; Jing-Bo, Xue
2017-06-08
To report the integrated progress of the hydatid disease information management system, and to provide the reference for further system improvements by analysis of results on simulation test feedback. The work of institutional code matching by collecting fundamental and integrated information of the system in epidemic areas of hydatid disease was carried out, and professional control agencies were selected to carry out the simulation test. The results of agencies code matching at stage indicated the average completion rate was 94.30% on administrative agencies, 69.94% on registered professional agencies and 56.40% on professional institutions matching related to hydatid disease prevention and control implements in seven provinces (autonomous regions) and Xinjiang Production and Construction Corps. Meanwhile, the response rate of open-ended proposals was 93.33% on fifteen feedbacks, and the statistics showed 21.43% believed the system was low fluency, 64.29% considered the system was inconvenience for data inputs and 42.86% considered it would be improved on system statistics functions, of which 27.78% were provincial users, 22.22% were the city users and 50.00% were the county users. The hydatid disease prevention information management system meets the fundamental needs of the majority agencies in hyperendemic areas of echinococcosis, it needs to develop the further test with more agencies joining after the work of the institutional code matching completion and the system service improvement in the next stage.
Retrofitting the AutoBayes Program Synthesis System with Concrete Syntax
NASA Technical Reports Server (NTRS)
Fischer, Bernd; Visser, Eelco
2004-01-01
AutoBayes is a fully automatic, schema-based program synthesis system for statistical data analysis applications. Its core component is a schema library. i.e., a collection of generic code templates with associated applicability constraints which are instantiated in a problem-specific way during synthesis. Currently, AutoBayes is implemented in Prolog; the schemas thus use abstract syntax (i.e., Prolog terms) to formulate the templates. However, the conceptual distance between this abstract representation and the concrete syntax of the generated programs makes the schemas hard to create and maintain. In this paper we describe how AutoBayes is retrofitted with concrete syntax. We show how it is integrated into Prolog and describe how the seamless interaction of concrete syntax fragments with AutoBayes's remaining legacy meta-programming kernel based on abstract syntax is achieved. We apply the approach to gradually mitigate individual schemas without forcing a disruptive migration of the entire system to a different First experiences show that a smooth migration can be achieved. Moreover, it can result in a considerable reduction of the code size and improved readability of the code. In particular, abstracting out fresh-variable generation and second-order term construction allows the formulation of larger continuous fragments.
Lebeau, Jean-Pierre; Cadwallader, Jean-Sébastien; Vaillant-Roussel, Hélène; Pouchain, Denis; Yaouanc, Virginie; Aubin-Auger, Isabelle; Mercier, Alain; Rusch, Emmanuel; Remmen, Roy; Vermeire, Etienne; Hendrickx, Kristin
2016-05-13
To construct a typology of general practitioners' (GPs) responses regarding their justification of therapeutic inertia in cardiovascular primary prevention for high-risk patients with hypertension. Empirically grounded construction of typology. Types were defined by attributes derived from the qualitative analysis of GPs' reported reasons for inaction. 256 GPs randomised in the intervention group of a cluster randomised controlled trial. GPs members of 23 French Regional Colleges of Teachers in General Practice, included in the EffectS of a multifaceted intervention on CArdiovascular risk factors in high-risk hyPErtensive patients (ESCAPE) trial. The database consisted of 2638 written responses given by the GPs to an open-ended question asking for the reasons why drug treatment was not changed as suggested by the national guidelines. All answers were coded using constant comparison analysis. A matrix analysis of codes per GP allowed the construction of a response typology, where types were defined by codes as attributes. Initial coding and definition of types were performed independently by two teams. Initial coding resulted in a list of 69 codes in the final codebook, representing 4764 coded references in the question responses. A typology including seven types was constructed. 100 GPs were allocated to one and only one of these types, while 25 GPs did not provide enough data to allow classification. Types (numbers of GPs allocated) were: 'optimists' (28), 'negotiators' (20), 'checkers' (15), 'contextualisers' (13), 'cautious' (11), 'rounders' (8) and 'scientists' (5). For the 36 GPs that provided 50 or more coded references, analysis of the code evolution over time and across patients showed a consistent belonging to the initial type for any given GP. This typology could provide GPs with some insight into their general ways of considering changes in the treatment/management of cardiovascular risk factors and guide design of specific physician-centred interventions to reduce inappropriate inaction. NCT00348855. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/
Design of a Double Anode Magnetron Injection Gun for Q-band Gyro-TWT Using Boundary Element Method
NASA Astrophysics Data System (ADS)
Li, Zhiliang; Feng, Jinjun; Liu, Bentian
2018-04-01
This paper presents a novel design code for double anode magnetron injection guns (MIGs) in gyro-devices based on boundary element method (BEM). The physical and mathematical models were constructed, and then the code using BEM for MIG's calculation was developed. Using the code, a double anode MIG for a Q-band gyrotron traveling-wave tube (gyro-TWT) amplifier operating in the circular TE01 mode at the fundamental cyclotron harmonic was designed. In order to verify the reliability of this code, velocity spread and guiding center radius of the MIG simulated by the BEM code were compared with these from the commonly used EGUN code, showing a reasonable agreement. Then, a Q-band gyro-TWT was fabricated and tested. The testing results show that the device has achieved an average power of 5kW and peak power ≥ 150 kW at a 3% duty cycle within bandwidth of 2 GHz, and maximum output peak power of 220 kW, with a corresponding saturated gain of 50.9 dB and efficiency of 39.8%. This paper demonstrates that the BEM code can be used as an effective approach for analysis of electron optics system in gyro-devices.
Yurko, Joseph P.; Buongiorno, Jacopo; Youngblood, Robert
2015-05-28
System codes for simulation of safety performance of nuclear plants may contain parameters whose values are not known very accurately. New information from tests or operating experience is incorporated into safety codes by a process known as calibration, which reduces uncertainty in the output of the code and thereby improves its support for decision-making. The work reported here implements several improvements on classic calibration techniques afforded by modern analysis techniques. The key innovation has come from development of code surrogate model (or code emulator) construction and prediction algorithms. Use of a fast emulator makes the calibration processes used here withmore » Markov Chain Monte Carlo (MCMC) sampling feasible. This study uses Gaussian Process (GP) based emulators, which have been used previously to emulate computer codes in the nuclear field. The present work describes the formulation of an emulator that incorporates GPs into a factor analysis-type or pattern recognition-type model. This “function factorization” Gaussian Process (FFGP) model allows overcoming limitations present in standard GP emulators, thereby improving both accuracy and speed of the emulator-based calibration process. Calibration of a friction-factor example using a Method of Manufactured Solution is performed to illustrate key properties of the FFGP based process.« less
Improved wavelength coded optical time domain reflectometry based on the optical switch.
Zhu, Ninghua; Tong, Youwan; Chen, Wei; Wang, Sunlong; Sun, Wenhui; Liu, Jianguo
2014-06-16
This paper presents an improved wavelength coded time-domain reflectometry based on the 2 × 1 optical switch. In this scheme, in order to improve the signal-noise-ratio (SNR) of the beat signal, the improved system used an optical switch to obtain wavelength-stable, low-noise and narrow optical pulses for probe and reference. Experiments were set up to demonstrate a spatial resolution of 2.5m within a range of 70km and obtain the beat signal with line width narrower than 15 MHz within a range of 50 km in fiber break detection. A system for wavelength-division-multiplexing passive optical network (WDM-PON) monitoring was also constructed to detect the fiber break of different channels by tuning the current applied on the gating section of the distributed Bragg reflector (DBR) laser.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-05-13
... construction (NAICS code 236), e.g., commercial building construction, industrial building construction, commercial and institutional building construction, building finishing contractors, drywall and insulation... Lead; Renovation, Repair, and Painting Program for Public and Commercial Buildings; Notice of Public...
ERIC Educational Resources Information Center
Swank, Linda K.
1994-01-01
Relationships between phonological coding abilities and reading outcomes have implications for differential diagnosis of language-based reading problems. The theoretical construct of specific phonological coding ability is explained, including phonological encoding, phonological awareness and metaphonology, lexical access, working memory, and…
Hybridizing Gravitationl Waveforms of Inspiralling Binary Neutron Star Systems
NASA Astrophysics Data System (ADS)
Cullen, Torrey; LIGO Collaboration
2016-03-01
Gravitational waves are ripples in space and time and were predicted to be produced by astrophysical systems such as binary neutron stars by Albert Einstein. These are key targets for Laser Interferometer and Gravitational Wave Observatory (LIGO), which uses template waveforms to find weak signals. The simplified template models are known to break down at high frequency, so I wrote code that constructs hybrid waveforms from numerical simulations to accurately cover a large range of frequencies. These hybrid waveforms use Post Newtonian template models at low frequencies and numerical data from simulations at high frequencies. They are constructed by reading in existing Post Newtonian models with the same masses as simulated stars, reading in the numerical data from simulations, and finding the ideal frequency and alignment to ``stitch'' these waveforms together.
Percolation bounds for decoding thresholds with correlated erasures in quantum LDPC codes
NASA Astrophysics Data System (ADS)
Hamilton, Kathleen; Pryadko, Leonid
Correlations between errors can dramatically affect decoding thresholds, in some cases eliminating the threshold altogether. We analyze the existence of a threshold for quantum low-density parity-check (LDPC) codes in the case of correlated erasures. When erasures are positively correlated, the corresponding multi-variate Bernoulli distribution can be modeled in terms of cluster errors, where qubits in clusters of various size can be marked all at once. In a code family with distance scaling as a power law of the code length, erasures can be always corrected below percolation on a qubit adjacency graph associated with the code. We bound this correlated percolation transition by weighted (uncorrelated) percolation on a specially constructed cluster connectivity graph, and apply our recent results to construct several bounds for the latter. This research was supported in part by the NSF Grant PHY-1416578 and by the ARO Grant W911NF-14-1-0272.
2013-01-01
Background The harmonization of European health systems brings with it a need for tools to allow the standardized collection of information about medical care. A common coding system and standards for the description of services are needed to allow local data to be incorporated into evidence-informed policy, and to permit equity and mobility to be assessed. The aim of this project has been to design such a classification and a related tool for the coding of services for Long Term Care (DESDE-LTC), based on the European Service Mapping Schedule (ESMS). Methods The development of DESDE-LTC followed an iterative process using nominal groups in 6 European countries. 54 researchers and stakeholders in health and social services contributed to this process. In order to classify services, we use the minimal organization unit or “Basic Stable Input of Care” (BSIC), coded by its principal function or “Main Type of Care” (MTC). The evaluation of the tool included an analysis of feasibility, consistency, ontology, inter-rater reliability, Boolean Factor Analysis, and a preliminary impact analysis (screening, scoping and appraisal). Results DESDE-LTC includes an alpha-numerical coding system, a glossary and an assessment instrument for mapping and counting LTC. It shows high feasibility, consistency, inter-rater reliability and face, content and construct validity. DESDE-LTC is ontologically consistent. It is regarded by experts as useful and relevant for evidence-informed decision making. Conclusion DESDE-LTC contributes to establishing a common terminology, taxonomy and coding of LTC services in a European context, and a standard procedure for data collection and international comparison. PMID:23768163
Tsuchimoto, Masashi; Tanimura, Yoshitaka
2015-08-11
A system with many energy states coupled to a harmonic oscillator bath is considered. To study quantum non-Markovian system-bath dynamics numerically rigorously and nonperturbatively, we developed a computer code for the reduced hierarchy equations of motion (HEOM) for a graphics processor unit (GPU) that can treat the system as large as 4096 energy states. The code employs a Padé spectrum decomposition (PSD) for a construction of HEOM and the exponential integrators. Dynamics of a quantum spin glass system are studied by calculating the free induction decay signal for the cases of 3 × 2 to 3 × 4 triangular lattices with antiferromagnetic interactions. We found that spins relax faster at lower temperature due to transitions through a quantum coherent state, as represented by the off-diagonal elements of the reduced density matrix, while it has been known that the spins relax slower due to suppression of thermal activation in a classical case. The decay of the spins are qualitatively similar regardless of the lattice sizes. The pathway of spin relaxation is analyzed under a sudden temperature drop condition. The Compute Unified Device Architecture (CUDA) based source code used in the present calculations is provided as Supporting Information .
Li, Wei; Ma, Le; Guo, Li-Ping; Wang, Xiao-Lei; Zhang, Jing-Wei; Bu, Zhi-Gao; Hua, Rong-Hong
2017-06-12
West Nile virus (WNV) is a neurotropic pathogen which causes zoonotic disease in humans. Recently, there have been an increasing number of infected cases and there are no clinically approved vaccines or effective drugs to treat WNV infections in humans. The purpose of this study was to facilitate vaccine and antiviral drug discovery by developing a packaging cell line-restricted WNV infectious replicon particle system. We constructed a DNA-based WNV replicon lacking the C-prM-E coding region and replaced it with a GFP coding sequence. To produce WNV replicon particles, cell lines stably-expressing prM-E and C-prM-E were constructed. When the WNV replicon plasmid was co-transfected with a WNV C-expressing plasmid into the prM-E-expressing cell line or directly transfected the C-prM-E expressing cell line, the replicon particle was able to replicate, form green fluorescence foci, and exhibit cytopathic plaques similar to that induced by the wild type virus. The infectious capacity of the replicon particles was restricted to the packaging cell line as the replicons demonstrated only one round of infection in other permissive cells. Thus, this system provides a safe and convenient reporter WNV manipulating tool which can be used to study WNV viral invasion mechanisms, neutralizing antibodies and antiviral efficacy.
Infant Mortality: Development of a Proposed Update to the Dollfus Classification of Infant Deaths
Dove, Melanie S.; Minnal, Archana; Damesyn, Mark; Curtis, Michael P.
2015-01-01
Objective Identifying infant deaths with common underlying causes and potential intervention points is critical to infant mortality surveillance and the development of prevention strategies. We constructed an International Classification of Diseases 10th Revision (ICD-10) parallel to the Dollfus cause-of-death classification scheme first published in 1990, which organized infant deaths by etiology and their amenability to prevention efforts. Methods Infant death records for 1996, dual-coded to the ICD Ninth Revision (ICD-9) and ICD-10, were obtained from the CDC public-use multiple-cause-of-death file on comparability between ICD-9 and ICD-10. We used the underlying cause of death to group 27,821 infant deaths into the nine categories of the ICD-9-based update to Dollfus' original coding scheme, published by Sowards in 1999. Comparability ratios were computed to measure concordance between ICD versions. Results The Dollfus classification system updated with ICD-10 codes had limited agreement with the 1999 modified classification system. Although prematurity, congenital malformations, Sudden Infant Death Syndrome, and obstetric conditions were the first through fourth most common causes of infant death under both systems, most comparability ratios were significantly different from one system to the other. Conclusion The Dollfus classification system can be adapted for use with ICD-10 codes to create a comprehensive, etiology-based profile of infant deaths. The potential benefits of using Dollfus logic to guide perinatal mortality reduction strategies, particularly to maternal and child health programs and other initiatives focused on improving infant health, warrant further examination of this method's use in perinatal mortality surveillance. PMID:26556935
EAGLEView: A surface and grid generation program and its data management
NASA Technical Reports Server (NTRS)
Remotigue, M. G.; Hart, E. T.; Stokes, M. L.
1992-01-01
An old and proven grid generation code, the EAGLE grid generation package, is given an added dimension of a graphical interface and a real time data base manager. The Numerical Aerodynamic Simulation (NAS) Panel Library is used for the graphical user interface. Through the panels, EAGLEView constructs the EAGLE script command and sends it to EAGLE to be processed. After the object is created, the script is saved in a mini-buffer which can be edited and/or saved and reinterpreted. The graphical objects are set-up in a linked-list and can be selected or queried by pointing and clicking the mouse. The added graphical enhancement to the EAGLE system emphasizes the unique capability to construct field points around complex geometry and visualize the construction every step of the way.
Image processing for safety assessment in civil engineering.
Ferrer, Belen; Pomares, Juan C; Irles, Ramon; Espinosa, Julian; Mas, David
2013-06-20
Behavior analysis of construction safety systems is of fundamental importance to avoid accidental injuries. Traditionally, measurements of dynamic actions in civil engineering have been done through accelerometers, but high-speed cameras and image processing techniques can play an important role in this area. Here, we propose using morphological image filtering and Hough transform on high-speed video sequence as tools for dynamic measurements on that field. The presented method is applied to obtain the trajectory and acceleration of a cylindrical ballast falling from a building and trapped by a thread net. Results show that safety recommendations given in construction codes can be potentially dangerous for workers.
Error-Trellis Construction for Convolutional Codes Using Shifted Error/Syndrome-Subsequences
NASA Astrophysics Data System (ADS)
Tajima, Masato; Okino, Koji; Miyagoshi, Takashi
In this paper, we extend the conventional error-trellis construction for convolutional codes to the case where a given check matrix H(D) has a factor Dl in some column (row). In the first case, there is a possibility that the size of the state space can be reduced using shifted error-subsequences, whereas in the second case, the size of the state space can be reduced using shifted syndrome-subsequences. The construction presented in this paper is based on the adjoint-obvious realization of the corresponding syndrome former HT(D). In the case where all the columns and rows of H(D) are delay free, the proposed construction is reduced to the conventional one of Schalkwijk et al. We also show that the proposed construction can equally realize the state-space reduction shown by Ariel et al. Moreover, we clarify the difference between their construction and that of ours using examples.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dustin, R.
Modernization and renovation of sports facilities challenge the design team to balance a number of requirements: spectator and owner expectations, existing building and site conditions, architectural layouts, code and legislation issues, time constraints and budget issues. System alternatives are evaluated and selected based on the relative priorities of these requirements. These priorities are unique to each project. At Alexander Memorial Coliseum, project schedules, construction funds and facility usage became the priorities. The ACC basketball schedule and arrival of the Centennial Olympics dictated the construction schedule. Initiation and success of the project depended on the commitment of the design team tomore » meet coliseum funding levels established three years ago. Analysis of facility usage and system alternative capabilities drove the design team to select a system that met the project requirements and will maximize the benefits to the owner and spectators for many years to come.« less
An efficient transgenic system by TA cloning vectors and RNAi for C. elegans
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gengyo-Ando, Keiko; CREST, JST, 4-1-8 Hon-cho, Kawaguchi, Saitama 332-0012; Yoshina, Sawako
2006-11-03
In the nematode, transgenic analyses have been performed by microinjection of DNA from various sources into the syncytium gonad. To expedite these transgenic analyses, we solved two potential problems in this work. First, we constructed an efficient TA-cloning vector system which is useful for any promoter. By amplifying the genomic DNA fragments which contain regulatory sequences with or without the coding region, we could easily construct plasmids expressing fluorescent protein fusion without considering restriction sites. We could dissect motor neurons with three colors in a single animal. Second, we used feeding RNAi to isolate transgenic strains which express lag-2::venus fusionmore » gene. We found that the fusion protein is toxic when ectopically expressed in embryos but is functional to rescue a loss of function mutant in the lag-2 gene. Thus, the transgenic system described here should be useful to examine the protein function in the nematode.« less
Simulation Enabled Safeguards Assessment Methodology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Robert Bean; Trond Bjornard; Thomas Larson
2007-09-01
It is expected that nuclear energy will be a significant component of future supplies. New facilities, operating under a strengthened international nonproliferation regime will be needed. There is good reason to believe virtual engineering applied to the facility design, as well as to the safeguards system design will reduce total project cost and improve efficiency in the design cycle. Simulation Enabled Safeguards Assessment MEthodology (SESAME) has been developed as a software package to provide this capability for nuclear reprocessing facilities. The software architecture is specifically designed for distributed computing, collaborative design efforts, and modular construction to allow step improvements inmore » functionality. Drag and drop wireframe construction allows the user to select the desired components from a component warehouse, render the system for 3D visualization, and, linked to a set of physics libraries and/or computational codes, conduct process evaluations of the system they have designed.« less
Space station integrated wall design and penetration damage control
NASA Technical Reports Server (NTRS)
Coronado, A. R.; Gibbins, M. N.; Wright, M. A.; Stern, P. H.
1987-01-01
The analysis code BUMPER executes a numerical solution to the problem of calculating the probability of no penetration (PNP) of a spacecraft subject to man-made orbital debris or meteoroid impact. The codes were developed on a DEC VAX 11/780 computer that uses the Virtual Memory System (VMS) operating system, which is written in FORTRAN 77 with no VAX extensions. To help illustrate the steps involved, a single sample analysis is performed. The example used is the space station reference configuration. The finite element model (FEM) of this configuration is relatively complex but demonstrates many BUMPER features. The computer tools and guidelines are described for constructing a FEM for the space station under consideration. The methods used to analyze the sensitivity of PNP to variations in design, are described. Ways are suggested for developing contour plots of the sensitivity study data. Additional BUMPER analysis examples are provided, including FEMs, command inputs, and data outputs. The mathematical theory used as the basis for the code is described, and illustrates the data flow within the analysis.
Barnett, Miya L.; Niec, Larissa N.; Acevedo-Polakovich, I. David
2013-01-01
This paper describes the initial evaluation of the Therapist-Parent Interaction Coding System (TPICS), a measure of in vivo therapist coaching for the evidence-based behavioral parent training intervention, parent-child interaction therapy (PCIT). Sixty-one video-recorded treatment sessions were coded with the TPICS to investigate (1) the variety of coaching techniques PCIT therapists use in the early stage of treatment, (2) whether parent skill-level guides a therapist’s coaching style and frequency, and (3) whether coaching mediates changes in parents’ skill levels from one session to the next. Results found that the TPICS captured a range of coaching techniques, and that parent skill-level prior to coaching did relate to therapists’ use of in vivo feedback. Therapists’ responsive coaching (e.g., praise to parents) was a partial mediator of change in parenting behavior from one session to the next for specific child-centered parenting skills; whereas directive coaching (e.g., modeling) did not relate to change. The TPICS demonstrates promise as a measure of coaching during PCIT with good reliability scores and initial evidence of construct validity. PMID:24839350
Eight Leadership Emergency Codes Worth Calling.
Freed, David H
Hospitals have a contemporary opportunity to change themselves before attempting to transform the larger US health care system. However, actually implementing change is much more easily described than accomplished in practice. This article calls out 8 dysfunctional behaviors that compromise professional standards at the ground level of the hospital. The construct of calling a code when one witnesses such behaviors is intended to make it safe for leaders to "See something, say something" and confront them in real time. The coordinated continuum of services that health care reform seeks to attain will not emerge until individual hospital organizations prepare themselves to operate better in their own spaces and the ones that immediately surround them.
Department of Energy Construction Safety Reference Guide
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1993-09-01
DOE has adopted the Occupational Safety and Health Administration (OSHA) regulations Title 29 Code of Federal Regulations (CFR) 1926 ``Safety and Health Regulations for Construction,`` and related parts of 29 CFR 1910, ``Occupational Safety and Health Standards.`` This nonmandatory reference guide is based on these OSHA regulations and, where appropriate, incorporates additional standards, codes, directives, and work practices that are recognized and accepted by DOE and the construction industry. It covers excavation, scaffolding, electricity, fire, signs/barricades, cranes/hoists/conveyors, hand and power tools, concrete/masonry, stairways/ladders, welding/cutting, motor vehicles/mechanical equipment, demolition, materials, blasting, steel erection, etc.
Topological quantum distillation.
Bombin, H; Martin-Delgado, M A
2006-11-03
We construct a class of topological quantum codes to perform quantum entanglement distillation. These codes implement the whole Clifford group of unitary operations in a fully topological manner and without selective addressing of qubits. This allows us to extend their application also to quantum teleportation, dense coding, and computation with magic states.
Evaluations of in-use emission factors from off-road construction equipment
NASA Astrophysics Data System (ADS)
Cao, Tanfeng; Durbin, Thomas D.; Russell, Robert L.; Cocker, David R.; Scora, George; Maldonado, Hector; Johnson, Kent C.
2016-12-01
Gaseous and particle emissions from construction engines contribute an important fraction of the total air pollutants released into the atmosphere and are gaining increasing regulatory attention. Robust quantification of nitrogen oxides (NOx) and particulate matter (PM) emissions are necessary to inventory the contribution of construction equipment to atmospheric loadings. Theses emission inventories require emissions factors from construction equipment as a function of equipment type and modes of operation. While the development of portable emissions measurement systems (PEMS) has led to increased studies of construction equipment emissions, emissions data are still much more limited than for on-road vehicles. The goal of this research program was to obtain accurate in-use emissions data from a test fleet of newer construction equipment (model year 2002 or later) using a Code of Federal Requirements (CFR) compliant PEMS system. In-use emission measurements were made from twenty-seven pieces of construction equipment, which included four backhoes, six wheel loaders, four excavators, two scrapers (one with two engines), six bulldozers, and four graders. The engines ranged in model year from 2003 to 2012, in rated horsepower (hp) from 92 to 540 hp, and in hours of operation from 24 to 17,149 h. This is the largest study of off-road equipment emissions using 40 CFR part 1065 compliant PEMS equipment for all regulated gaseous and particulate emissions.
Two-Dimensional Parson's Puzzles: The Concept, Tools, and First Observations
ERIC Educational Resources Information Center
Ihantola, Petri; Karavirta, Ville
2011-01-01
Parson's programming puzzles are a family of code construction assignments where lines of code are given, and the task is to form the solution by sorting and possibly selecting the correct code lines. We introduce a novel family of Parson's puzzles where the lines of code need to be sorted in two dimensions. The vertical dimension is used to order…
General Mission Analysis Tool (GMAT) Architectural Specification. Draft
NASA Technical Reports Server (NTRS)
Hughes, Steven P.; Conway, Darrel, J.
2007-01-01
Early in 2002, Goddard Space Flight Center (GSFC) began to identify requirements for the flight dynamics software needed to fly upcoming missions that use formations of spacecraft to collect data. These requirements ranged from low level modeling features to large scale interoperability requirements. In 2003 we began work on a system designed to meet these requirement; this system is GMAT. The General Mission Analysis Tool (GMAT) is a general purpose flight dynamics modeling tool built on open source principles. The GMAT code is written in C++, and uses modern C++ constructs extensively. GMAT can be run through either a fully functional Graphical User Interface (GUI) or as a command line program with minimal user feedback. The system is built and runs on Microsoft Windows, Linux, and Macintosh OS X platforms. The GMAT GUI is written using wxWidgets, a cross platform library of components that streamlines the development and extension of the user interface Flight dynamics modeling is performed in GMAT by building components that represent the players in the analysis problem that is being modeled. These components interact through the sequential execution of instructions, embodied in the GMAT Mission Sequence. A typical Mission Sequence will model the trajectories of a set of spacecraft evolving over time, calculating relevant parameters during this propagation, and maneuvering individual spacecraft to maintain a set of mission constraints as established by the mission analyst. All of the elements used in GMAT for mission analysis can be viewed in the GMAT GUI or through a custom scripting language. Analysis problems modeled in GMAT are saved as script files, and these files can be read into GMAT. When a script is read into the GMAT GUI, the corresponding user interface elements are constructed in the GMAT GUI. The GMAT system was developed from the ground up to run in a platform agnostic environment. The source code compiles on numerous different platforms, and is regularly exercised running on Windows, Linux and Macintosh computers by the development and analysis teams working on the project. The system can be run using either a graphical user interface, written using the open source wxWidgets framework, or from a text console. The GMAT source code was written using open source tools. GSFC has released the code using the NASA open source license.
Sustainable regulation of construction.
2000-11-01
The seminar examined the role building codes and regulations can have in promoting a more sustainable approach to construction, particularly through their application to non-industrial building materials. A range of building materials such as straw, bamboo, rammed earth, adobe, and cob (a mixture of clay and chopped straw) were described and illustrated by slides to show their building potential. The current codes have a prime concern to protect the health and safety of people from the built environment. They have been developed almost exclusively for mainstream industrial materials and methods of construction, which makes them difficult to use with alternative, indigenous, or non-industrial building materials, even though those materials may be considered more sustainable. The argument was put forward that with only one-third of the world population living in modern industrial buildings today, it is not sustainable to re-house the remaining rapidly expanding population in high technology dwellings. Many of the low technology building materials and methods now used by the majority of people in the world need only incremental improvement to be equal or superior to many of their industrial replacements. Since these can be more sustainable methods of building, there needs to be an acceptance of the use of alternative materials, particularly in the developing parts of the world, where they are being rejected for less sustainable industrial methods. However, many codes make it difficult to use non-industrial materials; indeed, many of the industrial materials would not meet the demands that must be now met if they were now being introduced as new materials. Consequently, there is a need to develop codes to facilitate the use of a wider range of materials than in current use, and research is needed to assist this development. Sustainable regulation should take into account the full range of real impacts that materials and systems have in areas such as resource use and depletion, toxicity of the processes that produce them, and their potential for re-use and recyclability.
Measuring homework completion in behavioral activation.
Busch, Andrew M; Uebelacker, Lisa A; Kalibatseva, Zornitsa; Miller, Ivan W
2010-07-01
The aim of this study was to develop and validate an observer-based coding system for the characterization and completion of homework assignments during Behavioral Activation (BA). Existing measures of homework completion are generally unsophisticated, and there is no current measure of homework completion designed to capture the particularities of BA. The tested scale sought to capture the type of assignment, realm of functioning targeted, extent of completion, and assignment difficulty. Homework assignments were drawn from 12 (mean age = 48, 83% female) clients in two trials of a 10-session BA manual targeting treatment-resistant depression in primary care. The two coders demonstrated acceptable or better reliability on most codes, and unreliable codes were dropped from the proposed scale. In addition, correlations between homework completion and outcome were strong, providing some support for construct validity. Ultimately, this line of research aims to develop a user-friendly, reliable measure of BA homework completion that can be completed by a therapist during session.
Parallel software for lattice N = 4 supersymmetric Yang-Mills theory
NASA Astrophysics Data System (ADS)
Schaich, David; DeGrand, Thomas
2015-05-01
We present new parallel software, SUSY LATTICE, for lattice studies of four-dimensional N = 4 supersymmetric Yang-Mills theory with gauge group SU(N). The lattice action is constructed to exactly preserve a single supersymmetry charge at non-zero lattice spacing, up to additional potential terms included to stabilize numerical simulations. The software evolved from the MILC code for lattice QCD, and retains a similar large-scale framework despite the different target theory. Many routines are adapted from an existing serial code (Catterall and Joseph, 2012), which SUSY LATTICE supersedes. This paper provides an overview of the new parallel software, summarizing the lattice system, describing the applications that are currently provided and explaining their basic workflow for non-experts in lattice gauge theory. We discuss the parallel performance of the code, and highlight some notable aspects of the documentation for those interested in contributing to its future development.
Structural Code Considerations for Solar Rooftop Installations.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dwyer, Stephen F.; Dwyer, Brian P.; Sanchez, Alfred
2014-12-01
Residential rooftop solar panel installations are limited in part by the high cost of structural related code requirements for field installation. Permitting solar installations is difficult because there is a belief among residential permitting authorities that typical residential rooftops may be structurally inadequate to support the additional load associated with a photovoltaic (PV) solar installation. Typical engineering methods utilized to calculate stresses on a roof structure involve simplifying assumptions that render a complex non-linear structure to a basic determinate beam. This method of analysis neglects the composite action of the entire roof structure, yielding a conservative analysis based on amore » rafter or top chord of a truss. Consequently, the analysis can result in an overly conservative structural analysis. A literature review was conducted to gain a better understanding of the conservative nature of the regulations and codes governing residential construction and the associated structural system calculations.« less
[New forms of medical profession--advertising].
Wolter, Udo
2005-04-01
Particularly in the last two years, the legislation of Part V of the German Social Code has challenged the time-honoured system of the physician's own medical practice as the panel doctor's registered office. New forms of outpatient care, for example the health centres, "Heilkunde-GmbHs", and the recently-developed medical care centres, are intended to impact on patients' ambulatory healthcare. Due to the partial employee status of practice owners, and thus the relinquishing of the independent entrepreneurial structuring of their own practices, the construct of the traditional professional code of conduct for physicians is beginning to totter. It remains to be seen whether liberalisation of the model code of conduct will provide a remedy. The principle should, however, be adhered to that advertising in the physician sector must not be unethical, if we understand this to mean not strident, not confusing, and not comparative.
Prediction of plant lncRNA by ensemble machine learning classifiers.
Simopoulos, Caitlin M A; Weretilnyk, Elizabeth A; Golding, G Brian
2018-05-02
In plants, long non-protein coding RNAs are believed to have essential roles in development and stress responses. However, relative to advances on discerning biological roles for long non-protein coding RNAs in animal systems, this RNA class in plants is largely understudied. With comparatively few validated plant long non-coding RNAs, research on this potentially critical class of RNA is hindered by a lack of appropriate prediction tools and databases. Supervised learning models trained on data sets of mostly non-validated, non-coding transcripts have been previously used to identify this enigmatic RNA class with applications largely focused on animal systems. Our approach uses a training set comprised only of empirically validated long non-protein coding RNAs from plant, animal, and viral sources to predict and rank candidate long non-protein coding gene products for future functional validation. Individual stochastic gradient boosting and random forest classifiers trained on only empirically validated long non-protein coding RNAs were constructed. In order to use the strengths of multiple classifiers, we combined multiple models into a single stacking meta-learner. This ensemble approach benefits from the diversity of several learners to effectively identify putative plant long non-coding RNAs from transcript sequence features. When the predicted genes identified by the ensemble classifier were compared to those listed in GreeNC, an established plant long non-coding RNA database, overlap for predicted genes from Arabidopsis thaliana, Oryza sativa and Eutrema salsugineum ranged from 51 to 83% with the highest agreement in Eutrema salsugineum. Most of the highest ranking predictions from Arabidopsis thaliana were annotated as potential natural antisense genes, pseudogenes, transposable elements, or simply computationally predicted hypothetical protein. Due to the nature of this tool, the model can be updated as new long non-protein coding transcripts are identified and functionally verified. This ensemble classifier is an accurate tool that can be used to rank long non-protein coding RNA predictions for use in conjunction with gene expression studies. Selection of plant transcripts with a high potential for regulatory roles as long non-protein coding RNAs will advance research in the elucidation of long non-protein coding RNA function.
27 CFR 53.97 - Constructive sale price; affiliated corporations.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 27 Alcohol, Tobacco Products and Firearms 2 2010-04-01 2010-04-01 false Constructive sale price... AND AMMUNITION Special Provisions Applicable to Manufacturers Taxes § 53.97 Constructive sale price... determining a constructive sale price under section 4216(b)(1)(C) of the Code for sales between corporations...
Construction Services at Northern Arizona University.
ERIC Educational Resources Information Center
Van Dyke, Gary
Construction Services is an innovative response to a chronic construction-remodeling problem at Northern Arizona State University. It is an in-house facilities maintenance department designed to address a variety of needs: prevention of construction or remodeling done by individual staff or faculty members without regard for applicable codes;…
Cai, Yong; Li, Xiwen; Wang, Runmiao; Yang, Qing; Li, Peng; Hu, Hao
2016-01-01
Currently, the chemical fingerprint comparison and analysis is mainly based on professional equipment and software, it’s expensive and inconvenient. This study aims to integrate QR (Quick Response) code with quality data and mobile intelligent technology to develop a convenient query terminal for tracing quality in the whole industrial chain of TCM (traditional Chinese medicine). Three herbal medicines were randomly selected and their chemical two-dimensional barcode (2D) barcodes fingerprints were constructed. Smartphone application (APP) based on Android system was developed to read initial data of 2D chemical barcodes, and compared multiple fingerprints from different batches of same species or different species. It was demonstrated that there were no significant differences between original and scanned TCM chemical fingerprints. Meanwhile, different TCM chemical fingerprint QR codes could be rendered in the same coordinate and showed the differences very intuitively. To be able to distinguish the variations of chemical fingerprint more directly, linear interpolation angle cosine similarity algorithm (LIACSA) was proposed to get similarity ratio. This study showed that QR codes can be used as an effective information carrier to transfer quality data. Smartphone application can rapidly read quality information in QR codes and convert data into TCM chemical fingerprints. PMID:27780256
Bilingual Storytelling: Code Switching, Discourse Control, and Learning Opportunities.
ERIC Educational Resources Information Center
de Mejia, Anne-Marie
1998-01-01
Alternating between languages in the construction of stories offers students creative opportunities for bilingual learning. Describes how a storyteller can code switch to tell stories to children who are becoming bilingual and presents an example from early-immersion classrooms in Colombia, discussing code switching and discourse control, and…
Error control techniques for satellite and space communications
NASA Technical Reports Server (NTRS)
Costello, Daniel J., Jr.
1988-01-01
During the period December 1, 1987 through May 31, 1988, progress was made in the following areas: construction of Multi-Dimensional Bandwidth Efficient Trellis Codes with MPSK modulation; performance analysis of Bandwidth Efficient Trellis Coded Modulation schemes; and performance analysis of Bandwidth Efficient Trellis Codes on Fading Channels.
38 CFR 39.63 - Architectural design standards.
Code of Federal Regulations, 2011 CFR
2011-07-01
... Association Life Safety Code and Errata (NFPA 101), the 2003 edition of the NFPA 5000, Building Construction... section, all applicable local and State building codes and regulations must be observed. In areas not subject to local or State building codes, the recommendations contained in the 2003 edition of the NFPA...
7 CFR 1792.103 - Seismic design and construction standards for new buildings.
Code of Federal Regulations, 2013 CFR
2013-01-01
... Seismic Regulation for New Buildings. (b) Each of the following model codes or standards provides a level...) 548-2723. Fax: (703) 295-6211. (3) 2003 International Code Council (ICC) International Building Code... buildings. 1792.103 Section 1792.103 Agriculture Regulations of the Department of Agriculture (Continued...
7 CFR 1792.103 - Seismic design and construction standards for new buildings.
Code of Federal Regulations, 2014 CFR
2014-01-01
... Seismic Regulation for New Buildings. (b) Each of the following model codes or standards provides a level...) 548-2723. Fax: (703) 295-6211. (3) 2003 International Code Council (ICC) International Building Code... buildings. 1792.103 Section 1792.103 Agriculture Regulations of the Department of Agriculture (Continued...
7 CFR 1792.103 - Seismic design and construction standards for new buildings.
Code of Federal Regulations, 2012 CFR
2012-01-01
... Seismic Regulation for New Buildings. (b) Each of the following model codes or standards provides a level...) 548-2723. Fax: (703) 295-6211. (3) 2003 International Code Council (ICC) International Building Code... buildings. 1792.103 Section 1792.103 Agriculture Regulations of the Department of Agriculture (Continued...
7 CFR 1792.103 - Seismic design and construction standards for new buildings.
Code of Federal Regulations, 2011 CFR
2011-01-01
... Seismic Regulation for New Buildings. (b) Each of the following model codes or standards provides a level...) 548-2723. Fax: (703) 295-6211. (3) 2003 International Code Council (ICC) International Building Code... buildings. 1792.103 Section 1792.103 Agriculture Regulations of the Department of Agriculture (Continued...
Graywater Use by the Army -- Is It Time Yet?
2011-05-11
codes and regulations. Where chlorine rs used for disinfection , the non-potable water shall contain not more than 4 mg/L of chloramines or free...filtration • Often minimal treatment then underground irrigation system Many commercial package plants ► Filtered, disinfected product – fairly... Disinfection Identification (labeling and dying) Distribution Permit to construct BUILDING STRONG® Other Concerns Fixture flushing Cooling
Tailoring a software production environment for a large project
NASA Technical Reports Server (NTRS)
Levine, D. R.
1984-01-01
A software production environment was constructed to meet the specific goals of a particular large programming project. These goals, the specific solutions as implemented, and the experiences on a project of over 100,000 lines of source code are discussed. The base development environment for this project was an ordinary PWB Unix (tm) system. Several important aspects of the development process required support not available in the existing tool set.
Air Leakage and Air Transfer Between Garage and Living Space
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rudd, Armin
2014-09-01
This research project focused on evaluation of air transfer between the garage and living space in a single-family detached home constructed by a production homebuilder in compliance with the 2009 International Residential Code and the 2009 International Energy Conservation Code. The project gathered important information about the performance of whole-building ventilation systems and garage ventilation systems as they relate to minimizing flow of contaminated air from garage to living space. A series of 25 multi-point fan pressurization tests and additional zone pressure diagnostic testing characterized the garage and house air leakage, the garage-to-house air leakage, and garage and house pressuremore » relationships to each other and to outdoors using automated fan pressurization and pressure monitoring techniques. While the relative characteristics of this house may not represent the entire population of new construction configurations and air tightness levels (house and garage) throughout the country, the technical approach was conservative and should reasonably extend the usefulness of the results to a large spectrum of house configurations from this set of parametric tests in this one house. Based on the results of this testing, the two-step garage-to-house air leakage test protocol described above is recommended where whole-house exhaust ventilation is employed.« less
PipelineDog: a simple and flexible graphic pipeline construction and maintenance tool.
Zhou, Anbo; Zhang, Yeting; Sun, Yazhou; Xing, Jinchuan
2018-05-01
Analysis pipelines are an essential part of bioinformatics research, and ad hoc pipelines are frequently created by researchers for prototyping and proof-of-concept purposes. However, most existing pipeline management system or workflow engines are too complex for rapid prototyping or learning the pipeline concept. A lightweight, user-friendly and flexible solution is thus desirable. In this study, we developed a new pipeline construction and maintenance tool, PipelineDog. This is a web-based integrated development environment with a modern web graphical user interface. It offers cross-platform compatibility, project management capabilities, code formatting and error checking functions and an online repository. It uses an easy-to-read/write script system that encourages code reuse. With the online repository, it also encourages sharing of pipelines, which enhances analysis reproducibility and accountability. For most users, PipelineDog requires no software installation. Overall, this web application provides a way to rapidly create and easily manage pipelines. PipelineDog web app is freely available at http://web.pipeline.dog. The command line version is available at http://www.npmjs.com/package/pipelinedog and online repository at http://repo.pipeline.dog. ysun@kean.edu or xing@biology.rutgers.edu or ysun@diagnoa.com. Supplementary data are available at Bioinformatics online.
Reformation of Regulatory Technical Standards for Nuclear Power Generation Equipments in Japan
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mikio Kurihara; Masahiro Aoki; Yu Maruyama
2006-07-01
Comprehensive reformation of the regulatory system has been introduced in Japan in order to apply recent technical progress in a timely manner. 'The Technical Standards for Nuclear Power Generation Equipments', known as the Ordinance No.622) of the Ministry of International Trade and Industry, which is used for detailed design, construction and operating stage of Nuclear Power Plants, was being modified to performance specifications with the consensus codes and standards being used as prescriptive specifications, in order to facilitate prompt review of the Ordinance with response to technological innovation. The activities on modification were performed by the Nuclear and Industrial Safetymore » Agency (NISA), the regulatory body in Japan, with support of the Japan Nuclear Energy Safety Organization (JNES), a technical support organization. The revised Ordinance No.62 was issued on July 1, 2005 and is enforced from January 1 2006. During the period from the issuance to the enforcement, JNES carried out to prepare enforceable regulatory guide which complies with each provisions of the Ordinance No.62, and also made technical assessment to endorse the applicability of consensus codes and standards, in response to NISA's request. Some consensus codes and standards were re-assessed since they were already used in regulatory review of the construction plan submitted by licensee. Other consensus codes and standards were newly assessed for endorsement. In case that proper consensus code or standards were not prepared, details of regulatory requirements were described in the regulatory guide as immediate measures. At the same time, appropriate standards developing bodies were requested to prepare those consensus code or standards. Supplementary note which provides background information on the modification, applicable examples etc. was prepared for convenience to the users of the Ordinance No. 62. This paper shows the activities on modification and the results, following the NISA's presentation at ICONE-13 that introduced the framework of the performance specifications and the modification process of the Ordinance NO. 62. (authors)« less
Development of a web service for analysis in a distributed network.
Jiang, Xiaoqian; Wu, Yuan; Marsolo, Keith; Ohno-Machado, Lucila
2014-01-01
We describe functional specifications and practicalities in the software development process for a web service that allows the construction of the multivariate logistic regression model, Grid Logistic Regression (GLORE), by aggregating partial estimates from distributed sites, with no exchange of patient-level data. We recently developed and published a web service for model construction and data analysis in a distributed environment. This recent paper provided an overview of the system that is useful for users, but included very few details that are relevant for biomedical informatics developers or network security personnel who may be interested in implementing this or similar systems. We focus here on how the system was conceived and implemented. We followed a two-stage development approach by first implementing the backbone system and incrementally improving the user experience through interactions with potential users during the development. Our system went through various stages such as concept proof, algorithm validation, user interface development, and system testing. We used the Zoho Project management system to track tasks and milestones. We leveraged Google Code and Apache Subversion to share code among team members, and developed an applet-servlet architecture to support the cross platform deployment. During the development process, we encountered challenges such as Information Technology (IT) infrastructure gaps and limited team experience in user-interface design. We figured out solutions as well as enabling factors to support the translation of an innovative privacy-preserving, distributed modeling technology into a working prototype. Using GLORE (a distributed model that we developed earlier) as a pilot example, we demonstrated the feasibility of building and integrating distributed modeling technology into a usable framework that can support privacy-preserving, distributed data analysis among researchers at geographically dispersed institutes.
Development of a Web Service for Analysis in a Distributed Network
Jiang, Xiaoqian; Wu, Yuan; Marsolo, Keith; Ohno-Machado, Lucila
2014-01-01
Objective: We describe functional specifications and practicalities in the software development process for a web service that allows the construction of the multivariate logistic regression model, Grid Logistic Regression (GLORE), by aggregating partial estimates from distributed sites, with no exchange of patient-level data. Background: We recently developed and published a web service for model construction and data analysis in a distributed environment. This recent paper provided an overview of the system that is useful for users, but included very few details that are relevant for biomedical informatics developers or network security personnel who may be interested in implementing this or similar systems. We focus here on how the system was conceived and implemented. Methods: We followed a two-stage development approach by first implementing the backbone system and incrementally improving the user experience through interactions with potential users during the development. Our system went through various stages such as concept proof, algorithm validation, user interface development, and system testing. We used the Zoho Project management system to track tasks and milestones. We leveraged Google Code and Apache Subversion to share code among team members, and developed an applet-servlet architecture to support the cross platform deployment. Discussion: During the development process, we encountered challenges such as Information Technology (IT) infrastructure gaps and limited team experience in user-interface design. We figured out solutions as well as enabling factors to support the translation of an innovative privacy-preserving, distributed modeling technology into a working prototype. Conclusion: Using GLORE (a distributed model that we developed earlier) as a pilot example, we demonstrated the feasibility of building and integrating distributed modeling technology into a usable framework that can support privacy-preserving, distributed data analysis among researchers at geographically dispersed institutes. PMID:25848586
A beamline systems model for Accelerator-Driven Transmutation Technology (ADTT) facilities
DOE Office of Scientific and Technical Information (OSTI.GOV)
Todd, A.M.M.; Paulson, C.C.; Peacock, M.A.
1995-10-01
A beamline systems code, that is being developed for Accelerator-Driven Transmutation Technology (ADTT) facility trade studies, is described. The overall program is a joint Grumman, G.H. Gillespie Associates (GHGA) and Los Alamos National Laboratory effort. The GHGA Accelerator Systems Model (ASM) has been adopted as the framework on which this effort is based. Relevant accelerator and beam transport models from earlier Grumman systems codes are being adapted to this framework. Preliminary physics and engineering models for each ADTT beamline component have been constructed. Examples noted include a Bridge Coupled Drift Tube Linac (BCDTL) and the accelerator thermal system. A decisionmore » has been made to confine the ASM framework principally to beamline modeling, while detailed target/blanket, balance-of-plant and facility costing analysis will be performed externally. An interfacing external balance-of-plant and facility costing model, which will permit the performance of iterative facility trade studies, is under separate development. An ABC (Accelerator Based Conversion) example is used to highlight the present models and capabilities.« less
A beamline systems model for Accelerator-Driven Transmutation Technology (ADTT) facilities
DOE Office of Scientific and Technical Information (OSTI.GOV)
Todd, Alan M. M.; Paulson, C. C.; Peacock, M. A.
1995-09-15
A beamline systems code, that is being developed for Accelerator-Driven Transmutation Technology (ADTT) facility trade studies, is described. The overall program is a joint Grumman, G. H. Gillespie Associates (GHGA) and Los Alamos National Laboratory effort. The GHGA Accelerator Systems Model (ASM) has been adopted as the framework on which this effort is based. Relevant accelerator and beam transport models from earlier Grumman systems codes are being adapted to this framework. Preliminary physics and engineering models for each ADTT beamline component have been constructed. Examples noted include a Bridge Coupled Drift Tube Linac (BCDTL) and the accelerator thermal system. Amore » decision has been made to confine the ASM framework principally to beamline modeling, while detailed target/blanket, balance-of-plant and facility costing analysis will be performed externally. An interfacing external balance-of-plant and facility costing model, which will permit the performance of iterative facility trade studies, is under separate development. An ABC (Accelerator Based Conversion) example is used to highlight the present models and capabilities.« less
The Italian experience on T/H best estimate codes: Achievements and perspectives
DOE Office of Scientific and Technical Information (OSTI.GOV)
Alemberti, A.; D`Auria, F.; Fiorino, E.
1997-07-01
Themalhydraulic system codes are complex tools developed to simulate the power plants behavior during off-normal conditions. Among the objectives of the code calculations the evaluation of safety margins, the operator training, the optimization of the plant design and of the emergency operating procedures, are mostly considered in the field of the nuclear safety. The first generation of codes was developed in the United States at the end of `60s. Since that time, different research groups all over the world started the development of their own codes. At the beginning of the `80s, the second generation codes were proposed; these differmore » from the first generation codes owing to the number of balance equations solved (six instead of three), the sophistication of the constitutive models and of the adopted numerics. The capabilities of available computers have been fully exploited during the years. The authors then summarize some of the major steps in the process of developing, modifying, and advancing the capabilities of the codes. They touch on the fact that Italian, and for that matter non-American, researchers have not been intimately involved in much of this work. They then describe the application of these codes in Italy, even though there are no operating or under construction nuclear power plants at this time. Much of this effort is directed at the general question of plant safety in the face of transient type events.« less
Fault trees for decision making in systems analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lambert, Howard E.
1975-10-09
The application of fault tree analysis (FTA) to system safety and reliability is presented within the framework of system safety analysis. The concepts and techniques involved in manual and automated fault tree construction are described and their differences noted. The theory of mathematical reliability pertinent to FTA is presented with emphasis on engineering applications. An outline of the quantitative reliability techniques of the Reactor Safety Study is given. Concepts of probabilistic importance are presented within the fault tree framework and applied to the areas of system design, diagnosis and simulation. The computer code IMPORTANCE ranks basic events and cut setsmore » according to a sensitivity analysis. A useful feature of the IMPORTANCE code is that it can accept relative failure data as input. The output of the IMPORTANCE code can assist an analyst in finding weaknesses in system design and operation, suggest the most optimal course of system upgrade, and determine the optimal location of sensors within a system. A general simulation model of system failure in terms of fault tree logic is described. The model is intended for efficient diagnosis of the causes of system failure in the event of a system breakdown. It can also be used to assist an operator in making decisions under a time constraint regarding the future course of operations. The model is well suited for computer implementation. New results incorporated in the simulation model include an algorithm to generate repair checklists on the basis of fault tree logic and a one-step-ahead optimization procedure that minimizes the expected time to diagnose system failure.« less
Multimedia techniques for construction education and training : final report.
DOT National Transportation Integrated Search
2017-02-01
The current profession of civil engineering often focuses education and training on code compliance rather than constructability and construction techniques. Also, it is well accepted that it takes a decade or more for engineers to develop a high-lev...
NASA Astrophysics Data System (ADS)
Kirvelis, Dobilas; Beitas, Kastytis
2008-10-01
The aim of this work is to show that the essence of life and living systems is their organization as bioinformational technology on the base of informational anticipatory control. Principal paradigmatic and structural schemes of functional organization of life (organisms and their systems) are constructed on the basis of systemic analysis and synthesis of main phenomenological features of living world. Life is based on functional elements that implement engineering procedures of closed-loop coding-decoding control (CL-CDC). Phenomenon of natural bioinformational control appeared and developed on the Earth 3-4 bln years ago, when the life originated as a result of chemical and later biological evolution. Informatics paradigm considers the physical and chemical transformations of energy and matter in organized systems as flows that are controlled and the signals as means for purposive informational control programs. The social and technical technological systems as informational control systems are a latter phenomenon engineered by man. The information emerges in organized systems as a necessary component of control technology. Generalized schemes of functional organization on levels of cell, organism and brain neocortex, as the highest biosystem with CL-CDC, are presented. CL-CDC concept expands the understanding of bioinformatics.
2006-10-01
The objective was to construct a bridge between existing and future microscopic simulation codes ( kMC , MD, MC, BD, LB etc.) and traditional, continuum...kinetic Monte Carlo, kMC , equilibrium MC, Lattice-Boltzmann, LB, Brownian Dynamics, BD, or general agent-based, AB) simulators. It also, fortuitously...cond-mat/0310460 at arXiv.org. 27. Coarse Projective kMC Integration: Forward/Reverse Initial and Boundary Value Problems", R. Rico-Martinez, C. W
Analysis for Building Envelopes and Mechanical Systems Using 2012 CBECS Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Winiarski, David W.; Halverson, Mark A.; Butzbaugh, Joshua B.
This report describes the aggregation and mapping of certain building characteristics data available in the most recent Commercial Building Energy Consumption Survey (CBECS) (DOE EIA 2012) to describe most typical building construction practices. This report provides summary data for potential use in the support of modifications to the Pacific Northwest National Laboratory’s commercial building prototypes used for building energy code analysis. Specifically, this report outlines findings and most typical design choices for certain building envelope and heating, ventilating, and air-conditioning (HVAC) system choices.
Research on performance of three-layer MG-OXC system based on MLAG and OCDM
NASA Astrophysics Data System (ADS)
Wang, Yubao; Ren, Yanfei; Meng, Ying; Bai, Jian
2017-10-01
At present, as traffic volume which optical transport networks convey and species of traffic grooming methods increase rapidly, optical switching techniques are faced with a series of issues, such as more requests for the number of wavelengths and complicated structure management and implementation. This work introduces optical code switching based on wavelength switching, constructs the three layers multi-granularity optical cross connection (MG-OXC) system on the basis of optical code division multiplexing (OCDM) and presents a new traffic grooming algorithm. The proposed architecture can improve the flexibility of traffic grooming, reduce the amount of used wavelengths and save the number of consumed ports, hence, it can simplify routing device and enhance the performance of the system significantly. Through analyzing the network model of switching structure on multicast layered auxiliary graph (MLAG) and the establishment of traffic grooming links, and the simulation of blocking probability and throughput, this paper shows the excellent performance of this mentioned architecture.
Kinetic turbulence simulations at extreme scale on leadership-class systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Bei; Ethier, Stephane; Tang, William
2013-01-01
Reliable predictive simulation capability addressing confinement properties in magnetically confined fusion plasmas is critically-important for ITER, a 20 billion dollar international burning plasma device under construction in France. The complex study of kinetic turbulence, which can severely limit the energy confinement and impact the economic viability of fusion systems, requires simulations at extreme scale for such an unprecedented device size. Our newly optimized, global, ab initio particle-in-cell code solving the nonlinear equations underlying gyrokinetic theory achieves excellent performance with respect to "time to solution" at the full capacity of the IBM Blue Gene/Q on 786,432 cores of Mira at ALCFmore » and recently of the 1,572,864 cores of Sequoia at LLNL. Recent multithreading and domain decomposition optimizations in the new GTC-P code represent critically important software advances for modern, low memory per core systems by enabling routine simulations at unprecedented size (130 million grid points ITER-scale) and resolution (65 billion particles).« less
ENEL overall PWR plant models and neutronic integrated computing systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pedroni, G.; Pollachini, L.; Vimercati, G.
1987-01-01
To support the design activity of the Italian nuclear energy program for the construction of pressurized water reactors, the Italian Electricity Board (ENEL) needs to verify the design as a whole (that is, the nuclear steam supply system and balance of plant) both in steady-state operation and in transient. The ENEL has therefore developed two computer models to analyze both operational and incidental transients. The models, named STRIP and SFINCS, perform the analysis of the nuclear as well as the conventional part of the plant (the control system being properly taken into account). The STRIP model has been developed bymore » means of the French (Electricite de France) modular code SICLE, while SFINCS is based on the Italian (ENEL) modular code LEGO. STRIP validation was performed with respect to Fessenheim French power plant experimental data. Two significant transients were chosen: load step and total load rejection. SFINCS validation was performed with respect to Saint-Laurent French power plant experimental data and also by comparing the SFINCS-STRIP responses.« less
[Representation of knowledge in respiratory medicine: ontology should help the coding process].
Blanc, F-X; Baneyx, A; Charlet, J; Housset, B
2010-09-01
Access to medical knowledge is a major issue for health professionals and requires the development of terminologies. The objective of the reported work was to construct an ontology of respiratory medicine, i.e. an organized and formalized terminology composed by specific knowledge. The purpose is to help the medico-economical coding process and to represent the relevant knowledge about the patient. Our researches cover the whole life cycle of an ontology, from the development of a methodology, to building it from texts, to its use in an operational system. A computerized tool, based on the ontology, allows both a medico-economical coding and a graphical medical one. This second one will be used to index hospital reports. Our ontology counts 1913 concepts and contains all the knowledge included in the PMSI part of the SPLF thesaurus. Our tool has been evaluated and showed a recall of 80% and an accuracy of 85% regarding the medico-economical coding. The work presented in this paper justifies the approach that has been used. It must be continued on a large scale to validate our coding principles and the possibility of making enquiries on patient reports concerning clinical research. Copyright © 2010. Published by Elsevier Masson SAS.
Towards self-correcting quantum memories
NASA Astrophysics Data System (ADS)
Michnicki, Kamil
This thesis presents a model of self-correcting quantum memories where quantum states are encoded using topological stabilizer codes and error correction is done using local measurements and local dynamics. Quantum noise poses a practical barrier to developing quantum memories. This thesis explores two types of models for suppressing noise. One model suppresses thermalizing noise energetically by engineering a Hamiltonian with a high energy barrier between code states. Thermalizing dynamics are modeled phenomenologically as a Markovian quantum master equation with only local generators. The second model suppresses stochastic noise with a cellular automaton that performs error correction using syndrome measurements and a local update rule. Several ways of visualizing and thinking about stabilizer codes are presented in order to design ones that have a high energy barrier: the non-local Ising model, the quasi-particle graph and the theory of welded stabilizer codes. I develop the theory of welded stabilizer codes and use it to construct a code with the highest known energy barrier in 3-d for spin Hamiltonians: the welded solid code. Although the welded solid code is not fully self correcting, it has some self correcting properties. It has an increased memory lifetime for an increased system size up to a temperature dependent maximum. One strategy for increasing the energy barrier is by mediating an interaction with an external system. I prove a no-go theorem for a class of Hamiltonians where the interaction terms are local, of bounded strength and commute with the stabilizer group. Under these conditions the energy barrier can only be increased by a multiplicative constant. I develop cellular automaton to do error correction on a state encoded using the toric code. The numerical evidence indicates that while there is no threshold, the model can extend the memory lifetime significantly. While of less theoretical importance, this could be practical for real implementations of quantum memories. Numerical evidence also suggests that the cellular automaton could function as a decoder with a soft threshold.
A new brain-computer interface design using fuzzy ARTMAP.
Palaniappan, Ramaswamy; Paramesran, Raveendran; Nishida, Shogo; Saiwaki, Naoki
2002-09-01
This paper proposes a new brain-computer interface (BCI) design using fuzzy ARTMAP (FA) neural network, as well as an application of the design. The objective of this BCI-FA design is to classify the best three of the five available mental tasks for each subject using power spectral density (PSD) values of electroencephalogram (EEG) signals. These PSD values are extracted using the Wiener-Khinchine and autoregressive methods. Ten experiments employing different triplets of mental tasks are studied for each subject. The findings show that the average BCI-FA outputs for four subjects gave less than 6% of error using the best triplets of mental tasks identified from the classification performances of FA. This implies that the BCI-FA can be successfully used with a tri-state switching device. As an application, a proposed tri-state Morse code scheme could be utilized to translate the outputs of this BCI-FA design into English letters. In this scheme, the three BCI-FA outputs correspond to a dot and a dash, which are the two basic Morse code alphabets and a space to denote the end (or beginning) of a dot or a dash. The construction of English letters using this tri-state Morse code scheme is determined only by the sequence of mental tasks and is independent of the time duration of each mental task. This is especially useful for constructing letters that are represented as multiple dots or dashes. This combination of BCI-FA design and the tri-state Morse code scheme could be developed as a communication system for paralyzed patients.
27 CFR 53.95 - Constructive sale price; basic rules.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 27 Alcohol, Tobacco Products and Firearms 2 2010-04-01 2010-04-01 false Constructive sale price... AMMUNITION Special Provisions Applicable to Manufacturers Taxes § 53.95 Constructive sale price; basic rules... to construct a sale price on which to compute a tax imposed under chapter 32 of the Code on the price...
Automatic programming of simulation models
NASA Technical Reports Server (NTRS)
Schroer, Bernard J.; Tseng, Fan T.; Zhang, Shou X.; Dwan, Wen S.
1988-01-01
The objective of automatic programming is to improve the overall environment for describing the program. This improved environment is realized by a reduction in the amount of detail that the programmer needs to know and is exposed to. Furthermore, this improved environment is achieved by a specification language that is more natural to the user's problem domain and to the user's way of thinking and looking at the problem. The goal of this research is to apply the concepts of automatic programming (AP) to modeling discrete event simulation system. Specific emphasis is on the design and development of simulation tools to assist the modeler define or construct a model of the system and to then automatically write the corresponding simulation code in the target simulation language, GPSS/PC. A related goal is to evaluate the feasibility of various languages for constructing automatic programming simulation tools.
ERIC Educational Resources Information Center
Moradi, Hamzeh
2014-01-01
Depending on the demands of a particular communicative situation, bilingual or multilingual speakers ("bilingualism-multilingualism") will switch between language varieties. Code-switching is the practice of moving between variations of languages in different contexts. In an educational context, code-switching is defined as the practice…
Zhou, Hai; He, Ming; Li, Jing; Chen, Liang; Huang, Zhifeng; Zheng, Shaoyan; Zhu, Liya; Ni, Erdong; Jiang, Dagang; Zhao, Bingran; Zhuang, Chuxiong
2016-11-22
Hybrid rice breeding offers an important strategy to improve rice production, in which the cultivation of a male sterile line is the key to the success of cross-breeding. CRISPR/Cas9 systems have been widely used in target-site genome editing, whereas their application for crop genetic improvement has been rarely reported. Here, using the CRISPR/Cas9 system, we induced specific mutations in TMS5, which is the most widely applied thermo-sensitive genic male sterility (TGMS) gene in China, and developed new "transgene clean" TGMS lines. We designed 10 target sites in the coding region of TMS5 for targeted mutagenesis using the CRISPR/Cas9 system and assessed the potential rates of on- and off-target effects. Finally, we established the most efficient construct, the TMS5ab construct, for breeding potentially applicable "transgene clean" TGMS lines. We also discussed factors that affect the editing efficiency according to the characteristics of different target sequences. Notably, using the TMS5ab construct, we developed 11 new "transgene clean" TGMS lines with potential applications in hybrid breeding within only one year in both rice subspecies. The application of our system not only significantly accelerates the breeding of sterile lines but also facilitates the exploitation of heterosis.
Peripheral infrastructure vectors and an extended set of plant parts for the Modular Cloning system
Kretschmer, Carola; Gruetzner, Ramona; Löfke, Christian; Dagdas, Yasin; Bürstenbinder, Katharina; Marillonnet, Sylvestre
2018-01-01
Standardized DNA assembly strategies facilitate the generation of multigene constructs from collections of building blocks in plant synthetic biology. A common syntax for hierarchical DNA assembly following the Golden Gate principle employing Type IIs restriction endonucleases was recently developed, and underlies the Modular Cloning and GoldenBraid systems. In these systems, transcriptional units and/or multigene constructs are assembled from libraries of standardized building blocks, also referred to as phytobricks, in several hierarchical levels and by iterative Golden Gate reactions. Here, a toolkit containing further modules for the novel DNA assembly standards was developed. Intended for use with Modular Cloning, most modules are also compatible with GoldenBraid. Firstly, a collection of approximately 80 additional phytobricks is provided, comprising e.g. modules for inducible expression systems, promoters or epitope tags. Furthermore, DNA modules were developed for connecting Modular Cloning and Gateway cloning, either for toggling between systems or for standardized Gateway destination vector assembly. Finally, first instances of a “peripheral infrastructure” around Modular Cloning are presented: While available toolkits are designed for the assembly of plant transformation constructs, vectors were created to also use coding sequence-containing phytobricks directly in yeast two hybrid interaction or bacterial infection assays. The presented material will further enhance versatility of hierarchical DNA assembly strategies. PMID:29847550
Polar codes for achieving the classical capacity of a quantum channel
NASA Astrophysics Data System (ADS)
Guha, Saikat; Wilde, Mark
2012-02-01
We construct the first near-explicit, linear, polar codes that achieve the capacity for classical communication over quantum channels. The codes exploit the channel polarization phenomenon observed by Arikan for classical channels. Channel polarization is an effect in which one can synthesize a set of channels, by ``channel combining'' and ``channel splitting,'' in which a fraction of the synthesized channels is perfect for data transmission while the other fraction is completely useless for data transmission, with the good fraction equal to the capacity of the channel. Our main technical contributions are threefold. First, we demonstrate that the channel polarization effect occurs for channels with classical inputs and quantum outputs. We then construct linear polar codes based on this effect, and the encoding complexity is O(N log N), where N is the blocklength of the code. We also demonstrate that a quantum successive cancellation decoder works well, i.e., the word error rate decays exponentially with the blocklength of the code. For a quantum channel with binary pure-state outputs, such as a binary-phase-shift-keyed coherent-state optical communication alphabet, the symmetric Holevo information rate is in fact the ultimate channel capacity, which is achieved by our polar code.
NASA Astrophysics Data System (ADS)
Gigante-Barrera, Ángel; Dindar, Serdar; Kaewunruen, Sakdirat; Ruikar, Darshan
2017-10-01
Railway turnouts are complex systems designed using complex geometries and grades which makes them difficult to be managed in terms of risk prevention. This feature poses a substantial peril to rail users as it is considered a cause of derailment. In addition, derailment deals to financial losses due to operational downtimes and monetary compensations in case of death or injure. These are fundamental drivers to consider mitigating risks arising from poor risk management during design. Prevention through design (PtD) is a process that introduces tacit knowledge from industry professionals during the design process. There is evidence that Building Information Modelling (BIM) can help to mitigate risk since the inception of the project. BIM is considered an Information System (IS) were tacit knowledge can be stored and retrieved from a digital database making easy to take promptly decisions as information is ready to be analysed. BIM at the model element level entails working with 3D elements and embedded data, therefore adding a layer of complexity to the management of information along the different stages of the project and across different disciplines. In order to overcome this problem, the industry has created a framework for model progression specification named Level of Development (LOD). The paper presents an IDM based framework for design risk mitigation through code validation using the LOD. This effort resulted on risk datasets which describe graphically and non-graphically a rail turnout as the model progresses. Thus, permitting its inclusion within risk information systems. The assignment of an LOD construct to a set of data, requires specialised management and process related expertise. Furthermore, the selection of a set of LOD constructs requires a purpose based analysis. Therefore, a framework for LOD constructs implementation within the IDM for code checking is required for the industry to progress in this particular field.
An Overview of Starfish: A Table-Centric Tool for Interactive Synthesis
NASA Technical Reports Server (NTRS)
Tsow, Alex
2008-01-01
Engineering is an interactive process that requires intelligent interaction at many levels. My thesis [1] advances an engineering discipline for high-level synthesis and architectural decomposition that integrates perspicuous representation, designer interaction, and mathematical rigor. Starfish, the software prototype for the design method, implements a table-centric transformation system for reorganizing control-dominated system expressions into high-level architectures. Based on the digital design derivation (DDD) system a designer-guided synthesis technique that applies correctness preserving transformations to synchronous data flow specifications expressed as co- recursive stream equations Starfish enhances user interaction and extends the reachable design space by incorporating four innovations: behavior tables, serialization tables, data refinement, and operator retiming. Behavior tables express systems of co-recursive stream equations as a table of guarded signal updates. Developers and users of the DDD system used manually constructed behavior tables to help them decide which transformations to apply and how to specify them. These design exercises produced several formally constructed hardware implementations: the FM9001 microprocessor, an SECD machine for evaluating LISP, and the SchemEngine, garbage collected machine for interpreting a byte-code representation of compiled Scheme programs. Bose and Tuna, two of DDD s developers, have subsequently commercialized the design derivation methodology at Derivation Systems, Inc. (DSI). DSI has formally derived and validated PCI bus interfaces and a Java byte-code processor; they further executed a contract to prototype SPIDER-NASA's ultra-reliable communications bus. To date, most derivations from DDD and DRS have targeted hardware due to its synchronous design paradigm. However, Starfish expressions are independent of the synchronization mechanism; there is no commitment to hardware or globally broadcast clocks. Though software back-ends for design derivation are limited to the DDD stream-interpreter, targeting synchronous or real-time software is not substantively different from targeting hardware.
Design of efficient and simple interface testing equipment for opto-electric tracking system
NASA Astrophysics Data System (ADS)
Liu, Qiong; Deng, Chao; Tian, Jing; Mao, Yao
2016-10-01
Interface testing for opto-electric tracking system is one important work to assure system running performance, aiming to verify the design result of every electronic interface matching the communication protocols or not, by different levels. Opto-electric tracking system nowadays is more complicated, composed of many functional units. Usually, interface testing is executed between units manufactured completely, highly depending on unit design and manufacture progress as well as relative people. As a result, it always takes days or weeks, inefficiently. To solve the problem, this paper promotes an efficient and simple interface testing equipment for opto-electric tracking system, consisting of optional interface circuit card, processor and test program. The hardware cards provide matched hardware interface(s), easily offered from hardware engineer. Automatic code generation technique is imported, providing adaption to new communication protocols. Automatic acquiring items, automatic constructing code architecture and automatic encoding are used to form a new program quickly with adaption. After simple steps, a standard customized new interface testing equipment with matching test program and interface(s) is ready for a waiting-test system in minutes. The efficient and simple interface testing equipment for opto-electric tracking system has worked for many opto-electric tracking system to test entire or part interfaces, reducing test time from days to hours, greatly improving test efficiency, with high software quality and stability, without manual coding. Used as a common tool, the efficient and simple interface testing equipment for opto-electric tracking system promoted by this paper has changed traditional interface testing method and created much higher efficiency.
Entanglement-assisted quantum convolutional coding
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wilde, Mark M.; Brun, Todd A.
2010-04-15
We show how to protect a stream of quantum information from decoherence induced by a noisy quantum communication channel. We exploit preshared entanglement and a convolutional coding structure to develop a theory of entanglement-assisted quantum convolutional coding. Our construction produces a Calderbank-Shor-Steane (CSS) entanglement-assisted quantum convolutional code from two arbitrary classical binary convolutional codes. The rate and error-correcting properties of the classical convolutional codes directly determine the corresponding properties of the resulting entanglement-assisted quantum convolutional code. We explain how to encode our CSS entanglement-assisted quantum convolutional codes starting from a stream of information qubits, ancilla qubits, and shared entangled bits.
Quantum Kronecker sum-product low-density parity-check codes with finite rate
NASA Astrophysics Data System (ADS)
Kovalev, Alexey A.; Pryadko, Leonid P.
2013-07-01
We introduce an ansatz for quantum codes which gives the hypergraph-product (generalized toric) codes by Tillich and Zémor and generalized bicycle codes by MacKay as limiting cases. The construction allows for both the lower and the upper bounds on the minimum distance; they scale as a square root of the block length. Many thus defined codes have a finite rate and limited-weight stabilizer generators, an analog of classical low-density parity-check (LDPC) codes. Compared to the hypergraph-product codes, hyperbicycle codes generally have a wider range of parameters; in particular, they can have a higher rate while preserving the estimated error threshold.
Low-rate image coding using vector quantization
DOE Office of Scientific and Technical Information (OSTI.GOV)
Makur, A.
1990-01-01
This thesis deals with the development and analysis of a computationally simple vector quantization image compression system for coding monochrome images at low bit rate. Vector quantization has been known to be an effective compression scheme when a low bit rate is desirable, but the intensive computation required in a vector quantization encoder has been a handicap in using it for low rate image coding. The present work shows that, without substantially increasing the coder complexity, it is indeed possible to achieve acceptable picture quality while attaining a high compression ratio. Several modifications to the conventional vector quantization coder aremore » proposed in the thesis. These modifications are shown to offer better subjective quality when compared to the basic coder. Distributed blocks are used instead of spatial blocks to construct the input vectors. A class of input-dependent weighted distortion functions is used to incorporate psychovisual characteristics in the distortion measure. Computationally simple filtering techniques are applied to further improve the decoded image quality. Finally, unique designs of the vector quantization coder using electronic neural networks are described, so that the coding delay is reduced considerably.« less
A denoising algorithm for CT image using low-rank sparse coding
NASA Astrophysics Data System (ADS)
Lei, Yang; Xu, Dong; Zhou, Zhengyang; Wang, Tonghe; Dong, Xue; Liu, Tian; Dhabaan, Anees; Curran, Walter J.; Yang, Xiaofeng
2018-03-01
We propose a denoising method of CT image based on low-rank sparse coding. The proposed method constructs an adaptive dictionary of image patches and estimates the sparse coding regularization parameters using the Bayesian interpretation. A low-rank approximation approach is used to simultaneously construct the dictionary and achieve sparse representation through clustering similar image patches. A variable-splitting scheme and a quadratic optimization are used to reconstruct CT image based on achieved sparse coefficients. We tested this denoising technology using phantom, brain and abdominal CT images. The experimental results showed that the proposed method delivers state-of-art denoising performance, both in terms of objective criteria and visual quality.
A Literature Review of Sealed and Insulated Attics—Thermal, Moisture and Energy Performance
DOE Office of Scientific and Technical Information (OSTI.GOV)
Less, Brennan; Walker, Iain; Levinson, Ronnen
In this literature review and analysis, we focus on the thermal, moisture and energy performance of sealed and insulated attics in California climates. Thermal. Sealed and insulated attics are expected to maintain attic air temperatures that are similar to those in the house within +/- 10°F. Thermal stress on the assembly, namely high shingle and sheathing temperatures, are of minimal concern. In the past, many sealed and insulated attics were constructed with insufficient insulation levels (~R-20) and with too much air leakage to outside, leading to poor thermal performance. To ensure high performance, sealed and insulated attics in new Californiamore » homes should be insulated at levels at least equivalent to the flat ceiling requirements in the code, and attic envelopes and ducts should be airtight. We expect that duct systems in well-constructed sealed and insulated attics should have less than 2% HVAC system leakage to outside. Moisture. Moisture risk in sealed and insulated California attics will increase with colder climate regions and more humid outside air in marine zones. Risk is considered low in the hot-dry, highly populated regions of the state, where most new home construction occurs. Indoor humidity levels should be controlled by following code requirements for continuous whole-house ventilation and local exhaust. Pending development of further guidance, we recommend that the air impermeable insulation requirements of the International Residential Code (2012) be used, as they vary with IECC climate region and roof finish. Energy. Sealed and insulated attics provide energy benefits only if HVAC equipment is located in the attic volume, and the benefits depend strongly on the insulation and airtightness of the attic and ducts. Existing homes with leaky, uninsulated ducts in the attic should have major savings. When compared with modern, airtight duct systems in a vented attic, sealed and insulated attics in California may still provide substantial benefit. Energy performance is expected to be roughly equivalent between sealed and insulated attics and prescriptive advanced roof/attic options in Title 24 2016. System performance can also be expected to improve, such as pull down time, performance at peak load, etc. We expect benefits to be reduced for all advanced roof/attic approaches, relative to a traditional vented attic, as duct system leakage is reduced close to 0. The most recent assessments, comparing advanced roof/attic assemblies to code compliant vented attics suggest average 13% TDV energy savings, with substantial variation by climate zone (more savings in more extreme climates). Similar 6-11% reductions in seasonally adjusted HVAC duct thermal losses have been measured in a small subset of such California homes using the ducts in conditioned space approach. Given the limited nature of energy and moisture monitoring in sealed and insulated attic homes, there is crucial need for long-term data and advanced modeling of these approaches in the California new and existing home contexts.« less
Comparison of Analytical Predictions and Experimental Results for a Dual Brayton Power System
NASA Technical Reports Server (NTRS)
Johnson, Paul
2007-01-01
NASA Glenn Research Center (GRC) contracted Barber- Nichols, Arvada, CO to construct a dual Brayton power conversion system for use as a hardware proof of concept and to validate results from a computational code known as the Closed Cycle System Simulation (CCSS). Initial checkout tests were performed at Barber- Nichols to ready the system for delivery to GRC. This presentation describes the system hardware components and lists the types of checkout tests performed along with a couple issues encountered while conducting the tests. A description of the CCSS model is also presented. The checkout tests did not focus on generating data, therefore, no test data or model analyses are presented.
Automatic mathematical modeling for real time simulation system
NASA Technical Reports Server (NTRS)
Wang, Caroline; Purinton, Steve
1988-01-01
A methodology for automatic mathematical modeling and generating simulation models is described. The models will be verified by running in a test environment using standard profiles with the results compared against known results. The major objective is to create a user friendly environment for engineers to design, maintain, and verify their model and also automatically convert the mathematical model into conventional code for conventional computation. A demonstration program was designed for modeling the Space Shuttle Main Engine Simulation. It is written in LISP and MACSYMA and runs on a Symbolic 3670 Lisp Machine. The program provides a very friendly and well organized environment for engineers to build a knowledge base for base equations and general information. It contains an initial set of component process elements for the Space Shuttle Main Engine Simulation and a questionnaire that allows the engineer to answer a set of questions to specify a particular model. The system is then able to automatically generate the model and FORTRAN code. The future goal which is under construction is to download the FORTRAN code to VAX/VMS system for conventional computation. The SSME mathematical model will be verified in a test environment and the solution compared with the real data profile. The use of artificial intelligence techniques has shown that the process of the simulation modeling can be simplified.
Design and construction of functional AAV vectors.
Gray, John T; Zolotukhin, Serge
2011-01-01
Using the basic principles of molecular biology and laboratory techniques presented in this chapter, researchers should be able to create a wide variety of AAV vectors for both clinical and basic research applications. Basic vector design concepts are covered for both protein coding gene expression and small non-coding RNA gene expression cassettes. AAV plasmid vector backbones (available via AddGene) are described, along with critical sequence details for a variety of modular expression components that can be inserted as needed for specific applications. Protocols are provided for assembling the various DNA components into AAV vector plasmids in Escherichia coli, as well as for transferring these vector sequences into baculovirus genomes for large-scale production of AAV in the insect cell production system.
Design-Load Basis for LANL Structures, Systems, and Components
DOE Office of Scientific and Technical Information (OSTI.GOV)
I. Cuesta
2004-09-01
This document supports the recommendations in the Los Alamos National Laboratory (LANL) Engineering Standard Manual (ESM), Chapter 5--Structural providing the basis for the loads, analysis procedures, and codes to be used in the ESM. It also provides the justification for eliminating the loads to be considered in design, and evidence that the design basis loads are appropriate and consistent with the graded approach required by the Department of Energy (DOE) Code of Federal Regulation Nuclear Safety Management, 10, Part 830. This document focuses on (1) the primary and secondary natural phenomena hazards listed in DOE-G-420.1-2, Appendix C, (2) additional loadsmore » not related to natural phenomena hazards, and (3) the design loads on structures during construction.« less
Solomon, Judith; Duschinsky, Robbie; Bakkum, Lianne; Schuengel, Carlo
2017-10-01
This article examines the construct of disorganized attachment originally proposed by Main and Solomon, developing some new conjectures based on inspiration from a largely unknown source: John Bowlby's unpublished texts, housed at the Wellcome Trust Library Archive in London (with permission from the Bowlby family). We explore Bowlby's discussions of disorganized attachment, which he understood from the perspective of ethological theories of conflict behavior. Bowlby's reflections regarding differences among the behaviors used to code disorganized attachment will be used to explore distinctions that may underlie the structure of the current coding system. The article closes with an emphasis on the importance Bowlby placed on Popper's distinction between the context of discovery and the context of justification in developmental science.
NASA Technical Reports Server (NTRS)
Sharma, Naveen
1992-01-01
In this paper we briefly describe a combined symbolic and numeric approach for solving mathematical models on parallel computers. An experimental software system, PIER, is being developed in Common Lisp to synthesize computationally intensive and domain formulation dependent phases of finite element analysis (FEA) solution methods. Quantities for domain formulation like shape functions, element stiffness matrices, etc., are automatically derived using symbolic mathematical computations. The problem specific information and derived formulae are then used to generate (parallel) numerical code for FEA solution steps. A constructive approach to specify a numerical program design is taken. The code generator compiles application oriented input specifications into (parallel) FORTRAN77 routines with the help of built-in knowledge of the particular problem, numerical solution methods and the target computer.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lstiburek, Joseph
2017-01-01
The measure guideline provides ventilation guidance for residential high performance multifamily construction that incorporates the requirements of the ASHRAE 62.2 ventilation and indoor air quality standard. The measure guideline focus is on the decision criteria for weighing cost and performance of various ventilation systems. The measure guideline is intended for contractors, builders, developers, designers and building code officials. The guide may also be helpful to building owners wishing to learn more about ventilation strategies available for their buildings. The measure guideline includes specific design and installation instructions for the most cost effective and performance effective solutions for ventilation in multifamilymore » units that satisfies the requirements of ASHRAE 62.2-2016.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lstiburek, Joseph
The measure guideline provides ventilation guidance for residential high performance multifamily construction that incorporates the requirements of the ASHRAE 62.2 ventilation and indoor air quality standard. The measure guideline focus is on the decision criteria for weighing cost and performance of various ventilation systems. The measure guideline is intended for contractors, builders, developers, designers and building code officials. The guide may also be helpful to building owners wishing to learn more about ventilation strategies available for their buildings. The measure guideline includes specific design and installation instructions for the most cost effective and performance effective solutions for ventilation in multifamilymore » units that satisfies the requirements of ASHRAE 62.2-2016.« less
In vitro cell irradiation systems based on 210Po alpha source: construction and characterisation
NASA Technical Reports Server (NTRS)
Szabo, J.; Feher, I.; Palfalvi, J.; Balashazy, I.; Dam, A. M.; Polonyi, I.; Bogdandi, E. N.
2002-01-01
One way of studying the risk to human health of low-level radiation exposure is to make biological experiments on living cell cultures. Two 210Po alpha-particle emitting devices, with 0.5 and 100 MBq activity, were designed and constructed to perform such experiments irradiating monolayers of cells. Estimates of dose rate at the cell surface were obtained from measurements by a PIPS alpha-particle spectrometer and from calculations by the SRIM 2000, Monte Carlo charged particle transport code. Particle fluence area distributions were measured by solid state nuclear track detectors. The design and dosimetric characterisation of the devices are discussed. c2002 Elsevier Science Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Liu, G.; Wu, C.; Li, X.; Song, P.
2013-12-01
The 3D urban geological information system has been a major part of the national urban geological survey project of China Geological Survey in recent years. Large amount of multi-source and multi-subject data are to be stored in the urban geological databases. There are various models and vocabularies drafted and applied by industrial companies in urban geological data. The issues such as duplicate and ambiguous definition of terms and different coding structure increase the difficulty of information sharing and data integration. To solve this problem, we proposed a national standard-driven information classification and coding method to effectively store and integrate urban geological data, and we applied the data dictionary technology to achieve structural and standard data storage. The overall purpose of this work is to set up a common data platform to provide information sharing service. Research progresses are as follows: (1) A unified classification and coding method for multi-source data based on national standards. Underlying national standards include GB 9649-88 for geology and GB/T 13923-2006 for geography. Current industrial models are compared with national standards to build a mapping table. The attributes of various urban geological data entity models are reduced to several categories according to their application phases and domains. Then a logical data model is set up as a standard format to design data file structures for a relational database. (2) A multi-level data dictionary for data standardization constraint. Three levels of data dictionary are designed: model data dictionary is used to manage system database files and enhance maintenance of the whole database system; attribute dictionary organizes fields used in database tables; term and code dictionary is applied to provide a standard for urban information system by adopting appropriate classification and coding methods; comprehensive data dictionary manages system operation and security. (3) An extension to system data management function based on data dictionary. Data item constraint input function is making use of the standard term and code dictionary to get standard input result. Attribute dictionary organizes all the fields of an urban geological information database to ensure the consistency of term use for fields. Model dictionary is used to generate a database operation interface automatically with standard semantic content via term and code dictionary. The above method and technology have been applied to the construction of Fuzhou Urban Geological Information System, South-East China with satisfactory results.
Dynamical generation of noiseless quantum subsystems
Viola; Knill; Lloyd
2000-10-16
We combine dynamical decoupling and universal control methods for open quantum systems with coding procedures. By exploiting a general algebraic approach, we show how appropriate encodings of quantum states result in obtaining universal control over dynamically generated noise-protected subsystems with limited control resources. In particular, we provide a constructive scheme based on two-body Hamiltonians for performing universal quantum computation over large noiseless spaces which can be engineered in the presence of arbitrary linear quantum noise.
Adaptive Control of Visually Guided Grasping in Neural Networks
1990-03-12
D.P. Shankweiler, M. Studdert-Kennedy (1967) Perception of the speech code, Psychol. Rev. 74, 43 1. J. Piaget ( 1952 ), The Origins of Intelligence in...Coordination, IEEE Control Systems Magazine.V9:3 p.25-30 Piaget , J. ( 1952 ), The Origins of Intelligence in Children, translated by M.Cook, (International...University Press, New York. Piaget , J. (1954) The Construction of Reality in the Child, Translated by M. Cook , Ballentine Books, New York - 24-
Listening to patients' voices: linguistic indicators related to diabetes self-management.
Connor, Ulla; Anton, Marta; Goering, Elizabeth; Lauten, Kathryn; Hayat, Amir; Roach, Paris; Balunda, Stephanie
2012-01-01
A great deal of research in health care has examined a wide range of variables to better understand the degree to which patients follow the advice of medical professionals in managing their health, known as adherence. This paper explains the development of the linguistic systems to describe and evaluate two psychosocial constructs (i.e. control orientation and agency) that have been found to be related to adherence in previous research for subjects with diabetes (Trento et al. 2007; Wangberg 2007; O'Hea et al. 2009). The present data came from 43 semi-structured in-depth interviews of subjects with Type 2 diabetes. One-on-one interviews with open-ended questions elicited subjects' 'stories' about living with diabetes, and the transcribed interviews were analyzed to develop the linguistic systems of control orientation and agency. The resultant systems were applied to the 43 interviews by raters with high inter-rater reliability. The results showed demarcations of clearly identified codings of patient types. The paper presents the linguistic coding systems developed in the study, the results of their application to the patient interview data, and recommendations for improved communication with patients.
Integrated nuclear data utilisation system for innovative reactors.
Yamano, N; Hasegawa, A; Kato, K; Igashira, M
2005-01-01
A five-year research and development project on an integrated nuclear data utilisation system was initiated in 2002, for developing innovative nuclear energy systems such as accelerator-driven systems. The integrated nuclear data utilisation system will be constructed as a modular code system, which consists of two sub-systems: the nuclear data search and plotting sub-system, and the nuclear data processing and utilisation sub-system. The system will be operated with a graphical user interface in order to enable easy utilisation through the Internet by both nuclear design engineers and nuclear data evaluators. This paper presents an overview of the integrated nuclear data utilisation system, describes the development of a prototype system to examine the operability of the user interface and discusses specifications of the two sub-systems.
On complexity of trellis structure of linear block codes
NASA Technical Reports Server (NTRS)
Lin, Shu
1990-01-01
The trellis structure of linear block codes (LBCs) is discussed. The state and branch complexities of a trellis diagram (TD) for a LBC is investigated. The TD with the minimum number of states is said to be minimal. The branch complexity of a minimal TD for a LBC is expressed in terms of the dimensions of specific subcodes of the given code. Then upper and lower bounds are derived on the number of states of a minimal TD for a LBC, and it is shown that a cyclic (or shortened cyclic) code is the worst in terms of the state complexity among the LBCs of the same length and dimension. Furthermore, it is shown that the structural complexity of a minimal TD for a LBC depends on the order of its bit positions. This fact suggests that an appropriate permutation of the bit positions of a code may result in an equivalent code with a much simpler minimal TD. Boolean polynomial representation of codewords of a LBC is also considered. This representation helps in study of the trellis structure of the code. Boolean polynomial representation of a code is applied to construct its minimal TD. Particularly, the construction of minimal trellises for Reed-Muller codes and the extended and permuted binary primitive BCH codes which contain Reed-Muller as subcodes is emphasized. Finally, the structural complexity of minimal trellises for the extended and permuted, and double-error-correcting BCH codes is analyzed and presented. It is shown that these codes have relatively simple trellis structure and hence can be decoded with the Viterbi decoding algorithm.
Kratochwil, Claudius F; Sefton, Maggie M; Liang, Yipeng; Meyer, Axel
2017-11-23
The Midas cichlid species complex (Amphilophus spp.) is widely known among evolutionary biologists as a model system for sympatric speciation and adaptive phenotypic divergence within extremely short periods of time (a few hundred generations). The repeated parallel evolution of adaptive phenotypes in this radiation, combined with their near genetic identity, makes them an excellent model for studying phenotypic diversification. While many ecological and evolutionary studies have been performed on Midas cichlids, the molecular basis of specific phenotypes, particularly adaptations, and their underlying coding and cis-regulatory changes have not yet been studied thoroughly. For the first time in any New World cichlid, we use Tol2 transposon-mediated transgenesis in the Midas cichlid (Amphilophus citrinellus). By adapting existing microinjection protocols, we established an effective protocol for transgenesis in Midas cichlids. Embryos were injected with a Tol2 plasmid construct that drives enhanced green fluorescent protein (eGFP) expression under the control of the ubiquitin promoter. The transgene was successfully integrated into the germline, driving strong ubiquitous expression of eGFP in the first transgenic Midas cichlid line. Additionally, we show transient expression of two further transgenic constructs, ubiquitin::tdTomato and mitfa::eGFP. Transgenesis in Midas cichlids will facilitate further investigation of the genetic basis of species-specific traits, many of which are adaptations. Transgenesis is a versatile tool not only for studying regulatory elements such as promoters and enhancers, but also for testing gene function through overexpression of allelic gene variants. As such, it is an important first step in establishing the Midas cichlid as a powerful model for studying adaptive coding and non-coding changes in an ecological and evolutionary context.
40 CFR 62.15265 - How do I monitor the load of my municipal waste combustion unit?
Code of Federal Regulations, 2013 CFR
2013-07-01
... Mechanical Engineers (ASME PTC 4.1—1964): Test Code for Steam Generating Units, Power Test Code 4.1-1964... of Mechanical Engineers, Service Center, 22 Law Drive, Post Office Box 2900, Fairfield, NJ 07007. You....archives.gov/federal_register/code_of_federal_regulations/ibr_locations.html. (4) Design, construct...
40 CFR 62.15265 - How do I monitor the load of my municipal waste combustion unit?
Code of Federal Regulations, 2011 CFR
2011-07-01
... Mechanical Engineers (ASME PTC 4.1—1964): Test Code for Steam Generating Units, Power Test Code 4.1-1964... of Mechanical Engineers, Service Center, 22 Law Drive, Post Office Box 2900, Fairfield, NJ 07007. You....archives.gov/federal_register/code_of_federal_regulations/ibr_locations.html. (4) Design, construct...
40 CFR 62.15265 - How do I monitor the load of my municipal waste combustion unit?
Code of Federal Regulations, 2010 CFR
2010-07-01
... Mechanical Engineers (ASME PTC 4.1—1964): Test Code for Steam Generating Units, Power Test Code 4.1-1964... of Mechanical Engineers, Service Center, 22 Law Drive, Post Office Box 2900, Fairfield, NJ 07007. You....archives.gov/federal_register/code_of_federal_regulations/ibr_locations.html. (4) Design, construct...
40 CFR 62.15265 - How do I monitor the load of my municipal waste combustion unit?
Code of Federal Regulations, 2012 CFR
2012-07-01
... Mechanical Engineers (ASME PTC 4.1—1964): Test Code for Steam Generating Units, Power Test Code 4.1-1964... of Mechanical Engineers, Service Center, 22 Law Drive, Post Office Box 2900, Fairfield, NJ 07007. You....archives.gov/federal_register/code_of_federal_regulations/ibr_locations.html. (4) Design, construct...
40 CFR 62.15265 - How do I monitor the load of my municipal waste combustion unit?
Code of Federal Regulations, 2014 CFR
2014-07-01
... Mechanical Engineers (ASME PTC 4.1—1964): Test Code for Steam Generating Units, Power Test Code 4.1-1964... of Mechanical Engineers, Service Center, 22 Law Drive, Post Office Box 2900, Fairfield, NJ 07007. You....archives.gov/federal_register/code_of_federal_regulations/ibr_locations.html. (4) Design, construct...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-10
.... The IgCC is intended to provide a green model building code provisions for new and existing commercial... DEPARTMENT OF ENERGY 10 CFR Part 430 [Docket No. EERE-2011-BT-BC-0009] Building Energy Codes Program: Presenting and Receiving Comments to DOE Proposed Changes to the International Green Construction...
Entanglement entropy from tensor network states for stabilizer codes
NASA Astrophysics Data System (ADS)
He, Huan; Zheng, Yunqin; Bernevig, B. Andrei; Regnault, Nicolas
2018-03-01
In this paper, we present the construction of tensor network states (TNS) for some of the degenerate ground states of three-dimensional (3D) stabilizer codes. We then use the TNS formalism to obtain the entanglement spectrum and entropy of these ground states for some special cuts. In particular, we work out examples of the 3D toric code, the X-cube model, and the Haah code. The latter two models belong to the category of "fracton" models proposed recently, while the first one belongs to the conventional topological phases. We mention the cases for which the entanglement entropy and spectrum can be calculated exactly: For these, the constructed TNS is a singular value decomposition (SVD) of the ground states with respect to particular entanglement cuts. Apart from the area law, the entanglement entropies also have constant and linear corrections for the fracton models, while the entanglement entropies for the toric code models only have constant corrections. For the cuts we consider, the entanglement spectra of these three models are completely flat. We also conjecture that the negative linear correction to the area law is a signature of extensive ground-state degeneracy. Moreover, the transfer matrices of these TNSs can be constructed. We show that the transfer matrices are projectors whose eigenvalues are either 1 or 0. The number of nonzero eigenvalues is tightly related to the ground-state degeneracy.
Potts glass reflection of the decoding threshold for qudit quantum error correcting codes
NASA Astrophysics Data System (ADS)
Jiang, Yi; Kovalev, Alexey A.; Pryadko, Leonid P.
We map the maximum likelihood decoding threshold for qudit quantum error correcting codes to the multicritical point in generalized Potts gauge glass models, extending the map constructed previously for qubit codes. An n-qudit quantum LDPC code, where a qudit can be involved in up to m stabilizer generators, corresponds to a ℤd Potts model with n interaction terms which can couple up to m spins each. We analyze general properties of the phase diagram of the constructed model, give several bounds on the location of the transitions, bounds on the energy density of extended defects (non-local analogs of domain walls), and discuss the correlation functions which can be used to distinguish different phases in the original and the dual models. This research was supported in part by the Grants: NSF PHY-1415600 (AAK), NSF PHY-1416578 (LPP), and ARO W911NF-14-1-0272 (LPP).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Alfonsi, Andrea; Rabiti, Cristian; Mandelli, Diego
2016-06-01
RAVEN is a software framework able to perform parametric and stochastic analysis based on the response of complex system codes. The initial development was aimed at providing dynamic risk analysis capabilities to the thermohydraulic code RELAP-7, currently under development at Idaho National Laboratory (INL). Although the initial goal has been fully accomplished, RAVEN is now a multi-purpose stochastic and uncertainty quantification platform, capable of communicating with any system code. In fact, the provided Application Programming Interfaces (APIs) allow RAVEN to interact with any code as long as all the parameters that need to be perturbed are accessible by input filesmore » or via python interfaces. RAVEN is capable of investigating system response and explore input space using various sampling schemes such as Monte Carlo, grid, or Latin hypercube. However, RAVEN strength lies in its system feature discovery capabilities such as: constructing limit surfaces, separating regions of the input space leading to system failure, and using dynamic supervised learning techniques. The development of RAVEN started in 2012 when, within the Nuclear Energy Advanced Modeling and Simulation (NEAMS) program, the need to provide a modern risk evaluation framework arose. RAVEN’s principal assignment is to provide the necessary software and algorithms in order to employ the concepts developed by the Risk Informed Safety Margin Characterization (RISMC) program. RISMC is one of the pathways defined within the Light Water Reactor Sustainability (LWRS) program. In the RISMC approach, the goal is not just to identify the frequency of an event potentially leading to a system failure, but the proximity (or lack thereof) to key safety-related events. Hence, the approach is interested in identifying and increasing the safety margins related to those events. A safety margin is a numerical value quantifying the probability that a safety metric (e.g. peak pressure in a pipe) is exceeded under certain conditions. Most of the capabilities, implemented having RELAP-7 as a principal focus, are easily deployable to other system codes. For this reason, several side activates have been employed (e.g. RELAP5-3D, any MOOSE-based App, etc.) or are currently ongoing for coupling RAVEN with several different software. The aim of this document is to provide a set of commented examples that can help the user to become familiar with the RAVEN code usage.« less
Development and Implementation of Kumamoto Technopolis Regional Database T-KIND
NASA Astrophysics Data System (ADS)
Onoue, Noriaki
T-KIND (Techno-Kumamoto Information Network for Data-Base) is a system for effectively searching information of technology, human resources and industries which are necessary to realize Kumamoto Technopolis. It is composed of coded database, image database and LAN inside technoresearch park which is the center of R & D in the Technopolis. It constructs on-line system by networking general-purposed computers, minicomputers, optical disk file systems and so on, and provides the service through public telephone line. Two databases are now available on enterprise information and human resource information. The former covers about 4,000 enterprises, and the latter does about 2,000 persons.
Second NASA Workshop on Wiring for Space Applications
NASA Technical Reports Server (NTRS)
1994-01-01
This document contains the proceedings of the Second NASA Workshop on Wiring for Space Applications held at NASA LeRC in Cleveland, OH, 6-7 Oct. 1993. The workshop was sponsored by NASA Headquarters Code QW Office of Safety and Mission Quality, Technical Standards Division and hosted by NASA LeRC, Power Technology Division, Electrical Components and Systems Branch. The workshop addressed key technology issues in the field of electrical power wiring for space applications. Speakers from government, industry, and academia presented and discussed topics on arc tracking phenomena, wiring system design, insulation constructions, and system protection. Presentation materials provided by the various speakers are included in this document.
Rational's experience using Ada for very large systems
NASA Technical Reports Server (NTRS)
Archer, James E., Jr.; Devlin, Michael T.
1986-01-01
The experience using the Rational Environment has confirmed the advantages forseen when the project was started. Interactive syntatic and semantic information makes a tremendous difference in the ease of constructing programs and making changes to them. The ability to follow semantic references makes it easier to understand exisiting programs and the impact of changes. The integrated debugger makes it much easier to find bugs and test fixes quickly. Taken together, these facilites have helped greatly in reducing the impact of ongoing maintenance of the ability to produce a new code. Similar improvements are anticipated as the same level of integration and interactivity are achieved for configuration management and version control. The environment has also proven useful in introducing personnel to the project and existing personnel to new parts of the system. Personnel benefit from the assistance with syntax and semantics; everyone benefits from the ability to traverse and understand the structure of unfamiliar software. It is often possible for someone completely unfamiliar with a body of code to use these facilities, to understand it well enough to successfully with a body of code to use these facilities to understand it well enough to successfully diagnose and fix bugs in a matter of minutes.
Lafuente, M J; Petit, T; Gancedo, C
1997-12-22
We have constructed a series of plasmids to facilitate the fusion of promoters with or without coding regions of genes of Schizosaccharomyces pombe to the lacZ gene of Escherichia coli. These vectors carry a multiple cloning region in which fission yeast DNA may be inserted in three different reading frames with respect to the coding region of lacZ. The plasmids were constructed with the ura4+ or the his3+ marker of S. pombe. Functionality of the plasmids was tested measuring in parallel the expression of fructose 1,6-bisphosphatase and beta-galactosidase under the control of the fbp1+ promoter in different conditions.
Soft-decision decoding techniques for linear block codes and their error performance analysis
NASA Technical Reports Server (NTRS)
Lin, Shu
1996-01-01
The first paper presents a new minimum-weight trellis-based soft-decision iterative decoding algorithm for binary linear block codes. The second paper derives an upper bound on the probability of block error for multilevel concatenated codes (MLCC). The bound evaluates difference in performance for different decompositions of some codes. The third paper investigates the bit error probability code for maximum likelihood decoding of binary linear codes. The fourth and final paper included in this report is concerns itself with the construction of multilevel concatenated block modulation codes using a multilevel concatenation scheme for the frequency non-selective Rayleigh fading channel.
Multiloop Integral System Test (MIST): MIST Facility Functional Specification
DOE Office of Scientific and Technical Information (OSTI.GOV)
Habib, T F; Koksal, C G; Moskal, T E
1991-04-01
The Multiloop Integral System Test (MIST) is part of a multiphase program started in 1983 to address small-break loss-of-coolant accidents (SBLOCAs) specific to Babcock and Wilcox designed plants. MIST is sponsored by the US Nuclear Regulatory Commission, the Babcock Wilcox Owners Group, the Electric Power Research Institute, and Babcock and Wilcox. The unique features of the Babcock and Wilcox design, specifically the hot leg U-bends and steam generators, prevented the use of existing integral system data or existing integral facilities to address the thermal-hydraulic SBLOCA questions. MIST was specifically designed and constructed for this program, and an existing facility --more » the Once Through Integral System (OTIS) -- was also used. Data from MIST and OTIS are used to benchmark the adequacy of system codes, such as RELAP5 and TRAC, for predicting abnormal plant transients. The MIST Functional Specification documents as-built design features, dimensions, instrumentation, and test approach. It also presents the scaling basis for the facility and serves to define the scope of work for the facility design and construction. 13 refs., 112 figs., 38 tabs.« less
Beyond Molecular Codes: Simple Rules to Wire Complex Brains
Hassan, Bassem A.; Hiesinger, P. Robin
2015-01-01
Summary Molecular codes, like postal zip codes, are generally considered a robust way to ensure the specificity of neuronal target selection. However, a code capable of unambiguously generating complex neural circuits is difficult to conceive. Here, we re-examine the notion of molecular codes in the light of developmental algorithms. We explore how molecules and mechanisms that have been considered part of a code may alternatively implement simple pattern formation rules sufficient to ensure wiring specificity in neural circuits. This analysis delineates a pattern-based framework for circuit construction that may contribute to our understanding of brain wiring. PMID:26451480
Separable concatenated codes with iterative map decoding for Rician fading channels
NASA Technical Reports Server (NTRS)
Lodge, J. H.; Young, R. J.
1993-01-01
Very efficient signalling in radio channels requires the design of very powerful codes having special structure suitable for practical decoding schemes. In this paper, powerful codes are obtained by combining comparatively simple convolutional codes to form multi-tiered 'separable' convolutional codes. The decoding of these codes, using separable symbol-by-symbol maximum a posteriori (MAP) 'filters', is described. It is known that this approach yields impressive results in non-fading additive white Gaussian noise channels. Interleaving is an inherent part of the code construction, and consequently, these codes are well suited for fading channel communications. Here, simulation results for communications over Rician fading channels are presented to support this claim.
NASA Technical Reports Server (NTRS)
Johnson, Paul K.
2007-01-01
NASA Glenn Research Center (GRC) contracted Barber-Nichols, Arvada, CO to construct a dual Brayton power conversion system for use as a hardware proof of concept and to validate results from a computational code known as the Closed Cycle System Simulation (CCSS). Initial checkout tests were performed at Barber- Nichols to ready the system for delivery to GRC. This presentation describes the system hardware components and lists the types of checkout tests performed along with a couple issues encountered while conducting the tests. A description of the CCSS model is also presented. The checkout tests did not focus on generating data, therefore, no test data or model analyses are presented.
MeshVoro: A Three-Dimensional Voronoi Mesh Building Tool for the TOUGH Family of Codes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Freeman, C. M.; Boyle, K. L.; Reagan, M.
2013-09-30
Few tools exist for creating and visualizing complex three-dimensional simulation meshes, and these have limitations that restrict their application to particular geometries and circumstances. Mesh generation needs to trend toward ever more general applications. To that end, we have developed MeshVoro, a tool that is based on the Voro (Rycroft 2009) library and is capable of generating complex threedimensional Voronoi tessellation-based (unstructured) meshes for the solution of problems of flow and transport in subsurface geologic media that are addressed by the TOUGH (Pruess et al. 1999) family of codes. MeshVoro, which includes built-in data visualization routines, is a particularly usefulmore » tool because it extends the applicability of the TOUGH family of codes by enabling the scientifically robust and relatively easy discretization of systems with challenging 3D geometries. We describe several applications of MeshVoro. We illustrate the ability of the tool to straightforwardly transform a complex geological grid into a simulation mesh that conforms to the specifications of the TOUGH family of codes. We demonstrate how MeshVoro can describe complex system geometries with a relatively small number of grid blocks, and we construct meshes for geometries that would have been practically intractable with a standard Cartesian grid approach. We also discuss the limitations and appropriate applications of this new technology.« less
49 CFR 41.115 - New buildings to be leased for DOT occupancy.
Code of Federal Regulations, 2011 CFR
2011-10-01
... compliance may include the engineer's and architect's authenticated verifications of seismic design codes... design and construction of new buildings to be leased for DOT occupancy or use will ensure that each building is designed and constructed in accord with the seismic design and construction standards set out...
49 CFR 41.115 - New buildings to be leased for DOT occupancy.
Code of Federal Regulations, 2010 CFR
2010-10-01
... compliance may include the engineer's and architect's authenticated verifications of seismic design codes... design and construction of new buildings to be leased for DOT occupancy or use will ensure that each building is designed and constructed in accord with the seismic design and construction standards set out...
How to reduce your fire insurance rates
NASA Technical Reports Server (NTRS)
Dubain, M.
1971-01-01
Construction procedures and utilization of materials to reduce the cost of insuring large buildings against losses from fire are discussed. Examples of good and bad techniques in building construction and fire safety management are provided. The inadequacies of building codes and the hazards resulting from improper construction are examined.
FY06 L2C2 HE program report Zaug et al.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zaug, J M; Crowhurst, J C; Howard, W M
2008-08-01
The purpose of this project is to advance the improvement of LLNL thermochemical computational models that form the underlying basis or input for laboratory hydrodynamic simulations. Our general work approach utilizes, by design, tight experimental-theoretical research interactions that allow us to not empirically, but rather more scientifically improve LLNL computational results. The ultimate goal here is to confidently predict through computer models, the performance and safety parameters of currently maintained, modified, and newly designed stockpile systems. To attain our goal we make relevant experimental measurements on candidate detonation products constrained under static high-pressure and temperature conditions. The reduced information frommore » these measurements is then used to construct analytical forms that describe the potential surface (repulsive energy as a function of interatomic separation distance) of single and mixed fluid or detonation product species. These potential surface shapes are also constructed using input from well-trusted shock wave physics and assorted thermodynamic data available in the open literature. Our potential surfaces permit one to determine the equations of state (P,V,T), the equilibrium chemistry, phase, and chemical interactions of detonation products under a very wide range of extreme pressure temperature conditions. Using our foundation of experimentally refined potential surfaces we are in a position to calculate, with confidence, the energetic output and chemical speciation occurring from a specific combustion and/or detonation reaction. The thermochemical model we developed and use for calculating the equilibrium chemistry, kinetics, and energy from ultrafast processes is named 'Cheetah'. Computational results from our Cheetah code are coupled to laboratory ALE3D hydrodynamic simulation codes where the complete response behavior of an existing or proposed system is ultimately predicted. The Cheetah thermochemical code is also used by well over 500 U.S. government DoD and DOE community users who calculate the chemical properties of detonated high explosives, propellants, and pyrotechnics. To satisfy the growing needs of LLNL and the general user community we continue to improve the robustness of our Cheetah code. The P-T range of current speed of sound experiments will soon be extended by a factor of four and our recently developed technological advancements permit us to, for the first time, study any chemical specie or fluid mixture. New experiments will focus on determining the miscibility or coexistence curves of detonation product mixtures. Our newly constructed ultrafast laser diagnostics will permit us to determine what chemical species exist under conditions approaching Chapman-Jouguet (CJ) detonation states. Furthermore we will measure the time evolution of candidate species and use our chemical kinetics data to develop new and validate existing rate laws employed in future versions of our Cheetah thermochemical code.« less
A generic minimization random allocation and blinding system on web.
Cai, Hongwei; Xia, Jielai; Xu, Dezhong; Gao, Donghuai; Yan, Yongping
2006-12-01
Minimization is a dynamic randomization method for clinical trials. Although recommended by many researchers, the utilization of minimization has been seldom reported in randomized trials mainly because of the controversy surrounding the validity of conventional analyses and its complexity in implementation. However, both the statistical and clinical validity of minimization were demonstrated in recent studies. Minimization random allocation system integrated with blinding function that could facilitate the implementation of this method in general clinical trials has not been reported. SYSTEM OVERVIEW: The system is a web-based random allocation system using Pocock and Simon minimization method. It also supports multiple treatment arms within a trial, multiple simultaneous trials, and blinding without further programming. This system was constructed with generic database schema design method, Pocock and Simon minimization method and blinding method. It was coded with Microsoft Visual Basic and Active Server Pages (ASP) programming languages. And all dataset were managed with a Microsoft SQL Server database. Some critical programming codes were also provided. SIMULATIONS AND RESULTS: Two clinical trials were simulated simultaneously to test the system's applicability. Not only balanced groups but also blinded allocation results were achieved in both trials. Practical considerations for minimization method, the benefits, general applicability and drawbacks of the technique implemented in this system are discussed. Promising features of the proposed system are also summarized.
Experimental study on lateral strength of wall-slab joint subjected to lateral cyclic load
NASA Astrophysics Data System (ADS)
Masrom, Mohd Asha'ari; Mohamad, Mohd Elfie; Hamid, Nor Hayati Abdul; Yusuff, Amer
2017-10-01
Tunnel form building has been utilised in building construction since 1960 in Malaysia. This method of construction has been applied extensively in the construction of high rise residential house (multistory building) such as condominium and apartment. Most of the tunnel form buildings have been designed according to British standard (BS) whereby there is no provision for seismic loading. The high-rise tunnel form buildings are vulnerable to seismic loading. The connections between slab and shear walls in the tunnel-form building constitute an essential link in the lateral load resisting mechanism. Malaysia is undergoing a shifting process from BS code to Eurocode (EC) for building construction since the country has realised the safety threats of earthquake. Hence, this study is intended to compare the performance of the interior wall slab joint for a tunnel form structure designed based on Euro and British codes. The experiment included a full scale test of the wall slab joint sub-assemblages under reversible lateral cyclic loading. Two sub-assemblage specimens of the wall slab joint were designed and constructed based on both codes. Each specimen was tested using lateral displacement control (drift control). The specimen designed by using Eurocode was found could survive up to 3.0% drift while BS specimen could last to 1.5% drift. The analysis results indicated that the BS specimen was governed by brittle failure modes with Ductility Class Low (DCL) while the EC specimen behaved in a ductile manner with Ductility Class Medium (DCM). The low ductility recorded in BS specimen was resulted from insufficient reinforcement provided in the BS code specimen. Consequently, the BS specimen could not absorb energy efficiently (low energy dissipation) and further sustain under inelastic deformation.
Energy and Environment Guide to Action - Chapter 4.3: Building Codes for Energy Efficiency
Provides guidance and recommendations for establishing, implementing, and evaluating state building codes for energy efficiency, which improve energy efficiency in new construction and major renovations. State success stories are included for reference.
Constructing the spectral web of rotating plasmas
NASA Astrophysics Data System (ADS)
Goedbloed, Hans
2012-10-01
Rotating plasmas are ubiquitous in nature. The theory of MHD stability of such plasmas, initiated a long time ago, has severely suffered from the wide spread misunderstanding that it necessarily involves non-self-adjoint operators. It has been shown (J.P. Goedbloed, PPCF 16, 074001, 2011; Goedbloed, Keppens and Poedts, Advanced Magnetohydrodynamics, Cambridge, 2010) that, on the contrary, spectral theory of moving plasmas can be constructed entirely on the basis of energy conservation and self-adjointness of the occurring operators. The spectral web is a further development along this line. It involves the construction of a network of curves in the complex omega-plane associated with the complex complementary energy, which is the energy needed to maintain harmonic time dependence in an open system. Vanishing of that energy, at the intersections of the mentioned curves, yields the eigenvalues of the closed system. This permits to consider the enormous diversity of MHD instabilities of rotating tokamaks, accretion disks about compact objects, and jets emitted from those objects, from a single view point. This will be illustrated with results obtained with a new spectral code (ROC).
Cre recombinase-mediated site-specific recombination between plant chromosomes.
Qin, M; Bayley, C; Stockton, T; Ow, D W
1994-01-01
We report the use of the bacteriophage P1 Cre-lox system for generating conservative site-specific recombination between tobacco chromosomes. Two constructs, one containing a promoterless hygromycin-resistance gene preceded by a lox site (lox-hpt) and the other containing a cauliflower mosaic virus 35S promoter linked to a lox sequence and the cre coding region (35S-lox-cre), were introduced separately into tobacco plants. Crosses between plants harboring either construct produced plants with the two constructs situated on different chromosomes. Plants with recombination events were identified by selecting for hygromycin resistance, a phenotype expressed upon recombination. Molecular analysis showed that these recombination events occurred specifically at the lox sites and resulted in the reciprocal exchange of flanking host DNA. Progenies of these plants showed 67-100% cotransmission of the new transgenes, 35S-lox-hpt and lox-cre, consistent with the preferential cosegregation of translocated chromosomes. These results illustrate that site-specific recombination systems can be useful tools for the large-scale manipulation of eukaryotic chromosomes in vivo. Images PMID:8127869
Shielding and activation calculations around the reactor core for the MYRRHA ADS design
NASA Astrophysics Data System (ADS)
Ferrari, Anna; Mueller, Stefan; Konheiser, J.; Castelliti, D.; Sarotto, M.; Stankovskiy, A.
2017-09-01
In the frame of the FP7 European project MAXSIMA, an extensive simulation study has been done to assess the main shielding problems in view of the construction of the MYRRHA accelerator-driven system at SCK·CEN in Mol (Belgium). An innovative method based on the combined use of the two state-of-the-art Monte Carlo codes MCNPX and FLUKA has been used, with the goal to characterize complex, realistic neutron fields around the core barrel, to be used as source terms in detailed analyses of the radiation fields due to the system in operation, and of the coupled residual radiation. The main results of the shielding analysis are presented, as well as the construction of an activation database of all the key structural materials. The results evidenced a powerful way to analyse the shielding and activation problems, with direct and clear implications on the design solutions.
NASA Astrophysics Data System (ADS)
Mattie, P. D.; Knowlton, R. G.; Arnold, B. W.; Tien, N.; Kuo, M.
2006-12-01
Sandia National Laboratories (Sandia), a U.S. Department of Energy National Laboratory, has over 30 years experience in radioactive waste disposal and is providing assistance internationally in a number of areas relevant to the safety assessment of radioactive waste disposal systems. International technology transfer efforts are often hampered by small budgets, time schedule constraints, and a lack of experienced personnel in countries with small radioactive waste disposal programs. In an effort to surmount these difficulties, Sandia has developed a system that utilizes a combination of commercially available codes and existing legacy codes for probabilistic safety assessment modeling that facilitates the technology transfer and maximizes limited available funding. Numerous codes developed and endorsed by the United States Nuclear Regulatory Commission and codes developed and maintained by United States Department of Energy are generally available to foreign countries after addressing import/export control and copyright requirements. From a programmatic view, it is easier to utilize existing codes than to develop new codes. From an economic perspective, it is not possible for most countries with small radioactive waste disposal programs to maintain complex software, which meets the rigors of both domestic regulatory requirements and international peer review. Therefore, re-vitalization of deterministic legacy codes, as well as an adaptation of contemporary deterministic codes, provides a creditable and solid computational platform for constructing probabilistic safety assessment models. External model linkage capabilities in Goldsim and the techniques applied to facilitate this process will be presented using example applications, including Breach, Leach, and Transport-Multiple Species (BLT-MS), a U.S. NRC sponsored code simulating release and transport of contaminants from a subsurface low-level waste disposal facility used in a cooperative technology transfer project between Sandia National Laboratories and Taiwan's Institute of Nuclear Energy Research (INER) for the preliminary assessment of several candidate low-level waste repository sites. Sandia National Laboratories is a multiprogram laboratory operated by Sandia Corporation, a Lockheed Martin Company, for the United States Department of Energy under Contract DE AC04 94AL85000.
INL Results for Phases I and III of the OECD/NEA MHTGR-350 Benchmark
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gerhard Strydom; Javier Ortensi; Sonat Sen
2013-09-01
The Idaho National Laboratory (INL) Very High Temperature Reactor (VHTR) Technology Development Office (TDO) Methods Core Simulation group led the construction of the Organization for Economic Cooperation and Development (OECD) Modular High Temperature Reactor (MHTGR) 350 MW benchmark for comparing and evaluating prismatic VHTR analysis codes. The benchmark is sponsored by the OECD's Nuclear Energy Agency (NEA), and the project will yield a set of reference steady-state, transient, and lattice depletion problems that can be used by the Department of Energy (DOE), the Nuclear Regulatory Commission (NRC), and vendors to assess their code suits. The Methods group is responsible formore » defining the benchmark specifications, leading the data collection and comparison activities, and chairing the annual technical workshops. This report summarizes the latest INL results for Phase I (steady state) and Phase III (lattice depletion) of the benchmark. The INSTANT, Pronghorn and RattleSnake codes were used for the standalone core neutronics modeling of Exercise 1, and the results obtained from these codes are compared in Section 4. Exercise 2 of Phase I requires the standalone steady-state thermal fluids modeling of the MHTGR-350 design, and the results for the systems code RELAP5-3D are discussed in Section 5. The coupled neutronics and thermal fluids steady-state solution for Exercise 3 are reported in Section 6, utilizing the newly developed Parallel and Highly Innovative Simulation for INL Code System (PHISICS)/RELAP5-3D code suit. Finally, the lattice depletion models and results obtained for Phase III are compared in Section 7. The MHTGR-350 benchmark proved to be a challenging simulation set of problems to model accurately, and even with the simplifications introduced in the benchmark specification this activity is an important step in the code-to-code verification of modern prismatic VHTR codes. A final OECD/NEA comparison report will compare the Phase I and III results of all other international participants in 2014, while the remaining Phase II transient case results will be reported in 2015.« less
Free-form surface design method for a collimator TIR lens.
Tsai, Chung-Yu
2016-04-01
A free-form (FF) surface design method is proposed for a general axial-symmetrical collimator system consisting of a light source and a total internal reflection lens with two coupled FF boundary surfaces. The profiles of the boundary surfaces are designed using a FF surface construction method such that each incident ray is directed (refracted and reflected) in such a way as to form a specified image pattern on the target plane. The light ray paths within the system are analyzed using an exact analytical model and a skew-ray tracing approach. In addition, the validity of the proposed FF design method is demonstrated by means of ZEMAX simulations. It is shown that the illumination distribution formed on the target plane is in good agreement with that specified by the user. The proposed surface construction method is mathematically straightforward and easily implemented in computer code. As such, it provides a useful tool for the design and analysis of general axial-symmetrical optical systems.
Distributed polar-coded OFDM based on Plotkin's construction for half duplex wireless communication
NASA Astrophysics Data System (ADS)
Umar, Rahim; Yang, Fengfan; Mughal, Shoaib; Xu, HongJun
2018-07-01
A Plotkin-based polar-coded orthogonal frequency division multiplexing (P-PC-OFDM) scheme is proposed and its bit error rate (BER) performance over additive white gaussian noise (AWGN), frequency selective Rayleigh, Rician and Nakagami-m fading channels has been evaluated. The considered Plotkin's construction possesses a parallel split in its structure, which motivated us to extend the proposed P-PC-OFDM scheme in a coded cooperative scenario. As the relay's effective collaboration has always been pivotal in the design of cooperative communication therefore, an efficient selection criterion for choosing the information bits has been inculcated at the relay node. To assess the BER performance of the proposed cooperative scheme, we have also upgraded conventional polar-coded cooperative scheme in the context of OFDM as an appropriate bench marker. The Monte Carlo simulated results revealed that the proposed Plotkin-based polar-coded cooperative OFDM scheme convincingly outperforms the conventional polar-coded cooperative OFDM scheme by 0.5 0.6 dBs over AWGN channel. This prominent gain in BER performance is made possible due to the bit-selection criteria and the joint successive cancellation decoding adopted at the relay and the destination nodes, respectively. Furthermore, the proposed coded cooperative schemes outperform their corresponding non-cooperative schemes by a gain of 1 dB under an identical condition.
Construction of type-II QC-LDPC codes with fast encoding based on perfect cyclic difference sets
NASA Astrophysics Data System (ADS)
Li, Ling-xiang; Li, Hai-bing; Li, Ji-bi; Jiang, Hua
2017-09-01
In view of the problems that the encoding complexity of quasi-cyclic low-density parity-check (QC-LDPC) codes is high and the minimum distance is not large enough which leads to the degradation of the error-correction performance, the new irregular type-II QC-LDPC codes based on perfect cyclic difference sets (CDSs) are constructed. The parity check matrices of these type-II QC-LDPC codes consist of the zero matrices with weight of 0, the circulant permutation matrices (CPMs) with weight of 1 and the circulant matrices with weight of 2 (W2CMs). The introduction of W2CMs in parity check matrices makes it possible to achieve the larger minimum distance which can improve the error- correction performance of the codes. The Tanner graphs of these codes have no girth-4, thus they have the excellent decoding convergence characteristics. In addition, because the parity check matrices have the quasi-dual diagonal structure, the fast encoding algorithm can reduce the encoding complexity effectively. Simulation results show that the new type-II QC-LDPC codes can achieve a more excellent error-correction performance and have no error floor phenomenon over the additive white Gaussian noise (AWGN) channel with sum-product algorithm (SPA) iterative decoding.
The exploration of the exhibition informatization
NASA Astrophysics Data System (ADS)
Zhang, Jiankang
2017-06-01
The construction and management of exhibition informatization is the main task and choke point during the process of Chinese exhibition industry’s transformation and promotion. There are three key points expected to realize a breakthrough during the construction of Chinese exhibition informatization, and the three aspects respectively are adopting service outsourcing to construct and maintain the database, adopting advanced chest card technology to collect various kinds of information, developing statistics analysis to maintain good cutomer relations. The success of Chinese exhibition informatization mainly calls for mature suppliers who can provide construction and maintenance of database, the proven technology, a sense of data security, advanced chest card technology, the ability of data mining and analysis and the ability to improve the exhibition service basing on the commercial information got from the data analysis. Several data security measures are expected to apply during the process of system developing, including the measures of the terminal data security, the internet data security, the media data security, the storage data security and the application data security. The informatization of this process is based on the chest card designing. At present, there are several types of chest card technology: bar code chest card; two-dimension code card; magnetic stripe chest card; smart-chip chest card. The information got from the exhibition data will help the organizers to make relevant service strategies, quantify the accumulated indexes of the customers, and improve the level of the customer’s satisfaction and loyalty, what’s more, the information can also provide more additional services like the commercial trips, VIP ceremonial reception.
Code of Federal Regulations, 2014 CFR
2014-01-01
... maximum extent feasible, comply with one of the nationally recognized model building codes and with other nationally-recognized codes in their construction or alteration of each building in accordance with 40 U.S.C. 3312; and (f) Use the applicable national codes and standards as a guide for their building operations...
Code of Federal Regulations, 2013 CFR
2013-07-01
... maximum extent feasible, comply with one of the nationally recognized model building codes and with other nationally-recognized codes in their construction or alteration of each building in accordance with 40 U.S.C. 3312; and (f) Use the applicable national codes and standards as a guide for their building operations...
Code of Federal Regulations, 2012 CFR
2012-01-01
... maximum extent feasible, comply with one of the nationally recognized model building codes and with other nationally-recognized codes in their construction or alteration of each building in accordance with 40 U.S.C. 3312; and (f) Use the applicable national codes and standards as a guide for their building operations...
Code of Federal Regulations, 2011 CFR
2011-01-01
... maximum extent feasible, comply with one of the nationally recognized model building codes and with other nationally-recognized codes in their construction or alteration of each building in accordance with 40 U.S.C. 3312; and (f) Use the applicable national codes and standards as a guide for their building operations...
Multiplexed Detection of Cytokines Based on Dual Bar-Code Strategy and Single-Molecule Counting.
Li, Wei; Jiang, Wei; Dai, Shuang; Wang, Lei
2016-02-02
Cytokines play important roles in the immune system and have been regarded as biomarkers. While single cytokine is not specific and accurate enough to meet the strict diagnosis in practice, in this work, we constructed a multiplexed detection method for cytokines based on dual bar-code strategy and single-molecule counting. Taking interferon-γ (IFN-γ) and tumor necrosis factor-α (TNF-α) as model analytes, first, the magnetic nanobead was functionalized with the second antibody and primary bar-code strands, forming a magnetic nanoprobe. Then, through the specific reaction of the second antibody and the antigen that fixed by the primary antibody, sandwich-type immunocomplex was formed on the substrate. Next, the primary bar-code strands as amplification units triggered multibranched hybridization chain reaction (mHCR), producing nicked double-stranded polymers with multiple branched arms, which were served as secondary bar-code strands. Finally, the secondary bar-code strands hybridized with the multimolecule labeled fluorescence probes, generating enhanced fluorescence signals. The numbers of fluorescence dots were counted one by one for quantification with epi-fluorescence microscope. By integrating the primary and secondary bar-code-based amplification strategy and the multimolecule labeled fluorescence probes, this method displayed an excellent sensitivity with the detection limits were both 5 fM. Unlike the typical bar-code assay that the bar-code strands should be released and identified on a microarray, this method is more direct. Moreover, because of the selective immune reaction and the dual bar-code mechanism, the resulting method could detect the two targets simultaneously. Multiple analysis in human serum was also performed, suggesting that our strategy was reliable and had a great potential application in early clinical diagnosis.
NASA Astrophysics Data System (ADS)
Moussa, Jonathan; Ryan-Anderson, Ciaran
The canonical modern plan for universal quantum computation is a Clifford+T gate set implemented in a topological error-correcting code. This plan has the basic disparity that logical Clifford gates are natural for codes in two spatial dimensions while logical T gates are natural in three. Recent progress has reduced this disparity by proposing logical T gates in two dimensions with doubled, stacked, or gauge color codes, but these proposals lack an error threshold. An alternative universal gate set is Clifford+F, where a fusion (F) gate converts two logical qubits into a logical qudit. We show that logical F gates can be constructed by identifying compatible pairs of qubit and qudit codes that stabilize the same logical subspace, much like the original Bravyi-Kitaev construction of magic state distillation. The simplest example of high-distance compatible codes results in a proposal that is very similar to the stacked color code with the key improvement of retaining an error threshold. Sandia National Labs is a multi-program laboratory managed and operated by Sandia Corp, a wholly owned subsidiary of Lockheed Martin Corp, for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000.
Rahpeyma, Mehdi; Fotouhi, Fatemeh; Makvandi, Manouchehr; Ghadiri, Ata; Samarbaf-Zadeh, Alireza
2015-11-01
Crimean-Congo hemorrhagic fever virus (CCHFV) is a member of the nairovirus, a genus in the Bunyaviridae family, which causes a life threatening disease in human. Currently, there is no vaccine against CCHFV and detailed structural analysis of CCHFV proteins remains undefined. The CCHFV M RNA segment encodes two viral surface glycoproteins known as Gn and Gc. Viral glycoproteins can be considered as key targets for vaccine development. The current study aimed to investigate structural bioinformatics of CCHFV Gn protein and design a construct to make a recombinant bacmid to express by baculovirus system. To express the Gn protein in insect cells that can be used as antigen in animal model vaccine studies. Bioinformatic analysis of CCHFV Gn protein was performed and designed a construct and cloned into pFastBacHTb vector and a recombinant Gn-bacmid was generated by Bac to Bac system. Primary, secondary, and 3D structure of CCHFV Gn were obtained and PCR reaction with M13 forward and reverse primers confirmed the generation of recombinant bacmid DNA harboring Gn coding region under polyhedron promoter. Characterization of the detailed structure of CCHFV Gn by bioinformatics software provides the basis for development of new experiments and construction of a recombinant bacmid harboring CCHFV Gn, which is valuable for designing a recombinant vaccine against deadly pathogens like CCHFV.
The Impact of Political Violence on Marketing Development in South Vietnam; 1955 through 1972
1976-06-01
this series, which was de - signed to measure the impact of political violence on employ- ment indicators of marketing development, was constructed...Demonstrations Sanctions Variable (Coefficient) de 501) Violence (Code 504) (Code 506) (Code 519) Evex (T-Ratio) (Code S02) (Codi Number of Motor 109 .9a b c...NATIONAL DEFENSE UNIVERSITY INDUSTRIAL COLLEGE OF THE ARMED FORCES WASHINGTON. D.C. 20319U/ THE IMPACT OF POLITICAL VIOLENCE ON MARKETING
Etomica: an object-oriented framework for molecular simulation.
Schultz, Andrew J; Kofke, David A
2015-03-30
We describe the design of an object-oriented library of software components that are suitable for constructing simulations of systems of interacting particles. The emphasis of the discussion is on the general design of the components and how they interact, and less on details of the programming interface or its implementation. Example code is provided as an aid to understanding object-oriented programming structures and to demonstrate how the framework is applied. © 2015 Wiley Periodicals, Inc.
1985-11-01
User Interface that consists of a set of callable execution time routines available to an application program for form processing . IISS Function Screen...provisions for test consists of the normal testing techniques that are accomplished during the construction process . They consist of design and code...application presents a form * to the user which must be filled in with information for processing by that application. The application then
The Berkeley UNIX Consultant Project
1987-08-01
of the National Conference on Artifcial Intelligence . Pittsburgh, PA. (2) Chin, D. N. 1986. User modeling in UC, the UNIX consultant. In Proceedings of...Codes Avalt and I1of Dis Special 1. Introduction Several years ago, we began a project called UC (UNIX Consultant). UC was to function as an intelligent ...English. We sometimes refer to UC as "an intelligent ’help’ facility" to emphasize our intention to construct a consul- tation system, rather than a
Approximate equiangular tight frames for compressed sensing and CDMA applications
NASA Astrophysics Data System (ADS)
Tsiligianni, Evaggelia; Kondi, Lisimachos P.; Katsaggelos, Aggelos K.
2017-12-01
Performance guarantees for recovery algorithms employed in sparse representations, and compressed sensing highlights the importance of incoherence. Optimal bounds of incoherence are attained by equiangular unit norm tight frames (ETFs). Although ETFs are important in many applications, they do not exist for all dimensions, while their construction has been proven extremely difficult. In this paper, we construct frames that are close to ETFs. According to results from frame and graph theory, the existence of an ETF depends on the existence of its signature matrix, that is, a symmetric matrix with certain structure and spectrum consisting of two distinct eigenvalues. We view the construction of a signature matrix as an inverse eigenvalue problem and propose a method that produces frames of any dimensions that are close to ETFs. Due to the achieved equiangularity property, the so obtained frames can be employed as spreading sequences in synchronous code-division multiple access (s-CDMA) systems, besides compressed sensing.
NASA Astrophysics Data System (ADS)
Hyater-Adams, Simone; Fracchiolla, Claudia; Finkelstein, Noah; Hinko, Kathleen
2018-06-01
Studies on physics identity are appearing more frequently and often responding to increased awareness of the underrepresentation of students of color in physics. In our broader research, we focus our efforts on understanding how racial identity and physics identity are negotiated throughout the experiences of Black physicists. In this paper, we present a Critical Physics Identity framework that can be used to examine racialized physics identity and demonstrate the utility of this framework by analyzing interviews with four physicists. Our framework draws from prior constructs of physics identity and racialized identity and provides operational definitions of six interacting dimensions. In this paper, we present the operationalized constructs, demonstrate how we use these constructs to code narrative data, as well as outline three methods of analysis that may be applied to study systems and structures and their influences on the experiences of Black students.
Initial and Long-Term Movement of Cladding Installed Over Exterior Rigid Insulation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Baker, P.
Changes in the International Energy Conservation Code (IECC) from 2009 to 2012 have resulted in the use of exterior rigid insulation becoming part of the prescriptive code requirements. With more jurisdictions adopting the 2012 IECC builders are going to finding themselves required to incorporate exterior insulation in the construction of their exterior wall assemblies. For thick layers of exterior insulation (levels greater than 1.5 inches), the use wood furring strips attached through the insulation back to the structure has been used by many contractors and designers as a means to provide a convenient cladding attachment location. However, there has beenmore » a significant resistance to its widespread implementation due to a lack of research and understanding of the mechanisms involved and potential creep effects of the assembly under the sustained dead load of a cladding. This research was an extension on previous research conducted by BSC in 2011, and 2012. Each year the understanding of the system discrete load component interactions, as well as impacts of environmental loading has increased. The focus of the research was to examine more closely the impacts of screw fastener bending on the total system capacity, effects of thermal expansion and contraction of materials on the compressive forces in the assembly, as well as to analyze a full years worth of cladding movement data from assemblies constructed in an exposed outdoor environment.« less
A Hyperbolic Solver for Black Hole Initial Data in Numerical Relativity
NASA Astrophysics Data System (ADS)
Babiuc, Maria
2016-03-01
Numerical relativity is essential to the efforts of detecting gravitational waves emitted at the inspiral and merger of binary black holes. The first requirement for the generation of reliable gravitational wave templates is an accurate method of constructing initial data (ID). The standard approach is to solve the constraint equations for general relativity by formulating them as an elliptic system. A shortcoming of the ID constructed this way is an initial burst of spurious unphysical radiation (junk radiation). Recently, Racz and Winicour formulated the constraints as a hyperbolic problem, requiring boundary conditions only on a large sphere surrounding the system, where the physical behavior of the gravitational field is well understood. We investigate the applicability of this new approach, by developing a new 4th order numerical code that implements the fully nonlinear constraints equations on a two dimensional stereographic foliation, and evolves them radially inward using a Runge-Kutta integrator. The tensorial quantities are written as spin-weighted fields and the angular derivatives are replaced with ``eth'' operators. We present here results for the simulation of nonlinear perturbations to Schwarzschild ID in Kerr-Schild coordinates. The code shows stability and convergence at both large and small radii. Our long-term goal is to develop this new approach into a numerical scheme for generating ID for binary black holes and to analyze its performance in eliminating the junk radiation.
A visual programming environment for the Navier-Stokes computer
NASA Technical Reports Server (NTRS)
Tomboulian, Sherryl; Crockett, Thomas W.; Middleton, David
1988-01-01
The Navier-Stokes computer is a high-performance, reconfigurable, pipelined machine designed to solve large computational fluid dynamics problems. Due to the complexity of the architecture, development of effective, high-level language compilers for the system appears to be a very difficult task. Consequently, a visual programming methodology has been developed which allows users to program the system at an architectural level by constructing diagrams of the pipeline configuration. These schematic program representations can then be checked for validity and automatically translated into machine code. The visual environment is illustrated by using a prototype graphical editor to program an example problem.
NASA Astrophysics Data System (ADS)
Kosmidis, Kosmas; Kalampokis, Alkiviadis; Argyrakis, Panos
2006-10-01
We use the detrended fluctuation analysis (DFA) and the Grassberger-Proccacia analysis (GP) methods in order to study language characteristics. Despite that we construct our signals using only word lengths or word frequencies, excluding in this way huge amount of information from language, the application of GP analysis indicates that linguistic signals may be considered as the manifestation of a complex system of high dimensionality, different from random signals or systems of low dimensionality such as the Earth climate. The DFA method is additionally able to distinguish a natural language signal from a computer code signal. This last result may be useful in the field of cryptography.
Composing Data Parallel Code for a SPARQL Graph Engine
DOE Office of Scientific and Technical Information (OSTI.GOV)
Castellana, Vito G.; Tumeo, Antonino; Villa, Oreste
Big data analytics process large amount of data to extract knowledge from them. Semantic databases are big data applications that adopt the Resource Description Framework (RDF) to structure metadata through a graph-based representation. The graph based representation provides several benefits, such as the possibility to perform in memory processing with large amounts of parallelism. SPARQL is a language used to perform queries on RDF-structured data through graph matching. In this paper we present a tool that automatically translates SPARQL queries to parallel graph crawling and graph matching operations. The tool also supports complex SPARQL constructs, which requires more than basicmore » graph matching for their implementation. The tool generates parallel code annotated with OpenMP pragmas for x86 Shared-memory Multiprocessors (SMPs). With respect to commercial database systems such as Virtuoso, our approach reduces memory occupation due to join operations and provides higher performance. We show the scaling of the automatically generated graph-matching code on a 48-core SMP.« less
The MCUCN simulation code for ultracold neutron physics
NASA Astrophysics Data System (ADS)
Zsigmond, G.
2018-02-01
Ultracold neutrons (UCN) have very low kinetic energies 0-300 neV, thereby can be stored in specific material or magnetic confinements for many hundreds of seconds. This makes them a very useful tool in probing fundamental symmetries of nature (for instance charge-parity violation by neutron electric dipole moment experiments) and contributing important parameters for the Big Bang nucleosynthesis (neutron lifetime measurements). Improved precision experiments are in construction at new and planned UCN sources around the world. MC simulations play an important role in the optimization of such systems with a large number of parameters, but also in the estimation of systematic effects, in benchmarking of analysis codes, or as part of the analysis. The MCUCN code written at PSI has been extensively used for the optimization of the UCN source optics and in the optimization and analysis of (test) experiments within the nEDM project based at PSI. In this paper we present the main features of MCUCN and interesting benchmark and application examples.
NASA Technical Reports Server (NTRS)
Myhill, Elizabeth A.; Boss, Alan P.
1993-01-01
In Boss & Myhill (1992) we described the derivation and testing of a spherical coordinate-based scheme for solving the hydrodynamic equations governing the gravitational collapse of nonisothermal, nonmagnetic, inviscid, radiative, three-dimensional protostellar clouds. Here we discuss a Cartesian coordinate-based scheme based on the same set of hydrodynamic equations. As with the spherical coorrdinate-based code, the Cartesian coordinate-based scheme employs explicit Eulerian methods which are both spatially and temporally second-order accurate. We begin by describing the hydrodynamic equations in Cartesian coordinates and the numerical methods used in this particular code. Following Finn & Hawley (1989), we pay special attention to the proper implementations of high-order accuracy, finite difference methods. We evaluate the ability of the Cartesian scheme to handle shock propagation problems, and through convergence testing, we show that the code is indeed second-order accurate. To compare the Cartesian scheme discussed here with the spherical coordinate-based scheme discussed in Boss & Myhill (1992), the two codes are used to calculate the standard isothermal collapse test case described by Bodenheimer & Boss (1981). We find that with the improved codes, the intermediate bar-configuration found previously disappears, and the cloud fragments directly into a binary protostellar system. Finally, we present the results from both codes of a new test for nonisothermal protostellar collapse.
Liu, Zhongyang; Guo, Feifei; Gu, Jiangyong; Wang, Yong; Li, Yang; Wang, Dan; Lu, Liang; Li, Dong; He, Fuchu
2015-06-01
Anatomical Therapeutic Chemical (ATC) classification system, widely applied in almost all drug utilization studies, is currently the most widely recognized classification system for drugs. Currently, new drug entries are added into the system only on users' requests, which leads to seriously incomplete drug coverage of the system, and bioinformatics prediction is helpful during this process. Here we propose a novel prediction model of drug-ATC code associations, using logistic regression to integrate multiple heterogeneous data sources including chemical structures, target proteins, gene expression, side-effects and chemical-chemical associations. The model obtains good performance for the prediction not only on ATC codes of unclassified drugs but also on new ATC codes of classified drugs assessed by cross-validation and independent test sets, and its efficacy exceeds previous methods. Further to facilitate the use, the model is developed into a user-friendly web service SPACE ( S: imilarity-based P: redictor of A: TC C: od E: ), which for each submitted compound, will give candidate ATC codes (ranked according to the decreasing probability_score predicted by the model) together with corresponding supporting evidence. This work not only contributes to knowing drugs' therapeutic, pharmacological and chemical properties, but also provides clues for drug repositioning and side-effect discovery. In addition, the construction of the prediction model also provides a general framework for similarity-based data integration which is suitable for other drug-related studies such as target, side-effect prediction etc. The web service SPACE is available at http://www.bprc.ac.cn/space. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gonzalez Gonzalez, R.; Petruzzi, A.; D'Auria, F.
2012-07-01
Atucha-2 is a Siemens-designed PHWR reactor under construction in the Republic of Argentina. Its geometrical complexity and (e.g., oblique Control Rods, Positive Void coefficient) required a developed and validated complex three dimensional (3D) neutron kinetics (NK) coupled thermal hydraulic (TH) model. Reactor shut-down is obtained by oblique CRs and, during accidental conditions, by an emergency shut-down system (JDJ) injecting a highly concentrated boron solution (boron clouds) in the moderator tank, the boron clouds reconstruction is obtained using a CFD (CFX) code calculation. A complete LBLOCA calculation implies the application of the RELAP5-3D{sup C} system code. Within the framework of themore » third Agreement 'NA-SA - Univ. of Pisa' a new RELAP5-3D control system for the boron injection system was developed and implemented in the validated coupled RELAP5-3D/NESTLE model of the Atucha 2 NPP. The aim of this activity is to find out the limiting case (maximum break area size) for the Peak Cladding Temperature for LOCAs under fixed boundary conditions. (authors)« less
A microcontroller-based telemetry system for sympathetic nerve activity and ECG measurement.
Harada, E; Yonezawa, Y; Caldwell, W M; Hahn, A W
1999-01-01
A telemetry system employing a low power 8-bit microcontroller has been developed for chronic unanesthetized small animal studies. The two-channel system is designed for use with animals in shielded cages. Analog signals from implantable ECG and nerve electrodes are converted to an 8-bit serial digital format. This is accomplished by individual 8 bit A/D converters included in the microcontroller, which also has serial I/O port. The converted serial binary code is applied directly to an antenna wire. Therefore, the system does not need to employ a separate transmitter, such as in FM or infrared optical telemeters. The system is used in a shielded animal cage to reduce interference from external radio signals and 60 Hz power line fields. The code is received by a high input impedance amplifier in the cage and is then demodulated. The telemeter is powered by a small 3 V lithium battery, which provides 100 hours of continuous operation. The circuit is constructed on two 25 x 25 mm. printed circuit boards and encapsulated in epoxy, yielding a total volume of 6.25 cc. The weight is 15 g.
Study of Globus-M Tokamak Poloidal System and Plasma Position Control
NASA Astrophysics Data System (ADS)
Dokuka, V. N.; Korenev, P. S.; Mitrishkin, Yu. V.; Pavlova, E. A.; Patrov, M. I.; Khayrutdinov, R. R.
2017-12-01
In order to provide efficient performance of tokamaks with vertically elongated plasma position, control systems for limited and diverted plasma configuration are required. The accuracy, stability, speed of response, and reliability of plasma position control as well as plasma shape and current control depend on the performance of the control system. Therefore, the problem of the development of such systems is an important and actual task in modern tokamaks. In this study, the measured signals from the magnetic loops and Rogowski coils are used to reconstruct the plasma equilibrium, for which linear models in small deviations are constructed. We apply methods of the H∞-optimization theory to the synthesize control system for vertical and horizontal position of plasma capable to working with structural uncertainty of the models of the plant. These systems are applied to the plasma-physical DINA code which is configured for the tokamak Globus-M plasma. The testing of the developed systems applied to the DINA code with Heaviside step functions have revealed the complex dynamics of plasma magnetic configurations. Being close to the bifurcation point in the parameter space of unstable plasma has made it possible to detect an abrupt change in the X-point position from the top to the bottom and vice versa. Development of the methods for reconstruction of plasma magnetic configurations and experience in designing plasma control systems with feedback for tokamaks provided an opportunity to synthesize new digital controllers for plasma vertical and horizontal position stabilization. It also allowed us to test the synthesized digital controllers in the closed loop of the control system with the DINA code as a nonlinear model of plasma.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kunsman, David Marvin; Aldemir, Tunc; Rutt, Benjamin
2008-05-01
This LDRD project has produced a tool that makes probabilistic risk assessments (PRAs) of nuclear reactors - analyses which are very resource intensive - more efficient. PRAs of nuclear reactors are being increasingly relied on by the United States Nuclear Regulatory Commission (U.S.N.R.C.) for licensing decisions for current and advanced reactors. Yet, PRAs are produced much as they were 20 years ago. The work here applied a modern systems analysis technique to the accident progression analysis portion of the PRA; the technique was a system-independent multi-task computer driver routine. Initially, the objective of the work was to fuse the accidentmore » progression event tree (APET) portion of a PRA to the dynamic system doctor (DSD) created by Ohio State University. Instead, during the initial efforts, it was found that the DSD could be linked directly to a detailed accident progression phenomenological simulation code - the type on which APET construction and analysis relies, albeit indirectly - and thereby directly create and analyze the APET. The expanded DSD computational architecture and infrastructure that was created during this effort is called ADAPT (Analysis of Dynamic Accident Progression Trees). ADAPT is a system software infrastructure that supports execution and analysis of multiple dynamic event-tree simulations on distributed environments. A simulator abstraction layer was developed, and a generic driver was implemented for executing simulators on a distributed environment. As a demonstration of the use of the methodological tool, ADAPT was applied to quantify the likelihood of competing accident progression pathways occurring for a particular accident scenario in a particular reactor type using MELCOR, an integrated severe accident analysis code developed at Sandia. (ADAPT was intentionally created with flexibility, however, and is not limited to interacting with only one code. With minor coding changes to input files, ADAPT can be linked to other such codes.) The results of this demonstration indicate that the approach can significantly reduce the resources required for Level 2 PRAs. From the phenomenological viewpoint, ADAPT can also treat the associated epistemic and aleatory uncertainties. This methodology can also be used for analyses of other complex systems. Any complex system can be analyzed using ADAPT if the workings of that system can be displayed as an event tree, there is a computer code that simulates how those events could progress, and that simulator code has switches to turn on and off system events, phenomena, etc. Using and applying ADAPT to particular problems is not human independent. While the human resources for the creation and analysis of the accident progression are significantly decreased, knowledgeable analysts are still necessary for a given project to apply ADAPT successfully. This research and development effort has met its original goals and then exceeded them.« less
Zhou, Hai; He, Ming; Li, Jing; Chen, Liang; Huang, Zhifeng; Zheng, Shaoyan; Zhu, Liya; Ni, Erdong; Jiang, Dagang; Zhao, Bingran; Zhuang, Chuxiong
2016-01-01
Hybrid rice breeding offers an important strategy to improve rice production, in which the cultivation of a male sterile line is the key to the success of cross-breeding. CRISPR/Cas9 systems have been widely used in target-site genome editing, whereas their application for crop genetic improvement has been rarely reported. Here, using the CRISPR/Cas9 system, we induced specific mutations in TMS5, which is the most widely applied thermo-sensitive genic male sterility (TGMS) gene in China, and developed new “transgene clean” TGMS lines. We designed 10 target sites in the coding region of TMS5 for targeted mutagenesis using the CRISPR/Cas9 system and assessed the potential rates of on- and off-target effects. Finally, we established the most efficient construct, the TMS5ab construct, for breeding potentially applicable “transgene clean” TGMS lines. We also discussed factors that affect the editing efficiency according to the characteristics of different target sequences. Notably, using the TMS5ab construct, we developed 11 new “transgene clean” TGMS lines with potential applications in hybrid breeding within only one year in both rice subspecies. The application of our system not only significantly accelerates the breeding of sterile lines but also facilitates the exploitation of heterosis. PMID:27874087
Solomon, Judith; Duschinsky, Robbie; Bakkum, Lianne; Schuengel, Carlo
2017-01-01
This article examines the construct of disorganized attachment originally proposed by Main and Solomon, developing some new conjectures based on inspiration from a largely unknown source: John Bowlby’s unpublished texts, housed at the Wellcome Trust Library Archive in London (with permission from the Bowlby family). We explore Bowlby’s discussions of disorganized attachment, which he understood from the perspective of ethological theories of conflict behavior. Bowlby’s reflections regarding differences among the behaviors used to code disorganized attachment will be used to explore distinctions that may underlie the structure of the current coding system. The article closes with an emphasis on the importance Bowlby placed on Popper’s distinction between the context of discovery and the context of justification in developmental science. PMID:28791871
A proposal for self-correcting stabilizer quantum memories in 3 dimensions (or slightly less)
NASA Astrophysics Data System (ADS)
Brell, Courtney G.
2016-01-01
We propose a family of local CSS stabilizer codes as possible candidates for self-correcting quantum memories in 3D. The construction is inspired by the classical Ising model on a Sierpinski carpet fractal, which acts as a classical self-correcting memory. Our models are naturally defined on fractal subsets of a 4D hypercubic lattice with Hausdorff dimension less than 3. Though this does not imply that these models can be realized with local interactions in {{{R}}}3, we also discuss this possibility. The X and Z sectors of the code are dual to one another, and we show that there exists a finite temperature phase transition associated with each of these sectors, providing evidence that the system may robustly store quantum information at finite temperature.