DOE Office of Scientific and Technical Information (OSTI.GOV)
Little, M.P.; Muirhead, C.R.; Goossens, L.H.J.
1997-12-01
The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library ofmore » uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA late health effects models. This volume contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures, (3) the rationales and results for the expert panel on late health effects, (4) short biographies of the experts, and (5) the aggregated results of their responses.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Goossens, L.H.J.; Kraan, B.C.P.; Cooke, R.M.
1998-04-01
The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library ofmore » uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA internal dosimetry models. This volume contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures, (3) the rationales and results for the panel on internal dosimetry, (4) short biographies of the experts, and (5) the aggregated results of their responses.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Goossens, L.H.J.; Kraan, B.C.P.; Cooke, R.M.
1997-12-01
The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library ofmore » uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA deposited material and external dose models. This volume contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures, (3) the rationales and results for the panel on deposited material and external doses, (4) short biographies of the experts, and (5) the aggregated results of their responses.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harper, F.T.; Young, M.L.; Miller, L.A.
The development of two new probabilistic accident consequence codes, MACCS and COSYMA, completed in 1990, estimate the risks presented by nuclear installations based on postulated frequencies and magnitudes of potential accidents. In 1991, the US Nuclear Regulatory Commission (NRC) and the Commission of the European Communities (CEC) began a joint uncertainty analysis of the two codes. The objective was to develop credible and traceable uncertainty distributions for the input variables of the codes. Expert elicitation, developed independently, was identified as the best technology available for developing a library of uncertainty distributions for the selected consequence parameters. The study was formulatedmore » jointly and was limited to the current code models and to physical quantities that could be measured in experiments. To validate the distributions generated for the wet deposition input variables, samples were taken from these distributions and propagated through the wet deposition code model along with the Gaussian plume model (GPM) implemented in the MACCS and COSYMA codes. Resulting distributions closely replicated the aggregated elicited wet deposition distributions. Project teams from the NRC and CEC cooperated successfully to develop and implement a unified process for the elaboration of uncertainty distributions on consequence code input parameters. Formal expert judgment elicitation proved valuable for synthesizing the best available information. Distributions on measurable atmospheric dispersion and deposition parameters were successfully elicited from experts involved in the many phenomenological areas of consequence analysis. This volume is the second of a three-volume document describing the project and contains two appendices describing the rationales for the dispersion and deposition data along with short biographies of the 16 experts who participated in the project.« less
Analysis of radiation safety for Small Modular Reactor (SMR) on PWR-100 MWe type
NASA Astrophysics Data System (ADS)
Udiyani, P. M.; Husnayani, I.; Deswandri; Sunaryo, G. R.
2018-02-01
Indonesia as an archipelago country, including big, medium and small islands is suitable to construction of Small Medium/Modular reactors. Preliminary technology assessment on various SMR has been started, indeed the SMR is grouped into Light Water Reactor, Gas Cooled Reactor, and Solid Cooled Reactor and from its site it is group into Land Based reactor and Water Based Reactor. Fukushima accident made people doubt about the safety of Nuclear Power Plant (NPP), which impact on the public perception of the safety of nuclear power plants. The paper will describe the assessment of safety and radiation consequences on site for normal operation and Design Basis Accident postulation of SMR based on PWR-100 MWe in Bangka Island. Consequences of radiation for normal operation simulated for 3 units SMR. The source term was generated from an inventory by using ORIGEN-2 software and the consequence of routine calculated by PC-Cream and accident by PC Cosyma. The adopted methodology used was based on site-specific meteorological and spatial data. According to calculation by PC-CREAM 08 computer code, the highest individual dose in site area for adults is 5.34E-02 mSv/y in ESE direction within 1 km distance from stack. The result of calculation is that doses on public for normal operation below 1mSv/y. The calculation result from PC Cosyma, the highest individual dose is 1.92.E+00 mSv in ESE direction within 1km distance from stack. The total collective dose (all pathway) is 3.39E-01 manSv, with dominant supporting from cloud pathway. Results show that there are no evacuation countermeasure will be taken based on the regulation of emergency.
Sohrabi, M; Ghasemi, M; Amrollahi, R; Khamooshi, C; Parsouzi, Z
2013-05-01
Unit-1 of the Bushehr nuclear power plant (BNPP-1) is a VVER-type reactor with 1,000-MWe power constructed near Bushehr city at the coast of the Persian Gulf, Iran. The reactor has been recently operational to near its full power. The radiological impact of nuclear power plant (NPP) accidents is of public concern, and the assessment of radiological consequences of any hypothetical nuclear accident on public exposure is vital. The hypothetical accident scenario considered in this paper is a design-basis accident, that is, a primary coolant leakage to the secondary circuit. This scenario was selected in order to compare and verify the results obtained in the present paper with those reported in the Final Safety Analysis Report (FSAR 2007) of the BNPP-1 and to develop a well-proven methodology that can be used to study other and more severe hypothetical accident scenarios for this reactor. In the present study, the version 2.01 of the PC COSYMA code was applied. In the early phase of the accidental releases, effective doses (from external and internal exposures) as well as individual and collective doses (due to the late phase of accidental releases) were evaluated. The surrounding area of the BNPP-1 within a radius of 80 km was subdivided into seven concentric rings and 16 sectors, and distribution of population and agricultural products was calculated for this grid. The results show that during the first year following the modeled hypothetical accident, the effective doses do not exceed the limit of 5 mSv, for the considered distances from the BNPP-1. The results obtained in this study are in good agreement with those in the FSAR-2007 report. The agreement obtained is in light of many inherent uncertainties and variables existing in the two modeling procedures applied and proves that the methodology applied here can also be used to model other severe hypothetical accident scenarios of the BNPP-1 such as a small and large break in the reactor coolant system as well as beyond design-basis accidents. Such scenarios are planned to be studied in the near future, for this reactor.
A Systematic Method for Verification and Validation of Gyrokinetic Microstability Codes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bravenec, Ronald
My original proposal for the period Feb. 15, 2014 through Feb. 14, 2017 called for an integrated validation and verification effort carried out by myself with collaborators. The validation component would require experimental profile and power-balance analysis. In addition, it would require running the gyrokinetic codes varying the input profiles within experimental uncertainties to seek agreement with experiment before discounting a code as invalidated. Therefore, validation would require a major increase of effort over my previous grant periods which covered only code verification (code benchmarking). Consequently, I had requested full-time funding. Instead, I am being funded at somewhat less thanmore » half time (5 calendar months per year). As a consequence, I decided to forego the validation component and to only continue the verification efforts.« less
The kinetics of aerosol particle formation and removal in NPP severe accidents
NASA Astrophysics Data System (ADS)
Zatevakhin, Mikhail A.; Arefiev, Valentin K.; Semashko, Sergey E.; Dolganov, Rostislav A.
2016-06-01
Severe Nuclear Power Plant (NPP) accidents are accompanied by release of a massive amount of energy, radioactive products and hydrogen into the atmosphere of the NPP containment. A valid estimation of consequences of such accidents can only be carried out through the use of the integrated codes comprising a description of the basic processes which determine the consequences. A brief description of a coupled aerosol and thermal-hydraulic code to be used for the calculation of the aerosol kinetics within the NPP containment in case of a severe accident is given. The code comprises a KIN aerosol unit integrated into the KUPOL-M thermal-hydraulic code. Some features of aerosol behavior in severe NPP accidents are briefly described.
The kinetics of aerosol particle formation and removal in NPP severe accidents
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zatevakhin, Mikhail A.; Arefiev, Valentin K.; Semashko, Sergey E.
2016-06-08
Severe Nuclear Power Plant (NPP) accidents are accompanied by release of a massive amount of energy, radioactive products and hydrogen into the atmosphere of the NPP containment. A valid estimation of consequences of such accidents can only be carried out through the use of the integrated codes comprising a description of the basic processes which determine the consequences. A brief description of a coupled aerosol and thermal–hydraulic code to be used for the calculation of the aerosol kinetics within the NPP containment in case of a severe accident is given. The code comprises a KIN aerosol unit integrated into themore » KUPOL-M thermal–hydraulic code. Some features of aerosol behavior in severe NPP accidents are briefly described.« less
Unfiltered Talk--A Challenge to Categories.
ERIC Educational Resources Information Center
McCormick, Kay
A study investigated how and why code switching and mixing occurs between English and Afrikaans in a region of South Africa. In District Six, non-standard Afrikaans seems to be a mixed code, and it is unclear whether non-standard English is a mixed code. Consequently, it is unclear when codes are being switched or mixed. The analysis looks at…
Methods for nuclear air-cleaning-system accident-consequence assessment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Andrae, R.W.; Bolstad, J.W.; Gregory, W.S.
1982-01-01
This paper describes a multilaboratory research program that is directed toward addressing many questions that analysts face when performing air cleaning accident consequence assessments. The program involves developing analytical tools and supportive experimental data that will be useful in making more realistic assessments of accident source terms within and up to the atmospheric boundaries of nuclear fuel cycle facilities. The types of accidents considered in this study includes fires, explosions, spills, tornadoes, criticalities, and equipment failures. The main focus of the program is developing an accident analysis handbook (AAH). We will describe the contents of the AAH, which include descriptionsmore » of selected nuclear fuel cycle facilities, process unit operations, source-term development, and accident consequence analyses. Three computer codes designed to predict gas and material propagation through facility air cleaning systems are described. These computer codes address accidents involving fires (FIRAC), explosions (EXPAC), and tornadoes (TORAC). The handbook relies on many illustrative examples to show the analyst how to approach accident consequence assessments. We will use the FIRAC code and a hypothetical fire scenario to illustrate the accident analysis capability.« less
34 CFR 600.10 - Date, extent, duration, and consequence of eligibility.
Code of Federal Regulations, 2011 CFR
2011-07-01
... Classification of Instructional Programs (CIP) code under the taxonomy of instructional program classifications... same CIP code as another program offered by the institution but leads to a different degree or...
34 CFR 600.10 - Date, extent, duration, and consequence of eligibility.
Code of Federal Regulations, 2014 CFR
2014-07-01
... Classification of Instructional Programs (CIP) code under the taxonomy of instructional program classifications... same CIP code as another program offered by the institution but leads to a different degree or...
34 CFR 600.10 - Date, extent, duration, and consequence of eligibility.
Code of Federal Regulations, 2013 CFR
2013-07-01
... Classification of Instructional Programs (CIP) code under the taxonomy of instructional program classifications... same CIP code as another program offered by the institution but leads to a different degree or...
34 CFR 600.10 - Date, extent, duration, and consequence of eligibility.
Code of Federal Regulations, 2012 CFR
2012-07-01
... Classification of Instructional Programs (CIP) code under the taxonomy of instructional program classifications... same CIP code as another program offered by the institution but leads to a different degree or...
ERIC Educational Resources Information Center
Emmorey, Karen; Petrich, Jennifer A. F.; Gollan, Tamar H.
2012-01-01
Bilinguals who are fluent in American Sign Language (ASL) and English often produce "code-blends"--simultaneously articulating a sign and a word while conversing with other ASL-English bilinguals. To investigate the cognitive mechanisms underlying code-blend processing, we compared picture-naming times (Experiment 1) and semantic categorization…
Non-coding variants contribute to the clinical heterogeneity of TTR amyloidosis.
Iorio, Andrea; De Lillo, Antonella; De Angelis, Flavio; Di Girolamo, Marco; Luigetti, Marco; Sabatelli, Mario; Pradotto, Luca; Mauro, Alessandro; Mazzeo, Anna; Stancanelli, Claudia; Perfetto, Federico; Frusconi, Sabrina; My, Filomena; Manfellotto, Dario; Fuciarelli, Maria; Polimanti, Renato
2017-09-01
Coding mutations in TTR gene cause a rare hereditary form of systemic amyloidosis, which has a complex genotype-phenotype correlation. We investigated the role of non-coding variants in regulating TTR gene expression and consequently amyloidosis symptoms. We evaluated the genotype-phenotype correlation considering the clinical information of 129 Italian patients with TTR amyloidosis. Then, we conducted a re-sequencing of TTR gene to investigate how non-coding variants affect TTR expression and, consequently, phenotypic presentation in carriers of amyloidogenic mutations. Polygenic scores for genetically determined TTR expression were constructed using data from our re-sequencing analysis and the GTEx (Genotype-Tissue Expression) project. We confirmed a strong phenotypic heterogeneity across coding mutations causing TTR amyloidosis. Considering the effects of non-coding variants on TTR expression, we identified three patient clusters with specific expression patterns associated with certain phenotypic presentations, including late onset, autonomic neurological involvement, and gastrointestinal symptoms. This study provides novel data regarding the role of non-coding variation and the gene expression profiles in patients affected by TTR amyloidosis, also putting forth an approach that could be used to investigate the mechanisms at the basis of the genotype-phenotype correlation of the disease.
Coding Manual for Continuous Observation of Interactions by Single Subjects in an Academic Setting.
ERIC Educational Resources Information Center
Cobb, Joseph A.; Hops, Hyman
The manual, designed particularly for work with acting-out or behavior problem students, describes coding procedures used in the observation of continuous classroom interactions between the student and his peers and teacher. Peer and/or teacher behaviors antecedent and consequent to the subject's behavior are identified in the coding process,…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sprung, J.L.; Jow, H-N; Rollstin, J.A.
1990-12-01
Estimation of offsite accident consequences is the customary final step in a probabilistic assessment of the risks of severe nuclear reactor accidents. Recently, the Nuclear Regulatory Commission reassessed the risks of severe accidents at five US power reactors (NUREG-1150). Offsite accident consequences for NUREG-1150 source terms were estimated using the MELCOR Accident Consequence Code System (MACCS). Before these calculations were performed, most MACCS input parameters were reviewed, and for each parameter reviewed, a best-estimate value was recommended. This report presents the results of these reviews. Specifically, recommended values and the basis for their selection are presented for MACCS atmospheric andmore » biospheric transport, emergency response, food pathway, and economic input parameters. Dose conversion factors and health effect parameters are not reviewed in this report. 134 refs., 15 figs., 110 tabs.« less
Finding Resolution for the Responsible Transparency of Economic Models in Health and Medicine.
Padula, William V; McQueen, Robert Brett; Pronovost, Peter J
2017-11-01
The Second Panel on Cost-Effectiveness in Health and Medicine recommendations for conduct, methodological practices, and reporting of cost-effectiveness analyses has a number of questions unanswered with respect to the implementation of transparent, open source code interface for economic models. The possibility of making economic model source code could be positive and progressive for the field; however, several unintended consequences of this system should be first considered before complete implementation of this model. First, there is the concern regarding intellectual property rights that modelers have to their analyses. Second, the open source code could make analyses more accessible to inexperienced modelers, leading to inaccurate or misinterpreted results. We propose several resolutions to these concerns. The field should establish a licensing system of open source code such that the model originators maintain control of the code use and grant permissions to other investigators who wish to use it. The field should also be more forthcoming towards the teaching of cost-effectiveness analysis in medical and health services education so that providers and other professionals are familiar with economic modeling and able to conduct analyses with open source code. These types of unintended consequences need to be fully considered before the field's preparedness to move forward into an era of model transparency with open source code.
Institutional Controls and Educational Research.
ERIC Educational Resources Information Center
Homan, Roger
1990-01-01
Recognizing tendencies toward contract research and possible consequences, advocates creating a conduct code to regulate educational research and protect its integrity. Reports survey responses from 48 British institutions, showing no systematic code. States confidence in supervisory discretion currently guides research. Proposes a specific code…
Separable concatenated codes with iterative map decoding for Rician fading channels
NASA Technical Reports Server (NTRS)
Lodge, J. H.; Young, R. J.
1993-01-01
Very efficient signalling in radio channels requires the design of very powerful codes having special structure suitable for practical decoding schemes. In this paper, powerful codes are obtained by combining comparatively simple convolutional codes to form multi-tiered 'separable' convolutional codes. The decoding of these codes, using separable symbol-by-symbol maximum a posteriori (MAP) 'filters', is described. It is known that this approach yields impressive results in non-fading additive white Gaussian noise channels. Interleaving is an inherent part of the code construction, and consequently, these codes are well suited for fading channel communications. Here, simulation results for communications over Rician fading channels are presented to support this claim.
DOT National Transportation Integrated Search
1996-01-01
The RISKIND computer program was developed for the analysis of radiological consequences and health risks to individuals and the collective population from exposures associated with the transportation of spent nuclear fuel (SNF) or other radioactive ...
Sanctions Connected to Dress Code Violations in Secondary School Handbooks
ERIC Educational Resources Information Center
Workman, Jane E.; Freeburg, Elizabeth W.; Lentz-Hees, Elizabeth S.
2004-01-01
This study identifies and evaluates sanctions for dress code violations in secondary school handbooks. Sanctions, or consequences for breaking rules, vary along seven interrelated dimensions: source, formality, retribution, obtrusiveness, magnitude, severity, and pervasiveness. A content analysis of handbooks from 155 public secondary schools…
Natural Language Interface for Safety Certification of Safety-Critical Software
NASA Technical Reports Server (NTRS)
Denney, Ewen; Fischer, Bernd
2011-01-01
Model-based design and automated code generation are being used increasingly at NASA. The trend is to move beyond simulation and prototyping to actual flight code, particularly in the guidance, navigation, and control domain. However, there are substantial obstacles to more widespread adoption of code generators in such safety-critical domains. Since code generators are typically not qualified, there is no guarantee that their output is correct, and consequently the generated code still needs to be fully tested and certified. The AutoCert generator plug-in supports the certification of automatically generated code by formally verifying that the generated code is free of different safety violations, by constructing an independently verifiable certificate, and by explaining its analysis in a textual form suitable for code reviews.
New quantum codes derived from a family of antiprimitive BCH codes
NASA Astrophysics Data System (ADS)
Liu, Yang; Li, Ruihu; Lü, Liangdong; Guo, Luobin
The Bose-Chaudhuri-Hocquenghem (BCH) codes have been studied for more than 57 years and have found wide application in classical communication system and quantum information theory. In this paper, we study the construction of quantum codes from a family of q2-ary BCH codes with length n=q2m+1 (also called antiprimitive BCH codes in the literature), where q≥4 is a power of 2 and m≥2. By a detailed analysis of some useful properties about q2-ary cyclotomic cosets modulo n, Hermitian dual-containing conditions for a family of non-narrow-sense antiprimitive BCH codes are presented, which are similar to those of q2-ary primitive BCH codes. Consequently, via Hermitian Construction, a family of new quantum codes can be derived from these dual-containing BCH codes. Some of these new antiprimitive quantum BCH codes are comparable with those derived from primitive BCH codes.
Evaluation of radiological dispersion/consequence codes supporting DOE nuclear facility SARs
DOE Office of Scientific and Technical Information (OSTI.GOV)
O`Kula, K.R.; Paik, I.K.; Chung, D.Y.
1996-12-31
Since the early 1990s, the authorization basis documentation of many U.S. Department of Energy (DOE) nuclear facilities has been upgraded to comply with DOE orders and standards. In this process, many safety analyses have been revised. Unfortunately, there has been nonuniform application of software, and the most appropriate computer and engineering methodologies often are not applied. A DOE Accident Phenomenology and Consequence (APAC) Methodology Evaluation Program was originated at the request of DOE Defense Programs to evaluate the safety analysis methodologies used in nuclear facility authorization basis documentation and to define future cost-effective support and development initiatives. Six areas, includingmore » source term development (fire, spills, and explosion analysis), in-facility transport, and dispersion/ consequence analysis (chemical and radiological) are contained in the APAC program. The evaluation process, codes considered, key results, and recommendations for future model and software development of the Radiological Dispersion/Consequence Working Group are summarized in this paper.« less
The Therapy Process Observational Coding System for Child Psychotherapy Strategies Scale
ERIC Educational Resources Information Center
McLeod, Bryce D.; Weisz, John R.
2010-01-01
Most everyday child and adolescent psychotherapy does not follow manuals that document the procedures. Consequently, usual clinical care has remained poorly understood and rarely studied. The Therapy Process Observational Coding System for Child Psychotherapy-Strategies scale (TPOCS-S) is an observational measure of youth psychotherapy procedures…
Perceptual consequences of disrupted auditory nerve activity.
Zeng, Fan-Gang; Kong, Ying-Yee; Michalewski, Henry J; Starr, Arnold
2005-06-01
Perceptual consequences of disrupted auditory nerve activity were systematically studied in 21 subjects who had been clinically diagnosed with auditory neuropathy (AN), a recently defined disorder characterized by normal outer hair cell function but disrupted auditory nerve function. Neurological and electrophysical evidence suggests that disrupted auditory nerve activity is due to desynchronized or reduced neural activity or both. Psychophysical measures showed that the disrupted neural activity has minimal effects on intensity-related perception, such as loudness discrimination, pitch discrimination at high frequencies, and sound localization using interaural level differences. In contrast, the disrupted neural activity significantly impairs timing related perception, such as pitch discrimination at low frequencies, temporal integration, gap detection, temporal modulation detection, backward and forward masking, signal detection in noise, binaural beats, and sound localization using interaural time differences. These perceptual consequences are the opposite of what is typically observed in cochlear-impaired subjects who have impaired intensity perception but relatively normal temporal processing after taking their impaired intensity perception into account. These differences in perceptual consequences between auditory neuropathy and cochlear damage suggest the use of different neural codes in auditory perception: a suboptimal spike count code for intensity processing, a synchronized spike code for temporal processing, and a duplex code for frequency processing. We also proposed two underlying physiological models based on desynchronized and reduced discharge in the auditory nerve to successfully account for the observed neurological and behavioral data. These methods and measures cannot differentiate between these two AN models, but future studies using electric stimulation of the auditory nerve via a cochlear implant might. These results not only show the unique contribution of neural synchrony to sensory perception but also provide guidance for translational research in terms of better diagnosis and management of human communication disorders.
Generating Code Review Documentation for Auto-Generated Mission-Critical Software
NASA Technical Reports Server (NTRS)
Denney, Ewen; Fischer, Bernd
2009-01-01
Model-based design and automated code generation are increasingly used at NASA to produce actual flight code, particularly in the Guidance, Navigation, and Control domain. However, since code generators are typically not qualified, there is no guarantee that their output is correct, and consequently auto-generated code still needs to be fully tested and certified. We have thus developed AUTOCERT, a generator-independent plug-in that supports the certification of auto-generated code. AUTOCERT takes a set of mission safety requirements, and formally verifies that the autogenerated code satisfies these requirements. It generates a natural language report that explains why and how the code complies with the specified requirements. The report is hyper-linked to both the program and the verification conditions and thus provides a high-level structured argument containing tracing information for use in code reviews.
25 CFR 11.1212 - Consequences of disobedience or interference.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 25 Indians 1 2010-04-01 2010-04-01 false Consequences of disobedience or interference. 11.1212 Section 11.1212 Indians BUREAU OF INDIAN AFFAIRS, DEPARTMENT OF THE INTERIOR LAW AND ORDER COURTS OF INDIAN OFFENSES AND LAW AND ORDER CODE Child Protection and Domestic Violence Procedures § 11.1212...
25 CFR 11.1212 - Consequences of disobedience or interference.
Code of Federal Regulations, 2011 CFR
2011-04-01
... 25 Indians 1 2011-04-01 2011-04-01 false Consequences of disobedience or interference. 11.1212 Section 11.1212 Indians BUREAU OF INDIAN AFFAIRS, DEPARTMENT OF THE INTERIOR LAW AND ORDER COURTS OF INDIAN OFFENSES AND LAW AND ORDER CODE Child Protection and Domestic Violence Procedures § 11.1212...
Automated Source-Code-Based Testing of Object-Oriented Software
NASA Astrophysics Data System (ADS)
Gerlich, Ralf; Gerlich, Rainer; Dietrich, Carsten
2014-08-01
With the advent of languages such as C++ and Java in mission- and safety-critical space on-board software, new challenges for testing and specifically automated testing arise. In this paper we discuss some of these challenges, consequences and solutions based on an experiment in automated source- code-based testing for C++.
Summary of evidence for an anticodonic basis for the origin of the genetic code
NASA Technical Reports Server (NTRS)
Lacey, J. C., Jr.; Mullins, D. W., Jr.
1981-01-01
This article summarizes data supporting the hypothesis that the genetic code origin was based on relationships (probably affinities) between amino acids and their anticodon nucleotides. Selective activation seems to follow from selective affinity and consequently, incorporation of amino acids into peptides can also be selective. It is suggested that these selectivities in affinity and activation, coupled with the base pairing specificities, allowed the origin of the code and the process of translation.
NASA Technical Reports Server (NTRS)
Lin, Shu; Rhee, Dojun; Rajpal, Sandeep
1993-01-01
This report presents a low-complexity and high performance concatenated coding scheme for high-speed satellite communications. In this proposed scheme, the NASA Standard Reed-Solomon (RS) code over GF(2(exp 8) is used as the outer code and the second-order Reed-Muller (RM) code of Hamming distance 8 is used as the inner code. The RM inner code has a very simple trellis structure and is decoded with the soft-decision Viterbi decoding algorithm. It is shown that the proposed concatenated coding scheme achieves an error performance which is comparable to that of the NASA TDRS concatenated coding scheme in which the NASA Standard rate-1/2 convolutional code of constraint length 7 and d sub free = 10 is used as the inner code. However, the proposed RM inner code has much smaller decoding complexity, less decoding delay, and much higher decoding speed. Consequently, the proposed concatenated coding scheme is suitable for reliable high-speed satellite communications, and it may be considered as an alternate coding scheme for the NASA TDRS system.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ritchie, L.T.; Johnson, J.D.; Blond, R.M.
The CRAC2 computer code is a revision of the Calculation of Reactor Accident Consequences computer code, CRAC, developed for the Reactor Safety Study. The CRAC2 computer code incorporates significant modeling improvements in the areas of weather sequence sampling and emergency response, and refinements to the plume rise, atmospheric dispersion, and wet deposition models. New output capabilities have also been added. This guide is to facilitate the informed and intelligent use of CRAC2. It includes descriptions of the input data, the output results, the file structures, control information, and five sample problems.
Mechanisms and consequences of alternative polyadenylation
Di Giammartino, Dafne Campigli; Nishida, Kensei; Manley, James L.
2011-01-01
Summary Alternative polyadenylation (APA) is emerging as a widespread mechanism used to control gene expression. Like alternative splicing, usage of alternative poly(A) sites allows a single gene to encode multiple mRNA transcripts. In some cases, this changes the mRNA coding potential; in other cases, the code remains unchanged but the 3’UTR length is altered, influencing the fate of mRNAs in several ways, for example, by altering the availability of RNA binding protein sites and microRNA binding sites. The mechansims governing both global and gene-specific APA are only starting to be deciphered. Here we review what is known about these mechanisms and the functional consequences of alternative polyadenlyation. PMID:21925375
Ancient DNA sequence revealed by error-correcting codes.
Brandão, Marcelo M; Spoladore, Larissa; Faria, Luzinete C B; Rocha, Andréa S L; Silva-Filho, Marcio C; Palazzo, Reginaldo
2015-07-10
A previously described DNA sequence generator algorithm (DNA-SGA) using error-correcting codes has been employed as a computational tool to address the evolutionary pathway of the genetic code. The code-generated sequence alignment demonstrated that a residue mutation revealed by the code can be found in the same position in sequences of distantly related taxa. Furthermore, the code-generated sequences do not promote amino acid changes in the deviant genomes through codon reassignment. A Bayesian evolutionary analysis of both code-generated and homologous sequences of the Arabidopsis thaliana malate dehydrogenase gene indicates an approximately 1 MYA divergence time from the MDH code-generated sequence node to its paralogous sequences. The DNA-SGA helps to determine the plesiomorphic state of DNA sequences because a single nucleotide alteration often occurs in distantly related taxa and can be found in the alternative codon patterns of noncanonical genetic codes. As a consequence, the algorithm may reveal an earlier stage of the evolution of the standard code.
Ancient DNA sequence revealed by error-correcting codes
Brandão, Marcelo M.; Spoladore, Larissa; Faria, Luzinete C. B.; Rocha, Andréa S. L.; Silva-Filho, Marcio C.; Palazzo, Reginaldo
2015-01-01
A previously described DNA sequence generator algorithm (DNA-SGA) using error-correcting codes has been employed as a computational tool to address the evolutionary pathway of the genetic code. The code-generated sequence alignment demonstrated that a residue mutation revealed by the code can be found in the same position in sequences of distantly related taxa. Furthermore, the code-generated sequences do not promote amino acid changes in the deviant genomes through codon reassignment. A Bayesian evolutionary analysis of both code-generated and homologous sequences of the Arabidopsis thaliana malate dehydrogenase gene indicates an approximately 1 MYA divergence time from the MDH code-generated sequence node to its paralogous sequences. The DNA-SGA helps to determine the plesiomorphic state of DNA sequences because a single nucleotide alteration often occurs in distantly related taxa and can be found in the alternative codon patterns of noncanonical genetic codes. As a consequence, the algorithm may reveal an earlier stage of the evolution of the standard code. PMID:26159228
Violence and its injury consequences in American movies
McArthur, David L; Peek-Asa, Corinne; Webb, Theresa; Fisher, Kevin; Cook, Bernard; Browne, Nick; Kraus, Jess
2000-01-01
Objectives To evaluate the seriousness and frequency of violence and the degree of associated injury depicted in the 100 top-grossing American films of 1994. Methods Each scene in each film was examined for the presentation of violent actions on persons and coded by a systematic context-sensitive analytic scheme. Specific degrees of violence and indices of injury severity were abstracted. Only actually depicted, not implied, actions were coded, although both explicit and implied consequences were examined. Results The median number of violent actions per film was 16 (range, 0-110). Intentional violence outnumbered unintentional violence by a factor of 10. Almost 90% of violent actions showed no consequences to the recipient's body, although more than 80% of the violent actions were executed with lethal or moderate force. Fewer than 1% of violent actions were accompanied by injuries that were then medically attended. Conclusions Violent force in American films of 1994 was overwhelmingly intentional and in 4 of 5 cases was executed at levels likely to cause significant bodily injury. Not only action films but movies of all genres contained scenes in which the intensity of the action was not matched by correspondingly severe injury consequences. Many American films, regardless of genre, tend to minimize the consequences of violence to human beings. PMID:10986175
Violence and its injury consequences in American movies: a public health perspective.
McArthur, D L; Peek-Asa, C; Webb, T; Fisher, K; Cook, B; Browne, N; Kraus, J
2000-09-01
To evaluate the seriousness and frequency of violence and the degree of associated injury depicted in the 100 top-grossing American films of 1994. Each scene in each film was examined for the presentation of violent actions on persons and coded by a systematic context-sensitive analytic scheme. Specific degrees of violence and indices of injury severity were abstracted. Only actually depicted, not implied, actions were coded, although both explicit and implied consequences were examined. The median number of violent actions per film was 16 (range, 0-110). Intentional violence outnumbered unintentional violence by a factor of 10. Almost 90% of violent actions showed no consequences to the recipient's body, although more than 80% of the violent actions were executed with lethal or moderate force. Fewer than 1% of violent actions were accompanied by injuries that were then medically attended. Violent force in American films of 1994 was overwhelmingly intentional and in 4 of 5 cases was executed at levels likely to cause significant bodily injury. Not only action films but movies of all genres contained scenes in which the intensity of the action was not matched by correspondingly severe injury consequences. Many American films, regardless of genre, tend to minimize the consequences of violence to human beings.
ERIC Educational Resources Information Center
Alty, James L.
Dual Coding Theory has quite specific predictions about how information in different media is stored, manipulated and recalled. Different combinations of media are expected to have significant effects upon the recall and retention of information. This obviously may have important consequences in the design of computer-based programs. The paper…
ERIC Educational Resources Information Center
Banzato, Monica; Tosato, Paolo
2017-01-01
In Italy, teaching coding at primary and secondary levels is emerging as a major educational issue, particularly in light of the recent reforms now being implemented. Consequently, there has been increased research on how to introduce information technology in lower secondary schools. This paper presents an exploratory survey, carried out through…
Weighted SAW reflector gratings for orthogonal frequency coded SAW tags and sensors
NASA Technical Reports Server (NTRS)
Puccio, Derek (Inventor); Malocha, Donald (Inventor)
2011-01-01
Weighted surface acoustic wave reflector gratings for coding identification tags and sensors to enable unique sensor operation and identification for a multi-sensor environment. In an embodiment, the weighted reflectors are variable while in another embodiment the reflector gratings are apodized. The weighting technique allows the designer to decrease reflectively and allows for more chips to be implemented in a device and, consequently, more coding diversity. As a result, more tags and sensors can be implemented using a given bandwidth when compared with uniform reflectors. Use of weighted reflector gratings with OFC makes various phase shifting schemes possible, such as in-phase and quadrature implementations of coded waveforms resulting in reduced device size and increased coding.
2003-02-01
servcice warfighters (Training devices and protocols, Onboard equipment, Cognitive and sensorimotor aids, Visual and auditory symbology, Peripheral visual...vestibular stimulation causing a decrease in cerebral blood pressure with the consequent reduction in G-tolerance and increased likelihood of ALOC or GLOC...tactile stimulators (e.g. one providing a sensation of movement) or of displays with a more complex coding (e.g. by increase in the number of tactors, or
Mechanism on brain information processing: Energy coding
NASA Astrophysics Data System (ADS)
Wang, Rubin; Zhang, Zhikang; Jiao, Xianfa
2006-09-01
According to the experimental result of signal transmission and neuronal energetic demands being tightly coupled to information coding in the cerebral cortex, the authors present a brand new scientific theory that offers a unique mechanism for brain information processing. They demonstrate that the neural coding produced by the activity of the brain is well described by the theory of energy coding. Due to the energy coding model's ability to reveal mechanisms of brain information processing based upon known biophysical properties, they cannot only reproduce various experimental results of neuroelectrophysiology but also quantitatively explain the recent experimental results from neuroscientists at Yale University by means of the principle of energy coding. Due to the theory of energy coding to bridge the gap between functional connections within a biological neural network and energetic consumption, they estimate that the theory has very important consequences for quantitative research of cognitive function.
Energy coding in biological neural networks
Zhang, Zhikang
2007-01-01
According to the experimental result of signal transmission and neuronal energetic demands being tightly coupled to information coding in the cerebral cortex, we present a brand new scientific theory that offers an unique mechanism for brain information processing. We demonstrate that the neural coding produced by the activity of the brain is well described by our theory of energy coding. Due to the energy coding model’s ability to reveal mechanisms of brain information processing based upon known biophysical properties, we can not only reproduce various experimental results of neuro-electrophysiology, but also quantitatively explain the recent experimental results from neuroscientists at Yale University by means of the principle of energy coding. Due to the theory of energy coding to bridge the gap between functional connections within a biological neural network and energetic consumption, we estimate that the theory has very important consequences for quantitative research of cognitive function. PMID:19003513
Bachman, Peter; Reichenberg, Abraham; Rice, Patrick; Woolsey, Mary; Chaves, Olga; Martinez, David; Maples, Natalie; Velligan, Dawn I; Glahn, David C
2010-05-01
Cognitive processing inefficiency, often measured using digit symbol coding tasks, is a putative vulnerability marker for schizophrenia and a reliable indicator of illness severity and functional outcome. Indeed, performance on the digit symbol coding task may be the most severe neuropsychological deficit patients with schizophrenia display at the group level. Yet, little is known about the contributions of simpler cognitive processes to coding performance in schizophrenia (e.g. decision making, visual scanning, relational memory, motor ability). We developed an experimental behavioral task, based on a computerized digit symbol coding task, which allows the manipulation of demands placed on visual scanning efficiency and relational memory while holding decisional and motor requirements constant. Although patients (n=85) were impaired on all aspects of the task when compared to demographically matched healthy comparison subjects (n=30), they showed a particularly striking failure to benefit from the presence of predictable target information. These findings are consistent with predicted impairments in cognitive processing speed due to schizophrenia patients' well-known memory impairment, suggesting that this mnemonic deficit may have consequences for critical aspects of information processing that are traditionally considered quite separate from the memory domain. Future investigation into the mechanisms underlying the wide-ranging consequences of mnemonic deficits in schizophrenia should provide additional insight. Copyright (c) 2010 Elsevier B.V. All rights reserved.
2014-10-01
offer a practical solution to calculating the grain -scale hetero- geneity present in the deformation field. Consequently, crystal plasticity models...process/performance simulation codes (e.g., crystal plasticity finite element method). 15. SUBJECT TERMS ICME; microstructure informatics; higher...iii) protocols for direct and efficient linking of materials models/databases into process/performance simulation codes (e.g., crystal plasticity
Lattice surgery on the Raussendorf lattice
NASA Astrophysics Data System (ADS)
Herr, Daniel; Paler, Alexandru; Devitt, Simon J.; Nori, Franco
2018-07-01
Lattice surgery is a method to perform quantum computation fault-tolerantly by using operations on boundary qubits between different patches of the planar code. This technique allows for universal planar code computation without eliminating the intrinsic two-dimensional nearest-neighbor properties of the surface code that eases physical hardware implementations. Lattice surgery approaches to algorithmic compilation and optimization have been demonstrated to be more resource efficient for resource-intensive components of a fault-tolerant algorithm, and consequently may be preferable over braid-based logic. Lattice surgery can be extended to the Raussendorf lattice, providing a measurement-based approach to the surface code. In this paper we describe how lattice surgery can be performed on the Raussendorf lattice and therefore give a viable alternative to computation using braiding in measurement-based implementations of topological codes.
Emmorey, Karen; Petrich, Jennifer; Gollan, Tamar H.
2012-01-01
Bilinguals who are fluent in American Sign Language (ASL) and English often produce code-blends - simultaneously articulating a sign and a word while conversing with other ASL-English bilinguals. To investigate the cognitive mechanisms underlying code-blend processing, we compared picture-naming times (Experiment 1) and semantic categorization times (Experiment 2) for code-blends versus ASL signs and English words produced alone. In production, code-blending did not slow lexical retrieval for ASL and actually facilitated access to low-frequency signs. However, code-blending delayed speech production because bimodal bilinguals synchronized English and ASL lexical onsets. In comprehension, code-blending speeded access to both languages. Bimodal bilinguals’ ability to produce code-blends without any cost to ASL implies that the language system either has (or can develop) a mechanism for switching off competition to allow simultaneous production of close competitors. Code-blend facilitation effects during comprehension likely reflect cross-linguistic (and cross-modal) integration at the phonological and/or semantic levels. The absence of any consistent processing costs for code-blending illustrates a surprising limitation on dual-task costs and may explain why bimodal bilinguals code-blend more often than they code-switch. PMID:22773886
Violence and its injury consequences in American movies: a public health perspective
McArthur, D.; Peek-Asa, C.; Webb, T.; Fisher, K.; Cook, B.; Browne, N.; Kraus, J.
2000-01-01
Objectives—The purpose of this study was to evaluate the seriousness and frequency of violence and the degree of associated injury depicted in the 100 top grossing American films of 1994. Methods—Each scene in each film was examined for the presentation of violent actions upon persons and coded by means of a systematic context sensitive analytic scheme. Specific degrees of violence and indices of injury severity were abstracted. Only actually depicted, not implied, actions were coded, although both explicit and implied consequences were examined. Results—The median number of violent actions per film was 16, with a range from 1 to 110. Intentional violence outnumbered unintentional violence by a factor of 10. Almost 90% of violent actions showed no consequences to the recipient's body, although more than 80% of the violent actions were executed with lethal or moderate force. Fewer than 1% of violent actions were accompanied by injuries that were then medically attended. Conclusions—Violent force in American films of 1994 was overwhelmingly intentional and in four of five cases was executed at levels likely to cause significant bodily injury. Not only action films but movies of all genres contained scenes in which the intensity of the action was not matched by correspondingly severe injury consequences. Many American films, regardless of genre, tend to minimize the consequences of violence to human beings. PMID:10875668
Violence and its injury consequences in American movies: a public health perspective.
McArthur, D; Peek-Asa, C; Webb, T; Fisher, K; Cook, B; Browne, N; Kraus, J
2000-06-01
The purpose of this study was to evaluate the seriousness and frequency of violence and the degree of associated injury depicted in the 100 top grossing American films of 1994. Each scene in each film was examined for the presentation of violent actions upon persons and coded by means of a systematic context sensitive analytic scheme. Specific degrees of violence and indices of injury severity were abstracted. Only actually depicted, not implied, actions were coded, although both explicit and implied consequences were examined. The median number of violent actions per film was 16, with a range from 1 to 110. Intentional violence outnumbered unintentional violence by a factor of 10. Almost 90% of violent actions showed no consequences to the recipient's body, although more than 80% of the violent actions were executed with lethal or moderate force. Fewer than 1% of violent actions were accompanied by injuries that were then medically attended. Violent force in American films of 1994 was overwhelmingly intentional and in four of five cases was executed at levels likely to cause significant bodily injury. Not only action films but movies of all genres contained scenes in which the intensity of the action was not matched by correspondingly severe injury consequences. Many American films, regardless of genre, tend to minimize the consequences of violence to human beings.
WEC3: Wave Energy Converter Code Comparison Project: Preprint
DOE Office of Scientific and Technical Information (OSTI.GOV)
Combourieu, Adrien; Lawson, Michael; Babarit, Aurelien
This paper describes the recently launched Wave Energy Converter Code Comparison (WEC3) project and present preliminary results from this effort. The objectives of WEC3 are to verify and validate numerical modelling tools that have been developed specifically to simulate wave energy conversion devices and to inform the upcoming IEA OES Annex VI Ocean Energy Modelling Verification and Validation project. WEC3 is divided into two phases. Phase 1 consists of a code-to-code verification and Phase II entails code-to-experiment validation. WEC3 focuses on mid-fidelity codes that simulate WECs using time-domain multibody dynamics methods to model device motions and hydrodynamic coefficients to modelmore » hydrodynamic forces. Consequently, high-fidelity numerical modelling tools, such as Navier-Stokes computational fluid dynamics simulation, and simple frequency domain modelling tools were not included in the WEC3 project.« less
Prompt Radiation Protection Factors
2018-02-01
dimensional Monte-Carlo radiation transport code MCNP (Monte Carlo N-Particle) and the evaluation of the protection factors (ratio of dose in the open to...radiation was performed using the three dimensional Monte- Carlo radiation transport code MCNP (Monte Carlo N-Particle) and the evaluation of the protection...by detonation of a nuclear device have placed renewed emphasis on evaluation of the consequences in case of such an event. The Defense Threat
PFLOTRAN-RepoTREND Source Term Comparison Summary.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Frederick, Jennifer M.
Code inter-comparison studies are useful exercises to verify and benchmark independently developed software to ensure proper function, especially when the software is used to model high-consequence systems which cannot be physically tested in a fully representative environment. This summary describes the results of the first portion of the code inter-comparison between PFLOTRAN and RepoTREND, which compares the radionuclide source term used in a typical performance assessment.
Rady, Mohamed Y; Verheijde, Joseph L
2014-06-02
End-of-life organ donation is controversial in Islam. The controversy stems from: (1) scientifically flawed medical criteria of death determination; (2) invasive perimortem procedures for preserving transplantable organs; and (3) incomplete disclosure of information to consenting donors and families. Data from a survey of Muslims residing in Western countries have shown that the interpretation of religious scriptures and advice of faith leaders were major barriers to willingness for organ donation. Transplant advocates have proposed corrective interventions: (1) reinterpreting religious scriptures, (2) reeducating faith leaders, and (3) utilizing media campaigns to overcome religious barriers in Muslim communities. This proposal disregards the intensifying scientific, legal, and ethical controversies in Western societies about the medical criteria of death determination in donors. It would also violate the dignity and inviolability of human life which are pertinent values incorporated in the Islamic moral code. Reinterpreting religious scriptures to serve the utilitarian objectives of a controversial end-of-life practice, perceived to be socially desirable, transgresses the Islamic moral code. It may also have deleterious practical consequences, as donors can suffer harm before death. The negative normative consequences of utilitarian secular moral reasoning reset the Islamic moral code upholding the sanctity and dignity of human life.
2014-01-01
End-of-life organ donation is controversial in Islam. The controversy stems from: (1) scientifically flawed medical criteria of death determination; (2) invasive perimortem procedures for preserving transplantable organs; and (3) incomplete disclosure of information to consenting donors and families. Data from a survey of Muslims residing in Western countries have shown that the interpretation of religious scriptures and advice of faith leaders were major barriers to willingness for organ donation. Transplant advocates have proposed corrective interventions: (1) reinterpreting religious scriptures, (2) reeducating faith leaders, and (3) utilizing media campaigns to overcome religious barriers in Muslim communities. This proposal disregards the intensifying scientific, legal, and ethical controversies in Western societies about the medical criteria of death determination in donors. It would also violate the dignity and inviolability of human life which are pertinent values incorporated in the Islamic moral code. Reinterpreting religious scriptures to serve the utilitarian objectives of a controversial end-of-life practice, perceived to be socially desirable, transgresses the Islamic moral code. It may also have deleterious practical consequences, as donors can suffer harm before death. The negative normative consequences of utilitarian secular moral reasoning reset the Islamic moral code upholding the sanctity and dignity of human life. PMID:24888748
Certifying Auto-Generated Flight Code
NASA Technical Reports Server (NTRS)
Denney, Ewen
2008-01-01
Model-based design and automated code generation are being used increasingly at NASA. Many NASA projects now use MathWorks Simulink and Real-Time Workshop for at least some of their modeling and code development. However, there are substantial obstacles to more widespread adoption of code generators in safety-critical domains. Since code generators are typically not qualified, there is no guarantee that their output is correct, and consequently the generated code still needs to be fully tested and certified. Moreover, the regeneration of code can require complete recertification, which offsets many of the advantages of using a generator. Indeed, manual review of autocode can be more challenging than for hand-written code. Since the direct V&V of code generators is too laborious and complicated due to their complex (and often proprietary) nature, we have developed a generator plug-in to support the certification of the auto-generated code. Specifically, the AutoCert tool supports certification by formally verifying that the generated code is free of different safety violations, by constructing an independently verifiable certificate, and by explaining its analysis in a textual form suitable for code reviews. The generated documentation also contains substantial tracing information, allowing users to trace between model, code, documentation, and V&V artifacts. This enables missions to obtain assurance about the safety and reliability of the code without excessive manual V&V effort and, as a consequence, eases the acceptance of code generators in safety-critical contexts. The generation of explicit certificates and textual reports is particularly well-suited to supporting independent V&V. The primary contribution of this approach is the combination of human-friendly documentation with formal analysis. The key technical idea is to exploit the idiomatic nature of auto-generated code in order to automatically infer logical annotations. The annotation inference algorithm itself is generic, and parametrized with respect to a library of coding patterns that depend on the safety policies and the code generator. The patterns characterize the notions of definitions and uses that are specific to the given safety property. For example, for initialization safety, definitions correspond to variable initializations while uses are statements which read a variable, whereas for array bounds safety, definitions are the array declarations, while uses are statements which access an array variable. The inferred annotations are thus highly dependent on the actual program and the properties being proven. The annotations, themselves, need not be trusted, but are crucial to obtain the automatic formal verification of the safety properties without requiring access to the internals of the code generator. The approach has been applied to both in-house and commercial code generators, but is independent of the particular generator used. It is currently being adapted to flight code generated using MathWorks Real-Time Workshop, an automatic code generator that translates from Simulink/Stateflow models into embedded C code.
NASA Astrophysics Data System (ADS)
Schneider, Barry I.; Segura, Javier; Gil, Amparo; Guan, Xiaoxu; Bartschat, Klaus
2018-04-01
This is a revised and updated version of a modern Fortran 90 code to compute the regular Plm (x) and irregular Qlm (x) associated Legendre functions for all x ∈(- 1 , + 1) (on the cut) and | x | > 1 and integer degree (l) and order (m). The necessity to revise the code comes as a consequence of some comments of Prof. James Bremer of the UC//Davis Mathematics Department, who discovered that there were errors in the code for large integer degree and order for the normalized regular Legendre functions on the cut.
[How do first codes of medical ethics inspire contemporary physicians?].
Paprocka-Lipińska, Anna; Basińska, Krystyna
2014-02-01
First codes of medical ethics appeared between 18th and 19th century. Their formation was inspired by changes that happened in medicine, positive in general but with some negative setbacks. Those negative consequences revealed the need to codify all those ethical duties, which were formerly passed from generation to generation by the word of mouth and individual example by master physicians. 210 years has passed since the publication of "Medical Ethics" by Thomas Percival, yet essential ethical guidelines remain the same. Similarly, ethical codes published in Poland in 19 century can still be an inspiration to modem physicians.
Illustration of Some Consequences of the Indistinguishability of Electrons
ERIC Educational Resources Information Center
Moore, John W.; Davies, William G.
1976-01-01
Discusses how color-coded overhead transparencies of computer-generated dot-density diagrams can be used to illustrate hybrid orbitals and the principle of the indistinguishability of electrons. (MLH)
New French Regulation for NPPs and Code Consequences
DOE Office of Scientific and Technical Information (OSTI.GOV)
Faidy, Claude
2006-07-01
On December 2005, the French regulator issued a new regulation for French nuclear power plants, in particular for pressure equipment (PE). This regulation need first to agree with non-nuclear PE regulation and add to that some specific requirements, in particular radiation protection requirements. Different advantages are in these proposal, it's more qualitative risk oriented and it's an important link with non-nuclear industry. Only few components are nuclear specific. But, the general philosophy of the existing Codes (RCC-M [15], KTA [16] or ASME [17]) have to be improved. For foreign Codes, it's plan to define the differences in the user specifications.more » In parallel to that, a new safety classification has been developed by French utility. The consequences is the need to cross all these specifications to define a minimum quality level for each components or systems. In the same time a new concept has been developed to replace the well known 'Leak Before Break methodology': the 'Break Exclusion' methodology. This paper will summarize the key aspects of these different topics. (authors)« less
Hauser-Feshbach calculations in deformed nuclei
Grimes, S. M.
2013-08-22
Hauser Feshbach calculations for deformed nuclei are typically done with level densities appropriate for deformed nuclei but with Hauser Feshbach codes which enforce spherical symmetry by not including K as a parameter in the decay sums. A code has been written which does allow the full K dependence to be included. Calculations with the code have been compared with those from a conventional Hauser Feshbach code. The evaporation portion (continuum) is only slightly affected by this change but the cross sections to individual (resolved) levels are changed substantially. It is found that cross sections to neighboring levels with the samemore » J but differing K are not the same. The predicted consequences of K mixing will also be discussed.« less
High-Content Optical Codes for Protecting Rapid Diagnostic Tests from Counterfeiting.
Gökçe, Onur; Mercandetti, Cristina; Delamarche, Emmanuel
2018-06-19
Warnings and reports on counterfeit diagnostic devices are released several times a year by regulators and public health agencies. Unfortunately, mishandling, altering, and counterfeiting point-of-care diagnostics (POCDs) and rapid diagnostic tests (RDTs) is lucrative, relatively simple and can lead to devastating consequences. Here, we demonstrate how to implement optical security codes in silicon- and nitrocellulose-based flow paths for device authentication using a smartphone. The codes are created by inkjet spotting inks directly on nitrocellulose or on micropillars. Codes containing up to 32 elements per mm 2 and 8 colors can encode as many as 10 45 combinations. Codes on silicon micropillars can be erased by setting a continuous flow path across the entire array of code elements or for nitrocellulose by simply wicking a liquid across the code. Static or labile code elements can further be formed on nitrocellulose to create a hidden code using poly(ethylene glycol) (PEG) or glycerol additives to the inks. More advanced codes having a specific deletion sequence can also be created in silicon microfluidic devices using an array of passive routing nodes, which activate in a particular, programmable sequence. Such codes are simple to fabricate, easy to view, and efficient in coding information; they can be ideally used in combination with information on a package to protect diagnostic devices from counterfeiting.
Blast and the Consequences on Traumatic Brain Injury-Multiscale Mechanical Modeling of Brain
2011-02-17
blast simulation. LS-DYNA as an explicit FE code has been employed to simulate this multi- material fluid –structure interaction problem. The 3-D head...formulation is implemented to model the air-blast simulation. LS-DYNA as an explicit FE code has been employed to simulate this multi-material fluid ...Biomechanics Study of Influencing Parameters for brain under Impact ............................... 12 5.1 The Impact of Cerebrospinal Fluid
Modeling emission lag after photoexcitation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jensen, Kevin L.; Petillo, John J.; Ovtchinnikov, Serguei
A theoretical model of delayed emission following photoexcitation from metals and semiconductors is given. Its numerical implementation is designed for beam optics codes used to model photocathodes in rf photoinjectors. The model extends the Moments approach for predicting photocurrent and mean transverse energy as moments of an emitted electron distribution by incorporating time of flight and scattering events that result in emission delay on a sub-picosecond level. The model accounts for a dynamic surface extraction field and changes in the energy distribution and time of emission as a consequence of the laser penetration depth and multiple scattering events during transport.more » Usage in the Particle-in-Cell code MICHELLE to predict the bunch shape and duration with or without laser jitter is given. The consequences of delayed emission effects for ultra-short pulses are discussed.« less
Modeling emission lag after photoexcitation
Jensen, Kevin L.; Petillo, John J.; Ovtchinnikov, Serguei; ...
2017-10-28
A theoretical model of delayed emission following photoexcitation from metals and semiconductors is given. Its numerical implementation is designed for beam optics codes used to model photocathodes in rf photoinjectors. The model extends the Moments approach for predicting photocurrent and mean transverse energy as moments of an emitted electron distribution by incorporating time of flight and scattering events that result in emission delay on a sub-picosecond level. The model accounts for a dynamic surface extraction field and changes in the energy distribution and time of emission as a consequence of the laser penetration depth and multiple scattering events during transport.more » Usage in the Particle-in-Cell code MICHELLE to predict the bunch shape and duration with or without laser jitter is given. The consequences of delayed emission effects for ultra-short pulses are discussed.« less
Proceedings of the 21st DOE/NRC Nuclear Air Cleaning Conference; Sessions 1--8
DOE Office of Scientific and Technical Information (OSTI.GOV)
First, M.W.
1991-02-01
Separate abstracts have been prepared for the papers presented at the meeting on nuclear facility air cleaning technology in the following specific areas of interest: air cleaning technologies for the management and disposal of radioactive wastes; Canadian waste management program; radiological health effects models for nuclear power plant accident consequence analysis; filter testing; US standard codes on nuclear air and gas treatment; European community nuclear codes and standards; chemical processing off-gas cleaning; incineration and vitrification; adsorbents; nuclear codes and standards; mathematical modeling techniques; filter technology; safety; containment system venting; and nuclear air cleaning programs around the world. (MB)
Towards a complete map of the human long non-coding RNA transcriptome.
Uszczynska-Ratajczak, Barbara; Lagarde, Julien; Frankish, Adam; Guigó, Roderic; Johnson, Rory
2018-05-23
Gene maps, or annotations, enable us to navigate the functional landscape of our genome. They are a resource upon which virtually all studies depend, from single-gene to genome-wide scales and from basic molecular biology to medical genetics. Yet present-day annotations suffer from trade-offs between quality and size, with serious but often unappreciated consequences for downstream studies. This is particularly true for long non-coding RNAs (lncRNAs), which are poorly characterized compared to protein-coding genes. Long-read sequencing technologies promise to improve current annotations, paving the way towards a complete annotation of lncRNAs expressed throughout a human lifetime.
Code development for ships -- A demonstration
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ayyub, B.; Mansour, A.E.; White, G.
1996-12-31
A demonstration summary of a reliability-based structural design code for ships is presented for two ship types, a cruiser and a tanker. For both ship types, code requirements cover four failure modes: hull girder bulking, unstiffened plate yielding and buckling, stiffened plate buckling, and fatigue of critical detail. Both serviceability and ultimate limit states are considered. Because of limitation on the length, only hull girder modes are presented in this paper. Code requirements for other modes will be presented in future publication. A specific provision of the code will be a safety check expression. The design variables are to bemore » taken at their nominal values, typically values in the safe side of the respective distributions. Other safety check expressions for hull girder failure that include load combination factors, as well as consequence of failure factors, are considered. This paper provides a summary of safety check expressions for the hull girder modes.« less
Cracking the code: the accuracy of coding shoulder procedures and the repercussions.
Clement, N D; Murray, I R; Nie, Y X; McBirnie, J M
2013-05-01
Coding of patients' diagnosis and surgical procedures is subject to error levels of up to 40% with consequences on distribution of resources and financial recompense. Our aim was to explore and address reasons behind coding errors of shoulder diagnosis and surgical procedures and to evaluate a potential solution. A retrospective review of 100 patients who had undergone surgery was carried out. Coding errors were identified and the reasons explored. A coding proforma was designed to address these errors and was prospectively evaluated for 100 patients. The financial implications were also considered. Retrospective analysis revealed the correct primary diagnosis was assigned in 54 patients (54%) had an entirely correct diagnosis, and only 7 (7%) patients had a correct procedure code assigned. Coders identified indistinct clinical notes and poor clarity of procedure codes as reasons for errors. The proforma was significantly more likely to assign the correct diagnosis (odds ratio 18.2, p < 0.0001) and the correct procedure code (odds ratio 310.0, p < 0.0001). Using the proforma resulted in a £28,562 increase in revenue for the 100 patients evaluated relative to the income generated from the coding department. High error levels for coding are due to misinterpretation of notes and ambiguity of procedure codes. This can be addressed by allowing surgeons to assign the diagnosis and procedure using a simplified list that is passed directly to coding.
Examining the relationship between comprehension and production processes in code-switched language
Guzzardo Tamargo, Rosa E.; Valdés Kroff, Jorge R.; Dussias, Paola E.
2016-01-01
We employ code-switching (the alternation of two languages in bilingual communication) to test the hypothesis, derived from experience-based models of processing (e.g., Boland, Tanenhaus, Carlson, & Garnsey, 1989; Gennari & MacDonald, 2009), that bilinguals are sensitive to the combinatorial distributional patterns derived from production and that they use this information to guide processing during the comprehension of code-switched sentences. An analysis of spontaneous bilingual speech confirmed the existence of production asymmetries involving two auxiliary + participle phrases in Spanish–English code-switches. A subsequent eye-tracking study with two groups of bilingual code-switchers examined the consequences of the differences in distributional patterns found in the corpus study for comprehension. Participants’ comprehension costs mirrored the production patterns found in the corpus study. Findings are discussed in terms of the constraints that may be responsible for the distributional patterns in code-switching production and are situated within recent proposals of the links between production and comprehension. PMID:28670049
Examining the relationship between comprehension and production processes in code-switched language.
Guzzardo Tamargo, Rosa E; Valdés Kroff, Jorge R; Dussias, Paola E
2016-08-01
We employ code-switching (the alternation of two languages in bilingual communication) to test the hypothesis, derived from experience-based models of processing (e.g., Boland, Tanenhaus, Carlson, & Garnsey, 1989; Gennari & MacDonald, 2009), that bilinguals are sensitive to the combinatorial distributional patterns derived from production and that they use this information to guide processing during the comprehension of code-switched sentences. An analysis of spontaneous bilingual speech confirmed the existence of production asymmetries involving two auxiliary + participle phrases in Spanish-English code-switches. A subsequent eye-tracking study with two groups of bilingual code-switchers examined the consequences of the differences in distributional patterns found in the corpus study for comprehension. Participants' comprehension costs mirrored the production patterns found in the corpus study. Findings are discussed in terms of the constraints that may be responsible for the distributional patterns in code-switching production and are situated within recent proposals of the links between production and comprehension.
Michael Frei, Dominik; Hodneland, Erlend; Rios-Mondragon, Ivan; Burtey, Anne; Neumann, Beate; Bulkescher, Jutta; Schölermann, Julia; Pepperkok, Rainer; Gerdes, Hans-Hermann; Kögel, Tanja
2015-01-01
Contact-dependent intercellular transfer (codeIT) of cellular constituents can have functional consequences for recipient cells, such as enhanced survival and drug resistance. Pathogenic viruses, prions and bacteria can also utilize this mechanism to spread to adjacent cells and potentially evade immune detection. However, little is known about the molecular mechanism underlying this intercellular transfer process. Here, we present a novel microscopy-based screening method to identify regulators and cargo of codeIT. Single donor cells, carrying fluorescently labelled endocytic organelles or proteins, are co-cultured with excess acceptor cells. CodeIT is quantified by confocal microscopy and image analysis in 3D, preserving spatial information. An siRNA-based screening using this method revealed the involvement of several myosins and small GTPases as codeIT regulators. Our data indicates that cellular protrusions and tubular recycling endosomes are important for codeIT. We automated image acquisition and analysis to facilitate large-scale chemical and genetic screening efforts to identify key regulators of codeIT. PMID:26271723
FIR Filter of DS-CDMA UWB Modem Transmitter
NASA Astrophysics Data System (ADS)
Kang, Kyu-Min; Cho, Sang-In; Won, Hui-Chul; Choi, Sang-Sung
This letter presents low-complexity digital pulse shaping filter structures of a direct sequence code division multiple access (DS-CDMA) ultra wide-band (UWB) modem transmitter with a ternary spreading code. The proposed finite impulse response (FIR) filter structures using a look-up table (LUT) have the effect of saving the amount of memory by about 50% to 80% in comparison to the conventional FIR filter structures, and consequently are suitable for a high-speed parallel data process.
Kisely, Steve; Crowe, Elizabeth; Lawrence, David; White, Angela; Connor, Jason
2013-08-01
In response to concerns about the health consequences of high-risk drinking by young people, the Australian Government increased the tax on pre-mixed alcoholic beverages ('alcopops') favoured by this demographic. We measured changes in admissions for alcohol-related harm to health throughout Queensland, before and after the tax increase in April 2008. We used data from the Queensland Trauma Register, Hospitals Admitted Patients Data Collection, and the Emergency Department Information System to calculate alcohol-related admission rates per 100,000 people, for 15 - 29 year-olds. We analysed data over 3 years (April 2006 - April 2009), using interrupted time-series analyses. This covered 2 years before, and 1 year after, the tax increase. We investigated both mental and behavioural consequences (via F10 codes), and intentional/unintentional injuries (S and T codes). We fitted an auto-regressive integrated moving average (ARIMA) model, to test for any changes following the increased tax. There was no decrease in alcohol-related admissions in 15 - 29 year-olds. We found similar results for males and females, as well as definitions of alcohol-related harms that were narrow (F10 codes only) and broad (F10, S and T codes). The increased tax on 'alcopops' was not associated with any reduction in hospital admissions for alcohol-related harms in Queensland 15 - 29 year-olds.
Pattani, Reena; Marquez, Christine; Dinyarian, Camellia; Sharma, Malika; Bain, Julie; Moore, Julia E; Straus, Sharon E
2018-04-10
Despite the gender parity existing in medical schools for over three decades, women remain underrepresented in academic medical centers, particularly in senior ranks and in leadership roles. This has consequences for patient care, education, research, and workplace culture within healthcare organizations. This study was undertaken to explore the perspectives of faculty members at a single department of medicine on the impact of the existing gender gap on organizational effectiveness and workplace culture, and to identify systems-based strategies to mitigate the gap. The study took place at a large university department of medicine in Toronto, Canada, with six affiliated hospitals. In this qualitative study, semi-structured individual interviews were conducted between May and September 2016 with full-time faculty members who held clinical and university-based appointments. Transcripts of the interviews were analyzed using thematic analysis. Three authors independently reviewed the transcripts to determine a preliminary list of codes and establish a coding framework. A modified audit consensus coding approach was applied; a single analyst reviewed all the transcripts and a second analyst audited 20% of the transcripts in each round of coding. Following each round, inter-rater reliability was determined, discrepancies were resolved through discussion, and modifications were made as needed to the coding framework. The analysis revealed faculty members' perceptions of the gender gap, potential contributing factors, organizational impacts, and possible solutions to bridge the gap. Of the 43 full-time faculty members who participated in the survey (29 of whom self-identified as female), most participants were aware of the existing gender gap within academic medicine. Participants described social exclusion, reinforced stereotypes, and unprofessional behaviors as consequences of the gap on organizational effectiveness and culture. They suggested improvements in (1) the processes for recruitment, hiring, and promotion; (2) inclusiveness of the work environment; (3) structures for mentorship; and (4) ongoing monitoring of the gap. The existing gender gap in academic medicine may have negative consequences for organizational effectiveness and workplace culture but many systems-based strategies to mitigate the gap exist. Although these solutions warrant rigorous evaluation, they are feasible to institute within most healthcare organizations immediately.
Software Model Checking of ARINC-653 Flight Code with MCP
NASA Technical Reports Server (NTRS)
Thompson, Sarah J.; Brat, Guillaume; Venet, Arnaud
2010-01-01
The ARINC-653 standard defines a common interface for Integrated Modular Avionics (IMA) code. In particular, ARINC-653 Part 1 specifies a process- and partition-management API that is analogous to POSIX threads, but with certain extensions and restrictions intended to support the implementation of high reliability flight code. MCP is a software model checker, developed at NASA Ames, that provides capabilities for model checking C and C++ source code. In this paper, we present recent work aimed at implementing extensions to MCP that support ARINC-653, and we discuss the challenges and opportunities that consequentially arise. Providing support for ARINC-653 s time and space partitioning is nontrivial, though there are implicit benefits for partial order reduction possible as a consequence of the API s strict interprocess communication policy.
Schnabel, M; Mann, D; Efe, T; Schrappe, M; V Garrel, T; Gotzen, L; Schaeg, M
2004-10-01
The introduction of the German Diagnostic Related Groups (D-DRG) system requires redesigning administrative patient management strategies. Wrong coding leads to inaccurate grouping and endangers the reimbursement of treatment costs. This situation emphasizes the roles of documentation and coding as factors of economical success. The aims of this study were to assess the quantity and quality of initial documentation and coding (ICD-10 and OPS-301) and find operative strategies to improve efficiency and strategic means to ensure optimal documentation and coding quality. In a prospective study, documentation and coding quality were evaluated in a standardized way by weekly assessment. Clinical data from 1385 inpatients were processed for initial correctness and quality of documentation and coding. Principal diagnoses were found to be accurate in 82.7% of cases, inexact in 7.1%, and wrong in 10.1%. Effects on financial returns occurred in 16%. Based on these findings, an optimized, interdisciplinary, and multiprofessional workflow on medical documentation, coding, and data control was developed. Workflow incorporating regular assessment of documentation and coding quality is required by the DRG system to ensure efficient accounting of hospital services. Interdisciplinary and multiprofessional cooperation is recognized to be an important factor in establishing an efficient workflow in medical documentation and coding.
Arbitrariness is not enough: towards a functional approach to the genetic code.
Lacková, Ľudmila; Matlach, Vladimír; Faltýnek, Dan
2017-12-01
Arbitrariness in the genetic code is one of the main reasons for a linguistic approach to molecular biology: the genetic code is usually understood as an arbitrary relation between amino acids and nucleobases. However, from a semiotic point of view, arbitrariness should not be the only condition for definition of a code, consequently it is not completely correct to talk about "code" in this case. Yet we suppose that there exist a code in the process of protein synthesis, but on a higher level than the nucleic bases chains. Semiotically, a code should be always associated with a function and we propose to define the genetic code not only relationally (in basis of relation between nucleobases and amino acids) but also in terms of function (function of a protein as meaning of the code). Even if the functional definition of meaning in the genetic code has been discussed in the field of biosemiotics, its further implications have not been considered. In fact, if the function of a protein represents the meaning of the genetic code (the sign's object), then it is crucial to reconsider the notion of its expression (the sign) as well. In our contribution, we will show that the actual model of the genetic code is not the only possible and we will propose a more appropriate model from a semiotic point of view.
Dynamic event tree analysis with the SAS4A/SASSYS-1 safety analysis code
Jankovsky, Zachary K.; Denman, Matthew R.; Aldemir, Tunc
2018-02-02
The consequences of a transient in an advanced sodium-cooled fast reactor are difficult to capture with the traditional approach to probabilistic risk assessment (PRA). Numerous safety-relevant systems are passive and may have operational states that cannot be represented by binary success or failure. In addition, the specific order and timing of events may be crucial which necessitates the use of dynamic PRA tools such as ADAPT. The modifications to the SAS4A/SASSYS-1 sodium-cooled fast reactor safety analysis code for linking it to ADAPT to perform a dynamic PRA are described. A test case is used to demonstrate the linking process andmore » to illustrate the type of insights that may be gained with this process. Finally, newly-developed dynamic importance measures are used to assess the significance of reactor parameters/constituents on calculated consequences of initiating events.« less
Dynamic event tree analysis with the SAS4A/SASSYS-1 safety analysis code
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jankovsky, Zachary K.; Denman, Matthew R.; Aldemir, Tunc
The consequences of a transient in an advanced sodium-cooled fast reactor are difficult to capture with the traditional approach to probabilistic risk assessment (PRA). Numerous safety-relevant systems are passive and may have operational states that cannot be represented by binary success or failure. In addition, the specific order and timing of events may be crucial which necessitates the use of dynamic PRA tools such as ADAPT. The modifications to the SAS4A/SASSYS-1 sodium-cooled fast reactor safety analysis code for linking it to ADAPT to perform a dynamic PRA are described. A test case is used to demonstrate the linking process andmore » to illustrate the type of insights that may be gained with this process. Finally, newly-developed dynamic importance measures are used to assess the significance of reactor parameters/constituents on calculated consequences of initiating events.« less
A predictive coding account of MMN reduction in schizophrenia.
Wacongne, Catherine
2016-04-01
The mismatch negativity (MMN) is thought to be an index of the automatic activation of a specialized network for active prediction and deviance detection in the auditory cortex. It is consistently reduced in schizophrenic patients and has received a lot of interest as a clinical and translational tool. The main neuronal hypothesis regarding the mechanisms leading to a reduced MMN in schizophrenic patients is a dysfunction of NMDA receptors (NMDA-R). However, this hypothesis has never been implemented in a neuronal model. In this paper, we examine the consequences of NMDA-R dysfunction in a neuronal model of MMN based on predictive coding principle. I also investigate how predictive processes may interact with synaptic adaptation in MMN generations and examine the consequences of this interaction for the use of MMN paradigms in schizophrenia research. Copyright © 2015 Elsevier B.V. All rights reserved.
Efficient preparation of large-block-code ancilla states for fault-tolerant quantum computation
NASA Astrophysics Data System (ADS)
Zheng, Yi-Cong; Lai, Ching-Yi; Brun, Todd A.
2018-03-01
Fault-tolerant quantum computation (FTQC) schemes that use multiqubit large block codes can potentially reduce the resource overhead to a great extent. A major obstacle is the requirement for a large number of clean ancilla states of different types without correlated errors inside each block. These ancilla states are usually logical stabilizer states of the data-code blocks, which are generally difficult to prepare if the code size is large. Previously, we have proposed an ancilla distillation protocol for Calderbank-Shor-Steane (CSS) codes by classical error-correcting codes. It was assumed that the quantum gates in the distillation circuit were perfect; however, in reality, noisy quantum gates may introduce correlated errors that are not treatable by the protocol. In this paper, we show that additional postselection by another classical error-detecting code can be applied to remove almost all correlated errors. Consequently, the revised protocol is fully fault tolerant and capable of preparing a large set of stabilizer states sufficient for FTQC using large block codes. At the same time, the yield rate can be boosted from O (t-2) to O (1 ) in practice for an [[n ,k ,d =2 t +1
Consequence analysis in LPG installation using an integrated computer package.
Ditali, S; Colombi, M; Moreschini, G; Senni, S
2000-01-07
This paper presents the prototype of the computer code, Atlantide, developed to assess the consequences associated with accidental events that can occur in a LPG storage plant. The characteristic of Atlantide is to be simple enough but at the same time adequate to cope with consequence analysis as required by Italian legislation in fulfilling the Seveso Directive. The application of Atlantide is appropriate for LPG storage/transferring installations. The models and correlations implemented in the code are relevant to flashing liquid releases, heavy gas dispersion and other typical phenomena such as BLEVE/Fireball. The computer code allows, on the basis of the operating/design characteristics, the study of the relevant accidental events from the evaluation of the release rate (liquid, gaseous and two-phase) in the unit involved, to the analysis of the subsequent evaporation and dispersion, up to the assessment of the final phenomena of fire and explosion. This is done taking as reference simplified Event Trees which describe the evolution of accidental scenarios, taking into account the most likely meteorological conditions, the different release situations and other features typical of a LPG installation. The limited input data required and the automatic linking between the single models, that are activated in a defined sequence, depending on the accidental event selected, minimize both the time required for the risk analysis and the possibility of errors. Models and equations implemented in Atlantide have been selected from public literature or in-house developed software and tailored with the aim to be easy to use and fast to run but, nevertheless, able to provide realistic simulation of the accidental event as well as reliable results, in terms of physical effects and hazardous areas. The results have been compared with those of other internationally recognized codes and with the criteria adopted by Italian authorities to verify the Safety Reports for LPG installations. A brief of the theoretical basis of each model implemented in Atlantide and an example of application are included in the paper.
Imitation learning based on an intrinsic motivation mechanism for efficient coding
Triesch, Jochen
2013-01-01
A hypothesis regarding the development of imitation learning is presented that is rooted in intrinsic motivations. It is derived from a recently proposed form of intrinsically motivated learning (IML) for efficient coding in active perception, wherein an agent learns to perform actions with its sense organs to facilitate efficient encoding of the sensory data. To this end, actions of the sense organs that improve the encoding of the sensory data trigger an internally generated reinforcement signal. Here it is argued that the same IML mechanism might also support the development of imitation when general actions beyond those of the sense organs are considered: The learner first observes a tutor performing a behavior and learns a model of the the behavior's sensory consequences. The learner then acts itself and receives an internally generated reinforcement signal reflecting how well the sensory consequences of its own behavior are encoded by the sensory model. Actions that are more similar to those of the tutor will lead to sensory signals that are easier to encode and produce a higher reinforcement signal. Through this, the learner's behavior is progressively tuned to make the sensory consequences of its actions match the learned sensory model. I discuss this mechanism in the context of human language acquisition and bird song learning where similar ideas have been proposed. The suggested mechanism also offers an account for the development of mirror neurons and makes a number of predictions. Overall, it establishes a connection between principles of efficient coding, intrinsic motivations and imitation. PMID:24204350
mRNA changes in nucleus accumbens related to methamphetamine addiction in mice
NASA Astrophysics Data System (ADS)
Zhu, Li; Li, Jiaqi; Dong, Nan; Guan, Fanglin; Liu, Yufeng; Ma, Dongliang; Goh, Eyleen L. K.; Chen, Teng
2016-11-01
Methamphetamine (METH) is a highly addictive psychostimulant that elicits aberrant changes in the expression of microRNAs (miRNAs) and long non-coding RNAs (lncRNAs) in the nucleus accumbens of mice, indicating a potential role of METH in post-transcriptional regulations. To decipher the potential consequences of these post-transcriptional regulations in response to METH, we performed strand-specific RNA sequencing (ssRNA-Seq) to identify alterations in mRNA expression and their alternative splicing in the nucleus accumbens of mice following exposure to METH. METH-mediated changes in mRNAs were analyzed and correlated with previously reported changes in non-coding RNAs (miRNAs and lncRNAs) to determine the potential functions of these mRNA changes observed here and how non-coding RNAs are involved. A total of 2171 mRNAs were differentially expressed in response to METH with functions involved in synaptic plasticity, mitochondrial energy metabolism and immune response. 309 and 589 of these mRNAs are potential targets of miRNAs and lncRNAs respectively. In addition, METH treatment decreases mRNA alternative splicing, and there are 818 METH-specific events not observed in saline-treated mice. Our results suggest that METH-mediated addiction could be attributed by changes in miRNAs and lncRNAs and consequently, changes in mRNA alternative splicing and expression. In conclusion, our study reported a methamphetamine-modified nucleus accumbens transcriptome and provided non-coding RNA-mRNA interaction networks possibly involved in METH addiction.
Ultra Safe And Secure Blasting System
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hart, M M
2009-07-27
The Ultra is a blasting system that is designed for special applications where the risk and consequences of unauthorized demolition or blasting are so great that the use of an extraordinarily safe and secure blasting system is justified. Such a blasting system would be connected and logically welded together through digital code-linking as part of the blasting system set-up and initialization process. The Ultra's security is so robust that it will defeat the people who designed and built the components in any attempt at unauthorized detonation. Anyone attempting to gain unauthorized control of the system by substituting components or tappingmore » into communications lines will be thwarted in their inability to provide encrypted authentication. Authentication occurs through the use of codes that are generated by the system during initialization code-linking and the codes remain unknown to anyone, including the authorized operator. Once code-linked, a closed system has been created. The system requires all components connected as they were during initialization as well as a unique code entered by the operator for function and blasting.« less
Electromagnetic code for naval applications
NASA Astrophysics Data System (ADS)
Crescimbeni, F.; Bessi, F.; Chiti, S.
1988-12-01
The use of an increasing number of electronic apparatus became vital to meet the high performance required for military Navy applications. Thus the number of antennas to be mounted on shipboard greatly increased. As a consequence of the high antenna density, of the complexity of the shipboard environment and of the powers used for communication and radar systems, the EMC (Electro-Magnetic Compatibility) problem is playing a leading role in the design of the topside of a ship. The Italian Navy has acquired a numerical code for the antenna siting and design. This code, together with experimental data measured at the Italian Navy test range facility, allows for the evaluation of optimal sitings for antenna systems on shipboard, and the prediction of their performances in the actual environment. The structure of this code, named Programma Elettromagnetico per Applicazioni Navali, (Electromagnetic Code for Naval Applications) is discussed, together with its capabilities and applications. Also the results obtained in some examples are presented and compared with the measurements.
The Development of Bimodal Bilingualism: Implications for Linguistic Theory.
Lillo-Martin, Diane; de Quadros, Ronice Müller; Pichler, Deborah Chen
2016-01-01
A wide range of linguistic phenomena contribute to our understanding of the architecture of the human linguistic system. In this paper we present a proposal dubbed Language Synthesis to capture bilingual phenomena including code-switching and 'transfer' as automatic consequences of the addition of a second language, using basic concepts of Minimalism and Distributed Morphology. Bimodal bilinguals, who use a sign language and a spoken language, provide a new type of evidence regarding possible bilingual phenomena, namely code-blending, the simultaneous production of (aspects of) a message in both speech and sign. We argue that code-blending also follows naturally once a second articulatory interface is added to the model. Several different types of code-blending are discussed in connection to the predictions of the Synthesis model. Our primary data come from children developing as bimodal bilinguals, but our proposal is intended to capture a wide range of bilingual effects across any language pair.
The Development of Bimodal Bilingualism: Implications for Linguistic Theory
Lillo-Martin, Diane; de Quadros, Ronice Müller; Pichler, Deborah Chen
2017-01-01
A wide range of linguistic phenomena contribute to our understanding of the architecture of the human linguistic system. In this paper we present a proposal dubbed Language Synthesis to capture bilingual phenomena including code-switching and ‘transfer’ as automatic consequences of the addition of a second language, using basic concepts of Minimalism and Distributed Morphology. Bimodal bilinguals, who use a sign language and a spoken language, provide a new type of evidence regarding possible bilingual phenomena, namely code-blending, the simultaneous production of (aspects of) a message in both speech and sign. We argue that code-blending also follows naturally once a second articulatory interface is added to the model. Several different types of code-blending are discussed in connection to the predictions of the Synthesis model. Our primary data come from children developing as bimodal bilinguals, but our proposal is intended to capture a wide range of bilingual effects across any language pair. PMID:28603576
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ritchie, L.T.; Alpert, D.J.; Burke, R.P.
1984-03-01
The CRAC2 computer code is a revised version of CRAC (Calculation of Reactor Accident Consequences) which was developed for the Reactor Safety Study. This document provides an overview of the CRAC2 code and a description of each of the models used. Significant improvements incorporated into CRAC2 include an improved weather sequence sampling technique, a new evacuation model, and new output capabilities. In addition, refinements have been made to the atmospheric transport and deposition model. Details of the modeling differences between CRAC2 and CRAC are emphasized in the model descriptions.
Mathematical fundamentals for the noise immunity of the genetic code.
Fimmel, Elena; Strüngmann, Lutz
2018-02-01
Symmetry is one of the essential and most visible patterns that can be seen in nature. Starting from the left-right symmetry of the human body, all types of symmetry can be found in crystals, plants, animals and nature as a whole. Similarly, principals of symmetry are also some of the fundamental and most useful tools in modern mathematical natural science that play a major role in theory and applications. As a consequence, it is not surprising that the desire to understand the origin of life, based on the genetic code, forces us to involve symmetry as a mathematical concept. The genetic code can be seen as a key to biological self-organisation. All living organisms have the same molecular bases - an alphabet consisting of four letters (nitrogenous bases): adenine, cytosine, guanine, and thymine. Linearly ordered sequences of these bases contain the genetic information for synthesis of proteins in all forms of life. Thus, one of the most fascinating riddles of nature is to explain why the genetic code is as it is. Genetic coding possesses noise immunity which is the fundamental feature that allows to pass on the genetic information from parents to their descendants. Hence, since the time of the discovery of the genetic code, scientists have tried to explain the noise immunity of the genetic information. In this chapter we will discuss recent results in mathematical modelling of the genetic code with respect to noise immunity, in particular error-detection and error-correction. We will focus on two central properties: Degeneracy and frameshift correction. Different amino acids are encoded by different quantities of codons and a connection between this degeneracy and the noise immunity of genetic information is a long standing hypothesis. Biological implications of the degeneracy have been intensively studied and whether the natural code is a frozen accident or a highly optimised product of evolution is still controversially discussed. Symmetries in the structure of degeneracy of the genetic code are essential and give evidence of substantial advantages of the natural code over other possible ones. In the present chapter we will present a recent approach to explain the degeneracy of the genetic code by algorithmic methods from bioinformatics, and discuss its biological consequences. The biologists recognised this problem immediately after the detection of the non-overlapping structure of the genetic code, i.e., coding sequences are to be read in a unique way determined by their reading frame. But how does the reading head of the ribosome recognises an error in the grouping of codons, caused by e.g. insertion or deletion of a base, that can be fatal during the translation process and may result in nonfunctional proteins? In this chapter we will discuss possible solutions to the frameshift problem with a focus on the theory of so-called circular codes that were discovered in large gene populations of prokaryotes and eukaryotes in the early 90s. Circular codes allow to detect a frameshift of one or two positions and recently a beautiful theory of such codes has been developed using statistics, group theory and graph theory. Copyright © 2017 Elsevier B.V. All rights reserved.
Development of high-fidelity multiphysics system for light water reactor analysis
NASA Astrophysics Data System (ADS)
Magedanz, Jeffrey W.
There has been a tendency in recent years toward greater heterogeneity in reactor cores, due to the use of mixed-oxide (MOX) fuel, burnable absorbers, and longer cycles with consequently higher fuel burnup. The resulting asymmetry of the neutron flux and energy spectrum between regions with different compositions causes a need to account for the directional dependence of the neutron flux, instead of the traditional diffusion approximation. Furthermore, the presence of both MOX and high-burnup fuel in the core increases the complexity of the heat conduction. The heat transfer properties of the fuel pellet change with irradiation, and the thermal and mechanical expansion of the pellet and cladding strongly affect the size of the gap between them, and its consequent thermal resistance. These operational tendencies require higher fidelity multi-physics modeling capabilities, and this need is addressed by the developments performed within this PhD research. The dissertation describes the development of a High-Fidelity Multi-Physics System for Light Water Reactor Analysis. It consists of three coupled codes -- CTF for Thermal Hydraulics, TORT-TD for Neutron Kinetics, and FRAPTRAN for Fuel Performance. It is meant to address these modeling challenges in three ways: (1) by resolving the state of the system at the level of each fuel pin, rather than homogenizing entire fuel assemblies, (2) by using the multi-group Discrete Ordinates method to account for the directional dependence of the neutron flux, and (3) by using a fuel-performance code, rather than a Thermal Hydraulics code's simplified fuel model, to account for the material behavior of the fuel and its feedback to the hydraulic and neutronic behavior of the system. While the first two are improvements, the third, the use of a fuel-performance code for feedback, constitutes an innovation in this PhD project. Also important to this work is the manner in which such coupling is written. While coupling involves combining codes into a single executable, they are usually still developed and maintained separately. It should thus be a design objective to minimize the changes to those codes, and keep the changes to each code free of dependence on the details of the other codes. This will ease the incorporation of new versions of the code into the coupling, as well as re-use of parts of the coupling to couple with different codes. In order to fulfill this objective, an interface for each code was created in the form of an object-oriented abstract data type. Object-oriented programming is an effective method for enforcing a separation between different parts of a program, and clarifying the communication between them. The interfaces enable the main program to control the codes in terms of high-level functionality. This differs from the established practice of a master/slave relationship, in which the slave code is incorporated into the master code as a set of subroutines. While this PhD research continues previous work with a coupling between CTF and TORT-TD, it makes two major original contributions: (1) using a fuel-performance code, instead of a thermal-hydraulics code's simplified built-in models, to model the feedback from the fuel rods, and (2) the design of an object-oriented interface as an innovative method to interact with a coupled code in a high-level, easily-understandable manner. The resulting code system will serve as a tool to study the question of under what conditions, and to what extent, these higher-fidelity methods will provide benefits to reactor core analysis. (Abstract shortened by UMI.)
Identification of a Novel GJA8 (Cx50) Point Mutation Causes Human Dominant Congenital Cataracts
NASA Astrophysics Data System (ADS)
Ge, Xiang-Lian; Zhang, Yilan; Wu, Yaming; Lv, Jineng; Zhang, Wei; Jin, Zi-Bing; Qu, Jia; Gu, Feng
2014-02-01
Hereditary cataracts are clinically and genetically heterogeneous lens diseases that cause a significant proportion of visual impairment and blindness in children. Human cataracts have been linked with mutations in two genes, GJA3 and GJA8, respectively. To identify the causative mutation in a family with hereditary cataracts, family members were screened for mutations by PCR for both genes. Sequencing the coding regions of GJA8, coding for connexin 50, revealed a C > A transversion at nucleotide 264, which caused p.P88T mutation. To dissect the molecular consequences of this mutation, plasmids carrying wild-type and mutant mouse ORFs of Gja8 were generated and ectopically expressed in HEK293 cells and human lens epithelial cells, respectively. The recombinant proteins were assessed by confocal microscopy and Western blotting. The results demonstrate that the molecular consequences of the p.P88T mutation in GJA8 include changes in connexin 50 protein localization patterns, accumulation of mutant protein, and increased cell growth.
Tindall, B J; Sutton, G; Garrity, G M
2017-02-01
Enterobacter aerogenes Hormaeche and Edwards 1960 (Approved Lists 1980) and Klebsiella mobilis Bascomb et al. 1971 (Approved Lists 1980) were placed on the Approved Lists of Bacterial Names and were based on the same nomenclatural type, ATCC 13048. Consequently they are to be treated as homotypic synonyms. However, the names of homotypic synonyms at the rank of species normally are based on the same epithet. Examination of the Rules of the International Code of Nomenclature of Bacteria in force at the time indicates that the epithet mobilis in Klebsiella mobilis Bascomb et al. 1971 (Approved Lists 1980) was illegitimate at the time the Approved Lists were published and according to the Rules of the current International Code of Nomenclature of Prokaryotes continues to be illegitimate.
Extremely accurate sequential verification of RELAP5-3D
Mesina, George L.; Aumiller, David L.; Buschman, Francis X.
2015-11-19
Large computer programs like RELAP5-3D solve complex systems of governing, closure and special process equations to model the underlying physics of nuclear power plants. Further, these programs incorporate many other features for physics, input, output, data management, user-interaction, and post-processing. For software quality assurance, the code must be verified and validated before being released to users. For RELAP5-3D, verification and validation are restricted to nuclear power plant applications. Verification means ensuring that the program is built right by checking that it meets its design specifications, comparing coding to algorithms and equations and comparing calculations against analytical solutions and method ofmore » manufactured solutions. Sequential verification performs these comparisons initially, but thereafter only compares code calculations between consecutive code versions to demonstrate that no unintended changes have been introduced. Recently, an automated, highly accurate sequential verification method has been developed for RELAP5-3D. The method also provides to test that no unintended consequences result from code development in the following code capabilities: repeating a timestep advancement, continuing a run from a restart file, multiple cases in a single code execution, and modes of coupled/uncoupled operation. In conclusion, mathematical analyses of the adequacy of the checks used in the comparisons are provided.« less
Młynarski, Wiktor
2014-01-01
To date a number of studies have shown that receptive field shapes of early sensory neurons can be reproduced by optimizing coding efficiency of natural stimulus ensembles. A still unresolved question is whether the efficient coding hypothesis explains formation of neurons which explicitly represent environmental features of different functional importance. This paper proposes that the spatial selectivity of higher auditory neurons emerges as a direct consequence of learning efficient codes for natural binaural sounds. Firstly, it is demonstrated that a linear efficient coding transform—Independent Component Analysis (ICA) trained on spectrograms of naturalistic simulated binaural sounds extracts spatial information present in the signal. A simple hierarchical ICA extension allowing for decoding of sound position is proposed. Furthermore, it is shown that units revealing spatial selectivity can be learned from a binaural recording of a natural auditory scene. In both cases a relatively small subpopulation of learned spectrogram features suffices to perform accurate sound localization. Representation of the auditory space is therefore learned in a purely unsupervised way by maximizing the coding efficiency and without any task-specific constraints. This results imply that efficient coding is a useful strategy for learning structures which allow for making behaviorally vital inferences about the environment. PMID:24639644
NASA Astrophysics Data System (ADS)
Silva, K.; Lawawirojwong, S.; Promping, J.
2017-06-01
Consequence assessment of a hypothetical severe accident is one of the important elements of the risk assessment of a nuclear power plant. It is widely known that the meteorological conditions can significantly influence the outcomes of such assessment, since it determines the results of the calculation of the radionuclide environmental transport. This study aims to assess the impacts of the meteorological conditions to the results of the consequence assessment. The consequence assessment code, OSCAAR, of Japan Atomic Energy Agency (JAEA) is used for the assessment. The results of the consequence assessment using Thai meteorological data are compared with those using Japanese meteorological data. The Thai case has following characteristics. Low wind speed made the radionuclides concentrate at the center comparing to the Japanese case. The squalls induced the peaks in the ground concentration distribution. The evacuated land is larger than the Japanese case though the relocated land is smaller, which is attributed to the concentration of the radionuclides near the release point.
How To Keep Your Schools Safe and Secure.
ERIC Educational Resources Information Center
Gilbert, Christopher B.
1996-01-01
Discusses unforeseen costs (including potential litigation expenses), benefits, and consequences of adopting security measures (such as metal detectors, drug dogs, security cameras, campus police, dress codes, crime watch programs, and communication devices) to counter on-campus violence and gang activity. High-tech gadgetry alone is insufficient.…
The Revised 2010 Ethical Standards for School Counselors
ERIC Educational Resources Information Center
Huey, Wayne C.
2011-01-01
The American School Counselor Association (ASCA) recently revised its ethical code for professional school counselors, the "Ethical Standards for School Counselors," in 2010. Professional school counselors have a unique challenge in counseling minors in that they provide services in an educational setting. Consequently, school counselors not only…
Thermodynamic consequences of hydrogen combustion within a containment of pressurized water reactor
NASA Astrophysics Data System (ADS)
Bury, Tomasz
2011-12-01
Gaseous hydrogen may be generated in a nuclear reactor system as an effect of the core overheating. This creates a risk of its uncontrolled combustion which may have a destructive consequences, as it could be observed during the Fukushima nuclear power plant accident. Favorable conditions for hydrogen production occur during heavy loss-of-coolant accidents. The author used an own computer code, called HEPCAL, of the lumped parameter type to realize a set of simulations of a large scale loss-of-coolant accidents scenarios within containment of second generation pressurized water reactor. Some simulations resulted in high pressure peaks, seemed to be irrational. A more detailed analysis and comparison with Three Mile Island and Fukushima accidents consequences allowed for withdrawing interesting conclusions.
A novel quantum LSB-based steganography method using the Gray code for colored quantum images
NASA Astrophysics Data System (ADS)
Heidari, Shahrokh; Farzadnia, Ehsan
2017-10-01
As one of the prevalent data-hiding techniques, steganography is defined as the act of concealing secret information in a cover multimedia encompassing text, image, video and audio, imperceptibly, in order to perform interaction between the sender and the receiver in which nobody except the receiver can figure out the secret data. In this approach a quantum LSB-based steganography method utilizing the Gray code for quantum RGB images is investigated. This method uses the Gray code to accommodate two secret qubits in 3 LSBs of each pixel simultaneously according to reference tables. Experimental consequences which are analyzed in MATLAB environment, exhibit that the present schema shows good performance and also it is more secure and applicable than the previous one currently found in the literature.
NASA Technical Reports Server (NTRS)
Ghosh, Amrit Raj
1996-01-01
The viscous, Navier-Stokes solver for turbomachinery applications, MSUTC has been modified to include the rotating frame formulation. The three-dimensional thin-layer Navier-Stokes equations have been cast in a rotating Cartesian frame enabling the freezing of grid motion. This also allows the flow-field associated with an isolated rotor to be viewed as a steady-state problem. Consequently, local time stepping can be used to accelerate convergence. The formulation is validated by running NASA's Rotor 67 as the test case. results are compared between the rotating frame code and the absolute frame code. The use of the rotating frame approach greatly enhances the performance of the code with respect to savings in computing time, without degradation of the solution.
Michel, Christian J
2017-04-18
In 1996, a set X of 20 trinucleotides was identified in genes of both prokaryotes and eukaryotes which has on average the highest occurrence in reading frame compared to its two shifted frames. Furthermore, this set X has an interesting mathematical property as X is a maximal C 3 self-complementary trinucleotide circular code. In 2015, by quantifying the inspection approach used in 1996, the circular code X was confirmed in the genes of bacteria and eukaryotes and was also identified in the genes of plasmids and viruses. The method was based on the preferential occurrence of trinucleotides among the three frames at the gene population level. We extend here this definition at the gene level. This new statistical approach considers all the genes, i.e., of large and small lengths, with the same weight for searching the circular code X . As a consequence, the concept of circular code, in particular the reading frame retrieval, is directly associated to each gene. At the gene level, the circular code X is strengthened in the genes of bacteria, eukaryotes, plasmids, and viruses, and is now also identified in the genes of archaea. The genes of mitochondria and chloroplasts contain a subset of the circular code X . Finally, by studying viral genes, the circular code X was found in DNA genomes, RNA genomes, double-stranded genomes, and single-stranded genomes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sullivan, Edmund J.; Anderson, Michael T.
In May 2010, the NRC issued a proposed notice of rulemaking that includes a provision to add a new section to its rules to require licensees to implement ASME Code Case N-770, ‘‘Alternative Examination Requirements and Acceptance Standards for Class 1 PWR Piping and Vessel Nozzle Butt Welds Fabricated with UNS N06082 or UNS W86182 Weld Filler Material With or Without the Application of Listed Mitigation Activities, Section XI, Division 1,’’ with 15 conditions. Code Case N-770 contains baseline and inservice inspection (ISI) requirements for unmitigated butt welds fabricated with Alloy 82/182 material and preservice and ISI requirements for mitigatedmore » butt welds. The NRC stated that application of ASME Code Case N-770 is necessary because the inspections currently required by the ASME Code, Section XI, were not written to address stress corrosion cracking Alloy 82/182 butt welds, and the safety consequences of inadequate inspections can be significant. The NRC expects to issue the final rule incorporating this code case into its regulations in the spring 2011 time frame. This paper discusses the new examination requirements, the conditions that NRC is imposing , and the major concerns with implementation of the new Code Case.« less
Auto-Regulatory RNA Editing Fine-Tunes mRNA Re-Coding and Complex Behaviour in Drosophila
Savva, Yiannis A.; Jepson, James E.C; Sahin, Asli; Sugden, Arthur U.; Dorsky, Jacquelyn S.; Alpert, Lauren; Lawrence, Charles; Reenan, Robert A.
2014-01-01
Auto-regulatory feedback loops are a common molecular strategy used to optimize protein function. In Drosophila many mRNAs involved in neuro-transmission are re-coded at the RNA level by the RNA editing enzyme dADAR, leading to the incorporation of amino acids that are not directly encoded by the genome. dADAR also re-codes its own transcript, but the consequences of this auto-regulation in vivo are unclear. Here we show that hard-wiring or abolishing endogenous dADAR auto-regulation dramatically remodels the landscape of re-coding events in a site-specific manner. These molecular phenotypes correlate with altered localization of dADAR within the nuclear compartment. Furthermore, auto-editing exhibits sexually dimorphic patterns of spatial regulation and can be modified by abiotic environmental factors. Finally, we demonstrate that modifying dAdar auto-editing affects adaptive complex behaviors. Our results reveal the in vivo relevance of auto-regulatory control over post-transcriptional mRNA re-coding events in fine-tuning brain function and organismal behavior. PMID:22531175
The dependence of frequency distributions on multiple meanings of words, codes and signs
NASA Astrophysics Data System (ADS)
Yan, Xiaoyong; Minnhagen, Petter
2018-01-01
The dependence of the frequency distributions due to multiple meanings of words in a text is investigated by deleting letters. By coding the words with fewer letters the number of meanings per coded word increases. This increase is measured and used as an input in a predictive theory. For a text written in English, the word-frequency distribution is broad and fat-tailed, whereas if the words are only represented by their first letter the distribution becomes exponential. Both distribution are well predicted by the theory, as is the whole sequence obtained by consecutively representing the words by the first L = 6 , 5 , 4 , 3 , 2 , 1 letters. Comparisons of texts written by Chinese characters and the same texts written by letter-codes are made and the similarity of the corresponding frequency-distributions are interpreted as a consequence of the multiple meanings of Chinese characters. This further implies that the difference of the shape for word-frequencies for an English text written by letters and a Chinese text written by Chinese characters is due to the coding and not to the language per se.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rosenberg, Michael; Jonlin, Duane; Nadel, Steven
Today’s building energy codes focus on prescriptive requirements for features of buildings that are directly controlled by the design and construction teams and verifiable by municipal inspectors. Although these code requirements have had a significant impact, they fail to influence a large slice of the building energy use pie – including not only miscellaneous plug loads, cooking equipment and commercial/industrial processes, but the maintenance and optimization of the code-mandated systems as well. Currently, code compliance is verified only through the end of construction, and there are no limits or consequences for the actual energy use in an occupied building. Inmore » the future, our suite of energy regulations will likely expand to include building efficiency, energy use or carbon emission budgets over their full life cycle. Intelligent building systems, extensive renewable energy, and a transition from fossil fuel to electric heating systems will likely be required to meet ultra-low-energy targets. This paper lays out the authors’ perspectives on how buildings may evolve over the course of the 21st century and the roles that codes and regulations will play in shaping those buildings of the future.« less
Program MAMO: Models for avian management optimization-user guide
Guillaumet, Alban; Paxton, Eben H.
2017-01-01
The following chapters describe the structure and code of MAMO, and walk the reader through running the different components of the program with sample data. This manual should be used alongside a computer running R, so that the reader can copy and paste code into R, observe the output, and follow along interactively. Taken together, chapters 2–4 will allow the user to replicate a simulation study investigating the consequences of climate change and two potential management actions on the population dynamics of a vulnerable and iconic Hawaiian forest bird, the ‘I‘iwi (Drepanis coccinea; hereafter IIWI).
Torres-Montúfar, Alejandro; Borsch, Thomas; Ochoterena, Helga
2018-05-01
The conceptualization and coding of characters is a difficult issue in phylogenetic systematics, no matter which inference method is used when reconstructing phylogenetic trees or if the characters are just mapped onto a specific tree. Complex characters are groups of features that can be divided into simpler hierarchical characters (reductive coding), although the implied hierarchical relational information may change depending on the type of coding (composite vs. reductive). Up to now, there is no common agreement to either code characters as complex or simple. Phylogeneticists have discussed which coding method is best but have not incorporated the heuristic process of reciprocal illumination to evaluate the coding. Composite coding allows to test whether 1) several characters were linked resulting in a structure described as a complex character or trait or 2) independently evolving characters resulted in the configuration incorrectly interpreted as a complex character. We propose that complex characters or character states should be decomposed iteratively into simpler characters when the original homology hypothesis is not corroborated by a phylogenetic analysis, and the character or character state is retrieved as homoplastic. We tested this approach using the case of fruit types within subfamily Cinchonoideae (Rubiaceae). The iterative reductive coding of characters associated with drupes allowed us to unthread fruit evolution within Cinchonoideae. Our results show that drupes and berries are not homologous. As a consequence, a more precise ontology for the Cinchonoideae drupes is required.
Communication Civility Codes: Positive Communication through the Students' Eyes
ERIC Educational Resources Information Center
Pawlowski, Donna R.
2017-01-01
Courses: Presentational courses such as Public Speaking, Interviewing, Business and Professional, Persuasion, Interpersonal; any course where civility may be promoted in the classroom. Objectives: At the end of this single-class activity, students will have an understanding of civility in order to: (1) identify civility and consequences of…
The evolution of transcriptional regulation in eukaryotes
NASA Technical Reports Server (NTRS)
Wray, Gregory A.; Hahn, Matthew W.; Abouheif, Ehab; Balhoff, James P.; Pizer, Margaret; Rockman, Matthew V.; Romano, Laura A.
2003-01-01
Gene expression is central to the genotype-phenotype relationship in all organisms, and it is an important component of the genetic basis for evolutionary change in diverse aspects of phenotype. However, the evolution of transcriptional regulation remains understudied and poorly understood. Here we review the evolutionary dynamics of promoter, or cis-regulatory, sequences and the evolutionary mechanisms that shape them. Existing evidence indicates that populations harbor extensive genetic variation in promoter sequences, that a substantial fraction of this variation has consequences for both biochemical and organismal phenotype, and that some of this functional variation is sorted by selection. As with protein-coding sequences, rates and patterns of promoter sequence evolution differ considerably among loci and among clades for reasons that are not well understood. Studying the evolution of transcriptional regulation poses empirical and conceptual challenges beyond those typically encountered in analyses of coding sequence evolution: promoter organization is much less regular than that of coding sequences, and sequences required for the transcription of each locus reside at multiple other loci in the genome. Because of the strong context-dependence of transcriptional regulation, sequence inspection alone provides limited information about promoter function. Understanding the functional consequences of sequence differences among promoters generally requires biochemical and in vivo functional assays. Despite these challenges, important insights have already been gained into the evolution of transcriptional regulation, and the pace of discovery is accelerating.
NASA Technical Reports Server (NTRS)
Jones, Jeremy; Grosvenor, Sandy; Wolf, Karl; Li, Connie; Koratkar, Anuradha; Powers, Edward I. (Technical Monitor)
2001-01-01
In the Virtual Observatory (VO), software tools will perform the functions that have traditionally been performed by physical observatories and their instruments. These tools will not be adjuncts to VO functionality but will make up the very core of the VO. Consequently, the tradition of observatory and system independent tools serving a small user base is not valid for the VO. For the VO to succeed, we must improve software collaboration and code sharing between projects and groups. A significant goal of the Scientist's Expert Assistant (SEA) project has been promoting effective collaboration and code sharing between groups. During the past three years, the SEA project has been developing prototypes for new observation planning software tools and strategies. Initially funded by the Next Generation Space Telescope, parts of the SEA code have since been adopted by the Space Telescope Science Institute. SEA has also supplied code for SOFIA, the SIRTF planning tools, and the JSky Open Source Java library. The potential benefits of sharing code are clear. The recipient gains functionality for considerably less cost. The provider gains additional developers working with their code. If enough users groups adopt a set of common code and tools, defacto standards can emerge (as demonstrated by the success of the FITS standard). Code sharing also raises a number of challenges related to the management of the code. In this talk, we will review our experiences with SEA - both successes and failures - and offer some lessons learned that may promote further successes in collaboration and re-use.
NASA Technical Reports Server (NTRS)
Korathkar, Anuradha; Grosvenor, Sandy; Jones, Jeremy; Li, Connie; Mackey, Jennifer; Neher, Ken; Obenschain, Arthur F. (Technical Monitor)
2001-01-01
In the Virtual Observatory (VO), software tools will perform the functions that have traditionally been performed by physical observatories and their instruments. These tools will not be adjuncts to VO functionality but will make up the very core of the VO. Consequently, the tradition of observatory and system independent tools serving a small user base is not valid for the VO. For the VO to succeed, we must improve software collaboration and code sharing between projects and groups. A significant goal of the Scientist's Expert Assistant (SEA) project has been promoting effective collaboration and code sharing among groups. During the past three years, the SEA project has been developing prototypes for new observation planning software tools and strategies. Initially funded by the Next Generation Space Telescope, parts of the SEA code have since been adopted by the Space Telescope Science Institute. SEA has also supplied code for the SIRTF (Space Infrared Telescope Facility) planning tools, and the JSky Open Source Java library. The potential benefits of sharing code are clear. The recipient gains functionality for considerably less cost. The provider gains additional developers working with their code. If enough users groups adopt a set of common code and tools, de facto standards can emerge (as demonstrated by the success of the FITS standard). Code sharing also raises a number of challenges related to the management of the code. In this talk, we will review our experiences with SEA--both successes and failures, and offer some lessons learned that might promote further successes in collaboration and re-use.
Critical evaluation of reverse engineering tool Imagix 4D!
Yadav, Rashmi; Patel, Ravindra; Kothari, Abhay
2016-01-01
The comprehension of legacy codes is difficult to understand. Various commercial reengineering tools are available that have unique working styles, and are equipped with their inherent capabilities and shortcomings. The focus of the available tools is in visualizing static behavior not the dynamic one. Therefore, it is difficult for people who work in software product maintenance, code understanding reengineering/reverse engineering. Consequently, the need for a comprehensive reengineering/reverse engineering tool arises. We found the usage of Imagix 4D to be good as it generates the maximum pictorial representations in the form of flow charts, flow graphs, class diagrams, metrics and, to a partial extent, dynamic visualizations. We evaluated Imagix 4D with the help of a case study involving a few samples of source code. The behavior of the tool was analyzed on multiple small codes and a large code gcc C parser. Large code evaluation was performed to uncover dead code, unstructured code, and the effect of not including required files at preprocessing level. The utility of Imagix 4D to prepare decision density and complexity metrics for a large code was found to be useful in getting to know how much reengineering is required. At the outset, Imagix 4D offered limitations in dynamic visualizations, flow chart separation (large code) and parsing loops. The outcome of evaluation will eventually help in upgrading Imagix 4D and posed a need of full featured tools in the area of software reengineering/reverse engineering. It will also help the research community, especially those who are interested in the realm of software reengineering tool building.
Recurrent and functional regulatory mutations in breast cancer.
Rheinbay, Esther; Parasuraman, Prasanna; Grimsby, Jonna; Tiao, Grace; Engreitz, Jesse M; Kim, Jaegil; Lawrence, Michael S; Taylor-Weiner, Amaro; Rodriguez-Cuevas, Sergio; Rosenberg, Mara; Hess, Julian; Stewart, Chip; Maruvka, Yosef E; Stojanov, Petar; Cortes, Maria L; Seepo, Sara; Cibulskis, Carrie; Tracy, Adam; Pugh, Trevor J; Lee, Jesse; Zheng, Zongli; Ellisen, Leif W; Iafrate, A John; Boehm, Jesse S; Gabriel, Stacey B; Meyerson, Matthew; Golub, Todd R; Baselga, Jose; Hidalgo-Miranda, Alfredo; Shioda, Toshi; Bernards, Andre; Lander, Eric S; Getz, Gad
2017-07-06
Genomic analysis of tumours has led to the identification of hundreds of cancer genes on the basis of the presence of mutations in protein-coding regions. By contrast, much less is known about cancer-causing mutations in non-coding regions. Here we perform deep sequencing in 360 primary breast cancers and develop computational methods to identify significantly mutated promoters. Clear signals are found in the promoters of three genes. FOXA1, a known driver of hormone-receptor positive breast cancer, harbours a mutational hotspot in its promoter leading to overexpression through increased E2F binding. RMRP and NEAT1, two non-coding RNA genes, carry mutations that affect protein binding to their promoters and alter expression levels. Our study shows that promoter regions harbour recurrent mutations in cancer with functional consequences and that the mutations occur at similar frequencies as in coding regions. Power analyses indicate that more such regions remain to be discovered through deep sequencing of adequately sized cohorts of patients.
NASA Technical Reports Server (NTRS)
Truong, T. K.; Hsu, I. S.; Eastman, W. L.; Reed, I. S.
1987-01-01
It is well known that the Euclidean algorithm or its equivalent, continued fractions, can be used to find the error locator polynomial and the error evaluator polynomial in Berlekamp's key equation needed to decode a Reed-Solomon (RS) code. A simplified procedure is developed and proved to correct erasures as well as errors by replacing the initial condition of the Euclidean algorithm by the erasure locator polynomial and the Forney syndrome polynomial. By this means, the errata locator polynomial and the errata evaluator polynomial can be obtained, simultaneously and simply, by the Euclidean algorithm only. With this improved technique the complexity of time domain RS decoders for correcting both errors and erasures is reduced substantially from previous approaches. As a consequence, decoders for correcting both errors and erasures of RS codes can be made more modular, regular, simple, and naturally suitable for both VLSI and software implementation. An example illustrating this modified decoding procedure is given for a (15, 9) RS code.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jankovsky, Zachary Kyle; Denman, Matthew R.
It is difficult to assess the consequences of a transient in a sodium-cooled fast reactor (SFR) using traditional probabilistic risk assessment (PRA) methods, as numerous safety-related sys- tems have passive characteristics. Often there is significant dependence on the value of con- tinuous stochastic parameters rather than binary success/failure determinations. One form of dynamic PRA uses a system simulator to represent the progression of a transient, tracking events through time in a discrete dynamic event tree (DDET). In order to function in a DDET environment, a simulator must have characteristics that make it amenable to changing physical parameters midway through themore » analysis. The SAS4A SFR system analysis code did not have these characteristics as received. This report describes the code modifications made to allow dynamic operation as well as the linking to a Sandia DDET driver code. A test case is briefly described to demonstrate the utility of the changes.« less
NASA Astrophysics Data System (ADS)
Zhang, Baocheng; Teunissen, Peter J. G.; Yuan, Yunbin; Zhang, Xiao; Li, Min
2018-03-01
Sensing the ionosphere with the global positioning system involves two sequential tasks, namely the ionospheric observable retrieval and the ionospheric parameter estimation. A prominent source of error has long been identified as short-term variability in receiver differential code bias (rDCB). We modify the carrier-to-code leveling (CCL), a method commonly used to accomplish the first task, through assuming rDCB to be unlinked in time. Aside from the ionospheric observables, which are affected by, among others, the rDCB at one reference epoch, the Modified CCL (MCCL) can also provide the rDCB offsets with respect to the reference epoch as by-products. Two consequences arise. First, MCCL is capable of excluding the effects of time-varying rDCB from the ionospheric observables, which, in turn, improves the quality of ionospheric parameters of interest. Second, MCCL has significant potential as a means to detect between-epoch fluctuations experienced by rDCB of a single receiver.
Studies of Planet Formation Using a Hybrid N-Body + Planetesimal Code
NASA Technical Reports Server (NTRS)
Kenyon, Scott J.
2004-01-01
The goal of our proposal was to use a hybrid multi-annulus planetesimal/n-body code to examine the planetesimal theory, one of the two main theories of planet formation. We developed this code to follow the evolution of numerous 1 m to 1 km planetesimals as they collide, merge, and grow into full-fledged planets. Our goal was to apply the code to several well-posed, topical problems in planet formation and to derive observational consequences of the models. We planned to construct detailed models to address two fundamental issues: (1) icy planets: models for icy planet formation will demonstrate how the physical properties of debris disks - including the Kuiper Belt in our solar system - depend on initial conditions and input physics; and (2) terrestrial planets: calculations following the evolution of 1-10 km planetesimals into Earth-mass planets and rings of dust will provide a better understanding of how terrestrial planets form and interact with their environment.
2014-01-01
Background The genome is pervasively transcribed but most transcripts do not code for proteins, constituting non-protein-coding RNAs. Despite increasing numbers of functional reports of individual long non-coding RNAs (lncRNAs), assessing the extent of functionality among the non-coding transcriptional output of mammalian cells remains intricate. In the protein-coding world, transcripts differentially expressed in the context of processes essential for the survival of multicellular organisms have been instrumental in the discovery of functionally relevant proteins and their deregulation is frequently associated with diseases. We therefore systematically identified lncRNAs expressed differentially in response to oncologically relevant processes and cell-cycle, p53 and STAT3 pathways, using tiling arrays. Results We found that up to 80% of the pathway-triggered transcriptional responses are non-coding. Among these we identified very large macroRNAs with pathway-specific expression patterns and demonstrated that these are likely continuous transcripts. MacroRNAs contain elements conserved in mammals and sauropsids, which in part exhibit conserved RNA secondary structure. Comparing evolutionary rates of a macroRNA to adjacent protein-coding genes suggests a local action of the transcript. Finally, in different grades of astrocytoma, a tumor disease unrelated to the initially used cell lines, macroRNAs are differentially expressed. Conclusions It has been shown previously that the majority of expressed non-ribosomal transcripts are non-coding. We now conclude that differential expression triggered by signaling pathways gives rise to a similar abundance of non-coding content. It is thus unlikely that the prevalence of non-coding transcripts in the cell is a trivial consequence of leaky or random transcription events. PMID:24594072
Work-family balance by women GP specialist trainees in Slovenia: a qualitative study.
Petek, Davorina; Gajsek, Tadeja; Petek Ster, Marija
2016-01-28
Women physicians face many challenges while balancing their many roles: doctor, specialist trainee, mother and partner. The most opportune biological time for a woman to start a family coincides with a great deal of demands and requirements at work. In this study we explored the options and capabilities of women GP specialist trainees in coordinating their family and career. This is a phenomenological qualitative research. Ten GP specialist trainees from urban and rural areas were chosen by the purposive sampling technique, and semi-structured in-depth interviews were conducted, recorded, transcribed and analysed by using thematic analysis process. Open coding and the book of codes were formed. Finally, we performed the process of code reduction by identifying the themes, which were compared, interpreted and organised in the highest analytical units--categories. One hundred fifty-five codes were identified in the analysis, which were grouped together into eleven themes. The identified themes are: types, causes and consequences of burdens, work as pleasure and positive attitude toward self, priorities, planning and help, and understanding of superiors, disburdening and changing in specialisation. The themes were grouped into four large categories: burdens, empowerment, coordination and needs for improvement. Women specialist trainees encounter intense burdens at work and home due to numerous demands and requirements during their specialisation training. In addition, there is also the issue of the work-family conflict. There are many consequences regarding burden and strain; however, burnout stands out the most. In contrast, reconciliation of work and family life and needs can be successful. The key element is empowerment of women doctors. The foremost necessary systemic solution is the reinforcement of general practitioners in primary health care and their understanding of the specialisation training scheme with more flexible possibilities for time adaptations of specialist training.
Effect of magnetic island geometry on ECRH/ECCD and consequences to the NTM stabilization dynamics
NASA Astrophysics Data System (ADS)
Chatziantonaki, I.; Tsironis, C.; Isliker, H.; Vlahos, L.
2012-09-01
In the majority of codes that model ECCD-based NTM stabilization, the analysis of the EC propagation and absorption is performed in terms of the axisymmetric magnetic field, ignoring effects due to the island topology. In this paper, we analyze the wave propagation, absorption and current drive in the presence of NTMs, as well as the ECCD-driven island growth, focusing on the effect of the island geometry on the wave de-position. A primary evaluation of the consequences of these effects on the NTM evolution is also made in terms of the modified Rutherford equation.
NASA Astrophysics Data System (ADS)
Caminata, A.; Agostini, M.; Altenmüller, K.; Appel, S.; Bellini, G.; Benziger, J.; Berton, N.; Bick, D.; Bonfini, G.; Bravo, D.; Caccianiga, B.; Calaprice, F.; Cavalcante, P.; Chepurnov, A.; Choi, K.; Cribier, M.; D'Angelo, D.; Davini, S.; Derbin, A.; Di Noto, L.; Drachnev, I.; Durero, M.; Empl, A.; Etenko, A.; Farinon, S.; Fischer, V.; Fomenko, K.; Franco, D.; Gabriele, F.; Gaffiot, J.; Galbiati, C.; Ghiano, C.; Giammarchi, M.; Goeger-Neff, M.; Goretti, A.; Gromov, M.; Hagner, C.; Houdy, T.; Hungerford, E.; Ianni, Aldo; Ianni, Andrea; Jonquères, N.; Jedrzejczak, K.; Kaiser, M.; Kobychev, V.; Korablev, D.; Korga, G.; Kornoukhov, V.; Kryn, D.; Lachenmaier, T.; Lasserre, T.; Laubenstein, M.; Lehnert, B.; Link, J.; Litvinovich, E.; Lombardi, F.; Lombardi, P.; Ludhova, L.; Lukyanchenko, G.; Machulin, I.; Manecki, S.; Maneschg, W.; Marcocci, S.; Maricic, J.; Mention, G.; Meroni, E.; Meyer, M.; Miramonti, L.; Misiaszek, M.; Montuschi, M.; Mosteiro, P.; Muratova, V.; Musenich, R.; Neumair, B.; Oberauer, L.; Obolensky, M.; Ortica, F.; Pallavicini, M.; Papp, L.; Perasso, L.; Pocar, A.; Ranucci, G.; Razeto, A.; Re, A.; Romani, A.; Roncin, R.; Rossi, N.; Schönert, S.; Scola, L.; Semenov, D.; Simgen, H.; Skorokhvatov, M.; Smirnov, O.; Sotnikov, A.; Sukhotin, S.; Suvorov, Y.; Tartaglia, R.; Testera, G.; Thurn, J.; Toropova, M.; Unzhakov, E.; Veyssiere, C.; Vishneva, A.; Vivier, M.; Vogelaar, R. B.; von Feilitzsch, F.; Wang, H.; Weinz, S.; Winter, J.; Wojcik, M.; Wurm, M.; Yokley, Z.; Zaimidoroga, O.; Zavatarelli, S.; Zuber, K.; Zuzel, G.
2016-02-01
Borexino is an unsegmented neutrino detector operating at LNGS in central Italy. The experiment has shown its performances through its unprecedented accomplishments in the solar and geoneutrino detection. These performances make it an ideal tool to accomplish a state- of-the-art experiment able to test the existence of sterile neutrinos (SOX experiment). For both the solar and the SOX analysis, a good understanding of the detector response is fundamental. Consequently, calibration campaigns with radioactive sources have been performed over the years. The calibration data are of extreme importance to develop an accurate Monte Carlo code. This code is used in all the neutrino analyses. The Borexino-SOX calibration techniques and program and the advances on the detector simulation code in view of the start of the SOX data taking are presented. 1
Michel, Christian J.
2017-01-01
In 1996, a set X of 20 trinucleotides was identified in genes of both prokaryotes and eukaryotes which has on average the highest occurrence in reading frame compared to its two shifted frames. Furthermore, this set X has an interesting mathematical property as X is a maximal C3 self-complementary trinucleotide circular code. In 2015, by quantifying the inspection approach used in 1996, the circular code X was confirmed in the genes of bacteria and eukaryotes and was also identified in the genes of plasmids and viruses. The method was based on the preferential occurrence of trinucleotides among the three frames at the gene population level. We extend here this definition at the gene level. This new statistical approach considers all the genes, i.e., of large and small lengths, with the same weight for searching the circular code X. As a consequence, the concept of circular code, in particular the reading frame retrieval, is directly associated to each gene. At the gene level, the circular code X is strengthened in the genes of bacteria, eukaryotes, plasmids, and viruses, and is now also identified in the genes of archaea. The genes of mitochondria and chloroplasts contain a subset of the circular code X. Finally, by studying viral genes, the circular code X was found in DNA genomes, RNA genomes, double-stranded genomes, and single-stranded genomes. PMID:28420220
Deforestation and Carbon Loss in Southwest Amazonia: Impact of Brazil's Revised Forest Code
NASA Astrophysics Data System (ADS)
Roriz, Pedro Augusto Costa; Yanai, Aurora Miho; Fearnside, Philip Martin
2017-09-01
In 2012 Brazil's National Congress altered the country's Forest Code, decreasing various environmental protections in the set of regulations governing forests. This suggests consequences in increased deforestation and emissions of greenhouse gases and in decreased protection of fragile ecosystems. To ascertain the effects, a simulation was run to the year 2025 for the municipality (county) of Boca do Acre, Amazonas state, Brazil. A baseline scenario considered historical behavior (which did not respect the Forest Code), while two scenarios considered full compliance with the old Forest Code (Law 4771/1965) and the current Code (Law 12,651/2012) regarding the protection of "areas of permanent preservation" (APPs) along the edges of watercourses. The models were parameterized from satellite imagery and simulated using Dinamica-EGO software. Deforestation actors and processes in the municipality were observed in loco in 2012. Carbon emissions and loss of forest by 2025 were computed in the three simulation scenarios. There was a 10% difference in the loss of carbon stock and of forest between the scenarios with the two versions of the Forest Code. The baseline scenario showed the highest loss of carbon stocks and the highest increase in annual emissions. The greatest damage was caused by not protecting wetlands and riparian zones.
Validation of a multi-layer Green's function code for ion beam transport
NASA Astrophysics Data System (ADS)
Walker, Steven; Tweed, John; Tripathi, Ram; Badavi, Francis F.; Miller, Jack; Zeitlin, Cary; Heilbronn, Lawrence
To meet the challenge of future deep space programs, an accurate and efficient engineering code for analyzing the shielding requirements against high-energy galactic heavy radiations is needed. In consequence, a new version of the HZETRN code capable of simulating high charge and energy (HZE) ions with either laboratory or space boundary conditions is currently under development. The new code, GRNTRN, is based on a Green's function approach to the solution of Boltzmann's transport equation and like its predecessor is deterministic in nature. The computational model consists of the lowest order asymptotic approximation followed by a Neumann series expansion with non-perturbative corrections. The physical description includes energy loss with straggling, nuclear attenuation, nuclear fragmentation with energy dispersion and down shift. Code validation in the laboratory environment is addressed by showing that GRNTRN accurately predicts energy loss spectra as measured by solid-state detectors in ion beam experiments with multi-layer targets. In order to validate the code with space boundary conditions, measured particle fluences are propagated through several thicknesses of shielding using both GRNTRN and the current version of HZETRN. The excellent agreement obtained indicates that GRNTRN accurately models the propagation of HZE ions in the space environment as well as in laboratory settings and also provides verification of the HZETRN propagator.
Deforestation and Carbon Loss in Southwest Amazonia: Impact of Brazil's Revised Forest Code.
Roriz, Pedro Augusto Costa; Yanai, Aurora Miho; Fearnside, Philip Martin
2017-09-01
In 2012 Brazil's National Congress altered the country's Forest Code, decreasing various environmental protections in the set of regulations governing forests. This suggests consequences in increased deforestation and emissions of greenhouse gases and in decreased protection of fragile ecosystems. To ascertain the effects, a simulation was run to the year 2025 for the municipality (county) of Boca do Acre, Amazonas state, Brazil. A baseline scenario considered historical behavior (which did not respect the Forest Code), while two scenarios considered full compliance with the old Forest Code (Law 4771/1965) and the current Code (Law 12,651/2012) regarding the protection of "areas of permanent preservation" (APPs) along the edges of watercourses. The models were parameterized from satellite imagery and simulated using Dinamica-EGO software. Deforestation actors and processes in the municipality were observed in loco in 2012. Carbon emissions and loss of forest by 2025 were computed in the three simulation scenarios. There was a 10% difference in the loss of carbon stock and of forest between the scenarios with the two versions of the Forest Code. The baseline scenario showed the highest loss of carbon stocks and the highest increase in annual emissions. The greatest damage was caused by not protecting wetlands and riparian zones.
Edge-diffraction effects in RCS predictions and their importance in systems analysis
NASA Astrophysics Data System (ADS)
Friess, W. F.; Klement, D.; Ruppel, M.; Stein, Volker
1996-06-01
In developing RCS prediction codes a variety of physical effects such as the edge diffraction effect have to be considered with the consequence that the computer effort increases considerably. This fact limits the field of application of such codes, especially if the RCS data serve as input parameters for system simulators which very often need these data for a high number of observation angles and/or frequencies. Vice versa the issues of a system analysis can be used to estimate the relevance of physical effects under system viewpoints and to rank them according to their magnitude. This paper tries to evaluate the importance of RCS predictions containing an edge diffracted field for systems analysis. A double dihedral with a strong depolarizing behavior and a generic airplane design containing many arbitrarily oriented edges are used as test structures. Data of the scattered field are generated by the RCS computer code SIGMA with and without including edge diffraction effects. These data are submitted to the code DORA to determine radar range and radar detectibility and to a SAR simulator code to generate SAR imagery. In both cases special scenarios are assumed. The essential features of the computer codes in their current state are described, the results are presented and discussed under systems viewpoints.
The Initial Atmospheric Transport (IAT) Code: Description and Validation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Morrow, Charles W.; Bartel, Timothy James
The Initial Atmospheric Transport (IAT) computer code was developed at Sandia National Laboratories as part of their nuclear launch accident consequences analysis suite of computer codes. The purpose of IAT is to predict the initial puff/plume rise resulting from either a solid rocket propellant or liquid rocket fuel fire. The code generates initial conditions for subsequent atmospheric transport calculations. The Initial Atmospheric Transfer (IAT) code has been compared to two data sets which are appropriate to the design space of space launch accident analyses. The primary model uncertainties are the entrainment coefficients for the extended Taylor model. The Titan 34Dmore » accident (1986) was used to calibrate these entrainment settings for a prototypic liquid propellant accident while the recent Johns Hopkins University Applied Physics Laboratory (JHU/APL, or simply APL) large propellant block tests (2012) were used to calibrate the entrainment settings for prototypic solid propellant accidents. North American Meteorology (NAM )formatted weather data profiles are used by IAT to determine the local buoyancy force balance. The IAT comparisons for the APL solid propellant tests illustrate the sensitivity of the plume elevation to the weather profiles; that is, the weather profile is a dominant factor in determining the plume elevation. The IAT code performed remarkably well and is considered validated for neutral weather conditions.« less
Changes in mitochondrial genetic codes as phylogenetic characters: Two examples from the flatworms
Telford, Maximilian J.; Herniou, Elisabeth A.; Russell, Robert B.; Littlewood, D. Timothy J.
2000-01-01
Shared molecular genetic characteristics other than DNA and protein sequences can provide excellent sources of phylogenetic information, particularly if they are complex and rare and are consequently unlikely to have arisen by chance convergence. We have used two such characters, arising from changes in mitochondrial genetic code, to define a clade within the Platyhelminthes (flatworms), the Rhabditophora. We have sampled 10 distinct classes within the Rhabditophora and find that all have the codon AAA coding for the amino acid Asn rather than the usual Lys and AUA for Ile rather than the usual Met. We find no evidence to support claims that the codon UAA codes for Tyr in the Platyhelminthes rather than the standard stop codon. The Rhabditophora are a very diverse group comprising the majority of the free-living turbellarian taxa and the parasitic Neodermata. In contrast, three other classes of turbellarian flatworm, the Acoela, Nemertodermatida, and Catenulida, have the standard invertebrate assignments for these codons and so are convincingly excluded from the rhabditophoran clade. We have developed a rapid computerized method for analyzing genetic codes and demonstrate the wide phylogenetic distribution of the standard invertebrate code as well as confirming already known metazoan deviations from it (ascidian, vertebrate, echinoderm/hemichordate). PMID:11027335
Women Faculty Distressed: Descriptions and Consequences of Academic Contrapower Harassment
ERIC Educational Resources Information Center
Lampman, Claudia; Crew, Earl C.; Lowery, Shea D.; Tompkins, Kelley
2016-01-01
Academic contrapower harassment (ACPH) occurs when someone with seemingly less power in an educational setting (e.g., a student) harasses someone more powerful (e.g., a professor). A representative sample of 289 professors from U.S. institutions of higher education described their worst incident with ACPH. Open-ended responses were coded using a…
An Examination of Differences in Consequences of Punishment among PK-12 School Administrators
ERIC Educational Resources Information Center
Randle, Dawn DuBose
2010-01-01
The purpose of this study was to examine the differences in the administering of punishment procedures for violations of a school district's Code of Student Conduct among school-based administrators. Specifically, this study was concerned with the impact of the socio-demographic variables of: gender, years of administrative experience,…
Whose Code Are You Teaching? A Popular Australian Coursebook Unravelled
ERIC Educational Resources Information Center
Ritchie, Annabelle
2005-01-01
The study of curriculum materials is of interest to social researchers seeking to understand the social constructions of reality. All texts embody a number of purposeful choices about how reality is to be represented, and these choices have consequences for what is "foregrounded, backgrounded, placed in the margins, distorted, short-cut,…
Developmental Dyslexia and Explicit Long-Term Memory
ERIC Educational Resources Information Center
Menghini, Deny; Carlesimo, Giovanni Augusto; Marotta, Luigi; Finzi, Alessandra; Vicari, Stefano
2010-01-01
The reduced verbal long-term memory capacities often reported in dyslexics are generally interpreted as a consequence of their deficit in phonological coding. The present study was aimed at evaluating whether the learning deficit exhibited by dyslexics was restricted only to the verbal component of the long-term memory abilities or also involved…
Preparing to "Not" Be a Footballer: Higher Education and Professional Sport
ERIC Educational Resources Information Center
Hickey, Christopher; Kelly, Peter
2008-01-01
In the commercialised and professionalised world of elite sport, issues associated with career pathways and post sporting career options have a particular resonance. In various football codes, an unexpected knock, twist, bend or break can profoundly impact a player's career. In this high risk and high consequence environment, a number of sports…
Smith, David Roy; Hua, Jimeng; Archibald, John M.; Lee, Robert W.
2013-01-01
Organelle DNA is no stranger to palindromic repeats. But never has a mitochondrial or plastid genome been described in which every coding region is part of a distinct palindromic unit. While sequencing the mitochondrial DNA of the nonphotosynthetic green alga Polytomella magna, we uncovered precisely this type of genic arrangement. The P. magna mitochondrial genome is linear and made up entirely of palindromes, each containing 1–7 unique coding regions. Consequently, every gene in the genome is duplicated and in an inverted orientation relative to its partner. And when these palindromic genes are folded into putative stem-loops, their predicted translational start sites are often positioned in the apex of the loop. Gel electrophoresis results support the linear, 28-kb monomeric conformation of the P. magna mitochondrial genome. Analyses of other Polytomella taxa suggest that palindromic mitochondrial genes were present in the ancestor of the Polytomella lineage and lost or retained to various degrees in extant species. The possible origins and consequences of this bizarre genomic architecture are discussed. PMID:23940100
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gauntt, Randall O.; Mattie, Patrick D.; Bixler, Nathan E.
2014-02-01
This paper describes the knowledge advancements from the uncertainty analysis for the State-of- the-Art Reactor Consequence Analyses (SOARCA) unmitigated long-term station blackout accident scenario at the Peach Bottom Atomic Power Station. This work assessed key MELCOR and MELCOR Accident Consequence Code System, Version 2 (MACCS2) modeling uncertainties in an integrated fashion to quantify the relative importance of each uncertain input on potential accident progression, radiological releases, and off-site consequences. This quantitative uncertainty analysis provides measures of the effects on consequences, of each of the selected uncertain parameters both individually and in interaction with other parameters. The results measure the modelmore » response (e.g., variance in the output) to uncertainty in the selected input. Investigation into the important uncertain parameters in turn yields insights into important phenomena for accident progression and off-site consequences. This uncertainty analysis confirmed the known importance of some parameters, such as failure rate of the Safety Relief Valve in accident progression modeling and the dry deposition velocity in off-site consequence modeling. The analysis also revealed some new insights, such as dependent effect of cesium chemical form for different accident progressions. (auth)« less
Jelsma, Judith G M; Mertens, Vera-Christina; Forsberg, Lisa; Forsberg, Lars
2015-07-01
Many randomized controlled trials in which motivational interviewing (MI) is a key intervention make no provision for the assessment of treatment fidelity. This methodological shortcoming makes it impossible to distinguish between high- and low-quality MI interventions, and, consequently, to know whether MI provision has contributed to any intervention effects. This article makes some practical recommendations for the collection, selection, coding and reporting of MI fidelity data, as measured using the Motivational Interviewing Treatment Integrity Code. We hope that researchers will consider these recommendations and include MI fidelity measures in future studies. Copyright © 2015 Elsevier Inc. All rights reserved.
Crescenzo-Chaigne, Bernadette; Barbezange, Cyril; Frigard, Vianney; Poulain, Damien; van der Werf, Sylvie
2014-01-01
Exchange of the non coding regions of the NP segment between type A and C influenza viruses was used to demonstrate the importance not only of the proximal panhandle, but also of the initial distal panhandle strength in type specificity. Both elements were found to be compulsory to rescue infectious virus by reverse genetics systems. Interestingly, in type A influenza virus infectious context, the length of the NP segment 5′ NC region once transcribed into mRNA was found to impact its translation, and the level of produced NP protein consequently affected the level of viral genome replication. PMID:25268971
Probabilistic evaluation of fuselage-type composite structures
NASA Technical Reports Server (NTRS)
Shiao, Michael C.; Chamis, Christos C.
1992-01-01
A methodology is developed to computationally simulate the uncertain behavior of composite structures. The uncertain behavior includes buckling loads, natural frequencies, displacements, stress/strain etc., which are the consequences of the random variation (scatter) of the primitive (independent random) variables in the constituent, ply, laminate and structural levels. This methodology is implemented in the IPACS (Integrated Probabilistic Assessment of Composite Structures) computer code. A fuselage-type composite structure is analyzed to demonstrate the code's capability. The probability distribution functions of the buckling loads, natural frequency, displacement, strain and stress are computed. The sensitivity of each primitive (independent random) variable to a given structural response is also identified from the analyses.
Scaling features of noncoding DNA
NASA Technical Reports Server (NTRS)
Stanley, H. E.; Buldyrev, S. V.; Goldberger, A. L.; Havlin, S.; Peng, C. K.; Simons, M.
1999-01-01
We review evidence supporting the idea that the DNA sequence in genes containing noncoding regions is correlated, and that the correlation is remarkably long range--indeed, base pairs thousands of base pairs distant are correlated. We do not find such a long-range correlation in the coding regions of the gene, and utilize this fact to build a Coding Sequence Finder Algorithm, which uses statistical ideas to locate the coding regions of an unknown DNA sequence. Finally, we describe briefly some recent work adapting to DNA the Zipf approach to analyzing linguistic texts, and the Shannon approach to quantifying the "redundancy" of a linguistic text in terms of a measurable entropy function, and reporting that noncoding regions in eukaryotes display a larger redundancy than coding regions. Specifically, we consider the possibility that this result is solely a consequence of nucleotide concentration differences as first noted by Bonhoeffer and his collaborators. We find that cytosine-guanine (CG) concentration does have a strong "background" effect on redundancy. However, we find that for the purine-pyrimidine binary mapping rule, which is not affected by the difference in CG concentration, the Shannon redundancy for the set of analyzed sequences is larger for noncoding regions compared to coding regions.
Wallis, Katy L; Malic, Claudia C; Littlewood, Sonia L; Judkins, Keith; Phipps, Alan R
2009-03-01
Coding inpatient episodes plays an important role in determining the financial remuneration of a clinical service. Insufficient or incomplete data may have very significant consequences on its viability. We created a document that improves the coding process in our Burns Centre. At Yorkshire Regional Burns Centre an inpatient summary sheet was designed to prospectively record and present essential information on a daily basis, for use in the coding process. The level of care was also recorded. A 3-month audit was conducted to assess the efficacy of the new forms. Forty-nine patients were admitted to the Burns Centre with a mean age of 27.6 years and TBSA ranging from 0.5% to 65%. The total stay in the Burns Centre was 758 days, of which 22% were at level B3-B5 and 39% at level B2. The use of the new discharge document identified potential income of about 500,000 GB pound sterling at our local daily tariffs for high dependency and intensive care. The new form is able to ensure a high quality of coding with a possible direct impact on the financial resources accrued for burn care.
Weatherson, Katie A; McKay, Rhyann; Gainforth, Heather L; Jung, Mary E
2017-10-23
In British Columbia Canada, a Daily Physical Activity (DPA) policy was mandated that requires elementary school teachers to provide students with opportunities to achieve 30 min of physical activity during the school day. However, the implementation of school-based physical activity policies is influenced by many factors. A theoretical examination of the factors that impede and enhance teachers' implementation of physical activity policies is necessary in order to develop strategies to improve policy practice and achieve desired outcomes. This study used the Theoretical Domains Framework (TDF) to understand teachers' barriers and facilitators to the implementation of the DPA policy in one school district. Additionally, barriers and facilitators were examined and compared according to how the teacher implemented the DPA policy during the instructional school day. Interviews were conducted with thirteen teachers and transcribed verbatim. One researcher performed barrier and facilitator extraction, with double extraction occurring across a third of the interview transcripts by a second researcher. A deductive and inductive analytical approach in a two-stage process was employed whereby barriers and facilitators were deductively coded using TDF domains (content analysis) and analyzed for sub-themes within each domain. Two researchers performed coding. A total of 832 items were extracted from the interview transcripts. Some items were coded into multiple TDF domains, resulting in a total of 1422 observations. The most commonly coded TDF domains accounting for 75% of the total were Environmental context and resources (ECR; n = 250), Beliefs about consequences (n = 225), Social influences (n = 193), Knowledge (n = 100), and Intentions (n = 88). Teachers who implemented DPA during instructional time differed from those who relied on non-instructional time in relation to Goals, Behavioural regulation, Social/professional role and identity, Beliefs about Consequences. Forty-one qualitative sub-themes were identified across the fourteen domains and exemplary quotes were highlighted. Teachers identified barriers and facilitators relating to all TDF domains, with ECR, Beliefs about consequences, Social influences, Knowledge and Intentions being the most often discussed influencers of DPA policy implementation. Use of the TDF to understand the implementation factors can assist with the systematic development of future interventions to improve implementation.
Meydan, Chanan; Bekenstein, Uriya; Soreq, Hermona
2018-01-01
Sepsis and metabolic syndrome (MetS) are both inflammation-related entities with high impact for human health and the consequences of concussions. Both represent imbalanced parasympathetic/cholinergic response to insulting triggers and variably uncontrolled inflammation that indicates shared upstream regulators, including short microRNAs (miRs) and long non-coding RNAs (lncRNAs). These may cross talk across multiple systems, leading to complex molecular and clinical outcomes. Notably, biomedical and RNA-sequencing based analyses both highlight new links between the acquired and inherited pathogenic, cardiac and inflammatory traits of sepsis/MetS. Those include the HOTAIR and MIAT lncRNAs and their targets, such as miR-122, -150, -155, -182, -197, -375, -608 and HLA-DRA. Implicating non-coding RNA regulators in sepsis and MetS may delineate novel high-value biomarkers and targets for intervention.
Computing element evolution towards Exascale and its impact on legacy simulation codes
NASA Astrophysics Data System (ADS)
Colin de Verdière, Guillaume J. L.
2015-12-01
In the light of the current race towards the Exascale, this article highlights the main features of the forthcoming computing elements that will be at the core of next generations of supercomputers. The market analysis, underlying this work, shows that computers are facing a major evolution in terms of architecture. As a consequence, it is important to understand the impacts of those evolutions on legacy codes or programming methods. The problems of dissipated power and memory access are discussed and will lead to a vision of what should be an exascale system. To survive, programming languages had to respond to the hardware evolutions either by evolving or with the creation of new ones. From the previous elements, we elaborate why vectorization, multithreading, data locality awareness and hybrid programming will be the key to reach the exascale, implying that it is time to start rewriting codes.
Raczek, Ewa
2009-01-01
On June 13, 2009, the new Family and Guardianship Code came into effect. Many important modifications were implemented to Chapter I. "Origin of a child", the issue being of special importance in the work of a forensic geneticist. Those changes are related not only to arguableness of the fatherhood of both types--the one that is judged in lawsuit of denial of the fatherhood and that in which ineffectiveness of paternity is recognized--but for the first time they also demand on maternity testing. The Code defines who--according to Polish law--is a mother to a child and on this base states motherhood. In consequence, the main legal maxim Mater semper certa est, which has existed since Ancient Rome times is now annulled. The paper presents some remarks of an expert witness on the introduced changes.
LDPC product coding scheme with extrinsic information for bit patterned media recoding
NASA Astrophysics Data System (ADS)
Jeong, Seongkwon; Lee, Jaejin
2017-05-01
Since the density limit of the current perpendicular magnetic storage system will soon be reached, bit patterned media recording (BPMR) is a promising candidate for the next generation storage system to achieve an areal density beyond 1 Tb/in2. Each recording bit is stored in a fabricated magnetic island and the space between the magnetic islands is nonmagnetic in BPMR. To approach recording densities of 1 Tb/in2, the spacing of the magnetic islands must be less than 25 nm. Consequently, severe inter-symbol interference (ISI) and inter-track interference (ITI) occur. ITI and ISI degrade the performance of BPMR. In this paper, we propose a low-density parity check (LDPC) product coding scheme that exploits extrinsic information for BPMR. This scheme shows an improved bit error rate performance compared to that in which one LDPC code is used.
Emergence of Coding and its Specificity as a Physico-Informatic Problem
NASA Astrophysics Data System (ADS)
Wills, Peter R.; Nieselt, Kay; McCaskill, John S.
2015-06-01
We explore the origin-of-life consequences of the view that biological systems are demarcated from inanimate matter by their possession of referential information, which is processed computationally to control choices of specific physico-chemical events. Cells are cybernetic: they use genetic information in processes of communication and control, subjecting physical events to a system of integrated governance. The genetic code is the most obvious example of how cells use information computationally, but the historical origin of the usefulness of molecular information is not well understood. Genetic coding made information useful because it imposed a modular metric on the evolutionary search and thereby offered a general solution to the problem of finding catalysts of any specificity. We use the term "quasispecies symmetry breaking" to describe the iterated process of self-organisation whereby the alphabets of distinguishable codons and amino acids increased, step by step.
Structural Code Considerations for Solar Rooftop Installations.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dwyer, Stephen F.; Dwyer, Brian P.; Sanchez, Alfred
2014-12-01
Residential rooftop solar panel installations are limited in part by the high cost of structural related code requirements for field installation. Permitting solar installations is difficult because there is a belief among residential permitting authorities that typical residential rooftops may be structurally inadequate to support the additional load associated with a photovoltaic (PV) solar installation. Typical engineering methods utilized to calculate stresses on a roof structure involve simplifying assumptions that render a complex non-linear structure to a basic determinate beam. This method of analysis neglects the composite action of the entire roof structure, yielding a conservative analysis based on amore » rafter or top chord of a truss. Consequently, the analysis can result in an overly conservative structural analysis. A literature review was conducted to gain a better understanding of the conservative nature of the regulations and codes governing residential construction and the associated structural system calculations.« less
The Role of Hierarchy in Response Surface Modeling of Wind Tunnel Data
NASA Technical Reports Server (NTRS)
DeLoach, Richard
2010-01-01
This paper is intended as a tutorial introduction to certain aspects of response surface modeling, for the experimentalist who has started to explore these methods as a means of improving productivity and quality in wind tunnel testing and other aerospace applications. A brief review of the productivity advantages of response surface modeling in aerospace research is followed by a description of the advantages of a common coding scheme that scales and centers independent variables. The benefits of model term reduction are reviewed. A constraint on model term reduction with coded factors is described in some detail, which requires such models to be well-formulated, or hierarchical. Examples illustrate the consequences of ignoring this constraint. The implication for automated regression model reduction procedures is discussed, and some opinions formed from the author s experience are offered on coding, model reduction, and hierarchy.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Deslippe, Jack; da Jornada, Felipe H.; Vigil-Fowler, Derek
2016-10-06
We profile and optimize calculations performed with the BerkeleyGW code on the Xeon-Phi architecture. BerkeleyGW depends both on hand-tuned critical kernels as well as on BLAS and FFT libraries. We describe the optimization process and performance improvements achieved. We discuss a layered parallelization strategy to take advantage of vector, thread and node-level parallelism. We discuss locality changes (including the consequence of the lack of L3 cache) and effective use of the on-package high-bandwidth memory. We show preliminary results on Knights-Landing including a roofline study of code performance before and after a number of optimizations. We find that the GW methodmore » is particularly well-suited for many-core architectures due to the ability to exploit a large amount of parallelism over plane-wave components, band-pairs, and frequencies.« less
The State of Software for Evolutionary Biology.
Darriba, Diego; Flouri, Tomáš; Stamatakis, Alexandros
2018-05-01
With Next Generation Sequencing data being routinely used, evolutionary biology is transforming into a computational science. Thus, researchers have to rely on a growing number of increasingly complex software. All widely used core tools in the field have grown considerably, in terms of the number of features as well as lines of code and consequently, also with respect to software complexity. A topic that has received little attention is the software engineering quality of widely used core analysis tools. Software developers appear to rarely assess the quality of their code, and this can have potential negative consequences for end-users. To this end, we assessed the code quality of 16 highly cited and compute-intensive tools mainly written in C/C++ (e.g., MrBayes, MAFFT, SweepFinder, etc.) and JAVA (BEAST) from the broader area of evolutionary biology that are being routinely used in current data analysis pipelines. Because, the software engineering quality of the tools we analyzed is rather unsatisfying, we provide a list of best practices for improving the quality of existing tools and list techniques that can be deployed for developing reliable, high quality scientific software from scratch. Finally, we also discuss journal as well as science policy and, more importantly, funding issues that need to be addressed for improving software engineering quality as well as ensuring support for developing new and maintaining existing software. Our intention is to raise the awareness of the community regarding software engineering quality issues and to emphasize the substantial lack of funding for scientific software development.
ERIC Educational Resources Information Center
Trivette, Carol M.; Dunst, Carl J.; Hamby, Deborah W.; O'Herin, Chainey E.
2009-01-01
The effectiveness of four adult learning methods (accelerated learning, coaching, guided design, and just-in-time training) constituted the focus of this research synthesis. Findings reported in "How People Learn" (Bransford et al., 2000) were used to operationally define six adult learning method characteristics, and to code and analyze…
Hirschsprung’s disease (HSCR), a birth defect characterized by variable aganglionosis of the gut, affects about 1 in 5000 births, and is a consequence of abnormal development of neural crest cells, from which enteric ganglia derive. In the companion article in this issue (Shen et...
A Qualitative Study of Immigration Policy and Practice Dilemmas for Social Work Students
ERIC Educational Resources Information Center
Furman, Rich; Langer, Carol L.; Sanchez, Thomas Wayne; Negi, Nalini Junko
2007-01-01
Social policy shapes the infrastructure wherein social work is practiced. However, what happens when a particular social policy is seemingly incongruent with the social work code of ethics? How do social work students conceive and resolve potential practice dilemmas that may arise as a consequence? In this study, the authors explored potential…
Students Behaving Badly: Policies on Weapons Violations in Florida Schools
ERIC Educational Resources Information Center
Dickinson, Wendy B.; Hall, Bruce W.
2003-01-01
This study looks at existing aspects of written school violence policies (Codes of Student Conduct) across large, mid-size, and small school districts in Florida. The aim was to provide a clearer picture of how weapons are defined, and the consequences of their possession, use, or display. Two research areas were addressed: (1) What constitutes a…
NASA Technical Reports Server (NTRS)
Spirkovska, Lilly; Reid, Max B.
1993-01-01
A higher-order neural network (HONN) can be designed to be invariant to changes in scale, translation, and inplane rotation. Invariances are built directly into the architecture of a HONN and do not need to be learned. Consequently, fewer training passes and a smaller training set are required to learn to distinguish between objects. The size of the input field is limited, however, because of the memory required for the large number of interconnections in a fully connected HONN. By coarse coding the input image, the input field size can be increased to allow the larger input scenes required for practical object recognition problems. We describe a coarse coding technique and present simulation results illustrating its usefulness and its limitations. Our simulations show that a third-order neural network can be trained to distinguish between two objects in a 4096 x 4096 pixel input field independent of transformations in translation, in-plane rotation, and scale in less than ten passes through the training set. Furthermore, we empirically determine the limits of the coarse coding technique in the object recognition domain.
Quantum computation with realistic magic-state factories
NASA Astrophysics Data System (ADS)
O'Gorman, Joe; Campbell, Earl T.
2017-03-01
Leading approaches to fault-tolerant quantum computation dedicate a significant portion of the hardware to computational factories that churn out high-fidelity ancillas called magic states. Consequently, efficient and realistic factory design is of paramount importance. Here we present the most detailed resource assessment to date of magic-state factories within a surface code quantum computer, along the way introducing a number of techniques. We show that the block codes of Bravyi and Haah [Phys. Rev. A 86, 052329 (2012), 10.1103/PhysRevA.86.052329] have been systematically undervalued; we track correlated errors both numerically and analytically, providing fidelity estimates without appeal to the union bound. We also introduce a subsystem code realization of these protocols with constant time and low ancilla cost. Additionally, we confirm that magic-state factories have space-time costs that scale as a constant factor of surface code costs. We find that the magic-state factory required for postclassical factoring can be as small as 6.3 million data qubits, ignoring ancilla qubits, assuming 10-4 error gates and the availability of long-range interactions.
Sources of financial pressure and up coding behavior in French public hospitals.
Georgescu, Irène; Hartmann, Frank G H
2013-05-01
Drawing upon role theory and the literature concerning unintended consequences of financial pressure, this study investigates the effects of health care decision pressure from the hospital's administration and from the professional peer group on physician's inclination to engage in up coding. We explore two kinds of up coding, information-related and action-related, and develop hypothesis that connect these kinds of data manipulation to the sources of pressure via the intermediate effect of role conflict. Qualitative data from initial interviews with physicians and subsequent questionnaire evidence from 578 physicians in 14 French hospitals suggest that the source of pressure is a relevant predictor of physicians' inclination to engage in data-manipulation. We further find that this effect is partly explained by the extent to which these pressures create role conflict. Given the concern about up coding in treatment-based reimbursement systems worldwide, our analysis adds to understanding how the design of the hospital's management control system may enhance this undesired type of behavior. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
Joint sparse coding based spatial pyramid matching for classification of color medical image.
Shi, Jun; Li, Yi; Zhu, Jie; Sun, Haojie; Cai, Yin
2015-04-01
Although color medical images are important in clinical practice, they are usually converted to grayscale for further processing in pattern recognition, resulting in loss of rich color information. The sparse coding based linear spatial pyramid matching (ScSPM) and its variants are popular for grayscale image classification, but cannot extract color information. In this paper, we propose a joint sparse coding based SPM (JScSPM) method for the classification of color medical images. A joint dictionary can represent both the color information in each color channel and the correlation between channels. Consequently, the joint sparse codes calculated from a joint dictionary can carry color information, and therefore this method can easily transform a feature descriptor originally designed for grayscale images to a color descriptor. A color hepatocellular carcinoma histological image dataset was used to evaluate the performance of the proposed JScSPM algorithm. Experimental results show that JScSPM provides significant improvements as compared with the majority voting based ScSPM and the original ScSPM for color medical image classification. Copyright © 2014 Elsevier Ltd. All rights reserved.
CARES/LIFE Software Commercialization
NASA Technical Reports Server (NTRS)
1995-01-01
The NASA Lewis Research Center has entered into a letter agreement with BIOSYM Technologies Inc. (now merged with Molecular Simulations Inc. (MSI)). Under this agreement, NASA will provide a developmental copy of the CARES/LIFE computer program to BIOSYM for evaluation. This computer code predicts the time-dependent reliability of a thermomechanically loaded component. BIOSYM will become familiar with CARES/LIFE, provide results of computations useful in validating the code, evaluate it for potential commercialization, and submit suggestions for improvements or extensions to the code or its documentation. If BIOSYM/Molecular Simulations reaches a favorable evaluation of CARES/LIFE, NASA will enter into negotiations for a cooperative agreement with BIOSYM/Molecular Simulations to further develop the code--adding features such as a user-friendly interface and other improvements. This agreement would give BIOSYM intellectual property rights in the modified codes, which they could protect and then commercialize. NASA would provide BIOSYM with the NASA-developed source codes and would agree to cooperate with BIOSYM in further developing the code. In return, NASA would receive certain use rights in the modified CARES/LIFE program. Presently BIOSYM Technologies Inc. has been involved with integration issues concerning its merger with Molecular Simulations Inc., since both companies used to compete in the computational chemistry market, and to some degree, in the materials market. Consequently, evaluation of the CARES/LIFE software is on hold for a month or two while the merger is finalized. Their interest in CARES continues, however, and they expect to get back to the evaluation by early November 1995.
It's time to make management a true profession.
Khurana, Rakesh; Nohria, Nitin
2008-10-01
In the face of the recent institutional breakdown of trust in business, managers are losing legitimacy. To regain public trust, management needs to become a true profession in much the way medicine and law have, argue Khurana and Nohria of Harvard Business School. True professions have codes, and the meaning and consequences of those codes are taught as part of the formal education required of their members. Through these codes, professional institutions forge an implicit social contract with society: Trust us to control and exercise jurisdiction over an important occupational category, and, in return, we will ensurethat the members of our profession are worthy of your trust--that they will not only be competent to perform the tasks entrusted to them, but that they will also conduct themselves with high standardsand great integrity. The authors believe that enforcing educational standards and a code of ethics is unlikely to choke entrepreneurial creativity. Indeed, if the field of medicine is any indication, a code may even stimulate creativity. The main challenge in writing a code lies in reaching a broad consensus on the aims and social purpose of management. There are two deeply divided schools of thought. One school argues that management's aim should simply be to maximize shareholder wealth; the other argues that management's purpose is to balance the claims of all the firm's stakeholders. Any code will have to steer a middle course in order to accommodate both the value-creating impetus of the shareholder value concept and the accountability inherent in the stakeholder approach.
The feasibility of QR-code prescription in Taiwan.
Lin, C-H; Tsai, F-Y; Tsai, W-L; Wen, H-W; Hu, M-L
2012-12-01
An ideal Health Care Service is a service system that focuses on patients. Patients in Taiwan have the freedom to fill their prescriptions at any pharmacies contracted with National Health Insurance. Each of these pharmacies uses its own computer system. So far, there are at least ten different systems on the market in Taiwan. To transmit the prescription information from the hospital to the pharmacy accurately and efficiently presents a great issue. This study consisted of two-dimensional applications using a QR-code to capture Patient's identification and prescription information from the hospitals as well as using a webcam to read the QR-code and transfer all data to the pharmacy computer system. Two hospitals and 85 community pharmacies participated in the study. During the trial, all participant pharmacies appraised highly of the accurate transmission of the prescription information. The contents in QR-code prescriptions from Taipei area were picked up efficiently and accurately in pharmacies at Taichung area (middle Taiwan) without software system limit and area limitation. The QR-code device received a patent (No. M376844, March 2010) from Intellectual Property Office Ministry of Economic Affair, China. Our trial has proven that QR-code prescription can provide community pharmacists an efficient, accurate and inexpensive device to digitalize the prescription contents. Consequently, pharmacists can offer better quality of pharmacy service to patients. © 2012 Blackwell Publishing Ltd.
The "Wow! signal" of the terrestrial genetic code
NASA Astrophysics Data System (ADS)
shCherbak, Vladimir I.; Makukov, Maxim A.
2013-05-01
It has been repeatedly proposed to expand the scope for SETI, and one of the suggested alternatives to radio is the biological media. Genomic DNA is already used on Earth to store non-biological information. Though smaller in capacity, but stronger in noise immunity is the genetic code. The code is a flexible mapping between codons and amino acids, and this flexibility allows modifying the code artificially. But once fixed, the code might stay unchanged over cosmological timescales; in fact, it is the most durable construct known. Therefore it represents an exceptionally reliable storage for an intelligent signature, if that conforms to biological and thermodynamic requirements. As the actual scenario for the origin of terrestrial life is far from being settled, the proposal that it might have been seeded intentionally cannot be ruled out. A statistically strong intelligent-like "signal" in the genetic code is then a testable consequence of such scenario. Here we show that the terrestrial code displays a thorough precision-type orderliness matching the criteria to be considered an informational signal. Simple arrangements of the code reveal an ensemble of arithmetical and ideographical patterns of the same symbolic language. Accurate and systematic, these underlying patterns appear as a product of precision logic and nontrivial computing rather than of stochastic processes (the null hypothesis that they are due to chance coupled with presumable evolutionary pathways is rejected with P-value < 10-13). The patterns are profound to the extent that the code mapping itself is uniquely deduced from their algebraic representation. The signal displays readily recognizable hallmarks of artificiality, among which are the symbol of zero, the privileged decimal syntax and semantical symmetries. Besides, extraction of the signal involves logically straightforward but abstract operations, making the patterns essentially irreducible to any natural origin. Plausible ways of embedding the signal into the code and possible interpretation of its content are discussed. Overall, while the code is nearly optimized biologically, its limited capacity is used extremely efficiently to pass non-biological information.
Oeschger, Franziska M; Jenal, Ursula
2018-01-01
Codes of conduct have received wide attention as a bottom-up approach to foster responsibility for dual use aspects of life science research within the scientific community. In Switzerland, a series of discussion sessions led by the Swiss Academy of Sciences with over 40 representatives of most Swiss academic life science research institutions has revealed that while a formal code of conduct was considered too restrictive, a bottom-up approach toward awareness raising and education and demonstrating scientists' responsibility toward society was highly welcomed. Consequently, an informational brochure on "Misuse potential and biosecurity in life sciences research" was developed to provide material for further discussions and education.
Surveying multidisciplinary aspects in real-time distributed coding for Wireless Sensor Networks.
Braccini, Carlo; Davoli, Franco; Marchese, Mario; Mongelli, Maurizio
2015-01-27
Wireless Sensor Networks (WSNs), where a multiplicity of sensors observe a physical phenomenon and transmit their measurements to one or more sinks, pertain to the class of multi-terminal source and channel coding problems of Information Theory. In this category, "real-time" coding is often encountered for WSNs, referring to the problem of finding the minimum distortion (according to a given measure), under transmission power constraints, attainable by encoding and decoding functions, with stringent limits on delay and complexity. On the other hand, the Decision Theory approach seeks to determine the optimal coding/decoding strategies or some of their structural properties. Since encoder(s) and decoder(s) possess different information, though sharing a common goal, the setting here is that of Team Decision Theory. A more pragmatic vision rooted in Signal Processing consists of fixing the form of the coding strategies (e.g., to linear functions) and, consequently, finding the corresponding optimal decoding strategies and the achievable distortion, generally by applying parametric optimization techniques. All approaches have a long history of past investigations and recent results. The goal of the present paper is to provide the taxonomy of the various formulations, a survey of the vast related literature, examples from the authors' own research, and some highlights on the inter-play of the different theories.
Genomic mutation consequence calculator.
Major, John E
2007-11-15
The genomic mutation consequence calculator (GMCC) is a tool that will reliably and quickly calculate the consequence of arbitrary genomic mutations. GMCC also reports supporting annotations for the specified genomic region. The particular strength of the GMCC is it works in genomic space, not simply in spliced transcript space as some similar tools do. Within gene features, GMCC can report on the effects on splice site, UTR and coding regions in all isoforms affected by the mutation. A considerable number of genomic annotations are also reported, including: genomic conservation score, known SNPs, COSMIC mutations, disease associations and others. The manual interface also offers link outs to various external databases and resources. In batch mode, GMCC returns a csv file which can easily be parsed by the end user. GMCC is intended to support the many tumor resequencing efforts, but can be useful to any study investigating genomic mutations.
ERIC Educational Resources Information Center
Lyken-Segosebe, Dawn; Min, Yunkyung; Braxton, John M.
2012-01-01
Four-year colleges and universities that espouse teaching as their primary mission bear a responsibility to safeguard the welfare of their students as clients of teaching. This responsibility takes the form of a moral imperative. Faculty members hold considerable autonomy in the professional choices they make in their teaching. As a consequence,…
ERIC Educational Resources Information Center
Powell, Sarah R.; Nurnberger-Haag, Julie
2015-01-01
Research Findings: Teachers and parents often use trade books to introduce or reinforce mathematics concepts. To date, an analysis of the early numeracy content of trade books has not been conducted. Consequently, this study evaluated the properties of numbers and counting within trade books. We coded 160 trade books targeted at establishing early…
Dynamics on Networks of Manifolds
NASA Astrophysics Data System (ADS)
DeVille, Lee; Lerman, Eugene
2015-03-01
We propose a precise definition of a continuous time dynamical system made up of interacting open subsystems. The interconnections of subsystems are coded by directed graphs. We prove that the appropriate maps of graphs called graph fibrations give rise to maps of dynamical systems. Consequently surjective graph fibrations give rise to invariant subsystems and injective graph fibrations give rise to projections of dynamical systems.
DOE Office of Scientific and Technical Information (OSTI.GOV)
White, J.E.; Roussin, R.W.; Gilpin, H.
A version of the CRAC2 computer code applicable for use in analyses of consequences and risks of reactor accidents in case work for environmental statements has been implemented for use on the Nuclear Regulatory Commission Data General MV/8000 computer system. Input preparation is facilitated through the use of an interactive computer program which operates on an IBM personal computer. The resulting CRAC2 input deck is transmitted to the MV/8000 by using an error-free file transfer mechanism. To facilitate the use of CRAC2 at NRC, relevant background material on input requirements and model descriptions has been extracted from four reports -more » ''Calculations of Reactor Accident Consequences,'' Version 2, NUREG/CR-2326 (SAND81-1994) and ''CRAC2 Model Descriptions,'' NUREG/CR-2552 (SAND82-0342), ''CRAC Calculations for Accident Sections of Environmental Statements, '' NUREG/CR-2901 (SAND82-1693), and ''Sensitivity and Uncertainty Studies of the CRAC2 Computer Code,'' NUREG/CR-4038 (ORNL-6114). When this background information is combined with instructions on the input processor, this report provides a self-contained guide for preparing CRAC2 input data with a specific orientation toward applications on the MV/8000. 8 refs., 11 figs., 10 tabs.« less
[Which legal consequences for those who provoke pain to infants?].
Bellieni, C V; Gabbrielli, M; Tataranno, M L; Perrone, S; Buonocore, G
2012-02-01
The advances in perinatal care have led to a significant increase in neonatal survival rate but also to the rise of the number of invasive procedures. Several scientific studies show that newborns are able to feel pain more intensely than adults. Despite this evidence, neonatal pain and the right to an appropriate analgesia are systematically underestimated, ignoring ethical and moral principles of beneficence and non-maleficence. Infants are more susceptible to pain and the prolonged exposure to painful sensations can alter the neural development and the response to pain causing hyperalgesia. Anyone who caused pain without using any analgesic procedure due to negligence or incompetence, should be severely punished. The right to analgesia, fundamental principle, is fully incorporated in the Italian code of Medical deontology (article 3). The doctor who does not use analgesia for newborns' treatment can be indicted by the Italian penal code (art.582 and 583), aggravated by being the victim an infant, who is unable to defend himself. To avoid penal consequences, a careful education and attention are needed: "pediatric analgesia" should become a basic teaching in Universities and in specialization schools; analgesic treatments should be mandatory and annotated in the patient's file even for minor potentially painful procedures.
Protein functional features are reflected in the patterns of mRNA translation speed.
López, Daniel; Pazos, Florencio
2015-07-09
The degeneracy of the genetic code makes it possible for the same amino acid string to be coded by different messenger RNA (mRNA) sequences. These "synonymous mRNAs" may differ largely in a number of aspects related to their overall translational efficiency, such as secondary structure content and availability of the encoded transfer RNAs (tRNAs). Consequently, they may render different yields of the translated polypeptides. These mRNA features related to translation efficiency are also playing a role locally, resulting in a non-uniform translation speed along the mRNA, which has been previously related to some protein structural features and also used to explain some dramatic effects of "silent" single-nucleotide-polymorphisms (SNPs). In this work we perform the first large scale analysis of the relationship between three experimental proxies of mRNA local translation efficiency and the local features of the corresponding encoded proteins. We found that a number of protein functional and structural features are reflected in the patterns of ribosome occupancy, secondary structure and tRNA availability along the mRNA. One or more of these proxies of translation speed have distinctive patterns around the mRNA regions coding for certain protein local features. In some cases the three patterns follow a similar trend. We also show specific examples where these patterns of translation speed point to the protein's important structural and functional features. This support the idea that the genome not only codes the protein functional features as sequences of amino acids, but also as subtle patterns of mRNA properties which, probably through local effects on the translation speed, have some consequence on the final polypeptide. These results open the possibility of predicting a protein's functional regions based on a single genomic sequence, and have implications for heterologous protein expression and fine-tuning protein function.
The World Anti-Doping Code: can you have asthma and still be an elite athlete?
2016-01-01
Key points The World Anti-Doping Code (the Code) does place some restrictions on prescribing inhaled β2-agonists, but these can be overcome without jeopardising the treatment of elite athletes with asthma. While the Code permits the use of inhaled glucocorticoids without restriction, oral and intravenous glucocorticoids are prohibited, although a mechanism exists that allows them to be administered for acute severe asthma. Although asthmatic athletes achieved outstanding sporting success during the 1950s and 1960s before any anti-doping rules existed, since introduction of the Code’s policies on some drugs to manage asthma results at the Olympic Games have revealed that athletes with confirmed asthma/airway hyperresponsiveness (AHR) have outperformed their non-asthmatic rivals. It appears that years of intensive endurance training can provoke airway injury, AHR and asthma in athletes without any past history of asthma. Although further research is needed, it appears that these consequences of airway injury may abate in some athletes after they have ceased intensive training. The World Anti-Doping Code (the Code) has not prevented asthmatic individuals from becoming elite athletes. This review examines those sections of the Code that are relevant to respiratory physicians who manage elite and sub-elite athletes with asthma. The restrictions that the Code places or may place on the prescription of drugs to prevent and treat asthma in athletes are discussed. In addition, the means by which respiratory physicians are able to treat their elite asthmatic athlete patients with drugs that are prohibited in sport are outlined, along with some of the pitfalls in such management and how best to prevent or minimise them. PMID:27408633
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ditmars, J.D.; Walbridge, E.W.; Rote, D.M.
1983-10-01
Repository performance assessment is analysis that identifies events and processes that might affect a repository system for isolation of radioactive waste, examines their effects on barriers to waste migration, and estimates the probabilities of their occurrence and their consequences. In 1983 Battelle Memorial Institute's Office of Nuclear Waste Isolation (ONWI) prepared two plans - one for performance assessment for a waste repository in salt and one for verification and validation of performance assessment technology. At the request of the US Department of Energy's Salt Repository Project Office (SRPO), Argonne National Laboratory reviewed those plans and prepared this report to advisemore » SRPO of specific areas where ONWI's plans for performance assessment might be improved. This report presents a framework for repository performance assessment that clearly identifies the relationships among the disposal problems, the processes underlying the problems, the tools for assessment (computer codes), and the data. In particular, the relationships among important processes and 26 model codes available to ONWI are indicated. A common suggestion for computer code verification and validation is the need for specific and unambiguous documentation of the results of performance assessment activities. A major portion of this report consists of status summaries of 27 model codes indicated as potentially useful by ONWI. The code summaries focus on three main areas: (1) the code's purpose, capabilities, and limitations; (2) status of the elements of documentation and review essential for code verification and validation; and (3) proposed application of the code for performance assessment of salt repository systems. 15 references, 6 figures, 4 tables.« less
Sukanya, Chongthawonsatid
2017-10-01
This study examined the validity of the principal diagnoses on discharge summaries and coding assessments. Data were collected from the National Health Security Office (NHSO) of Thailand in 2015. In total, 118,971 medical records were audited. The sample was drawn from government hospitals and private hospitals covered by the Universal Coverage Scheme in Thailand. Hospitals and cases were selected using NHSO criteria. The validity of the principal diagnoses listed in the "Summary and Coding Assessment" forms was established by comparing data from the discharge summaries with data obtained from medical record reviews, and additionally, by comparing data from the coding assessments with data in the computerized ICD (the data base used for reimbursement-purposes). The summary assessments had low sensitivities (7.3%-37.9%), high specificities (97.2%-99.8%), low positive predictive values (9.2%-60.7%), and high negative predictive values (95.9%-99.3%). The coding assessments had low sensitivities (31.1%-69.4%), high specificities (99.0%-99.9%), moderate positive predictive values (43.8%-89.0%), and high negative predictive values (97.3%-99.5%). The discharge summaries and codings often contained mistakes, particularly the categories "Endocrine, nutritional, and metabolic diseases", "Symptoms, signs, and abnormal clinical and laboratory findings not elsewhere classified", "Factors influencing health status and contact with health services", and "Injury, poisoning, and certain other consequences of external causes". The validity of the principal diagnoses on the summary and coding assessment forms was found to be low. The training of physicians and coders must be strengthened to improve the validity of discharge summaries and codings.
The State of Software for Evolutionary Biology
Darriba, Diego; Flouri, Tomáš; Stamatakis, Alexandros
2018-01-01
Abstract With Next Generation Sequencing data being routinely used, evolutionary biology is transforming into a computational science. Thus, researchers have to rely on a growing number of increasingly complex software. All widely used core tools in the field have grown considerably, in terms of the number of features as well as lines of code and consequently, also with respect to software complexity. A topic that has received little attention is the software engineering quality of widely used core analysis tools. Software developers appear to rarely assess the quality of their code, and this can have potential negative consequences for end-users. To this end, we assessed the code quality of 16 highly cited and compute-intensive tools mainly written in C/C++ (e.g., MrBayes, MAFFT, SweepFinder, etc.) and JAVA (BEAST) from the broader area of evolutionary biology that are being routinely used in current data analysis pipelines. Because, the software engineering quality of the tools we analyzed is rather unsatisfying, we provide a list of best practices for improving the quality of existing tools and list techniques that can be deployed for developing reliable, high quality scientific software from scratch. Finally, we also discuss journal as well as science policy and, more importantly, funding issues that need to be addressed for improving software engineering quality as well as ensuring support for developing new and maintaining existing software. Our intention is to raise the awareness of the community regarding software engineering quality issues and to emphasize the substantial lack of funding for scientific software development. PMID:29385525
[Seasonal distribution of clinical case codes (DOC study)].
von Dercks, N; Melz, R; Hepp, P; Theopold, J; Marquass, B; Josten, C
2017-02-01
The German diagnosis-related groups remuneration system (G-DRG) was implemented in 2004 and patient-related diagnoses and procedures lead to allocation to specific DRGs. This system includes several codes, such as case mix (CM), case mix index (CMI) and number of cases. Seasonal distribution of these codes as well as distribution of diagnoses and DRGs may lead to logistical consequences for clinical management. From 2004 to 2013 all the main diagnoses and DRGs for inpatients were recorded. Monthly and seasonal distributions were analyzed using ANOVA. The average monthly number of cases was 265 ± 25 cases, the average CM was 388.50 ± 51.75 and the average CMI was 1.46 ± 0.15 with no significant seasonal differences (p > 0.1). Concussion was the most frequently occurring main diagnosis (3739 cases) followed by fractures of the humeral head (699). Significant distribution differences could be shown for humeral head fractures in monthly (p = 0.018) and seasonal comparisons (p = 0.006) with a maximum in winter. Radius (p = 0.01) and ankle fractures (p ≤ 0.001) also occurred most frequently in winter. Non-bony lesions of the shoulder were significantly less in spring (p = 0.04). The DRGs showed no evidence of a monthly or seasonal clustering (p > 0.1). The significant clustering of injuries in specific months and seasons should lead to logistic consequences (e.g. operating room slots, availability of nursing and anesthesia staff). For a needs assessment the analysis of main diagnoses is more appropriate than DRGs.
Anticipatory anxiety disrupts neural valuation during risky choice.
Engelmann, Jan B; Meyer, Friederike; Fehr, Ernst; Ruff, Christian C
2015-02-18
Incidental negative emotions unrelated to the current task, such as background anxiety, can strongly influence decisions. This is most evident in psychiatric disorders associated with generalized emotional disturbances. However, the neural mechanisms by which incidental emotions may affect choices remain poorly understood. Here we study the effects of incidental anxiety on human risky decision making, focusing on both behavioral preferences and their underlying neural processes. Although observable choices remained stable across affective contexts with high and low incidental anxiety, we found a clear change in neural valuation signals: during high incidental anxiety, activity in ventromedial prefrontal cortex and ventral striatum showed a marked reduction in (1) neural coding of the expected subjective value (ESV) of risky options, (2) prediction of observed choices, (3) functional coupling with other areas of the valuation system, and (4) baseline activity. At the same time, activity in the anterior insula showed an increase in coding the negative ESV of risky lotteries, and this neural activity predicted whether the risky lotteries would be rejected. This pattern of results suggests that incidental anxiety can shift the focus of neural valuation from possible positive consequences to anticipated negative consequences of choice options. Moreover, our findings show that these changes in neural value coding can occur in the absence of changes in overt behavior. This suggest a possible pathway by which background anxiety may lead to the development of chronic reward desensitization and a maladaptive focus on negative cognitions, as prevalent in affective and anxiety disorders. Copyright © 2015 the authors 0270-6474/15/353085-15$15.00/0.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Weber, Scott; Bixler, Nathan E.; McFadden, Katherine Letizia
In 1973 the U.S. Environmental Protection Agency (EPA) developed SecPop to calculate population estimates to support a study on air quality. The Nuclear Regulatory Commission (NRC) adopted this program to support siting reviews for nuclear power plant construction and license applications. Currently SecPop is used to prepare site data input files for offsite consequence calculations with the MELCOR Accident Consequence Code System (MACCS). SecPop enables the use of site-specific population, land use, and economic data for a polar grid defined by the user. Updated versions of SecPop have been released to use U.S. decennial census population data. SECPOP90 was releasedmore » in 1997 to use 1990 population and economic data. SECPOP2000 was released in 2003 to use 2000 population data and 1997 economic data. This report describes the current code version, SecPop version 4.3, which uses 2010 population data and both 2007 and 2012 economic data. It is also compatible with 2000 census and 2002 economic data. At the time of this writing, the current version of SecPop is 4.3.0, and that version is described herein. This report contains guidance for the installation and use of the code as well as a description of the theory, models, and algorithms involved. This report contains appendices which describe the development of the 2010 census file, 2007 county file, and 2012 county file. Finally, an appendix is included that describes the validation assessments performed.« less
Sensemaking, stakeholder discord, and long-term risk communication at a US Superfund site.
Hoover, Anna Goodman
2017-03-01
Risk communication can help reduce exposures to environmental contaminants, mitigate negative health outcomes, and inform community-based decisions about hazardous waste sites. While communication best practices have long guided such efforts, little research has examined unintended consequences arising from such guidelines. As rhetoric informs stakeholder sensemaking, the language used in and reinforced by these guidelines can challenge relationships and exacerbate stakeholder tensions. This study evaluates risk communication at a U.S. Superfund site to identify unintended consequences arising from current risk communication practices. This qualitative case study crystallizes data spanning 6 years from three sources: 1) local newspaper coverage of site-related topics; 2) focus-group transcripts from a multi-year project designed to support future visioning of site use; and 3) published blog entries authored by a local environmental activist. Constant comparative analysis provides the study's analytic foundation, with qualitative data analysis software QSR NVivo 8 supporting a three-step process: 1) provisional coding to identify broad topic categories within datasets, 2) coding occurrences of sensemaking constructs and emergent intra-dataset patterns, and 3) grouping related codes across datasets to examine the relationships among them. Existing risk communication practices at this Superfund site contribute to a dichotomous conceptualization of multiple and diverse stakeholders as members of one of only two categories: the government or the public. This conceptualization minimizes perceptions of capacity, encourages public commitment to stances aligned with a preferred group, and contributes to negative expectations that can become self-fulfilling prophecies. Findings indicate a need to re-examine and adapt risk communication guidelines to encourage more pluralistic understanding of the stakeholder landscape.
Taylor, Jennifer A; Gerwin, Daniel; Morlock, Laura; Miller, Marlene R
2011-12-01
To evaluate the need for triangulating case-finding tools in patient safety surveillance. This study applied four case-finding tools to error-associated patient safety events to identify and characterise the spectrum of events captured by these tools, using puncture or laceration as an example for in-depth analysis. Retrospective hospital discharge data were collected for calendar year 2005 (n=48,418) from a large, urban medical centre in the USA. The study design was cross-sectional and used data linkage to identify the cases captured by each of four case-finding tools. Three case-finding tools (International Classification of Diseases external (E) and nature (N) of injury codes, Patient Safety Indicators (PSI)) were applied to the administrative discharge data to identify potential patient safety events. The fourth tool was Patient Safety Net, a web-based voluntary patient safety event reporting system. The degree of mutual exclusion among detection methods was substantial. For example, when linking puncture or laceration on unique identifiers, out of 447 potential events, 118 were identical between PSI and E-codes, 152 were identical between N-codes and E-codes and 188 were identical between PSI and N-codes. Only 100 events that were identified by PSI, E-codes and N-codes were identical. Triangulation of multiple tools through data linkage captures potential patient safety events most comprehensively. Existing detection tools target patient safety domains differently, and consequently capture different occurrences, necessitating the integration of data from a combination of tools to fully estimate the total burden.
Verification of Gyrokinetic codes: theoretical background and applications
NASA Astrophysics Data System (ADS)
Tronko, Natalia
2016-10-01
In fusion plasmas the strong magnetic field allows the fast gyro motion to be systematically removed from the description of the dynamics, resulting in a considerable model simplification and gain of computational time. Nowadays, the gyrokinetic (GK) codes play a major role in the understanding of the development and the saturation of turbulence and in the prediction of the consequent transport. We present a new and generic theoretical framework and specific numerical applications to test the validity and the domain of applicability of existing GK codes. For a sound verification process, the underlying theoretical GK model and the numerical scheme must be considered at the same time, which makes this approach pioneering. At the analytical level, the main novelty consists in using advanced mathematical tools such as variational formulation of dynamics for systematization of basic GK code's equations to access the limits of their applicability. The indirect verification of numerical scheme is proposed via the Benchmark process. In this work, specific examples of code verification are presented for two GK codes: the multi-species electromagnetic ORB5 (PIC), and the radially global version of GENE (Eulerian). The proposed methodology can be applied to any existing GK code. We establish a hierarchy of reduced GK Vlasov-Maxwell equations using the generic variational formulation. Then, we derive and include the models implemented in ORB5 and GENE inside this hierarchy. At the computational level, detailed verification of global electromagnetic test cases based on the CYCLONE are considered, including a parametric β-scan covering the transition between the ITG to KBM and the spectral properties at the nominal β value.
Quality of head injury coding from autopsy reports with AIS © 2005 update 2008.
Schick, Sylvia; Humrich, Anton; Graw, Matthias
2018-02-28
ABSTACT Objective: Coding injuries from autopsy reports of traffic accident victims according to Abbreviated Injury Scale AIS © 2005 update 2008 [1] is quite time consuming. The suspicion arose, that many issues leading to discussion between coder and control reader were based on information required by the AIS that was not documented in the autopsy reports. To quantify this suspicion, we introduced an AIS-detail-indicator (AIS-DI). To each injury in the AIS Codebook one letter from A to N was assigned indicating the level of detail. Rules were formulated to receive repeatable assignments. This scheme was applied to a selection of 149 multiply injured traffic fatalities. The frequencies of "not A" codes were calculated for each body region and it was analysed, why the most detailed level A had not been coded. As a first finding, the results of the head region are presented. 747 AIS head injury codes were found in 137 traffic fatalities, and 60% of these injuries were coded with an AIS-DI of level A. There are three different explanations for codes of AIS-DI "not A": Group 1 "Missing information in autopsy report" (5%), Group 2 "Clinical data required by AIS" (20%), and Group 3 "AIS system determined" (15%). Groups 1 and 2 show consequences for the ISS in 25 cases. Other body regions might perform differently. The AIS-DI can indicate the quality of the underlying data basis and, depending on the aims of different AIS users it can be a helpful tool for quality checks.
3D Equilibrium Effects Due to RMP Application on DIII-D
DOE Office of Scientific and Technical Information (OSTI.GOV)
S. Lazerson, E. Lazarus, S. Hudson, N. Pablant and D. Gates
2012-06-20
The mitigation and suppression of edge localized modes (ELMs) through application of resonant magnetic perturbations (RMPs) in Tokamak plasmas is a well documented phenomenon [1]. Vacuum calculations suggest the formation of edge islands and stochastic regions when RMPs are applied to the axisymmetric equilibria. Self-consistent calculations of the plasma equilibrium with the VMEC [2] and SPEC [3] codes have been performed for an up-down symmetric shot (142603) in DIII-D. In these codes, a self-consistent calculation of the plasma response due to the RMP coils is calculated. The VMEC code globally enforces the constraints of ideal MHD; consequently, a continuously nestedmore » family of flux surfaces is enforced throughout the plasma domain. This approach necessarily precludes the observation of islands or field-line chaos. The SPEC code relaxes the constraints of ideal MHD locally, and allows for islands and field line chaos at or near the rational surfaces. Equilibria with finite pressure gradients are approximated by a set of discrete "ideal-interfaces" at the most irrational flux surfaces and where the strongest pressure gradients are observed. Both the VMEC and SPEC calculations are initialized from EFIT reconstructions of the plasma that are consistent with the experimental pressure and current profiles. A 3D reconstruction using the STELLOPT code, which fits VMEC equilibria to experimental measurements, has also been performed. Comparisons between the equilibria generated by the 3D codes and between STELLOPT and EFIT are presented.« less
3D Equilibrium Effects Due to RMP Application on DIII-D
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lazerson, S.; Lazarus, E.; Hudson, S.
2012-06-20
The mitigation and suppression of edge localized modes (ELMs) through application of resonant magnetic perturbations (RMPs) in Tokamak plasmas is a well documented phenomenon. Vacuum calculations suggest the formation of edge islands and stochastic regions when RMPs are applied to the axisymmetric equilibria. Self-consistent calculations of the plasma equilibrium with the VMEC and SPEC codes have been performed for an up-down symmetric shot in DIII-D. In these codes, a self-consistent calculation of the plasma response due to the RMP coils is calculated. The VMEC code globally enforces the constraints of ideal MHD; consequently, a continuously nested family of flux surfacesmore » is enforced throughout the plasma domain. This approach necessarily precludes the observation of islands or field-line chaos. The SPEC code relaxes the constraints of ideal MHD locally, and allows for islands and field line chaos at or near the rational surfaces. Equilibria with finite pressure gradients are approximated by a set of discrete "ideal-interfaces" at the most irrational flux surfaces and where the strongest pressure gradients are observed. Both the VMEC and SPEC calculations are initialized from EFIT reconstructions of the plasma that are consistent with the experimental pressure and current profiles. A 3D reconstruction using the STELLOPT code, which fits VMEC equilibria to experimental measurements, has also been performed. Comparisons between the equilibria generated by the 3D codes and between STELLOPT and EFIT are presented.« less
Ross, Jaclyn M.; Girard, Jeffrey M.; Wright, Aidan G.C.; Beeney, Joseph E.; Scott, Lori N.; Hallquist, Michael N.; Lazarus, Sophie A.; Stepp, Stephanie D.; Pilkonis, Paul A.
2016-01-01
Relationships are among the most salient factors affecting happiness and wellbeing for individuals and families. Relationship science has identified the study of dyadic behavioral patterns between couple members during conflict as an important window in to relational functioning with both short-term and long-term consequences. Several methods have been developed for the momentary assessment of behavior during interpersonal transactions. Among these, the most popular is the Specific Affect Coding System (SPAFF), which organizes social behavior into a set of discrete behavioral constructs. This study examines the interpersonal meaning of the SPAFF codes through the lens of interpersonal theory, which uses the fundamental dimensions of Dominance and Affiliation to organize interpersonal behavior. A sample of 67 couples completed a conflict task, which was video recorded and coded using SPAFF and a method for rating momentary interpersonal behavior, the Continuous Assessment of Interpersonal Dynamics (CAID). Actor partner interdependence models in a multilevel structural equation modeling framework were used to study the covariation of SPAFF codes and CAID ratings. Results showed that a number of SPAFF codes had clear interpersonal signatures, but many did not. Additionally, actor and partner effects for the same codes were strongly consistent with interpersonal theory’s principle of complementarity. Thus, findings reveal points of convergence and divergence in the two systems and provide support for central tenets of interpersonal theory. Future directions based on these initial findings are discussed. PMID:27148786
Gyrofluid Modeling of Turbulent, Kinetic Physics
NASA Astrophysics Data System (ADS)
Despain, Kate Marie
2011-12-01
Gyrofluid models to describe plasma turbulence combine the advantages of fluid models, such as lower dimensionality and well-developed intuition, with those of gyrokinetics models, such as finite Larmor radius (FLR) effects. This allows gyrofluid models to be more tractable computationally while still capturing much of the physics related to the FLR of the particles. We present a gyrofluid model derived to capture the behavior of slow solar wind turbulence and describe the computer code developed to implement the model. In addition, we describe the modifications we made to a gyrofluid model and code that simulate plasma turbulence in tokamak geometries. Specifically, we describe a nonlinear phase mixing phenomenon, part of the E x B term, that was previously missing from the model. An inherently FLR effect, it plays an important role in predicting turbulent heat flux and diffusivity levels for the plasma. We demonstrate this importance by comparing results from the updated code to studies done previously by gyrofluid and gyrokinetic codes. We further explain what would be necessary to couple the updated gyrofluid code, gryffin, to a turbulent transport code, thus allowing gryffin to play a role in predicting profiles for fusion devices such as ITER and to explore novel fusion configurations. Such a coupling would require the use of Graphical Processing Units (GPUs) to make the modeling process fast enough to be viable. Consequently, we also describe our experience with GPU computing and demonstrate that we are poised to complete a gryffin port to this innovative architecture.
NASA Astrophysics Data System (ADS)
Hamdi, Mazda; Kenari, Masoumeh Nasiri
2013-06-01
We consider a time-hopping based multiple access scheme introduced in [1] for communication over dispersive infrared links, and evaluate its performance for correlator and matched filter receivers. In the investigated time-hopping code division multiple access (TH-CDMA) method, the transmitter benefits a low rate convolutional encoder. In this method, the bit interval is divided into Nc chips and the output of the encoder along with a PN sequence assigned to the user determines the position of the chip in which the optical pulse is transmitted. We evaluate the multiple access performance of the system for correlation receiver considering background noise which is modeled as White Gaussian noise due to its large intensity. For the correlation receiver, the results show that for a fixed processing gain, at high transmit power, where the multiple access interference has the dominant effect, the performance improves by the coding gain. But at low transmit power, in which the increase of coding gain leads to the decrease of the chip time, and consequently, to more corruption due to the channel dispersion, there exists an optimum value for the coding gain. However, for the matched filter, the performance always improves by the coding gain. The results show that the matched filter receiver outperforms the correlation receiver in the considered cases. Our results show that, for the same bandwidth and bit rate, the proposed system excels other multiple access techniques, like conventional CDMA and time hopping scheme.
Attacks on quantum key distribution protocols that employ non-ITS authentication
NASA Astrophysics Data System (ADS)
Pacher, C.; Abidin, A.; Lorünser, T.; Peev, M.; Ursin, R.; Zeilinger, A.; Larsson, J.-Å.
2016-01-01
We demonstrate how adversaries with large computing resources can break quantum key distribution (QKD) protocols which employ a particular message authentication code suggested previously. This authentication code, featuring low key consumption, is not information-theoretically secure (ITS) since for each message the eavesdropper has intercepted she is able to send a different message from a set of messages that she can calculate by finding collisions of a cryptographic hash function. However, when this authentication code was introduced, it was shown to prevent straightforward man-in-the-middle (MITM) attacks against QKD protocols. In this paper, we prove that the set of messages that collide with any given message under this authentication code contains with high probability a message that has small Hamming distance to any other given message. Based on this fact, we present extended MITM attacks against different versions of BB84 QKD protocols using the addressed authentication code; for three protocols, we describe every single action taken by the adversary. For all protocols, the adversary can obtain complete knowledge of the key, and for most protocols her success probability in doing so approaches unity. Since the attacks work against all authentication methods which allow to calculate colliding messages, the underlying building blocks of the presented attacks expose the potential pitfalls arising as a consequence of non-ITS authentication in QKD post-processing. We propose countermeasures, increasing the eavesdroppers demand for computational power, and also prove necessary and sufficient conditions for upgrading the discussed authentication code to the ITS level.
Ross, Jaclyn M; Girard, Jeffrey M; Wright, Aidan G C; Beeney, Joseph E; Scott, Lori N; Hallquist, Michael N; Lazarus, Sophie A; Stepp, Stephanie D; Pilkonis, Paul A
2017-02-01
Relationships are among the most salient factors affecting happiness and wellbeing for individuals and families. Relationship science has identified the study of dyadic behavioral patterns between couple members during conflict as an important window in to relational functioning with both short-term and long-term consequences. Several methods have been developed for the momentary assessment of behavior during interpersonal transactions. Among these, the most popular is the Specific Affect Coding System (SPAFF), which organizes social behavior into a set of discrete behavioral constructs. This study examines the interpersonal meaning of the SPAFF codes through the lens of interpersonal theory, which uses the fundamental dimensions of Dominance and Affiliation to organize interpersonal behavior. A sample of 67 couples completed a conflict task, which was video recorded and coded using SPAFF and a method for rating momentary interpersonal behavior, the Continuous Assessment of Interpersonal Dynamics (CAID). Actor partner interdependence models in a multilevel structural equation modeling framework were used to study the covariation of SPAFF codes and CAID ratings. Results showed that a number of SPAFF codes had clear interpersonal signatures, but many did not. Additionally, actor and partner effects for the same codes were strongly consistent with interpersonal theory's principle of complementarity. Thus, findings reveal points of convergence and divergence in the 2 systems and provide support for central tenets of interpersonal theory. Future directions based on these initial findings are discussed. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
Krystal, John H; Anticevic, Alan; Yang, Genevieve J; Dragoi, George; Driesen, Naomi R; Wang, Xiao-Jing; Murray, John D
2017-05-15
The functional optimization of neural ensembles is central to human higher cognitive functions. When the functions through which neural activity is tuned fail to develop or break down, symptoms and cognitive impairments arise. This review considers ways in which disturbances in the balance of excitation and inhibition might develop and be expressed in cortical networks in association with schizophrenia. This presentation is framed within a developmental perspective that begins with disturbances in glutamate synaptic development in utero. It considers developmental correlates and consequences, including compensatory mechanisms that increase intrinsic excitability or reduce inhibitory tone. It also considers the possibility that these homeostatic increases in excitability have potential negative functional and structural consequences. These negative functional consequences of disinhibition may include reduced working memory-related cortical activity associated with the downslope of the "inverted-U" input-output curve, impaired spatial tuning of neural activity and impaired sparse coding of information, and deficits in the temporal tuning of neural activity and its implication for neural codes. The review concludes by considering the functional significance of noisy activity for neural network function. The presentation draws on computational neuroscience and pharmacologic and genetic studies in animals and humans, particularly those involving N-methyl-D-aspartate glutamate receptor antagonists, to illustrate principles of network regulation that give rise to features of neural dysfunction associated with schizophrenia. While this presentation focuses on schizophrenia, the general principles outlined in the review may have broad implications for considering disturbances in the regulation of neural ensembles in psychiatric disorders. Published by Elsevier Inc.
Effect of Isotope Mass in Simulations of JET H-mode Discharges
NASA Astrophysics Data System (ADS)
Snyder, S. E.; Onjun, T.; Kritz, A. H.; Bateman, G.; Parail, V.
2004-11-01
In JET type-I ELMy H-mode discharges, it is found that the height of the pressure pedestal increases and the frequency of the ELMs decreases with increasing isotope mass. These experimentally observed trends are obtained in these simulations only if the pedestal width increases with isotope mass. Simulations are carried out using the JETTO integrated modeling code with a dynamic model for the H-mode pedestal and the ELMs.(T. Onjun et al, Phys. Plasmas 11 (2004) 1469 and 3006.) The HELENA and MISHKA stability codes are applied to calibrate the stability criteria used to trigger ELM crashes in the JETTO code and to explore possible access to second stability in the pedestal. In the simulations, transport in the pedestal is given by the ion thermal neoclassical diffusivity, which increases with isotope mass. Consequently, as the isotope mass is increased, the pressure gradient and the bootstrap current in the pedestal rebuild more slowly after each ELM crash. Several models are explored in which the pedestal width increases with isotope mass.
Facility Targeting, Protection and Mission Decision Making Using the VISAC Code
NASA Technical Reports Server (NTRS)
Morris, Robert H.; Sulfredge, C. David
2011-01-01
The Visual Interactive Site Analysis Code (VISAC) has been used by DTRA and several other agencies to aid in targeting facilities and to predict the associated collateral effects for the go, no go mission decision making process. VISAC integrates the three concepts of target geometric modeling, damage assessment capabilities, and an event/fault tree methodology for evaluating accident/incident consequences. It can analyze a variety of accidents/incidents at nuclear or industrial facilities, ranging from simple component sabotage to an attack with military or terrorist weapons. For nuclear facilities, VISAC predicts the facility damage, estimated downtime, amount and timing of any radionuclides released. Used in conjunction with DTRA's HPAC code, VISAC also can analyze transport and dispersion of the radionuclides, levels of contamination of the surrounding area, and the population at risk. VISAC has also been used by the NRC to aid in the development of protective measures for nuclear facilities that may be subjected to attacks by car/truck bombs.
Methylation of miRNA genes and oncogenesis.
Loginov, V I; Rykov, S V; Fridman, M V; Braga, E A
2015-02-01
Interaction between microRNA (miRNA) and messenger RNA of target genes at the posttranscriptional level provides fine-tuned dynamic regulation of cell signaling pathways. Each miRNA can be involved in regulating hundreds of protein-coding genes, and, conversely, a number of different miRNAs usually target a structural gene. Epigenetic gene inactivation associated with methylation of promoter CpG-islands is common to both protein-coding genes and miRNA genes. Here, data on functions of miRNAs in development of tumor-cell phenotype are reviewed. Genomic organization of promoter CpG-islands of the miRNA genes located in inter- and intragenic areas is discussed. The literature and our own results on frequency of CpG-island methylation in miRNA genes from tumors are summarized, and data regarding a link between such modification and changed activity of miRNA genes and, consequently, protein-coding target genes are presented. Moreover, the impact of miRNA gene methylation on key oncogenetic processes as well as affected signaling pathways is discussed.
Circular non-coding RNA ANRIL modulates ribosomal RNA maturation and atherosclerosis in humans
Holdt, Lesca M.; Stahringer, Anika; Sass, Kristina; Pichler, Garwin; Kulak, Nils A.; Wilfert, Wolfgang; Kohlmaier, Alexander; Herbst, Andreas; Northoff, Bernd H.; Nicolaou, Alexandros; Gäbel, Gabor; Beutner, Frank; Scholz, Markus; Thiery, Joachim; Musunuru, Kiran; Krohn, Knut; Mann, Matthias; Teupser, Daniel
2016-01-01
Circular RNAs (circRNAs) are broadly expressed in eukaryotic cells, but their molecular mechanism in human disease remains obscure. Here we show that circular antisense non-coding RNA in the INK4 locus (circANRIL), which is transcribed at a locus of atherosclerotic cardiovascular disease on chromosome 9p21, confers atheroprotection by controlling ribosomal RNA (rRNA) maturation and modulating pathways of atherogenesis. CircANRIL binds to pescadillo homologue 1 (PES1), an essential 60S-preribosomal assembly factor, thereby impairing exonuclease-mediated pre-rRNA processing and ribosome biogenesis in vascular smooth muscle cells and macrophages. As a consequence, circANRIL induces nucleolar stress and p53 activation, resulting in the induction of apoptosis and inhibition of proliferation, which are key cell functions in atherosclerosis. Collectively, these findings identify circANRIL as a prototype of a circRNA regulating ribosome biogenesis and conferring atheroprotection, thereby showing that circularization of long non-coding RNAs may alter RNA function and protect from human disease. PMID:27539542
Nonlinear ship waves and computational fluid dynamics
MIYATA, Hideaki; ORIHARA, Hideo; SATO, Yohei
2014-01-01
Research works undertaken in the first author’s laboratory at the University of Tokyo over the past 30 years are highlighted. Finding of the occurrence of nonlinear waves (named Free-Surface Shock Waves) in the vicinity of a ship advancing at constant speed provided the start-line for the progress of innovative technologies in the ship hull-form design. Based on these findings, a multitude of the Computational Fluid Dynamic (CFD) techniques have been developed over this period, and are highlighted in this paper. The TUMMAC code has been developed for wave problems, based on a rectangular grid system, while the WISDAM code treats both wave and viscous flow problems in the framework of a boundary-fitted grid system. These two techniques are able to cope with almost all fluid dynamical problems relating to ships, including the resistance, ship’s motion and ride-comfort issues. Consequently, the two codes have contributed significantly to the progress in the technology of ship design, and now form an integral part of the ship-designing process. PMID:25311139
NASA Astrophysics Data System (ADS)
Araya, Mussie K.; Brownell, William E.
2015-12-01
Hearing requires precise detection and coding of acoustic signals by the inner ear and equally precise communication of the information through the auditory brainstem. A membrane based motor in the outer hair cell lateral wall contributes to the transformation of sound into a precise neural code. Structural, molecular and energetic similarities between the outer hair cell and auditory brainstem neurons suggest that a similar membrane based motor may contribute to signal processing in the auditory CNS. Cooperative activation of voltage gated ion channels enhances neuronal temporal processing and increases the upper frequency limit for phase locking. We explore the possibility that membrane mechanics contribute to ion channel cooperativity as a consequence of the nearly instantaneous speed of electromechanical signaling and the fact that membrane composition and mechanics modulate ion channel function.
Nummer, Brian A
2013-11-01
Kombucha is a fermented beverage made from brewed tea and sugar. The taste is slightly sweet and acidic and it may have residual carbon dioxide. Kombucha is consumed in many countries as a health beverage and it is gaining in popularity in the U.S. Consequently, many retailers and food service operators are seeking to brew this beverage on site. As a fermented beverage, kombucha would be categorized in the Food and Drug Administration model Food Code as a specialized process and would require a variance with submission of a food safety plan. This special report was created to assist both operators and regulators in preparing or reviewing a kombucha food safety plan.
Ethics in Science: The Unique Consequences of Chemistry.
Kovac, Jeffrey
2015-01-01
This article discusses the ethical issues unique to the science and practice of chemistry. These issues arise from chemistry's position in the middle between the theoretical and the practical, a science concerned with molecules that are of the right size to directly affect human life. Many of the issues are raised by the central activity of chemistry--synthesis. Chemists make thousands of new substances each year. Many are beneficial, but others are threats. Since the development of the chemical industry in the nineteenth century, chemistry has contributed to the deterioration of the environment but has also helped to reduce pollution. Finally, we discuss the role of codes of ethics and whether the current codes of conduct for chemists are adequate for the challenges of today's world.
CoreTSAR: Core Task-Size Adapting Runtime
Scogland, Thomas R. W.; Feng, Wu-chun; Rountree, Barry; ...
2014-10-27
Heterogeneity continues to increase at all levels of computing, with the rise of accelerators such as GPUs, FPGAs, and other co-processors into everything from desktops to supercomputers. As a consequence, efficiently managing such disparate resources has become increasingly complex. CoreTSAR seeks to reduce this complexity by adaptively worksharing parallel-loop regions across compute resources without requiring any transformation of the code within the loop. Lastly, our results show performance improvements of up to three-fold over a current state-of-the-art heterogeneous task scheduler as well as linear performance scaling from a single GPU to four GPUs for many codes. In addition, CoreTSAR demonstratesmore » a robust ability to adapt to both a variety of workloads and underlying system configurations.« less
Coulomb effects in low-energy nuclear fragmentation
NASA Technical Reports Server (NTRS)
Wilson, John W.; Chun, Sang Y.; Badavi, Francis F.; John, Sarah
1993-01-01
Early versions of the Langley nuclear fragmentation code NUCFRAG (and a publicly released version called HZEFRG1) assumed straight-line trajectories throughout the interaction. As a consequence, NUCFRAG and HZEFRG1 give unrealistic cross sections for large mass removal from the projectile and target at low energies. A correction for the distortion of the trajectory by the nuclear Coulomb fields is used to derive fragmentation cross sections. A simple energy-loss term is applied to estimate the energy downshifts that greatly alter the Coulomb trajectory at low energy. The results, which are far more realistic than prior versions of the code, should provide the data base for future transport calculations. The systematic behavior of charge-removal cross sections compares favorably with results from low-energy experiments.
Progress in theoretical and numerical modeling of RF/MHD coupling using NIMROD
NASA Astrophysics Data System (ADS)
Jenkins, Thomas G.; Schnack, Dalton D.; Hegna, Chris C.; Callen, James D.; Sovinec, Carl R.; Held, Eric D.; Ji, Jeong-Young; Kruger, Scott E.
2007-11-01
Preliminary work relevant to the development of a general framework for the self-consistent inclusion of RF effects in fluid codes is presented; specifically, the stabilization of neoclassical and conventional tearing modes by electron cyclotron current drive is considered. For this particular problem, the effects of the RF drive can be formally captured by a quasilinear diffusion operator which enters the fluid equations on the same footing as the collision operator. Furthermore, a Chapman-Enskog-like method can be used to determine the consequent effects of the RF drive on the fluid closures for the parallel heat flow and stress. We summarize our recent research along these lines and discuss issues relevant to its implementation in the NIMROD code.
Martín, Javier; Samoilovich, Elena; Dunn, Glynis; Lackenby, Angie; Feldman, Esphir; Heath, Alan; Svirchevskaya, Ekaterina; Cooper, Gill; Yermalovich, Marina; Minor, Philip D
2002-11-01
The isolation of a capsid intertypic poliovirus recombinant from a child with vaccine-associated paralytic poliomyelitis is described. Virus 31043 had a Sabin-derived type 3-type 2-type 1 recombinant genome with a 5'-end crossover point within the capsid coding region. The result was a poliovirus chimera containing the entire coding sequence for antigenic site 3a derived from the Sabin type 2 strain. The recombinant virus showed altered antigenic properties but did not acquire type 2 antigenic characteristics. The significance of the presence in nature of such poliovirus chimeras and the consequences for the current efforts to detect potentially dangerous vaccine-derived poliovirus strains are discussed in the context of the global polio eradication initiative.
Non-coding RNA networks in cancer.
Anastasiadou, Eleni; Jacob, Leni S; Slack, Frank J
2018-01-01
Thousands of unique non-coding RNA (ncRNA) sequences exist within cells. Work from the past decade has altered our perception of ncRNAs from 'junk' transcriptional products to functional regulatory molecules that mediate cellular processes including chromatin remodelling, transcription, post-transcriptional modifications and signal transduction. The networks in which ncRNAs engage can influence numerous molecular targets to drive specific cell biological responses and fates. Consequently, ncRNAs act as key regulators of physiological programmes in developmental and disease contexts. Particularly relevant in cancer, ncRNAs have been identified as oncogenic drivers and tumour suppressors in every major cancer type. Thus, a deeper understanding of the complex networks of interactions that ncRNAs coordinate would provide a unique opportunity to design better therapeutic interventions.
Studies of Planet Formation using a Hybrid N-body + Planetesimal Code
NASA Technical Reports Server (NTRS)
Kenyon, Scott J.; Bromley, Benjamin C.; Salamon, Michael (Technical Monitor)
2005-01-01
The goal of our proposal was to use a hybrid multi-annulus planetesimal/n-body code to examine the planetesimal theory, one of the two main theories of planet formation. We developed this code to follow the evolution of numerous 1 m to 1 km planetesimals as they collide, merge, and grow into full-fledged planets. Our goal was to apply the code to several well-posed, topical problems in planet formation and to derive observational consequences of the models. We planned to construct detailed models to address two fundamental issues: 1) icy planets - models for icy planet formation will demonstrate how the physical properties of debris disks, including the Kuiper Belt in our solar system, depend on initial conditions and input physics; and 2) terrestrial planets - calculations following the evolution of 1-10 km planetesimals into Earth-mass planets and rings of dust will provide a better understanding of how terrestrial planets form and interact with their environment. During the past year, we made progress on each issue. Papers published in 2004 are summarized. Summaries of work to be completed during the first half of 2005 and work planned for the second half of 2005 are included.
On Applicability of Network Coding Technique for 6LoWPAN-based Sensor Networks.
Amanowicz, Marek; Krygier, Jaroslaw
2018-05-26
In this paper, the applicability of the network coding technique in 6LoWPAN-based sensor multihop networks is examined. The 6LoWPAN is one of the standards proposed for the Internet of Things architecture. Thus, we can expect the significant growth of traffic in such networks, which can lead to overload and decrease in the sensor network lifetime. The authors propose the inter-session network coding mechanism that can be implemented in resource-limited sensor motes. The solution reduces the overall traffic in the network, and in consequence, the energy consumption is decreased. Used procedures take into account deep header compressions of the native 6LoWPAN packets and the hop-by-hop changes of the header structure. Applied simplifications reduce signaling traffic that is typically occurring in network coding deployments, keeping the solution usefulness for the wireless sensor networks with limited resources. The authors validate the proposed procedures in terms of end-to-end packet delay, packet loss ratio, traffic in the air, total energy consumption, and network lifetime. The solution has been tested in a real wireless sensor network. The results confirm the efficiency of the proposed technique, mostly in delay-tolerant sensor networks.
The Analysis of Design of Robust Nonlinear Estimators and Robust Signal Coding Schemes.
1982-09-16
b - )’/ 12. between uniform and nonuniform quantizers. For the nonuni- Proof: If b - acca then form quantizer we can expect the mean-square error to...in the window greater than or equal to the value at We define f7 ’(s) as the n-times filtered signal p + 1; consequently, point p + 1 is the median and
Environmental Compliance Assessment Army Reserve (ECAAR)
1993-09-01
and water Spent mixed acid Spent caustic Spent sulfuric acid Potential Consequences: Heat generation, violent reaction. Group 2-A Group 2-B Aluminum Any...methane reforming furnaces, pulping liquor recovery furnaces, combustion devices used in the recovery of sulfur values from spent sulfuric acid...Industry and USEPA Hazardous Waste Hazard No. Hazardous Waste Code* Generic FOO1 The spent halogenated solvents used in degreasing: Trichloroethylene, (t
DoD Resource Augmentation for Civilian Consequence Management (DRACCM) Tool
2015-07-01
staffing availabilities for the nine regions. Finally, we added options to view IDAC data that had included school closings, vaccinations , antivirals...there is enough critical medical resource at that hospital for a given day. An hospital icon coded yellow means that at least one critical medical...tularemia, Q fever , SEB, anthrax, plague (with contagion), VEE, botulism, brucellosis, glanders, smallpox (with contagion), influenza, cesium, sarin, VX
Geoethics: what can we learn from existing bio-, ecological, and engineering ethics codes?
NASA Astrophysics Data System (ADS)
Kieffer, Susan W.; Palka, John
2014-05-01
Many scientific disciplines are concerned about ethics, and codes of ethics for these professions exist, generally through the professional scientific societies such as the American Geophysical Union (AGU), American Geological Institute (AGI), American Association of Petroleum Engineers (AAPE), National Society of Professional Engineers (NSPE), Ecological Society of America (ESA), and many others worldwide. These vary considerably in depth and specificity. In this poster, we review existing codes with the goal of extracting fundamentals that should/can be broadly applied to all geo-disciplines. Most of these codes elucidate a set of principles that cover practical issues such as avoiding conflict of interest, avoiding plagiarism, not permitting illegitimate use of intellectual products, enhancing the prestige of the profession, acknowledging an obligation to perform services only in areas of competence, issuing public statements only in an objective manner, holding paramount the welfare of the public, and in general conducting oneself honorably, responsibly, and lawfully. It is striking that, given that the work of these societies and their members is relevant to the future of the earth, few discuss in any detail ethical obligations regarding our relation to the planet itself. The AGU code, for example, only states that "Members have an ethical obligation to weigh the societal benefits of their research against the costs and risks to human and animal welfare and impacts on the environment and society." The NSPE and AGI codes go somewhat further: "Engineers are encouraged to adhere to the principles of sustainable development in order to protect the environment for future generations," and "Geoscientists should strive to protect our natural environment. They should understand and anticipate the environmental consequences of their work and should disclose the consequences of recommended actions. They should acknowledge that resource extraction and use are necessary to the existence of our society and that such should be undertaken in an environmentally and economically responsible manner." However, statements such as these still focus primarily on the value of the earth to generations of humans, rather than on the earth itself. They remain far from meeting addressing our obligation to the land as summarized, for example, by Aldo Leopold, widely regarded as the principal founder of the American conservation movement: "The individual is a member of a community of interdependent parts. The land ethic simply enlarges the boundaries of the community to include soils, waters, plants and animals, or collectively the land." In this poster, we compare and contrast the various existing codes and suggest ways in which ethical obligations to the community itself, as defined by Leopold, could be more clearly incorporated.
Code manual for CONTAIN 2.0: A computer code for nuclear reactor containment analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Murata, K.K.; Williams, D.C.; Griffith, R.O.
1997-12-01
The CONTAIN 2.0 computer code is an integrated analysis tool used for predicting the physical conditions, chemical compositions, and distributions of radiological materials inside a containment building following the release of material from the primary system in a light-water reactor accident. It can also predict the source term to the environment. CONTAIN 2.0 is intended to replace the earlier CONTAIN 1.12, which was released in 1991. The purpose of this Code Manual is to provide full documentation of the features and models in CONTAIN 2.0. Besides complete descriptions of the models, this Code Manual provides a complete description of themore » input and output from the code. CONTAIN 2.0 is a highly flexible and modular code that can run problems that are either quite simple or highly complex. An important aspect of CONTAIN is that the interactions among thermal-hydraulic phenomena, aerosol behavior, and fission product behavior are taken into account. The code includes atmospheric models for steam/air thermodynamics, intercell flows, condensation/evaporation on structures and aerosols, aerosol behavior, and gas combustion. It also includes models for reactor cavity phenomena such as core-concrete interactions and coolant pool boiling. Heat conduction in structures, fission product decay and transport, radioactive decay heating, and the thermal-hydraulic and fission product decontamination effects of engineered safety features are also modeled. To the extent possible, the best available models for severe accident phenomena have been incorporated into CONTAIN, but it is intrinsic to the nature of accident analysis that significant uncertainty exists regarding numerous phenomena. In those cases, sensitivity studies can be performed with CONTAIN by means of user-specified input parameters. Thus, the code can be viewed as a tool designed to assist the knowledge reactor safety analyst in evaluating the consequences of specific modeling assumptions.« less
FILM-30: A Heat Transfer Properties Code for Water Coolant
DOE Office of Scientific and Technical Information (OSTI.GOV)
MARSHALL, THERON D.
2001-02-01
A FORTRAN computer code has been written to calculate the heat transfer properties at the wetted perimeter of a coolant channel when provided the bulk water conditions. This computer code is titled FILM-30 and the code calculates its heat transfer properties by using the following correlations: (1) Sieder-Tate: forced convection, (2) Bergles-Rohsenow: onset to nucleate boiling, (3) Bergles-Rohsenow: partially developed nucleate boiling, (4) Araki: fully developed nucleate boiling, (5) Tong-75: critical heat flux (CHF), and (6) Marshall-98: transition boiling. FILM-30 produces output files that provide the heat flux and heat transfer coefficient at the wetted perimeter as a function ofmore » temperature. To validate FILM-30, the calculated heat transfer properties were used in finite element analyses to predict internal temperatures for a water-cooled copper mockup under one-sided heating from a rastered electron beam. These predicted temperatures were compared with the measured temperatures from the author's 1994 and 1998 heat transfer experiments. There was excellent agreement between the predicted and experimentally measured temperatures, which confirmed the accuracy of FILM-30 within the experimental range of the tests. FILM-30 can accurately predict the CHF and transition boiling regimes, which is an important advantage over current heat transfer codes. Consequently, FILM-30 is ideal for predicting heat transfer properties for applications that feature high heat fluxes produced by one-sided heating.« less
NASA Astrophysics Data System (ADS)
Wang, Yayong
2010-06-01
A large number of buildings were seriously damaged or collapsed in the “5.12” Wenchuan earthquake. Based on field surveys and studies of damage to different types of buildings, seismic design codes have been updated. This paper briefly summarizes some of the major revisions that have been incorporated into the “Standard for classification of seismic protection of building constructions GB50223-2008” and “Code for Seismic Design of Buildings GB50011-2001.” The definition of seismic fortification class for buildings has been revisited, and as a result, the seismic classifications for schools, hospitals and other buildings that hold large populations such as evacuation shelters and information centers have been upgraded in the GB50223-2008 Code. The main aspects of the revised GB50011-2001 code include: (a) modification of the seismic intensity specified for the Provinces of Sichuan, Shanxi and Gansu; (b) basic conceptual design for retaining walls and building foundations in mountainous areas; (c) regularity of building configuration; (d) integration of masonry structures and pre-cast RC floors; (e) requirements for calculating and detailing stair shafts; and (f) limiting the use of single-bay RC frame structures. Some significant examples of damage in the epicenter areas are provided as a reference in the discussion on the consequences of collapse, the importance of duplicate structural systems, and the integration of RC and masonry structures.
Kneipp, Shawn M.
2017-01-01
Criminal convictions are often associated with collateral consequences that limit access to the forms of employment and social services on which disadvantaged women most frequently rely – regardless of the severity of the offense. These consequences may play an important role in perpetuating health disparities by socioeconomic status and gender. We examined the extent to which research studies to date have assessed whether a criminal conviction might influence women’s health by limiting access to Temporary Assistance for Needy Families (TANF) and employment, as a secondary, or “collateral” criminal conviction-related consequence. We reviewed 434 peer-reviewed journal articles retrieved from three electronic article databases and 197 research reports from three research organizations. Two reviewers independently extracted data from each eligible article or report using a standardized coding scheme. Of the sixteen eligible studies included in the review, most were descriptive. None explored whether receiving TANF modified health outcomes, despite its potential to do so. Researchers to date have not fully examined the causal pathways that could link employment, receiving TANF, and health, especially for disadvantaged women. Future research is needed to address this gap and to understand better the potential consequences of the criminal justice system involvement on the health of this vulnerable population. PMID:25905904
Sheely, Amanda; Kneipp, Shawn M
2015-01-01
Criminal convictions are often associated with collateral consequences that limit access to the forms of employment and social services on which disadvantaged women most frequently rely--regardless of the severity of the offense. These consequences may play an important role in perpetuating health disparities by socioeconomic status and gender. We examined the extent to which research studies to date have assessed whether a criminal conviction might influence women's health by limiting access to Temporary Assistance for Needy Families (TANF) and employment, as a secondary, or "collateral" criminal conviction-related consequence. We reviewed 434 peer-reviewed journal articles retrieved from three electronic article databases and 197 research reports from three research organizations. Two reviewers independently extracted data from each eligible article or report using a standardized coding scheme. Of the sixteen eligible studies included in the review, most were descriptive. None explored whether receiving TANF modified health outcomes, despite its potential to do so. Researchers to date have not fully examined the causal pathways that could link employment, receiving TANF, and health, especially for disadvantaged women. Future research is needed to address this gap and to understand better the potential consequences of the criminal justice system involvement on the health of this vulnerable population.
Changes in the prevalence of alcohol in rap music lyrics 1979-2009.
Herd, Denise
2014-02-01
This study examines the prevalence and context of alcohol references in rap music lyrics from 1979 through 2009. Four hundred nine top-ranked rap music songs released were sampled from Billboard magazine rating charts. Songs were analyzed using systematic content analysis and were coded for alcohol beverage types and brand names, drinking behaviors, drinking contexts, attitudes towards alcohol, and consequences of drinking. Trends were analyzed using regression analyses. The results of the study reveal significant increases in the presence of alcohol in rap songs; a decline in negative attitudes towards alcohol; decreases in consequences attributed to alcohol; increases in the association of alcohol with glamour and wealth, drugs, and nightclubs; and increases in references to liquor and champagne.
Deep-Earth reactor: nuclear fission, helium, and the geomagnetic field.
Hollenbach, D F; Herndon, J M
2001-09-25
Geomagnetic field reversals and changes in intensity are understandable from an energy standpoint as natural consequences of intermittent and/or variable nuclear fission chain reactions deep within the Earth. Moreover, deep-Earth production of helium, having (3)He/(4)He ratios within the range observed from deep-mantle sources, is demonstrated to be a consequence of nuclear fission. Numerical simulations of a planetary-scale geo-reactor were made by using the SCALE sequence of codes. The results clearly demonstrate that such a geo-reactor (i) would function as a fast-neutron fuel breeder reactor; (ii) could, under appropriate conditions, operate over the entire period of geologic time; and (iii) would function in such a manner as to yield variable and/or intermittent output power.
India's homosexual discrimination and health consequences.
Agoramoorthy, Govindasamy; Minna, J Hsu
2007-08-01
A large number of countries worldwide have legalized homosexual rights. But for 147 years, since when India was a British colony, Section 377 of the Indian Penal Code defines homosexuality as a crime, punishable by imprisonment. This outdated law violates the fundamental rights of homosexuals in India. Despite the fact that literature drawn from Hindu, Buddhist, Muslim, and modern fiction testify to the presence of same-sex love in various forms, homosexuality is still considered a taboo subject in India, by both the society and the government. In the present article, the continuation of the outdated colonial-era homosexuality law and its impact on the underprivileged homosexual society in India is discussed, as well as consequences to this group's health in relation to HIV infection.
Littoral Combat Ship Manpower, an Overview of Officer Characteristics and Placement
2013-03-01
15. NUMBER OF PAGES 103 16. PRICE CODE 17. SECURITY CLASSIFICATION OF REPORT Unclassified 18. SECURITY CLASSIFICATION OF THIS PAGE...maritime force: 1.) Networks should be the central organizing principle of the fleet, and its sensing and fighting power should be distributed across...assured access” force; and 4.) Numbers of hulls count (quantity had its own quality) and consequently the fleet’s combat power should be
In Search of the Good War: Just War and Realpolitik in Our Time
2012-10-01
1914, few formal treaties governed armed conflict. Early efforts included the American Lieber Code in 1863, the first Geneva Convention of 1864...making interstate war a rare phenomenon. The trials at Nuremberg and Tokyo following the war established the precedent that war crimes carried...consequences. Nuremberg seemed an ideal marriage of law and morality, and later treaties banned genocide and created the Inter- national Criminal Court
ERIC Educational Resources Information Center
Park, Insu
2010-01-01
The purpose of this study is to explore systems users' behavior on IS under the various circumstances (e.g., email usage and malware threats, online communication at the individual level, and IS usage in organizations). Specifically, the first essay develops a method for analyzing and predicting the impact category of malicious code, particularly…
"Drunk in Love": The Portrayal of Risk Behavior in Music Lyrics.
Holody, Kyle J; Anderson, Christina; Craig, Clay; Flynn, Mark
2016-10-01
The current study investigated the prevalence of multiple risk behaviors in popular music lyrics as well as the contexts within which they occur. We conducted a content analysis of the top 20 Billboard songs from 2009 to 2013 in the genres of rap, country, adult contemporary, rock, R&B/hip-hop, and pop, coding for the presence of alcohol, marijuana, nonmarijuana drugs, and sex as well as the contexts intoxication, binging/addiction, partying/socializing, disregard for consequences, and emotional states. The contexts relationship status and degradation were also coded for when sex was present. Of the 600 songs, 212 mentioned sexual behaviors, which were most frequent in rap and R&B/hip-hop. Alcohol was the next most frequent risk behavior, again with greatest mention in rap and R&B/hip-hop. Alcohol, marijuana, and nonmarijuana drugs were most often associated with positive emotions, and sex was most often described within the context of casual relationships. Alcohol and sex were associated with disregard for consequences most often in 2011, when the "you only live once" motto was most popular. These findings are concerning because exposure to popular music is associated with increased risk behaviors for adolescents and young adults, who are the greatest consumers of music.
Latulipe, Celine; Melius, Kathryn A; Quandt, Sara A; Arcury, Thomas A
2016-01-01
Background The United States government is encouraging physicians to adopt patient portals—secure websites that allow patients to access their health information. For patient portals to recognize their full potential and improve patient care, health care providers’ acceptance and encouragement of their use will be essential. However, little is known about provider concerns or views of patient portals. Objective We conducted this qualitative study to determine how administrators, clinic staff, and health care providers at practices serving a lower income adult population viewed patient portals in terms of their potential benefit, areas of concern, and hopes for the future. Methods We performed in-depth interviews between October 2013 and June 2014 with 20 clinic personnel recruited from health centers in four North Carolina counties. Trained study personnel conducted individual interviews following an interviewer guide to elicit perceptions of the benefits and disadvantages of patient portals. Interviews were recorded and transcribed. Research team members reviewed transcribed interviews for major themes to construct a coding dictionary. Two researchers then coded each transcript with any coding discrepancies resolved through discussion. Results The interviews revealed that clinic personnel viewed patient portals as a mandated product that had potential to improve communication and enhance information sharing. However, they expressed many concerns including portals’ potential to generate more work, confuse patients, alienate non-users, and increase health disparities. Clinic personnel expected few older and disadvantaged patients to use a portal. Conclusions Given that clinic personnel have significant concerns about portals’ unintended consequences, their uptake and impact on care may be limited. Future studies should examine ways portals can be implemented in practices to address providers’ concerns and meet the needs of vulnerable populations. PMID:26772771
Sensemaking, Stakeholder Discord, and Long-Term Risk Communication at a U.S. Superfund Site
Hoover, Anna Goodman
2018-01-01
Introduction Risk communication can help reduce exposures to environmental contaminants, mitigate negative health outcomes, and inform community-based decisions about hazardous waste sites. While communication best practices have long guided such efforts, little research has examined unintended consequences arising from such guidelines. As rhetoric informs stakeholder sensemaking, the language used in and reinforced by these guidelines can challenge relationships and exacerbate stakeholder tensions. Objectives This study evaluates risk communication at a U.S. Superfund site to identify unintended consequences arising from current risk communication practices. Methods This qualitative case study crystallizes data spanning 6 years from three sources: 1) local newspaper coverage of site-related topics; 2) focus-group transcripts from a multi-year project designed to support future visioning of site use; and 3) published blog entries authored by a local environmental activist. Constant comparative analysis provides the study’s analytic foundation, with qualitative data analysis software QSR NVivo 8 supporting a three-step process: 1) provisional coding to identify broad topic categories within datasets, 2) coding occurrences of sensemaking constructs and emergent intra-dataset patterns, and 3) grouping related codes across datasets to examine the relationships among them. Results Existing risk communication practices at this Superfund site contribute to a dichotomous conceptualization of multiple and diverse stakeholders as members of one of only two categories: the government or the public. This conceptualization minimizes perceptions of capacity, encourages public commitment to stances aligned with a preferred group, and contributes to negative expectations that can become self-fulfilling prophecies. Conclusion Findings indicate a need to re-examine and adapt risk communication guidelines to encourage more pluralistic understanding of the stakeholder landscape. PMID:28282297
Ethical education in software engineering: responsibility in the production of complex systems.
Génova, Gonzalo; González, M Rosario; Fraga, Anabel
2007-12-01
Among the various contemporary schools of moral thinking, consequence-based ethics, as opposed to rule-based, seems to have a good acceptance among professionals such as software engineers. But naïve consequentialism is intellectually too weak to serve as a practical guide in the profession. Besides, the complexity of software systems makes it very hard to know in advance the consequences that will derive from professional activities in the production of software. Therefore, following the spirit of well-known codes of ethics such as the ACM/IEEE's, we advocate for a more solid position in the ethical education of software engineers, which we call 'moderate deontologism', that takes into account both rules and consequences to assess the goodness of actions, and at the same time pays an adequate consideration to the absolute values of human dignity. In order to educate responsible professionals, however, this position should be complemented with a pedagogical approach to virtue ethics.
Hartmann, Christine W.; Hoff, Timothy; Palmer, Jennifer A.; Wroe, Peter; Dutta-Linn, M. Maya; Lee, Grace
2014-01-01
In 2008, the Centers for Medicare & Medicaid Services introduced a new policy to adjust payment to hospitals for health care-associated infections (HAIs) not present on admission. Interviews with 36 hospital infection preventionists across the United States explored the perspectives of these key stakeholders on the potential unintended consequences of the current policy. Responses were analyzed using an iterative coding process where themes were developed from the data. Participants’ descriptions of unintended impacts of the policy centered around three themes. Results suggest the policy has focused more attention on targeted HAIs and has affected hospital staff; relatively fewer systems changes have ensued. Some consequences of the policy, such as infection preventionists having less time to devote to HAIs other than those in the policy or having less time to implement prevention activities, may have undesirable effects on HAI rates if hospitals do not recognize and react to potential time and resource gaps. PMID:21810797
Use of zerotree coding in a high-speed pyramid image multiresolution decomposition
NASA Astrophysics Data System (ADS)
Vega-Pineda, Javier; Cabrera, Sergio D.; Lucero, Aldo
1995-03-01
A Zerotree (ZT) coding scheme is applied as a post-processing stage to avoid transmitting zero data in the High-Speed Pyramid (HSP) image compression algorithm. This algorithm has features that increase the capability of the ZT coding to give very high compression rates. In this paper the impact of the ZT coding scheme is analyzed and quantified. The HSP algorithm creates a discrete-time multiresolution analysis based on a hierarchical decomposition technique that is a subsampling pyramid. The filters used to create the image residues and expansions can be related to wavelet representations. According to the pixel coordinates and the level in the pyramid, N2 different wavelet basis functions of various sizes and rotations are linearly combined. The HSP algorithm is computationally efficient because of the simplicity of the required operations, and as a consequence, it can be very easily implemented with VLSI hardware. This is the HSP's principal advantage over other compression schemes. The ZT coding technique transforms the different quantized image residual levels created by the HSP algorithm into a bit stream. The use of ZT's compresses even further the already compressed image taking advantage of parent-child relationships (trees) between the pixels of the residue images at different levels of the pyramid. Zerotree coding uses the links between zeros along the hierarchical structure of the pyramid, to avoid transmission of those that form branches of all zeros. Compression performance and algorithm complexity of the combined HSP-ZT method are compared with those of the JPEG standard technique.
A conflict-based model of color categorical perception: evidence from a priming study.
Hu, Zhonghua; Hanley, J Richard; Zhang, Ruiling; Liu, Qiang; Roberson, Debi
2014-10-01
Categorical perception (CP) of color manifests as faster or more accurate discrimination of two shades of color that straddle a category boundary (e.g., one blue and one green) than of two shades from within the same category (e.g., two different shades of green), even when the differences between the pairs of colors are equated according to some objective metric. The results of two experiments provide new evidence for a conflict-based account of this effect, in which CP is caused by competition between visual and verbal/categorical codes on within-category trials. According to this view, conflict arises because the verbal code indicates that the two colors are the same, whereas the visual code indicates that they are different. In Experiment 1, two shades from the same color category were discriminated significantly faster when the previous trial also comprised a pair of within-category colors than when the previous trial comprised a pair from two different color categories. Under the former circumstances, the CP effect disappeared. According to the conflict-based model, response conflict between visual and categorical codes during discrimination of within-category pairs produced an adjustment of cognitive control that reduced the weight given to the categorical code relative to the visual code on the subsequent trial. Consequently, responses on within-category trials were facilitated, and CP effects were reduced. The effectiveness of this conflict-based account was evaluated in comparison with an alternative view that CP reflects temporary warping of perceptual space at the boundaries between color categories.
[Medico-legal autopsy--selected legal issues: the autopsy protocol].
Gaszczyk-Ozarowski, Zbigniew; Chowaniec, Czesław
2010-01-01
The majority of experts in the field of forensic medicine maintain that the minutes of the medicolegal autopsy should be taken by the forensic pathologist. The authors argue that it is the public prosecutor who is obliged to draw up the minutes, whereas the forensic pathologist issues the expert opinion. To support their stance, the authors make frequent references to several provisions of the Criminal Procedure Code of 1997. The authors also imply that due to organizational reasons and the ratio legis of the aforementioned code, the forensic pathologist should not be assigned the role of the minutes-taker, despite the lack of a specific exclusion rule governing such a case. Possible consequences caused by the lack of the properly drawn up minutes are briefly discussed as well.
A comparison of VLSI architectures for time and transform domain decoding of Reed-Solomon codes
NASA Technical Reports Server (NTRS)
Hsu, I. S.; Truong, T. K.; Deutsch, L. J.; Satorius, E. H.; Reed, I. S.
1988-01-01
It is well known that the Euclidean algorithm or its equivalent, continued fractions, can be used to find the error locator polynomial needed to decode a Reed-Solomon (RS) code. It is shown that this algorithm can be used for both time and transform domain decoding by replacing its initial conditions with the Forney syndromes and the erasure locator polynomial. By this means both the errata locator polynomial and the errate evaluator polynomial can be obtained with the Euclidean algorithm. With these ideas, both time and transform domain Reed-Solomon decoders for correcting errors and erasures are simplified and compared. As a consequence, the architectures of Reed-Solomon decoders for correcting both errors and erasures can be made more modular, regular, simple, and naturally suitable for VLSI implementation.
Ethics in Science: The Unique Consequences of Chemistry
Kovac, Jeffrey
2015-01-01
This article discusses the ethical issues unique to the science and practice of chemistry. These issues arise from chemistry’s position in the middle between the theoretical and the practical, a science concerned with molecules that are of the right size to directly affect human life. Many of the issues are raised by the central activity of chemistry––synthesis. Chemists make thousands of new substances each year. Many are beneficial, but others are threats. Since the development of the chemical industry in the nineteenth century, chemistry has contributed to the deterioration of the environment but has also helped to reduce pollution. Finally, we discuss the role of codes of ethics and whether the current codes of conduct for chemists are adequate for the challenges of today’s world. PMID:26155729
[Patients' rights: what the new swiss civil code will change in the health care of children].
Laufer, Daniel; Genaine, Patrick; Simon, Jeanne-Pascale P
2013-02-20
Recent progress in medicine allow to provide treatment, to cure or to extend the lifespan of people that would have not survived before. Doctors and healthcare providers have become indispensable actors in Western societies. This is particularly true for children's health issues. With the new information technologies, knowledge is now available to everyone, which enables patients to dialog on an equal footing with the physician. Nowadays, therapeutic choices are discussed and negotiated. The new tensions caused by this relationship between therapist and patient have created the need for new regulations. The Swiss Confederation has modified its Civil Code with the objective of a better protection of vulnerable individuals. This article summarizes the consequences of the new regulations with regard to the care and treatment provided to children.
The molecular basis for attractive salt-taste coding in Drosophila.
Zhang, Yali V; Ni, Jinfei; Montell, Craig
2013-06-14
Below a certain level, table salt (NaCl) is beneficial for animals, whereas excessive salt is harmful. However, it remains unclear how low- and high-salt taste perceptions are differentially encoded. We identified a salt-taste coding mechanism in Drosophila melanogaster. Flies use distinct types of gustatory receptor neurons (GRNs) to respond to different concentrations of salt. We demonstrated that a member of the newly discovered ionotropic glutamate receptor (IR) family, IR76b, functioned in the detection of low salt and was a Na(+) channel. The loss of IR76b selectively impaired the attractive pathway, leaving salt-aversive GRNs unaffected. Consequently, low salt became aversive. Our work demonstrated that the opposing behavioral responses to low and high salt were determined largely by an elegant bimodal switch system operating in GRNs.
Martín, Javier; Samoilovich, Elena; Dunn, Glynis; Lackenby, Angie; Feldman, Esphir; Heath, Alan; Svirchevskaya, Ekaterina; Cooper, Gill; Yermalovich, Marina; Minor, Philip D.
2002-01-01
The isolation of a capsid intertypic poliovirus recombinant from a child with vaccine-associated paralytic poliomyelitis is described. Virus 31043 had a Sabin-derived type 3-type 2-type 1 recombinant genome with a 5′-end crossover point within the capsid coding region. The result was a poliovirus chimera containing the entire coding sequence for antigenic site 3a derived from the Sabin type 2 strain. The recombinant virus showed altered antigenic properties but did not acquire type 2 antigenic characteristics. The significance of the presence in nature of such poliovirus chimeras and the consequences for the current efforts to detect potentially dangerous vaccine-derived poliovirus strains are discussed in the context of the global polio eradication initiative. PMID:12368335
A comparison of cigarette- and hookah-related videos on YouTube.
Carroll, Mary V; Shensa, Ariel; Primack, Brian A
2013-09-01
YouTube is now the second most visited site on the internet. The authors aimed to compare characteristics of and messages conveyed by cigarette- and hookah-related videos on YouTube. Systematic search procedures yielded 66 cigarette-related and 61 hookah-related videos. After three trained qualitative researchers used an iterative approach to develop and refine definitions for the coding of variables, two of them independently coded each video for content including positive and negative associations with smoking and major content type. Median view counts were 606,884 for cigarettes-related videos and 102,307 for hookah-related videos (p<0.001). However, the number of comments per 1000 views was significantly lower for cigarette-related videos than for hookah-related videos (1.6 vs 2.5, p=0.003). There was no significant difference in the number of 'like' designations per 100 reactions (91 vs 87, p=0.39). Cigarette-related videos were less likely than hookah-related videos to portray tobacco use in a positive light (24% vs 92%, p<0.001). In addition, cigarette-related videos were more likely to be of high production quality (42% vs 5%, p<0.001), to mention short-term consequences (50% vs 18%, p<0.001) and long-term consequences (44% vs 2%, p<0.001) of tobacco use, to contain explicit antismoking messages (39% vs 0%, p<0.001) and to provide specific information on how to quit tobacco use (21% vs 0%, p<0.001). Although internet user-generated videos related to cigarette smoking often acknowledge harmful consequences and provide explicit antismoking messages, hookah-related videos do not. It may be valuable for public health programmes to correct common misconceptions regarding hookah use.
A Comparison of Cigarette- and Hookah-Related Videos on YouTube
Carroll, Mary V.; Shensa, Ariel; Primack, Brian A.
2013-01-01
Objective YouTube is now the second most visited site on the Internet. We aimed to compare characteristics of and messages conveyed by cigarette- and hookah-related videos on YouTube. Methods Systematic search procedures yielded 66 cigarette-related and 61 hookah-related videos. After 3 trained qualitative researchers used an iterative approach to develop and refine definitions for the coding of variables, 2 of them independently coded each video for content including positive and negative associations with smoking and major content type. Results Median view counts were 606,884 for cigarettes and 102,307 for hookahs (P<.001). However, the number of comments per 1,000 views was significantly lower for cigarette-related videos than for hookah-related videos (1.6 vs 2.5, P=.003). There was no significant difference in the number of “like” designations per 100 reactions (91 vs. 87, P=.39). Cigarette-related videos were less likely than hookah-related videos to portray tobacco use in a positive light (24% vs. 92%, P<.001). In addition, cigarette-related videos were more likely to be of high production quality (42% vs. 5%, P<.001), to mention short-term consequences (50% vs. 18%, P<.001) and long-term consequences (44% vs. 2%, P<.001) of tobacco use, to contain explicit antismoking messages (39% vs. 0%, P<.001), and to provide specific information on how to quit tobacco use (21% vs. 0%, P<.001). Conclusions Although Internet user–generated videos related to cigarette smoking often acknowledge harmful consequences and provide explicit antismoking messages, hookah-related videos do not. It may be valuable for public health programs to correct common misconceptions regarding hookah use. PMID:22363069
'Zero is not good for me': implications of infertility in Ghana.
Fledderjohann, J J
2012-05-01
Given the high value placed on children in sub-Saharan Africa, previous research suggests that infertility increases the risk of psychological distress and marital conflict, encourages risky sexual behavior and deprives infertile individuals and couples of an important source of economic and social capital. This paper explores the implications of infertility for women in Ghana, West Africa. Semi-structured interview data collected from 107 women (aged 21-48 years, mean 33 years) seeking treatment in gynecological and obstetric clinics in Accra, Ghana, are analyzed. Based on iterative open coding of the interviews, the focus of the analysis is on mental health, marital instability, social interaction and gendered experiences. Infertile women report facing severe social stigma, marital strain and a range of mental health difficulties. Many women feel that they shoulder a disproportionate share of the blame for infertility and, by extension, face greater social consequences than male partners for difficulties conceiving. Women who do not self-identify as infertile corroborate these findings, asserting that the social consequences of infertility are severe, particularly for women. Infertility in Ghana has important consequences for social interactions, marital stability and mental health. These consequences are not perceived to be shared equally by Ghanaian men.
T cells are influenced by a long non-coding RNA in the autoimmune associated PTPN2 locus.
Houtman, Miranda; Shchetynsky, Klementy; Chemin, Karine; Hensvold, Aase Haj; Ramsköld, Daniel; Tandre, Karolina; Eloranta, Maija-Leena; Rönnblom, Lars; Uebe, Steffen; Catrina, Anca Irinel; Malmström, Vivianne; Padyukov, Leonid
2018-06-01
Non-coding SNPs in the protein tyrosine phosphatase non-receptor type 2 (PTPN2) locus have been linked with several autoimmune diseases, including rheumatoid arthritis, type I diabetes, and inflammatory bowel disease. However, the functional consequences of these SNPs are poorly characterized. Herein, we show in blood cells that SNPs in the PTPN2 locus are highly correlated with DNA methylation levels at four CpG sites downstream of PTPN2 and expression levels of the long non-coding RNA (lncRNA) LINC01882 downstream of these CpG sites. We observed that LINC01882 is mainly expressed in T cells and that anti-CD3/CD28 activated naïve CD4 + T cells downregulate the expression of LINC01882. RNA sequencing analysis of LINC01882 knockdown in Jurkat T cells, using a combination of antisense oligonucleotides and RNA interference, revealed the upregulation of the transcription factor ZEB1 and kinase MAP2K4, both involved in IL-2 regulation. Overall, our data suggests the involvement of LINC01882 in T cell activation and hints towards an auxiliary role of these non-coding SNPs in autoimmunity associated with the PTPN2 locus. Copyright © 2018 The Authors. Published by Elsevier Ltd.. All rights reserved.
Zhang, Yuqin; Lin, Fanbo; Zhang, Youyu; Li, Haitao; Zeng, Yue; Tang, Hao; Yao, Shouzhuo
2011-01-01
A new method for the detection of point mutation in DNA based on the monobase-coded cadmium tellurium nanoprobes and the quartz crystal microbalance (QCM) technique was reported. A point mutation (single-base, adenine, thymine, cytosine, and guanine, namely, A, T, C and G, mutation in DNA strand, respectively) DNA QCM sensor was fabricated by immobilizing single-base mutation DNA modified magnetic beads onto the electrode surface with an external magnetic field near the electrode. The DNA-modified magnetic beads were obtained from the biotin-avidin affinity reaction of biotinylated DNA and streptavidin-functionalized core/shell Fe(3)O(4)/Au magnetic nanoparticles, followed by a DNA hybridization reaction. Single-base coded CdTe nanoprobes (A-CdTe, T-CdTe, C-CdTe and G-CdTe, respectively) were used as the detection probes. The mutation site in DNA was distinguished by detecting the decreases of the resonance frequency of the piezoelectric quartz crystal when the coded nanoprobe was added to the test system. This proposed detection strategy for point mutation in DNA is proved to be sensitive, simple, repeatable and low-cost, consequently, it has a great potential for single nucleotide polymorphism (SNP) detection. 2011 © The Japan Society for Analytical Chemistry
A Survey of Visualization Tools Assessed for Anomaly-Based Intrusion Detection Analysis
2014-04-01
objective? • What vulnerabilities exist in the target system? • What damage or other consequences are likely? • What exploit scripts or other attack...languages C, R, and Python; no response capabilities. JUNG https://blogs.reucon.com/asterisk- java /tag/visualization/ Create custom layouts and can...annotate graphs, links, nodes with any Java data type. Must be familiar with coding in Java to call the routines; no monitoring or response
Manpower Staffing, Emergency Department Access and Consequences on Patient Outcomes
2007-06-01
distance to the nearest hospital have higher death rates than those zip codes which experience a change. However, we hesitate to conclude that this may...1. Trend Analysis of Mortality Rates by Distance Categories: 1990-2004 Figure 6 presents heart-related death rates for the State of California from...1990- 2004. The graph shows a distinct layering of heart-related death rates across the three distance categories. The population which experiences
SORL1 variants across Alzheimer's disease European American cohorts.
Fernández, Maria Victoria; Black, Kathleen; Carrell, David; Saef, Ben; Budde, John; Deming, Yuetiva; Howells, Bill; Del-Aguila, Jorge L; Ma, Shengmei; Bi, Catherine; Norton, Joanne; Chasse, Rachel; Morris, John; Goate, Alison; Cruchaga, Carlos
2016-12-01
The accumulation of the toxic Aβ peptide in Alzheimer's disease (AD) largely relies upon an efficient recycling of amyloid precursor protein (APP). Recent genetic association studies have described rare variants in SORL1 with putative pathogenic consequences in the recycling of APP. In this work, we examine the presence of rare coding variants in SORL1 in three different European American cohorts: early-onset, late-onset AD (LOAD) and familial LOAD.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Werley, Kenneth Alan; Mccown, Andrew William
The EPREP code is designed to evaluate the effects of an Electro-Magnetic Pulse (EMP) on the electric power transmission system. The EPREP code embodies an umbrella framework that allows a user to set up analysis conditions and to examine analysis results. The code links to three major physics/engineering modules. The first module describes the EM wave in space and time. The second module evaluates the damage caused by the wave on specific electric power (EP) transmission system components. The third module evaluates the consequence of the damaged network on its (reduced) ability to provide electric power to meet demand. Thismore » third module is the focus of the present paper. The EMPACT code serves as the third module. The EMPACT name denotes EMP effects on Alternating Current Transmission systems. The EMPACT algorithms compute electric power transmission network flow solutions under severely damaged network conditions. Initial solutions are often characterized by unacceptible network conditions including line overloads and bad voltages. The EMPACT code contains algorithms to adjust optimally network parameters to eliminate network problems while minimizing outages. System adjustments include automatically adjusting control equipment (generator V control, variable transformers, and variable shunts), as well as non-automatic control of generator power settings and minimal load shedding. The goal is to evaluate the minimal loss of customer load under equilibrium (steady-state) conditions during peak demand.« less
Lorkowski, Jacek; Mrzygłód, Mirosław; Kotela, Ireneusz; Kiełbasiewicz-Lorkowska, Ewa; Teul, Iwona
2013-01-01
According to the verdict of the Supreme Court in 2005, an employer may dismiss an employee if their conduct (including dress) exposes the employer to losses or threatens his interests. The aim of the study was a holistic assessment of the pleiotropic effects of high-heeled pointed shoes on the health condition of women's feet, wearing them at work, in accordance with the existing rules of the "business dress code". A holistic multidisciplinary analysis was performed. It takes into account: 1) women employees of banks and other large corporations (82 persons); 2) 2D FEM computer model developed by the authors of foot deformed by pointed high-heeled shoes; 3) web site found after entering the code "business dress code". Over 60% of women in the office wore high-heeled shoes. The following has been found among people walking to work in high heels: 1) reduction in the quality of life in about 70% of cases, through periodic occurrence of pain and reduction of functional capacity of the feet; 2) increase in the pressure on the plantar side of the forefoot at least twice; 3) the continued effects the forces deforming the forefoot. 1. An evolutionary change of "dress code" shoes is necessary in order to lead to a reduction in non-physiological overload of feet and the consequence of their disability. 2. These changes are particularly urgent in patients with so-called "sensitive foot".
Athermalization of infrared dual field optical system based on wavefront coding
NASA Astrophysics Data System (ADS)
Jiang, Kai; Jiang, Bo; Liu, Kai; Yan, Peipei; Duan, Jing; Shan, Qiu-sha
2017-02-01
Wavefront coding is a technology which combination of the optical design and digital image processing. By inserting a phase mask closed to the pupil plane of the optical system the wavefront of the system is re-modulated. And the depth of focus is extended consequently. In reality the idea is same as the athermalization theory of infrared optical system. In this paper, an uncooled infrared dual field optical system with effective focal as 38mm/19mm, F number as 1.2 of both focal length, operating wavelength varying from 8μm to 12μm was designed. A cubic phase mask was used at the pupil plane to re-modulate the wavefront. Then the performance of the infrared system was simulated with CODEV as the environment temperature varying from -40° to 60°. MTF curve of the optical system with phase mask are compared with the outcome before using phase mask. The result show that wavefront coding technology can make the system not sensitive to thermal defocus, and then realize the athermal design of the infrared optical system.
Mironov, Vladimir; Moskovsky, Alexander; D’Mello, Michael; ...
2017-10-04
The Hartree-Fock (HF) method in the quantum chemistry package GAMESS represents one of the most irregular algorithms in computation today. Major steps in the calculation are the irregular computation of electron repulsion integrals (ERIs) and the building of the Fock matrix. These are the central components of the main Self Consistent Field (SCF) loop, the key hotspot in Electronic Structure (ES) codes. By threading the MPI ranks in the official release of the GAMESS code, we not only speed up the main SCF loop (4x to 6x for large systems), but also achieve a significant (>2x) reduction in the overallmore » memory footprint. These improvements are a direct consequence of memory access optimizations within the MPI ranks. We benchmark our implementation against the official release of the GAMESS code on the Intel R Xeon PhiTM supercomputer. Here, scaling numbers are reported on up to 7,680 cores on Intel Xeon Phi coprocessors.« less
From chemical metabolism to life: the origin of the genetic coding process
2017-01-01
Looking for origins is so much rooted in ideology that most studies reflect opinions that fail to explore the first realistic scenarios. To be sure, trying to understand the origins of life should be based on what we know of current chemistry in the solar system and beyond. There, amino acids and very small compounds such as carbon dioxide, dihydrogen or dinitrogen and their immediate derivatives are ubiquitous. Surface-based chemical metabolism using these basic chemicals is the most likely beginning in which amino acids, coenzymes and phosphate-based small carbon molecules were built up. Nucleotides, and of course RNAs, must have come to being much later. As a consequence, the key question to account for life is to understand how chemical metabolism that began with amino acids progressively shaped into a coding process involving RNAs. Here I explore the role of building up complementarity rules as the first information-based process that allowed for the genetic code to emerge, after RNAs were substituted to surfaces to carry over the basic metabolic pathways that drive the pursuit of life. PMID:28684991
Confinement properties of tokamak plasmas with extended regions of low magnetic shear
NASA Astrophysics Data System (ADS)
Graves, J. P.; Cooper, W. A.; Kleiner, A.; Raghunathan, M.; Neto, E.; Nicolas, T.; Lanthaler, S.; Patten, H.; Pfefferle, D.; Brunetti, D.; Lutjens, H.
2017-10-01
Extended regions of low magnetic shear can be advantageous to tokamak plasmas. But the core and edge can be susceptible to non-resonant ideal fluctuations due to the weakened restoring force associated with magnetic field line bending. This contribution shows how saturated non-linear phenomenology, such as 1 / 1 Long Lived Modes, and Edge Harmonic Oscillations associated with QH-modes, can be modelled accurately using the non-linear stability code XTOR, the free boundary 3D equilibrium code VMEC, and non-linear analytic theory. That the equilibrium approach is valid is particularly valuable because it enables advanced particle confinement studies to be undertaken in the ordinarily difficult environment of strongly 3D magnetic fields. The VENUS-LEVIS code exploits the Fourier description of the VMEC equilibrium fields, such that full Lorenzian and guiding centre approximated differential operators in curvilinear angular coordinates can be evaluated analytically. Consequently, the confinement properties of minority ions such as energetic particles and high Z impurities can be calculated accurately over slowing down timescales in experimentally relevant 3D plasmas.
Qualitatively different coding of symbolic and nonsymbolic numbers in the human brain.
Lyons, Ian M; Ansari, Daniel; Beilock, Sian L
2015-02-01
Are symbolic and nonsymbolic numbers coded differently in the brain? Neuronal data indicate that overlap in numerical tuning curves is a hallmark of the approximate, analogue nature of nonsymbolic number representation. Consequently, patterns of fMRI activity should be more correlated when the representational overlap between two numbers is relatively high. In bilateral intraparietal sulci (IPS), for nonsymbolic numbers, the pattern of voxelwise correlations between pairs of numbers mirrored the amount of overlap in their tuning curves under the assumption of approximate, analogue coding. In contrast, symbolic numbers showed a flat field of modest correlations more consistent with discrete, categorical representation (no systematic overlap between numbers). Directly correlating activity patterns for a given number across formats (e.g., the numeral "6" with six dots) showed no evidence of shared symbolic and nonsymbolic number-specific representations. Overall (univariate) activity in bilateral IPS was well fit by the log of the number being processed for both nonsymbolic and symbolic numbers. IPS activity is thus sensitive to numerosity regardless of format; however, the nature in which symbolic and nonsymbolic numbers are encoded is fundamentally different. © 2014 Wiley Periodicals, Inc.
Scheckel, Claudia; Drapeau, Elodie; Frias, Maria A; Park, Christopher Y; Fak, John; Zucker-Scharff, Ilana; Kou, Yan; Haroutunian, Vahram; Ma'ayan, Avi
2016-01-01
Neuronal ELAV-like (nELAVL) RNA binding proteins have been linked to numerous neurological disorders. We performed crosslinking-immunoprecipitation and RNAseq on human brain, and identified nELAVL binding sites on 8681 transcripts. Using knockout mice and RNAi in human neuroblastoma cells, we showed that nELAVL intronic and 3' UTR binding regulates human RNA splicing and abundance. We validated hundreds of nELAVL targets among which were important neuronal and disease-associated transcripts, including Alzheimer's disease (AD) transcripts. We therefore investigated RNA regulation in AD brain, and observed differential splicing of 150 transcripts, which in some cases correlated with differential nELAVL binding. Unexpectedly, the most significant change of nELAVL binding was evident on non-coding Y RNAs. nELAVL/Y RNA complexes were specifically remodeled in AD and after acute UV stress in neuroblastoma cells. We propose that the increased nELAVL/Y RNA association during stress may lead to nELAVL sequestration, redistribution of nELAVL target binding, and altered neuronal RNA splicing. DOI: http://dx.doi.org/10.7554/eLife.10421.001 PMID:26894958
Optimizing legacy molecular dynamics software with directive-based offload
NASA Astrophysics Data System (ADS)
Michael Brown, W.; Carrillo, Jan-Michael Y.; Gavhane, Nitin; Thakkar, Foram M.; Plimpton, Steven J.
2015-10-01
Directive-based programming models are one solution for exploiting many-core coprocessors to increase simulation rates in molecular dynamics. They offer the potential to reduce code complexity with offload models that can selectively target computations to run on the CPU, the coprocessor, or both. In this paper, we describe modifications to the LAMMPS molecular dynamics code to enable concurrent calculations on a CPU and coprocessor. We demonstrate that standard molecular dynamics algorithms can run efficiently on both the CPU and an x86-based coprocessor using the same subroutines. As a consequence, we demonstrate that code optimizations for the coprocessor also result in speedups on the CPU; in extreme cases up to 4.7X. We provide results for LAMMPS benchmarks and for production molecular dynamics simulations using the Stampede hybrid supercomputer with both Intel® Xeon Phi™ coprocessors and NVIDIA GPUs. The optimizations presented have increased simulation rates by over 2X for organic molecules and over 7X for liquid crystals on Stampede. The optimizations are available as part of the "Intel package" supplied with LAMMPS.
A generic framework for individual-based modelling and physical-biological interaction
2018-01-01
The increased availability of high-resolution ocean data globally has enabled more detailed analyses of physical-biological interactions and their consequences to the ecosystem. We present IBMlib, which is a versatile, portable and computationally effective framework for conducting Lagrangian simulations in the marine environment. The purpose of the framework is to handle complex individual-level biological models of organisms, combined with realistic 3D oceanographic model of physics and biogeochemistry describing the environment of the organisms without assumptions about spatial or temporal scales. The open-source framework features a minimal robust interface to facilitate the coupling between individual-level biological models and oceanographic models, and we provide application examples including forward/backward simulations, habitat connectivity calculations, assessing ocean conditions, comparison of physical circulation models, model ensemble runs and recently posterior Eulerian simulations using the IBMlib framework. We present the code design ideas behind the longevity of the code, our implementation experiences, as well as code performance benchmarking. The framework may contribute substantially to progresses in representing, understanding, predicting and eventually managing marine ecosystems. PMID:29351280
Sexual involvement with patients.
Kirstein, L
1978-04-01
Three cases of sexual activity between patients and staff members were presented and determinants and consequences of this type of acting out behavior were discussed. Patients sexual behavior was in part motivated by a need to avoid feelings of loneliness and anxiety and a consequence of the sexual behavior was the recurrence of symptoms and behaviors noted upon admission. The staff members were noted to become more self preoccupied and less involved with both staff and patients following the sexual behavior. The role of the ward psychiatrist in preventing such patient staff interactions includes his taking responsibility for educational and supervisory needs of the staff, his being involved in the creation and maintenance of the ward's moral code and his awareness of group and organizational factors that may impede open staff communications.
Deep-Earth reactor: Nuclear fission, helium, and the geomagnetic field
Hollenbach, D. F.; Herndon, J. M.
2001-01-01
Geomagnetic field reversals and changes in intensity are understandable from an energy standpoint as natural consequences of intermittent and/or variable nuclear fission chain reactions deep within the Earth. Moreover, deep-Earth production of helium, having 3He/4He ratios within the range observed from deep-mantle sources, is demonstrated to be a consequence of nuclear fission. Numerical simulations of a planetary-scale geo-reactor were made by using the SCALE sequence of codes. The results clearly demonstrate that such a geo-reactor (i) would function as a fast-neutron fuel breeder reactor; (ii) could, under appropriate conditions, operate over the entire period of geologic time; and (iii) would function in such a manner as to yield variable and/or intermittent output power. PMID:11562483
Moral disengagement in the corporate world.
White, Jenny; Bandura, Albert; Bero, Lisa A
2009-01-01
We analyze mechanisms of moral disengagement used to eliminate moral consequences by industries whose products or production practices are harmful to human health. Moral disengagement removes the restraint of self-censure from harmful practices. Moral self-sanctions can be selectively disengaged from harmful activities by investing them with socially worthy purposes, sanitizing and exonerating them, displacing and diffusing responsibility, minimizing or disputing harmful consequences, making advantageous comparisons, and disparaging and blaming critics and victims. Internal industry documents and public statements related to the research activities of these industries were coded for modes of moral disengagement by the tobacco, lead, vinyl chloride (VC), and silicosis-producing industries. All but one of the modes of moral disengagement were used by each of these industries. We present possible safeguards designed to protect the integrity of research.
Impacts of phylogenetic nomenclature on the efficacy of the U.S. Endangered Species Act.
Leslie, Matthew S
2015-02-01
Cataloging biodiversity is critical to conservation efforts because accurate taxonomy is often a precondition for protection under laws designed for species conservation, such as the U.S. Endangered Species Act (ESA). Traditional nomenclatural codes governing the taxonomic process have recently come under scrutiny because taxon names are more closely linked to hierarchical ranks than to the taxa themselves. A new approach to naming biological groups, called phylogenetic nomenclature (PN), explicitly names taxa by defining their names in terms of ancestry and descent. PN has the potential to increase nomenclatural stability and decrease confusion induced by the rank-based codes. But proponents of PN have struggled with whether species and infraspecific taxa should be governed by the same rules as other taxa or should have special rules. Some proponents advocate the wholesale abandonment of rank labels (including species); this could have consequences for the implementation of taxon-based conservation legislation. I examined the principles of PN as embodied in the PhyloCode (an alternative to traditional rank-based nomenclature that names biological groups based on the results of phylogenetic analyses and does not associate taxa with ranks) and assessed how this novel approach to naming taxa might affect the implementation of species-based legislation by providing a case study of the ESA. The latest version of the PhyloCode relies on the traditional rank-based codes to name species and infraspecific taxa; thus, little will change regarding the main targets of the ESA because they will retain rank labels. For this reason, and because knowledge of evolutionary relationships is of greater importance than nomenclatural procedures for initial protection of endangered taxa under the ESA, I conclude that PN under the PhyloCode will have little impact on implementation of the ESA. © 2014 Society for Conservation Biology.
Syed, Ahsan A; Almas, Aysha; Naeem, Quratulain; Malik, Umer F; Muhammad, Tariq
2017-02-01
In Asian societies including Pakistan, a complex background of illiteracy, different familial dynamics, lack of patient's autonomy, religious beliefs, and financial constraints give new dimensions to code status discussion. Barriers faced by physicians during code status discussion in these societies are largely unknown. To determine the barriers and perceptions in discussion of code status by physicians. Questionnaire-based cross-sectional study. This study was conducted in the Department of Medicine of The Aga Khan University Hospital, Karachi, Pakistan. A total of 134 physicians who had discussed at least five code statuses in their lifetime were included. A total of 77 (57.4%) physicians responded. Family-related barriers were found to be the most common barriers. They include family denial (74.0%), level of education of family (66.2%), and conflict between individual family members (66.2%). Regarding personal barriers, lack of knowledge regarding prognosis (44.1%), personal discomfort in discussing death (29.8%), and fear of legal consequences (28.5%) were the top most barriers. In hospital-related barriers, time constraint (57.1%), lack of hospital administration support (48.0%), and suboptimal nursing care after do not resuscitate (48.0%) were the most frequent. There were significant differences among opinions of trainees when compared to those of attending physicians. Family-related barriers are the most frequent roadblocks in the end-of-life care discussions for physicians in Pakistan. Strengthening communication skills of physicians and family education are the potential strategies to improve end-of-life care. Large multi-center studies are needed to better understand the barriers of code status discussion in developing countries.
NASA Astrophysics Data System (ADS)
Zamani, K.; Bombardelli, F. A.
2014-12-01
Verification of geophysics codes is imperative to avoid serious academic as well as practical consequences. In case that access to any given source code is not possible, the Method of Manufactured Solution (MMS) cannot be employed in code verification. In contrast, employing the Method of Exact Solution (MES) has several practical advantages. In this research, we first provide four new one-dimensional analytical solutions designed for code verification; these solutions are able to uncover the particular imperfections of the Advection-diffusion-reaction equation, such as nonlinear advection, diffusion or source terms, as well as non-constant coefficient equations. After that, we provide a solution of Burgers' equation in a novel setup. Proposed solutions satisfy the continuity of mass for the ambient flow, which is a crucial factor for coupled hydrodynamics-transport solvers. Then, we use the derived analytical solutions for code verification. To clarify gray-literature issues in the verification of transport codes, we designed a comprehensive test suite to uncover any imperfection in transport solvers via a hierarchical increase in the level of tests' complexity. The test suite includes hundreds of unit tests and system tests to check vis-a-vis the portions of the code. Examples for checking the suite start by testing a simple case of unidirectional advection; then, bidirectional advection and tidal flow and build up to nonlinear cases. We design tests to check nonlinearity in velocity, dispersivity and reactions. The concealing effect of scales (Peclet and Damkohler numbers) on the mesh-convergence study and appropriate remedies are also discussed. For the cases in which the appropriate benchmarks for mesh convergence study are not available, we utilize symmetry. Auxiliary subroutines for automation of the test suite and report generation are designed. All in all, the test package is not only a robust tool for code verification but it also provides comprehensive insight on the ADR solvers capabilities. Such information is essential for any rigorous computational modeling of ADR equation for surface/subsurface pollution transport. We also convey our experiences in finding several errors which were not detectable with routine verification techniques.
Błażej, Paweł; Wnȩtrzak, Małgorzata; Mackiewicz, Paweł
2016-12-01
One of theories explaining the present structure of canonical genetic code assumes that it was optimized to minimize harmful effects of amino acid replacements resulting from nucleotide substitutions and translational errors. A way to testify this concept is to find the optimal code under given criteria and compare it with the canonical genetic code. Unfortunately, the huge number of possible alternatives makes it impossible to find the optimal code using exhaustive methods in sensible time. Therefore, heuristic methods should be applied to search the space of possible solutions. Evolutionary algorithms (EA) seem to be ones of such promising approaches. This class of methods is founded both on mutation and crossover operators, which are responsible for creating and maintaining the diversity of candidate solutions. These operators possess dissimilar characteristics and consequently play different roles in the process of finding the best solutions under given criteria. Therefore, the effective searching for the potential solutions can be improved by applying both of them, especially when these operators are devised specifically for a given problem. To study this subject, we analyze the effectiveness of algorithms for various combinations of mutation and crossover probabilities under three models of the genetic code assuming different restrictions on its structure. To achieve that, we adapt the position based crossover operator for the most restricted model and develop a new type of crossover operator for the more general models. The applied fitness function describes costs of amino acid replacement regarding their polarity. Our results indicate that the usage of crossover operators can significantly improve the quality of the solutions. Moreover, the simulations with the crossover operator optimize the fitness function in the smaller number of generations than simulations without this operator. The optimal genetic codes without restrictions on their structure minimize the costs about 2.7 times better than the canonical genetic code. Interestingly, the optimal codes are dominated by amino acids characterized by polarity close to its average value for all amino acids. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Nonlinear Modeling of Radial Stellar Pulsations
NASA Astrophysics Data System (ADS)
Smolec, R.
2009-09-01
In this thesis, I present the results of my work concerning the nonlinear modeling of radial stellar pulsations. I will focus on classical Cepheids, particularly on the double-mode phenomenon. History of nonlinear modeling of radial stellar pulsations begins in the sixties of the previous century. At the beginning convection was disregarded in model equations. Qualitatively, almost all features of the radial pulsators were successfully modeled with purely radiative hydrocodes. Among problems that remained, the most disturbing was modeling of the double-mode phenomenon. This long-standing problem seemed to be finally solved with the inclusion of turbulent convection into the model equations (Kollath et al. 1998, Feuchtinger 1998). Although dynamical aspects of the double-mode behaviour were extensively studied, its origin, particularly the specific role played by convection, remained obscure. To study this and other problems of radial stellar pulsations, I implemented the convection into pulsation hydrocodes. The codes adopt the Kuhfuss (1986) convection model. In other codes, particularly in the Florida-Budapest hydrocode (e.g. Kollath et al. 2002), used in comput! ation of most of the published double-mode models, different approximations concerning e.g. eddy-viscous terms or treatment of convectively stable regions are adopted. Particularly the neglect of negative buoyancy effects in the Florida-Budapest code and its consequences, were never discussed in the literature. These consequences are severe. Concerning the single-mode pulsators, neglect of negative buoyancy leads to smaller pulsation amplitudes, in comparison to amplitudes computed with code including these effects. Particularly, neglect of negative buoyancy reduces the amplitude of the fundamental mode very strong. This property of the Florida-Budapest models is crucial in bringing up the stable non-resonant double-mode Cepheid pulsation involving fundamental and first overtone modes (F/1O). Such pulsation is not observed in models computed including negative buoyancy. As the neglect of negative buoyancy is physically not correct, so are the double-mode Cepheid models computed with the Florida-Budapest hydrocode. Extensive search for F/1O double-mode Cepheid pulsation with the codes including negative buoyancy effects yielded null result. Some resonant double-mode F/1O Cepheid models were found, but their occurrence was restricted to a very narrow domain in the Hertzsprung-Russel diagram. Model computations intended to model the double-overtone (1O/2O) Cepheids in the Large Magellanic Cloud, also revealed some stable double-mode pulsations, however, restricted to a narrow period range. Resonances are most likely conductive in bringing up the double-mode behaviour observed in these models. However, majority of the double-overtone LMC Cepheids cannot be reproduced with our codes. Hence, the modeling of double-overtone Cepheids with convective hydrocodes is not satisfactory, either. Double-mode pulsation still lacks satisfactory explanation, and problem of its modeling remains open.
A simple program to measure and analyse tree rings using Excel, R and SigmaScan
Hietz, Peter
2011-01-01
I present a new software that links a program for image analysis (SigmaScan), one for spreadsheets (Excel) and one for statistical analysis (R) for applications of tree-ring analysis. The first macro measures ring width marked by the user on scanned images, stores raw and detrended data in Excel and calculates the distance to the pith and inter-series correlations. A second macro measures darkness along a defined path to identify latewood–earlywood transition in conifers, and a third shows the potential for automatic detection of boundaries. Written in Visual Basic for Applications, the code makes use of the advantages of existing programs and is consequently very economic and relatively simple to adjust to the requirements of specific projects or to expand making use of already available code. PMID:26109835
O’Doherty, John P.
2015-01-01
Neural correlates of value have been extensively reported in a diverse set of brain regions. However, in many cases it is difficult to determine whether a particular neural response pattern corresponds to a value-signal per se as opposed to an array of alternative non-value related processes, such as outcome-identity coding, informational coding, encoding of autonomic and skeletomotor consequences, alongside previously described “salience” or “attentional” effects. Here, I review a number of experimental manipulations that can be used to test for value, and I identify the challenges in ascertaining whether a particular neural response is or is not a value signal. Finally, I emphasize that some non-value related signals may be especially informative as a means of providing insight into the nature of the decision-making related computations that are being implemented in a particular brain region. PMID:24726573
Darwinism and ethology. The role of natural selection in animals and humans.
Gervet, J; Soleilhavoup, M
1997-11-01
The role of behaviour in biological evolution is examined within the context of Darwinism. All Darwinian models are based on the distinction of two mechanisms: one that permits faithful transmission of a feature from one generation to another, and another that differentially regulates the degree of this transmission. Behaviour plays a minimal role as an agent of transmission in the greater part of the animal kingdom; by contrast, the forms it may assume strongly influence the mechanisms of selection regulating the different rates of transmission. We consider the decisive feature of the human species to be the existence of a phenotypical system of cultural coding characterized by precision and reliability which are the distinctive feature of genetic coding in animals. We examine the consequences for the application of the Darwinian model to human history.
SKIRT: Hybrid parallelization of radiative transfer simulations
NASA Astrophysics Data System (ADS)
Verstocken, S.; Van De Putte, D.; Camps, P.; Baes, M.
2017-07-01
We describe the design, implementation and performance of the new hybrid parallelization scheme in our Monte Carlo radiative transfer code SKIRT, which has been used extensively for modelling the continuum radiation of dusty astrophysical systems including late-type galaxies and dusty tori. The hybrid scheme combines distributed memory parallelization, using the standard Message Passing Interface (MPI) to communicate between processes, and shared memory parallelization, providing multiple execution threads within each process to avoid duplication of data structures. The synchronization between multiple threads is accomplished through atomic operations without high-level locking (also called lock-free programming). This improves the scaling behaviour of the code and substantially simplifies the implementation of the hybrid scheme. The result is an extremely flexible solution that adjusts to the number of available nodes, processors and memory, and consequently performs well on a wide variety of computing architectures.
Information theory of adaptation in neurons, behavior, and mood.
Sharpee, Tatyana O; Calhoun, Adam J; Chalasani, Sreekanth H
2014-04-01
The ability to make accurate predictions of future stimuli and consequences of one's actions are crucial for the survival and appropriate decision-making. These predictions are constantly being made at different levels of the nervous system. This is evidenced by adaptation to stimulus parameters in sensory coding, and in learning of an up-to-date model of the environment at the behavioral level. This review will discuss recent findings that actions of neurons and animals are selected based on detailed stimulus history in such a way as to maximize information for achieving the task at hand. Information maximization dictates not only how sensory coding should adapt to various statistical aspects of stimuli, but also that reward function should adapt to match the predictive information from past to future. Copyright © 2013 Elsevier Ltd. All rights reserved.
Hamonet, Claude
2007-01-01
The reparation of corporeal damages, consequences of intentional or no intentional violence is a part of measurement of stability and progress in the human societies interested by a dignity life for the victims. Initiated by Hammourabi Code and continued by the Jews in the Bible, the reference was (now and still its) the amputed or impaired part of body (hand, arm, leg, eye...). For every part a fare in money was indicated or a rate in percentage. The Coast brothers translate in ecus or in slaves. This code indicates the originality of a society founded on violence, the robbery and murder with introduction of cooperative if not democratic modalities of functioning. The role of Bertrand d'Ogeron, governor of the Turtle Island was very beneficent.
A simple program to measure and analyse tree rings using Excel, R and SigmaScan.
Hietz, Peter
I present a new software that links a program for image analysis (SigmaScan), one for spreadsheets (Excel) and one for statistical analysis (R) for applications of tree-ring analysis. The first macro measures ring width marked by the user on scanned images, stores raw and detrended data in Excel and calculates the distance to the pith and inter-series correlations. A second macro measures darkness along a defined path to identify latewood-earlywood transition in conifers, and a third shows the potential for automatic detection of boundaries. Written in Visual Basic for Applications, the code makes use of the advantages of existing programs and is consequently very economic and relatively simple to adjust to the requirements of specific projects or to expand making use of already available code.
Hybrid finite element/waveguide mode analysis of passive RF devices
NASA Astrophysics Data System (ADS)
McGrath, Daniel T.
1993-07-01
A numerical solution for time-harmonic electromagnetic fields in two-port passive radio frequency (RF) devices has been developed, implemented in a computer code, and validated. Vector finite elements are used to represent the fields in the device interior, and field continuity across waveguide apertures is enforced by matching the interior solution to a sum of waveguide modes. Consequently, the mesh may end at the aperture instead of extending into the waveguide. The report discusses the variational formulation and its reduction to a linear system using Galerkin's method. It describes the computer code, including its interface to commercial CAD software used for geometry generation. It presents validation results for waveguide discontinuities, coaxial transitions, and microstrip circuits. They demonstrate that the method is an effective and versatile tool for predicting the performance of passive RF devices.
Force-free electrodynamics in dynamical curved spacetimes
NASA Astrophysics Data System (ADS)
McWilliams, Sean
2015-04-01
We present results on our study of force-free electrodynamics in curved spacetimes. Specifically, we present several improvements to what has become the established set of evolution equations, and we apply these to study the nonlinear stability of analytically known force-free solutions for the first time. We implement our method in a new pseudo-spectral code built on top of the SpEC code for evolving dynamic spacetimes. Finally, we revisit these known solutions and attempt to clarify some interesting properties that render them analytically tractable. Finally, we preview some new work that similarly revisits the established approach to solving another problem in numerical relativity: the post-merger recoil from asymmetric gravitational-wave emission. These new results may have significant implications for the parameter dependence of recoils, and consequently on the statistical expectations for recoil velocities of merged systems.
2007 Report to Congress of the U.S.- China Economic and Security Review Commission
2007-11-01
the $3 billion stake it took in the New York-based private equity firm The Blackstone Group. Some worry that the new fund may be used to capture more...pollution of surrounding riverbanks and other ecological harm. Pollution from Coal Mining Air pollution is not the only environmental consequence of Chi...technology firms and human rights organizations was formed to discuss the establishment of an international code of ethics on issues related to
An Improved Maintenance Model for the Simulation of Strategic Airlift Capability.
1982-03-01
developed using SLAM as the primary simulation language. Maintenance manning is modeled at the Air Force Specialty Code level, to allow the possibility of...Atlantic Treaty Organization (NATO) allies is one of our primary national objectives, but recent increases in Soviet ground and air forces (Ref 5:100) have...arrive from the United States. Consequently, the primary objective of the United States Air Force mobility program is to be able, by 1982, to double the
1983-04-01
34.. .. . ...- "- -,-. SIGNIFICANCE AND EXPLANATION Many different codes for the simulation of semiconductor devices such as transitors , diodes, thyristors are already circulated...partially take into account the consequences introduced by degenerate semiconductors (e.g. invalidity of Boltzmann’s statistics , bandgap narrowing). These...ft - ni p nep /Ut(2.10) Sni *e p nie 2.11) .7. (2.10) can be physically interpreted as the application of Boltzmann statistics . However (2.10) a.,zo
Spatial coding of ordinal information in short- and long-term memory.
Ginsburg, Véronique; Gevers, Wim
2015-01-01
The processing of numerical information induces a spatial response bias: Faster responses to small numbers with the left hand and faster responses to large numbers with the right hand. Most theories agree that long-term representations underlie this so called SNARC effect (Spatial Numerical Association of Response Codes; Dehaene et al., 1993). However, a spatial response bias was also observed with the activation of temporary position-space associations in working memory (ordinal position effect; van Dijck and Fias, 2011). Items belonging to the beginning of a memorized sequence are responded to faster with the left hand side while items at the end of the sequence are responded to faster with the right hand side. The theoretical possibility was put forward that the SNARC effect is an instance of the ordinal position effect, with the empirical consequence that the SNARC effect and the ordinal position effect cannot be observed simultaneously. In two experiments we falsify this claim by demonstrating that the SNARC effect and the ordinal position effect are not mutually exclusive. Consequently, this suggests that the SNARC effect and the ordinal position effect result from the activation of different representations. We conclude that spatial response biases can result from the activation of both pre-existing positions in long-term memory and from temporary space associations in working memory at the same time.
COOL: A code for Dynamic Monte Carlo Simulation of molecular dynamics
NASA Astrophysics Data System (ADS)
Barletta, Paolo
2012-02-01
Cool is a program to simulate evaporative and sympathetic cooling for a mixture of two gases co-trapped in an harmonic potential. The collisions involved are assumed to be exclusively elastic, and losses are due to evaporation from the trap. Each particle is followed individually in its trajectory, consequently properties such as spatial densities or energy distributions can be readily evaluated. The code can be used sequentially, by employing one output as input for another run. The code can be easily generalised to describe more complicated processes, such as the inclusion of inelastic collisions, or the possible presence of more than two species in the trap. New version program summaryProgram title: COOL Catalogue identifier: AEHJ_v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEHJ_v2_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 1 097 733 No. of bytes in distributed program, including test data, etc.: 18 425 722 Distribution format: tar.gz Programming language: C++ Computer: Desktop Operating system: Linux RAM: 500 Mbytes Classification: 16.7, 23 Catalogue identifier of previous version: AEHJ_v1_0 Journal reference of previous version: Comput. Phys. Comm. 182 (2011) 388 Does the new version supersede the previous version?: Yes Nature of problem: Simulation of the sympathetic process occurring for two molecular gases co-trapped in a deep optical trap. Solution method: The Direct Simulation Monte Carlo method exploits the decoupling, over a short time period, of the inter-particle interaction from the trapping potential. The particle dynamics is thus exclusively driven by the external optical field. The rare inter-particle collisions are considered with an acceptance/rejection mechanism, that is, by comparing a random number to the collisional probability defined in terms of the inter-particle cross section and centre-of-mass energy. All particles in the trap are individually simulated so that at each time step a number of useful quantities, such as the spatial densities or the energy distributions, can be readily evaluated. Reasons for new version: A number of issues made the old version very difficult to be ported on different architectures, and impossible to compile on Windows. Furthermore, the test runs results could only be replicated poorly, as a consequence of the simulations being very sensitive to the machine background noise. In practise, as the particles are simulated for billions and billions of steps, the consequence of a small difference in the initial conditions due to the finiteness of double precision real can have macroscopic effects in the output. This is not a problem in its own right, but a feature of such simulations. However, for sake of completeness we have introduced a quadruple precision version of the code which yields the same results independently of the software used to compile it, or the hardware architecture where the code is run. Summary of revisions: A number of bugs in the dynamic memory allocation have been detected and removed, mostly in the cool.cpp file. All files have been renamed with a .cpp ending, rather than .c++, to make them compatible with Windows. The Random Number Generator routine, which is the computational core of the algorithm, has been re-written in C++, and there is no need any longer for cross FORTRAN-C++ compilation. A quadruple precision version of the code is provided alongside the original double precision one. The makefile allows the user to choose which one to compile by setting the switch PRECISION to either double or quad. The source code and header files have been organised into directories to make the code file system look neater. Restrictions: The in-trap motion of the particles is treated classically. Running time: The running time is relatively short, 1-2 hours. However it is convenient to replicate each simulation several times with different initialisations of the random sequence.
Conceptual-driven classification for coding advise in health insurance reimbursement.
Li, Sheng-Tun; Chen, Chih-Chuan; Huang, Fernando
2011-01-01
With the non-stop increases in medical treatment fees, the economic survival of a hospital in Taiwan relies on the reimbursements received from the Bureau of National Health Insurance, which in turn depend on the accuracy and completeness of the content of the discharge summaries as well as the correctness of their International Classification of Diseases (ICD) codes. The purpose of this research is to enforce the entire disease classification framework by supporting disease classification specialists in the coding process. This study developed an ICD code advisory system (ICD-AS) that performed knowledge discovery from discharge summaries and suggested ICD codes. Natural language processing and information retrieval techniques based on Zipf's Law were applied to process the content of discharge summaries, and fuzzy formal concept analysis was used to analyze and represent the relationships between the medical terms identified by MeSH. In addition, a certainty factor used as reference during the coding process was calculated to account for uncertainty and strengthen the credibility of the outcome. Two sets of 360 and 2579 textual discharge summaries of patients suffering from cerebrovascular disease was processed to build up ICD-AS and to evaluate the prediction performance. A number of experiments were conducted to investigate the impact of system parameters on accuracy and compare the proposed model to traditional classification techniques including linear-kernel support vector machines. The comparison results showed that the proposed system achieves the better overall performance in terms of several measures. In addition, some useful implication rules were obtained, which improve comprehension of the field of cerebrovascular disease and give insights to the relationships between relevant medical terms. Our system contributes valuable guidance to disease classification specialists in the process of coding discharge summaries, which consequently brings benefits in aspects of patient, hospital, and healthcare system. Copyright © 2010 Elsevier B.V. All rights reserved.
Optimizing the use of a sensor resource for opponent polarization coding
Heras, Francisco J.H.
2017-01-01
Flies use specialized photoreceptors R7 and R8 in the dorsal rim area (DRA) to detect skylight polarization. R7 and R8 form a tiered waveguide (central rhabdomere pair, CRP) with R7 on top, filtering light delivered to R8. We examine how the division of a given resource, CRP length, between R7 and R8 affects their ability to code polarization angle. We model optical absorption to show how the length fractions allotted to R7 and R8 determine the rates at which they transduce photons, and correct these rates for transduction unit saturation. The rates give polarization signal and photon noise in R7, and in R8. Their signals are combined in an opponent unit, intrinsic noise added, and the unit’s output analysed to extract two measures of coding ability, number of discriminable polarization angles and mutual information. A very long R7 maximizes opponent signal amplitude, but codes inefficiently due to photon noise in the very short R8. Discriminability and mutual information are optimized by maximizing signal to noise ratio, SNR. At lower light levels approximately equal lengths of R7 and R8 are optimal because photon noise dominates. At higher light levels intrinsic noise comes to dominate and a shorter R8 is optimum. The optimum R8 length fractions falls to one third. This intensity dependent range of optimal length fractions corresponds to the range observed in different fly species and is not affected by transduction unit saturation. We conclude that a limited resource, rhabdom length, can be divided between two polarization sensors, R7 and R8, to optimize opponent coding. We also find that coding ability increases sub-linearly with total rhabdom length, according to the law of diminishing returns. Consequently, the specialized shorter central rhabdom in the DRA codes polarization twice as efficiently with respect to rhabdom length than the longer rhabdom used in the rest of the eye. PMID:28316880
The Purine Bias of Coding Sequences is Determined by Physicochemical Constraints on Proteins.
Ponce de Leon, Miguel; de Miranda, Antonio Basilio; Alvarez-Valin, Fernando; Carels, Nicolas
2014-01-01
For this report, we analyzed protein secondary structures in relation to the statistics of three nucleotide codon positions. The purpose of this investigation was to find which properties of the ribosome, tRNA or protein level, could explain the purine bias (Rrr) as it is observed in coding DNA. We found that the Rrr pattern is the consequence of a regularity (the codon structure) resulting from physicochemical constraints on proteins and thermodynamic constraints on ribosomal machinery. The physicochemical constraints on proteins mainly come from the hydropathy and molecular weight (MW) of secondary structures as well as the energy cost of amino acid synthesis. These constraints appear through a network of statistical correlations, such as (i) the cost of amino acid synthesis, which is in favor of a higher level of guanine in the first codon position, (ii) the constructive contribution of hydropathy alternation in proteins, (iii) the spatial organization of secondary structure in proteins according to solvent accessibility, (iv) the spatial organization of secondary structure according to amino acid hydropathy, (v) the statistical correlation of MW with protein secondary structures and their overall hydropathy, (vi) the statistical correlation of thymine in the second codon position with hydropathy and the energy cost of amino acid synthesis, and (vii) the statistical correlation of adenine in the second codon position with amino acid complexity and the MW of secondary protein structures. Amino acid physicochemical properties and functional constraints on proteins constitute a code that is translated into a purine bias within the coding DNA via tRNAs. In that sense, the Rrr pattern within coding DNA is the effect of information transfer on nucleotide composition from protein to DNA by selection according to the codon positions. Thus, coding DNA structure and ribosomal machinery co-evolved to minimize the energy cost of protein coding given the functional constraints on proteins.
Moral competence among nurses in Malawi: A concept analysis approach.
Maluwa, Veronica Mary; Gwaza, Elizabeth; Sakala, Betty; Kapito, Esnath; Mwale, Ruth; Haruzivishe, Clara; Chirwa, Ellen
2018-01-01
Nurses are expected to provide comprehensive, holistic and ethically accepted care according to their code of ethics and practice. However, in Malawi, this is not always the case. This article analyses moral competence concept using the Walker and Avant's strategy of concept analysis. The aim of this article is to analyse moral competence concept in relation to nursing practice and determine defining attributes, antecedents and consequences of moral competence in nursing practice. Analysis of moral competence concept was done using Walker and Avant's strategy of concept analysis. Deductive analysis was used to find the defining attributes of moral competence, which were kindness, compassion, caring, critical thinking, ethical decision making ability, problem solving, responsibility, discipline, accountability, communication, solidarity, honesty, and respect for human values, dignity and rights. The identified antecedents were personal, cultural and religious values; nursing ethics training, environment and guidance. The consequences of moral competence are team work spirit, effective communication, improved performance and positive attitudes in providing nursing care. Moral competence can therefore be used as a tool to improve care in nursing practice to meet patients' problems and needs and consequently increase public's satisfaction in Malawi.
Palmer, Cameron S; Franklyn, Melanie
2011-01-07
Trauma systems should consistently monitor a given trauma population over a period of time. The Abbreviated Injury Scale (AIS) and derived scores such as the Injury Severity Score (ISS) are commonly used to quantify injury severities in trauma registries. To reflect contemporary trauma management and treatment, the most recent version of the AIS (AIS08) contains many codes which differ in severity from their equivalents in the earlier 1998 version (AIS98). Consequently, the adoption of AIS08 may impede comparisons between data coded using different AIS versions. It may also affect the number of patients classified as major trauma. The entire AIS98-coded injury dataset of a large population based trauma registry was retrieved and mapped to AIS08 using the currently available AIS98-AIS08 dictionary map. The percentage of codes which had increased or decreased in severity, or could not be mapped, was examined in conjunction with the effect of these changes to the calculated ISS. The potential for free text information accompanying AIS coding to improve the quality of AIS mapping was explored. A total of 128280 AIS98-coded injuries were evaluated in 32134 patients, 15471 patients of whom were classified as major trauma. Although only 4.5% of dictionary codes decreased in severity from AIS98 to AIS08, this represented almost 13% of injuries in the registry. In 4.9% of patients, no injuries could be mapped. ISS was potentially unreliable in one-third of patients, as they had at least one AIS98 code which could not be mapped. Using AIS08, the number of patients classified as major trauma decreased by between 17.3% and 30.3%. Evaluation of free text descriptions for some injuries demonstrated the potential to improve mapping between AIS versions. Converting AIS98-coded data to AIS08 results in a significant decrease in the number of patients classified as major trauma. Many AIS98 codes are missing from the existing AIS map, and across a trauma population the AIS08 dataset estimates which it produces are of insufficient quality to be used in practice. However, it may be possible to improve AIS98 to AIS08 mapping to the point where it is useful to established registries.
2011-01-01
Background Trauma systems should consistently monitor a given trauma population over a period of time. The Abbreviated Injury Scale (AIS) and derived scores such as the Injury Severity Score (ISS) are commonly used to quantify injury severities in trauma registries. To reflect contemporary trauma management and treatment, the most recent version of the AIS (AIS08) contains many codes which differ in severity from their equivalents in the earlier 1998 version (AIS98). Consequently, the adoption of AIS08 may impede comparisons between data coded using different AIS versions. It may also affect the number of patients classified as major trauma. Methods The entire AIS98-coded injury dataset of a large population based trauma registry was retrieved and mapped to AIS08 using the currently available AIS98-AIS08 dictionary map. The percentage of codes which had increased or decreased in severity, or could not be mapped, was examined in conjunction with the effect of these changes to the calculated ISS. The potential for free text information accompanying AIS coding to improve the quality of AIS mapping was explored. Results A total of 128280 AIS98-coded injuries were evaluated in 32134 patients, 15471 patients of whom were classified as major trauma. Although only 4.5% of dictionary codes decreased in severity from AIS98 to AIS08, this represented almost 13% of injuries in the registry. In 4.9% of patients, no injuries could be mapped. ISS was potentially unreliable in one-third of patients, as they had at least one AIS98 code which could not be mapped. Using AIS08, the number of patients classified as major trauma decreased by between 17.3% and 30.3%. Evaluation of free text descriptions for some injuries demonstrated the potential to improve mapping between AIS versions. Conclusions Converting AIS98-coded data to AIS08 results in a significant decrease in the number of patients classified as major trauma. Many AIS98 codes are missing from the existing AIS map, and across a trauma population the AIS08 dataset estimates which it produces are of insufficient quality to be used in practice. However, it may be possible to improve AIS98 to AIS08 mapping to the point where it is useful to established registries. PMID:21214906
A rocket-borne pulse-height analyzer for energetic particle measurements
NASA Technical Reports Server (NTRS)
Leung, W.; Smith, L. G.; Voss, H. D.
1979-01-01
The pulse-height analyzer basically resembles a time-sharing multiplexing data-acquisition system which acquires analog data (from energetic particle spectrometers) and converts them into digital code. The PHA simultaneously acquires pulse-height information from the analog signals of the four input channels and sequentially multiplexes the digitized data to a microprocessor. The PHA together with the microprocessor form an on-board real-time data-manipulation system. The system processes data obtained during the rocket flight and reduces the amount of data to be sent back to the ground station. Consequently the data-reduction process for the rocket experiments is speeded up. By using a time-sharing technique, the throughput rate of the microprocessor is increased. Moreover, data from several particle spectrometers are manipulated to share one information channel; consequently, the TM capacity is increased.
Fernandez-Mercado, Marta; Manterola, Lorea; Larrea, Erika; Goicoechea, Ibai; Arestin, María; Armesto, María; Otaegui, David; Lawrie, Charles H
2015-01-01
The gold standard for cancer diagnosis remains the histological examination of affected tissue, obtained either by surgical excision, or radiologically guided biopsy. Such procedures however are expensive, not without risk to the patient, and require consistent evaluation by expert pathologists. Consequently, the search for non-invasive tools for the diagnosis and management of cancer has led to great interest in the field of circulating nucleic acids in plasma and serum. An additional benefit of blood-based testing is the ability to carry out screening and repeat sampling on patients undergoing therapy, or monitoring disease progression allowing for the development of a personalized approach to cancer patient management. Despite having been discovered over 60 years ago, the clear clinical potential of circulating nucleic acids, with the notable exception of prenatal diagnostic testing, has yet to translate into the clinic. The recent discovery of non-coding (nc) RNA (in particular micro(mi)RNAs) in the blood has provided fresh impetuous for the field. In this review, we discuss the potential of the circulating transcriptome (coding and ncRNA), as novel cancer biomarkers, the controversy surrounding their origin and biology, and most importantly the hurdles that remain to be overcome if they are really to become part of future clinical practice. PMID:26119132
Study of steam condensation at sub-atmospheric pressure: setting a basic research using MELCOR code
NASA Astrophysics Data System (ADS)
Manfredini, A.; Mazzini, M.
2017-11-01
One of the most serious accidents that can occur in the experimental nuclear fusion reactor ITER is the break of one of the headers of the refrigeration system of the first wall of the Tokamak. This results in water-steam mixture discharge in vacuum vessel (VV), with consequent pressurization of this container. To prevent the pressure in the VV exceeds 150 KPa absolute, a system discharges the steam inside a suppression pool, at an absolute pressure of 4.2 kPa. The computer codes used to analyze such incident (eg. RELAP 5 or MELCOR) are not validated experimentally for such conditions. Therefore, we planned a basic research, in order to have experimental data useful to validate the heat transfer correlations used in these codes. After a thorough literature search on this topic, ACTA, in collaboration with the staff of ITER, defined the experimental matrix and performed the design of the experimental apparatus. For the thermal-hydraulic design of the experiments, we executed a series of calculations by MELCOR. This code, however, was used in an unconventional mode, with the development of models suited respectively to low and high steam flow-rate tests. The article concludes with a discussion of the placement of experimental data within the map featuring the phenomenon characteristics, showing the importance of the new knowledge acquired, particularly in the case of chugging.
Two-Fluid Extensions to the M3D CDX-U Validation Study
NASA Astrophysics Data System (ADS)
Breslau, J.; Strauss, H.; Sugiyama, L.
2005-10-01
As part of a cross-code verification and validation effort, both the M3D code [1] and the NIMROD code [2] have qualitatively reproduced the nonlinear behavior of a complete sawtooth cycle in the CDX-U tokamak, chosen for the study because its low temperature and small size puts it in a parameter regime easily accessible to both codes. Initial M3D studies on this problem used a resistive MHD model with a large, empirical perpendicular heat transport value and with modest toroidal resolution (24 toroidal planes). The success of this study prompted the pursuit of more quantitatively accurate predictions by the application of more sophisticated physical models and higher numerical resolution. The results of two consequent follow-up studies are presented here. In the first, the toroidal resolution of the original run is doubled to 48 planes. The behavior of the sawtooth in this case is essentially the same as in the lower- resolution study. The sawtooth study has also been repeated using a two-fluid plasma model, with the effects of the &*circ;i term emphasized. The resulting mode rotation, as well as the effects on the reconnection rate (sawtooth crash time), sawtooth period, and overall stability are presented. [1] W. Park, et al., Phys. Plasmas 6, 1796 (1999). [2] C. Sovinec, et al., J. Comp. Phys. 195, 355 (2004).
Al-Hablani, Bader
2017-01-01
The objective of this study is to discuss and analyze the use of automated SNOMED CT clinical coding in clinical decision support systems (CDSSs) for preventive care. The central question that this study seeks to answer is whether the utilization of SNOMED CT in CDSSs can improve preventive care. PubMed, Google Scholar, and Cochrane Library were searched for articles published in English between 2001 and 2012 on SNOMED CT, CDSS, and preventive care. Outcome measures were the sensitivity or specificity of SNOMED CT coded data and the positive predictive value or negative predictive value of SNOMED CT coded data. Additionally, we documented the publication year, research question, study design, results, and conclusions of these studies. The reviewed studies suggested that SNOMED CT successfully represents clinical terms and negated clinical terms. The use of SNOMED CT in CDSS can be considered to provide an answer to the problem of medical errors as well as for preventive care in general. Enhancement of the modifiers and synonyms found in SNOMED CT will be necessary to improve the expected outcome of the integration of SNOMED CT with CDSS. Moreover, the application of the tree-augmented naïve (TAN) Bayesian network method can be considered the best technique to search SNOMED CT data and, consequently, to help improve preventive health services.
Energy Cost Impact of Non-Residential Energy Code Requirements
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Jian; Hart, Philip R.; Rosenberg, Michael I.
2016-08-22
The 2012 International Energy Conservation Code contains 396 separate requirements applicable to non-residential buildings; however, there is no systematic analysis of the energy cost impact of each requirement. Consequently, limited code department budgets for plan review, inspection, and training cannot be focused on the most impactful items. An inventory and ranking of code requirements based on their potential energy cost impact is under development. The initial phase focuses on office buildings with simple HVAC systems in climate zone 4C. Prototype building simulations were used to estimate the energy cost impact of varying levels of non-compliance. A preliminary estimate of themore » probability of occurrence of each level of non-compliance was combined with the estimated lost savings for each level to rank the requirements according to expected savings impact. The methodology to develop and refine further energy cost impacts, specific to building type, system type, and climate location is demonstrated. As results are developed, an innovative alternative method for compliance verification can focus efforts so only the most impactful requirements from an energy cost perspective are verified for every building and a subset of the less impactful requirements are verified on a random basis across a building population. The results can be further applied in prioritizing training material development and specific areas of building official training.« less
Coding and non-coding gene regulatory networks underlie the immune response in liver cirrhosis.
Gao, Bo; Zhang, Xueming; Huang, Yongming; Yang, Zhengpeng; Zhang, Yuguo; Zhang, Weihui; Gao, Zu-Hua; Xue, Dongbo
2017-01-01
Liver cirrhosis is recognized as being the consequence of immune-mediated hepatocyte damage and repair processes. However, the regulation of these immune responses underlying liver cirrhosis has not been elucidated. In this study, we used GEO datasets and bioinformatics methods to established coding and non-coding gene regulatory networks including transcription factor-/lncRNA-microRNA-mRNA, and competing endogenous RNA interaction networks. Our results identified 2224 mRNAs, 70 lncRNAs and 46 microRNAs were differentially expressed in liver cirrhosis. The transcription factor -/lncRNA- microRNA-mRNA network we uncovered that results in immune-mediated liver cirrhosis is comprised of 5 core microRNAs (e.g., miR-203; miR-219-5p), 3 transcription factors (i.e., FOXP3, ETS1 and FOS) and 7 lncRNAs (e.g., ENTS00000671336, ENST00000575137). The competing endogenous RNA interaction network we identified includes a complex immune response regulatory subnetwork that controls the entire liver cirrhosis network. Additionally, we found 10 overlapping GO terms shared by both liver cirrhosis and hepatocellular carcinoma including "immune response" as well. Interestingly, the overlapping differentially expressed genes in liver cirrhosis and hepatocellular carcinoma were enriched in immune response-related functional terms. In summary, a complex gene regulatory network underlying immune response processes may play an important role in the development and progression of liver cirrhosis, and its development into hepatocellular carcinoma.
Analysis of the influence of the heat transfer phenomena on the late phase of the ThAI Iod-12 test
NASA Astrophysics Data System (ADS)
Gonfiotti, B.; Paci, S.
2014-11-01
Iodine is one of the major contributors to the source term during a severe accident in a Nuclear Power Plant for its volatility and high radiological consequences. Therefore, large efforts have been made to describe the Iodine behaviour during an accident, especially in the containment system. Due to the lack of experimental data, in the last years many attempts were carried out to fill the gaps on the knowledge of Iodine behaviour. In this framework, two tests (ThAI Iod-11 and Iod-12) were carried out inside a multi-compartment steel vessel. A quite complex transient characterizes these two tests; therefore they are also suitable for thermal- hydraulic benchmarks. The two tests were originally released for a benchmark exercise during the SARNET2 EU Project. At the end of this benchmark a report covering the main findings was issued, stating that the common codes employed in SA studies were able to simulate the tests but with large discrepancies. The present work is then related to the application of the new versions of ASTEC and MELCOR codes with the aim of carry out a new code-to-code comparison vs. ThAI Iod-12 experimental data, focusing on the influence of the heat exchanges with the outer environment, which seems to be one of the most challenging issues to cope with.
Minozzi, Silvia; Armaroli, Paola; Espina, Carolina; Villain, Patricia; Wiseman, Martin; Schüz, Joachim; Segnan, Nereo
2015-12-01
The European Code Against Cancer is a set of recommendations to give advice on cancer prevention. Its 4th edition is an update of the 3rd edition, from 2003. Working Groups of independent experts from different fields of cancer prevention were appointed to review the recommendations, supported by a Literature Group to provide scientific and technical support in the assessment of the scientific evidence, through systematic reviews of the literature. Common procedures were developed to guide the experts in identifying, retrieving, assessing, interpreting and summarizing the scientific evidence in order to revise the recommendations. The Code strictly followed the concept of providing advice to European Union citizens based on the current best available science. The advice, if followed, would be expected to reduce cancer risk, referring both to avoiding or reducing exposure to carcinogenic agents or changing behaviour related to cancer risk and to participating in medical interventions able to avert specific cancers or their consequences. The information sources and procedures for the review of the scientific evidence are described here in detail. The 12 recommendations of the 4th edition of the European Code Against Cancer were ultimately approved by a Scientific Committee of leading European cancer and public health experts. Copyright © 2015 International Agency for Research on Cancer. Published by Elsevier Ltd. All rights reserved.
Kaltner, H; Gabius, H-J
2012-04-01
Lectin histochemistry has revealed cell-type-selective glycosylation. It is under dynamic and spatially controlled regulation. Since their chemical properties allow carbohydrates to reach unsurpassed structural diversity in oligomers, they are ideal for high density information coding. Consequently, the concept of the sugar code assigns a functional dimension to the glycans of cellular glycoconjugates. Indeed, multifarious cell processes depend on specific recognition of glycans by their receptors (lectins), which translate the sugar-encoded information into effects. Duplication of ancestral genes and the following divergence of sequences account for the evolutionary dynamics in lectin families. Differences in gene number can even appear among closely related species. The adhesion/growth-regulatory galectins are selected as an instructive example to trace the phylogenetic diversification in several animals, most of them popular models in developmental and tumor biology. Chicken galectins are identified as a low-level-complexity set, thus singled out for further detailed analysis. The various operative means for establishing protein diversity among the chicken galectins are delineated, and individual characteristics in expression profiles discerned. To apply this galectin-fingerprinting approach in histopathology has potential for refining differential diagnosis and for obtaining prognostic assessments. On the grounds of in vitro work with tumor cells a strategically orchestrated co-regulation of galectin expression with presentation of cognate glycans is detected. This coordination epitomizes the far-reaching physiological significance of sugar coding.
Al-Hablani, Bader
2017-01-01
Objective The objective of this study is to discuss and analyze the use of automated SNOMED CT clinical coding in clinical decision support systems (CDSSs) for preventive care. The central question that this study seeks to answer is whether the utilization of SNOMED CT in CDSSs can improve preventive care. Method PubMed, Google Scholar, and Cochrane Library were searched for articles published in English between 2001 and 2012 on SNOMED CT, CDSS, and preventive care. Outcome Measures Outcome measures were the sensitivity or specificity of SNOMED CT coded data and the positive predictive value or negative predictive value of SNOMED CT coded data. Additionally, we documented the publication year, research question, study design, results, and conclusions of these studies. Results The reviewed studies suggested that SNOMED CT successfully represents clinical terms and negated clinical terms. Conclusion The use of SNOMED CT in CDSS can be considered to provide an answer to the problem of medical errors as well as for preventive care in general. Enhancement of the modifiers and synonyms found in SNOMED CT will be necessary to improve the expected outcome of the integration of SNOMED CT with CDSS. Moreover, the application of the tree-augmented naïve (TAN) Bayesian network method can be considered the best technique to search SNOMED CT data and, consequently, to help improve preventive health services. PMID:28566995
DNA methylation of miRNA coding sequences putatively associated with childhood obesity.
Mansego, M L; Garcia-Lacarte, M; Milagro, F I; Marti, A; Martinez, J A
2017-02-01
Epigenetic mechanisms may be involved in obesity onset and its consequences. The aim of the present study was to evaluate whether DNA methylation status in microRNA (miRNA) coding regions is associated with childhood obesity. DNA isolated from white blood cells of 24 children (identification sample: 12 obese and 12 non-obese) from the Grupo Navarro de Obesidad Infantil study was hybridized in a 450 K methylation microarray. Several CpGs whose DNA methylation levels were statistically different between obese and non-obese were validated by MassArray® in 95 children (validation sample) from the same study. Microarray analysis identified 16 differentially methylated CpGs between both groups (6 hypermethylated and 10 hypomethylated). DNA methylation levels in miR-1203, miR-412 and miR-216A coding regions significantly correlated with body mass index standard deviation score (BMI-SDS) and explained up to 40% of the variation of BMI-SDS. The network analysis identified 19 well-defined obesity-relevant biological pathways from the KEGG database. MassArray® validation identified three regions located in or near miR-1203, miR-412 and miR-216A coding regions differentially methylated between obese and non-obese children. The current work identified three CpG sites located in coding regions of three miRNAs (miR-1203, miR-412 and miR-216A) that were differentially methylated between obese and non-obese children, suggesting a role of miRNA epigenetic regulation in childhood obesity. © 2016 World Obesity Federation.
AKM in Open Source Communities
NASA Astrophysics Data System (ADS)
Stamelos, Ioannis; Kakarontzas, George
Previous chapters in this book have dealt with Architecture Knowledge Management in traditional Closed Source Software (CSS) projects. This chapterwill attempt to examine the ways that knowledge is shared among participants in Free Libre Open Source Software (FLOSS 1) projects and how architectural knowledge is managed w.r.t. CSS. FLOSS projects are organized and developed in a fundamentally different way than CSS projects. FLOSS projects simply do not develop code as CSS projects do. As a consequence, their knowledge management mechanisms are also based on different concepts and tools.
Performance characteristics of the Cooper PC-9 centrifugal compressor
DOE Office of Scientific and Technical Information (OSTI.GOV)
Foster, R.E.; Neely, R.F.
1988-06-30
Mathematical performance modeling of the PC-9 centrifugal compressor has been completed. Performance characteristics curves have never been obtained for them in test loops with the same degree of accuracy as for the uprated axial compressors and, consequently, computer modeling of the top cascade and purge cascades has been very difficult and of limited value. This compressor modeling work has been carried out in an attempt to generate data which would more accurately define the compressor's performance and would permit more accurate cascade modeling. A computer code, COMPAL, was used to mathematically model the PC-9 performance with variations in gas composition,more » flow ratios, pressure ratios, speed and temperature. The results of this effort, in the form of graphs, with information about the compressor and the code, are the subject of this report. Compressor characteristic curves are featured. 13 figs.« less
Development of Northeast Asia Nuclear Power Plant Accident Simulator.
Kim, Juyub; Kim, Juyoul; Po, Li-Chi Cliff
2017-06-15
A conclusion from the lessons learned after the March 2011 Fukushima Daiichi accident was that Korea needs a tool to estimate consequences from a major accident that could occur at a nuclear power plant located in a neighboring country. This paper describes a suite of computer-based codes to be used by Korea's nuclear emergency response staff for training and potentially operational support in Korea's national emergency preparedness and response program. The systems of codes, Northeast Asia Nuclear Accident Simulator (NANAS), consist of three modules: source-term estimation, atmospheric dispersion prediction and dose assessment. To quickly assess potential doses to the public in Korea, NANAS includes specific reactor data from the nuclear power plants in China, Japan and Taiwan. The completed simulator is demonstrated using data for a hypothetical release. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
A Neural Code That Is Isometric to Vocal Output and Correlates with Its Sensory Consequences
Vyssotski, Alexei L.; Stepien, Anna E.; Keller, Georg B.; Hahnloser, Richard H. R.
2016-01-01
What cortical inputs are provided to motor control areas while they drive complex learned behaviors? We study this question in the nucleus interface of the nidopallium (NIf), which is required for normal birdsong production and provides the main source of auditory input to HVC, the driver of adult song. In juvenile and adult zebra finches, we find that spikes in NIf projection neurons precede vocalizations by several tens of milliseconds and are insensitive to distortions of auditory feedback. We identify a local isometry between NIf output and vocalizations: quasi-identical notes produced in different syllables are preceded by highly similar NIf spike patterns. NIf multiunit firing during song precedes responses in auditory cortical neurons by about 50 ms, revealing delayed congruence between NIf spiking and a neural representation of auditory feedback. Our findings suggest that NIf codes for imminent acoustic events within vocal performance. PMID:27723764
Microprocessor mediates transcriptional termination in long noncoding microRNA genes
Dhir, Ashish; Dhir, Somdutta; Proudfoot, Nick J.; Jopling, Catherine L.
2015-01-01
MicroRNA (miRNA) play a major role in the post-transcriptional regulation of gene expression. Mammalian miRNA biogenesis begins with co-transcriptional cleavage of RNA polymerase II (Pol II) transcripts by the Microprocessor complex. While most miRNA are located within introns of protein coding genes, a substantial minority of miRNA originate from long non coding (lnc) RNA where transcript processing is largely uncharacterized. We show, by detailed characterization of liver-specific lnc-pri-miR-122 and genome-wide analysis in human cell lines, that most lnc-pri-miRNA do not use the canonical cleavage and polyadenylation (CPA) pathway, but instead use Microprocessor cleavage to terminate transcription. This Microprocessor inactivation leads to extensive transcriptional readthrough of lnc-pri-miRNA and transcriptional interference with downstream genes. Consequently we define a novel RNase III-mediated, polyadenylation-independent mechanism of Pol II transcription termination in mammalian cells. PMID:25730776
Clinical potential of oligonucleotide-based therapeutics in the respiratory system.
Moschos, Sterghios A; Usher, Louise; Lindsay, Mark A
2017-01-01
The discovery of an ever-expanding plethora of coding and non-coding RNAs with nodal and causal roles in the regulation of lung physiology and disease is reinvigorating interest in the clinical utility of the oligonucleotide therapeutic class. This is strongly supported through recent advances in nucleic acids chemistry, synthetic oligonucleotide delivery and viral gene therapy that have succeeded in bringing to market at least three nucleic acid-based drugs. As a consequence, multiple new candidates such as RNA interference modulators, antisense, and splice switching compounds are now progressing through clinical evaluation. Here, manipulation of RNA for the treatment of lung disease is explored, with emphasis on robust pharmacological evidence aligned to the five pillars of drug development: exposure to the appropriate tissue, binding to the desired molecular target, evidence of the expected mode of action, activity in the relevant patient population and commercially viable value proposition. Copyright © 2016 Elsevier Inc. All rights reserved.
High-Frequency Network Oscillations in Cerebellar Cortex
Middleton, Steven J.; Racca, Claudia; Cunningham, Mark O.; Traub, Roger D.; Monyer, Hannah; Knöpfel, Thomas; Schofield, Ian S.; Jenkins, Alistair; Whittington, Miles A.
2016-01-01
SUMMARY Both cerebellum and neocortex receive input from the somatosensory system. Interaction between these regions has been proposed to underpin the correct selection and execution of motor commands, but it is not clear how such interactions occur. In neocortex, inputs give rise to population rhythms, providing a spatiotemporal coding strategy for inputs and consequent outputs. Here, we show that similar patterns of rhythm generation occur in cerebellum during nicotinic receptor subtype activation. Both gamma oscillations (30–80 Hz) and very fast oscillations (VFOs, 80–160 Hz) were generated by intrinsic cerebellar cortical circuitry in the absence of functional glutamatergic connections. As in neocortex, gamma rhythms were dependent on GABAA receptor-mediated inhibition, whereas VFOs required only nonsynaptically connected intercellular networks. The ability of cerebellar cortex to generate population rhythms within the same frequency bands as neocortex suggests that they act as a common spatiotemporal code within which corticocerebellar dialog may occur. PMID:18549787
RNA editing in Drosophila melanogaster: new targets and functionalconsequences
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stapleton, Mark; Carlson, Joseph W.; Celniker, Susan E.
2006-09-05
Adenosine deaminases that act on RNA (ADARs) catalyze the site-specific conversion of adenosine to inosine in primary mRNA transcripts. These re-coding events affect coding potential, splice-sites, and stability of mature mRNAs. ADAR is an essential gene and studies in mouse, C. elegans, and Drosophila suggest its primary function is to modify adult behavior by altering signaling components in the nervous system. By comparing the sequence of isogenic cDNAs to genomic DNA, we have identified and experimentally verified 27 new targets of Drosophila ADAR. Our analyses lead us to identify new classes of genes whose transcripts are targets of ADAR includingmore » components of the actin cytoskeleton, and genes involved in ion homeostasis and signal transduction. Our results indicate that editing in Drosophila increases the diversity of the proteome, and does so in a manner that has direct functional consequences on protein function.« less
The Karen Quinlan case: problems and proposals.
Kennedy, I. M.
1976-01-01
Karen Quinlan, a young American girl, has lain in hospital since 15 April 1975 without any prospect of recovering consciousness. Her breathing is assisted by means of a respirator and she is fed through a tube inserted in her stomach. Her adoptive parents applied to the courts for permission for the respirator to be switched off. The judge refused permission. Using the Quinlan case as an exemplar, Mr. Kennedy analyses the medical points one by one against the legal background. He would like to see established a code of practice to assist doctors in such cases who at present have no legal guidance. A set of rules arising as a consequence of a series of court decisions would be undesirable; rather a code should be drawn up as the result of discussion between the many people concerned and the consensus so arrived at. PMID:957371
An active inference theory of allostasis and interoception in depression
Quigley, Karen S.; Hamilton, Paul
2016-01-01
In this paper, we integrate recent theoretical and empirical developments in predictive coding and active inference accounts of interoception (including the Embodied Predictive Interoception Coding model) with working hypotheses from the theory of constructed emotion to propose a biologically plausible unified theory of the mind that places metabolism and energy regulation (i.e. allostasis), as well as the sensory consequences of that regulation (i.e. interoception), at its core. We then consider the implications of this approach for understanding depression. We speculate that depression is a disorder of allostasis, whose myriad symptoms result from a ‘locked in’ brain that is relatively insensitive to its sensory context. We conclude with a brief discussion of the ways our approach might reveal new insights for the treatment of depression. This article is part of the themed issue ‘Interoception beyond homeostasis: affect, cognition and mental health’. PMID:28080969
Gene and translation initiation site prediction in metagenomic sequences
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hyatt, Philip Douglas; LoCascio, Philip F; Hauser, Loren John
2012-01-01
Gene prediction in metagenomic sequences remains a difficult problem. Current sequencing technologies do not achieve sufficient coverage to assemble the individual genomes in a typical sample; consequently, sequencing runs produce a large number of short sequences whose exact origin is unknown. Since these sequences are usually smaller than the average length of a gene, algorithms must make predictions based on very little data. We present MetaProdigal, a metagenomic version of the gene prediction program Prodigal, that can identify genes in short, anonymous coding sequences with a high degree of accuracy. The novel value of the method consists of enhanced translationmore » initiation site identification, ability to identify sequences that use alternate genetic codes and confidence values for each gene call. We compare the results of MetaProdigal with other methods and conclude with a discussion of future improvements.« less
High-beta extended MHD simulations of stellarators
NASA Astrophysics Data System (ADS)
Bechtel, T. A.; Hegna, C. C.; Sovinec, C. R.; Roberds, N. A.
2016-10-01
The high beta properties of stellarator plasmas are studied using the nonlinear, extended MHD code NIMROD. In this work, we describe recent developments to the semi-implicit operator which allow the code to model 3D plasma evolution with better accuracy and efficiency. The configurations under investigation are an l=2, M=5 torsatron with geometry modeled after the Compact Toroidal Hybrid (CTH) experiment and an l=2, M=10 torsatron capable of having vacuum rotational transform profiles near unity. High-beta plasmas are created using a volumetric heating source and temperature dependent anisotropic thermal conduction and resistivity. To reduce computation expenses, simulations are initialized from stellarator symmetric pseudo-equilibria by turning on symmetry breaking modes at finite beta. The onset of MHD instabilities and nonlinear consequences are monitored as a function of beta as well as the fragility of the magnetic surfaces. Research supported by US DOE under Grant No. DE-FG02-99ER54546.
Miller, Andrew D
2015-02-01
A sense peptide can be defined as a peptide whose sequence is coded by the nucleotide sequence (read 5' → 3') of the sense (positive) strand of DNA. Conversely, an antisense (complementary) peptide is coded by the corresponding nucleotide sequence (read 5' → 3') of the antisense (negative) strand of DNA. Research has been accumulating steadily to suggest that sense peptides are capable of specific interactions with their corresponding antisense peptides. Unfortunately, although more and more examples of specific sense-antisense peptide interactions are emerging, the very idea of such interactions does not conform to standard biology dogma and so there remains a sizeable challenge to lift this concept from being perceived as a peripheral phenomenon if not worse, into becoming part of the scientific mainstream. Specific interactions have now been exploited for the inhibition of number of widely different protein-protein and protein-receptor interactions in vitro and in vivo. Further, antisense peptides have also been used to induce the production of antibodies targeted to specific receptors or else the production of anti-idiotypic antibodies targeted against auto-antibodies. Such illustrations of utility would seem to suggest that observed sense-antisense peptide interactions are not just the consequence of a sequence of coincidental 'lucky-hits'. Indeed, at the very least, one might conclude that sense-antisense peptide interactions represent a potentially new and different source of leads for drug discovery. But could there be more to come from studies in this area? Studies on the potential mechanism of sense-antisense peptide interactions suggest that interactions may be driven by amino acid residue interactions specified from the genetic code. If so, such specified amino acid residue interactions could form the basis for an even wider amino acid residue interaction code (proteomic code) that links gene sequences to actual protein structure and function, even entire genomes to entire proteomes. The possibility that such a proteomic code should exist is discussed. So too the potential implications for biology and pharmaceutical science are also discussed were such a code to exist.
2012-01-01
Background No validated model exists to explain the learning effects of assessment, a problem when designing and researching assessment for learning. We recently developed a model explaining the pre-assessment learning effects of summative assessment in a theory teaching context. The challenge now is to validate this model. The purpose of this study was to explore whether the model was operational in a clinical context as a first step in this process. Methods Given the complexity of the model, we adopted a qualitative approach. Data from in-depth interviews with eighteen medical students were subject to content analysis. We utilised a code book developed previously using grounded theory. During analysis, we remained alert to data that might not conform to the coding framework and open to the possibility of deploying inductive coding. Ethical clearance and informed consent were obtained. Results The three components of the model i.e., assessment factors, mechanism factors and learning effects were all evident in the clinical context. Associations between these components could all be explained by the model. Interaction with preceptors was identified as a new subcomponent of assessment factors. The model could explain the interrelationships of the three facets of this subcomponent i.e., regular accountability, personal consequences and emotional valence of the learning environment, with previously described components of the model. Conclusions The model could be utilized to analyse and explain observations in an assessment context different to that from which it was derived. In the clinical setting, the (negative) influence of preceptors on student learning was particularly prominent. In this setting, learning effects resulted not only from the high-stakes nature of summative assessment but also from personal stakes, e.g. for esteem and agency. The results suggest that to influence student learning, consequences should accrue from assessment that are immediate, concrete and substantial. The model could have utility as a planning or diagnostic tool in practice and research settings. PMID:22420839
Does location uncertainty in letter position coding emerge because of literacy training?
Perea, Manuel; Jiménez, María; Gomez, Pablo
2016-06-01
In the quest to unveil the nature of the orthographic code, a useful strategy is to examine the transposed-letter effect (e.g., JUGDE is more confusable with its base word, JUDGE, than the replacement-letter nonword JUPTE). A leading explanation of this phenomenon, which is line with models of visual attention, is that there is perceptual uncertainty at assigning letters ("objects") to positions. This mechanism would be at work not only with skilled readers but also with preliterate children. An alternative explanation is that the transposed-letter effect emerges at an orthographic level of processing as a direct consequence of literacy training. To test these accounts, we conducted a same-different matching experiment with preliterate 4-year-old children using same versus different trials (created by letter transposition or replacement). Results showed a significantly larger number of false positives (i.e., "same" responses) to transposed-letter strings than to 1/2 replacement-letter strings. Therefore, the present data favor the view that the visual processing of location information is inherently noisy and rule out an interpretation of confusability in letter position coding as emerging from literacy training. (PsycINFO Database Record (c) 2016 APA, all rights reserved).
Computations of Axisymmetric Flows in Hypersonic Shock Tubes
NASA Technical Reports Server (NTRS)
Sharma, Surendra P.; Wilson, Gregory J.
1995-01-01
A time-accurate two-dimensional fluid code is used to compute test times in shock tubes operated at supersonic speeds. Unlike previous studies, this investigation resolves the finer temporal details of the shock-tube flow by making use of modern supercomputers and state-of-the-art computational fluid dynamic solution techniques. The code, besides solving the time-dependent fluid equations, also accounts for the finite rate chemistry in the hypersonic environment. The flowfield solutions are used to estimate relevant shock-tube parameters for laminar flow, such as test times, and to predict density and velocity profiles. Boundary-layer parameters such as bar-delta(sub u), bar-delta(sup *), and bar-tau(sub w), and test time parameters such as bar-tau and particle time of flight t(sub f), are computed and compared with those evaluated by using Mirels' correlations. This article then discusses in detail the effects of flow nonuniformities on particle time-of-flight behind the normal shock and, consequently, on the interpretation of shock-tube data. This article concludes that for accurate interpretation of shock-tube data, a detailed analysis of flowfield parameters, using a computer code such as used in this study, must be performed.
The home care teaching and learning process in undergraduate health care degree courses.
Hermann, Ana Paula; Lacerda, Maria Ribeiro; Maftum, Mariluci Alves; Bernardino, Elizabeth; Mello, Ana Lúcia Schaefer Ferreira de
2017-07-01
Home care, one of the services provided by the health system, requires health practitioners who are capable of understanding its specificities. This study aimed to build a substantive theory that describes experiences of home care teaching and learning during undergraduate degree courses in nursing, pharmacy, medicine, nutrition, dentistry and occupational therapy. A qualitative analysis was performed using the grounded theory approach based on the results of 63 semistructured interviews conducted with final year students, professors who taught subjects related to home care, and recent graduates working with home care, all participants in the above courses. The data was analyzed in three stages - open coding, axial coding and selective coding - resulting in the phenomenon Experiences of home care teaching and learning during the undergraduate health care degree courses. Its causes were described in the category Articulating knowledge of home care, strategies in the category Experiencing the unique nature of home care, intervening conditions in the category Understanding the multidimensional characteristics of home care, consequences in the category Changing thinking about home care training, and context in the category Understanding home care in the health system. Home care contributes towards the decentralization of hospital care.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wampler, William R.; Brooks, J. N.; Elder, J. D.
2015-03-29
We analyze a DIII-D tokamak experiment where two tungsten spots on the removable DiMES divertor probe were exposed to 12 s of attached plasma conditions, with moderate strike point temperature and density (~20 eV, ~4.5 × 10 19 m –3), and 3% carbon impurity content. Both very small (1 mm diameter) and small (1 cm diameter) deposited samples were used for assessing gross and net tungsten sputtering erosion. The analysis uses a 3-D erosion/redeposition code package (REDEP/WBC), with input from a diagnostic-calibrated near-surface plasma code (OEDGE), and with focus on charge state resolved impinging carbon ion flux and energy. Themore » tungsten surfaces are primarily sputtered by the carbon, in charge states +1 to +4. We predict high redeposition (~75%) of sputtered tungsten on the 1 cm spot—with consequent reduced net erosion—and this agrees well with post-exposure DiMES probe RBS analysis data. As a result, this study and recent related work is encouraging for erosion lifetime and non-contamination performance of tokamak reactor high-Z plasma facing components.« less
[Learning and Repetive Reproduction of Memorized Sequences by the Right and the Left Hand].
Bobrova, E V; Lyakhovetskii, V A; Bogacheva, I N
2015-01-01
An important stage of learning a new skill is repetitive reproduction of one and the same sequence of movements, which plays a significant role in forming of the movement stereotypes. Two groups of right-handers repeatedly memorized (6-10 repetitions) the sequences of their hand transitions by experimenter in 6 positions, firstly by the right hand (RH), and then--by the left hand (LH) or vice versa. Random sequences previously unknown to the volunteers were reproduced in the 11 series. Modified sequences were tested in the 2nd and 3rd series, where the same elements' positions were presented in different order. The processes of repetitive sequence reproduction were similar for RH and LH. However, the learning of the modified sequences differed: Information about elements' position disregarding the reproduction order was used only when LH initiated task performing. This information was not used when LH followed RH and when RH performed the task. Consequently, the type of information coding activated by LH helped learn the positions of sequence elements, while the type of information coding activated by RH prevented learning. It is supposedly connected with the predominant role of right hemisphere in the processes of positional coding and motor learning.
Optimizing legacy molecular dynamics software with directive-based offload
Michael Brown, W.; Carrillo, Jan-Michael Y.; Gavhane, Nitin; ...
2015-05-14
The directive-based programming models are one solution for exploiting many-core coprocessors to increase simulation rates in molecular dynamics. They offer the potential to reduce code complexity with offload models that can selectively target computations to run on the CPU, the coprocessor, or both. In our paper, we describe modifications to the LAMMPS molecular dynamics code to enable concurrent calculations on a CPU and coprocessor. We also demonstrate that standard molecular dynamics algorithms can run efficiently on both the CPU and an x86-based coprocessor using the same subroutines. As a consequence, we demonstrate that code optimizations for the coprocessor also resultmore » in speedups on the CPU; in extreme cases up to 4.7X. We provide results for LAMMAS benchmarks and for production molecular dynamics simulations using the Stampede hybrid supercomputer with both Intel (R) Xeon Phi (TM) coprocessors and NVIDIA GPUs: The optimizations presented have increased simulation rates by over 2X for organic molecules and over 7X for liquid crystals on Stampede. The optimizations are available as part of the "Intel package" supplied with LAMMPS. (C) 2015 Elsevier B.V. All rights reserved.« less
Implementation of Online Veterinary Hospital on Cloud Platform.
Chen, Tzer-Shyong; Chen, Tzer-Long; Chung, Yu-Fang; Huang, Yao-Min; Chen, Tao-Chieh; Wang, Huihui; Wei, Wei
2016-06-01
Pet markets involve in great commercial possibilities, which boost thriving development of veterinary hospital businesses. The service tends to intensive competition and diversified channel environment. Information technology is integrated for developing the veterinary hospital cloud service platform. The platform contains not only pet medical services but veterinary hospital management and services. In the study, QR Code andcloud technology are applied to establish the veterinary hospital cloud service platform for pet search by labeling a pet's identification with QR Code. This technology can break the restriction on veterinary hospital inspection in different areas and allows veterinary hospitals receiving the medical records and information through the exclusive QR Code for more effective inspection. As an interactive platform, the veterinary hospital cloud service platform allows pet owners gaining the knowledge of pet diseases and healthcare. Moreover, pet owners can enquire and communicate with veterinarians through the platform. Also, veterinary hospitals can periodically send reminders of relevant points and introduce exclusive marketing information with the platform for promoting the service items and establishing individualized marketing. Consequently, veterinary hospitals can increase the profits by information share and create the best solution in such a competitive veterinary market with industry alliance.
Drouin, Simon; Caron, Maxime; St-Onge, Pascal; Gioia, Romain; Richer, Chantal; Oualkacha, Karim; Droit, Arnaud; Sinnett, Daniel
2017-01-01
Pre-B cell childhood acute lymphoblastic leukemia (pre-B cALL) is a heterogeneous disease involving many subtypes typically stratified using a combination of cytogenetic and molecular-based assays. These methods, although widely used, rely on the presence of known chromosomal translocations, which is a limiting factor. There is therefore a need for robust, sensitive, and specific molecular biomarkers unaffected by such limitations that would allow better risk stratification and consequently better clinical outcome. In this study we performed a transcriptome analysis of 56 pre-B cALL patients to identify expression signatures in different subtypes. In both protein-coding and long non-coding RNAs (lncRNA), we identified subtype-specific gene signatures distinguishing pre-B cALL subtypes, particularly in t(12;21) and hyperdiploid cases. The genes up-regulated in pre-B cALL subtypes were enriched in bivalent chromatin marks in their promoters. LncRNAs is a new and under-studied class of transcripts. The subtype-specific nature of lncRNAs suggests they may be suitable clinical biomarkers to guide risk stratification and targeted therapies in pre-B cALL patients. PMID:28346506
Villanueva, Pía; Nudel, Ron; Hoischen, Alexander; Fernández, María Angélica; Simpson, Nuala H; Gilissen, Christian; Reader, Rose H; Jara, Lillian; Echeverry, María Magdalena; Echeverry, Maria Magdalena; Francks, Clyde; Baird, Gillian; Conti-Ramsden, Gina; O'Hare, Anne; Bolton, Patrick F; Hennessy, Elizabeth R; Palomino, Hernán; Carvajal-Carmona, Luis; Veltman, Joris A; Cazier, Jean-Baptiste; De Barbieri, Zulema; Fisher, Simon E; Newbury, Dianne F
2015-03-01
Children affected by Specific Language Impairment (SLI) fail to acquire age appropriate language skills despite adequate intelligence and opportunity. SLI is highly heritable, but the understanding of underlying genetic mechanisms has proved challenging. In this study, we use molecular genetic techniques to investigate an admixed isolated founder population from the Robinson Crusoe Island (Chile), who are affected by a high incidence of SLI, increasing the power to discover contributory genetic factors. We utilize exome sequencing in selected individuals from this population to identify eight coding variants that are of putative significance. We then apply association analyses across the wider population to highlight a single rare coding variant (rs144169475, Minor Allele Frequency of 4.1% in admixed South American populations) in the NFXL1 gene that confers a nonsynonymous change (N150K) and is significantly associated with language impairment in the Robinson Crusoe population (p = 2.04 × 10-4, 8 variants tested). Subsequent sequencing of NFXL1 in 117 UK SLI cases identified four individuals with heterozygous variants predicted to be of functional consequence. We conclude that coding variants within NFXL1 confer an increased risk of SLI within a complex genetic model.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bixler, Nathan E.; Osborn, Douglas M.; Sallaberry, Cedric Jean-Marie
2014-02-01
This paper describes the convergence of MELCOR Accident Consequence Code System, Version 2 (MACCS2) probabilistic results of offsite consequences for the uncertainty analysis of the State-of-the-Art Reactor Consequence Analyses (SOARCA) unmitigated long-term station blackout scenario at the Peach Bottom Atomic Power Station. The consequence metrics evaluated are individual latent-cancer fatality (LCF) risk and individual early fatality risk. Consequence results are presented as conditional risk (i.e., assuming the accident occurs, risk per event) to individuals of the public as a result of the accident. In order to verify convergence for this uncertainty analysis, as recommended by the Nuclear Regulatory Commission’s Advisorymore » Committee on Reactor Safeguards, a ‘high’ source term from the original population of Monte Carlo runs has been selected to be used for: (1) a study of the distribution of consequence results stemming solely from epistemic uncertainty in the MACCS2 parameters (i.e., separating the effect from the source term uncertainty), and (2) a comparison between Simple Random Sampling (SRS) and Latin Hypercube Sampling (LHS) in order to validate the original results obtained with LHS. Three replicates (each using a different random seed) of size 1,000 each using LHS and another set of three replicates of size 1,000 using SRS are analyzed. The results show that the LCF risk results are well converged with either LHS or SRS sampling. The early fatality risk results are less well converged at radial distances beyond 2 miles, and this is expected due to the sparse data (predominance of “zero” results).« less
Practical Problems with Medication Use that Older People Experience: A Qualitative Study
Notenboom, Kim; Beers, Erna; van Riet-Nales, Diana A; Egberts, Toine C G; Leufkens, Hubert G M; Jansen, Paul A F; Bouvy, Marcel L
2014-01-01
Objectives To identify the practical problems that older people experience with the daily use of their medicines and their management strategies to address these problems and to determine the potential clinical relevance thereof. Design Qualitative study with semistructured face-to-face interviews. Setting A community pharmacy and a geriatric outpatient ward. Participants Community-dwelling people aged 70 and older (N = 59). Measurements Participants were interviewed at home. Two researchers coded the reported problems and management strategies independently according to a coding scheme. An expert panel classified the potential clinical relevance of every identified practical problem and associated management strategy using a 3-point scale. Results Two hundred eleven practical problems and 184 management strategies were identified. Ninety-five percent of the participants experienced one or more practical problems with the use of their medicines: problems reading and understanding the instructions for use, handling the outer packaging, handling the immediate packaging, completing preparation before use, and taking the medicine. For 10 participants, at least one of their problems, in combination with the applied management strategy, had potential clinical consequences and 11 cases (5% of the problems) had the potential to cause moderate or severe clinical deterioration. Conclusion Older people experience a number of practical problems using their medicines, and their strategies to manage these problems are sometimes suboptimal. These problems can lead to incorrect medication use with clinically relevant consequences. The findings pose a challenge for healthcare professionals, drug developers, and regulators to diminish these problems. PMID:25516030
The difference of being human: Morality
Ayala, Francisco J.
2010-01-01
In The Descent of Man, and Selection in Relation to Sex, published in 1871, Charles Darwin wrote: “I fully … subscribe to the judgment of those writers who maintain that of all the differences between man and the lower animals the moral sense or conscience is by far the most important.” I raise the question of whether morality is biologically or culturally determined. The question of whether the moral sense is biologically determined may refer either to the capacity for ethics (i.e., the proclivity to judge human actions as either right or wrong), or to the moral norms accepted by human beings for guiding their actions. I propose that the capacity for ethics is a necessary attribute of human nature, whereas moral codes are products of cultural evolution. Humans have a moral sense because their biological makeup determines the presence of three necessary conditions for ethical behavior: (i) the ability to anticipate the consequences of one's own actions; (ii) the ability to make value judgments; and (iii) the ability to choose between alternative courses of action. Ethical behavior came about in evolution not because it is adaptive in itself but as a necessary consequence of man's eminent intellectual abilities, which are an attribute directly promoted by natural selection. That is, morality evolved as an exaptation, not as an adaptation. Moral codes, however, are outcomes of cultural evolution, which accounts for the diversity of cultural norms among populations and for their evolution through time. PMID:20445091
Westinghouse ICF power plant study
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sucov, E. W.
1980-10-01
In this study, two different electric power plants for the production of about 1000 MWe which were based on a CO/sub 2/ laser driver and on a heavy ion driver have been developed and analyzed. The purposes of this study were: (1) to examine in a self consistent way the technological and institutional problems that need to be confronted and solved in order to produce commercially competitive electricity in the 2020 time frame from an inertial fusion reactor, and (2) to compare, on a common basis, the consequences of using two different drivers to initiate the DT fuel pellet explosions.more » Analytic descriptions of size/performance/cost relationships for each of the subsystems comprising the power plant have been combined into an overall computer code which models the entire plant. This overall model has been used to conduct trade studies which examine the consequences of varying critical design values around the reference point.« less
Tethers as Debris: Hydrocode Simulation of Impacts of Polymer Tether Fragments on Aluminum Plates
NASA Technical Reports Server (NTRS)
Evans, Steven W.
2003-01-01
Tethers promise to find use in a variety of space applications. Despite being narrow objects, their great lengths result in them having large total areas. Consequently, tethers are very susceptible to being severed by orbital debris. Extensive work has been done designing tethers that resist severs by small debris objects, in order to lengthen their working lives. It is from this perspective that most recent work has considered the tether - debris question. The potential of intact tethers, or severed tether fragments, as debris, to pose a significant collision risk to other spacecraft has been less well studied. Understanding the consequences of such collisions is important in assessing the risks tethers pose to other spacecraft. This paper discusses the damage that polymer tethers may produce on aluminum plates, as revealed by hypervelocity impact simulations using the SPHC hydrodynamic code.
A tutorial on count regression and zero-altered count models for longitudinal substance use data
Atkins, David C.; Baldwin, Scott A.; Zheng, Cheng; Gallop, Robert J.; Neighbors, Clayton
2012-01-01
Critical research questions in the study of addictive behaviors concern how these behaviors change over time - either as the result of intervention or in naturalistic settings. The combination of count outcomes that are often strongly skewed with many zeroes (e.g., days using, number of total drinks, number of drinking consequences) with repeated assessments (e.g., longitudinal follow-up after intervention or daily diary data) present challenges for data analyses. The current article provides a tutorial on methods for analyzing longitudinal substance use data, focusing on Poisson, zero-inflated, and hurdle mixed models, which are types of hierarchical or multilevel models. Two example datasets are used throughout, focusing on drinking-related consequences following an intervention and daily drinking over the past 30 days, respectively. Both datasets as well as R, SAS, Mplus, Stata, and SPSS code showing how to fit the models are available on a supplemental website. PMID:22905895
Histone modification: cause or cog?
Henikoff, Steven; Shilatifard, Ali
2011-10-01
Histone modifications are key components of chromatin packaging but whether they constitute a 'code' has been contested. We believe that the central issue is causality: are histone modifications responsible for differences between chromatin states, or are differences in modifications mostly consequences of dynamic processes, such as transcription and nucleosome remodeling? We find that inferences of causality are often based on correlation and that patterns of some key histone modifications are more easily explained as consequences of nucleosome disruption in the presence of histone modifying enzymes. We suggest that the 35-year-old DNA accessibility paradigm provides a mechanistically sound basis for understanding the role of nucleosomes in gene regulation and epigenetic inheritance. Based on this view, histone modifications and variants contribute to diversification of a chromatin landscape shaped by dynamic processes that are driven primarily by transcription and nucleosome remodeling. Copyright © 2011 Elsevier Ltd. All rights reserved.
[Thrombosis and post-thrombotic syndrome as a consequence of an accident].
Wahl, U; Hirsch, T
2015-10-01
Phlebothromboses represent alarming complications in accident victims since they can cause fatal pulmonary embolisms. More than half of those affected also develop post-thrombotic syndrome in the course of the illness. In addition to making clinical assessments, the traumatologist should also have fundamental knowledge about diagnostic methods and be familiar with interpreting internal findings. Colour-coded duplex sonography plays a central role in diagnosing thrombosis and in assessing functional limitations. Further information can be gathered from various phlebological procedures. The expert evaluation of the immediate, as well as the long-term consequences of an accident frequently require leg swelling to be classified. It is not uncommon for post-thrombotic syndrome to be diagnosed for the first time during this process. An additional vascular appraisal is often required. An appreciation of social-medical and insurance-related aspects means a high degree of responsibility is placed on the expert.
Huang, Yi-Fei; Gulko, Brad; Siepel, Adam
2017-04-01
Many genetic variants that influence phenotypes of interest are located outside of protein-coding genes, yet existing methods for identifying such variants have poor predictive power. Here we introduce a new computational method, called LINSIGHT, that substantially improves the prediction of noncoding nucleotide sites at which mutations are likely to have deleterious fitness consequences, and which, therefore, are likely to be phenotypically important. LINSIGHT combines a generalized linear model for functional genomic data with a probabilistic model of molecular evolution. The method is fast and highly scalable, enabling it to exploit the 'big data' available in modern genomics. We show that LINSIGHT outperforms the best available methods in identifying human noncoding variants associated with inherited diseases. In addition, we apply LINSIGHT to an atlas of human enhancers and show that the fitness consequences at enhancers depend on cell type, tissue specificity, and constraints at associated promoters.
Non-coding landscapes of colorectal cancer
Ragusa, Marco; Barbagallo, Cristina; Statello, Luisa; Condorelli, Angelo Giuseppe; Battaglia, Rosalia; Tamburello, Lucia; Barbagallo, Davide; Di Pietro, Cinzia; Purrello, Michele
2015-01-01
For two decades Vogelstein’s model has been the paradigm for describing the sequence of molecular changes within protein-coding genes that would lead to overt colorectal cancer (CRC). This model is now too simplistic in the light of recent studies, which have shown that our genome is pervasively transcribed in RNAs other than mRNAs, denominated non-coding RNAs (ncRNAs). The discovery that mutations in genes encoding these RNAs [i.e., microRNAs (miRNAs), long non-coding RNAs, and circular RNAs] are causally involved in cancer phenotypes has profoundly modified our vision of tumour molecular genetics and pathobiology. By exploiting a wide range of different mechanisms, ncRNAs control fundamental cellular processes, such as proliferation, differentiation, migration, angiogenesis and apoptosis: these data have also confirmed their role as oncogenes or tumor suppressors in cancer development and progression. The existence of a sophisticated RNA-based regulatory system, which dictates the correct functioning of protein-coding networks, has relevant biological and biomedical consequences. Different miRNAs involved in neoplastic and degenerative diseases exhibit potential predictive and prognostic properties. Furthermore, the key roles of ncRNAs make them very attractive targets for innovative therapeutic approaches. Several recent reports have shown that ncRNAs can be secreted by cells into the extracellular environment (i.e., blood and other body fluids): this suggests the existence of extracellular signalling mechanisms, which may be exploited by cells in physiology and pathology. In this review, we will summarize the most relevant issues on the involvement of cellular and extracellular ncRNAs in disease. We will then specifically describe their involvement in CRC pathobiology and their translational applications to CRC diagnosis, prognosis and therapy. PMID:26556998
NASA Astrophysics Data System (ADS)
Holmes, Mary Anne; Marin-Spiotta, Erika; Schneider, Blair
2017-04-01
Harassment, sexual and otherwise, including bullying and discrimination, remains an ongoing problem in the science workforce. In response to monthly revelations of harassment in academic science in the U.S. in 2016, the American Geophysical Union (AGU) convened a workshop to discuss strategies for professional societies to address this pernicious practice. Participants included researchers on this topic and members from professional science societies, academia, and U.S. federal government agencies. We agreed on the following principles: - Harassment, discrimination and bullying most often occur between a superior (e.g., an advisor, professor, supervisor) and a student or early career professional, representing a power difference that disadvantages the less-powerful scientist. - Harassment drives excellent potential as well as current scientists from the field who would otherwise contribute to the advancement of science, engineering and technology. - Harassment, therefore, represents a form of scientific misconduct, and should be treated as plagiarism, falsification, and other forms of scientific misconduct are treated, with meaningful consequences. To address harassment and to change the culture of science, professional societies can and should: ensure that their Code of Ethics and/or Code of Conduct addresses harassment with clear definitions of what constitutes this behavior, including in academic, professional, conference and field settings; provide a clear and well-disseminated mechanism for reporting violations to the society; have a response person or team in the society that can assist those who feel affected by harassment; and provide a mechanism to revisit and update Codes on a regular basis. The Code should be disseminated widely to members and apply to all members and staff. A revised Code of Ethics is now being constructed by AGU, and will be ready for adoption in 2017. See http://harassment.agu.org/ for information updates.
Semantic and phonological coding in poor and normal readers.
Vellutino, F R; Scanlon, D M; Spearing, D
1995-02-01
Three studies were conducted evaluating semantic and phonological coding deficits as alternative explanations of reading disability. In the first study, poor and normal readers in second and sixth grade were compared on various tests evaluating semantic development as well as on tests evaluating rapid naming and pseudoword decoding as independent measures of phonological coding ability. In a second study, the same subjects were given verbal memory and visual-verbal learning tasks using high and low meaning words as verbal stimuli and Chinese ideographs as visual stimuli. On the semantic tasks, poor readers performed below the level of the normal readers only at the sixth grade level, but, on the rapid naming and pseudoword learning tasks, they performed below the normal readers at the second as well as at the sixth grade level. On both the verbal memory and visual-verbal learning tasks, performance in poor readers approximated that of normal readers when the word stimuli were high in meaning but not when they were low in meaning. These patterns were essentially replicated in a third study that used some of the same semantic and phonological measures used in the first experiment, and verbal memory and visual-verbal learning tasks that employed word lists and visual stimuli (novel alphabetic characters) that more closely approximated those used in learning to read. It was concluded that semantic coding deficits are an unlikely cause of reading difficulties in most poor readers at the beginning stages of reading skills acquisition, but accrue as a consequence of prolonged reading difficulties in older readers. It was also concluded that phonological coding deficits are a probable cause of reading difficulties in most poor readers.
Ciliates learn to diagnose and correct classical error syndromes in mating strategies
Clark, Kevin B.
2013-01-01
Preconjugal ciliates learn classical repetition error-correction codes to safeguard mating messages and replies from corruption by “rivals” and local ambient noise. Because individual cells behave as memory channels with Szilárd engine attributes, these coding schemes also might be used to limit, diagnose, and correct mating-signal errors due to noisy intracellular information processing. The present study, therefore, assessed whether heterotrich ciliates effect fault-tolerant signal planning and execution by modifying engine performance, and consequently entropy content of codes, during mock cell–cell communication. Socially meaningful serial vibrations emitted from an ambiguous artificial source initiated ciliate behavioral signaling performances known to advertise mating fitness with varying courtship strategies. Microbes, employing calcium-dependent Hebbian-like decision making, learned to diagnose then correct error syndromes by recursively matching Boltzmann entropies between signal planning and execution stages via “power” or “refrigeration” cycles. All eight serial contraction and reversal strategies incurred errors in entropy magnitude by the execution stage of processing. Absolute errors, however, subtended expected threshold values for single bit-flip errors in three-bit replies, indicating coding schemes protected information content throughout signal production. Ciliate preparedness for vibrations selectively and significantly affected the magnitude and valence of Szilárd engine performance during modal and non-modal strategy corrective cycles. But entropy fidelity for all replies mainly improved across learning trials as refinements in engine efficiency. Fidelity neared maximum levels for only modal signals coded in resilient three-bit repetition error-correction sequences. Together, these findings demonstrate microbes can elevate survival/reproductive success by learning to implement classical fault-tolerant information processing in social contexts. PMID:23966987
A source-channel coding approach to digital image protection and self-recovery.
Sarreshtedari, Saeed; Akhaee, Mohammad Ali
2015-07-01
Watermarking algorithms have been widely applied to the field of image forensics recently. One of these very forensic applications is the protection of images against tampering. For this purpose, we need to design a watermarking algorithm fulfilling two purposes in case of image tampering: 1) detecting the tampered area of the received image and 2) recovering the lost information in the tampered zones. State-of-the-art techniques accomplish these tasks using watermarks consisting of check bits and reference bits. Check bits are used for tampering detection, whereas reference bits carry information about the whole image. The problem of recovering the lost reference bits still stands. This paper is aimed at showing that having the tampering location known, image tampering can be modeled and dealt with as an erasure error. Therefore, an appropriate design of channel code can protect the reference bits against tampering. In the present proposed method, the total watermark bit-budget is dedicated to three groups: 1) source encoder output bits; 2) channel code parity bits; and 3) check bits. In watermark embedding phase, the original image is source coded and the output bit stream is protected using appropriate channel encoder. For image recovery, erasure locations detected by check bits help channel erasure decoder to retrieve the original source encoded image. Experimental results show that our proposed scheme significantly outperforms recent techniques in terms of image quality for both watermarked and recovered image. The watermarked image quality gain is achieved through spending less bit-budget on watermark, while image recovery quality is considerably improved as a consequence of consistent performance of designed source and channel codes.
Galián, José A; Rosato, Marcela; Rosselló, Josep A
2014-03-01
Multigene families have provided opportunities for evolutionary biologists to assess molecular evolution processes and phylogenetic reconstructions at deep and shallow systematic levels. However, the use of these markers is not free of technical and analytical challenges. Many evolutionary studies that used the nuclear 5S rDNA gene family rarely used contiguous 5S coding sequences due to the routine use of head-to-tail polymerase chain reaction primers that are anchored to the coding region. Moreover, the 5S coding sequences have been concatenated with independent, adjacent gene units in many studies, creating simulated chimeric genes as the raw data for evolutionary analysis. This practice is based on the tacitly assumed, but rarely tested, hypothesis that strict intra-locus concerted evolution processes are operating in 5S rDNA genes, without any empirical evidence as to whether it holds for the recovered data. The potential pitfalls of analysing the patterns of molecular evolution and reconstructing phylogenies based on these chimeric genes have not been assessed to date. Here, we compared the sequence integrity and phylogenetic behavior of entire versus concatenated 5S coding regions from a real data set obtained from closely related plant species (Medicago, Fabaceae). Our results suggest that within arrays sequence homogenization is partially operating in the 5S coding region, which is traditionally assumed to be highly conserved. Consequently, concatenating 5S genes increases haplotype diversity, generating novel chimeric genotypes that most likely do not exist within the genome. In addition, the patterns of gene evolution are distorted, leading to incorrect haplotype relationships in some evolutionary reconstructions.
Application of CFX-10 to the Investigation of RPV Coolant Mixing in VVER Reactors
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moretti, Fabio; Melideo, Daniele; Terzuoli, Fulvio
2006-07-01
Coolant mixing phenomena occurring in the pressure vessel of a nuclear reactor constitute one of the main objectives of investigation by researchers concerned with nuclear reactor safety. For instance, mixing plays a relevant role in reactivity-induced accidents initiated by de-boration or boron dilution events, followed by transport of a de-borated slug into the vessel of a pressurized water reactor. Another example is constituted by temperature mixing, which may sensitively affect the consequences of a pressurized thermal shock scenario. Predictive analysis of mixing phenomena is strongly improved by the availability of computational tools able to cope with the inherent three-dimensionality ofmore » such problem, like system codes with three-dimensional capabilities, and Computational Fluid Dynamics (CFD) codes. The present paper deals with numerical analyses of coolant mixing in the reactor pressure vessel of a VVER-1000 reactor, performed by the ANSYS CFX-10 CFD code. In particular, the 'swirl' effect that has been observed to take place in the downcomer of such kind of reactor has been addressed, with the aim of assessing the capability of the codes to predict that effect, and to understand the reasons for its occurrence. Results have been compared against experimental data from V1000CT-2 Benchmark. Moreover, a boron mixing problem has been investigated, in the hypothesis that a de-borated slug, transported by natural circulation, enters the vessel. Sensitivity analyses have been conducted on some geometrical features, model parameters and boundary conditions. (authors)« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dokhane, A.; Canepa, S.; Ferroukhi, H.
For stability analyses of the Swiss operating Boiling-Water-Reactors (BWRs), the methodology employed and validated so far at the Paul Scherrer Inst. (PSI) was based on the RAMONA-3 code with a hybrid upstream static lattice/core analysis approach using CASMO-4 and PRESTO-2. More recently, steps were undertaken towards a new methodology based on the SIMULATE-3K (S3K) code for the dynamical analyses combined with the CMSYS system relying on the CASMO/SIMULATE-3 suite of codes and which was established at PSI to serve as framework for the development and validation of reference core models of all the Swiss reactors and operated cycles. This papermore » presents a first validation of the new methodology on the basis of a benchmark recently organised by a Swiss utility and including the participation of several international organisations with various codes/methods. Now in parallel, a transition from CASMO-4E (C4E) to CASMO-5M (C5M) as basis for the CMSYS core models was also recently initiated at PSI. Consequently, it was considered adequate to address the impact of this transition both for the steady-state core analyses as well as for the stability calculations and to achieve thereby, an integral approach for the validation of the new S3K methodology. Therefore, a comparative assessment of C4 versus C5M is also presented in this paper with particular emphasis on the void coefficients and their impact on the downstream stability analysis results. (authors)« less
Improving the safety of street-vended food.
Moy, G; Hazzard, A; Käferstein, F
1997-01-01
An integrated plan of action for improving street food involving health and other regulatory authorities, vendors and consumers should address not only food safety, but also environmental health management, including consideration of inadequate sanitation and waste management, possible environmental pollution, congestion and disturbances to traffic. However, WHO cautions that, in view of their importance in the diets of urban populations, particularly the socially disadvantaged, every effort should be made to preserve the benefits provided by varied, inexpensive and often nutritious street food. Therefore, authorities concerned with street food management must balance efforts aimed at reducing the negative aspects on the environment with the benefits of street food and its important role in the community. Health authorities charged with responsibility for food safety control should match risk management action to the level of assessed risk. The rigorous application of codes and enforcement of regulations more suited to larger and permanent food service establishments is unlikely to be justifiable. Such rigorous application of codes and regulations may result in disappearance of the trade with consequent aggravation of hunger and malnutrition. Moreover, most codes and regulations have not been based on any systematic identification and assessment of health hazards associated with different types of foods and operations as embodied in the HACCP approach which has been recognized by Codex as the most cost-effective means for promoting food safety. WHO encourages the development of regulations that empower vendors to take greater responsibility for the preparation of safe food, and of codes of practice based on the HACCP system.
The Use of a Pseudo Noise Code for DIAL Lidar
NASA Technical Reports Server (NTRS)
Burris, John F.
2010-01-01
Retrievals of CO2 profiles within the planetary boundary layer (PBL) are required to understand CO2 transport over regional scales and for validating the future space borne CO2 remote sensing instrument, such as the CO2 Laser Sounder, for the ASCENDS mission, We report the use of a return-to-zero (RZ) pseudo noise (PN) code modulation technique for making range resolved measurements of CO2 within the PBL using commercial, off-the-shelf, components. Conventional, range resolved, measurements require laser pulse widths that are s#rorter than the desired spatial resolution and have pulse spacing such that returns from only a single pulse are observed by the receiver at one time (for the PBL pulse separations must be greater than approximately 2000m). This imposes a serious limitation when using available fiber lasers because of the resulting low duty cycle (less than 0.001) and consequent low average laser output power. RZ PN code modulation enables a fiber laser to operate at much higher duty cycles (approaching 0.1) thereby more effectively utilizing the amplifier's output. This results in an increase in received counts by approximately two orders of magnitude. The approach involves employing two, back to back, CW fiber amplifiers seeded at the appropriate on and offline CO2 wavelengths (approximately 1572 nm) using distributed feedback diode lasers modulated by a PN code at rates significantly above 1 megahertz. An assessment of the technique, discussions of measurement precision and error sources as well as preliminary data will be presented.
Unconventional Gas and Oil Drilling Is Associated with Increased Hospital Utilization Rates.
Jemielita, Thomas; Gerton, George L; Neidell, Matthew; Chillrud, Steven; Yan, Beizhan; Stute, Martin; Howarth, Marilyn; Saberi, Pouné; Fausti, Nicholas; Penning, Trevor M; Roy, Jason; Propert, Kathleen J; Panettieri, Reynold A
2015-01-01
Over the past ten years, unconventional gas and oil drilling (UGOD) has markedly expanded in the United States. Despite substantial increases in well drilling, the health consequences of UGOD toxicant exposure remain unclear. This study examines an association between wells and healthcare use by zip code from 2007 to 2011 in Pennsylvania. Inpatient discharge databases from the Pennsylvania Healthcare Cost Containment Council were correlated with active wells by zip code in three counties in Pennsylvania. For overall inpatient prevalence rates and 25 specific medical categories, the association of inpatient prevalence rates with number of wells per zip code and, separately, with wells per km2 (separated into quantiles and defined as well density) were estimated using fixed-effects Poisson models. To account for multiple comparisons, a Bonferroni correction with associations of p<0.00096 was considered statistically significant. Cardiology inpatient prevalence rates were significantly associated with number of wells per zip code (p<0.00096) and wells per km2 (p<0.00096) while neurology inpatient prevalence rates were significantly associated with wells per km2 (p<0.00096). Furthermore, evidence also supported an association between well density and inpatient prevalence rates for the medical categories of dermatology, neurology, oncology, and urology. These data suggest that UGOD wells, which dramatically increased in the past decade, were associated with increased inpatient prevalence rates within specific medical categories in Pennsylvania. Further studies are necessary to address healthcare costs of UGOD and determine whether specific toxicants or combinations are associated with organ-specific responses.
Numerical prediction of turbulent oscillating flow and associated heat transfer
NASA Technical Reports Server (NTRS)
Koehler, W. J.; Patankar, S. V.; Ibele, W. E.
1991-01-01
A crucial point for further development of engines is the optimization of its heat exchangers which operate under oscillatory flow conditions. It has been found that the most important thermodynamic uncertainties in the Stirling engine designs for space power are in the heat transfer between gas and metal in all engine components and in the pressure drop across the heat exchanger components. So far, performance codes cannot predict the power output of a Stirling engine reasonably enough if used for a wide variety of engines. Thus, there is a strong need for better performance codes. However, a performance code is not concerned with the details of the flow. This information must be provided externally. While analytical relationships exist for laminar oscillating flow, there has been hardly any information about transitional and turbulent oscillating flow, which could be introduced into the performance codes. In 1986, a survey by Seume and Simon revealed that most Stirling engine heat exchangers operate in the transitional and turbulent regime. Consequently, research has since focused on the unresolved issue of transitional and turbulent oscillating flow and heat transfer. Since 1988, the University of Minnesota oscillating flow facility has obtained experimental data about transitional and turbulent oscillating flow. However, since the experiments in this field are extremely difficult, lengthy, and expensive, it is advantageous to numerically simulate the flow and heat transfer accurately from first principles. Work done at the University of Minnesota on the development of such a numerical simulation is summarized.
Duellman, Tyler; Warren, Christopher; Yang, Jay
2014-01-01
Microribonucleic acids (miRNAs) work with exquisite specificity and are able to distinguish a target from a non-target based on a single nucleotide mismatch in the core nucleotide domain. We questioned whether miRNA regulation of gene expression could occur in a single nucleotide polymorphism (SNP)-specific manner, manifesting as a post-transcriptional control of expression of genetic polymorphisms. In our recent study of the functional consequences of matrix metalloproteinase (MMP)-9 SNPs, we discovered that expression of a coding exon SNP in the pro-domain of the protein resulted in a profound decrease in the secreted protein. This missense SNP results in the N38S amino acid change and a loss of an N-glycosylation site. A systematic study demonstrated that the loss of secreted protein was due not to the loss of an N-glycosylation site, but rather an SNP-specific targeting by miR-671-3p and miR-657. Bioinformatics analysis identified 41 SNP-specific miRNA targeting MMP-9 SNPs, mostly in the coding exon and an extension of the analysis to chromosome 20, where the MMP-9 gene is located, suggesting that SNP-specific miRNAs targeting the coding exon are prevalent. This selective post-transcriptional regulation of a target messenger RNA harboring genetic polymorphisms by miRNAs offers an SNP-dependent post-transcriptional regulatory mechanism, allowing for polymorphic-specific differential gene regulation. PMID:24627221
Coding and non-coding gene regulatory networks underlie the immune response in liver cirrhosis
Zhang, Xueming; Huang, Yongming; Yang, Zhengpeng; Zhang, Yuguo; Zhang, Weihui; Gao, Zu-hua; Xue, Dongbo
2017-01-01
Liver cirrhosis is recognized as being the consequence of immune-mediated hepatocyte damage and repair processes. However, the regulation of these immune responses underlying liver cirrhosis has not been elucidated. In this study, we used GEO datasets and bioinformatics methods to established coding and non-coding gene regulatory networks including transcription factor-/lncRNA-microRNA-mRNA, and competing endogenous RNA interaction networks. Our results identified 2224 mRNAs, 70 lncRNAs and 46 microRNAs were differentially expressed in liver cirrhosis. The transcription factor -/lncRNA- microRNA-mRNA network we uncovered that results in immune-mediated liver cirrhosis is comprised of 5 core microRNAs (e.g., miR-203; miR-219-5p), 3 transcription factors (i.e., FOXP3, ETS1 and FOS) and 7 lncRNAs (e.g., ENTS00000671336, ENST00000575137). The competing endogenous RNA interaction network we identified includes a complex immune response regulatory subnetwork that controls the entire liver cirrhosis network. Additionally, we found 10 overlapping GO terms shared by both liver cirrhosis and hepatocellular carcinoma including “immune response” as well. Interestingly, the overlapping differentially expressed genes in liver cirrhosis and hepatocellular carcinoma were enriched in immune response-related functional terms. In summary, a complex gene regulatory network underlying immune response processes may play an important role in the development and progression of liver cirrhosis, and its development into hepatocellular carcinoma. PMID:28355233
Parallel evolution of chordate cis-regulatory code for development.
Doglio, Laura; Goode, Debbie K; Pelleri, Maria C; Pauls, Stefan; Frabetti, Flavia; Shimeld, Sebastian M; Vavouri, Tanya; Elgar, Greg
2013-11-01
Urochordates are the closest relatives of vertebrates and at the larval stage, possess a characteristic bilateral chordate body plan. In vertebrates, the genes that orchestrate embryonic patterning are in part regulated by highly conserved non-coding elements (CNEs), yet these elements have not been identified in urochordate genomes. Consequently the evolution of the cis-regulatory code for urochordate development remains largely uncharacterised. Here, we use genome-wide comparisons between C. intestinalis and C. savignyi to identify putative urochordate cis-regulatory sequences. Ciona conserved non-coding elements (ciCNEs) are associated with largely the same key regulatory genes as vertebrate CNEs. Furthermore, some of the tested ciCNEs are able to activate reporter gene expression in both zebrafish and Ciona embryos, in a pattern that at least partially overlaps that of the gene they associate with, despite the absence of sequence identity. We also show that the ability of a ciCNE to up-regulate gene expression in vertebrate embryos can in some cases be localised to short sub-sequences, suggesting that functional cross-talk may be defined by small regions of ancestral regulatory logic, although functional sub-sequences may also be dispersed across the whole element. We conclude that the structure and organisation of cis-regulatory modules is very different between vertebrates and urochordates, reflecting their separate evolutionary histories. However, functional cross-talk still exists because the same repertoire of transcription factors has likely guided their parallel evolution, exploiting similar sets of binding sites but in different combinations.
Building code challenging the ethics behind adobe architecture in North Cyprus.
Hurol, Yonca; Yüceer, Hülya; Şahali, Öznem
2015-04-01
Adobe masonry is part of the vernacular architecture of Cyprus. Thus, it is possible to use this technology in a meaningful way on the island. On the other hand, although adobe architecture is more sustainable in comparison to other building technologies, the use of it is diminishing in North Cyprus. The application of Turkish building code in the north of the island has created complications in respect of the use of adobe masonry, because this building code demands that reinforced concrete vertical tie-beams are used together with adobe masonry. The use of reinforced concrete elements together with adobe masonry causes problems in relation to the climatic response of the building as well as causing other technical and aesthetic problems. This situation makes the design of adobe masonry complicated and various types of ethical problems also emerge. The objective of this article is to analyse the ethical problems which arise as a consequence of the restrictive character of the building code, by analysing two case studies and conducting an interview with an architect who was involved with the use of adobe masonry in North Cyprus. According to the results of this article there are ethical problems at various levels in the design of both case studies. These problems are connected to the responsibilities of architects in respect of the social benefit, material production, aesthetics and affordability of the architecture as well as presenting distrustful behaviour where the obligations of architects to their clients is concerned.
Abstract feature codes: The building blocks of the implicit learning system.
Eberhardt, Katharina; Esser, Sarah; Haider, Hilde
2017-07-01
According to the Theory of Event Coding (TEC; Hommel, Müsseler, Aschersleben, & Prinz, 2001), action and perception are represented in a shared format in the cognitive system by means of feature codes. In implicit sequence learning research, it is still common to make a conceptual difference between independent motor and perceptual sequences. This supposedly independent learning takes place in encapsulated modules (Keele, Ivry, Mayr, Hazeltine, & Heuer 2003) that process information along single dimensions. These dimensions have remained underspecified so far. It is especially not clear whether stimulus and response characteristics are processed in separate modules. Here, we suggest that feature dimensions as they are described in the TEC should be viewed as the basic content of modules of implicit learning. This means that the modules process all stimulus and response information related to certain feature dimensions of the perceptual environment. In 3 experiments, we investigated by means of a serial reaction time task the nature of the basic units of implicit learning. As a test case, we used stimulus location sequence learning. The results show that a stimulus location sequence and a response location sequence cannot be learned without interference (Experiment 2) unless one of the sequences can be coded via an alternative, nonspatial dimension (Experiment 3). These results support the notion that spatial location is one module of the implicit learning system and, consequently, that there are no separate processing units for stimulus versus response locations. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
Schneider, Adam D.; Jamali, Mohsen; Carriot, Jerome; Chacron, Maurice J.
2015-01-01
Efficient processing of incoming sensory input is essential for an organism's survival. A growing body of evidence suggests that sensory systems have developed coding strategies that are constrained by the statistics of the natural environment. Consequently, it is necessary to first characterize neural responses to natural stimuli to uncover the coding strategies used by a given sensory system. Here we report for the first time the statistics of vestibular rotational and translational stimuli experienced by rhesus monkeys during natural (e.g., walking, grooming) behaviors. We find that these stimuli can reach intensities as high as 1500 deg/s and 8 G. Recordings from afferents during naturalistic rotational and linear motion further revealed strongly nonlinear responses in the form of rectification and saturation, which could not be accurately predicted by traditional linear models of vestibular processing. Accordingly, we used linear–nonlinear cascade models and found that these could accurately predict responses to naturalistic stimuli. Finally, we tested whether the statistics of natural vestibular signals constrain the neural coding strategies used by peripheral afferents. We found that both irregular otolith and semicircular canal afferents, because of their higher sensitivities, were more optimized for processing natural vestibular stimuli as compared with their regular counterparts. Our results therefore provide the first evidence supporting the hypothesis that the neural coding strategies used by the vestibular system are matched to the statistics of natural stimuli. PMID:25855169
Non-codingRNA sequence variations in human chronic lymphocytic leukemia and colorectal cancer.
Wojcik, Sylwia E; Rossi, Simona; Shimizu, Masayoshi; Nicoloso, Milena S; Cimmino, Amelia; Alder, Hansjuerg; Herlea, Vlad; Rassenti, Laura Z; Rai, Kanti R; Kipps, Thomas J; Keating, Michael J; Croce, Carlo M; Calin, George A
2010-02-01
Cancer is a genetic disease in which the interplay between alterations in protein-coding genes and non-coding RNAs (ncRNAs) plays a fundamental role. In recent years, the full coding component of the human genome was sequenced in various cancers, whereas such attempts related to ncRNAs are still fragmentary. We screened genomic DNAs for sequence variations in 148 microRNAs (miRNAs) and ultraconserved regions (UCRs) loci in patients with chronic lymphocytic leukemia (CLL) or colorectal cancer (CRC) by Sanger technique and further tried to elucidate the functional consequences of some of these variations. We found sequence variations in miRNAs in both sporadic and familial CLL cases, mutations of UCRs in CLLs and CRCs and, in certain instances, detected functional effects of these variations. Furthermore, by integrating our data with previously published data on miRNA sequence variations, we have created a catalog of DNA sequence variations in miRNAs/ultraconserved genes in human cancers. These findings argue that ncRNAs are targeted by both germ line and somatic mutations as well as by single-nucleotide polymorphisms with functional significance for human tumorigenesis. Sequence variations in ncRNA loci are frequent and some have functional and biological significance. Such information can be exploited to further investigate on a genome-wide scale the frequency of genetic variations in ncRNAs and their functional meaning, as well as for the development of new diagnostic and prognostic markers for leukemias and carcinomas.
Non-codingRNA sequence variations in human chronic lymphocytic leukemia and colorectal cancer
Wojcik, Sylwia E.; Rossi, Simona; Shimizu, Masayoshi; Nicoloso, Milena S.; Cimmino, Amelia; Alder, Hansjuerg; Herlea, Vlad; Rassenti, Laura Z.; Rai, Kanti R.; Kipps, Thomas J.; Keating, Michael J.
2010-01-01
Cancer is a genetic disease in which the interplay between alterations in protein-coding genes and non-coding RNAs (ncRNAs) plays a fundamental role. In recent years, the full coding component of the human genome was sequenced in various cancers, whereas such attempts related to ncRNAs are still fragmentary. We screened genomic DNAs for sequence variations in 148 microRNAs (miRNAs) and ultraconserved regions (UCRs) loci in patients with chronic lymphocytic leukemia (CLL) or colorectal cancer (CRC) by Sanger technique and further tried to elucidate the functional consequences of some of these variations. We found sequence variations in miRNAs in both sporadic and familial CLL cases, mutations of UCRs in CLLs and CRCs and, in certain instances, detected functional effects of these variations. Furthermore, by integrating our data with previously published data on miRNA sequence variations, we have created a catalog of DNA sequence variations in miRNAs/ultraconserved genes in human cancers. These findings argue that ncRNAs are targeted by both germ line and somatic mutations as well as by single-nucleotide polymorphisms with functional significance for human tumorigenesis. Sequence variations in ncRNA loci are frequent and some have functional and biological significance. Such information can be exploited to further investigate on a genome-wide scale the frequency of genetic variations in ncRNAs and their functional meaning, as well as for the development of new diagnostic and prognostic markers for leukemias and carcinomas. PMID:19926640
Jeong, Dahn; Presseau, Justin; ElChamaa, Rima; Naumann, Danielle N; Mascaro, Colin; Luconi, Francesca; Smith, Karen M; Kitto, Simon
2018-04-10
This scoping review explored the barriers and facilitators that influence engagement in and implementation of self-directed learning (SDL) in continuing professional development (CPD) for physicians in Canada. This review followed the six-stage scoping review framework of Arksey and O'Malley and of Daudt et al. In 2015, the authors searched eight online databases for English-language Canadian articles published January 2005-December 2015. To chart and analyze the data from the 17 included studies, they employed two-step analysis process of conventional content analysis followed by directed coding guided by the Theoretical Domains Framework (TDF). Conventional content analysis generated five categories of barriers and facilitators: individual, program, technological, environmental, and workplace/organizational. Directed coding guided by the TDF allowed analysis of barriers and facilitators to behavior change according to two key groups: physicians engaging in SDL and SDL developers designing and implementing SDL programs. Of the 318 total barriers and facilitators coded, 290 (91.2%) were coded for physicians and 28 (8.8%) for SDL developers. The majority (209; 65.7%) were coded in four key TDF domains: environmental context and resources, social influences, beliefs about consequences, and behavioral regulation. This scoping review identified five categories of barriers and facilitators in the literature and four key TDF domains where most factors related to behavior change of physicians and SDL developers regarding SDL programs in CPD were coded. There was a significant gap in the literature about factors that may contribute to SDL developers' capacity to design and implement SDL programs in CPD.This is an open-access article distributed under the terms of the Creative Commons Attribution-Non Commercial-No Derivatives License 4.0 (CCBY-NC-ND), where it is permissible to download and share the work provided it is properly cited. The work cannot be changed in any way or used commercially without permission from the journal.
Cyclic coding for Brillouin optical time-domain analyzers using probe dithering.
Iribas, Haritz; Loayssa, Alayn; Sauser, Florian; Llera, Miguel; Le Floch, Sébastien
2017-04-17
We study the performance limits of mono-color cyclic coding applied to Brillouin optical time-domain analysis (BOTDA) sensors that use probe wave dithering. BOTDA analyzers with dithering of the probe use a dual-probe-sideband setup in which an optical frequency modulation of the probe waves along the fiber is introduced. This avoids non-local effects while keeping the Brillouin threshold at its highest level, thus preventing the spontaneous Brillouin scattering from generating noise in the deployed sensing fiber. In these conditions, it is possible to introduce an unprecedented high probe power into the sensing fiber, which leads to an enhancement of the signal-to-noise ratio (SNR) and consequently to a performance improvement of the analyzer. The addition of cyclic coding in these set-ups can further increase the SNR and accordingly enhance the performance. However, this unprecedented probe power levels that can be employed result in the appearance of detrimental effects in the measurement that had not previously been observed in other BOTDA set-ups. In this work, we analyze the distortion in the decoding process and the errors in the measurement that this distortion causes, due to three factors: the power difference of the successive pulses of a code sequence, the appearance of first-order non-local effects and the non-linear amplification of the probe wave that results when using mono-color cyclic coding of the pump pulses. We apply the results of this study to demonstrate the performance enhancement that can be achieved in a long-range dithered dual-probe BOTDA. A 164-km fiber-loop is measured with 1-m spatial resolution, obtaining 3-MHz Brillouin frequency shift measurement precision at the worst contrast location. To the best of our knowledge, this is the longest sensing distance achieved with a BOTDA sensor using mono-color cyclic coding.
Comparison of ORSAT and SCARAB Reentry Analysis Tools for a Generic Satellite Test Case
NASA Technical Reports Server (NTRS)
Kelley, Robert L.; Hill, Nicole M.; Rochelle, W. C.; Johnson, Nicholas L.; Lips, T.
2010-01-01
Reentry analysis is essential to understanding the consequences of the full life cycle of a spacecraft. Since reentry is a key factor in spacecraft development, NASA and ESA have separately developed tools to assess the survivability of objects during reentry. Criteria such as debris casualty area and impact energy are particularly important to understanding the risks posed to people on Earth. Therefore, NASA and ESA have undertaken a series of comparison studies of their respective reentry codes for verification and improvements in accuracy. The NASA Object Reentry Survival Analysis Tool (ORSAT) and the ESA Spacecraft Atmospheric Reentry and Aerothermal Breakup (SCARAB) reentry analysis tools serve as standard codes for reentry survivability assessment of satellites. These programs predict whether an object will demise during reentry and calculate the debris casualty area of objects determined to survive, establishing the reentry risk posed to the Earth's population by surviving debris. A series of test cases have been studied for comparison and the most recent uses "Testsat," a conceptual satellite composed of generic parts, defined to use numerous simple shapes and various materials for a better comparison of the predictions of these two codes. This study is an improvement on the others in this series because of increased consistency in modeling techniques and variables. The overall comparison demonstrated that the two codes arrive at similar results. Either most objects modeled resulted in close agreement between the two codes, or if the difference was significant, the variance could be explained as a case of semantics in the model definitions. This paper presents the main results of ORSAT and SCARAB for the Testsat case and discusses the sources of any discovered differences. Discussion of the results of previous comparisons is made for a summary of differences between the codes and lessons learned from this series of tests.
Challenges in Coding Adverse Events in Clinical Trials: A Systematic Review
Schroll, Jeppe Bennekou; Maund, Emma; Gøtzsche, Peter C.
2012-01-01
Background Misclassification of adverse events in clinical trials can sometimes have serious consequences. Therefore, each of the many steps involved, from a patient's adverse experience to presentation in tables in publications, should be as standardised as possible, minimising the scope for interpretation. Adverse events are categorised by a predefined dictionary, e.g. MedDRA, which is updated biannually with many new categories. The objective of this paper is to study interobserver variation and other challenges of coding. Methods Systematic review using PRISMA. We searched PubMed, EMBASE and The Cochrane Library. All studies were screened for eligibility by two authors. Results Our search returned 520 unique studies of which 12 were included. Only one study investigated interobserver variation. It reported that 12% of the codes were evaluated differently by two coders. Independent physicians found that 8% of all the codes deviated from the original description. Other studies found that product summaries could be greatly affected by the choice of dictionary. With the introduction of MedDRA, it seems to have become harder to identify adverse events statistically because each code is divided in subgroups. To account for this, lumping techniques have been developed but are rarely used, and guidance on when to use them is vague. An additional challenge is that adverse events are censored if they already occurred in the run-in period of a trial. As there are more than 26 ways of determining whether an event has already occurred, this can lead to bias, particularly because data analysis is rarely performed blindly. Conclusion There is a lack of evidence that coding of adverse events is a reliable, unbiased and reproducible process. The increase in categories has made detecting adverse events harder, potentially compromising safety. It is crucial that readers of medical publications are aware of these challenges. Comprehensive interobserver studies are needed. PMID:22911755
Oxidative stress damages rRNA inside the ribosome and differentially affects the catalytic center
Willi, Jessica; Küpfer, Pascal; Evéquoz, Damien; Fernandez, Guillermo; Polacek, Norbert
2018-01-01
Abstract Intracellular levels of reactive oxygen species (ROS) increase as a consequence of oxidative stress and represent a major source of damage to biomolecules. Due to its high cellular abundance RNA is more frequently the target for oxidative damage than DNA. Nevertheless the functional consequences of damage on stable RNA are poorly understood. Using a genome-wide approach, based on 8-oxo-guanosine immunoprecipitation, we present evidence that the most abundant non-coding RNA in a cell, the ribosomal RNA (rRNA), is target for oxidative nucleobase damage by ROS. Subjecting ribosomes to oxidative stress, we demonstrate that oxidized 23S rRNA inhibits the ribosome during protein biosynthesis. Placing single oxidized nucleobases at specific position within the ribosome's catalytic center by atomic mutagenesis resulted in markedly different functional outcomes. While some active site nucleobases tolerated oxidative damage well, oxidation at others had detrimental effects on protein synthesis by inhibiting different sub-steps of the ribosomal elongation cycle. Our data provide molecular insight into the biological consequences of RNA oxidation in one of the most central cellular enzymes and reveal mechanistic insight on the role of individual active site nucleobases during translation. PMID:29309687
Whole-genome sequencing identifies EN1 as a determinant of bone density and fracture
Zheng, Hou-Feng; Forgetta, Vincenzo; Hsu, Yi-Hsiang; Estrada, Karol; Rosello-Diez, Alberto; Leo, Paul J; Dahia, Chitra L; Park-Min, Kyung Hyun; Tobias, Jonathan H; Kooperberg, Charles; Kleinman, Aaron; Styrkarsdottir, Unnur; Liu, Ching-Ti; Uggla, Charlotta; Evans, Daniel S; Nielson, Carrie M; Walter, Klaudia; Pettersson-Kymmer, Ulrika; McCarthy, Shane; Eriksson, Joel; Kwan, Tony; Jhamai, Mila; Trajanoska, Katerina; Memari, Yasin; Min, Josine; Huang, Jie; Danecek, Petr; Wilmot, Beth; Li, Rui; Chou, Wen-Chi; Mokry, Lauren E; Moayyeri, Alireza; Claussnitzer, Melina; Cheng, Chia-Ho; Cheung, Warren; Medina-Gómez, Carolina; Ge, Bing; Chen, Shu-Huang; Choi, Kwangbom; Oei, Ling; Fraser, James; Kraaij, Robert; Hibbs, Matthew A; Gregson, Celia L; Paquette, Denis; Hofman, Albert; Wibom, Carl; Tranah, Gregory J; Marshall, Mhairi; Gardiner, Brooke B; Cremin, Katie; Auer, Paul; Hsu, Li; Ring, Sue; Tung, Joyce Y; Thorleifsson, Gudmar; Enneman, Anke W; van Schoor, Natasja M; de Groot, Lisette C.P.G.M.; van der Velde, Nathalie; Melin, Beatrice; Kemp, John P; Christiansen, Claus; Sayers, Adrian; Zhou, Yanhua; Calderari, Sophie; van Rooij, Jeroen; Carlson, Chris; Peters, Ulrike; Berlivet, Soizik; Dostie, Josée; Uitterlinden, Andre G; Williams, Stephen R.; Farber, Charles; Grinberg, Daniel; LaCroix, Andrea Z; Haessler, Jeff; Chasman, Daniel I; Giulianini, Franco; Rose, Lynda M; Ridker, Paul M; Eisman, John A; Nguyen, Tuan V; Center, Jacqueline R; Nogues, Xavier; Garcia-Giralt, Natalia; Launer, Lenore L; Gudnason, Vilmunder; Mellström, Dan; Vandenput, Liesbeth; Karlsson, Magnus K; Ljunggren, Östen; Svensson, Olle; Hallmans, Göran; Rousseau, François; Giroux, Sylvie; Bussière, Johanne; Arp, Pascal P; Koromani, Fjorda; Prince, Richard L; Lewis, Joshua R; Langdahl, Bente L; Hermann, A Pernille; Jensen, Jens-Erik B; Kaptoge, Stephen; Khaw, Kay-Tee; Reeve, Jonathan; Formosa, Melissa M; Xuereb-Anastasi, Angela; Åkesson, Kristina; McGuigan, Fiona E; Garg, Gaurav; Olmos, Jose M; Zarrabeitia, Maria T; Riancho, Jose A; Ralston, Stuart H; Alonso, Nerea; Jiang, Xi; Goltzman, David; Pastinen, Tomi; Grundberg, Elin; Gauguier, Dominique; Orwoll, Eric S; Karasik, David; Davey-Smith, George; Smith, Albert V; Siggeirsdottir, Kristin; Harris, Tamara B; Zillikens, M Carola; van Meurs, Joyce BJ; Thorsteinsdottir, Unnur; Maurano, Matthew T; Timpson, Nicholas J; Soranzo, Nicole; Durbin, Richard; Wilson, Scott G; Ntzani, Evangelia E; Brown, Matthew A; Stefansson, Kari; Hinds, David A; Spector, Tim; Cupples, L Adrienne; Ohlsson, Claes; Greenwood, Celia MT; Jackson, Rebecca D; Rowe, David W; Loomis, Cynthia A; Evans, David M; Ackert-Bicknell, Cheryl L; Joyner, Alexandra L; Duncan, Emma L; Kiel, Douglas P; Rivadeneira, Fernando; Richards, J Brent
2016-01-01
SUMMARY The extent to which low-frequency (minor allele frequency [MAF] between 1–5%) and rare (MAF ≤ 1%) variants contribute to complex traits and disease in the general population is largely unknown. Bone mineral density (BMD) is highly heritable, is a major predictor of osteoporotic fractures and has been previously associated with common genetic variants1–8, and rare, population-specific, coding variants9. Here we identify novel non-coding genetic variants with large effects on BMD (ntotal = 53,236) and fracture (ntotal = 508,253) in individuals of European ancestry from the general population. Associations for BMD were derived from whole-genome sequencing (n=2,882 from UK10K), whole-exome sequencing (n= 3,549), deep imputation of genotyped samples using a combined UK10K/1000Genomes reference panel (n=26,534), and de-novo replication genotyping (n= 20,271). We identified a low-frequency non-coding variant near a novel locus, EN1, with an effect size 4-fold larger than the mean of previously reported common variants for lumbar spine BMD8 (rs11692564[T], MAF = 1.7%, replication effect size = +0.20 standard deviations [SD], Pmeta = 2×10−14), which was also associated with a decreased risk of fracture (OR = 0.85; P = 2×10−11; ncases = 98,742 and ncontrols = 409,511). Using an En1Cre/flox mouse model, we observed that conditional loss of En1 results in low bone mass, likely as a consequence of high bone turn-over. We also identified a novel low-frequency non-coding variant with large effects on BMD near WNT16 (rs148771817[T], MAF = 1.1%, replication effect size = +0.39 SD, Pmeta = 1×10−11). In general, there was an excess of association signals arising from deleterious coding and conserved non-coding variants. These findings provide evidence that low-frequency non-coding variants have large effects on BMD and fracture, thereby providing rationale for whole-genome sequencing and improved imputation reference panels to study the genetic architecture of complex traits and disease in the general population. PMID:26367794
Recent Developments in Three Dimensional Radiation Transport Using the Green's Function Technique
NASA Technical Reports Server (NTRS)
Rockell, Candice; Tweed, John; Blattnig, Steve R.; Mertens, Christopher J.
2010-01-01
In the future, astronauts will be sent into space for longer durations of time compared to previous missions. The increased risk of exposure to dangerous radiation, such as Galactic Cosmic Rays and Solar Particle Events, is of great concern. Consequently, steps must be taken to ensure astronaut safety by providing adequate shielding. In order to better determine and verify shielding requirements, an accurate and efficient radiation transport code based on a fully three dimensional radiation transport model using the Green's function technique is being developed
Transonic flow theory of airfoils and wings
NASA Technical Reports Server (NTRS)
Garabedian, P. R.
1976-01-01
There are plans to use the supercritical wing on the next generation of commercial aircraft so as to economize on fuel consumption by reducing drag. Computer codes have served well in meeting the consequent demand for new wing sections. The possibility of replacing wind tunnel tests by computational fluid dynamics is discussed. Another approach to the supercritical wing is through shockless airfoils. A novel boundary value problem in the hodograph plane is studied that enables one to design a shockless airfoil so that its pressure distribution very nearly takes on data that are prescribed.
NASA Technical Reports Server (NTRS)
Poltev, V. I.; Bruskov, V. I.; Shuliupina, N. V.; Rein, R.; Shibata, M.; Ornstein, R.; Miller, J.
1993-01-01
The review is presented of experimental and computational data on the influence of genotoxic modification of bases (deamination, alkylation, oxidation) on the structure and biological functioning of nucleic acids. Pathways are discussed for the influence of modification on coding properties of bases, on possible errors of nucleic acid biosynthesis, and on configurations of nucleotide mispairs. The atomic structure of nucleic acid fragments with modified bases and the role of base damages in mutagenesis and carcinogenesis are considered.
Neutron production by cosmic-ray muons in various materials
NASA Astrophysics Data System (ADS)
Manukovsky, K. V.; Ryazhskaya, O. G.; Sobolevsky, N. M.; Yudin, A. V.
2016-07-01
The results obtained by studying the background of neutrons produced by cosmic-raymuons in underground experimental facilities intended for rare-event searches and in surrounding rock are presented. The types of this rock may include granite, sedimentary rock, gypsum, and rock salt. Neutron production and transfer were simulated using the Geant4 and SHIELD transport codes. These codes were tuned via a comparison of the results of calculations with experimental data—in particular, with data of the Artemovsk research station of the Institute for Nuclear Research (INR, Moscow, Russia)—as well as via an intercomparison of results of calculations with the Geant4 and SHIELD codes. It turns out that the atomic-number dependence of the production and yield of neutrons has an irregular character and does not allow a description in terms of a universal function of the atomic number. The parameters of this dependence are different for two groups of nuclei—nuclei consisting of alpha particles and all of the remaining nuclei. Moreover, there are manifest exceptions from a power-law dependence—for example, argon. This may entail important consequences both for the existing underground experimental facilities and for those under construction. Investigation of cosmic-ray-induced neutron production in various materials is of paramount importance for the interpretation of experiments conducted at large depths under the Earth's surface.
Villanueva, Pía; Nudel, Ron; Hoischen, Alexander; Fernández, María Angélica; Simpson, Nuala H.; Gilissen, Christian; Reader, Rose H.; Jara, Lillian; Echeverry, Maria Magdalena; Francks, Clyde; Baird, Gillian; Conti-Ramsden, Gina; O’Hare, Anne; Bolton, Patrick F.; Hennessy, Elizabeth R.; Palomino, Hernán; Carvajal-Carmona, Luis; Veltman, Joris A.; Cazier, Jean-Baptiste; De Barbieri, Zulema
2015-01-01
Children affected by Specific Language Impairment (SLI) fail to acquire age appropriate language skills despite adequate intelligence and opportunity. SLI is highly heritable, but the understanding of underlying genetic mechanisms has proved challenging. In this study, we use molecular genetic techniques to investigate an admixed isolated founder population from the Robinson Crusoe Island (Chile), who are affected by a high incidence of SLI, increasing the power to discover contributory genetic factors. We utilize exome sequencing in selected individuals from this population to identify eight coding variants that are of putative significance. We then apply association analyses across the wider population to highlight a single rare coding variant (rs144169475, Minor Allele Frequency of 4.1% in admixed South American populations) in the NFXL1 gene that confers a nonsynonymous change (N150K) and is significantly associated with language impairment in the Robinson Crusoe population (p = 2.04 × 10–4, 8 variants tested). Subsequent sequencing of NFXL1 in 117 UK SLI cases identified four individuals with heterozygous variants predicted to be of functional consequence. We conclude that coding variants within NFXL1 confer an increased risk of SLI within a complex genetic model. PMID:25781923
Sclafani, F; Starace, A
1978-01-01
The Republic of San Marino adopted a new Penal Code which came into force on Ist January 1975; it replaced the former one of 15th Sept. 1865. After having stated the typical aspects of the Penal Procedure System therein enforceable, the Authors examine the rules concerning criminal responsibility and the danger of committing new crimes. They point out and criticize the relevant contradictions. In explaining the measures regarding punishment and educational rehabilitation provided for by the San Marino's legal system, the Authors later consider them from a juridical and criminological viewpoint. If some reforms must be approved (for example: biopsychical inquiry on the charged person, probation, week-end imprisonments, fines according to the incomes of the condemned, etc.). the Authors stress that some legal provisions may appear useless and unrealistic when one considers the environmental conditions of the little Republic. The Authors conclude that Penal Procedure Law is not in accordance with Penal Law and, consequently, they hope that a new reform will be grounded on the needs arising from the crimes perpetrated in loco. It shall be, however, necessary to plan a co-ordination among the two Codes within a framework of de-criminalization of many acts which are now punishable as crime.
Resurrection of DNA Function In Vivo from an Extinct Genome
Pask, Andrew J.; Behringer, Richard R.; Renfree, Marilyn B.
2008-01-01
There is a burgeoning repository of information available from ancient DNA that can be used to understand how genomes have evolved and to determine the genetic features that defined a particular species. To assess the functional consequences of changes to a genome, a variety of methods are needed to examine extinct DNA function. We isolated a transcriptional enhancer element from the genome of an extinct marsupial, the Tasmanian tiger (Thylacinus cynocephalus or thylacine), obtained from 100 year-old ethanol-fixed tissues from museum collections. We then examined the function of the enhancer in vivo. Using a transgenic approach, it was possible to resurrect DNA function in transgenic mice. The results demonstrate that the thylacine Col2A1 enhancer directed chondrocyte-specific expression in this extinct mammalian species in the same way as its orthologue does in mice. While other studies have examined extinct coding DNA function in vitro, this is the first example of the restoration of extinct non-coding DNA and examination of its function in vivo. Our method using transgenesis can be used to explore the function of regulatory and protein-coding sequences obtained from any extinct species in an in vivo model system, providing important insights into gene evolution and diversity. PMID:18493600
Zendedel, Rena; Schouten, Barbara C; van Weert, Julia C M; van den Putte, Bas
2018-06-01
The aim of this observational study was twofold. First, we examined how often and which roles informal interpreters performed during consultations between Turkish-Dutch migrant patients and general practitioners (GPs). Second, relations between these roles and patients' and GPs' perceived control, trust in informal interpreters and satisfaction with the consultation were assessed. A coding instrument was developed to quantitatively code informal interpreters' roles from transcripts of 84 audio-recorded interpreter-mediated consultations in general practice. Patients' and GPs' perceived control, trust and satisfaction were assessed in a post consultation questionnaire. Informal interpreters most often performed the conduit role (almost 25% of all coded utterances), and also frequently acted as replacers and excluders of patients and GPs by asking and answering questions on their own behalf, and by ignoring and omitting patients' and GPs' utterances. The role of information source was negatively related to patients' trust and the role of GP excluder was negatively related to patients' perceived control. Patients and GPs are possibly insufficiently aware of the performed roles of informal interpreters, as these were barely related to patients' and GPs' perceived trust, control and satisfaction. Patients and GPs should be educated about the possible negative consequences of informal interpreting. Copyright © 2018 Elsevier B.V. All rights reserved.
Liu, Baodong; Liu, Xiaoling; Lai, Weiyi; Wang, Hailin
2017-06-06
DNA N 6 -methyl-2'-deoxyadenosine (6mdA) is an epigenetic modification in both eukaryotes and bacteria. Here we exploited stable isotope-labeled deoxynucleoside [ 15 N 5 ]-2'-deoxyadenosine ([ 15 N 5 ]-dA) as an initiation tracer and for the first time developed a metabolically differential tracing code for monitoring DNA 6mdA in human cells. We demonstrate that the initiation tracer [ 15 N 5 ]-dA undergoes a specific and efficient adenine deamination reaction leading to the loss the exocyclic amine 15 N, and further utilizes the purine salvage pathway to generate mainly both [ 15 N 4 ]-dA and [ 15 N 4 ]-2'-deoxyguanosine ([ 15 N 4 ]-dG) in mammalian genomes. However, [ 15 N 5 ]-dA is largely retained in the genomes of mycoplasmas, which are often found in cultured cells and experimental animals. Consequently, the methylation of dA generates 6mdA with a consistent coding pattern, with a predominance of [ 15 N 4 ]-6mdA. Therefore, mammalian DNA 6mdA can be potentially discriminated from that generated by infecting mycoplasmas. Collectively, we show a promising approach for identification of authentic DNA 6mdA in human cells and determine if the human cells are contaminated with mycoplasmas.
Modification of orthogonal tRNAs: unexpected consequences for sense codon reassignment.
Biddle, Wil; Schmitt, Margaret A; Fisk, John D
2016-12-01
Breaking the degeneracy of the genetic code via sense codon reassignment has emerged as a way to incorporate multiple copies of multiple non-canonical amino acids into a protein of interest. Here, we report the modification of a normally orthogonal tRNA by a host enzyme and show that this adventitious modification has a direct impact on the activity of the orthogonal tRNA in translation. We observed nearly equal decoding of both histidine codons, CAU and CAC, by an engineered orthogonal M. jannaschii tRNA with an AUG anticodon: tRNA Opt We suspected a modification of the tRNA Opt AUG anticodon was responsible for the anomalous lack of codon discrimination and demonstrate that adenosine 34 of tRNA Opt AUG is converted to inosine. We identified tRNA Opt AUG anticodon loop variants that increase reassignment of the histidine CAU codon, decrease incorporation in response to the histidine CAC codon, and improve cell health and growth profiles. Recognizing tRNA modification as both a potential pitfall and avenue of directed alteration will be important as the field of genetic code engineering continues to infiltrate the genetic codes of diverse organisms. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.
Living Organisms Author Their Read-Write Genomes in Evolution.
Shapiro, James A
2017-12-06
Evolutionary variations generating phenotypic adaptations and novel taxa resulted from complex cellular activities altering genome content and expression: (i) Symbiogenetic cell mergers producing the mitochondrion-bearing ancestor of eukaryotes and chloroplast-bearing ancestors of photosynthetic eukaryotes; (ii) interspecific hybridizations and genome doublings generating new species and adaptive radiations of higher plants and animals; and, (iii) interspecific horizontal DNA transfer encoding virtually all of the cellular functions between organisms and their viruses in all domains of life. Consequently, assuming that evolutionary processes occur in isolated genomes of individual species has become an unrealistic abstraction. Adaptive variations also involved natural genetic engineering of mobile DNA elements to rewire regulatory networks. In the most highly evolved organisms, biological complexity scales with "non-coding" DNA content more closely than with protein-coding capacity. Coincidentally, we have learned how so-called "non-coding" RNAs that are rich in repetitive mobile DNA sequences are key regulators of complex phenotypes. Both biotic and abiotic ecological challenges serve as triggers for episodes of elevated genome change. The intersections of cell activities, biosphere interactions, horizontal DNA transfers, and non-random Read-Write genome modifications by natural genetic engineering provide a rich molecular and biological foundation for understanding how ecological disruptions can stimulate productive, often abrupt, evolutionary transformations.
Shoura, Massa J; Gabdank, Idan; Hansen, Loren; Merker, Jason; Gotlib, Jason; Levene, Stephen D; Fire, Andrew Z
2017-10-05
Investigations aimed at defining the 3D configuration of eukaryotic chromosomes have consistently encountered an endogenous population of chromosome-derived circular genomic DNA, referred to as extrachromosomal circular DNA (eccDNA). While the production, distribution, and activities of eccDNAs remain understudied, eccDNA formation from specific regions of the linear genome has profound consequences on the regulatory and coding capabilities for these regions. Here, we define eccDNA distributions in Caenorhabditis elegans and in three human cell types, utilizing a set of DNA topology-dependent approaches for enrichment and characterization. The use of parallel biophysical, enzymatic, and informatic approaches provides a comprehensive profiling of eccDNA robust to isolation and analysis methodology. Results in human and nematode systems provide quantitative analysis of the eccDNA loci at both unique and repetitive regions. Our studies converge on and support a consistent picture, in which endogenous genomic DNA circles are present in normal physiological states, and in which the circles come from both coding and noncoding genomic regions. Prominent among the coding regions generating DNA circles are several genes known to produce a diversity of protein isoforms, with mucin proteins and titin as specific examples. Copyright © 2017 Shoura et al.
The Vestibular System Implements a Linear–Nonlinear Transformation In Order to Encode Self-Motion
Massot, Corentin; Schneider, Adam D.; Chacron, Maurice J.; Cullen, Kathleen E.
2012-01-01
Although it is well established that the neural code representing the world changes at each stage of a sensory pathway, the transformations that mediate these changes are not well understood. Here we show that self-motion (i.e. vestibular) sensory information encoded by VIIIth nerve afferents is integrated nonlinearly by post-synaptic central vestibular neurons. This response nonlinearity was characterized by a strong (∼50%) attenuation in neuronal sensitivity to low frequency stimuli when presented concurrently with high frequency stimuli. Using computational methods, we further demonstrate that a static boosting nonlinearity in the input-output relationship of central vestibular neurons accounts for this unexpected result. Specifically, when low and high frequency stimuli are presented concurrently, this boosting nonlinearity causes an intensity-dependent bias in the output firing rate, thereby attenuating neuronal sensitivities. We suggest that nonlinear integration of afferent input extends the coding range of central vestibular neurons and enables them to better extract the high frequency features of self-motion when embedded with low frequency motion during natural movements. These findings challenge the traditional notion that the vestibular system uses a linear rate code to transmit information and have important consequences for understanding how the representation of sensory information changes across sensory pathways. PMID:22911113
[Direct genetic manipulation and criminal code in Venezuela: absolute criminal law void?].
Cermeño Zambrano, Fernando G De J
2002-01-01
The judicial regulation of genetic biotechnology applied to the human genome is of big relevance currently in Venezuela due to the drafting of an innovative bioethical law in the country's parliament. This article will highlight the constitutional normative of Venezuela's 1999 Constitution regarding this subject, as it establishes the framework from which this matter will be legally regulated. The approach this article makes towards the genetic biotechnology applied to the human genome is made taking into account the Venezuelan penal law and by highlighting the violent genetic manipulations that have criminal relevance. The genetic biotechnology applied to the human genome has another important relevance as a consequence of the reformulation of the Venezuelan Penal Code discussed by the country's National Assembly. Therefore, a concise study of the country's penal code will be made in this article to better understand what judicial-penal properties have been protected by the Venezuelan penal legislation. This last step will enable us to identify the penal tools Venezuela counts on to face direct genetic manipulations. We will equally indicate the existing punitive loophole and that should be covered by the penal legislator. In conclusion, this essay concerns criminal policy, referred to the direct genetic manipulations on the human genome that haven't been typified in Venezuelan law, thus discovering a genetic biotechnology paradise.
Mild KCC2 Hypofunction Causes Inconspicuous Chloride Dysregulation that Degrades Neural Coding
Doyon, Nicolas; Prescott, Steven A.; De Koninck, Yves
2016-01-01
Disinhibition caused by Cl− dysregulation is implicated in several neurological disorders. This form of disinhibition, which stems primarily from impaired Cl− extrusion through the co-transporter KCC2, is typically identified by a depolarizing shift in GABA reversal potential (EGABA). Here we show, using computer simulations, that intracellular [Cl−] exhibits exaggerated fluctuations during transient Cl− loads and recovers more slowly to baseline when KCC2 level is even modestly reduced. Using information theory and signal detection theory, we show that increased Cl− lability and settling time degrade neural coding. Importantly, these deleterious effects manifest after less KCC2 reduction than needed to produce the gross changes in EGABA required for detection by most experiments, which assess KCC2 function under weak Cl− load conditions. By demonstrating the existence and functional consequences of “occult” Cl− dysregulation, these results suggest that modest KCC2 hypofunction plays a greater role in neurological disorders than previously believed. PMID:26858607
Advances in pleural disease management including updated procedural coding.
Haas, Andrew R; Sterman, Daniel H
2014-08-01
Over 1.5 million pleural effusions occur in the United States every year as a consequence of a variety of inflammatory, infectious, and malignant conditions. Although rarely fatal in isolation, pleural effusions are often a marker of a serious underlying medical condition and contribute to significant patient morbidity, quality-of-life reduction, and mortality. Pleural effusion management centers on pleural fluid drainage to relieve symptoms and to investigate pleural fluid accumulation etiology. Many recent studies have demonstrated important advances in pleural disease management approaches for a variety of pleural fluid etiologies, including malignant pleural effusion, complicated parapneumonic effusion and empyema, and chest tube size. The last decade has seen greater implementation of real-time imaging assistance for pleural effusion management and increasing use of smaller bore percutaneous chest tubes. This article will briefly review recent pleural effusion management literature and update the latest changes in common procedural terminology billing codes as reflected in the changing landscape of imaging use and percutaneous approaches to pleural disease management.
Remote state preparation through hyperentangled atomic states
NASA Astrophysics Data System (ADS)
Nawaz, Mehwish; ul-Islam, Rameez-; Ikram, Manzoor
2018-04-01
Hyperentangled states have enhanced channel capacity in quantum processing and have yielded` evident increased communication speed in quantum informatics as a consequence of excessively high information content coded over each quantum entity. In the present article, we intend to demonstrate this fact by utilizing atomic states simultaneously entangled both in internal as well as external degrees of freedom, i.e. the de Broglie motion for remote state preparation (RSP). The results clearly demonstrate that we can efficiently communicate two bit information while manipulating only a single quantum subsystem. The states are prepared and manipulated using atomic Bragg diffraction as well as Ramsey interferometry, both of which are now considered as standard, state of the art tools based on cavity quantum electrodynamics. Since atomic Bragg diffraction is a large interaction time regime and produces spatially well separated, decoherence resistant outputs, the schematics presented here for the RSP offer important perspectives on efficient detection as well as unambiguous information coding and readout. The article summarizes the experimental feasibility of the proposal, culminating with a brief discussion.
Crystal growth and furnace analysis
NASA Technical Reports Server (NTRS)
Dakhoul, Youssef M.
1986-01-01
A thermal analysis of Hg/Cd/Te solidification in a Bridgman cell is made using Continuum's VAST code. The energy equation is solved in an axisymmetric, quasi-steady domain for both the molten and solid alloy regions. Alloy composition is calculated by a simplified one-dimensional model to estimate its effect on melt thermal conductivity and, consequently, on the temperature field within the cell. Solidification is assumed to occur at a fixed temperature of 979 K. Simplified boundary conditions are included to model both the radiant and conductive heat exchange between the furnace walls and the alloy. Calculations are performed to show how the steady-state isotherms are affected by: the hot and cold furnace temperatures, boundary condition parameters, and the growth rate which affects the calculated alloy's composition. The Advanced Automatic Directional Solidification Furnace (AADSF), developed by NASA, is also thermally analyzed using the CINDA code. The objective is to determine the performance and the overall power requirements for different furnace designs.
Damage-plasticity model of the host rock in a nuclear waste repository
DOE Office of Scientific and Technical Information (OSTI.GOV)
Koudelka, Tomáš; Kruis, Jaroslav, E-mail: kruis@fsv.cvut.cz
The paper describes damage-plasticity model for the modelling of the host rock environment of a nuclear waste repository. Radioactive Waste Repository Authority in Czech Republic assumes the repository to be in a granite rock mass which exhibit anisotropic behaviour where the strength in tension is lower than in compression. In order to describe this phenomenon, the damage-plasticity model is formulated with the help of the Drucker-Prager yield criterion which can be set to capture the compression behaviour while the tensile stress states is described with the help of scalar isotropic damage model. The concept of damage-plasticity model was implemented inmore » the SIFEL finite element code and consequently, the code was used for the simulation of the Äspö Pillar Stability Experiment (APSE) which was performed in order to determine yielding strength under various conditions in similar granite rocks as in Czech Republic. The results from the performed analysis are presented and discussed in the paper.« less
The Altered Hepatic Tubulin Code in Alcoholic Liver Disease.
Groebner, Jennifer L; Tuma, Pamela L
2015-09-18
The molecular mechanisms that lead to the progression of alcoholic liver disease have been actively examined for decades. Because the hepatic microtubule cytoskeleton supports innumerable cellular processes, it has been the focus of many such mechanistic studies. It has long been appreciated that α-tubulin is a major target for modification by highly reactive ethanol metabolites and reactive oxygen species. It is also now apparent that alcohol exposure induces post-translational modifications that are part of the natural repertoire, mainly acetylation. In this review, the modifications of the "tubulin code" are described as well as those adducts by ethanol metabolites. The potential cellular consequences of microtubule modification are described with a focus on alcohol-induced defects in protein trafficking and enhanced steatosis. Possible mechanisms that can explain hepatic dysfunction are described and how this relates to the onset of liver injury is discussed. Finally, we propose that agents that alter the cellular acetylation state may represent a novel therapeutic strategy for treating liver disease.
Scaled experiments of explosions in cavities
Grun, J.; Cranch, G. A.; Lunsford, R.; ...
2016-05-11
Consequences of an explosion inside an air-filled cavity under the earth's surface are partly duplicated in a laboratory experiment on spatial scales 1000 smaller. The experiment measures shock pressures coupled into a block of material by an explosion inside a gas-filled cavity therein. The explosion is generated by suddenly heating a thin foil that is located near the cavity center with a short laser pulse, which turns the foil into expanding plasma, most of whose energy drives a blast wave in the cavity gas. Variables in the experiment are the cavity radius and explosion energy. Measurements and GEODYN code simulationsmore » show that shock pressuresmeasured in the block exhibit a weak dependence on scaled cavity radius up to ~25 m/kt 1/3, above which they decrease rapidly. Possible mechanisms giving rise to this behavior are described. As a result, the applicability of this work to validating codes used to simulate full-scale cavityexplosions is discussed.« less
Porter, Joseph J; Mehl, Ryan A
2018-01-01
Posttranslational modifications resulting from oxidation of proteins (Ox-PTMs) are present intracellularly under conditions of oxidative stress as well as basal conditions. In the past, these modifications were thought to be generic protein damage, but it has become increasingly clear that Ox-PTMs can have specific physiological effects. It is an arduous task to distinguish between the two cases, as multiple Ox-PTMs occur simultaneously on the same protein, convoluting analysis. Genetic code expansion (GCE) has emerged as a powerful tool to overcome this challenge as it allows for the site-specific incorporation of an Ox-PTM into translated protein. The resulting homogeneously modified protein products can then be rigorously characterized for the effects of individual Ox-PTMs. We outline the strengths and weaknesses of GCE as they relate to the field of oxidative stress and Ox-PTMs. An overview of the Ox-PTMs that have been genetically encoded and applications of GCE to the study of Ox-PTMs, including antibody validation and therapeutic development, is described.
Molecular Pathways: microRNAs, Cancer Cells, and Microenvironment
Berindan-Neagoe, Ioana; Calin, George A.
2015-01-01
One of the most unexpected discoveries in molecular oncology over the last decade is the interplay between abnormalities in protein-coding genes and short non-coding microRNAs (miRNAs) that are causally involved in cancer initiation, progression, and dissemination. This phenomenon was initially defined in malignant cells; however, in recent years, more data have accumulated describing the participation of miRNAs produced by microenvironment cells. As hormones, miRNAs are released by a donor cell in various forms of vesicles or as ‘free’ molecules secreted by active mechanisms. These miRNAs spread as signaling molecules that are uptaken either as exosomes or as ‘free’ RNAs by cells located in other parts of the organism. Here, we discuss the communication between cancer cells and the microenvironment through miRNAs. We further expand this in the context of translational consequences and present miRNAs as predictors of therapeutic response and as targeted therapeutics and therapeutic targets in either malignant cells or microenvironment cells. PMID:25512634
Amaro, Christina M.; Devine, Katie A.; Psihogios, Alexandra M.; Murphy, Lexa K.; Holmbeck, Grayson N.
2015-01-01
Objective To examine observed autonomy-promoting and -inhibiting parenting behaviors during preadolescence as predictors of adjustment outcomes in emerging adults with and without spina bifida (SB). Methods Demographic and videotaped interaction data were collected from families with 8/9-year-old children with SB (n = 68) and a matched group of typically developing youth (n = 68). Observed interaction data were coded with macro- and micro-coding schemes. Measures of emerging adulthood adjustment were collected 10 years later (ages 18/19 years; n = 50 and n = 60 for SB and comparison groups, respectively). Results Autonomy-promoting (behavioral control, autonomy-relatedness) and -inhibiting (psychological control) observed preadolescent parenting behaviors prospectively predicted emerging adulthood adjustment, particularly within educational, social, and emotional domains. Interestingly, high parent undermining of relatedness predicted better educational and social adjustment in the SB sample. Conclusions Parenting behaviors related to autonomy have long-term consequences for adjustment in emerging adults with and without SB. PMID:24864277
[Quality assurance using routine data. Is outcome quality now measurable?].
Kostuj, T; Smektala, R
2010-12-01
Health service quality in Germany can be shown by the data from the external quality assurance program (BQS) but as these records are limited to the period of in-hospital stay no information about outcome after discharge from hospital can be obtained. Secondary routine administrative data contain information about long-term outcome, such as mortality, subsequent revision and the need for care following surgical treatment due to a hip fracture.Experiences in the use of secondary data dealing with treatment of hip fractures from the BQS are available in our department. In addition we analyzed routine administrative data from the health insurance companies Knappschaft Bahn-See and AOK in a cooperative study with the WidO (scientific institute of the AOK). These routine data clearly show a bias because of poor quality in coding as well as broad interpretation possibilities of some of the ICD-10 codes used.Consequently quality assurance using routine data is less valid than register-based conclusions. Nevertheless medical expertise is necessary to avoid misinterpretation of routine administrative data.
NASA Astrophysics Data System (ADS)
Ezzedine, S. M.; Dearborn, D. S.; Miller, P. L.
2015-12-01
The annual probability of an asteroid impact is low, but over time, such catastrophic events are inevitable. Interest in assessing the impact consequences has led us to develop a physics-based framework to seamlessly simulate the event from entry to impact, including air and water shock propagation and wave generation. The non-linear effects are simulated using the hydrodynamics code GEODYN. As effects propagate outward, they become a wave source for the linear-elastic-wave propagation code, WPP/WWP. The GEODYN-WPP/WWP coupling is based on the structured adaptive-mesh-refinement infrastructure, SAMRAI, and has been used in FEMA table-top exercises conducted in 2013 and 2014, and more recently, the 2015 Planetary Defense Conference exercise. Results from these simulations provide an estimate of onshore effects and can inform more sophisticated inundation models. The capabilities of this methodology are illustrated by providing results for different impact locations, and an exploration of asteroid size on the waves arriving at the shoreline of area cities. We constructed the maximum and minimum envelops of water-wave heights given the size of the asteroid and the location of the impact along the risk corridor. Such profiles can inform emergency response and disaster-mitigation efforts, and may be used for design of maritime protection or assessment of risk to shoreline structures of interest. This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344. LLNL-ABS-675390-DRAFT.
Value and probability coding in a feedback-based learning task utilizing food rewards.
Tricomi, Elizabeth; Lempert, Karolina M
2015-01-01
For the consequences of our actions to guide behavior, the brain must represent different types of outcome-related information. For example, an outcome can be construed as negative because an expected reward was not delivered or because an outcome of low value was delivered. Thus behavioral consequences can differ in terms of the information they provide about outcome probability and value. We investigated the role of the striatum in processing probability-based and value-based negative feedback by training participants to associate cues with food rewards and then employing a selective satiety procedure to devalue one food outcome. Using functional magnetic resonance imaging, we examined brain activity related to receipt of expected rewards, receipt of devalued outcomes, omission of expected rewards, omission of devalued outcomes, and expected omissions of an outcome. Nucleus accumbens activation was greater for rewarding outcomes than devalued outcomes, but activity in this region did not correlate with the probability of reward receipt. Activation of the right caudate and putamen, however, was largest in response to rewarding outcomes relative to expected omissions of reward. The dorsal striatum (caudate and putamen) at the time of feedback also showed a parametric increase correlating with the trialwise probability of reward receipt. Our results suggest that the ventral striatum is sensitive to the motivational relevance, or subjective value, of the outcome, while the dorsal striatum codes for a more complex signal that incorporates reward probability. Value and probability information may be integrated in the dorsal striatum, to facilitate action planning and allocation of effort. Copyright © 2015 the American Physiological Society.
Kunkeaw, Nawapol; Jeon, Sung Ho; Lee, Kwanbok; Johnson, Betty H.; Tanasanvimon, Suebpong; Javle, Milind; Pairojkul, Chawalit; Chamgramol, Yaovalux; Wongfieng, Wipaporn; Gong, Bin; Leelayuwat, Chanvit; Lee, Yong Sun
2013-01-01
We have recently identified nc886 (pre-miR-886 or vtRNA2-1) as a novel type of non-coding RNA that inhibits activation of PKR (Protein Kinase RNA-activated). PKR's pro-apoptotic role through eIF2α phosphorylation is well established in the host defense against viral infection. Paradoxically, some cancer patients have elevated PKR activity; however, its cause and consequence are not understood. Initially we evaluated the expression of nc886, PKR and eIF2α in non-malignant cholangiocyte and cholangiocarcinoma (CCA) cells. nc886 is repressed in CCA cells and this repression is the cause of PKR's activation therein. nc886 alone is necessary and sufficient for suppression of PKR via direct physical interaction. Consistently, artificial suppression of nc886 in cholangiocyte cells activates the canonical PKR/eIF2α cell death pathway, suggesting a potential significance of the nc886 suppression and the consequent PKR activation in eliminating pre-malignant cells during tumorigenesis. In comparison, active PKR in CCA cells does not induce phospho-eIF2α nor apoptosis, but promotes the pro-survival NF-κB pathway. Thus, PKR plays a dual life or death role during tumorigenesis. Similarly to the CCA cell lines, nc886 tends to be decreased but PKR tends to be activated in our clinical samples from CCA patients. Collectively from our data, we propose a tumor surveillance model for nc886's role in the PKR pathway during tumorigenesis. PMID:22926522
Stereotypes Associated With Age-related Conditions and Assistive Device Use in Canadian Media.
Fraser, Sarah Anne; Kenyon, Virginia; Lagacé, Martine; Wittich, Walter; Southall, Kenneth Edmund
2016-12-01
Newspapers are an important source of information. The discourses within the media can influence public attitudes and support or discourage stereotypical portrayals of older individuals. This study critically examined discourses within a Canadian newspaper in terms of stereotypical depictions of age-related health conditions and assistive technology devices (ATDs). Four years (2009-2013) of Globe and Mail articles were searched for terms relevant to the research question. A total of 65 articles were retained, and a critical discourse analysis (CDA) of the texts was conducted. The articles were coded for stereotypes associated with age-related health conditions and ATDs, consequences of the stereotyping, and context (overall setting or background) of the discourse. The primary code list included 4 contexts, 13 stereotypes, and 9 consequences of stereotyping. CDA revealed discourses relating to (a) maintaining autonomy in a stereotypical world, (b) ATDs as obstacles in employment, (c) barriers to help seeking for age-related conditions, and (d) people in power setting the stage for discrimination. Our findings indicate that discourses in the Canadian media include stereotypes associated with age-related health conditions. Further, depictions of health conditions and ATDs may exacerbate existing stereotypes about older individuals, limit the options available to them, lead to a reduction in help seeking, and lower ATD use. Education about the realities of age-related health changes and ATDs is needed in order to diminish stereotypes and encourage ATD uptake and use. © The Author 2015. Published by Oxford University Press on behalf of The Gerontological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Simulation of a beam rotation system for a spallation source
NASA Astrophysics Data System (ADS)
Reiss, Tibor; Reggiani, Davide; Seidel, Mike; Talanov, Vadim; Wohlmuther, Michael
2015-04-01
With a nominal beam power of nearly 1 MW on target, the Swiss Spallation Neutron Source (SINQ), ranks among the world's most powerful spallation neutron sources. The proton beam transport to the SINQ target is carried out exclusively by means of linear magnetic elements. In the transport line to SINQ the beam is scattered in two meson production targets and as a consequence, at the SINQ target entrance the beam shape can be described by Gaussian distributions in transverse x and y directions with tails cut short by collimators. This leads to a highly nonuniform power distribution inside the SINQ target, giving rise to thermal and mechanical stresses. In view of a future proton beam intensity upgrade, the possibility of homogenizing the beam distribution by means of a fast beam rotation system is currently under investigation. Important aspects which need to be studied are the impact of a rotating proton beam on the resulting neutron spectra, spatial flux distributions and additional—previously not present—proton losses causing unwanted activation of accelerator components. Hence a new source description method was developed for the radiation transport code MCNPX. This new feature makes direct use of the results from the proton beam optics code TURTLE. Its advantage to existing MCNPX source options is that all phase space information and correlations of each primary beam particle computed with TURTLE are preserved and transferred to MCNPX. Simulations of the different beam distributions together with their consequences in terms of neutron production are presented in this publication. Additionally, a detailed description of the coupling method between TURTLE and MCNPX is provided.
NASA Astrophysics Data System (ADS)
Emeriau-Viard, Constance; Brun, Allan Sacha
2017-10-01
During the PMS, structure and rotation rate of stars evolve significantly. We wish to assess the consequences of these drastic changes on stellar dynamo, internal magnetic field topology and activity level by mean of HPC simulations with the ASH code. To answer this question, we develop 3D MHD simulations that represent specific stages of stellar evolution along the PMS. We choose five different models characterized by the radius of their radiative zone following an evolutionary track, from 1 Myr to 50 Myr, computed by a 1D stellar evolution code. We introduce a seed magnetic field in the youngest model and then we spread it through all simulations. First of all, we study the consequences that the increase of rotation rate and the change of geometry of the convective zone have on the dynamo field that exists in the convective envelop. The magnetic energy increases, the topology of the magnetic field becomes more complex and the axisymmetric magnetic field becomes less predominant as the star ages. The computation of the fully convective MHD model shows that a strong dynamo develops with a ratio of magnetic to kinetic energy reaching equipartition and even super-equipartition states in the faster rotating cases. Magnetic fields resulting from our MHD simulations possess a mixed poloidal-toroidal topology with no obvious dominant component. We also study the relaxation of the vestige dynamo magnetic field within the radiative core and found that it satisfies stability criteria. Hence it does not experience a global reconfiguration and instead slowly relaxes by retaining its mixed poloidal-toroidal topology.
Exploring the patient perspective of fatigue in adults with visual impairment: a qualitative study
Bode, Christina; van der Aa, Hilde P A; Hulshof, Carel T J; Bosmans, Judith E; van Rens, Gerardus H M B; van Nispen, Ruth M A
2017-01-01
Objectives Fatigue is an often mentioned symptom by patients with irreversible visual impairment. This study explored the patient perspective of fatigue in visually impaired adults with a focus on symptoms of fatigue, causes, consequences and coping strategies. Setting Two large Dutch low vision multidisciplinary rehabilitation organisations. Participants 16 visually impaired adults with severe symptoms of fatigue selected by purposive sampling. Methods A qualitative study involving semistructured interviews. A total of four first-level codes were top–down predetermined in correspondence with the topics of the research question. Verbatim transcribed interviews were analysed with a combination of a deductive and inductive approach using open and axial coding. Results Participants often described the symptoms of fatigue as a mental, daily and physical experience. The most often mentioned causes of fatigue were a high cognitive load, the intensity and amount of activities, the high effort necessary to establish visual perception, difficulty with light intensity and negative cognitions. Fatigue had the greatest impact on the ability to carry out social roles and participation, emotional functioning and cognitive functioning. The most common coping strategies were relaxation, external support, socialising and physical exercise and the acceptance of fatigue. Conclusions Our results indicate that low vision-related fatigue is mainly caused by population specific determinants that seem different from the fatigue experience described in studies with other patient populations. Fatigue may be central to the way patients react, adapt and compensate to the consequences of vision loss. These findings indicate a need for future research aimed at interventions specifically tailored to the unique aspects of fatigue related to vision loss. PMID:28775181
The economic consequences of irritable bowel syndrome: a US employer perspective.
Leong, Stephanie A; Barghout, Victoria; Birnbaum, Howard G; Thibeault, Crystal E; Ben-Hamadi, Rym; Frech, Feride; Ofman, Joshua J
2003-04-28
The objective of this study was to measure the direct costs of treating irritable bowel syndrome (IBS) and the indirect costs in the workplace. This was accomplished through retrospective analysis of administrative claims data from a national Fortune 100 manufacturer, which includes all medical, pharmaceutical, and disability claims for the company's employees, spouses/dependents, and retirees. Patients with IBS were identified as individuals, aged 18 to 64 years, who received a primary code for IBS or a secondary code for IBS and a primary code for constipation or abdominal pain between January 1, 1996, and December 31, 1998. Of these patients with IBS, 93.7% were matched based on age, sex, employment status, and ZIP code to a control population of beneficiaries. Direct and indirect costs for patients with IBS were compared with those of matched controls. The average total cost (direct plus indirect) per patient with IBS was 4527 dollars in 1998 compared with 3276 dollars for a control beneficiary (P<.001). The average physician visit costs were 524 dollars and 345 dollars for patients with IBS and controls, respectively (P<.001). The average outpatient care costs to the employer were 1258 dollars and 742 dollars for patients with IBS and controls, respectively (P<.001). Medically related work absenteeism cost the employer 901 dollars on average per employee treated for IBS compared with 528 dollars on average per employee without IBS (P<.001). Irritable bowel syndrome is a significant financial burden on the employer that arises from an increase in direct and indirect costs compared with the control group.
Nund, Rebecca L; Scarinci, Nerina A; Cartmill, Bena; Ward, Elizabeth C; Kuipers, Pim; Porceddu, Sandro V
2014-12-01
The International Classification of Functioning, Disability, and Health (ICF) is an internationally recognized framework which allows its user to describe the consequences of a health condition on an individual in the context of their environment. With growing recognition that dysphagia can have broad ranging physical and psychosocial impacts, the aim of this paper was to identify the ICF domains and categories that describe the full functional impact of dysphagia following non-surgical head and neck cancer (HNC) management, from the perspective of the person with dysphagia. A secondary analysis was conducted on previously published qualitative study data which explored the lived experiences of dysphagia of 24 individuals with self-reported swallowing difficulties following HNC management. Categories and sub-categories identified by the qualitative analysis were subsequently mapped to the ICF using the established linking rules to develop a set of ICF codes relevant to the impact of dysphagia following HNC management. The 69 categories and sub-categories that had emerged from the qualitative analysis were successfully linked to 52 ICF codes. The distribution of these codes across the ICF framework revealed that the components of Body Functions, Activities and Participation, and Environmental Factors were almost equally represented. The findings confirm that the ICF is a valuable framework for representing the complexity and multifaceted impact of dysphagia following HNC. This list of ICF codes, which reflect the diverse impact of dysphagia associated with HNC on the individual, can be used to guide more holistic assessment and management for this population.
Transformation of the neural code for tactile detection from thalamus to cortex.
Vázquez, Yuriria; Salinas, Emilio; Romo, Ranulfo
2013-07-09
To understand how sensory-driven neural activity gives rise to perception, it is essential to characterize how various relay stations in the brain encode stimulus presence. Neurons in the ventral posterior lateral (VPL) nucleus of the somatosensory thalamus and in primary somatosensory cortex (S1) respond to vibrotactile stimulation with relatively slow modulations (∼100 ms) of their firing rate. In addition, faster modulations (∼10 ms) time-locked to the stimulus waveform are observed in both areas, but their contribution to stimulus detection is unknown. Furthermore, it is unclear whether VPL and S1 neurons encode stimulus presence with similar accuracy and via the same response features. To address these questions, we recorded single neurons while trained monkeys judged the presence or absence of a vibrotactile stimulus of variable amplitude, and their activity was analyzed with a unique decoding method that is sensitive to the time scale of the firing rate fluctuations. We found that the maximum detection accuracy of single neurons is similar in VPL and S1. However, VPL relies more heavily on fast rate modulations than S1, and as a consequence, the neural code in S1 is more tolerant: its performance degrades less when the readout method or the time scale of integration is suboptimal. Therefore, S1 neurons implement a more robust code, one less sensitive to the temporal integration window used to infer stimulus presence downstream. The differences between VPL and S1 responses signaling the appearance of a stimulus suggest a transformation of the neural code from thalamus to cortex.
Decoding the complex genetic causes of heart diseases using systems biology.
Djordjevic, Djordje; Deshpande, Vinita; Szczesnik, Tomasz; Yang, Andrian; Humphreys, David T; Giannoulatou, Eleni; Ho, Joshua W K
2015-03-01
The pace of disease gene discovery is still much slower than expected, even with the use of cost-effective DNA sequencing and genotyping technologies. It is increasingly clear that many inherited heart diseases have a more complex polygenic aetiology than previously thought. Understanding the role of gene-gene interactions, epigenetics, and non-coding regulatory regions is becoming increasingly critical in predicting the functional consequences of genetic mutations identified by genome-wide association studies and whole-genome or exome sequencing. A systems biology approach is now being widely employed to systematically discover genes that are involved in heart diseases in humans or relevant animal models through bioinformatics. The overarching premise is that the integration of high-quality causal gene regulatory networks (GRNs), genomics, epigenomics, transcriptomics and other genome-wide data will greatly accelerate the discovery of the complex genetic causes of congenital and complex heart diseases. This review summarises state-of-the-art genomic and bioinformatics techniques that are used in accelerating the pace of disease gene discovery in heart diseases. Accompanying this review, we provide an interactive web-resource for systems biology analysis of mammalian heart development and diseases, CardiacCode ( http://CardiacCode.victorchang.edu.au/ ). CardiacCode features a dataset of over 700 pieces of manually curated genetic or molecular perturbation data, which enables the inference of a cardiac-specific GRN of 280 regulatory relationships between 33 regulator genes and 129 target genes. We believe this growing resource will fill an urgent unmet need to fully realise the true potential of predictive and personalised genomic medicine in tackling human heart disease.
Unconventional Gas and Oil Drilling Is Associated with Increased Hospital Utilization Rates
Neidell, Matthew; Chillrud, Steven; Yan, Beizhan; Stute, Martin; Howarth, Marilyn; Saberi, Pouné; Fausti, Nicholas; Penning, Trevor M.; Roy, Jason; Propert, Kathleen J.; Panettieri, Reynold A.
2015-01-01
Over the past ten years, unconventional gas and oil drilling (UGOD) has markedly expanded in the United States. Despite substantial increases in well drilling, the health consequences of UGOD toxicant exposure remain unclear. This study examines an association between wells and healthcare use by zip code from 2007 to 2011 in Pennsylvania. Inpatient discharge databases from the Pennsylvania Healthcare Cost Containment Council were correlated with active wells by zip code in three counties in Pennsylvania. For overall inpatient prevalence rates and 25 specific medical categories, the association of inpatient prevalence rates with number of wells per zip code and, separately, with wells per km2 (separated into quantiles and defined as well density) were estimated using fixed-effects Poisson models. To account for multiple comparisons, a Bonferroni correction with associations of p<0.00096 was considered statistically significant. Cardiology inpatient prevalence rates were significantly associated with number of wells per zip code (p<0.00096) and wells per km2 (p<0.00096) while neurology inpatient prevalence rates were significantly associated with wells per km2 (p<0.00096). Furthermore, evidence also supported an association between well density and inpatient prevalence rates for the medical categories of dermatology, neurology, oncology, and urology. These data suggest that UGOD wells, which dramatically increased in the past decade, were associated with increased inpatient prevalence rates within specific medical categories in Pennsylvania. Further studies are necessary to address healthcare costs of UGOD and determine whether specific toxicants or combinations are associated with organ-specific responses. PMID:26176544
Wanje, George; Masese, Linnet; Avuvika, Ethel; Baghazal, Anisa; Omoni, Grace; Scott McClelland, R
2017-08-14
To successfully develop and implement school-based sexual health interventions for adolescent girls, such as screening for Chlamydia trachomatis, Neisseria gonorrhoeae, and Trichomonas vaginalis, it is important to understand parents' and teachers' attitudes towards sexual health education and acceptability of sexually transmitted infection (STI) screening interventions. In this qualitative study, we approached parents and teachers from three high schools to participate in in-depth interviews (IDIs) and focus-group discussions (FGDs). Parents and teachers were asked about their general knowledge of STIs and sexual health education. In addition, they were asked whether they would support utilizing outreach to schools to facilitate provision of sexual health education and screening for STIs in adolescent girls. Data were audio-recorded, transcribed, and translated into English. An initial coding matrix was developed and refined throughout the coding process. Transcripts were coded by two researchers and analyzed using the content analysis approach. We conducted 10 IDIs (5 parents and 5 teachers) and 4 FGDs (2 with parents, 2 with teachers, total of 26 participants). Most parents reported few or no discussions regarding STIs with their adolescent girls. Parents were more comfortable discussing consequences of sexual activity including loss of virginity and the potential for pregnancy. Parents tended to place responsibility for sexual health education with teachers. The teachers, in turn, provided basic sexual and reproductive health education including puberty, abstinence, and overview of STIs. Both parents and teachers found the idea of screening for STIs in adolescent girls to be acceptable, and were comfortable with research staff contacting girls through informational meetings at schools. Parents felt that adolescents' STI screening results should be shared with their parents. In this African setting, parents and teachers provide limited sexual health education, with a focus on negative consequences including loss of virginity, pregnancy, and risk for STIs. Nonetheless, both parents and teachers were supportive of STI screening for adolescent girls, beginning with school-based informational meetings for the girls. Research and programs that aim to provide STI screening in this setting must offer treatment and address the issue of whether results will be disclosed to parents.
Li, Linxin
2016-01-01
Objectives To determine the accuracy of coding of admissions for stroke on weekdays versus weekends and any impact on apparent outcome. Design Prospective population based stroke incidence study and a scoping review of previous studies of weekend effects in stroke. Setting Primary and secondary care of all individuals registered with nine general practices in Oxfordshire, United Kingdom (OXVASC, the Oxford Vascular Study). Participants All patients with clinically confirmed acute stroke in OXVASC identified with multiple overlapping methods of ascertainment in 2002-14 versus all acute stroke admissions identified by hospital diagnostic and mortality coding alone during the same period. Main outcomes measures Accuracy of administrative coding data for all patients with confirmed stroke admitted to hospital in OXVASC. Difference between rates of “false positive” or “false negative” coding for weekday and weekend admissions. Impact of inaccurate coding on apparent case fatality at 30 days in weekday versus weekend admissions. Weekend effects on outcomes in patients with confirmed stroke admitted to hospital in OXVASC and impacts of other potential biases compared with those in the scoping review. Results Among 92 728 study population, 2373 episodes of acute stroke were ascertained in OXVASC, of which 826 (34.8%) mainly minor events were managed without hospital admission, 60 (2.5%) occurred out of the area or abroad, and 195 (8.2%) occurred in hospital during an admission for a different reason. Of 1292 local hospital admissions for acute stroke, 973 (75.3%) were correctly identified by administrative coding. There was no bias in distribution of weekend versus weekday admission of the 319 strokes missed by coding. Of 1693 admissions for stroke identified by coding, 1055 (62.3%) were confirmed to be acute strokes after case adjudication. Among the 638 false positive coded cases, patients were more likely to be admitted on weekdays than at weekends (536 (41.0%) v 102 (26.5%); P<0.001), partly because of weekday elective admissions after previous stroke being miscoded as new stroke episodes (267 (49.8%) v 26 (25.5%); P<0.001). The 30 day case fatality after these elective admissions was lower than after confirmed acute stroke admissions (11 (3.8%) v 233 (22.1%); P<0.001). Consequently, relative 30 day case fatality for weekend versus weekday admissions differed (P<0.001) between correctly coded acute stroke admissions and false positive coding cases. Results were consistent when only the 1327 emergency cases identified by “admission method” from coding were included, with more false positive cases with low case fatality (35 (14.7%)) being included for weekday versus weekend admissions (190 (19.5%) v 48 (13.7%), P<0.02). Among all acute stroke admissions in OXVASC, there was no imbalance in baseline stroke severity for weekends versus weekdays and no difference in case fatality at 30 days (adjusted odds ratio 0.85, 95% confidence interval 0.63 to 1.15; P=0.30) or any adverse “weekend effect” on modified Rankin score at 30 days (0.78, 0.61 to 0.99; P=0.04) or one year (0.76, 0.59 to 0.98; P=0.03) among incident strokes. Conclusion Retrospective studies of UK administrative hospital coding data to determine “weekend effects” on outcome in acute medical conditions, such as stroke, can be undermined by inaccurate coding, which can introduce biases that cannot be reliably dealt with by adjustment for case mix. PMID:27185754
Gene and genon concept: coding versus regulation
2007-01-01
We analyse here the definition of the gene in order to distinguish, on the basis of modern insight in molecular biology, what the gene is coding for, namely a specific polypeptide, and how its expression is realized and controlled. Before the coding role of the DNA was discovered, a gene was identified with a specific phenotypic trait, from Mendel through Morgan up to Benzer. Subsequently, however, molecular biologists ventured to define a gene at the level of the DNA sequence in terms of coding. As is becoming ever more evident, the relations between information stored at DNA level and functional products are very intricate, and the regulatory aspects are as important and essential as the information coding for products. This approach led, thus, to a conceptual hybrid that confused coding, regulation and functional aspects. In this essay, we develop a definition of the gene that once again starts from the functional aspect. A cellular function can be represented by a polypeptide or an RNA. In the case of the polypeptide, its biochemical identity is determined by the mRNA prior to translation, and that is where we locate the gene. The steps from specific, but possibly separated sequence fragments at DNA level to that final mRNA then can be analysed in terms of regulation. For that purpose, we coin the new term “genon”. In that manner, we can clearly separate product and regulative information while keeping the fundamental relation between coding and function without the need to introduce a conceptual hybrid. In mRNA, the program regulating the expression of a gene is superimposed onto and added to the coding sequence in cis - we call it the genon. The complementary external control of a given mRNA by trans-acting factors is incorporated in its transgenon. A consequence of this definition is that, in eukaryotes, the gene is, in most cases, not yet present at DNA level. Rather, it is assembled by RNA processing, including differential splicing, from various pieces, as steered by the genon. It emerges finally as an uninterrupted nucleic acid sequence at mRNA level just prior to translation, in faithful correspondence with the amino acid sequence to be produced as a polypeptide. After translation, the genon has fulfilled its role and expires. The distinction between the protein coding information as materialised in the final polypeptide and the processing information represented by the genon allows us to set up a new information theoretic scheme. The standard sequence information determined by the genetic code expresses the relation between coding sequence and product. Backward analysis asks from which coding region in the DNA a given polypeptide originates. The (more interesting) forward analysis asks in how many polypeptides of how many different types a given DNA segment is expressed. This concerns the control of the expression process for which we have introduced the genon concept. Thus, the information theoretic analysis can capture the complementary aspects of coding and regulation, of gene and genon. PMID:18087760
DOE Office of Scientific and Technical Information (OSTI.GOV)
Glodzik, Dominik; Morganella, Sandro; Davies, Helen
Somatic rearrangements contribute to the mutagenized landscape of cancer genomes. Here, we systematically interrogated rearrangements in 560 breast cancers by using a piecewise constant fitting approach. We identified 33 hotspots of large (>100 kb) tandem duplications, a mutational signature associated with homologous-recombination-repair deficiency. Notably, these tandem-duplication hotspots were enriched in breast cancer germline susceptibility loci (odds ratio (OR) = 4.28) and breast-specific 'super-enhancer' regulatory elements (OR = 3.54). These hotspots may be sites of selective susceptibility to double-strand-break damage due to high transcriptional activity or, through incrementally increasing copy number, may be sites of secondary selective pressure. Furthermore, the transcriptomicmore » consequences ranged from strong individual oncogene effects to weak but quantifiable multigene expression effects. We thus present a somatic-rearrangement mutational process affecting coding sequences and noncoding regulatory elements and contributing a continuum of driver consequences, from modest to strong effects, thereby supporting a polygenic model of cancer development.« less
Glodzik, Dominik; Morganella, Sandro; Davies, Helen; ...
2017-01-23
Somatic rearrangements contribute to the mutagenized landscape of cancer genomes. Here, we systematically interrogated rearrangements in 560 breast cancers by using a piecewise constant fitting approach. We identified 33 hotspots of large (>100 kb) tandem duplications, a mutational signature associated with homologous-recombination-repair deficiency. Notably, these tandem-duplication hotspots were enriched in breast cancer germline susceptibility loci (odds ratio (OR) = 4.28) and breast-specific 'super-enhancer' regulatory elements (OR = 3.54). These hotspots may be sites of selective susceptibility to double-strand-break damage due to high transcriptional activity or, through incrementally increasing copy number, may be sites of secondary selective pressure. Furthermore, the transcriptomicmore » consequences ranged from strong individual oncogene effects to weak but quantifiable multigene expression effects. We thus present a somatic-rearrangement mutational process affecting coding sequences and noncoding regulatory elements and contributing a continuum of driver consequences, from modest to strong effects, thereby supporting a polygenic model of cancer development.« less
Case, Kathleen; Crook, Brittani; Lazard, Allison; Mackert, Michael
2016-01-01
Objective This formative study examined perceptions of e-cigarettes in college students with the goal of informing future health communication campaigns. Differences between e-cigarette users and nonusers were also examined. Participants: Thirty undergraduate students were recruited from a large southwestern public university (15 users, 15 nonusers). Methods Structured interviews were conducted and transcripts were coded for themes. Results Although users had more favorable attitudes toward e-cigarettes, both users and nonusers believed that e-cigarettes produce water vapor and reported that e-cigarettes were less harmful than conventional cigarettes. Potential health consequences and addiction concerns were the most common perceived threats for both users and nonusers. Both nonusers and users cited social stigma as a perceived disadvantage of e-cigarette use. Conclusions Ultimately, themes with particular relevance to future health communication campaigns included negative perceptions of e-cigarette users and social stigma, as well as harm perceptions and potential health consequences associated with e-cigarette use. PMID:26979833
Konopka, Tomasz; Skupień, Elzbieta
2008-01-01
In the opinion of some forensic medicine experts, assessment of potential consequences in keeping with Article 160 of the Polish Penal code, which refers to the crime of "exposure to direct danger of death or severe health damage", lies within the competence of medicolegal specialists. This view is accepted by courts and prosecution offices. However, the knowledge of physicians in the field of predicting consequences which did not occur is only somewhat better than that of lawyers. In simple cases, e.g. in trauma involving a sensitive area of the body, passing an opinion confirming a serious danger is not associated with any major problems. Similarly, no problems arise when passing an opinion on the lack of such a danger e.g. in the case of traumawithout any injuries. In complex cases, however, which include the majority of medical error cases, passing an opinion on exposure to direct danger of death or severe health damage may be not feasible.
Survival and quality of life in burns.
Königová, R
2010-07-01
Advances in medical technology and practices have been associated with increasing medical specialization, but they have developed at a price. This price has included not only enormous financial costs, but the additional cost of dehumanized patient care, diminished confidence in the medical staff and, consequently, human suffering. Burn injuries are catastrophic in scope and require specialized, intensive and prolonged treatment from which ensure ethical and psychological problems often complicated by many individual factors. Some of them arising from the Code of Patients Rights not only in the Czech Republic and contribute to DNR decisions (do-not-resuscitate). Not only "Quantity" of life but also "Quality" of life should be considered, particularly in burns. Critical factor is age. In old patients more sophisticated medical knowledge and practice may actually contribute to suffering. At any age scarring represents a special type of disfigurement. The "burn image" is more likely to evoke public avoidance than sympathy. The non handicapped by their negative attitudes help create and perpetuate the handicap and the consequent burden of suffering in burn patients.
Case, Kathleen; Crook, Brittani; Lazard, Allison; Mackert, Michael
2016-07-01
This formative study examined perceptions of e-cigarettes in college students with the goal of informing future health communication campaigns. Differences between e-cigarette users and nonusers were also examined. Thirty undergraduate students were recruited from a large southwestern public university (15 users, 15 nonusers). Structured interviews were conducted and transcripts were coded for themes. Although users had more favorable attitudes toward e-cigarettes, both users and nonusers believed that e-cigarettes produce water vapor and reported that e-cigarettes were less harmful than conventional cigarettes. Potential health consequences and addiction concerns were the most common perceived threats for both users and nonusers. Both nonusers and users cited social stigma as a perceived disadvantage of e-cigarette use. Ultimately, themes with particular relevance to future health communication campaigns included negative perceptions of e-cigarette users and social stigma, as well as harm perceptions and potential health consequences associated with e-cigarette use.
Improved Design of Beam Tunnel for 42 GHz Gyrotron
NASA Astrophysics Data System (ADS)
Singh, Udaybir; Kumar, Nitin; Purohit, L. P.; Sinha, A. K.
2011-04-01
In gyrotron, there is the chance of generation and excitation of unwanted RF modes (parasite oscillations). These modes may interact with electron beam and consequently degrade the beam quality. This paper presents the improved design of the beam tunnel to reduce the parasite oscillations and the effect of beam tunnel geometry on the electron beam parameters. The design optimization of the beam tunnel has been done with the help of 3-D simulation software CST-Microwave Studio and the effect of beam tunnel geometry on the electron beam parameters has been analyzed by EGUN code.
Tanaka, Kenichi; Sakurai, Yoshinori; Endo, Satoru; Takada, Jun
2014-06-01
In order to measure the spatial distributions of neutrons and gamma rays separately using the imaging plate, the requirement for the converter to enhance specific component was investigated with the PHITS code. Consequently, enhancing fast neutrons using recoil protons from epoxy resin was not effective due to high sensitivity of the imaging plate to gamma rays. However, the converter of epoxy resin doped with (10)B was found to have potential for thermal and epithermal neutrons, and graphite for gamma rays. Copyright © 2014 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Tuninetti, V.; Yuan, S.; Gilles, G.; Guzmán, C. F.; Habraken, A. M.; Duchêne, L.
2016-08-01
This paper presents different extensions of the classical GTN damage model implemented in a finite element code. The goal of this study is to assess these extensions for the numerical prediction of failure of a DC01 steel sheet during a single point incremental forming process, after a proper identification of the material parameters. It is shown that the prediction of failure appears too early compared to experimental results. Though, the use of the Thomason criterion permitted to delay the onset of coalescence and consequently the final failure.
Use of bar coding technology to flag ER patients on metformin-containing drugs.
Lipcamon, James D; Miller, Pam; Kaiser, Tom; Campbell, Bonnie; Freemen, Amanda
2009-01-01
Sixty percent of Jennie Edmundson Hospital's inpatients are admitted through the emergency room. Type II diabetes accounts for 90-95% of all diagnosed cases of diabetes. There were about 1.6 million new cases of diabetes diagnosed in people 20 years or older in 2007. Consequently, we should expect to see an increase in Americans on metformin-containing drugs in the future. Jennie Edmundson Hospital's goal was to develop a hardwired process to identify patients on the medication metformin and who had a CT scan with contrast in the ER and were then admitted as an inpatient.
A visual programming environment for the Navier-Stokes computer
NASA Technical Reports Server (NTRS)
Tomboulian, Sherryl; Crockett, Thomas W.; Middleton, David
1988-01-01
The Navier-Stokes computer is a high-performance, reconfigurable, pipelined machine designed to solve large computational fluid dynamics problems. Due to the complexity of the architecture, development of effective, high-level language compilers for the system appears to be a very difficult task. Consequently, a visual programming methodology has been developed which allows users to program the system at an architectural level by constructing diagrams of the pipeline configuration. These schematic program representations can then be checked for validity and automatically translated into machine code. The visual environment is illustrated by using a prototype graphical editor to program an example problem.
Substance abuse among nurses--defining the issue.
Dunn, Debra
2005-10-01
The prevalence of substance abuse in the nurse population is believed to parallel that in the general population (i.e. approximately 10%). Nurses with substance abuse problems need help. They are in danger of harming patients, the facility's reputation, the nursing profession, and themselves. The consequences of not reporting concerns can be far worse than those of reporting the issue. Part one of this two-part series discusses how society views addiction and the nursing profession, signs and symptoms of substance abuse, reasons nurses should report an impaired colleague, the code of silence that exists among nurses, and board of nursing jurisdiction.
Windeler, Jürgen; Lange, Stefan
2015-03-01
The term benefit describes the (positive) causal, patient-relevant consequences of medical interventions, whether diagnostic or therapeutic. Benefit assessments form the basis of rational decision-making within a health care system. They are based on clinical trials that are able to provide valid answers to the question regarding the relevant benefit or harm that can be caused by an intervention. In Germany, evidence-based benefit assessments are fixed by law, i.e., the Social Code Book V. The application and the practical impact of these assessments could be improved.
Benefits of emotional integration and costs of emotional distancing.
Roth, Guy; Shahar, Bat-Hen; Zohar-Shefer, Yael; Benita, Moti; Moed, Anat; Bibi, Uri; Kanat-Maymon, Yaniv; Ryan, Richard M
2017-12-09
Three studies explored the consequences of the self-determination theory conception of integrative emotion regulation (IER; Ryan & Deci, 2017), which involves an interested stance toward emotions. Emotional, physiological, and cognitive consequences of IER were compared to the consequences of emotional distancing (ED), in relation to a fear-eliciting film. In Study 1, we manipulated emotion regulation by prompting students' (N = 90) IER and ED and also included a control group. Then we tested groups' defensive versus nondefensive emotional processing, coded from post-film written texts. Study 2 (N = 90) and Study 3 (N = 135) used the same emotion regulation manipulations but exposed participants to the fear-eliciting film twice, 72 hr apart, to examine each style's protection from adverse emotional, physiological, and cognitive costs at second exposure. Participants who had been prompted to practice IER were expected to benefit more than participants in the ED and control groups at second exposure, as manifested in lower arousal and better cognitive capacity. Overall, results supported our hypotheses. The current studies provide some support for the assumption that in comparison to ED, taking interest in and accepting one's negative emotions are linked with less defensive processing of negative experiences and with better functioning. © 2017 Wiley Periodicals, Inc.
A Sequential Fluid-mechanic Chemical-kinetic Model of Propane HCCI Combustion
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aceves, S M; Flowers, D L; Martinez-Frias, J
2000-11-29
We have developed a methodology for predicting combustion and emissions in a Homogeneous Charge Compression Ignition (HCCI) Engine. This methodology combines a detailed fluid mechanics code with a detailed chemical kinetics code. Instead of directly linking the two codes, which would require an extremely long computational time, the methodology consists of first running the fluid mechanics code to obtain temperature profiles as a function of time. These temperature profiles are then used as input to a multi-zone chemical kinetics code. The advantage of this procedure is that a small number of zones (10) is enough to obtain accurate results. Thismore » procedure achieves the benefits of linking the fluid mechanics and the chemical kinetics codes with a great reduction in the computational effort, to a level that can be handled with current computers. The success of this procedure is in large part a consequence of the fact that for much of the compression stroke the chemistry is inactive and thus has little influence on fluid mechanics and heat transfer. Then, when chemistry is active, combustion is rather sudden, leaving little time for interaction between chemistry and fluid mixing and heat transfer. This sequential methodology has been capable of explaining the main characteristics of HCCI combustion that have been observed in experiments. In this paper, we use our model to explore an HCCI engine running on propane. The paper compares experimental and numerical pressure traces, heat release rates, and hydrocarbon and carbon monoxide emissions. The results show an excellent agreement, even in parameters that are difficult to predict, such as chemical heat release rates. Carbon monoxide emissions are reasonably well predicted, even though it is intrinsically difficult to make good predictions of CO emissions in HCCI engines. The paper includes a sensitivity study on the effect of the heat transfer correlation on the results of the analysis. Importantly, the paper also shows a numerical study on how parameters such as swirl rate, crevices and ceramic walls could help in reducing HC and CO emissions from HCCI engines.« less
2013-01-01
Background Estimates of the prevalence of irritable bowel syndrome (IBS) vary widely, and a large proportion of patients report having consulted their general practitioner (GP). In patients with new onset gastrointestinal symptoms in primary care it might be possible to predict those at risk of persistent symptoms. However, one of the difficulties is identifying patients within primary care. GPs use a variety of Read Codes to describe patients presenting with IBS. Furthermore, in a qualitative study, exploring GPs’ attitudes and approaches to defining patients with IBS, GPs appeared reluctant to add the IBS Read Code to the patient record until more serious conditions were ruled out. Consequently, symptom codes such as 'abdominal pain’, 'diarrhoea’ or 'constipation’ are used. The aim of the current study was to investigate the prevalence of recorded consultations for IBS and to explore the symptom profile of patients with IBS using data from the Salford Integrated Record (SIR). Methods This was a database study using the SIR, a local patient sharing record system integrating primary, community and secondary care information. Records were obtained for a cohort of patients with gastrointestinal disorders from January 2002 to December 2011. Prevalence rates, symptom recording, medication prescribing and referral patterns were compared for three patient groups (IBS, abdominal pain (AP) and Inflammatory Bowel Disease (IBD)). Results The prevalence of IBS (age standardised rate: 616 per year per 100,000 population) was much lower than expected compared with that reported in the literature. The majority of patients (69%) had no gastrointestinal symptoms recorded in the year prior to their IBS. However a proportion of these (22%) were likely to have been prescribed NICE guideline recommended medications for IBS in that year. The findings for AP and IBD were similar. Conclusions Using Read Codes to identify patients with IBS may lead to a large underestimate of the community prevalence. The IBS diagnostic Read Code was rarely applied in practice. There are similarities with many other medically unexplained symptoms which are typically difficult to diagnose in clinical practice. PMID:24295337
Harkness, Elaine F; Grant, Laura; O'Brien, Sarah J; Chew-Graham, Carolyn A; Thompson, David G
2013-12-02
Estimates of the prevalence of irritable bowel syndrome (IBS) vary widely, and a large proportion of patients report having consulted their general practitioner (GP). In patients with new onset gastrointestinal symptoms in primary care it might be possible to predict those at risk of persistent symptoms. However, one of the difficulties is identifying patients within primary care. GPs use a variety of Read Codes to describe patients presenting with IBS. Furthermore, in a qualitative study, exploring GPs' attitudes and approaches to defining patients with IBS, GPs appeared reluctant to add the IBS Read Code to the patient record until more serious conditions were ruled out. Consequently, symptom codes such as 'abdominal pain', 'diarrhoea' or 'constipation' are used. The aim of the current study was to investigate the prevalence of recorded consultations for IBS and to explore the symptom profile of patients with IBS using data from the Salford Integrated Record (SIR). This was a database study using the SIR, a local patient sharing record system integrating primary, community and secondary care information. Records were obtained for a cohort of patients with gastrointestinal disorders from January 2002 to December 2011. Prevalence rates, symptom recording, medication prescribing and referral patterns were compared for three patient groups (IBS, abdominal pain (AP) and Inflammatory Bowel Disease (IBD)). The prevalence of IBS (age standardised rate: 616 per year per 100,000 population) was much lower than expected compared with that reported in the literature. The majority of patients (69%) had no gastrointestinal symptoms recorded in the year prior to their IBS. However a proportion of these (22%) were likely to have been prescribed NICE guideline recommended medications for IBS in that year. The findings for AP and IBD were similar. Using Read Codes to identify patients with IBS may lead to a large underestimate of the community prevalence. The IBS diagnostic Read Code was rarely applied in practice. There are similarities with many other medically unexplained symptoms which are typically difficult to diagnose in clinical practice.
Exposure of a liquefied gas container to an external fire.
Raj, Phani K
2005-06-30
In liquefied gas, bulk-storage facilities and plants, the separation distances between storage tanks and between a tank and a line of adjoining property that can be built are governed by local regulations and/or codes (e.g. National Fire Protection Association (NFPA) 58, 2004). Separation distance requirements have been in the NFPA 58 Code for over 60 years; however, no scientific foundations (either theoretical or experimental) are available for the specified distances. Even though the liquefied petroleum gas (LPG) industry has operated safely over the years, there is a question as to whether the code-specified distances provide sufficient safety to LPG-storage tanks, when they are exposed to large external fires. A radiation heat-transfer-based model is presented in this paper. The temporal variation of the vapor-wetted tank-wall temperature is calculated when exposed to thermal radiation from an external, non-impinging, large, 30.5 m (100 ft) diameter, highly radiative, hydrocarbon fuel (pool) fire located at a specified distance. Structural steel wall of a pressurized, liquefied gas container (such as the ASME LP-Gas tank) begins to lose its strength, when the wall temperature approaches a critical temperature, 810 K (1000 degrees F). LP-Gas tank walls reaching close to this temperature will be a cause for major concern because of increased potential for tank failure, which could result in catastrophic consequences. Results from the model for exposure of different size ASME (LP-Gas) containers to a hydrocarbon pool fire of 30.5 m (100 ft) in diameter, located with its base edge at the separation distances specified by NFPA 58 [NFPA 58, Liquefied Petroleum Gas Code, Table 6.3.1, 2004 ed., National Fire Protection Association, Quincy, MA, 2004] indicate that the vapor-wetted wall temperature of the containers never reach the critical temperature under common wind conditions (0, 5 and 10 m/s), with the flame tilting towards the tank. This indicates that the separation distances specified in the code are adequate for non-impingement type of fires. The model can be used to test the efficacy of other similar codes and regulations for other materials.
Setiawan, B B
2002-01-01
The settlement along the bank of the Code River in Yogyakarta, Indonesia provides housing for a large mass of the city's poor. Its strategic location and the fact that most urban poor do not have access to land, attracts people to "illegally" settle along the bank of the river. This brings negative consequences for the environment, particularly the increasing domestic waste along the river and the annual flooding in the rainy season. While the public controversies regarding the existence of the settlement along the Code River were still not resolved, at the end of the 1980s, a group of architects, academics and community members proposed the idea of constructing a dike along the River as part of a broader settlement improvement program. From 1991 to 1998, thousands of local people mobilized their resources and were able to construct 6,000 metres of riverside dike along the Code River. The construction of the riverside dike along the River has become an important "stimulant" that generated not only settlement improvement, but also a better treatment of river water. As all housing units located along the River are now facing the River, the River itself is considered the "front-yard". Before the dike was constructed, the inhabitants used to treat the River as the "backyard" and therefore just throw waste into the River. They now really want to have a cleaner river, since the River is an important part of their settlement. The settlement along the Code River presents a complex range of persistent problems with informal settlements in Indonesia; such problems are related to the issues of how to provide more affordable and adequate housing for the poor, while at the same time, to improve the water quality of the river. The project represents a good case, which shows that through a mutual partnership among stakeholders, it is possible to integrate environmental goals into urban redevelopment schemes.
De Vito, David; Al-Aidroos, Naseem; Fenske, Mark J
2017-05-01
Stimuli appearing as visual distractors subsequently receive more negative affective evaluations than novel items or prior targets of attention. Leading accounts question whether this distractor devaluation effect occurs through evaluative codes that become associated with distractors as a mere artefact of attention-task instructions, or through affective consequences of attentional inhibition when applied to prevent distractor interference. Here we test opposing predictions arising from the evaluative-coding and devaluation-by-inhibition hypotheses using an electrophysiological marker of attentional inhibition in a task that requires participants to avoid interference from abstract-shape distractors presented while maintaining a uniquely-colored stimulus in memory. Consistent with prior research, distractors that matched the colour of the stimulus being held in memory elicited a Pd component of the event-related potential waveform, indicating that their processing was being actively suppressed. Subsequent affective evaluations revealed that memory-matching distractors also received more negative ratings than non-matching distractors or previously-unseen shapes. Moreover, Pd magnitude was greater on trials in which the memory-matching distractors were later rated negatively than on trials preceding positive ratings. These results support the devaluation-by-inhibition hypothesis and strongly suggest that fluctuations in stimulus inhibition are closely associated with subsequent affective evaluations. In contrast, none of the evaluative-coding based predictions were confirmed. Copyright © 2017 Elsevier Ltd. All rights reserved.
A Parallel Numerical Micromagnetic Code Using FEniCS
NASA Astrophysics Data System (ADS)
Nagy, L.; Williams, W.; Mitchell, L.
2013-12-01
Many problems in the geosciences depend on understanding the ability of magnetic minerals to provide stable paleomagnetic recordings. Numerical micromagnetic modelling allows us to calculate the domain structures found in naturally occurring magnetic materials. However the computational cost rises exceedingly quickly with respect to the size and complexity of the geometries that we wish to model. This problem is compounded by the fact that the modern processor design no longer focuses on the speed at which calculations are performed, but rather on the number of computational units amongst which we may distribute our calculations. Consequently to better exploit modern computational resources our micromagnetic simulations must "go parallel". We present a parallel and scalable micromagnetics code written using FEniCS. FEniCS is a multinational collaboration involving several institutions (University of Cambridge, University of Chicago, The Simula Research Laboratory, etc.) that aims to provide a set of tools for writing scientific software; in particular software that employs the finite element method. The advantages of this approach are the leveraging of pre-existing projects from the world of scientific computing (PETSc, Trilinos, Metis/Parmetis, etc.) and exposing these so that researchers may pose problems in a manner closer to the mathematical language of their domain. Our code provides a scriptable interface (in Python) that allows users to not only run micromagnetic models in parallel, but also to perform pre/post processing of data.
NASA Technical Reports Server (NTRS)
Plante, Ianik; Cucinotta, Francis A.
2011-01-01
Radiolytic species are formed approximately 1 ps after the passage of ionizing radiation through matter. After their formation, they diffuse and chemically react with other radiolytic species and neighboring biological molecules, leading to various oxidative damage. Therefore, the simulation of radiation chemistry is of considerable importance to understand how radiolytic species damage biological molecules [1]. The step-by-step simulation of chemical reactions is difficult, because the radiolytic species are distributed non-homogeneously in the medium. Consequently, computational approaches based on Green functions for diffusion-influenced reactions should be used [2]. Recently, Green functions for more complex type of reactions have been published [3-4]. We have developed exact random variate generators of these Green functions [5], which will allow us to use them in radiation chemistry codes. Moreover, simulating chemistry using the Green functions is which is computationally very demanding, because the probabilities of reactions between each pair of particles should be evaluated at each timestep [2]. This kind of problem is well adapted for General Purpose Graphic Processing Units (GPGPU), which can handle a large number of similar calculations simultaneously. These new developments will allow us to include more complex reactions in chemistry codes, and to improve the calculation time. This code should be of importance to link radiation track structure simulations and DNA damage models.
Mensi, Skander; Hagens, Olivier; Gerstner, Wulfram; Pozzorini, Christian
2016-02-01
The way in which single neurons transform input into output spike trains has fundamental consequences for network coding. Theories and modeling studies based on standard Integrate-and-Fire models implicitly assume that, in response to increasingly strong inputs, neurons modify their coding strategy by progressively reducing their selective sensitivity to rapid input fluctuations. Combining mathematical modeling with in vitro experiments, we demonstrate that, in L5 pyramidal neurons, the firing threshold dynamics adaptively adjust the effective timescale of somatic integration in order to preserve sensitivity to rapid signals over a broad range of input statistics. For that, a new Generalized Integrate-and-Fire model featuring nonlinear firing threshold dynamics and conductance-based adaptation is introduced that outperforms state-of-the-art neuron models in predicting the spiking activity of neurons responding to a variety of in vivo-like fluctuating currents. Our model allows for efficient parameter extraction and can be analytically mapped to a Generalized Linear Model in which both the input filter--describing somatic integration--and the spike-history filter--accounting for spike-frequency adaptation--dynamically adapt to the input statistics, as experimentally observed. Overall, our results provide new insights on the computational role of different biophysical processes known to underlie adaptive coding in single neurons and support previous theoretical findings indicating that the nonlinear dynamics of the firing threshold due to Na+-channel inactivation regulate the sensitivity to rapid input fluctuations.
Neutron production by cosmic-ray muons in various materials
DOE Office of Scientific and Technical Information (OSTI.GOV)
Manukovsky, K. V.; Ryazhskaya, O. G.; Sobolevsky, N. M.
The results obtained by studying the background of neutrons produced by cosmic-raymuons in underground experimental facilities intended for rare-event searches and in surrounding rock are presented. The types of this rock may include granite, sedimentary rock, gypsum, and rock salt. Neutron production and transfer were simulated using the Geant4 and SHIELD transport codes. These codes were tuned via a comparison of the results of calculations with experimental data—in particular, with data of the Artemovsk research station of the Institute for Nuclear Research (INR, Moscow, Russia)—as well as via an intercomparison of results of calculations with the Geant4 and SHIELD codes.more » It turns out that the atomic-number dependence of the production and yield of neutrons has an irregular character and does not allow a description in terms of a universal function of the atomic number. The parameters of this dependence are different for two groups of nuclei—nuclei consisting of alpha particles and all of the remaining nuclei. Moreover, there are manifest exceptions from a power-law dependence—for example, argon. This may entail important consequences both for the existing underground experimental facilities and for those under construction. Investigation of cosmic-ray-induced neutron production in various materials is of paramount importance for the interpretation of experiments conducted at large depths under the Earth’s surface.« less
NASA Astrophysics Data System (ADS)
Abdellah, Skoudarli; Mokhtar, Nibouche; Amina, Serir
2015-11-01
The H.264/AVC video coding standard is used in a wide range of applications from video conferencing to high-definition television according to its high compression efficiency. This efficiency is mainly acquired from the newly allowed prediction schemes including variable block modes. However, these schemes require a high complexity to select the optimal mode. Consequently, complexity reduction in the H.264/AVC encoder has recently become a very challenging task in the video compression domain, especially when implementing the encoder in real-time applications. Fast mode decision algorithms play an important role in reducing the overall complexity of the encoder. In this paper, we propose an adaptive fast intermode algorithm based on motion activity, temporal stationarity, and spatial homogeneity. This algorithm predicts the motion activity of the current macroblock from its neighboring blocks and identifies temporal stationary regions and spatially homogeneous regions using adaptive threshold values based on content video features. Extensive experimental work has been done in high profile, and results show that the proposed source-coding algorithm effectively reduces the computational complexity by 53.18% on average compared with the reference software encoder, while maintaining the high-coding efficiency of H.264/AVC by incurring only 0.097 dB in total peak signal-to-noise ratio and 0.228% increment on the total bit rate.
The sugar code: Why glycans are so important.
Gabius, Hans-Joachim
2018-02-01
The cell surface is the platform for presentation of biochemical signals that are required for intercellular communication. Their profile necessarily needs to be responsive to internal and external factors in a highly dynamic manner. The structural features of the signals must meet the criterion of high-density information coding in a minimum of space. Thus, only biomolecules that can generate many different oligomers ('words') from few building blocks ('letters') qualify to meet this challenge. Examining the respective properties of common biocompounds that form natural oligo- and polymers comparatively, starting with nucleotides and amino acids (the first and second alphabets of life), comes up with sugars as clear frontrunner. The enzymatic machinery for the biosynthesis of sugar chains can indeed link monosaccharides, the letters of the third alphabet of life, in a manner to reach an unsurpassed number of oligomers (complex carbohydrates or glycans). Fittingly, the resulting glycome of a cell can be likened to a fingerprint. Conjugates of glycans with proteins and sphingolipids (glycoproteins and glycolipids) are ubiquitous in Nature. This implies a broad (patho)physiologic significance. By looking at the signals, at the writers and the erasers of this information as well as its readers and ensuing consequences, this review intends to introduce a broad readership to the principles of the concept of the sugar code. Copyright © 2017 Elsevier B.V. All rights reserved.
Sajjad, Muhammad; Mehmood, Irfan; Baik, Sung Wook
2015-01-01
Image super-resolution (SR) plays a vital role in medical imaging that allows a more efficient and effective diagnosis process. Usually, diagnosing is difficult and inaccurate from low-resolution (LR) and noisy images. Resolution enhancement through conventional interpolation methods strongly affects the precision of consequent processing steps, such as segmentation and registration. Therefore, we propose an efficient sparse coded image SR reconstruction technique using a trained dictionary. We apply a simple and efficient regularized version of orthogonal matching pursuit (ROMP) to seek the coefficients of sparse representation. ROMP has the transparency and greediness of OMP and the robustness of the L1-minization that enhance the dictionary learning process to capture feature descriptors such as oriented edges and contours from complex images like brain MRIs. The sparse coding part of the K-SVD dictionary training procedure is modified by substituting OMP with ROMP. The dictionary update stage allows simultaneously updating an arbitrary number of atoms and vectors of sparse coefficients. In SR reconstruction, ROMP is used to determine the vector of sparse coefficients for the underlying patch. The recovered representations are then applied to the trained dictionary, and finally, an optimization leads to high-resolution output of high-quality. Experimental results demonstrate that the super-resolution reconstruction quality of the proposed scheme is comparatively better than other state-of-the-art schemes.
Zou, Ling; Zhao, Haihua; Zhang, Hongbin
2016-08-24
This study presents a numerical investigation on using the Jacobian-free Newton–Krylov (JFNK) method to solve the two-phase flow four-equation drift flux model with realistic constitutive correlations (‘closure models’). The drift flux model is based on Isshi and his collaborators’ work. Additional constitutive correlations for vertical channel flow, such as two-phase flow pressure drop, flow regime map, wall boiling and interfacial heat transfer models, were taken from the RELAP5-3D Code Manual and included to complete the model. The staggered grid finite volume method and fully implicit backward Euler method was used for the spatial discretization and time integration schemes, respectively. Themore » Jacobian-free Newton–Krylov method shows no difficulty in solving the two-phase flow drift flux model with a discrete flow regime map. In addition to the Jacobian-free approach, the preconditioning matrix is obtained by using the default finite differencing method provided in the PETSc package, and consequently the labor-intensive implementation of complex analytical Jacobian matrix is avoided. Extensive and successful numerical verification and validation have been performed to prove the correct implementation of the models and methods. Code-to-code comparison with RELAP5-3D has further demonstrated the successful implementation of the drift flux model.« less
Development of Switchable Polarity Solvent Draw Solutes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wilson, Aaron D.
Results of a computational fluid dynamic (CFD) study of flow and heat transfer in a printed circuit heat exchanger (PCHE) geometry are presented. CFD results obtained from a two-plate model are compared to corresponding experimental results for the validation. This process provides the basis for further application of the CFD code to PCHE design and performance analysis in a variety of internal flow geometries. As a part of the code verification and validation (V&V) process, CFD simulation of a single semicircular straight channel under laminar isothermal conditions was also performed and compared to theoretical results. This comparison yielded excellent agreementmore » with the theoretical values. The two-plate CFD model based on the experimental PCHE design overestimated the effectiveness and underestimated the pressure drop. However, it is found that the discrepancy between the CFD result and experimental data was mainly caused by the uncertainty in the geometry of heat exchanger during the fabrication. The CFD results obtained using a slightly smaller channel diameter yielded good agreement with the experimental data. A separate investigation revealed that the average channel diameter of the OSU PCHE after the diffusion-bonding was 1.93 mm on the cold fluid side and 1.90 mm on the hot fluid side which are both smaller than the nominal design value. Consequently, the CFD code was shown to have sufficient capability to evaluate the heat exchanger thermal-hydraulic performance.« less
Exploring inattention and distraction in the SafetyNet Accident Causation Database.
Talbot, Rachel; Fagerlind, Helen; Morris, Andrew
2013-11-01
Distraction and inattention are considered to be very important and prevalent factors in the causation of road accidents. There have been many recent research studies which have attempted to understand the circumstances under which a driver becomes distracted or inattentive and how distraction/inattention can be prevented. Both factors are thought to have become more important in recent times partly due to the evolution of in-vehicle information and communication technology. This study describes a methodology that was developed to understand when factors such as distraction and inattention may have been contributors to crashes and also describes some of the consequences of distraction and inattention in terms of subsequent driver actions. The study uses data relating to distraction and inattention from the SafetyNet Accident Causation Database. This database was formulated as part of the SafetyNet project to address the lack of representative in-depth accident causation data within the European Union. Data were collected in 6 European countries using 'on-scene' and 'nearly on-scene' crash investigation methodologies. 32% of crashes recorded in the database, involved at least one driver, rider or pedestrian, who was determined to be 'Inattentive' or 'Distracted'. 212 of the drivers were assigned 'Distraction' and 140 drivers were given the code 'Inattention'. It was found that both distraction and inattention often lead to missed observations within the driving task and consequently 'Timing' or 'Direction' become critical events in the aetiology of crashes. In addition, the crash types and outcomes may differ according to the type and nature of the distraction and inattention as determined by the in-depth investigations. The development of accident coding methodology is described in this study as is its evolution into the Driver Reliability and Error Analysis Model (DREAM) version 3.0. Copyright © 2012 Elsevier Ltd. All rights reserved.
A content analysis of displayed alcohol references on a social networking web site.
Moreno, Megan A; Briner, Leslie R; Williams, Amanda; Brockman, Libby; Walker, Leslie; Christakis, Dimitri A
2010-08-01
Exposure to alcohol use in media is associated with adolescent alcohol use. Adolescents frequently display alcohol references on Internet media, such as social networking web sites. The purpose of this study was to conduct a theoretically based content analysis of older adolescents' displayed alcohol references on a social networking web site. We evaluated 400 randomly selected public MySpace profiles of self-reported 17- to 20-year-olds from zip codes, representing urban, suburban, and rural communities in one Washington county. Content was evaluated for alcohol references, suggesting: (1) explicit versus figurative alcohol use, (2) alcohol-related motivations, associations, and consequences, including references that met CRAFFT problem drinking criteria. We compared profiles from four target zip codes for prevalence and frequency of alcohol display. Of 400 profiles, 225 (56.3%) contained 341 references to alcohol. Profile owners who displayed alcohol references were mostly male (54.2%) and white (70.7%). The most frequent reference category was explicit use (49.3%); the most commonly displayed alcohol use motivation was peer pressure (4.7%). Few references met CRAFFT problem drinking criteria (3.2%). There were no differences in prevalence or frequency of alcohol display among the four sociodemographic communities. Despite alcohol use being illegal and potentially stigmatizing in this population, explicit alcohol use is frequently referenced on adolescents' MySpace profiles across several sociodemographic communities. Motivations, associations, and consequences regarding alcohol use referenced on MySpace appear consistent with previous studies of adolescent alcohol use. These references may be a potent source of influence on adolescents, particularly given that they are created and displayed by peers. (c) 2010 Society for Adolescent Health and Medicine. Published by Elsevier Inc. All rights reserved.
CIRMIS Data system. Volume 2. Program listings
DOE Office of Scientific and Technical Information (OSTI.GOV)
Friedrichs, D.R.
1980-01-01
The Assessment of Effectiveness of Geologic Isolation Systems (AEGIS) Program is developing and applying the methodology for assessing the far-field, long-term post-closure safety of deep geologic nuclear waste repositories. AEGIS is being performed by Pacific Northwest Laboratory (PNL) under contract with the Office of Nuclear Waste Isolation (OWNI) for the Department of Energy (DOE). One task within AEGIS is the development of methodology for analysis of the consequences (water pathway) from loss of repository containment as defined by various release scenarios. Analysis of the long-term, far-field consequences of release scenarios requires the application of numerical codes which simulate the hydrologicmore » systems, model the transport of released radionuclides through the hydrologic systems, model the transport of released radionuclides through the hydrologic systems to the biosphere, and, where applicable, assess the radiological dose to humans. The various input parameters required in the analysis are compiled in data systems. The data are organized and prepared by various input subroutines for utilization by the hydraulic and transport codes. The hydrologic models simulate the groundwater flow systems and provide water flow directions, rates, and velocities as inputs to the transport models. Outputs from the transport models are basically graphs of radionuclide concentration in the groundwater plotted against time. After dilution in the receiving surface-water body (e.g., lake, river, bay), these data are the input source terms for the dose models, if dose assessments are required.The dose models calculate radiation dose to individuals and populations. CIRMIS (Comprehensive Information Retrieval and Model Input Sequence) Data System is a storage and retrieval system for model input and output data, including graphical interpretation and display. This is the second of four volumes of the description of the CIRMIS Data System.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Friedrichs, D.R.
1980-01-01
The Assessment of Effectiveness of Geologic Isolation Systems (AEGIS) Program is developing and applying the methodology for assessing the far-field, long-term post-closure safety of deep geologic nuclear waste repositories. AEGIS is being performed by Pacific Northwest Laboratory (PNL) under contract with the Office of Nuclear Waste Isolation (ONWI) for the Department of Energy (DOE). One task within AEGIS is the development of methodology for analysis of the consequences (water pathway) from loss of repository containment as defined by various release scenarios. Analysis of the long-term, far-field consequences of release scenarios requires the application of numerical codes which simulate the hydrologicmore » systems, model the transport of released radionuclides through the hydrologic systems to the biosphere, and, where applicable, assess the radiological dose to humans. The various input parameters required in the analysis are compiled in data systems. The data are organized and prepared by various input subroutines for use by the hydrologic and transport codes. The hydrologic models simulate the groundwater flow systems and provide water flow directions, rates, and velocities as inputs to the transport models. Outputs from the transport models are basically graphs of radionuclide concentration in the groundwater plotted against time. After dilution in the receiving surface-water body (e.g., lake, river, bay), these data are the input source terms for the dose models, if dose assessments are required. The dose models calculate radiation dose to individuals and populations. CIRMIS (Comprehensive Information Retrieval and Model Input Sequence) Data System is a storage and retrieval system for model input and output data, including graphical interpretation and display. This is the fourth of four volumes of the description of the CIRMIS Data System.« less
A Content Analysis of Displayed Alcohol References on a Social Networking Web Site
Moreno, Megan A; Briner, Leslie R; Williams, Amanda; Brockman, Libby; Walker, Leslie; Christakis, Dimitri A
2010-01-01
Purpose Exposure to alcohol use in media is associated with adolescent alcohol use. Adolescents frequently display alcohol references on Internet media such as social networking websites (SNSs). The purpose of this study was to conduct a theoretically-based content analysis of older adolescents’ displayed alcohol references on a SNS. Methods We evaluated 400 randomly selected public MySpace profiles of self-reported 17 to 20-year-olds from zip codes representing urban, suburban and rural communities in one Washington county. Content was evaluated for alcohol references suggesting: 1) explicit versus figurative alcohol use, 2) alcohol-related motivations, associations and consequences, including references that met CRAFFT problem drinking criteria. We compared profiles from four target zip codes for prevalence and frequency of alcohol display. Results Of 400 profiles, 225 profiles (56.3%) contained 341 references to alcohol. Profile owners who displayed alcohol references were mostly male (54.2%) and White (70.7%). The most frequent reference category was explicit use (49.3%), the most commonly displayed alcohol use motivation was peer pressure (4.7%). Few references met CRAFFT problem drinking criteria (3.2%). There were no differences in prevalence or frequency of alcohol display among the four sociodemographic communities. Conclusions Despite alcohol use being illegal and potentially stigmatizing in this population, explicit alcohol use is frequently referenced on adolescents’ MySpace profiles across several sociodemographic communities. Motivations, associations and consequences regarding alcohol use referenced on MySpace appear consistent with previous studies of adolescent alcohol use. These references may be a potent source of influence on adolescents, particularly given that they are created and displayed by peers. PMID:20638009
Alcohol brand appearances in US popular music.
Primack, Brian A; Nuzzo, Erin; Rice, Kristen R; Sargent, James D
2012-03-01
The average US adolescent is exposed to 34 references to alcohol in popular music daily. Although brand recognition is an independent, potent risk factor for alcohol outcomes among adolescents, alcohol brand appearances in popular music have not been assessed systematically. We aimed to determine the prevalence of and contextual elements associated with alcohol brand appearances in US popular music. Qualitative content analysis. We used Billboard Magazine to identify songs to which US adolescents were most exposed in 2005-07. For each of the 793 songs, two trained coders analyzed independently the lyrics of each song for references to alcohol and alcohol brand appearances. Subsequent in-depth assessments utilized Atlas.ti to determine contextual factors associated with each of the alcohol brand appearances. Our final code book contained 27 relevant codes representing six categories: alcohol types, consequences, emotional states, activities, status and objects. Average inter-rater reliability was high (κ = 0.80), and all differences were easily adjudicated. Of the 793 songs in our sample, 169 (21.3%) referred explicitly to alcohol, and of those, 41 (24.3%) contained an alcohol brand appearance. Consequences associated with alcohol were more often positive than negative (41.5% versus 17.1%, P < 0.001). Alcohol brand appearances were associated commonly with wealth (63.4%), sex (58.5%), luxury objects (51.2%), partying (48.8%), other drugs (43.9%) and vehicles (39.0%). One in five songs sampled from US popular music had explicit references to alcohol, and one-quarter of these mentioned a specific alcohol brand. These alcohol brand appearances are associated commonly with a luxury life-style characterized by wealth, sex, partying and other drugs. © 2011 The Authors, Addiction © 2011 Society for the Study of Addiction.
Monitoring the evolutionary aspect of the Gene Ontology to enhance predictability and usability.
Park, Jong C; Kim, Tak-eun; Park, Jinah
2008-04-11
Much effort is currently made to develop the Gene Ontology (GO). Due to the dynamic nature of information it addresses, GO undergoes constant updates whose results are released at regular intervals as separate versions. Although there are a large number of computational tools to aid the development of GO, they are operating on a particular version of GO, making it difficult for GO curators to anticipate the full impact of particular changes along the time axis on a larger scale. We present a method for tapping into such an evolutionary aspect of GO, by making it possible to keep track of important temporal changes to any of the terms and relations of GO and by consequently making it possible to recognize associated trends. We have developed visualization methods for viewing the changes between two different versions of GO by constructing a colour-coded layered graph. The graph shows both versions of GO with highlights to those GO terms that are added, removed and modified between the two versions. Focusing on a specific GO term or terms of interest over a period, we demonstrate the utility of our system that can be used to make useful hypotheses about the cause of the evolution and to provide new insights into more complex changes. GO undergoes fast evolutionary changes. A snapshot of GO, as presented by each version of GO alone, overlooks such evolutionary aspects, and consequently limits the utilities of GO. The method that highlights the differences of consecutive versions or two different versions of an evolving ontology with colour-coding enhances the utility of GO for users as well as for developers. To the best of our knowledge, this is the first proposal to visualize the evolutionary aspect of GO.
Azzopardi, Roberta Vella; Vermeiren, Sofie; Gorus, Ellen; Habbig, Ann-Katrin; Petrovic, Mirko; Van Den Noortgate, Nele; De Vriendt, Patricia; Bautmans, Ivan; Beyer, Ingo
2016-11-01
To date, the major dilemma concerning frailty is the lack of a standardized language regarding its operationalization. Considering the demographic challenge that the world is facing, standardization of frailty identification is indeed the first step in tackling the burdensome consequences of frailty. To demonstrate this diversity in frailty assessment, the available frailty instruments have been linked to the International Classification of Functioning, Disability, and Health (ICF): a standardized and hierarchically coded language developed by World Health Organization regarding health conditions and their positive (functioning) and negative (disability) consequences. A systematic review on frailty instruments was carried out in PubMed, Web of Knowledge, and PsycINFO. The items of the identified frailty instruments were then linked to the ICF codes. 79 original or adapted frailty instruments were identified and categorized into single (n = 25) and multidomain (n = 54) groups. Only 5 frailty instruments (indexes) were linked to all 5 ICF components. Whereas the ICF components Body Functions and Activities and Participation were frequently linked to the frailty instruments, Body Structures, Environmental and Personal factors were sparingly represented mainly in the multidomain frailty instruments. This review highlights the heterogeneity in frailty operationalization. Environmental and personal factors should be given more thought in future frailty assessments. Being unambiguous, structured, and neutral, the ICF language allows comparing observations made with different frailty instruments. In conclusion, this systematic overview and ICF translation can be a cornerstone for future standardization of frailty assessment. Copyright © 2016 AMDA – The Society for Post-Acute and Long-Term Care Medicine. Published by Elsevier Inc. All rights reserved.
Alcohol Brand Appearances in U.S. Popular Music
Primack, Brian A.; Nuzzo, Erin; Rice, Kristen R.; Sargent, James D.
2011-01-01
Aims The average US adolescent is exposed to 34 references to alcohol in popular music daily. Although brand recognition is an independent, potent risk factor for alcohol outcomes among adolescents, alcohol brand appearances in popular music have not been systematically assessed. We aimed to determine the prevalence of and contextual elements associated with alcohol brand appearances in U.S. popular music. Design Qualitative content analysis. Setting We used Billboard Magazine to identify songs to which US adolescents were most exposed in 2005-2007. For each of the 793 songs, two trained coders independently analyzed the lyrics of each song for references to alcohol and alcohol brand appearances. Subsequent in-depth assessments utilised Atlas.ti to determine contextual factors associated with each of the alcohol brand appearances. Measurements Our final code book contained 27 relevant codes representing 6 categories: alcohol types, consequences, emotional states, activities, status, and objects. Findings Average inter-rater reliability was high (κ=0.80), and all differences were easily adjudicated. Of the 793 songs in our sample, 169 (21.3%) explicitly referred to alcohol, and of those, 41 (24.3%) contained an alcohol brand appearance. Consequences associated with alcohol were more often positive than negative (41.5% vs. 17.1%, P<.001). Alcohol brand appearances were commonly associated with wealth (63.4%), sex (58.5%), luxury objects (51.2%), partying (48.8%), other drugs (43.9%), and vehicles (39.0%). Conclusions One-in-five songs sampled from U.S. popular music had explicit references to alcohol, and one quarter of these mentioned a specific alcohol brand. These alcohol brand appearances are commonly associated with a luxury lifestyle characterised by wealth, sex, partying, and other drugs. PMID:22011113
Exploring the patient perspective of fatigue in adults with visual impairment: a qualitative study.
Schakel, Wouter; Bode, Christina; van der Aa, Hilde P A; Hulshof, Carel T J; Bosmans, Judith E; van Rens, Gerardus H M B; van Nispen, Ruth M A
2017-08-03
Fatigue is an often mentioned symptom by patients with irreversible visual impairment. This study explored the patient perspective of fatigue in visually impaired adults with a focus on symptoms of fatigue, causes, consequences and coping strategies. Two large Dutch low vision multidisciplinary rehabilitation organisations. 16 visually impaired adults with severe symptoms of fatigue selected by purposive sampling. A qualitative study involving semistructured interviews. A total of four first-level codes were top-down predetermined in correspondence with the topics of the research question. Verbatim transcribed interviews were analysed with a combination of a deductive and inductive approach using open and axial coding. Participants often described the symptoms of fatigue as a mental, daily and physical experience. The most often mentioned causes of fatigue were a high cognitive load, the intensity and amount of activities, the high effort necessary to establish visual perception, difficulty with light intensity and negative cognitions. Fatigue had the greatest impact on the ability to carry out social roles and participation, emotional functioning and cognitive functioning. The most common coping strategies were relaxation, external support, socialising and physical exercise and the acceptance of fatigue. Our results indicate that low vision-related fatigue is mainly caused by population specific determinants that seem different from the fatigue experience described in studies with other patient populations. Fatigue may be central to the way patients react, adapt and compensate to the consequences of vision loss. These findings indicate a need for future research aimed at interventions specifically tailored to the unique aspects of fatigue related to vision loss. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
The radiation fields around a proton therapy facility: A comparison of Monte Carlo simulations
NASA Astrophysics Data System (ADS)
Ottaviano, G.; Picardi, L.; Pillon, M.; Ronsivalle, C.; Sandri, S.
2014-02-01
A proton therapy test facility with a beam current lower than 10 nA in average, and an energy up to 150 MeV, is planned to be sited at the Frascati ENEA Research Center, in Italy. The accelerator is composed of a sequence of linear sections. The first one is a commercial 7 MeV proton linac, from which the beam is injected in a SCDTL (Side Coupled Drift Tube Linac) structure reaching the energy of 52 MeV. Then a conventional CCL (coupled Cavity Linac) with side coupling cavities completes the accelerator. The linear structure has the important advantage that the main radiation losses during the acceleration process occur to protons with energy below 20 MeV, with a consequent low production of neutrons and secondary radiation. From the radiation protection point of view the source of radiation for this facility is then almost completely located at the final target. Physical and geometrical models of the device have been developed and implemented into radiation transport computer codes based on the Monte Carlo method. The scope is the assessment of the radiation field around the main source for supporting the safety analysis. For the assessment independent researchers used two different Monte Carlo computer codes named FLUKA (FLUktuierende KAskade) and MCNPX (Monte Carlo N-Particle eXtended) respectively. Both are general purpose tools for calculations of particle transport and interactions with matter, covering an extended range of applications including proton beam analysis. Nevertheless each one utilizes its own nuclear cross section libraries and uses specific physics models for particle types and energies. The models implemented into the codes are described and the results are presented. The differences between the two calculations are reported and discussed pointing out disadvantages and advantages of each code in the specific application.
Spatiotopic coding during dynamic head tilt
Turi, Marco; Burr, David C.
2016-01-01
Humans maintain a stable representation of the visual world effortlessly, despite constant movements of the eyes, head, and body, across multiple planes. Whereas visual stability in the face of saccadic eye movements has been intensely researched, fewer studies have investigated retinal image transformations induced by head movements, especially in the frontal plane. Unlike head rotations in the horizontal and sagittal planes, tilting the head in the frontal plane is only partially counteracted by torsional eye movements and consequently induces a distortion of the retinal image to which we seem to be completely oblivious. One possible mechanism aiding perceptual stability is an active reconstruction of a spatiotopic map of the visual world, anchored in allocentric coordinates. To explore this possibility, we measured the positional motion aftereffect (PMAE; the apparent change in position after adaptation to motion) with head tilts of ∼42° between adaptation and test (to dissociate retinal from allocentric coordinates). The aftereffect was shown to have both a retinotopic and spatiotopic component. When tested with unpatterned Gaussian blobs rather than sinusoidal grating stimuli, the retinotopic component was greatly reduced, whereas the spatiotopic component remained. The results suggest that perceptual stability may be maintained at least partially through mechanisms involving spatiotopic coding. NEW & NOTEWORTHY Given that spatiotopic coding could play a key role in maintaining visual stability, we look for evidence of spatiotopic coding after retinal image transformations caused by head tilt. To this end, we measure the strength of the positional motion aftereffect (PMAE; previously shown to be largely spatiotopic after saccades) after large head tilts. We find that, as with eye movements, the spatial selectivity of the PMAE has a large spatiotopic component after head rotation. PMID:27903636
Experimental study on lateral strength of wall-slab joint subjected to lateral cyclic load
NASA Astrophysics Data System (ADS)
Masrom, Mohd Asha'ari; Mohamad, Mohd Elfie; Hamid, Nor Hayati Abdul; Yusuff, Amer
2017-10-01
Tunnel form building has been utilised in building construction since 1960 in Malaysia. This method of construction has been applied extensively in the construction of high rise residential house (multistory building) such as condominium and apartment. Most of the tunnel form buildings have been designed according to British standard (BS) whereby there is no provision for seismic loading. The high-rise tunnel form buildings are vulnerable to seismic loading. The connections between slab and shear walls in the tunnel-form building constitute an essential link in the lateral load resisting mechanism. Malaysia is undergoing a shifting process from BS code to Eurocode (EC) for building construction since the country has realised the safety threats of earthquake. Hence, this study is intended to compare the performance of the interior wall slab joint for a tunnel form structure designed based on Euro and British codes. The experiment included a full scale test of the wall slab joint sub-assemblages under reversible lateral cyclic loading. Two sub-assemblage specimens of the wall slab joint were designed and constructed based on both codes. Each specimen was tested using lateral displacement control (drift control). The specimen designed by using Eurocode was found could survive up to 3.0% drift while BS specimen could last to 1.5% drift. The analysis results indicated that the BS specimen was governed by brittle failure modes with Ductility Class Low (DCL) while the EC specimen behaved in a ductile manner with Ductility Class Medium (DCM). The low ductility recorded in BS specimen was resulted from insufficient reinforcement provided in the BS code specimen. Consequently, the BS specimen could not absorb energy efficiently (low energy dissipation) and further sustain under inelastic deformation.
NASA Astrophysics Data System (ADS)
Steefel, C. I.
2015-12-01
Over the last 20 years, we have seen the evolution of multicomponent reactive transport modeling and the expanding range and increasing complexity of subsurface environmental applications it is being used to address. Reactive transport modeling is being asked to provide accurate assessments of engineering performance and risk for important issues with far-reaching consequences. As a result, the complexity and detail of subsurface processes, properties, and conditions that can be simulated have significantly expanded. Closed form solutions are necessary and useful, but limited to situations that are far simpler than typical applications that combine many physical and chemical processes, in many cases in coupled form. In the absence of closed form and yet realistic solutions for complex applications, numerical benchmark problems with an accepted set of results will be indispensable to qualifying codes for various environmental applications. The intent of this benchmarking exercise, now underway for more than five years, is to develop and publish a set of well-described benchmark problems that can be used to demonstrate simulator conformance with norms established by the subsurface science and engineering community. The objective is not to verify this or that specific code--the reactive transport codes play a supporting role in this regard—but rather to use the codes to verify that a common solution of the problem can be achieved. Thus, the objective of each of the manuscripts is to present an environmentally-relevant benchmark problem that tests the conceptual model capabilities, numerical implementation, process coupling, and accuracy. The benchmark problems developed to date include 1) microbially-mediated reactions, 2) isotopes, 3) multi-component diffusion, 4) uranium fate and transport, 5) metal mobility in mining affected systems, and 6) waste repositories and related aspects.
NASA Astrophysics Data System (ADS)
Ludwig, J.; Lindhorst, S.; Betzler, C.; Bierstedt, S. E.; Borówka, R. K.
2017-08-01
It is shown that coastal dunes bear a so far unread archive of annual wind intensity. Active dunes at the Polish coast near Łeba consist of two genetic units: primary dunes with up to 18 m high eastward-dipping foresets, temporarily superimposed by smaller secondary dunes. Ground-penetrating radar (GPR) data reveal that the foresets of the primary dunes are bundled into alternating packages imaged as either low- or high-amplitude reflections. High-amplitude packages are composed of quartz sand with intercalated heavy-minerals layers. Low-amplitude packages lack these heavy-mineral concentrations. Dune net-progradation is towards the east, reflecting the prevalence of westerly winds. Winds blowing parallel to the dune crest winnow the lee slope, leaving layers enriched in heavy minerals. Sediment transport to the slip face of the dunes is enhanced during the winter months, whereas winnowing predominantly takes place during the spring to autumn months, when the wind field is bi-directional. As a consequence of this seasonal shift, the sedimentary record of one year comprises one low- and one high-amplitude GPR reflection interval. This sedimentary pattern is a persistent feature of the Łeba dunes and recognized to resemble a sedimentary "bar code". To overcome hiatuses in the bar code of individual dunes and dune-to-dune variations in bar-code quality, dendrochronological methods were adopted to compile a composite bar code from several dunes. The resulting data series shows annual variations in west-wind intensity at the southern Baltic coast for the time period 1987 to 2012. Proxy-based wind data are validated against instrumental based weather observations.
PFLOTRAN Verification: Development of a Testing Suite to Ensure Software Quality
NASA Astrophysics Data System (ADS)
Hammond, G. E.; Frederick, J. M.
2016-12-01
In scientific computing, code verification ensures the reliability and numerical accuracy of a model simulation by comparing the simulation results to experimental data or known analytical solutions. The model is typically defined by a set of partial differential equations with initial and boundary conditions, and verification ensures whether the mathematical model is solved correctly by the software. Code verification is especially important if the software is used to model high-consequence systems which cannot be physically tested in a fully representative environment [Oberkampf and Trucano (2007)]. Justified confidence in a particular computational tool requires clarity in the exercised physics and transparency in its verification process with proper documentation. We present a quality assurance (QA) testing suite developed by Sandia National Laboratories that performs code verification for PFLOTRAN, an open source, massively-parallel subsurface simulator. PFLOTRAN solves systems of generally nonlinear partial differential equations describing multiphase, multicomponent and multiscale reactive flow and transport processes in porous media. PFLOTRAN's QA test suite compares the numerical solutions of benchmark problems in heat and mass transport against known, closed-form, analytical solutions, including documentation of the exercised physical process models implemented in each PFLOTRAN benchmark simulation. The QA test suite development strives to follow the recommendations given by Oberkampf and Trucano (2007), which describes four essential elements in high-quality verification benchmark construction: (1) conceptual description, (2) mathematical description, (3) accuracy assessment, and (4) additional documentation and user information. Several QA tests within the suite will be presented, including details of the benchmark problems and their closed-form analytical solutions, implementation of benchmark problems in PFLOTRAN simulations, and the criteria used to assess PFLOTRAN's performance in the code verification procedure. References Oberkampf, W. L., and T. G. Trucano (2007), Verification and Validation Benchmarks, SAND2007-0853, 67 pgs., Sandia National Laboratories, Albuquerque, NM.
NASA Astrophysics Data System (ADS)
Pantale, O.; Caperaa, S.; Rakotomalala, R.
2004-07-01
During the last 50 years, the development of better numerical methods and more powerful computers has been a major enterprise for the scientific community. In the same time, the finite element method has become a widely used tool for researchers and engineers. Recent advances in computational software have made possible to solve more physical and complex problems such as coupled problems, nonlinearities, high strain and high-strain rate problems. In this field, an accurate analysis of large deformation inelastic problems occurring in metal-forming or impact simulations is extremely important as a consequence of high amount of plastic flow. In this presentation, the object-oriented implementation, using the C++ language, of an explicit finite element code called DynELA is presented. The object-oriented programming (OOP) leads to better-structured codes for the finite element method and facilitates the development, the maintainability and the expandability of such codes. The most significant advantage of OOP is in the modeling of complex physical systems such as deformation processing where the overall complex problem is partitioned in individual sub-problems based on physical, mathematical or geometric reasoning. We first focus on the advantages of OOP for the development of scientific programs. Specific aspects of OOP, such as the inheritance mechanism, the operators overload procedure or the use of template classes are detailed. Then we present the approach used for the development of our finite element code through the presentation of the kinematics, conservative and constitutive laws and their respective implementation in C++. Finally, the efficiency and accuracy of our finite element program are investigated using a number of benchmark tests relative to metal forming and impact simulations.
NASA Astrophysics Data System (ADS)
Zeitler, T.; Kirchner, T. B.; Hammond, G. E.; Park, H.
2014-12-01
The Waste Isolation Pilot Plant (WIPP) has been developed by the U.S. Department of Energy (DOE) for the geologic (deep underground) disposal of transuranic (TRU) waste. Containment of TRU waste at the WIPP is regulated by the U.S. Environmental Protection Agency (EPA). The DOE demonstrates compliance with the containment requirements by means of performance assessment (PA) calculations. WIPP PA calculations estimate the probability and consequence of potential radionuclide releases from the repository to the accessible environment for a regulatory period of 10,000 years after facility closure. The long-term performance of the repository is assessed using a suite of sophisticated computational codes. In a broad modernization effort, the DOE has overseen the transfer of these codes to modern hardware and software platforms. Additionally, there is a current effort to establish new performance assessment capabilities through the further development of the PFLOTRAN software, a state-of-the-art massively parallel subsurface flow and reactive transport code. Improvements to the current computational environment will result in greater detail in the final models due to the parallelization afforded by the modern code. Parallelization will allow for relatively faster calculations, as well as a move from a two-dimensional calculation grid to a three-dimensional grid. The result of the modernization effort will be a state-of-the-art subsurface flow and transport capability that will serve WIPP PA into the future. Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000. This research is funded by WIPP programs administered by the Office of Environmental Management (EM) of the U.S Department of Energy.
Pletti, Carolina; Sarlo, Michela; Palomba, Daniela; Rumiati, Rino; Lotto, Lorella
2015-03-01
In any modern society killing is regarded as a severe violation of the legal codes that is subjected to penal judgment. Therefore, it is likely that people take legal consequences into account when deciding about the hypothetical killing of one person in classic moral dilemmas, with legal concerns contributing to decision-making. In particular, by differing for the degree of intentionality and emotional salience, Footbridge- and Trolley-type dilemmas might promote differential assignment of blame and punishment while implicating the same severity of harm. The present study was aimed at comparing the neural activity, subjective emotional reactions, and behavioral choices in two groups of participants who either took (Legal group) or did not take (No Legal group) legal consequences into account when deciding on Footbridge-type and Trolley-type moral dilemmas. Stimulus- and response-locked ERPs were measured to investigate the neural activity underlying two separate phases of the decision process. No difference in behavioral choices was found between groups. However, the No Legal group reported greater overall emotional impact, associated with lower preparation for action, suggesting greater conflict between alternative motor responses representing the different decision choices. In contrast, the Legal group showed an overall dampened affective experience during decision-making associated with greater overall action readiness and intention to act, reflecting lower conflict in responding. On these bases, we suggest that in moral dilemmas legal consequences of actions provide a sort of reference point on which people can rely to support a decision, independent of dilemma type. Copyright © 2015 Elsevier Inc. All rights reserved.
Pullen, Samuel J; Petruzzi, Liana; Lange, Brittany Cl; Parnarouskis, Lindsey; Dominguez, Silvia; Harris, Benjamin; Quiterio, Nicole; Durham, Michelle P; Lekpeh, Gondah; Manobah, Burgess; Slopadoe, Siede P; Diandy, Veronique C; Payne, Arthur J; Henderson, David C; Borba, Christina Pc
2016-02-01
Substance use is a significant and common problem among school-aged youths throughout Africa. Like other countries on this continent, the West-African nation of Liberia is recovering from civil war. A well-educated population of young people is critical to the recovery efforts and long-term success of Liberia. Substance use by school-aged youths has important public health consequences that could undermine Liberia's post-conflict recovery efforts. We wanted to better understand the culturally significant themes and subthemes related to substance use among youths attending public schools in Monrovia, Liberia. A qualitative research design was used to collect data from 72 students attending public school in Monrovia, Liberia. Nine focus groups of 6-8 students from three public schools were facilitated using a semi-structured format to guide discussions on substance use. Student narratives were translated and re-occurring themes and subthemes were coded and analyzed. Four emergent themes described in this study were: Behaviors associated with substance useConsequences associated with individual useConsequences of substance use that affected the school milieuSchool-related factors that were protective from substance use.Subthemes associated with substance use included concealment of substances, intoxication and disruption of the classroom environment, expulsion from school, school drop-out, and school as protective against substance use. Liberian school-aged youths described important themes and subthemes associated with substance use occurring within the school milieu. These data have germane public health ramifications, and could help inform larger epidemiologic study methods and public health interventions for Liberia and countries with similar profiles.
Visual adaptation and face perception
Webster, Michael A.; MacLeod, Donald I. A.
2011-01-01
The appearance of faces can be strongly affected by the characteristics of faces viewed previously. These perceptual after-effects reflect processes of sensory adaptation that are found throughout the visual system, but which have been considered only relatively recently in the context of higher level perceptual judgements. In this review, we explore the consequences of adaptation for human face perception, and the implications of adaptation for understanding the neural-coding schemes underlying the visual representation of faces. The properties of face after-effects suggest that they, in part, reflect response changes at high and possibly face-specific levels of visual processing. Yet, the form of the after-effects and the norm-based codes that they point to show many parallels with the adaptations and functional organization that are thought to underlie the encoding of perceptual attributes like colour. The nature and basis for human colour vision have been studied extensively, and we draw on ideas and principles that have been developed to account for norms and normalization in colour vision to consider potential similarities and differences in the representation and adaptation of faces. PMID:21536555
DOE Office of Scientific and Technical Information (OSTI.GOV)
Di Piazza, Ivan; Buehler, Leo
2000-09-15
The buoyancy-driven magnetoconvection in the cross section of an infinitely long vertical square duct is investigated numerically using the CFX code package. The implementation of a magnetohydrodynamic (MHD) problem in CFX is discussed, with particular reference to the Lorentz forces and the electric potential boundary conditions for arbitrary electrical conductivity of the walls. The method proposed is general and applies to arbitrary geometries with an arbitrary orientation of the magnetic field. Results for fully developed flow under various thermal boundary conditions are compared with asymptotic analytical solutions. The comparison shows that the asymptotic analysis is confirmed for highly conducting wallsmore » as high velocity jets occur at the side walls. For weakly conducting walls, the side layers become more conducting than the side walls, and strong electric currents flow within these layers parallel to the magnetic field. As a consequence, the velocity jets are suppressed, and the core solution is only corrected by the viscous forces near the wall. The implementation of MHD in CFX is achieved.« less
Numerical simulation of turbulent gas flames in tubes.
Salzano, E; Marra, F S; Russo, G; Lee, J H S
2002-12-02
Computational fluid dynamics (CFD) is an emerging technique to predict possible consequences of gas explosion and it is often considered a powerful and accurate tool to obtain detailed results. However, systematic analyses of the reliability of this approach to real-scale industrial configurations are still needed. Furthermore, few experimental data are available for comparison and validation. In this work, a set of well documented experimental data related to the flame acceleration obtained within obstacle-filled tubes filled with flammable gas-air mixtures, has been simulated. In these experiments, terminal steady flame speeds corresponding to different propagation regimes were observed, thus, allowing a clear and prompt characterisation of the numerical results with respect to numerical parameters, as grid definition, geometrical parameters, as blockage ratio and to mixture parameters, as mixture reactivity. The CFD code AutoReagas was used for the simulations. Numerical predictions were compared with available experimental data and some insights into the code accuracy were determined. Computational results are satisfactory for the relatively slower turbulent deflagration regimes and became fair when choking regime is observed, whereas transition to quasi-detonation or Chapman-Jogouet (CJ) were never predicted.
Electron acceleration in the Solar corona - 3D PiC code simulations of guide field reconnection
NASA Astrophysics Data System (ADS)
Alejandro Munoz Sepulveda, Patricio
2017-04-01
The efficient electron acceleration in the solar corona detected by means of hard X-ray emission is still not well understood. Magnetic reconnection through current sheets is one of the proposed production mechanisms of non-thermal electrons in solar flares. Previous works in this direction were based mostly on test particle calculations or 2D fully-kinetic PiC simulations. We have now studied the consequences of self-generated current-aligned instabilities on the electron acceleration mechanisms by 3D magnetic reconnection. For this sake, we carried out 3D Particle-in-Cell (PiC) code numerical simulations of force free reconnecting current sheets, appropriate for the description of the solar coronal plasmas. We find an efficient electron energization, evidenced by the formation of a non-thermal power-law tail with a hard spectral index smaller than -2 in the electron energy distribution function. We discuss and compare the influence of the parallel electric field versus the curvature and gradient drifts in the guiding-center approximation on the overall acceleration, and their dependence on different plasma parameters.
High-speed 3D surface measurement with a fringe projection based optical sensor
NASA Astrophysics Data System (ADS)
Bräuer-Burchardt, Christian; Heist, Stefan; Kühmstedt, Peter; Notni, Gunther
2014-05-01
A new optical sensor based on fringe projection technique for the accurate and fast measurement of the surface of objects mainly for industrial inspection tasks is introduced. High-speed fringe projection and image recording with 180 Hz allows 3D rates up to 60 Hz. The high measurement velocity was achieved by consequent fringe code reduction and parallel data processing. Reduction of the image sequence length was obtained by omission of the Gray-code sequence by using the geometric restrictions of the measurement objects. The sensor realizes three different measurement fields between 20 x 20 mm2 and 40 x 40 mm2 with lateral spatial solutions between 10 μm and 20 μm with the same working distance. Measurement object height extension is between +/- 0.5 mm and +/- 2 mm. Height resolution between 1 μm and 5 μm can be achieved depending on the properties of the measurement objects. The sensor may be used e.g. for quality inspection of conductor boards or plugs in real-time industrial applications.
Visual adaptation and face perception.
Webster, Michael A; MacLeod, Donald I A
2011-06-12
The appearance of faces can be strongly affected by the characteristics of faces viewed previously. These perceptual after-effects reflect processes of sensory adaptation that are found throughout the visual system, but which have been considered only relatively recently in the context of higher level perceptual judgements. In this review, we explore the consequences of adaptation for human face perception, and the implications of adaptation for understanding the neural-coding schemes underlying the visual representation of faces. The properties of face after-effects suggest that they, in part, reflect response changes at high and possibly face-specific levels of visual processing. Yet, the form of the after-effects and the norm-based codes that they point to show many parallels with the adaptations and functional organization that are thought to underlie the encoding of perceptual attributes like colour. The nature and basis for human colour vision have been studied extensively, and we draw on ideas and principles that have been developed to account for norms and normalization in colour vision to consider potential similarities and differences in the representation and adaptation of faces.
[Increasingly appropriate depiction of rheumatology for G-DRG reimbursement 2006].
Lakomek, H J; Fiori, W; Buscham, K; Hülsemann, J; Köneke, N; Liman, W; Märker-Hermann, E; Roeder, N
2006-02-01
Starting with the second year of the so called "convergence period", specialized rheumatological treatment is now represented by a specific DRG (197Z) in the German G-DRG system. The definition of this DRG is based on the procedure codes for the complex and multimodal treatment of rheumatological inpatients (OPS 8-983 and 8-986). This will result in a more appropriate reimbursement of rheumatological treatment. The implementation of specialized rheumatological treatment can be regarded as exemplary for the incorporation of medical specializations into DRG systems. The first step is the definition of the characteristics by procedure codes, which can consequently be utilized within the grouping algorithm. After an inadequate representation of a medical specialization within the DRG system has been demonstrated, a new DRG will be established. As no cost data were available, the calculation of a cost weight for the new G-DRG 197Z is not yet possible for 2006. Hence, reimbursement has to be negotiated between the individual hospital and the budget commission of the health insurers. In this context, the use of clinical pathways is considered helpful.
Thick Galactic Cosmic Radiation Shielding Using Atmospheric Data
NASA Technical Reports Server (NTRS)
Youngquist, Robert C.; Nurge, Mark A.; Starr, Stanley O.; Koontz, Steven L.
2013-01-01
NASA is concerned with protecting astronauts from the effects of galactic cosmic radiation and has expended substantial effort in the development of computer models to predict the shielding obtained from various materials. However, these models were only developed for shields up to about 120 g!cm2 in thickness and have predicted that shields of this thickness are insufficient to provide adequate protection for extended deep space flights. Consequently, effort is underway to extend the range of these models to thicker shields and experimental data is required to help confirm the resulting code. In this paper empirically obtained effective dose measurements from aircraft flights in the atmosphere are used to obtain the radiation shielding function of the earth's atmosphere, a very thick shield. Obtaining this result required solving an inverse problem and the method for solving it is presented. The results are shown to be in agreement with current code in the ranges where they overlap. These results are then checked and used to predict the radiation dosage under thick shields such as planetary regolith and the atmosphere of Venus.
Computation of turbulent boundary layers employing the defect wall-function method. M.S. Thesis
NASA Technical Reports Server (NTRS)
Brown, Douglas L.
1994-01-01
In order to decrease overall computational time requirements of spatially-marching parabolized Navier-Stokes finite-difference computer code when applied to turbulent fluid flow, a wall-function methodology, originally proposed by R. Barnwell, was implemented. This numerical effort increases computational speed and calculates reasonably accurate wall shear stress spatial distributions and boundary-layer profiles. Since the wall shear stress is analytically determined from the wall-function model, the computational grid near the wall is not required to spatially resolve the laminar-viscous sublayer. Consequently, a substantially increased computational integration step size is achieved resulting in a considerable decrease in net computational time. This wall-function technique is demonstrated for adiabatic flat plate test cases from Mach 2 to Mach 8. These test cases are analytically verified employing: (1) Eckert reference method solutions, (2) experimental turbulent boundary-layer data of Mabey, and (3) finite-difference computational code solutions with fully resolved laminar-viscous sublayers. Additionally, results have been obtained for two pressure-gradient cases: (1) an adiabatic expansion corner and (2) an adiabatic compression corner.
Discrete space charge affected field emission: Flat and hemisphere emitters
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jensen, Kevin L., E-mail: kevin.jensen@nrl.navy.mil; Shiffler, Donald A.; Tang, Wilkin
Models of space-charge affected thermal-field emission from protrusions, able to incorporate the effects of both surface roughness and elongated field emitter structures in beam optics codes, are desirable but difficult. The models proposed here treat the meso-scale diode region separate from the micro-scale regions characteristic of the emission sites. The consequences of discrete emission events are given for both one-dimensional (sheets of charge) and three dimensional (rings of charge) models: in the former, results converge to steady state conditions found by theory (e.g., Rokhlenko et al. [J. Appl. Phys. 107, 014904 (2010)]) but show oscillatory structure as they do. Surfacemore » roughness or geometric features are handled using a ring of charge model, from which the image charges are found and used to modify the apex field and emitted current. The roughness model is shown to have additional constraints related to the discrete nature of electron charge. The ability of a unit cell model to treat field emitter structures and incorporate surface roughness effects inside a beam optics code is assessed.« less
Study of the solar flares effect on VLF radio signal propagating along NRK-ALG path using LWPC code
NASA Astrophysics Data System (ADS)
Bouderba, Y.; NaitAmor, S.; Tribeche, M.
2016-07-01
The X-ray solar flare emissions penetrate down into the D region of the ionosphere (60-90 km of altitude) and affect the propagating very low frequency (VLF) radio signal. In this paper, we will present the effect of the solar flares on the signal mode composition of the NRK-ALG path during the period from 2007 to 2013. In the Long Wave Propagating Capability (LWPC) code theory, the VLF signal is a sum of discrete modes that propagate to the receiver with different attenuation coefficients. Therefore, an interest is given to the behavior of these coefficients under solar flares. Effectively, from the simulation, we give more explanations about the role of the signal mode composition on the fading displacement since this later is a consequence of the destructive modes interferences. Thus, the sign (positive or negative) of the perturbed signal parameters (amplitude and phase) is found to be depending on the distance between the transmitter and the receiver. Finally, we give the Wait parameters and the electron density variations as a function of solar flares.
A scalable population code for time in the striatum.
Mello, Gustavo B M; Soares, Sofia; Paton, Joseph J
2015-05-04
To guide behavior and learn from its consequences, the brain must represent time over many scales. Yet, the neural signals used to encode time in the seconds-to-minute range are not known. The striatum is a major input area of the basal ganglia associated with learning and motor function. Previous studies have also shown that the striatum is necessary for normal timing behavior. To address how striatal signals might be involved in timing, we recorded from striatal neurons in rats performing an interval timing task. We found that neurons fired at delays spanning tens of seconds and that this pattern of responding reflected the interaction between time and the animals' ongoing sensorimotor state. Surprisingly, cells rescaled responses in time when intervals changed, indicating that striatal populations encoded relative time. Moreover, time estimates decoded from activity predicted timing behavior as animals adjusted to new intervals, and disrupting striatal function led to a decrease in timing performance. These results suggest that striatal activity forms a scalable population code for time, providing timing signals that animals use to guide their actions. Copyright © 2015 Elsevier Ltd. All rights reserved.
2018-01-01
Posttranslational modifications resulting from oxidation of proteins (Ox-PTMs) are present intracellularly under conditions of oxidative stress as well as basal conditions. In the past, these modifications were thought to be generic protein damage, but it has become increasingly clear that Ox-PTMs can have specific physiological effects. It is an arduous task to distinguish between the two cases, as multiple Ox-PTMs occur simultaneously on the same protein, convoluting analysis. Genetic code expansion (GCE) has emerged as a powerful tool to overcome this challenge as it allows for the site-specific incorporation of an Ox-PTM into translated protein. The resulting homogeneously modified protein products can then be rigorously characterized for the effects of individual Ox-PTMs. We outline the strengths and weaknesses of GCE as they relate to the field of oxidative stress and Ox-PTMs. An overview of the Ox-PTMs that have been genetically encoded and applications of GCE to the study of Ox-PTMs, including antibody validation and therapeutic development, is described. PMID:29849913
Selecting a proper design period for heliostat field layout optimization using Campo code
NASA Astrophysics Data System (ADS)
Saghafifar, Mohammad; Gadalla, Mohamed
2016-09-01
In this paper, different approaches are considered to calculate the cosine factor which is utilized in Campo code to expand the heliostat field layout and maximize its annual thermal output. Furthermore, three heliostat fields containing different number of mirrors are taken into consideration. Cosine factor is determined by considering instantaneous and time-average approaches. For instantaneous method, different design days and design hours are selected. For the time average method, daily time average, monthly time average, seasonally time average, and yearly time averaged cosine factor determinations are considered. Results indicate that instantaneous methods are more appropriate for small scale heliostat field optimization. Consequently, it is proposed to consider the design period as the second design variable to ensure the best outcome. For medium and large scale heliostat fields, selecting an appropriate design period is more important. Therefore, it is more reliable to select one of the recommended time average methods to optimize the field layout. Optimum annual weighted efficiency for heliostat fields (small, medium, and large) containing 350, 1460, and 3450 mirrors are 66.14%, 60.87%, and 54.04%, respectively.
NASCAP simulation of PIX 2 experiments
NASA Technical Reports Server (NTRS)
Roche, J. C.; Mandell, M. J.
1985-01-01
The latest version of the NASCAP/LEO digital computer code used to simulate the PIX 2 experiment is discussed. NASCAP is a finite-element code and previous versions were restricted to a single fixed mesh size. As a consequence the resolution was dictated by the largest physical dimension to be modeled. The latest version of NASCAP/LEO can subdivide selected regions. This permitted the modeling of the overall Delta launch vehicle in the primary computational grid at a coarse resolution, with subdivided regions at finer resolution being used to pick up the details of the experiment module configuration. Langmuir probe data from the flight were used to estimate the space plasma density and temperature and the Delta ground potential relative to the space plasma. This information is needed for input to NASCAP. Because of the uncertainty or variability in the values of these parameters, it was necessary to explore a range around the nominal value in order to determine the variation in current collection. The flight data from PIX 2 were also compared with the results of the NASCAP simulation.
A Neutronic Program for Critical and Nonequilibrium Study of Mobile Fuel Reactors: The Cinsf1D Code
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lecarpentier, David; Carpentier, Vincent
2003-01-15
Molten salt reactors (MSRs) have the distinction of having a liquid fuel that is also the coolant. The transport of delayed-neutron precursors by the fuel modifies the precursors' equation. As a consequence, it is necessary to adapt the methods currently used for solid fuel reactors to achieve critical or kinetics calculations for an MSR. A program is presented for which this adaptation has been carried out within the framework of the two-energy-group diffusion theory with one dimension of space. This program has been called Cinsf1D (Cinetique pour reacteur a sels fondus 1D)
Optogenetics in animal model of alcohol addiction
NASA Astrophysics Data System (ADS)
Nalberczak, Maria; Radwanska, Kasia
2014-11-01
Our understanding of the neuronal and molecular basis of alcohol addiction is still not satisfactory. As a consequence we still miss successful therapy of alcoholism. One of the reasons for such state is the lack of appropriate animal models which would allow in-depth analysis of biological basis of addiction. Here we will present our efforts to create the animal model of alcohol addiction in the automated learning device, the IntelliCage setup. Applying this model to optogenetically modified mice with remotely controlled regulation of selected neuronal populations by light may lead to very precise identification of neuronal circuits involved in coding addiction-related behaviors.
Draft genome sequence of Enterococcus faecium strain LMG 8148.
Michiels, Joran E; Van den Bergh, Bram; Fauvart, Maarten; Michiels, Jan
2016-01-01
Enterococcus faecium, traditionally considered a harmless gut commensal, is emerging as an important nosocomial pathogen showing increasing rates of multidrug resistance. We report the draft genome sequence of E. faecium strain LMG 8148, isolated in 1968 from a human in Gothenburg, Sweden. The draft genome has a total length of 2,697,490 bp, a GC-content of 38.3 %, and 2,402 predicted protein-coding sequences. The isolation of this strain predates the emergence of E. faecium as a nosocomial pathogen. Consequently, its genome can be useful in comparative genomic studies investigating the evolution of E. faecium as a pathogen.
The role of hot electrons in the dynamics of a laser-driven strong converging shock
Llor Aisa, E.; Ribeyre, X.; Duchateau, G.; ...
2017-11-30
Experiments on strong shock excitation in spherical plastic targets conducted at the Omega Laser Facility are interpreted with the radiation–hydrodynamics code CHIC to account for parametric instabilities excitation and hot-electron generation. The effects of hot electrons on the shock-pressure amplification and upstream preheat are analyzed. In this study, it is demonstrated that both effects contribute to an increase in shock velocity. Comparison of the measured laser reflectivity and shock flash time with numerical simulations make it possible to reconstitute the time history of the ablation and shock pressures. Finally, consequences of this analysis for the shock-ignition target design are discussed.
High-Fidelity Micromechanics Model Developed for the Response of Multiphase Materials
NASA Technical Reports Server (NTRS)
Aboudi, Jacob; Pindera, Marek-Jerzy; Arnold, Steven M.
2002-01-01
A new high-fidelity micromechanics model has been developed under funding from the NASA Glenn Research Center for predicting the response of multiphase materials with arbitrary periodic microstructures. The model's analytical framework is based on the homogenization technique, but the method of solution for the local displacement and stress fields borrows concepts previously employed in constructing the higher order theory for functionally graded materials. The resulting closed-form macroscopic and microscopic constitutive equations, valid for both uniaxial and multiaxial loading of periodic materials with elastic and inelastic constitutive phases, can be incorporated into a structural analysis computer code. Consequently, this model now provides an alternative, accurate method.
The role of hot electrons in the dynamics of a laser-driven strong converging shock
DOE Office of Scientific and Technical Information (OSTI.GOV)
Llor Aisa, E.; Ribeyre, X.; Duchateau, G.
Experiments on strong shock excitation in spherical plastic targets conducted at the Omega Laser Facility are interpreted with the radiation–hydrodynamics code CHIC to account for parametric instabilities excitation and hot-electron generation. The effects of hot electrons on the shock-pressure amplification and upstream preheat are analyzed. In this study, it is demonstrated that both effects contribute to an increase in shock velocity. Comparison of the measured laser reflectivity and shock flash time with numerical simulations make it possible to reconstitute the time history of the ablation and shock pressures. Finally, consequences of this analysis for the shock-ignition target design are discussed.
Signatures for Black Hole Production from Hadronic Observables at the Large Hadron Collider
NASA Astrophysics Data System (ADS)
Humanic, Thomas J.; Koch, Benjamin; Stöcker, Horst
The concept of Large Extra Dimensions (LED) provides a way of solving the Hierarchy Problem which concerns the weakness of gravity compared with the strong and electro-weak forces. A consequence of LED is that miniature Black Holes (mini-BHs) may be produced at the Large Hadron Collider in p + p collisions. The present work uses the CHARYBDIS mini-BH generator code to simulate the hadronic signal which might be expected in a mid-rapidity particle tracking detector from the decay of these exotic objects if indeed they are produced. An estimate is also given for Pb+Pb collisions.
Brouillet, Denis; Milhau, Audrey; Brouillet, Thibaut
2015-01-01
Since the work of Casasanto (2009), it is now well established that valence and laterality are associated. Participants tend to prefer objects presented on their dominant side over items presented on their non-dominant side, and to place good items on their dominant side and bad items on the other side. Several studies highlight that those associations of valence and laterality are accounted for by the greater motor fluency of the dominant hand and various studies noted that these associations could be reversed depending on the way people interact with their environment. Consistently with the Theory of Event Coding, the aim of this work is to show that the consequences of motor actions could also reverse the associations between valence and laterality. Thus, if participants had to place two animals (one good, one bad) on two supports, one stable (no risk of falling), one unstable (risk of falling), we hypothesized that the good item would be placed on the stable support, regardless of the side where it would be put (i.e., on the dominant or non-dominant side). We expected the opposite for the bad item. The results of two experiments are consistent with this prediction and support the claim that the consequences of motor action bias the hedonic connotation of our dominant side.
Brouillet, Denis; Milhau, Audrey; Brouillet, Thibaut
2015-01-01
Since the work of Casasanto (2009), it is now well established that valence and laterality are associated. Participants tend to prefer objects presented on their dominant side over items presented on their non-dominant side, and to place good items on their dominant side and bad items on the other side. Several studies highlight that those associations of valence and laterality are accounted for by the greater motor fluency of the dominant hand and various studies noted that these associations could be reversed depending on the way people interact with their environment. Consistently with the Theory of Event Coding, the aim of this work is to show that the consequences of motor actions could also reverse the associations between valence and laterality. Thus, if participants had to place two animals (one good, one bad) on two supports, one stable (no risk of falling), one unstable (risk of falling), we hypothesized that the good item would be placed on the stable support, regardless of the side where it would be put (i.e., on the dominant or non-dominant side). We expected the opposite for the bad item. The results of two experiments are consistent with this prediction and support the claim that the consequences of motor action bias the hedonic connotation of our dominant side. PMID:25798122
Li, Linxin; Rothwell, Peter M
2016-05-16
To determine the accuracy of coding of admissions for stroke on weekdays versus weekends and any impact on apparent outcome. Prospective population based stroke incidence study and a scoping review of previous studies of weekend effects in stroke. Primary and secondary care of all individuals registered with nine general practices in Oxfordshire, United Kingdom (OXVASC, the Oxford Vascular Study). All patients with clinically confirmed acute stroke in OXVASC identified with multiple overlapping methods of ascertainment in 2002-14 versus all acute stroke admissions identified by hospital diagnostic and mortality coding alone during the same period. Accuracy of administrative coding data for all patients with confirmed stroke admitted to hospital in OXVASC. Difference between rates of "false positive" or "false negative" coding for weekday and weekend admissions. Impact of inaccurate coding on apparent case fatality at 30 days in weekday versus weekend admissions. Weekend effects on outcomes in patients with confirmed stroke admitted to hospital in OXVASC and impacts of other potential biases compared with those in the scoping review. Among 92 728 study population, 2373 episodes of acute stroke were ascertained in OXVASC, of which 826 (34.8%) mainly minor events were managed without hospital admission, 60 (2.5%) occurred out of the area or abroad, and 195 (8.2%) occurred in hospital during an admission for a different reason. Of 1292 local hospital admissions for acute stroke, 973 (75.3%) were correctly identified by administrative coding. There was no bias in distribution of weekend versus weekday admission of the 319 strokes missed by coding. Of 1693 admissions for stroke identified by coding, 1055 (62.3%) were confirmed to be acute strokes after case adjudication. Among the 638 false positive coded cases, patients were more likely to be admitted on weekdays than at weekends (536 (41.0%) v 102 (26.5%); P<0.001), partly because of weekday elective admissions after previous stroke being miscoded as new stroke episodes (267 (49.8%) v 26 (25.5%); P<0.001). The 30 day case fatality after these elective admissions was lower than after confirmed acute stroke admissions (11 (3.8%) v 233 (22.1%); P<0.001). Consequently, relative 30 day case fatality for weekend versus weekday admissions differed (P<0.001) between correctly coded acute stroke admissions and false positive coding cases. Results were consistent when only the 1327 emergency cases identified by "admission method" from coding were included, with more false positive cases with low case fatality (35 (14.7%)) being included for weekday versus weekend admissions (190 (19.5%) v 48 (13.7%), P<0.02). Among all acute stroke admissions in OXVASC, there was no imbalance in baseline stroke severity for weekends versus weekdays and no difference in case fatality at 30 days (adjusted odds ratio 0.85, 95% confidence interval 0.63 to 1.15; P=0.30) or any adverse "weekend effect" on modified Rankin score at 30 days (0.78, 0.61 to 0.99; P=0.04) or one year (0.76, 0.59 to 0.98; P=0.03) among incident strokes. Retrospective studies of UK administrative hospital coding data to determine "weekend effects" on outcome in acute medical conditions, such as stroke, can be undermined by inaccurate coding, which can introduce biases that cannot be reliably dealt with by adjustment for case mix. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
Rasal, Kiran D; Shah, Tejas M; Vaidya, Megha; Jakhesara, Subhash J; Joshi, Chaitanya G
2015-06-01
The recent advances in high throughput sequencing technology accelerate possible ways for the study of genome wide variation in several organisms and associated consequences. In the present study, mutations in TGFBR3 showing significant association with FCR trait in chicken during exome sequencing were further analyzed. Out of four SNPs, one nsSNP p.Val451Leu was found in the coding region of TGFBR3. In silico tools such as SnpSift and PANTHER predicted it as deleterious (0.04) and to be tolerated, respectively, while I-Mutant revealed that protein stability decreased. The TGFBR3 I-TASSER model has a C-score of 0.85, which was validated using PROCHECK. Based on MD simulation, mutant protein structure deviated from native with RMSD 0.08 Å due to change in the H-bonding distances of mutant residue. The docking of TGFBR3 with interacting TGFBR2 inferred that mutant required more global energy. Therefore, the present study will provide useful information about functional SNPs that have an impact on FCR traits.
Technology and medication errors: impact in nursing homes.
Baril, Chantal; Gascon, Viviane; St-Pierre, Liette; Lagacé, Denis
2014-01-01
The purpose of this paper is to study a medication distribution technology's (MDT) impact on medication errors reported in public nursing homes in Québec Province. The work was carried out in six nursing homes (800 patients). Medication error data were collected from nursing staff through a voluntary reporting process before and after MDT was implemented. The errors were analysed using: totals errors; medication error type; severity and patient consequences. A statistical analysis verified whether there was a significant difference between the variables before and after introducing MDT. The results show that the MDT detected medication errors. The authors' analysis also indicates that errors are detected more rapidly resulting in less severe consequences for patients. MDT is a step towards safer and more efficient medication processes. Our findings should convince healthcare administrators to implement technology such as electronic prescriber or bar code medication administration systems to improve medication processes and to provide better healthcare to patients. Few studies have been carried out in long-term healthcare facilities such as nursing homes. The authors' study extends what is known about MDT's impact on medication errors in nursing homes.
Thermal-hydraulic modeling needs for passive reactors
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kelly, J.M.
1997-07-01
The U.S. Nuclear Regulatory Commission has received an application for design certification from the Westinghouse Electric Corporation for an Advanced Light Water Reactor design known as the AP600. As part of the design certification process, the USNRC uses its thermal-hydraulic system analysis codes to independently audit the vendor calculations. The focus of this effort has been the small break LOCA transients that rely upon the passive safety features of the design to depressurize the primary system sufficiently so that gravity driven injection can provide a stable source for long term cooling. Of course, large break LOCAs have also been considered,more » but as the involved phenomena do not appear to be appreciably different from those of current plants, they were not discussed in this paper. Although the SBLOCA scenario does not appear to threaten core coolability - indeed, heatup is not even expected to occur - there have been concerns as to the performance of the passive safety systems. For example, the passive systems drive flows with small heads, consequently requiring more precision in the analysis compared to active systems methods for passive plants as compared to current plants with active systems. For the analysis of SBLOCAs and operating transients, the USNRC uses the RELAP5 thermal-hydraulic system analysis code. To assure the applicability of RELAP5 to the analysis of these transients for the AP600 design, a four year long program of code development and assessment has been undertaken.« less
NASA Technical Reports Server (NTRS)
Plante, Ianik; Ponomarev, Artem L.; Wu, Honglu; Blattnig, Steve; George, Kerry
2014-01-01
The formation of DNA double-strand breaks (DSBs) and chromosome aberrations is an important consequence of ionizing radiation. To simulate DNA double-strand breaks and the formation of chromosome aberrations, we have recently merged the codes RITRACKS (Relativistic Ion Tracks) and NASARTI (NASA Radiation Track Image). The program RITRACKS is a stochastic code developed to simulate detailed event-by-event radiation track structure: [1] This code is used to calculate the dose in voxels of 20 nm, in a volume containing simulated chromosomes, [2] The number of tracks in the volume is calculated for each simulation by sampling a Poisson distribution, with the distribution parameter obtained from the irradiation dose, ion type and energy. The program NASARTI generates the chromosomes present in a cell nucleus by random walks of 20 nm, corresponding to the size of the dose voxels, [3] The generated chromosomes are located within domains which may intertwine, and [4] Each segment of the random walks corresponds to approx. 2,000 DNA base pairs. NASARTI uses pre-calculated dose at each voxel to calculate the probability of DNA damage at each random walk segment. Using the location of double-strand breaks, possible rejoining between damaged segments is evaluated. This yields various types of chromosomes aberrations, including deletions, inversions, exchanges, etc. By performing the calculations using various types of radiations, it will be possible to obtain relative biological effectiveness (RBE) values for several types of chromosome aberrations.
NASA Technical Reports Server (NTRS)
Burris, John
2011-01-01
We report the use of a return-to- zero (RZPN) pseudo noise modulation technique for making range resolved measurements of CO2 within the planetary boundary layer (PBL) using commercial, off-the-shelf, components. Conventional, range resolved, DIAL measurements require laser pulse widths that are significantly shorter than the desired spatial resolution and necessitate using pulses whose temporal spacing is such that scattered returns from only a single pulse are observed by the receiver at any one time (for the PBL pulse separations must be greater than approximately 20 microseconds). This imposes significant operational limitations when using currently available fiber lasers because of the resulting low duty cycle (less than approximately 0.0005) and consequent low average laser output power. The RZPN modulation technique enables a fiber laser to operate at much higher duty cycles (approaching 0.04) thereby more effectively utilizing the amplifier's output. This increases the counts received by approximately two orders of magnitude. Our approach involves employing two distributed feedback lasers (DFB), each modulated by a different RPZN code, whose outputs are then amplified by a CW fiber amplifier. One laser is tuned to a CO2 absorption line; the other operates offline thereby permitting the simultaneous acquisition of both on and offline signals using independent RZPN codes. This minimizes the impact of atmospheric turbulence on the measurement. The on and offline signals are retrieved by deconvolving the return signal using the appropriate kernels.
Unified transform architecture for AVC, AVS, VC-1 and HEVC high-performance codecs
NASA Astrophysics Data System (ADS)
Dias, Tiago; Roma, Nuno; Sousa, Leonel
2014-12-01
A unified architecture for fast and efficient computation of the set of two-dimensional (2-D) transforms adopted by the most recent state-of-the-art digital video standards is presented in this paper. Contrasting to other designs with similar functionality, the presented architecture is supported on a scalable, modular and completely configurable processing structure. This flexible structure not only allows to easily reconfigure the architecture to support different transform kernels, but it also permits its resizing to efficiently support transforms of different orders (e.g. order-4, order-8, order-16 and order-32). Consequently, not only is it highly suitable to realize high-performance multi-standard transform cores, but it also offers highly efficient implementations of specialized processing structures addressing only a reduced subset of transforms that are used by a specific video standard. The experimental results that were obtained by prototyping several configurations of this processing structure in a Xilinx Virtex-7 FPGA show the superior performance and hardware efficiency levels provided by the proposed unified architecture for the implementation of transform cores for the Advanced Video Coding (AVC), Audio Video coding Standard (AVS), VC-1 and High Efficiency Video Coding (HEVC) standards. In addition, such results also demonstrate the ability of this processing structure to realize multi-standard transform cores supporting all the standards mentioned above and that are capable of processing the 8k Ultra High Definition Television (UHDTV) video format (7,680 × 4,320 at 30 fps) in real time.
Mensi, Skander; Hagens, Olivier; Gerstner, Wulfram; Pozzorini, Christian
2016-01-01
The way in which single neurons transform input into output spike trains has fundamental consequences for network coding. Theories and modeling studies based on standard Integrate-and-Fire models implicitly assume that, in response to increasingly strong inputs, neurons modify their coding strategy by progressively reducing their selective sensitivity to rapid input fluctuations. Combining mathematical modeling with in vitro experiments, we demonstrate that, in L5 pyramidal neurons, the firing threshold dynamics adaptively adjust the effective timescale of somatic integration in order to preserve sensitivity to rapid signals over a broad range of input statistics. For that, a new Generalized Integrate-and-Fire model featuring nonlinear firing threshold dynamics and conductance-based adaptation is introduced that outperforms state-of-the-art neuron models in predicting the spiking activity of neurons responding to a variety of in vivo-like fluctuating currents. Our model allows for efficient parameter extraction and can be analytically mapped to a Generalized Linear Model in which both the input filter—describing somatic integration—and the spike-history filter—accounting for spike-frequency adaptation—dynamically adapt to the input statistics, as experimentally observed. Overall, our results provide new insights on the computational role of different biophysical processes known to underlie adaptive coding in single neurons and support previous theoretical findings indicating that the nonlinear dynamics of the firing threshold due to Na+-channel inactivation regulate the sensitivity to rapid input fluctuations. PMID:26907675
Pullen, Samuel J; Petruzzi, Liana; Lange, Brittany CL; Parnarouskis, Lindsey; Dominguez, Silvia; Harris, Benjamin; Quiterio, Nicole; Durham, Michelle P; Lekpeh, Gondah; Manobah, Burgess; Slopadoe, Siede P; Diandy, Veronique C; Payne, Arthur J; Henderson, David C; Borba, Christina PC
2016-01-01
Objective Substance use is a significant and common problem among school-aged youths throughout Africa. Like other countries on this continent, the West-African nation of Liberia is recovering from civil war. A well-educated population of young people is critical to the recovery efforts and long-term success of Liberia. Substance use by school-aged youths has important public health consequences that could undermine Liberia’s post-conflict recovery efforts. We wanted to better understand the culturally significant themes and subthemes related to substance use among youths attending public schools in Monrovia, Liberia. Methods A qualitative research design was used to collect data from 72 students attending public school in Monrovia, Liberia. Nine focus groups of 6–8 students from three public schools were facilitated using a semi-structured format to guide discussions on substance use. Student narratives were translated and re-occurring themes and subthemes were coded and analyzed. Results Four emergent themes described in this study were: Behaviors associated with substance use Consequences associated with individual use Consequences of substance use that affected the school milieu School-related factors that were protective from substance use. Subthemes associated with substance use included concealment of substances, intoxication and disruption of the classroom environment, expulsion from school, school drop-out, and school as protective against substance use. Conclusion Liberian school-aged youths described important themes and subthemes associated with substance use occurring within the school milieu. These data have germane public health ramifications, and could help inform larger epidemiologic study methods and public health interventions for Liberia and countries with similar profiles. PMID:27158680
Content Analysis of Tobacco, Alcohol, and Other Drugs in Popular Music
Primack, Brian A.; Dalton, Madeline A.; Carroll, Mary V.; Agarwal, Aaron A.; Fine, Michael J.
2010-01-01
Objective To perform a comprehensive content analysis of substance use in contemporary popular music. Design We analyzed the 279 most popular songs of 2005 according to Billboard magazine. Two coders working independently used a standardized data collection instrument to code portrayals of substance use. Outcome Measures Presence and explicit use of substances and motivations for, associations with, and consequences of substance use. Results Of the 279 songs, 93 (33.3%) portrayed substance use, with an average of 35.2 substance references per song-hour. Portrayal of substance use varied significantly (P<.001) by genre, with 1 or more references in 3 of 35 pop songs (9%), 9 of 66 rock songs (14%), 11 of 55 R&B/hip-hop songs (20%), 22 of 61 country songs (36%), and 48 of 62 rap songs (77%). While only 2.9% of the 279 songs portrayed tobacco use, 23.7% depicted alcohol use, 13.6% depicted marijuana use, and 11.5% depicted other or unspecified substance use. In the 93 songs with substance use, it was most often motivated by peer/social pressure (45 [48%]) or sex (28 [30%]); use was commonly associated with partying (50 [54%]), sex (43 [46%]), violence (27 [29%]), and/or humor (22 [24%]). Only 4 songs (4%) contained explicit antiuse messages, and none portrayed substance refusal. Most songs with substance use (63 [68%]) portrayed more positive than negative consequences; these positive consequences were most commonly social, sexual, financial, or emotional. Conclusions The average adolescent is exposed to approximately 84 references to explicit substance use daily in popular songs, and this exposure varies widely by musical genre. The substance use depicted in popular music is frequently motivated by peer acceptance and sex, and it has highly positive associations and consequences. PMID:18250243
Content analysis of tobacco, alcohol, and other drugs in popular music.
Primack, Brian A; Dalton, Madeline A; Carroll, Mary V; Agarwal, Aaron A; Fine, Michael J
2008-02-01
To perform a comprehensive content analysis of substance use in contemporary popular music. We analyzed the 279 most popular songs of 2005 according to Billboard magazine. Two coders working independently used a standardized data collection instrument to code portrayals of substance use. Presence and explicit use of substances and motivations for, associations with, and consequences of substance use. Of the 279 songs, 93 (33.3%) portrayed substance use, with an average of 35.2 substance references per song-hour. Portrayal of substance use varied significantly (P < .001) by genre, with 1 or more references in 3 of 35 pop songs (9%), 9 of 66 rock songs (14%), 11 of 55 R & B/hip-hop songs (20%), 22 of 61 country songs (36%), and 48 of 62 rap songs (77%). While only 2.9% of the 279 songs portrayed tobacco use, 23.7% depicted alcohol use, 13.6% depicted marijuana use, and 11.5% depicted other or unspecified substance use. In the 93 songs with substance use, it was most often motivated by peer/social pressure (45 [48%]) or sex (28 [30%]); use was commonly associated with partying (50 [54%]), sex (43 [46%]), violence (27 [29%]), and/or humor (22 [24%]). Only 4 songs (4%) contained explicit antiuse messages, and none portrayed substance refusal. Most songs with substance use (63 [68%]) portrayed more positive than negative consequences; these positive consequences were most commonly social, sexual, financial, or emotional. The average adolescent is exposed to approximately 84 references to explicit substance use daily in popular songs, and this exposure varies widely by musical genre. The substance use depicted in popular music is frequently motivated by peer acceptance and sex, and it has highly positive associations and consequences.
Automatic Generation of Algorithms for the Statistical Analysis of Planetary Nebulae Images
NASA Technical Reports Server (NTRS)
Fischer, Bernd
2004-01-01
Analyzing data sets collected in experiments or by observations is a Core scientific activity. Typically, experimentd and observational data are &aught with uncertainty, and the analysis is based on a statistical model of the conjectured underlying processes, The large data volumes collected by modern instruments make computer support indispensible for this. Consequently, scientists spend significant amounts of their time with the development and refinement of the data analysis programs. AutoBayes [GF+02, FS03] is a fully automatic synthesis system for generating statistical data analysis programs. Externally, it looks like a compiler: it takes an abstract problem specification and translates it into executable code. Its input is a concise description of a data analysis problem in the form of a statistical model as shown in Figure 1; its output is optimized and fully documented C/C++ code which can be linked dynamically into the Matlab and Octave environments. Internally, however, it is quite different: AutoBayes derives a customized algorithm implementing the given model using a schema-based process, and then further refines and optimizes the algorithm into code. A schema is a parameterized code template with associated semantic constraints which define and restrict the template s applicability. The schema parameters are instantiated in a problem-specific way during synthesis as AutoBayes checks the constraints against the original model or, recursively, against emerging sub-problems. AutoBayes schema library contains problem decomposition operators (which are justified by theorems in a formal logic in the domain of Bayesian networks) as well as machine learning algorithms (e.g., EM, k-Means) and nu- meric optimization methods (e.g., Nelder-Mead simplex, conjugate gradient). AutoBayes augments this schema-based approach by symbolic computation to derive closed-form solutions whenever possible. This is a major advantage over other statistical data analysis systems which use numerical approximations even in cases where closed-form solutions exist. AutoBayes is implemented in Prolog and comprises approximately 75.000 lines of code. In this paper, we take one typical scientific data analysis problem-analyzing planetary nebulae images taken by the Hubble Space Telescope-and show how AutoBayes can be used to automate the implementation of the necessary anal- ysis programs. We initially follow the analysis described by Knuth and Hajian [KHO2] and use AutoBayes to derive code for the published models. We show the details of the code derivation process, including the symbolic computations and automatic integration of library procedures, and compare the results of the automatically generated and manually implemented code. We then go beyond the original analysis and use AutoBayes to derive code for a simple image segmentation procedure based on a mixture model which can be used to automate a manual preproceesing step. Finally, we combine the original approach with the simple segmentation which yields a more detailed analysis. This also demonstrates that AutoBayes makes it easy to combine different aspects of data analysis.
Boutin-Foster, Carla; Milan, Maria; Kanna, Balavenkatesh
2014-01-01
Abstract Background. The South Bronx, a largely Latino community, has become an epicenter of the diabetes epidemic in New York City. In this community, nondiabetic first-degree relatives of people with diabetes are prime targets for intervention. Therefore, the objective of this study was to explore the knowledge of diabetes and attitudes toward health behavior modification of Latino adults who are first-degree relatives of people with diabetes. Methods. Participants were recruited from three settings in the South Bronx (a community-based organization, a faith-based organization, and a taxi station). The Common Sense Model was used to develop focus-group items. This model provides a framework for exploring illness representations along five domains: identity, cause, consequences, timeline, and perceptions of curability. Responses were transcribed verbatim, and data analysis proceeded in the following order: data immersion, assignment of codes, grouping of key concepts to form categories, and construction of higher-order themes. Results. Of the 115 potential participants identified, 53 were found to be eligible, and 23 of these participated in the focus group. Of these, 20 were Dominicans, 2 were Puerto Ricans, and 1 was Salvadorian. The mean age was 46.39 years, 35% were women, 61% were married, and 26% had less than a high school education. Qualitative analyses resulted in 547 codes that were grouped into 52 concepts, from which 9 categories and 4 overarching themes emerged. The dominant themes were 1) family, genetics, and culture play a major role in the etiology of diabetes; 2) being Latino and having a first-degree relative with diabetes makes getting diabetes inevitable, and, like a time bomb exploding, it is destined to happen; 3) once one develops diabetes, the physical and emotional consequences are devastating and destructive; and 4) diabetes can be “cured” through healthy eating and with insulin. Conclusions.In this study, first-degree relatives of patients with diabetes were knowledgeable about the risks and consequences of diabetes. However, some participants felt that being Latino and having a first-degree relative with diabetes made one destined to have diabetes. Addressing this misperception through culturally tailored interventions has implications for diabetes prevention and may help to stem the diabetes epidemic in Latino communities. PMID:26246756
Castro-Rivas, Erida; Boutin-Foster, Carla; Milan, Maria; Kanna, Balavenkatesh
2014-02-01
Background. The South Bronx, a largely Latino community, has become an epicenter of the diabetes epidemic in New York City. In this community, nondiabetic first-degree relatives of people with diabetes are prime targets for intervention. Therefore, the objective of this study was to explore the knowledge of diabetes and attitudes toward health behavior modification of Latino adults who are first-degree relatives of people with diabetes. Methods. Participants were recruited from three settings in the South Bronx (a community-based organization, a faith-based organization, and a taxi station). The Common Sense Model was used to develop focus-group items. This model provides a framework for exploring illness representations along five domains: identity, cause, consequences, timeline, and perceptions of curability. Responses were transcribed verbatim, and data analysis proceeded in the following order: data immersion, assignment of codes, grouping of key concepts to form categories, and construction of higher-order themes. Results. Of the 115 potential participants identified, 53 were found to be eligible, and 23 of these participated in the focus group. Of these, 20 were Dominicans, 2 were Puerto Ricans, and 1 was Salvadorian. The mean age was 46.39 years, 35% were women, 61% were married, and 26% had less than a high school education. Qualitative analyses resulted in 547 codes that were grouped into 52 concepts, from which 9 categories and 4 overarching themes emerged. The dominant themes were 1) family, genetics, and culture play a major role in the etiology of diabetes; 2) being Latino and having a first-degree relative with diabetes makes getting diabetes inevitable, and, like a time bomb exploding, it is destined to happen; 3) once one develops diabetes, the physical and emotional consequences are devastating and destructive; and 4) diabetes can be "cured" through healthy eating and with insulin. Conclusions.In this study, first-degree relatives of patients with diabetes were knowledgeable about the risks and consequences of diabetes. However, some participants felt that being Latino and having a first-degree relative with diabetes made one destined to have diabetes. Addressing this misperception through culturally tailored interventions has implications for diabetes prevention and may help to stem the diabetes epidemic in Latino communities.
Nystedt, Astrid; Hildingsson, Ingegerd
2014-07-16
Prolonged labour very often causes suffering from difficulties that may have lifelong implications. This study aimed to explore the prevalence and treatment of prolonged labour and to compare birth outcome and women's experiences of prolonged and normal labour. Women with spontaneous onset of labour, living in a Swedish county, were recruited two months after birth, to a cross-sectional study. Women (n = 829) completed a questionnaire that investigated socio-demographic and obstetric background, birth outcome and women's feelings and experiences of birth. The prevalence of prolonged labour, as defined by a documented ICD-code and inspection of partogram was calculated. Four groups were identified; women with prolonged labour as identified by documented ICD-codes or by partogram inspection but no ICD-code; women with normal labour augmented with oxytocin or not. Every fifth woman experienced a prolonged labour. The prevalence with the documented ICD-code was (13%) and without ICD-code but positive partogram was (8%). Seven percent of women with prolonged labour were not treated with oxytocin. Approximately one in three women (28%) received oxytocin augmentation despite having no evidence of prolonged labour. The length of labour differed between the four groups of women, from 7 to 23 hours.Women with a prolonged labour had a negative birth experience more often (13%) than did women who had a normal labour (3%) (P <0.00). The factors that contributed most strongly to a negative birth experience in women with prolonged labour were emergency Caesarean section (OR 9.0, 95% CI 1.2-3.0) and to strongly agree with the following statement 'My birth experience made me decide not to have any more children' (OR 41.3, 95% CI 4.9-349.6). The factors that contributed most strongly to a negative birth experience in women with normal labour were less agreement with the statement 'It was exiting to give birth' (OR 0.13, 95% CI 0.34-0.5). There is need for increased clinical skill in identification and classification of prolonged labour, in order to improve care for all women and their experiences of birthing processes regardless whether they experience a prolonged labour or not.
NASA Astrophysics Data System (ADS)
Ditommaso, Rocco; Carlo Ponzo, Felice; Auletta, Gianluca; Iacovino, Chiara; Nigro, Antonella
2015-04-01
Aim of this study is a comparison among the fundamental period of reinforced concrete buildings evaluated using the simplified approach proposed by the Italian Seismic code (NTC 2008), numerical models and real values retrieved from an experimental campaign performed on several buildings located in Basilicata region (Italy). With the intention of proposing simplified relationships to evaluate the fundamental period of reinforced concrete buildings, scientists and engineers performed several numerical and experimental campaigns, on different structures all around the world, to calibrate different kind of formulas. Most of formulas retrieved from both numerical and experimental analyses provides vibration periods smaller than those suggested by the Italian seismic code. However, it is well known that the fundamental period of a structure play a key role in the correct evaluation of the spectral acceleration for seismic static analyses. Generally, simplified approaches impose the use of safety factors greater than those related to in depth nonlinear analyses with the aim to cover possible unexpected uncertainties. Using the simplified formula proposed by the Italian seismic code the fundamental period is quite higher than fundamental periods experimentally evaluated on real structures, with the consequence that the spectral acceleration adopted in the seismic static analysis may be significantly different than real spectral acceleration. This approach could produces a decreasing in safety factors obtained using linear and nonlinear seismic static analyses. Finally, the authors suggest a possible update of the Italian seismic code formula for the simplified estimation of the fundamental period of vibration of existing RC buildings, taking into account both elastic and inelastic structural behaviour and the interaction between structural and non-structural elements. Acknowledgements This study was partially funded by the Italian Civil Protection Department within the project DPC-RELUIS 2014 - RS4 ''Seismic observatory of structures and health monitoring''. References R. Ditommaso, M. Vona, M. R. Gallipoli and M. Mucciarelli (2013). Evaluation and considerations about fundamental periods of damaged reinforced concrete buildings. Nat. Hazards Earth Syst. Sci., 13, 1903-1912, 2013. www.nat-hazards-earth-syst-sci.net/13/1903/2013. doi:10.5194/nhess-13-1903-2013
NASA Astrophysics Data System (ADS)
Gassmöller, Rene; Bangerth, Wolfgang
2016-04-01
Particle-in-cell methods have a long history and many applications in geodynamic modelling of mantle convection, lithospheric deformation and crustal dynamics. They are primarily used to track material information, the strain a material has undergone, the pressure-temperature history a certain material region has experienced, or the amount of volatiles or partial melt present in a region. However, their efficient parallel implementation - in particular combined with adaptive finite-element meshes - is complicated due to the complex communication patterns and frequent reassignment of particles to cells. Consequently, many current scientific software packages accomplish this efficient implementation by specifically designing particle methods for a single purpose, like the advection of scalar material properties that do not evolve over time (e.g., for chemical heterogeneities). Design choices for particle integration, data storage, and parallel communication are then optimized for this single purpose, making the code relatively rigid to changing requirements. Here, we present the implementation of a flexible, scalable and efficient particle-in-cell method for massively parallel finite-element codes with adaptively changing meshes. Using a modular plugin structure, we allow maximum flexibility of the generation of particles, the carried tracer properties, the advection and output algorithms, and the projection of properties to the finite-element mesh. We present scaling tests ranging up to tens of thousands of cores and tens of billions of particles. Additionally, we discuss efficient load-balancing strategies for particles in adaptive meshes with their strengths and weaknesses, local particle-transfer between parallel subdomains utilizing existing communication patterns from the finite element mesh, and the use of established parallel output algorithms like the HDF5 library. Finally, we show some relevant particle application cases, compare our implementation to a modern advection-field approach, and demonstrate under which conditions which method is more efficient. We implemented the presented methods in ASPECT (aspect.dealii.org), a freely available open-source community code for geodynamic simulations. The structure of the particle code is highly modular, and segregated from the PDE solver, and can thus be easily transferred to other programs, or adapted for various application cases.
Development of Pflotran Code for Waste Isolation Pilot Plant Performance Assessment
NASA Astrophysics Data System (ADS)
Zeitler, T.; Day, B. A.; Frederick, J.; Hammond, G. E.; Kim, S.; Sarathi, R.; Stein, E.
2017-12-01
The Waste Isolation Pilot Plant (WIPP) has been developed by the U.S. Department of Energy (DOE) for the geologic (deep underground) disposal of transuranic (TRU) waste. Containment of TRU waste at the WIPP is regulated by the U.S. Environmental Protection Agency (EPA). The DOE demonstrates compliance with the containment requirements by means of performance assessment (PA) calculations. WIPP PA calculations estimate the probability and consequence of potential radionuclide releases from the repository to the accessible environment for a regulatory period of 10,000 years after facility closure. The long-term performance of the repository is assessed using a suite of sophisticated computational codes. There is a current effort to enhance WIPP PA capabilities through the further development of the PFLOTRAN software, a state-of-the-art massively parallel subsurface flow and reactive transport code. Benchmark testing of the individual WIPP-specific process models implemented in PFLOTRAN (e.g., gas generation, chemistry, creep closure, actinide transport, and waste form) has been performed, including results comparisons for PFLOTRAN and existing WIPP PA codes. Additionally, enhancements to the subsurface hydrologic flow mode have been made. Repository-scale testing has also been performed for the modified PFLTORAN code and detailed results will be presented. Ultimately, improvements to the current computational environment will result in greater detail and flexibility in the repository model due to a move from a two-dimensional calculation grid to a three-dimensional representation. The result of the effort will be a state-of-the-art subsurface flow and transport capability that will serve WIPP PA into the future for use in compliance recertification applications (CRAs) submitted to the EPA. Sandia National Laboratories is a multi-mission laboratory managed and operated by National Technology and Engineering Solutions of Sandia, LLC., a wholly owned subsidiary of Honeywell International, Inc., for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-NA0003525. This research is funded by WIPP programs administered by the Office of Environmental Management (EM) of the U.S. Department of Energy.SAND2017-8198A.
Southan, Christopher; Williams, Antony J; Ekins, Sean
2013-01-01
There is an expanding amount of interest directed at the repurposing and repositioning of drugs, as well as how in silico methods can assist these endeavors. Recent repurposing project tendering calls by the National Center for Advancing Translational Sciences (USA) and the Medical Research Council (UK) have included compound information and pharmacological data. However, none of the internal company development code names were assigned to chemical structures in the official documentation. This not only abrogates in silico analysis to support repurposing but consequently necessitates data gathering and curation to assign structures. Here, we describe the approaches, results and major challenges associated with this. Copyright © 2012 Elsevier Ltd. All rights reserved.
Psychological impact of acne on 21st-century adolescents: decoding for better care.
Revol, O; Milliez, N; Gerard, D
2015-07-01
The psychological consequences of acne have been the subject of many studies. As a particularly visible skin disorder, acne complicates the daily lives of adolescents who are undergoing multiple transformations: physical, intellectual and emotional. While it is well established that acne can be responsible for depression and low self-esteem, it is likely that this impact is aggravated by the sociological evolution of adolescents in the 21st century. Understanding the codes of adolescents today (who can be characterized as being more concerned by their appearance than previous generations at the same age) allows us to optimize our medical approach to acne and facilitates treatment compliance and adherence. © 2015 British Association of Dermatologists.
Paying a price: culture, trust, and negotiation consequences.
Gunia, Brian C; Brett, Jeanne M; Nandkeolyar, Amit K; Kamdar, Dishan
2011-07-01
Three studies contrasting Indian and American negotiators tested hypotheses derived from theory proposing why there are cultural differences in trust and how cultural differences in trust influence negotiation strategy. Study 1 (a survey) documented that Indian negotiators trust their counterparts less than American negotiators. Study 2 (a negotiation simulation) linked American and Indian negotiators' self-reported trust and strategy to their insight and joint gains. Study 3 replicated and extended Study 2 using independently coded negotiation strategy data, allowing for stronger causal inference. Overall, the strategy associated with Indian negotiators' reluctance to extend interpersonal (as opposed to institutional) trust produced relatively poor outcomes. Our data support an expanded theoretical model of negotiation, linking culture to trust, strategies, and outcomes.
Methods of parallel computation applied on granular simulations
NASA Astrophysics Data System (ADS)
Martins, Gustavo H. B.; Atman, Allbens P. F.
2017-06-01
Every year, parallel computing has becoming cheaper and more accessible. As consequence, applications were spreading over all research areas. Granular materials is a promising area for parallel computing. To prove this statement we study the impact of parallel computing in simulations of the BNE (Brazil Nut Effect). This property is due the remarkable arising of an intruder confined to a granular media when vertically shaken against gravity. By means of DEM (Discrete Element Methods) simulations, we study the code performance testing different methods to improve clock time. A comparison between serial and parallel algorithms, using OpenMP® is also shown. The best improvement was obtained by optimizing the function that find contacts using Verlet's cells.
Rönnberg, J; Borg, E
2001-01-01
This paper reviews research on deaf-blind individuals, primarily from behavioral and communicative points of view. Inclusion in the population of deaf-blind is qualified by describing a variety of subgroups and genetically based syndromes associated with deaf-blindness. Sensory assessment procedures--based primarily on residual capacities--are appraised. Consequences for everyday life are described briefly. Non-sensory, alternative classificatory schemes and procedures are presented and the results from behavior modification procedures used for correcting maladaptive behaviors are summarized. Methods for communicating tactilely are described and evaluated. Attention is also drawn to some suggestions regarding learning of alphabetic codes and sign acquisition. Finally, suggestions for future research are proposed.
Eigenmode multiplexing with SLM for volume holographic data storage
NASA Astrophysics Data System (ADS)
Chen, Guanghao; Miller, Bo E.; Takashima, Yuzuru
2017-08-01
The cavity supports the orthogonal reference beam families as its eigenmodes while enhancing the reference beam power. Such orthogonal eigenmodes are used as additional degree of freedom to multiplex data pages, consequently increase storage densities for volume Holographic Data Storage Systems (HDSS) when the maximum number of multiplexed data page is limited by geometrical factor. Image bearing holograms are multiplexed by orthogonal phase code multiplexing via Hermite-Gaussian eigenmodes in a Fe:LiNbO3 medium with a 532 nm laser at multiple Bragg angles by using Liquid Crystal on Silicon (LCOS) spatial light modulators (SLMs) in reference arms. Total of nine holograms are recorded with three angular and three eigenmode.
Tipton, Julia A
2014-01-01
The purpose of this study was to explore caregivers' beliefs and perceptions regarding serving sugar-sweetened beverages (SSBs) to non-Hispanic black preschoolers. The Theory of Planned Behavior (TpB) was used as the framework for conducting elicitation interviews among a sample of (n = 19) caregivers. Thematic coding of interview transcripts revealed that the decision to serve SSBs to preschoolers is driven by numerous individual, familial, cultural, and environmental factors. Salient factors associated with serving SSBs included convenience, cost, taste, potential health consequences, availability, and pressure from other parents. Population-specific interventions aimed at reducing SSB intake among non-Hispanic preschoolers are discussed. © 2013.
Finite element analysis of two disk rotor system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dixit, Harsh Kumar
A finite element model of simple horizontal rotor system is developed for evaluating its dynamic behaviour. The model is based on Timoshenko beam element and accounts for the effect of gyroscopic couple and other rotational forces. Present rotor system consists of single shaft which is supported by bearings at both ends and two disks are mounted at different locations. The natural frequencies, mode shapes and orbits of rotating system for a specific range of rotation speed are obtained by developing a MATLAB code for solving the finite element equations of rotary system. Consequently, Campbell diagram is plotted for finding amore » relationship between natural whirl frequencies and rotation of the rotor.« less
Correia, J R C C C; Martins, C J A P
2017-10-01
Topological defects unavoidably form at symmetry breaking phase transitions in the early universe. To probe the parameter space of theoretical models and set tighter experimental constraints (exploiting the recent advances in astrophysical observations), one requires more and more demanding simulations, and therefore more hardware resources and computation time. Improving the speed and efficiency of existing codes is essential. Here we present a general purpose graphics-processing-unit implementation of the canonical Press-Ryden-Spergel algorithm for the evolution of cosmological domain wall networks. This is ported to the Open Computing Language standard, and as a consequence significant speedups are achieved both in two-dimensional (2D) and 3D simulations.
Treatment Recommendation Actions, Contingencies, and Responses: An Introduction.
Stivers, Tanya; Barnes, Rebecca K
2017-08-21
In the era of patient participation in health care decision making, we know surprisingly little about the ways in which treatment recommendations are made, the contexts that shape their formulation, and the consequences of these formulations. In this article, we introduce a systematic collective investigation of how recommendations for medications are responded to and made in primary versus secondary care, in the US versus the UK, and in contexts where the medication was over the counter versus by prescription. This article provides an overview of the coding system that was used in this project including describing what constitutes a recommendation, the primary action types clinicians use for recommendations, and the types of responses provided by patients to recommendations.
Information technology and medication safety: what is the benefit?
Kaushal, R; Bates, D
2002-01-01
Medication errors occur frequently and have significant clinical and financial consequences. Several types of information technologies can be used to decrease rates of medication errors. Computerized physician order entry with decision support significantly reduces serious inpatient medication error rates in adults. Other available information technologies that may prove effective for inpatients include computerized medication administration records, robots, automated pharmacy systems, bar coding, "smart" intravenous devices, and computerized discharge prescriptions and instructions. In outpatients, computerization of prescribing and patient oriented approaches such as personalized web pages and delivery of web based information may be important. Public and private mandates for information technology interventions are growing, but further development, application, evaluation, and dissemination are required. PMID:12486992
Butler, G S; Overall, C M
2007-01-01
We illustrate the use of quantitative proteomics, namely isotope-coded affinity tag labelling and tandem mass spectrometry, to assess the targets and effects of the blockade of matrix metalloproteinases by an inhibitor drug in a breast cancer cell culture system. Treatment of MT1-MMP-transfected MDA-MB-231 cells with AG3340 (Prinomastat) directly affected the processing a multitude of matrix metalloproteinase substrates, and indirectly altered the expression of an array of other proteins with diverse functions. Therefore, broad spectrum blockade of MMPs has wide-ranging biological consequences. In this human breast cancer cell line, secreted substrates accumulated uncleaved in the conditioned medium and plasma membrane protein substrates were retained on the cell surface, due to reduced processing and shedding of these proteins (cell surface receptors, growth factors and bioactive molecules) to the medium in the presence of the matrix metalloproteinase inhibitor. Hence, proteomic investigation of drug-perturbed cellular proteomes can identify new protease substrates and at the same time provides valuable information for target validation, drug efficacy and potential side effects prior to commitment to clinical trials.
Conserved Non-Coding Regulatory Signatures in Arabidopsis Co-Expressed Gene Modules
Spangler, Jacob B.; Ficklin, Stephen P.; Luo, Feng; Freeling, Michael; Feltus, F. Alex
2012-01-01
Complex traits and other polygenic processes require coordinated gene expression. Co-expression networks model mRNA co-expression: the product of gene regulatory networks. To identify regulatory mechanisms underlying coordinated gene expression in a tissue-enriched context, ten Arabidopsis thaliana co-expression networks were constructed after manually sorting 4,566 RNA profiling datasets into aerial, flower, leaf, root, rosette, seedling, seed, shoot, whole plant, and global (all samples combined) groups. Collectively, the ten networks contained 30% of the measurable genes of Arabidopsis and were circumscribed into 5,491 modules. Modules were scrutinized for cis regulatory mechanisms putatively encoded in conserved non-coding sequences (CNSs) previously identified as remnants of a whole genome duplication event. We determined the non-random association of 1,361 unique CNSs to 1,904 co-expression network gene modules. Furthermore, the CNS elements were placed in the context of known gene regulatory networks (GRNs) by connecting 250 CNS motifs with known GRN cis elements. Our results provide support for a regulatory role of some CNS elements and suggest the functional consequences of CNS activation of co-expression in specific gene sets dispersed throughout the genome. PMID:23024789
Conserved non-coding regulatory signatures in Arabidopsis co-expressed gene modules.
Spangler, Jacob B; Ficklin, Stephen P; Luo, Feng; Freeling, Michael; Feltus, F Alex
2012-01-01
Complex traits and other polygenic processes require coordinated gene expression. Co-expression networks model mRNA co-expression: the product of gene regulatory networks. To identify regulatory mechanisms underlying coordinated gene expression in a tissue-enriched context, ten Arabidopsis thaliana co-expression networks were constructed after manually sorting 4,566 RNA profiling datasets into aerial, flower, leaf, root, rosette, seedling, seed, shoot, whole plant, and global (all samples combined) groups. Collectively, the ten networks contained 30% of the measurable genes of Arabidopsis and were circumscribed into 5,491 modules. Modules were scrutinized for cis regulatory mechanisms putatively encoded in conserved non-coding sequences (CNSs) previously identified as remnants of a whole genome duplication event. We determined the non-random association of 1,361 unique CNSs to 1,904 co-expression network gene modules. Furthermore, the CNS elements were placed in the context of known gene regulatory networks (GRNs) by connecting 250 CNS motifs with known GRN cis elements. Our results provide support for a regulatory role of some CNS elements and suggest the functional consequences of CNS activation of co-expression in specific gene sets dispersed throughout the genome.
Work-related ladder fall fractures: identification and diagnosis validation using narrative text.
Smith, Gordon S; Timmons, Robert A; Lombardi, David A; Mamidi, Dheeresh K; Matz, Simon; Courtney, Theodore K; Perry, Melissa J
2006-09-01
To identify ladder-related fracture injuries and determine how ladder fall fractures differ from other ladder-related injuries. Ladder-related fracture cases were identified using narrative text and coded data from workers' compensation claims. Potential cases were identified by text searches and verified with claim records. Injury characteristics were compared using proportionate injury ratios. Of 9826 ladder-related injuries, 7% resulted in fracture cases. Falls caused 89% of fractures and resulted in more medical costs and disability days than other injuries. Frequent mechanisms were ladder instability (22%) and lost footing (22%). Narrative text searches identified 17% more fractures than injury codes alone. Males were more likely to sustain a fall fracture than other injuries; construction workers were most likely, and retail workers were the least likely to sustain fractures. Fractures are an important injury from ladder falls, resulting more serious consequences than other ladder-related injuries. Text analysis can improve the quality and utility of workers compensation data by identifying and understanding injury causes. Proportionate injury ratios are also useful for making cross-group comparisons of injury experience when denominator data are not available. Greater attention to risk factors for ladder falls is needed for targeting interventions.