2015-11-01
induced residual stresses and distortions from weld simulations in the SYSWELD software code in structural Finite Element Analysis ( FEA ) simulations...performed in the Abaqus FEA code is presented. The translation of these results is accomplished using a newly developed Python script. Full details of...Local Weld Model in Structural FEA ....................................................15 CONCLUSIONS
BCM-2.0 - The new version of computer code ;Basic Channeling with Mathematica©;
NASA Astrophysics Data System (ADS)
Abdrashitov, S. V.; Bogdanov, O. V.; Korotchenko, K. B.; Pivovarov, Yu. L.; Rozhkova, E. I.; Tukhfatullin, T. A.; Eikhorn, Yu. L.
2017-07-01
The new symbolic-numerical code devoted to investigation of the channeling phenomena in periodic potential of a crystal has been developed. The code has been written in Wolfram Language taking advantage of analytical programming method. Newly developed different packages were successfully applied to simulate scattering, radiation, electron-positron pair production and other effects connected with channeling of relativistic particles in aligned crystal. The result of the simulation has been validated against data from channeling experiments carried out at SAGA LS.
Solution of the lossy nonlinear Tricomi equation with application to sonic boom focusing
NASA Astrophysics Data System (ADS)
Salamone, Joseph A., III
Sonic boom focusing theory has been augmented with new terms that account for mean flow effects in the direction of propagation and also for atmospheric absorption/dispersion due to molecular relaxation due to oxygen and nitrogen. The newly derived model equation was numerically implemented using a computer code. The computer code was numerically validated using a spectral solution for nonlinear propagation of a sinusoid through a lossy homogeneous medium. An additional numerical check was performed to verify the linear diffraction component of the code calculations. The computer code was experimentally validated using measured sonic boom focusing data from the NASA sponsored Superboom Caustic and Analysis Measurement Program (SCAMP) flight test. The computer code was in good agreement with both the numerical and experimental validation. The newly developed code was applied to examine the focusing of a NASA low-boom demonstration vehicle concept. The resulting pressure field was calculated for several supersonic climb profiles. The shaping efforts designed into the signatures were still somewhat evident despite the effects of sonic boom focusing.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Uchibori, Akihiro; Kurihara, Akikazu; Ohshima, Hiroyuki
A multiphysics analysis system for sodium-water reaction phenomena in a steam generator of sodium-cooled fast reactors was newly developed. The analysis system consists of the mechanistic numerical analysis codes, SERAPHIM, TACT, and RELAP5. The SERAPHIM code calculates the multicomponent multiphase flow and sodium-water chemical reaction caused by discharging of pressurized water vapor. Applicability of the SERAPHIM code was confirmed through the analyses of the experiment on water vapor discharging in liquid sodium. The TACT code was developed to calculate heat transfer from the reacting jet to the adjacent tube and to predict the tube failure occurrence. The numerical models integratedmore » into the TACT code were verified through some related experiments. The RELAP5 code evaluates thermal hydraulic behavior of water inside the tube. The original heat transfer correlations were corrected for the tube rapidly heated by the reacting jet. The developed system enables evaluation of the wastage environment and the possibility of the failure propagation.« less
[Trial of eye drops recognizer for visually disabled persons].
Okamoto, Norio; Suzuki, Katsuhiko; Mimura, Osamu
2009-01-01
The development of a device to enable the visually disabled to differentiate eye drops and their dose. The new instrument is composed of a voice generator and a two-dimensional bar-code reader (LS9208). We designed voice outputs for the visually disabled to state when (number of times) and where (right, left, or both) to administer eye drops. We then determined the minimum bar-code size that can be recognized. After attaching bar-codes of the appropriate size to the lateral or bottom surface of the eye drops container, the readability of the bar-codes was compared. The minimum discrimination bar-code size was 6 mm high x 8.5 mm long. Bar-codes on the bottom surface could be more easily recognized than bar-codes on the side. Our newly-developed device using bar-codes enables visually disabled persons to differentiate eye drops and their doses.
Nonlinear Transient Problems Using Structure Compatible Heat Transfer Code
NASA Technical Reports Server (NTRS)
Hou, Gene
2000-01-01
The report documents the recent effort to enhance a transient linear heat transfer code so as to solve nonlinear problems. The linear heat transfer code was originally developed by Dr. Kim Bey of NASA Largely and called the Structure-Compatible Heat Transfer (SCHT) code. The report includes four parts. The first part outlines the formulation of the heat transfer problem of concern. The second and the third parts give detailed procedures to construct the nonlinear finite element equations and the required Jacobian matrices for the nonlinear iterative method, Newton-Raphson method. The final part summarizes the results of the numerical experiments on the newly enhanced SCHT code.
NASA Astrophysics Data System (ADS)
Lezberg, Erwin A.; Mularz, Edward J.; Liou, Meng-Sing
1991-03-01
The objectives and accomplishments of research in chemical reacting flows, including both experimental and computational problems are described. The experimental research emphasizes the acquisition of reliable reacting-flow data for code validation, the development of chemical kinetics mechanisms, and the understanding of two-phase flow dynamics. Typical results from two nonreacting spray studies are presented. The computational fluid dynamics (CFD) research emphasizes the development of efficient and accurate algorithms and codes, as well as validation of methods and modeling (turbulence and kinetics) for reacting flows. Major developments of the RPLUS code and its application to mixing concepts, the General Electric combustor, and the Government baseline engine for the National Aerospace Plane are detailed. Finally, the turbulence research in the newly established Center for Modeling of Turbulence and Transition (CMOTT) is described.
NASA Technical Reports Server (NTRS)
2002-01-01
Goddard Space Flight Center and Triangle Research & Development Corporation collaborated to create "Smart Eyes," a charge coupled device camera that, for the first time, could read and measure bar codes without the use of lasers. The camera operated in conjunction with software and algorithms created by Goddard and Triangle R&D that could track bar code position and direction with speed and precision, as well as with software that could control robotic actions based on vision system input. This accomplishment was intended for robotic assembly of the International Space Station, helping NASA to increase production while using less manpower. After successfully completing the two- phase SBIR project with Goddard, Triangle R&D was awarded a separate contract from the U.S. Department of Transportation (DOT), which was interested in using the newly developed NASA camera technology to heighten automotive safety standards. In 1990, Triangle R&D and the DOT developed a mask made from a synthetic, plastic skin covering to measure facial lacerations resulting from automobile accidents. By pairing NASA's camera technology with Triangle R&D's and the DOT's newly developed mask, a system that could provide repeatable, computerized evaluations of laceration injury was born.
Professional socialization: the key to survival as a newly qualified nurse.
Mooney, Mary
2007-04-01
The impact and prevalence of professional socialization in nursing has been written about extensively. Despite the many positive developments that have taken place in nursing within the past decade, the role of professional socialization remains heavily weighted and is of particular significance to those nurses who are newly qualified. The account given by newly registered nurses in this study demonstrates that their ability and willingness to become professionally socialized determines their ease of survival at clinical level. Twelve newly qualified Irish nurses, from two separate cohorts, were interviewed to ascertain their perceptions of becoming newly qualified nurses. A grounded theory approach was used and data were analysed using thematic analysis. A category that emerged was linked very strongly with professional socialization. The respondents did not refer to professional socialization per se, but through the coding process this emerged as the linchpin of the discussion.
Propagation Effects of Wind and Temperature on Acoustic Ground Contour Levels
NASA Technical Reports Server (NTRS)
Heath, Stephanie L.; McAninch, Gerry L.
2006-01-01
Propagation characteristics for varying wind and temperature atmospheric conditions are identified using physically-limiting propagation angles to define shadow boundary regions. These angles are graphically illustrated for various wind and temperature cases using a newly developed ray-tracing propagation code.
Airborne antenna radiation pattern code user's manual
NASA Technical Reports Server (NTRS)
Burnside, Walter D.; Kim, Jacob J.; Grandchamp, Brett; Rojas, Roberto G.; Law, Philip
1985-01-01
The use of a newly developed computer code to analyze the radiation patterns of antennas mounted on a ellipsoid and in the presence of a set of finite flat plates is described. It is shown how the code allows the user to simulate a wide variety of complex electromagnetic radiation problems using the ellipsoid/plates model. The code has the capacity of calculating radiation patterns around an arbitrary conical cut specified by the user. The organization of the code, definition of input and output data, and numerous practical examples are also presented. The analysis is based on the Uniform Geometrical Theory of Diffraction (UTD), and most of the computed patterns are compared with experimental results to show the accuracy of this solution.
Re-evaluation and updating of the seismic hazard of Lebanon
NASA Astrophysics Data System (ADS)
Huijer, Carla; Harajli, Mohamed; Sadek, Salah
2016-01-01
This paper presents the results of a study undertaken to evaluate the implications of the newly mapped offshore Mount Lebanon Thrust (MLT) fault system on the seismic hazard of Lebanon and the current seismic zoning and design parameters used by the local engineering community. This re-evaluation is critical, given that the MLT is located at close proximity to the major cities and economic centers of the country. The updated seismic hazard was assessed using probabilistic methods of analysis. The potential sources of seismic activities that affect Lebanon were integrated along with any/all newly established characteristics within an updated database which includes the newly mapped fault system. The earthquake recurrence relationships of these sources were developed from instrumental seismology data, historical records, and earlier studies undertaken to evaluate the seismic hazard of neighboring countries. Maps of peak ground acceleration contours, based on 10 % probability of exceedance in 50 years (as per Uniform Building Code (UBC) 1997), as well as 0.2 and 1 s peak spectral acceleration contours, based on 2 % probability of exceedance in 50 years (as per International Building Code (IBC) 2012), were also developed. Finally, spectral charts for the main coastal cities of Beirut, Tripoli, Jounieh, Byblos, Saida, and Tyre are provided for use by designers.
Development of a SCALE Tool for Continuous-Energy Eigenvalue Sensitivity Coefficient Calculations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Perfetti, Christopher M; Rearden, Bradley T
2013-01-01
Two methods for calculating eigenvalue sensitivity coefficients in continuous-energy Monte Carlo applications were implemented in the KENO code within the SCALE code package. The methods were used to calculate sensitivity coefficients for several criticality safety problems and produced sensitivity coefficients that agreed well with both reference sensitivities and multigroup TSUNAMI-3D sensitivity coefficients. The newly developed CLUTCH method was observed to produce sensitivity coefficients with high figures of merit and low memory requirements, and both continuous-energy sensitivity methods met or exceeded the accuracy of the multigroup TSUNAMI-3D calculations.
NASA Technical Reports Server (NTRS)
Habbal, Shadia Rifai
2005-01-01
Investigations of the physical processes responsible for coronal heating and the acceleration of the solar wind were pursued with the use of our recently developed 2D MHD solar wind code and our 1D multifluid code. In particular, we explored: (1) the role of proton temperature anisotropy in the expansion of the solar (2) the role of plasma parameters at the coronal base in the formation of high (3) a three-fluid model of the slow solar wind (4) the heating of coronal loops (5) a newly developed hybrid code for the study of ion cyclotron resonance in wind, speed solar wind streams at mid-latitudes, the solar wind.
Newly-Developed 3D GRMHD Code and its Application to Jet Formation
NASA Technical Reports Server (NTRS)
Mizuno, Y.; Nishikawa, K.-I.; Koide, S.; Hardee, P.; Fishman, G. J.
2006-01-01
We have developed a new three-dimensional general relativistic magnetohydrodynamic code by using a conservative, high-resolution shock-capturing scheme. The numerical fluxes are calculated using the HLL approximate Riemann solver scheme. The flux-interpolated constrained transport scheme is used to maintain a divergence-free magnetic field. We have performed various 1-dimensional test problems in both special and general relativity by using several reconstruction methods and found that the new 3D GRMHD code shows substantial improvements over our previous model. The . preliminary results show the jet formations from a geometrically thin accretion disk near a non-rotating and a rotating black hole. We will discuss the jet properties depended on the rotation of a black hole and the magnetic field strength.
Coupled-cluster based R-matrix codes (CCRM): Recent developments
NASA Astrophysics Data System (ADS)
Sur, Chiranjib; Pradhan, Anil K.
2008-05-01
We report the ongoing development of the new coupled-cluster R-matrix codes (CCRM) for treating electron-ion scattering and radiative processes within the framework of the relativistic coupled-cluster method (RCC), interfaced with the standard R-matrix methodology. The RCC method is size consistent and in principle equivalent to an all-order many-body perturbation theory. The RCC method is one of the most accurate many-body theories, and has been applied for several systems. This project should enable the study of electron-interactions with heavy atoms/ions, utilizing not only high speed computing platforms but also improved theoretical description of the relativistic and correlation effects for the target atoms/ions as treated extensively within the RCC method. Here we present a comprehensive outline of the newly developed theoretical method and a schematic representation of the new suite of CCRM codes. We begin with the flowchart and description of various stages involved in this development. We retain the notations and nomenclature of different stages as analogous to the standard R-matrix codes.
ERIC Educational Resources Information Center
Hupp, Stephen D. A.; Reitman, David; Forde, Debra A.; Shriver, Mark D.; Kelley, Mary Lou
2008-01-01
This study investigates the validity of the Parent Instruction-Giving Game with Youngsters (PIGGY), a newly developed direct-observation system. The PIGGY is a derivative of the Dyadic Parent-Child Interaction Coding System II [DPICS-II; Eyberg, S. M., Bessmer, J., Newcomb, K., Edwards, D., Robinson, E. (1994). Manual for the Dyadic Parent-Child…
Linking Family Hardship to Children's Lives.
ERIC Educational Resources Information Center
Elder, Gen H., Jr.; And Others
1985-01-01
Used newly developed codes for parenting behavior during the Great Depression reported in the Oakland Growth Study. Results indicated that economic hardship adversely influenced the psychosocial well-being of adolescent girls, but not boys, by increasing the rejecting behavior of fathers. This effect was particularly strong for unattractive girls.…
Translating an AI application from Lisp to Ada: A case study
NASA Technical Reports Server (NTRS)
Davis, Gloria J.
1991-01-01
A set of benchmarks was developed to test the performance of a newly designed computer executing both Lisp and Ada. Among these was AutoClassII -- a large Artificial Intelligence (AI) application written in Common Lisp. The extraction of a representative subset of this complex application was aided by a Lisp Code Analyzer (LCA). The LCA enabled rapid analysis of the code, putting it in a concise and functionally readable form. An equivalent benchmark was created in Ada through manual translation of the Lisp version. A comparison of the execution results of both programs across a variety of compiler-machine combinations indicate that line-by-line translation coupled with analysis of the initial code can produce relatively efficient and reusable target code.
NASA Astrophysics Data System (ADS)
Jubran, Mohammad K.; Bansal, Manu; Kondi, Lisimachos P.
2006-01-01
In this paper, we consider the problem of optimal bit allocation for wireless video transmission over fading channels. We use a newly developed hybrid scalable/multiple-description codec that combines the functionality of both scalable and multiple-description codecs. It produces a base layer and multiple-description enhancement layers. Any of the enhancement layers can be decoded (in a non-hierarchical manner) with the base layer to improve the reconstructed video quality. Two different channel coding schemes (Rate-Compatible Punctured Convolutional (RCPC)/Cyclic Redundancy Check (CRC) coding and, product code Reed Solomon (RS)+RCPC/CRC coding) are used for unequal error protection of the layered bitstream. Optimal allocation of the bitrate between source and channel coding is performed for discrete sets of source coding rates and channel coding rates. Experimental results are presented for a wide range of channel conditions. Also, comparisons with classical scalable coding show the effectiveness of using hybrid scalable/multiple-description coding for wireless transmission.
DNA rearrangements directed by non-coding RNAs in ciliates
Mochizuki, Kazufumi
2013-01-01
Extensive programmed rearrangement of DNA, including DNA elimination, chromosome fragmentation, and DNA descrambling, takes place in the newly developed macronucleus during the sexual reproduction of ciliated protozoa. Recent studies have revealed that two distant classes of ciliates use distinct types of non-coding RNAs to regulate such DNA rearrangement events. DNA elimination in Tetrahymena is regulated by small non-coding RNAs that are produced and utilized in an RNAi-related process. It has been proposed that the small RNAs produced from the micronuclear genome are used to identify eliminated DNA sequences by whole-genome comparison between the parental macronucleus and the micronucleus. In contrast, DNA descrambling in Oxytricha is guided by long non-coding RNAs that are produced from the parental macronuclear genome. These long RNAs are proposed to act as templates for the direct descrambling events that occur in the developing macronucleus. Both cases provide useful examples to study epigenetic chromatin regulation by non-coding RNAs. PMID:21956937
NASA Astrophysics Data System (ADS)
Aikawa, Satoru; Nakamura, Yasuhisa; Takanashi, Hitoshi
1994-02-01
This paper describes the performance of an outage free SXH (Synchronous Digital Hierarchy) interface 256 QAM modem. An outage free DMR (Digital Microwave Radio) is achieved by a high coding gain trellis coded SPORT QAM and Super Multicarrier modem. A new frame format and its associated circuits connect the outage free modem to the SDH interface. The newly designed VLSI's are key devices for developing the modem. As an overall modem performance, BER (bit error rate) characteristics and equipment signatures are presented. A coding gain of 4.7 dB (at a BER of 10(exp -4)) is obtained using SPORT 256 QAM and Viterbi decoding. This coding gain is realized by trellis coding as well as by increasing of transmission rate. Roll-off factor is decreased to maintain the same frequency occupation and modulation level as ordinary SDH 256 QAM modern.
The Association between Observed Parental Emotion Socialization and Adolescent Self-Medication
ERIC Educational Resources Information Center
Hersh, Matthew A.; Hussong, Andrea M.
2009-01-01
The current study examined the moderating influence of observed parental emotion socialization (PES) on self-medication in adolescents. Strengths of the study include the use of a newly developed observational coding system further extending the study of PES to adolescence, the use of an experience sampling method to assess the daily covariation…
NASA Astrophysics Data System (ADS)
Sandalski, Stou
Smooth particle hydrodynamics is an efficient method for modeling the dynamics of fluids. It is commonly used to simulate astrophysical processes such as binary mergers. We present a newly developed GPU accelerated smooth particle hydrodynamics code for astrophysical simulations. The code is named
DOE Office of Scientific and Technical Information (OSTI.GOV)
Binotti, M.; Zhu, G.; Gray, A.
An analytical approach, as an extension of one newly developed method -- First-principle OPTical Intercept Calculation (FirstOPTIC) -- is proposed to treat the geometrical impact of three-dimensional (3-D) effects on parabolic trough optical performance. The mathematical steps of this analytical approach are presented and implemented numerically as part of the suite of FirstOPTIC code. In addition, the new code has been carefully validated against ray-tracing simulation results and available numerical solutions. This new analytical approach to treating 3-D effects will facilitate further understanding and analysis of the optical performance of trough collectors as a function of incidence angle.
NASA Technical Reports Server (NTRS)
Bidwell, Colin S.; Pinella, David; Garrison, Peter
1999-01-01
Collection efficiency and ice accretion calculations were made for a commercial transport using the NASA Lewis LEWICE3D ice accretion code, the ICEGRID3D grid code and the CMARC panel code. All of the calculations were made on a Windows 95 based personal computer. The ice accretion calculations were made for the nose, wing, horizontal tail and vertical tail surfaces. Ice shapes typifying those of a 30 minute hold were generated. Collection efficiencies were also generated for the entire aircraft using the newly developed unstructured collection efficiency method. The calculations highlight the flexibility and cost effectiveness of the LEWICE3D, ICEGRID3D, CMARC combination.
Syazwan, AI; Rafee, B Mohd; Hafizan, Juahir; Azman, AZF; Nizar, AM; Izwyn, Z; Muhaimin, AA; Yunos, MA Syafiq; Anita, AR; Hanafiah, J Muhamad; Shaharuddin, MS; Ibthisham, A Mohd; Ismail, Mohd Hasmadi; Azhar, MN Mohamad; Azizan, HS; Zulfadhli, I; Othman, J
2012-01-01
Background To meet the current diversified health needs in workplaces, especially in nonindustrial workplaces in developing countries, an indoor air quality (IAQ) component of a participatory occupational safety and health survey should be included. Objectives The purpose of this study was to evaluate and suggest a multidisciplinary, integrated IAQ checklist for evaluating the health risk of building occupants. This IAQ checklist proposed to support employers, workers, and assessors in understanding a wide range of important elements in the indoor air environment to promote awareness in nonindustrial workplaces. Methods The general structure of and specific items in the IAQ checklist were discussed in a focus group meeting with IAQ assessors based upon the result of a literature review, previous industrial code of practice, and previous interviews with company employers and workers. Results For practicality and validity, several sessions were held to elicit the opinions of company members, and, as a result, modifications were made. The newly developed IAQ checklist was finally formulated, consisting of seven core areas, nine technical areas, and 71 essential items. Each item was linked to a suitable section in the Industry Code of Practice on Indoor Air Quality published by the Department of Occupational Safety and Health. Conclusion Combined usage of an IAQ checklist with the information from the Industry Code of Practice on Indoor Air Quality would provide easily comprehensible information and practical support. Intervention and evaluation studies using this newly developed IAQ checklist will clarify the effectiveness of a new approach in evaluating the risk of indoor air pollutants in the workplace. PMID:22570579
Syazwan, Ai; Rafee, B Mohd; Hafizan, Juahir; Azman, Azf; Nizar, Am; Izwyn, Z; Muhaimin, Aa; Yunos, Ma Syafiq; Anita, Ar; Hanafiah, J Muhamad; Shaharuddin, Ms; Ibthisham, A Mohd; Ismail, Mohd Hasmadi; Azhar, Mn Mohamad; Azizan, Hs; Zulfadhli, I; Othman, J
2012-01-01
To meet the current diversified health needs in workplaces, especially in nonindustrial workplaces in developing countries, an indoor air quality (IAQ) component of a participatory occupational safety and health survey should be included. The purpose of this study was to evaluate and suggest a multidisciplinary, integrated IAQ checklist for evaluating the health risk of building occupants. This IAQ checklist proposed to support employers, workers, and assessors in understanding a wide range of important elements in the indoor air environment to promote awareness in nonindustrial workplaces. The general structure of and specific items in the IAQ checklist were discussed in a focus group meeting with IAQ assessors based upon the result of a literature review, previous industrial code of practice, and previous interviews with company employers and workers. For practicality and validity, several sessions were held to elicit the opinions of company members, and, as a result, modifications were made. The newly developed IAQ checklist was finally formulated, consisting of seven core areas, nine technical areas, and 71 essential items. Each item was linked to a suitable section in the Industry Code of Practice on Indoor Air Quality published by the Department of Occupational Safety and Health. Combined usage of an IAQ checklist with the information from the Industry Code of Practice on Indoor Air Quality would provide easily comprehensible information and practical support. Intervention and evaluation studies using this newly developed IAQ checklist will clarify the effectiveness of a new approach in evaluating the risk of indoor air pollutants in the workplace.
Differences between Children and Adults in the Recognition of Enjoyment Smiles
ERIC Educational Resources Information Center
Del Giudice, Marco; Colle, Livia
2007-01-01
The authors investigated the differences between 8-year-olds (n = 80) and adults (n = 80) in recognition of felt versus faked enjoyment smiles by using a newly developed picture set that is based on the Facial Action Coding System. The authors tested the effect of different facial action units (AUs) on judgments of smile authenticity. Multiple…
ERIC Educational Resources Information Center
DaCosta, Kneia
2006-01-01
This qualitative investigation explores the responses of 22 U.S. urban public high school students when confronted with their newly imposed school uniform policy. Specifically, the study assessed students' appraisals of the policy along with compliance and academic performance. Guided by ecological human development perspectives and grounded in…
Hypersonic simulations using open-source CFD and DSMC solvers
NASA Astrophysics Data System (ADS)
Casseau, V.; Scanlon, T. J.; John, B.; Emerson, D. R.; Brown, R. E.
2016-11-01
Hypersonic hybrid hydrodynamic-molecular gas flow solvers are required to satisfy the two essential requirements of any high-speed reacting code, these being physical accuracy and computational efficiency. The James Weir Fluids Laboratory at the University of Strathclyde is currently developing an open-source hybrid code which will eventually reconcile the direct simulation Monte-Carlo method, making use of the OpenFOAM application called dsmcFoam, and the newly coded open-source two-temperature computational fluid dynamics solver named hy2Foam. In conjunction with employing the CVDV chemistry-vibration model in hy2Foam, novel use is made of the QK rates in a CFD solver. In this paper, further testing is performed, in particular with the CFD solver, to ensure its efficacy before considering more advanced test cases. The hy2Foam and dsmcFoam codes have shown to compare reasonably well, thus providing a useful basis for other codes to compare against.
MODELLING OF FUEL BEHAVIOUR DURING LOSS-OF-COOLANT ACCIDENTS USING THE BISON CODE
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pastore, G.; Novascone, S. R.; Williamson, R. L.
2015-09-01
This work presents recent developments to extend the BISON code to enable fuel performance analysis during LOCAs. This newly developed capability accounts for the main physical phenomena involved, as well as the interactions among them and with the global fuel rod thermo-mechanical analysis. Specifically, new multiphysics models are incorporated in the code to describe (1) transient fission gas behaviour, (2) rapid steam-cladding oxidation, (3) Zircaloy solid-solid phase transition, (4) hydrogen generation and transport through the cladding, and (5) Zircaloy high-temperature non-linear mechanical behaviour and failure. Basic model characteristics are described, and a demonstration BISON analysis of a LWR fuel rodmore » undergoing a LOCA accident is presented. Also, as a first step of validation, the code with the new capability is applied to the simulation of experiments investigating cladding behaviour under LOCA conditions. The comparison of the results with the available experimental data of cladding failure due to burst is presented.« less
Continuous-energy eigenvalue sensitivity coefficient calculations in TSUNAMI-3D
DOE Office of Scientific and Technical Information (OSTI.GOV)
Perfetti, C. M.; Rearden, B. T.
2013-07-01
Two methods for calculating eigenvalue sensitivity coefficients in continuous-energy Monte Carlo applications were implemented in the KENO code within the SCALE code package. The methods were used to calculate sensitivity coefficients for several test problems and produced sensitivity coefficients that agreed well with both reference sensitivities and multigroup TSUNAMI-3D sensitivity coefficients. The newly developed CLUTCH method was observed to produce sensitivity coefficients with high figures of merit and a low memory footprint, and both continuous-energy sensitivity methods met or exceeded the accuracy of the multigroup TSUNAMI-3D calculations. (authors)
Two-fluid 2.5D code for simulations of small scale magnetic fields in the lower solar atmosphere
NASA Astrophysics Data System (ADS)
Piantschitsch, Isabell; Amerstorfer, Ute; Thalmann, Julia Katharina; Hanslmeier, Arnold; Lemmerer, Birgit
2015-08-01
Our aim is to investigate magnetic reconnection as a result of the time evolution of magnetic flux tubes in the solar chromosphere. A new numerical two-fluid code was developed, which will perform a 2.5D simulation of the dynamics from the upper convection zone up to the transition region. The code is based on the Total Variation Diminishing Lax-Friedrichs method and includes the effects of ion-neutral collisions, ionisation/recombination, thermal/resistive diffusivity as well as collisional/resistive heating. What is innovative about our newly developed code is the inclusion of a two-fluid model in combination with the use of analytically constructed vertically open magnetic flux tubes, which are used as initial conditions for our simulation. First magnetohydrodynamic (MHD) tests have already shown good agreement with known results of numerical MHD test problems like e.g. the Orszag-Tang vortex test, the Current Sheet test or the Spherical Blast Wave test. Furthermore, the single-fluid approach will also be applied to the initial conditions, in order to compare the different rates of magnetic reconnection in both codes, the two-fluid code and the single-fluid one.
The FLUKA code for space applications: recent developments
NASA Technical Reports Server (NTRS)
Andersen, V.; Ballarini, F.; Battistoni, G.; Campanella, M.; Carboni, M.; Cerutti, F.; Empl, A.; Fasso, A.; Ferrari, A.; Gadioli, E.;
2004-01-01
The FLUKA Monte Carlo transport code is widely used for fundamental research, radioprotection and dosimetry, hybrid nuclear energy system and cosmic ray calculations. The validity of its physical models has been benchmarked against a variety of experimental data over a wide range of energies, ranging from accelerator data to cosmic ray showers in the earth atmosphere. The code is presently undergoing several developments in order to better fit the needs of space applications. The generation of particle spectra according to up-to-date cosmic ray data as well as the effect of the solar and geomagnetic modulation have been implemented and already successfully applied to a variety of problems. The implementation of suitable models for heavy ion nuclear interactions has reached an operational stage. At medium/high energy FLUKA is using the DPMJET model. The major task of incorporating heavy ion interactions from a few GeV/n down to the threshold for inelastic collisions is also progressing and promising results have been obtained using a modified version of the RQMD-2.4 code. This interim solution is now fully operational, while waiting for the development of new models based on the FLUKA hadron-nucleus interaction code, a newly developed QMD code, and the implementation of the Boltzmann master equation theory for low energy ion interactions. c2004 COSPAR. Published by Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Draeger, Erik W.
This report documents the fact that the work in creating a strategic plan and beginning customer engagements has been completed. The description of milestone is: The newly formed advanced architecture and portability specialists (AAPS) team will develop a strategic plan to meet the goals of 1) sharing knowledge and experience with code teams to ensure that ASC codes run well on new architectures, and 2) supplying skilled computational scientists to put the strategy into practice. The plan will be delivered to ASC management in the first quarter. By the fourth quarter, the team will identify their first customers within PEMmore » and IC, perform an initial assessment and scalability and performance bottleneck for next-generation architectures, and embed AAPS team members with customer code teams to assist with initial portability development within standalone kernels or proxy applications.« less
ERIC Educational Resources Information Center
Greene, Michele G.; And Others
1987-01-01
Using a newly developed coding method, the Geriatric Interaction Analysis system, the interactions of doctors with a matched sample of older and younger patients were audiotaped and scored. Patients and doctors raised fewer psychosocial issues in interviews with older patients than with younger patients. Doctors also responded less well to these…
Numerical investigations of low-density nozzle flow by solving the Boltzmann equation
NASA Technical Reports Server (NTRS)
Deng, Zheng-Tao; Liaw, Goang-Shin; Chou, Lynn Chen
1995-01-01
A two-dimensional finite-difference code to solve the BGK-Boltzmann equation has been developed. The solution procedure consists of three steps: (1) transforming the BGK-Boltzmann equation into two simultaneous partial differential equations by taking moments of the distribution function with respect to the molecular velocity u(sub z), with weighting factors 1 and u(sub z)(sup 2); (2) solving the transformed equations in the physical space based on the time-marching technique and the four-stage Runge-Kutta time integration, for a given discrete-ordinate. The Roe's second-order upwind difference scheme is used to discretize the convective terms and the collision terms are treated as source terms; and (3) using the newly calculated distribution functions at each point in the physical space to calculate the macroscopic flow parameters by the modified Gaussian quadrature formula. Repeating steps 2 and 3, the time-marching procedure stops when the convergent criteria is reached. A low-density nozzle flow field has been calculated by this newly developed code. The BGK Boltzmann solution and experimental data show excellent agreement. It demonstrated that numerical solutions of the BGK-Boltzmann equation are ready to be experimentally validated.
Tsuru, Satoko; Okamine, Eiko; Takada, Aya; Watanabe, Chitose; Uchiyama, Makiko; Dannoue, Hideo; Aoyagi, Hisae; Endo, Akira
2009-01-01
Nursing Action Master and Nursing Observation Master were released from 2002 to 2008. Two kinds of format, an Excel format and a CSV format file are prepared for maintaining them. Followings were decided as a basic rule of the maintenance: newly addition, revision, deletion, the numbering of the management and a rule of the coding. The master was developed based on it. We do quality assurance for the masters using these rules.
Application of grammar-based codes for lossless compression of digital mammograms
NASA Astrophysics Data System (ADS)
Li, Xiaoli; Krishnan, Srithar; Ma, Ngok-Wah
2006-01-01
A newly developed grammar-based lossless source coding theory and its implementation was proposed in 1999 and 2000, respectively, by Yang and Kieffer. The code first transforms the original data sequence into an irreducible context-free grammar, which is then compressed using arithmetic coding. In the study of grammar-based coding for mammography applications, we encountered two issues: processing time and limited number of single-character grammar G variables. For the first issue, we discover a feature that can simplify the matching subsequence search in the irreducible grammar transform process. Using this discovery, an extended grammar code technique is proposed and the processing time of the grammar code can be significantly reduced. For the second issue, we propose to use double-character symbols to increase the number of grammar variables. Under the condition that all the G variables have the same probability of being used, our analysis shows that the double- and single-character approaches have the same compression rates. By using the methods proposed, we show that the grammar code can outperform three other schemes: Lempel-Ziv-Welch (LZW), arithmetic, and Huffman on compression ratio, and has similar error tolerance capabilities as LZW coding under similar circumstances.
A Comprehensive High Performance Predictive Tool for Fusion Liquid Metal Hydromagnetics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huang, Peter; Chhabra, Rupanshi; Munipalli, Ramakanth
In Phase I SBIR project, HyPerComp and Texcel initiated the development of two induction-based MHD codes as a predictive tool for fusion hydro-magnetics. The newly-developed codes overcome the deficiency of other MHD codes based on the quasi static approximation by defining a more general mathematical model that utilizes the induced magnetic field rather than the electric potential as the main electromagnetic variable. The UCLA code is a finite-difference staggered-mesh code that serves as a supplementary tool to the massively-parallel finite-volume code developed by HyPerComp. As there is no suitable experimental data under blanket-relevant conditions for code validation, code-to-code comparisons andmore » comparisons against analytical solutions were successfully performed for three selected test cases: (1) lid-driven MHD flow, (2) flow in a rectangular duct in a transverse magnetic field, and (3) unsteady finite magnetic Reynolds number flow in a rectangular enclosure. The performed tests suggest that the developed codes are accurate and robust. Further work will focus on enhancing the code capabilities towards higher flow parameters and faster computations. At the conclusion of the current Phase-II Project we have completed the preliminary validation efforts in performing unsteady mixed-convection MHD flows (against limited data that is currently available in literature), and demonstrated flow behavior in large 3D channels including important geometrical features. Code enhancements such as periodic boundary conditions, unmatched mesh structures are also ready. As proposed, we have built upon these strengths and explored a much increased range of Grashof numbers and Hartmann numbers under various flow conditions, ranging from flows in a rectangular duct to prototypic blanket modules and liquid metal PFC. Parametric studies, numerical and physical model improvements to expand the scope of simulations, code demonstration, and continued validation activities have also been completed.« less
Boundary modelling of the stellarator Wendelstein 7-X
NASA Astrophysics Data System (ADS)
Renner, H.; Strumberger, E.; Kisslinger, J.; Nührenberg, J.; Wobig, H.
1997-02-01
To justify the design of the divertor plates in W7-X the magnetic fields of finite-β HELIAS equilibria for the so-called high-mirror case have been computed for various average β-values up to < β > = 0.04 with the NEMEC free-boundary equilibrium code [S.P. Hirshman, W.I. van Rij and W.I. Merkel, Comput. Phys. Commun. 43 (1986) 143] in combination with the newly developed MFBE (magnetic field solver for finite-beta equilibria) code. In a second study the unloading of the target plates by radiation was investigated. The B2 code [B.J. Braams, Ph.D. Thesis, Rijksuniversiteit Utrecht (1986)] was applied for the first time to stellarators to provide of a self-consistent modelling of the SOL including effects of neutrals and impurities.
NASA Astrophysics Data System (ADS)
Russell, John L.; Campbell, John L.; Boyd, Nicholas I.; Dias, Johnny F.
2018-02-01
The newly developed GUMAP software creates element maps from OMDAQ list mode files, displays these maps individually or collectively, and facilitates on-screen definitions of specified regions from which a PIXE spectrum can be built. These include a free-hand region defined by moving the cursor. The regional charge is entered automatically into the spectrum file in a new GUPIXWIN-compatible format, enabling a GUPIXWIN analysis of the spectrum. The code defaults to the OMDAQ dead time treatment but also facilitates two other methods for dead time correction in sample regions with count rates different from the average.
DOE Office of Scientific and Technical Information (OSTI.GOV)
A. Alfonsi; C. Rabiti; D. Mandelli
The Reactor Analysis and Virtual control ENviroment (RAVEN) code is a software tool that acts as the control logic driver and post-processing engine for the newly developed Thermal-Hydraulic code RELAP-7. RAVEN is now a multi-purpose Probabilistic Risk Assessment (PRA) software framework that allows dispatching different functionalities: Derive and actuate the control logic required to simulate the plant control system and operator actions (guided procedures), allowing on-line monitoring/controlling in the Phase Space Perform both Monte-Carlo sampling of random distributed events and Dynamic Event Tree based analysis Facilitate the input/output handling through a Graphical User Interface (GUI) and a post-processing data miningmore » module« less
Superintendents' Perceptions of the Effectiveness of Newly Hired Principals
ERIC Educational Resources Information Center
Lehman, Lynn E.; Boyland, Lori G.; Sriver, Shawn K.
2014-01-01
This study investigates the frequency of research-based leadership strategies utilized by newly hired school principals in the workplace. Public school superintendents in Indiana were asked to respond to two open-ended research questions. Through the use of content analysis, their comments were coded for the occurrence of effective leadership…
Assessment and Application of the ROSE Code for Reactor Outage Thermal-Hydraulic and Safety Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liang, Thomas K.S.; Ko, F.-K.; Dai, L.-C
The currently available tools, such as RELAP5, RETRAN, and others, cannot easily and correctly perform the task of analyzing the system behavior during plant outages. Therefore, a medium-sized program aiming at reactor outage simulation and evaluation, such as midloop operation (MLO) with loss of residual heat removal (RHR), has been developed. Important thermal-hydraulic processes involved during MLO with loss of RHR can be properly simulated by the newly developed reactor outage simulation and evaluation (ROSE) code. The two-region approach with a modified two-fluid model has been adopted to be the theoretical basis of the ROSE code.To verify the analytical modelmore » in the first step, posttest calculations against the integral midloop experiments with loss of RHR have been performed. The excellent simulation capacity of the ROSE code against the Institute of Nuclear Energy Research Integral System Test Facility test data is demonstrated. To further mature the ROSE code in simulating a full-sized pressurized water reactor, assessment against the WGOTHIC code and the Maanshan momentary-loss-of-RHR event has been undertaken. The successfully assessed ROSE code is then applied to evaluate the abnormal operation procedure (AOP) with loss of RHR during MLO (AOP 537.4) for the Maanshan plant. The ROSE code also has been successfully transplanted into the Maanshan training simulator to support operator training. How the simulator was upgraded by the ROSE code for MLO will be presented in the future.« less
Recent and Planned Developments in the CARI Program
2013-04-01
software are available from the Radiobiology Research Team Website. The source code is available upon request. CARI-6 is based on the last major... Research Team at its newly founded Civil Aeromedical Research Institute (now called the Civil Aerospace Medical Institute, i.e., CAMI) to investigate...Administration, Office of Aerospace Medicine. Re- port DOT/FAA/AM-11/09, 2011. Online at: www. faa.gov/data_ research / research /med_humanfacs/ oamtechreports
Scalar collapse in AdS with an OpenCL open source code
NASA Astrophysics Data System (ADS)
Liebling, Steven L.; Khanna, Gaurav
2017-10-01
We study the spherically symmetric collapse of a scalar field in anti-de Sitter spacetime using a newly constructed, open-source code which parallelizes over heterogeneous architectures using the open standard OpenCL. An open question for this scenario concerns how to tell, a priori, whether some form of initial data will be stable or will instead develop under the turbulent instability into a black hole in the limit of vanishing amplitude. Previous work suggested the existence of islands of stability around quasi-periodic solutions, and we use this new code to examine the stability properties of approximately quasi-periodic solutions which balance energy transfer to higher modes with energy transfer to lower modes. The evolutions provide some evidence, though not conclusively, for stability of initial data sufficiently close to quasiperiodic solutions.
Evaluation of Thoracic Injury in Swine Model with a Noise Immune Stethoscope
2011-04-01
USAARL Report No. 2011-16 Evaluation of Thoracic Injury in Swine Model with a Noise Immune Stethoscope By Alaistair Bushby Eric J. Ansorge Keith...Include area code) 22-04-2011 Final Evaluation of Thoracic Injury in Swine Model with a Noise Immune Stethoscope Alaistair Bushby Eric J. Ansorge...and to provide life saving interventions. This study evaluated the feasibility and sensitivity of a newly developed electronic stethoscope concept
Self-Cohering Airborne Distributed Array
1988-06-01
F19628-84- C -0080 ft. ADDRESS (City, State, and ZIP Code) 10. SOURCE OF FUNDING NUMBERS PROGRAM PROJECT JTASK JWORK UNIT Hanscom APE MA 01731-5000...algorithms under consideration (including the newly developed algorithms). The algorithms are classified both according to the type c -f processing and...4.1 RADIO CAMERA DATA FORMAT AND PROCEDURES (FROM C -23) The range trace delivered by each antenna element is stonred as a rc’w of coimplex number-s
NASA Technical Reports Server (NTRS)
Solomon, G.
1993-01-01
A (72,36;15) box code is constructed as a 9 x 8 matrix whose columns add to form an extended BCH-Hamming (8,4;4) code and whose rows sum to odd or even parity. The newly constructed code, due to its matrix form, is easily decodable for all seven-error and many eight-error patterns. The code comes from a slight modification in the parity (eighth) dimension of the Reed-Solomon (8,4;5) code over GF(512). Error correction uses the row sum parity information to detect errors, which then become erasures in a Reed-Solomon correction algorithm.
Dust-Particle Transport in Tokamak Edge Plasmas
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pigarov, A Y; Krasheninnikov, S I; Soboleva, T K
2005-09-12
Dust particulates in the size range of 10nm-100{micro}m are found in all fusion devices. Such dust can be generated during tokamak operation due to strong plasma/material-surface interactions. Some recent experiments and theoretical estimates indicate that dust particles can provide an important source of impurities in the tokamak plasma. Moreover, dust can be a serious threat to the safety of next-step fusion devices. In this paper, recent experimental observations on dust in fusion devices are reviewed. A physical model for dust transport simulation, and a newly developed code DUSTT, are discussed. The DUSTT code incorporates both dust dynamics due to comprehensivemore » dust-plasma interactions as well as the effects of dust heating, charging, and evaporation. The code tracks test dust particles in realistic plasma backgrounds as provided by edge-plasma transport codes. Results are presented for dust transport in current and next-step tokamaks. The effect of dust on divertor plasma profiles and core plasma contamination is examined.« less
The equation of state package FEOS for high energy density matter
NASA Astrophysics Data System (ADS)
Faik, Steffen; Tauschwitz, Anna; Iosilevskiy, Igor
2018-06-01
Adequate equation of state (EOS) data is of high interest in the growing field of high energy density physics and especially essential for hydrodynamic simulation codes. The semi-analytical method used in the newly developed Frankfurt equation of state (FEOS) package provides an easy and fast access to the EOS of - in principle - arbitrary materials. The code is based on the well known QEOS model (More et al., 1988; Young and Corey, 1995) and is a further development of the MPQeos code (Kemp and Meyer-ter Vehn, 1988; Kemp and Meyer-ter Vehn, 1998) from Max-Planck-Institut für Quantenoptik (MPQ) in Garching Germany. The list of features contains the calculation of homogeneous mixtures of chemical elements and the description of the liquid-vapor two-phase region with or without a Maxwell construction. Full flexibility of the package is assured by its structure: A program library provides the EOS with an interface designed for Fortran or C/C++ codes. Two additional software tools allow for the generation of EOS tables in different file output formats and for the calculation and visualization of isolines and Hugoniot shock adiabats. As an example the EOS of fused silica (SiO2) is calculated and compared to experimental data and other EOS codes.
A deep learning method for lincRNA detection using auto-encoder algorithm.
Yu, Ning; Yu, Zeng; Pan, Yi
2017-12-06
RNA sequencing technique (RNA-seq) enables scientists to develop novel data-driven methods for discovering more unidentified lincRNAs. Meantime, knowledge-based technologies are experiencing a potential revolution ignited by the new deep learning methods. By scanning the newly found data set from RNA-seq, scientists have found that: (1) the expression of lincRNAs appears to be regulated, that is, the relevance exists along the DNA sequences; (2) lincRNAs contain some conversed patterns/motifs tethered together by non-conserved regions. The two evidences give the reasoning for adopting knowledge-based deep learning methods in lincRNA detection. Similar to coding region transcription, non-coding regions are split at transcriptional sites. However, regulatory RNAs rather than message RNAs are generated. That is, the transcribed RNAs participate the biological process as regulatory units instead of generating proteins. Identifying these transcriptional regions from non-coding regions is the first step towards lincRNA recognition. The auto-encoder method achieves 100% and 92.4% prediction accuracy on transcription sites over the putative data sets. The experimental results also show the excellent performance of predictive deep neural network on the lincRNA data sets compared with support vector machine and traditional neural network. In addition, it is validated through the newly discovered lincRNA data set and one unreported transcription site is found by feeding the whole annotated sequences through the deep learning machine, which indicates that deep learning method has the extensive ability for lincRNA prediction. The transcriptional sequences of lincRNAs are collected from the annotated human DNA genome data. Subsequently, a two-layer deep neural network is developed for the lincRNA detection, which adopts the auto-encoder algorithm and utilizes different encoding schemes to obtain the best performance over intergenic DNA sequence data. Driven by those newly annotated lincRNA data, deep learning methods based on auto-encoder algorithm can exert their capability in knowledge learning in order to capture the useful features and the information correlation along DNA genome sequences for lincRNA detection. As our knowledge, this is the first application to adopt the deep learning techniques for identifying lincRNA transcription sequences.
QUANTUM ESPRESSO: a modular and open-source software project for quantum simulations of materials.
Giannozzi, Paolo; Baroni, Stefano; Bonini, Nicola; Calandra, Matteo; Car, Roberto; Cavazzoni, Carlo; Ceresoli, Davide; Chiarotti, Guido L; Cococcioni, Matteo; Dabo, Ismaila; Dal Corso, Andrea; de Gironcoli, Stefano; Fabris, Stefano; Fratesi, Guido; Gebauer, Ralph; Gerstmann, Uwe; Gougoussis, Christos; Kokalj, Anton; Lazzeri, Michele; Martin-Samos, Layla; Marzari, Nicola; Mauri, Francesco; Mazzarello, Riccardo; Paolini, Stefano; Pasquarello, Alfredo; Paulatto, Lorenzo; Sbraccia, Carlo; Scandolo, Sandro; Sclauzero, Gabriele; Seitsonen, Ari P; Smogunov, Alexander; Umari, Paolo; Wentzcovitch, Renata M
2009-09-30
QUANTUM ESPRESSO is an integrated suite of computer codes for electronic-structure calculations and materials modeling, based on density-functional theory, plane waves, and pseudopotentials (norm-conserving, ultrasoft, and projector-augmented wave). The acronym ESPRESSO stands for opEn Source Package for Research in Electronic Structure, Simulation, and Optimization. It is freely available to researchers around the world under the terms of the GNU General Public License. QUANTUM ESPRESSO builds upon newly-restructured electronic-structure codes that have been developed and tested by some of the original authors of novel electronic-structure algorithms and applied in the last twenty years by some of the leading materials modeling groups worldwide. Innovation and efficiency are still its main focus, with special attention paid to massively parallel architectures, and a great effort being devoted to user friendliness. QUANTUM ESPRESSO is evolving towards a distribution of independent and interoperable codes in the spirit of an open-source project, where researchers active in the field of electronic-structure calculations are encouraged to participate in the project by contributing their own codes or by implementing their own ideas into existing codes.
Soapy: an adaptive optics simulation written purely in Python for rapid concept development
NASA Astrophysics Data System (ADS)
Reeves, Andrew
2016-07-01
Soapy is a newly developed Adaptive Optics (AO) simulation which aims be a flexible and fast to use tool-kit for many applications in the field of AO. It is written purely in the Python language, adding to and taking advantage of the already rich ecosystem of scientific libraries and programs. The simulation has been designed to be extremely modular, such that each component can be used stand-alone for projects which do not require a full end-to-end simulation. Ease of use, modularity and code clarity have been prioritised at the expense of computational performance. Though this means the code is not yet suitable for large studies of Extremely Large Telescope AO systems, it is well suited to education, exploration of new AO concepts and investigations of current generation telescopes.
Enhancement/upgrade of Engine Structures Technology Best Estimator (EST/BEST) Software System
NASA Technical Reports Server (NTRS)
Shah, Ashwin
2003-01-01
This report describes the work performed during the contract period and the capabilities included in the EST/BEST software system. The developed EST/BEST software system includes the integrated NESSUS, IPACS, COBSTRAN, and ALCCA computer codes required to perform the engine cycle mission and component structural analysis. Also, the interactive input generator for NESSUS, IPACS, and COBSTRAN computer codes have been developed and integrated with the EST/BEST software system. The input generator allows the user to create input from scratch as well as edit existing input files interactively. Since it has been integrated with the EST/BEST software system, it enables the user to modify EST/BEST generated files and perform the analysis to evaluate the benefits. Appendix A gives details of how to use the newly added features in the EST/BEST software system.
Structural response of existing spatial truss roof construction based on Cosserat rod theory
NASA Astrophysics Data System (ADS)
Miśkiewicz, Mikołaj
2018-04-01
Paper presents the application of the Cosserat rod theory and newly developed associated finite elements code as the tools that support in the expert-designing engineering practice. Mechanical principles of the 3D spatially curved rods, dynamics (statics) laws, principle of virtual work are discussed. Corresponding FEM approach with interpolation and accumulation techniques of state variables are shown that enable the formulation of the C0 Lagrangian rod elements with 6-degrees of freedom per node. Two test examples are shown proving the correctness and suitability of the proposed formulation. Next, the developed FEM code is applied to assess the structural response of the spatial truss roof of the "Olivia" Sports Arena Gdansk, Poland. The numerical results are compared with load test results. It is shown that the proposed FEM approach yields correct results.
USDA-ARS?s Scientific Manuscript database
A newly expanded digital resource exists for tracking decisions on all nomenclature proposals potentially contributing to Appendices II-VIII of the International Code of Nomenclature for algae, fungi, and plants. This resource originated with the Smithsonian Institution's Proposals and Disposals web...
A Developmental Approach to the Teaching of Ethical Decision Making.
ERIC Educational Resources Information Center
Neukrug, Edward S.
1996-01-01
Examines the newly adopted code of ethics, reviews some ethical decision-making models, and hypothesizes how the maturity of a student might mediate the effective use of codes and of decision-making models. Provides a model for human service educators that integrates ethical guidelines and ethical decision-making models. (RJM)
Code of Ethics for Electrical Engineers
NASA Astrophysics Data System (ADS)
Matsuki, Junya
The Institute of Electrical Engineers of Japan (IEEJ) has established the rules of practice for its members recently, based on its code of ethics enacted in 1998. In this paper, first, the characteristics of the IEEJ 1998 ethical code are explained in detail compared to the other ethical codes for other fields of engineering. Secondly, the contents which shall be included in the modern code of ethics for electrical engineers are discussed. Thirdly, the newly-established rules of practice and the modified code of ethics are presented. Finally, results of questionnaires on the new ethical code and rules which were answered on May 23, 2007, by 51 electrical and electronic students of the University of Fukui are shown.
Energy Savings Analysis of the Proposed Revision of the Washington D.C. Non-Residential Energy Code
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rosenberg, Michael I.; Athalye, Rahul A.; Hart, Philip R.
This report presents the results of an assessment of savings for the proposed Washington D.C. energy code relative to ASHRAE Standard 90.1-2010. It includes annual and life cycle savings for site energy, source energy, energy cost, and carbon dioxide emissions that would result from adoption and enforcement of the proposed code for newly constructed buildings in Washington D.C. over a five year period.
Signature of chaos in the 4 f -core-excited states for highly-charged tungsten ions
NASA Astrophysics Data System (ADS)
Safronova, Ulyana; Safronova, Alla
2014-05-01
We evaluate radiative and autoionizing transition rates in highly charged W ions in search for the signature of chaos. In particularly, previously published results for Ag-like W27+, Tm-like W5+, and Yb-like W4+ ions as well as newly obtained for I-like W21+, Xe-like W20+, Cs-like W19+, and La-like W17+ ions (with ground configuration [Kr] 4d10 4fk with k = 7, 8, 9, and 11, respectively) are considered that were calculated using the multiconfiguration relativistic Hebrew University Lawrence Livermore Atomic Code (HULLAC code) and the Hartree-Fock-Relativistic method (COWAN code). The main emphasis was on verification of Gaussian statistics of rates as a function of transition energy. There was no evidence of such statistics for above mentioned previously published results as well as for the transitions between the excited and autoionizing states for newly calculated results. However, we did find the Gaussian profile for the transitions between excited states such as the [Kr] 4d10 4fk - [Kr] 4d10 4f k - 1 5 d transitions , for newly calculated W ions. This work is supported in part by DOE under NNSA Cooperative Agreement DE-NA0001984.
A fast code for channel limb radiances with gas absorption and scattering in a spherical atmosphere
NASA Astrophysics Data System (ADS)
Eluszkiewicz, Janusz; Uymin, Gennady; Flittner, David; Cady-Pereira, Karen; Mlawer, Eli; Henderson, John; Moncet, Jean-Luc; Nehrkorn, Thomas; Wolff, Michael
2017-05-01
We present a radiative transfer code capable of accurately and rapidly computing channel limb radiances in the presence of gaseous absorption and scattering in a spherical atmosphere. The code has been prototyped for the Mars Climate Sounder measuring limb radiances in the thermal part of the spectrum (200-900 cm-1) where absorption by carbon dioxide and water vapor and absorption and scattering by dust and water ice particles are important. The code relies on three main components: 1) The Gauss Seidel Spherical Radiative Transfer Model (GSSRTM) for scattering, 2) The Planetary Line-By-Line Radiative Transfer Model (P-LBLRTM) for gas opacity, and 3) The Optimal Spectral Sampling (OSS) for selecting a limited number of spectral points to simulate channel radiances and thus achieving a substantial increase in speed. The accuracy of the code has been evaluated against brute-force line-by-line calculations performed on the NASA Pleiades supercomputer, with satisfactory results. Additional improvements in both accuracy and speed are attainable through incremental changes to the basic approach presented in this paper, which would further support the use of this code for real-time retrievals and data assimilation. Both newly developed codes, GSSRTM/OSS for MCS and P-LBLRTM, are available for additional testing and user feedback.
Development of a New System for Transport Simulation and Analysis at General Atomics
NASA Astrophysics Data System (ADS)
St. John, H. E.; Peng, Q.; Freeman, J.; Crotinger, J.
1997-11-01
General Atomics has begun a long term program to improve all aspects of experimental data analysis related to DIII--D. The object is to make local and visiting physicists as productive as possible, with only a small investment in training, by developing intuitive, sophisticated interfaces to existing and newly created computer programs. Here we describe our initial work and results of a pilot project in this program. The pilot project is a collaboratory effort between LLNL and GA which will ultimately result in the merger of Corsica and ONETWO (and selected modules from other codes) into a new advanced transport code system. The initial goal is to produce a graphical user interface to the transport code ONETWO which will couple to a programmable (steerable) front end designed for the transport system. This will be an object oriented scheme written primarily in python. The programmable application will integrate existing C, C^++, and Fortran methods in a single computational paradigm. Its most important feature is the use of plug in physics modules which will allow a high degree of customization.
Recent progress in the analysis of iced airfoils and wings
NASA Technical Reports Server (NTRS)
Cebeci, Tuncer; Chen, Hsun H.; Kaups, Kalle; Schimke, Sue
1992-01-01
Recent work on the analysis of iced airfoils and wings is described. Ice shapes for multielement airfoils and wings are computed using an extension of the LEWICE code that was developed for single airfoils. The aerodynamic properties of the iced wing are determined with an interactive scheme in which the solutions of the inviscid flow equations are obtained from a panel method and the solutions of the viscous flow equations are obtained from an inverse three-dimensional finite-difference boundary-layer method. A new interaction law is used to couple the inviscid and viscous flow solutions. The newly developed LEWICE multielement code is amplified to a high-lift configuration to calculate the ice shapes on the slat and on the main airfoil and on a four-element airfoil. The application of the LEWICE wing code to the calculation of ice shapes on a MS-317 swept wing shows good agreement with measurements. The interactive boundary-layer method is applied to a tapered iced wing in order to study the effect of icing on the aerodynamic properties of the wing at several angles of attack.
Thermodynamic Analysis of Coherently Grown GaAsN/Ge: Effects of Different Gaseous Sources
NASA Astrophysics Data System (ADS)
Kawano, Jun; Kangawa, Yoshihiro; Yayama, Tomoe; Kakimoto, Koichi; Koukitu, Akinori
2013-04-01
Thermodynamic analysis of coherently grown GaAs1-xNx on Ge with low N content was performed to determine the relationship between solid composition and growth conditions. In this study, a new algorithm for the simulation code, which is applicable to wider combinations of gaseous sources than the traditional algorithm, was developed to determine the influence of different gaseous sources on N incorporation. Using this code, here we successfully compared two cases: one is a system using trimethylgallium (TMG), AsH3, and NH3, and the other uses dimethylhydrazine (DMHy) instead of NH3. It was found that the optimal N/As ratio of input gas in the system using DMHy was much lower than that using NH3. This shows that the newly developed algorithm could be a useful tool for analyzing the N incorporation during the vapor growth of GaAs1-xNx.
New quantum codes constructed from quaternary BCH codes
NASA Astrophysics Data System (ADS)
Xu, Gen; Li, Ruihu; Guo, Luobin; Ma, Yuena
2016-10-01
In this paper, we firstly study construction of new quantum error-correcting codes (QECCs) from three classes of quaternary imprimitive BCH codes. As a result, the improved maximal designed distance of these narrow-sense imprimitive Hermitian dual-containing quaternary BCH codes are determined to be much larger than the result given according to Aly et al. (IEEE Trans Inf Theory 53:1183-1188, 2007) for each different code length. Thus, families of new QECCs are newly obtained, and the constructed QECCs have larger distance than those in the previous literature. Secondly, we apply a combinatorial construction to the imprimitive BCH codes with their corresponding primitive counterpart and construct many new linear quantum codes with good parameters, some of which have parameters exceeding the finite Gilbert-Varshamov bound for linear quantum codes.
2005-05-01
purification of intermediates to antibiotic medicines Hangzhou First Pharmaceutical Company Hangzhou, P. R. China Research chemist September 1987...NAME OF RESPONSIBLE PERSON OF ABSTRACT OF PAGES a. REPORT b. ABSTRACT c . THIS PAGE LTU 19b. TELEPHONE NUMBER (include area U U U 32 code) Standard Form...used as the second-dimension system c . BioRad PROTEAN® d. VersaDocTM Imaging Systems with PDQUEST software Preliminary results Using the newly installed
Studying Turbulence Using Numerical Simulation Databases, 2. Proceedings of the 1988 Summer Program
NASA Technical Reports Server (NTRS)
1988-01-01
The focus of the program was on the use of direct numerical simulations of turbulent flow for study of turbulence physics and modeling. A special interest was placed on turbulent mixing layers. The required data for these investigations were generated from four newly developed codes for simulation of time and spatially developing incompressible and compressible mixing layers. Also of interest were the structure of wall bounded turbulent and transitional flows, evaluation of diagnostic techniques for detection of organized motions, energy transfer in isotropic turbulence, optical propagation through turbulent media, and detailed analysis of the interaction of vortical structures.
The Pluralisation of Family Life: Implications for Preschool Education
ERIC Educational Resources Information Center
Šebart, Mojca Kovac; Kuhar, Roman
2017-01-01
The article takes as its starting point the public debate about the newly proposed Family Code in Slovenia in 2009. Inter alia, the Code introduced a new, inclusive definition of the family in accordance with the contemporary pluralisation of family life. This raised a number of questions about how--if at all--various families are addressed in the…
NASA Astrophysics Data System (ADS)
Vesselinov, V. V.; Harp, D.
2010-12-01
The process of decision making to protect groundwater resources requires a detailed estimation of uncertainties in model predictions. Various uncertainties associated with modeling a natural system, such as: (1) measurement and computational errors; (2) uncertainties in the conceptual model and model-parameter estimates; (3) simplifications in model setup and numerical representation of governing processes, contribute to the uncertainties in the model predictions. Due to this combination of factors, the sources of predictive uncertainties are generally difficult to quantify individually. Decision support related to optimal design of monitoring networks requires (1) detailed analyses of existing uncertainties related to model predictions of groundwater flow and contaminant transport, (2) optimization of the proposed monitoring network locations in terms of their efficiency to detect contaminants and provide early warning. We apply existing and newly-proposed methods to quantify predictive uncertainties and to optimize well locations. An important aspect of the analysis is the application of newly-developed optimization technique based on coupling of Particle Swarm and Levenberg-Marquardt optimization methods which proved to be robust and computationally efficient. These techniques and algorithms are bundled in a software package called MADS. MADS (Model Analyses for Decision Support) is an object-oriented code that is capable of performing various types of model analyses and supporting model-based decision making. The code can be executed under different computational modes, which include (1) sensitivity analyses (global and local), (2) Monte Carlo analysis, (3) model calibration, (4) parameter estimation, (5) uncertainty quantification, and (6) model selection. The code can be externally coupled with any existing model simulator through integrated modules that read/write input and output files using a set of template and instruction files (consistent with the PEST I/O protocol). MADS can also be internally coupled with a series of built-in analytical simulators. MADS provides functionality to work directly with existing control files developed for the code PEST (Doherty 2009). To perform the computational modes mentioned above, the code utilizes (1) advanced Latin-Hypercube sampling techniques (including Improved Distributed Sampling), (2) various gradient-based Levenberg-Marquardt optimization methods, (3) advanced global optimization methods (including Particle Swarm Optimization), and (4) a selection of alternative objective functions. The code has been successfully applied to perform various model analyses related to environmental management of real contamination sites. Examples include source identification problems, quantification of uncertainty, model calibration, and optimization of monitoring networks. The methodology and software codes are demonstrated using synthetic and real case studies where monitoring networks are optimized taking into account the uncertainty in model predictions of contaminant transport.
Comparison of measured and computed phase functions of individual tropospheric ice crystals
NASA Astrophysics Data System (ADS)
Stegmann, Patrick G.; Tropea, Cameron; Järvinen, Emma; Schnaiter, Martin
2016-07-01
Airplanes passing the incuda (lat. anvils) regions of tropical cumulonimbi-clouds are at risk of suffering an engine power-loss event and engine damage due to ice ingestion (Mason et al., 2006 [1]). Research in this field relies on optical measurement methods to characterize ice crystals; however the design and implementation of such methods presently suffer from the lack of reliable and efficient means of predicting the light scattering from ice crystals. The nascent discipline of direct measurement of phase functions of ice crystals in conjunction with particle imaging and forward modelling through geometrical optics derivative- and Transition matrix-codes for the first time allow us to obtain a deeper understanding of the optical properties of real tropospheric ice crystals. In this manuscript, a sample phase function obtained via the Particle Habit Imaging and Polar Scattering (PHIPS) probe during a measurement campaign in flight over Brazil will be compared to three different light scattering codes. This includes a newly developed first order geometrical optics code taking into account the influence of the Gaussian beam illumination used in the PHIPS device, as well as the reference ray tracing code of Macke and the T-matrix code of Kahnert.
Modelling Dynamic Behaviour and Spall Failure of Aluminium Alloy AA7010
NASA Astrophysics Data System (ADS)
Ma'at, N.; Nor, M. K. Mohd; Ismail, A. E.; Kamarudin, K. A.; Jamian, S.; Ibrahim, M. N.; Awang, M. K.
2017-10-01
A finite strain constitutive model to predict the dynamic deformation behaviour of Aluminium Alloy 7010 including shockwaves and spall failure is developed in this work. The important feature of this newly hyperelastic-plastic constitutive formulation is a new Mandel stress tensor formulated using new generalized orthotropic pressure. This tensor is combined with a shock equation of state (EOS) and Grady spall failure. The Hill’s yield criterion is adopted to characterize plastic orthotropy by means of the evolving structural tensors that is defined in the isoclinic configuration. This material model was developed and integration into elastic and plastic parts. The elastic anisotropy is taken into account through the newly stress tensor decomposition of a generalized orthotropic pressure. Plastic anisotropy is considered through yield surface and an isotropic hardening defined in a unique alignment of deviatoric plane within the stress space. To test its ability to describe shockwave propagation and spall failure, the new material model was implemented into the LLNL-DYNA3D code of UTHM’s. The capability of this newly constitutive model were compared against published experimental data of Plate Impact Test at 234m/s, 450m/s and 895m/s impact velocities. A good agreement is obtained between experimental and simulation in each test.
A computer code for multiphase all-speed transient flows in complex geometries. MAST version 1.0
NASA Technical Reports Server (NTRS)
Chen, C. P.; Jiang, Y.; Kim, Y. M.; Shang, H. M.
1991-01-01
The operation of the MAST code, which computes transient solutions to the multiphase flow equations applicable to all-speed flows, is described. Two-phase flows are formulated based on the Eulerian-Lagrange scheme in which the continuous phase is described by the Navier-Stokes equation (or Reynolds equations for turbulent flows). Dispersed phase is formulated by a Lagrangian tracking scheme. The numerical solution algorithms utilized for fluid flows is a newly developed pressure-implicit algorithm based on the operator-splitting technique in generalized nonorthogonal coordinates. This operator split allows separate operation on each of the variable fields to handle pressure-velocity coupling. The obtained pressure correction equation has the hyperbolic nature and is effective for Mach numbers ranging from the incompressible limit to supersonic flow regimes. The present code adopts a nonstaggered grid arrangement; thus, the velocity components and other dependent variables are collocated at the same grid. A sequence of benchmark-quality problems, including incompressible, subsonic, transonic, supersonic, gas-droplet two-phase flows, as well as spray-combustion problems, were performed to demonstrate the robustness and accuracy of the present code.
Supernova Light Curves and Spectra from Two Different Codes: Supernu and Phoenix
NASA Astrophysics Data System (ADS)
Van Rossum, Daniel R; Wollaeger, Ryan T
2014-08-01
The observed similarities between light curve shapes from Type Ia supernovae, and in particular the correlation of light curve shape and brightness, have been actively studied for more than two decades. In recent years, hydronamic simulations of white dwarf explosions have advanced greatly, and multiple mechanisms that could potentially produce Type Ia supernovae have been explored in detail. The question which of the proposed mechanisms is (or are) possibly realized in nature remains challenging to answer, but detailed synthetic light curves and spectra from explosion simulations are very helpful and important guidelines towards answering this question.We present results from a newly developed radiation transport code, Supernu. Supernu solves the supernova radiation transfer problem uses a novel technique based on a hybrid between Implicit Monte Carlo and Discrete Diffusion Monte Carlo. This technique enhances the efficiency with respect to traditional implicit monte carlo codes and thus lends itself perfectly for multi-dimensional simulations. We show direct comparisons of light curves and spectra from Type Ia simulations with Supernu versus the legacy Phoenix code.
NASA Astrophysics Data System (ADS)
Nagakura, Hiroki; Iwakami, Wakana; Furusawa, Shun; Sumiyoshi, Kohsuke; Yamada, Shoichi; Matsufuru, Hideo; Imakura, Akira
2017-04-01
We present a newly developed moving-mesh technique for the multi-dimensional Boltzmann-Hydro code for the simulation of core-collapse supernovae (CCSNe). What makes this technique different from others is the fact that it treats not only hydrodynamics but also neutrino transfer in the language of the 3 + 1 formalism of general relativity (GR), making use of the shift vector to specify the time evolution of the coordinate system. This means that the transport part of our code is essentially general relativistic, although in this paper it is applied only to the moving curvilinear coordinates in the flat Minknowski spacetime, since the gravity part is still Newtonian. The numerical aspect of the implementation is also described in detail. Employing the axisymmetric two-dimensional version of the code, we conduct two test computations: oscillations and runaways of proto-neutron star (PNS). We show that our new method works fine, tracking the motions of PNS correctly. We believe that this is a major advancement toward the realistic simulation of CCSNe.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nagakura, Hiroki; Iwakami, Wakana; Furusawa, Shun
We present a newly developed moving-mesh technique for the multi-dimensional Boltzmann-Hydro code for the simulation of core-collapse supernovae (CCSNe). What makes this technique different from others is the fact that it treats not only hydrodynamics but also neutrino transfer in the language of the 3 + 1 formalism of general relativity (GR), making use of the shift vector to specify the time evolution of the coordinate system. This means that the transport part of our code is essentially general relativistic, although in this paper it is applied only to the moving curvilinear coordinates in the flat Minknowski spacetime, since the gravity partmore » is still Newtonian. The numerical aspect of the implementation is also described in detail. Employing the axisymmetric two-dimensional version of the code, we conduct two test computations: oscillations and runaways of proto-neutron star (PNS). We show that our new method works fine, tracking the motions of PNS correctly. We believe that this is a major advancement toward the realistic simulation of CCSNe.« less
Recognition of Protein-coding Genes Based on Z-curve Algorithms
-Biao Guo, Feng; Lin, Yan; -Ling Chen, Ling
2014-01-01
Recognition of protein-coding genes, a classical bioinformatics issue, is an absolutely needed step for annotating newly sequenced genomes. The Z-curve algorithm, as one of the most effective methods on this issue, has been successfully applied in annotating or re-annotating many genomes, including those of bacteria, archaea and viruses. Two Z-curve based ab initio gene-finding programs have been developed: ZCURVE (for bacteria and archaea) and ZCURVE_V (for viruses and phages). ZCURVE_C (for 57 bacteria) and Zfisher (for any bacterium) are web servers for re-annotation of bacterial and archaeal genomes. The above four tools can be used for genome annotation or re-annotation, either independently or combined with the other gene-finding programs. In addition to recognizing protein-coding genes and exons, Z-curve algorithms are also effective in recognizing promoters and translation start sites. Here, we summarize the applications of Z-curve algorithms in gene finding and genome annotation. PMID:24822027
Radiation and polarization signatures of the 3D multizone time-dependent hadronic blazar model
Zhang, Haocheng; Diltz, Chris; Bottcher, Markus
2016-09-23
We present a newly developed time-dependent three-dimensional multizone hadronic blazar emission model. By coupling a Fokker–Planck-based lepto-hadronic particle evolution code, 3DHad, with a polarization-dependent radiation transfer code, 3DPol, we are able to study the time-dependent radiation and polarization signatures of a hadronic blazar model for the first time. Our current code is limited to parameter regimes in which the hadronic γ-ray output is dominated by proton synchrotron emission, neglecting pion production. Our results demonstrate that the time-dependent flux and polarization signatures are generally dominated by the relation between the synchrotron cooling and the light-crossing timescale, which is largely independent ofmore » the exact model parameters. We find that unlike the low-energy polarization signatures, which can vary rapidly in time, the high-energy polarization signatures appear stable. Lastly, future high-energy polarimeters may be able to distinguish such signatures from the lower and more rapidly variable polarization signatures expected in leptonic models.« less
NASA Astrophysics Data System (ADS)
Liu, Tianyu; Du, Xining; Ji, Wei; Xu, X. George; Brown, Forrest B.
2014-06-01
For nuclear reactor analysis such as the neutron eigenvalue calculations, the time consuming Monte Carlo (MC) simulations can be accelerated by using graphics processing units (GPUs). However, traditional MC methods are often history-based, and their performance on GPUs is affected significantly by the thread divergence problem. In this paper we describe the development of a newly designed event-based vectorized MC algorithm for solving the neutron eigenvalue problem. The code was implemented using NVIDIA's Compute Unified Device Architecture (CUDA), and tested on a NVIDIA Tesla M2090 GPU card. We found that although the vectorized MC algorithm greatly reduces the occurrence of thread divergence thus enhancing the warp execution efficiency, the overall simulation speed is roughly ten times slower than the history-based MC code on GPUs. Profiling results suggest that the slow speed is probably due to the memory access latency caused by the large amount of global memory transactions. Possible solutions to improve the code efficiency are discussed.
New nonbinary quantum codes with larger distance constructed from BCH codes over 𝔽q2
NASA Astrophysics Data System (ADS)
Xu, Gen; Li, Ruihu; Fu, Qiang; Ma, Yuena; Guo, Luobin
2017-03-01
This paper concentrates on construction of new nonbinary quantum error-correcting codes (QECCs) from three classes of narrow-sense imprimitive BCH codes over finite field 𝔽q2 (q ≥ 3 is an odd prime power). By a careful analysis on properties of cyclotomic cosets in defining set T of these BCH codes, the improved maximal designed distance of these narrow-sense imprimitive Hermitian dual-containing BCH codes is determined to be much larger than the result given according to Aly et al. [S. A. Aly, A. Klappenecker and P. K. Sarvepalli, IEEE Trans. Inf. Theory 53, 1183 (2007)] for each different code length. Thus families of new nonbinary QECCs are constructed, and the newly obtained QECCs have larger distance than those in previous literature.
P80 SRM low torque flex-seal development - thermal and chemical modeling of molding process
NASA Astrophysics Data System (ADS)
Descamps, C.; Gautronneau, E.; Rousseau, G.; Daurat, M.
2009-09-01
The development of the flex-seal component of the P80 nozzle gave the opportunity to set up new design and manufacturing process methods. Due to the short development lead time required by VEGA program, the usual manufacturing iterative tests work flow, which is usually time consuming, had to be enhanced in order to use a more predictive approach. A newly refined rubber vulcanization description was built up and identified on laboratory samples. This chemical model was implemented in a thermal analysis code. The complete model successfully supports the manufacturing processes. These activities were conducted with the support of ESA/CNES Research & Technologies and DGA (General Delegation for Armament).
INL Results for Phases I and III of the OECD/NEA MHTGR-350 Benchmark
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gerhard Strydom; Javier Ortensi; Sonat Sen
2013-09-01
The Idaho National Laboratory (INL) Very High Temperature Reactor (VHTR) Technology Development Office (TDO) Methods Core Simulation group led the construction of the Organization for Economic Cooperation and Development (OECD) Modular High Temperature Reactor (MHTGR) 350 MW benchmark for comparing and evaluating prismatic VHTR analysis codes. The benchmark is sponsored by the OECD's Nuclear Energy Agency (NEA), and the project will yield a set of reference steady-state, transient, and lattice depletion problems that can be used by the Department of Energy (DOE), the Nuclear Regulatory Commission (NRC), and vendors to assess their code suits. The Methods group is responsible formore » defining the benchmark specifications, leading the data collection and comparison activities, and chairing the annual technical workshops. This report summarizes the latest INL results for Phase I (steady state) and Phase III (lattice depletion) of the benchmark. The INSTANT, Pronghorn and RattleSnake codes were used for the standalone core neutronics modeling of Exercise 1, and the results obtained from these codes are compared in Section 4. Exercise 2 of Phase I requires the standalone steady-state thermal fluids modeling of the MHTGR-350 design, and the results for the systems code RELAP5-3D are discussed in Section 5. The coupled neutronics and thermal fluids steady-state solution for Exercise 3 are reported in Section 6, utilizing the newly developed Parallel and Highly Innovative Simulation for INL Code System (PHISICS)/RELAP5-3D code suit. Finally, the lattice depletion models and results obtained for Phase III are compared in Section 7. The MHTGR-350 benchmark proved to be a challenging simulation set of problems to model accurately, and even with the simplifications introduced in the benchmark specification this activity is an important step in the code-to-code verification of modern prismatic VHTR codes. A final OECD/NEA comparison report will compare the Phase I and III results of all other international participants in 2014, while the remaining Phase II transient case results will be reported in 2015.« less
NASA Technical Reports Server (NTRS)
Kojima, Jun; Nguyen, Quang-Viet
2007-01-01
In support of NASA ARMD's code validation project, we have made significant progress by providing the first quantitative single-shot multi-scalar data from a turbulent elevated-pressure (5 atm), swirl-stabilized, lean direct injection (LDI) type research burner operating on CH4-air using a spatially-resolved pulsed-laser spontaneous Raman diagnostic technique. The Raman diagnostics apparatus and data analysis that we present here were developed over the past 6 years at Glenn Research Center. From the Raman scattering data, we produce spatially-mapped probability density functions (PDFs) of the instantaneous temperature, determined using a newly developed low-resolution effective rotational bandwidth (ERB) technique. The measured 3-scalar (triplet) correlations, between temperature, CH4, and O2 concentrations, as well as their PDF s, also provide a high-level of detail into the nature and extent of the turbulent mixing process and its impact on chemical reactions in a realistic gas turbine injector flame at elevated pressures. The multi-scalar triplet data set presented here provides a good validation case for CFD combustion codes to simulate by providing both average and statistical values for the 3 measured scalars.
Application of the MCNPX-McStas interface for shielding calculations and guide design at ESS
NASA Astrophysics Data System (ADS)
Klinkby, E. B.; Knudsen, E. B.; Willendrup, P. K.; Lauritzen, B.; Nonbøl, E.; Bentley, P.; Filges, U.
2014-07-01
Recently, an interface between the Monte Carlo code MCNPX and the neutron ray-tracing code MCNPX was developed [1, 2]. Based on the expected neutronic performance and guide geometries relevant for the ESS, the combined MCNPX-McStas code is used to calculate dose rates along neutron beam guides. The generation and moderation of neutrons is simulated using a full scale MCNPX model of the ESS target monolith. Upon entering the neutron beam extraction region, the individual neutron states are handed to McStas via the MCNPX-McStas interface. McStas transports the neutrons through the beam guide, and by using newly developed event logging capability, the neutron state parameters corresponding to un-reflected neutrons are recorded at each scattering. This information is handed back to MCNPX where it serves as neutron source input for a second MCNPX simulation. This simulation enables calculation of dose rates in the vicinity of the guide. In addition the logging mechanism is employed to record the scatterings along the guides which is exploited to simulate the supermirror quality requirements (i.e. m-values) needed at different positions along the beam guide to transport neutrons in the same guide/source setup.
An Object-Oriented Serial DSMC Simulation Package
NASA Astrophysics Data System (ADS)
Liu, Hongli; Cai, Chunpei
2011-05-01
A newly developed three-dimensional direct simulation Monte Carlo (DSMC) simulation package, named GRASP ("Generalized Rarefied gAs Simulation Package"), is reported in this paper. This package utilizes the concept of simulation engine, many C++ features and software design patterns. The package has an open architecture which can benefit further development and maintenance of the code. In order to reduce the engineering time for three-dimensional models, a hybrid grid scheme, combined with a flexible data structure compiled by C++ language, are implemented in this package. This scheme utilizes a local data structure based on the computational cell to achieve high performance on workstation processors. This data structure allows the DSMC algorithm to be very efficiently parallelized with domain decomposition and it provides much flexibility in terms of grid types. This package can utilize traditional structured, unstructured or hybrid grids within the framework of a single code to model arbitrarily complex geometries and to simulate rarefied gas flows. Benchmark test cases indicate that this package has satisfactory accuracy for complex rarefied gas flows.
MicroRNAs and intellectual disability (ID) in Down syndrome, X-linked ID, and Fragile X syndrome
Siew, Wei-Hong; Tan, Kai-Leng; Babaei, Maryam Abbaspour; Cheah, Pike-See; Ling, King-Hwa
2013-01-01
Intellectual disability (ID) is one of the many features manifested in various genetic syndromes leading to deficits in cognitive function among affected individuals. ID is a feature affected by polygenes and multiple environmental factors. It leads to a broad spectrum of affected clinical and behavioral characteristics among patients. Until now, the causative mechanism of ID is unknown and the progression of the condition is poorly understood. Advancement in technology and research had identified various genetic abnormalities and defects as the potential cause of ID. However, the link between these abnormalities with ID is remained inconclusive and the roles of many newly discovered genetic components such as non-coding RNAs have not been thoroughly investigated. In this review, we aim to consolidate and assimilate the latest development and findings on a class of small non-coding RNAs known as microRNAs (miRNAs) involvement in ID development and progression with special focus on Down syndrome (DS) and X-linked ID (XLID) [including Fragile X syndrome (FXS)]. PMID:23596395
DOE Office of Scientific and Technical Information (OSTI.GOV)
Neely, J. R.; Hornung, R.; Black, A.
This document serves as a detailed companion to the powerpoint slides presented as part of the ASC L2 milestone review for Integrated Codes milestone #4782 titled “Assess Newly Emerging Programming and Memory Models for Advanced Architectures on Integrated Codes”, due on 9/30/2014, and presented for formal program review on 9/12/2014. The program review committee is represented by Mike Zika (A Program Project Lead for Kull), Brian Pudliner (B Program Project Lead for Ares), Scott Futral (DEG Group Lead in LC), and Mike Glass (Sierra Project Lead at Sandia). This document, along with the presentation materials, and a letter of completionmore » signed by the review committee will act as proof of completion for this milestone.« less
A Granular Self-Organizing Map for Clustering and Gene Selection in Microarray Data.
Ray, Shubhra Sankar; Ganivada, Avatharam; Pal, Sankar K
2016-09-01
A new granular self-organizing map (GSOM) is developed by integrating the concept of a fuzzy rough set with the SOM. While training the GSOM, the weights of a winning neuron and the neighborhood neurons are updated through a modified learning procedure. The neighborhood is newly defined using the fuzzy rough sets. The clusters (granules) evolved by the GSOM are presented to a decision table as its decision classes. Based on the decision table, a method of gene selection is developed. The effectiveness of the GSOM is shown in both clustering samples and developing an unsupervised fuzzy rough feature selection (UFRFS) method for gene selection in microarray data. While the superior results of the GSOM, as compared with the related clustering methods, are provided in terms of β -index, DB-index, Dunn-index, and fuzzy rough entropy, the genes selected by the UFRFS are not only better in terms of classification accuracy and a feature evaluation index, but also statistically more significant than the related unsupervised methods. The C-codes of the GSOM and UFRFS are available online at http://avatharamg.webs.com/software-code.
Scientific Software - the role of best practices and recommendations
NASA Astrophysics Data System (ADS)
Fritzsch, Bernadette; Bernstein, Erik; Castell, Wolfgang zu; Diesmann, Markus; Haas, Holger; Hammitzsch, Martin; Konrad, Uwe; Lähnemann, David; McHardy, Alice; Pampel, Heinz; Scheliga, Kaja; Schreiber, Andreas; Steglich, Dirk
2017-04-01
In Geosciences - like in most other communities - scientific work strongly depends on software. For big data analysis, existing (closed or open source) program packages are often mixed with newly developed codes. Different versions of software components and varying configurations can influence the result of data analysis. This often makes reproducibility of results and reuse of codes very difficult. Policies for publication and documentation of used and newly developed software, along with best practices, can help tackle this problem. Within the Helmholtz Association a Task Group "Access to and Re-use of scientific software" was implemented by the Open Science Working Group in 2016. The aim of the Task Group is to foster the discussion about scientific software in the Open Science context and to formulate recommendations for the production and publication of scientific software, ensuring open access to it. As a first step, a workshop gathered interested scientists from institutions across Germany. The workshop brought together various existing initiatives from different scientific communities to analyse current problems, share established best practices and come up with possible solutions. The subjects in the working groups covered a broad range of themes, including technical infrastructures, standards and quality assurance, citation of software and reproducibility. Initial recommendations are presented and discussed in the talk. They are the foundation for further discussions in the Helmholtz Association and the Priority Initiative "Digital Information" of the Alliance of Science Organisations in Germany. The talk aims to inform about the activities and to link with other initiatives on the national or international level.
Simulations of Turbulent Momentum and Scalar Transport in Confined Swirling Coaxial Jets
NASA Technical Reports Server (NTRS)
Shih, Tsan-Hsing; Liu, Nan-Suey; Moder, Jeffrey P.
2015-01-01
This paper presents the numerical simulations of confined three-dimensional coaxial water jets. The objectives are to validate the newly proposed nonlinear turbulence models of momentum and scalar transport, and to evaluate the newly introduced scalar APDF and DWFDF equation along with its Eulerian implementation in the National Combustion Code(NCC). Simulations conducted include the steady RANS, the unsteady RANS (URANS), and the time-filtered Navier-Stokes (TFNS); both without and with invoking the APDF or DWFDF equation.
Fluid Aspects of Solar Wind Disturbances Driven by Coronal Mass Ejections. Appendix 3
NASA Technical Reports Server (NTRS)
Gosling, J. T.; Riley, Pete
2001-01-01
Transient disturbances in the solar wind initiated by coronal eruptions have been modeled for many years, beginning with the self-similar analytical models of Parker and Simon and Axford. The first numerical computer code (one-dimensional, gas dynamic) to study disturbance propagation in the solar wind was developed in the late 1960s, and a variety of other codes ranging from simple one-dimensional gas dynamic codes through three-dimensional gas dynamic and magnetohydrodynamic codes have been developed in subsequent years. For the most part, these codes have been applied to the problem of disturbances driven by fast CMEs propagating into a structureless solar wind. Pizzo provided an excellent summary of the level of understanding achieved from such simulation studies through about 1984, and other reviews have subsequently become available. More recently, some attention has been focused on disturbances generated by slow CMEs, on disturbances driven by CMEs having high internal pressures, and disturbance propagation effects associated with a structured ambient solar wind. Our purpose here is to provide a brief tutorial on fluid aspects of solar wind disturbances derived from numerical gas dynamic simulations. For the most part we illustrate disturbance evolution by propagating idealized perturbations, mimicking different types of CMEs, into a structureless solar wind using a simple one-dimensional, adiabatic (except at shocks), gas dynamic code. The simulations begin outside the critical point where the solar wind becomes supersonic and thus do not address questions of how the CMEs themselves are initiated. Limited to one dimension (the radial direction), the simulation code predicts too strong an interaction between newly ejected solar material and the ambient wind because it neglects azimuthal and meridional motions of the plasma that help relieve pressure stresses. Moreover, the code ignores magnetic forces and thus also underestimates the speed with which pressure disturbances propagate in the wind.
Vasculogenesis and Angiogenesis: Molecular and Cellular Controls
Kubis, N.; Levy, B.I.
2003-01-01
Summary Angiogenesis characterizes embryonic development, but also occurs in adulthood in physiological situations such as adaptation to muscle exercise, and in pathological conditions like cancer. Major advances have been made in understanding the molecular mechanisms responsible for vasculogenesis and angiogenesis, largely due to the use of “knock-out mice”, i.e. mice in which the gene coding for the protein under investigation has been inactivated. Interestingly, the same growth factors and their receptors are equally involved in the different aspects of vasculogenesis and angiogenesis during development and in adulthood. This review aims to describe in detail their respective roles and how interactions between them lead to a newly formed vessel. PMID:20591248
Decoding sORF translation - from small proteins to gene regulation.
Cabrera-Quio, Luis Enrique; Herberg, Sarah; Pauli, Andrea
2016-11-01
Translation is best known as the fundamental mechanism by which the ribosome converts a sequence of nucleotides into a string of amino acids. Extensive research over many years has elucidated the key principles of translation, and the majority of translated regions were thought to be known. The recent discovery of wide-spread translation outside of annotated protein-coding open reading frames (ORFs) came therefore as a surprise, raising the intriguing possibility that these newly discovered translated regions might have unrecognized protein-coding or gene-regulatory functions. Here, we highlight recent findings that provide evidence that some of these newly discovered translated short ORFs (sORFs) encode functional, previously missed small proteins, while others have regulatory roles. Based on known examples we will also speculate about putative additional roles and the potentially much wider impact that these translated regions might have on cellular homeostasis and gene regulation.
Studies of dynamic processes related to active experiments in space plasmas
NASA Technical Reports Server (NTRS)
Banks, Peter M.; Neubert, Torsten
1992-01-01
This is the final report for grant NAGw-2055, 'Studies of Dynamic Processes Related to Active Experiments in Space Plasmas', covering research performed at the University of Michigan. The grant was awarded to study: (1) theoretical and data analysis of data from the CHARGE-2 rocket experiment (1keV; 1-46 mA electron beam ejections) and the Spacelab-2 shuttle experiment (1keV; 100 mA); (2) studies of the interaction of an electron beam, emitted from an ionospheric platform, with the ambient neutral atmosphere and plasma by means of a newly developed computer simulation model, relating model predictions with CHARGE-2 observations of return currents observed during electron beam emissions; and (3) development of a self-consistent model for the charge distribution on a moving conducting tether in a magnetized plasma and for the potential structure in the plasma surrounding the tether. Our main results include: (1) the computer code developed for the interaction of electrons beams with the neutral atmosphere and plasma is able to model observed return fluxes to the CHARGE-2 sounding rocket payload; and (2) a 3-D electromagnetic and relativistic particle simulation code was developed.
Modelling of the Gadolinium Fuel Test IFA-681 using the BISON Code
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pastore, Giovanni; Hales, Jason Dean; Novascone, Stephen Rhead
2016-05-01
In this work, application of Idaho National Laboratory’s fuel performance code BISON to modelling of fuel rods from the Halden IFA-681 gadolinium fuel test is presented. First, an overview is given of BISON models, focusing on UO2/UO2-Gd2O3 fuel and Zircaloy cladding. Then, BISON analyses of selected fuel rods from the IFA-681 test are performed. For the first time in a BISON application to integral fuel rod simulations, the analysis is informed by detailed neutronics calculations in order to accurately capture the radial power profile throughout the fuel, which is strongly affected by the complex evolution of absorber Gd isotopes. Inmore » particular, radial power profiles calculated at IFE–Halden Reactor Project with the HELIOS code are used. The work has been carried out in the frame of the collaboration between Idaho National Laboratory and Halden Reactor Project. Some slide have been added as an Appendix to present the newly developed PolyPole-1 algorithm for modeling of intra-granular fission gas release.« less
NASA Technical Reports Server (NTRS)
Walitt, L.
1982-01-01
The VANS successive approximation numerical method was extended to the computation of three dimensional, viscous, transonic flows in turbomachines. A cross-sectional computer code, which conserves mass flux at each point of the cross-sectional surface of computation was developed. In the VANS numerical method, the cross-sectional computation follows a blade-to-blade calculation. Numerical calculations were made for an axial annular turbine cascade and a transonic, centrifugal impeller with splitter vanes. The subsonic turbine cascade computation was generated in blade-to-blade surface to evaluate the accuracy of the blade-to-blade mode of marching. Calculated blade pressures at the hub, mid, and tip radii of the cascade agreed with corresponding measurements. The transonic impeller computation was conducted to test the newly developed locally mass flux conservative cross-sectional computer code. Both blade-to-blade and cross sectional modes of calculation were implemented for this problem. A triplet point shock structure was computed in the inducer region of the impeller. In addition, time-averaged shroud static pressures generally agreed with measured shroud pressures. It is concluded that the blade-to-blade computation produces a useful engineering flow field in regions of subsonic relative flow; and cross-sectional computation, with a locally mass flux conservative continuity equation, is required to compute the shock waves in regions of supersonic relative flow.
IgSimulator: a versatile immunosequencing simulator.
Safonova, Yana; Lapidus, Alla; Lill, Jennie
2015-10-01
The recent introduction of next-generation sequencing technologies to antibody studies have resulted in a growing number of immunoinformatics tools for antibody repertoire analysis. However, benchmarking these newly emerging tools remains problematic since the gold standard datasets that are needed to validate these tools are typically not available. Since simulating antibody repertoires is often the only feasible way to benchmark new immunoinformatics tools, we developed the IgSimulator tool that addresses various complications in generating realistic antibody repertoires. IgSimulator's code has modular structure and can be easily adapted to new requirements to simulation. IgSimulator is open source and freely available as a C++ and Python program running on all Unix-compatible platforms. The source code is available from yana-safonova.github.io/ig_simulator. safonova.yana@gmail.com Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Analysis of dose-LET distribution in the human body irradiated by high energy hadrons.
Sato, T; Tsuda, S; Sakamoto, Y; Yamaguchi, Y; Niita, K
2003-01-01
For the purposes of radiological protection, it is important to analyse profiles of the particle field inside a human body irradiated by high energy hadrons, since they can produce a variety of secondary particles which play an important role in the energy deposition process, and characterise their radiation qualities. Therefore Monte Carlo calculations were performed to evaluate dose distributions in terms of the linear energy transfer of ionising particles (dose-LET distribution) using a newly developed particle transport code (Particle and Heavy Ion Transport code System, PHITS) for incidences of neutrons, protons and pions with energies from 100 MeV to 200 GeV. Based on these calculations, it was found that more than 80% and 90% of the total deposition energies are attributed to ionisation by particles with LET below 10 keV microm(-1) for the irradiations of neutrons and the charged particles, respectively.
Dynamic event tree analysis with the SAS4A/SASSYS-1 safety analysis code
Jankovsky, Zachary K.; Denman, Matthew R.; Aldemir, Tunc
2018-02-02
The consequences of a transient in an advanced sodium-cooled fast reactor are difficult to capture with the traditional approach to probabilistic risk assessment (PRA). Numerous safety-relevant systems are passive and may have operational states that cannot be represented by binary success or failure. In addition, the specific order and timing of events may be crucial which necessitates the use of dynamic PRA tools such as ADAPT. The modifications to the SAS4A/SASSYS-1 sodium-cooled fast reactor safety analysis code for linking it to ADAPT to perform a dynamic PRA are described. A test case is used to demonstrate the linking process andmore » to illustrate the type of insights that may be gained with this process. Finally, newly-developed dynamic importance measures are used to assess the significance of reactor parameters/constituents on calculated consequences of initiating events.« less
Dynamic event tree analysis with the SAS4A/SASSYS-1 safety analysis code
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jankovsky, Zachary K.; Denman, Matthew R.; Aldemir, Tunc
The consequences of a transient in an advanced sodium-cooled fast reactor are difficult to capture with the traditional approach to probabilistic risk assessment (PRA). Numerous safety-relevant systems are passive and may have operational states that cannot be represented by binary success or failure. In addition, the specific order and timing of events may be crucial which necessitates the use of dynamic PRA tools such as ADAPT. The modifications to the SAS4A/SASSYS-1 sodium-cooled fast reactor safety analysis code for linking it to ADAPT to perform a dynamic PRA are described. A test case is used to demonstrate the linking process andmore » to illustrate the type of insights that may be gained with this process. Finally, newly-developed dynamic importance measures are used to assess the significance of reactor parameters/constituents on calculated consequences of initiating events.« less
NASA Astrophysics Data System (ADS)
Birdsell, D.; Karra, S.; Rajaram, H.
2016-12-01
The governing equations for subsurface flow codes in deformable porous media are derived from the fluid mass balance equation. One class of these codes, which we call general subsurface flow (GSF) codes, does not explicitly track the motion of the solid porous media but does accept general constitutive relations for porosity, density, and fluid flux. Examples of GSF codes include PFLOTRAN, FEHM, STOMP, and TOUGH2. Meanwhile, analytical and numerical solutions based on the groundwater flow equation have assumed forms for porosity, density, and fluid flux. We review the derivation of the groundwater flow equation, which uses the form of Darcy's equation that accounts for the velocity of fluids with respect to solids and defines the soil matrix compressibility accordingly. We then show how GSF codes have a different governing equation if they use the form of Darcy's equation that is written only in terms of fluid velocity. The difference is seen in the porosity change, which is part of the specific storage term in the groundwater flow equation. We propose an alternative definition of soil matrix compressibility to correct for the untracked solid velocity. Simulation results show significantly less error for our new compressibility definition than the traditional compressibility when compared to analytical solutions from the groundwater literature. For example, the error in one calculation for a pumped sandstone aquifer goes from 940 to <70 Pa when the new compressibility is used. Code users and developers need to be aware of assumptions in the governing equations and constitutive relations in subsurface flow codes, and our newly-proposed compressibility function should be incorporated into GSF codes.
NASA Astrophysics Data System (ADS)
Birdsell, D.; Karra, S.; Rajaram, H.
2017-12-01
The governing equations for subsurface flow codes in deformable porous media are derived from the fluid mass balance equation. One class of these codes, which we call general subsurface flow (GSF) codes, does not explicitly track the motion of the solid porous media but does accept general constitutive relations for porosity, density, and fluid flux. Examples of GSF codes include PFLOTRAN, FEHM, STOMP, and TOUGH2. Meanwhile, analytical and numerical solutions based on the groundwater flow equation have assumed forms for porosity, density, and fluid flux. We review the derivation of the groundwater flow equation, which uses the form of Darcy's equation that accounts for the velocity of fluids with respect to solids and defines the soil matrix compressibility accordingly. We then show how GSF codes have a different governing equation if they use the form of Darcy's equation that is written only in terms of fluid velocity. The difference is seen in the porosity change, which is part of the specific storage term in the groundwater flow equation. We propose an alternative definition of soil matrix compressibility to correct for the untracked solid velocity. Simulation results show significantly less error for our new compressibility definition than the traditional compressibility when compared to analytical solutions from the groundwater literature. For example, the error in one calculation for a pumped sandstone aquifer goes from 940 to <70 Pa when the new compressibility is used. Code users and developers need to be aware of assumptions in the governing equations and constitutive relations in subsurface flow codes, and our newly-proposed compressibility function should be incorporated into GSF codes.
Democratisation of AAC Symbol Choices Using Technology.
Draffan, E A; Wald, Mike; Zeinoun, Nadine; Banes, David
2017-01-01
The use of an online voting system has been developed to enable democratic choices of newly designed symbols to support speech, language and literacy skills in a localisation situation. The system works for those using and supporting Augmentative and Alternative Communication (AAC) symbols on electronic systems by the provision of simplified scales of acceptance and adapted grids. The methodology and results highlighted the importance of user participation at the outset and concrete examples of symbol adaptations that were found necessary to ensure higher levels of user satisfaction. Design changes included appropriate local dress codes, linguistic nuances, social settings, the built environment and religious sensitivities.
Resonant scattering experiments with radioactive nuclear beams - Recent results and future plans
DOE Office of Scientific and Technical Information (OSTI.GOV)
Teranishi, T.; Sakaguchi, S.; Uesaka, T.
2013-04-19
Resonant scattering with low-energy radioactive nuclear beams of E < 5 MeV/u have been studied at CRIB of CNS and at RIPS of RIKEN. As an extension to the present experimental technique, we will install an advanced polarized proton target for resonant scattering experiments. A Monte-Carlo simulation was performed to study the feasibility of future experiments with the polarized target. In the Monte-Carlo simulation, excitation functions and analyzing powers were calculated using a newly developed R-matrix calculation code. A project of a small-scale radioactive beam facility at Kyushu University is also briefly described.
RELAP5-3D Results for Phase I (Exercise 2) of the OECD/NEA MHTGR-350 MW Benchmark
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gerhard Strydom
2012-06-01
The coupling of the PHISICS code suite to the thermal hydraulics system code RELAP5-3D has recently been initiated at the Idaho National Laboratory (INL) to provide a fully coupled prismatic Very High Temperature Reactor (VHTR) system modeling capability as part of the NGNP methods development program. The PHISICS code consists of three modules: INSTANT (performing 3D nodal transport core calculations), MRTAU (depletion and decay heat generation) and a perturbation/mixer module. As part of the verification and validation activities, steady state results have been obtained for Exercise 2 of Phase I of the newly-defined OECD/NEA MHTGR-350 MW Benchmark. This exercise requiresmore » participants to calculate a steady-state solution for an End of Equilibrium Cycle 350 MW Modular High Temperature Reactor (MHTGR), using the provided geometry, material, and coolant bypass flow description. The paper provides an overview of the MHTGR Benchmark and presents typical steady state results (e.g. solid and gas temperatures, thermal conductivities) for Phase I Exercise 2. Preliminary results are also provided for the early test phase of Exercise 3 using a two-group cross-section library and the Relap5-3D model developed for Exercise 2.« less
RELAP5-3D results for phase I (Exercise 2) of the OECD/NEA MHTGR-350 MW benchmark
DOE Office of Scientific and Technical Information (OSTI.GOV)
Strydom, G.; Epiney, A. S.
2012-07-01
The coupling of the PHISICS code suite to the thermal hydraulics system code RELAP5-3D has recently been initiated at the Idaho National Laboratory (INL) to provide a fully coupled prismatic Very High Temperature Reactor (VHTR) system modeling capability as part of the NGNP methods development program. The PHISICS code consists of three modules: INSTANT (performing 3D nodal transport core calculations), MRTAU (depletion and decay heat generation) and a perturbation/mixer module. As part of the verification and validation activities, steady state results have been obtained for Exercise 2 of Phase I of the newly-defined OECD/NEA MHTGR-350 MW Benchmark. This exercise requiresmore » participants to calculate a steady-state solution for an End of Equilibrium Cycle 350 MW Modular High Temperature Reactor (MHTGR), using the provided geometry, material, and coolant bypass flow description. The paper provides an overview of the MHTGR Benchmark and presents typical steady state results (e.g. solid and gas temperatures, thermal conductivities) for Phase I Exercise 2. Preliminary results are also provided for the early test phase of Exercise 3 using a two-group cross-section library and the Relap5-3D model developed for Exercise 2. (authors)« less
Comparative genomics of biotechnologically important yeasts
USDA-ARS?s Scientific Manuscript database
Ascomycete yeasts are metabolically diverse, with great potential for biotechnology. Here, we report the comparative genome analysis of 29 taxonomically and biotechnologically important yeasts, including 16 newly sequenced. We identify a genetic code change, CUG-Ala, in Pachysolen tannophilus in the...
FY06 L2C2 HE program report Zaug et al.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zaug, J M; Crowhurst, J C; Howard, W M
2008-08-01
The purpose of this project is to advance the improvement of LLNL thermochemical computational models that form the underlying basis or input for laboratory hydrodynamic simulations. Our general work approach utilizes, by design, tight experimental-theoretical research interactions that allow us to not empirically, but rather more scientifically improve LLNL computational results. The ultimate goal here is to confidently predict through computer models, the performance and safety parameters of currently maintained, modified, and newly designed stockpile systems. To attain our goal we make relevant experimental measurements on candidate detonation products constrained under static high-pressure and temperature conditions. The reduced information frommore » these measurements is then used to construct analytical forms that describe the potential surface (repulsive energy as a function of interatomic separation distance) of single and mixed fluid or detonation product species. These potential surface shapes are also constructed using input from well-trusted shock wave physics and assorted thermodynamic data available in the open literature. Our potential surfaces permit one to determine the equations of state (P,V,T), the equilibrium chemistry, phase, and chemical interactions of detonation products under a very wide range of extreme pressure temperature conditions. Using our foundation of experimentally refined potential surfaces we are in a position to calculate, with confidence, the energetic output and chemical speciation occurring from a specific combustion and/or detonation reaction. The thermochemical model we developed and use for calculating the equilibrium chemistry, kinetics, and energy from ultrafast processes is named 'Cheetah'. Computational results from our Cheetah code are coupled to laboratory ALE3D hydrodynamic simulation codes where the complete response behavior of an existing or proposed system is ultimately predicted. The Cheetah thermochemical code is also used by well over 500 U.S. government DoD and DOE community users who calculate the chemical properties of detonated high explosives, propellants, and pyrotechnics. To satisfy the growing needs of LLNL and the general user community we continue to improve the robustness of our Cheetah code. The P-T range of current speed of sound experiments will soon be extended by a factor of four and our recently developed technological advancements permit us to, for the first time, study any chemical specie or fluid mixture. New experiments will focus on determining the miscibility or coexistence curves of detonation product mixtures. Our newly constructed ultrafast laser diagnostics will permit us to determine what chemical species exist under conditions approaching Chapman-Jouguet (CJ) detonation states. Furthermore we will measure the time evolution of candidate species and use our chemical kinetics data to develop new and validate existing rate laws employed in future versions of our Cheetah thermochemical code.« less
Feasibility of a computer-assisted feedback system between dispatch centre and ambulances.
Lindström, Veronica; Karlsten, Rolf; Falk, Ann-Charlotte; Castrèn, Maaret
2011-06-01
The aim of the study was to evaluate the feasibility of a newly developed computer-assisted feedback system between dispatch centre and ambulances in Stockholm, Sweden. A computer-assisted feedback system based on a Finnish model was designed to fit the Swedish emergency medical system. Feedback codes were identified and divided into three categories; assessment of patients' primary condition when ambulance arrives at scene, no transport by the ambulance and level of priority. Two ambulances and one emergency medical communication centre (EMCC) in Stockholm participated in the study. A sample of 530 feedback codes sent through the computer-assisted feedback system was reviewed. The information on the ambulance medical records was compared with the feedback codes used and 240 assignments were further analyzed. The used feedback codes sent from ambulance to EMCC were correct in 92% of the assignments. The most commonly used feedback code sent to the emergency medical dispatchers was 'agree with the dispatchers' assessment'. In addition, in 160 assignments there was a mismatch between emergency medical dispatchers and ambulance nurse assessments. Our results have shown a high agreement between medical dispatchers and ambulance nurse assessment. The feasibility of the feedback codes seems to be acceptable based on the small margin of error. The computer-assisted feedback system may, when used on a daily basis, make it possible for the medical dispatchers to receive feedback in a structural way. The EMCC organization can directly evaluate any changes in the assessment protocol by structured feedback sent from the ambulance.
Potential Energy Cost Savings from Increased Commercial Energy Code Compliance
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rosenberg, Michael I.; Hart, Philip R.; Athalye, Rahul A.
2016-08-22
An important question for commercial energy code compliance is: “How much energy cost savings can better compliance achieve?” This question is in sharp contrast to prior efforts that used a checklist of code requirements, each of which was graded pass or fail. Percent compliance for any given building was simply the percent of individual requirements that passed. A field investigation method is being developed that goes beyond the binary approach to determine how much energy cost savings is not realized. Prototype building simulations were used to estimate the energy cost impact of varying levels of non-compliance for newly constructed officemore » buildings in climate zone 4C. Field data collected from actual buildings on specific conditions relative to code requirements was then applied to the simulation results to find the potential lost energy savings for a single building or for a sample of buildings. This new methodology was tested on nine office buildings in climate zone 4C. The amount of additional energy cost savings they could have achieved had they complied fully with the 2012 International Energy Conservation Code is determined. This paper will present the results of the test, lessons learned, describe follow-on research that is needed to verify that the methodology is both accurate and practical, and discuss the benefits that might accrue if the method were widely adopted.« less
Taylor, Barry J; Garstang, Joanna; Engelberts, Adele; Obonai, Toshimasa; Cote, Aurore; Freemantle, Jane; Vennemann, Mechtild; Healey, Matt; Sidebotham, Peter; Mitchell, Edwin A; Moon, Rachel Y
2015-11-01
Comparing rates of sudden unexpected death in infancy (SUDI) in different countries and over time is difficult, as these deaths are certified differently in different countries, and, even within the same jurisdiction, changes in this death certification process have occurred over time. To identify if International Classification of Diseases-10 (ICD-10) codes are being applied differently in different countries, and to develop a more robust tool for international comparison of these types of deaths. Usage of six ICD-10 codes, which code for the majority of SUDI, was compared for the years 2002-2010 in eight high-income countries. There was a great variability in how each country codes SUDI. For example, the proportion of SUDI coded as sudden infant death syndrome (R95) ranged from 32.6% in Japan to 72.5% in Germany. The proportion of deaths coded as accidental suffocation and strangulation in bed (W75) ranged from 1.1% in Germany to 31.7% in New Zealand. Japan was the only country to consistently use the R96 code, with 44.8% of SUDI attributed to that code. The lowest, overall, SUDI rate was seen in the Netherlands (0.19/1000 live births (LB)), and the highest in New Zealand (1.00/1000 LB). SUDI accounted for one-third to half of postneonatal mortality in 2002-2010 for all of the countries except for the Netherlands. The proposed set of ICD-10 codes encompasses the codes used in different countries for most SUDI cases. Use of these codes will allow for better international comparisons and tracking of trends over time. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
Thickness Map of Buried Carbon-Dioxide Deposit
2011-04-21
NASA Mars Reconnaissance Orbiter color-codes thickness estimates in a newly found, buried deposit of frozen carbon dioxide, dry ice, near the south pole of Mars contains ~30 times more carbon dioxide than previously estimated to be frozen near the pole.
Functional annotation of the vlinc class of non-coding RNAs using systems biology approach
Laurent, Georges St.; Vyatkin, Yuri; Antonets, Denis; Ri, Maxim; Qi, Yao; Saik, Olga; Shtokalo, Dmitry; de Hoon, Michiel J.L.; Kawaji, Hideya; Itoh, Masayoshi; Lassmann, Timo; Arner, Erik; Forrest, Alistair R.R.; Nicolas, Estelle; McCaffrey, Timothy A.; Carninci, Piero; Hayashizaki, Yoshihide; Wahlestedt, Claes; Kapranov, Philipp
2016-01-01
Functionality of the non-coding transcripts encoded by the human genome is the coveted goal of the modern genomics research. While commonly relied on the classical methods of forward genetics, integration of different genomics datasets in a global Systems Biology fashion presents a more productive avenue of achieving this very complex aim. Here we report application of a Systems Biology-based approach to dissect functionality of a newly identified vast class of very long intergenic non-coding (vlinc) RNAs. Using highly quantitative FANTOM5 CAGE dataset, we show that these RNAs could be grouped into 1542 novel human genes based on analysis of insulators that we show here indeed function as genomic barrier elements. We show that vlincRNAs genes likely function in cis to activate nearby genes. This effect while most pronounced in closely spaced vlincRNA–gene pairs can be detected over relatively large genomic distances. Furthermore, we identified 101 vlincRNA genes likely involved in early embryogenesis based on patterns of their expression and regulation. We also found another 109 such genes potentially involved in cellular functions also happening at early stages of development such as proliferation, migration and apoptosis. Overall, we show that Systems Biology-based methods have great promise for functional annotation of non-coding RNAs. PMID:27001520
DOE Office of Scientific and Technical Information (OSTI.GOV)
David Andrs; Ray Berry; Derek Gaston
The document contains the simulation results of a steady state model PWR problem with the RELAP-7 code. The RELAP-7 code is the next generation nuclear reactor system safety analysis code being developed at Idaho National Laboratory (INL). The code is based on INL's modern scientific software development framework - MOOSE (Multi-Physics Object-Oriented Simulation Environment). This report summarizes the initial results of simulating a model steady-state single phase PWR problem using the current version of the RELAP-7 code. The major purpose of this demonstration simulation is to show that RELAP-7 code can be rapidly developed to simulate single-phase reactor problems. RELAP-7more » is a new project started on October 1st, 2011. It will become the main reactor systems simulation toolkit for RISMC (Risk Informed Safety Margin Characterization) and the next generation tool in the RELAP reactor safety/systems analysis application series (the replacement for RELAP5). The key to the success of RELAP-7 is the simultaneous advancement of physical models, numerical methods, and software design while maintaining a solid user perspective. Physical models include both PDEs (Partial Differential Equations) and ODEs (Ordinary Differential Equations) and experimental based closure models. RELAP-7 will eventually utilize well posed governing equations for multiphase flow, which can be strictly verified. Closure models used in RELAP5 and newly developed models will be reviewed and selected to reflect the progress made during the past three decades. RELAP-7 uses modern numerical methods, which allow implicit time integration, higher order schemes in both time and space, and strongly coupled multi-physics simulations. RELAP-7 is written with object oriented programming language C++. Its development follows modern software design paradigms. The code is easy to read, develop, maintain, and couple with other codes. Most importantly, the modern software design allows the RELAP-7 code to evolve with time. RELAP-7 is a MOOSE-based application. MOOSE (Multiphysics Object-Oriented Simulation Environment) is a framework for solving computational engineering problems in a well-planned, managed, and coordinated way. By leveraging millions of lines of open source software packages, such as PETSC (a nonlinear solver developed at Argonne National Laboratory) and LibMesh (a Finite Element Analysis package developed at University of Texas), MOOSE significantly reduces the expense and time required to develop new applications. Numerical integration methods and mesh management for parallel computation are provided by MOOSE. Therefore RELAP-7 code developers only need to focus on physics and user experiences. By using the MOOSE development environment, RELAP-7 code is developed by following the same modern software design paradigms used for other MOOSE development efforts. There are currently over 20 different MOOSE based applications ranging from 3-D transient neutron transport, detailed 3-D transient fuel performance analysis, to long-term material aging. Multi-physics and multiple dimensional analyses capabilities can be obtained by coupling RELAP-7 and other MOOSE based applications and by leveraging with capabilities developed by other DOE programs. This allows restricting the focus of RELAP-7 to systems analysis-type simulations and gives priority to retain and significantly extend RELAP5's capabilities.« less
Mocking the weak lensing universe: The LensTools Python computing package
NASA Astrophysics Data System (ADS)
Petri, A.
2016-10-01
We present a newly developed software package which implements a wide range of routines frequently used in Weak Gravitational Lensing (WL). With the continuously increasing size of the WL scientific community we feel that easy to use Application Program Interfaces (APIs) for common calculations are a necessity to ensure efficiency and coordination across different working groups. Coupled with existing open source codes, such as CAMB (Lewis et al., 2000) and Gadget2 (Springel, 2005), LensTools brings together a cosmic shear simulation pipeline which, complemented with a variety of WL feature measurement tools and parameter sampling routines, provides easy access to the numerics for theoretical studies of WL as well as for experiment forecasts. Being implemented in PYTHON (Rossum, 1995), LensTools takes full advantage of a range of state-of-the art techniques developed by the large and growing open-source software community (Jones et al., 2001; McKinney, 2010; Astrophy Collaboration, 2013; Pedregosa et al., 2011; Foreman-Mackey et al., 2013). We made the LensTools code available on the Python Package Index and published its documentation on http://lenstools.readthedocs.io.
MLP: A Parallel Programming Alternative to MPI for New Shared Memory Parallel Systems
NASA Technical Reports Server (NTRS)
Taft, James R.
1999-01-01
Recent developments at the NASA AMES Research Center's NAS Division have demonstrated that the new generation of NUMA based Symmetric Multi-Processing systems (SMPs), such as the Silicon Graphics Origin 2000, can successfully execute legacy vector oriented CFD production codes at sustained rates far exceeding processing rates possible on dedicated 16 CPU Cray C90 systems. This high level of performance is achieved via shared memory based Multi-Level Parallelism (MLP). This programming approach, developed at NAS and outlined below, is distinct from the message passing paradigm of MPI. It offers parallelism at both the fine and coarse grained level, with communication latencies that are approximately 50-100 times lower than typical MPI implementations on the same platform. Such latency reductions offer the promise of performance scaling to very large CPU counts. The method draws on, but is also distinct from, the newly defined OpenMP specification, which uses compiler directives to support a limited subset of multi-level parallel operations. The NAS MLP method is general, and applicable to a large class of NASA CFD codes.
Accuracy comparison among different machine learning techniques for detecting malicious codes
NASA Astrophysics Data System (ADS)
Narang, Komal
2016-03-01
In this paper, a machine learning based model for malware detection is proposed. It can detect newly released malware i.e. zero day attack by analyzing operation codes on Android operating system. The accuracy of Naïve Bayes, Support Vector Machine (SVM) and Neural Network for detecting malicious code has been compared for the proposed model. In the experiment 400 benign files, 100 system files and 500 malicious files have been used to construct the model. The model yields the best accuracy 88.9% when neural network is used as classifier and achieved 95% and 82.8% accuracy for sensitivity and specificity respectively.
2011-01-01
Introduction Many trauma registries have used the Abbreviated Injury Scale 1990 Revision Update 98 (AIS98) to classify injuries. In the current AIS version (Abbreviated Injury Scale 2005 Update 2008 - AIS08), injury classification and specificity differ substantially from AIS98, and the mapping tools provided in the AIS08 dictionary are incomplete. As a result, data from different AIS versions cannot currently be compared. The aim of this study was to develop an additional AIS98 to AIS08 mapping tool to complement the current AIS dictionary map, and then to evaluate the completed map (produced by combining these two maps) using double-coded data. The value of additional information provided by free text descriptions accompanying assigned codes was also assessed. Methods Using a modified Delphi process, a panel of expert AIS coders established plausible AIS08 equivalents for the 153 AIS98 codes which currently have no AIS08 map. A series of major trauma patients whose injuries had been double-coded in AIS98 and AIS08 was used to assess the maps; both of the AIS datasets had already been mapped to another AIS version using the AIS dictionary maps. Following application of the completed (enhanced) map with or without free text evaluation, up to six AIS codes were available for each injury. Datasets were assessed for agreement in injury severity measures, and the relative performances of the maps in accurately describing the trauma population were evaluated. Results The double-coded injuries sustained by 109 patients were used to assess the maps. For data conversion from AIS98, both the enhanced map and the enhanced map with free text description resulted in higher levels of accuracy and agreement with directly coded AIS08 data than the currently available dictionary map. Paired comparisons demonstrated significant differences between direct coding and the dictionary maps, but not with either of the enhanced maps. Conclusions The newly-developed AIS98 to AIS08 complementary map enabled transformation of the trauma population description given by AIS98 into an AIS08 estimate which was statistically indistinguishable from directly coded AIS08 data. It is recommended that the enhanced map should be adopted for dataset conversion, using free text descriptions if available. PMID:21548991
Palmer, Cameron S; Franklyn, Melanie; Read-Allsopp, Christine; McLellan, Susan; Niggemeyer, Louise E
2011-05-08
Many trauma registries have used the Abbreviated Injury Scale 1990 Revision Update 98 (AIS98) to classify injuries. In the current AIS version (Abbreviated Injury Scale 2005 Update 2008 - AIS08), injury classification and specificity differ substantially from AIS98, and the mapping tools provided in the AIS08 dictionary are incomplete. As a result, data from different AIS versions cannot currently be compared. The aim of this study was to develop an additional AIS98 to AIS08 mapping tool to complement the current AIS dictionary map, and then to evaluate the completed map (produced by combining these two maps) using double-coded data. The value of additional information provided by free text descriptions accompanying assigned codes was also assessed. Using a modified Delphi process, a panel of expert AIS coders established plausible AIS08 equivalents for the 153 AIS98 codes which currently have no AIS08 map. A series of major trauma patients whose injuries had been double-coded in AIS98 and AIS08 was used to assess the maps; both of the AIS datasets had already been mapped to another AIS version using the AIS dictionary maps. Following application of the completed (enhanced) map with or without free text evaluation, up to six AIS codes were available for each injury. Datasets were assessed for agreement in injury severity measures, and the relative performances of the maps in accurately describing the trauma population were evaluated. The double-coded injuries sustained by 109 patients were used to assess the maps. For data conversion from AIS98, both the enhanced map and the enhanced map with free text description resulted in higher levels of accuracy and agreement with directly coded AIS08 data than the currently available dictionary map. Paired comparisons demonstrated significant differences between direct coding and the dictionary maps, but not with either of the enhanced maps. The newly-developed AIS98 to AIS08 complementary map enabled transformation of the trauma population description given by AIS98 into an AIS08 estimate which was statistically indistinguishable from directly coded AIS08 data. It is recommended that the enhanced map should be adopted for dataset conversion, using free text descriptions if available.
AACSD: An atomistic analyzer for crystal structure and defects
NASA Astrophysics Data System (ADS)
Liu, Z. R.; Zhang, R. F.
2018-01-01
We have developed an efficient command-line program named AACSD (Atomistic Analyzer for Crystal Structure and Defects) for the post-analysis of atomic configurations generated by various atomistic simulation codes. The program has implemented not only the traditional filter methods like the excess potential energy (EPE), the centrosymmetry parameter (CSP), the common neighbor analysis (CNA), the common neighborhood parameter (CNP), the bond angle analysis (BAA), and the neighbor distance analysis (NDA), but also the newly developed ones including the modified centrosymmetry parameter (m-CSP), the orientation imaging map (OIM) and the local crystallographic orientation (LCO). The newly proposed OIM and LCO methods have been extended for all three crystal structures including face centered cubic, body centered cubic and hexagonal close packed. More specially, AACSD can be easily used for the atomistic analysis of metallic nanocomposite with each phase to be analyzed independently, which provides a unique pathway to capture their dynamic evolution of various defects on the fly. In this paper, we provide not only a throughout overview on various theoretical methods and their implementation into AACSD program, but some critical evaluations, specific testing and applications, demonstrating the capability of the program on each functionality.
Validation of CFD/Heat Transfer Software for Turbine Blade Analysis
NASA Technical Reports Server (NTRS)
Kiefer, Walter D.
2004-01-01
I am an intern in the Turbine Branch of the Turbomachinery and Propulsion Systems Division. The division is primarily concerned with experimental and computational methods of calculating heat transfer effects of turbine blades during operation in jet engines and land-based power systems. These include modeling flow in internal cooling passages and film cooling, as well as calculating heat flux and peak temperatures to ensure safe and efficient operation. The branch is research-oriented, emphasizing the development of tools that may be used by gas turbine designers in industry. The branch has been developing a computational fluid dynamics (CFD) and heat transfer code called GlennHT to achieve the computational end of this analysis. The code was originally written in FORTRAN 77 and run on Silicon Graphics machines. However the code has been rewritten and compiled in FORTRAN 90 to take advantage of more modem computer memory systems. In addition the branch has made a switch in system architectures from SGI's to Linux PC's. The newly modified code therefore needs to be tested and validated. This is the primary goal of my internship. To validate the GlennHT code, it must be run using benchmark fluid mechanics and heat transfer test cases, for which there are either analytical solutions or widely accepted experimental data. From the solutions generated by the code, comparisons can be made to the correct solutions to establish the accuracy of the code. To design and create these test cases, there are many steps and programs that must be used. Before a test case can be run, pre-processing steps must be accomplished. These include generating a grid to describe the geometry, using a software package called GridPro. Also various files required by the GlennHT code must be created including a boundary condition file, a file for multi-processor computing, and a file to describe problem and algorithm parameters. A good deal of this internship will be to become familiar with these programs and the structure of the GlennHT code. Additional information is included in the original extended abstract.
Miadlikowska, Jolanta; Kauff, Frank; Högnabba, Filip; Oliver, Jeffrey C.; Molnár, Katalin; Fraker, Emily; Gaya, Ester; Hafellner, Josef; Hofstetter, Valérie; Gueidan, Cécile; Otálora, Mónica A.G.; Hodkinson, Brendan; Kukwa, Martin; Lücking, Robert; Björk, Curtis; Sipman, Harrie J.M.; Burgaz, Ana Rosa; Thell, Arne; Passo, Alfredo; Myllys, Leena; Goward, Trevor; Fernández-Brime, Samantha; Hestmark, Geir; Lendemer, James; Lumbsch, H. Thorsten; Schmull, Michaela; Schoch, Conrad; Sérusiaux, Emmanuël; Maddison, David R.; Arnold, A. Elizabeth; Lutzoni, François; Stenroos, Soili
2014-01-01
The Lecanoromycetes is the largest class of lichenized Fungi, and one of the most species-rich classes in the kingdom. Here we provide a multigene phylogenetic synthesis (using three ribosomal RNA-coding and two protein-coding genes) of the Lecanoromycetes based on 642 newly generated and 3329 publicly available sequences representing 1139 taxa, 317 genera, 66 families, 17 orders and five subclasses (four currently recognized: Acarosporomycetidae, Lecanoromycetidae, Ostropomycetidae, Umbilicariomycetidae; and one provisionarily recognized, ‘Candelariomycetidae’). Maximum likelihood phylogenetic analyses on four multigene datasets assembled using a cumulative supermatrix approach with a progressively higher number of species and missing data (5-gene, 5+4-gene, 5+4+3-gene and 5+4+3+2-gene datasets) show that the current classification includes non-monophyletic taxa at various ranks, which need to be recircumscribed and require revisionary treatments based on denser taxon sampling and more loci. Two newly circumscribed orders (Arctomiales and Hymeneliales in the Ostropomycetidae) and three families (Ramboldiaceae and Psilolechiaceae in the Lecanorales, and Strangosporaceae in the Lecanoromycetes inc. sed.) are introduced. The potential resurrection of the families Eigleraceae and Lopadiaceae is considered here to alleviate phylogenetic and classification disparities. An overview of the photobionts associated with the main fungal lineages in the Lecanoromycetes based on available published records is provided. A revised schematic classification at the family level in the phylogenetic context of widely accepted and newly revealed relationships across Lecanoromycetes is included. The cumulative addition of taxa with an increasing amount of missing data (i.e., a cumulative supermatrix approach, starting with taxa for which sequences were available for all five targeted genes and ending with the addition of taxa for which only two genes have been sequenced) revealed relatively stable relationships for many families and orders. However, the increasing number of taxa without the addition of more loci also resulted in an expected substantial loss of phylogenetic resolving power and support (especially for deep phylogenetic relationships), potentially including the misplacements of several taxa. Future phylogenetic analyses should include additional single copy protein-coding markers in order to improve the tree of the Lecanoromycetes. As part of this study, a new module (“Hypha”) of the freely available Mesquite software was developed to compare and display the internodal support values derived from this cumulative supermatrix approach. PMID:24747130
Cloudy - simulating the non-equilibrium microphysics of gas and dust, and its observed spectrum
NASA Astrophysics Data System (ADS)
Ferland, Gary J.
2014-01-01
Cloudy is an open-source plasma/spectral simulation code, last described in the open-access journal Revista Mexicana (Ferland et al. 2013, 2013RMxAA..49..137F). The project goal is a complete simulation of the microphysics of gas and dust over the full range of density, temperature, and ionization that we encounter in astrophysics, together with a prediction of the observed spectrum. Cloudy is one of the more widely used theory codes in astrophysics with roughly 200 papers citing its documentation each year. It is developed by graduate students, postdocs, and an international network of collaborators. Cloudy is freely available on the web at trac.nublado.org, the user community can post questions on http://groups.yahoo.com/neo/groups/cloudy_simulations/info, and summer schools are organized to learn more about Cloudy and its use (http://cloud9.pa.uky.edu gary/cloudy/CloudySummerSchool/). The code’s widespread use is possible because of extensive automatic testing. It is exercised over its full range of applicability whenever the source is changed. Changes in predicted quantities are automatically detected along with any newly introduced problems. The code is designed to be autonomous and self-aware. It generates a report at the end of a calculation that summarizes any problems encountered along with suggestions of potentially incorrect boundary conditions. This self-monitoring is a core feature since the code is now often used to generate large MPI grids of simulations, making it impossible for a user to verify each calculation by hand. I will describe some challenges in developing a large physics code, with its many interconnected physical processes, many at the frontier of research in atomic or molecular physics, all in an open environment.
RETRACTED — PMD mitigation through interleaving LDPC codes with polarization scramblers
NASA Astrophysics Data System (ADS)
Han, Dahai; Chen, Haoran; Xi, Lixia
2012-11-01
The combination of forward error correction (FEC) and distributed fast polarization scramblers (D-FPSs) is approved as an effective method to mitigate polarization mode dispersion (PMD) in high-speed optical fiber communication system. The low-density parity-check (LDPC) codes are newly introduced into the PMD mitigation scheme with D-FPSs in this paper as one of the promising FEC codes to achieve better performance. The scrambling speed of FPS for LDPC (2040, 1903) codes system is discussed, and the reasonable speed 10 MHz is obtained from the simulation results. For easy application in practical large scale integrated (LSI) circuit, the number of iterations in decoding LDPC codes is also investigated. The PMD tolerance and cut-off optical signal-to-noise ratio (OSNR) of LDPC codes are compared with Reed-Solomon (RS) codes in different conditions. In the simulation, the interleaving LDPC codes brings incremental performance of error correction, and the PMD tolerance is 10 ps at OSNR=11.4 dB. The results show that the meaning of the work is that LDPC codes are a substitute for traditional RS codes with D-FPSs and all of the executable code files are open for researchers who have practical LSI platform for PMD mitigation.
PMD mitigation through interleaving LDPC codes with polarization scramblers
NASA Astrophysics Data System (ADS)
Han, Dahai; Chen, Haoran; Xi, Lixia
2013-09-01
The combination of forward error correction (FEC) and distributed fast polarization scramblers (D-FPSs) is approved an effective method to mitigate polarization mode dispersion (PMD) in high-speed optical fiber communication system. The low-density parity-check (LDPC) codes are newly introduced into the PMD mitigation scheme with D-FPSs in this article as one of the promising FEC codes to achieve better performance. The scrambling speed of FPS for LDPC (2040, 1903) codes system is discussed, and the reasonable speed 10MHz is obtained from the simulation results. For easy application in practical large scale integrated (LSI) circuit, the number of iterations in decoding LDPC codes is also investigated. The PMD tolerance and cut-off optical signal-to-noise ratio (OSNR) of LDPC codes are compared with Reed-Solomon (RS) codes in different conditions. In the simulation, the interleaving LDPC codes bring incremental performance of error correction, and the PMD tolerance is 10ps at OSNR=11.4dB. The results show the meaning of the work is that LDPC codes are a substitute for traditional RS codes with D-FPSs and all of the executable code files are open for researchers who have practical LSI platform for PMD mitigation.
NASA Astrophysics Data System (ADS)
Horiuchi, Toshiyuki; Watanabe, Jun; Suzuki, Yuta; Iwasaki, Jun-ya
2017-05-01
Two dimensional code marks are often used for the production management. In particular, in the production lines of liquid-crystal-display panels and others, data on fabrication processes such as production number and process conditions are written on each substrate or device in detail, and they are used for quality managements. For this reason, lithography system specialized in code mark printing is developed. However, conventional systems using lamp projection exposure or laser scan exposure are very expensive. Therefore, development of a low-cost exposure system using light emitting diodes (LEDs) and optical fibers with squared ends arrayed in a matrix is strongly expected. In the past research, feasibility of such a new exposure system was demonstrated using a handmade system equipped with 100 LEDs with a central wavelength of 405 nm, a 10×10 matrix of optical fibers with 1 mm square ends, and a 10X projection lens. Based on these progresses, a new method for fabricating large-scale arrays of finer fibers with squared ends was developed in this paper. At most 40 plastic optical fibers were arranged in a linear gap of an arraying instrument, and simultaneously squared by heating them on a hotplate at 120°C for 7 min. Fiber sizes were homogeneous within 496+/-4 μm. In addition, average light leak was improved from 34.4 to 21.3% by adopting the new method in place of conventional one by one squaring method. Square matrix arrays necessary for printing code marks will be obtained by piling the newly fabricated linear arrays up.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Walczak, Przemysław; Fontes, Christopher John; Colgan, James Patrick
Here, our goal is to test the newly developed OPLIB opacity tables from Los Alamos National Laboratory and check their influence on the pulsation properties of B-type stars. We calculated models using MESA and Dziembowski codes for stellar evolution and linear, nonadiabatic pulsations, respectively. We derived the instability domains of β Cephei and SPB-types for different opacity tables OPLIB, OP, and OPAL. As a result, the new OPLIB opacities have the highest Rosseland mean opacity coefficient near the so-called Z-bump. Therefore, the OPLIB instability domains are wider than in the case of OP and OPAL data.
Walczak, Przemysław; Fontes, Christopher John; Colgan, James Patrick; ...
2015-08-13
Here, our goal is to test the newly developed OPLIB opacity tables from Los Alamos National Laboratory and check their influence on the pulsation properties of B-type stars. We calculated models using MESA and Dziembowski codes for stellar evolution and linear, nonadiabatic pulsations, respectively. We derived the instability domains of β Cephei and SPB-types for different opacity tables OPLIB, OP, and OPAL. As a result, the new OPLIB opacities have the highest Rosseland mean opacity coefficient near the so-called Z-bump. Therefore, the OPLIB instability domains are wider than in the case of OP and OPAL data.
Local-area simulations of rotating compressible convection and associated mean flows
NASA Technical Reports Server (NTRS)
Hurlburt, Neal E.; Brummell, N. H.; Toomre, Juri
1995-01-01
The dynamics of compressible convection within a curved local segment of a rotating spherical shell are considered in relation to the turbulent redistribution of angular momentum within the solar convection zone. Current supercomputers permit fully turbulent flows to be considered within the restricted geometry of local area models. By considering motions in a curvilinear geometry in which the Coriolos parameters vary with latitude, Rossby waves which couple with the turbulent convection are thought of as being possible. Simulations of rotating convection are presented in such a curved local segment of a spherical shell using a newly developed, sixth-order accurate code based on compact finite differences.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-11-20
... for the year to which they apply, of rents for existing or newly constructed rental dwelling units, as... Census geography. Furthermore, The Census Bureau will not continue to support both ZIP code and ZCTA...
Preparing Protocols for Institutional Review Boards.
ERIC Educational Resources Information Center
Lyons, Charles M.
1983-01-01
Introduces the process by which Institutional Review Boards (IRBs) review proposals for research involving human subjects. Describes the composition of IRBs. Presents the Nuremberg code, the elements of informed consent, the judging criteria for proposals, and a sample protocol format. References newly published regulations governing research with…
Ray-tracing critical-angle transmission gratings for the X-ray Surveyor and Explorer-size missions
NASA Astrophysics Data System (ADS)
Günther, Hans M.; Bautz, Marshall W.; Heilmann, Ralf K.; Huenemoerder, David P.; Marshall, Herman L.; Nowak, Michael A.; Schulz, Norbert S.
2016-07-01
We study a critical angle transmission (CAT) grating spectrograph that delivers a spectral resolution significantly above any X-ray spectrograph ever own. This new technology will allow us to resolve kinematic components in absorption and emission lines of galactic and extragalactic matter down to unprecedented dispersion levels. We perform ray-trace simulations to characterize the performance of the spectrograph in the context of an X-ray Surveyor or Arcus like layout (two mission concepts currently under study). Our newly developed ray-trace code is a tool suite to simulate the performance of X-ray observatories. The simulator code is written in Python, because the use of a high-level scripting language allows modifications of the simulated instrument design in very few lines of code. This is especially important in the early phase of mission development, when the performances of different configurations are contrasted. To reduce the run-time and allow for simulations of a few million photons in a few minutes on a desktop computer, the simulator code uses tabulated input (from theoretical models or laboratory measurements of samples) for grating efficiencies and mirror reflectivities. We find that the grating facet alignment tolerances to maintain at least 90% of resolving power that the spectrometer has with perfect alignment are (i) translation parallel to the optical axis below 0.5 mm, (ii) rotation around the optical axis or the groove direction below a few arcminutes, and (iii) constancy of the grating period to 1:105. Translations along and rotations around the remaining axes can be significantly larger than this without impacting the performance.
Development of slow control system for the Belle II ARICH counter
NASA Astrophysics Data System (ADS)
Yonenaga, M.; Adachi, I.; Dolenec, R.; Hataya, K.; Iori, S.; Iwata, S.; Kakuno, H.; Kataura, R.; Kawai, H.; Kindo, H.; Kobayashi, T.; Korpar, S.; Križan, P.; Kumita, T.; Mrvar, M.; Nishida, S.; Ogawa, K.; Ogawa, S.; Pestotnik, R.; Šantelj, L.; Sumiyoshi, T.; Tabata, M.; Yusa, Y.
2017-12-01
A slow control system (SCS) for the Aerogel Ring Imaging Cherenkov (ARICH) counter in the Belle II experiment was newly developed and coded in the development frameworks of the Belle II DAQ software. The ARICH is based on 420 Hybrid Avalanche Photo-Detectors (HAPDs). Each HAPD has 144 pixels to be readout and requires 6 power supply (PS) channels, therefore a total number of 2520 PS channels and 60,480 pixels have to be configured and controlled. Graphical User Interfaces (GUIs) with detector oriented view and device oriented view, were also implemented to ease the detector operation. The ARICH SCS is in operation for detector construction and cosmic rays tests. The paper describes the detailed features of the SCS and preliminary results of operation of a reduced set of hardware which confirm the scalability to the full detector.
Oh, Ahyuda; Thurman, David J; Kim, Hyunmi
2017-10-01
Neurobehavioral comorbidities can be related to underlying etiology of epilepsy, epilepsy itself, and adverse effects of antiepileptic drugs. We examined the relationship between neurobehavioral comorbidities and putative risk factors for epilepsy in children with newly diagnosed epilepsy. We conducted a retrospective analysis of children aged ≤18years in 50 states and the District of Columbia, using the Truven Health MarketScan® commercial claims and encounters database from January 1, 2009 to December 31, 2013. The eligible study cohort was continuously enrolled throughout 2013 as well as enrolled for any days during a baseline period of at least the prior 2years. Newly diagnosed cases of epilepsy were defined by International Classification of Diseases, Ninth Revision, Clinical Modification-coded diagnoses of epilepsy or recurrent seizures and evidence of prescribed antiepileptic drugs during 2013, when neither seizure codes nor seizure medication claims were recorded during baseline periods. Twelve neurobehavioral comorbidities and eleven putative risk factors for epilepsy were measured. More than 6 million children were analyzed (male, 51%; mean age, 8.8years). A total of 7654 children were identified as having newly diagnosed epilepsy (125 per 100,000, 99% CI=122-129). Neurobehavioral comorbidities were more prevalent in children with epilepsy than children without epilepsy (60%, 99% CI=58.1-61.0 vs. 23%, CI=23.1-23.2). Children with epilepsy were far more likely to have multiple comorbidities (36%, 99% CI=34.3-37.1) than those without epilepsy (8%, 99% CI=7.45-7.51, P<0.001). Preexisting putative risk factors for epilepsy were detected in 28% (99% CI=26.9-29.6) of children with epilepsy. After controlling for demographics, neurobehavioral comorbidities, family history of epilepsy, and other risk factors than primary interest, neonatal seizures had the strongest independent association with the development of epilepsy (OR=29.8, 99% CI=23.7-37.3, P<0.001). Compared with children with risk factors but no epilepsy, those with both epilepsy and risk factors were more likely to have intellectual disabilities (OR=13.4, 99% CI=11.9-15.0, P<0.001). The epilepsy and intellectual disabilities could share the common pathophysiology in the neuronal network. Copyright © 2017 Elsevier Inc. All rights reserved.
DOT National Transportation Integrated Search
1980-01-01
Senate Bill 85, an action of the 1978 General Assembly, amended the Code of Virginia to provide, in part, that the Division of Highway Safety be succeeded by the newly created Department of Transportation Safety effective July 1, 1978. In its Declara...
BEARKIMPE-2: A VBA Excel program for characterizing granular iron in treatability studies
NASA Astrophysics Data System (ADS)
Firdous, R.; Devlin, J. F.
2014-02-01
The selection of a suitable kinetic model to investigate the reaction rate of a contaminant with granular iron (GI) is essential to optimize the permeable reactive barrier (PRB) performance in terms of its reactivity. The newly developed Kinetic Iron Model (KIM) determines the surface rate constant (k) and sorption parameters (Cmax &J) which were not possible to uniquely identify previously. The code was written in Visual Basic (VBA), within Microsoft Excel, was adapted from earlier command line FORTRAN codes, BEARPE and KIMPE. The program is organized with several user interface screens (UserForms) that guide the user step by step through the analysis. BEARKIMPE-2 uses a non-linear optimization algorithm to calculate transport and chemical kinetic parameters. Both reactive and non-reactive sites are considered. A demonstration of the functionality of BEARKIMPE-2, with three nitroaromatic compounds showed that the differences in reaction rates for these compounds could be attributed to differences in their sorption behavior rather than their propensities to accept electrons in the reduction process.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Albert, Christopher G.; Heyn, Martin F.; Kapper, Gernot
Toroidal torque generated by neoclassical viscosity caused by external non-resonant, non-axisymmetric perturbations has a significant influence on toroidal plasma rotation in tokamaks. In this article, a derivation for the expressions of toroidal torque and radial transport in resonant regimes is provided within quasilinear theory in canonical action-angle variables. The proposed approach treats all low-collisional quasilinear resonant neoclassical toroidal viscosity regimes including superbanana-plateau and drift-orbit resonances in a unified way and allows for magnetic drift in all regimes. It is valid for perturbations on toroidally symmetric flux surfaces of the unperturbed equilibrium without specific assumptions on geometry or aspect ratio. Themore » resulting expressions are shown to match the existing analytical results in the large aspect ratio limit. Numerical results from the newly developed code NEO-RT are compared to calculations by the quasilinear version of the code NEO-2 at low collisionalities. The importance of the magnetic shear term in the magnetic drift frequency and a significant effect of the magnetic drift on drift-orbit resonances are demonstrated.« less
Numerical studies of the fluid and optical fields associated with complex cavity flows
NASA Technical Reports Server (NTRS)
Atwood, Christopher A.
1992-01-01
Numerical solutions for the flowfield about several cavity configurations have been computed using the Reynolds averaged Navier-Stokes equations. Comparisons between numerical and experimental results are made in two dimensions for free shear layers and a rectangular cavity, and in three dimensions for the transonic aero-window problem of the Stratospheric Observatory for Infrared Astronomy (SOFIA). Results show that dominant acoustic frequencies and magnitudes of the self excited resonant cavity flows compare well with the experiment. In addition, solution sensitivity to artificial dissipation and grid resolution levels are determined. Optical path distortion due to the flow field is modelled geometrically and is found to match the experiment. The fluid field was computed using a diagonalized scheme within an overset mesh framework. An existing code, OVERFLOW, was utilized with the additions of characteristic boundary condition and output routines required for reduction of the unsteady data. The newly developed code is directly applicable to a generalized three dimensional structured grid zone. Details are provided in a paper included in Appendix A.
Numerical modelling of the Madison Dynamo Experiment.
NASA Astrophysics Data System (ADS)
Bayliss, R. A.; Wright, J. C.; Forest, C. B.; O'Connell, R.; Truitt, J. L.
2000-10-01
Growth, saturation and turbulent evolution of the Madison dynamo experiment is investigated numerically using a newly developed 3-D pseudo-spectral simulation of the MHD equations; results of the simulations will be compared to the experimental results obtained from the experiment. The code, Dynamo, is in Fortran90 and allows for full evolution of the magnetic and velocity fields. The induction equation governing B and the Navier-Stokes equation governing V are solved. The code uses a spectral representation via spherical harmonic basis functions of the vector fields in longitude and latitude, and finite differences in the radial direction. The magnetic field evolution has been benchmarked against the laminar kinematic dynamo predicted by M.L. Dudley and R.W. James (M.L. Dudley and R.W. James, Time-dependant kinematic dynamos with stationary flows, Proc. R. Soc. Lond. A 425, p. 407 (1989)). Initial results on magnetic field saturation, generated by the simultaneous evolution of magnetic and velocity fields be presented using a variety of mechanical forcing terms.
Functional annotation of the vlinc class of non-coding RNAs using systems biology approach.
St Laurent, Georges; Vyatkin, Yuri; Antonets, Denis; Ri, Maxim; Qi, Yao; Saik, Olga; Shtokalo, Dmitry; de Hoon, Michiel J L; Kawaji, Hideya; Itoh, Masayoshi; Lassmann, Timo; Arner, Erik; Forrest, Alistair R R; Nicolas, Estelle; McCaffrey, Timothy A; Carninci, Piero; Hayashizaki, Yoshihide; Wahlestedt, Claes; Kapranov, Philipp
2016-04-20
Functionality of the non-coding transcripts encoded by the human genome is the coveted goal of the modern genomics research. While commonly relied on the classical methods of forward genetics, integration of different genomics datasets in a global Systems Biology fashion presents a more productive avenue of achieving this very complex aim. Here we report application of a Systems Biology-based approach to dissect functionality of a newly identified vast class of very long intergenic non-coding (vlinc) RNAs. Using highly quantitative FANTOM5 CAGE dataset, we show that these RNAs could be grouped into 1542 novel human genes based on analysis of insulators that we show here indeed function as genomic barrier elements. We show that vlinc RNAs genes likely function in cisto activate nearby genes. This effect while most pronounced in closely spaced vlinc RNA-gene pairs can be detected over relatively large genomic distances. Furthermore, we identified 101 vlinc RNA genes likely involved in early embryogenesis based on patterns of their expression and regulation. We also found another 109 such genes potentially involved in cellular functions also happening at early stages of development such as proliferation, migration and apoptosis. Overall, we show that Systems Biology-based methods have great promise for functional annotation of non-coding RNAs. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.
A Java-Enabled Interactive Graphical Gas Turbine Propulsion System Simulator
NASA Technical Reports Server (NTRS)
Reed, John A.; Afjeh, Abdollah A.
1997-01-01
This paper describes a gas turbine simulation system which utilizes the newly developed Java language environment software system. The system provides an interactive graphical environment which allows the quick and efficient construction and analysis of arbitrary gas turbine propulsion systems. The simulation system couples a graphical user interface, developed using the Java Abstract Window Toolkit, and a transient, space- averaged, aero-thermodynamic gas turbine analysis method, both entirely coded in the Java language. The combined package provides analytical, graphical and data management tools which allow the user to construct and control engine simulations by manipulating graphical objects on the computer display screen. Distributed simulations, including parallel processing and distributed database access across the Internet and World-Wide Web (WWW), are made possible through services provided by the Java environment.
NASA Astrophysics Data System (ADS)
Saitou, Y.
2018-01-01
An SPH (Smoothed Particle Hydrodynamics) simulation code is developed to reproduce our findings on behavior of dust particles, which were obtained in our previous experiments (Phys. Plasmas, 23, 013709 (2016) and Abst. 18th Intern. Cong. Plasma Phys. (Kaohsiung, 2016)). Usually, in an SPH simulation, a smoothed particle is interpreted as a discretized fluid element. Here we regard the particles as dust particles because it is known that behavior of dust particles in complex plasmas can be described using fluid dynamics equations in many cases. Various rotation velocities that are difficult to achieve in the experiment are given to particles at boundaries in the newly developed simulation and motion of particles is investigated. Preliminary results obtained by the simulation are shown.
A computer code for calculations in the algebraic collective model of the atomic nucleus
NASA Astrophysics Data System (ADS)
Welsh, T. A.; Rowe, D. J.
2016-03-01
A Maple code is presented for algebraic collective model (ACM) calculations. The ACM is an algebraic version of the Bohr model of the atomic nucleus, in which all required matrix elements are derived by exploiting the model's SU(1 , 1) × SO(5) dynamical group. This paper reviews the mathematical formulation of the ACM, and serves as a manual for the code. The code enables a wide range of model Hamiltonians to be analysed. This range includes essentially all Hamiltonians that are rational functions of the model's quadrupole moments qˆM and are at most quadratic in the corresponding conjugate momenta πˆN (- 2 ≤ M , N ≤ 2). The code makes use of expressions for matrix elements derived elsewhere and newly derived matrix elements of the operators [ π ˆ ⊗ q ˆ ⊗ π ˆ ] 0 and [ π ˆ ⊗ π ˆ ] LM. The code is made efficient by use of an analytical expression for the needed SO(5)-reduced matrix elements, and use of SO(5) ⊃ SO(3) Clebsch-Gordan coefficients obtained from precomputed data files provided with the code.
Deep electrode insertion and sound coding in cochlear implants.
Hochmair, Ingeborg; Hochmair, Erwin; Nopp, Peter; Waller, Melissa; Jolly, Claude
2015-04-01
Present-day cochlear implants demonstrate remarkable speech understanding performance despite the use of non-optimized coding strategies concerning the transmission of tonal information. Most systems rely on place pitch information despite possibly large deviations from correct tonotopic placement of stimulation sites. Low frequency information is limited as well because of the constant pulse rate stimulation generally used and, being even more restrictive, of the limited insertion depth of the electrodes. This results in a compromised perception of music and tonal languages. Newly available flexible long straight electrodes permit deep insertion reaching the apical region with little or no insertion trauma. This article discusses the potential benefits of deep insertion which are obtained using pitch-locked temporal stimulation patterns. Besides the access to low frequency information, further advantages of deeply inserted long electrodes are the possibility to better approximate the correct tonotopic location of contacts, the coverage of a wider range of cochlear locations, and the somewhat reduced channel interaction due to the wider contact separation for a given number of channels. A newly developed set of strategies has been shown to improve speech understanding in noise and to enhance sound quality by providing a more "natural" impression, which especially becomes obvious when listening to music. The benefits of deep insertion should not, however, be compromised by structural damage during insertion. The small cross section and the high flexibility of the new electrodes can help to ensure less traumatic insertions as demonstrated by patients' hearing preservation rate. This article is part of a Special Issue entitled
Hiding in Plain Sight: Rediscovering the Importance of Noncoding RNA in Human Malignancy.
Feeley, Kyle P; Edmonds, Mick D
2018-05-01
At the time of its construction in the 1950s, the central dogma of molecular biology was a useful model that represented the current state of knowledge for the flow of genetic information after a period of prolific scientific discovery. Unknowingly, it also biased many of our assumptions going forward. Whether intentional or not, genomic elements not fitting into this paradigm were deemed unimportant and emphasis on the study of protein-coding genes prevailed for decades. The phrase "Junk DNA," first popularized in the 1960s, is still used with alarming frequency to describe the entirety of noncoding DNA. It has since become apparent that RNA molecules not coding for protein are vitally important in both normal development and human malignancy. Cancer researchers have been pioneers in determining noncoding RNA function and developing new technologies to study these molecules. In this review, we will discuss well known and newly emerging species of noncoding RNAs, their functions in cancer, and new technologies being utilized to understand their mechanisms of action in cancer. Cancer Res; 78(9); 2149-58. ©2018 AACR . ©2018 American Association for Cancer Research.
Merlaen, Britt; De Keyser, Ellen; Van Labeke, Marie-Christine
2018-01-01
The newly identified aquaporin coding sequences presented here pave the way for further insights into the plant-water relations in the commercial strawberry ( Fragaria x ananassa ). Aquaporins are water channel proteins that allow water to cross (intra)cellular membranes. In Fragaria x ananassa , few of them have been identified hitherto, hampering the exploration of the water transport regulation at cellular level. Here, we present new aquaporin coding sequences belonging to different subclasses: plasma membrane intrinsic proteins subtype 1 and subtype 2 (PIP1 and PIP2) and tonoplast intrinsic proteins (TIP). The classification is based on phylogenetic analysis and is confirmed by the presence of conserved residues. Substrate-specific signature sequences (SSSSs) and specificity-determining positions (SDPs) predict the substrate specificity of each new aquaporin. Expression profiling in leaves, petioles and developing fruits reveals distinct patterns, even within the same (sub)class. Expression profiles range from leaf-specific expression over constitutive expression to fruit-specific expression. Both upregulation and downregulation during fruit ripening occur. Substrate specificity and expression profiles suggest that functional specialization exists among aquaporins belonging to a different but also to the same (sub)class.
Seismic Analysis Code (SAC): Development, porting, and maintenance within a legacy code base
NASA Astrophysics Data System (ADS)
Savage, B.; Snoke, J. A.
2017-12-01
The Seismic Analysis Code (SAC) is the result of toil of many developers over almost a 40-year history. Initially a Fortran-based code, it has undergone major transitions in underlying bit size from 16 to 32, in the 1980s, and 32 to 64 in 2009; as well as a change in language from Fortran to C in the late 1990s. Maintenance of SAC, the program and its associated libraries, have tracked changes in hardware and operating systems including the advent of Linux in the early 1990, the emergence and demise of Sun/Solaris, variants of OSX processors (PowerPC and x86), and Windows (Cygwin). Traces of these systems are still visible in source code and associated comments. A major concern while improving and maintaining a routinely used, legacy code is a fear of introducing bugs or inadvertently removing favorite features of long-time users. Prior to 2004, SAC was maintained and distributed by LLNL (Lawrence Livermore National Lab). In that year, the license was transferred from LLNL to IRIS (Incorporated Research Institutions for Seismology), but the license is not open source. However, there have been thousands of downloads a year of the package, either source code or binaries for specific system. Starting in 2004, the co-authors have maintained the SAC package for IRIS. In our updates, we fixed bugs, incorporated newly introduced seismic analysis procedures (such as EVALRESP), added new, accessible features (plotting and parsing), and improved the documentation (now in HTML and PDF formats). Moreover, we have added modern software engineering practices to the development of SAC including use of recent source control systems, high-level tests, and scripted, virtualized environments for rapid testing and building. Finally, a "sac-help" listserv (administered by IRIS) was setup for SAC-related issues and is the primary avenue for users seeking advice and reporting bugs. Attempts are always made to respond to issues and bugs in a timely fashion. For the past thirty-plus years, SAC files contained a fixed-length header. Time and distance-related values are stored in single precision, which has become a problem with the increase in desired precision for data compared to thirty years ago. A future goal is to address this precision problem, but in a backward compatible manner. We would also like to transition SAC to a more open source license.
Stormwater Management Plan for the Arden Hills Army Training Site, Arden Hills, Minnesota
DOE Office of Scientific and Technical Information (OSTI.GOV)
Carr, Adrianne E.; Wuthrich, Kelsey K.; Ziech, Angela M.
2013-03-01
This stormwater management plan focuses on the cantonment and training areas of the Arden Hills Army Training Site (AHATS). The plan relates the site stormwater to the regulatory framework, and it summarizes best management practices to aide site managers in promoting clean site runoff. It includes documentation for a newly developed, detailed model of stormwater flow retention for the entire AHATS property and adjacent upgradient areas. The model relies on established modeling codes integrated in a U.S. Department of Defense-sponsored software tool, the Watershed Modeling System (WMS), and it can be updated with data on changes in land use ormore » with monitoring data.« less
NASA Astrophysics Data System (ADS)
Yan, Rui; Cao, Shihui; Wan, Zhenhua; Hu, Guangyue; Zheng, Jian; Hao, Liang; Liu, Wenda; Ren, Chuang
2017-10-01
We push our FLAME project forward with a newly developed code FLAME-MD (Multi-Dimensional) based on the fluid model presented in Ref.. Simulations are performed to study two plasmon decay (TPD) instabilities and stimulated Raman scattering (SRS) in three dimensions (3D) with parameters relevant to ICF. 3D effects on the growth of TPD and SRS, including laser polarizations and multi beam configurations, are studied. This material is based upon work supported by National Natural Science Foundation of China (NSFC) under Grant No. 11642020, 11621202; by Science Challenge Project (No. JCKY2016212A505); and by DOE Office of Fusion Energy Sciences Grant DE-SC0014318.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bakosi, Jozsef; Christon, Mark A.; Francois, Marianne M.
Progress is reported on computational capabilities for the grid-to-rod-fretting (GTRF) problem of pressurized water reactors. Numeca's Hexpress/Hybrid mesh generator is demonstrated as an excellent alternative to generating computational meshes for complex flow geometries, such as in GTRF. Mesh assessment is carried out using standard industrial computational fluid dynamics practices. Hydra-TH, a simulation code developed at LANL for reactor thermal-hydraulics, is demonstrated on hybrid meshes, containing different element types. A series of new Hydra-TH calculations has been carried out collecting turbulence statistics. Preliminary results on the newly generated meshes are discussed; full analysis will be documented in the L3 milestone, THM.CFD.P5.05,more » Sept. 2012.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liang, Thomas K.S.; Ko, F.-K
Although only a few percent of residual power remains during plant outages, the associated risk of core uncovery and corresponding fuel overheating has been identified to be relatively high, particularly under midloop operation (MLO) in pressurized water reactors. However, to analyze the system behavior during outages, the tools currently available, such as RELAP5, RETRAN, etc., cannot easily perform the task. Therefore, a medium-sized program aiming at reactor outage simulation and evaluation, such as MLO with the loss of residual heat removal (RHR), was developed. All important thermal-hydraulic processes involved during MLO with the loss of RHR will be properly simulatedmore » by the newly developed reactor outage simulation and evaluation (ROSE) code. Important processes during MLO with loss of RHR involve a pressurizer insurge caused by the hot-leg flooding, reflux condensation, liquid holdup inside the steam generator, loop-seal clearance, core-level depression, etc. Since the accuracy of the pressure distribution from the classical nodal momentum approach will be degraded when the system is stratified and under atmospheric pressure, the two-region approach with a modified two-fluid model will be the theoretical basis of the new program to analyze the nuclear steam supply system during plant outages. To verify the analytical model in the first step, posttest calculations against the closed integral midloop experiments with loss of RHR were performed. The excellent simulation capacity of the ROSE code against the Institute of Nuclear Energy Research Integral System Test Facility (IIST) test data is demonstrated.« less
A New Way to Confirm Planet Candidates
NASA Astrophysics Data System (ADS)
Kohler, Susanna
2016-05-01
What was the big deal behind the Kepler news conference yesterday? Its not just that the number of confirmed planets found by Kepler has more than doubled (though thats certainly exciting news!). Whats especially interesting is the way in which these new planets were confirmed.Number of planet discoveries by year since 1995, including previous non-Kepler discoveries (blue), previous Kepler discoveries (light blue) and the newly validated Kepler planets (orange). [NASA Ames/W. Stenzel; Princeton University/T. Morton]No Need for Follow-UpBefore Kepler, the way we confirmed planet candidates was with follow-up observations. The candidate could be validated either by directly imaging (which is rare) or obtaining a large number radial-velocity measurements of the wobble of the planets host star due to the planets orbit. But once Kepler started producing planet candidates, these approaches to validation became less feasible. A lot of Kepler candidates are small and orbit faint stars, making follow-up observations difficult or impossible.This problem is what inspired the development of whats known as probabilistic validation, an analysis technique that involves assessing the likelihood that the candidates signal is caused by various false-positive scenarios. Using this technique allows astronomers to estimate the likelihood of a candidate signal being a true planet detection; if that likelihood is high enough, the planet candidate can be confirmed without the need for follow-up observations.A breakdown of the catalog of Kepler Objects of Interest. Just over half had previously been identified as false positives or confirmed as candidates. 1284 are newly validated, and another 455 have FPP of1090%. [Morton et al. 2016]Probabilistic validation has been used in the past to confirm individual planet candidates in Kepler data, but now Timothy Morton (Princeton University) and collaborators have taken this to a new level: they developed the first code thats designed to do fully automated batch processing of a large number of candidates.In a recently published study the results of which were announced yesterday the teamapplied their code to the entire catalog of 7,470 Kepler objects of interest.New Planets and False PositivesThe teams code was able to successfully evaluate the total false-positive probability (FPP) for 7,056 of the objects of interest. Of these, 428 objects previously identified as candidates were found to have FPP of more than 90%, suggesting that they are most likely false positives.Periods and radii of candidate and confirmed planets in the Kepler Objects of Interest catalog. Blue circles have previously been identified as confirmed planets. Candidates (orange) are shaded by false positive probability; more transparent means more likely to be a false positive. [Morton et al. 2016]In contrast, 1,935 candidates were found to have FPP of less than 1%, and were therefore declared validated planets. Of these confirmations, 1,284 were previously unconfirmed, more than doubling Keplers previous catalog of 1,041 confirmed planets. Morton and collaborators believe that 9 of these newly confirmed planets may fall within the habitable zone of their host stars.While the announcement of 1,284 newly confirmed planets is huge, the analysis presented in this study is the real news. The code used is publicly available and can be applied to any transiting exoplanet candidate. This means that this analysis technique can be used to find batches of exoplanets in data from the extended Kepler mission (K2) or from the future TESS and PLATO transit missions.CitationTimothy D. Morton et al 2016 ApJ 822 86. doi:10.3847/0004-637X/822/2/86
An Extended Proof-Carrying Code Framework for Security Enforcement
NASA Astrophysics Data System (ADS)
Pirzadeh, Heidar; Dubé, Danny; Hamou-Lhadj, Abdelwahab
The rapid growth of the Internet has resulted in increased attention to security to protect users from being victims of security threats. In this paper, we focus on security mechanisms that are based on Proof-Carrying Code (PCC) techniques. In a PCC system, a code producer sends a code along with its safety proof to the consumer. The consumer executes the code only if the proof is valid. Although PCC has been shown to be a useful security framework, it suffers from the sheer size of typical proofs -proofs of even small programs can be considerably large. In this paper, we propose an extended PCC framework (EPCC) in which, instead of the proof, a proof generator for the program in question is transmitted. This framework enables the execution of the proof generator and the recovery of the proof on the consumer's side in a secure manner using a newly created virtual machine called the VEP (Virtual Machine for Extended PCC).
NASA Technical Reports Server (NTRS)
Rajagopal, Kadambi R.; DebChaudhury, Amitabha; Orient, George
2000-01-01
This report describes a probabilistic structural analysis performed to determine the probabilistic structural response under fluctuating random pressure loads for the Space Shuttle Main Engine (SSME) turnaround vane. It uses a newly developed frequency and distance dependent correlation model that has features to model the decay phenomena along the flow and across the flow with the capability to introduce a phase delay. The analytical results are compared using two computer codes SAFER (Spectral Analysis of Finite Element Responses) and NESSUS (Numerical Evaluation of Stochastic Structures Under Stress) and with experimentally observed strain gage data. The computer code NESSUS with an interface to a sub set of Composite Load Spectra (CLS) code is used for the probabilistic analysis. A Fatigue code was used to calculate fatigue damage due to the random pressure excitation. The random variables modeled include engine system primitive variables that influence the operating conditions, convection velocity coefficient, stress concentration factor, structural damping, and thickness of the inner and outer vanes. The need for an appropriate correlation model in addition to magnitude of the PSD is emphasized. The study demonstrates that correlation characteristics even under random pressure loads are capable of causing resonance like effects for some modes. The study identifies the important variables that contribute to structural alternate stress response and drive the fatigue damage for the new design. Since the alternate stress for the new redesign is less than the endurance limit for the material, the damage due high cycle fatigue is negligible.
DOT National Transportation Integrated Search
1981-01-01
Senate Bill 85, an action of the 1978 General Assembly, amended the Code of Virginia to provide, in part, that the Division of Highway Safety be succeeded by the newly created Department of Transportation Safety effective July 1, 1978. In its Declara...
NASA Astrophysics Data System (ADS)
Papior, Nick; Lorente, Nicolás; Frederiksen, Thomas; García, Alberto; Brandbyge, Mads
2017-03-01
We present novel methods implemented within the non-equilibrium Green function code (NEGF) TRANSIESTA based on density functional theory (DFT). Our flexible, next-generation DFT-NEGF code handles devices with one or multiple electrodes (Ne ≥ 1) with individual chemical potentials and electronic temperatures. We describe its novel methods for electrostatic gating, contour optimizations, and assertion of charge conservation, as well as the newly implemented algorithms for optimized and scalable matrix inversion, performance-critical pivoting, and hybrid parallelization. Additionally, a generic NEGF "post-processing" code (TBTRANS/PHTRANS) for electron and phonon transport is presented with several novelties such as Hamiltonian interpolations, Ne ≥ 1 electrode capability, bond-currents, generalized interface for user-defined tight-binding transport, transmission projection using eigenstates of a projected Hamiltonian, and fast inversion algorithms for large-scale simulations easily exceeding 106 atoms on workstation computers. The new features of both codes are demonstrated and bench-marked for relevant test systems.
Clinical results of HIS, RIS, PACS integration using data integration CASE tools
NASA Astrophysics Data System (ADS)
Taira, Ricky K.; Chan, Hing-Ming; Breant, Claudine M.; Huang, Lu J.; Valentino, Daniel J.
1995-05-01
Current infrastructure research in PACS is dominated by the development of communication networks (local area networks, teleradiology, ATM networks, etc.), multimedia display workstations, and hierarchical image storage architectures. However, limited work has been performed on developing flexible, expansible, and intelligent information processing architectures for the vast decentralized image and text data repositories prevalent in healthcare environments. Patient information is often distributed among multiple data management systems. Current large-scale efforts to integrate medical information and knowledge sources have been costly with limited retrieval functionality. Software integration strategies to unify distributed data and knowledge sources is still lacking commercially. Systems heterogeneity (i.e., differences in hardware platforms, communication protocols, database management software, nomenclature, etc.) is at the heart of the problem and is unlikely to be standardized in the near future. In this paper, we demonstrate the use of newly available CASE (computer- aided software engineering) tools to rapidly integrate HIS, RIS, and PACS information systems. The advantages of these tools include fast development time (low-level code is generated from graphical specifications), and easy system maintenance (excellent documentation, easy to perform changes, and centralized code repository in an object-oriented database). The CASE tools are used to develop and manage the `middle-ware' in our client- mediator-serve architecture for systems integration. Our architecture is scalable and can accommodate heterogeneous database and communication protocols.
NASA Astrophysics Data System (ADS)
Takamatsu, Kuniyoshi; Nakagawa, Shigeaki; Takeda, Tetsuaki
Safety demonstration tests using the High Temperature Engineering Test Reactor (HTTR) are in progress to verify its inherent safety features and improve the safety technology and design methodology for High-temperature Gas-cooled Reactors (HTGRs). The reactivity insertion test is one of the safety demonstration tests for the HTTR. This test simulates the rapid increase in the reactor power by withdrawing the control rod without operating the reactor power control system. In addition, the loss of coolant flow tests has been conducted to simulate the rapid decrease in the reactor power by tripping one, two or all out of three gas circulators. The experimental results have revealed the inherent safety features of HTGRs, such as the negative reactivity feedback effect. The numerical analysis code, which was named-ACCORD-, was developed to analyze the reactor dynamics including the flow behavior in the HTTR core. We have modified this code to use a model with four parallel channels and twenty temperature coefficients. Furthermore, we added another analytical model of the core for calculating the heat conduction between the fuel channels and the core in the case of the loss of coolant flow tests. This paper describes the validation results for the newly developed code using the experimental results. Moreover, the effect of the model is formulated quantitatively with our proposed equation. Finally, the pre-analytical result of the loss of coolant flow test by tripping all gas circulators is also discussed.
Optimization of computations for adjoint field and Jacobian needed in 3D CSEM inversion
NASA Astrophysics Data System (ADS)
Dehiya, Rahul; Singh, Arun; Gupta, Pravin K.; Israil, M.
2017-01-01
We present the features and results of a newly developed code, based on Gauss-Newton optimization technique, for solving three-dimensional Controlled-Source Electromagnetic inverse problem. In this code a special emphasis has been put on representing the operations by block matrices for conjugate gradient iteration. We show how in the computation of Jacobian, the matrix formed by differentiation of system matrix can be made independent of frequency to optimize the operations at conjugate gradient step. The coarse level parallel computing, using OpenMP framework, is used primarily due to its simplicity in implementation and accessibility of shared memory multi-core computing machine to almost anyone. We demonstrate how the coarseness of modeling grid in comparison to source (comp`utational receivers) spacing can be exploited for efficient computing, without compromising the quality of the inverted model, by reducing the number of adjoint calls. It is also demonstrated that the adjoint field can even be computed on a grid coarser than the modeling grid without affecting the inversion outcome. These observations were reconfirmed using an experiment design where the deviation of source from straight tow line is considered. Finally, a real field data inversion experiment is presented to demonstrate robustness of the code.
A Hybrid Optimization Framework with POD-based Order Reduction and Design-Space Evolution Scheme
NASA Astrophysics Data System (ADS)
Ghoman, Satyajit S.
The main objective of this research is to develop an innovative multi-fidelity multi-disciplinary design, analysis and optimization suite that integrates certain solution generation codes and newly developed innovative tools to improve the overall optimization process. The research performed herein is divided into two parts: (1) the development of an MDAO framework by integration of variable fidelity physics-based computational codes, and (2) enhancements to such a framework by incorporating innovative features extending its robustness. The first part of this dissertation describes the development of a conceptual Multi-Fidelity Multi-Strategy and Multi-Disciplinary Design Optimization Environment (M3 DOE), in context of aircraft wing optimization. M 3 DOE provides the user a capability to optimize configurations with a choice of (i) the level of fidelity desired, (ii) the use of a single-step or multi-step optimization strategy, and (iii) combination of a series of structural and aerodynamic analyses. The modularity of M3 DOE allows it to be a part of other inclusive optimization frameworks. The M 3 DOE is demonstrated within the context of shape and sizing optimization of the wing of a Generic Business Jet aircraft. Two different optimization objectives, viz. dry weight minimization, and cruise range maximization are studied by conducting one low-fidelity and two high-fidelity optimization runs to demonstrate the application scope of M3 DOE. The second part of this dissertation describes the development of an innovative hybrid optimization framework that extends the robustness of M 3 DOE by employing a proper orthogonal decomposition-based design-space order reduction scheme combined with the evolutionary algorithm technique. The POD method of extracting dominant modes from an ensemble of candidate configurations is used for the design-space order reduction. The snapshot of candidate population is updated iteratively using evolutionary algorithm technique of fitness-driven retention. This strategy capitalizes on the advantages of evolutionary algorithm as well as POD-based reduced order modeling, while overcoming the shortcomings inherent with these techniques. When linked with M3 DOE, this strategy offers a computationally efficient methodology for problems with high level of complexity and a challenging design-space. This newly developed framework is demonstrated for its robustness on a nonconventional supersonic tailless air vehicle wing shape optimization problem.
NASA Astrophysics Data System (ADS)
Barranco, Joseph
2006-03-01
We have developed a three-dimensional (3D) spectral hydrodynamic code to study vortex dynamics in rotating, shearing, stratified systems (eg, the atmosphere of gas giant planets, protoplanetary disks around newly forming protostars). The time-independent background state is stably stratified in the vertical direction and has a unidirectional linear shear flow aligned with one horizontal axis. Superposed on this background state is an unsteady, subsonic flow that is evolved with the Euler equations subject to the anelastic approximation to filter acoustic phenomena. A Fourier-Fourier basis in a set of quasi-Lagrangian coordinates that advect with the background shear is used for spectral expansions in the two horizontal directions. For the vertical direction, two different sets of basis functions have been implemented: (1) Chebyshev polynomials on a truncated, finite domain, and (2) rational Chebyshev functions on an infinite domain. Use of this latter set is equivalent to transforming the infinite domain to a finite one with a cotangent mapping, and using cosine and sine expansions in the mapped coordinate. The nonlinear advection terms are time integrated explicitly, whereas the Coriolis force, buoyancy terms, and pressure/enthalpy gradient are integrated semi- implicitly. We show that internal gravity waves can be damped by adding new terms to the Euler equations. The code exhibits excellent parallel performance with the Message Passing Interface (MPI). As a demonstration of the code, we simulate vortex dynamics in protoplanetary disks and the Kelvin-Helmholtz instability in the dusty midplanes of protoplanetary disks.
A 3D spectral anelastic hydrodynamic code for shearing, stratified flows
NASA Astrophysics Data System (ADS)
Barranco, Joseph A.; Marcus, Philip S.
2006-11-01
We have developed a three-dimensional (3D) spectral hydrodynamic code to study vortex dynamics in rotating, shearing, stratified systems (e.g., the atmosphere of gas giant planets, protoplanetary disks around newly forming protostars). The time-independent background state is stably stratified in the vertical direction and has a unidirectional linear shear flow aligned with one horizontal axis. Superposed on this background state is an unsteady, subsonic flow that is evolved with the Euler equations subject to the anelastic approximation to filter acoustic phenomena. A Fourier Fourier basis in a set of quasi-Lagrangian coordinates that advect with the background shear is used for spectral expansions in the two horizontal directions. For the vertical direction, two different sets of basis functions have been implemented: (1) Chebyshev polynomials on a truncated, finite domain, and (2) rational Chebyshev functions on an infinite domain. Use of this latter set is equivalent to transforming the infinite domain to a finite one with a cotangent mapping, and using cosine and sine expansions in the mapped coordinate. The nonlinear advection terms are time-integrated explicitly, the pressure/enthalpy terms are integrated semi-implicitly, and the Coriolis force and buoyancy terms are treated semi-analytically. We show that internal gravity waves can be damped by adding new terms to the Euler equations. The code exhibits excellent parallel performance with the message passing interface (MPI). As a demonstration of the code, we simulate the merger of two 3D vortices in the midplane of a protoplanetary disk.
42 CFR 414.906 - Competitive acquisition program as the basis for payment.
Code of Federal Regulations, 2010 CFR
2010-10-01
... section, payment for CAP drugs is based on bids submitted as a result of the bidding process as described... established. (d) Adjustments. There is an established process for adjustments to payments to account for drugs... of— (A) One or more newly issued HCPCS codes; or (B) One of the following single indication orphan...
Identification and validation of Asteraceae miRNAs by the expressed sequence tag analysis.
Monavar Feshani, Aboozar; Mohammadi, Saeed; Frazier, Taylor P; Abbasi, Abbas; Abedini, Raha; Karimi Farsad, Laleh; Ehya, Farveh; Salekdeh, Ghasem Hosseini; Mardi, Mohsen
2012-02-10
MicroRNAs (miRNAs) are small non-coding RNA molecules that play a vital role in the regulation of gene expression. Despite their identification in hundreds of plant species, few miRNAs have been identified in the Asteraceae, a large family that comprises approximately one tenth of all flowering plants. In this study, we used the expressed sequence tag (EST) analysis to identify potential conserved miRNAs and their putative target genes in the Asteraceae. We applied quantitative Real-Time PCR (qRT-PCR) to confirm the expression of eight potential miRNAs in Carthamus tinctorius and Helianthus annuus. We also performed qRT-PCR analysis to investigate the differential expression pattern of five newly identified miRNAs during five different cotyledon growth stages in safflower. Using these methods, we successfully identified and characterized 151 potentially conserved miRNAs, belonging to 26 miRNA families, in 11 genus of Asteraceae. EST analysis predicted that the newly identified conserved Asteraceae miRNAs target 130 total protein-coding ESTs in sunflower and safflower, as well as 433 additional target genes in other plant species. We experimentally confirmed the existence of seven predicted miRNAs, (miR156, miR159, miR160, miR162, miR166, miR396, and miR398) in safflower and sunflower seedlings. We also observed that five out of eight miRNAs are differentially expressed during cotyledon development. Our results indicate that miRNAs may be involved in the regulation of gene expression during seed germination and the formation of the cotyledons in the Asteraceae. The findings of this study might ultimately help in the understanding of miRNA-mediated gene regulation in important crop species. Copyright © 2011 Elsevier B.V. All rights reserved.
[Changes for rheumatology in the G-DRG system 2005].
Fiori, W; Roeder, N; Lakomek, H-J; Liman, W; Köneke, N; Hülsemann, J L; Lehmann, H; Wenke, A
2005-02-01
The German prospective payment system G-DRG has been recently adapted and recalculated. Apart from the adjustments of the G-DRG classification system itself changes in the legal framework like the extension of the "convergence period" or the limitation of budget loss due to DRG introduction have to be considered. Especially the introduction of new procedure codes (OPS) describing the specialized and complex rheumatologic treatment of inpatients might be of significant importance. Even though these procedures will not yet develop influence on the grouping process in 2005, it will enable a more accurate description of the efforts of acute-rheumatologic treatment which can be used for further adaptations of the DRG algorithm. Numerous newly introduced additive payment components (ZE) result in a more adequate description of the "DRG-products". Although not increasing the individual hospital budget, these additive payments contribute to more transparency of high cost services and can be addressed separately from the DRG-budget. Furthermore a lot of other relevant changes to the G-DRG catalogue, the classification systems ICD-10-GM and OPS-301 and the German Coding Standards (DKR) are presented.
NASA Technical Reports Server (NTRS)
Kondoz, A. M.; Evans, B. G.
1993-01-01
In the last decade, low bit rate speech coding research has received much attention resulting in newly developed, good quality, speech coders operating at as low as 4.8 Kb/s. Although speech quality at around 8 Kb/s is acceptable for a wide variety of applications, at 4.8 Kb/s more improvements in quality are necessary to make it acceptable to the majority of applications and users. In addition to the required low bit rate with acceptable speech quality, other facilities such as integrated digital echo cancellation and voice activity detection are now becoming necessary to provide a cost effective and compact solution. In this paper we describe a CELP speech coder with integrated echo canceller and a voice activity detector all of which have been implemented on a single DSP32C with 32 KBytes of SRAM. The quality of CELP coded speech has been improved significantly by a new codebook implementation which also simplifies the encoder/decoder complexity making room for the integration of a 64-tap echo canceller together with a voice activity detector.
Visser, Marco D.; McMahon, Sean M.; Merow, Cory; Dixon, Philip M.; Record, Sydne; Jongejans, Eelke
2015-01-01
Computation has become a critical component of research in biology. A risk has emerged that computational and programming challenges may limit research scope, depth, and quality. We review various solutions to common computational efficiency problems in ecological and evolutionary research. Our review pulls together material that is currently scattered across many sources and emphasizes those techniques that are especially effective for typical ecological and environmental problems. We demonstrate how straightforward it can be to write efficient code and implement techniques such as profiling or parallel computing. We supply a newly developed R package (aprof) that helps to identify computational bottlenecks in R code and determine whether optimization can be effective. Our review is complemented by a practical set of examples and detailed Supporting Information material (S1–S3 Texts) that demonstrate large improvements in computational speed (ranging from 10.5 times to 14,000 times faster). By improving computational efficiency, biologists can feasibly solve more complex tasks, ask more ambitious questions, and include more sophisticated analyses in their research. PMID:25811842
Dislocation mechanisms in stressed crystals with surface effects
NASA Astrophysics Data System (ADS)
Wu, Chi-Chin; Crone, Joshua; Munday, Lynn; Discrete Dislocation Dynamics Team
2014-03-01
Understanding dislocation properties in stressed crystals is the key for important processes in materials science, including the strengthening of metals and the stress relaxation during the growth of hetero-epitaxial structures. Despite existing experimental approaches and theories, many dislocation mechanisms with surface effects still remain elusive in experiments. Even though discrete dislocation dynamics (DDD) simulations are commonly employed to study dislocations, few demonstrate sufficient computational capabilities for massive dislocations with the combined effects of surfaces and stresses. Utilizing the Army's newly developed FED3 code, a DDD computation code coupled with finite elements, this work presents several dislocation mechanisms near different types of surfaces in finite domains. Our simulation models include dislocations in a bended metallic cantilever beam, near voids in stressed metals, as well as threading and misfit dislocations in as-grown semiconductor epitaxial layers and their quantitative inter-correlations to stress relaxation and surface instability. Our studies provide not only detailed physics of individual dislocation mechanisms, but also important collective dislocation properties such as dislocation densities and strain-stress profiles and their interactions with surfaces.
CFD analysis on control of secondary losses in STME LOX turbines with endwall fences
NASA Technical Reports Server (NTRS)
Chyu, Mingking K.
1992-01-01
The rotor blade in the newly designed LOX turbine for the future Space Transportation Main Engine (STME) has a severe flow turning angle, nearly 160 degrees. The estimated secondary loss in the rotor alone accounts for nearly 50 percent of the total loss over the entire stage. To reduce such a loss, one of the potential methods is to use fences attached on the turbine endwall (hub). As a prelude to examining the effects of endwall fence with actual STME turbine configuration, the present study focuses on similar issues with a different, but more generic, geometry - a rectangular duct with a 160-degree bend. The duct cross-section has a 2-to-1 aspect ratio and the radii of curvature for the inner and outer wall are 0.25 and 1.25 times the duct width, respectively. The present emphasis lies in examining the effects of various fence-length extending along the streamwise direction. The flowfield is numerically simulated using the FDNS code developed earlier by Wang and Chen. The FDNS code is a pressure based, finite-difference, Navier-Stokes equations solver.
Simulation of Rotary-Wing Near-Wake Vortex Structures Using Navier-Stokes CFD Methods
NASA Technical Reports Server (NTRS)
Kenwright, David; Strawn, Roger; Ahmad, Jasim; Duque, Earl; Warmbrodt, William (Technical Monitor)
1997-01-01
This paper will use high-resolution Navier-Stokes computational fluid dynamics (CFD) simulations to model the near-wake vortex roll-up behind rotor blades. The locations and strengths of the trailing vortices will be determined from newly-developed visualization and analysis software tools applied to the CFD solutions. Computational results for rotor nearwake vortices will be used to study the near-wake vortex roll up for highly-twisted tiltrotor blades. These rotor blades typically have combinations of positive and negative spanwise loading and complex vortex wake interactions. Results of the computational studies will be compared to vortex-lattice wake models that are frequently used in rotorcraft comprehensive codes. Information from these comparisons will be used to improve the rotor wake models in the Tilt-Rotor Acoustic Code (TRAC) portion of NASA's Short Haul Civil Transport program (SHCT). Accurate modeling of the rotor wake is an important part of this program and crucial to the successful design of future civil tiltrotor aircraft. The rotor wake system plays an important role in blade-vortex interaction noise, a major problem for all rotorcraft including tiltrotors.
The effect of pressure anisotropy on ballooning modes in tokamak plasmas
NASA Astrophysics Data System (ADS)
Johnston, A.; Hole, M. J.; Qu, Z. S.; Hezaveh, H.
2018-06-01
Edge Localised Modes are thought to be caused by a spectrum of magnetohydrodynamic instabilities, including the ballooning mode. While ballooning modes have been studied extensively both theoretically and experimentally, the focus of the vast majority of this research has been on isotropic plasmas. The prevalence of pressure anisotropy in modern tokamaks thus motivates further study of these modes. This paper presents a numerical analysis of ballooning modes in anisotropic equilibria. The investigation was conducted using the newly-developed codes HELENA+ATF and MISHKA-A, which adds anisotropic physics to equilibria and stability analysis. We have examined the impact of anisotropy on the stability of an n = 30 ballooning mode, confirming results conform to previous calculations in the isotropic limit. Growth rates of ballooning modes in equilibria with different levels of anisotropy were then calculated using the stability code MISHKA-A. The key finding was that the level of anisotropy had a significant impact on ballooning mode growth rates. For {T}\\perp > {T}| | , typical of ICRH heating, the growth rate increases, while for {T}\\perp < {T}| | , typical of neutral beam heating, the growth rate decreases.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kuba, J; Slaughter, D R; Fittinghoff, D N
We present a detailed comparison of the measured characteristics of Thomson backscattered x-rays produced at the PLEIADES (Picosecond Laser-Electron Interaction for the Dynamic Evaluation of Structures) facility at Lawrence Livermore National Laboratory to predicted results from a newly developed, fully three-dimensional time and frequency-domain code. Based on the relativistic differential cross section, this code has the capability to calculate time and space dependent spectra of the x-ray photons produced from linear Thomson scattering for both bandwidth-limited and chirped incident laser pulses. Spectral broadening of the scattered x-ray pulse resulting from the incident laser bandwidth, perpendicular wave vector components in themore » laser focus, and the transverse and longitudinal phase space of the electron beam are included. Electron beam energy, energy spread, and transverse phase space measurements of the electron beam at the interaction point are presented, and the corresponding predicted x-ray characteristics are determined. In addition, time-integrated measurements of the x-rays produced from the interaction are presented, and shown to agree well with the simulations.« less
Collapse of magnetized hypermassive neutron stars in general relativity.
Duez, Matthew D; Liu, Yuk Tung; Shapiro, Stuart L; Shibata, Masaru; Stephens, Branson C
2006-01-27
Hypermassive neutron stars (HMNSs)--equilibrium configurations supported against collapse by rapid differential rotation--are possible transient remnants of binary neutron-star mergers. Using newly developed codes for magnetohydrodynamic simulations in dynamical spacetimes, we are able to track the evolution of a magnetized HMNS in full general relativity for the first time. We find that secular angular momentum transport due to magnetic braking and the magnetorotational instability results in the collapse of an HMNS to a rotating black hole, accompanied by a gravitational wave burst. The nascent black hole is surrounded by a hot, massive torus undergoing quasistationary accretion and a collimated magnetic field. This scenario suggests that HMNS collapse is a possible candidate for the central engine of short gamma-ray bursts.
A comprehensive experimental characterization of the iPIX gamma imager
NASA Astrophysics Data System (ADS)
Amgarou, K.; Paradiso, V.; Patoz, A.; Bonnet, F.; Handley, J.; Couturier, P.; Becker, F.; Menaa, N.
2016-08-01
The results of more than 280 different experiments aimed at exploring the main features and performances of a newly developed gamma imager, called iPIX, are summarized in this paper. iPIX is designed to quickly localize radioactive sources while estimating the ambient dose equivalent rate at the measurement point. It integrates a 1 mm thick CdTe detector directly bump-bonded to a Timepix chip, a tungsten coded-aperture mask, and a mini RGB camera. It also represents a major technological breakthrough in terms of lightness, compactness, usability, response sensitivity, and angular resolution. As an example of its key strengths, an 241Am source with a dose rate of only few nSv/h can be localized in less than one minute.
Lee, Bumshik; Kim, Munchurl
2016-08-01
In this paper, a low complexity coding unit (CU)-level rate and distortion estimation scheme is proposed for High Efficiency Video Coding (HEVC) hardware-friendly implementation where a Walsh-Hadamard transform (WHT)-based low-complexity integer discrete cosine transform (DCT) is employed for distortion estimation. Since HEVC adopts quadtree structures of coding blocks with hierarchical coding depths, it becomes more difficult to estimate accurate rate and distortion values without actually performing transform, quantization, inverse transform, de-quantization, and entropy coding. Furthermore, DCT for rate-distortion optimization (RDO) is computationally high, because it requires a number of multiplication and addition operations for various transform block sizes of 4-, 8-, 16-, and 32-orders and requires recursive computations to decide the optimal depths of CU or transform unit. Therefore, full RDO-based encoding is highly complex, especially for low-power implementation of HEVC encoders. In this paper, a rate and distortion estimation scheme is proposed in CU levels based on a low-complexity integer DCT that can be computed in terms of WHT whose coefficients are produced in prediction stages. For rate and distortion estimation in CU levels, two orthogonal matrices of 4×4 and 8×8 , which are applied to WHT that are newly designed in a butterfly structure only with addition and shift operations. By applying the integer DCT based on the WHT and newly designed transforms in each CU block, the texture rate can precisely be estimated after quantization using the number of non-zero quantized coefficients and the distortion can also be precisely estimated in transform domain without de-quantization and inverse transform required. In addition, a non-texture rate estimation is proposed by using a pseudoentropy code to obtain accurate total rate estimates. The proposed rate and the distortion estimation scheme can effectively be used for HW-friendly implementation of HEVC encoders with 9.8% loss over HEVC full RDO, which much less than 20.3% and 30.2% loss of a conventional approach and Hadamard-only scheme, respectively.
Development of a set of SNP markers present in expressed genes of the apple.
Chagné, David; Gasic, Ksenija; Crowhurst, Ross N; Han, Yuepeng; Bassett, Heather C; Bowatte, Deepa R; Lawrence, Timothy J; Rikkerink, Erik H A; Gardiner, Susan E; Korban, Schuyler S
2008-11-01
Molecular markers associated with gene coding regions are useful tools for bridging functional and structural genomics. Due to their high abundance in plant genomes, single nucleotide polymorphisms (SNPs) are present within virtually all genomic regions, including most coding sequences. The objective of this study was to develop a set of SNPs for the apple by taking advantage of the wealth of genomics resources available for the apple, including a large collection of expressed sequenced tags (ESTs). Using bioinformatics tools, a search for SNPs within an EST database of approximately 350,000 sequences developed from a variety of apple accessions was conducted. This resulted in the identification of a total of 71,482 putative SNPs. As the apple genome is reported to be an ancient polyploid, attempts were made to verify whether those SNPs detected in silico were attributable either to allelic polymorphisms or to gene duplication or paralogous or homeologous sequence variations. To this end, a set of 464 PCR primer pairs was designed, PCR was amplified using two subsets of plants, and the PCR products were sequenced. The SNPs retrieved from these sequences were then mapped onto apple genetic maps, including a newly constructed map of a Royal Gala x A689-24 cross and a Malling 9 x Robusta 5, map using a bin mapping strategy. The SNP genotyping was performed using the high-resolution melting (HRM) technique. A total of 93 new markers containing 210 coding SNPs were successfully mapped. This new set of SNP markers for the apple offers new opportunities for understanding the genetic control of important horticultural traits using quantitative trait loci (QTL) or linkage disequilibrium analysis. These also serve as useful markers for aligning physical and genetic maps, and as potential transferable markers across the Rosaceae family.
The Nuremberg Code-A critique.
Ghooi, Ravindra B
2011-04-01
The Nuremberg Code drafted at the end of the Doctor's trial in Nuremberg 1947 has been hailed as a landmark document in medical and research ethics. Close examination of this code reveals that it was based on the Guidelines for Human Experimentation of 1931. The resemblance between these documents is uncanny. It is unfortunate that the authors of the Nuremberg Code passed it off as their original work. There is evidence that the defendants at the trial did request that their actions be judged on the basis of the 1931 Guidelines, in force in Germany. The prosecutors, however, ignored the request and tried the defendants for crimes against humanity, and the judges included the Nuremberg Code as a part of the judgment. Six of ten principles in Nuremberg Code are derived from the 1931 Guidelines, and two of four newly inserted principles are open to misinterpretation. There is little doubt that the Code was prepared after studying the Guidelines, but no reference was made to the Guidelines, for reasons that are not known. Using the Guidelines as a base document without giving due credit is plagiarism; as per our understanding of ethics today, this would be considered unethical. The Nuremberg Code has fallen by the wayside; since unlike the Declaration of Helsinki, it is not regularly reviewed and updated. The regular updating of some ethics codes is evidence of the evolving nature of human ethics.
Study on an azimuthal line cusp ion source for the KSTAR neutral beam injector.
Jeong, Seung Ho; Chang, Doo-Hee; In, Sang Ryul; Lee, Kwang Won; Oh, Byung-Hoon; Yoon, Byung-Joo; Song, Woo Sob; Kim, Jinchoon; Kim, Tae Seong
2008-02-01
In this study it is found that the cusp magnetic field configuration of an anode bucket influences the primary electron behavior. An electron orbit code (ELEORBIT code) showed that an azimuthal line cusp (cusp lines run azimuthally with respect to the beam extraction direction) provides a longer primary electron confinement time than an axial line cusp configuration. Experimentally higher plasma densities were obtained under the same arc power when the azimuthal cusp chamber was used. The newly designed azimuthal cusp bucket has been investigated in an effort to increase the plasma density in its plasma generator per arc power.
Implementation of non-axisymmetric mesh system in the gyrokinetic PIC code (XGC) for Stellarators
NASA Astrophysics Data System (ADS)
Moritaka, Toseo; Hager, Robert; Cole, Micheal; Chang, Choong-Seock; Lazerson, Samuel; Ku, Seung-Hoe; Ishiguro, Seiji
2017-10-01
Gyrokinetic simulation is a powerful tool to investigate turbulent and neoclassical transports based on the first-principles of plasma kinetics. The gyrokinetic PIC code XGC has been developed for integrated simulations that cover the entire region of Tokamaks. Complicated field line and boundary structures should be taken into account to demonstrate edge plasma dynamics under the influence of X-point and vessel components. XGC employs gyrokinetic Poisson solver on unstructured triangle mesh to deal with this difficulty. We introduce numerical schemes newly developed for XGC simulation in non-axisymmetric Stellarator geometry. Triangle meshes in each poloidal plane are defined by PEST poloidal angle in the VMEC equilibrium so that they have the same regular structure in the straight field line coordinate. Electric charge of marker particle is distributed to the triangles specified by the field-following projection to the neighbor poloidal planes. 3D spline interpolation in a cylindrical mesh is also used to obtain equilibrium magnetic field at the particle position. These schemes capture the anisotropic plasma dynamics and resulting potential structure with high accuracy. The triangle meshes can smoothly connect to unstructured meshes in the edge region. We will present the validation test in the core region of Large Helical Device and discuss about future challenges toward edge simulations.
PENTACLE: Parallelized particle-particle particle-tree code for planet formation
NASA Astrophysics Data System (ADS)
Iwasawa, Masaki; Oshino, Shoichi; Fujii, Michiko S.; Hori, Yasunori
2017-10-01
We have newly developed a parallelized particle-particle particle-tree code for planet formation, PENTACLE, which is a parallelized hybrid N-body integrator executed on a CPU-based (super)computer. PENTACLE uses a fourth-order Hermite algorithm to calculate gravitational interactions between particles within a cut-off radius and a Barnes-Hut tree method for gravity from particles beyond. It also implements an open-source library designed for full automatic parallelization of particle simulations, FDPS (Framework for Developing Particle Simulator), to parallelize a Barnes-Hut tree algorithm for a memory-distributed supercomputer. These allow us to handle 1-10 million particles in a high-resolution N-body simulation on CPU clusters for collisional dynamics, including physical collisions in a planetesimal disc. In this paper, we show the performance and the accuracy of PENTACLE in terms of \\tilde{R}_cut and a time-step Δt. It turns out that the accuracy of a hybrid N-body simulation is controlled through Δ t / \\tilde{R}_cut and Δ t / \\tilde{R}_cut ˜ 0.1 is necessary to simulate accurately the accretion process of a planet for ≥106 yr. For all those interested in large-scale particle simulations, PENTACLE, customized for planet formation, will be freely available from https://github.com/PENTACLE-Team/PENTACLE under the MIT licence.
Second Generation Integrated Composite Analyzer (ICAN) Computer Code
NASA Technical Reports Server (NTRS)
Murthy, Pappu L. N.; Ginty, Carol A.; Sanfeliz, Jose G.
1993-01-01
This manual updates the original 1986 NASA TP-2515, Integrated Composite Analyzer (ICAN) Users and Programmers Manual. The various enhancements and newly added features are described to enable the user to prepare the appropriate input data to run this updated version of the ICAN code. For reference, the micromechanics equations are provided in an appendix and should be compared to those in the original manual for modifications. A complete output for a sample case is also provided in a separate appendix. The input to the code includes constituent material properties, factors reflecting the fabrication process, and laminate configuration. The code performs micromechanics, macromechanics, and laminate analyses, including the hygrothermal response of polymer-matrix-based fiber composites. The output includes the various ply and composite properties, the composite structural response, and the composite stress analysis results with details on failure. The code is written in FORTRAN 77 and can be used efficiently as a self-contained package (or as a module) in complex structural analysis programs. The input-output format has changed considerably from the original version of ICAN and is described extensively through the use of a sample problem.
Study of the OCDMA Transmission Characteristics in FSO-FTTH at Various Distances, Outdoor
NASA Astrophysics Data System (ADS)
Aldouri, Muthana Y.; Aljunid, S. A.; Fadhil, Hilal A.
2013-06-01
It is important to apply the field Programmable Gate Array (FPGA), and Optical Switch technology as an encoder and decoder for Spectral Amplitude Coding Optical Code Division Multiple Access (SAC-OCDMA) Free Space Optic Fiber to the Home (FSO-FTTH) transmitter and receiver system design. The encoder and decoder module will be using FPGA as a code generator, optical switch using as encode and decode of optical source. This module was tested by using the Modified Double Weight (MDW) code, which is selected as an excellent candidate because it had shown superior performance were by the total noise is reduced. It is also easy to construct and can reduce the number of filters required at a receiver by a newly proposed detection scheme known as AND Subtraction technique. MDW code is presented here to support Fiber-To-The-Home (FTTH) access network in Point-To-Multi-Point (P2MP) application. The conversion used a Mach-Zehnder interferometer (MZI) wavelength converter. The performances are characterized through BER and bit rate (BR), also, the received power at a variety of bit rates.
Højland, Dorte H.; Jensen, Karl-Martin Vagn; Kristensen, Michael
2014-01-01
Background The housefly, Musca domestica, has developed resistance to most insecticides applied for its control. Expression of genes coding for detoxification enzymes play a role in the response of the housefly when encountered by a xenobiotic. The highest level of constitutive gene expression of nine P450 genes was previously found in a newly-collected susceptible field population in comparison to three insecticide-resistant laboratory strains and a laboratory reference strain. Results We compared gene expression of five P450s by qPCR as well as global gene expression by RNAseq in the newly-acquired field population (845b) in generation F1, F13 and F29 to test how gene expression changes following laboratory adaption. Four (CYP6A1, CYP6A36, CYP6D3, CYP6G4) of five investigated P450 genes adapted to breeding by decreasing expression. CYP6D1 showed higher female expression in F29 than in F1. For males, about half of the genes accessed in the global gene expression were up-regulated in F13 and F29 in comparison with the F1 population. In females, 60% of the genes were up-regulated in F13 in comparison with F1, while 33% were up-regulated in F29. Forty potential P450 genes were identified. In most cases, P450 gene expression was decreased in F13 flies in comparison with F1. Gene expression then increased from F13 to F29 in males and decreased further in females. Conclusion The global gene expression changes massively during adaptation to laboratory breeding. In general, global expression decreased as a result of laboratory adaption in males, while female expression was not unidirectional. Expression of P450 genes was in general down-regulated as a result of laboratory adaption. Expression of hexamerin, coding for a storage protein was increased, while gene expression of genes coding for amylases decreased. This suggests a major impact of the surrounding environment on gene response to xenobiotics and genetic composition of housefly strains. PMID:24489682
Advanced Aerodynamic Design of Passive Porosity Control Effectors
NASA Technical Reports Server (NTRS)
Hunter, Craig A.; Viken, Sally A.; Wood, Richard M.; Bauer, Steven X. S.
2001-01-01
This paper describes aerodynamic design work aimed at developing a passive porosity control effector system for a generic tailless fighter aircraft. As part of this work, a computational design tool was developed and used to layout passive porosity effector systems for longitudinal and lateral-directional control at a low-speed, high angle of attack condition. Aerodynamic analysis was conducted using the NASA Langley computational fluid dynamics code USM3D, in conjunction with a newly formulated surface boundary condition for passive porosity. Results indicate that passive porosity effectors can provide maneuver control increments that equal and exceed those of conventional aerodynamic effectors for low-speed, high-alpha flight, with control levels that are a linear function of porous area. This work demonstrates the tremendous potential of passive porosity to yield simple control effector systems that have no external moving parts and will preserve an aircraft's fixed outer mold line.
Battle against cancer: an everlasting saga of p53.
Hao, Qian; Cho, William C
2014-12-01
Cancer is one of the most life-threatening diseases characterized by uncontrolled growth and spread of malignant cells. The tumor suppressor p53 is the master regulator of tumor cell growth and proliferation. In response to various stress signals, p53 can be activated and transcriptionally induces a myriad of target genes, including both protein-encoding and non-coding genes, controlling cell cycle progression, DNA repair, senescence, apoptosis, autophagy and metabolism of tumor cells. However, around 50% of human cancers harbor mutant p53 and, in the majority of the remaining cancers, p53 is inactivated through multiple mechanisms. Herein, we review the recent progress in understanding the molecular basis of p53 signaling, particularly the newly identified ribosomal stress-p53 pathway, and the development of chemotherapeutics via activating wild-type p53 or restoring mutant p53 functions in cancer. A full understanding of p53 regulation will aid the development of effective cancer treatments.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Greve, L., E-mail: lars.greve@volkswagen.de; Medricky, M., E-mail: miloslav.medricky@volkswagen.de; Andres, M., E-mail: miloslav.medricky@volkswagen.de
A comprehensive strain hardening and fracture characterization of different grades of boron steel blanks has been performed, providing the foundation for the implementation into the modular material model (MMM) framework developed by Volkswagen Group Research for an explicit crash code. Due to the introduction of hardness-based interpolation rules for the characterized main grades, the hardening and fracture behavior is solely described by the underlying Vickers hardness. In other words, knowledge of the hardness distribution within a hot-formed component is enough to set up the newly developed computational model. The hardness distribution can be easily introduced via an experimentally measured hardnessmore » curve or via hardness mapping from a corresponding hot-forming simulation. For industrial application using rather coarse and computationally inexpensive shell element meshes, the user material model has been extended by a necking/post-necking model with reduced mesh-dependency as an additional failure mode. The present paper mainly addresses the necking/post-necking model.« less
Analysis of monochromatic and quasi-monochromatic X-ray sources in imaging and therapy
NASA Astrophysics Data System (ADS)
Westphal, Maximillian; Lim, Sara; Nahar, Sultana; Orban, Christopher; Pradhan, Anil
2017-04-01
We studied biomedical imaging and therapeutic applications of recently developed quasi-monochromatic and monochromatic X-ray sources. Using the Monte Carlo code GEANT4, we found that the quasi-monochromatic 65 keV Gaussian X-ray spectrum created by inverse Compton scattering with relatavistic electron beams were capable of producing better image contrast with less radiation compared to conventional 120 kV broadband CT scans. We also explored possible experimental detection of theoretically predicted K α resonance fluorescence in high-Z elements using the European Synchrotron Research Facility with a tungsten (Z = 74) target. In addition, we studied a newly developed quasi-monochromatic source generated by converting broadband X-rays to monochromatic K α and β X-rays with a zirconium target (Z = 40). We will further study how these K α and K β dominated spectra can be implemented in conjunction with nanoparticles for targeted therapy. Acknowledgement: Ohio Supercomputer Center, Columbus, OH.
The Oceanographic Multipurpose Software Environment (OMUSE v1.0)
NASA Astrophysics Data System (ADS)
Pelupessy, Inti; van Werkhoven, Ben; van Elteren, Arjen; Viebahn, Jan; Candy, Adam; Portegies Zwart, Simon; Dijkstra, Henk
2017-08-01
In this paper we present the Oceanographic Multipurpose Software Environment (OMUSE). OMUSE aims to provide a homogeneous environment for existing or newly developed numerical ocean simulation codes, simplifying their use and deployment. In this way, numerical experiments that combine ocean models representing different physics or spanning different ranges of physical scales can be easily designed. Rapid development of simulation models is made possible through the creation of simple high-level scripts. The low-level core of the abstraction in OMUSE is designed to deploy these simulations efficiently on heterogeneous high-performance computing resources. Cross-verification of simulation models with different codes and numerical methods is facilitated by the unified interface that OMUSE provides. Reproducibility in numerical experiments is fostered by allowing complex numerical experiments to be expressed in portable scripts that conform to a common OMUSE interface. Here, we present the design of OMUSE as well as the modules and model components currently included, which range from a simple conceptual quasi-geostrophic solver to the global circulation model POP (Parallel Ocean Program). The uniform access to the codes' simulation state and the extensive automation of data transfer and conversion operations aids the implementation of model couplings. We discuss the types of couplings that can be implemented using OMUSE. We also present example applications that demonstrate the straightforward model initialization and the concurrent use of data analysis tools on a running model. We give examples of multiscale and multiphysics simulations by embedding a regional ocean model into a global ocean model and by coupling a surface wave propagation model with a coastal circulation model.
Functional Interplay between Small Non-Coding RNAs and RNA Modification in the Brain.
Leighton, Laura J; Bredy, Timothy W
2018-06-07
Small non-coding RNAs are essential for transcription, translation and gene regulation in all cell types, but are particularly important in neurons, with known roles in neurodevelopment, neuroplasticity and neurological disease. Many small non-coding RNAs are directly involved in the post-transcriptional modification of other RNA species, while others are themselves substrates for modification, or are functionally modulated by modification of their target RNAs. In this review, we explore the known and potential functions of several distinct classes of small non-coding RNAs in the mammalian brain, focusing on the newly recognised interplay between the epitranscriptome and the activity of small RNAs. We discuss the potential for this relationship to influence the spatial and temporal dynamics of gene activation in the brain, and predict that further research in the field of epitranscriptomics will identify interactions between small RNAs and RNA modifications which are essential for higher order brain functions such as learning and memory.
NASA Astrophysics Data System (ADS)
Liu, Hongdong; Zhang, Lian; Chen, Zhi; Liu, Xinguo; Dai, Zhongying; Li, Qiang; Xu, Xie George
2017-09-01
In medical physics it is desirable to have a Monte Carlo code that is less complex, reliable yet flexible for dose verification, optimization, and component design. TOPAS is a newly developed Monte Carlo simulation tool which combines extensive radiation physics libraries available in Geant4 code, easyto-use geometry and support for visualization. Although TOPAS has been widely tested and verified in simulations of proton therapy, there has been no reported application for carbon ion therapy. To evaluate the feasibility and accuracy of TOPAS simulations for carbon ion therapy, a licensed TOPAS code (version 3_0_p1) was used to carry out a dosimetric study of therapeutic carbon ions. Results of depth dose profile based on different physics models have been obtained and compared with the measurements. It is found that the G4QMD model is at least as accurate as the TOPAS default BIC physics model for carbon ions, but when the energy is increased to relatively high levels such as 400 MeV/u, the G4QMD model shows preferable performance. Also, simulations of special components used in the treatment head at the Institute of Modern Physics facility was conducted to investigate the Spread-Out dose distribution in water. The physical dose in water of SOBP was found to be consistent with the aim of the 6 cm ridge filter.
Ghooi, Ravindra B.
2011-01-01
The Nuremberg Code drafted at the end of the Doctor’s trial in Nuremberg 1947 has been hailed as a landmark document in medical and research ethics. Close examination of this code reveals that it was based on the Guidelines for Human Experimentation of 1931. The resemblance between these documents is uncanny. It is unfortunate that the authors of the Nuremberg Code passed it off as their original work. There is evidence that the defendants at the trial did request that their actions be judged on the basis of the 1931 Guidelines, in force in Germany. The prosecutors, however, ignored the request and tried the defendants for crimes against humanity, and the judges included the Nuremberg Code as a part of the judgment. Six of ten principles in Nuremberg Code are derived from the 1931 Guidelines, and two of four newly inserted principles are open to misinterpretation. There is little doubt that the Code was prepared after studying the Guidelines, but no reference was made to the Guidelines, for reasons that are not known. Using the Guidelines as a base document without giving due credit is plagiarism; as per our understanding of ethics today, this would be considered unethical. The Nuremberg Code has fallen by the wayside; since unlike the Declaration of Helsinki, it is not regularly reviewed and updated. The regular updating of some ethics codes is evidence of the evolving nature of human ethics. PMID:21731859
Tree-based solvers for adaptive mesh refinement code FLASH - I: gravity and optical depths
NASA Astrophysics Data System (ADS)
Wünsch, R.; Walch, S.; Dinnbier, F.; Whitworth, A.
2018-04-01
We describe an OctTree algorithm for the MPI parallel, adaptive mesh refinement code FLASH, which can be used to calculate the gas self-gravity, and also the angle-averaged local optical depth, for treating ambient diffuse radiation. The algorithm communicates to the different processors only those parts of the tree that are needed to perform the tree-walk locally. The advantage of this approach is a relatively low memory requirement, important in particular for the optical depth calculation, which needs to process information from many different directions. This feature also enables a general tree-based radiation transport algorithm that will be described in a subsequent paper, and delivers excellent scaling up to at least 1500 cores. Boundary conditions for gravity can be either isolated or periodic, and they can be specified in each direction independently, using a newly developed generalization of the Ewald method. The gravity calculation can be accelerated with the adaptive block update technique by partially re-using the solution from the previous time-step. Comparison with the FLASH internal multigrid gravity solver shows that tree-based methods provide a competitive alternative, particularly for problems with isolated or mixed boundary conditions. We evaluate several multipole acceptance criteria (MACs) and identify a relatively simple approximate partial error MAC which provides high accuracy at low computational cost. The optical depth estimates are found to agree very well with those of the RADMC-3D radiation transport code, with the tree-solver being much faster. Our algorithm is available in the standard release of the FLASH code in version 4.0 and later.
Southern, Danielle A; Burnand, Bernard; Droesler, Saskia E; Flemons, Ward; Forster, Alan J; Gurevich, Yana; Harrison, James; Quan, Hude; Pincus, Harold A; Romano, Patrick S; Sundararajan, Vijaya; Kostanjsek, Nenad; Ghali, William A
2017-03-01
Existing administrative data patient safety indicators (PSIs) have been limited by uncertainty around the timing of onset of included diagnoses. We undertook de novo PSI development through a data-driven approach that drew upon "diagnosis timing" information available in some countries' administrative hospital data. Administrative database analysis and modified Delphi rating process. All hospitalized adults in Canada in 2009. We queried all hospitalizations for ICD-10-CA diagnosis codes arising during hospital stay. We then undertook a modified Delphi panel process to rate the extent to which each of the identified diagnoses has a potential link to suboptimal quality of care. We grouped the identified quality/safety-related diagnoses into relevant clinical categories. Lastly, we queried Alberta hospital discharge data to assess the frequency of the newly defined PSI events. Among 2,416,413 national hospitalizations, we found 2590 unique ICD-10-CA codes flagged as having arisen after admission. Seven panelists evaluated these in a 2-round review process, and identified a listing of 640 ICD-10-CA diagnosis codes judged to be linked to suboptimal quality of care and thus appropriate for inclusion in PSIs. These were then grouped by patient safety experts into 18 clinically relevant PSI categories. We then analyzed data on 2,381,652 Alberta hospital discharges from 2005 through 2012, and found that 134,299 (5.2%) hospitalizations had at least 1 PSI diagnosis. The resulting work creates a foundation for a new set of PSIs for routine large-scale surveillance of hospital and health system performance.
Overview of NASA Lewis Research Center free-piston Stirling engine activities
NASA Technical Reports Server (NTRS)
Slaby, J. G.
1984-01-01
A generic free-piston Stirling technology project is being conducted to develop technologies generic to both space power and terrestrial heat pump applications in a cooperative, cost-shared effort. The generic technology effort includes extensive parametric testing of a 1 kW free-piston Stirling engine (RE-1000), development of a free-piston Stirling performance computer code, design and fabrication under contract of a hydraulic output modification for RE-1000 engine tests, and a 1000-hour endurance test, under contract, of a 3 kWe free-piston Stirling/alternator engine. A newly initiated space power technology feasibility demonstration effort addresses the capability of scaling a free-piston Stirling/alternator system to about 25 kWe; developing thermodynamic cycle efficiency or equal to 70 percent of Carnot at temperature ratios in the order of 1.5 to 2.0; achieving a power conversion unit specific weight of 6 kg/kWe; operating with noncontacting gas bearings; and dynamically balancing the system. Planned engine and component design and test efforts are described.
Reformation of Regulatory Technical Standards for Nuclear Power Generation Equipments in Japan
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mikio Kurihara; Masahiro Aoki; Yu Maruyama
2006-07-01
Comprehensive reformation of the regulatory system has been introduced in Japan in order to apply recent technical progress in a timely manner. 'The Technical Standards for Nuclear Power Generation Equipments', known as the Ordinance No.622) of the Ministry of International Trade and Industry, which is used for detailed design, construction and operating stage of Nuclear Power Plants, was being modified to performance specifications with the consensus codes and standards being used as prescriptive specifications, in order to facilitate prompt review of the Ordinance with response to technological innovation. The activities on modification were performed by the Nuclear and Industrial Safetymore » Agency (NISA), the regulatory body in Japan, with support of the Japan Nuclear Energy Safety Organization (JNES), a technical support organization. The revised Ordinance No.62 was issued on July 1, 2005 and is enforced from January 1 2006. During the period from the issuance to the enforcement, JNES carried out to prepare enforceable regulatory guide which complies with each provisions of the Ordinance No.62, and also made technical assessment to endorse the applicability of consensus codes and standards, in response to NISA's request. Some consensus codes and standards were re-assessed since they were already used in regulatory review of the construction plan submitted by licensee. Other consensus codes and standards were newly assessed for endorsement. In case that proper consensus code or standards were not prepared, details of regulatory requirements were described in the regulatory guide as immediate measures. At the same time, appropriate standards developing bodies were requested to prepare those consensus code or standards. Supplementary note which provides background information on the modification, applicable examples etc. was prepared for convenience to the users of the Ordinance No. 62. This paper shows the activities on modification and the results, following the NISA's presentation at ICONE-13 that introduced the framework of the performance specifications and the modification process of the Ordinance NO. 62. (authors)« less
Theoretical study of a dual harmonic system and its application to the CSNS/RCS
NASA Astrophysics Data System (ADS)
Yuan, Yao-Shuo; Wang, Na; Xu, Shou-Yan; Yuan, Yue; Wang, Sheng
2015-12-01
Dual harmonic systems have been widely used in high intensity proton synchrotrons to suppress the space charge effect, as well as reduce the beam loss. To investigate the longitudinal beam dynamics in a dual rf system, the potential well, the sub-buckets in the bunch and the multi-solutions of the phase equation are studied theoretically in this paper. Based on these theoretical studies, optimization of bunching factor and rf voltage waveform are made for the dual harmonic rf system in the upgrade phase of the China Spallation Neutron Source Rapid Cycling Synchrotron (CSNS/RCS). In the optimization process, the simulation with space charge effect is done using a newly developed code, C-SCSIM. Supported by National Natural Science Foundation of China (11175193)
A novel ECG data compression method based on adaptive Fourier decomposition
NASA Astrophysics Data System (ADS)
Tan, Chunyu; Zhang, Liming
2017-12-01
This paper presents a novel electrocardiogram (ECG) compression method based on adaptive Fourier decomposition (AFD). AFD is a newly developed signal decomposition approach, which can decompose a signal with fast convergence, and hence reconstruct ECG signals with high fidelity. Unlike most of the high performance algorithms, our method does not make use of any preprocessing operation before compression. Huffman coding is employed for further compression. Validated with 48 ECG recordings of MIT-BIH arrhythmia database, the proposed method achieves the compression ratio (CR) of 35.53 and the percentage root mean square difference (PRD) of 1.47% on average with N = 8 decomposition times and a robust PRD-CR relationship. The results demonstrate that the proposed method has a good performance compared with the state-of-the-art ECG compressors.
Neutrino Heating Drives a Supernova (Silent Animation)
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
When a neutron star forms, compression creates heat that generates neutrinos. When the star’s core collapses, a shock wave propagates around the star but stalls. The neutrinos reenergize a stalled shock wave, and the convection created leads to an asymmetric explosion that shoots elements into the cosmos. The heat content, or entropy, is shown, with greater entropy represented by “warmer” hues. At center is a volume rendering of the developing explosion above the newly formed neutron star (based on a simulation with the CHIMERA code); side images of orthogonal slices through the star reveal additional detail. The movie starts 100more » milliseconds after the formation of the neutron star, depicts the shockwave’s bounce and follows astrophysical events up to 432 milliseconds after the bounce.« less
Does littoral sand bypass the head of Mugu Submarine Canyon? - a modeling study
Xu, Jingping; Elias, Edwin; Kinsman, Nicole; Wang, Ping; Rosati, Julie D.; Roberts, Tiffany M.
2011-01-01
A newly developed sand-tracer code for the process-based model Delft3D (Deltares, The Netherlands) was used to simulate the littoral transport near the head of the Mugu Submarine Canyon in California, USA. For westerly swells, which account for more than 90% of the wave conditions in the region, the sand tracers in the downcoast littoral drift were unable to bypass the canyon head. A flow convergence near the upcoast rim of the canyon intercepts the tracers and moves them either offshore onto the shelf just west of the canyon rim (low wave height conditions) or into the canyon head (storm wave conditions). This finding supports the notion that Mugu Canyon is the true terminus of the Santa Barbara Littoral Cell.
"The only feasible means." The Pentagon's ambivalent relationship with the Nuremberg Code.
Moreno, J D
1996-01-01
Convinced that armed conflict with the Soviet Union was all but inevitable, that such conflict would involve unconventional atomic, biological, and chemical warfare, and that research with human subjects was essential to respond to the threat, in the early 1950s the U.S. Department of Defense promulgated a policy governing human experimentation based on the Nuremberg Code. Yet the policymaking process focused on the abstract issue of whether human experiments should go forward at all, ignoring the reality of humans subjects research already under way and leaving unanswered ethical questions about how to conduct such research. Documents newly released to the Advisory Committee on Human Radiation Experiments tell the story of the Pentagon policy.
Incidence and clinical characteristics of interstitial cystitis in the community.
Patel, Ronak; Calhoun, Elizabeth A; Meenan, Richard T; O'Keeffe Rosetti, Maureen C; Kimes, Terry; Clemens, J Quentin
2008-08-01
We utilized physician-coded diagnoses and chart reviews to estimate the incidence of interstitial cystitis (IC) in women. A computer search of the Kaiser Permanente database was performed to identify newly coded diagnoses of IC (ICD-9 code 595.1) between May 2002 and May 2005. Chart reviews were performed and patient demographics, diagnosing physicians, and symptom characteristics were recorded. The IC incidence rate was 15 per 100,000 women per year. The mean age of the patients was 51 years (range 31-81 years). The most common presenting symptoms were frequency (70%), dysuria (52%), urgency (50%), suprapubic pain (50%), nocturia (35%), and dyspareunia (13%). Cases diagnosed by primary care physicians had a shorter median symptom duration (9 months) compared with those diagnosed by urologists (1 year) and gynecologists (3 years). IC is an uncommon diagnosis in the community setting, with an incidence rate of 15 per 100,000 women per year.
1992-07-23
This Law, reformulating entirely Book II of the French Penal Code, newly criminalizes the following acts: a) sexual harassment; b) subjecting a person to work conditions or lodging contrary to human dignity because that person is in a situation of vulnerability or dependence; c) incitement of minors to engage in dangerous or illegal behavior such as excessive drinking, use of narcotics, or begging; and d) using the pictures of minors for pornographic purposes. Sexual harassment is defined as the use of orders, threats, or force to gain sexual favors by a person whose responsibilities place him in a position of authority over another person. In addition, provisions relating to the punishment of procuring have been strengthened in the new Code. Acts noted in b) above were criminalized in order to combat more forcefully the use of clandestine workers.
Daures, J; Gouriou, J; Bordy, J M
2011-03-01
This work has been performed within the frame of the European Union ORAMED project (Optimisation of RAdiation protection for MEDical staff). The main goal of the project is to improve standards of protection for medical staff for procedures resulting in potentially high exposures and to develop methodologies for better assessing and for reducing, exposures to medical staff. The Work Package WP2 is involved in the development of practical eye-lens dosimetry in interventional radiology. This study is complementary of the part of the ENEA report concerning the calculations with the MCNP-4C code of the conversion factors related to the operational quantity H(p)(3). In this study, a set of energy- and angular-dependent conversion coefficients (H(p)(3)/K(a)), in the newly proposed square cylindrical phantom made of ICRU tissue, have been calculated with the Monte-Carlo code PENELOPE and MCNP5. The H(p)(3) values have been determined in terms of absorbed dose, according to the definition of this quantity, and also with the kerma approximation as formerly reported in ICRU reports. At a low-photon energy (up to 1 MeV), the two results obtained with the two methods are consistent. Nevertheless, large differences are showed at a higher energy. This is mainly due to the lack of electronic equilibrium, especially for small angle incidences. The values of the conversion coefficients obtained with the MCNP-4C code published by ENEA quite agree with the kerma approximation calculations obtained with PENELOPE. We also performed the same calculations with the code MCNP5 with two types of tallies: F6 for kerma approximation and *F8 for estimating the absorbed dose that is, as known, due to secondary electrons. PENELOPE and MCNP5 results agree for the kerma approximation and for the absorbed dose calculation of H(p)(3) and prove that, for photon energies larger than 1 MeV, the transport of the secondary electrons has to be taken into account.
Jiang, Chuang-Dao; Wang, Xin; Gao, Hui-Yuan; Shi, Lei; Chow, Wah Soon
2011-03-01
Leaf anatomy of C3 plants is mainly regulated by a systemic irradiance signal. Since the anatomical features of C4 plants are different from that of C3 plants, we investigated whether the systemic irradiance signal regulates leaf anatomical structure and photosynthetic performance in sorghum (Sorghum bicolor), a C4 plant. Compared with growth under ambient conditions (A), no significant changes in anatomical structure were observed in newly developed leaves by shading young leaves alone (YS). Shading mature leaves (MS) or whole plants (S), on the other hand, caused shade-leaf anatomy in newly developed leaves. By contrast, chloroplast ultrastructure in developing leaves depended only on their local light conditions. Functionally, shading young leaves alone had little effect on their net photosynthetic capacity and stomatal conductance, but shading mature leaves or whole plants significantly decreased these two parameters in newly developed leaves. Specifically, the net photosynthetic rate in newly developed leaves exhibited a positive linear correlation with that of mature leaves, as did stomatal conductance. In MS and S treatments, newly developed leaves exhibited severe photoinhibition under high light. By contrast, newly developed leaves in A and YS treatments were more resistant to high light relative to those in MS- and S-treated seedlings. We suggest that (1) leaf anatomical structure, photosynthetic capacity, and high-light tolerance in newly developed sorghum leaves were regulated by a systemic irradiance signal from mature leaves; and (2) chloroplast ultrastructure only weakly influenced the development of photosynthetic capacity and high-light tolerance. The potential significance of the regulation by a systemic irradiance signal is discussed.
Peng, Hui; Lan, Chaowang; Liu, Yuansheng; Liu, Tao; Blumenstein, Michael; Li, Jinyan
2017-10-03
Disease-related protein-coding genes have been widely studied, but disease-related non-coding genes remain largely unknown. This work introduces a new vector to represent diseases, and applies the newly vectorized data for a positive-unlabeled learning algorithm to predict and rank disease-related long non-coding RNA (lncRNA) genes. This novel vector representation for diseases consists of two sub-vectors, one is composed of 45 elements, characterizing the information entropies of the disease genes distribution over 45 chromosome substructures. This idea is supported by our observation that some substructures (e.g., the chromosome 6 p-arm) are highly preferred by disease-related protein coding genes, while some (e.g., the 21 p-arm) are not favored at all. The second sub-vector is 30-dimensional, characterizing the distribution of disease gene enriched KEGG pathways in comparison with our manually created pathway groups. The second sub-vector complements with the first one to differentiate between various diseases. Our prediction method outperforms the state-of-the-art methods on benchmark datasets for prioritizing disease related lncRNA genes. The method also works well when only the sequence information of an lncRNA gene is known, or even when a given disease has no currently recognized long non-coding genes.
Peng, Hui; Lan, Chaowang; Liu, Yuansheng; Liu, Tao; Blumenstein, Michael; Li, Jinyan
2017-01-01
Disease-related protein-coding genes have been widely studied, but disease-related non-coding genes remain largely unknown. This work introduces a new vector to represent diseases, and applies the newly vectorized data for a positive-unlabeled learning algorithm to predict and rank disease-related long non-coding RNA (lncRNA) genes. This novel vector representation for diseases consists of two sub-vectors, one is composed of 45 elements, characterizing the information entropies of the disease genes distribution over 45 chromosome substructures. This idea is supported by our observation that some substructures (e.g., the chromosome 6 p-arm) are highly preferred by disease-related protein coding genes, while some (e.g., the 21 p-arm) are not favored at all. The second sub-vector is 30-dimensional, characterizing the distribution of disease gene enriched KEGG pathways in comparison with our manually created pathway groups. The second sub-vector complements with the first one to differentiate between various diseases. Our prediction method outperforms the state-of-the-art methods on benchmark datasets for prioritizing disease related lncRNA genes. The method also works well when only the sequence information of an lncRNA gene is known, or even when a given disease has no currently recognized long non-coding genes. PMID:29108274
Zetterberg, Veera; Ustina, Valentina; Liitsola, Kirsi; Zilmer, Kai; Kalikova, Nelli; Sevastianova, Ksenia; Brummer-Korvenkontio, Henrikki; Leinikki, Pauli; Salminen, Mika O
2004-11-01
HIV-1 infection has been rare in Estonia. In 2000, an explosive epidemic among injecting drug users was detected in the Eastern border region, resulting in 3603 newly reported cases by the end of 2003. The molecular epidemiology of the outbreak was studied to establish whether the Estonian epidemic is linked to the epidemics in Eastern Europe. Over 200 newly infected individuals were prospectively sampled from June 2000 to March 2002 in a geographically representative way, with known dates of diagnosis and information of probable route of transmission. Viral regions coding for two viral gene regions were directly sequenced from plasma viral RNA and phylogenetically analyzed. In addition, a larger region coding for the entire env gene was sequenced from one sample and studied for indications of possible recombinant structure. The Estonian HIV outbreak was found to be caused by simultaneous introduction of two strains: a minor subtype A strain very similar to the Eastern European subtype A strain (approximately 8% of cases), and a second major strain (77%) found to be most closely related to the CRF06-cpx strain, previously described only from African countries. The variability in the two clusters was very low, suggesting point source introductions. Ten percent of cases seemed to be newly generated recombinants of the A and CRF06-cpx strains. Analysis of viral diversification over time revealed a rate of change within the V3 region of 0.83%/year for the CRF06-cpx strain, consistent with findings from other subtypes. Due to the relatively frequently found novel recombinant forms, the Estonian HIV-1 epidemic may allow studies of coinfection and intersubtype recombination in detail.
Valenzuela-Miranda, Diego; Gallardo-Escárate, Cristian
2016-12-01
Despite the high prevalence and impact to Chilean salmon aquaculture of the intracellular bacterium Piscirickettsia salmonis, the molecular underpinnings of host-pathogen interactions remain unclear. Herein, the interplay of coding and non-coding transcripts has been proposed as a key mechanism involved in immune response. Therefore, the aim of this study was to evidence how coding and non-coding transcripts are modulated during the infection process of Atlantic salmon with P. salmonis. For this, RNA-seq was conducted in brain, spleen, and head kidney samples, revealing different transcriptional profiles according to bacterial load. Additionally, while most of the regulated genes annotated for diverse biological processes during infection, a common response associated with clathrin-mediated endocytosis and iron homeostasis was present in all tissues. Interestingly, while endocytosis-promoting factors and clathrin inductions were upregulated, endocytic receptors were mainly downregulated. Furthermore, the regulation of genes related to iron homeostasis suggested an intracellular accumulation of iron, a process in which heme biosynthesis/degradation pathways might play an important role. Regarding the non-coding response, 918 putative long non-coding RNAs were identified, where 425 were newly characterized for S. salar. Finally, co-localization and co-expression analyses revealed a strong correlation between the modulations of long non-coding RNAs and genes associated with endocytosis and iron homeostasis. These results represent the first comprehensive study of putative interplaying mechanisms of coding and non-coding RNAs during bacterial infection in salmonids. Copyright © 2016 Elsevier Ltd. All rights reserved.
Study of optical design of Blu-ray pickup head system with a liquid crystal element.
Fang, Yi-Chin; Yen, Chih-Ta; Hsu, Jui-Hsin
2014-10-10
This paper proposes a newly developed optical design and an active compensation method for a Blu-ray pickup head system with a liquid crystal (LC) element. Different from traditional pickup lens design, this new optical design delivers performance as good as the conventional one but has more room for tolerance control, which plays a role in antishaking devices, such as portable Blu-ray players. A hole-pattern electrode and LC optics with external voltage input were employed to generate a symmetric nonuniform electrical field in the LC layer that directs LC molecules into the appropriate gradient refractive index distribution, resulting in the convergence or divergence of specific light beams. LC optics deliver fast and, most importantly, active compensation through optical design when errors occur. Simulations and tolerance analysis were conducted using Code V software, including various tolerance analyses, such as defocus, tilt, and decenter, and their related compensations. Two distinct Blu-ray pickup head system designs were examined in this study. In traditional Blu-ray pickup head system designs, the aperture stop is always set on objective lenses. In the study, the aperture stop is on the LC lens as a newly developed lens. The results revealed that an optical design with aperture stop set on the LC lens as an active compensation device successfully eliminated up to 57% of coma aberration compared with traditional optical designs so that this pickup head lens design will have more space for tolerance control.
Simulations of High Current NuMI Magnetic Horn Striplines at FNAL
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sipahi, Taylan; Biedron, Sandra; Hylen, James
2016-06-01
Both the NuMI (Neutrinos and the Main Injector) beam line, that has been providing intense neutrino beams for several Fermilab experiments (MINOS, MINERVA, NOVA), and the newly proposed LBNF (Long Baseline Neutrino Facility) beam line which plans to produce the highest power neutrino beam in the world for DUNE (the Deep Underground Neutrino Experiment) need pulsed magnetic horns to focus the mesons which decay to produce the neutrinos. The high-current horn and stripline design has been evolving as NuMI reconfigures for higher beam power and to meet the needs of the LBNF design. The CSU particle accelerator group has aidedmore » the neutrino physics experiments at Fermilab by producing EM simulations of magnetic horns and the required high-current striplines. In this paper, we present calculations, using the Poisson and ANSYS Maxwell 3D codes, of the EM interaction of the stripline plates of the NuMI horns at critical stress points. In addition, we give the electrical simulation results using the ANSYS Electric code. These results are being used to support the development of evolving horn stripline designs to handle increased electrical current and higher beam power for NuMI upgrades and for LBNF« less
QR encoded smart oral dosage forms by inkjet printing.
Edinger, Magnus; Bar-Shalom, Daniel; Sandler, Niklas; Rantanen, Jukka; Genina, Natalja
2018-01-30
The use of inkjet printing (IJP) technology enables the flexible manufacturing of personalized medicine with the doses tailored for each patient. In this study we demonstrate, for the first time, the applicability of IJP in the production of edible dosage forms in the pattern of a quick response (QR) code. This printed pattern contains the drug itself and encoded information relevant to the patient and/or healthcare professionals. IJP of the active pharmaceutical ingredient (API)-containing ink in the pattern of QR code was performed onto a newly developed porous and flexible, but mechanically stable substrate with a good absorption capacity. The printing did not affect the mechanical properties of the substrate. The actual drug content of the printed dosage forms was in accordance with the encoded drug content. The QR encoded dosage forms had a good print definition without significant edge bleeding. They were readable by a smartphone even after storage in harsh conditions. This approach of efficient data incorporation and data storage combined with the use of smart devices can lead to safer and more patient-friendly drug products in the future. Copyright © 2017 Elsevier B.V. All rights reserved.
Kinetic Modeling of Ultraintense X-ray Laser-Matter Interactions
NASA Astrophysics Data System (ADS)
Royle, Ryan; Sentoku, Yasuhiko; Mancini, Roberto
2016-10-01
Hard x-ray free-electron lasers (XFELs) have had a profound impact on the physical, chemical, and biological sciences. They can produce millijoule x-ray laser pulses just tens of femtoseconds in duration with more than 1012 photons each, making them the brightest laboratory x-ray sources ever produced by several orders of magnitude. An XFEL pulse can be intensified to 1020 W/cm2 when focused to submicron spot sizes, making it possible to isochorically heat solid matter well beyond 100 eV. These characteristics enable XFELs to create and probe well-characterized warm and hot dense plasmas of relevance to HED science, planetary science, laboratory astrophysics, relativistic laser plasmas, and fusion research. Several newly developed atomic physics models including photoionization, Auger ionization, and continuum-lowering have been implemented in a particle-in-cell code, PICLS, which self-consistently solves the x-ray transport, to enable the simulation of the non-LTE plasmas created by ultraintense x-ray laser interactions with solid density matter. The code is validated against the results of several recent experiments and is used to simulate the maximum-intensity x-ray heating of solid iron targets. This work was supported by DOE/OFES under Contract No. DE-SC0008827.
Hadwiger, Lee A; Polashock, James
2013-01-01
Previous reports on the model nonhost resistance interaction between Fusarium solani f. sp. phaseoli and pea endocarp tissue have described the disease resistance-signaling role of a fungal DNase1-like protein. The response resulted in no further growth beyond spore germination. This F. solani f. sp. phaseoli DNase gene, constructed with a pathogenesis-related (PR) gene promoter, when transferred to tobacco, generated resistance against Pseudomonas syringe pv. tabaci. The current analytical/theoretical article proposes similar roles for the additional nuclear and mitochondrial nucleases, the coding regions for which are identified in newly available fungal genome sequences. The amino acid sequence homologies within functional domains are conserved within a wide array of fungi. The potato pathogen Verticillium dahliae nuclease was divergent from that of the saprophyte, yeast; however, the purified DNase from yeast also elicited nonhost defense responses in pea, including pisatin accumulation, PR gene induction, and resistance against a true pea pathogen. The yeast mitochondrial DNase gene (open reading frame) predictably codes for a signal peptide providing the mechanism for secretion. Mitochondrial DNase genes appear to provide an unlimited source of components for developing transgenic resistance in all transformable plants.
Patrick, Donald L; Burke, Laurie B; Gwaltney, Chad J; Leidy, Nancy Kline; Martin, Mona L; Molsen, Elizabeth; Ring, Lena
2011-12-01
The importance of content validity in developing patient reported outcomes (PRO) instruments is stressed by both the US Food and Drug Administration and the European Medicines Agency. Content validity is the extent to which an instrument measures the important aspects of concepts that developers or users purport it to assess. A PRO instrument measures the concepts most significant and relevant to a patient's condition and its treatment. For PRO instruments, items and domains as reflected in the scores of an instrument should be important to the target population and comprehensive with respect to patient concerns. Documentation of target population input in item generation, as well as evaluation of patient understanding through cognitive interviewing, can provide the evidence for content validity. Developing content for, and assessing respondent understanding of, newly developed PRO instruments for medical product evaluation will be discussed in this two-part ISPOR PRO Good Research Practices Task Force Report. Topics include the methods for generating items, documenting item development, coding of qualitative data from item generation, cognitive interviewing, and tracking item development through the various stages of research and preparing this tracking for submission to regulatory agencies. Part 1 covers elicitation of key concepts using qualitative focus groups and/or interviews to inform content and structure of a new PRO instrument. Part 2 covers the instrument development process, the assessment of patient understanding of the draft instrument using cognitive interviews and steps for instrument revision. The two parts are meant to be read together. They are intended to offer suggestions for good practices in planning, executing, and documenting qualitative studies that are used to support the content validity of PRO instruments to be used in medical product evaluation. Copyright © 2011 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.
Image dependency in the recognition of newly learnt faces.
Longmore, Christopher A; Santos, Isabel M; Silva, Carlos F; Hall, Abi; Faloyin, Dipo; Little, Emily
2017-05-01
Research investigating the effect of lighting and viewpoint changes on unfamiliar and newly learnt faces has revealed that such recognition is highly image dependent and that changes in either of these leads to poor recognition accuracy. Three experiments are reported to extend these findings by examining the effect of apparent age on the recognition of newly learnt faces. Experiment 1 investigated the ability to generalize to novel ages of a face after learning a single image. It was found that recognition was best for the learnt image with performance falling the greater the dissimilarity between the study and test images. Experiments 2 and 3 examined whether learning two images aids subsequent recognition of a novel image. The results indicated that interpolation between two studied images (Experiment 2) provided some additional benefit over learning a single view, but that this did not extend to extrapolation (Experiment 3). The results from all studies suggest that recognition was driven primarily by pictorial codes and that the recognition of faces learnt from a limited number of sources operates on stored images of faces as opposed to more abstract, structural, representations.
Rep. Coble, Howard [R-NC-6
2009-09-16
House - 10/19/2009 Referred to the Subcommittee on Immigration, Citizenship, Refugees, Border Security, and International Law. (All Actions) Notes: For further action, see S.1599, which became Public Law 111-113 on 12/14/2009. Tracker: This bill has the status IntroducedHere are the steps for Status of Legislation:
Deborah Lucy, S.; Bisbee, Leslie; Conti-Becker, Angela
2009-01-01
ABSTRACT Purpose: To understand the professional socialization of physical therapy (PT) students. Method: Forty-two students enrolled in our newly developed master's degree programme wrote three-page reflective journals on a critical learning incident after each of three selected clinical experiences. The journals were coded and analyzed, and major themes were identified and described. A separate cohort of 44 students participated in focus groups after the same three clinical experiences to check the trustworthiness of the results. Results: Following the first placement, the main themes coded were emotions, self-confidence, professionalism in the real world, communication, and learning by doing. After the intermediate placement, major themes were idealism versus realism, depth of communication with clients, and breadth of communication with family members and colleagues. Aspects of clinical learning were variable, and self-confidence remained an issue. After the final placement, most students were deeply engaged with their clients and self-confidence had developed to the point of self-efficacy. Tensions increased between the concept of ideal practice and the pragmatics of actual practice, and the concept of self as protégé (rather than as object of the supervisor's evaluation) emerged. The themes were subsequently assembled in a booklet with representative quotations. Conclusion: These results contribute to foundational knowledge required by PT educators, including clinical instructors, by explicitly describing the professional socialization of PT students. PMID:20145748
Zhang, Xiaomin; Xie, Xiangdong; Cheng, Jie; Ning, Jing; Yuan, Yong; Pan, Jie; Yang, Guoshan
2012-01-01
A set of conversion coefficients from kerma free-in-air to the organ absorbed dose for external photon beams from 10 keV to 10 MeV are presented based on a newly developed voxel mouse model, for the purpose of radiation effect evaluation. The voxel mouse model was developed from colour images of successive cryosections of a normal nude male mouse, in which 14 organs or tissues were segmented manually and filled with different colours, while each colour was tagged by a specific ID number for implementation of mouse model in Monte Carlo N-particle code (MCNP). Monte Carlo simulation with MCNP was carried out to obtain organ dose conversion coefficients for 22 external monoenergetic photon beams between 10 keV and 10 MeV under five different irradiation geometries conditions (left lateral, right lateral, dorsal-ventral, ventral-dorsal, and isotropic). Organ dose conversion coefficients were presented in tables and compared with the published data based on a rat model to investigate the effect of body size and weight on the organ dose. The calculated and comparison results show that the organ dose conversion coefficients varying the photon energy exhibits similar trend for most organs except for the bone and skin, and the organ dose is sensitive to body size and weight at a photon energy approximately <0.1 MeV.
Verification of the ideal magnetohydrodynamic response at rational surfaces in the VMEC code
Lazerson, Samuel A.; Loizu, Joaquim; Hirshman, Steven; ...
2016-01-13
The VMEC nonlinear ideal MHD equilibrium code [S. P. Hirshman and J. C. Whitson, Phys. Fluids 26, 3553 (1983)] is compared against analytic linear ideal MHD theory in a screw-pinch-like configuration. The focus of such analysis is to verify the ideal MHD response at magnetic surfaces which possess magnetic transform (ι) which is resonant with spectral values of the perturbed boundary harmonics. A large aspect ratio circular cross section zero-beta equilibrium is considered. This equilibrium possess a rational surface with safety factor q = 2 at a normalized flux value of 0.5. A small resonant boundary perturbation is introduced, excitingmore » a response at the resonant rational surface. The code is found to capture the plasma response as predicted by a newly developed analytic theory that ensures the existence of nested flux surfaces by allowing for a jump in rotational transform (ι=1/q). The VMEC code satisfactorily reproduces these theoretical results without the necessity of an explicit transform discontinuity (Δι) at the rational surface. It is found that the response across the rational surfaces depends upon both radial grid resolution and local shear (dι/dΦ, where ι is the rotational transform and Φ the enclosed toroidal flux). Calculations of an implicit Δι suggest that it does not arise due to numerical artifacts (attributed to radial finite differences in VMEC) or existence conditions for flux surfaces as predicted by linear theory (minimum values of Δι). Scans of the rotational transform profile indicate that for experimentally relevant levels of transform shear the response becomes increasing localised. Furthermore, careful examination of a large experimental tokamak equilibrium, with applied resonant fields, indicates that this shielding response is present, suggesting the phenomena is not limited to this verification exercise.« less
BayeSED: A General Approach to Fitting the Spectral Energy Distribution of Galaxies
NASA Astrophysics Data System (ADS)
Han, Yunkun; Han, Zhanwen
2014-11-01
We present a newly developed version of BayeSED, a general Bayesian approach to the spectral energy distribution (SED) fitting of galaxies. The new BayeSED code has been systematically tested on a mock sample of galaxies. The comparison between the estimated and input values of the parameters shows that BayeSED can recover the physical parameters of galaxies reasonably well. We then applied BayeSED to interpret the SEDs of a large Ks -selected sample of galaxies in the COSMOS/UltraVISTA field with stellar population synthesis models. Using the new BayeSED code, a Bayesian model comparison of stellar population synthesis models has been performed for the first time. We found that the 2003 model by Bruzual & Charlot, statistically speaking, has greater Bayesian evidence than the 2005 model by Maraston for the Ks -selected sample. In addition, while setting the stellar metallicity as a free parameter obviously increases the Bayesian evidence of both models, varying the initial mass function has a notable effect only on the Maraston model. Meanwhile, the physical parameters estimated with BayeSED are found to be generally consistent with those obtained using the popular grid-based FAST code, while the former parameters exhibit more natural distributions. Based on the estimated physical parameters of the galaxies in the sample, we qualitatively classified the galaxies in the sample into five populations that may represent galaxies at different evolution stages or in different environments. We conclude that BayeSED could be a reliable and powerful tool for investigating the formation and evolution of galaxies from the rich multi-wavelength observations currently available. A binary version of the BayeSED code parallelized with Message Passing Interface is publicly available at https://bitbucket.org/hanyk/bayesed.
SOCR Analyses - an Instructional Java Web-based Statistical Analysis Toolkit.
Chu, Annie; Cui, Jenny; Dinov, Ivo D
2009-03-01
The Statistical Online Computational Resource (SOCR) designs web-based tools for educational use in a variety of undergraduate courses (Dinov 2006). Several studies have demonstrated that these resources significantly improve students' motivation and learning experiences (Dinov et al. 2008). SOCR Analyses is a new component that concentrates on data modeling and analysis using parametric and non-parametric techniques supported with graphical model diagnostics. Currently implemented analyses include commonly used models in undergraduate statistics courses like linear models (Simple Linear Regression, Multiple Linear Regression, One-Way and Two-Way ANOVA). In addition, we implemented tests for sample comparisons, such as t-test in the parametric category; and Wilcoxon rank sum test, Kruskal-Wallis test, Friedman's test, in the non-parametric category. SOCR Analyses also include several hypothesis test models, such as Contingency tables, Friedman's test and Fisher's exact test.The code itself is open source (http://socr.googlecode.com/), hoping to contribute to the efforts of the statistical computing community. The code includes functionality for each specific analysis model and it has general utilities that can be applied in various statistical computing tasks. For example, concrete methods with API (Application Programming Interface) have been implemented in statistical summary, least square solutions of general linear models, rank calculations, etc. HTML interfaces, tutorials, source code, activities, and data are freely available via the web (www.SOCR.ucla.edu). Code examples for developers and demos for educators are provided on the SOCR Wiki website.In this article, the pedagogical utilization of the SOCR Analyses is discussed, as well as the underlying design framework. As the SOCR project is on-going and more functions and tools are being added to it, these resources are constantly improved. The reader is strongly encouraged to check the SOCR site for most updated information and newly added models.
Creep force modelling for rail traction vehicles based on the Fastsim algorithm
NASA Astrophysics Data System (ADS)
Spiryagin, Maksym; Polach, Oldrich; Cole, Colin
2013-11-01
The evaluation of creep forces is a complex task and their calculation is a time-consuming process for multibody simulation (MBS). A methodology of creep forces modelling at large traction creepages has been proposed by Polach [Creep forces in simulations of traction vehicles running on adhesion limit. Wear. 2005;258:992-1000; Influence of locomotive tractive effort on the forces between wheel and rail. Veh Syst Dyn. 2001(Suppl);35:7-22] adapting his previously published algorithm [Polach O. A fast wheel-rail forces calculation computer code. Veh Syst Dyn. 1999(Suppl);33:728-739]. The most common method for creep force modelling used by software packages for MBS of running dynamics is the Fastsim algorithm by Kalker [A fast algorithm for the simplified theory of rolling contact. Veh Syst Dyn. 1982;11:1-13]. However, the Fastsim code has some limitations which do not allow modelling the creep force - creep characteristic in agreement with measurements for locomotives and other high-power traction vehicles, mainly for large traction creep at low-adhesion conditions. This paper describes a newly developed methodology based on a variable contact flexibility increasing with the ratio of the slip area to the area of adhesion. This variable contact flexibility is introduced in a modification of Kalker's code Fastsim by replacing the constant Kalker's reduction factor, widely used in MBS, by a variable reduction factor together with a slip-velocity-dependent friction coefficient decreasing with increasing global creepage. The proposed methodology is presented in this work and compared with measurements for different locomotives. The modification allows use of the well recognised Fastsim code for simulation of creep forces at large creepages in agreement with measurements without modifying the proven modelling methodology at small creepages.
Accelerated GPU based SPECT Monte Carlo simulations.
Garcia, Marie-Paule; Bert, Julien; Benoit, Didier; Bardiès, Manuel; Visvikis, Dimitris
2016-06-07
Monte Carlo (MC) modelling is widely used in the field of single photon emission computed tomography (SPECT) as it is a reliable technique to simulate very high quality scans. This technique provides very accurate modelling of the radiation transport and particle interactions in a heterogeneous medium. Various MC codes exist for nuclear medicine imaging simulations. Recently, new strategies exploiting the computing capabilities of graphical processing units (GPU) have been proposed. This work aims at evaluating the accuracy of such GPU implementation strategies in comparison to standard MC codes in the context of SPECT imaging. GATE was considered the reference MC toolkit and used to evaluate the performance of newly developed GPU Geant4-based Monte Carlo simulation (GGEMS) modules for SPECT imaging. Radioisotopes with different photon energies were used with these various CPU and GPU Geant4-based MC codes in order to assess the best strategy for each configuration. Three different isotopes were considered: (99m) Tc, (111)In and (131)I, using a low energy high resolution (LEHR) collimator, a medium energy general purpose (MEGP) collimator and a high energy general purpose (HEGP) collimator respectively. Point source, uniform source, cylindrical phantom and anthropomorphic phantom acquisitions were simulated using a model of the GE infinia II 3/8" gamma camera. Both simulation platforms yielded a similar system sensitivity and image statistical quality for the various combinations. The overall acceleration factor between GATE and GGEMS platform derived from the same cylindrical phantom acquisition was between 18 and 27 for the different radioisotopes. Besides, a full MC simulation using an anthropomorphic phantom showed the full potential of the GGEMS platform, with a resulting acceleration factor up to 71. The good agreement with reference codes and the acceleration factors obtained support the use of GPU implementation strategies for improving computational efficiency of SPECT imaging simulations.
BayeSED: A GENERAL APPROACH TO FITTING THE SPECTRAL ENERGY DISTRIBUTION OF GALAXIES
DOE Office of Scientific and Technical Information (OSTI.GOV)
Han, Yunkun; Han, Zhanwen, E-mail: hanyk@ynao.ac.cn, E-mail: zhanwenhan@ynao.ac.cn
2014-11-01
We present a newly developed version of BayeSED, a general Bayesian approach to the spectral energy distribution (SED) fitting of galaxies. The new BayeSED code has been systematically tested on a mock sample of galaxies. The comparison between the estimated and input values of the parameters shows that BayeSED can recover the physical parameters of galaxies reasonably well. We then applied BayeSED to interpret the SEDs of a large K{sub s} -selected sample of galaxies in the COSMOS/UltraVISTA field with stellar population synthesis models. Using the new BayeSED code, a Bayesian model comparison of stellar population synthesis models has beenmore » performed for the first time. We found that the 2003 model by Bruzual and Charlot, statistically speaking, has greater Bayesian evidence than the 2005 model by Maraston for the K{sub s} -selected sample. In addition, while setting the stellar metallicity as a free parameter obviously increases the Bayesian evidence of both models, varying the initial mass function has a notable effect only on the Maraston model. Meanwhile, the physical parameters estimated with BayeSED are found to be generally consistent with those obtained using the popular grid-based FAST code, while the former parameters exhibit more natural distributions. Based on the estimated physical parameters of the galaxies in the sample, we qualitatively classified the galaxies in the sample into five populations that may represent galaxies at different evolution stages or in different environments. We conclude that BayeSED could be a reliable and powerful tool for investigating the formation and evolution of galaxies from the rich multi-wavelength observations currently available. A binary version of the BayeSED code parallelized with Message Passing Interface is publicly available at https://bitbucket.org/hanyk/bayesed.« less
The role of water vapor in the ITCZ response to hemispherically asymmetric forcings
NASA Astrophysics Data System (ADS)
Clark, S.; Ming, Y.; Held, I.
2016-12-01
Studies using both comprehensive and simplified models have shown that changes to the inter-hemispheric energy budget can lead to changes in the position of the ITCZ. In these studies, the mean position of the ITCZ tends to shift toward the hemisphere receiving more energy. While included in many studies using comprehensive models, the role of the water vapor-radiation feedback in influencing ITCZ shifts has not been focused on in isolation in an idealized setting. Here we use an aquaplanet idealized moist general circulation model initially developed by Dargan Frierson, without clouds, newly coupled to a full radiative transfer code to investigate the role of water vapor in the ITCZ response to hemispherically asymmetric forcings. We induce a southward ITCZ shift by reducing the incoming solar radiation in the northern hemisphere. To isolate the radiative impact of water vapor, we run simulations where the radiation code sees the prognostic water vapor field, which responds dynamically to temperature, parameterized convection, and the circulation and also run simulations where the radiation code sees a prescribed static climatological water vapor field. We find that under Earth-like climate conditions, a shifting water vapor distribution's interaction with longwave radiation amplifies the latitudinal displacement of the ITCZ in response to a given hemispherically asymmetric forcing roughly by a factor of two; this effect appears robust to the convection scheme used. We argue that this amplifying effect can be explained using the energy flux equator theory for the position of the ITCZ.
Exploring the Perceptions of Newly Credentialed Athletic Trainers as They Transition to Practice.
Walker, Stacy E; Thrasher, Ashley B; Mazerolle, Stephanie M
2016-08-01
Research is limited on the transition to practice of newly credentialed athletic trainers (ATs). Understanding this transition could provide insight to assist employers and professional programs in developing initiatives to enhance the transition. To explore newly credentialed ATs' experiences and feelings during their transition from student to autonomous practitioner. Qualitative study. Individual phone interviews. Thirty-four ATs certified between January and September 2013 participated in this study (18 women, 16 men; age = 23.8 ± 2.1 years; work settings were collegiate, secondary school, clinic, and other). Data saturation guided the number of participants. Participants were interviewed via phone using a semistructured interview guide. All interviews were recorded and transcribed verbatim. Data were analyzed through phenomenologic reduction, with data coded for common themes and subthemes. Credibility was established via member checks, peer review, and intercoder reliability. The 3 themes that emerged from the data were (1) transition to practice preparation, (2) orientation, and (3) mentoring. Transition to practice was rarely discussed during professional preparation, but information on the organization and administration or capstone course (eg, insurance, documentation) assisted participants in their transition. Participants felt that preceptors influenced their transition by providing or hindering the number and quality of patient encounters. Participants from larger collegiate settings reported more formal orientation methods (eg, review policies, procedures manual), whereas those in secondary school, clinic/hospital, and smaller collegiate settings reported informal orientation methods (eg, independent review of policies and procedures, tours). Some participants were assigned a formal mentor, and others engaged in peer mentoring. Employers could enhance the transition to practice by providing formal orientation and mentorship. Professional programs could prepare students for the transition by discussing how to find support and mentoring and by involving preceptors who provide students with opportunities to give patient care.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Seitz, R.R.; Rittmann, P.D.; Wood, M.I.
The US Department of Energy Headquarters established a performance assessment task team (PATT) to integrate the activities of DOE sites that are preparing performance assessments for the disposal of newly generated low-level waste. The PATT chartered a subteam with the task of comparing computer codes and exposure scenarios used for dose calculations in performance assessments. This report documents the efforts of the subteam. Computer codes considered in the comparison include GENII, PATHRAE-EPA, MICROSHIELD, and ISOSHLD. Calculations were also conducted using spreadsheets to provide a comparison at the most fundamental level. Calculations and modeling approaches are compared for unit radionuclide concentrationsmore » in water and soil for the ingestion, inhalation, and external dose pathways. Over 30 tables comparing inputs and results are provided.« less
How families cope with diabetes in adolescence. An approach and case analyses.
Hauser, S T; Paul, E L; Jacobson, A M; Weiss-Perry, B; Vieyra, M A; Rufo, P; Spetter, L D; DiPlacido, J; Wertlieb, D; Wolfsdorf, J
1988-01-01
In this paper we describe our newly constructed Family Coping Coding System. This scheme was constructed to identify family coping strategies that involve appraisal, problem solving, and emotion management dimensions. We discuss the theoretical rationale, meanings and reliability of the coping codes, and illustrate them through excerpts drawn from family discussions of a recent stressful situation (the onset of a chronic or acute illness in an adolescent member). Finally, we consider the clinical research relevance of this new assessment technique, exemplifying this potential with respect to medical compliance. We present analyses of two families with diabetic adolescents who strikingly differ with respect to compliance, and explore which family coping strategies may be predictive of an adolescent's favorable or problematic compliance to diabetes management.
A Rewritable, Random-Access DNA-Based Storage System.
Yazdi, S M Hossein Tabatabaei; Yuan, Yongbo; Ma, Jian; Zhao, Huimin; Milenkovic, Olgica
2015-09-18
We describe the first DNA-based storage architecture that enables random access to data blocks and rewriting of information stored at arbitrary locations within the blocks. The newly developed architecture overcomes drawbacks of existing read-only methods that require decoding the whole file in order to read one data fragment. Our system is based on new constrained coding techniques and accompanying DNA editing methods that ensure data reliability, specificity and sensitivity of access, and at the same time provide exceptionally high data storage capacity. As a proof of concept, we encoded parts of the Wikipedia pages of six universities in the USA, and selected and edited parts of the text written in DNA corresponding to three of these schools. The results suggest that DNA is a versatile media suitable for both ultrahigh density archival and rewritable storage applications.
Evaluation of MARC for the analysis of rotating composite blades
NASA Technical Reports Server (NTRS)
Bartos, Karen F.; Ernst, Michael A.
1993-01-01
The suitability of the MARC code for the analysis of rotating composite blades was evaluated using a four-task process. A nonlinear displacement analysis and subsequent eigenvalue analysis were performed on a rotating spring mass system to ensure that displacement-dependent centrifugal forces were accounted for in the eigenvalue analysis. Normal modes analyses were conducted on isotropic plates with various degrees of twist to evaluate MARC's ability to handle blade twist. Normal modes analyses were conducted on flat composite plates to validate the newly developed coupled COBSTRAN-MARC methodology. Finally, normal modes analyses were conducted on four composite propfan blades that were designed, analyzed, and fabricated at NASA Lewis Research Center. Results were compared with experimental data. The research documented herein presents MARC as a viable tool for the analysis of rotating composite blades.
Evaluation of MARC for the analysis of rotating composite blades
NASA Astrophysics Data System (ADS)
Bartos, Karen F.; Ernst, Michael A.
1993-03-01
The suitability of the MARC code for the analysis of rotating composite blades was evaluated using a four-task process. A nonlinear displacement analysis and subsequent eigenvalue analysis were performed on a rotating spring mass system to ensure that displacement-dependent centrifugal forces were accounted for in the eigenvalue analysis. Normal modes analyses were conducted on isotropic plates with various degrees of twist to evaluate MARC's ability to handle blade twist. Normal modes analyses were conducted on flat composite plates to validate the newly developed coupled COBSTRAN-MARC methodology. Finally, normal modes analyses were conducted on four composite propfan blades that were designed, analyzed, and fabricated at NASA Lewis Research Center. Results were compared with experimental data. The research documented herein presents MARC as a viable tool for the analysis of rotating composite blades.
A Rewritable, Random-Access DNA-Based Storage System
NASA Astrophysics Data System (ADS)
Tabatabaei Yazdi, S. M. Hossein; Yuan, Yongbo; Ma, Jian; Zhao, Huimin; Milenkovic, Olgica
2015-09-01
We describe the first DNA-based storage architecture that enables random access to data blocks and rewriting of information stored at arbitrary locations within the blocks. The newly developed architecture overcomes drawbacks of existing read-only methods that require decoding the whole file in order to read one data fragment. Our system is based on new constrained coding techniques and accompanying DNA editing methods that ensure data reliability, specificity and sensitivity of access, and at the same time provide exceptionally high data storage capacity. As a proof of concept, we encoded parts of the Wikipedia pages of six universities in the USA, and selected and edited parts of the text written in DNA corresponding to three of these schools. The results suggest that DNA is a versatile media suitable for both ultrahigh density archival and rewritable storage applications.
Alternate operating scenarios for NDCX-II
NASA Astrophysics Data System (ADS)
Sharp, W. M.; Friedman, A.; Grote, D. P.; Cohen, R. H.; Lund, S. M.; Vay, J.-L.; Waldron, W. L.
2014-01-01
NDCX-II is a newly completed accelerator facility at LBNL, built to study ion-heated warm dense matter, as well as aspects of ion-driven targets and intense-beam dynamics for inertial-fusion energy. The baseline design calls for using 12 induction cells to accelerate 30-50 nC of Li+ ions to 1.2 MeV. During commissioning, though, we plan to extend the source lifetime by extracting less total charge. Over time, we expect that NDCX-II will be upgraded to substantially higher energies, necessitating the use of heavier ions to keep a suitable deposition range in targets. For operational flexibility, the option of using a helium plasma source is also being investigated. Each of these options requires development of an alternate acceleration schedule. The schedules here are worked out with a fast-running 1-D particle-in-cell code ASP.
NASA Technical Reports Server (NTRS)
Balakrishnan, L.; Abdol-Hamid, Khaled S.
1992-01-01
Compressible jet plumes were studied using a two-equation turbulence model. A space marching procedure based on an upwind numerical scheme was used to solve the governing equations and turbulence transport equations. The computed results indicate that extending the space marching procedure for solving supersonic/subsonic mixing problems can be stable, efficient and accurate. Moreover, a newly developed correction for compressible dissipation has been verified in fully expanded and underexpanded jet plumes. For a sonic jet plume, no improvement in results over the standard two-equation model was seen. However for a supersonic jet plume, the correction due to compressible dissipation successfully predicted the reduced spreading rate of the jet compared to the sonic case. The computed results were generally in good agreement with the experimental data.
Heinemann, Allen W; Miskovic, Ana; Semik, Patrick; Wong, Alex; Dashner, Jessica; Baum, Carolyn; Magasi, Susan; Hammel, Joy; Tulsky, David S; Garcia, Sofia F; Jerousek, Sara; Lai, Jin-Shei; Carlozzi, Noelle E; Gray, David B
2016-12-01
To describe the unique and overlapping content of the newly developed Environmental Factors Item Banks (EFIB) and 7 legacy environmental factor instruments, and to evaluate the EFIB's construct validity by examining associations with legacy instruments. Cross-sectional, observational cohort. Community. A sample of community-dwelling adults with stroke, spinal cord injury, and traumatic brain injury (N=568). None. EFIB covering domains of the built and natural environment; systems, services, and policies; social environment; and access to information and technology; the Craig Hospital Inventory of Environmental Factors (CHIEF) short form; the Facilitators and Barriers Survey/Mobility (FABS/M) short form; the Home and Community Environment Instrument (HACE); the Measure of the Quality of the Environment (MQE) short form; and 3 of the Patient Reported Outcomes Measurement Information System's (PROMIS) Quality of Social Support measures. The EFIB and legacy instruments assess most of the International Classification of Functioning, Disability and Health (ICF) environmental factors chapters, including chapter 1 (products and technology; 75 items corresponding to 11 codes), chapter 2 (natural environment and human-made changes; 31 items corresponding to 7 codes), chapter 3 (support and relationships; 74 items corresponding to 7 codes), chapter 4 (attitudes; 83 items corresponding to 8 codes), and chapter 5 (services, systems, and policies; 72 items corresponding to 16 codes). Construct validity is provided by moderate correlations between EFIB measures and the CHIEF, MQE barriers, HACE technology mobility, FABS/M community built features, and PROMIS item banks and by small correlations with other legacy instruments. Only 5 of the 66 legacy instrument correlation coefficients are moderate, suggesting they measure unique aspects of the environment, whereas all intra-EFIB correlations were at least moderate. The EFIB measures provide a brief and focused assessment of ICF environmental factor chapters. The pattern of correlations with legacy instruments provides initial evidence of construct validity. Copyright © 2016 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.
2013-01-01
Background Significant efforts have been made to address the problem of identifying short genes in prokaryotic genomes. However, most known methods are not effective in detecting short genes. Because of the limited information contained in short DNA sequences, it is very difficult to accurately distinguish between protein coding and non-coding sequences in prokaryotic genomes. We have developed a new Iteratively Adaptive Sparse Partial Least Squares (IASPLS) algorithm as the classifier to improve the accuracy of the identification process. Results For testing, we chose the short coding and non-coding sequences from seven prokaryotic organisms. We used seven feature sets (including GC content, Z-curve, etc.) of short genes. In comparison with GeneMarkS, Metagene, Orphelia, and Heuristic Approachs methods, our model achieved the best prediction performance in identification of short prokaryotic genes. Even when we focused on the very short length group ([60–100 nt)), our model provided sensitivity as high as 83.44% and specificity as high as 92.8%. These values are two or three times higher than three of the other methods while Metagene fails to recognize genes in this length range. The experiments also proved that the IASPLS can improve the identification accuracy in comparison with other widely used classifiers, i.e. Logistic, Random Forest (RF) and K nearest neighbors (KNN). The accuracy in using IASPLS was improved 5.90% or more in comparison with the other methods. In addition to the improvements in accuracy, IASPLS required ten times less computer time than using KNN or RF. Conclusions It is conclusive that our method is preferable for application as an automated method of short gene classification. Its linearity and easily optimized parameters make it practicable for predicting short genes of newly-sequenced or under-studied species. Reviewers This article was reviewed by Alexey Kondrashov, Rajeev Azad (nominated by Dr J.Peter Gogarten) and Yuriy Fofanov (nominated by Dr Janet Siefert). PMID:24067167
Reactor Pressure Vessel Fracture Analysis Capabilities in Grizzly
DOE Office of Scientific and Technical Information (OSTI.GOV)
Spencer, Benjamin; Backman, Marie; Chakraborty, Pritam
2015-03-01
Efforts have been underway to develop fracture mechanics capabilities in the Grizzly code to enable it to be used to perform deterministic fracture assessments of degraded reactor pressure vessels (RPVs). Development in prior years has resulted a capability to calculate -integrals. For this application, these are used to calculate stress intensity factors for cracks to be used in deterministic linear elastic fracture mechanics (LEFM) assessments of fracture in degraded RPVs. The -integral can only be used to evaluate stress intensity factors for axis-aligned flaws because it can only be used to obtain the stress intensity factor for pure Mode Imore » loading. Off-axis flaws will be subjected to mixed-mode loading. For this reason, work has continued to expand the set of fracture mechanics capabilities to permit it to evaluate off-axis flaws. This report documents the following work to enhance Grizzly’s engineering fracture mechanics capabilities for RPVs: • Interaction Integral and -stress: To obtain mixed-mode stress intensity factors, a capability to evaluate interaction integrals for 2D or 3D flaws has been developed. A -stress evaluation capability has been developed to evaluate the constraint at crack tips in 2D or 3D. Initial verification testing of these capabilities is documented here. • Benchmarking for axis-aligned flaws: Grizzly’s capabilities to evaluate stress intensity factors for axis-aligned flaws have been benchmarked against calculations for the same conditions in FAVOR. • Off-axis flaw demonstration: The newly-developed interaction integral capabilities are demon- strated in an application to calculate the mixed-mode stress intensity factors for off-axis flaws. • Other code enhancements: Other enhancements to the thermomechanics capabilities that relate to the solution of the engineering RPV fracture problem are documented here.« less
Simulations of Turbulent Momentum and Scalar Transport in Confined Swirling Coaxial Jets
NASA Technical Reports Server (NTRS)
Shih, Tsan-Hsing; Liu, Nan-Suey
2014-01-01
This paper presents the numerical simulations of confined three dimensional coaxial water jets. The objectives are to validate the newly proposed nonlinear turbulence models of momentum and scalar transport, and to evaluate the newly introduced scalar APDF and DWFDF equation along with its Eulerian implementation in the National Combustion Code (NCC). Simulations conducted include the steady RANS, the unsteady RANS (URANS), and the time-filtered Navier-Stokes (TFNS) with and without invoking the APDF or DWFDF equation. When the APDF or DWFDF equation is invoked, the simulations are of a hybrid nature, i.e., the transport equations of energy and species are replaced by the APDF or DWFDF equation. Results of simulations are compared with the available experimental data. Some positive impacts of the nonlinear turbulence models and the Eulerian scalar APDF and DWFDF approach are observed.
ITG-TEM turbulence simulation with bounce-averaged kinetic electrons in tokamak geometry
NASA Astrophysics Data System (ADS)
Kwon, Jae-Min; Qi, Lei; Yi, S.; Hahm, T. S.
2017-06-01
We develop a novel numerical scheme to simulate electrostatic turbulence with kinetic electron responses in magnetically confined toroidal plasmas. Focusing on ion gyro-radius scale turbulences with slower frequencies than the time scales for electron parallel motions, we employ and adapt the bounce-averaged kinetic equation to model trapped electrons for nonlinear turbulence simulation with Coulomb collisions. Ions are modeled by employing the gyrokinetic equation. The newly developed scheme is implemented on a global δf particle in cell code gKPSP. By performing linear and nonlinear simulations, it is demonstrated that the new scheme can reproduce key physical properties of Ion Temperature Gradient (ITG) and Trapped Electron Mode (TEM) instabilities, and resulting turbulent transport. The overall computational cost of kinetic electrons using this novel scheme is limited to 200%-300% of the cost for simulations with adiabatic electrons. Therefore the new scheme allows us to perform kinetic simulations with trapped electrons very efficiently in magnetized plasmas.
NASA Technical Reports Server (NTRS)
Oaks, O. J.; Reid, Wilson; Wright, James; Duffey, Christopher; Williams, Charles; Warren, Hugh; Zeh, Tom; Buisson, James
1996-01-01
The Naval Research Laboratory (NRL) in the development of timing systems for remote locations, had a technical requirement for a Y code (SA/AS) Global Positioning System (GPS) precise time transfer receiver (TTR) which could be used both in a stationary mode or mobile mode. A contract was awarded to the Stanford Telecommunication Corporation (STEL) to build such a device. The Eastern Range (ER) als had a requirement for such a receiver and entered into the contract with NRL for the procurement of additional receivers. The Moving Vehicle Experiment (MVE) described in this paper is the first in situ test of the STEL Model 5401C Time Transfer System in both stationary and mobile operations. The primary objective of the MVE was to test the timing accuracy of the newly developed GPS TTR aboard a moving vessel. To accomplish this objective, a joint experiment was performed with personnel from NRL and the er at the Atlantic Undersea Test and Evaluation Center (AUTEC) test range at Andros Island. Results and discussion of the test are presented in this paper.
Atkinson, Mark J; Lohs, Jan; Kuhagen, Ilka; Kaufman, Julie; Bhaidani, Shamsu
2006-01-01
Objectives This proof of concept (POC) study was designed to evaluate the use of an Internet-based bulletin board technology to aid parallel cross-cultural development of thematic content for a new set of patient-reported outcome measures (PROs). Methods The POC study, conducted in Germany and the United States, utilized Internet Focus Groups (IFGs) to assure the validity of new PRO items across the two cultures – all items were designed to assess the impact of excess facial oil on individuals' lives. The on-line IFG activities were modeled after traditional face-to-face focus groups and organized by a common 'Topic' Guide designed with input from thought leaders in dermatology and health outcomes research. The two sets of IFGs were professionally moderated in the native language of each country. IFG moderators coded the thematic content of transcripts, and a frequency analysis of code endorsement was used to identify areas of content similarity and difference between the two countries. Based on this information, draft PRO items were designed and a majority (80%) of the original participants returned to rate the relative importance of the newly designed questions. Findings The use of parallel cross-cultural content analysis of IFG transcripts permitted identification of the major content themes in each country as well as exploration of the possible reasons for any observed differences between the countries. Results from coded frequency counts and transcript reviews informed the design and wording of the test questions for the future PRO instrument(s). Subsequent ratings of item importance also deepened our understanding of potential areas of cross-cultural difference, differences that would be explored over the course of future validation studies involving these PROs. Conclusion The use of IFGs for cross-cultural content development received positive reviews from participants and was found to be both cost and time effective. The novel thematic coding methodology provided an empirical platform on which to develop culturally sensitive questionnaire content using the natural language of participants. Overall, the IFG responses and thematic analyses provided a thorough evaluation of similarities and differences in cross-cultural themes, which in turn acted as a sound base for the development of new PRO questionnaires. PMID:16995935
Atkinson, Mark J; Lohs, Jan; Kuhagen, Ilka; Kaufman, Julie; Bhaidani, Shamsu
2006-09-22
This proof of concept (POC) study was designed to evaluate the use of an Internet-based bulletin board technology to aid parallel cross-cultural development of thematic content for a new set of patient-reported outcome measures (PROs). The POC study, conducted in Germany and the United States, utilized Internet Focus Groups (IFGs) to assure the validity of new PRO items across the two cultures--all items were designed to assess the impact of excess facial oil on individuals' lives. The on-line IFG activities were modeled after traditional face-to-face focus groups and organized by a common 'Topic' Guide designed with input from thought leaders in dermatology and health outcomes research. The two sets of IFGs were professionally moderated in the native language of each country. IFG moderators coded the thematic content of transcripts, and a frequency analysis of code endorsement was used to identify areas of content similarity and difference between the two countries. Based on this information, draft PRO items were designed and a majority (80%) of the original participants returned to rate the relative importance of the newly designed questions. The use of parallel cross-cultural content analysis of IFG transcripts permitted identification of the major content themes in each country as well as exploration of the possible reasons for any observed differences between the countries. Results from coded frequency counts and transcript reviews informed the design and wording of the test questions for the future PRO instrument(s). Subsequent ratings of item importance also deepened our understanding of potential areas of cross-cultural difference, differences that would be explored over the course of future validation studies involving these PROs. The use of IFGs for cross-cultural content development received positive reviews from participants and was found to be both cost and time effective. The novel thematic coding methodology provided an empirical platform on which to develop culturally sensitive questionnaire content using the natural language of participants. Overall, the IFG responses and thematic analyses provided a thorough evaluation of similarities and differences in cross-cultural themes, which in turn acted as a sound base for the development of new PRO questionnaires.
Adaptive evolution of the Hox gene family for development in bats and dolphins.
Liang, Lu; Shen, Yong-Yi; Pan, Xiao-Wei; Zhou, Tai-Cheng; Yang, Chao; Irwin, David M; Zhang, Ya-Ping
2013-01-01
Bats and cetaceans (i.e., whales, dolphins, porpoises) are two kinds of mammals with unique locomotive styles and occupy novel niches. Bats are the only mammals capable of sustained flight in the sky, while cetaceans have returned to the aquatic environment and are specialized for swimming. Associated with these novel adaptations to their environment, various development changes have occurred to their body plans and associated structures. Given the importance of Hox genes in many aspects of embryonic development, we conducted an analysis of the coding regions of all Hox gene family members from bats (represented by Pteropus vampyrus, Pteropus alecto, Myotis lucifugus and Myotis davidii) and cetaceans (represented by Tursiops truncatus) for adaptive evolution using the available draft genome sequences. Differences in the selective pressures acting on many Hox genes in bats and cetaceans were found compared to other mammals. Positive selection, however, was not found to act on any of the Hox genes in the common ancestor of bats and only upon Hoxb9 in cetaceans. PCR amplification data from additional bat and cetacean species, and application of the branch-site test 2, showed that the Hoxb2 gene within bats had significant evidence of positive selection. Thus, our study, with genomic and newly sequenced Hox genes, identifies two candidate Hox genes that may be closely linked with developmental changes in bats and cetaceans, such as those associated with the pancreatic, neuronal, thymus shape and forelimb. In addition, the difference in our results from the genome-wide scan and newly sequenced data reveals that great care must be taken in interpreting results from draft genome data from a limited number of species, and deep genetic sampling of a particular clade is a powerful tool for generating complementary data to address this limitation.
Adaptive Evolution of the Hox Gene Family for Development in Bats and Dolphins
Pan, Xiao-Wei; Zhou, Tai-Cheng; Yang, Chao; Irwin, David M.; Zhang, Ya-Ping
2013-01-01
Bats and cetaceans (i.e., whales, dolphins, porpoises) are two kinds of mammals with unique locomotive styles and occupy novel niches. Bats are the only mammals capable of sustained flight in the sky, while cetaceans have returned to the aquatic environment and are specialized for swimming. Associated with these novel adaptations to their environment, various development changes have occurred to their body plans and associated structures. Given the importance of Hox genes in many aspects of embryonic development, we conducted an analysis of the coding regions of all Hox gene family members from bats (represented by Pteropus vampyrus, Pteropus alecto, Myotis lucifugus and Myotis davidii) and cetaceans (represented by Tursiops truncatus) for adaptive evolution using the available draft genome sequences. Differences in the selective pressures acting on many Hox genes in bats and cetaceans were found compared to other mammals. Positive selection, however, was not found to act on any of the Hox genes in the common ancestor of bats and only upon Hoxb9 in cetaceans. PCR amplification data from additional bat and cetacean species, and application of the branch-site test 2, showed that the Hoxb2 gene within bats had significant evidence of positive selection. Thus, our study, with genomic and newly sequenced Hox genes, identifies two candidate Hox genes that may be closely linked with developmental changes in bats and cetaceans, such as those associated with the pancreatic, neuronal, thymus shape and forelimb. In addition, the difference in our results from the genome-wide scan and newly sequenced data reveals that great care must be taken in interpreting results from draft genome data from a limited number of species, and deep genetic sampling of a particular clade is a powerful tool for generating complementary data to address this limitation. PMID:23825528
The Chandra Source Catalog: Algorithms
NASA Astrophysics Data System (ADS)
McDowell, Jonathan; Evans, I. N.; Primini, F. A.; Glotfelty, K. J.; McCollough, M. L.; Houck, J. C.; Nowak, M. A.; Karovska, M.; Davis, J. E.; Rots, A. H.; Siemiginowska, A. L.; Hain, R.; Evans, J. D.; Anderson, C. S.; Bonaventura, N. R.; Chen, J. C.; Doe, S. M.; Fabbiano, G.; Galle, E. C.; Gibbs, D. G., II; Grier, J. D.; Hall, D. M.; Harbo, P. N.; He, X.; Lauer, J.; Miller, J. B.; Mitschang, A. W.; Morgan, D. L.; Nichols, J. S.; Plummer, D. A.; Refsdal, B. L.; Sundheim, B. A.; Tibbetts, M. S.; van Stone, D. W.; Winkelman, S. L.; Zografou, P.
2009-09-01
Creation of the Chandra Source Catalog (CSC) required adjustment of existing pipeline processing, adaptation of existing interactive analysis software for automated use, and development of entirely new algorithms. Data calibration was based on the existing pipeline, but more rigorous data cleaning was applied and the latest calibration data products were used. For source detection, a local background map was created including the effects of ACIS source readout streaks. The existing wavelet source detection algorithm was modified and a set of post-processing scripts used to correct the results. To analyse the source properties we ran the SAO Traceray trace code for each source to generate a model point spread function, allowing us to find encircled energy correction factors and estimate source extent. Further algorithms were developed to characterize the spectral, spatial and temporal properties of the sources and to estimate the confidence intervals on count rates and fluxes. Finally, sources detected in multiple observations were matched, and best estimates of their merged properties derived. In this paper we present an overview of the algorithms used, with more detailed treatment of some of the newly developed algorithms presented in companion papers.
OpenMP-accelerated SWAT simulation using Intel C and FORTRAN compilers: Development and benchmark
NASA Astrophysics Data System (ADS)
Ki, Seo Jin; Sugimura, Tak; Kim, Albert S.
2015-02-01
We developed a practical method to accelerate execution of Soil and Water Assessment Tool (SWAT) using open (free) computational resources. The SWAT source code (rev 622) was recompiled using a non-commercial Intel FORTRAN compiler in Ubuntu 12.04 LTS Linux platform, and newly named iOMP-SWAT in this study. GNU utilities of make, gprof, and diff were used to develop the iOMP-SWAT package, profile memory usage, and check identicalness of parallel and serial simulations. Among 302 SWAT subroutines, the slowest routines were identified using GNU gprof, and later modified using Open Multiple Processing (OpenMP) library in an 8-core shared memory system. In addition, a C wrapping function was used to rapidly set large arrays to zero by cross compiling with the original SWAT FORTRAN package. A universal speedup ratio of 2.3 was achieved using input data sets of a large number of hydrological response units. As we specifically focus on acceleration of a single SWAT run, the use of iOMP-SWAT for parameter calibrations will significantly improve the performance of SWAT optimization.
MicroRNA-mediated regulatory circuits: outlook and perspectives
NASA Astrophysics Data System (ADS)
Cora', Davide; Re, Angela; Caselle, Michele; Bussolino, Federico
2017-08-01
MicroRNAs have been found to be necessary for regulating genes implicated in almost all signaling pathways, and consequently their dysfunction influences many diseases, including cancer. Understanding of the complexity of the microRNA-mediated regulatory network has grown in terms of size, connectivity and dynamics with the development of computational and, more recently, experimental high-throughput approaches for microRNA target identification. Newly developed studies on recurrent microRNA-mediated circuits in regulatory networks, also known as network motifs, have substantially contributed to addressing this complexity, and therefore to helping understand the ways by which microRNAs achieve their regulatory role. This review provides a summarizing view of the state-of-the-art, and perspectives of research efforts on microRNA-mediated regulatory motifs. In this review, we discuss the topological properties characterizing different types of circuits, and the regulatory features theoretically enabled by such properties, with a special emphasis on examples of circuits typifying their biological significance in experimentally validated contexts. Finally, we will consider possible future developments, in particular regarding microRNA-mediated circuits involving long non-coding RNAs and epigenetic regulators.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rodriguez, Sal
2017-08-24
The code (aka computer program written as a Matlab script) uses a unique set of n independent equations to solve for n turbulence variables. The code requires the input of a characteristic dimension, a characteristic fluid velocity, the fluid dynamic viscosity, and the fluid density. Most importantly, the code estimates the size of three key turbulent eddies: Kolmogorov, Taylor, and integral. Based on the eddy sizes, dimples dimensions are prescribed such that the key eddies (principally Taylor, and sometimes Kolmogorov), can be generated by the dimple rim and flow unimpeded through the dimple’s concave cavity. It is hypothesized that turbulentmore » eddies are generated by the dimple rim at the dimple-surface interface. The newly-generated eddies in turn entrain the movement of surrounding regions of fluid, creating more mixing. The eddies also generate lift near the wall surrounding the dimple, as they accelerate and reduce pressure in the regions near and at the dimple cavity, thereby minimizing the fluid drag.« less
MELCOR/CONTAIN LMR Implementation Report - FY16 Progress.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Louie, David; Humphries, Larry L.
2016-11-01
This report describes the progress of the CONTAIN - LMR sodium physics and chemistry models to be implemented in MELCOR 2.1. In the past three years , the implementation included the addition of sodium equations of state and sodium properties from two different sources. The first source is based on the previous work done by Idaho National Laboratory by modifying MELCOR to include liquid lithium equation of state as a working fluid to model the nuclear fusion safety research. The second source uses properties generated for the SIMMER code. The implemented modeling has been tested and results are reported inmore » this document. In addition, the CONTAIN - LMR code was derived from an early version of the CONTAIN code, and many physical models that were developed since this early version of CONTAIN are not available in this early code version. Therefore, CONTAIN 2 has been updated with the sodium models in CONTAIN - LMR as CONTAIN2 - LMR, which may be used to provide code-to-code comparison with CONTAIN - LMR and MELCOR when the sodium chemistry models from CONTAIN - LMR have been completed. Both the spray fire and pool fire chemistry routines from CONTAIN - LMR have been integrated into MELCOR 2.1, and debugging and testing are in progress. Because MELCOR only models the equation of state for liquid and gas phases of the coolant, a modeling gap still exists when dealing with experiments or accident conditions that take place when the ambient temperature is below the freezing point of sodium. An alternative method is under investigation to overcome this gap . We are no longer working on the separate branch from the main branch of MELCOR 2.1 since the major modeling of MELCOR 2.1 has been completed. At the current stage, the newly implemented sodium chemistry models will be a part of the main MELCOR release version (MELCOR 2.2). This report will discuss the accomplishments and issues relating to the implementation. Also, we will report on the planned completion of all remaining tasks in the upcoming FY2017, including the atmospheric chemistry model and sodium - concrete interaction model implementation .« less
An Application of Holland’s Occupational Codes to Air Force Officer Career Fields.
1985-09-01
allows men to intelligently counteract the arbitrary decisions of their corporations, to address themselves as private persons, and to restore the balance...57) (23:114). Gottfredson , an associate of Holland, in a study of newly hired bank tellers, found that congruence of VPI score and occupation type...Campbell, David P. and Jo-Ida C. Hansen. Manual for the SVIB-SCII. Stanford: Stanford University Press, 1981. 4. Gottfredson , Gary D. and others. Dictionary
Design and Implementation of Secure and Reliable Communication using Optical Wireless Communication
NASA Astrophysics Data System (ADS)
Saadi, Muhammad; Bajpai, Ambar; Zhao, Yan; Sangwongngam, Paramin; Wuttisittikulkij, Lunchakorn
2014-11-01
Wireless networking intensify the tractability in the home and office environment to connect the internet without wires but at the cost of risks associated with stealing the data or threat of loading malicious code with the intention of harming the network. In this paper, we proposed a novel method of establishing a secure and reliable communication link using optical wireless communication (OWC). For security, spatial diversity based transmission using two optical transmitters is used and the reliability in the link is achieved by a newly proposed method for the construction of structured parity check matrix for binary Low Density Parity Check (LDPC) codes. Experimental results show that a successful secure and reliable link between the transmitter and the receiver can be achieved by using the proposed novel technique.
The molecular basis for attractive salt-taste coding in Drosophila.
Zhang, Yali V; Ni, Jinfei; Montell, Craig
2013-06-14
Below a certain level, table salt (NaCl) is beneficial for animals, whereas excessive salt is harmful. However, it remains unclear how low- and high-salt taste perceptions are differentially encoded. We identified a salt-taste coding mechanism in Drosophila melanogaster. Flies use distinct types of gustatory receptor neurons (GRNs) to respond to different concentrations of salt. We demonstrated that a member of the newly discovered ionotropic glutamate receptor (IR) family, IR76b, functioned in the detection of low salt and was a Na(+) channel. The loss of IR76b selectively impaired the attractive pathway, leaving salt-aversive GRNs unaffected. Consequently, low salt became aversive. Our work demonstrated that the opposing behavioral responses to low and high salt were determined largely by an elegant bimodal switch system operating in GRNs.
Functional expression of amine oxidase from Aspergillus niger (AO-I) in Saccharomyces cerevisiae.
Kolaríková, Katerina; Galuszka, Petr; Sedlárová, Iva; Sebela, Marek; Frébort, Ivo
2009-01-01
The aim of this work was to prepare recombinant amine oxidase from Aspergillus niger after overexpressing in yeast. The yeast expression vector pDR197 that includes a constitutive PMA1 promoter was used for the expression in Saccharomyces cerevisiae. Recombinant amine oxidase was extracted from the growth medium of the yeast, purified to homogeneity and identified by activity assay and MALDI-TOF peptide mass fingerprinting. Similarity search in the newly published A. niger genome identified six genes coding for copper amine oxidase, two of them corresponding to the previously described enzymes AO-I a methylamine oxidase and three other genes coding for FAD amine oxidases. Thus, A. niger possesses an enormous metabolic gear to grow on amine compounds and thus support its saprophytic lifestyle.
NASA Astrophysics Data System (ADS)
Elkhateeb, M. M.; Nouh, M. I.; Nelson, R. H.
2015-02-01
A first photometric study for the newly discovered systems USNO-B1.0 1091-0130715 and GSC-03449-0680 was carried out by means of recent a windows interface version of the Wilson and Devinney code based on model atmospheres by Kurucz (1993). The accepted models reveal some absolute parameters for both systems, which are used in deriving the spectral type of the system components and their evolutionary status. Distances to each systems and physical properties were estimated. Comparisons of the computed physical parameters with stellar models are discussed. The components of the system USNO-B1.0 1091-0130715 and the primary of the system GSC-03449-0680 are found to be on or near the ZAMS track, while the secondary of GSC-03449-0680 system found to be severely under luminous and too cool compared to its ZAMS mass.
Kmieciak, Wioletta; Szewczyk, Eligia M; Ciszewski, Marcin
2016-07-01
The paper presents an analysis of 51 Staphylococcus pseudintermedius clinically isolated strains from humans and from animals. Staphylococcus pseudintermedius strains' ability to produce β-haemolysin was evaluated with phenotypic methods (hot-cold effect, reverse CAMP test). In order to determine the hlb gene presence (coding for β-haemolysin) in a genomic DNA, PCR reactions were conducted with two different pairs of primers: one described in the literature for Staphylococcus aureus and recommended for analysing SIG group staphylococci and newly designed one in CLC Main Workbench software. Only reactions with newly designed primers resulted in product amplification, the presence of which was fully compatible with the results of phenotypic β-haemolysin test. Negative results for S. aureus and S. intermedius reference ATCC strains suggest that after further analysis the fragment of hlb gene amplified with primers described in this study might be included in the process of S. pseudintermedius strains identification.
NASA Technical Reports Server (NTRS)
Shih, Tsan-Hsing; Liu, Nan-Suey; Moder, Jeffrey P.
2015-01-01
This paper presents the numerical simulations of confined three-dimensional coaxial water jets. The objectives are to validate the newly proposed nonlinear turbulence models of momentum and scalar transport, and to evaluate the newly introduced scalar APDF and DWFDF equation along with its Eulerian implementation in the National Combustion Code (NCC). Simulations conducted include the steady RANS, the unsteady RANS (URANS), and the time-filtered Navier-Stokes (TFNS); both without and with invoking the APDF or DWFDF equation. When the APDF (ensemble averaged probability density function) or DWFDF (density weighted filtered density function) equation is invoked, the simulations are of a hybrid nature, i.e., the transport equations of energy and species are replaced by the APDF or DWFDF equation. Results of simulations are compared with the available experimental data. Some positive impacts of the nonlinear turbulence models and the Eulerian scalar APDF and DWFDF approach are observed.
Barta, Andrea; Kalyna, Maria; Reddy, Anireddy S N
2010-09-01
Growing interest in alternative splicing in plants and the extensive sequencing of new plant genomes necessitate more precise definition and classification of genes coding for splicing factors. SR proteins are a family of RNA binding proteins, which function as essential factors for constitutive and alternative splicing. We propose a unified nomenclature for plant SR proteins, taking into account the newly revised nomenclature of the mammalian SR proteins and a number of plant-specific properties of the plant proteins. We identify six subfamilies of SR proteins in Arabidopsis thaliana and rice (Oryza sativa), three of which are plant specific. The proposed subdivision of plant SR proteins into different subfamilies will allow grouping of paralogous proteins and simple assignment of newly discovered SR orthologs from other plant species and will promote functional comparisons in diverse plant species.
VVER-440 and VVER-1000 reactor dosimetry benchmark - BUGLE-96 versus ALPAN VII.0
DOE Office of Scientific and Technical Information (OSTI.GOV)
Duo, J. I.
2011-07-01
Document available in abstract form only, full text of document follows: Analytical results of the vodo-vodyanoi energetichesky reactor-(VVER-) 440 and VVER-1000 reactor dosimetry benchmarks developed from engineering mockups at the Nuclear Research Inst. Rez LR-0 reactor are discussed. These benchmarks provide accurate determination of radiation field parameters in the vicinity and over the thickness of the reactor pressure vessel. Measurements are compared to calculated results with two sets of tools: TORT discrete ordinates code and BUGLE-96 cross-section library versus the newly Westinghouse-developed RAPTOR-M3G and ALPAN VII.0. The parallel code RAPTOR-M3G enables detailed neutron distributions in energy and space in reducedmore » computational time. ALPAN VII.0 cross-section library is based on ENDF/B-VII.0 and is designed for reactor dosimetry applications. It uses a unique broad group structure to enhance resolution in thermal-neutron-energy range compared to other analogous libraries. The comparison of fast neutron (E > 0.5 MeV) results shows good agreement (within 10%) between BUGLE-96 and ALPAN VII.O libraries. Furthermore, the results compare well with analogous results of participants of the REDOS program (2005). Finally, the analytical results for fast neutrons agree within 15% with the measurements, for most locations in all three mockups. In general, however, the analytical results underestimate the attenuation through the reactor pressure vessel thickness compared to the measurements. (authors)« less
Peterson, M.D.; Mueller, C.S.
2011-01-01
The USGS National Seismic Hazard Maps are updated about every six years by incorporating newly vetted science on earthquakes and ground motions. The 2008 hazard maps for the central and eastern United States region (CEUS) were updated by using revised New Madrid and Charleston source models, an updated seismicity catalog and an estimate of magnitude uncertainties, a distribution of maximum magnitudes, and several new ground-motion prediction equations. The new models resulted in significant ground-motion changes at 5 Hz and 1 Hz spectral acceleration with 5% damping compared to the 2002 version of the hazard maps. The 2008 maps have now been incorporated into the 2009 NEHRP Recommended Provisions, the 2010 ASCE-7 Standard, and the 2012 International Building Code. The USGS is now planning the next update of the seismic hazard maps, which will be provided to the code committees in December 2013. Science issues that will be considered for introduction into the CEUS maps include: 1) updated recurrence models for New Madrid sources, including new geodetic models and magnitude estimates; 2) new earthquake sources and techniques considered in the 2010 model developed by the nuclear industry; 3) new NGA-East ground-motion models (currently under development); and 4) updated earthquake catalogs. We will hold a regional workshop in late 2011 or early 2012 to discuss these and other issues that will affect the seismic hazard evaluation in the CEUS.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Andrews, Nathan; Faucett, Christopher; Haskin, Troy Christopher
Following the conclusion of the first phase of the crosswalk analysis, one of the key unanswered questions was whether or not the deviations found would persist during a partially recovered accident scenario, similar to the one that occurred in TMI - 2. In particular this analysis aims to compare the impact of core degradation morphology on quenching models inherent within the two codes and the coolability of debris during partially recovered accidents. A primary motivation for this study is the development of insights into how uncertainties in core damage progression models impact the ability to assess the potential for recoverymore » of a degraded core. These quench and core recovery models are of the most interest when there is a significant amount of core damage, but intact and degraded fuel still remain in the cor e region or the lower plenum. Accordingly this analysis presents a spectrum of partially recovered accident scenarios by varying both water injection timing and rate to highlight the impact of core degradation phenomena on recovered accident scenarios. This analysis uses the newly released MELCOR 2.2 rev. 966 5 and MAAP5, Version 5.04. These code versions, which incorporate a significant number of modifications that have been driven by analyses and forensic evidence obtained from the Fukushima - Daiichi reactor site.« less
Observed child and parent toothbrushing behaviors and child oral health
COLLETT, BRENT R.; HUEBNER, COLLEEN E.; SEMINARIO, ANA LUCIA; WALLACE, ERIN; GRAY, KRISTEN E.; SPELTZ, MATTHEW L.
2018-01-01
Background Parent-led toothbrushing effectively reduces early childhood caries. Research on the strategies that parents use to promote this behavior is, however, lacking. Aim To examine associations between parent–child toothbrushing interactions and child oral health using a newly developed measure, the Toothbrushing Observation System (TBOS). Design One hundred children ages 18–60 months and their parents were video-recorded during toothbrushing interactions. Using these recordings, six raters coded parent and child behaviors and the duration of toothbrushing. We examined the reliability of the coding system and associations between observed parent and child behaviors and three indices of oral health: caries, gingival health, and history of dental procedures requiring general anesthesia. Results Reliabilities were moderate to strong for TBOS child and parent scores. Parent TBOS scores and longer duration of parent-led toothbrushing were associated with fewer decayed, missing or filled tooth surfaces and lower incidence of gingivitis and procedures requiring general anesthesia. Associations between child TBOS scores and dental outcomes were modest, suggesting the relative importance of parent versus child behaviors at this early age. Conclusions Parents’ child behavior management skills and the duration of parent-led toothbrushing were associated with better child oral health. These findings suggest that parenting skills are an important target for future behavioral oral health interventions. PMID:26148197
Spirometry Use among Older Adults with Chronic Obstructive Pulmonary Disease: 1999–2008
Wang, Yue; Kuo, Yong-Fang; Goodwin, James S.; Sharma, Gulshan
2013-01-01
Rationale: Clinical practice guidelines recommend spirometry to diagnose chronic obstructive pulmonary disease (COPD) and facilitate management. National trends in spirometry use in older adults with newly diagnosed COPD are not known. Objectives: To examine the rate and beneficiary characteristics associated with spirometry use in subjects with newly diagnosed COPD between 1999 and 2008. Methods: We examined newly diagnosed beneficiaries with COPD using a 5% Medicare population from 1999 to 2008. A new COPD diagnosis required two outpatient visits or one hospitalization with primary International Classification of Diseases, 9th edition code 491.xx, 492.xx, or 496 occurring at least 30 days apart with none in the prior 12 months. The primary measurement was spirometry performed within 365 days (±) of the first claim with a COPD diagnosis. Measurements and Main Results: Between 1999 and 2008, 64,985 subjects were newly diagnosed with COPD. Of these, 35,739 (55%) had spirometry performed within 1 year before or after the initial diagnosis of COPD. Spirometry use increased from 51.3% in 1999 to 58.3% in 2008 (P < 0.001). Subjects with younger age, men, whites, those with higher socioeconomic status, and those with a greater number of comorbidities were more likely to have spirometry. In a multivariable analysis, compared with 1999, subjects diagnosed in 2008 had 10% higher odds (odds ratio, 1.10; 95% confidence interval, 1.06–1.13) of having spirometry performed. Conclusions: Despite an increase in the use of spirometry over time in newly diagnosed older adults with COPD, spirometry use remains low. Clinical practice guidelines and educational efforts should focus on increasing the use of spirometry to diagnose and manage COPD. PMID:24053440
A Novel Collection of snRNA-Like Promoters with Tissue-Specific Transcription Properties
Garritano, Sonia; Gigoni, Arianna; Costa, Delfina; Malatesta, Paolo; Florio, Tullio; Cancedda, Ranieri; Pagano, Aldo
2012-01-01
We recently identified a novel dataset of snRNA-like trascriptional units in the human genome. The investigation of a subset of these elements showed that they play relevant roles in physiology and/or pathology. In this work we expand our collection of small RNAs taking advantage of a newly developed algorithm able to identify genome sequence stretches with RNA polymerase (pol) III type 3 promoter features thus constituting putative pol III binding sites. The bioinformatic analysis of a subset of these elements that map in introns of protein-coding genes in antisense configuration suggest their association with alternative splicing, similarly to other recently characterized small RNAs. Interestingly, the analysis of the transcriptional activity of these novel promoters shows that they are active in a cell-type specific manner, in accordance with the emerging body of evidence of a tissue/cell-specific activity of pol III. PMID:23109855
A novel collection of snRNA-like promoters with tissue-specific transcription properties.
Garritano, Sonia; Gigoni, Arianna; Costa, Delfina; Malatesta, Paolo; Florio, Tullio; Cancedda, Ranieri; Pagano, Aldo
2012-01-01
We recently identified a novel dataset of snRNA-like trascriptional units in the human genome. The investigation of a subset of these elements showed that they play relevant roles in physiology and/or pathology. In this work we expand our collection of small RNAs taking advantage of a newly developed algorithm able to identify genome sequence stretches with RNA polymerase (pol) III type 3 promoter features thus constituting putative pol III binding sites. The bioinformatic analysis of a subset of these elements that map in introns of protein-coding genes in antisense configuration suggest their association with alternative splicing, similarly to other recently characterized small RNAs. Interestingly, the analysis of the transcriptional activity of these novel promoters shows that they are active in a cell-type specific manner, in accordance with the emerging body of evidence of a tissue/cell-specific activity of pol III.
ePMV embeds molecular modeling into professional animation software environments.
Johnson, Graham T; Autin, Ludovic; Goodsell, David S; Sanner, Michel F; Olson, Arthur J
2011-03-09
Increasingly complex research has made it more difficult to prepare data for publication, education, and outreach. Many scientists must also wade through black-box code to interface computational algorithms from diverse sources to supplement their bench work. To reduce these barriers we have developed an open-source plug-in, embedded Python Molecular Viewer (ePMV), that runs molecular modeling software directly inside of professional 3D animation applications (hosts) to provide simultaneous access to the capabilities of these newly connected systems. Uniting host and scientific algorithms into a single interface allows users from varied backgrounds to assemble professional quality visuals and to perform computational experiments with relative ease. By enabling easy exchange of algorithms, ePMV can facilitate interdisciplinary research, smooth communication between broadly diverse specialties, and provide a common platform to frame and visualize the increasingly detailed intersection(s) of cellular and molecular biology. Copyright © 2011 Elsevier Ltd. All rights reserved.
Vohra, R; Cantrell, F L; Williams, S R
2008-02-01
Rattlesnake envenomation occasionally results in repetitive small-muscle fasciculations known as myokymia. We report the results of a retrospective inquiry of this phenomenon from a statewide poison center's database. Data was obtained from a poison system database for the years 2000-2003, inclusive, for rattlesnake envenomation exposures coded as having fasciculations. A total of 47 cases were identified, and nine other cases were found from previously published literature. There was no consistent temporal pattern by monthly analyses in incidence or proportion of reported snakebites with myokymia. All four of the reviewed cases with myokymia of the shoulders were intubated and none without it were intubated. A review of four consecutive years of data revealed no pattern to correlate the incidence of fasciculations with the month. The development of respiratory failure associated with myokymia, sometimes despite antivenom, is a newly reported occurrence. Clinicians are reminded to monitor closely airway and inspiratory capacity in patients with severe myokymia.
ePMV Embeds Molecular Modeling into Professional Animation Software Environments
Johnson, Graham T.; Autin, Ludovic; Goodsell, David S.; Sanner, Michel F.; Olson, Arthur J.
2011-01-01
SUMMARY Increasingly complex research has made it more difficult to prepare data for publication, education, and outreach. Many scientists must also wade through black-box code to interface computational algorithms from diverse sources to supplement their bench work. To reduce these barriers, we have developed an open-source plug-in, embedded Python Molecular Viewer (ePMV), that runs molecular modeling software directly inside of professional 3D animation applications (hosts) to provide simultaneous access to the capabilities of these newly connected systems. Uniting host and scientific algorithms into a single interface allows users from varied backgrounds to assemble professional quality visuals and to perform computational experiments with relative ease. By enabling easy exchange of algorithms, ePMV can facilitate interdisciplinary research, smooth communication between broadly diverse specialties and provide a common platform to frame and visualize the increasingly detailed intersection(s) of cellular and molecular biology. PMID:21397181
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nandipati, Giridhar; Setyawan, Wahyu; Heinisch, Howard L.
2015-07-01
The results of object kinetic Monte Carlo (OKMC) simulations of the annealing of primary cascade damage in bulk tungsten using a comprehensive database of cascades obtained from molecular dynamics (Setyawan et al.) are described as a function of primary knock-on atom (PKA) energy at temperatures of 300, 1025 and 2050 K. An increase in SIA clustering coupled with a decrease in vacancy clustering with increasing temperature, in addition to the disparate mobilities of SIAs versus vacancies, causes an interesting effect of temperature on cascade annealing. The annealing efficiency (the ratio of the number of defects after and before annealing) exhibitsmore » an inverse U-shape curve as a function of temperature. The capabilities of the newly developed OKMC code KSOME (kinetic simulations of microstructure evolution) used to carry out these simulations are described.« less
Analysis of neutron spectrum effects on primary damage in tritium breeding blankets
NASA Astrophysics Data System (ADS)
Choi, Yong Hee; Joo, Han Gyu
2012-07-01
The effect of neutron spectrum on primary damages in a structural material of a tritium breeding blanket is investigated with a newly established recoil spectrum estimation system. First, a recoil spectrum generation code is developed to obtain the energy spectrum of primary knock-on atoms (PKAs) for a given neutron spectrum utilizing the latest ENDF/B data. Secondly, a method for approximating the high energy tail of the recoil spectrum is introduced to avoid expensive molecular dynamics calculations for high energy PKAs using the concept of recoil energy of the secondary knock-on atoms originated by the INtegration of CAScades (INCAS) model. Thirdly, the modified spectrum is combined with a set of molecular dynamics calculation results to estimate the primary damage parameters such as the number of surviving point defects. Finally, the neutron spectrum is varied by changing the material of the spectral shifter and the result in primary damage parameters is examined.
An Improved Neutron Transport Algorithm for HZETRN
NASA Technical Reports Server (NTRS)
Slaba, Tony C.; Blattnig, Steve R.; Clowdsley, Martha S.; Walker, Steven A.; Badavi, Francis F.
2010-01-01
Long term human presence in space requires the inclusion of radiation constraints in mission planning and the design of shielding materials, structures, and vehicles. In this paper, the numerical error associated with energy discretization in HZETRN is addressed. An inadequate numerical integration scheme in the transport algorithm is shown to produce large errors in the low energy portion of the neutron and light ion fluence spectra. It is further shown that the errors result from the narrow energy domain of the neutron elastic cross section spectral distributions, and that an extremely fine energy grid is required to resolve the problem under the current formulation. Two numerical methods are developed to provide adequate resolution in the energy domain and more accurately resolve the neutron elastic interactions. Convergence testing is completed by running the code for various environments and shielding materials with various energy grids to ensure stability of the newly implemented method.
Computational Study of the Richtmyer-Meshkov Instability with a Complex Initial Condition
NASA Astrophysics Data System (ADS)
McFarland, Jacob; Reilly, David; Greenough, Jeffrey; Ranjan, Devesh
2014-11-01
Results are presented for a computational study of the Richtmyer-Meshkov instability with a complex initial condition. This study covers experiments which will be conducted at the newly-built inclined shock tube facility at the Georgia Institute of Technology. The complex initial condition employed consists of an underlying inclined interface perturbation with a broadband spectrum of modes superimposed. A three-dimensional staggered mesh arbitrary Lagrange Eulerian (ALE) hydrodynamics code developed at Lawerence Livermore National Laboratory called ARES was used to obtain both qualitative and quantitative results. Qualitative results are discussed using time series of density plots from which mixing width may be extracted. Quantitative results are also discussed using vorticity fields, circulation components, and energy spectra. The inclined interface case is compared to the complex interface case in order to study the effect of initial conditions on shocked, variable-density flows.
NASA Astrophysics Data System (ADS)
Erich, M.; Kokkoris, M.; Fazinić, S.; Petrović, S.
2018-02-01
This work reports on the induced diamond crystal amorphization by 4 MeV carbon ions implanted in the 〈1 0 0〉 oriented crystal and its determination by application of RBS/C and EBS/C techniques. The spectra from the implanted samples were recorded for 1.2, 1.5, 1.75 and 1.9 MeV protons. For the two latter ones the strong resonance of the nuclear elastic scattering 12C(p,p0)12C at 1.737 MeV was explored. The backscattering channeling spectra were successfully fitted and the ion beam induced crystal amorphization depth profile was determined using a phenomenological approach, which is based on the properly defined Gompertz type dechanneling functions for protons in the 〈1 0 0〉 diamond crystal channels and the introduction of the concept of ion beam amorphization, which is implemented through our newly developed computer code CSIM.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Raskin, Cody; Owen, J. Michael, E-mail: raskin1@llnl.gov, E-mail: mikeowen@llnl.gov
2016-11-01
We discuss a generalization of the classic Keplerian disk test problem allowing for both pressure and rotational support, as a method of testing astrophysical codes incorporating both gravitation and hydrodynamics. We argue for the inclusion of pressure in rotating disk simulations on the grounds that realistic, astrophysical disks exhibit non-negligible pressure support. We then apply this test problem to examine the performance of various smoothed particle hydrodynamics (SPH) methods incorporating a number of improvements proposed over the years to address problems noted in modeling the classical gravitation-only Keplerian disk. We also apply this test to a newly developed extension ofmore » SPH based on reproducing kernels called CRKSPH. Counterintuitively, we find that pressure support worsens the performance of traditional SPH on this problem, causing unphysical collapse away from the steady-state disk solution even more rapidly than the purely gravitational problem, whereas CRKSPH greatly reduces this error.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sengupta, M.; Ganesh, R.
The dynamics of cylindrically trapped electron plasma has been investigated using a newly developed 2D Electrostatic PIC code that uses unapproximated, mass-included equations of motion for simulation. Exhaustive simulations, covering the entire range of Brillouin ratio, were performed for uniformly filled circular profiles in rigid rotor equilibrium. The same profiles were then loaded away from equilibrium with an initial value of rigid rotation frequency different from that required for radial force balance. Both these sets of simulations were performed for an initial zero-temperature or cold load of the plasma with no spread in either angular velocity or radial velocity. Themore » evolution of the off-equilibrium initial conditions to a steady state involve radial breathing of the profile that scales in amplitude and algebraic growth with Brillouin fraction. For higher Brillouin fractions, the growth of the breathing mode is followed by complex dynamics of spontaneous hollow density structures, excitation of poloidal modes, leading to a monotonically falling density profile.« less
Field Performance of a Newly Developed Upflow Filtration Device
The objective of this research is to examine the removal capacities of a newly developed Upflow filtration device for treatment of stormwater. The device was developed by engineers at the University of Alabama through a Small Business Innovative Research (SBIR) grant from the U....
2012-01-01
Background Inter-rater agreement in the interpretation of chest X-ray (CXR) films is crucial for clinical and epidemiological studies of tuberculosis. We compared the readings of CXR films used for a survey of tuberculosis between raters from two Asian countries. Methods Of the 11,624 people enrolled in a prevalence survey in Hanoi, Viet Nam, in 2003, we studied 258 individuals whose CXR films did not exclude the possibility of active tuberculosis. Follow-up films obtained from accessible individuals in 2006 were also analyzed. Two Japanese and two Vietnamese raters read the CXR films based on a coding system proposed by Den Boon et al. and another system newly developed in this study. Inter-rater agreement was evaluated by kappa statistics. Marginal homogeneity was evaluated by the generalized estimating equation (GEE). Results CXR findings suspected of tuberculosis differed between the four raters. The frequencies of infiltrates and fibrosis/scarring detected on the films significantly differed between the raters from the two countries (P < 0.0001 and P = 0.0082, respectively, by GEE). The definition of findings such as primary cavity, used in the coding systems also affected the degree of agreement. Conclusions CXR findings were inconsistent between the raters with different backgrounds. High inter-rater agreement is a component necessary for an optimal CXR coding system, particularly in international studies. An analysis of reading results and a thorough discussion to achieve a consensus would be necessary to achieve further consistency and high quality of reading. PMID:22296612
A theoretical and simulation study of the contact discontinuities based on a Vlasov simulation code
NASA Astrophysics Data System (ADS)
Tsai, T. C.; Lyu, L. H.; Chao, J. K.; Chen, M. Q.; Tsai, W. H.
2009-12-01
Contact discontinuity (CD) is the simplest solution that can be obtained from the magnetohydrodynamics (MHD) Rankine-Hugoniot jump conditions. Due to the limitations of the previous kinetic simulation models, the stability of the CD has become a controversial issue in the past 10 years. The stability of the CD is reexamined analytically and numerically. Our theoretical analysis shows that the electron temperature profile and the ion temperature profile must be out of phase across the CD if the CD structure is to be stable in the electron time scale and with zero electron heat flux on either side of the CD. Both a newly developed fourth-order implicit electrostatic Vlasov simulation code and an electromagnetic finite-size particle code are used to examine the stability and the electrostatic nature of the CD structure. Our theoretical prediction is verified by both simulations. Our results of Vlasov simulation also indicate that a simulation with initial electron temperature profile and ion temperature profile varying in phase across the CD will undergo very transient changes in the electron time scale but will relax into a quasi-steady CD structure within a few ion plasma oscillation periods if a real ion-electron mass ratio is used in the simulation and if the boundary conditions allow nonzero heat flux to be presented at the boundaries of the simulation box. The simulation results of this study indicate that the Vlasov simulation is a powerful tool to study nonlinear phenomena with nonperiodic boundary conditions and with nonzero heat flux at the boundaries of the simulation box.
ERIC Educational Resources Information Center
Jang, Syh-Jong
2011-01-01
Ongoing professional development for college teachers has been much emphasized. However, previous research on learning environments has seldom addressed college students' perceptions of teachers' PCK. This study aimed to evaluate college students' perceptions of a physics teacher's PCK development using a newly developed instrument and workshop…
Preparing School Leaders: The Professional Development Needs of Newly Appointed Principals
ERIC Educational Resources Information Center
Ng, Shun-wing; Szeto, Sing-ying Elson
2016-01-01
In Hong Kong, there is an acute need to provide newly appointed principals with opportunities for continuous professional development so that they could face the impact of reforms and globalization on school development. The Education Bureau has commissioned the tertiary institutions to provide structured professional development courses to cater…
Travnik, Jaden B; Pilarski, Patrick M
2017-07-01
Prosthetic devices have advanced in their capabilities and in the number and type of sensors included in their design. As the space of sensorimotor data available to a conventional or machine learning prosthetic control system increases in dimensionality and complexity, it becomes increasingly important that this data be represented in a useful and computationally efficient way. Well structured sensory data allows prosthetic control systems to make informed, appropriate control decisions. In this study, we explore the impact that increased sensorimotor information has on current machine learning prosthetic control approaches. Specifically, we examine the effect that high-dimensional sensory data has on the computation time and prediction performance of a true-online temporal-difference learning prediction method as embedded within a resource-limited upper-limb prosthesis control system. We present results comparing tile coding, the dominant linear representation for real-time prosthetic machine learning, with a newly proposed modification to Kanerva coding that we call selective Kanerva coding. In addition to showing promising results for selective Kanerva coding, our results confirm potential limitations to tile coding as the number of sensory input dimensions increases. To our knowledge, this study is the first to explicitly examine representations for realtime machine learning prosthetic devices in general terms. This work therefore provides an important step towards forming an efficient prosthesis-eye view of the world, wherein prompt and accurate representations of high-dimensional data may be provided to machine learning control systems within artificial limbs and other assistive rehabilitation technologies.
Brewer, Michael S; Swafford, Lynn; Spruill, Chad L; Bond, Jason E
2013-01-01
Arthropods are the most diverse group of eukaryotic organisms, but their phylogenetic relationships are poorly understood. Herein, we describe three mitochondrial genomes representing orders of millipedes for which complete genomes had not been characterized. Newly sequenced genomes are combined with existing data to characterize the protein coding regions of myriapods and to attempt to reconstruct the evolutionary relationships within the Myriapoda and Arthropoda. The newly sequenced genomes are similar to previously characterized millipede sequences in terms of synteny and length. Unique translocations occurred within the newly sequenced taxa, including one half of the Appalachioria falcifera genome, which is inverted with respect to other millipede genomes. Across myriapods, amino acid conservation levels are highly dependent on the gene region. Additionally, individual loci varied in the level of amino acid conservation. Overall, most gene regions showed low levels of conservation at many sites. Attempts to reconstruct the evolutionary relationships suffered from questionable relationships and low support values. Analyses of phylogenetic informativeness show the lack of signal deep in the trees (i.e., genes evolve too quickly). As a result, the myriapod tree resembles previously published results but lacks convincing support, and, within the arthropod tree, well established groups were recovered as polyphyletic. The novel genome sequences described herein provide useful genomic information concerning millipede groups that had not been investigated. Taken together with existing sequences, the variety of compositions and evolution of myriapod mitochondrial genomes are shown to be more complex than previously thought. Unfortunately, the use of mitochondrial protein-coding regions in deep arthropod phylogenetics appears problematic, a result consistent with previously published studies. Lack of phylogenetic signal renders the resulting tree topologies as suspect. As such, these data are likely inappropriate for investigating such ancient relationships.
D'Ettorre, P; Mondy, N; Lenoir, A; Errard, C
2002-01-01
Social parasites are able to exploit their host's communication code and achieve social integration. For colony foundation, a newly mated slave-making ant queen must usurp a host colony. The parasite's brood is cared for by the hosts and newly eclosed slave-making workers integrate to form a mixed ant colony. To elucidate the social integration strategy of the slave-making workers, Polyergus rufescens, behavioural and chemical analyses were carried out. Cocoons of P. rufescens were introduced into subcolonies of four potential host species: Formica subgenus Serviformica (Formica cunicularia and F. rufibarbis, usual host species; F. gagates, rare host; F. selysi, non-natural host). Slave-making broods were cared for and newly emerged workers showed several social interactions with adult Formica. We recorded the occurrence of abdominal trophallaxis, in which P. rufescens, the parasite, was the donor. Social integration of P. rufescens workers into host colonies appears to rely on the ability of the parasite to modify its cuticular hydrocarbon profile to match that of the rearing species. To study the specific P. rufescens chemical profile, newly emerged callows were reared in isolation from the mother colony (without any contact with adult ants). The isolated P. rufescens workers exhibited a chemical profile closely matching that of the primary host species, indicating the occurrence of local host adaptation in the slave-maker population. However, the high flexibility in the ontogeny of the parasite's chemical signature could allow for host switching. PMID:12350253
Interactive Exploration for Continuously Expanding Neuron Databases.
Li, Zhongyu; Metaxas, Dimitris N; Lu, Aidong; Zhang, Shaoting
2017-02-15
This paper proposes a novel framework to help biologists explore and analyze neurons based on retrieval of data from neuron morphological databases. In recent years, the continuously expanding neuron databases provide a rich source of information to associate neuronal morphologies with their functional properties. We design a coarse-to-fine framework for efficient and effective data retrieval from large-scale neuron databases. In the coarse-level, for efficiency in large-scale, we employ a binary coding method to compress morphological features into binary codes of tens of bits. Short binary codes allow for real-time similarity searching in Hamming space. Because the neuron databases are continuously expanding, it is inefficient to re-train the binary coding model from scratch when adding new neurons. To solve this problem, we extend binary coding with online updating schemes, which only considers the newly added neurons and update the model on-the-fly, without accessing the whole neuron databases. In the fine-grained level, we introduce domain experts/users in the framework, which can give relevance feedback for the binary coding based retrieval results. This interactive strategy can improve the retrieval performance through re-ranking the above coarse results, where we design a new similarity measure and take the feedback into account. Our framework is validated on more than 17,000 neuron cells, showing promising retrieval accuracy and efficiency. Moreover, we demonstrate its use case in assisting biologists to identify and explore unknown neurons. Copyright © 2017 Elsevier Inc. All rights reserved.
PROGRESS REPORT ON THE DSSTOX DATABASE NETWORK: NEWLY LAUNCHED WEBSITE, APPLICATIONS, FUTURE PLANS
Progress Report on the DSSTox Database Network: Newly Launched Website, Applications, Future Plans
Progress will be reported on development of the Distributed Structure-Searchable Toxicity (DSSTox) Database Network and the newly launched public website that coordinates and...
WorldWide Telescope: A Newly Open Source Astronomy Visualization System
NASA Astrophysics Data System (ADS)
Fay, Jonathan; Roberts, Douglas A.
2016-01-01
After eight years of development by Microsoft Research, WorldWide Telescope (WWT) was made an open source project at the end of June 2015. WWT was motivated by the desire to put new surveys of objects, such as the Sloan Digital Sky Survey in the context of the night sky. The development of WWT under Microsoft started with the creation of a Windows desktop client that is widely used in various education, outreach and research projects. Using this, users can explore the data built into WWT as well as data that is loaded in. Beyond exploration, WWT can be used to create tours that present various datasets a narrative format.In the past two years, the team developed a collection of web controls, including an HTML5 web client, which contains much of the functionality of the Windows desktop client. The project under Microsoft has deep connections with several user communities such as education through the WWT Ambassadors program, http://wwtambassadors.org/ and with planetariums and museums such as the Adler Planetarium. WWT can also support research, including using WWT to visualize the Bones of the Milky Way and rich connections between WWT and the Astrophysical Data Systems (ADS, http://labs.adsabs.harvard.edu/adsabs/). One important new research connection is the use of WWT to create dynamic and potentially interactive supplements to journal articles, which have been created in 2015.Now WWT is an open source community lead project. The source code is available in GitHub (https://github.com/WorldWideTelescope). There is significant developer documentation on the website (http://worldwidetelescope.org/Developers/) and an extensive developer workshops (http://wwtworkshops.org/?tribe_events=wwt-developer-workshop) has taken place in the fall of 2015.Now that WWT is open source anyone who has the interest in the project can be a contributor. As important as helping out with coding, the project needs people interested in documentation, testing, training and other roles.
A Multi-domain Spectral Method for Supersonic Reactive Flows
NASA Technical Reports Server (NTRS)
Don, Wai-Sun; Gottlieb, David; Jung, Jae-Hun; Bushnell, Dennis M. (Technical Monitor)
2002-01-01
This paper has a dual purpose: it presents a multidomain Chebyshev method for the solution of the two-dimensional reactive compressible Navier-Stokes equations, and it reports the results of the application of this code to the numerical simulations of high Mach number reactive flows in recessed cavity. The computational method utilizes newly derived interface boundary conditions as well as an adaptive filtering technique to stabilize the computations. The results of the simulations are relevant to recessed cavity flameholders.
Mapping subsurface in proximity to newly-developed sinkhole along roadway.
DOT National Transportation Integrated Search
2013-02-01
MS&T acquired electrical resistivity tomography profiles in immediate proximity to a newly-developed sinkhole in Nixa Missouri : The sinkhole has closed a well-traveled municipal roadway and threatens proximal infrastructure. The intent of this inves...
GIS embedded hydrological modeling: the SID&GRID project
NASA Astrophysics Data System (ADS)
Borsi, I.; Rossetto, R.; Schifani, C.
2012-04-01
The SID&GRID research project, started April 2010 and funded by Regione Toscana (Italy) under the POR FSE 2007-2013, aims to develop a Decision Support System (DSS) for water resource management and planning based on open source and public domain solutions. In order to quantitatively assess water availability in space and time and to support the planning decision processes, the SID&GRID solution consists of hydrological models (coupling 3D existing and newly developed surface- and ground-water and unsaturated zone modeling codes) embedded in a GIS interface, applications and library, where all the input and output data are managed by means of DataBase Management System (DBMS). A graphical user interface (GUI) to manage, analyze and run the SID&GRID hydrological models based on open source gvSIG GIS framework (Asociación gvSIG, 2011) and a Spatial Data Infrastructure to share and interoperate with distributed geographical data is being developed. Such a GUI is thought as a "master control panel" able to guide the user from pre-processing spatial and temporal data, running the hydrological models, and analyzing the outputs. To achieve the above-mentioned goals, the following codes have been selected and are being integrated: 1. Postgresql/PostGIS (PostGIS, 2011) for the Geo Data base Management System; 2. gvSIG with Sextante (Olaya, 2011) geo-algorithm library capabilities and Grass tools (GRASS Development Team, 2011) for the desktop GIS; 3. Geoserver and Geonetwork to share and discover spatial data on the web according to Open Geospatial Consortium; 4. new tools based on the Sextante GeoAlgorithm framework; 5. MODFLOW-2005 (Harbaugh, 2005) groundwater modeling code; 6. MODFLOW-LGR (Mehl and Hill 2005) for local grid refinement; 7. VSF (Thoms et al., 2006) for the variable saturated flow component; 8. new developed routines for overland flow; 9. new algorithms in Jython integrated in gvSIG to compute the net rainfall rate reaching the soil surface, as input for the unsaturated/saturated flow model. At this stage of the research (which will end April 2013), two primary components of the master control panel are being developed: i. a SID&GRID toolbar integrated into gvSIG map context; ii. a new Sextante set of geo-algorithm to pre- and post-process the spatial data to run the hydrological models. The groundwater part of the code has been fully integrated and tested and 3D visualization tools are being developed. The LGR capability has been extended to the 3D solution of the Richards' equation in order to solve in detail the unsaturated zone where required. To be updated about the project, please follow us at the website: http://ut11.isti.cnr.it/SIDGRID/
Evaluation of new binders using newly developed fracture energy test.
DOT National Transportation Integrated Search
2013-07-01
This study evaluated a total of seven asphalt binders with various additives : using the newly developed binder fracture energy test. The researchers prepared and : tested PAV-aged and RTFO-plus-PAV-aged specimens. This study confirmed previous : res...
13 CFR 120.812 - Probationary period for newly certified CDCs.
Code of Federal Regulations, 2013 CFR
2013-01-01
... certified CDCs. 120.812 Section 120.812 Business Credit and Assistance SMALL BUSINESS ADMINISTRATION BUSINESS LOANS Development Company Loan Program (504) Certification Procedures to Become A Cdc § 120.812 Probationary period for newly certified CDCs. (a) Newly certified CDCs will be on probation for a period of two...
13 CFR 120.812 - Probationary period for newly certified CDCs.
Code of Federal Regulations, 2014 CFR
2014-01-01
... certified CDCs. 120.812 Section 120.812 Business Credit and Assistance SMALL BUSINESS ADMINISTRATION BUSINESS LOANS Development Company Loan Program (504) Certification Procedures to Become A Cdc § 120.812 Probationary period for newly certified CDCs. (a) Newly certified CDCs will be on probation for a period of two...
13 CFR 120.812 - Probationary period for newly certified CDCs.
Code of Federal Regulations, 2012 CFR
2012-01-01
... certified CDCs. 120.812 Section 120.812 Business Credit and Assistance SMALL BUSINESS ADMINISTRATION BUSINESS LOANS Development Company Loan Program (504) Certification Procedures to Become A Cdc § 120.812 Probationary period for newly certified CDCs. (a) Newly certified CDCs will be on probation for a period of two...
13 CFR 120.812 - Probationary period for newly certified CDCs.
Code of Federal Regulations, 2011 CFR
2011-01-01
... certified CDCs. 120.812 Section 120.812 Business Credit and Assistance SMALL BUSINESS ADMINISTRATION BUSINESS LOANS Development Company Loan Program (504) Certification Procedures to Become A Cdc § 120.812 Probationary period for newly certified CDCs. (a) Newly certified CDCs will be on probation for a period of two...
Ravinetto, Raffaella; Tinto, Halidou; Diro, Ermias; Okebe, Joseph; Mahendradhata, Yodi; Rijal, Suman; Gotuzzo, Eduardo; Lutumba, Pascal; Nahum, Alain; De Nys, Katelijne; Casteels, Minne; Boelaert, Marleen
2016-01-01
The Good Clinical Practices (GCP) codes of the WHO and the International Conference of Harmonization set international standards for clinical research. But critics argue that they were written without consideration for the challenges faced in low and middle income countries (LMICs). Based on our field experience in LMICs, we developed a non-exhaustive set of recommendations for the improvement of GCP. These cover 3 domains: ethical, legal and operational, and 8 specific issues: the double ethical review of 'externally sponsored' trials; the informed consent procedure in minors and in illiterate people; post-trial access to newly-developed products for the trial communities; the role of communities as key research actors; the definition of sponsor; and the guidance for contractual agreements, laboratory quality management systems, and quality assurance of investigational medicinal products. Issues not covered in our analysis include among others biobanking, standard of care, and study designs. The international GCP codes de facto guide national legislators and funding agencies, so the current shortcomings may weaken the regulatory oversight of international research. In addition, activities neglected by GCP are less likely to be implemented or funded. If GCP are meant to serve the interests of global society, a comprehensive revision is needed. The revised guidelines should be strongly rooted in ethics, sensitive to different sociocultural perspectives, and allow consideration for trial-specific and context-specific challenges. This can be only achieved if all stakeholders, including researchers, sponsors, regulators, ethical reviewers and patients' representatives from LMICs, as well as non-commercial researchers and sponsors from affluent countries, are transparently involved in the revision process. We hope that our limited analysis would foster advocacy for a broad and inclusive revision of the international GCP codes, to make them at the same time 'global', 'context centred' and 'patient centred'.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Depriest, Kendall
Unsuccessful attempts by members of the radiation effects community to independently derive the Norgett-Robinson-Torrens (NRT) damage energy factors for silicon in ASTM standard E722-14 led to an investigation of the software coding and data that produced those damage energy factors. The ad hoc collaboration to discover the reason for lack of agreement revealed a coding error and resulted in a report documenting the methodology to produce the response function for the standard. The recommended changes in the NRT damage energy factors for silicon are shown to have significant impact for a narrow energy region of the 1-MeV(Si) equivalent fluence responsemore » function. However, when evaluating integral metrics over all neutrons energies in various spectra important to the SNL electronics testing community, the change in the response results in a small decrease in the total 1- MeV(Si) equivalent fluence of ~0.6% compared to the E722-14 response. Response functions based on the newly recommended NRT damage energy factors have been produced and are available for users of both the NuGET and MCNP codes.« less
Turcot, Valérie; Lu, Yingchang; Highland, Heather M; Schurmann, Claudia; Justice, Anne E; Fine, Rebecca S; Bradfield, Jonathan P; Esko, Tõnu; Giri, Ayush; Graff, Mariaelisa; Guo, Xiuqing; Hendricks, Audrey E; Karaderi, Tugce; Lempradl, Adelheid; Locke, Adam E; Mahajan, Anubha; Marouli, Eirini; Sivapalaratnam, Suthesh; Young, Kristin L; Alfred, Tamuno; Feitosa, Mary F; Masca, Nicholas GD; Manning, Alisa K; Medina-Gomez, Carolina; Mudgal, Poorva; Ng, Maggie CY; Reiner, Alex P; Vedantam, Sailaja; Willems, Sara M; Winkler, Thomas W; Abecasis, Goncalo; Aben, Katja K; Alam, Dewan S; Alharthi, Sameer E; Allison, Matthew; Amouyel, Philippe; Asselbergs, Folkert W; Auer, Paul L; Balkau, Beverley; Bang, Lia E; Barroso, Inês; Bastarache, Lisa; Benn, Marianne; Bergmann, Sven; Bielak, Lawrence F; Blüher, Matthias; Boehnke, Michael; Boeing, Heiner; Boerwinkle, Eric; Böger, Carsten A; Bork-Jensen, Jette; Bots, Michiel L; Bottinger, Erwin P; Bowden, Donald W; Brandslund, Ivan; Breen, Gerome; Brilliant, Murray H; Broer, Linda; Brumat, Marco; Burt, Amber A; Butterworth, Adam S; Campbell, Peter T; Cappellani, Stefania; Carey, David J; Catamo, Eulalia; Caulfield, Mark J; Chambers, John C; Chasman, Daniel I; Chen, Yii-Der Ida; Chowdhury, Rajiv; Christensen, Cramer; Chu, Audrey Y; Cocca, Massimiliano; Collins, Francis S; Cook, James P; Corley, Janie; Galbany, Jordi Corominas; Cox, Amanda J; Crosslin, David S; Cuellar-Partida, Gabriel; D'Eustacchio, Angela; Danesh, John; Davies, Gail; de Bakker, Paul IW; de Groot, Mark CH; de Mutsert, Renée; Deary, Ian J; Dedoussis, George; Demerath, Ellen W; den Heijer, Martin; den Hollander, Anneke I; den Ruijter, Hester M; Dennis, Joe G; Denny, Josh C; Di Angelantonio, Emanuele; Drenos, Fotios; Du, Mengmeng; Dubé, Marie-Pierre; Dunning, Alison M; Easton, Douglas F; Edwards, Todd L; Ellinghaus, David; Ellinor, Patrick T; Elliott, Paul; Evangelou, Evangelos; Farmaki, Aliki-Eleni; Farooqi, I. Sadaf; Faul, Jessica D; Fauser, Sascha; Feng, Shuang; Ferrannini, Ele; Ferrieres, Jean; Florez, Jose C; Ford, Ian; Fornage, Myriam; Franco, Oscar H; Franke, Andre; Franks, Paul W; Friedrich, Nele; Frikke-Schmidt, Ruth; Galesloot, Tessel E.; Gan, Wei; Gandin, Ilaria; Gasparini, Paolo; Gibson, Jane; Giedraitis, Vilmantas; Gjesing, Anette P; Gordon-Larsen, Penny; Gorski, Mathias; Grabe, Hans-Jörgen; Grant, Struan FA; Grarup, Niels; Griffiths, Helen L; Grove, Megan L; Gudnason, Vilmundur; Gustafsson, Stefan; Haessler, Jeff; Hakonarson, Hakon; Hammerschlag, Anke R; Hansen, Torben; Harris, Kathleen Mullan; Harris, Tamara B; Hattersley, Andrew T; Have, Christian T; Hayward, Caroline; He, Liang; Heard-Costa, Nancy L; Heath, Andrew C; Heid, Iris M; Helgeland, Øyvind; Hernesniemi, Jussi; Hewitt, Alex W; Holmen, Oddgeir L; Hovingh, G Kees; Howson, Joanna MM; Hu, Yao; Huang, Paul L; Huffman, Jennifer E; Ikram, M Arfan; Ingelsson, Erik; Jackson, Anne U; Jansson, Jan-Håkan; Jarvik, Gail P; Jensen, Gorm B; Jia, Yucheng; Johansson, Stefan; Jørgensen, Marit E; Jørgensen, Torben; Jukema, J Wouter; Kahali, Bratati; Kahn, René S; Kähönen, Mika; Kamstrup, Pia R; Kanoni, Stavroula; Kaprio, Jaakko; Karaleftheri, Maria; Kardia, Sharon LR; Karpe, Fredrik; Kathiresan, Sekar; Kee, Frank; Kiemeney, Lambertus A; Kim, Eric; Kitajima, Hidetoshi; Komulainen, Pirjo; Kooner, Jaspal S; Kooperberg, Charles; Korhonen, Tellervo; Kovacs, Peter; Kuivaniemi, Helena; Kutalik, Zoltán; Kuulasmaa, Kari; Kuusisto, Johanna; Laakso, Markku; Lakka, Timo A; Lamparter, David; Lange, Ethan M; Lange, Leslie A; Langenberg, Claudia; Larson, Eric B; Lee, Nanette R; Lehtimäki, Terho; Lewis, Cora E; Li, Huaixing; Li, Jin; Li-Gao, Ruifang; Lin, Honghuang; Lin, Keng-Hung; Lin, Li-An; Lin, Xu; Lind, Lars; Lindström, Jaana; Linneberg, Allan; Liu, Ching-Ti; Liu, Dajiang J; Liu, Yongmei; Lo, Ken Sin; Lophatananon, Artitaya; Lotery, Andrew J; Loukola, Anu; Luan, Jian'an; Lubitz, Steven A; Lyytikäinen, Leo-Pekka; Männistö, Satu; Marenne, Gaëlle; Mazul, Angela L; McCarthy, Mark I; McKean-Cowdin, Roberta; Medland, Sarah E; Meidtner, Karina; Milani, Lili; Mistry, Vanisha; Mitchell, Paul; Mohlke, Karen L; Moilanen, Leena; Moitry, Marie; Montgomery, Grant W; Mook-Kanamori, Dennis O; Moore, Carmel; Mori, Trevor A; Morris, Andrew D; Morris, Andrew P; Müller-Nurasyid, Martina; Munroe, Patricia B; Nalls, Mike A; Narisu, Narisu; Nelson, Christopher P; Neville, Matt; Nielsen, Sune F; Nikus, Kjell; Njølstad, Pål R; Nordestgaard, Børge G; Nyholt, Dale R; O'Connel, Jeffrey R; O’Donoghue, Michelle L.; Olde Loohuis, Loes M; Ophoff, Roel A; Owen, Katharine R; Packard, Chris J; Padmanabhan, Sandosh; Palmer, Colin NA; Palmer, Nicholette D; Pasterkamp, Gerard; Patel, Aniruddh P; Pattie, Alison; Pedersen, Oluf; Peissig, Peggy L; Peloso, Gina M; Pennell, Craig E; Perola, Markus; Perry, James A; Perry, John RB; Pers, Tune H; Person, Thomas N; Peters, Annette; Petersen, Eva RB; Peyser, Patricia A; Pirie, Ailith; Polasek, Ozren; Polderman, Tinca J; Puolijoki, Hannu; Raitakari, Olli T; Rasheed, Asif; Rauramaa, Rainer; Reilly, Dermot F; Renström, Frida; Rheinberger, Myriam; Ridker, Paul M; Rioux, John D; Rivas, Manuel A; Roberts, David J; Robertson, Neil R; Robino, Antonietta; Rolandsson, Olov; Rudan, Igor; Ruth, Katherine S; Saleheen, Danish; Salomaa, Veikko; Samani, Nilesh J; Sapkota, Yadav; Sattar, Naveed; Schoen, Robert E; Schreiner, Pamela J; Schulze, Matthias B; Scott, Robert A; Segura-Lepe, Marcelo P; Shah, Svati H; Sheu, Wayne H-H; Sim, Xueling; Slater, Andrew J; Small, Kerrin S; Smith, Albert Vernon; Southam, Lorraine; Spector, Timothy D; Speliotes, Elizabeth K; Starr, John M; Stefansson, Kari; Steinthorsdottir, Valgerdur; Stirrups, Kathleen E; Strauch, Konstantin; Stringham, Heather M; Stumvoll, Michael; Sun, Liang; Surendran, Praveen; Swift, Amy J; Tada, Hayato; Tansey, Katherine E; Tardif, Jean-Claude; Taylor, Kent D; Teumer, Alexander; Thompson, Deborah J; Thorleifsson, Gudmar; Thorsteinsdottir, Unnur; Thuesen, Betina H; Tönjes, Anke; Tromp, Gerard; Trompet, Stella; Tsafantakis, Emmanouil; Tuomilehto, Jaakko; Tybjaerg-Hansen, Anne; Tyrer, Jonathan P; Uher, Rudolf; Uitterlinden, André G; Uusitupa, Matti; van der Laan, Sander W; van Duijn, Cornelia M; van Leeuwen, Nienke; van Setten, Jessica; Vanhala, Mauno; Varbo, Anette; Varga, Tibor V; Varma, Rohit; Velez Edwards, Digna R; Vermeulen, Sita H; Veronesi, Giovanni; Vestergaard, Henrik; Vitart, Veronique; Vogt, Thomas F; Völker, Uwe; Vuckovic, Dragana; Wagenknecht, Lynne E; Walker, Mark; Wallentin, Lars; Wang, Feijie; Wang, Carol A; Wang, Shuai; Wang, Yiqin; Ware, Erin B; Wareham, Nicholas J; Warren, Helen R; Waterworth, Dawn M; Wessel, Jennifer; White, Harvey D; Willer, Cristen J; Wilson, James G; Witte, Daniel R; Wood, Andrew R; Wu, Ying; Yaghootkar, Hanieh; Yao, Jie; Yao, Pang; Yerges-Armstrong, Laura M; Young, Robin; Zeggini, Eleftheria; Zhan, Xiaowei; Zhang, Weihua; Zhao, Jing Hua; Zhao, Wei; Zhao, Wei; Zhou, Wei; Zondervan, Krina T; Rotter, Jerome I; Pospisilik, John A; Rivadeneira, Fernando; Borecki, Ingrid B; Deloukas, Panos; Frayling, Timothy M; Lettre, Guillaume; North, Kari E; Lindgren, Cecilia M; Hirschhorn, Joel N; Loos, Ruth JF
2018-01-01
Genome-wide association studies (GWAS) have identified >250 loci for body mass index (BMI), implicating pathways related to neuronal biology. Most GWAS loci represent clusters of common, non-coding variants from which pinpointing causal genes remains challenging. Here, we combined data from 718,734 individuals to discover rare and low-frequency (MAF<5%) coding variants associated with BMI. We identified 14 coding variants in 13 genes, of which eight in genes (ZBTB7B, ACHE, RAPGEF3, RAB21, ZFHX3, ENTPD6, ZFR2, ZNF169) newly implicated in human obesity, two (MC4R, KSR2) previously observed in extreme obesity, and two variants in GIPR. Effect sizes of rare variants are ~10 times larger than of common variants, with the largest effect observed in carriers of an MC4R stop-codon (p.Tyr35Ter, MAF=0.01%), weighing ~7kg more than non-carriers. Pathway analyses confirmed enrichment of neuronal genes and provide new evidence for adipocyte and energy expenditure biology, widening the potential of genetically-supported therapeutic targets to treat obesity. PMID:29273807
High Resolution Aerospace Applications using the NASA Columbia Supercomputer
NASA Technical Reports Server (NTRS)
Mavriplis, Dimitri J.; Aftosmis, Michael J.; Berger, Marsha
2005-01-01
This paper focuses on the parallel performance of two high-performance aerodynamic simulation packages on the newly installed NASA Columbia supercomputer. These packages include both a high-fidelity, unstructured, Reynolds-averaged Navier-Stokes solver, and a fully-automated inviscid flow package for cut-cell Cartesian grids. The complementary combination of these two simulation codes enables high-fidelity characterization of aerospace vehicle design performance over the entire flight envelope through extensive parametric analysis and detailed simulation of critical regions of the flight envelope. Both packages. are industrial-level codes designed for complex geometry and incorpor.ats. CuStomized multigrid solution algorithms. The performance of these codes on Columbia is examined using both MPI and OpenMP and using both the NUMAlink and InfiniBand interconnect fabrics. Numerical results demonstrate good scalability on up to 2016 CPUs using the NUMAIink4 interconnect, with measured computational rates in the vicinity of 3 TFLOP/s, while InfiniBand showed some performance degradation at high CPU counts, particularly with multigrid. Nonetheless, the results are encouraging enough to indicate that larger test cases using combined MPI/OpenMP communication should scale well on even more processors.
Advice about Work-Related Issues to Peers and Employers from Head and Neck Cancer Survivors
Dewa, Carolyn S.; Trojanowski, Lucy; Tamminga, Sietske J.; Ringash, Jolie; McQuestion, Maurene; Hoch, Jeffrey S.
2016-01-01
Purpose The purpose of this exploratory and descriptive study is to contribute to the sparse return-to-work literature on head and neck cancer (HNC) survivors. Interview participants were asked to reflect upon their work-related experience with cancer by answering two specific questions: (1) What advice would you give someone who has been newly diagnosed with head and neck cancer? (2) What advice would you give to employers of these people? Methods Data were gathered through 10 individual semi-structured in-depth interviews with HNC clinic patients at a regional cancer center’s head and neck clinic in Ontario, Canada. A constant comparative method of theme development was used. Codes identified in and derived from the data were discussed by research team members until consensus was reached. Codes with similar characteristics were grouped together and used to develop overarching themes. Results Work-related advice for peers focused on personal self-care and interactions within workplaces. Work-related advice to employers focused on demonstrating basic human values as well as the importance of communication. Discussion The study results suggest HNC clinic patients should be proactive with employers and help to set reasonable expectations and provide a realistic plan for work to be successfully completed. HNC clinic patients should develop communication skills to effectively disclose their cancer and treatment to employers. Conclusions In this exploratory study, HNC clinic patients’ advice was solution-focused underscoring the importance of self-care and pro-active communication and planning with employers. Employers were advised to demonstrate core human values throughout all phases of the work disability episode beginning at diagnosis. PMID:27070654
Evaluation of Advanced Thermal Protection Techniques for Future Reusable Launch Vehicles
NASA Technical Reports Server (NTRS)
Olds, John R.; Cowart, Kris
2001-01-01
A method for integrating Aeroheating analysis into conceptual reusable launch vehicle RLV design is presented in this thesis. This process allows for faster turn-around time to converge a RLV design through the advent of designing an optimized thermal protection system (TPS). It consists of the coupling and automation of four computer software packages: MINIVER, TPSX, TCAT and ADS. MINIVER is an Aeroheating code that produces centerline radiation equilibrium temperatures, convective heating rates, and heat loads over simplified vehicle geometries. These include flat plates and swept cylinders that model wings and leading edges, respectively. TPSX is a NASA Ames material properties database that is available on the World Wide Web. The newly developed Thermal Calculation Analysis Tool (TCAT) uses finite difference methods to carry out a transient in-depth I-D conduction analysis over the center mold line of the vehicle. This is used along with the Automated Design Synthesis (ADS) code to correctly size the vehicle's thermal protection system JPS). The numerical optimizer ADS uses algorithms that solve constrained and unconstrained design problems. The resulting outputs for this process are TPS material types, unit thicknesses, and acreage percentages. TCAT was developed for several purposes. First, it provides a means to calculate the transient in-depth conduction seen by the surface of the TPS material that protects a vehicle during ascent and reentry. Along with the in-depth conduction, radiation from the surface of the material is calculated along with the temperatures at the backface and interior parts of the TPS material. Secondly, TCAT contributes added speed and automation to the overall design process. Another motivation in the development of TCAT is optimization.
Lee, Wan-Ju Annabelle; Cheng, Ching-Lan; Lee, Cheng-Han; Kao Yang, Yea-Huei; Lin, Swu-Jane; Hsieh, Cheng-Yang
2017-10-01
Age-related macular degeneration (AMD) is an eye disease causing blindness in the elderly. It shares many common possible pathogenic mechanisms with cardiovascular diseases. Many studies have discussed the association between AMD and stroke, but the results were inconsistent. Our aim was to determine the associations between neovascular AMD and the risk of stroke in the Taiwanese population. This is a retrospective cohort study. We used claims data from National Health Insurance Research Database. Patients aged more than 45 years without stroke, myocardial infarction, or any AMD were selected from 2001 to 2008 and followed until 2010. The index date was defined as the date of nAMD diagnosis (ICD-9 code, 362.52). The comparison group was patients without an nAMD diagnosis with age- and sex-matched to nAMD subjects at a ratio of up to 10 to 1. Kaplan-Meier survival analysis and Cox regression analysis were used. The incidence of stroke events (ICD-9 codes, 430-434) and their subtypes (hemorrhagic and ischemic) were primary outcomes. Secondary outcomes included acute myocardial infarction (AMI), composite AMI/stroke, and all-cause mortality. Patients with nAMD had a higher risk of developing stroke, with an adjusted HR of 1.30 (95% CI, 1.01-1.68). A higher risk for hemorrhagic stroke (HR, 1.70, 95% CI, 1.03-2.83) was also found. No significant differences were observed in ischemic stroke, the composite of AMI/stroke, and all-cause mortality. Patients with nAMD had a significantly higher risk of developing stroke, which was driven mainly by the increased risk of developing the hemorrhagic subtype. Copyright © 2017 John Wiley & Sons, Ltd.
NASA Astrophysics Data System (ADS)
Bogusz, Michael
1993-01-01
The need for a systematic methodology for the analysis of aircraft electromagnetic compatibility (EMC) problems is examined. The available computer aids used in aircraft EMC analysis are assessed and a theoretical basis is established for the complex algorithms which identify and quantify electromagnetic interactions. An overview is presented of one particularly well established aircraft antenna to antenna EMC analysis code, the Aircraft Inter-Antenna Propagation with Graphics (AAPG) Version 07 software. The specific new algorithms created to compute cone geodesics and their associated path losses and to graph the physical coupling path are discussed. These algorithms are validated against basic principles. Loss computations apply the uniform geometrical theory of diffraction and are subsequently compared to measurement data. The increased modelling and analysis capabilities of the newly developed AAPG Version 09 are compared to those of Version 07. Several models of real aircraft, namely the Electronic Systems Trainer Challenger, are generated and provided as a basis for this preliminary comparative assessment. Issues such as software reliability, algorithm stability, and quality of hardcopy output are also discussed.
LabVIEW control software for scanning micro-beam X-ray fluorescence spectrometer.
Wrobel, Pawel; Czyzycki, Mateusz; Furman, Leszek; Kolasinski, Krzysztof; Lankosz, Marek; Mrenca, Alina; Samek, Lucyna; Wegrzynek, Dariusz
2012-05-15
Confocal micro-beam X-ray fluorescence microscope was constructed. The system was assembled from commercially available components - a low power X-ray tube source, polycapillary X-ray optics and silicon drift detector - controlled by an in-house developed LabVIEW software. A video camera coupled to optical microscope was utilized to display the area excited by X-ray beam. The camera image calibration and scan area definition software were also based entirely on LabVIEW code. Presently, the main area of application of the newly constructed spectrometer is 2-dimensional mapping of element distribution in environmental, biological and geological samples with micrometer spatial resolution. The hardware and the developed software can already handle volumetric 3-D confocal scans. In this work, a front panel graphical user interface as well as communication protocols between hardware components were described. Two applications of the spectrometer, to homogeneity testing of titanium layers and to imaging of various types of grains in air particulate matter collected on membrane filters, were presented. Copyright © 2012 Elsevier B.V. All rights reserved.
[Observation of Attachment Disorder Symptoms in Middle Childhood].
Iwanski, Alexandra; Zimmermann, Peter
2018-05-01
Observation of Attachment Disorder Symptoms in Middle Childhood Attachment in childhood is mainly assessed by observation. In contrast, assessment of attachment disorder symptoms (RAD) is mainly based on caregiver reports. The present study uses a newly developed observation tool (Coding of Attachment Disorder Behavior in Children; Iwanski u. Zimmermann, 2013) to assess attachment disorder symptoms in a group of school-aged children from a risk group for the development of attachment disorder symptoms and non-clinical controls. In addition, caregiver reports on RAD symptoms are also assessed (Relationship Problems Questionnaire; Minnis, Rabe-Hesketh, Wolkind, 2002; Disturbances of Attachment Interview; Smyke u. Zeanah, 1999). Moreover, associations with children's self-concept (Harter, 2012) were studied. Results reveal that children at risk showed more inhibited and disinhibited attachment disorder symptoms and a more negative self-concept compared to non-clinical controls. RAD symptoms are shown in interaction with both the caregiver and a stranger. The use of a reliable and valid observation tool for the diagnostic of attachment disorder symptoms besides ratings of caregivers is recommended for clinical practice and research.
Tamošiūnas, Paulius Lukas; Petraitytė-Burneikienė, Rasa; Lasickienė, Rita; Sereika, Vilimas; Lelešius, Raimundas; Žvirblienė, Aurelija; Sasnauskas, Kęstutis
2014-01-01
Porcine parvovirus (PPV) is a widespread infectious virus that causes serious reproductive diseases of swine and death of piglets. The gene coding for the major capsid protein VP2 of PPV was amplified using viral nucleic acid extract from swine serum and inserted into yeast Saccharomyces cerevisiae expression plasmid. Recombinant PPV VP2 protein was efficiently expressed in yeast and purified using density gradient centrifugation. Electron microscopy analysis of purified PPV VP2 protein revealed the self-assembly of virus-like particles (VLPs). Nine monoclonal antibodies (MAbs) against the recombinant PPV VP2 protein were generated. The specificity of the newly generated MAbs was proven by immunofluorescence analysis of PPV-infected cells. Indirect IgG ELISA based on the recombinant VLPs for detection of PPV-specific antibodies in swine sera was developed and evaluated. The sensitivity and specificity of the new assay were found to be 93.4% and 97.4%, respectively. In conclusion, yeast S. cerevisiae represents a promising expression system for generating recombinant PPV VP2 protein VLPs of diagnostic relevance. PMID:25045718
QUANTIFICATION OF HEAT FLUX FROM A REACTING THERMITE SPRAY
DOE Office of Scientific and Technical Information (OSTI.GOV)
Eric Nixon; Michelle Pantoya
2009-07-01
Characterizing the combustion behaviors of energetic materials requires diagnostic tools that are often not readily or commercially available. For example, a jet of thermite spray provides a high temperature and pressure reaction that can also be highly corrosive and promote undesirable conditions for the survivability of any sensor. Developing a diagnostic to quantify heat flux from a thermite spray is the objective of this study. Quick response sensors such as thin film heat flux sensors can not survive the harsh conditions of the spray, but more rugged sensors lack the response time for the resolution desired. A sensor that willmore » allow for adequate response time while surviving the entire test duration was constructed. The sensor outputs interior temperatures of the probes at known locations and utilizes an inverse heat conduction code to calculate heat flux values. The details of this device are discussed and illustrated. Temperature and heat flux measurements of various thermite spray conditions are reported. Results indicate that this newly developed energetic material heat flux sensor provides quantitative data with good repeatability.« less
Modeling NIF experimental designs with adaptive mesh refinement and Lagrangian hydrodynamics
NASA Astrophysics Data System (ADS)
Koniges, A. E.; Anderson, R. W.; Wang, P.; Gunney, B. T. N.; Becker, R.; Eder, D. C.; MacGowan, B. J.; Schneider, M. B.
2006-06-01
Incorporation of adaptive mesh refinement (AMR) into Lagrangian hydrodynamics algorithms allows for the creation of a highly powerful simulation tool effective for complex target designs with three-dimensional structure. We are developing an advanced modeling tool that includes AMR and traditional arbitrary Lagrangian-Eulerian (ALE) techniques. Our goal is the accurate prediction of vaporization, disintegration and fragmentation in National Ignition Facility (NIF) experimental target elements. Although our focus is on minimizing the generation of shrapnel in target designs and protecting the optics, the general techniques are applicable to modern advanced targets that include three-dimensional effects such as those associated with capsule fill tubes. Several essential computations in ordinary radiation hydrodynamics need to be redesigned in order to allow for AMR to work well with ALE, including algorithms associated with radiation transport. Additionally, for our goal of predicting fragmentation, we include elastic/plastic flow into our computations. We discuss the integration of these effects into a new ALE-AMR simulation code. Applications of this newly developed modeling tool as well as traditional ALE simulations in two and three dimensions are applied to NIF early-light target designs.
Wu, Yushan; Gong, Wanzhuo; Wang, Yangmei; Yong, Taiwen; Yang, Feng; Liu, Weigui; Wu, Xiaoling; Du, Junbo; Shu, Kai; Liu, Jiang; Liu, Chunyan; Yang, Wenyu
2018-03-29
Leaf anatomy and the stomatal development of developing leaves of plants have been shown to be regulated by the same light environment as that of mature leaves, but no report has yet been written on whether such a long-distance signal from mature leaves regulates the total leaf area of newly emerged leaves. To explore this question, we created an investigation in which we collected data on the leaf area, leaf mass per area (LMA), leaf anatomy, cell size, cell number, gas exchange and soluble sugar content of leaves from three soybean varieties grown under full sunlight (NS), shaded mature leaves (MS) or whole plants grown in shade (WS). Our results show that MS or WS cause a marked decline both in leaf area and LMA in newly developing leaves. Leaf anatomy also showed characteristics of shade leaves with decreased leaf thickness, palisade tissue thickness, sponge tissue thickness, cell size and cell numbers. In addition, in the MS and WS treatments, newly developed leaves exhibited lower net photosynthetic rate (Pn), stomatal conductance (Gs) and transpiration rate (E), but higher carbon dioxide (CO 2 ) concentration in the intercellular space (Ci) than plants grown in full sunlight. Moreover, soluble sugar content was significantly decreased in newly developed leaves in MS and WS treatments. These results clearly indicate that (1) leaf area, leaf anatomical structure, and photosynthetic function of newly developing leaves are regulated by a systemic irradiance signal from mature leaves; (2) decreased cell size and cell number are the major cause of smaller and thinner leaves in shade; and (3) sugars could possibly act as candidate signal substances to regulate leaf area systemically.
A Simple Test of Class-Level Genetic Association Can Reveal Novel Cardiometabolic Trait Loci.
Qian, Jing; Nunez, Sara; Reed, Eric; Reilly, Muredach P; Foulkes, Andrea S
2016-01-01
Characterizing the genetic determinants of complex diseases can be further augmented by incorporating knowledge of underlying structure or classifications of the genome, such as newly developed mappings of protein-coding genes, epigenetic marks, enhancer elements and non-coding RNAs. We apply a simple class-level testing framework, termed Genetic Class Association Testing (GenCAT), to identify protein-coding gene association with 14 cardiometabolic (CMD) related traits across 6 publicly available genome wide association (GWA) meta-analysis data resources. GenCAT uses SNP-level meta-analysis test statistics across all SNPs within a class of elements, as well as the size of the class and its unique correlation structure, to determine if the class is statistically meaningful. The novelty of findings is evaluated through investigation of regional signals. A subset of findings are validated using recently updated, larger meta-analysis resources. A simulation study is presented to characterize overall performance with respect to power, control of family-wise error and computational efficiency. All analysis is performed using the GenCAT package, R version 3.2.1. We demonstrate that class-level testing complements the common first stage minP approach that involves individual SNP-level testing followed by post-hoc ascribing of statistically significant SNPs to genes and loci. GenCAT suggests 54 protein-coding genes at 41 distinct loci for the 13 CMD traits investigated in the discovery analysis, that are beyond the discoveries of minP alone. An additional application to biological pathways demonstrates flexibility in defining genetic classes. We conclude that it would be prudent to include class-level testing as standard practice in GWA analysis. GenCAT, for example, can be used as a simple, complementary and efficient strategy for class-level testing that leverages existing data resources, requires only summary level data in the form of test statistics, and adds significant value with respect to its potential for identifying multiple novel and clinically relevant trait associations.
2014-01-01
Background Down-regulation or silencing of transgene expression can be a major hurdle to both molecular studies and biotechnology applications in many plant species. Sugarcane is particularly effective at silencing introduced transgenes, including reporter genes such as the firefly luciferase gene. Synthesizing transgene coding sequences optimized for usage in the host plant is one method of enhancing transgene expression and stability. Using specified design rules we have synthesised new coding sequences for both the firefly luciferase and Renilla luciferase reporter genes. We have tested these optimized versions for enhanced levels of luciferase activity and for increased steady state luciferase mRNA levels in sugarcane. Results The synthetic firefly luciferase (luc*) and Renilla luciferase (Renluc*) coding sequences have elevated G + C contents in line with sugarcane codon usage, but maintain 75% identity to the native firefly or Renilla luciferase nucleotide sequences and 100% identity to the protein coding sequences. Under the control of the maize pUbi promoter, the synthetic luc* and Renluc* genes yielded 60x and 15x higher luciferase activity respectively, over the native firefly and Renilla luciferase genes in transient assays on sugarcane suspension cell cultures. Using a novel transient assay in sugarcane suspension cells combining co-bombardment and qRT-PCR, we showed that synthetic luc* and Renluc* genes generate increased transcript levels compared to the native firefly and Renilla luciferase genes. In stable transgenic lines, the luc* transgene generated significantly higher levels of expression than the native firefly luciferase transgene. The fold difference in expression was highest in the youngest tissues. Conclusions We developed synthetic versions of both the firefly and Renilla luciferase reporter genes that resist transgene silencing in sugarcane. These transgenes will be particularly useful for evaluating the expression patterns conferred by existing and newly isolated promoters in sugarcane tissues. The strategies used to design the synthetic luciferase transgenes could be applied to other transgenes that are aggressively silenced in sugarcane. PMID:24708613
Chou, Ting-Chun; Moyle, Richard L
2014-04-08
Down-regulation or silencing of transgene expression can be a major hurdle to both molecular studies and biotechnology applications in many plant species. Sugarcane is particularly effective at silencing introduced transgenes, including reporter genes such as the firefly luciferase gene.Synthesizing transgene coding sequences optimized for usage in the host plant is one method of enhancing transgene expression and stability. Using specified design rules we have synthesised new coding sequences for both the firefly luciferase and Renilla luciferase reporter genes. We have tested these optimized versions for enhanced levels of luciferase activity and for increased steady state luciferase mRNA levels in sugarcane. The synthetic firefly luciferase (luc*) and Renilla luciferase (Renluc*) coding sequences have elevated G + C contents in line with sugarcane codon usage, but maintain 75% identity to the native firefly or Renilla luciferase nucleotide sequences and 100% identity to the protein coding sequences.Under the control of the maize pUbi promoter, the synthetic luc* and Renluc* genes yielded 60x and 15x higher luciferase activity respectively, over the native firefly and Renilla luciferase genes in transient assays on sugarcane suspension cell cultures.Using a novel transient assay in sugarcane suspension cells combining co-bombardment and qRT-PCR, we showed that synthetic luc* and Renluc* genes generate increased transcript levels compared to the native firefly and Renilla luciferase genes.In stable transgenic lines, the luc* transgene generated significantly higher levels of expression than the native firefly luciferase transgene. The fold difference in expression was highest in the youngest tissues. We developed synthetic versions of both the firefly and Renilla luciferase reporter genes that resist transgene silencing in sugarcane. These transgenes will be particularly useful for evaluating the expression patterns conferred by existing and newly isolated promoters in sugarcane tissues. The strategies used to design the synthetic luciferase transgenes could be applied to other transgenes that are aggressively silenced in sugarcane.
Solar Wind Acceleration: Modeling Effects of Turbulent Heating in Open Flux Tubes
NASA Astrophysics Data System (ADS)
Woolsey, Lauren N.; Cranmer, Steven R.
2014-06-01
We present two self-consistent coronal heating models that determine the properties of the solar wind generated and accelerated in magnetic field geometries that are open to the heliosphere. These models require only the radial magnetic field profile as input. The first code, ZEPHYR (Cranmer et al. 2007) is a 1D MHD code that includes the effects of turbulent heating created by counter-propagating Alfven waves rather than relying on empirical heating functions. We present the analysis of a large grid of modeled flux tubes (> 400) and the resulting solar wind properties. From the models and results, we recreate the observed anti-correlation between wind speed at 1 AU and the so-called expansion factor, a parameterization of the magnetic field profile. We also find that our models follow the same observationally-derived relation between temperature at 1 AU and wind speed at 1 AU. We continue our analysis with a newly-developed code written in Python called TEMPEST (The Efficient Modified-Parker-Equation-Solving Tool) that runs an order of magnitude faster than ZEPHYR due to a set of simplifying relations between the input magnetic field profile and the temperature and wave reflection coefficient profiles. We present these simplifying relations as a useful result in themselves as well as the anti-correlation between wind speed and expansion factor also found with TEMPEST. Due to the nature of the algorithm TEMPEST utilizes to find solar wind solutions, we can effectively separate the two primary ways in which Alfven waves contribute to solar wind acceleration: 1) heating the surrounding gas through a turbulent cascade and 2) providing a separate source of wave pressure. We intend to make TEMPEST easily available to the public and suggest that TEMPEST can be used as a valuable tool in the forecasting of space weather, either as a stand-alone code or within an existing modeling framework.
SOCR Analyses – an Instructional Java Web-based Statistical Analysis Toolkit
Chu, Annie; Cui, Jenny; Dinov, Ivo D.
2011-01-01
The Statistical Online Computational Resource (SOCR) designs web-based tools for educational use in a variety of undergraduate courses (Dinov 2006). Several studies have demonstrated that these resources significantly improve students' motivation and learning experiences (Dinov et al. 2008). SOCR Analyses is a new component that concentrates on data modeling and analysis using parametric and non-parametric techniques supported with graphical model diagnostics. Currently implemented analyses include commonly used models in undergraduate statistics courses like linear models (Simple Linear Regression, Multiple Linear Regression, One-Way and Two-Way ANOVA). In addition, we implemented tests for sample comparisons, such as t-test in the parametric category; and Wilcoxon rank sum test, Kruskal-Wallis test, Friedman's test, in the non-parametric category. SOCR Analyses also include several hypothesis test models, such as Contingency tables, Friedman's test and Fisher's exact test. The code itself is open source (http://socr.googlecode.com/), hoping to contribute to the efforts of the statistical computing community. The code includes functionality for each specific analysis model and it has general utilities that can be applied in various statistical computing tasks. For example, concrete methods with API (Application Programming Interface) have been implemented in statistical summary, least square solutions of general linear models, rank calculations, etc. HTML interfaces, tutorials, source code, activities, and data are freely available via the web (www.SOCR.ucla.edu). Code examples for developers and demos for educators are provided on the SOCR Wiki website. In this article, the pedagogical utilization of the SOCR Analyses is discussed, as well as the underlying design framework. As the SOCR project is on-going and more functions and tools are being added to it, these resources are constantly improved. The reader is strongly encouraged to check the SOCR site for most updated information and newly added models. PMID:21546994
CHORUS code for solar and planetary convection
NASA Astrophysics Data System (ADS)
Wang, Junfeng
Turbulent, density stratified convection is ubiquitous in stars and planets. Numerical simulation has become an indispensable tool for understanding it. A primary contribution of this dissertation work is the creation of the Compressible High-ORder Unstructured Spectral-difference (CHORUS) code for simulating the convection and related fluid dynamics in the interiors of stars and planets. In this work, the CHORUS code is verified by using two newly defined benchmark cases and demonstrates excellent parallel performance. It has unique potential to simulate challenging physical phenomena such as multi-scale solar convection, core convection, and convection in oblate, rapidly-rotating stars. In order to exploit its unique capabilities, the CHORUS code has been extended to perform the first 3D simulations of convection in oblate, rapidly rotating solar-type stars. New insights are obtained with respect to the influence of oblateness on the convective structure and heat flux transport. With the presence of oblateness resulting from the centrifugal force effect, the convective structure in the polar regions decouples from the main convective modes in the equatorial regions. Our convection simulations predict that heat flux peaks in both the polar and equatorial regions, contrary to previous theoretical results that predict darker equators. High latitudinal zonal jets are also observed in the simulations.
Chadwick, Georgina; Varagunam, Mira; Brand, Christian; Riley, Stuart A; Maynard, Nick; Crosby, Tom; Michalowski, Julie; Cromwell, David A
2017-06-09
The International Classification of Diseases 10th Revision (ICD-10) system used in the English hospital administrative database (Hospital Episode Statistics (HES)) does not contain a specific code for oesophageal high-grade dysplasia (HGD). The aim of this paper was to examine how patients with HGD were coded in HES and whether it was done consistently. National population-based cohort study of patients with newly diagnosed with HGD in England. The study used data collected prospectively as part of the National Oesophago-Gastric Cancer Audit (NOGCA). These records were linked to HES to investigate the pattern of ICD-10 codes recorded for these patients at the time of diagnosis. All patients with a new diagnosis of HGD between 1 April 2013 and 31 March 2014 in England, who had data submitted to the NOGCA. The main outcome assessed was the pattern of primary and secondary ICD-10 diagnostic codes recorded in the HES records at endoscopy at the time of diagnosis of HGD. Among 452 patients with a new diagnosis of HGD between 1 April 2013 and 31 March 2014, Barrett's oesophagus was the only condition coded in 200 (44.2%) HES records. Records for 59 patients (13.1%) contained no oesophageal conditions. The remaining 193 patients had various diagnostic codes recorded, 93 included a diagnosis of Barrett's oesophagus and 57 included a diagnosis of oesophageal/gastric cardia cancer. HES is not suitable to support national studies looking at the management of HGD. This is one reason for the UK to adopt an extended ICD system (akin to ICD-10-CM). © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
Development of thermal protection system of the MUSES-C/DASH reentry capsule
NASA Astrophysics Data System (ADS)
Yamada, Tetsuya; Inatani, Yoshifumi; Honda, Masahisa; Hirai, Ken'ich
2002-07-01
In the final phase of the MUSES-C mission, a small capsule with asteroid sample conducts reentry flight directly from the interplanetary transfer orbit at the velocity over 12 km/s. The severe heat flux, the complicated functional requirements, and small weight budget impose several engineering challenges on the designing of the thermal protection system of the capsule. The heat shield is required to function not only as ablator but also as a structural component. The cloth-layered carbon-phenolic ablator, which has higher allowable stress, is developed in newly-devised fabric method for avoiding delamination due to the high aerodynamic heating. The ablation analysis code, which takes into account of the effect of pyrolysis gas on the surface recession rate, has been developed and verified in the arc-heating tests in the facility environment of broad range of enthalpy level. The capsule was designed to be ventilated during the reentry flight up to about atmospheric pressure by the time of parachute deployment by being sealed with porous flow-restrict material. The designing of the thermal protection system, the hardware specifications, and the ground-based test programs of both MUSES-C and DASH capsule are summarized and discussed here in this paper.
Rattay, Stephanie; Trilling, Mirko; Megger, Dominik A; Sitek, Barbara; Meyer, Helmut E; Hengel, Hartmut; Le-Trilling, Vu Thuy Khanh
2015-08-01
Transcription of mouse cytomegalovirus (MCMV) immediate early ie1 and ie3 is controlled by the major immediate early promoter/enhancer (MIEP) and requires differential splicing. Based on complete loss of genome replication of an MCMV mutant carrying a deletion of the ie3-specific exon 5, the multifunctional IE3 protein (611 amino acids; pIE611) is considered essential for viral replication. Our analysis of ie3 transcription resulted in the identification of novel ie3 isoforms derived from alternatively spliced ie3 transcripts. Construction of an IE3-hemagglutinin (IE3-HA) virus by insertion of an in-frame HA epitope sequence allowed detection of the IE3 isoforms in infected cells, verifying that the newly identified transcripts code for proteins. This prompted the construction of an MCMV mutant lacking ie611 but retaining the coding capacity for the newly identified isoforms ie453 and ie310. Using Δie611 MCMV, we demonstrated the dispensability of the canonical ie3 gene product pIE611 for viral replication. To determine the role of pIE611 for viral gene expression during MCMV infection in an unbiased global approach, we used label-free quantitative mass spectrometry to delineate pIE611-dependent changes of the MCMV proteome. Interestingly, further analysis revealed transcriptional as well as posttranscriptional regulation of MCMV gene products by pIE611. Cytomegaloviruses are pathogenic betaherpesviruses persisting in a lifelong latency from which reactivation can occur under conditions of immunosuppression, immunoimmaturity, or inflammation. The switch from latency to reactivation requires expression of immediate early genes. Therefore, understanding of immediate early gene regulation might add insights into viral pathogenesis. The mouse cytomegalovirus (MCMV) immediate early 3 protein (611 amino acids; pIE611) is considered essential for viral replication. The identification of novel protein isoforms derived from alternatively spliced ie3 transcripts prompted the construction of an MCMV mutant lacking ie611 but retaining the coding capacity for the newly identified isoforms ie453 and ie310. Using Δie611 MCMV, we demonstrated the dispensability of the canonical ie3 gene product pIE611 for viral replication and delineated pIE611-dependent changes of the MCMV proteome. Our findings have fundamental implications for the interpretation of earlier studies on pIE3 functions and highlight the complex orchestration of MCMV gene regulation. Copyright © 2015, American Society for Microbiology. All Rights Reserved.
Exploring the Perceptions of Newly Credentialed Athletic Trainers as They Transition to Practice
Walker, Stacy E.; Thrasher, Ashley B.; Mazerolle, Stephanie M.
2016-01-01
Context: Research is limited on the transition to practice of newly credentialed athletic trainers (ATs). Understanding this transition could provide insight to assist employers and professional programs in developing initiatives to enhance the transition. Objective: To explore newly credentialed ATs' experiences and feelings during their transition from student to autonomous practitioner. Design: Qualitative study. Setting: Individual phone interviews. Patients or Other Participants: Thirty-four ATs certified between January and September 2013 participated in this study (18 women, 16 men; age = 23.8 ± 2.1 years; work settings were collegiate, secondary school, clinic, and other). Data saturation guided the number of participants. Data Collection and Analysis: Participants were interviewed via phone using a semistructured interview guide. All interviews were recorded and transcribed verbatim. Data were analyzed through phenomenologic reduction, with data coded for common themes and subthemes. Credibility was established via member checks, peer review, and intercoder reliability. Results: The 3 themes that emerged from the data were (1) transition to practice preparation, (2) orientation, and (3) mentoring. Transition to practice was rarely discussed during professional preparation, but information on the organization and administration or capstone course (eg, insurance, documentation) assisted participants in their transition. Participants felt that preceptors influenced their transition by providing or hindering the number and quality of patient encounters. Participants from larger collegiate settings reported more formal orientation methods (eg, review policies, procedures manual), whereas those in secondary school, clinic/hospital, and smaller collegiate settings reported informal orientation methods (eg, independent review of policies and procedures, tours). Some participants were assigned a formal mentor, and others engaged in peer mentoring. Conclusions: Employers could enhance the transition to practice by providing formal orientation and mentorship. Professional programs could prepare students for the transition by discussing how to find support and mentoring and by involving preceptors who provide students with opportunities to give patient care. PMID:27710092
Wang, Han-Cheng; Lau, Chi-Ieong; Lin, Che-Chen; Chang, Anna; Kao, Chia-Hung
2016-07-01
This study evaluated the association between group A streptococcal (GAS) infections and the risks of developing tic disorders, obsessive-compulsive disorder (OCD), and attention-deficit/hyperactivity disorder (ADHD). We conducted a follow-up cohort study in 2014 using Taiwan's National Health Insurance Research Database. The study cohort consisted of patients younger than 18 years with newly diagnosed GAS infection (ICD-9-CM codes 034 [streptococcal sore throat and scarlet fever] and 482.31 [pneumonia due to Streptococcus, group A]) from 2001 to 2010. All patients having GAS infection codes between 1996 and 2000 were excluded. We assessed the patients' risks of developing tic disorders, OCD, and ADHD (ICD-9-CM codes 300.3 [obsessive-compulsive disorders], 301.4 [obsessive-compulsive personality disorder], 307.2 [tic disorder, unspecified], and 314 [attention deficit disorder]) and compared these risks with those of a control cohort. The primary outcomes of this study were the overall neuropsychiatric disorder occurrence and the occurrence of separate subtypes. We examined 2,596 patients and 25,960 controls. The incidence of neuropsychiatric disorders in the GAS infection cohort (60.42 per 10,000 person-years) was significantly higher than that in the comparison cohort (49.32 per 10,000 person-years) (hazard ratio [HR] = 1.22; 95% CI, 1.00-1.49). The largest increased risk was for tic disorders (HR = 1.63; 95% CI, 1.02-2.62). Patients hospitalized for GAS infection had a 1.96-fold higher risk of neuropsychiatric disorders than did people without GAS infection (HR = 1.96; 95% CI, 1.23-3.12), and there was no difference in risk between outpatients with GAS infection and people without GAS infection (HR = 1.14; 95% CI, 0.92-1.41). Patients with moderate or high frequencies of GAS infection-related clinic visits had much higher risks of developing a neuropsychiatric disorder and, specifically, tic disorders and ADHD (all P values for trend < .05). These risks were not increased in patients with a low frequency of clinic visits. Our results confirmed an association between previous group A streptococcal infection and neuropsychiatric disorders. © Copyright 2016 Physicians Postgraduate Press, Inc.
Ohno, S
1984-01-01
Three outstanding properties uniquely qualify repeats of base oligomers as the primordial coding sequences of all polypeptide chains. First, when compared with randomly generated base sequences in general, they are more likely to have long open reading frames. Second, periodical polypeptide chains specified by such repeats are more likely to assume either alpha-helical or beta-sheet secondary structures than are polypeptide chains of random sequence. Third, provided that the number of bases in the oligomeric unit is not a multiple of 3, these internally repetitious coding sequences are impervious to randomly sustained base substitutions, deletions, and insertions. This is because the recurring periodicity of their polypeptide chains is given by three consecutive copies of the oligomeric unit translated in three different reading frames. Accordingly, when one reading frame is open, the other two are automatically open as well, all three being capable of coding for polypeptide chains of identical periodicity. Under this circumstance, a frame shift due to the deletion or insertion of a number of bases that is not a multiple of 3 fails to alter the down-stream amino acid sequence, and even a base change causing premature chain-termination can silence only one of the three potential coding units. Newly arisen coding sequences in modern organisms are oligomeric repeats, and most of the older genes retain various vestiges of their original internal repetitions. Some of the genes (e.g., oncogenes) have even inherited the property of being impervious to randomly sustained base changes.
Yang, Bo Ram; Kang, Young Ae; Heo, Eun Young; Koo, Bo Kyung; Choi, Nam-Kyong; Hwang, Seung-Sik; Lee, Chang-Hoon
2018-04-01
There are regional differences in the burden of tuberculosis (TB). Although these differences might be explained by regional differences in the risk factors of TB, whether such risk factors are actually associated with the regional differences in the TB burden remains unclear. This study aimed to investigate the relationship between the risk factors of and regional differences in TB incidence. A cohort study applying nationwide claims database in Republic of Korea included patients newly diagnosed with type 2 diabetes mellitus (DM) in 2009. The main outcome was the incidence of TB defined based on the diagnostic codes combined with anti-tuberculosis treatment repeated within 90 days. Sixteen regions were categorized into 3 groups according to the age- and sex-standardized TB incidence rates. Multivariate logistic regression analysis adjusted for risk factors was performed to identify the determinants of the regional differences in TB incidence. Among 331 601 participants newly diagnosed with type 2 DM and with no history of previous TB, 1216 TB cases were observed. The regional TB incidence rates ranged between 2.3 and 5.9/1000 patients. Multivariate analyses did not identify any determinants of regional differences in the TB incidence among the various risk factors, including age, sex, health care utilization, co-morbidities, medication and treatment and complications of DM. Similarly, temperature, humidity and latent TB infection rate also did not affect the results. Although substantial regional differences in the TB incidence rate were observed among patients with newly diagnosed DM, no determinants of regional difference were identified among the risk factors. © 2017 John Wiley & Sons Ltd.
Tóth, Lola; Fábos, Beáta; Farkas, Katalin; Sulák, Adrienn; Tripolszki, Kornélia; Széll, Márta; Nagy, Nikoletta
2017-03-15
Oculocutaneous albinism (OCA) is a clinically and genetically heterogenic group of pigmentation abnormalities. OCA type IV (OCA4, OMIM 606574) develops due to homozygous or compound heterozygous mutations in the solute carrier family 45, member 2 (SLC45A2) gene. This gene encodes a membrane-associated transport protein, which regulates tyrosinase activity and, thus, melanin content by changing melanosomal pH and disrupting the incorporation of copper into tyrosinase. Here we report two Hungarian siblings affected by an unusual OCA4 phenotype. After genomic DNA was isolated from peripheral blood of the patients, the coding regions of the SLC45A2 gene were sequenced. In silico tools were applied to identify the functional impact of the newly detected mutations. Direct sequencing of the SLC45A2 gene revealed two novel, heterozygous mutations, one missense (c.1226G > A, p.Gly409Asp) and one nonsense (c.1459C > T, p.Gln437*), which were present in both patients, suggesting the mutations were compound heterozygous. In silico tools suggest that these variations are disease causing mutations. The newly identified mutations may affect the transmembrane domains of the protein, and could impair transport function, resulting in decreases in both melanosomal pH and tyrosinase activity. Our study provides expands on the mutation spectrum of the SLC45A2 gene and the genetic background of OCA4.
Xu, Stanley; Newcomer, Sophia; Nelson, Jennifer; Qian, Lei; McClure, David; Pan, Yi; Zeng, Chan; Glanz, Jason
2014-05-01
The Vaccine Safety Datalink project captures electronic health record data including vaccinations and medically attended adverse events on 8.8 million enrollees annually from participating managed care organizations in the United States. While the automated vaccination data are generally of high quality, a presumptive adverse event based on diagnosis codes in automated health care data may not be true (misclassification). Consequently, analyses using automated health care data can generate false positive results, where an association between the vaccine and outcome is incorrectly identified, as well as false negative findings, where a true association or signal is missed. We developed novel conditional Poisson regression models and fixed effects models that accommodate misclassification of adverse event outcome for self-controlled case series design. We conducted simulation studies to evaluate their performance in signal detection in vaccine safety hypotheses generating (screening) studies. We also reanalyzed four previously identified signals in a recent vaccine safety study using the newly proposed models. Our simulation studies demonstrated that (i) outcome misclassification resulted in both false positive and false negative signals in screening studies; (ii) the newly proposed models reduced both the rates of false positive and false negative signals. In reanalyses of four previously identified signals using the novel statistical models, the incidence rate ratio estimates and statistical significances were similar to those using conventional models and including only medical record review confirmed cases. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Coleman, Craig I; Vaitsiakhovich, Tatsiana; Nguyen, Elaine; Weeda, Erin R; Sood, Nitesh A; Bunz, Thomas J; Schaefer, Bernhard; Meinecke, Anna-Katharina; Eriksson, Daniel
2018-01-01
Schemas to identify bleeding-related hospitalizations in claims data differ in billing codes used and coding positions allowed. We assessed agreement across bleeding-related hospitalization coding schemas for claims analyses of nonvalvular atrial fibrillation (NVAF) patients on oral anticoagulation (OAC). We hypothesized that prior coding schemas used to identify bleeding-related hospitalizations in claim database studies would provide varying levels of agreement in incidence rates. Within MarketScan data, we identified adults, newly started on OAC for NVAF from January 2012 to June 2015. Billing code schemas developed by Cunningham et al., the US Food and Drug Administration (FDA) Mini-Sentinel program, and Yao et al. were used to identify bleeding-related hospitalizations as a surrogate for major bleeding. Bleeds were subcategorized as intracranial hemorrhage (ICH), gastrointestinal (GI), or other. Schema agreement was assessed by comparing incidence, rates of events/100 person-years (PYs), and Cohen's kappa statistic. We identified 151 738 new-users of OAC with NVAF (CHA2DS2-VASc score = 3, [interquartile range = 2-4] and median HAS-BLED score = 3 [interquartile range = 2-3]). The Cunningham, FDA Mini-Sentinel, and Yao schemas identified any bleeding-related hospitalizations in 1.87% (95% confidence interval [CI]: 1.81-1.94), 2.65% (95% CI: 2.57-2.74), and 4.66% (95% CI: 4.55-4.76) of patients (corresponding rates = 3.45, 4.90, and 8.65 events/100 PYs). Kappa agreement across schemas was weak-to-moderate (κ = 0.47-0.66) for any bleeding hospitalization. Near-perfect agreement (κ = 0.99) was observed with the FDA Mini-Sentinel and Yao schemas for ICH-related hospitalizations, but agreement was weak when comparing Cunningham to FDA Mini-Sentinel or Yao (κ = 0.52-0.53). FDA Mini-Sentinel and Yao agreement was moderate (κ = 0.62) for GI bleeding, but agreement was weak when comparing Cunningham to FDA Mini-Sentinel or Yao (κ = 0.44-0.56). For other bleeds, agreement across schemas was minimal (κ = 0.14-0.38). We observed varying levels of agreement among 3 bleeding-related hospitalizations schemas in NVAF patients. © 2018 Wiley Periodicals, Inc.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lazerson, Samuel A.; Loizu, Joaquim; Hirshman, Steven
The VMEC nonlinear ideal MHD equilibrium code [S. P. Hirshman and J. C. Whitson, Phys. Fluids 26, 3553 (1983)] is compared against analytic linear ideal MHD theory in a screw-pinch-like configuration. The focus of such analysis is to verify the ideal MHD response at magnetic surfaces which possess magnetic transform (ι) which is resonant with spectral values of the perturbed boundary harmonics. A large aspect ratio circular cross section zero-beta equilibrium is considered. This equilibrium possess a rational surface with safety factor q = 2 at a normalized flux value of 0.5. A small resonant boundary perturbation is introduced, excitingmore » a response at the resonant rational surface. The code is found to capture the plasma response as predicted by a newly developed analytic theory that ensures the existence of nested flux surfaces by allowing for a jump in rotational transform (ι=1/q). The VMEC code satisfactorily reproduces these theoretical results without the necessity of an explicit transform discontinuity (Δι) at the rational surface. It is found that the response across the rational surfaces depends upon both radial grid resolution and local shear (dι/dΦ, where ι is the rotational transform and Φ the enclosed toroidal flux). Calculations of an implicit Δι suggest that it does not arise due to numerical artifacts (attributed to radial finite differences in VMEC) or existence conditions for flux surfaces as predicted by linear theory (minimum values of Δι). Scans of the rotational transform profile indicate that for experimentally relevant levels of transform shear the response becomes increasing localised. Furthermore, careful examination of a large experimental tokamak equilibrium, with applied resonant fields, indicates that this shielding response is present, suggesting the phenomena is not limited to this verification exercise.« less
Inflammatory bowel disease and risk of Parkinson's disease in Medicare beneficiaries.
Camacho-Soto, Alejandra; Gross, Anat; Searles Nielsen, Susan; Dey, Neelendu; Racette, Brad A
2018-05-01
Gastrointestinal (GI) dysfunction precedes the motor symptoms of Parkinson's disease (PD) by several years. PD patients have abnormal aggregation of intestinal α-synuclein, the accumulation of which may be promoted by inflammation. The relationship between intestinal α-synuclein aggregates and central nervous system neuropathology is unknown. Recently, we observed a possible inverse association between inflammatory bowel disease (IBD) and PD as part of a predictive model of PD. Therefore, the objective of this study was to examine the relationship between PD risk and IBD and IBD-associated conditions and treatment. Using a case-control design, we identified 89,790 newly diagnosed PD cases and 118,095 population-based controls >65 years of age using comprehensive Medicare data from 2004-2009 including detailed claims data. We classified IBD using International Classification of Diseases version 9 (ICD-9) diagnosis codes. We used logistic regression to calculate odds ratios (ORs) and 95% confidence intervals (CIs) to evaluate the association between PD and IBD. Covariates included age, sex, race/ethnicity, smoking, Elixhauser comorbidities, and health care use. PD was inversely associated with IBD overall (OR = 0.85, 95% CI 0.80-0.91) and with both Crohn's disease (OR = 0.83, 95% CI 0.74-0.93) and ulcerative colitis (OR = 0.88, 95% CI 0.82-0.96). Among beneficiaries with ≥2 ICD-9 codes for IBD, there was an inverse dose-response association between number of IBD ICD-9 codes, as a potential proxy for IBD severity, and PD (p-for-trend = 0.006). IBD is associated with a lower risk of developing PD. Copyright © 2018 Elsevier Ltd. All rights reserved.
Lineage-Specific Biology Revealed by a Finished Genome Assembly of the Mouse
Hillier, LaDeana W.; Zody, Michael C.; Goldstein, Steve; She, Xinwe; Bult, Carol J.; Agarwala, Richa; Cherry, Joshua L.; DiCuccio, Michael; Hlavina, Wratko; Kapustin, Yuri; Meric, Peter; Maglott, Donna; Birtle, Zoë; Marques, Ana C.; Graves, Tina; Zhou, Shiguo; Teague, Brian; Potamousis, Konstantinos; Churas, Christopher; Place, Michael; Herschleb, Jill; Runnheim, Ron; Forrest, Daniel; Amos-Landgraf, James; Schwartz, David C.; Cheng, Ze; Lindblad-Toh, Kerstin; Eichler, Evan E.; Ponting, Chris P.
2009-01-01
The mouse (Mus musculus) is the premier animal model for understanding human disease and development. Here we show that a comprehensive understanding of mouse biology is only possible with the availability of a finished, high-quality genome assembly. The finished clone-based assembly of the mouse strain C57BL/6J reported here has over 175,000 fewer gaps and over 139 Mb more of novel sequence, compared with the earlier MGSCv3 draft genome assembly. In a comprehensive analysis of this revised genome sequence, we are now able to define 20,210 protein-coding genes, over a thousand more than predicted in the human genome (19,042 genes). In addition, we identified 439 long, non–protein-coding RNAs with evidence for transcribed orthologs in human. We analyzed the complex and repetitive landscape of 267 Mb of sequence that was missing or misassembled in the previously published assembly, and we provide insights into the reasons for its resistance to sequencing and assembly by whole-genome shotgun approaches. Duplicated regions within newly assembled sequence tend to be of more recent ancestry than duplicates in the published draft, correcting our initial understanding of recent evolution on the mouse lineage. These duplicates appear to be largely composed of sequence regions containing transposable elements and duplicated protein-coding genes; of these, some may be fixed in the mouse population, but at least 40% of segmentally duplicated sequences are copy number variable even among laboratory mouse strains. Mouse lineage-specific regions contain 3,767 genes drawn mainly from rapidly-changing gene families associated with reproductive functions. The finished mouse genome assembly, therefore, greatly improves our understanding of rodent-specific biology and allows the delineation of ancestral biological functions that are shared with human from derived functions that are not. PMID:19468303
NASA Astrophysics Data System (ADS)
Fable, E.; Angioni, C.; Ivanov, A. A.; Lackner, K.; Maj, O.; Medvedev, S. Yu; Pautasso, G.; Pereverzev, G. V.; Treutterer, W.; the ASDEX Upgrade Team
2013-07-01
The modelling of tokamak scenarios requires the simultaneous solution of both the time evolution of the plasma kinetic profiles and of the magnetic equilibrium. Their dynamical coupling involves additional complications, which are not present when the two physical problems are solved separately. Difficulties arise in maintaining consistency in the time evolution among quantities which appear in both the transport and the Grad-Shafranov equations, specifically the poloidal and toroidal magnetic fluxes as a function of each other and of the geometry. The required consistency can be obtained by means of iteration cycles, which are performed outside the equilibrium code and which can have different convergence properties depending on the chosen numerical scheme. When these external iterations are performed, the stability of the coupled system becomes a concern. In contrast, if these iterations are not performed, the coupled system is numerically stable, but can become physically inconsistent. By employing a novel scheme (Fable E et al 2012 Nucl. Fusion submitted), which ensures stability and physical consistency among the same quantities that appear in both the transport and magnetic equilibrium equations, a newly developed version of the ASTRA transport code (Pereverzev G V et al 1991 IPP Report 5/42), which is coupled to the SPIDER equilibrium code (Ivanov A A et al 2005 32nd EPS Conf. on Plasma Physics (Tarragona, 27 June-1 July) vol 29C (ECA) P-5.063), in both prescribed- and free-boundary modes is presented here for the first time. The ASTRA-SPIDER coupled system is then applied to the specific study of the modelling of controlled current ramp-up in ASDEX Upgrade discharges.
GeneBuilder: interactive in silico prediction of gene structure.
Milanesi, L; D'Angelo, D; Rogozin, I B
1999-01-01
Prediction of gene structure in newly sequenced DNA becomes very important in large genome sequencing projects. This problem is complicated due to the exon-intron structure of eukaryotic genes and because gene expression is regulated by many different short nucleotide domains. In order to be able to analyse the full gene structure in different organisms, it is necessary to combine information about potential functional signals (promoter region, splice sites, start and stop codons, 3' untranslated region) together with the statistical properties of coding sequences (coding potential), information about homologous proteins, ESTs and repeated elements. We have developed the GeneBuilder system which is based on prediction of functional signals and coding regions by different approaches in combination with similarity searches in proteins and EST databases. The potential gene structure models are obtained by using a dynamic programming method. The program permits the use of several parameters for gene structure prediction and refinement. During gene model construction, selecting different exon homology levels with a protein sequence selected from a list of homologous proteins can improve the accuracy of the gene structure prediction. In the case of low homology, GeneBuilder is still able to predict the gene structure. The GeneBuilder system has been tested by using the standard set (Burset and Guigo, Genomics, 34, 353-367, 1996) and the performances are: 0.89 sensitivity and 0.91 specificity at the nucleotide level. The total correlation coefficient is 0.88. The GeneBuilder system is implemented as a part of the WebGene a the URL: http://www.itba.mi. cnr.it/webgene and TRADAT (TRAncription Database and Analysis Tools) launcher URL: http://www.itba.mi.cnr.it/tradat.
Ning, Shangwei; Yue, Ming; Wang, Peng; Liu, Yue; Zhi, Hui; Zhang, Yan; Zhang, Jizhou; Gao, Yue; Guo, Maoni; Zhou, Dianshuang; Li, Xin; Li, Xia
2017-01-04
We describe LincSNP 2.0 (http://bioinfo.hrbmu.edu.cn/LincSNP), an updated database that is used specifically to store and annotate disease-associated single nucleotide polymorphisms (SNPs) in human long non-coding RNAs (lncRNAs) and their transcription factor binding sites (TFBSs). In LincSNP 2.0, we have updated the database with more data and several new features, including (i) expanding disease-associated SNPs in human lncRNAs; (ii) identifying disease-associated SNPs in lncRNA TFBSs; (iii) updating LD-SNPs from the 1000 Genomes Project; and (iv) collecting more experimentally supported SNP-lncRNA-disease associations. Furthermore, we developed three flexible online tools to retrieve and analyze the data. Linc-Mart is a convenient way for users to customize their own data. Linc-Browse is a tool for all data visualization. Linc-Score predicts the associations between lncRNA and disease. In addition, we provided users a newly designed, user-friendly interface to search and download all the data in LincSNP 2.0 and we also provided an interface to submit novel data into the database. LincSNP 2.0 is a continually updated database and will serve as an important resource for investigating the functions and mechanisms of lncRNAs in human diseases. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.
NASA Astrophysics Data System (ADS)
Pei, C.; Bieber, J. W.; Burger, R. A.; Clem, J.
2010-12-01
We present a detailed description of our newly developed stochastic approach for solving Parker's transport equation, which we believe is the first attempt to solve it with time dependence in 3-D, evolving from our 3-D steady state stochastic approach. Our formulation of this method is general and is valid for any type of heliospheric magnetic field, although we choose the standard Parker field as an example to illustrate the steps to calculate the transport of galactic cosmic rays. Our 3-D stochastic method is different from other stochastic approaches in the literature in several ways. For example, we employ spherical coordinates to integrate directly, which makes the code much more efficient by reducing coordinate transformations. What is more, the equivalence between our stochastic differential equations and Parker's transport equation is guaranteed by Ito's theorem in contrast to some other approaches. We generalize the technique for calculating particle flux based on the pseudoparticle trajectories for steady state solutions and for time-dependent solutions in 3-D. To validate our code, first we show that good agreement exists between solutions obtained by our steady state stochastic method and a traditional finite difference method. Then we show that good agreement also exists for our time-dependent method for an idealized and simplified heliosphere which has a Parker magnetic field and a simple initial condition for two different inner boundary conditions.
The Hospital Community Benefit Program: Implications for Food and Nutrition Professionals.
Fleischhacker, Sheila; Ramachandran, Gowri
2016-01-01
This article briefly explains the food and nutrition implications of the new standards, tax penalties and reporting requirements for non-profit hospitals and healthcare systems to maintain a tax-exempt or charitable status under section 501(c)(3) of the Federal Internal Revenue Code set forth in The Patient Protection and Affordable Care Act (P.L. 111-148, Sec. 9007). The newly created 501(r) of the Internal Revenue Code requires, beginning with the first tax year on or after March 23, 2012, that such hospitals demonstrate community benefit by conducting a community health needs assessment (CHNA) at least once every three years and annually file information by means of a Schedule H (Form 990) regarding progress towards addressing identified needs. As hospitals conduct their CHNA and work further and collaboratively with community stakeholders on developing and monitoring their proposed action plans, the breadth and depth of food and nutrition activities occurring as a result of the Affordable Care Act Hospital Community Benefit Program will likely increase. The CHNA requirement, along with other emerging initiatives focused on improving the food environments and nutrition-related activities of hospitals and healthcare systems offer fruitful opportunities for food and nutrition professionals to partner on innovative ways to leverage hospital infrastructure and capacity to influence those residing, working or visiting the hospital campus, as well as the surrounding community.
miRSponge: a manually curated database for experimentally supported miRNA sponges and ceRNAs.
Wang, Peng; Zhi, Hui; Zhang, Yunpeng; Liu, Yue; Zhang, Jizhou; Gao, Yue; Guo, Maoni; Ning, Shangwei; Li, Xia
2015-01-01
In this study, we describe miRSponge, a manually curated database, which aims at providing an experimentally supported resource for microRNA (miRNA) sponges. Recent evidence suggests that miRNAs are themselves regulated by competing endogenous RNAs (ceRNAs) or 'miRNA sponges' that contain miRNA binding sites. These competitive molecules can sequester miRNAs to prevent them interacting with their natural targets to play critical roles in various biological and pathological processes. It has become increasingly important to develop a high quality database to record and store ceRNA data to support future studies. To this end, we have established the experimentally supported miRSponge database that contains data on 599 miRNA-sponge interactions and 463 ceRNA relationships from 11 species following manual curating from nearly 1200 published articles. Database classes include endogenously generated molecules including coding genes, pseudogenes, long non-coding RNAs and circular RNAs, along with exogenously introduced molecules including viral RNAs and artificial engineered sponges. Approximately 70% of the interactions were identified experimentally in disease states. miRSponge provides a user-friendly interface for convenient browsing, retrieval and downloading of dataset. A submission page is also included to allow researchers to submit newly validated miRNA sponge data. Database URL: http://www.bio-bigdata.net/miRSponge. © The Author(s) 2015. Published by Oxford University Press.
miRSponge: a manually curated database for experimentally supported miRNA sponges and ceRNAs
Wang, Peng; Zhi, Hui; Zhang, Yunpeng; Liu, Yue; Zhang, Jizhou; Gao, Yue; Guo, Maoni; Ning, Shangwei; Li, Xia
2015-01-01
In this study, we describe miRSponge, a manually curated database, which aims at providing an experimentally supported resource for microRNA (miRNA) sponges. Recent evidence suggests that miRNAs are themselves regulated by competing endogenous RNAs (ceRNAs) or ‘miRNA sponges’ that contain miRNA binding sites. These competitive molecules can sequester miRNAs to prevent them interacting with their natural targets to play critical roles in various biological and pathological processes. It has become increasingly important to develop a high quality database to record and store ceRNA data to support future studies. To this end, we have established the experimentally supported miRSponge database that contains data on 599 miRNA-sponge interactions and 463 ceRNA relationships from 11 species following manual curating from nearly 1200 published articles. Database classes include endogenously generated molecules including coding genes, pseudogenes, long non-coding RNAs and circular RNAs, along with exogenously introduced molecules including viral RNAs and artificial engineered sponges. Approximately 70% of the interactions were identified experimentally in disease states. miRSponge provides a user-friendly interface for convenient browsing, retrieval and downloading of dataset. A submission page is also included to allow researchers to submit newly validated miRNA sponge data. Database URL: http://www.bio-bigdata.net/miRSponge. PMID:26424084
Observed child and parent toothbrushing behaviors and child oral health.
Collett, Brent R; Huebner, Colleen E; Seminario, Ana Lucia; Wallace, Erin; Gray, Kristen E; Speltz, Matthew L
2016-05-01
Parent-led toothbrushing effectively reduces early childhood caries. Research on the strategies that parents use to promote this behavior is, however, lacking. To examine associations between parent-child toothbrushing interactions and child oral health using a newly developed measure, the Toothbrushing Observation System (TBOS). One hundred children ages 18-60 months and their parents were video-recorded during toothbrushing interactions. Using these recordings, six raters coded parent and child behaviors and the duration of toothbrushing. We examined the reliability of the coding system and associations between observed parent and child behaviors and three indices of oral health: caries, gingival health, and history of dental procedures requiring general anesthesia. Reliabilities were moderate to strong for TBOS child and parent scores. Parent TBOS scores and longer duration of parent-led toothbrushing were associated with fewer decayed, missing or filled tooth surfaces and lower incidence of gingivitis and procedures requiring general anesthesia. Associations between child TBOS scores and dental outcomes were modest, suggesting the relative importance of parent versus child behaviors at this early age. Parents' child behavior management skills and the duration of parent-led toothbrushing were associated with better child oral health. These findings suggest that parenting skills are an important target for future behavioral oral health interventions. © 2015 BSPD, IAPD and John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
ERIC Educational Resources Information Center
Ruohotie-Lyhty, Maria
2011-01-01
This paper explores the professional development of 11 newly qualified foreign language teachers. It draws on a qualitative longitudinal study conducted at the University of Jyvaskyla, Finland between 2002 and 2009. The paper concentrates on the personal side of teacher development by analysing participants' discourses concerning language…
NASA Astrophysics Data System (ADS)
Zappacosta, Diego C.; Ochogavía, Ana C.; Rodrigo, Juan M.; Romero, José R.; Meier, Mauro S.; Garbus, Ingrid; Pessino, Silvina C.; Echenique, Viviana C.
2014-04-01
Eragrostis curvula includes biotypes reproducing through obligate and facultative apomixis or, rarely, full sexuality. We previously generated a ``tetraploid-dihaploid-tetraploid'' series of plants consisting of a tetraploid apomictic plant (T), a sexual dihaploid plant (D) and a tetraploid artificial colchiploid (C). Initially, plant C was nearly 100% sexual. However, its capacity to form non-reduced embryo sacs dramatically increased over a four year period (2003-2007) to reach levels of 85-90%. Here, we confirmed high rates of apomixis in plant C, and used AFLPs and MSAPs to characterize the genetic and epigenetic variation observed in this plant in 2007 as compared to 2003. Of the polymorphic sequences, some had no coding potential whereas others were homologous to retrotransposons and/or protein-coding-like sequences. Our results suggest that in this particular plant system increased apomixis expression is concurrent with genetic and epigenetic modifications, possibly involving transposable elements.
Parton distribution functions with QED corrections in the valon model
NASA Astrophysics Data System (ADS)
Mottaghizadeh, Marzieh; Taghavi Shahri, Fatemeh; Eslami, Parvin
2017-10-01
The parton distribution functions (PDFs) with QED corrections are obtained by solving the QCD ⊗QED DGLAP evolution equations in the framework of the "valon" model at the next-to-leading-order QCD and the leading-order QED approximations. Our results for the PDFs with QED corrections in this phenomenological model are in good agreement with the newly related CT14QED global fits code [Phys. Rev. D 93, 114015 (2016), 10.1103/PhysRevD.93.114015] and APFEL (NNPDF2.3QED) program [Comput. Phys. Commun. 185, 1647 (2014), 10.1016/j.cpc.2014.03.007] in a wide range of x =[10-5,1 ] and Q2=[0.283 ,108] GeV2 . The model calculations agree rather well with those codes. In the latter, we proposed a new method for studying the symmetry breaking of the sea quark distribution functions inside the proton.
Nazione, Samantha; Silk, Kami J.; Robinson, Jeffrey
2017-01-01
This study reports an analysis of verbal social support strategies directed by surgeons and patients’ companions to breast cancer patients using the social support behavior code (SSBC). Additionally, the influence of companions on the provision of social support is examined. Forty-six videotapes of appointments where treatment regimens were being decided were analyzed. Results demonstrated that the majority of units spoken by surgeons were coded as verbal social support, primarily in the form of informational social support. Companions’ social support was lower (relative to surgeons) in nearly every category of social support assessed. Patients who brought companions were found to receive more network social support from surgeons. Overall, these results point to low emotional support from surgeons and companions for patients during these appointments, which indicates a need for modifications in empathy training for medical providers. PMID:29081835
Alexandrowicz, Rainer W; Friedrich, Fabian; Jahn, Rebecca; Soulier, Nathalie
2015-01-01
The present study compares the 30-, 20-, and 12-items versions of the General Health Questionnaire (GHQ) in the original coding and four different recoding schemes (Bimodal, Chronic, Modified Likert and a newly proposed Modified Chronic) with respect to their psychometric qualities. The dichotomized versions (i.e. Bimodal, Chronic and Modified Chronic) were evaluated with the Rasch-Model and the polytomous original version and the Modified Likert version were evaluated with the Partial Credit Model. In general, the versions under consideration showed agreement with the model assumption. However, the recoded versions exhibited some deficits with respect to the Outfit index. Because of the item deficits and for theoretical reasons we argue in favor of using the any of the three length versions with the original four-categorical coding scheme. Nevertheless, any of the versions appears apt for clinical use from a psychometric perspective.
Wang, Shuo; Gao, Li-Zhi
2016-11-01
The complete chloroplast genome sequence of foxtail millet (Setaria italica), an important food and fodder crop in the family Poaceae, is first reported in this study. The genome consists of 1 35 516 bp containing a pair of inverted repeats (IRs) of 21 804 bp separated by a large single-copy (LSC) region and a small single-copy (SSC) region of 79 896 bp and 12 012 bp, respectively. Coding sequences constitute 58.8% of the genome harboring 111 unique genes, 71 of which are protein-coding genes, 4 are rRNA genes, and 36 are tRNA genes. Phylogenetic analysis indicated foxtail millet clustered with Panicum virgatum and Echinochloa crus-galli belonging to the tribe Paniceae of the subfamily Panicoideae. This newly determined chloroplast genome will provide valuable information for the future breeding programs of valuable cereal crops in the family Poaceae.
Tork, Sanaa E; Aly, Magda M; Alakilli, Saleha Y; Al-Seeni, Madeha N
2015-03-01
γ-poly glutamic acid (γ-PGA) has received considerable attention for pharmaceutical and biomedical applications. γ-PGA from the newly isolate Bacillus licheniformis NRC20 was purified and characterized using diffusion distance agar plate, mass spectrometry and thin layer chromatography. All analysis indicated that γ-PGA is a homopolymer composed of glutamic acid. Its molecular weight was determined to be 1266 kDa. It was composed of L- and D-glutamic acid residues. An amplicon of 3050 represents the γ-PGA-coding genes was obtained, sequenced and submitted in genbank database. Its amino acid sequence showed high similarity with that obtained from B. licheniformis strains. The bacterium NRC 20 was independent of L-glutamic acid but the polymer production enhanced when cultivated in medium containing L-glutamic acid as the sole nitrogen source. Finally we can conclude that γ-PGA production from B. licheniformis NRC20 has many promised applications in medicine, industry and nanotechnology. Copyright © 2014 Elsevier B.V. All rights reserved.
2012-01-01
On the 4th September 2012 the International Commission on Zoological Nomenclature announced an amendment to the International Code of Zoological Nomenclature allowing for electronic publication of the scientific names of animals. In this interview Frank-T. Krell discusses the implications of this amendment for authors wishing to publish descriptions of newly identified animal species in online and open access journals, and for the future of taxonomic science. PMID:22978411
SCaN Network Ground Station Receiver Performance for Future Service Support
NASA Technical Reports Server (NTRS)
Estabrook, Polly; Lee, Dennis; Cheng, Michael; Lau, Chi-Wung
2012-01-01
Objectives: Examine the impact of providing the newly standardized CCSDS Low Density Parity Check (LDPC) codes to the SCaN return data service on the SCaN SN and DSN ground stations receivers: SN Current Receiver: Integrated Receiver (IR). DSN Current Receiver: Downlink Telemetry and Tracking (DTT) Receiver. Early Commercial-Off-The-Shelf (COTS) prototype of the SN User Service Subsystem Component Replacement (USS CR) Narrow Band Receiver. Motivate discussion of general issues of ground station hardware design to enable simple and cheap modifications for support of future services.
Liebers, Falk; Brandstädt, Felix; Schust, Marianne; Serafin, Patrick; Schäfer, Andreas; Gebhardt, Hansjürgen; Hartmann, Bernd; Steinberg, Ulf
2017-01-01
Introduction The impact of work-related musculoskeletal disorders is considerable. The assessment of work tasks with physical workloads is crucial to estimate the work-related health risks of exposed employees. Three key indicator methods are available for risk assessment regarding manual lifting, holding and carrying of loads; manual pulling and pushing of loads; and manual handling operations. Three further KIMs for risk assessment regarding whole-body forces, awkward body postures and body movement have been developed de novo. In addition, the development of a newly drafted combined method for mixed exposures is planned. All methods will be validated regarding face validity, reliability, convergent validity, criterion validity and further aspects of utility under practical conditions. Methods and analysis As part of the joint project MEGAPHYS (multilevel risk assessment of physical workloads), a mixed-methods study is being designed for the validation of KIMs and conducted in companies of different sizes and branches in Germany. Workplaces are documented and analysed by observations, applying KIMs, interviews and assessment of environmental conditions. Furthermore, a survey among the employees at the respective workplaces takes place with standardised questionnaires, interviews and physical examinations. It is intended to include 1200 employees at 120 different workplaces. For analysis of the quality criteria, recommendations of the COSMIN checklist (COnsensus-based Standards for the selection of health Measurement INstruments) will be taken into account. Ethics and dissemination The study was planned and conducted in accordance with the German Medical Professional Code and the Declaration of Helsinki as well as the German Federal Data Protection Act. The design of the study was approved by ethics committees. We intend to publish the validated KIMs in 2018. Results will be published in peer-reviewed journals, presented at international meetings and disseminated to actual users for practical application. PMID:28827239
Klussmann, Andre; Liebers, Falk; Brandstädt, Felix; Schust, Marianne; Serafin, Patrick; Schäfer, Andreas; Gebhardt, Hansjürgen; Hartmann, Bernd; Steinberg, Ulf
2017-08-21
The impact of work-related musculoskeletal disorders is considerable. The assessment of work tasks with physical workloads is crucial to estimate the work-related health risks of exposed employees. Three key indicator methods are available for risk assessment regarding manual lifting, holding and carrying of loads; manual pulling and pushing of loads; and manual handling operations. Three further KIMs for risk assessment regarding whole-body forces, awkward body postures and body movement have been developed de novo. In addition, the development of a newly drafted combined method for mixed exposures is planned. All methods will be validated regarding face validity, reliability, convergent validity, criterion validity and further aspects of utility under practical conditions. As part of the joint project MEGAPHYS (multilevel risk assessment of physical workloads), a mixed-methods study is being designed for the validation of KIMs and conducted in companies of different sizes and branches in Germany. Workplaces are documented and analysed by observations, applying KIMs, interviews and assessment of environmental conditions. Furthermore, a survey among the employees at the respective workplaces takes place with standardised questionnaires, interviews and physical examinations. It is intended to include 1200 employees at 120 different workplaces. For analysis of the quality criteria, recommendations of the COSMIN checklist (COnsensus-based Standards for the selection of health Measurement INstruments) will be taken into account. The study was planned and conducted in accordance with the German Medical Professional Code and the Declaration of Helsinki as well as the German Federal Data Protection Act. The design of the study was approved by ethics committees. We intend to publish the validated KIMs in 2018. Results will be published in peer-reviewed journals, presented at international meetings and disseminated to actual users for practical application. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
NASA Astrophysics Data System (ADS)
Hirano, Ryoichi; Iida, Susumu; Amano, Tsuyoshi; Watanabe, Hidehiro; Hatakeyama, Masahiro; Murakami, Takeshi; Suematsu, Kenichi; Terao, Kenji
2016-03-01
Novel projection electron microscope optics have been developed and integrated into a new inspection system named EBEYE-V30 ("Model EBEYE" is an EBARA's model code) , and the resulting system shows promise for application to half-pitch (hp) 16-nm node extreme ultraviolet lithography (EUVL) patterned mask inspection. To improve the system's inspection throughput for 11-nm hp generation defect detection, a new electron-sensitive area image sensor with a high-speed data processing unit, a bright and stable electron source, and an image capture area deflector that operates simultaneously with the mask scanning motion have been developed. A learning system has been used for the mask inspection tool to meet the requirements of hp 11-nm node EUV patterned mask inspection. Defects are identified by the projection electron microscope system using the "defectivity" from the characteristics of the acquired image. The learning system has been developed to reduce the labor and costs associated with adjustment of the detection capability to cope with newly-defined mask defects. We describe the integration of the developed elements into the inspection tool and the verification of the designed specification. We have also verified the effectiveness of the learning system, which shows enhanced detection capability for the hp 11-nm node.
76 FR 33419 - Nationally Recognized Statistical Rating Organizations
Federal Register 2010, 2011, 2012, 2013, 2014
2011-06-08
... documentation of the internal control structure) or should the factors focus on the design (i.e., establishment... related to implementing them. a. Controls reasonably designed to ensure that a newly developed methodology... U.S.C. 78o-7(r)(1)(A). b. Controls reasonably designed to ensure that a newly developed methodology...
The Spelling Project. Technical Report 1992-2.
ERIC Educational Resources Information Center
Green, Kathy E.; Schroeder, David H.
Results of an analysis of a newly developed spelling test and several related measures are reported. Information about the reliability of a newly developed spelling test; its distribution of scores; its relationship with the standard battery of aptitude tests of the Johnson O'Connor Research Foundation; and its relationships with sex, age,…
ERIC Educational Resources Information Center
McNeely, Clea A.; Morland, Lyn; Doty, S. Benjamin; Meschke, Laurie L.; Awad, Summer; Husain, Altaf; Nashwan, Ayat
2017-01-01
Background: The US education system must find creative and effective ways to foster the healthy development of the approximately 2 million newly arrived immigrant and refugee adolescents, many of whom contend with language barriers, limited prior education, trauma, and discrimination. We identify research priorities for promoting the school…
Methodology, status and plans for development and assessment of Cathare code
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bestion, D.; Barre, F.; Faydide, B.
1997-07-01
This paper presents the methodology, status and plans for the development, assessment and uncertainty evaluation of the Cathare code. Cathare is a thermalhydraulic code developed by CEA (DRN), IPSN, EDF and FRAMATOME for PWR safety analysis. First, the status of the code development and assessment is presented. The general strategy used for the development and the assessment of the code is presented. Analytical experiments with separate effect tests, and component tests are used for the development and the validation of closure laws. Successive Revisions of constitutive laws are implemented in successive Versions of the code and assessed. System tests ormore » integral tests are used to validate the general consistency of the Revision. Each delivery of a code Version + Revision is fully assessed and documented. A methodology is being developed to determine the uncertainty on all constitutive laws of the code using calculations of many analytical tests and applying the Discrete Adjoint Sensitivity Method (DASM). At last, the plans for the future developments of the code are presented. They concern the optimization of the code performance through parallel computing - the code will be used for real time full scope plant simulators - the coupling with many other codes (neutronic codes, severe accident codes), the application of the code for containment thermalhydraulics. Also, physical improvements are required in the field of low pressure transients and in the modeling for the 3-D model.« less
NASA Astrophysics Data System (ADS)
Navy, S. L.; Luft, J. A.; Toerien, R.; Hewson, P. W.
2018-05-01
In many parts of the world, newly hired science teachers' practices are developing in a complex policy environment. However, little is known about how newly hired science teachers' practices are enacted throughout a cycle of instruction and how these practices can be influenced by macro-, meso-, and micro-policies. Knowing how policies impact practice can result in better policies or better support for certain policies in order to enhance the instruction of newly hired teachers. This comparative study investigated how 12 newly hired science teachers at sites in South Africa (SA) and the United States (US) progressed through an instructional cycle of planning, teaching, and reflection. The qualitative data were analysed through beginning teacher competency frameworks, the cycle of instruction, and institutional theory. Data analysis revealed prevailing areas of practice and connections to levels of policy within the instructional cycle phases. There were some differences between the SA and US teachers and among first-, second-, and third-year teachers. More importantly, this study indicates that newly hired teachers are susceptible to micro-policies and are progressively developing their practice. It also shows the importance of meso-level connectors. It suggests that teacher educators and policy makers must consider how to prepare and support newly hired science teachers to achieve the shared global visions of science teaching.
Construction of a lactose-assimilating strain of baker's yeast.
Adam, A C; Prieto, J A; Rubio-Texeira, M; Polaina, J
1999-09-30
A recombinant strain of baker's yeast has been constructed which can assimilate lactose efficiently. This strain has been designed to allow its propagation in whey, the byproduct resulting from cheese-making. The ability to metabolize lactose is conferred by the functional expression of two genes from Kluyveromyces lactis, LAC12 and LAC4, which encode a lactose permease and a beta-galactosidase, respectively. To make the recombinant strain more acceptable for its use in bread-making, the genetic transformation of the host baker's yeast was carried out with linear fragments of DNA of defined sequence, carrying as the only heterologous material the coding regions of the two K. lactis genes. Growth of the new strain on cheese whey affected neither the quality of bread nor the yeast gassing power. The significance of the newly developed strain is two-fold: it affords a cheap alternative to the procedure generally used for the propagation of baker's yeast, and it offers a profitable use for cheese whey. Copyright 1999 John Wiley & Sons, Ltd.
Lateral Violence in Nursing: Implications and Strategies for Nurse Educators.
Sanner-Stiehr, Ericka; Ward-Smith, Peggy
Lateral violence among nurses persists as a prevalent problem, contributing to psychological distress, staff turnover, and attrition. Newly graduated nurses are at particular risk for being targets of lateral violence and experiencing its negative sequelae. Preparing student nurses to respond to lateral violence prior to entering the nursing may alter this scenario. A review of the literature was conducted to determine the potential for nursing faculty to change the cycle of lateral violence. Based on this review, we recommend 3 main strategies, specifically for nursing faculty, aimed at reducing incidences of lateral violence and preparing students to manage this phenomenon. First, curricular content can address integrating lateral violence content into simulation experiences and facilitating this knowledge into clinical experiences. Second, codes of conduct should guide behaviors for both students and faculty. Finally, as role models, faculty should be aware of their own behaviors, role model respectful communication, facilitate a courteous academic environment, and develop nurses capable of identifying and appropriately responding to lateral violence. Copyright © 2016 Elsevier Inc. All rights reserved.
Simulation of Plasma Jet Merger and Liner Formation within the PLX- α Project
NASA Astrophysics Data System (ADS)
Samulyak, Roman; Chen, Hsin-Chiang; Shih, Wen; Hsu, Scott
2015-11-01
Detailed numerical studies of the propagation and merger of high Mach number argon plasma jets and the formation of plasma liners have been performed using the newly developed method of Lagrangian particles (LP). The LP method significantly improves accuracy and mathematical rigor of common particle-based numerical methods such as smooth particle hydrodynamics while preserving their main advantages compared to grid-based methods. A brief overview of the LP method will be presented. The Lagrangian particle code implements main relevant physics models such as an equation of state for argon undergoing atomic physics transformation, radiation losses in thin optical limit, and heat conduction. Simulations of the merger of two plasma jets are compared with experimental data from past PLX experiments. Simulations quantify the effect of oblique shock waves, ionization, and radiation processes on the jet merger process. Results of preliminary simulations of future PLX- alpha experiments involving the ~ π / 2 -solid-angle plasma-liner configuration with 9 guns will also be presented. Partially supported by ARPA-E's ALPHA program.
NASA Astrophysics Data System (ADS)
Piantschitsch, Isabell; Vršnak, Bojan; Hanslmeier, Arnold; Lemmerer, Birgit; Veronig, Astrid; Hernandez-Perez, Aaron; Čalogović, Jaša
2018-06-01
We performed 2.5D magnetohydrodynamic (MHD) simulations showing the propagation of fast-mode MHD waves of different initial amplitudes and their interaction with a coronal hole (CH), using our newly developed numerical code. We find that this interaction results in, first, the formation of reflected, traversing, and transmitted waves (collectively, secondary waves) and, second, in the appearance of stationary features at the CH boundary. Moreover, we observe a density depletion that is moving in the opposite direction of the incoming wave. We find a correlation between the initial amplitude of the incoming wave and the amplitudes of the secondary waves as well as the peak values of the stationary features. Additionally, we compare the phase speed of the secondary waves and the lifetime of the stationary features to observations. Both effects obtained in the simulation, the evolution of secondary waves, as well as the formation of stationary fronts at the CH boundary, strongly support the theory that coronal waves are fast-mode MHD waves.
Efficient Parallelization of a Dynamic Unstructured Application on the Tera MTA
NASA Technical Reports Server (NTRS)
Oliker, Leonid; Biswas, Rupak
1999-01-01
The success of parallel computing in solving real-life computationally-intensive problems relies on their efficient mapping and execution on large-scale multiprocessor architectures. Many important applications are both unstructured and dynamic in nature, making their efficient parallel implementation a daunting task. This paper presents the parallelization of a dynamic unstructured mesh adaptation algorithm using three popular programming paradigms on three leading supercomputers. We examine an MPI message-passing implementation on the Cray T3E and the SGI Origin2OOO, a shared-memory implementation using cache coherent nonuniform memory access (CC-NUMA) of the Origin2OOO, and a multi-threaded version on the newly-released Tera Multi-threaded Architecture (MTA). We compare several critical factors of this parallel code development, including runtime, scalability, programmability, and memory overhead. Our overall results demonstrate that multi-threaded systems offer tremendous potential for quickly and efficiently solving some of the most challenging real-life problems on parallel computers.
Alternate Operating Modes For NDCX-II
NASA Astrophysics Data System (ADS)
Sharp, W. M.; Friedman, A.; Grote, D. P.; Cohen, R. H.; Lund, S. M.; Vay, J.-L.; Waldron, W. L.
2012-10-01
NDCX-II is a newly completed accelerator facility at LBNL, built to study ion-heated warm dense matter and aspects of ion-driven targets for inertial-fusion energy. The baseline design calls for using twelve induction cells to accelerate 40 nC of Li+ ions to 1.2 MeV. During commissioning, though, we plan to extend the source lifetime by extracting less total charge. For operational flexibility, the option of using a helium plasma source is also being investigated. Over time, we expect that NDCX-II will be upgraded to substantially higher energies, necessitating the use of heavier ions to keep a suitable deposition range in targets. Each of these options requires development of an alternate acceleration schedule and the associated transverse focusing. The schedules here are first worked out with a fast-running 1-D particle-in-cell code ASP, then 2-D and 3-D Warp simulations are used to verify the 1-D results and to design transverse focusing.
Raskin, Cody; Owen, J. Michael
2016-10-24
Here, we discuss a generalization of the classic Keplerian disk test problem allowing for both pressure and rotational support, as a method of testing astrophysical codes incorporating both gravitation and hydrodynamics. We argue for the inclusion of pressure in rotating disk simulations on the grounds that realistic, astrophysical disks exhibit non-negligible pressure support. We then apply this test problem to examine the performance of various smoothed particle hydrodynamics (SPH) methods incorporating a number of improvements proposed over the years to address problems noted in modeling the classical gravitation-only Keplerian disk. We also apply this test to a newly developed extensionmore » of SPH based on reproducing kernels called CRKSPH. Counterintuitively, we find that pressure support worsens the performance of traditional SPH on this problem, causing unphysical collapse away from the steady-state disk solution even more rapidly than the purely gravitational problem, whereas CRKSPH greatly reduces this error.« less
Console, Rodolfo; Nardi, Anna; Carluccio, Roberto; Murru, Maura; Falcone, Giuseppe; Parsons, Thomas E.
2017-01-01
The use of a newly developed earthquake simulator has allowed the production of catalogs lasting 100 kyr and containing more than 100,000 events of magnitudes ≥4.5. The model of the fault system upon which we applied the simulator code was obtained from the DISS 3.2.0 database, selecting all the faults that are recognized on the Calabria region, for a total of 22 fault segments. The application of our simulation algorithm provides typical features in time, space and magnitude behavior of the seismicity, which can be compared with those of the real observations. The results of the physics-based simulator algorithm were compared with those obtained by an alternative method using a slip-rate balanced technique. Finally, as an example of a possible use of synthetic catalogs, an attenuation law has been applied to all the events reported in the synthetic catalog for the production of maps showing the exceedance probability of given values of PGA on the territory under investigation.
tRNAscan-SE On-line: integrating search and context for analysis of transfer RNA genes.
Lowe, Todd M; Chan, Patricia P
2016-07-08
High-throughput genome sequencing continues to grow the need for rapid, accurate genome annotation and tRNA genes constitute the largest family of essential, ever-present non-coding RNA genes. Newly developed tRNAscan-SE 2.0 has advanced the state-of-the-art methodology in tRNA gene detection and functional prediction, captured by rich new content of the companion Genomic tRNA Database. Previously, web-server tRNA detection was isolated from knowledge of existing tRNAs and their annotation. In this update of the tRNAscan-SE On-line resource, we tie together improvements in tRNA classification with greatly enhanced biological context via dynamically generated links between web server search results, the most relevant genes in the GtRNAdb and interactive, rich genome context provided by UCSC genome browsers. The tRNAscan-SE On-line web server can be accessed at http://trna.ucsc.edu/tRNAscan-SE/. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.
Fan Noise Prediction with Applications to Aircraft System Noise Assessment
NASA Technical Reports Server (NTRS)
Nark, Douglas M.; Envia, Edmane; Burley, Casey L.
2009-01-01
This paper describes an assessment of current fan noise prediction tools by comparing measured and predicted sideline acoustic levels from a benchmark fan noise wind tunnel test. Specifically, an empirical method and newly developed coupled computational approach are utilized to predict aft fan noise for a benchmark test configuration. Comparisons with sideline noise measurements are performed to assess the relative merits of the two approaches. The study identifies issues entailed in coupling the source and propagation codes, as well as provides insight into the capabilities of the tools in predicting the fan noise source and subsequent propagation and radiation. In contrast to the empirical method, the new coupled computational approach provides the ability to investigate acoustic near-field effects. The potential benefits/costs of these new methods are also compared with the existing capabilities in a current aircraft noise system prediction tool. The knowledge gained in this work provides a basis for improved fan source specification in overall aircraft system noise studies.
Codes That Support Smart Growth Development
Provides examples of local zoning codes that support smart growth development, categorized by: unified development code, form-based code, transit-oriented development, design guidelines, street design standards, and zoning overlay.
ERIC Educational Resources Information Center
Navy, S. L.; Luft, J. A.; Toerien, R.; Hewson, P. W.
2018-01-01
In many parts of the world, newly hired science teachers' practices are developing in a complex policy environment. However, little is known about how newly hired science teachers' practices are enacted throughout a cycle of instruction and how these practices can be influenced by macro-, meso-, and micro-policies. Knowing how policies impact…
New variable stars discovered in the fields of three Galactic open clusters using the VVV survey
NASA Astrophysics Data System (ADS)
Palma, T.; Minniti, D.; Dékány, I.; Clariá, J. J.; Alonso-García, J.; Gramajo, L. V.; Ramírez Alegría, S.; Bonatto, C.
2016-11-01
This project is a massive near-infrared (NIR) search for variable stars in highly reddened and obscured open cluster (OC) fields projected on regions of the Galactic bulge and disk. The search is performed using photometric NIR data in the J-, H- and Ks- bands obtained from the Vista Variables in the Vía Láctea (VVV) Survey. We performed in each cluster field a variability search using Stetson's variability statistics to select the variable candidates. Later, those candidates were subjected to a frequency analysis using the Generalized Lomb-Scargle and the Phase Dispersion Minimization algorithms. The number of independent observations range between 63 and 73. The newly discovered variables in this study, 157 in total in three different known OCs, are classified based on their light curve shapes, periods, amplitudes and their location in the corresponding color-magnitude (J -Ks ,Ks) and color-color (H -Ks , J - H) diagrams. We found 5 possible Cepheid stars which, based on the period-luminosity relation, are very likely type II Cepheids located behind the bulge. Among the newly discovered variables, there are eclipsing binaries, δ Scuti, as well as background RR Lyrae stars. Using the new version of the Wilson & Devinney code as well as the "Physics Of Eclipsing Binaries" (PHOEBE) code, we analyzed some of the best eclipsing binaries we discovered. Our results show that these studied systems turn out to be ranging from detached to double-contact binaries, with low eccentricities and high inclinations of approximately 80°. Their surface temperatures range between 3500 K and 8000 K.
An Evaluation of New York City's 2015 Birth Certificate Gender Marker Regulation.
Lee, Erica J; Gurr, Danielle; Van Wye, Gretchen
2017-10-01
In 1971, the New York City (NYC) Department of Health and Mental Hygiene amended Section 207.05 of the NYC Health Code to allow individuals who had undergone "convertive surgery" (interpreted by the code to mean genital surgery) to amend the gender on their birth certificates. This surgery requirement was removed in 2015. In a survey evaluating the regulation change, we sought to characterize the transgender population newly eligible to obtain a gender-congruent NYC birth certificate by comparing respondents with and without genital surgery. We mailed a 42-question survey with each newly issued birth certificate. We compared respondents across current gender identity, race, Hispanic ethnicity, age, insurance status, income, current general health status, other transition-related care obtained, and healthcare access, stigma, and discrimination. Of 642 applicants, 219 responded and were thus enrolled in our 5-year study (34.1%). Most (n = 158 out of 203 who answered, 77.8%) had not received genital surgery. Compared to respondents with genital surgery, respondents without surgery were significantly more likely to be transgender men (50.0% vs. 20.0%); younger (median age 32 vs. 56.5); on Medicaid (31.6% vs. 11.1%); identify as Hispanic (28.5% vs. 8.9%); and live in households making <$20,000 annually (35.3% vs. 12.8%). Removing a genital surgery requirement more equitably enables transgender men and those with limited resources to obtain a gender-congruent birth certificate. Jurisdictions with such requirements should consider similar regulation changes to address the inequities that this requirement likely imposes in accessing birth certificates.
Screening for Behavioral Health Issues in Children Enrolled in Massachusetts Medicaid
Penfold, Robert; Arsenault, Lisa; Zhang, Fang; Murphy, Michael; Wissow, Larry
2014-01-01
OBJECTIVES: To understand mandated behavioral health (BH) screening in Massachusetts Medicaid including characteristics of screened children, predictors of positive screens, and whether screening identifies children without a previous BH history. METHODS: Massachusetts mandated BH screening in particularly among underidentified groups. 2008. Providers used a billing code and modifier to indicate a completed screen and whether a BH need was identified. Using MassHealth claims data, children with ≥300 days of eligibility in fiscal year (FY) 2009 were identified and categorized into groups based on first use of the modifier, screening code, or claim. Bivariate analyses were conducted to determine differences among groups. BH history was examined by limiting the sample to those continuously enrolled in FY 2008 and 2009. Multivariate logistic regression was used to determine predictors of positive screens. RESULTS: Of 355 490 eligible children, 46% had evidence of screening. Of those with modifiers, 12% were positive. Among continuously enrolled children (FY 2008 and FY 2009) with evidence of screening, 43% with positive modifiers had no BH history. This “newly identified” group were more likely to be female, younger, minority, and from rural residences (P < .0001). Among children with modifiers; gender (male), age (5–7), being in foster care, recent BH history, and Hispanic ethnicity predicted having a positive modifier. CONCLUSIONS: The high rate of newly identified Medicaid children with a BH need suggests that screening is performing well, particularly among underidentified groups. To better assess screening value, future work on cost-effectiveness and the impact on subsequent mental health treatment is needed. PMID:24298005
Analysis of transient fission gas behaviour in oxide fuel using BISON and TRANSURANUS
NASA Astrophysics Data System (ADS)
Barani, T.; Bruschi, E.; Pizzocri, D.; Pastore, G.; Van Uffelen, P.; Williamson, R. L.; Luzzi, L.
2017-04-01
The modelling of fission gas behaviour is a crucial aspect of nuclear fuel performance analysis in view of the related effects on the thermo-mechanical performance of the fuel rod, which can be particularly significant during transients. In particular, experimental observations indicate that substantial fission gas release (FGR) can occur on a small time scale during transients (burst release). To accurately reproduce the rapid kinetics of the burst release process in fuel performance calculations, a model that accounts for non-diffusional mechanisms such as fuel micro-cracking is needed. In this work, we present and assess a model for transient fission gas behaviour in oxide fuel, which is applied as an extension of conventional diffusion-based models to introduce the burst release effect. The concept and governing equations of the model are presented, and the sensitivity of results to the newly introduced parameters is evaluated through an analytic sensitivity analysis. The model is assessed for application to integral fuel rod analysis by implementation in two structurally different fuel performance codes: BISON (multi-dimensional finite element code) and TRANSURANUS (1.5D code). Model assessment is based on the analysis of 19 light water reactor fuel rod irradiation experiments from the OECD/NEA IFPE (International Fuel Performance Experiments) database, all of which are simulated with both codes. The results point out an improvement in both the quantitative predictions of integral fuel rod FGR and the qualitative representation of the FGR kinetics with the transient model relative to the canonical, purely diffusion-based models of the codes. The overall quantitative improvement of the integral FGR predictions in the two codes is comparable. Moreover, calculated radial profiles of xenon concentration after irradiation are investigated and compared to experimental data, illustrating the underlying representation of the physical mechanisms of burst release.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Barani, T.; Bruschi, E.; Pizzocri, D.
The modelling of fission gas behaviour is a crucial aspect of nuclear fuel analysis in view of the related effects on the thermo-mechanical performance of the fuel rod, which can be particularly significant during transients. Experimental observations indicate that substantial fission gas release (FGR) can occur on a small time scale during transients (burst release). To accurately reproduce the rapid kinetics of burst release in fuel performance calculations, a model that accounts for non-diffusional mechanisms such as fuel micro-cracking is needed. In this work, we present and assess a model for transient fission gas behaviour in oxide fuel, which ismore » applied as an extension of diffusion-based models to allow for the burst release effect. The concept and governing equations of the model are presented, and the effect of the newly introduced parameters is evaluated through an analytic sensitivity analysis. Then, the model is assessed for application to integral fuel rod analysis. The approach that we take for model assessment involves implementation in two structurally different fuel performance codes, namely, BISON (multi-dimensional finite element code) and TRANSURANUS (1.5D semi-analytic code). The model is validated against 19 Light Water Reactor fuel rod irradiation experiments from the OECD/NEA IFPE (International Fuel Performance Experiments) database, all of which are simulated with both codes. The results point out an improvement in both the qualitative representation of the FGR kinetics and the quantitative predictions of integral fuel rod FGR, relative to the canonical, purely diffusion-based models, with both codes. The overall quantitative improvement of the FGR predictions in the two codes is comparable. Furthermore, calculated radial profiles of xenon concentration are investigated and compared to experimental data, demonstrating the representation of the underlying mechanisms of burst release by the new model.« less
Jung, Bo Kyeung; Kim, Jeeyong; Cho, Chi Hyun; Kim, Ju Yeon; Nam, Myung Hyun; Shin, Bong Kyung; Rho, Eun Youn; Kim, Sollip; Sung, Heungsup; Kim, Shinyoung; Ki, Chang Seok; Park, Min Jung; Lee, Kap No; Yoon, Soo Young
2017-04-01
The National Health Information Standards Committee was established in 2004 in Korea. The practical subcommittee for laboratory test terminology was placed in charge of standardizing laboratory medicine terminology in Korean. We aimed to establish a standardized Korean laboratory terminology database, Korea-Logical Observation Identifier Names and Codes (K-LOINC) based on former products sponsored by this committee. The primary product was revised based on the opinions of specialists. Next, we mapped the electronic data interchange (EDI) codes that were revised in 2014, to the corresponding K-LOINC. We established a database of synonyms, including the laboratory codes of three reference laboratories and four tertiary hospitals in Korea. Furthermore, we supplemented the clinical microbiology section of K-LOINC using an alternative mapping strategy. We investigated other systems that utilize laboratory codes in order to investigate the compatibility of K-LOINC with statistical standards for a number of tests. A total of 48,990 laboratory codes were adopted (21,539 new and 16,330 revised). All of the LOINC synonyms were translated into Korean, and 39,347 Korean synonyms were added. Moreover, 21,773 synonyms were added from reference laboratories and tertiary hospitals. Alternative strategies were established for mapping within the microbiology domain. When we applied these to a smaller hospital, the mapping rate was successfully increased. Finally, we confirmed K-LOINC compatibility with other statistical standards, including a newly proposed EDI code system. This project successfully established an up-to-date standardized Korean laboratory terminology database, as well as an updated EDI mapping to facilitate the introduction of standard terminology into institutions. © 2017 The Korean Academy of Medical Sciences.
Development of WRF-CO2 4DVAR Data Assimilation System
NASA Astrophysics Data System (ADS)
Zheng, T.; French, N. H. F.
2016-12-01
Four dimensional variational (4DVar) assimilation systems have been widely used for CO2 inverse modeling at global scale. At regional scale, however, 4DVar assimilation systems have been lacking. At present, most regional CO2 inverse models use Lagrangian particle backward trajectory tools to compute influence function in an analytical/synthesis framework. To provide a 4DVar based alternative, we developed WRF-CO2 4DVAR based on Weather Research and Forecasting (WRF), its chemistry extension (WRF-Chem), and its data assimilation system (WRFDA/WRFPLUS). Different from WRFDA, WRF-CO2 4DVAR does not optimize meteorology initial condition, instead it solves for the optimized CO2 surface fluxes (sources/sink) constrained by atmospheric CO2 observations. Based on WRFPLUS, we developed tangent linear and adjoint code for CO2 emission, advection, vertical mixing in boundary layer, and convective transport. Furthermore, we implemented an incremental algorithm to solve for optimized CO2 emission scaling factors by iteratively minimizing the cost function in a Bayes framework. The model sensitivity (of atmospheric CO2 with respect to emission scaling factor) calculated by tangent linear and adjoint model agrees well with that calculated by finite difference, indicating the validity of the newly developed code. The effectiveness of WRF-CO2 4DVar for inverse modeling is tested using forward-model generated pseudo-observation data in two experiments: first-guess CO2 fluxes has a 50% overestimation in the first case and 50% underestimation in the second. In both cases, WRF-CO2 4DVar reduces cost function to less than 10-4 of its initial values in less than 20 iterations and successfully recovers the true values of emission scaling factors. We expect future applications of WRF-CO2 4DVar with satellite observations will provide insights for CO2 regional inverse modeling, including the impacts of model transport error in vertical mixing.
Dehury, Budheswar; Panda, Debashis; Sahu, Jagajjit; Sahu, Mousumi; Sarma, Kishore; Barooah, Madhumita; Sen, Priyabrata; Modi, Mahendra Kumar
2013-01-01
The endogenous small non-coding micro RNAs (miRNAs), which are typically ~21–24 nt nucleotides, play a crucial role in regulating the intrinsic normal growth of cells and development of the plants as well as in maintaining the integrity of genomes. These small non-coding RNAs function as the universal specificity factors in post-transcriptional gene silencing. Discovering miRNAs, identifying their targets, and further inferring miRNA functions is a routine process to understand normal biological processes of miRNAs and their roles in the development of plants. Comparative genomics based approach using expressed sequence tags (EST) and genome survey sequences (GSS) offer a cost-effective platform for identification and characterization of miRNAs and their target genes in plants. Despite the fact that sweet potato (Ipomoea batatas L.) is an important staple food source for poor small farmers throughout the world, the role of miRNA in various developmental processes remains largely unknown. In this paper, we report the computational identification of miRNAs and their target genes in sweet potato from their ESTs. Using comparative genomics-based approach, 8 potential miRNA candidates belonging to miR168, miR2911, and miR156 families were identified from 23 406 ESTs in sweet potato. A total of 42 target genes were predicted and their probable functions were illustrated. Most of the newly identified miRNAs target transcription factors as well as genes involved in plant growth and development, signal transduction, metabolism, defense, and stress response. The identification of miRNAs and their targets is expected to accelerate the pace of miRNA discovery, leading to an improved understanding of the role of miRNA in development and physiology of sweet potato, as well as stress response. PMID:24067297
Schroeder, H; Hoeltken, A M; Fladung, M
2012-03-01
Within the genus Populus several species belonging to different sections are cross-compatible. Hence, high numbers of interspecies hybrids occur naturally and, additionally, have been artificially produced in huge breeding programmes during the last 100 years. Therefore, determination of a single poplar species, used for the production of 'multi-species hybrids' is often difficult, and represents a great challenge for the use of molecular markers in species identification. Within this study, over 20 chloroplast regions, both intergenic spacers and coding regions, have been tested for their ability to differentiate different poplar species using 23 already published barcoding primer combinations and 17 newly designed primer combinations. About half of the published barcoding primers yielded amplification products, whereas the new primers designed on the basis of the total sequenced cpDNA genome of Populus trichocarpa Torr. & Gray yielded much higher amplification success. Intergenic spacers were found to be more variable than coding regions within the genus Populus. The highest discrimination power of Populus species was found in the combination of two intergenic spacers (trnG-psbK, psbK-psbl) and the coding region rpoC. In barcoding projects, the coding regions matK and rbcL are often recommended, but within the genus Populus they only show moderate variability and are not efficient in species discrimination. © 2011 German Botanical Society and The Royal Botanical Society of the Netherlands.
Tang, Clara S; Zhang, He; Cheung, Chloe Y Y; Xu, Ming; Ho, Jenny C Y; Zhou, Wei; Cherny, Stacey S; Zhang, Yan; Holmen, Oddgeir; Au, Ka-Wing; Yu, Haiyi; Xu, Lin; Jia, Jia; Porsch, Robert M; Sun, Lijie; Xu, Weixian; Zheng, Huiping; Wong, Lai-Yung; Mu, Yiming; Dou, Jingtao; Fong, Carol H Y; Wang, Shuyu; Hong, Xueyu; Dong, Liguang; Liao, Yanhua; Wang, Jiansong; Lam, Levina S M; Su, Xi; Yan, Hua; Yang, Min-Lee; Chen, Jin; Siu, Chung-Wah; Xie, Gaoqiang; Woo, Yu-Cho; Wu, Yangfeng; Tan, Kathryn C B; Hveem, Kristian; Cheung, Bernard M Y; Zöllner, Sebastian; Xu, Aimin; Eugene Chen, Y; Jiang, Chao Qiang; Zhang, Youyi; Lam, Tai-Hing; Ganesh, Santhi K; Huo, Yong; Sham, Pak C; Lam, Karen S L; Willer, Cristen J; Tse, Hung-Fat; Gao, Wei
2015-12-22
Blood lipids are important risk factors for coronary artery disease (CAD). Here we perform an exome-wide association study by genotyping 12,685 Chinese, using a custom Illumina HumanExome BeadChip, to identify additional loci influencing lipid levels. Single-variant association analysis on 65,671 single nucleotide polymorphisms reveals 19 loci associated with lipids at exome-wide significance (P<2.69 × 10(-7)), including three Asian-specific coding variants in known genes (CETP p.Asp459Gly, PCSK9 p.Arg93Cys and LDLR p.Arg257Trp). Furthermore, missense variants at two novel loci-PNPLA3 p.Ile148Met and PKD1L3 p.Thr429Ser-also influence levels of triglycerides and low-density lipoprotein cholesterol, respectively. Another novel gene, TEAD2, is found to be associated with high-density lipoprotein cholesterol through gene-based association analysis. Most of these newly identified coding variants show suggestive association (P<0.05) with CAD. These findings demonstrate that exome-wide genotyping on samples of non-European ancestry can identify additional population-specific possible causal variants, shedding light on novel lipid biology and CAD.
Optimized bit extraction using distortion modeling in the scalable extension of H.264/AVC.
Maani, Ehsan; Katsaggelos, Aggelos K
2009-09-01
The newly adopted scalable extension of H.264/AVC video coding standard (SVC) demonstrates significant improvements in coding efficiency in addition to an increased degree of supported scalability relative to the scalable profiles of prior video coding standards. Due to the complicated hierarchical prediction structure of the SVC and the concept of key pictures, content-aware rate adaptation of SVC bit streams to intermediate bit rates is a nontrivial task. The concept of quality layers has been introduced in the design of the SVC to allow for fast content-aware prioritized rate adaptation. However, existing quality layer assignment methods are suboptimal and do not consider all network abstraction layer (NAL) units from different layers for the optimization. In this paper, we first propose a technique to accurately and efficiently estimate the quality degradation resulting from discarding an arbitrary number of NAL units from multiple layers of a bitstream by properly taking drift into account. Then, we utilize this distortion estimation technique to assign quality layers to NAL units for a more efficient extraction. Experimental results show that a significant gain can be achieved by the proposed scheme.
SU-D-BRD-03: A Gateway for GPU Computing in Cancer Radiotherapy Research
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jia, X; Folkerts, M; Shi, F
Purpose: Graphics Processing Unit (GPU) has become increasingly important in radiotherapy. However, it is still difficult for general clinical researchers to access GPU codes developed by other researchers, and for developers to objectively benchmark their codes. Moreover, it is quite often to see repeated efforts spent on developing low-quality GPU codes. The goal of this project is to establish an infrastructure for testing GPU codes, cross comparing them, and facilitating code distributions in radiotherapy community. Methods: We developed a system called Gateway for GPU Computing in Cancer Radiotherapy Research (GCR2). A number of GPU codes developed by our group andmore » other developers can be accessed via a web interface. To use the services, researchers first upload their test data or use the standard data provided by our system. Then they can select the GPU device on which the code will be executed. Our system offers all mainstream GPU hardware for code benchmarking purpose. After the code running is complete, the system automatically summarizes and displays the computing results. We also released a SDK to allow the developers to build their own algorithm implementation and submit their binary codes to the system. The submitted code is then systematically benchmarked using a variety of GPU hardware and representative data provided by our system. The developers can also compare their codes with others and generate benchmarking reports. Results: It is found that the developed system is fully functioning. Through a user-friendly web interface, researchers are able to test various GPU codes. Developers also benefit from this platform by comprehensively benchmarking their codes on various GPU platforms and representative clinical data sets. Conclusion: We have developed an open platform allowing the clinical researchers and developers to access the GPUs and GPU codes. This development will facilitate the utilization of GPU in radiation therapy field.« less
REM sleep selectively prunes and maintains new synapses in development and learning.
Li, Wei; Ma, Lei; Yang, Guang; Gan, Wen-Biao
2017-03-01
The functions and underlying mechanisms of rapid eye movement (REM) sleep remain unclear. Here we show that REM sleep prunes newly formed postsynaptic dendritic spines of layer 5 pyramidal neurons in the mouse motor cortex during development and motor learning. This REM sleep-dependent elimination of new spines facilitates subsequent spine formation during development and when a new motor task is learned, indicating a role for REM sleep in pruning to balance the number of new spines formed over time. Moreover, REM sleep also strengthens and maintains newly formed spines, which are critical for neuronal circuit development and behavioral improvement after learning. We further show that dendritic calcium spikes arising during REM sleep are important for pruning and strengthening new spines. Together, these findings indicate that REM sleep has multifaceted functions in brain development, learning and memory consolidation by selectively eliminating and maintaining newly formed synapses via dendritic calcium spike-dependent mechanisms.
Development and application of the GIM code for the Cyber 203 computer
NASA Technical Reports Server (NTRS)
Stainaker, J. F.; Robinson, M. A.; Rawlinson, E. G.; Anderson, P. G.; Mayne, A. W.; Spradley, L. W.
1982-01-01
The GIM computer code for fluid dynamics research was developed. Enhancement of the computer code, implicit algorithm development, turbulence model implementation, chemistry model development, interactive input module coding and wing/body flowfield computation are described. The GIM quasi-parabolic code development was completed, and the code used to compute a number of example cases. Turbulence models, algebraic and differential equations, were added to the basic viscous code. An equilibrium reacting chemistry model and implicit finite difference scheme were also added. Development was completed on the interactive module for generating the input data for GIM. Solutions for inviscid hypersonic flow over a wing/body configuration are also presented.
Song, Yuhyun; Leman, Scotland; Monteil, Caroline L.; Heath, Lenwood S.; Vinatzer, Boris A.
2014-01-01
A broadly accepted and stable biological classification system is a prerequisite for biological sciences. It provides the means to describe and communicate about life without ambiguity. Current biological classification and nomenclature use the species as the basic unit and require lengthy and laborious species descriptions before newly discovered organisms can be assigned to a species and be named. The current system is thus inadequate to classify and name the immense genetic diversity within species that is now being revealed by genome sequencing on a daily basis. To address this lack of a general intra-species classification and naming system adequate for today’s speed of discovery of new diversity, we propose a classification and naming system that is exclusively based on genome similarity and that is suitable for automatic assignment of codes to any genome-sequenced organism without requiring any phenotypic or phylogenetic analysis. We provide examples demonstrating that genome similarity-based codes largely align with current taxonomic groups at many different levels in bacteria, animals, humans, plants, and viruses. Importantly, the proposed approach is only slightly affected by the order of code assignment and can thus provide codes that reflect similarity between organisms and that do not need to be revised upon discovery of new diversity. We envision genome similarity-based codes to complement current biological nomenclature and to provide a universal means to communicate unambiguously about any genome-sequenced organism in fields as diverse as biodiversity research, infectious disease control, human and microbial forensics, animal breed and plant cultivar certification, and human ancestry research. PMID:24586551
Olson, Nicole A; Davidow, Amy L; Winston, Carla A; Chen, Michael P; Gazmararian, Julie A; Katz, Dolores J
2012-05-18
Tuberculosis (TB) in developed countries has historically been associated with poverty and low socioeconomic status (SES). In the past quarter century, TB in the United States has changed from primarily a disease of native-born to primarily a disease of foreign-born persons, who accounted for more than 60% of newly-diagnosed TB cases in 2010. The purpose of this study was to assess the association of SES with rates of TB in U.S.-born and foreign-born persons in the United States, overall and for the five most common foreign countries of origin. National TB surveillance data for 1996-2005 was linked with ZIP Code-level measures of SES (crowding, unemployment, education, and income) from U.S. Census 2000. ZIP Codes were grouped into quartiles from low SES to high SES and TB rates were calculated for foreign-born and U.S.-born populations in each quartile. TB rates were highest in the quartiles with low SES for both U.S.-born and foreign-born populations. However, while TB rates increased five-fold or more from the two highest to the two lowest SES quartiles among the U.S.-born, they increased only by a factor of 1.3 among the foreign-born. Low SES is only weakly associated with TB among foreign-born persons in the United States. The traditional associations of TB with poverty are not sufficient to explain the epidemiology of TB among foreign-born persons in this country and perhaps in other developed countries. TB outreach and research efforts that focus only on low SES will miss an important segment of the foreign-born population.
Genomic Sequencing and Characterization of Cynomolgus Macaque Cytomegalovirus▿
Marsh, Angie K.; Willer, David O.; Ambagala, Aruna P. N.; Dzamba, Misko; Chan, Jacqueline K.; Pilon, Richard; Fournier, Jocelyn; Sandstrom, Paul; Brudno, Michael; MacDonald, Kelly S.
2011-01-01
Cytomegalovirus (CMV) infection is the most common opportunistic infection in immunosuppressed individuals, such as transplant recipients or people living with HIV/AIDS, and congenital CMV is the leading viral cause of developmental disabilities in infants. Due to the highly species-specific nature of CMV, animal models that closely recapitulate human CMV (HCMV) are of growing importance for vaccine development. Here we present the genomic sequence of a novel nonhuman primate CMV from cynomolgus macaques (Macaca fascicularis; CyCMV). CyCMV (Ottawa strain) was isolated from the urine of a healthy, captive-bred, 4-year-old cynomolgus macaque of Philippine origin, and the viral genome was sequenced using next-generation Illumina sequencing to an average of 516-fold coverage. The CyCMV genome is 218,041 bp in length, with 49.5% G+C content and 84% protein-coding density. We have identified 262 putative open reading frames (ORFs) with an average coding length of 789 bp. The genomic organization of CyCMV is largely colinear with that of rhesus macaque CMV (RhCMV). Of the 262 CyCMV ORFs, 137 are homologous to HCMV genes, 243 are homologous to RhCMV 68.1, and 200 are homologous to RhCMV 180.92. CyCMV encodes four ORFs that are not present in RhCMV strain 68.1 or 180.92 but have homologies with HCMV (UL30, UL74A, UL126, and UL146). Similar to HCMV, CyCMV does not produce the RhCMV-specific viral homologue of cyclooxygenase-2. This newly characterized CMV may provide a novel model in which to study CMV biology and HCMV vaccine development. PMID:21994460
Design search and optimization in aerospace engineering.
Keane, A J; Scanlan, J P
2007-10-15
In this paper, we take a design-led perspective on the use of computational tools in the aerospace sector. We briefly review the current state-of-the-art in design search and optimization (DSO) as applied to problems from aerospace engineering, focusing on those problems that make heavy use of computational fluid dynamics (CFD). This ranges over issues of representation, optimization problem formulation and computational modelling. We then follow this with a multi-objective, multi-disciplinary example of DSO applied to civil aircraft wing design, an area where this kind of approach is becoming essential for companies to maintain their competitive edge. Our example considers the structure and weight of a transonic civil transport wing, its aerodynamic performance at cruise speed and its manufacturing costs. The goals are low drag and cost while holding weight and structural performance at acceptable levels. The constraints and performance metrics are modelled by a linked series of analysis codes, the most expensive of which is a CFD analysis of the aerodynamics using an Euler code with coupled boundary layer model. Structural strength and weight are assessed using semi-empirical schemes based on typical airframe company practice. Costing is carried out using a newly developed generative approach based on a hierarchical decomposition of the key structural elements of a typical machined and bolted wing-box assembly. To carry out the DSO process in the face of multiple competing goals, a recently developed multi-objective probability of improvement formulation is invoked along with stochastic process response surface models (Krigs). This approach both mitigates the significant run times involved in CFD computation and also provides an elegant way of balancing competing goals while still allowing the deployment of the whole range of single objective optimizers commonly available to design teams.
6 CFR 7.26 - Derivative classification.
Code of Federal Regulations, 2011 CFR
2011-01-01
... already classified, and marking the newly developed material consistent with the classification markings... classification decisions and carry forward to any newly created documents the pertinent classification markings. (d) Information classified derivatively from other classified information shall be classified and...
Tetteh, Kevin K. A.; Loukas, Alex; Tripp, Cindy; Maizels, Rick M.
1999-01-01
Larvae of Toxocara canis, a nematode parasite of dogs, infect humans, causing visceral and ocular larva migrans. In noncanid hosts, larvae neither grow nor differentiate but endure in a state of arrested development. Reasoning that parasite protein production is orientated to immune evasion, we undertook a random sequencing project from a larval cDNA library to characterize the most highly expressed transcripts. In all, 266 clones were sequenced, most from both 3′ and 5′ ends, and similarity searches against GenBank protein and dbEST nucleotide databases were conducted. Cluster analyses showed that 128 distinct gene products had been found, all but 3 of which represented newly identified genes. Ninety-five genes were represented by a single clone, but seven transcripts were present at high frequencies, each composing >2% of all clones sequenced. These high-abundance transcripts include a mucin and a C-type lectin, which are both major excretory-secretory antigens released by parasites. Four highly expressed novel gene transcripts, termed ant (abundant novel transcript) genes, were found. Together, these four genes comprised 18% of all cDNA clones isolated, but no similar sequences occur in the Caenorhabditis elegans genome. While the coding regions of the four genes are dissimilar, their 3′ untranslated tracts have significant homology in nucleotide sequence. The discovery of these abundant, parasite-specific genes of newly identified lectins and mucins, as well as a range of conserved and novel proteins, provides defined candidates for future analysis of the molecular basis of immune evasion by T. canis. PMID:10456930
Huang, Lin; Lange, Miles D.; Zhang, Zhixin
2014-01-01
VH replacement occurs through RAG-mediated secondary recombination between a rearranged VH gene and an upstream unrearranged VH gene. Due to the location of the cryptic recombination signal sequence (cRSS, TACTGTG) at the 3′ end of VH gene coding region, a short stretch of nucleotides from the previous rearranged VH gene can be retained in the newly formed VH–DH junction as a “footprint” of VH replacement. Such footprints can be used as markers to identify Ig heavy chain (IgH) genes potentially generated through VH replacement. To explore the contribution of VH replacement products to the antibody repertoire, we developed a Java-based computer program, VH replacement footprint analyzer-I (VHRFA-I), to analyze published or newly obtained IgH genes from human or mouse. The VHRFA-1 program has multiple functional modules: it first uses service provided by the IMGT/V-QUEST program to assign potential VH, DH, and JH germline genes; then, it searches for VH replacement footprint motifs within the VH–DH junction (N1) regions of IgH gene sequences to identify potential VH replacement products; it can also analyze the frequencies of VH replacement products in correlation with publications, keywords, or VH, DH, and JH gene usages, and mutation status; it can further analyze the amino acid usages encoded by the identified VH replacement footprints. In summary, this program provides a useful computation tool for exploring the biological significance of VH replacement products in human and mouse. PMID:24575092
Orthopoxvirus Genome Evolution: The Role of Gene Loss
Hendrickson, Robert Curtis; Wang, Chunlin; Hatcher, Eneida L.; Lefkowitz, Elliot J.
2010-01-01
Poxviruses are highly successful pathogens, known to infect a variety of hosts. The family Poxviridae includes Variola virus, the causative agent of smallpox, which has been eradicated as a public health threat but could potentially reemerge as a bioterrorist threat. The risk scenario includes other animal poxviruses and genetically engineered manipulations of poxviruses. Studies of orthologous gene sets have established the evolutionary relationships of members within the Poxviridae family. It is not clear, however, how variations between family members arose in the past, an important issue in understanding how these viruses may vary and possibly produce future threats. Using a newly developed poxvirus-specific tool, we predicted accurate gene sets for viruses with completely sequenced genomes in the genus Orthopoxvirus. Employing sensitive sequence comparison techniques together with comparison of syntenic gene maps, we established the relationships between all viral gene sets. These techniques allowed us to unambiguously identify the gene loss/gain events that have occurred over the course of orthopoxvirus evolution. It is clear that for all existing Orthopoxvirus species, no individual species has acquired protein-coding genes unique to that species. All existing species contain genes that are all present in members of the species Cowpox virus and that cowpox virus strains contain every gene present in any other orthopoxvirus strain. These results support a theory of reductive evolution in which the reduction in size of the core gene set of a putative ancestral virus played a critical role in speciation and confining any newly emerging virus species to a particular environmental (host or tissue) niche. PMID:21994715
Sato, Tatsuhiko; Masunaga, Shin-Ichiro; Kumada, Hiroaki; Hamada, Nobuyuki
2018-01-17
We here propose a new model for estimating the biological effectiveness for boron neutron capture therapy (BNCT) considering intra- and intercellular heterogeneity in 10 B distribution. The new model was developed from our previously established stochastic microdosimetric kinetic model that determines the surviving fraction of cells irradiated with any radiations. In the model, the probability density of the absorbed doses in microscopic scales is the fundamental physical index for characterizing the radiation fields. A new computational method was established to determine the probability density for application to BNCT using the Particle and Heavy Ion Transport code System PHITS. The parameters used in the model were determined from the measured surviving fraction of tumor cells administrated with two kinds of 10 B compounds. The model quantitatively highlighted the indispensable need to consider the synergetic effect and the dose dependence of the biological effectiveness in the estimate of the therapeutic effect of BNCT. The model can predict the biological effectiveness of newly developed 10 B compounds based on their intra- and intercellular distributions, and thus, it can play important roles not only in treatment planning but also in drug discovery research for future BNCT.
A new method for the measurement of anteversion of the acetabular cup after total hip arthroplasty.
Aydogan, Mehmet; Burç, Halil; Saka, Gursel
2014-08-01
Many methods of determining the anteversion of the acetabular cup have been described in the literature. The advantages and disadvantages of each of these methods are discussed in this paper. We present a new method of measuring the acetabular anteversion at the anteroposterior hip. The formula designed by the authors was anteversion angle (α) = arc sin |PK|/√ |AK| × |BK|. The formula was tested using the AutoCAD software, and an experimental study was conducted to evaluate the accuracy. Three groups were created, and 16 X-ray images were taken and coded. Ten orthopaedic surgeons measured the acetabular anteversion from these X-rays using our formula. The results in Group 1 were closer to the actual value; in contrast, the results in Group 2 differed from the actual values. The results in Group 3 were as close to the actual anteversion values as were those in Group 1. Developments in technology often bring an increase in complications. Despite newly developed surgical methods and technology, the position of the acetabular cup is still used to determine the results of a total hip arthroplasty. Our method is simple, cost-effective and achieves almost 100 % accuracy.
A Structured Grid Based Solution-Adaptive Technique for Complex Separated Flows
NASA Technical Reports Server (NTRS)
Thornburg, Hugh; Soni, Bharat K.; Kishore, Boyalakuntla; Yu, Robert
1996-01-01
The objective of this work was to enhance the predictive capability of widely used computational fluid dynamic (CFD) codes through the use of solution adaptive gridding. Most problems of engineering interest involve multi-block grids and widely disparate length scales. Hence, it is desirable that the adaptive grid feature detection algorithm be developed to recognize flow structures of different type as well as differing intensity, and adequately address scaling and normalization across blocks. In order to study the accuracy and efficiency improvements due to the grid adaptation, it is necessary to quantify grid size and distribution requirements as well as computational times of non-adapted solutions. Flow fields about launch vehicles of practical interest often involve supersonic freestream conditions at angle of attack exhibiting large scale separate vortical flow, vortex-vortex and vortex-surface interactions, separated shear layers and multiple shocks of different intensity. In this work, a weight function and an associated mesh redistribution procedure is presented which detects and resolves these features without user intervention. Particular emphasis has been placed upon accurate resolution of expansion regions and boundary layers. Flow past a wedge at Mach=2.0 is used to illustrate the enhanced detection capabilities of this newly developed weight function.
Bravender, Terrill; Tulsky, James A.; Farrell, David; Alexander, Stewart C.; Østbye, Truls; Lyna, Pauline; Dolor, Rowena J.; Coffman, Cynthia J.; Bilheimer, Alicia; Lin, Pao-Hwa; Pollak, Kathryn I.
2013-01-01
Objective To describe the theoretical basis, use, and satisfaction with Teen CHAT, an online educational intervention designed to improve physician-adolescent communication about healthy weight. Methods Routine health maintenance encounters between pediatricians and family practitioners and their overweight adolescent patients were audio recorded, and content was coded to summarize adherence with motivational interviewing techniques. An online educational intervention was developed using constructs from social cognitive theory and using personalized audio recordings. Physicians were randomized to the online intervention or not, and completed post-intervention surveys. Results Forty-six physicians were recruited, and 22 physicians were randomized to view the intervention website. The educational intervention took an average of 54 minutes to complete, and most physicians thought it was useful, that they would use newly acquired skills with their patients, and would recommend it to others. Fewer physicians thought it helped them address confidentiality issues with their adolescent patients. Conclusion The Teen CHAT online intervention shows potential for enhancing physician motivational interviewing skills in an acceptable and time-efficient manner. Practice Implications If found to be effective in enhancing motivational interviewing skills and changing adolescent weight-related behaviors, wide dissemination will be feasible and indicated. PMID:24021419
A Noninvasive In Vitro Monitoring System Reporting Skeletal Muscle Differentiation.
Öztürk-Kaloglu, Deniz; Hercher, David; Heher, Philipp; Posa-Markaryan, Katja; Sperger, Simon; Zimmermann, Alice; Wolbank, Susanne; Redl, Heinz; Hacobian, Ara
2017-01-01
Monitoring of cell differentiation is a crucial aspect of cell-based therapeutic strategies depending on tissue maturation. In this study, we have developed a noninvasive reporter system to trace murine skeletal muscle differentiation. Either a secreted bioluminescent reporter (Metridia luciferase) or a fluorescent reporter (green fluorescent protein [GFP]) was placed under the control of the truncated muscle creatine kinase (MCK) basal promoter enhanced by variable numbers of upstream MCK E-boxes. The engineered pE3MCK vector, coding a triple tandem of E-Boxes and the truncated MCK promoter, showed twentyfold higher levels of luciferase activation compared with a Cytomegalovirus (CMV) promoter. This newly developed reporter system allowed noninvasive monitoring of myogenic differentiation in a straining bioreactor. Additionally, binding sequences of endogenous microRNAs (miRNAs; seed sequences) that are known to be downregulated in myogenesis were ligated as complementary seed sequences into the reporter vector to reduce nonspecific signal background. The insertion of seed sequences improved the signal-to-noise ratio up to 25% compared with pE3MCK. Due to the highly specific, fast, and convenient expression analysis for cells undergoing myogenic differentiation, this reporter system provides a powerful tool for application in skeletal muscle tissue engineering.
NASA Astrophysics Data System (ADS)
Yakut, Kadri
2015-08-01
We present a detailed study of KIC 2306740, an eccentric double-lined eclipsing binary system with a pulsating component.Archive Kepler satellite data were combined with newly obtained spectroscopic data with 4.2\\,m William Herschel Telescope(WHT). This allowed us to determine rather precise orbital and physical parameters of this long period, slightly eccentric, pulsating binary system. Duplicity effects are extracted from the light curve in order to estimate pulsation frequencies from the residuals.We modelled the detached binary system assuming non-conservative evolution models with the Cambridge STARS(TWIN) code.
NASA Astrophysics Data System (ADS)
Goswami, Debarghya; Sinha, Debashis; Mandal, Pradip Kumar
2018-05-01
One newly synthesized fluorinated ferroelectric liquid crystal, (S)-(+)-4_-[(3-undecafluorohexanoyloxy) prop-1-oxy]biphenyl-4-yl 4-(1-methylheptyloxy)-benzoate (code name 5F3R), has been characterized by dielectric and electro-optic investigations. The sample exhibits only SmC* phase for a considerable range of temperature. Only Gold stone mode of relaxation has been observed in dielectric study. Spontaneous polarization, response time, optical tilt angle, rotational viscosity have also been measured. The values of observed physical parameters and their temperature dependence have been compared with that of other samples of same homologues series.
Quantifying Therapeutic and Diagnostic Efficacy in 2D Microvascular Images
NASA Technical Reports Server (NTRS)
Parsons-Wingerter, Patricia; Vickerman, Mary B.; Keith, Patricia A.
2009-01-01
VESGEN is a newly automated, user-interactive program that maps and quantifies the effects of vascular therapeutics and regulators on microvascular form and function. VESGEN analyzes two-dimensional, black and white vascular images by measuring important vessel morphology parameters. This software guides the user through each required step of the analysis process via a concise graphical user interface (GUI). Primary applications of the VESGEN code are 2D vascular images acquired as clinical diagnostic images of the human retina and as experimental studies of the effects of vascular regulators and therapeutics on vessel remodeling.
Modify Federal Tax Code to Create Incentives for Individuals to Obtain Coverage.
McGlynn, Elizabeth A
2011-01-01
This article explores how a refundable tax credit to offset the cost of health insurance premiums would affect health system performance along nine dimensions. A refundable tax credit would produce a slight gain in health as measured by life expectancy; 2.3 to 10 million people would become newly insured under this policy change. It is uncertain how the policy would affect waste or patient experience. Refundable tax credits would have no discernable effect on total health care spending, overall consumer financial risk, reliability of care, or health system capacity. Implementing refundable tax credits would be relatively easy.
Automated Coding Software: Development and Use to Enhance Anti-Fraud Activities*
Garvin, Jennifer H.; Watzlaf, Valerie; Moeini, Sohrab
2006-01-01
This descriptive research project identified characteristics of automated coding systems that have the potential to detect improper coding and to minimize improper or fraudulent coding practices in the setting of automated coding used with the electronic health record (EHR). Recommendations were also developed for software developers and users of coding products to maximize anti-fraud practices. PMID:17238546
Johnson, Martin; Magnusson, Carin; Allan, Helen; Evans, Karen; Ball, Elaine; Horton, Khim; Curtis, Kathy; Westwood, Sue
2015-02-01
The role of the acute hospital nurse has moved away from the direct delivery of patient care and more towards the management of the delivery of bedside care by healthcare assistants. How newly qualified nurses delegate to and supervise healthcare assistants is important as failures can lead to care being missed, duplicated and/or incorrectly performed. The data described here form part of a wider study which explored how newly qualified nurses recontextualise knowledge into practice, and develop and apply effective delegation and supervision skills. This article analyses team working between newly qualified nurses and healthcare assistants, and nurses' balancing of administrative tasks with bedside care. Ethnographic case studies were undertaken in three hospital sites in England, using a mixed methods approach involving: participant observations; interviews with 33 newly qualified nurses, 10 healthcare assistants and 12 ward managers. Data were analysed using thematic analysis, aided by the qualitative software NVivo. Multiple demands upon the newly qualified nurses' time, particularly the pressures to maintain records, can influence how effectively they delegate to, and supervise, healthcare assistants. While some nurses and healthcare assistants work successfully together, others work 'in parallel' rather than as an efficient team. While some ward cultures and individual working styles promote effective team working, others lead to less efficient collaboration between newly qualified nurses and healthcare assistants. In particular the need for qualified nurses to maintain records can create a gap between them, and between nurses and patients. Newly qualified nurses require more assistance in managing their own time and developing successful working relationships with healthcare assistants. Copyright © 2014 Elsevier Ltd. All rights reserved.
DEVELOPMENT OF AG-1 SECTION FI ON METAL MEDIA FILTERS - 9061
DOE Office of Scientific and Technical Information (OSTI.GOV)
Adamson, D; Charles A. Waggoner, C
Development of a metal media standard (FI) for ASME AG-1 (Code on Nuclear Air and Gas Treatment) has been under way for almost ten years. This paper will provide a brief history of the development process of this section and a detailed overview of its current content/status. There have been at least two points when dramatic changes have been made in the scope of the document due to feedback from the full Committee on Nuclear Air and Gas Treatment (CONAGT). Development of the proposed section has required resolving several difficult issues associated with scope; namely, filtering efficiency, operating conditions (mediamore » velocity, pressure drop, etc.), qualification testing, and quality control/acceptance testing. A proposed version of Section FI is currently undergoing final revisions prior to being submitted for balloting. The section covers metal media filters of filtering efficiencies ranging from medium (less than 99.97%) to high (99.97% and greater). Two different types of high efficiency filters are addressed; those units intended to be a direct replacement of Section FC fibrous glass HEPA filters and those that will be placed into newly designed systems capable of supporting greater static pressures and differential pressures across the filter elements. Direct replacements of FC HEPA filters in existing systems will be required to meet equivalent qualification and testing requirements to those contained in Section FC. A series of qualification and quality assurance test methods have been identified for the range of filtering efficiencies covered by this proposed standard. Performance characteristics of sintered metal powder vs. sintered metal fiber media are dramatically different with respect to parameters like differential pressures and rigidity of the media. Wide latitude will be allowed for owner specification of performance criteria for filtration units that will be placed into newly designed systems. Such allowances will permit use of the most appropriate metal media for a system as specified by the owner with respect to material of manufacture, media velocity, system maximum static pressure, maximum differential pressure across the filter, and similar parameters.« less
CFD Code Development for Combustor Flows
NASA Technical Reports Server (NTRS)
Norris, Andrew
2003-01-01
During the lifetime of this grant, work has been performed in the areas of model development, code development, code validation and code application. For model development, this has included the PDF combustion module, chemical kinetics based on thermodynamics, neural network storage of chemical kinetics, ILDM chemical kinetics and assumed PDF work. Many of these models were then implemented in the code, and in addition many improvements were made to the code, including the addition of new chemistry integrators, property evaluation schemes, new chemistry models and turbulence-chemistry interaction methodology. Validation of all new models and code improvements were also performed, while application of the code to the ZCET program and also the NPSS GEW combustor program were also performed. Several important items remain under development, including the NOx post processing, assumed PDF model development and chemical kinetic development. It is expected that this work will continue under the new grant.
Computer codes developed and under development at Lewis
NASA Technical Reports Server (NTRS)
Chamis, Christos C.
1992-01-01
The objective of this summary is to provide a brief description of: (1) codes developed or under development at LeRC; and (2) the development status of IPACS with some typical early results. The computer codes that have been developed and/or are under development at LeRC are listed in the accompanying charts. This list includes: (1) the code acronym; (2) select physics descriptors; (3) current enhancements; and (4) present (9/91) code status with respect to its availability and documentation. The computer codes list is grouped by related functions such as: (1) composite mechanics; (2) composite structures; (3) integrated and 3-D analysis; (4) structural tailoring; and (5) probabilistic structural analysis. These codes provide a broad computational simulation infrastructure (technology base-readiness) for assessing the structural integrity/durability/reliability of propulsion systems. These codes serve two other very important functions: they provide an effective means of technology transfer; and they constitute a depository of corporate memory.
Novel mutations of TCOF1 gene in European patients with treacher Collins syndrome
2011-01-01
Background Treacher Collins syndrome (TCS) is one of the most severe autosomal dominant congenital disorders of craniofacial development and shows variable phenotypic expression. TCS is extremely rare, occurring with an incidence of 1 in 50.000 live births. The TCS distinguishing characteristics are represented by down slanting palpebral fissures, coloboma of the eyelid, micrognathia, microtia and other deformity of the ears, hypoplastic zygomatic arches, and macrostomia. Conductive hearing loss and cleft palate are often present. TCS results from mutations in the TCOF1 gene located on chromosome 5, which encodes a serine/alanine-rich nucleolar phospho-protein called Treacle. However, alterations in the TCOF1 gene have been implicated in only 81-93% of TCS cases. Methods In this study, the entire coding regions of the TCOF1 gene, including newly described exons 6A and 16A, were sequenced in 46 unrelated subjects suspected of TCS clinical indication. Results Fifteen mutations were reported, including twelve novel and three already described in 14 sporadic patients and in 3 familial cases. Moreover, seven novel polymorphisms were also described. Most of the mutations characterised were microdeletions spanning one or more nucleotides, in addition to an insertion of one nucleotide in exon 18 and a stop mutation. The deletions and the insertion described cause a premature termination of translation, resulting in a truncated protein. Conclusion This study confirms that almost all the TCOF1 pathogenic mutations fall in the coding region and lead to an aberrant protein. PMID:21951868
Novel mutations of TCOF1 gene in European patients with Treacher Collins syndrome.
Conte, Chiara; D'Apice, Maria Rosaria; Rinaldi, Fabrizio; Gambardella, Stefano; Sangiuolo, Federica; Novelli, Giuseppe
2011-09-27
Treacher Collins syndrome (TCS) is one of the most severe autosomal dominant congenital disorders of craniofacial development and shows variable phenotypic expression. TCS is extremely rare, occurring with an incidence of 1 in 50.000 live births. The TCS distinguishing characteristics are represented by down slanting palpebral fissures, coloboma of the eyelid, micrognathia, microtia and other deformity of the ears, hypoplastic zygomatic arches, and macrostomia. Conductive hearing loss and cleft palate are often present. TCS results from mutations in the TCOF1 gene located on chromosome 5, which encodes a serine/alanine-rich nucleolar phospho-protein called Treacle. However, alterations in the TCOF1 gene have been implicated in only 81-93% of TCS cases. In this study, the entire coding regions of the TCOF1 gene, including newly described exons 6A and 16A, were sequenced in 46 unrelated subjects suspected of TCS clinical indication. Fifteen mutations were reported, including twelve novel and three already described in 14 sporadic patients and in 3 familial cases. Moreover, seven novel polymorphisms were also described. Most of the mutations characterised were microdeletions spanning one or more nucleotides, in addition to an insertion of one nucleotide in exon 18 and a stop mutation. The deletions and the insertion described cause a premature termination of translation, resulting in a truncated protein. This study confirms that almost all the TCOF1 pathogenic mutations fall in the coding region and lead to an aberrant protein.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gowardhan, Akshay; Neuscamman, Stephanie; Donetti, John
Aeolus is an efficient three-dimensional computational fluid dynamics code based on finite volume method developed for predicting transport and dispersion of contaminants in a complex urban area. It solves the time dependent incompressible Navier-Stokes equation on a regular Cartesian staggered grid using a fractional step method. It also solves a scalar transport equation for temperature and using the Boussinesq approximation. The model also includes a Lagrangian dispersion model for predicting the transport and dispersion of atmospheric contaminants. The model can be run in an efficient Reynolds Average Navier-Stokes (RANS) mode with a run time of several minutes, or a moremore » detailed Large Eddy Simulation (LES) mode with run time of hours for a typical simulation. This report describes the model components, including details on the physics models used in the code, as well as several model validation efforts. Aeolus wind and dispersion predictions are compared to field data from the Joint Urban Field Trials 2003 conducted in Oklahoma City (Allwine et al 2004) including both continuous and instantaneous releases. Newly implemented Aeolus capabilities include a decay chain model and an explosive Radiological Dispersal Device (RDD) source term; these capabilities are described. Aeolus predictions using the buoyant explosive RDD source are validated against two experimental data sets: the Green Field explosive cloud rise experiments conducted in Israel (Sharon et al 2012) and the Full-Scale RDD Field Trials conducted in Canada (Green et al 2016).« less
Nantón, Ana; Ruiz-Ruano, Francisco J.; Camacho, Juan Pedro M.; Méndez, Josefina
2017-01-01
Background Four species of the genus Donax (D. semistriatus, D. trunculus, D. variegatus and D. vittatus) are common on Iberian Peninsula coasts. Nevertheless, despite their economic importance and overexploitation, scarce genetic resources are available. In this work, we newly determined the complete mitochondrial genomes of these four representatives of the family Donacidae, with the aim of contributing to unveil phylogenetic relationships within the Veneroida order, and of developing genetic markers being useful in wedge clam identification and authentication, and aquaculture stock management. Principal findings The complete female mitochondrial genomes of the four species vary in size from 17,044 to 17,365 bp, and encode 13 protein-coding genes (including the atp8 gene), 2 rRNAs and 22 tRNAs, all located on the same strand. A long non-coding region was identified in each of the four Donax species between cob and cox2 genes, presumably corresponding to the Control Region. The Bayesian and Maximum Likelihood phylogenetic analysis of the Veneroida order indicate that all four species of Donax form a single clade as a sister group of other bivalves within the Tellinoidea superfamily. However, although Tellinoidea is actually monophyletic, none of its families are monophyletic. Conclusions Sequencing of complete mitochondrial genomes provides highly valuable information to establish the phylogenetic relationships within the Veneroida order. Furthermore, we provide here significant genetic resources for further research and conservation of this commercially important fishing resource. PMID:28886105
Suzuki, Y; Kambara, H; Kadota, K; Tamaki, S; Yamazato, A; Nohara, R; Osakada, G; Kawai, C
1985-08-01
To evaluate the noninvasive detection of shunt flow using a newly developed real-time 2-dimensional color-coded Doppler flow imaging system (D-2DE), 20 patients were examined, including 10 with secundum atrial septal defect (ASD) and 10 control subjects. These results were compared with contrast 2-dimensional echocardiography (C-2DE). Doppler 2DE displayed the blood flow toward the transducer as red and the blood flow away from the transducer as blue in 8 shades, each shade adding green according to the degree of variance in Doppler frequency. In the patients with ASD, D-2DE clearly visualized left-to-right shunt flow in 7 of 10 patients. In 5 of these 7 patients, C-2DE showed a negative contrast effect in the same area of the right atrium. Thus, D-2DE increased the sensitivity over C-2DE for detecting left-to-right shunt flow (from 50% to 70%). However, the specificity was slightly less in D-2DE (90%) than C-2DE (100%). Doppler 2DE could not visualize right-to-left shunt flow in all patients with ASD, though C-2DE showed a positive contrast effect in the left-sided heart in 9 of 10 patients with ASD. Thus, D-2DE is clinically useful for detecting left-to-right shunt flow in patients with ASD.
How Dental Team Members describe Adverse Events
Maramaldi, Peter; Walji, Muhammad F.; White, Joel; Etoulu, Jini; Kahn, Maria; Vaderhobli, Ram; Kwatra, Japneet; Delattre, Veronique F.; Hebballi, Nutan B.; Stewart, Denice; Kent, Karla; Yansane, Alfa; Ramoni, Rachel B.; Kalenderian, Elsbeth
2016-01-01
Background There is increased recognition that patients suffer adverse events (AEs) or harm caused by treatments in dentistry, and little is known about how dental providers describe these events. Understanding how providers view AEs is essential to building a safer environment in dental practice. Methods Dental providers and domain experts were interviewed through focus groups and in-depth interviews and asked to identify the types of AEs that may occur in dental settings. Results The first order listing of the interview and focus group findings yielded 1,514 items that included both causes and AEs. 632 causes were coded into one of the eight categories of the Eindhoven classification. 882 AEs were coded into 12 categories of a newly developed dental AE classification. Inter-rater reliability was moderate among coders. The list was reanalyzed and duplicate items were removed leaving a total of 747 unique AEs and 540 causes. The most frequently identified AE types were “Aspiration/ingestion” at 14% (n=142), “Wrong-site, wrong-procedure, wrong-patient errors” at 13%, “Hard tissue damage” at 13%, and “Soft tissue damage” at 12%. Conclusions Dental providers identified a large and diverse list of AEs. These events ranged from “death due to cardiac arrest” to “jaw fatigue from lengthy procedures”. Practical Implications Identifying threats to patient safety is a key element of improving dental patient safety. An inventory of dental AEs underpins efforts to track, prevent, and mitigate these events. PMID:27269376
Jet-torus connection in radio galaxies. Relativistic hydrodynamics and synthetic emission
NASA Astrophysics Data System (ADS)
Fromm, C. M.; Perucho, M.; Porth, O.; Younsi, Z.; Ros, E.; Mizuno, Y.; Zensus, J. A.; Rezzolla, L.
2018-01-01
Context. High resolution very long baseline interferometry observations of active galactic nuclei have revealed asymmetric structures in the jets of radio galaxies. These asymmetric structures may be due to internal asymmetries in the jets or they may be induced by the different conditions in the surrounding ambient medium, including the obscuring torus, or a combination of the two. Aims: In this paper we investigate the influence of the ambient medium, including the obscuring torus, on the observed properties of jets from radio galaxies. Methods: We performed special-relativistic hydrodynamic (SRHD) simulations of over-pressured and pressure-matched jets using the special-relativistic hydrodynamics code Ratpenat, which is based on a second-order accurate finite-volume method and an approximate Riemann solver. Using a newly developed radiative transfer code to compute the electromagnetic radiation, we modelled several jets embedded in various ambient medium and torus configurations and subsequently computed the non-thermal emission produced by the jet and thermal absorption from the torus. To better compare the emission simulations with observations we produced synthetic radio maps, taking into account the properties of the observatory. Results: The detailed analysis of our simulations shows that the observed properties such as core shift could be used to distinguish between over-pressured and pressure matched jets. In addition to the properties of the jets, insights into the extent and density of the obscuring torus can be obtained from analyses of the single-dish spectrum and spectral index maps.
A Standard System to Study Vertebrate Embryos
Werneburg, Ingmar
2009-01-01
Staged embryonic series are important as reference for different kinds of biological studies. I summarise problems that occur when using ‘staging tables’ of ‘model organisms’. Investigations of developmental processes in a broad scope of taxa are becoming commonplace. Beginning in the 1990s, methods were developed to quantify and analyse developmental events in a phylogenetic framework. The algorithms associated with these methods are still under development, mainly due to difficulties of using non-independent characters. Nevertheless, the principle of comparing clearly defined newly occurring morphological features in development (events) in quantifying analyses was a key innovation for comparative embryonic research. Up to date no standard was set for how to define such events in a comparative approach. As a case study I compared the external development of 23 land vertebrate species with a focus on turtles, mainly based on reference staging tables. I excluded all the characters that are only identical for a particular species or general features that were only analysed in a few species. Based on these comparisons I defined 104 developmental characters that are common either for all vertebrates (61 characters), gnathostomes (26), tetrapods (3), amniotes (7), or only for sauropsids (7). Characters concern the neural tube, somite, ear, eye, limb, maxillary and mandibular process, pharyngeal arch, eyelid or carapace development. I present an illustrated guide listing all the defined events. This guide can be used for describing developmental series of any vertebrate species or for documenting specimen variability of a particular species. The guide incorporates drawings and photographs as well as consideration of species identifying developmental features such as colouration. The simple character-code of the guide is extendable to further characters pertaining to external and internal morphological, physiological, genetic or molecular development, and also for other vertebrate groups not examined here, such as Chondrichthyes or Actinopterygii. An online database to type in developmental events for different stages and species could be a basis for further studies in comparative embryology. By documenting developmental events with the standard code, sequence heterochrony studies (i.e. Parsimov) and studies on variability can use this broad comparative data set. PMID:19521537
Utilization of recently developed codes for high power Brayton and Rankine cycle power systems
NASA Technical Reports Server (NTRS)
Doherty, Michael P.
1993-01-01
Two recently developed FORTRAN computer codes for high power Brayton and Rankine thermodynamic cycle analysis for space power applications are presented. The codes were written in support of an effort to develop a series of subsystem models for multimegawatt Nuclear Electric Propulsion, but their use is not limited just to nuclear heat sources or to electric propulsion. Code development background, a description of the codes, some sample input/output from one of the codes, and state future plans/implications for the use of these codes by NASA's Lewis Research Center are provided.
NASA Astrophysics Data System (ADS)
Yuan, F.; Wang, G.; Painter, S. L.; Tang, G.; Xu, X.; Kumar, J.; Bisht, G.; Hammond, G. E.; Mills, R. T.; Thornton, P. E.; Wullschleger, S. D.
2017-12-01
In Arctic tundra ecosystem soil freezing-thawing is one of dominant physical processes through which biogeochemical (e.g., carbon and nitrogen) cycles are tightly coupled. Besides hydraulic transport, freezing-thawing can cause pore water movement and aqueous species gradients, which are additional mechanisms for soil nitrogen (N) reactive-transport in Tundra ecosystem. In this study, we have fully coupled an in-development ESM(i.e., Advanced Climate Model for Energy, ACME)'s Land Model (ALM) aboveground processes with a state-of-the-art massively parallel 3-D subsurface thermal-hydrology and reactive transport code, PFLOTRAN. The resulting coupled ALM-PFLOTRAN model is a Land Surface Model (LSM) capable of resolving 3-D soil thermal-hydrological-biogeochemical cycles. This specific version of PFLOTRAN has incorporated CLM-CN Converging Trophic Cascade (CTC) model and a full and simple but robust soil N cycle. It includes absorption-desorption for soil NH4+ and gas dissolving-degasing process as well. It also implements thermal-hydrology mode codes with three newly-modified freezing-thawing algorithms which can greatly improve computing performance in regarding to numerical stiffness at freezing-point. Here we tested the model in fully 3-D coupled mode at the Next Generation Ecosystem Experiment-Arctic (NGEE-Arctic) field intensive study site at the Barrow Environmental Observatory (BEO), AK. The simulations show that: (1) synchronous coupling of soil thermal-hydrology and biogeochemistry in 3-D can greatly impact ecosystem dynamics across polygonal tundra landscape; and (2) freezing-thawing cycles can add more complexity to the system, resulting in greater mobility of soil N vertically and laterally, depending upon local micro-topography. As a preliminary experiment, the model is also implemented for Pan-Arctic region in 1-D column mode (i.e. no lateral connection), showing significant differences compared to stand-alone ALM. The developed ALM-PFLOTRAN coupling codes embeded within ESM will be used for Pan-Arctic regional evaluation of climate change-caused ecosystem responses and their feedbacks to climate system at various scales.
Development of Web Interfaces for Analysis Codes
NASA Astrophysics Data System (ADS)
Emoto, M.; Watanabe, T.; Funaba, H.; Murakami, S.; Nagayama, Y.; Kawahata, K.
Several codes have been developed to analyze plasma physics. However, most of them are developed to run on supercomputers. Therefore, users who typically use personal computers (PCs) find it difficult to use these codes. In order to facilitate the widespread use of these codes, a user-friendly interface is required. The authors propose Web interfaces for these codes. To demonstrate the usefulness of this approach, the authors developed Web interfaces for two analysis codes. One of them is for FIT developed by Murakami. This code is used to analyze the NBI heat deposition, etc. Because it requires electron density profiles, electron temperatures, and ion temperatures as polynomial expressions, those unfamiliar with the experiments find it difficult to use this code, especially visitors from other institutes. The second one is for visualizing the lines of force in the LHD (large helical device) developed by Watanabe. This code is used to analyze the interference caused by the lines of force resulting from the various structures installed in the vacuum vessel of the LHD. This code runs on PCs; however, it requires that the necessary parameters be edited manually. Using these Web interfaces, users can execute these codes interactively.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yang, X. H.; Fu, J. N.; Zha, Q., E-mail: jnfu@bnu.edu.cn
Time-series photometric observations were made for the SX Phoenicis star XX Cyg between 2007 and 2011 at the Xinglong Station of National Astronomical Observatories of China. With the light curves derived from the new observations, we do not detect any secondary maximum in the descending portion of the light curves of XX Cyg, as reported in some previous work. Frequency analysis of the light curves confirms a fundamental frequency f{sub 0} = 7.4148 cycles day{sup -1} and up to 19 harmonics, 11 of which are newly detected. However, no secondary mode of pulsation is detected from the light curves. Themore » O-C diagram, produced from 46 newly determined times of maximum light combined with those derived from the literature, reveals a continuous period increase with the rate of (1/P)(dP/dt) = 1.19(13) Multiplication-Sign 10{sup -8} yr{sup -1}. Theoretical rates of period change due to the stellar evolution were calculated with a modeling code. The result shows that the observed rate of period change is fully consistent with period change caused by evolutionary behavior predicted by standard theoretical models.« less
Kahl, Fabian; Frewer, Andreas
2017-04-01
Background: In 2015 the number of refugees who sought asylum in Germany has increased dramatically. Therefore, the medical care for these refugees faces huge challenges. The treatment of mental illness of refugees is a particular difficult topic. Objective of this study is the acquisition of the outpatient prescriptions of drugs for newly arrived refugees in Erlangen, focused on psychotropic drugs. Methods: Evaluation of all outpatient prescribed drugs (n=1 137), which were prescribed between 10/01/2014 and 09/30/2015 for asylum seekers living in the refugee center in Erlangen, a branch of the "Central Admission Institution" ("ZAE") Zirndorf. Funding organization of this treatment is the City of Erlangen. Settlement documents of the City of Erlangen were used for the analysis. Results: The prescribed drugs cover the spectrum of acute primary care. Big parts of the prescription rates are antiinfectives (ATC-Code: J), medication for the respiratory system (ATC: R), as well as non-steroidal anti-inflammatory drug (NSAID's: ibuprofen, paracetamol, metamizole). The prescription of psychotropic drugs is relatively underrepresented. © Georg Thieme Verlag KG Stuttgart · New York.
Kinetic turbulence simulations at extreme scale on leadership-class systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Bei; Ethier, Stephane; Tang, William
2013-01-01
Reliable predictive simulation capability addressing confinement properties in magnetically confined fusion plasmas is critically-important for ITER, a 20 billion dollar international burning plasma device under construction in France. The complex study of kinetic turbulence, which can severely limit the energy confinement and impact the economic viability of fusion systems, requires simulations at extreme scale for such an unprecedented device size. Our newly optimized, global, ab initio particle-in-cell code solving the nonlinear equations underlying gyrokinetic theory achieves excellent performance with respect to "time to solution" at the full capacity of the IBM Blue Gene/Q on 786,432 cores of Mira at ALCFmore » and recently of the 1,572,864 cores of Sequoia at LLNL. Recent multithreading and domain decomposition optimizations in the new GTC-P code represent critically important software advances for modern, low memory per core systems by enabling routine simulations at unprecedented size (130 million grid points ITER-scale) and resolution (65 billion particles).« less
NASA Technical Reports Server (NTRS)
Miki, Kenji; Moder, Jeff; Liou, Meng-Sing
2016-01-01
In this paper, we present the recent enhancement of the Open National Combustion Code (OpenNCC) and apply the OpenNCC to model a realistic combustor configuration (Energy Efficient Engine (E3)). First, we perform a series of validation tests for the newly-implemented advection upstream splitting method (AUSM) and the extended version of the AUSM-family schemes (AUSM+-up). Compared with the analytical/experimental data of the validation tests, we achieved good agreement. In the steady-state E3 cold flow results using the Reynolds-averaged Navier-Stokes(RANS), we find a noticeable difference in the flow fields calculated by the two different numerical schemes, the standard Jameson- Schmidt-Turkel (JST) scheme and the AUSM scheme. The main differences are that the AUSM scheme is less numerical dissipative and it predicts much stronger reverse flow in the recirculation zone. This study indicates that two schemes could show different flame-holding predictions and overall flame structures.
Wang, Shuo; Gao, Li-Zhi
2016-09-01
The complete chloroplast genome of green foxtail (Setaria viridis), a promising model system for C4 photosynthesis, is first reported in this study. The genome harbors a large single copy (LSC) region of 81 016 bp and a small single copy (SSC) region of 12 456 bp separated by a pair of inverted repeat (IRa and IRb) regions of 22 315 bp. GC content is 38.92%. The proportion of coding sequence is 57.97%, comprising of 111 (19 duplicated in IR regions) unique genes, 71 of which are protein-coding genes, four are rRNA genes, and 36 are tRNA genes. Phylogenetic analysis indicated that S. viridis was clustered with its cultivated species S. italica in the tribe Paniceae of the family Poaceae. This newly determined chloroplast genome will provide valuable genetic resources to assist future studies on C4 photosynthesis in grasses.
Complete mitochondrial genome of yellow meal worm(Tenebrio molitor)
LIU, Li-Na; WANG, Cheng-Ye
2014-01-01
The yellow meal worm(Tenebrio molitor L.) is an important resource insect typically used as animal feed additive. It is also widely used for biological research. The first complete mitochondrial genome of T. molitor was determined for the first time by long PCR and conserved primer walking approaches. The results showed that the entire mitogenome of T. molitor was 15 785 bp long, with 72.35% A+T content [deposited in GenBank with accession number KF418153]. The gene order and orientation were the same as the most common type suggested as ancestral for insects. Two protein-coding genes used atypical start codons(CTA in ND2 and AAT in COX1), and the remaining 11 protein-coding genes started with a typical insect initiation codon ATN. All tRNAs showed standard clover-leaf structure, except for tRNASer(AGN), which lacked a dihydrouridine(DHU) arm. The newly added T. molitor mitogenome could provide information for future studies on yellow meal worm. PMID:25465087
Complete mitochondrial genome of yellow meal worm (Tenebrio molitor).
Liu, Li-Na; Wang, Cheng-Ye
2014-11-18
The yellow meal worm (Tenebrio molitor L.) is an important resource insect typically used as animal feed additive. It is also widely used for biological research. The first complete mitochondrial genome of T. molitor was determined for the first time by long PCR and conserved primer walking approaches. The results showed that the entire mitogenome of T. molitor was 15 785 bp long, with 72.35% A+T content [deposited in GenBank with accession number KF418153]. The gene order and orientation were the same as the most common type suggested as ancestral for insects. Two protein-coding genes used atypical start codons (CTA in ND2 and AAT in COX1), and the remaining 11 protein-coding genes started with a typical insect initiation codon ATN. All tRNAs showed standard clover-leaf structure, except for tRNA(Ser) (AGN), which lacked a dihydrouridine (DHU) arm. The newly added T. molitor mitogenome could provide information for future studies on yellow meal worm.
The Proteus Navier-Stokes code
NASA Technical Reports Server (NTRS)
Towne, Charles E.; Bui, Trong T.; Cavicchi, Richard H.; Conley, Julianne M.; Molls, Frank B.; Schwab, John R.
1992-01-01
An effort is currently underway at NASA Lewis to develop two- and three-dimensional Navier-Stokes codes, called Proteus, for aerospace propulsion applications. The emphasis in the development of Proteus is not algorithm development or research on numerical methods, but rather the development of the code itself. The objective is to develop codes that are user-oriented, easily-modified, and well-documented. Well-proven, state-of-the-art solution algorithms are being used. Code readability, documentation (both internal and external), and validation are being emphasized. This paper is a status report on the Proteus development effort. The analysis and solution procedure are described briefly, and the various features in the code are summarized. The results from some of the validation cases that have been run are presented for both the two- and three-dimensional codes.
Developing and Modifying Behavioral Coding Schemes in Pediatric Psychology: A Practical Guide
McMurtry, C. Meghan; Chambers, Christine T.; Bakeman, Roger
2015-01-01
Objectives To provide a concise and practical guide to the development, modification, and use of behavioral coding schemes for observational data in pediatric psychology. Methods This article provides a review of relevant literature and experience in developing and refining behavioral coding schemes. Results A step-by-step guide to developing and/or modifying behavioral coding schemes is provided. Major steps include refining a research question, developing or refining the coding manual, piloting and refining the coding manual, and implementing the coding scheme. Major tasks within each step are discussed, and pediatric psychology examples are provided throughout. Conclusions Behavioral coding can be a complex and time-intensive process, but the approach is invaluable in allowing researchers to address clinically relevant research questions in ways that would not otherwise be possible. PMID:25416837
REM sleep selectively prunes and maintains new synapses in development and learning
Li, Wei; Ma, Lei; Yang, Guang; Gan, Wenbiao
2017-01-01
The functions and underlying mechanisms of rapid eye movement (REM) sleep remain unclear. Here we show that REM sleep prunes newly-formed postsynaptic dendritic spines of layer 5 pyramidal neurons in the mouse motor cortex during development and motor learning. This REM sleep-dependent elimination of new spines facilitates subsequent spine formation in development and when a new motor task is learned, indicating a role of REM sleep in pruning to balance the number of new spines formed over time. In addition, REM sleep also strengthens and maintains some newly-formed spines that are critical for neuronal circuit development and behavioral improvement after learning. We further show that dendritic calcium spikes arising during REM sleep are important for pruning and strengthening of new spines. Together, these findings indicate that REM sleep has multifaceted functions in brain development, learning, and memory consolidation by selectively eliminating and maintaining newly-formed synapses via dendritic calcium spike-dependent mechanisms. PMID:28092659
Comparative Genetic Analyses of Human Rhinovirus C (HRV-C) Complete Genome from Malaysia.
Khaw, Yam Sim; Chan, Yoke Fun; Jafar, Faizatul Lela; Othman, Norlijah; Chee, Hui Yee
2016-01-01
Human rhinovirus-C (HRV-C) has been implicated in more severe illnesses than HRV-A and HRV-B, however, the limited number of HRV-C complete genomes (complete 5' and 3' non-coding region and open reading frame sequences) has hindered the in-depth genetic study of this virus. This study aimed to sequence seven complete HRV-C genomes from Malaysia and compare their genetic characteristics with the 18 published HRV-Cs. Seven Malaysian HRV-C complete genomes were obtained with newly redesigned primers. The seven genomes were classified as HRV-C6, C12, C22, C23, C26, C42, and pat16 based on the VP4/VP2 and VP1 pairwise distance threshold classification. Five of the seven Malaysian isolates, namely, 3430-MY-10/C22, 8713-MY-10/C23, 8097-MY-11/C26, 1570-MY-10/C42, and 7383-MY-10/pat16 are the first newly sequenced complete HRV-C genomes. All seven Malaysian isolates genomes displayed nucleotide similarity of 63-81% among themselves and 63-96% with other HRV-Cs. Malaysian HRV-Cs had similar putative immunogenic sites, putative receptor utilization and potential antiviral sites as other HRV-Cs. The genomic features of Malaysian isolates were similar to those of other HRV-Cs. Negative selections were frequently detected in HRV-Cs complete coding sequences indicating that these sequences were under functional constraint. The present study showed that HRV-Cs from Malaysia have diverse genetic sequences but share conserved genomic features with other HRV-Cs. This genetic information could provide further aid in the understanding of HRV-C infection.
Ayieko, James; Ti, Angeline; Hagey, Jill; Akama, Eliud; Bukusi, Elizabeth A; Cohen, Craig R; Patel, Rena C
2017-08-08
Factors influencing fertility desires among HIV-infected individuals remain poorly understood. With new recommendations for universal HIV treatment and increasing antiretroviral therapy (ART) access, we sought to evaluate how access to early ART influences fertility desires among HIV-infected ART-naïve women. Semi-structured in-depth interviews were conducted with a select subgroup of 20 HIV-infected ART-naïve women attending one of 13 HIV facilities in western Kenya between July and August 2014 who would soon newly become eligible to initiate ART based on the latest national policy recommendations. The interviews covered four major themes: 1) definitions of family and children's role in community; 2) personal, interpersonal, institutional, and societal factors influencing fertility desires; 3) influence of HIV-positive status on fertility desires; and 4) influence of future ART initiation on fertility desires. An iterative process of reading transcripts, applying inductive codes, and comparing and contrasting codes was used to identify convergent and divergent themes. The women indicated their HIV-positive status did influence-largely negatively-their fertility desires. Furthermore, initiating ART and anticipating improved health status did not necessarily translate to increased fertility desires. Instead, individual factors, such as age, parity, current health status, financial resources and number of surviving or HIV-infected children, played a crucial role in decisions about future fertility. In addition, societal influences, such as community norms and health providers' expectations of their fertility desires, played an equally important role in determining fertility desires. Initiating ART may not be the leading factor influencing fertility desires among previously ART-naïve HIV-infected women. Instead, individual and societal factors appear to be the major determinants of fertility desires among these women.
Bays, Alison M.; Engelberg, Ruth A.; Back, Anthony L.; Ford, Dee W.; Downey, Lois; Shannon, Sarah E.; Doorenbos, Ardith Z.; Edlund, Barbara; Christianson, Phyllis; Arnold, Richard W.; O'Connor, Kim; Kross, Erin K.; Reinke, Lynn F.; Cecere Feemster, Laura; Fryer-Edwards, Kelly; Alexander, Stewart C.; Tulsky, James A.
2014-01-01
Abstract Background: Communication with patients and families is an essential component of high-quality care in serious illness. Small-group skills training can result in new communication behaviors, but past studies have used facilitators with extensive experience, raising concerns this is not scalable. Objective: The objective was to investigate the effect of an experiential communication skills building workshop (Codetalk), led by newly trained facilitators, on internal medicine trainees' and nurse practitioner students' ability to communicate bad news and express empathy. Design: Trainees participated in Codetalk; skill improvement was evaluated through pre- and post- standardized patient (SP) encounters. Setting and subjects: The subjects were internal medicine residents and nurse practitioner students at two universities. Intervention and measurements: The study was carried out in anywhere from five to eight half-day sessions over a month. The first and last sessions included audiotaped trainee SP encounters coded for effective communication behaviors. The primary outcome was change in communication scores from pre-intervention to post-intervention. We also measured trainee characteristics to identify predictors of performance and change in performance over time. Results: We enrolled 145 trainees who completed pre- and post-intervention SP interviews—with participation rates of 52% for physicians and 14% for nurse practitioners. Trainees' scores improved in 8 of 11 coded behaviors (p<0.05). The only significant predictors of performance were having participated in the intervention (p<0.001) and study site (p<0.003). The only predictor of improvement in performance over time was participating in the intervention (p<0.001). Conclusions: A communication skills intervention using newly trained facilitators was associated with improvement in trainees' skills in giving bad news and expressing empathy. Improvement in communication skills did not vary by trainee characteristics. PMID:24180700
Phylogeny of Anophelinae using mitochondrial protein coding genes
de Oliveira, Tatiane Marques Porangaba; Bergo, Eduardo S.; Conn, Jan E.; Sant’Ana, Denise Cristina; Nagaki, Sandra Sayuri; Nihei, Silvio; Lamas, Carlos Einicker; González, Christian; Moreira, Caio Cesar; Sallum, Maria Anice Mureb
2017-01-01
Malaria is a vector-borne disease that is a great burden on the poorest and most marginalized communities of the tropical and subtropical world. Approximately 41 species of Anopheline mosquitoes can effectively spread species of Plasmodium parasites that cause human malaria. Proposing a natural classification for the subfamily Anophelinae has been a continuous effort, addressed using both morphology and DNA sequence data. The monophyly of the genus Anopheles, and phylogenetic placement of the genus Bironella, subgenera Kerteszia, Lophopodomyia and Stethomyia within the subfamily Anophelinae, remain in question. To understand the classification of Anophelinae, we inferred the phylogeny of all three genera (Anopheles, Bironella, Chagasia) and major subgenera by analysing the amino acid sequences of the 13 protein coding genes of 150 newly sequenced mitochondrial genomes of Anophelinae and 18 newly sequenced Culex species as outgroup taxa, supplemented with 23 mitogenomes from GenBank. Our analyses generally place genus Bironella within the genus Anopheles, which implies that the latter as it is currently defined is not monophyletic. With some inconsistencies, Bironella was placed within the major clade that includes Anopheles, Cellia, Kerteszia, Lophopodomyia, Nyssorhynchus and Stethomyia, which were found to be monophyletic groups within Anophelinae. Our findings provided robust evidence for elevating the monophyletic groupings Kerteszia, Lophopodomyia, Nyssorhynchus and Stethomyia to genus level; genus Anopheles to include subgenera Anopheles, Baimaia, Cellia and Christya; Anopheles parvus to be placed into a new genus; Nyssorhynchus to be elevated to genus level; the genus Nyssorhynchus to include subgenera Myzorhynchella and Nyssorhynchus; Anopheles atacamensis and Anopheles pictipennis to be transferred from subgenus Nyssorhynchus to subgenus Myzorhynchella; and subgenus Nyssorhynchus to encompass the remaining species of Argyritarsis and Albimanus Sections. PMID:29291068
Yao, Zhicheng; Xiong, Zhiyong; Li, Ruixi; Liang, Hao; Jia, Changchang; Deng, Meihai
2018-05-14
Dysregulation of long non-coding RNAs is a newly identified mechanism for tumour progression. Previous studies have suggested that the nuclear factor of activated T cells (NFAT) gene plays a very important role in cancer growth and metastasis. However, lncNRON is a newly identified repressor of NFAT, and its function is largely unknown, especially in hepatocellular carcinoma (HCC). Therefore, the expression levels of lncNRON in 215 pairs of HCC tissue were evaluated by qRT-PCR, and its relationship to clinicopathological parameters, recurrence, and survival was analysed. Furthermore, stably overexpressing lncNRON cell lines were constructed and evaluated for cell phenotype. Finally, we detected epithelial-to-mesenchymal transition (EMT) proteins to determine the underlying mechanism involved in lncNRON function. We observed that lncNRON was downregulated in HCC tumour tissues; low lncNRON expression was associated with poor tumour differentiation and the presence of vascular tumour thrombus, which tended to result in poor clinical outcomes, as demonstrated by the recurrence rate and survival curves. Functional analysis showed that lncNRON overexpression impaired colony formation and cell viability and inhibited cell migration and invasion. A study using tumour-bearing mice showed that lncNRON markedly limited tumour growth and lung metastasis in vivo. Importantly, western blot analysis revealed that the expression of the EMT-related epithelial marker, E-cadherin, increased, whereas the expression of mesenchymal markers N-cadherin, snail, and vimentin was attenuated by lncNRON overexpression in HCC cells. Therefore, lower lncNRON expression indicates a poorer clinical outcome in HCC. LncNRON overexpression can suppress HCC growth and metastasis via inhibiting the EMT, and lncNRON may function as a new HCC prognostic marker. Copyright © 2018 Elsevier Masson SAS. All rights reserved.
Bays, Alison M; Engelberg, Ruth A; Back, Anthony L; Ford, Dee W; Downey, Lois; Shannon, Sarah E; Doorenbos, Ardith Z; Edlund, Barbara; Christianson, Phyllis; Arnold, Richard W; O'Connor, Kim; Kross, Erin K; Reinke, Lynn F; Cecere Feemster, Laura; Fryer-Edwards, Kelly; Alexander, Stewart C; Tulsky, James A; Curtis, J Randall
2014-02-01
Communication with patients and families is an essential component of high-quality care in serious illness. Small-group skills training can result in new communication behaviors, but past studies have used facilitators with extensive experience, raising concerns this is not scalable. The objective was to investigate the effect of an experiential communication skills building workshop (Codetalk), led by newly trained facilitators, on internal medicine trainees' and nurse practitioner students' ability to communicate bad news and express empathy. Trainees participated in Codetalk; skill improvement was evaluated through pre- and post- standardized patient (SP) encounters. The subjects were internal medicine residents and nurse practitioner students at two universities. The study was carried out in anywhere from five to eight half-day sessions over a month. The first and last sessions included audiotaped trainee SP encounters coded for effective communication behaviors. The primary outcome was change in communication scores from pre-intervention to post-intervention. We also measured trainee characteristics to identify predictors of performance and change in performance over time. We enrolled 145 trainees who completed pre- and post-intervention SP interviews-with participation rates of 52% for physicians and 14% for nurse practitioners. Trainees' scores improved in 8 of 11 coded behaviors (p<0.05). The only significant predictors of performance were having participated in the intervention (p<0.001) and study site (p<0.003). The only predictor of improvement in performance over time was participating in the intervention (p<0.001). A communication skills intervention using newly trained facilitators was associated with improvement in trainees' skills in giving bad news and expressing empathy. Improvement in communication skills did not vary by trainee characteristics.
Comparative Genetic Analyses of Human Rhinovirus C (HRV-C) Complete Genome from Malaysia
Khaw, Yam Sim; Chan, Yoke Fun; Jafar, Faizatul Lela; Othman, Norlijah; Chee, Hui Yee
2016-01-01
Human rhinovirus-C (HRV-C) has been implicated in more severe illnesses than HRV-A and HRV-B, however, the limited number of HRV-C complete genomes (complete 5′ and 3′ non-coding region and open reading frame sequences) has hindered the in-depth genetic study of this virus. This study aimed to sequence seven complete HRV-C genomes from Malaysia and compare their genetic characteristics with the 18 published HRV-Cs. Seven Malaysian HRV-C complete genomes were obtained with newly redesigned primers. The seven genomes were classified as HRV-C6, C12, C22, C23, C26, C42, and pat16 based on the VP4/VP2 and VP1 pairwise distance threshold classification. Five of the seven Malaysian isolates, namely, 3430-MY-10/C22, 8713-MY-10/C23, 8097-MY-11/C26, 1570-MY-10/C42, and 7383-MY-10/pat16 are the first newly sequenced complete HRV-C genomes. All seven Malaysian isolates genomes displayed nucleotide similarity of 63–81% among themselves and 63–96% with other HRV-Cs. Malaysian HRV-Cs had similar putative immunogenic sites, putative receptor utilization and potential antiviral sites as other HRV-Cs. The genomic features of Malaysian isolates were similar to those of other HRV-Cs. Negative selections were frequently detected in HRV-Cs complete coding sequences indicating that these sequences were under functional constraint. The present study showed that HRV-Cs from Malaysia have diverse genetic sequences but share conserved genomic features with other HRV-Cs. This genetic information could provide further aid in the understanding of HRV-C infection. PMID:27199901
Comparison of Comorbidity Collection Methods
Kallogjeri, Dorina; Gaynor, Sheila M; Piccirillo, Marilyn L; Jean, Raymond A; Spitznagel, Edward L; Piccirillo, Jay F
2014-01-01
Background Multiple valid comorbidity indices exist to quantify the presence and role of comorbidities in cancer patient survival. Our goal was to compare chart-based Adult Comorbidity Evaluation-27 index (ACE-27), and claims-based Charlson Comorbidity Index (CCI) methods of identifying comorbid ailments, and their prognostic ability. Study Design Prospective cohort study of 6138 newly-diagnosed cancer patients at 12 different institutions. Participating registrars were trained to collect comorbidities from the abstracted chart using the ACE-27 method. ACE-27 assessment was compared with comorbidities captured through hospital discharge face-sheets using ICD-coding. The prognostic accomplishments of each comorbidity method was examined using follow-up data assessed at 24 months after data abstraction. Results Distribution of the ACE-27 scores was: “None” for 1453 (24%) of the patients; “Mild” for 2388 (39%); “Moderate” for 1344 (22%) and “Severe” for 950 (15%) of the patients. Deyo’s adaption of the Charlson Comorbidity Index (CCI) identified 4265 (69%) patients with a CCI score of 0, and the remaining 31% had CCI scores of 1 (n=1341, 22%), 2 (n=365, 6%), or 3 or more (n=167, 3%). Of the 4265 patients with a CCI score of 0, 394 (9%) were coded with severe comorbidities based on ACE-27 method. A higher comorbidity score was significantly associated with higher risk of death for both comorbidity indices. The multivariable Cox model including both comorbidity indices had the best performance (Nagelkerke’s R-square=0.37) and the best discrimination (c-index=0.827). Conclusion The number, type, and overall severity of comorbid ailments identified by chart- and claims-based approaches in newly-diagnosed cancer patients were notably different. Both indices were prognostically significant and able to provide unique prognostic information. PMID:24933715
Behavioral Health Services Following Implementation of Screening in Massachusetts Medicaid Children
Penfold, Robert B.; Arsenault, Lisa N.; Zhang, Fang; Murphy, Michael; Wissow, Lawrence S.
2014-01-01
OBJECTIVES: To determine the relationship of child behavioral health (BH) screening results to receipt of BH services in Massachusetts Medicaid (MassHealth) children. METHODS: After a court decision, Massachusetts primary care providers were mandated to conduct BH screening at well-child visits and use a Current Procedural Terminology code along with a modifier indicating whether a BH need was identified. Using MassHealth claims data, a cohort of continuously enrolled (July 2007–June 2010) children was constructed. The salient visit (first use of the modifier, screening code, or claim in fiscal year 2009) was considered a reference point to examine BH history and postscreening BH services. Bivariate and multivariate logistic regression analyses were performed to determine predictors of postscreening BH services. RESULTS: Of 261 160 children in the cohort, 45% (118 464) were screened and 37% had modifiers. Fifty-seven percent of children screening positive received postscreening BH services compared with 22% of children screening negative. However, only 30% of newly identified children received BH services. The strongest predictors of postscreening BH services for children without a BH history were being in foster care (odds ratio, 10.38; 95% confidence interval, 9.22–11.68) and having a positive modifier (odds ratio, 3.79; 95% confidence interval, 3.53–4.06). CONCLUSIONS: Previous BH history, a positive modifier, and foster care predicted postscreening BH services. Only one-third of newly identified children received services. Thus although screening is associated with an increase in BH recognition, it may be insufficient to improve care. Additional strategies may be needed to enhance engagement in BH services. PMID:25225135
X-ray investigation of cross-breed silk in cocoon, yarn and fabric forms
DOE Office of Scientific and Technical Information (OSTI.GOV)
Radhalakshmi, Y. C.; Kariappa,; Siddaraju, G. N.
2012-06-05
Recently Central Sericulture Research and Training Institute, Mysore developed many improved cross breeds and bivoltine hybrids. Newly developed cross breeds recorded fibre characteristics which are significantly superior over existing control hybrids. This aspect has been investigated using X-ray diffraction technique. We have employed line profile analysis to compute microstructural parameters. These parameters are compared with physical parameters of newly developed cross breed silk fibers for a better understanding of structure-property relation in these samples.
NASA Astrophysics Data System (ADS)
McNamara, Louis Edward, III
The development of new materials capable of efficient charge transfer and energy storage has become increasingly important in many areas of modern chemical research. This is especially true for the development of emissive optoelectronic devices and in the field of solar to electric energy conversion. The characterization of the photophysical properties of new molecular systems for these applications has become critical in the design and development of these materials. Many molecular building blocks have been developed and understanding the properties of these molecules at a fundamental level is essential for their successful implementation and future engineering. This dissertation focuses on the characterization of some of these newly-developed molecular systems. The spectroscopic studies focus on the characterization of newly-developed molecules based on perylene and indolizine derivatives for solar to electric energy conversion, thienopyrazine derivatives for near infrared (NIR) emissive applications, an SCS pincer complex for blue emissive materials and a fluorescent probe for medical applications. The effects of noncovalent interactions are also investigated on these systems and a benchmark biological molecule trimethylamine N-oxide (TMAO).
Hu, Yunzi; Daoud, Walid A.; Cheuk, Kevin Ka Leung; Lin, Carol Sze Ki
2016-01-01
Polycondensation and ring-opening polymerization are two important polymer synthesis methods. Poly(lactic acid), the most typical biodegradable polymer, has been researched extensively from 1900s. It is of significant importance to have an up-to-date review on the recent improvement in techniques for biodegradable polymers. This review takes poly(lactic acid) as the example to present newly developed polymer synthesis techniques on polycondensation and ring-opening polymerization reported in the recent decade (2005–2015) on the basis of industrial technique modifications and advanced laboratory research. Different polymerization methods, including various solvents, heating programs, reaction apparatus and catalyst systems, are summarized and compared with the current industrial production situation. Newly developed modification techniques for polymer properties improvement are also discussed based on the case of poly(lactic acid). PMID:28773260
Failure prediction of thin beryllium sheets used in spacecraft structures
NASA Technical Reports Server (NTRS)
Roschke, Paul N.; Mascorro, Edward; Papados, Photios; Serna, Oscar R.
1991-01-01
The primary objective of this study is to develop a method for prediction of failure of thin beryllium sheets that undergo complex states of stress. Major components of the research include experimental evaluation of strength parameters for cross-rolled beryllium sheet, application of the Tsai-Wu failure criterion to plate bending problems, development of a high order failure criterion, application of the new criterion to a variety of structures, and incorporation of both failure criteria into a finite element code. A Tsai-Wu failure model for SR-200 sheet material is developed from available tensile data, experiments carried out by NASA on two circular plates, and compression and off-axis experiments performed in this study. The failure surface obtained from the resulting criterion forms an ellipsoid. By supplementing experimental data used in the the two-dimensional criterion and modifying previously suggested failure criteria, a multi-dimensional failure surface is proposed for thin beryllium structures. The new criterion for orthotropic material is represented by a failure surface in six-dimensional stress space. In order to determine coefficients of the governing equation, a number of uniaxial, biaxial, and triaxial experiments are required. Details of these experiments and a complementary ultrasonic investigation are described in detail. Finally, validity of the criterion and newly determined mechanical properties is established through experiments on structures composed of SR200 sheet material. These experiments include a plate-plug arrangement under a complex state of stress and a series of plates with an out-of-plane central point load. Both criteria have been incorporated into a general purpose finite element analysis code. Numerical simulation incrementally applied loads to a structural component that is being designed and checks each nodal point in the model for exceedance of a failure criterion. If stresses at all locations do not exceed the failure criterion, the load is increased and the process is repeated. Failure results for the plate-plug and clamped plate tests are accurate to within 2 percent.
CellAnimation: an open source MATLAB framework for microscopy assays.
Georgescu, Walter; Wikswo, John P; Quaranta, Vito
2012-01-01
Advances in microscopy technology have led to the creation of high-throughput microscopes that are capable of generating several hundred gigabytes of images in a few days. Analyzing such wealth of data manually is nearly impossible and requires an automated approach. There are at present a number of open-source and commercial software packages that allow the user to apply algorithms of different degrees of sophistication to the images and extract desired metrics. However, the types of metrics that can be extracted are severely limited by the specific image processing algorithms that the application implements, and by the expertise of the user. In most commercial software, code unavailability prevents implementation by the end user of newly developed algorithms better suited for a particular type of imaging assay. While it is possible to implement new algorithms in open-source software, rewiring an image processing application requires a high degree of expertise. To obviate these limitations, we have developed an open-source high-throughput application that allows implementation of different biological assays such as cell tracking or ancestry recording, through the use of small, relatively simple image processing modules connected into sophisticated imaging pipelines. By connecting modules, non-expert users can apply the particular combination of well-established and novel algorithms developed by us and others that are best suited for each individual assay type. In addition, our data exploration and visualization modules make it easy to discover or select specific cell phenotypes from a heterogeneous population. CellAnimation is distributed under the Creative Commons Attribution-NonCommercial 3.0 Unported license (http://creativecommons.org/licenses/by-nc/3.0/). CellAnimationsource code and documentation may be downloaded from www.vanderbilt.edu/viibre/software/documents/CellAnimation.zip. Sample data are available at www.vanderbilt.edu/viibre/software/documents/movies.zip. walter.georgescu@vanderbilt.edu Supplementary data available at Bioinformatics online.
Impact of Different Correlations on TRACEv4.160 Predicted Critical Heat Flux
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jasiulevicius, A.; Macian-Juan, R.
2006-07-01
This paper presents an independent assessment of the Critical Heat Flux (CHF) models implemented in TRACEv4.160 with data from the experiments carried out at the Royal Institute of Technology (RIT) in Stockholm, Sweden, with single vertical uniformly heated 7.0 m long tubes. In previous CHF assessment studies with TRACE, it was noted that, although the overall code predictions in long single tubes with inner diameters of 1.0 to 2.49 cm agreed rather well with the results of experiments (with r.m.s. error being 25.6%), several regions of pressure and coolant mass flux could be identified, in which the code strongly under-predictsmore » or over-predicts the CHF. In order to evaluate the possibility of improving the code performance, some of the most widely used and assessed CHF correlations were additionally implemented in TRACEv4.160, namely Bowring, Levitan - Lantsman, and Tong-W3. The results obtained for the CHF predictions in single tubes with uniform axial heat flux by using these correlations, were compared to the results produced with the standard TRACE correlations (Biasi and CISE-GE), and with the experimental data from RIT, which covered a broad range of pressures (3-20 MPa) and coolant mass fluxes (500-3000 kg/m{sup 2}s). Several hundreds of experimental points were calculated to cover the parameter range mentioned above for the evaluation of the newly implemented correlations in the TRACEv4.160 code. (author)« less
Exploring newly qualified doctors' workplace stressors: an interview study from Australia
Tallentire, Victoria R; Smith, Samantha E; Facey, Adam D; Rotstein, Laila
2017-01-01
Purpose Postgraduate year 1 (PGY1) doctors suffer from high levels of psychological distress, yet the contributory factors are poorly understood. This study used an existing model of workplace stress to explore the elements most pertinent to PGY1 doctors. In turn, the data were used to amend and refine the conceptual model to better reflect the unique experiences of PGY1 doctors. Method Focus groups were undertaken with PGY1 doctors working at four different health services in Victoria, Australia. Transcripts were coded using Michie's model of workplace stress as the initial coding template. Remaining text was coded inductively and the supplementary codes were used to modify and amplify Michie's framework. Results There were 37 participants in total. Key themes included stressors intrinsic to the job, such as work overload and long hours, as well as those related to the context of work such as lack of role clarity and relationships with colleagues. The main modification to Michie's framework was the addition of the theme of uncertainty. This concept related to most of the pre-existing themes in complex ways, culminating in an overall sense of anxiety. Conclusions Michie's model of workplace stress can be effectively used to explore the stressors experienced by PGY1 doctors. Pervasive uncertainty may help to explain the high levels of psychological morbidity in this group. While some uncertainty will always remain, the medical education community must seek ways to improve role clarity and promote mutual respect. PMID:28801411
Preliminary Assessment of Turbomachinery Codes
NASA Technical Reports Server (NTRS)
Mazumder, Quamrul H.
2007-01-01
This report assesses different CFD codes developed and currently being used at Glenn Research Center to predict turbomachinery fluid flow and heat transfer behavior. This report will consider the following codes: APNASA, TURBO, GlennHT, H3D, and SWIFT. Each code will be described separately in the following section with their current modeling capabilities, level of validation, pre/post processing, and future development and validation requirements. This report addresses only previously published and validations of the codes. However, the codes have been further developed to extend the capabilities of the codes.
The automotive application of discontinuously reinforced TiB-Ti composites
NASA Astrophysics Data System (ADS)
Saito, Takashi
2004-05-01
In 1998, Toyota Motor Corporation adopted intake valves and exhaust valves made of titanium-based alloys for the engine of its Altezza. Both valves were manufactured via a newly developed cost-effective powder metallurgy process. The exhaust valve is made of a newly developed titanium metal-matrix composite (MMC). The valve has achieved sufficient durability and reliability with a manufacturing cost acceptable for the mass-produced automobile engine components.
Seekoe, Eunice
2014-04-24
South Africa transformed higher education through the enactment of the Higher Education Act (No. 101 of 1997). The researcher identified the need to develop a model for the mentoring of newly-appointed nurse educators in nursing education institutions in South Africa. To develop and describe the model for mentoring newly-appointed nurse educators in nursing education institutions in South Africa. A qualitative and theory-generating design was used (following empirical findings regarding needs analysis) in order to develop the model. The conceptualisation of the framework focused on the context, content, process and the theoretical domains that influenced the model. Ideas from different theories were borrowed from and integrated with the literature and deductive and inductive strategies were applied. The structure of the model is multidimensional and complex in nature (macro, mesoand micro) based on the philosophy of reflective practice, competency-based practice andcritical learning theories. The assumptions are in relation to stakeholders, context, mentoring, outcome, process and dynamic. The stakeholders are the mentor and mentee within an interactive participatory relationship. The mentoring takes place within the process with a sequence of activities such as relationship building, development, engagement, reflective process and assessment. Capacity building and empowerment are outcomes of mentoring driven by motivation. The implication for nurse managers is that the model can be used to develop mentoring programmes for newly-appointed nurse educators.
1994-12-01
three times with newly emerged adults and three times with adults at least a month old. Treatments were (a) four hydrilla sprigs, (b) four hydrilla...Methods conducted to determine if feeding on nonhost plants stimulated the develop- ment of flight muscles. Newly emerged females were held in a plastic...Fecundity and adult longevity A fecundity and adult longevity test was conducted with newly emerged adults. Adults were held initially with moist paper
Developing and modifying behavioral coding schemes in pediatric psychology: a practical guide.
Chorney, Jill MacLaren; McMurtry, C Meghan; Chambers, Christine T; Bakeman, Roger
2015-01-01
To provide a concise and practical guide to the development, modification, and use of behavioral coding schemes for observational data in pediatric psychology. This article provides a review of relevant literature and experience in developing and refining behavioral coding schemes. A step-by-step guide to developing and/or modifying behavioral coding schemes is provided. Major steps include refining a research question, developing or refining the coding manual, piloting and refining the coding manual, and implementing the coding scheme. Major tasks within each step are discussed, and pediatric psychology examples are provided throughout. Behavioral coding can be a complex and time-intensive process, but the approach is invaluable in allowing researchers to address clinically relevant research questions in ways that would not otherwise be possible. © The Author 2014. Published by Oxford University Press on behalf of the Society of Pediatric Psychology. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Deductive Glue Code Synthesis for Embedded Software Systems Based on Code Patterns
NASA Technical Reports Server (NTRS)
Liu, Jian; Fu, Jicheng; Zhang, Yansheng; Bastani, Farokh; Yen, I-Ling; Tai, Ann; Chau, Savio N.
2006-01-01
Automated code synthesis is a constructive process that can be used to generate programs from specifications. It can, thus, greatly reduce the software development cost and time. The use of formal code synthesis approach for software generation further increases the dependability of the system. Though code synthesis has many potential benefits, the synthesis techniques are still limited. Meanwhile, components are widely used in embedded system development. Applying code synthesis to component based software development (CBSD) process can greatly enhance the capability of code synthesis while reducing the component composition efforts. In this paper, we discuss the issues and techniques for applying deductive code synthesis techniques to CBSD. For deductive synthesis in CBSD, a rule base is the key for inferring appropriate component composition. We use the code patterns to guide the development of rules. Code patterns have been proposed to capture the typical usages of the components. Several general composition operations have been identified to facilitate systematic composition. We present the technique for rule development and automated generation of new patterns from existing code patterns. A case study of using this method in building a real-time control system is also presented.
Li, Yongping; Wei, Wei; Feng, Jia; Luo, Huifeng; Pi, Mengting; Liu, Zhongchi; Kang, Chunying
2018-01-01
Abstract The genome of the wild diploid strawberry species Fragaria vesca, an ideal model system of cultivated strawberry (Fragaria × ananassa, octoploid) and other Rosaceae family crops, was first published in 2011 and followed by a new assembly (Fvb). However, the annotation for Fvb mainly relied on ab initio predictions and included only predicted coding sequences, therefore an improved annotation is highly desirable. Here, a new annotation version named v2.0.a2 was created for the Fvb genome by a pipeline utilizing one PacBio library, 90 Illumina RNA-seq libraries, and 9 small RNA-seq libraries. Altogether, 18,641 genes (55.6% out of 33,538 genes) were augmented with information on the 5′ and/or 3′ UTRs, 13,168 (39.3%) protein-coding genes were modified or newly identified, and 7,370 genes were found to possess alternative isoforms. In addition, 1,938 long non-coding RNAs, 171 miRNAs, and 51,714 small RNA clusters were integrated into the annotation. This new annotation of F. vesca is substantially improved in both accuracy and integrity of gene predictions, beneficial to the gene functional studies in strawberry and to the comparative genomic analysis of other horticultural crops in Rosaceae family. PMID:29036429
Codes of medical ethics: traditional foundations and contemporary practice.
Sohl, P; Bassford, H A
1986-01-01
The Hippocratic Coprus recognized the interaction of 'business' and patient-health moral considerations, and urged that the former be subordinated to the latter. During the 1800s with the growth of complexity in both scientific knowledge and the organization of health services, the medical ethical codes addressed themselves to elaborate rules of conduct to be followed by the members of the newly emerging national medical associations. After World War II the World Medical Association was established as an international forum where national medical associations could debate the ethical problems presented by modern medicine. The International Code of Medical ethics and the Declaration of Geneva were written as 20th century restatements of the medical profession's commitment to the sovereignty of the patient-care norm. Many ethical statements have been issued by the World Medical Association in the past 35 years; they show the variety and difficulties of contemporary medical practice. The newest revisions were approved by the General Assembly of the World Medical Association in Venice, Italy October 1983. Their content is examined and concern is voiced about the danger of falling into cultural relativism when questions about the methods of financing medical services are the subject of an ethical declaration which is arrived at by consensus in the W.M.A.
RNA-Seq Based Transcriptional Map of Bovine Respiratory Disease Pathogen “Histophilus somni 2336”
Kumar, Ranjit; Lawrence, Mark L.; Watt, James; Cooksey, Amanda M.; Burgess, Shane C.; Nanduri, Bindu
2012-01-01
Genome structural annotation, i.e., identification and demarcation of the boundaries for all the functional elements in a genome (e.g., genes, non-coding RNAs, proteins and regulatory elements), is a prerequisite for systems level analysis. Current genome annotation programs do not identify all of the functional elements of the genome, especially small non-coding RNAs (sRNAs). Whole genome transcriptome analysis is a complementary method to identify “novel” genes, small RNAs, regulatory regions, and operon structures, thus improving the structural annotation in bacteria. In particular, the identification of non-coding RNAs has revealed their widespread occurrence and functional importance in gene regulation, stress and virulence. However, very little is known about non-coding transcripts in Histophilus somni, one of the causative agents of Bovine Respiratory Disease (BRD) as well as bovine infertility, abortion, septicemia, arthritis, myocarditis, and thrombotic meningoencephalitis. In this study, we report a single nucleotide resolution transcriptome map of H. somni strain 2336 using RNA-Seq method. The RNA-Seq based transcriptome map identified 94 sRNAs in the H. somni genome of which 82 sRNAs were never predicted or reported in earlier studies. We also identified 38 novel potential protein coding open reading frames that were absent in the current genome annotation. The transcriptome map allowed the identification of 278 operon (total 730 genes) structures in the genome. When compared with the genome sequence of a non-virulent strain 129Pt, a disproportionate number of sRNAs (∼30%) were located in genomic region unique to strain 2336 (∼18% of the total genome). This observation suggests that a number of the newly identified sRNAs in strain 2336 may be involved in strain-specific adaptations. PMID:22276113
RNA-seq based transcriptional map of bovine respiratory disease pathogen "Histophilus somni 2336".
Kumar, Ranjit; Lawrence, Mark L; Watt, James; Cooksey, Amanda M; Burgess, Shane C; Nanduri, Bindu
2012-01-01
Genome structural annotation, i.e., identification and demarcation of the boundaries for all the functional elements in a genome (e.g., genes, non-coding RNAs, proteins and regulatory elements), is a prerequisite for systems level analysis. Current genome annotation programs do not identify all of the functional elements of the genome, especially small non-coding RNAs (sRNAs). Whole genome transcriptome analysis is a complementary method to identify "novel" genes, small RNAs, regulatory regions, and operon structures, thus improving the structural annotation in bacteria. In particular, the identification of non-coding RNAs has revealed their widespread occurrence and functional importance in gene regulation, stress and virulence. However, very little is known about non-coding transcripts in Histophilus somni, one of the causative agents of Bovine Respiratory Disease (BRD) as well as bovine infertility, abortion, septicemia, arthritis, myocarditis, and thrombotic meningoencephalitis. In this study, we report a single nucleotide resolution transcriptome map of H. somni strain 2336 using RNA-Seq method.The RNA-Seq based transcriptome map identified 94 sRNAs in the H. somni genome of which 82 sRNAs were never predicted or reported in earlier studies. We also identified 38 novel potential protein coding open reading frames that were absent in the current genome annotation. The transcriptome map allowed the identification of 278 operon (total 730 genes) structures in the genome. When compared with the genome sequence of a non-virulent strain 129Pt, a disproportionate number of sRNAs (∼30%) were located in genomic region unique to strain 2336 (∼18% of the total genome). This observation suggests that a number of the newly identified sRNAs in strain 2336 may be involved in strain-specific adaptations.
The AGORA High-resolution Galaxy Simulations Comparison Project II: Isolated disk test
Kim, Ji-hoon; Agertz, Oscar; Teyssier, Romain; ...
2016-12-20
Using an isolated Milky Way-mass galaxy simulation, we compare results from 9 state-of-the-art gravito-hydrodynamics codes widely used in the numerical community. We utilize the infrastructure we have built for the AGORA High-resolution Galaxy Simulations Comparison Project. This includes the common disk initial conditions, common physics models (e.g., radiative cooling and UV background by the standardized package Grackle) and common analysis toolkit yt, all of which are publicly available. Subgrid physics models such as Jeans pressure floor, star formation, supernova feedback energy, and metal production are carefully constrained across code platforms. With numerical accuracy that resolves the disk scale height, wemore » find that the codes overall agree well with one another in many dimensions including: gas and stellar surface densities, rotation curves, velocity dispersions, density and temperature distribution functions, disk vertical heights, stellar clumps, star formation rates, and Kennicutt-Schmidt relations. Quantities such as velocity dispersions are very robust (agreement within a few tens of percent at all radii) while measures like newly-formed stellar clump mass functions show more significant variation (difference by up to a factor of ~3). Systematic differences exist, for example, between mesh-based and particle-based codes in the low density region, and between more diffusive and less diffusive schemes in the high density tail of the density distribution. Yet intrinsic code differences are generally small compared to the variations in numerical implementations of the common subgrid physics such as supernova feedback. Lastly, our experiment reassures that, if adequately designed in accordance with our proposed common parameters, results of a modern high-resolution galaxy formation simulation are more sensitive to input physics than to intrinsic differences in numerical schemes.« less
The AGORA High-resolution Galaxy Simulations Comparison Project II: Isolated disk test
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kim, Ji-hoon; Agertz, Oscar; Teyssier, Romain
Using an isolated Milky Way-mass galaxy simulation, we compare results from 9 state-of-the-art gravito-hydrodynamics codes widely used in the numerical community. We utilize the infrastructure we have built for the AGORA High-resolution Galaxy Simulations Comparison Project. This includes the common disk initial conditions, common physics models (e.g., radiative cooling and UV background by the standardized package Grackle) and common analysis toolkit yt, all of which are publicly available. Subgrid physics models such as Jeans pressure floor, star formation, supernova feedback energy, and metal production are carefully constrained across code platforms. With numerical accuracy that resolves the disk scale height, wemore » find that the codes overall agree well with one another in many dimensions including: gas and stellar surface densities, rotation curves, velocity dispersions, density and temperature distribution functions, disk vertical heights, stellar clumps, star formation rates, and Kennicutt-Schmidt relations. Quantities such as velocity dispersions are very robust (agreement within a few tens of percent at all radii) while measures like newly-formed stellar clump mass functions show more significant variation (difference by up to a factor of ~3). Systematic differences exist, for example, between mesh-based and particle-based codes in the low density region, and between more diffusive and less diffusive schemes in the high density tail of the density distribution. Yet intrinsic code differences are generally small compared to the variations in numerical implementations of the common subgrid physics such as supernova feedback. Lastly, our experiment reassures that, if adequately designed in accordance with our proposed common parameters, results of a modern high-resolution galaxy formation simulation are more sensitive to input physics than to intrinsic differences in numerical schemes.« less
THE AGORA HIGH-RESOLUTION GALAXY SIMULATIONS COMPARISON PROJECT. II. ISOLATED DISK TEST
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kim, Ji-hoon; Agertz, Oscar; Teyssier, Romain
Using an isolated Milky Way-mass galaxy simulation, we compare results from nine state-of-the-art gravito-hydrodynamics codes widely used in the numerical community. We utilize the infrastructure we have built for the AGORA High-resolution Galaxy Simulations Comparison Project. This includes the common disk initial conditions, common physics models (e.g., radiative cooling and UV background by the standardized package Grackle) and common analysis toolkit yt, all of which are publicly available. Subgrid physics models such as Jeans pressure floor, star formation, supernova feedback energy, and metal production are carefully constrained across code platforms. With numerical accuracy that resolves the disk scale height, wemore » find that the codes overall agree well with one another in many dimensions including: gas and stellar surface densities, rotation curves, velocity dispersions, density and temperature distribution functions, disk vertical heights, stellar clumps, star formation rates, and Kennicutt–Schmidt relations. Quantities such as velocity dispersions are very robust (agreement within a few tens of percent at all radii) while measures like newly formed stellar clump mass functions show more significant variation (difference by up to a factor of ∼3). Systematic differences exist, for example, between mesh-based and particle-based codes in the low-density region, and between more diffusive and less diffusive schemes in the high-density tail of the density distribution. Yet intrinsic code differences are generally small compared to the variations in numerical implementations of the common subgrid physics such as supernova feedback. Our experiment reassures that, if adequately designed in accordance with our proposed common parameters, results of a modern high-resolution galaxy formation simulation are more sensitive to input physics than to intrinsic differences in numerical schemes.« less
Kassa, Jiri; Musilek, Kamil; Koomlova, Marketa; Bajgar, Jiri
2012-04-01
The ability of three newly developed reversible inhibitors of acetylcholinesterase (AChE) (K298, K344 and K474) and currently available carbamate pyridostigmine to increase the resistance of mice against soman and the efficacy of antidotal treatment of soman-poisoned mice was compared. Neither pyridostigmine nor new reversible inhibitors of AChE were able to increase the LD(50) value of soman. Thus, the pharmacological pre-treatment with pyridostigmine or newly synthesized inhibitors of AChE was not able to protect mice against soman-induced lethal acute toxicity. The pharmacological pre-treatment with pyridostigmine alone or with K474 was able to slightly increase the efficacy of antidotal treatment (the oxime HI-6 in combination with atropine) of soman-poisoned mice, but the increase in the efficacy of antidotal treatment was not significant. The other newly developed reversible inhibitors of AChF (K298, K344) were completely ineffective. These findings demonstrate that pharmacological pre-treatment of soman-poisoned mice with tested reversible inhibitors of AChF is not promising. © 2011 The Authors. Basic & Clinical Pharmacology & Toxicology © 2011 Nordic Pharmacological Society.
Using Coding Apps to Support Literacy Instruction and Develop Coding Literacy
ERIC Educational Resources Information Center
Hutchison, Amy; Nadolny, Larysa; Estapa, Anne
2016-01-01
In this article the authors present the concept of Coding Literacy and describe the ways in which coding apps can support the development of Coding Literacy and disciplinary and digital literacy skills. Through detailed examples, we describe how coding apps can be integrated into literacy instruction to support learning of the Common Core English…
DOT National Transportation Integrated Search
2013-06-01
The Indiana Department of Transportation (INDOT) is currently utilizing a profilograph and the profile index for measuring smoothness : assurance for newly constructed pavements. However, there are benefits to implementing a new IRI based smoothness ...
Is phonology bypassed in normal or dyslexic development?
Pennington, B F; Lefly, D L; Van Orden, G C; Bookman, M O; Smith, S D
1987-01-01
A pervasive assumption in most accounts of normal reading and spelling development is that phonological coding is important early in development but is subsequently superseded by faster, orthographic coding which bypasses phonology. We call this assumption, which derives from dual process theory, the developmental bypass hypothesis. The present study tests four specific predictions of the developmental bypass hypothesis by comparing dyslexics and nondyslexics from the same families in a cross-sectional design. The four predictions are: 1) That phonological coding skill develops early in normal readers and soon reaches asymptote, whereas orthographic coding skill has a protracted course of development; 2) that the correlation of adult reading or spelling performance with phonological coding skill is considerably less than the correlation with orthographic coding skill; 3) that dyslexics who are mainly deficient in phonological coding skill should be able to bypass this deficit and eventually close the gap in reading and spelling performance; and 4) that the greatest differences between dyslexics and developmental controls on measures of phonological coding skill should be observed early rather than late in development.None of the four predictions of the developmental bypass hypothesis were upheld. Phonological coding skill continued to develop in nondyslexics until adulthood. It accounted for a substantial (32-53 percent) portion of the variance in reading and spelling performance in adult nondyslexics, whereas orthographic coding skill did not account for a statistically reliable portion of this variance. The dyslexics differed little across age in phonological coding skill, but made linear progress in orthographic coding skill, surpassing spelling-age (SA) controls by adulthood. Nonetheless, they didnot close the gap in reading and spelling performance. Finally, dyslexics were significantly worse than SA (and Reading Age [RA]) controls in phonological coding skill only in adulthood.
Maletzki, Claudia; Beyrich, Franziska; Hühns, Maja; Klar, Ernst; Linnebacher, Michael
2016-08-16
Mice lines homozygous negative for one of the four DNA mismatch repair (MMR) genes (MLH1, MSH2, PMS2, MSH6) were generated as models for MMR deficient (MMR-D) diseases. Clinically, hereditary forms of MMR-D include Lynch syndrome (characterized by a germline MMR gene defect) and constitutional MMR-D, the biallelic form. MMR-D knockout mice may be representative for both diseases. Here, we aimed at characterizing the MLH1-/- model focusing on tumor-immune microenvironment and identification of coding microsatellite mutations in lymphomas and gastrointestinal tumors (GIT).All tumors showed microsatellite instability (MSI) in non-coding mononucleotide markers. Mutational profiling of 26 coding loci in MSI+ GIT and lymphomas revealed instability in half of the microsatellites, two of them (Rfc3 and Rasal2) shared between both entities. MLH1-/- tumors of both entities displayed a similar phenotype (high CD71, FasL, PD-L1 and CTLA-4 expression). Additional immunofluorescence verified the tumors' natural immunosuppressive character (marked CD11b/CD200R infiltration). Vice versa, CD3+ T cells as well as immune checkpoints molecules were detectable, indicative for an active immune microenvironment. For functional analysis, a permanent cell line from an MLH1-/- GIT was established. The newly developed MLH1-/- A7450 cells exhibit stable in vitro growth, strong invasive potential and heterogeneous drug response. Moreover, four additional MSI target genes (Nktr1, C8a, Taf1b, and Lig4) not recognized in the primary were identified in this cell line.Summing up, molecular and immunological mechanisms of MLH1-/- driven carcinogenesis correlate well with clinical features of MMR-D. MLH1-/- knockout mice combine characteristics of Lynch syndrome and constitutional MMR-D, making them suitable models for preclinical research aiming at MMR-D related diseases.
Hühns, Maja; Klar, Ernst; Linnebacher, Michael
2016-01-01
Mice lines homozygous negative for one of the four DNA mismatch repair (MMR) genes (MLH1, MSH2, PMS2, MSH6) were generated as models for MMR deficient (MMR-D) diseases. Clinically, hereditary forms of MMR-D include Lynch syndrome (characterized by a germline MMR gene defect) and constitutional MMR-D, the biallelic form. MMR-D knockout mice may be representative for both diseases. Here, we aimed at characterizing the MLH1-/- model focusing on tumor-immune microenvironment and identification of coding microsatellite mutations in lymphomas and gastrointestinal tumors (GIT). All tumors showed microsatellite instability (MSI) in non-coding mononucleotide markers. Mutational profiling of 26 coding loci in MSI+ GIT and lymphomas revealed instability in half of the microsatellites, two of them (Rfc3 and Rasal2) shared between both entities. MLH1-/- tumors of both entities displayed a similar phenotype (high CD71, FasL, PD-L1 and CTLA-4 expression). Additional immunofluorescence verified the tumors’ natural immunosuppressive character (marked CD11b/CD200R infiltration). Vice versa, CD3+ T cells as well as immune checkpoints molecules were detectable, indicative for an active immune microenvironment. For functional analysis, a permanent cell line from an MLH1-/- GIT was established. The newly developed MLH1-/- A7450 cells exhibit stable in vitro growth, strong invasive potential and heterogeneous drug response. Moreover, four additional MSI target genes (Nktr1, C8a, Taf1b, and Lig4) not recognized in the primary were identified in this cell line. Summing up, molecular and immunological mechanisms of MLH1-/- driven carcinogenesis correlate well with clinical features of MMR-D. MLH1-/- knockout mice combine characteristics of Lynch syndrome and constitutional MMR-D, making them suitable models for preclinical research aiming at MMR-D related diseases. PMID:27447752
Nutt, S L; Morrison, A M; Dörfler, P; Rolink, A; Busslinger, M
1998-01-01
The Pax-5 gene codes for the transcription factor BSAP which is essential for the progression of adult B lymphopoiesis beyond an early progenitor (pre-BI) cell stage. Although several genes have been proposed to be regulated by BSAP, CD19 is to date the only target gene which has been genetically confirmed to depend on this transcription factor for its expression. We have now taken advantage of cultured pre-BI cells of wild-type and Pax-5 mutant bone marrow to screen a large panel of B lymphoid genes for additional BSAP target genes. Four differentially expressed genes were shown to be under the direct control of BSAP, as their expression was rapidly regulated in Pax-5-deficient pre-BI cells by a hormone-inducible BSAP-estrogen receptor fusion protein. The genes coding for the B-cell receptor component Ig-alpha (mb-1) and the transcription factors N-myc and LEF-1 are positively regulated by BSAP, while the gene coding for the cell surface protein PD-1 is efficiently repressed. Distinct regulatory mechanisms of BSAP were revealed by reconstituting Pax-5-deficient pre-BI cells with full-length BSAP or a truncated form containing only the paired domain. IL-7 signalling was able to efficiently induce the N-myc gene only in the presence of full-length BSAP, while complete restoration of CD19 synthesis was critically dependent on the BSAP protein concentration. In contrast, the expression of the mb-1 and LEF-1 genes was already reconstituted by the paired domain polypeptide lacking any transactivation function, suggesting that the DNA-binding domain of BSAP is sufficient to recruit other transcription factors to the regulatory regions of these two genes. In conclusion, these loss- and gain-of-function experiments demonstrate that BSAP regulates four newly identified target genes as a transcriptional activator, repressor or docking protein depending on the specific regulatory sequence context. PMID:9545244
Winkler, Isaac S; Blaschke, Jeremy D; Davis, Daniel J; Stireman, John O; O'Hara, James E; Cerretti, Pierfilippo; Moulton, John K
2015-07-01
Molecular phylogenetic studies at all taxonomic levels often infer rapid radiation events based on short, poorly resolved internodes. While such rapid episodes of diversification are an important and widespread evolutionary phenomenon, much of this poor phylogenetic resolution may be attributed to the continuing widespread use of "traditional" markers (mitochondrial, ribosomal, and some nuclear protein-coding genes) that are often poorly suited to resolve difficult, higher-level phylogenetic problems. Here we reconstruct phylogenetic relationships among a representative set of taxa of the parasitoid fly family Tachinidae and related outgroups of the superfamily Oestroidea. The Tachinidae are one of the most species rich, yet evolutionarily recent families of Diptera, providing an ideal case study for examining the differential performance of loci in resolving phylogenetic relationships and the benefits of adding more loci to phylogenetic analyses. We assess the phylogenetic utility of nine genes including both traditional genes (e.g., CO1 mtDNA, 28S rDNA) and nuclear protein-coding genes newly developed for phylogenetic analysis. Our phylogenetic findings, based on a limited set of taxa, include: a close relationship between Tachinidae and the calliphorid subfamily Polleninae, monophyly of Tachinidae and the subfamilies Exoristinae and Dexiinae, subfamily groupings of Dexiinae+Phasiinae and Tachininae+Exoristinae, and robust phylogenetic placement of the somewhat enigmatic genera Strongygaster, Euthera, and Ceracia. In contrast to poor resolution and phylogenetic incongruence of "traditional genes," we find that a more selective set of highly informative genes is able to more precisely identify regions of the phylogeny that experienced rapid radiation of lineages, while more accurately depicting their phylogenetic context. Although much expanded taxon sampling is necessary to effectively assess the monophyly of and relationships among major tachinid lineages and their relatives, we show that a small number of well-chosen nuclear protein-coding genes can successfully resolve even difficult phylogenetic problems. Copyright © 2015 Elsevier Inc. All rights reserved.
Santos, Leonardo N; Silva, Eduardo S; Santos, André S; De Sá, Pablo H; Ramos, Rommel T; Silva, Artur; Cooper, Philip J; Barreto, Maurício L; Loureiro, Sebastião; Pinheiro, Carina S; Alcantara-Neves, Neuza M; Pacheco, Luis G C
2016-07-01
Infection with helminthic parasites, including the soil-transmitted helminth Trichuris trichiura (human whipworm), has been shown to modulate host immune responses and, consequently, to have an impact on the development and manifestation of chronic human inflammatory diseases. De novo derivation of helminth proteomes from sequencing of transcriptomes will provide valuable data to aid identification of parasite proteins that could be evaluated as potential immunotherapeutic molecules in near future. Herein, we characterized the transcriptome of the adult stage of the human whipworm T. trichiura, using next-generation sequencing technology and a de novo assembly strategy. Nearly 17.6 million high-quality clean reads were assembled into 6414 contiguous sequences, with an N50 of 1606bp. In total, 5673 protein-encoding sequences were confidentially identified in the T. trichiura adult worm transcriptome; of these, 1013 sequences represent potential newly discovered proteins for the species, most of which presenting orthologs already annotated in the related species T. suis. A number of transcripts representing probable novel non-coding transcripts for the species T. trichiura were also identified. Among the most abundant transcripts, we found sequences that code for proteins involved in lipid transport, such as vitellogenins, and several chitin-binding proteins. Through a cross-species expression analysis of gene orthologs shared by T. trichiura and the closely related parasites T. suis and T. muris it was possible to find twenty-six protein-encoding genes that are consistently highly expressed in the adult stages of the three helminth species. Additionally, twenty transcripts could be identified that code for proteins previously detected by mass spectrometry analysis of protein fractions of the whipworm somatic extract that present immunomodulatory activities. Five of these transcripts were amongst the most highly expressed protein-encoding sequences in the T. trichiura adult worm. Besides, orthologs of proteins demonstrated to have potent immunomodulatory properties in related parasitic helminths were also predicted from the T. trichiura de novo assembled transcriptome. Copyright © 2016. Published by Elsevier B.V.
Williams, Brent A; Evans, Michael A; Honushefsky, Ashley M; Berger, Peter B
2017-10-12
Though warfarin has historically been the primary oral anticoagulant for stroke prevention in newly diagnosed atrial fibrillation (AF), several new direct oral anticoagulants may be preferred when anticoagulation control with warfarin is expected to be poor. This study developed a prediction model for time in therapeutic range (TTR) among newly diagnosed AF patients on newly initiated warfarin as a tool to assist decision making between warfarin and direct oral anticoagulants. This electronic medical record-based, retrospective study included newly diagnosed, nonvalvular AF patients with no recent warfarin exposure receiving primary care services through a large healthcare system in rural Pennsylvania. TTR was estimated as the percentage of time international normalized ratio measurements were between 2.0 and 3.0 during the first year following warfarin initiation. Candidate predictors of TTR were chosen from data elements collected during usual clinical care. A TTR prediction model was developed and temporally validated and its predictive performance was compared with the SAMe-TT 2 R 2 score (sex, age, medical history, treatment, tobacco, race) using R 2 and c-statistics. A total of 7877 newly diagnosed AF patients met study inclusion criteria. Median (interquartile range) TTR within the first year of starting warfarin was 51% (32, 67). Of 85 candidate predictors evaluated, 15 were included in the final validated model with an R 2 of 15.4%. The proposed model showed better predictive performance than the SAMe-TT 2 R 2 score ( R 2 =3.0%). The proposed prediction model may assist decision making on the proper mode of oral anticoagulant among newly diagnosed AF patients. However, predicting TTR on warfarin remains challenging. © 2017 The Authors. Published on behalf of the American Heart Association, Inc., by Wiley.
Jason M. Forthofer; Bret W. Butler; Natalie S. Wagenbrenner
2014-01-01
For this study three types of wind models have been defined for simulating surface wind flow in support of wildland fire management: (1) a uniform wind field (typically acquired from coarse-resolution (,4 km) weather service forecast models); (2) a newly developed mass-conserving model and (3) a newly developed mass and momentumconserving model (referred to as the...
Psychological characteristics of patients with newly developed psychogenic seizures
van Merode, T; Twellaar, M; Kotsopoulos, I; Kessels, A; Merckelbach, H; de Krom, M C T F M; Knottnerus, J
2004-01-01
Methods: Using validated scales, 178 patients from the general population diagnosed with newly developed seizures were assessed, at a point in time when the nature of their seizures was yet unknown to either doctors or patients. After standardised neurological examination, 138 patients were diagnosed with non-psychogenic seizures (NPS), while 40 patients were found to have psychogenic seizures (PS). To evaluate possible differences between the genders and the diagnostic groups, univariate analyses of variance were done. Results: PS patients reported significantly more comorbid psychopathological complaints, dissociative experiences, anxiety, and self-reported childhood trauma than NPS patients. In addition, PS patients had lower quality of life ratings than NPS patients. These effects were not modulated by gender. Conclusions: The results of the present study indicate that patients with newly developed PS constitute a group with complex psychopathological features that warrant early detection and treatment. PMID:15258225
Period variations of Algol-type eclipsing binaries AD And, TWCas and IV Cas
NASA Astrophysics Data System (ADS)
Parimucha, Štefan; Gajdoš, Pavol; Kudak, Viktor; Fedurco, Miroslav; Vaňko, Martin
2018-04-01
We present new analyses of variations in O – C diagrams of three Algol-type eclipsing binary stars: AD And, TW Cas and IV Cas. We have used all published minima times (including visual and photographic) as well as newly determined ones from our and SuperWasp observations. We determined orbital parameters of 3rd bodies in the systems with statistically significant errors, using our code based on genetic algorithms and Markov chain Monte Carlo simulations. We confirmed the multiple nature of AD And and the triple-star model of TW Cas, and we proposed a quadruple-star model of IV Cas.
Blood Glucose Meters That Are Accessible to Blind and Visually Impaired Persons
Uslan, Mark M.; Burton, Darren M.; Clements, Charles W.
2008-01-01
Blood glucose meters (BGMs) that can be used nonvisually or with a visual limitation were introduced in the mid-1990s, but it was not until 2006 and 2007 that a new set of meters with accessibility features were introduced: Prodigy, Prodigy Autocode, and Prodigy Voice (Diagnostic Devices, Charlotte, NC), and Advocate and Advocate Redi-Code (TaiDoc, Taiwan). Accessibility attributes of the newly introduced BGMs were tabulated, and product design features were examined and documented. The Prodigy Voice was found to be the only one of these new BGMs that is fully usable by blind and visually impaired persons. PMID:19885356
Blood glucose meters that are accessible to blind and visually impaired persons.
Uslan, Mark M; Burton, Darren M; Clements, Charles W
2008-03-01
Blood glucose meters (BGMs) that can be used nonvisually or with a visual limitation were introduced in the mid-1990s, but it was not until 2006 and 2007 that a new set of meters with accessibility features were introduced: Prodigy, Prodigy Autocode, and Prodigy Voice (Diagnostic Devices, Charlotte, NC), and Advocate and Advocate Redi-Code (TaiDoc, Taiwan). Accessibility attributes of the newly introduced BGMs were tabulated, and product design features were examined and documented. The Prodigy Voice was found to be the only one of these new BGMs that is fully usable by blind and visually impaired persons.
Modify Federal Tax Code to Create Incentives for Individuals to Obtain Coverage
McGlynn, Elizabeth A.
2011-01-01
Abstract This article explores how a refundable tax credit to offset the cost of health insurance premiums would affect health system performance along nine dimensions. A refundable tax credit would produce a slight gain in health as measured by life expectancy; 2.3 to 10 million people would become newly insured under this policy change. It is uncertain how the policy would affect waste or patient experience. Refundable tax credits would have no discernable effect on total health care spending, overall consumer financial risk, reliability of care, or health system capacity. Implementing refundable tax credits would be relatively easy. PMID:28083204
The rectangular array of magnetic probes on J-TEXT tokamak.
Chen, Zhipeng; Li, Fuming; Zhuang, Ge; Jian, Xiang; Zhu, Lizhi
2016-11-01
The rectangular array of magnetic probes system was newly designed and installed in the torus on J-TEXT tokamak to measure the local magnetic fields outside the last closed flux surface at a single toroidal angle. In the implementation, the experimental results agree well with the theoretical results based on the Spool model and three-dimensional numerical finite element model when the vertical field was applied. Furthermore, the measurements were successfully used as the input of EFIT code to conduct the plasma equilibrium reconstruction. The calculated Faraday rotation angle using the EFIT output is in agreement with the measured one from the three-wave polarimeter-interferometer system.
The rectangular array of magnetic probes on J-TEXT tokamak
NASA Astrophysics Data System (ADS)
Chen, Zhipeng; Li, Fuming; Zhuang, Ge; Jian, Xiang; Zhu, Lizhi
2016-11-01
The rectangular array of magnetic probes system was newly designed and installed in the torus on J-TEXT tokamak to measure the local magnetic fields outside the last closed flux surface at a single toroidal angle. In the implementation, the experimental results agree well with the theoretical results based on the Spool model and three-dimensional numerical finite element model when the vertical field was applied. Furthermore, the measurements were successfully used as the input of EFIT code to conduct the plasma equilibrium reconstruction. The calculated Faraday rotation angle using the EFIT output is in agreement with the measured one from the three-wave polarimeter-interferometer system.
Wang, Yadong; Li, Xiangrui; Yuan, Yiwen; Patel, Mahomed S
2014-01-01
To describe an innovative approach for developing and implementing an in-service curriculum in China for staff of the newly established health emergency response offices (HEROs), and that is generalisable to other settings. The multi-method training needs assessment included reviews of the competency domains needed to implement the International Health Regulations (2005) as well as China's policies and emergency regulations. The review, iterative interviews and workshops with experts in government, academia, the military, and with HERO staff were reviewed critically by an expert technical advisory panel. Over 1600 participants contributed to curriculum development. Of the 18 competency domains identified as essential for HERO staff, nine were developed into priority in-service training modules to be conducted over 2.5 weeks. Experts from academia and experienced practitioners prepared and delivered each module through lectures followed by interactive problem-solving exercises and desktop simulations to help trainees apply, experiment with, and consolidate newly acquired knowledge and skills. This study adds to the emerging literature on China's enduring efforts to strengthen its emergency response capabilities since the outbreak of SARS in 2003. The multi-method approach to curriculum development in partnership with senior policy-makers, researchers, and experienced practitioners can be applied in other settings to ensure training is responsive and customized to local needs, resources and priorities. Ongoing curriculum development should reflect international standards and be coupled with the development of appropriate performance support systems at the workplace for motivating staff to apply their newly acquired knowledge and skills effectively and creatively.
Analysis of transient fission gas behaviour in oxide fuel using BISON and TRANSURANUS
Barani, T.; Bruschi, E.; Pizzocri, D.; ...
2017-01-03
The modelling of fission gas behaviour is a crucial aspect of nuclear fuel analysis in view of the related effects on the thermo-mechanical performance of the fuel rod, which can be particularly significant during transients. Experimental observations indicate that substantial fission gas release (FGR) can occur on a small time scale during transients (burst release). To accurately reproduce the rapid kinetics of burst release in fuel performance calculations, a model that accounts for non-diffusional mechanisms such as fuel micro-cracking is needed. In this work, we present and assess a model for transient fission gas behaviour in oxide fuel, which ismore » applied as an extension of diffusion-based models to allow for the burst release effect. The concept and governing equations of the model are presented, and the effect of the newly introduced parameters is evaluated through an analytic sensitivity analysis. Then, the model is assessed for application to integral fuel rod analysis. The approach that we take for model assessment involves implementation in two structurally different fuel performance codes, namely, BISON (multi-dimensional finite element code) and TRANSURANUS (1.5D semi-analytic code). The model is validated against 19 Light Water Reactor fuel rod irradiation experiments from the OECD/NEA IFPE (International Fuel Performance Experiments) database, all of which are simulated with both codes. The results point out an improvement in both the qualitative representation of the FGR kinetics and the quantitative predictions of integral fuel rod FGR, relative to the canonical, purely diffusion-based models, with both codes. The overall quantitative improvement of the FGR predictions in the two codes is comparable. Furthermore, calculated radial profiles of xenon concentration are investigated and compared to experimental data, demonstrating the representation of the underlying mechanisms of burst release by the new model.« less
NASA Astrophysics Data System (ADS)
Ferres, Lynn; Stahl, Wolfgang; Nguyen, Ha Vinh Lam
2018-03-01
The microwave spectrum of m-methylanisole (also known as 3-methylanisole, or 3-methoxytoluene) was measured using a pulsed molecular jet Fourier transform microwave spectrometer operating in the frequency range of 2-26.5 GHz. Quantum chemical calculations predicted two conformers with the methoxy group in trans or cis position related to the ring methyl group, both of which were assigned in the experimental spectrum. Due to the internal rotation of the ring methyl group, all rotational transitions introduced large A-E splittings up to several GHz, which were analyzed with a newly developed program, called aixPAM, working in the principal axis system. There are significant differences in the V3 potential barriers of 55.7693(90) cm-1 and 36.6342(84) cm-1 determined by fitting 223 and 320 torsional components of the cis and the trans conformer, respectively. These values were compared with those found in other m-substituted toluenes as well as in o- and p-methylanisole. A comparison between the aixPAM and the XIAM code (using a combined axis system) was also performed.
KIRMES: kernel-based identification of regulatory modules in euchromatic sequences.
Schultheiss, Sebastian J; Busch, Wolfgang; Lohmann, Jan U; Kohlbacher, Oliver; Rätsch, Gunnar
2009-08-15
Understanding transcriptional regulation is one of the main challenges in computational biology. An important problem is the identification of transcription factor (TF) binding sites in promoter regions of potential TF target genes. It is typically approached by position weight matrix-based motif identification algorithms using Gibbs sampling, or heuristics to extend seed oligos. Such algorithms succeed in identifying single, relatively well-conserved binding sites, but tend to fail when it comes to the identification of combinations of several degenerate binding sites, as those often found in cis-regulatory modules. We propose a new algorithm that combines the benefits of existing motif finding with the ones of support vector machines (SVMs) to find degenerate motifs in order to improve the modeling of regulatory modules. In experiments on microarray data from Arabidopsis thaliana, we were able to show that the newly developed strategy significantly improves the recognition of TF targets. The python source code (open source-licensed under GPL), the data for the experiments and a Galaxy-based web service are available at http://www.fml.mpg.de/raetsch/suppl/kirmes/.
Han, Heeyoung; Papireddy, Muralidhar Reddy; Hingle, Susan T; Ferguson, Jacqueline Anne; Koschmann, Timothy; Sandstrom, Steve
2018-07-01
Individualized structured feedback is an integral part of a resident's learning in communication skills. However, it is not clear what feedback residents receive for their communication skills development in real patient care. We will identify the most common feedback topics given to residents regarding communication skills during Internal Medicine residency training. We analyzed Resident Audio-recording Project feedback data from 2008 to 2013 by using a content analysis approach. Using open coding and an iterative categorization process, we identified 15 emerging themes for both positive and negative feedback. The most recurrent feedback topics were Patient education, Thoroughness, Organization, Questioning strategy, and Management. The residents were guided to improve their communication skills regarding Patient education, Thoroughness, Management, and Holistic exploration of patient's problem. Thoroughness and Communication intelligibility were newly identified themes that were rarely discussed in existing frameworks. Assessment rubrics serve as a lens through which we assess the adequacy of the residents' communication skills. Rather than sticking to a specific rubric, we chose to let the rubric evolve through our experience.
NASA Astrophysics Data System (ADS)
Baykiev, Eldar; Ebbing, Jörg; Brönner, Marco; Fabian, Karl
2016-11-01
A newly developed software package to calculate the magnetic field in a spherical coordinate system near the Earth's surface and on satellite height is shown to produce reliable modeling results for global and regional applications. The discretization cells of the model are uniformly magnetized spherical prisms, so called tesseroids. The presented algorithm extends an existing code for gravity calculations by applying Poisson's relation to identify the magnetic potential with the sum over pseudogravity fields of tesseroids. By testing different lithosphere discretization grids it is possible to determine the optimal size of tesseroids for field calculations on satellite altitude within realistic measurement error bounds. Also the influence of the Earth's ellipticity upon the modeling result is estimated and global examples are studied. The new software calculates induced and remanent magnetic fields for models at global and regional scale. For regional models far-field effects are evaluated and discussed. This provides bounds for the minimal size of a regional model that is necessary to predict meaningful satellite total field anomalies over the corresponding area.
Alternative Splicing as a Target for Cancer Treatment.
Martinez-Montiel, Nancy; Rosas-Murrieta, Nora Hilda; Anaya Ruiz, Maricruz; Monjaraz-Guzman, Eduardo; Martinez-Contreras, Rebeca
2018-02-11
Alternative splicing is a key mechanism determinant for gene expression in metazoan. During alternative splicing, non-coding sequences are removed to generate different mature messenger RNAs due to a combination of sequence elements and cellular factors that contribute to splicing regulation. A different combination of splicing sites, exonic or intronic sequences, mutually exclusive exons or retained introns could be selected during alternative splicing to generate different mature mRNAs that could in turn produce distinct protein products. Alternative splicing is the main source of protein diversity responsible for 90% of human gene expression, and it has recently become a hallmark for cancer with a full potential as a prognostic and therapeutic tool. Currently, more than 15,000 alternative splicing events have been associated to different aspects of cancer biology, including cell proliferation and invasion, apoptosis resistance and susceptibility to different chemotherapeutic drugs. Here, we present well established and newly discovered splicing events that occur in different cancer-related genes, their modification by several approaches and the current status of key tools developed to target alternative splicing with diagnostic and therapeutic purposes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yu, Kuang; Libisch, Florian; Carter, Emily A., E-mail: eac@princeton.edu
We report a new implementation of the density functional embedding theory (DFET) in the VASP code, using the projector-augmented-wave (PAW) formalism. Newly developed algorithms allow us to efficiently perform optimized effective potential optimizations within PAW. The new algorithm generates robust and physically correct embedding potentials, as we verified using several test systems including a covalently bound molecule, a metal surface, and bulk semiconductors. We show that with the resulting embedding potential, embedded cluster models can reproduce the electronic structure of point defects in bulk semiconductors, thereby demonstrating the validity of DFET in semiconductors for the first time. Compared to ourmore » previous version, the new implementation of DFET within VASP affords use of all features of VASP (e.g., a systematic PAW library, a wide selection of functionals, a more flexible choice of U correction formalisms, and faster computational speed) with DFET. Furthermore, our results are fairly robust with respect to both plane-wave and Gaussian type orbital basis sets in the embedded cluster calculations. This suggests that the density functional embedding method is potentially an accurate and efficient way to study properties of isolated defects in semiconductors.« less
Status and plans for the future of the Vienna VLBI Software
NASA Astrophysics Data System (ADS)
Madzak, Matthias; Böhm, Johannes; Böhm, Sigrid; Girdiuk, Anastasiia; Hellerschmied, Andreas; Hofmeister, Armin; Krasna, Hana; Kwak, Younghee; Landskron, Daniel; Mayer, David; McCallum, Jamie; Plank, Lucia; Schönberger, Caroline; Shabala, Stanislav; Sun, Jing; Teke, Kamil
2016-04-01
The Vienna VLBI Software (VieVS) is a VLBI analysis software developed and maintained at Technische Universität Wien (TU Wien) since 2008 with contributions from groups all over the world. It is used for both academic purposes in university courses as well as for providing VLBI analysis results to the geodetic community. Written in a modular structure in Matlab, VieVS offers easy access to the source code and the possibility to adapt the programs for particular purposes. The new version 2.3, released in December 2015, includes several new parameters to be estimated in the global solution, such as tidal ERP variation coefficients. The graphical user interface was slightly modified for an improved user functionality and, e.g., the possibility of deriving baseline length repeatabilities. The scheduling of satellite observations was refined, the simulator newly includes the effect of source structure which can also be corrected for in the analysis. This poster gives an overview of all VLBI-related activities in Vienna and provides an outlook to future plans concerning the Vienna VLBI Software.
A diagnostic for quantifying heat flux from a thermite spray
DOE Office of Scientific and Technical Information (OSTI.GOV)
E. P. Nixon; M. L. Pantoya; D. J. Prentice
2010-02-01
Characterizing the combustion behaviors of energetic materials requires diagnostic tools that are often not readily or commercially available. For example, a jet of thermite spray provides a high temperature and pressure reaction that can also be highly corrosive and promote undesirable conditions for the survivability of any sensor. Developing a diagnostic to quantify heat flux from a thermite spray is the objective of this study. Quick response sensors such as thin film heat flux sensors cannot survive the harsh conditions of the spray, but more rugged sensors lack the response time for the resolution desired. A sensor that will allowmore » for adequate response time while surviving the entire test duration was constructed. The sensor outputs interior temperatures of the probes at known locations and utilizes an inverse heat conduction code to calculate heat flux values. The details of this device are discussed and illustrated. Temperature and heat flux measurements of various thermite sprays are reported. Results indicate that this newly designed heat flux sensor provides quantitative data with good repeatability suitable for characterizing energetic material combustion.« less
An ethnographic study of differentiated practice in an operating room.
Graff, C; Roberts, K; Thornton, K
1999-01-01
An ethnographic study was conducted to investigate implementation of the clinical nurse III or team leader (TL) role as part of a newly executed nursing differentiated practice model. The six TLs studied were employed in the operating room (OR). Through participant observation, interviews, and document analysis, the TL role--as well as perceptions of the role by the TLs and OR staff--were studied. Problems related to performance of the role and its evolutionary process were delineated. Data analysis involved identifying categories and subcategories of data and developing a coding system to identify themes. Salient themes were related to the culture of the OR. Because of the OR's highly technical environment, the TLs defined their roles in relation to the organizational and technical needs of their surgical service. Refinement of surgeon "preference cards" and "instrument count sheets" was considered the initial priority for the TLs. Various controllable and uncontrollable factors were identified that affected implementation of the new TL role. Findings suggest that introduction of the role requires insight into setting and an emphasis on staging and orientation of employees to the new role.
Reproducible Computing: a new Technology for Statistics Education and Educational Research
NASA Astrophysics Data System (ADS)
Wessa, Patrick
2009-05-01
This paper explains how the R Framework (http://www.wessa.net) and a newly developed Compendium Platform (http://www.freestatistics.org) allow us to create, use, and maintain documents that contain empirical research results which can be recomputed and reused in derived work. It is illustrated that this technological innovation can be used to create educational applications that can be shown to support effective learning of statistics and associated analytical skills. It is explained how a Compendium can be created by anyone, without the need to understand the technicalities of scientific word processing (L style="font-variant: small-caps">ATEX) or statistical computing (R code). The proposed Reproducible Computing system allows educational researchers to objectively measure key aspects of the actual learning process based on individual and constructivist activities such as: peer review, collaboration in research, computational experimentation, etc. The system was implemented and tested in three statistics courses in which the use of Compendia was used to create an interactive e-learning environment that simulated the real-world process of empirical scientific research.
Mi-DISCOVERER: A bioinformatics tool for the detection of mi-RNA in human genome.
Arshad, Saadia; Mumtaz, Asia; Ahmad, Freed; Liaquat, Sadia; Nadeem, Shahid; Mehboob, Shahid; Afzal, Muhammad
2010-11-27
MicroRNAs (miRNAs) are 22 nucleotides non-coding RNAs that play pivotal regulatory roles in diverse organisms including the humans and are difficult to be identified due to lack of either sequence features or robust algorithms to efficiently identify. Therefore, we made a tool that is Mi-Discoverer for the detection of miRNAs in human genome. The tools used for the development of software are Microsoft Office Access 2003, the JDK version 1.6.0, BioJava version 1.0, and the NetBeans IDE version 6.0. All already made miRNAs softwares were web based; so the advantage of our project was to make a desktop facility to the user for sequence alignment search with already identified miRNAs of human genome present in the database. The user can also insert and update the newly discovered human miRNA in the database. Mi-Discoverer, a bioinformatics tool successfully identifies human miRNAs based on multiple sequence alignment searches. It's a non redundant database containing a large collection of publicly available human miRNAs.
Mi-DISCOVERER: A bioinformatics tool for the detection of mi-RNA in human genome
Arshad, Saadia; Mumtaz, Asia; Ahmad, Freed; Liaquat, Sadia; Nadeem, Shahid; Mehboob, Shahid; Afzal, Muhammad
2010-01-01
MicroRNAs (miRNAs) are 22 nucleotides non-coding RNAs that play pivotal regulatory roles in diverse organisms including the humans and are difficult to be identified due to lack of either sequence features or robust algorithms to efficiently identify. Therefore, we made a tool that is Mi-Discoverer for the detection of miRNAs in human genome. The tools used for the development of software are Microsoft Office Access 2003, the JDK version 1.6.0, BioJava version 1.0, and the NetBeans IDE version 6.0. All already made miRNAs softwares were web based; so the advantage of our project was to make a desktop facility to the user for sequence alignment search with already identified miRNAs of human genome present in the database. The user can also insert and update the newly discovered human miRNA in the database. Mi-Discoverer, a bioinformatics tool successfully identifies human miRNAs based on multiple sequence alignment searches. It's a non redundant database containing a large collection of publicly available human miRNAs. PMID:21364831
NASA Astrophysics Data System (ADS)
Cao, Duc; Moses, Gregory; Delettrez, Jacques
2015-08-01
An implicit, non-local thermal conduction algorithm based on the algorithm developed by Schurtz, Nicolai, and Busquet (SNB) [Schurtz et al., Phys. Plasmas 7, 4238 (2000)] for non-local electron transport is presented and has been implemented in the radiation-hydrodynamics code DRACO. To study the model's effect on DRACO's predictive capability, simulations of shot 60 303 from OMEGA are completed using the iSNB model, and the computed shock speed vs. time is compared to experiment. Temperature outputs from the iSNB model are compared with the non-local transport model of Goncharov et al. [Phys. Plasmas 13, 012702 (2006)]. Effects on adiabat are also examined in a polar drive surrogate simulation. Results show that the iSNB model is not only capable of flux-limitation but also preheat prediction while remaining numerically robust and sacrificing little computational speed. Additionally, the results provide strong incentive to further modify key parameters within the SNB theory, namely, the newly introduced non-local mean free path. This research was supported by the Laboratory for Laser Energetics of the University of Rochester.
Enhancement and Validation of an Arab Surname Database
Schwartz, Kendra; Beebani, Ganj; Sedki, Mai; Tahhan, Mamon; Ruterbusch, Julie J.
2015-01-01
Objectives Arab Americans constitute a large, heterogeneous, and quickly growing subpopulation in the United States. Health statistics for this group are difficult to find because US governmental offices do not recognize Arab as separate from white. The development and validation of an Arab- and Chaldean-American name database will enhance research efforts in this population subgroup. Methods A previously validated name database was supplemented with newly identified names gathered primarily from vital statistic records and then evaluated using a multistep process. This process included 1) review by 4 Arabic- and Chaldean-speaking reviewers, 2) ethnicity assessment by social media searches, and 3) self-report of ancestry obtained from a telephone survey. Results Our Arab- and Chaldean-American name algorithm has a positive predictive value of 91% and a negative predictive value of 100%. Conclusions This enhanced name database and algorithm can be used to identify Arab Americans in health statistics data, such as cancer and hospital registries, where they are often coded as white, to determine the extent of health disparities in this population. PMID:24625771
Comparison of Models for Ball Bearing Dynamic Capacity and Life
NASA Technical Reports Server (NTRS)
Gupta, Pradeep K.; Oswald, Fred B.; Zaretsky, Erwin V.
2015-01-01
Generalized formulations for dynamic capacity and life of ball bearings, based on the models introduced by Lundberg and Palmgren and Zaretsky, have been developed and implemented in the bearing dynamics computer code, ADORE. Unlike the original Lundberg-Palmgren dynamic capacity equation, where the elastic properties are part of the life constant, the generalized formulations permit variation of elastic properties of the interacting materials. The newly updated Lundberg-Palmgren model allows prediction of life as a function of elastic properties. For elastic properties similar to those of AISI 52100 bearing steel, both the original and updated Lundberg-Palmgren models provide identical results. A comparison between the Lundberg-Palmgren and the Zaretsky models shows that at relatively light loads the Zaretsky model predicts a much higher life than the Lundberg-Palmgren model. As the load increases, the Zaretsky model provides a much faster drop off in life. This is because the Zaretsky model is much more sensitive to load than the Lundberg-Palmgren model. The generalized implementation where all model parameters can be varied provides an effective tool for future model validation and enhancement in bearing life prediction capabilities.
Development of probabilistic multimedia multipathway computer codes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yu, C.; LePoire, D.; Gnanapragasam, E.
2002-01-01
The deterministic multimedia dose/risk assessment codes RESRAD and RESRAD-BUILD have been widely used for many years for evaluation of sites contaminated with residual radioactive materials. The RESRAD code applies to the cleanup of sites (soils) and the RESRAD-BUILD code applies to the cleanup of buildings and structures. This work describes the procedure used to enhance the deterministic RESRAD and RESRAD-BUILD codes for probabilistic dose analysis. A six-step procedure was used in developing default parameter distributions and the probabilistic analysis modules. These six steps include (1) listing and categorizing parameters; (2) ranking parameters; (3) developing parameter distributions; (4) testing parameter distributionsmore » for probabilistic analysis; (5) developing probabilistic software modules; and (6) testing probabilistic modules and integrated codes. The procedures used can be applied to the development of other multimedia probabilistic codes. The probabilistic versions of RESRAD and RESRAD-BUILD codes provide tools for studying the uncertainty in dose assessment caused by uncertain input parameters. The parameter distribution data collected in this work can also be applied to other multimedia assessment tasks and multimedia computer codes.« less
Costi, Cintia; Grandi, Tarciana; Halon, Maria Laura; Silva, Márcia Susana Nunes; Silva, Cláudia Maria Dornelles da; Gregianini, Tatiana Schäffer; Possuelo, Lia Gonçalves; Jarczewski, Carla Adriane; Niel, Christian; Rossetti, Maria Lucia Rosa
2017-04-01
Porto Alegre is the Brazilian state capital with second highest incidence of tuberculosis (TB) and the highest proportion of people infected with human immunodeficiency virus (HIV) among patients with TB. Hepatitis C virus (HCV) infection increases the risk of anti-TB drug-induced hepatotoxicity, which may result in discontinuation of the therapy. The aim of this study was (i) to estimate prevalence of HCV and HIV in a group of patients newly diagnosed with active TB in a public reference hospital in Porto Alegre and (ii) to compare demographic, behavioural, and clinical characteristics of patients in relation to their HCV infection status. One hundred and thirty-eight patients with TB were tested for anti-HCV antibody, HCV RNA, and anti-HIV1/2 antibody markers. HCV RNA from real-time polymerase chain reaction (PCR)-positive samples was submitted to reverse transcription and PCR amplification. The 5' non-coding region of the HCV genome was sequenced, and genotypes of HCV isolates were determined. Anti-HCV antibody, HCV RNA, and anti-HIV antibodies were detected in 27 [20%; 95% confidence interval (CI), 13-26%], 17 (12%; 95% CI, 7-18%), and 34 (25%; 95% CI, 17-32%) patients, respectively. HCV isolates belonged to genotypes 1 (n = 12) and 3 (n = 4). Some characteristics were significantly more frequent in patients infected with HCV. Among them, non-white individuals, alcoholics, users of illicit drugs, imprisoned individuals, and those with history of previous TB episode were more commonly infected with HCV (p < 0.05). HCV screening, including detection of anti-HCV antibody and HCV RNA, will be important to improving the management of co-infected patients, given their increased risk of developing TB treatment-related hepatotoxicity.
How Can Online Discussion Support and Develop Newly Qualified Teachers? Research Briefing No. 51
ERIC Educational Resources Information Center
Unwin, Adam
2013-01-01
This research investigated newly qualified teachers (NQTs) experiences of participating in online discussions (ODs) that were part of their Master of Teaching (MTeach) course. [The project was partially funded by the Excellence in Work-Based Learning for Education Professionals (WLE) Centre in 2009.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-04-19
...: Notice. SUMMARY: The DOE participates in the code development process of the International Code Council... notice outlines the process by which DOE produces code change proposals, and participates in the ICC code development process. FOR FURTHER INFORMATION CONTACT: Jeremiah Williams, U.S. Department of Energy, Office of...
Modulation and coding for satellite and space communications
NASA Technical Reports Server (NTRS)
Yuen, Joseph H.; Simon, Marvin K.; Pollara, Fabrizio; Divsalar, Dariush; Miller, Warner H.; Morakis, James C.; Ryan, Carl R.
1990-01-01
Several modulation and coding advances supported by NASA are summarized. To support long-constraint-length convolutional code, a VLSI maximum-likelihood decoder, utilizing parallel processing techniques, which is being developed to decode convolutional codes of constraint length 15 and a code rate as low as 1/6 is discussed. A VLSI high-speed 8-b Reed-Solomon decoder which is being developed for advanced tracking and data relay satellite (ATDRS) applications is discussed. A 300-Mb/s modem with continuous phase modulation (CPM) and codings which is being developed for ATDRS is discussed. Trellis-coded modulation (TCM) techniques are discussed for satellite-based mobile communication applications.
Computing border bases using mutant strategies
NASA Astrophysics Data System (ADS)
Ullah, E.; Abbas Khan, S.
2014-01-01
Border bases, a generalization of Gröbner bases, have actively been addressed during recent years due to their applicability to industrial problems. In cryptography and coding theory a useful application of border based is to solve zero-dimensional systems of polynomial equations over finite fields, which motivates us for developing optimizations of the algorithms that compute border bases. In 2006, Kehrein and Kreuzer formulated the Border Basis Algorithm (BBA), an algorithm which allows the computation of border bases that relate to a degree compatible term ordering. In 2007, J. Ding et al. introduced mutant strategies bases on finding special lower degree polynomials in the ideal. The mutant strategies aim to distinguish special lower degree polynomials (mutants) from the other polynomials and give them priority in the process of generating new polynomials in the ideal. In this paper we develop hybrid algorithms that use the ideas of J. Ding et al. involving the concept of mutants to optimize the Border Basis Algorithm for solving systems of polynomial equations over finite fields. In particular, we recall a version of the Border Basis Algorithm which is actually called the Improved Border Basis Algorithm and propose two hybrid algorithms, called MBBA and IMBBA. The new mutants variants provide us space efficiency as well as time efficiency. The efficiency of these newly developed hybrid algorithms is discussed using standard cryptographic examples.
An Overview of Research and Design Activities at CTFusion
NASA Astrophysics Data System (ADS)
Sutherland, D. A.; Jarboe, T. R.; Hossack, A. C.
2016-10-01
CTFusion, a newly formed company dedicated to the development of compact, toroidal fusion energy, is a spin-off from the University of Washington that will build upon the successes of the HIT-SI research program. The mission of the company to develop net-gain fusion power cores that will serve as the heart of economical fusion power plants or radioactive-waste destroying burner reactors. The overarching vision and development plan of the company will be presented, along with a detailed justification and design for our next device, the HIT-TD (Technology Demonstration) prototype. By externally driving the edge current and imposing non-axisymmetric magnetic perturbations, HIT-TD should demonstrate the sustainment of stable spheromak configurations with Imposed-Dynamo Current Drive (IDCD), as was accomplished in the HIT-SI device, with higher current gains and temperatures than previously possible. HIT-TD, if successful, will be an instrumental step along this path to economical fusion energy, and will serve as the stepping stone to our Proof-Of-Principle device (HIT-PoP). Beyond the implications of higher performance, sustained spheromaks for fusion applications, the HIT-TD platform will provide a unique system to observe plasma self-organizational phenomena of interest for other fusion devices, and astrophysical systems as well. Lastly, preliminary nuclear engineering design simulations with the MCNP6 code of the HIT-FNSF (Fusion Nuclear Science Facility) device will be presented.
Douzery, Emmanuel J P; Scornavacca, Celine; Romiguier, Jonathan; Belkhir, Khalid; Galtier, Nicolas; Delsuc, Frédéric; Ranwez, Vincent
2014-07-01
Comparative genomic studies extensively rely on alignments of orthologous sequences. Yet, selecting, gathering, and aligning orthologous exons and protein-coding sequences (CDS) that are relevant for a given evolutionary analysis can be a difficult and time-consuming task. In this context, we developed OrthoMaM, a database of ORTHOlogous MAmmalian Markers describing the evolutionary dynamics of orthologous genes in mammalian genomes using a phylogenetic framework. Since its first release in 2007, OrthoMaM has regularly evolved, not only to include newly available genomes but also to incorporate up-to-date software in its analytic pipeline. This eighth release integrates the 40 complete mammalian genomes available in Ensembl v73 and provides alignments, phylogenies, evolutionary descriptor information, and functional annotations for 13,404 single-copy orthologous CDS and 6,953 long exons. The graphical interface allows to easily explore OrthoMaM to identify markers with specific characteristics (e.g., taxa availability, alignment size, %G+C, evolutionary rate, chromosome location). It hence provides an efficient solution to sample preprocessed markers adapted to user-specific needs. OrthoMaM has proven to be a valuable resource for researchers interested in mammalian phylogenomics, evolutionary genomics, and has served as a source of benchmark empirical data sets in several methodological studies. OrthoMaM is available for browsing, query and complete or filtered downloads at http://www.orthomam.univ-montp2.fr/. © The Author 2014. Published by Oxford University Press on behalf of the Society for Molecular Biology and Evolution. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Ophthalmologist-patient communication, self-efficacy, and glaucoma medication adherence
Sleath, Betsy; Blalock, Susan J.; Carpenter, Delesha M.; Sayner, Robyn; Muir, Kelly W.; Slota, Catherine; Lawrence, Scott D.; Giangiacomo, Annette L.; Hartnett, Mary Elizabeth; Tudor, Gail; Goldsmith, Jason A.; Robin, Alan L.
2015-01-01
Objective The objective of the study was to examine the association between provider-patient communication, glaucoma medication adherence self-efficacy, outcome expectations, and glaucoma medication adherence. Design Prospective observational cohort study. Participants 279 patients with glaucoma who were newly prescribed or on glaucoma medications were recruited at six ophthalmology clinics. Methods Patients’ visits were video-tape recorded and communication variables were coded using a detailed coding tool developed by the authors. Adherence was measured using Medication Event Monitoring Systems for 60 days after their visits. Main outcome measures The following adherence variables were measured for the 60 day period after their visits: whether the patient took 80% or more of the prescribed doses, percent correct number of prescribed doses taken each day, and percent prescribed doses taken on time. Results Higher glaucoma medication adherence self-efficacy was positively associated with better adherence with all three measures. African American race was negatively associated with percent correct number of doses taken each day (beta= −0.16, p<0.05) and whether the patient took 80% or more of the prescribed doses (odds ratio=0.37, 95% confidence interval 0.16, 0.86). Physician education about how to administer drops was positively associated with percent correct number of doses taken each day (beta= 0.18, p<0.01) and percent prescribed doses taken on time (beta=0.15, p<0.05). Conclusions These findings indicate that provider education about how to administer glaucoma drops and patient glaucoma medication adherence self-efficacy are positively associated with adherence. PMID:25542521
Dweep, Harsh; Sticht, Carsten; Pandey, Priyanka; Gretz, Norbert
2011-10-01
MicroRNAs are small, non-coding RNA molecules that can complementarily bind to the mRNA 3'-UTR region to regulate the gene expression by transcriptional repression or induction of mRNA degradation. Increasing evidence suggests a new mechanism by which miRNAs may regulate target gene expression by binding in promoter and amino acid coding regions. Most of the existing databases on miRNAs are restricted to mRNA 3'-UTR region. To address this issue, we present miRWalk, a comprehensive database on miRNAs, which hosts predicted as well as validated miRNA binding sites, information on all known genes of human, mouse and rat. All mRNAs, mitochondrial genes and 10 kb upstream flanking regions of all known genes of human, mouse and rat were analyzed by using a newly developed algorithm named 'miRWalk' as well as with eight already established programs for putative miRNA binding sites. An automated and extensive text-mining search was performed on PubMed database to extract validated information on miRNAs. Combined information was put into a MySQL database. miRWalk presents predicted and validated information on miRNA-target interaction. Such a resource enables researchers to validate new targets of miRNA not only on 3'-UTR, but also on the other regions of all known genes. The 'Validated Target module' is updated every month and the 'Predicted Target module' is updated every 6 months. miRWalk is freely available at http://mirwalk.uni-hd.de/. Copyright © 2011 Elsevier Inc. All rights reserved.
Lee, Ya-Ling; Hu, Hsiao-Yun; Huang, Li-Ying; Chou, Pesus; Chu, Dachen
2017-09-01
To determine the magnitude and temporal aspect of the effect of poor dental health and periodontal disease (PD) on dementia. Retrospective cohort study SETTING: Taiwan National Health Insurance Research Database. Individuals with newly diagnosed PD (N = 182,747) MEASUREMENTS: Participants were followed from January 1, 2000, to December 31, 2010. Participants were assigned to dental prophylaxis, intensive periodontal treatment, tooth extraction, or no treatment, according to International Classification of Diseases codes and PD treatment codes. The incidence rate of dementia of the groups was compared. The association between PD and dementia was analyzed using Cox regression, with adjustments for age, sex, monthly income, residential urbanicity, and comorbidities. The incidence of dementia was significantly higher in the group with PD that did not receive treatment (0.76% per year) and in the group that had teeth extracted (0.57% per year) than in the group that underwent intensive PD treatment (0.35% per year) and the group that received dental prophylaxis (0.39% per year) (P < .001). After adjusting for confounders, the Cox proportional hazards model revealed a higher risk of dementia in the group with PD who did not undergo treatment (hazard ratio (HR) = 1.14, 95% confidence interval (CI) = 1.04-1.24) and the group that had teeth extracted (HR = 1.10, 95% CI = 1.04-1.16) than in the group that received dental prophylaxis. Subjects who had more severe PD or did not receive periodontal treatment were at greater risk of developing dementia. © 2017, Copyright the Authors Journal compilation © 2017, The American Geriatrics Society.
Software engineering and automatic continuous verification of scientific software
NASA Astrophysics Data System (ADS)
Piggott, M. D.; Hill, J.; Farrell, P. E.; Kramer, S. C.; Wilson, C. R.; Ham, D.; Gorman, G. J.; Bond, T.
2011-12-01
Software engineering of scientific code is challenging for a number of reasons including pressure to publish and a lack of awareness of the pitfalls of software engineering by scientists. The Applied Modelling and Computation Group at Imperial College is a diverse group of researchers that employ best practice software engineering methods whilst developing open source scientific software. Our main code is Fluidity - a multi-purpose computational fluid dynamics (CFD) code that can be used for a wide range of scientific applications from earth-scale mantle convection, through basin-scale ocean dynamics, to laboratory-scale classic CFD problems, and is coupled to a number of other codes including nuclear radiation and solid modelling. Our software development infrastructure consists of a number of free tools that could be employed by any group that develops scientific code and has been developed over a number of years with many lessons learnt. A single code base is developed by over 30 people for which we use bazaar for revision control, making good use of the strong branching and merging capabilities. Using features of Canonical's Launchpad platform, such as code review, blueprints for designing features and bug reporting gives the group, partners and other Fluidity uers an easy-to-use platform to collaborate and allows the induction of new members of the group into an environment where software development forms a central part of their work. The code repositoriy are coupled to an automated test and verification system which performs over 20,000 tests, including unit tests, short regression tests, code verification and large parallel tests. Included in these tests are build tests on HPC systems, including local and UK National HPC services. The testing of code in this manner leads to a continuous verification process; not a discrete event performed once development has ceased. Much of the code verification is done via the "gold standard" of comparisons to analytical solutions via the method of manufactured solutions. By developing and verifying code in tandem we avoid a number of pitfalls in scientific software development and advocate similar procedures for other scientific code applications.
Malik, Azhar H; Shimazoe, Kenji; Takahashi, Hiroyuki
2013-01-01
In order to obtain plasma time activity curve (PTAC), input function for almost all quantitative PET studies, patient blood is sampled manually from the artery or vein which has various drawbacks. Recently a novel compact Time over Threshold (ToT) based Pr:LuAG-APD animal PET tomograph is developed in our laboratory which has 10% energy resolution, 4.2 ns time resolution and 1.76 mm spatial resolution. The measured value of spatial resolution shows much promise for imaging the blood vascular, i.e; artery of diameter 2.3-2.4mm, and hence, to measure PTAC for quantitative PET studies. To find the measurement time required to obtain reasonable counts for image reconstruction, the most important parameter is the sensitivity of the system. Usually small animal PET systems are characterized by using a point source in air. We used Electron Gamma Shower 5 (EGS5) code to simulate a point source at different positions inside the sensitive volume of tomograph and the axial and radial variations in the sensitivity are studied in air and phantom equivalent water cylinder. An average sensitivity difference of 34% in axial direction and 24.6% in radial direction is observed when point source is displaced inside water cylinder instead of air.
Lyons-Ruth, Karlen; Bureau, Jean-François; Riley, Caitlin D; Atlas-Corbett, Alisha F
2009-01-01
Socially indiscriminate attachment behavior has been repeatedly observed among institutionally reared children. Socially indiscriminate behavior has also been associated with aggression and hyperactivity. However, available data rely heavily on caregiver report of indiscriminate behavior. In addition, few studies have been conducted with samples of home-reared infants exposed to inadequate care. The current study aimed to develop a reliable laboratory measure of socially indiscriminate forms of attachment behavior based on direct observation and to validate the measure against assessments of early care and later behavior problems among home-reared infants. Strange Situation episodes of 75 socially at-risk mother-infant dyads were coded for infant indiscriminate attachment behavior on the newly developed Rating for Infant-Stranger Engagement. After controlling for infant insecure-organized and disorganized behavior in all analyses, extent of infant-stranger engagement at 18 months was significantly related to serious caregiving risk (maltreatment or maternal psychiatric hospitalization), observed quality of disrupted maternal affective communication, and aggressive and hyperactive behavior problems at age 5. Results are discussed in relation to the convergent and discriminant validity of the new measure and to the potential utility of a standardized observational measure of indiscriminate attachment behavior. Further validation is needed in relation to caregiver report measures of indiscriminate behavior.