DOE Office of Scientific and Technical Information (OSTI.GOV)
Curry, J J; Gallagher, D W; Modarres, M
Appendices are presented concerning isolation condenser makeup; vapor suppression system; station air system; reactor building closed cooling water system; turbine building secondary closed water system; service water system; emergency service water system; fire protection system; emergency ac power; dc power system; event probability estimation; methodology of accident sequence quantification; and assignment of dominant sequences to release categories.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mays, S.E.; Poloski, J.P.; Sullivan, W.H.
1982-07-01
This report describes a risk study of the Browns Ferry, Unit 1, nuclear plant. The study is one of four such studies sponsored by the NRC Office of Research, Division of Risk Assessment, as part of its Interim Reliability Evaluation Program (IREP), Phase II. This report is contained in four volumes: a main report and three appendixes. Appendix C generally describes the methods used to estimate accident sequence frequency values. Information is presented concerning the approach, example collection, failure data, candidate dominant sequences, uncertainty analysis, and sensitivity analysis.
NASA Astrophysics Data System (ADS)
Ha, Taesung
A probabilistic risk assessment (PRA) was conducted for a loss of coolant accident, (LOCA) in the McMaster Nuclear Reactor (MNR). A level 1 PRA was completed including event sequence modeling, system modeling, and quantification. To support the quantification of the accident sequence identified, data analysis using the Bayesian method and human reliability analysis (HRA) using the accident sequence evaluation procedure (ASEP) approach were performed. Since human performance in research reactors is significantly different from that in power reactors, a time-oriented HRA model (reliability physics model) was applied for the human error probability (HEP) estimation of the core relocation. This model is based on two competing random variables: phenomenological time and performance time. The response surface and direct Monte Carlo simulation with Latin Hypercube sampling were applied for estimating the phenomenological time, whereas the performance time was obtained from interviews with operators. An appropriate probability distribution for the phenomenological time was assigned by statistical goodness-of-fit tests. The human error probability (HEP) for the core relocation was estimated from these two competing quantities: phenomenological time and operators' performance time. The sensitivity of each probability distribution in human reliability estimation was investigated. In order to quantify the uncertainty in the predicted HEPs, a Bayesian approach was selected due to its capability of incorporating uncertainties in model itself and the parameters in that model. The HEP from the current time-oriented model was compared with that from the ASEP approach. Both results were used to evaluate the sensitivity of alternative huinan reliability modeling for the manual core relocation in the LOCA risk model. This exercise demonstrated the applicability of a reliability physics model supplemented with a. Bayesian approach for modeling human reliability and its potential usefulness of quantifying model uncertainty as sensitivity analysis in the PRA model.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ravindra, M.K.; Banon, H.
1992-07-01
In this report, the scoping quantification procedures for external events in probabilistic risk assessments of nuclear power plants are described. External event analysis in a PRA has three important goals; (1) the analysis should be complete in that all events are considered; (2) by following some selected screening criteria, the more significant events are identified for detailed analysis; (3) the selected events are analyzed in depth by taking into account the unique features of the events: hazard, fragility of structures and equipment, external-event initiated accident sequences, etc. Based on the above goals, external event analysis may be considered as amore » three-stage process: Stage I: Identification and Initial Screening of External Events; Stage II: Bounding Analysis; Stage III: Detailed Risk Analysis. In the present report, first, a review of published PRAs is given to focus on the significance and treatment of external events in full-scope PRAs. Except for seismic, flooding, fire, and extreme wind events, the contributions of other external events to plant risk have been found to be negligible. Second, scoping methods for external events not covered in detail in the NRC's PRA Procedures Guide are provided. For this purpose, bounding analyses for transportation accidents, extreme winds and tornadoes, aircraft impacts, turbine missiles, and chemical release are described.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ravindra, M.K.; Banon, H.
1992-07-01
In this report, the scoping quantification procedures for external events in probabilistic risk assessments of nuclear power plants are described. External event analysis in a PRA has three important goals; (1) the analysis should be complete in that all events are considered; (2) by following some selected screening criteria, the more significant events are identified for detailed analysis; (3) the selected events are analyzed in depth by taking into account the unique features of the events: hazard, fragility of structures and equipment, external-event initiated accident sequences, etc. Based on the above goals, external event analysis may be considered as amore » three-stage process: Stage I: Identification and Initial Screening of External Events; Stage II: Bounding Analysis; Stage III: Detailed Risk Analysis. In the present report, first, a review of published PRAs is given to focus on the significance and treatment of external events in full-scope PRAs. Except for seismic, flooding, fire, and extreme wind events, the contributions of other external events to plant risk have been found to be negligible. Second, scoping methods for external events not covered in detail in the NRC`s PRA Procedures Guide are provided. For this purpose, bounding analyses for transportation accidents, extreme winds and tornadoes, aircraft impacts, turbine missiles, and chemical release are described.« less
Domino effect in chemical accidents: main features and accident sequences.
Darbra, R M; Palacios, Adriana; Casal, Joaquim
2010-11-15
The main features of domino accidents in process/storage plants and in the transportation of hazardous materials were studied through an analysis of 225 accidents involving this effect. Data on these accidents, which occurred after 1961, were taken from several sources. Aspects analyzed included the accident scenario, the type of accident, the materials involved, the causes and consequences and the most common accident sequences. The analysis showed that the most frequent causes are external events (31%) and mechanical failure (29%). Storage areas (35%) and process plants (28%) are by far the most common settings for domino accidents. Eighty-nine per cent of the accidents involved flammable materials, the most frequent of which was LPG. The domino effect sequences were analyzed using relative probability event trees. The most frequent sequences were explosion→fire (27.6%), fire→explosion (27.5%) and fire→fire (17.8%). Copyright © 2010 Elsevier B.V. All rights reserved.
Categorizing accident sequences in the external radiotherapy for risk analysis
2013-01-01
Purpose This study identifies accident sequences from the past accidents in order to help the risk analysis application to the external radiotherapy. Materials and Methods This study reviews 59 accidental cases in two retrospective safety analyses that have collected the incidents in the external radiotherapy extensively. Two accident analysis reports that accumulated past incidents are investigated to identify accident sequences including initiating events, failure of safety measures, and consequences. This study classifies the accidents by the treatments stages and sources of errors for initiating events, types of failures in the safety measures, and types of undesirable consequences and the number of affected patients. Then, the accident sequences are grouped into several categories on the basis of similarity of progression. As a result, these cases can be categorized into 14 groups of accident sequence. Results The result indicates that risk analysis needs to pay attention to not only the planning stage, but also the calibration stage that is committed prior to the main treatment process. It also shows that human error is the largest contributor to initiating events as well as to the failure of safety measures. This study also illustrates an event tree analysis for an accident sequence initiated in the calibration. Conclusion This study is expected to provide sights into the accident sequences for the prospective risk analysis through the review of experiences. PMID:23865005
An expert system for the quantification of fault rates in construction fall accidents.
Talat Birgonul, M; Dikmen, Irem; Budayan, Cenk; Demirel, Tuncay
2016-01-01
Expert witness reports, prepared with the aim of quantifying fault rates among parties, play an important role in a court's final decision. However, conflicting fault rates assigned by different expert witness boards lead to iterative objections raised by the related parties. This unfavorable situation mainly originates due to the subjectivity of expert judgments and unavailability of objective information about the causes of accidents. As a solution to this shortcoming, an expert system based on a rule-based system was developed for the quantification of fault rates in construction fall accidents. The aim of developing DsSafe is decreasing the subjectivity inherent in expert witness reports. Eighty-four inspection reports prepared by the official and authorized inspectors were examined and root causes of construction fall accidents in Turkey were identified. Using this information, an evaluation form was designed and submitted to the experts. Experts were asked to evaluate the importance level of the factors that govern fall accidents and determine the fault rates under different scenarios. Based on expert judgments, a rule-based expert system was developed. The accuracy and reliability of DsSafe were tested with real data as obtained from finalized court cases. DsSafe gives satisfactory results.
Road profiling of traffic accidents in Jos, Nigeria, 1995-1999.
Bombom, Leonard S; Edino, Marcus O
2009-09-01
Road traffic accident data in Nigeria generally lack exact coordinate information. Accident analysis is, therefore, restricted to aggregate data on trends, magnitude and temporal dimensions. This article addresses the road accident problem in Jos between 1995 and 1999 through a road profiling approach. Results show that four gateway routes, seven multi-lane roadways (including two gateway routes) and seven road intersections accounted for 84% of all traffic accidents, 84% of injured casualties and 88% of fatalities. This approach allows for quantification of impacts of controlling for accidents by deliberate profiling of roads for close monitoring and policing. For example, reducing accident counts and fatalities by 50% each on gateway routes will amount to approximately 35 and 40% reduction in accident and fatality counts, respectively. Countermeasures must consider these roadways and intersections as important inputs in their accidents and casualty reduction targets.
Analysis of loss of decay-heat-removal sequences at Browns Ferry Unit One
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harrington, R.M.
1983-01-01
This paper summarizes the Oak Ridge National Laboratory (ORNL) report Loss of DHR Sequences at Browns Ferry Unit One - Accident Sequence Analysis (NUREG/CR-2973). The Loss of DHR investigation is the third in a series of accident studies concerning the BWR 4 - MK I containment plant design. These studies, sponsored by the Nuclear Regulatory Commission Severe Accident Sequence Analysis (SASA) program, have been conducted at ORNL with the full cooperation of the Tennessee Valley Authority (TVA). The purpose of the SASA studies is to predetermine the probable course of postulated severe accidents so as to establish the timing andmore » the sequence of events. The SASA studies also produce recommendations concerning the implementation of better system design and better emergency operating instructions and operator training. The ORNL studies also include a detailed, best-estimate calculation of the release and transport of radioactive fission products following postulated severe accidents.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sprung, J.L.; Jow, H-N; Rollstin, J.A.
1990-12-01
Estimation of offsite accident consequences is the customary final step in a probabilistic assessment of the risks of severe nuclear reactor accidents. Recently, the Nuclear Regulatory Commission reassessed the risks of severe accidents at five US power reactors (NUREG-1150). Offsite accident consequences for NUREG-1150 source terms were estimated using the MELCOR Accident Consequence Code System (MACCS). Before these calculations were performed, most MACCS input parameters were reviewed, and for each parameter reviewed, a best-estimate value was recommended. This report presents the results of these reviews. Specifically, recommended values and the basis for their selection are presented for MACCS atmospheric andmore » biospheric transport, emergency response, food pathway, and economic input parameters. Dose conversion factors and health effect parameters are not reviewed in this report. 134 refs., 15 figs., 110 tabs.« less
Rapid and Easy Protocol for Quantification of Next-Generation Sequencing Libraries.
Hawkins, Steve F C; Guest, Paul C
2018-01-01
The emergence of next-generation sequencing (NGS) over the last 10 years has increased the efficiency of DNA sequencing in terms of speed, ease, and price. However, the exact quantification of a NGS library is crucial in order to obtain good data on sequencing platforms developed by the current market leader Illumina. Different approaches for DNA quantification are available currently and the most commonly used are based on analysis of the physical properties of the DNA through spectrophotometric or fluorometric methods. Although these methods are technically simple, they do not allow exact quantification as can be achieved using a real-time quantitative PCR (qPCR) approach. A qPCR protocol for DNA quantification with applications in NGS library preparation studies is presented here. This can be applied in various fields of study such as medical disorders resulting from nutritional programming disturbances.
1994 Accident sequence precursor program results
DOE Office of Scientific and Technical Information (OSTI.GOV)
Belles, R.J.; Cletcher, J.W.; Copinger, D.A.
1996-01-01
The Accident Sequence Precursor (ASP) Program involves the systematic review and evaluation of operational events that have occurred at light-water reactors to identify and categorize precursors to potential severe core damage accident sequences. The results of the ASP Program are published in an annual report. The most recent report, which contains the analyses of the precursors for 1994, is NUREG/CR-4674, Vols. 21 and 22, Precursors to Potential Severe Core Damage Accidents: 1994, A Status Report, published in December 1995. This article provides an overview of the ASP review and evaluation process and a summary of the results for 1994. 12more » refs., 2 figs., 4 tabs.« less
Ju, Yong Han; Sohn, So Young
2011-01-01
Injury analysis following a vehicle crash is one of the most important research areas. However, most injury analyses have focused on one-dimensional injury variables, such as the AIS (Abbreviated Injury Scale) or the IIS (Injury Impairment Scale), at a time in relation to various traffic accident factors. However, these studies cannot reflect the various injury phenomena that appear simultaneously. In this paper, we apply quantification method II to the NASS (National Automotive Sampling System) CDS (Crashworthiness Data System) to find the relationship between the categorical injury phenomena, such as the injury scale, injury position, and injury type, and the various traffic accident condition factors, such as speed, collision direction, vehicle type, and seat position. Our empirical analysis indicated the importance of safety devices, such as restraint equipment and airbags. In addition, we found that narrow impact, ejection, air bag deployment, and higher speed are associated with more severe than minor injury to the thigh, ankle, and leg in terms of dislocation, abrasion, or laceration. Copyright © 2010 Elsevier Ltd. All rights reserved.
The impact of climate change on winter road maintenance and traffic accidents in West Midlands, UK.
Andersson, Anna K; Chapman, Lee
2011-01-01
Winter weather can be a significant cause of road traffic accidents. This paper uses UKCIP climate change scenarios and a temporal analogue to investigate the relationship between temperature and severe road accidents in the West Midlands, UK. This approach also allows quantification of the changes in the severity of the winter season over the next century in the region. It is demonstrated that the predicted reduction in the number of frost days should in turn reduce the number of road accidents caused due to slipperiness by approximately 50%. However, the paper concludes by warning against complacency in winter maintenance regimes. A warmer climate may result in budget cuts for highway maintenance which in turn may well reverse declining accident trends. Copyright © 2010 Elsevier Ltd. All rights reserved.
Laurie, Matthew T; Bertout, Jessica A; Taylor, Sean D; Burton, Joshua N; Shendure, Jay A; Bielas, Jason H
2013-08-01
Due to the high cost of failed runs and suboptimal data yields, quantification and determination of fragment size range are crucial steps in the library preparation process for massively parallel sequencing (or next-generation sequencing). Current library quality control methods commonly involve quantification using real-time quantitative PCR and size determination using gel or capillary electrophoresis. These methods are laborious and subject to a number of significant limitations that can make library calibration unreliable. Herein, we propose and test an alternative method for quality control of sequencing libraries using droplet digital PCR (ddPCR). By exploiting a correlation we have discovered between droplet fluorescence and amplicon size, we achieve the joint quantification and size determination of target DNA with a single ddPCR assay. We demonstrate the accuracy and precision of applying this method to the preparation of sequencing libraries.
Delvosalle, Christian; Fievez, Cécile; Pipart, Aurore; Debray, Bruno
2006-03-31
In the frame of the Accidental Risk Assessment Methodology for Industries (ARAMIS) project, this paper aims at presenting the work carried out in the part of the project devoted to the definition of accident scenarios. This topic is a key-point in risk assessment and serves as basis for the whole risk quantification. The first result of the work is the building of a methodology for the identification of major accident hazards (MIMAH), which is carried out with the development of generic fault and event trees based on a typology of equipment and substances. The term "major accidents" must be understood as the worst accidents likely to occur on the equipment, assuming that no safety systems are installed. A second methodology, called methodology for the identification of reference accident scenarios (MIRAS) takes into account the influence of safety systems on both the frequencies and possible consequences of accidents. This methodology leads to identify more realistic accident scenarios. The reference accident scenarios are chosen with the help of a tool called "risk matrix", crossing the frequency and the consequences of accidents. This paper presents both methodologies and an application on an ethylene oxide storage.
Fuzzy approach for reducing subjectivity in estimating occupational accident severity.
Pinto, Abel; Ribeiro, Rita A; Nunes, Isabel L
2012-03-01
Quantifying or, more generally, estimating the severity of the possible consequences of occupational accidents is a decisive step in any occupational risk assessment process. Because of the lack of historic information (accident data collection and recording are incipient and insufficient, particularly in construction) and the lack of practical tools in the construction industry, the estimation/quantification of occupational accident severity is a notably arbitrary process rather than a systematic and rigorous assessment. This work proposes several severity functions (based on a safety risk assessment) to represent biomechanical knowledge with the aim of determining the severity level of occupational accidents in the construction industry and, consequently, improving occupational risk assessment quality. We follow a fuzzy approach because it makes it possible to capture and represent imprecise knowledge in a simple and understandable way for users and specialists. Copyright © 2011 Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sattison, M.B.; Blackman, H.S.; Novack, S.D.
The Office for Analysis and Evaluation of Operational Data (AEOD) has sought the assistance of the Idaho National Engineering Laboratory (INEL) to make some significant enhancements to the SAPHIRE-based Accident Sequence Precursor (ASP) models recently developed by the INEL. The challenge of this project is to provide the features of a full-scale PRA within the framework of the simplified ASP models. Some of these features include: (1) uncertainty analysis addressing the standard PRA uncertainties and the uncertainties unique to the ASP models and methods, (2) incorporation and proper quantification of individual human actions and the interaction among human actions, (3)more » enhanced treatment of common cause failures, and (4) extension of the ASP models to more closely mimic full-scale PRAs (inclusion of more initiators, explicitly modeling support system failures, etc.). This paper provides an overview of the methods being used to make the above improvements.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sattison, M.B.; Blackman, H.S.; Novack, S.D.
The Office for Analysis and Evaluation of Operational Data (AEOD) has sought the assistance of the Idaho National Engineering Laboratory (INEL) to make some significant enhancements to the SAPHIRE-based Accident Sequence Precursor (ASP) models recently developed by the INEL. The challenge of this project is to provide the features of a full-scale PRA within the framework of the simplified ASP models. Some of these features include: (1) uncertainty analysis addressing the standard PRA uncertainties and the uncertainties unique to the ASP models and methodology, (2) incorporation and proper quantification of individual human actions and the interaction among human actions, (3)more » enhanced treatment of common cause failures, and (4) extension of the ASP models to more closely mimic full-scale PRAs (inclusion of more initiators, explicitly modeling support system failures, etc.). This paper provides an overview of the methods being used to make the above improvements.« less
10 CFR 70.62 - Safety program and integrated safety analysis.
Code of Federal Regulations, 2013 CFR
2013-01-01
...; (iv) Potential accident sequences caused by process deviations or other events internal to the... of occurrence of each potential accident sequence identified pursuant to paragraph (c)(1)(iv) of this... have experience in nuclear criticality safety, radiation safety, fire safety, and chemical process...
10 CFR 70.62 - Safety program and integrated safety analysis.
Code of Federal Regulations, 2014 CFR
2014-01-01
...; (iv) Potential accident sequences caused by process deviations or other events internal to the... of occurrence of each potential accident sequence identified pursuant to paragraph (c)(1)(iv) of this... have experience in nuclear criticality safety, radiation safety, fire safety, and chemical process...
10 CFR 70.62 - Safety program and integrated safety analysis.
Code of Federal Regulations, 2012 CFR
2012-01-01
...; (iv) Potential accident sequences caused by process deviations or other events internal to the... of occurrence of each potential accident sequence identified pursuant to paragraph (c)(1)(iv) of this... have experience in nuclear criticality safety, radiation safety, fire safety, and chemical process...
Loss of control air at Browns Ferry Unit One: accident sequence analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harrington, R.M.; Hodge, S.A.
1986-04-01
This study describes the predicted response of the Browns Ferry Nuclear Plant to a postulated complete failure of plant control air. The failure of plant control air cascades to include the loss of drywell control air at Units 1 and 2. Nevertheless, this is a benign accident unless compounded by simultaneous failures in the turbine-driven high pressure injection systems. Accident sequence calculations are presented for Loss of Control Air sequences with assumed failure upon demand of the Reactor Core Isolation Cooling (RCIC) and the High Pressure Coolant Injection (HPCI) at Unit 1. Sequences with and without operator action are considered.more » Results show that the operators can prevent core uncovery if they take action to utilize the Control Rod Drive Hydraulic System as a backup high pressure injection system.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mueller, C.; Nabelssi, B.; Roglans-Ribas, J.
1995-04-01
This report contains the Appendices for the Analysis of Accident Sequences and Source Terms at Waste Treatment and Storage Facilities for Waste Generated by the U.S. Department of Energy Waste Management Operations. The main report documents the methodology, computational framework, and results of facility accident analyses performed as a part of the U.S. Department of Energy (DOE) Waste Management Programmatic Environmental Impact Statement (WM PEIS). The accident sequences potentially important to human health risk are specified, their frequencies are assessed, and the resultant radiological and chemical source terms are evaluated. A personal computer-based computational framework and database have been developedmore » that provide these results as input to the WM PEIS for calculation of human health risk impacts. This report summarizes the accident analyses and aggregates the key results for each of the waste streams. Source terms are estimated and results are presented for each of the major DOE sites and facilities by WM PEIS alternative for each waste stream. The appendices identify the potential atmospheric release of each toxic chemical or radionuclide for each accident scenario studied. They also provide discussion of specific accident analysis data and guidance used or consulted in this report.« less
Droplet Digital™ PCR Next-Generation Sequencing Library QC Assay.
Heredia, Nicholas J
2018-01-01
Digital PCR is a valuable tool to quantify next-generation sequencing (NGS) libraries precisely and accurately. Accurately quantifying NGS libraries enable accurate loading of the libraries on to the sequencer and thus improve sequencing performance by reducing under and overloading error. Accurate quantification also benefits users by enabling uniform loading of indexed/barcoded libraries which in turn greatly improves sequencing uniformity of the indexed/barcoded samples. The advantages gained by employing the Droplet Digital PCR (ddPCR™) library QC assay includes the precise and accurate quantification in addition to size quality assessment, enabling users to QC their sequencing libraries with confidence.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Muhlheim, M.D.; Belles, R.J.; Cletcher, J.W.
The Accident Sequence Precursor (ASP) Program involves the systematic review and evaluation of operational events that have occurred at light-water reactors to identify and categorize precursors to potential severe core damage accident sequences. The results of the ASP Program are published in an annual report. The most recent report, which contains the precursors for 1995, is NUREG/CR-4674, Volume 23, Precursors to Potential Severe Core Damage Accidents: 1995, A Status Report, published in April 1997. This article provides an overview of the ASP review and evaluation process and a summary of the results for 1995.
Shinozuka, Hiroshi; Forster, John W
2016-01-01
Background. Multiplexed sequencing is commonly performed on massively parallel short-read sequencing platforms such as Illumina, and the efficiency of library normalisation can affect the quality of the output dataset. Although several library normalisation approaches have been established, none are ideal for highly multiplexed sequencing due to issues of cost and/or processing time. Methods. An inexpensive and high-throughput library quantification method has been developed, based on an adaptation of the melting curve assay. Sequencing libraries were subjected to the assay using the Bio-Rad Laboratories CFX Connect(TM) Real-Time PCR Detection System. The library quantity was calculated through summation of reduction of relative fluorescence units between 86 and 95 °C. Results.PCR-enriched sequencing libraries are suitable for this quantification without pre-purification of DNA. Short DNA molecules, which ideally should be eliminated from the library for subsequent processing, were differentiated from the target DNA in a mixture on the basis of differences in melting temperature. Quantification results for long sequences targeted using the melting curve assay were correlated with those from existing methods (R (2) > 0.77), and that observed from MiSeq sequencing (R (2) = 0.82). Discussion.The results of multiplexed sequencing suggested that the normalisation performance of the described method is equivalent to that of another recently reported high-throughput bead-based method, BeNUS. However, costs for the melting curve assay are considerably lower and processing times shorter than those of other existing methods, suggesting greater suitability for highly multiplexed sequencing applications.
SBLOCA outside containment at Browns Ferry Unit One: accident sequence analysis. [Small break
DOE Office of Scientific and Technical Information (OSTI.GOV)
Condon, W.A.; Harrington, R.M.; Greene, S.R.
1982-11-01
This study describes the predicted response of Unit 1 at the Browns Ferry Nuclear Plant to a postulated small-break loss-of-coolant accident outside of the primary containment. The break has been assumed to occur in the scram discharge volume piping immediately following a reactor scram that cannot be reset. The events before core uncovering are discussed for both the worst-case accident sequence without operator action and for the more likely sequences with operator action. Without operator action, the events after core uncovering would include core meltdown and subsequent containment failure, and this event sequence has been determined through use of themore » MARCH code. An estimate of the magnitude and timing of the concomitant release of the noble gas, cesium, and iodine-based fission products to the environment is provided in Volume 2 of this report.« less
NASA Technical Reports Server (NTRS)
1972-01-01
The Accident Model Document is one of three documents of the Preliminary Safety Analysis Report (PSAR) - Reactor System as applied to a Space Base Program. Potential terrestrial nuclear hazards involving the zirconium hydride reactor-Brayton power module are identified for all phases of the Space Base program. The accidents/events that give rise to the hazards are defined and abort sequence trees are developed to determine the sequence of events leading to the hazard and the associated probabilities of occurence. Source terms are calculated to determine the magnitude of the hazards. The above data is used in the mission accident analysis to determine the most probable and significant accidents/events in each mission phase. The only significant hazards during the prelaunch and launch ascent phases of the mission are those which arise form criticality accidents. Fission product inventories during this time period were found to be very low due to very limited low power acceptance testing.
ATWS at Browns Ferry Unit One - accident sequence analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harrington, R.M.; Hodge, S.A.
1984-07-01
This study describes the predicted response of Unit One at the Browns Ferry Nuclear Plant to a postulated complete failure to scram following a transient occurrence that has caused closure of all Main Steam Isolation Valves (MSIVs). This hypothetical event constitutes the most severe example of the type of accident classified as Anticipated Transient Without Scram (ATWS). Without the automatic control rod insertion provided by scram, the void coefficient of reactivity and the mechanisms by which voids are formed in the moderator/coolant play a dominant role in the progression of the accident. Actions taken by the operator greatly influence themore » quantity of voids in the coolant and the effect is analyzed in this report. The progression of the accident sequence under existing and under recommended procedures is discussed. For the extremely unlikely cases in which equipment failure and wrongful operator actions might lead to severe core damage, the sequence of emergency action levels and the associated timing of events are presented.« less
[Clinical and analytical toxicology of opiate, cocaine and amphetamine].
Feliu, Catherine; Fouley, Aurélie; Millart, Hervé; Gozalo, Claire; Marty, Hélène; Djerada, Zoubir
2015-01-01
In several circumstances, determination and quantification of illicit drugs in biological fluids are determinant. Contexts are varied such as driving under influence, traffic accident, clinical and forensic toxicology, doping analysis, chemical submission. Whole blood is the favoured matrix for the quantification of illicit drugs. Gas chromatography coupled with mass spectrometry (GC-MS) is the gold standard for these analyses. All methods developed must be at least equivalent to gas chromatography coupled with a mass spectrometer. Nowadays, new technologies are available to biologists and clinicians: liquid chromatography coupled with a mass spectrometry (LC/MS) or coupled with a tandem mass spectrometer (LC/MS/MS). The aim of this paper is to describe the state of the art regarding techniques of confirmation by mass spectrometry used for quantification of conventional drugs except cannabis.
Accident sequence precursor events with age-related contributors
DOE Office of Scientific and Technical Information (OSTI.GOV)
Murphy, G.A.; Kohn, W.E.
1995-12-31
The Accident Sequence Precursor (ASP) Program at ORNL analyzed about 14.000 Licensee Event Reports (LERs) filed by US nuclear power plants 1987--1993. There were 193 events identified as precursors to potential severe core accident sequences. These are reported in G/CR-4674. Volumes 7 through 20. Under the NRC Nuclear Plant Aging Research program, the authors evaluated these events to determine the extent to which component aging played a role. Events were selected that involved age-related equipment degradation that initiated an event or contributed to an event sequence. For the 7-year period, ORNL identified 36 events that involved aging degradation as amore » contributor to an ASP event. Except for 1992, the percentage of age-related events within the total number of ASP events over the 7-year period ({approximately}19%) appears fairly consistent up to 1991. No correlation between plant ape and number of precursor events was found. A summary list of the age-related events is presented in the report.« less
TRAP: automated classification, quantification and annotation of tandemly repeated sequences.
Sobreira, Tiago José P; Durham, Alan M; Gruber, Arthur
2006-02-01
TRAP, the Tandem Repeats Analysis Program, is a Perl program that provides a unified set of analyses for the selection, classification, quantification and automated annotation of tandemly repeated sequences. TRAP uses the results of the Tandem Repeats Finder program to perform a global analysis of the satellite content of DNA sequences, permitting researchers to easily assess the tandem repeat content for both individual sequences and whole genomes. The results can be generated in convenient formats such as HTML and comma-separated values. TRAP can also be used to automatically generate annotation data in the format of feature table and GFF files.
Shimizu, Eri; Kato, Hisashi; Nakagawa, Yuki; Kodama, Takashi; Futo, Satoshi; Minegishi, Yasutaka; Watanabe, Takahiro; Akiyama, Hiroshi; Teshima, Reiko; Furui, Satoshi; Hino, Akihiro; Kitta, Kazumi
2008-07-23
A novel type of quantitative competitive polymerase chain reaction (QC-PCR) system for the detection and quantification of the Roundup Ready soybean (RRS) was developed. This system was designed based on the advantage of a fully validated real-time PCR method used for the quantification of RRS in Japan. A plasmid was constructed as a competitor plasmid for the detection and quantification of genetically modified soy, RRS. The plasmid contained the construct-specific sequence of RRS and the taxon-specific sequence of lectin1 (Le1), and both had 21 bp oligonucleotide insertion in the sequences. The plasmid DNA was used as a reference molecule instead of ground seeds, which enabled us to precisely and stably adjust the copy number of targets. The present study demonstrated that the novel plasmid-based QC-PCR method could be a simple and feasible alternative to the real-time PCR method used for the quantification of genetically modified organism contents.
Arkas: Rapid reproducible RNAseq analysis
Colombo, Anthony R.; J. Triche Jr, Timothy; Ramsingh, Giridharan
2017-01-01
The recently introduced Kallisto pseudoaligner has radically simplified the quantification of transcripts in RNA-sequencing experiments. We offer cloud-scale RNAseq pipelines Arkas-Quantification, and Arkas-Analysis available within Illumina’s BaseSpace cloud application platform which expedites Kallisto preparatory routines, reliably calculates differential expression, and performs gene-set enrichment of REACTOME pathways . Due to inherit inefficiencies of scale, Illumina's BaseSpace computing platform offers a massively parallel distributive environment improving data management services and data importing. Arkas-Quantification deploys Kallisto for parallel cloud computations and is conveniently integrated downstream from the BaseSpace Sequence Read Archive (SRA) import/conversion application titled SRA Import. Arkas-Analysis annotates the Kallisto results by extracting structured information directly from source FASTA files with per-contig metadata, calculates the differential expression and gene-set enrichment analysis on both coding genes and transcripts. The Arkas cloud pipeline supports ENSEMBL transcriptomes and can be used downstream from the SRA Import facilitating raw sequencing importing, SRA FASTQ conversion, RNA quantification and analysis steps. PMID:28868134
NASA Technical Reports Server (NTRS)
Belcastro, Christine M.
2011-01-01
Loss of control remains one of the largest contributors to fatal aircraft accidents worldwide. Aircraft loss-of-control accidents are complex, resulting from numerous causal and contributing factors acting alone or more often in combination. Hence, there is no single intervention strategy to prevent these accidents. This paper summarizes recent analysis results in identifying worst-case combinations of loss-of-control accident precursors and their time sequences, a holistic approach to preventing loss-of-control accidents in the future, and key requirements for validating the associated technologies.
DOE Office of Scientific and Technical Information (OSTI.GOV)
De Rosa, Felice
2006-07-01
In the ambit of the Severe Accident Network of Excellence Project (SARNET), funded by the European Union, 6. FISA (Fission Safety) Programme, one of the main tasks is the development and validation of the European Accident Source Term Evaluation Code (ASTEC Code). One of the reference codes used to compare ASTEC results, coming from experimental and Reactor Plant applications, is MELCOR. ENEA is a SARNET member and also an ASTEC and MELCOR user. During the first 18 months of this project, we performed a series of MELCOR and ASTEC calculations referring to a French PWR 900 MWe and to themore » accident sequence of 'Loss of Steam Generator (SG) Feedwater' (known as H2 sequence in the French classification). H2 is an accident sequence substantially equivalent to a Station Blackout scenario, like a TMLB accident, with the only difference that in H2 sequence the scram is forced to occur with a delay of 28 seconds. The main events during the accident sequence are a loss of normal and auxiliary SG feedwater (0 s), followed by a scram when the water level in SG is equal or less than 0.7 m (after 28 seconds). There is also a main coolant pumps trip when {delta}Tsat < 10 deg. C, a total opening of the three relief valves when Tric (core maximal outlet temperature) is above 603 K (330 deg. C) and accumulators isolation when primary pressure goes below 1.5 MPa (15 bar). Among many other points, it is worth noting that this was the first time that a MELCOR 1.8.5 input deck was available for a French PWR 900. The main ENEA effort in this period was devoted to prepare the MELCOR input deck using the code version v.1.8.5 (build QZ Oct 2000 with the latest patch 185003 Oct 2001). The input deck, completely new, was prepared taking into account structure, data and same conditions as those found inside ASTEC input decks. The main goal of the work presented in this paper is to put in evidence where and when MELCOR provides good enough results and why, in some cases mainly referring to its specific models (candling, corium pool behaviour, etc.) they were less good. A future work will be the preparation of an input deck for the new MELCOR 1.8.6. and to perform a code-to-code comparison with ASTEC v1.2 rev. 1. (author)« less
Fission product transport analysis in a loss of decay heat removal accident at Browns Ferry
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wichner, R.P.; Weber, C.F.; Hodge, S.A.
1984-01-01
This paper summarizes an analysis of the movement of noble gases, iodine, and cesium fission products within the Mark-I containment BWR reactor system represented by Browns Ferry Unit 1 during a postulated accident sequence initiated by a loss of decay heat removal (DHR) capability following a scram. The event analysis showed that this accident could be brought under control by various means, but the sequence with no operator action ultimately leads to containment (drywell) failure followed by loss of water from the reactor vessel, core degradation due to overheating, and reactor vessel failure with attendant movement of core debris ontomore » the drywell floor.« less
Event-specific real-time detection and quantification of genetically modified Roundup Ready soybean.
Huang, Chia-Chia; Pan, Tzu-Ming
2005-05-18
The event-specific real-time detection and quantification of Roundup Ready soybean (RRS) using an ABI PRISM 7700 sequence detection system with light upon extension (LUX) primer was developed in this study. The event-specific primers were designed, targeting the junction of the RRS 5' integration site and the endogenous gene lectin1. Then, a standard reference plasmid was constructed that carried both of the targeted sequences for quantitative analysis. The detection limit of the LUX real-time PCR system was 0.05 ng of 100% RRS genomic DNA, which was equal to 20.5 copies. The range of quantification was from 0.1 to 100%. The sensitivity and range of quantification successfully met the requirement of the labeling rules in the European Union and Taiwan.
Fingerprinting and quantification of GMOs in the agro-food sector.
Taverniers, I; Van Bockstaele, E; De Loose, M
2003-01-01
Most strategies for analyzing GMOs in plants and derived food and feed products, are based on the polymerase chain reaction (PCR) technique. In conventional PCR methods, a 'known' sequence between two specific primers is amplified. To the contrary, with the 'anchor PCR' technique, unknown sequences adjacent to a known sequence, can be amplified. Because T-DNA/plant border sequences are being amplified, anchor PCR is the perfect tool for unique identification of transgenes, including non-authorized GMOs. In this work, anchor PCR was applied to characterize the 'transgene locus' and to clarify the complete molecular structure of at least six different commercial transgenic plants. Based on sequences of T-DNA/plant border junctions, obtained by anchor PCR, event specific primers were developed. The junction fragments, together with endogeneous reference gene targets, were cloned in plasmids. The latter were then used as event specific calibrators in real-time PCR, a new technique for the accurate relative quantification of GMOs. We demonstrate here the importance of anchor PCR for identification and the usefulness of plasmid DNA calibrators in quantification strategies for GMOs, throughout the agro-food sector.
The effectiveness of using pictures in teaching young children about burn injury accidents.
Liu, Hsueh-Fen; Lin, Fang-Suey; Chang, Chien-Ju
2015-11-01
This study utilized the "story grammar" approach (Stein and Glenn, 1979) to analyze the within-corpus differences in recounting of sixty 6- and 7-year-old children, specifically whether illustrations (5-factor accident sequence) were or were not resorted to as a means to assist their narration of a home accident in which a child received a burn injury from hot soup. Our investigation revealed that the message presentation strategy "combining oral and pictures" better helped young children to memorize the story content (sequence of events leading to the burn injury) than "oral only." Specifically, the content of "the dangerous objects that caused the injury", "the unsafe actions that people involved took", and "how the people involved felt about the severity of the accident" differed significantly between the two groups. Copyright © 2015 Elsevier Ltd and The Ergonomics Society. All rights reserved.
Manzanares-Palenzuela, C Lorena; de-Los-Santos-Álvarez, Noemí; Lobo-Castañón, María Jesús; López-Ruiz, Beatriz
2015-06-15
Current EU regulations on the mandatory labeling of genetically modified organisms (GMOs) with a minimum content of 0.9% would benefit from the availability of reliable and rapid methods to detect and quantify DNA sequences specific for GMOs. Different genosensors have been developed to this aim, mainly intended for GMO screening. A remaining challenge, however, is the development of genosensing platforms for GMO quantification, which should be expressed as the number of event-specific DNA sequences per taxon-specific sequences. Here we report a simple and sensitive multiplexed electrochemical approach for the quantification of Roundup-Ready Soybean (RRS). Two DNA sequences, taxon (lectin) and event-specific (RR), are targeted via hybridization onto magnetic beads. Both sequences are simultaneously detected by performing the immobilization, hybridization and labeling steps in a single tube and parallel electrochemical readout. Hybridization is performed in a sandwich format using signaling probes labeled with fluorescein isothiocyanate (FITC) or digoxigenin (Dig), followed by dual enzymatic labeling using Fab fragments of anti-Dig and anti-FITC conjugated to peroxidase or alkaline phosphatase, respectively. Electrochemical measurement of the enzyme activity is finally performed on screen-printed carbon electrodes. The assay gave a linear range of 2-250 pM for both targets, with LOD values of 650 fM (160 amol) and 190 fM (50 amol) for the event-specific and the taxon-specific targets, respectively. Results indicate that the method could be applied for GMO quantification below the European labeling threshold level (0.9%), offering a general approach for the rapid quantification of specific GMO events in foods. Copyright © 2015 Elsevier B.V. All rights reserved.
Interface requirements to couple thermal hydraulics codes to severe accident codes: ICARE/CATHARE
DOE Office of Scientific and Technical Information (OSTI.GOV)
Camous, F.; Jacq, F.; Chatelard, P.
1997-07-01
In order to describe with the same code the whole sequence of severe LWR accidents, up to the vessel failure, the Institute of Protection and Nuclear Safety has performed a coupling of the severe accident code ICARE2 to the thermalhydraulics code CATHARE2. The resulting code, ICARE/CATHARE, is designed to be as pertinent as possible in all the phases of the accident. This paper is mainly devoted to the description of the ICARE2-CATHARE2 coupling.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wichner, R.P.; Hodge, S.A.; Weber, C.F.
1984-08-01
This report presents an analysis of the movement of noble gas, iodine, and cesium fission products within the Mark-I containment BWR reactor system represented by Browns Ferry Unit 1 during a postulated accident sequence initiated by a loss of decay heat removal capability following a scram. The event analysis showed that this accident could be brought under control by various means, but the sequence with no operator action ultimately leads to containment (drywell) failure followed by loss of water from the reactor vessel, core degradation due to overheating, and reactor vessel failure with attendant movement of core debris onto themore » drywell floor. The analysis of fission product transport presented in this report is based on the no-operator-action sequence and provides an estimate of fission product inventories, as a function of time, within 14 control volumes outside the core, with the atmosphere considered as the final control volume in the transport sequence. As in the case of accident sequences previously studied, we find small barrier for noble gas ejection to air, these gases being effectively purged from the drywell and reactor building by steam and concrete degradation gases. However, significant decay of krypton isotopes occurs during the long delay times involved in this sequence. In contrast, large degrees of holdup for iodine and cesium are projected due to the chemical reactivity of these elements. Only about 2 x 10/sup -4/% of the initial iodine and cesium activity are predicted to be released to the atmosphere. Principal barriers for release are deposition on reactor vessel and containment walls. A significant amount of iodine is captured in the water pool formed in the reactor building basement after actuation of the fire protection system.« less
Aircraft Loss-of-Control Accident Analysis
NASA Technical Reports Server (NTRS)
Belcastro, Christine M.; Foster, John V.
2010-01-01
Loss of control remains one of the largest contributors to fatal aircraft accidents worldwide. Aircraft loss-of-control accidents are complex in that they can result from numerous causal and contributing factors acting alone or (more often) in combination. Hence, there is no single intervention strategy to prevent these accidents. To gain a better understanding into aircraft loss-of-control events and possible intervention strategies, this paper presents a detailed analysis of loss-of-control accident data (predominantly from Part 121), including worst case combinations of causal and contributing factors and their sequencing. Future potential risks are also considered.
Chaouachi, Maher; El Malki, Redouane; Berard, Aurélie; Romaniuk, Marcel; Laval, Valérie; Brunel, Dominique; Bertheau, Yves
2008-03-26
The labeling of products containing genetically modified organisms (GMO) is linked to their quantification since a threshold for the presence of fortuitous GMOs in food has been established. This threshold is calculated from a combination of two absolute quantification values: one for the specific GMO target and the second for an endogenous reference gene specific to the taxon. Thus, the development of reliable methods to quantify GMOs using endogenous reference genes in complex matrixes such as food and feed is needed. Plant identification can be difficult in the case of closely related taxa, which moreover are subject to introgression events. Based on the homology of beta-fructosidase sequences obtained from public databases, two couples of consensus primers were designed for the detection, quantification, and differentiation of four Solanaceae: potato (Solanum tuberosum), tomato (Solanum lycopersicum), pepper (Capsicum annuum), and eggplant (Solanum melongena). Sequence variability was studied first using lines and cultivars (intraspecies sequence variability), then using taxa involved in gene introgressions, and finally, using taxonomically close taxa (interspecies sequence variability). This study allowed us to design four highly specific TaqMan-MGB probes. A duplex real time PCR assay was developed for simultaneous quantification of tomato and potato. For eggplant and pepper, only simplex real time PCR tests were developed. The results demonstrated the high specificity and sensitivity of the assays. We therefore conclude that beta-fructosidase can be used as an endogenous reference gene for GMO analysis.
Schouten, Jan P.; McElgunn, Cathal J.; Waaijer, Raymond; Zwijnenburg, Danny; Diepvens, Filip; Pals, Gerard
2002-01-01
We describe a new method for relative quantification of 40 different DNA sequences in an easy to perform reaction requiring only 20 ng of human DNA. Applications shown of this multiplex ligation-dependent probe amplification (MLPA) technique include the detection of exon deletions and duplications in the human BRCA1, MSH2 and MLH1 genes, detection of trisomies such as Down’s syndrome, characterisation of chromosomal aberrations in cell lines and tumour samples and SNP/mutation detection. Relative quantification of mRNAs by MLPA will be described elsewhere. In MLPA, not sample nucleic acids but probes added to the samples are amplified and quantified. Amplification of probes by PCR depends on the presence of probe target sequences in the sample. Each probe consists of two oligonucleotides, one synthetic and one M13 derived, that hybridise to adjacent sites of the target sequence. Such hybridised probe oligonucleotides are ligated, permitting subsequent amplification. All ligated probes have identical end sequences, permitting simultaneous PCR amplification using only one primer pair. Each probe gives rise to an amplification product of unique size between 130 and 480 bp. Probe target sequences are small (50–70 nt). The prerequisite of a ligation reaction provides the opportunity to discriminate single nucleotide differences. PMID:12060695
Schouten, Jan P; McElgunn, Cathal J; Waaijer, Raymond; Zwijnenburg, Danny; Diepvens, Filip; Pals, Gerard
2002-06-15
We describe a new method for relative quantification of 40 different DNA sequences in an easy to perform reaction requiring only 20 ng of human DNA. Applications shown of this multiplex ligation-dependent probe amplification (MLPA) technique include the detection of exon deletions and duplications in the human BRCA1, MSH2 and MLH1 genes, detection of trisomies such as Down's syndrome, characterisation of chromosomal aberrations in cell lines and tumour samples and SNP/mutation detection. Relative quantification of mRNAs by MLPA will be described elsewhere. In MLPA, not sample nucleic acids but probes added to the samples are amplified and quantified. Amplification of probes by PCR depends on the presence of probe target sequences in the sample. Each probe consists of two oligonucleotides, one synthetic and one M13 derived, that hybridise to adjacent sites of the target sequence. Such hybridised probe oligonucleotides are ligated, permitting subsequent amplification. All ligated probes have identical end sequences, permitting simultaneous PCR amplification using only one primer pair. Each probe gives rise to an amplification product of unique size between 130 and 480 bp. Probe target sequences are small (50-70 nt). The prerequisite of a ligation reaction provides the opportunity to discriminate single nucleotide differences.
Motor vehicle seat belt restraint system analysis during rollover.
Meyer, Steven E; Hock, Davis; Forrest, Stephen; Herbst, Brian; Sances, Anthony; Kumaresan, Srirangam
2003-01-01
The multi-planar and multiple impact long duration accident sequence of a real world rollover results in multidirectional vehicle acceleration pulses and multiplanar occupant motions not typically seen in a planar crash sequence. Various researchers have documented that, while contemporary production emergency locking seatbelt retractors (ELRs) have been found to be extremely effective in the planar crashes in which they are extensively evaluated, when subjected to multi-planar acceleration environments their response may be different than expected. Specifically, accelerations in the vertical plane have been shown to substantially affect the timeliness of the retractors inertial sensor moving out of its neutral position and locking the seat belt. An analysis of the vehicle occupant motions relative to the acceleration pulses sensed at the retractor location indicates a time phase shift that, under certain circumstances, can result in unexpected seat belt spool out and occupant excursions in these multi-planar, multiple impact crash sequences. This paper will review the various previous studies focusing on the retractors response to these multidirectional, including vertical, acceleration environments and review statistical studies based upon U.S. government collected data indicating a significant difference in belt usage rates in rollover accidents as compared to all other planar accident modes. A significant number of real world accident case studies will be reviewed wherein the performance of ELR equipped seatbelt systems spooled out. Finally, the typical occupant injury and the associated mechanism due to belt spool out in real world accidents will be delineated.
Nakamura, Asako J.; Suzuki, Masatoshi; Redon, Christophe E.; Kuwahara, Yoshikazu; Yamashiro, Hideaki; Abe, Yasuyuki; Takahashi, Shintaro; Fukuda, Tomokazu; Isogai, Emiko; Bonner, William M.; Fukumoto, Manabu
2017-01-01
The Fukushima Daiichi Nuclear Power Plant (FNPP) accident, the largest nuclear incident since the 1986 Chernobyl disaster, occurred when the plant was hit by a tsunami triggered by the Great East Japan Earthquake on March 11, 2011. The subsequent uncontrolled release of radioactive substances resulted in massive evacuations in a 20-km zone. To better understand the biological consequences of the FNPP accident, we have been measuring DNA damage levels in cattle in the evacuation zone. DNA damage was evaluated by assessing the levels of DNA double-strand breaks in peripheral blood lymphocytes by immunocyto-fluorescence-based quantification of γ-H2AX foci. A greater than two-fold increase in the fraction of damaged lymphocytes was observed in all animal cohorts within the evacuation zone, and the levels of DNA damage decreased slightly over the 700-day sample collection period. While the extent of damage appeared to be independent of the distance from the accident site and the estimated radiation dose from radiocesium, we observed age-dependent accumulation of DNA damage. Thus, this study, which was the first to evaluate the biological impact of the FNPP accident utilizing the γ-H2AX assays, indicated the causal relation between high levels of DNA damage in animals living in the evacuation zone and the FNPP accident. PMID:28240558
Nakamura, Asako J; Suzuki, Masatoshi; Redon, Christophe E; Kuwahara, Yoshikazu; Yamashiro, Hideaki; Abe, Yasuyuki; Takahashi, Shintaro; Fukuda, Tomokazu; Isogai, Emiko; Bonner, William M; Fukumoto, Manabu
2017-05-01
The Fukushima Daiichi Nuclear Power Plant (FNPP) accident, the largest nuclear incident since the 1986 Chernobyl disaster, occurred when the plant was hit by a tsunami triggered by the Great East Japan Earthquake on March 11, 2011. The subsequent uncontrolled release of radioactive substances resulted in massive evacuations in a 20-km zone. To better understand the biological consequences of the FNPP accident, we have been measuring DNA damage levels in cattle in the evacuation zone. DNA damage was evaluated by assessing the levels of DNA double-strand breaks in peripheral blood lymphocytes by immunocytofluorescence-based quantification of γ-H2AX foci. A greater than two-fold increase in the fraction of damaged lymphocytes was observed in all animal cohorts within the evacuation zone, and the levels of DNA damage decreased slightly over the 700-day sample collection period. While the extent of damage appeared to be independent of the distance from the accident site and the estimated radiation dose from radiocesium, we observed age-dependent accumulation of DNA damage. Thus, this study, which was the first to evaluate the biological impact of the FNPP accident utilizing the γ-H2AX assays, indicated the causal relation between high levels of DNA damage in animals living in the evacuation zone and the FNPP accident.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kastenberg, W.E.; Apostolakis, G.; Dhir, V.K.
Severe accident management can be defined as the use of existing and/or altemative resources, systems and actors to prevent or mitigate a core-melt accident. For each accident sequence and each combination of severe accident management strategies, there may be several options available to the operator, and each involves phenomenological and operational considerations regarding uncertainty. Operational uncertainties include operator, system and instrumentation behavior during an accident. A framework based on decision trees and influence diagrams has been developed which incorporates such criteria as feasibility, effectiveness, and adverse effects, for evaluating potential severe accident management strategies. The framework is also capable ofmore » propagating both data and model uncertainty. It is applied to several potential strategies including PWR cavity flooding, BWR drywell flooding, PWR depressurization and PWR feed and bleed.« less
Zhao, Shanrong; Zhang, Ying; Gamini, Ramya; Zhang, Baohong; von Schack, David
2018-03-19
To allow efficient transcript/gene detection, highly abundant ribosomal RNAs (rRNA) are generally removed from total RNA either by positive polyA+ selection or by rRNA depletion (negative selection) before sequencing. Comparisons between the two methods have been carried out by various groups, but the assessments have relied largely on non-clinical samples. In this study, we evaluated these two RNA sequencing approaches using human blood and colon tissue samples. Our analyses showed that rRNA depletion captured more unique transcriptome features, whereas polyA+ selection outperformed rRNA depletion with higher exonic coverage and better accuracy of gene quantification. For blood- and colon-derived RNAs, we found that 220% and 50% more reads, respectively, would have to be sequenced to achieve the same level of exonic coverage in the rRNA depletion method compared with the polyA+ selection method. Therefore, in most cases we strongly recommend polyA+ selection over rRNA depletion for gene quantification in clinical RNA sequencing. Our evaluation revealed that a small number of lncRNAs and small RNAs made up a large fraction of the reads in the rRNA depletion RNA sequencing data. Thus, we recommend that these RNAs are specifically depleted to improve the sequencing depth of the remaining RNAs.
Sanosyan, Armen; Fayd'herbe de Maudave, Alexis; Bollore, Karine; Zimmermann, Valérie; Foulongne, Vincent; Van de Perre, Philippe; Tuaillon, Edouard
2017-01-01
Viral load monitoring and early Epstein-Barr virus (EBV) DNA detection are essential in routine laboratory testing, especially in preemptive management of Post-transplant Lymphoproliferative Disorder. Targeting the repetitive BamHI-W sequence was shown to increase the sensitivity of EBV DNA quantification, but the variability of BamHI-W reiterations was suggested to be a source of quantification bias. We aimed to assess the extent of variability associated with BamHI-W PCR and its impact on the sensitivity of EBV DNA quantification using the 1st WHO international standard, EBV strains and clinical samples. Repetitive BamHI-W- and LMP2 single- sequences were amplified by in-house qPCRs and BXLF-1 sequence by a commercial assay (EBV R-gene™, BioMerieux). Linearity and limits of detection of in-house methods were assessed. The impact of repeated versus single target sequences on EBV DNA quantification precision was tested on B95.8 and Raji cell lines, possessing 11 and 7 copies of the BamHI-W sequence, respectively, and on clinical samples. BamHI-W qPCR demonstrated a lower limit of detection compared to LMP2 qPCR (2.33 log10 versus 3.08 log10 IU/mL; P = 0.0002). BamHI-W qPCR underestimated the EBV DNA load on Raji strain which contained fewer BamHI-W copies than the WHO standard derived from the B95.8 EBV strain (mean bias: - 0.21 log10; 95% CI, -0.54 to 0.12). Comparison of BamHI-W qPCR versus LMP2 and BXLF-1 qPCR showed an acceptable variability between EBV DNA levels in clinical samples with the mean bias being within 0.5 log10 IU/mL EBV DNA, whereas a better quantitative concordance was observed between LMP2 and BXLF-1 assays. Targeting BamHI-W resulted to a higher sensitivity compared to LMP2 but the variable reiterations of BamHI-W segment are associated with higher quantification variability. BamHI-W can be considered for clinical and therapeutic monitoring to detect an early EBV DNA and a dynamic change in viral load.
Fayd’herbe de Maudave, Alexis; Bollore, Karine; Zimmermann, Valérie; Foulongne, Vincent; Van de Perre, Philippe; Tuaillon, Edouard
2017-01-01
Background Viral load monitoring and early Epstein-Barr virus (EBV) DNA detection are essential in routine laboratory testing, especially in preemptive management of Post-transplant Lymphoproliferative Disorder. Targeting the repetitive BamHI-W sequence was shown to increase the sensitivity of EBV DNA quantification, but the variability of BamHI-W reiterations was suggested to be a source of quantification bias. We aimed to assess the extent of variability associated with BamHI-W PCR and its impact on the sensitivity of EBV DNA quantification using the 1st WHO international standard, EBV strains and clinical samples. Methods Repetitive BamHI-W- and LMP2 single- sequences were amplified by in-house qPCRs and BXLF-1 sequence by a commercial assay (EBV R-gene™, BioMerieux). Linearity and limits of detection of in-house methods were assessed. The impact of repeated versus single target sequences on EBV DNA quantification precision was tested on B95.8 and Raji cell lines, possessing 11 and 7 copies of the BamHI-W sequence, respectively, and on clinical samples. Results BamHI-W qPCR demonstrated a lower limit of detection compared to LMP2 qPCR (2.33 log10 versus 3.08 log10 IU/mL; P = 0.0002). BamHI-W qPCR underestimated the EBV DNA load on Raji strain which contained fewer BamHI-W copies than the WHO standard derived from the B95.8 EBV strain (mean bias: - 0.21 log10; 95% CI, -0.54 to 0.12). Comparison of BamHI-W qPCR versus LMP2 and BXLF-1 qPCR showed an acceptable variability between EBV DNA levels in clinical samples with the mean bias being within 0.5 log10 IU/mL EBV DNA, whereas a better quantitative concordance was observed between LMP2 and BXLF-1 assays. Conclusions Targeting BamHI-W resulted to a higher sensitivity compared to LMP2 but the variable reiterations of BamHI-W segment are associated with higher quantification variability. BamHI-W can be considered for clinical and therapeutic monitoring to detect an early EBV DNA and a dynamic change in viral load. PMID:28850597
Quantification of HCV RNA in Clinical Specimens by Branched DNA (bDNA) Technology.
Wilber, J C; Urdea, M S
1999-01-01
The diagnosis and monitoring of hepatitis C virus (HCV) infection have been aided by the development of HCV RNA quantification assays A direct measure of viral load, HCV RNA quantification has the advantage of providing information on viral kinetics and provides unique insight into the disease process. Branched DNA (bDNA) signal amplification technology provides a novel approach for the direct quantification of HCV RNA in patient specimens. The bDNA assay measures HCV RNA at physiological levels by boosting the reporter signal, rather than by replicating target sequences as the means of detection, and thus avoids the errors inherent in the extraction and amplification of target sequences. Inherently quantitative and nonradioactive, the bDNA assay is amenable to routine use in a clinical research setting, and has been used by several groups to explore the natural history, pathogenesis, and treatment of HCV infection.
Demeke, Tigst; Eng, Monika
2018-05-01
Droplet digital PCR (ddPCR) has been used for absolute quantification of genetically engineered (GE) events. Absolute quantification of GE events by duplex ddPCR requires the use of appropriate primers and probes for target and reference gene sequences in order to accurately determine the amount of GE materials. Single copy reference genes are generally preferred for absolute quantification of GE events by ddPCR. Study has not been conducted on a comparison of reference genes for absolute quantification of GE canola events by ddPCR. The suitability of four endogenous reference sequences ( HMG-I/Y , FatA(A), CruA and Ccf) for absolute quantification of GE canola events by ddPCR was investigated. The effect of DNA extraction methods and DNA quality on the assessment of reference gene copy numbers was also investigated. ddPCR results were affected by the use of single vs. two copy reference genes. The single copy, FatA(A), reference gene was found to be stable and suitable for absolute quantification of GE canola events by ddPCR. For the copy numbers measured, the HMG-I/Y reference gene was less consistent than FatA(A) reference gene. The expected ddPCR values were underestimated when CruA and Ccf (two copy endogenous Cruciferin sequences) were used because of high number of copies. It is important to make an adjustment if two copy reference genes are used for ddPCR in order to obtain accurate results. On the other hand, real-time quantitative PCR results were not affected by the use of single vs. two copy reference genes.
Hoshino, Tatsuhiko; Inagaki, Fumio
2017-01-01
Next-generation sequencing (NGS) is a powerful tool for analyzing environmental DNA and provides the comprehensive molecular view of microbial communities. For obtaining the copy number of particular sequences in the NGS library, however, additional quantitative analysis as quantitative PCR (qPCR) or digital PCR (dPCR) is required. Furthermore, number of sequences in a sequence library does not always reflect the original copy number of a target gene because of biases caused by PCR amplification, making it difficult to convert the proportion of particular sequences in the NGS library to the copy number using the mass of input DNA. To address this issue, we applied stochastic labeling approach with random-tag sequences and developed a NGS-based quantification protocol, which enables simultaneous sequencing and quantification of the targeted DNA. This quantitative sequencing (qSeq) is initiated from single-primer extension (SPE) using a primer with random tag adjacent to the 5' end of target-specific sequence. During SPE, each DNA molecule is stochastically labeled with the random tag. Subsequently, first-round PCR is conducted, specifically targeting the SPE product, followed by second-round PCR to index for NGS. The number of random tags is only determined during the SPE step and is therefore not affected by the two rounds of PCR that may introduce amplification biases. In the case of 16S rRNA genes, after NGS sequencing and taxonomic classification, the absolute number of target phylotypes 16S rRNA gene can be estimated by Poisson statistics by counting random tags incorporated at the end of sequence. To test the feasibility of this approach, the 16S rRNA gene of Sulfolobus tokodaii was subjected to qSeq, which resulted in accurate quantification of 5.0 × 103 to 5.0 × 104 copies of the 16S rRNA gene. Furthermore, qSeq was applied to mock microbial communities and environmental samples, and the results were comparable to those obtained using digital PCR and relative abundance based on a standard sequence library. We demonstrated that the qSeq protocol proposed here is advantageous for providing less-biased absolute copy numbers of each target DNA with NGS sequencing at one time. By this new experiment scheme in microbial ecology, microbial community compositions can be explored in more quantitative manner, thus expanding our knowledge of microbial ecosystems in natural environments.
Mapping DNA polymerase errors by single-molecule sequencing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, David F.; Lu, Jenny; Chang, Seungwoo
Genomic integrity is compromised by DNA polymerase replication errors, which occur in a sequence-dependent manner across the genome. Accurate and complete quantification of a DNA polymerase's error spectrum is challenging because errors are rare and difficult to detect. We report a high-throughput sequencing assay to map in vitro DNA replication errors at the single-molecule level. Unlike previous methods, our assay is able to rapidly detect a large number of polymerase errors at base resolution over any template substrate without quantification bias. To overcome the high error rate of high-throughput sequencing, our assay uses a barcoding strategy in which each replicationmore » product is tagged with a unique nucleotide sequence before amplification. Here, this allows multiple sequencing reads of the same product to be compared so that sequencing errors can be found and removed. We demonstrate the ability of our assay to characterize the average error rate, error hotspots and lesion bypass fidelity of several DNA polymerases.« less
Mapping DNA polymerase errors by single-molecule sequencing
Lee, David F.; Lu, Jenny; Chang, Seungwoo; ...
2016-05-16
Genomic integrity is compromised by DNA polymerase replication errors, which occur in a sequence-dependent manner across the genome. Accurate and complete quantification of a DNA polymerase's error spectrum is challenging because errors are rare and difficult to detect. We report a high-throughput sequencing assay to map in vitro DNA replication errors at the single-molecule level. Unlike previous methods, our assay is able to rapidly detect a large number of polymerase errors at base resolution over any template substrate without quantification bias. To overcome the high error rate of high-throughput sequencing, our assay uses a barcoding strategy in which each replicationmore » product is tagged with a unique nucleotide sequence before amplification. Here, this allows multiple sequencing reads of the same product to be compared so that sequencing errors can be found and removed. We demonstrate the ability of our assay to characterize the average error rate, error hotspots and lesion bypass fidelity of several DNA polymerases.« less
Leung, Ross Ka-Kit; Dong, Zhi Qiang; Sa, Fei; Chong, Cheong Meng; Lei, Si Wan; Tsui, Stephen Kwok-Wing; Lee, Simon Ming-Yuen
2014-02-01
Minor variants have significant implications in quasispecies evolution, early cancer detection and non-invasive fetal genotyping but their accurate detection by next-generation sequencing (NGS) is hampered by sequencing errors. We generated sequencing data from mixtures at predetermined ratios in order to provide insight into sequencing errors and variations that can arise for which simulation cannot be performed. The information also enables better parameterization in depth of coverage, read quality and heterogeneity, library preparation techniques, technical repeatability for mathematical modeling, theory development and simulation experimental design. We devised minor variant authentication rules that achieved 100% accuracy in both testing and validation experiments. The rules are free from tedious inspection of alignment accuracy, sequencing read quality or errors introduced by homopolymers. The authentication processes only require minor variants to: (1) have minimum depth of coverage larger than 30; (2) be reported by (a) four or more variant callers, or (b) DiBayes or LoFreq, plus SNVer (or BWA when no results are returned by SNVer), and with the interassay coefficient of variation (CV) no larger than 0.1. Quantification accuracy undermined by sequencing errors could neither be overcome by ultra-deep sequencing, nor recruiting more variant callers to reach a consensus, such that consistent underestimation and overestimation (i.e. low CV) were observed. To accommodate stochastic error and adjust the observed ratio within a specified accuracy, we presented a proof of concept for the use of a double calibration curve for quantification, which provides an important reference towards potential industrial-scale fabrication of calibrants for NGS.
GMO quantification: valuable experience and insights for the future.
Milavec, Mojca; Dobnik, David; Yang, Litao; Zhang, Dabing; Gruden, Kristina; Zel, Jana
2014-10-01
Cultivation and marketing of genetically modified organisms (GMOs) have been unevenly adopted worldwide. To facilitate international trade and to provide information to consumers, labelling requirements have been set up in many countries. Quantitative real-time polymerase chain reaction (qPCR) is currently the method of choice for detection, identification and quantification of GMOs. This has been critically assessed and the requirements for the method performance have been set. Nevertheless, there are challenges that should still be highlighted, such as measuring the quantity and quality of DNA, and determining the qPCR efficiency, possible sequence mismatches, characteristics of taxon-specific genes and appropriate units of measurement, as these remain potential sources of measurement uncertainty. To overcome these problems and to cope with the continuous increase in the number and variety of GMOs, new approaches are needed. Statistical strategies of quantification have already been proposed and expanded with the development of digital PCR. The first attempts have been made to use new generation sequencing also for quantitative purposes, although accurate quantification of the contents of GMOs using this technology is still a challenge for the future, and especially for mixed samples. New approaches are needed also for the quantification of stacks, and for potential quantification of organisms produced by new plant breeding techniques.
Reiman, Mario; Laan, Maris; Rull, Kristiina; Sõber, Siim
2017-08-01
RNA degradation is a ubiquitous process that occurs in living and dead cells, as well as during handling and storage of extracted RNA. Reduced RNA quality caused by degradation is an established source of uncertainty for all RNA-based gene expression quantification techniques. RNA sequencing is an increasingly preferred method for transcriptome analyses, and dependence of its results on input RNA integrity is of significant practical importance. This study aimed to characterize the effects of varying input RNA integrity [estimated as RNA integrity number (RIN)] on transcript level estimates and delineate the characteristic differences between transcripts that differ in degradation rate. The study used ribodepleted total RNA sequencing data from a real-life clinically collected set ( n = 32) of human solid tissue (placenta) samples. RIN-dependent alterations in gene expression profiles were quantified by using DESeq2 software. Our results indicate that small differences in RNA integrity affect gene expression quantification by introducing a moderate and pervasive bias in expression level estimates that significantly affected 8.1% of studied genes. The rapidly degrading transcript pool was enriched in pseudogenes, short noncoding RNAs, and transcripts with extended 3' untranslated regions. Typical slowly degrading transcripts (median length, 2389 nt) represented protein coding genes with 4-10 exons and high guanine-cytosine content.-Reiman, M., Laan, M., Rull, K., Sõber, S. Effects of RNA integrity on transcript quantification by total RNA sequencing of clinically collected human placental samples. © FASEB.
Piazza, Rocco; Magistroni, Vera; Pirola, Alessandra; Redaelli, Sara; Spinelli, Roberta; Redaelli, Serena; Galbiati, Marta; Valletta, Simona; Giudici, Giovanni; Cazzaniga, Giovanni; Gambacorti-Passerini, Carlo
2013-01-01
Copy number alterations (CNA) are common events occurring in leukaemias and solid tumors. Comparative Genome Hybridization (CGH) is actually the gold standard technique to analyze CNAs; however, CGH analysis requires dedicated instruments and is able to perform only low resolution Loss of Heterozygosity (LOH) analyses. Here we present CEQer (Comparative Exome Quantification analyzer), a new graphical, event-driven tool for CNA/allelic-imbalance (AI) coupled analysis of exome sequencing data. By using case-control matched exome data, CEQer performs a comparative digital exonic quantification to generate CNA data and couples this information with exome-wide LOH and allelic imbalance detection. This data is used to build mixed statistical/heuristic models allowing the identification of CNA/AI events. To test our tool, we initially used in silico generated data, then we performed whole-exome sequencing from 20 leukemic specimens and corresponding matched controls and we analyzed the results using CEQer. Taken globally, these analyses showed that the combined use of comparative digital exon quantification and LOH/AI allows generating very accurate CNA data. Therefore, we propose CEQer as an efficient, robust and user-friendly graphical tool for the identification of CNA/AI in the context of whole-exome sequencing data.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Klinestiver, L.R.
Psychophysiological factors are not uncommon terms in the aviation incident/accident investigation sequence where human error is involved. It is highly suspect that the same psychophysiological factors may also exist in the industrial arena where operator personnel function; but, there is little evidence in literature indicating how management and subordinates cope with these factors to prevent or reduce accidents. It is apparent that human factors psychophysological training is quite evident in the aviation industry. However, while the industrial arena appears to analyze psychophysiological factors in accident investigations, there is little evidence that established training programs exist for supervisors and operator personnel.
A Qualitative Study on Organizational Factors Affecting Occupational Accidents
ESKANDARI, Davood; JAFARI, Mohammad Javad; MEHRABI, Yadollah; KIAN, Mostafa Pouya; CHARKHAND, Hossein; MIRGHOTBI, Mostafa
2017-01-01
Background: Technical, human, operational and organizational factors have been influencing the sequence of occupational accidents. Among them, organizational factors play a major role in causing occupational accidents. The aim of this research was to understand the Iranian safety experts’ experiences and perception of organizational factors. Methods: This qualitative study was conducted in 2015 by using the content analysis technique. Data were collected through semi-structured interviews with 17 safety experts working in Iranian universities and industries and analyzed with a conventional qualitative content analysis method using the MAXQDA software. Results: Eleven organizational factors’ sub-themes were identified: management commitment, management participation, employee involvement, communication, blame culture, education and training, job satisfaction, interpersonal relationship, supervision, continuous improvement, and reward system. The participants considered these factors as effective on occupational accidents. Conclusion: The mentioned 11 organizational factors are probably involved in occupational accidents in Iran. Naturally, improving organizational factors can increase the safety performance and reduce occupational accidents. PMID:28435824
A Qualitative Study on Organizational Factors Affecting Occupational Accidents.
Eskandari, Davood; Jafari, Mohammad Javad; Mehrabi, Yadollah; Kian, Mostafa Pouya; Charkhand, Hossein; Mirghotbi, Mostafa
2017-03-01
Technical, human, operational and organizational factors have been influencing the sequence of occupational accidents. Among them, organizational factors play a major role in causing occupational accidents. The aim of this research was to understand the Iranian safety experts' experiences and perception of organizational factors. This qualitative study was conducted in 2015 by using the content analysis technique. Data were collected through semi-structured interviews with 17 safety experts working in Iranian universities and industries and analyzed with a conventional qualitative content analysis method using the MAXQDA software. Eleven organizational factors' sub-themes were identified: management commitment, management participation, employee involvement, communication, blame culture, education and training, job satisfaction, interpersonal relationship, supervision, continuous improvement, and reward system. The participants considered these factors as effective on occupational accidents. The mentioned 11 organizational factors are probably involved in occupational accidents in Iran. Naturally, improving organizational factors can increase the safety performance and reduce occupational accidents.
USDA-ARS?s Scientific Manuscript database
The pathogen causing corky root on lettuce, Sphingobium suberifaciens, is recalcitrant to standard epidemiological methods. Primers were selected from 16S rDNA sequences useful for the specific detection and quantification of S. suberifaciens. Conventional (PCR) and quantitative (qPCR) PCR protocols...
NASA Astrophysics Data System (ADS)
Martel, Dimitri; Tse Ve Koon, K.; Le Fur, Yann; Ratiney, Hélène
2015-11-01
Two-dimensional spectroscopy offers the possibility to unambiguously distinguish metabolites by spreading out the multiplet structure of J-coupled spin systems into a second dimension. Quantification methods that perform parametric fitting of the 2D MRS signal have recently been proposed for resolved PRESS (JPRESS) but not explicitly for Localized Correlation Spectroscopy (LCOSY). Here, through a whole metabolite quantification approach, correlation spectroscopy quantification performances are studied. The ability to quantify metabolite relaxation constant times is studied for three localized 2D MRS sequences (LCOSY, LCTCOSY and the JPRESS) in vitro on preclinical MR systems. The issues encountered during implementation and quantification strategies are discussed with the help of the Fisher matrix formalism. The described parameterized models enable the computation of the lower bound for error variance - generally known as the Cramér Rao bounds (CRBs), a standard of precision - on the parameters estimated from these 2D MRS signal fittings. LCOSY has a theoretical net signal loss of two per unit of acquisition time compared to JPRESS. A rapid analysis could point that the relative CRBs of LCOSY compared to JPRESS (expressed as a percentage of the concentration values) should be doubled but we show that this is not necessarily true. Finally, the LCOSY quantification procedure has been applied on data acquired in vivo on a mouse brain.
NASA Technical Reports Server (NTRS)
Belcastro, Christine M.; Groff, Loren; Newman, Richard L.; Foster, John V.; Crider, Dennis H.; Klyde, David H.; Huston, A. McCall
2014-01-01
Aircraft loss of control (LOC) is a leading cause of fatal accidents across all transport airplane and operational classes, and can result from a wide spectrum of hazards, often occurring in combination. Technologies developed for LOC prevention and recovery must therefore be effective under a wide variety of conditions and uncertainties, including multiple hazards, and their validation must provide a means of assessing system effectiveness and coverage of these hazards. This requires the definition of a comprehensive set of LOC test scenarios based on accident and incident data as well as future risks. This paper defines a comprehensive set of accidents and incidents over a recent 15 year period, and presents preliminary analysis results to identify worst-case combinations of causal and contributing factors (i.e., accident precursors) and how they sequence in time. Such analyses can provide insight in developing effective solutions for LOC, and form the basis for developing test scenarios that can be used in evaluating them. Preliminary findings based on the results of this paper indicate that system failures or malfunctions, crew actions or inactions, vehicle impairment conditions, and vehicle upsets contributed the most to accidents and fatalities, followed by inclement weather or atmospheric disturbances and poor visibility. Follow-on research will include finalizing the analysis through a team consensus process, defining future risks, and developing a comprehensive set of test scenarios with correlation to the accidents, incidents, and future risks. Since enhanced engineering simulations are required for batch and piloted evaluations under realistic LOC precursor conditions, these test scenarios can also serve as a high-level requirement for defining the engineering simulation enhancements needed for generating them.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dallman, R J; Gottula, R C; Holcomb, E E
1987-05-01
An analysis of five anticipated transients without scram (ATWS) was conducted at the Idaho National Engineering Laboratory (INEL). The five detailed deterministic simulations of postulated ATWS sequences were initiated from a main steamline isolation valve (MSIV) closure. The subject of the analysis was the Browns Ferry Nuclear Plant Unit 1, a boiling water reactor (BWR) of the BWR/4 product line with a Mark I containment. The simulations yielded insights to the possible consequences resulting from a MSIV closure ATWS. An evaluation of the effects of plant safety systems and operator actions on accident progression and mitigation is presented.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Behling, H.; Behling, K.; Amarasooriya, H.
1995-02-01
A generic difficulty encountered in cost-benefit analyses is the quantification of major elements that define the costs and the benefits in commensurate units. In this study, the costs of making KI available for public use, and the avoidance of thyroidal health effects predicted to be realized from the availability of that KI (i.e., the benefits), are defined in the commensurate units of dollars.
Piazza, Rocco; Magistroni, Vera; Pirola, Alessandra; Redaelli, Sara; Spinelli, Roberta; Redaelli, Serena; Galbiati, Marta; Valletta, Simona; Giudici, Giovanni; Cazzaniga, Giovanni; Gambacorti-Passerini, Carlo
2013-01-01
Copy number alterations (CNA) are common events occurring in leukaemias and solid tumors. Comparative Genome Hybridization (CGH) is actually the gold standard technique to analyze CNAs; however, CGH analysis requires dedicated instruments and is able to perform only low resolution Loss of Heterozygosity (LOH) analyses. Here we present CEQer (Comparative Exome Quantification analyzer), a new graphical, event-driven tool for CNA/allelic-imbalance (AI) coupled analysis of exome sequencing data. By using case-control matched exome data, CEQer performs a comparative digital exonic quantification to generate CNA data and couples this information with exome-wide LOH and allelic imbalance detection. This data is used to build mixed statistical/heuristic models allowing the identification of CNA/AI events. To test our tool, we initially used in silico generated data, then we performed whole-exome sequencing from 20 leukemic specimens and corresponding matched controls and we analyzed the results using CEQer. Taken globally, these analyses showed that the combined use of comparative digital exon quantification and LOH/AI allows generating very accurate CNA data. Therefore, we propose CEQer as an efficient, robust and user-friendly graphical tool for the identification of CNA/AI in the context of whole-exome sequencing data. PMID:24124457
2017-01-01
Unique Molecular Identifiers (UMIs) are random oligonucleotide barcodes that are increasingly used in high-throughput sequencing experiments. Through a UMI, identical copies arising from distinct molecules can be distinguished from those arising through PCR amplification of the same molecule. However, bioinformatic methods to leverage the information from UMIs have yet to be formalized. In particular, sequencing errors in the UMI sequence are often ignored or else resolved in an ad hoc manner. We show that errors in the UMI sequence are common and introduce network-based methods to account for these errors when identifying PCR duplicates. Using these methods, we demonstrate improved quantification accuracy both under simulated conditions and real iCLIP and single-cell RNA-seq data sets. Reproducibility between iCLIP replicates and single-cell RNA-seq clustering are both improved using our proposed network-based method, demonstrating the value of properly accounting for errors in UMIs. These methods are implemented in the open source UMI-tools software package. PMID:28100584
Robustness of Fat Quantification using Chemical Shift Imaging
Hansen, Katie H; Schroeder, Michael E; Hamilton, Gavin; Sirlin, Claude B; Bydder, Mark
2011-01-01
This purpose of this study was to investigate the effect of parameter changes that can potentially lead to unreliable measurements in fat quantification. Chemical shift imaging was performed using spoiled gradient echo sequences with systematic variations in the following: 2D/3D sequence, number of echoes, delta echo time, fractional echo factor, slice thickness, repetition time, flip angle, bandwidth, matrix size, flow compensation and field strength. Results indicated no significant (or significant but small) changes in fat fraction with parameter. The significant changes can be attributed to known effects of T1 bias and the two forms of noise bias. PMID:22055856
A Cyber-Attack Detection Model Based on Multivariate Analyses
NASA Astrophysics Data System (ADS)
Sakai, Yuto; Rinsaka, Koichiro; Dohi, Tadashi
In the present paper, we propose a novel cyber-attack detection model based on two multivariate-analysis methods to the audit data observed on a host machine. The statistical techniques used here are the well-known Hayashi's quantification method IV and cluster analysis method. We quantify the observed qualitative audit event sequence via the quantification method IV, and collect similar audit event sequence in the same groups based on the cluster analysis. It is shown in simulation experiments that our model can improve the cyber-attack detection accuracy in some realistic cases where both normal and attack activities are intermingled.
Offsite radiological consequence analysis for the bounding flammable gas accident
DOE Office of Scientific and Technical Information (OSTI.GOV)
CARRO, C.A.
2003-03-19
The purpose of this analysis is to calculate the offsite radiological consequence of the bounding flammable gas accident. DOE-STD-3009-94, ''Preparation Guide for U.S. Department of Energy Nonreactor Nuclear Facility Documented Safety Analyses'', requires the formal quantification of a limited subset of accidents representing a complete set of bounding conditions. The results of these analyses are then evaluated to determine if they challenge the DOE-STD-3009-94, Appendix A, ''Evaluation Guideline,'' of 25 rem total effective dose equivalent in order to identify and evaluate safety class structures, systems, and components. The bounding flammable gas accident is a detonation in a single-shell tank (SST).more » A detonation versus a deflagration was selected for analysis because the faster flame speed of a detonation can potentially result in a larger release of respirable material. As will be shown, the consequences of a detonation in either an SST or a double-shell tank (DST) are approximately equal. A detonation in an SST was selected as the bounding condition because the estimated respirable release masses are the same and because the doses per unit quantity of waste inhaled are generally greater for SSTs than for DSTs. Appendix A contains a DST analysis for comparison purposes.« less
Li, Wei; Abram, François; Pelletier, Jean-Pierre; Raynauld, Jean-Pierre; Dorais, Marc; d'Anjou, Marc-André; Martel-Pelletier, Johanne
2010-01-01
Joint effusion is frequently associated with osteoarthritis (OA) flare-up and is an important marker of therapeutic response. This study aimed at developing and validating a fully automated system based on magnetic resonance imaging (MRI) for the quantification of joint effusion volume in knee OA patients. MRI examinations consisted of two axial sequences: a T2-weighted true fast imaging with steady-state precession and a T1-weighted gradient echo. An automated joint effusion volume quantification system using MRI was developed and validated (a) with calibrated phantoms (cylinder and sphere) and effusion from knee OA patients; (b) with assessment by manual quantification; and (c) by direct aspiration. Twenty-five knee OA patients with joint effusion were included in the study. The automated joint effusion volume quantification was developed as a four stage sequencing process: bone segmentation, filtering of unrelated structures, segmentation of joint effusion, and subvoxel volume calculation. Validation experiments revealed excellent coefficients of variation with the calibrated cylinder (1.4%) and sphere (0.8%) phantoms. Comparison of the OA knee joint effusion volume assessed by the developed automated system and by manual quantification was also excellent (r = 0.98; P < 0.0001), as was the comparison with direct aspiration (r = 0.88; P = 0.0008). The newly developed fully automated MRI-based system provided precise quantification of OA knee joint effusion volume with excellent correlation with data from phantoms, a manual system, and joint aspiration. Such an automated system will be instrumental in improving the reproducibility/reliability of the evaluation of this marker in clinical application.
Novel Primer Sets for Next Generation Sequencing-Based Analyses of Water Quality
Lee, Elvina; Khurana, Maninder S.; Whiteley, Andrew S.; Monis, Paul T.; Bath, Andrew; Gordon, Cameron; Ryan, Una M.; Paparini, Andrea
2017-01-01
Next generation sequencing (NGS) has rapidly become an invaluable tool for the detection, identification and relative quantification of environmental microorganisms. Here, we demonstrate two new 16S rDNA primer sets, which are compatible with NGS approaches and are primarily for use in water quality studies. Compared to 16S rRNA gene based universal primers, in silico and experimental analyses demonstrated that the new primers showed increased specificity for the Cyanobacteria and Proteobacteria phyla, allowing increased sensitivity for the detection, identification and relative quantification of toxic bloom-forming microalgae, microbial water quality bioindicators and common pathogens. Significantly, Cyanobacterial and Proteobacterial sequences accounted for ca. 95% of all sequences obtained within NGS runs (when compared to ca. 50% with standard universal NGS primers), providing higher sensitivity and greater phylogenetic resolution of key water quality microbial groups. The increased selectivity of the new primers allow the parallel sequencing of more samples through reduced sequence retrieval levels required to detect target groups, potentially reducing NGS costs by 50% but still guaranteeing optimal coverage and species discrimination. PMID:28118368
Meng, Yanan; Liu, Xin; Wang, Shu; Zhang, Dabing; Yang, Litao
2012-01-11
To enforce the labeling regulations of genetically modified organisms (GMOs), the application of DNA plasmids as calibrants is becoming essential for the practical quantification of GMOs. This study reports the construction of plasmid pTC1507 for a quantification assay of genetically modified (GM) maize TC1507 and the collaborative ring trial in international validation of its applicability as a plasmid calibrant. pTC1507 includes one event-specific sequence of TC1507 maize and one unique sequence of maize endogenous gene zSSIIb. A total of eight GMO detection laboratories worldwide were invited to join the validation process, and test results were returned from all eight participants. Statistical analysis of the returned results showed that real-time PCR assays using pTC1507 as calibrant in both GM event-specific and endogenous gene quantifications had high PCR efficiency (ranging from 0.80 to 1.15) and good linearity (ranging from 0.9921 to 0.9998). In a quantification assay of five blind samples, the bias between the test values and true values ranged from 2.6 to 24.9%. All results indicated that the developed pTC1507 plasmid is applicable for the quantitative analysis of TC1507 maize and can be used as a suitable substitute for dried powder certified reference materials (CRMs).
Loss of DHR sequences at Browns Ferry Unit One - accident-sequence analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cook, D.H.; Grene, S.R.; Harrington, R.M.
1983-05-01
This study describes the predicted response of Unit One at the Browns Ferry Nuclear Plant to a postulated loss of decay heat removal (DHR) capability following scram from full power with the power conversion system unavailable. In accident sequences without DHR capability, the residual heat removal (RHR) system functions of pressure suppression pool cooling and reactor vessel shutdown cooling are unavailable. Consequently, all decay heat energy is stored in the pressure suppression pool with a concomitant increase in pool temperature and primary containment pressure. With the assumption that DHR capability is not regained during the lengthy course of this accidentmore » sequence, the containment ultimately fails by overpressurization. Although unlikely, this catastrophic failure might lead to loss of the ability to inject cooling water into the reactor vessel, causing subsequent core uncovery and meltdown. The timing of these events and the effective mitigating actions that might be taken by the operator are discussed in this report.« less
RAMONA-3B application to Browns Ferry ATWS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Slovik, G.C.; Neymotin, L.Y.; Saha, P.
1985-01-01
The Anticipated Transient Without Scram (ATWS) is known to be a dominant accident sequence for possible core melt in a Boiling Water Reactor (BWR). A recent Probabilistic Risk Assessment (PRA) analysis for the Browns Ferry nuclear power plant indicates that ATWS is the second most dominant transient for core melt in BWR/4 with Mark I containment. The most dominant sequence being the failure of long term decay heat removal function of the Residual Heat Removal (RHR) system. Of all the various ATWS scenarios, the Main Steam Isolation Valve (MSIV) closure ATWS sequence was chosen for present analysis because of itsmore » relatively high frequency of occurrence and its challenge to the residual heat removal system and containment integrity. The objective of this paper is to discuss four MSIV closure ATWS calculations using the RAMONA-3B code. The paper is a summary of a report being prepared for the USNRC Severe Accident Sequence Analysis (SASA) program which should be referred to for details. 10 refs., 20 figs., 3 tabs.« less
Inverse modelling of radionuclide release rates using gamma dose rate observations
NASA Astrophysics Data System (ADS)
Hamburger, Thomas; Evangeliou, Nikolaos; Stohl, Andreas; von Haustein, Christoph; Thummerer, Severin; Wallner, Christian
2015-04-01
Severe accidents in nuclear power plants such as the historical accident in Chernobyl 1986 or the more recent disaster in the Fukushima Dai-ichi nuclear power plant in 2011 have drastic impacts on the population and environment. Observations and dispersion modelling of the released radionuclides help to assess the regional impact of such nuclear accidents. Modelling the increase of regional radionuclide activity concentrations, which results from nuclear accidents, underlies a multiplicity of uncertainties. One of the most significant uncertainties is the estimation of the source term. That is, the time dependent quantification of the released spectrum of radionuclides during the course of the nuclear accident. The quantification of the source term may either remain uncertain (e.g. Chernobyl, Devell et al., 1995) or rely on estimates given by the operators of the nuclear power plant. Precise measurements are mostly missing due to practical limitations during the accident. The release rates of radionuclides at the accident site can be estimated using inverse modelling (Davoine and Bocquet, 2007). The accuracy of the method depends amongst others on the availability, reliability and the resolution in time and space of the used observations. Radionuclide activity concentrations are observed on a relatively sparse grid and the temporal resolution of available data may be low within the order of hours or a day. Gamma dose rates, on the other hand, are observed routinely on a much denser grid and higher temporal resolution and provide therefore a wider basis for inverse modelling (Saunier et al., 2013). We present a new inversion approach, which combines an atmospheric dispersion model and observations of radionuclide activity concentrations and gamma dose rates to obtain the source term of radionuclides. We use the Lagrangian particle dispersion model FLEXPART (Stohl et al., 1998; Stohl et al., 2005) to model the atmospheric transport of the released radionuclides. The inversion method uses a Bayesian formulation considering uncertainties for the a priori source term and the observations (Eckhardt et al., 2008, Stohl et al., 2012). The a priori information on the source term is a first guess. The gamma dose rate observations are used to improve the first guess and to retrieve a reliable source term. The details of this method will be presented at the conference. This work is funded by the Bundesamt für Strahlenschutz BfS, Forschungsvorhaben 3612S60026. References Davoine, X. and Bocquet, M., Atmos. Chem. Phys., 7, 1549-1564, 2007. Devell, L., et al., OCDE/GD(96)12, 1995. Eckhardt, S., et al., Atmos. Chem. Phys., 8, 3881-3897, 2008. Saunier, O., et al., Atmos. Chem. Phys., 13, 11403-11421, 2013. Stohl, A., et al., Atmos. Environ., 32, 4245-4264, 1998. Stohl, A., et al., Atmos. Chem. Phys., 5, 2461-2474, 2005. Stohl, A., et al., Atmos. Chem. Phys., 12, 2313-2343, 2012.
Chemistry, the Central Science? The History of the High School Science Sequence
ERIC Educational Resources Information Center
Sheppard, Keith; Robbins, Dennis M.
2005-01-01
Chemistry became the ''central science'' not by design but by accident in the US high schools. The three important factors, which had their influence on the high school science, are sequenced and their impact on the development of US science education, are mentioned.
Expert systems for fault diagnosis in nuclear reactor control
NASA Astrophysics Data System (ADS)
Jalel, N. A.; Nicholson, H.
1990-11-01
An expert system for accident analysis and fault diagnosis for the Loss Of Fluid Test (LOFT) reactor, a small scale pressurized water reactor, was developed for a personal computer. The knowledge of the system is presented using a production rule approach with a backward chaining inference engine. The data base of the system includes simulated dependent state variables of the LOFT reactor model. Another system is designed to assist the operator in choosing the appropriate cooling mode and to diagnose the fault in the selected cooling system. The response tree, which is used to provide the link between a list of very specific accident sequences and a set of generic emergency procedures which help the operator in monitoring system status, and to differentiate between different accident sequences and select the correct procedures, is used to build the system knowledge base. Both systems are written in TURBO PROLOG language and can be run on an IBM PC compatible with 640k RAM, 40 Mbyte hard disk and color graphics.
Quantitative proteome analysis using isobaric peptide termini labeling (IPTL).
Arntzen, Magnus O; Koehler, Christian J; Treumann, Achim; Thiede, Bernd
2011-01-01
The quantitative comparison of proteome level changes across biological samples has become an essential feature in proteomics that remains challenging. We have recently introduced isobaric peptide termini labeling (IPTL), a novel strategy for isobaric quantification based on the derivatization of peptide termini with complementary isotopically labeled reagents. Unlike non-isobaric quantification methods, sample complexity at the MS level is not increased, providing improved sensitivity and protein coverage. The distinguishing feature of IPTL when comparing it to more established isobaric labeling methods (iTRAQ and TMT) is the presence of quantification signatures in all sequence-determining ions in MS/MS spectra, not only in the low mass reporter ion region. This makes IPTL a quantification method that is accessible to mass spectrometers with limited capabilities in the low mass range. Also, the presence of several quantification points in each MS/MS spectrum increases the robustness of the quantification procedure.
A smart phone-based pocket fall accident detection, positioning, and rescue system.
Kau, Lih-Jen; Chen, Chih-Sheng
2015-01-01
We propose in this paper a novel algorithm as well as architecture for the fall accident detection and corresponding wide area rescue system based on a smart phone and the third generation (3G) networks. To realize the fall detection algorithm, the angles acquired by the electronic compass (ecompass) and the waveform sequence of the triaxial accelerometer on the smart phone are used as the system inputs. The acquired signals are then used to generate an ordered feature sequence and then examined in a sequential manner by the proposed cascade classifier for recognition purpose. Once the corresponding feature is verified by the classifier at current state, it can proceed to next state; otherwise, the system will reset to the initial state and wait for the appearance of another feature sequence. Once a fall accident event is detected, the user's position can be acquired by the global positioning system (GPS) or the assisted GPS, and sent to the rescue center via the 3G communication network so that the user can get medical help immediately. With the proposed cascaded classification architecture, the computational burden and power consumption issue on the smart phone system can be alleviated. Moreover, as we will see in the experiment that a distinguished fall accident detection accuracy up to 92% on the sensitivity and 99.75% on the specificity can be obtained when a set of 450 test actions in nine different kinds of activities are estimated by using the proposed cascaded classifier, which justifies the superiority of the proposed algorithm.
2010-01-01
Introduction Joint effusion is frequently associated with osteoarthritis (OA) flare-up and is an important marker of therapeutic response. This study aimed at developing and validating a fully automated system based on magnetic resonance imaging (MRI) for the quantification of joint effusion volume in knee OA patients. Methods MRI examinations consisted of two axial sequences: a T2-weighted true fast imaging with steady-state precession and a T1-weighted gradient echo. An automated joint effusion volume quantification system using MRI was developed and validated (a) with calibrated phantoms (cylinder and sphere) and effusion from knee OA patients; (b) with assessment by manual quantification; and (c) by direct aspiration. Twenty-five knee OA patients with joint effusion were included in the study. Results The automated joint effusion volume quantification was developed as a four stage sequencing process: bone segmentation, filtering of unrelated structures, segmentation of joint effusion, and subvoxel volume calculation. Validation experiments revealed excellent coefficients of variation with the calibrated cylinder (1.4%) and sphere (0.8%) phantoms. Comparison of the OA knee joint effusion volume assessed by the developed automated system and by manual quantification was also excellent (r = 0.98; P < 0.0001), as was the comparison with direct aspiration (r = 0.88; P = 0.0008). Conclusions The newly developed fully automated MRI-based system provided precise quantification of OA knee joint effusion volume with excellent correlation with data from phantoms, a manual system, and joint aspiration. Such an automated system will be instrumental in improving the reproducibility/reliability of the evaluation of this marker in clinical application. PMID:20846392
Short RNA indicator sequences are not completely degraded by autoclaving
Unnithan, Veena V.; Unc, Adrian; Joe, Valerisa; Smith, Geoffrey B.
2014-01-01
Short indicator RNA sequences (<100 bp) persist after autoclaving and are recovered intact by molecular amplification. Primers targeting longer sequences are most likely to produce false positives due to amplification errors easily verified by melting curves analyses. If short indicator RNA sequences are used for virus identification and quantification then post autoclave RNA degradation methodology should be employed, which may include further autoclaving. PMID:24518856
Scoping Study Investigating PWR Instrumentation during a Severe Accident Scenario
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rempe, J. L.; Knudson, D. L.; Lutz, R. J.
The accidents at the Three Mile Island Unit 2 (TMI-2) and Fukushima Daiichi Units 1, 2, and 3 nuclear power plants demonstrate the critical importance of accurate, relevant, and timely information on the status of reactor systems during a severe accident. These events also highlight the critical importance of understanding and focusing on the key elements of system status information in an environment where operators may be overwhelmed with superfluous and sometimes conflicting data. While progress in these areas has been made since TMI-2, the events at Fukushima suggests that there may still be a potential need to ensure thatmore » critical plant information is available to plant operators. Recognizing the significant technical and economic challenges associated with plant modifications, it is important to focus on instrumentation that can address these information critical needs. As part of a program initiated by the Department of Energy, Office of Nuclear Energy (DOE-NE), a scoping effort was initiated to assess critical information needs identified for severe accident management and mitigation in commercial Light Water Reactors (LWRs), to quantify the environment instruments monitoring this data would have to survive, and to identify gaps where predicted environments exceed instrumentation qualification envelop (QE) limits. Results from the Pressurized Water Reactor (PWR) scoping evaluations are documented in this report. The PWR evaluations were limited in this scoping evaluation to quantifying the environmental conditions for an unmitigated Short-Term Station BlackOut (STSBO) sequence in one unit at the Surry nuclear power station. Results were obtained using the MELCOR models developed for the US Nuclear Regulatory Commission (NRC)-sponsored State of the Art Consequence Assessment (SOARCA) program project. Results from this scoping evaluation indicate that some instrumentation identified to provide critical information would be exposed to conditions that significantly exceeded QE limits for extended time periods for the low frequency STSBO sequence evaluated in this study. It is recognized that the core damage frequency (CDF) of the sequence evaluated in this scoping effort would be considerably lower if evaluations considered new FLEX equipment being installed by industry. Nevertheless, because of uncertainties in instrumentation response when exposed to conditions beyond QE limits and alternate challenges associated with different sequences that may impact sensor performance, it is recommended that additional evaluations of instrumentation performance be completed to provide confidence that operators have access to accurate, relevant, and timely information on the status of reactor systems for a broad range of challenges associated with risk important severe accident sequences.« less
Quan, Phenix-Lan; Sauzade, Martin
2018-01-01
Digital Polymerase Chain Reaction (dPCR) is a novel method for the absolute quantification of target nucleic acids. Quantification by dPCR hinges on the fact that the random distribution of molecules in many partitions follows a Poisson distribution. Each partition acts as an individual PCR microreactor and partitions containing amplified target sequences are detected by fluorescence. The proportion of PCR-positive partitions suffices to determine the concentration of the target sequence without a need for calibration. Advances in microfluidics enabled the current revolution of digital quantification by providing efficient partitioning methods. In this review, we compare the fundamental concepts behind the quantification of nucleic acids by dPCR and quantitative real-time PCR (qPCR). We detail the underlying statistics of dPCR and explain how it defines its precision and performance metrics. We review the different microfluidic digital PCR formats, present their underlying physical principles, and analyze the technological evolution of dPCR platforms. We present the novel multiplexing strategies enabled by dPCR and examine how isothermal amplification could be an alternative to PCR in digital assays. Finally, we determine whether the theoretical advantages of dPCR over qPCR hold true by perusing studies that directly compare assays implemented with both methods. PMID:29677144
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mays, S.E.; Poloski, J.P.; Sullivan, W.H.
1982-07-01
A probabilistic risk assessment (PRA) was made of the Browns Ferry, Unit 1, nuclear plant as part of the Nuclear Regulatory Commission's Interim Reliability Evaluation Program (IREP). Specific goals of the study were to identify the dominant contributors to core melt, develop a foundation for more extensive use of PRA methods, expand the cadre of experienced PRA practitioners, and apply procedures for extension of IREP analyses to other domestic light water reactors. Event tree and fault tree analyses were used to estimate the frequency of accident sequences initiated by transients and loss of coolant accidents. External events such as floods,more » fires, earthquakes, and sabotage were beyond the scope of this study and were, therefore, excluded. From these sequences, the dominant contributors to probable core melt frequency were chosen. Uncertainty and sensitivity analyses were performed on these sequences to better understand the limitations associated with the estimated sequence frequencies. Dominant sequences were grouped according to common containment failure modes and corresponding release categories on the basis of comparison with analyses of similar designs rather than on the basis of detailed plant-specific calculations.« less
Streaming fragment assignment for real-time analysis of sequencing experiments
Roberts, Adam; Pachter, Lior
2013-01-01
We present eXpress, a software package for highly efficient probabilistic assignment of ambiguously mapping sequenced fragments. eXpress uses a streaming algorithm with linear run time and constant memory use. It can determine abundances of sequenced molecules in real time, and can be applied to ChIP-seq, metagenomics and other large-scale sequencing data. We demonstrate its use on RNA-seq data, showing greater efficiency than other quantification methods. PMID:23160280
Padmanaban, Jeya; Shields, Leland E; Scheibe, Robert R; Eyges, Vitaly E
2008-10-01
This study investigated 478 police accident reports from 9 states to examine and characterize rollover crashes involving ESC-equipped vehicles. The focus was on the sequence of critical events leading to loss of control and rollover, and the interactions between the accident, driver, and environment. Results show that, while ESC is effective in reducing loss of control leading to certain rollover crashes, its effectiveness is diminished in others, particularly when the vehicle departs the roadway or when environmental factors such as slick road conditions or driver factors such as speeding, distraction, fatigue, impairment, or overcorrection are present.
Padmanaban, Jeya; Shields, Leland E.; Scheibe, Robert R.; Eyges, Vitaly E.
2008-01-01
This study investigated 478 police accident reports from 9 states to examine and characterize rollover crashes involving ESC-equipped vehicles. The focus was on the sequence of critical events leading to loss of control and rollover, and the interactions between the accident, driver, and environment. Results show that, while ESC is effective in reducing loss of control leading to certain rollover crashes, its effectiveness is diminished in others, particularly when the vehicle departs the roadway or when environmental factors such as slick road conditions or driver factors such as speeding, distraction, fatigue, impairment, or overcorrection are present. PMID:19026219
Code of Federal Regulations, 2010 CFR
2010-01-01
... licensed before 1997, or use simplified, inherent, passive, or other innovative means to accomplish their... sequences, including equilibrium core conditions; or (2) There has been acceptable testing of a prototype... accident sequences, including equilibrium core conditions. If a prototype plant is used to comply with the...
2013-01-01
The formalin-fixed, paraffin-embedded (FFPE) biopsy is a challenging sample for molecular assays such as targeted next-generation sequencing (NGS). We compared three methods for FFPE DNA quantification, including a novel PCR assay (‘QFI-PCR’) that measures the absolute copy number of amplifiable DNA, across 165 residual clinical specimens. The results reveal the limitations of commonly used approaches, and demonstrate the value of an integrated workflow using QFI-PCR to improve the accuracy of NGS mutation detection and guide changes in input that can rescue low quality FFPE DNA. These findings address a growing need for improved quality measures in NGS-based patient testing. PMID:24001039
Development of economic consequence methodology for process risk analysis.
Zadakbar, Omid; Khan, Faisal; Imtiaz, Syed
2015-04-01
A comprehensive methodology for economic consequence analysis with appropriate models for risk analysis of process systems is proposed. This methodology uses loss functions to relate process deviations in a given scenario to economic losses. It consists of four steps: definition of a scenario, identification of losses, quantification of losses, and integration of losses. In this methodology, the process deviations that contribute to a given accident scenario are identified and mapped to assess potential consequences. Losses are assessed with an appropriate loss function (revised Taguchi, modified inverted normal) for each type of loss. The total loss is quantified by integrating different loss functions. The proposed methodology has been examined on two industrial case studies. Implementation of this new economic consequence methodology in quantitative risk assessment will provide better understanding and quantification of risk. This will improve design, decision making, and risk management strategies. © 2014 Society for Risk Analysis.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Oldenburg, Curtis M.; Budnitz, Robert J.
If Carbon dioxide Capture and Storage (CCS) is to be effective in mitigating climate change, it will need to be carried out on a very large scale. This will involve many thousands of miles of dedicated high-pressure pipelines in order to transport many millions of tonnes of CO 2 annually, with the CO 2 delivered to many thousands of wells that will inject the CO 2 underground. The new CCS infrastructure could rival in size the current U.S. upstream natural gas pipeline and well infrastructure. This new infrastructure entails hazards for life, health, animals, the environment, and natural resources. Pipelinesmore » are known to rupture due to corrosion, from external forces such as impacts by vehicles or digging equipment, by defects in construction, or from the failure of valves and seals. Similarly, wells are vulnerable to catastrophic failure due to corrosion, cement degradation, or operational mistakes. While most accidents involving pipelines and wells will be minor, there is the inevitable possibility of accidents with very high consequences, especially to public health. The most important consequence of concern is CO 2 release to the environment in concentrations sufficient to cause death by asphyxiation to nearby populations. Such accidents are thought to be very unlikely, but of course they cannot be excluded, even if major engineering effort is devoted (as it will be) to keeping their probability low and their consequences minimized. This project has developed a methodology for analyzing the risks of these rare but high-consequence accidents, using a step-by-step probabilistic methodology. A key difference between risks for pipelines and wells is that the former are spatially distributed along the pipe whereas the latter are confined to the vicinity of the well. Otherwise, the methodology we develop for risk assessment of pipeline and well failures is similar and provides an analysis both of the annual probabilities of accident sequences of concern and of their consequences, and crucially the methodology provides insights into what measures might be taken to mitigate those accident sequences identified as of concern. Mitigating strategies could address reducing the likelihood of an accident sequence of concern, or reducing the consequences, or some combination. The methodology elucidates both local and integrated risks along the pipeline or at the well providing information useful to decision makers at various levels including local (e.g., property owners and town councils), regional (e.g., county and state representatives), and national levels (federal regulators and corporate proponents).« less
Deng, Ning; Li, Zhenye; Pan, Chao; Duan, Huilong
2015-01-01
Study of complex proteome brings forward higher request for the quantification method using mass spectrometry technology. In this paper, we present a mass spectrometry label-free quantification tool for complex proteomes, called freeQuant, which integrated quantification with functional analysis effectively. freeQuant consists of two well-integrated modules: label-free quantification and functional analysis with biomedical knowledge. freeQuant supports label-free quantitative analysis which makes full use of tandem mass spectrometry (MS/MS) spectral count, protein sequence length, shared peptides, and ion intensity. It adopts spectral count for quantitative analysis and builds a new method for shared peptides to accurately evaluate abundance of isoforms. For proteins with low abundance, MS/MS total ion count coupled with spectral count is included to ensure accurate protein quantification. Furthermore, freeQuant supports the large-scale functional annotations for complex proteomes. Mitochondrial proteomes from the mouse heart, the mouse liver, and the human heart were used to evaluate the usability and performance of freeQuant. The evaluation showed that the quantitative algorithms implemented in freeQuant can improve accuracy of quantification with better dynamic range.
Lights, camera, action: high-throughput plant phenotyping is ready for a close-up
USDA-ARS?s Scientific Manuscript database
Modern techniques for crop improvement rely on both DNA sequencing and accurate quantification of plant traits to identify genes and germplasm of interest. With rapid advances in DNA sequencing technologies, plant phenotyping is now a bottleneck in advancing crop yields [1,2]. Furthermore, the envir...
Application of forensic image analysis in accident investigations.
Verolme, Ellen; Mieremet, Arjan
2017-09-01
Forensic investigations are primarily meant to obtain objective answers that can be used for criminal prosecution. Accident analyses are usually performed to learn from incidents and to prevent similar events from occurring in the future. Although the primary goal may be different, the steps in which information is gathered, interpreted and weighed are similar in both types of investigations, implying that forensic techniques can be of use in accident investigations as well. The use in accident investigations usually means that more information can be obtained from the available information than when used in criminal investigations, since the latter require a higher evidence level. In this paper, we demonstrate the applicability of forensic techniques for accident investigations by presenting a number of cases from one specific field of expertise: image analysis. With the rapid spread of digital devices and new media, a wealth of image material and other digital information has become available for accident investigators. We show that much information can be distilled from footage by using forensic image analysis techniques. These applications show that image analysis provides information that is crucial for obtaining the sequence of events and the two- and three-dimensional geometry of an accident. Since accident investigation focuses primarily on learning from accidents and prevention of future accidents, and less on the blame that is crucial for criminal investigations, the field of application of these forensic tools may be broader than would be the case in purely legal sense. This is an important notion for future accident investigations. Copyright © 2017 Elsevier B.V. All rights reserved.
Locomotive crashworthiness research
DOT National Transportation Integrated Search
2015-04-01
conducts research on locomotive crashworthiness. The research approach includes four phases: : 1. Accident investigations to assemble sequences of events leading to injury and fatality. : 2. Locomotive performance is analyzed, and potential improveme...
Cho, Jin-Young; Lee, Hyoung-Joo; Jeong, Seul-Ki; Paik, Young-Ki
2017-12-01
Mass spectrometry (MS) is a widely used proteome analysis tool for biomedical science. In an MS-based bottom-up proteomic approach to protein identification, sequence database (DB) searching has been routinely used because of its simplicity and convenience. However, searching a sequence DB with multiple variable modification options can increase processing time, false-positive errors in large and complicated MS data sets. Spectral library searching is an alternative solution, avoiding the limitations of sequence DB searching and allowing the detection of more peptides with high sensitivity. Unfortunately, this technique has less proteome coverage, resulting in limitations in the detection of novel and whole peptide sequences in biological samples. To solve these problems, we previously developed the "Combo-Spec Search" method, which uses manually multiple references and simulated spectral library searching to analyze whole proteomes in a biological sample. In this study, we have developed a new analytical interface tool called "Epsilon-Q" to enhance the functions of both the Combo-Spec Search method and label-free protein quantification. Epsilon-Q performs automatically multiple spectral library searching, class-specific false-discovery rate control, and result integration. It has a user-friendly graphical interface and demonstrates good performance in identifying and quantifying proteins by supporting standard MS data formats and spectrum-to-spectrum matching powered by SpectraST. Furthermore, when the Epsilon-Q interface is combined with the Combo-Spec search method, called the Epsilon-Q system, it shows a synergistic function by outperforming other sequence DB search engines for identifying and quantifying low-abundance proteins in biological samples. The Epsilon-Q system can be a versatile tool for comparative proteome analysis based on multiple spectral libraries and label-free quantification.
Quantitative interaction proteomics using mass spectrometry.
Wepf, Alexander; Glatter, Timo; Schmidt, Alexander; Aebersold, Ruedi; Gstaiger, Matthias
2009-03-01
We present a mass spectrometry-based strategy for the absolute quantification of protein complex components isolated through affinity purification. We quantified bait proteins via isotope-labeled reference peptides corresponding to an affinity tag sequence and prey proteins by label-free correlational quantification using the precursor ion signal intensities of proteotypic peptides generated in reciprocal purifications. We used this method to quantitatively analyze interaction stoichiometries in the human protein phosphatase 2A network.
Systematic Errors in Peptide and Protein Identification and Quantification by Modified Peptides*
Bogdanow, Boris; Zauber, Henrik; Selbach, Matthias
2016-01-01
The principle of shotgun proteomics is to use peptide mass spectra in order to identify corresponding sequences in a protein database. The quality of peptide and protein identification and quantification critically depends on the sensitivity and specificity of this assignment process. Many peptides in proteomic samples carry biochemical modifications, and a large fraction of unassigned spectra arise from modified peptides. Spectra derived from modified peptides can erroneously be assigned to wrong amino acid sequences. However, the impact of this problem on proteomic data has not yet been investigated systematically. Here we use combinations of different database searches to show that modified peptides can be responsible for 20–50% of false positive identifications in deep proteomic data sets. These false positive hits are particularly problematic as they have significantly higher scores and higher intensities than other false positive matches. Furthermore, these wrong peptide assignments lead to hundreds of false protein identifications and systematic biases in protein quantification. We devise a “cleaned search” strategy to address this problem and show that this considerably improves the sensitivity and specificity of proteomic data. In summary, we show that modified peptides cause systematic errors in peptide and protein identification and quantification and should therefore be considered to further improve the quality of proteomic data annotation. PMID:27215553
Jiang, Lingxi; Yang, Litao; Rao, Jun; Guo, Jinchao; Wang, Shu; Liu, Jia; Lee, Seonghun; Zhang, Dabing
2010-02-01
To implement genetically modified organism (GMO) labeling regulations, an event-specific analysis method based on the junction sequence between exogenous integration and host genomic DNA has become the preferential approach for GMO identification and quantification. In this study, specific primers and TaqMan probes based on the revealed 5'-end junction sequence of GM cotton MON15985 were designed, and qualitative and quantitative polymerase chain reaction (PCR) assays were established employing the designed primers and probes. In the qualitative PCR assay, the limit of detection (LOD) was 0.5 g kg(-1) in 100 ng total cotton genomic DNA, corresponding to about 17 copies of haploid cotton genomic DNA, and the LOD and limit of quantification (LOQ) for quantitative PCR assay were 10 and 17 copies of haploid cotton genomic DNA, respectively. Furthermore, the developed quantitative PCR assays were validated in-house by five different researchers. Also, five practical samples with known GM contents were quantified using the developed PCR assay in in-house validation, and the bias between the true and quantification values ranged from 2.06% to 12.59%. This study shows that the developed qualitative and quantitative PCR methods are applicable for the identification and quantification of GM cotton MON15985 and its derivates.
Horn, T; Chang, C A; Urdea, M S
1997-12-01
The divergent synthesis of bDNA structures is described. This new type of branched DNA contains one unique oligonucleotide, the primary sequence, covalently attached through a comb-like branching network to many identical copies of a different oligonucleotide, the secondary sequence. The bDNA comb molecules were assembled on a solid support using parameters optimized for bDNA synthesis. The chemistry was used to synthesize bDNA comb molecules containing 15 secondary sequences. The bDNA comb molecules were elaborated by enzymatic ligation into branched amplification multimers, large bDNA molecules (a total of 1068 nt) containing an average of 36 repeated DNA oligomer sequences, each capable of hybridizing specifically to an alkaline phosphatase-labeled oligonucleotide. The bDNA comb molecules were characterized by electrophoretic methods and by controlled cleavage at periodate-cleavable moieties incorporated during synthesis. The branched amplification multimers have been used as signal amplifiers in nucleic acid quantification assays for detection of viral infection. It is possible to detect as few as 50 molecules with bDNA technology.
Horn, T; Chang, C A; Urdea, M S
1997-01-01
The divergent synthesis of bDNA structures is described. This new type of branched DNA contains one unique oligonucleotide, the primary sequence, covalently attached through a comb-like branching network to many identical copies of a different oligonucleotide, the secondary sequence. The bDNA comb molecules were assembled on a solid support using parameters optimized for bDNA synthesis. The chemistry was used to synthesize bDNA comb molecules containing 15 secondary sequences. The bDNA comb molecules were elaborated by enzymatic ligation into branched amplification multimers, large bDNA molecules (a total of 1068 nt) containing an average of 36 repeated DNA oligomer sequences, each capable of hybridizing specifically to an alkaline phosphatase-labeled oligonucleotide. The bDNA comb molecules were characterized by electrophoretic methods and by controlled cleavage at periodate-cleavable moieties incorporated during synthesis. The branched amplification multimers have been used as signal amplifiers in nucleic acid quantification assays for detection of viral infection. It is possible to detect as few as 50 molecules with bDNA technology. PMID:9365266
Otte, Dietmar; Jänsch, Michael; Haasper, Carl
2012-01-01
Within a study of accident data from GIDAS (German In-Depth Accident Study), vulnerable road users are investigated regarding injury risk in traffic accidents. GIDAS is the largest in-depth accident study in Germany. Due to a well-defined sampling plan, representativeness with respect to the federal statistics is also guaranteed. A hierarchical system ACASS (Accident Causation Analysis with Seven Steps) was developed in GIDAS, describing the human causation factors in a chronological sequence. The accordingly classified causation factors - derived from the systematic of the analysis of human accident causes ("7 steps") - can be used to describe the influence of accident causes on the injury outcome. The bases of the study are accident documentations over ten years from 1999 to 2008 with 8204 vulnerable road users (VRU), of which 3 different groups were selected as pedestrians n=2041, motorcyclists n=2199 and bicyclists n=3964, and analyzed on collisions with cars and trucks as well as vulnerable road users alone. The paper will give a description of the injury pattern and injury mechanisms of accidents. The injury frequencies and severities are pointed out considering different types of VRU and protective measures of helmet and clothes of the human body. The impact points are demonstrated on the car, following to conclusion of protective measures on the vehicle. Existing standards of protection devices as well as interdisciplinary research, including accident and injury statistics, are described. With this paper, a summarization of the existing possibilities on protective measures for pedestrians, bicyclists and motorcyclists is given and discussed by comparison of all three groups of vulnerable road users. Also the relevance of special impact situations and accident causes mainly responsible for severe injuries are pointed out, given the new orientation of research for the avoidance and reduction of accident patterns. 2010 Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1988-12-15
This section of the Accident Model Document (AMD) presents the appendices which describe the various analyses that have been conducted for use in the Galileo Final Safety Analysis Report II, Volume II. Included in these appendices are the approaches, techniques, conditions and assumptions used in the development of the analytical models plus the detailed results of the analyses. Also included in these appendices are summaries of the accidents and their associated probabilities and environment models taken from the Shuttle Data Book (NSTS-08116), plus summaries of the several segments of the recent GPHS safety test program. The information presented in thesemore » appendices is used in Section 3.0 of the AMD to develop the Failure/Abort Sequence Trees (FASTs) and to determine the fuel releases (source terms) resulting from the potential Space Shuttle/IUS accidents throughout the missions.« less
Upon the reconstruction of accidents triggered by tire explosion. Analytical model and case study
NASA Astrophysics Data System (ADS)
Gaiginschi, L.; Agape, I.; Talif, S.
2017-10-01
Accident Reconstruction is important in the general context of increasing road traffic safety. In the casuistry of traffic accidents, those caused by tire explosions are critical under the severity of consequences, because they are usually happening at high speeds. Consequently, the knowledge of the running speed of the vehicle involved at the time of the tire explosion is essential to elucidate the circumstances of the accident. The paper presents an analytical model for the kinematics of a vehicle which, after the explosion of one of its tires, begins to skid, overturns and rolls. The model consists of two concurent approaches built as applications of the momentum conservation and energy conservation principles, and allows determination of the initial speed of the vehicle involved, by running backwards the sequences of the road event. The authors also aimed to both validate the two distinct analytical approaches by calibrating the calculation algorithms on a case study
2011-01-01
Purpose Eddy current induced velocity offsets are of concern for accuracy in cardiovascular magnetic resonance (CMR) volume flow quantification. However, currently known theoretical aspects of eddy current behavior have not led to effective guidelines for the optimization of flow quantification sequences. This study is aimed at identifying correlations between protocol parameters and the resulting velocity error in clinical CMR flow measurements in a multi-vendor study. Methods Nine 1.5T scanners of three different types/vendors were studied. Measurements were performed on a large stationary phantom. Starting from a clinical breath-hold flow protocol, several protocol parameters were varied. Acquisitions were made in three clinically relevant orientations. Additionally, a time delay between the bipolar gradient and read-out, asymmetric versus symmetric velocity encoding, and gradient amplitude and slew rate were studied in adapted sequences as exploratory measurements beyond the protocol. Image analysis determined the worst-case offset for a typical great-vessel flow measurement. Results The results showed a great variation in offset behavior among scanners (standard deviation among samples of 0.3, 0.4, and 0.9 cm/s for the three different scanner types), even for small changes in the protocol. Considering the absolute values, none of the tested protocol settings consistently reduced the velocity offsets below the critical level of 0.6 cm/s neither for all three orientations nor for all three scanner types. Using multilevel linear model analysis, oblique aortic and pulmonary slices showed systematic higher offsets than the transverse aortic slices (oblique aortic 0.6 cm/s, and pulmonary 1.8 cm/s higher than transverse aortic). The exploratory measurements beyond the protocol yielded some new leads for further sequence development towards reduction of velocity offsets; however those protocols were not always compatible with the time-constraints of breath-hold imaging and flow-related artefacts. Conclusions This study showed that with current systems there was no generic protocol which resulted into acceptable flow offset values. Protocol optimization would have to be performed on a per scanner and per protocol basis. Proper optimization might make accurate (transverse) aortic flow quantification possible for most scanners. Pulmonary flow quantification would still need further (offline) correction. PMID:21388521
Investigating accident causation through information network modelling.
Griffin, T G C; Young, M S; Stanton, N A
2010-02-01
Management of risk in complex domains such as aviation relies heavily on post-event investigations, requiring complex approaches to fully understand the integration of multi-causal, multi-agent and multi-linear accident sequences. The Event Analysis of Systemic Teamwork methodology (EAST; Stanton et al. 2008) offers such an approach based on network models. In this paper, we apply EAST to a well-known aviation accident case study, highlighting communication between agents as a central theme and investigating the potential for finding agents who were key to the accident. Ultimately, this work aims to develop a new model based on distributed situation awareness (DSA) to demonstrate that the risk inherent in a complex system is dependent on the information flowing within it. By identifying key agents and information elements, we can propose proactive design strategies to optimize the flow of information and help work towards avoiding aviation accidents. Statement of Relevance: This paper introduces a novel application of an holistic methodology for understanding aviation accidents. Furthermore, it introduces an ongoing project developing a nonlinear and prospective method that centralises distributed situation awareness and communication as themes. The relevance of findings are discussed in the context of current ergonomic and aviation issues of design, training and human-system interaction.
BNL severe-accident sequence experiments and analysis program. [PWR; BWR
DOE Office of Scientific and Technical Information (OSTI.GOV)
Greene, G.A.; Ginsberg, T.; Tutu, N.K.
1983-01-01
In the analysis of degraded core accidents, the two major sources of pressure loading on light water reactor containments are: steam generation from core debris-water thermal interactions; and molten core-concrete interactions. Experiments are in progress at BNL in support of analytical model development related to aspects of the above containment loading mechanisms. The work supports development and evaluation of the CORCON (Muir, 1981) and MARCH (Wooton, 1980) computer codes. Progress in the two programs is described.
Nie, Hui; Evans, Alison A.; London, W. Thomas; Block, Timothy M.; Ren, Xiangdong David
2011-01-01
Hepatitis B virus (HBV) carrying the A1762T/G1764A double mutation in the basal core promoter (BCP) region is associated with HBe antigen seroconversion and increased risk of liver cirrhosis and hepatocellular carcinoma (HCC). Quantification of the mutant viruses may help in predicting the risk of HCC. However, the viral genome tends to have nucleotide polymorphism, which makes it difficult to design hybridization-based assays including real-time PCR. Ultrasensitive quantification of the mutant viruses at the early developmental stage is even more challenging, as the mutant is masked by excessive amounts of the wild-type (WT) viruses. In this study, we developed a selective inhibitory PCR (siPCR) using a locked nucleic acid-based PCR blocker to selectively inhibit the amplification of the WT viral DNA but not the mutant DNA. At the end of siPCR, the proportion of the mutant could be increased by about 10,000-fold, making the mutant more readily detectable by downstream applications such as real-time PCR and DNA sequencing. We also describe a primer-probe partial overlap approach which significantly simplified the melting curve patterns and minimized the influence of viral genome polymorphism on assay accuracy. Analysis of 62 patient samples showed a complete match of the melting curve patterns with the sequencing results. More than 97% of HBV BCP sequences in the GenBank database can be correctly identified by the melting curve analysis. The combination of siPCR and the SimpleProbe real-time PCR enabled mutant quantification in the presence of a 100,000-fold excess of the WT DNA. PMID:21562108
Quantification Bias Caused by Plasmid DNA Conformation in Quantitative Real-Time PCR Assay
Lin, Chih-Hui; Chen, Yu-Chieh; Pan, Tzu-Ming
2011-01-01
Quantitative real-time PCR (qPCR) is the gold standard for the quantification of specific nucleic acid sequences. However, a serious concern has been revealed in a recent report: supercoiled plasmid standards cause significant over-estimation in qPCR quantification. In this study, we investigated the effect of plasmid DNA conformation on the quantification of DNA and the efficiency of qPCR. Our results suggest that plasmid DNA conformation has significant impact on the accuracy of absolute quantification by qPCR. DNA standard curves shifted significantly among plasmid standards with different DNA conformations. Moreover, the choice of DNA measurement method and plasmid DNA conformation may also contribute to the measurement error of DNA standard curves. Due to the multiple effects of plasmid DNA conformation on the accuracy of qPCR, efforts should be made to assure the highest consistency of plasmid standards for qPCR. Thus, we suggest that the conformation, preparation, quantification, purification, handling, and storage of standard plasmid DNA should be described and defined in the Minimum Information for Publication of Quantitative Real-Time PCR Experiments (MIQE) to assure the reproducibility and accuracy of qPCR absolute quantification. PMID:22194997
Krafft, Axel J; Loeffler, Ralf B; Song, Ruitian; Tipirneni-Sajja, Aaryani; McCarville, M Beth; Robson, Matthew D; Hankins, Jane S; Hillenbrand, Claudia M
2017-11-01
Hepatic iron content (HIC) quantification via transverse relaxation rate (R2*)-MRI using multi-gradient echo (mGRE) imaging is compromised toward high HIC or at higher fields due to the rapid signal decay. Our study aims at presenting an optimized 2D ultrashort echo time (UTE) sequence for R2* quantification to overcome these limitations. Two-dimensional UTE imaging was realized via half-pulse excitation and radial center-out sampling. The sequence includes chemically selective saturation pulses to reduce streaking artifacts from subcutaneous fat, and spatial saturation (sSAT) bands to suppress out-of-slice signals. The sequence employs interleaved multi-echo readout trains to achieve dense temporal sampling of rapid signal decays. Evaluation was done at 1.5 Tesla (T) and 3T in phantoms, and clinical applicability was demonstrated in five patients with biopsy-confirmed massively high HIC levels (>25 mg Fe/g dry weight liver tissue). In phantoms, the sSAT pulses were found to remove out-of-slice contamination, and R2* results were in excellent agreement to reference mGRE R2* results (slope of linear regression: 1.02/1.00 for 1.5/3T). UTE-based R2* quantification in patients with massive iron overload proved successful at both field strengths and was consistent with biopsy HIC values. The UTE sequence provides a means to measure R2* in patients with massive iron overload, both at 1.5T and 3T. Magn Reson Med 78:1839-1851, 2017. © 2017 Wiley Periodicals, Inc. © 2017 International Society for Magnetic Resonance in Medicine.
Quantification of differential gene expression by multiplexed targeted resequencing of cDNA
Arts, Peer; van der Raadt, Jori; van Gestel, Sebastianus H.C.; Steehouwer, Marloes; Shendure, Jay; Hoischen, Alexander; Albers, Cornelis A.
2017-01-01
Whole-transcriptome or RNA sequencing (RNA-Seq) is a powerful and versatile tool for functional analysis of different types of RNA molecules, but sample reagent and sequencing cost can be prohibitive for hypothesis-driven studies where the aim is to quantify differential expression of a limited number of genes. Here we present an approach for quantification of differential mRNA expression by targeted resequencing of complementary DNA using single-molecule molecular inversion probes (cDNA-smMIPs) that enable highly multiplexed resequencing of cDNA target regions of ∼100 nucleotides and counting of individual molecules. We show that accurate estimates of differential expression can be obtained from molecule counts for hundreds of smMIPs per reaction and that smMIPs are also suitable for quantification of relative gene expression and allele-specific expression. Compared with low-coverage RNA-Seq and a hybridization-based targeted RNA-Seq method, cDNA-smMIPs are a cost-effective high-throughput tool for hypothesis-driven expression analysis in large numbers of genes (10 to 500) and samples (hundreds to thousands). PMID:28474677
10 CFR 963.13 - Preclosure suitability evaluation method.
Code of Federal Regulations, 2011 CFR
2011-01-01
... to evaluate whether the geologic repository is likely to comply with the applicable radiation... prevent or mitigate the effects of postulated Category 1 and 2 event sequences. The preclosure safety... prevent accidents. ...
Wienkoop, Stefanie; Larrainzar, Estíbaliz; Glinski, Mirko; González, Esther M.; Arrese-Igor, Cesar; Weckwerth, Wolfram
2008-01-01
Mass spectrometry (MS) has become increasingly important for tissue specific protein quantification at the isoform level, as well as for the analysis of protein post-translational regulation mechanisms and turnover rates. Thanks to the development of high accuracy mass spectrometers, peptide sequencing without prior knowledge of the amino acid sequence—de novo sequencing—can be performed. In this work, absolute quantification of a set of key enzymes involved in carbon and nitrogen metabolism in Medicago truncatula ‘Jemalong A17’ root nodules is presented. Among them, sucrose synthase (SuSy; EC 2.4.1.13), one of the central enzymes in sucrose cleavage in root nodules, has been further characterized and the relative phosphorylation state of the three most abundant isoforms has been quantified. De novo sequencing provided sequence information of a so far unidentified peptide, most probably belonging to SuSy2, the second most abundant isoform in M. truncatula root nodules. TiO2-phosphopeptide enrichment led to the identification of not only a phosphorylation site at Ser11 in SuSy1, but also of several novel phosphorylation sites present in other root nodule proteins such as alkaline invertase (AI; EC 3.2.1.26) and an RNA-binding protein. PMID:18772307
Glutamate quantification by PRESS or MEGA-PRESS: Validation, repeatability, and concordance.
van Veenendaal, Tamar M; Backes, Walter H; van Bussel, Frank C G; Edden, Richard A E; Puts, Nicolaas A J; Aldenkamp, Albert P; Jansen, Jacobus F A
2018-05-01
While PRESS is often employed to measure glutamate concentrations, MEGA-PRESS enables simultaneous Glx (glutamate and glutamine) and GABA measurements. This study aimed to compare validation, repeatability, and concordance of different approaches for glutamate quantification at 3T to aid future studies in their selection of the appropriate sequence and quantification method. Nine phantoms with different glutamate and glutamine concentrations and five healthy participants were scanned twice to assess respectively the validation and repeatability of measurements with PRESS and MEGA-PRESS. To assess concordance between the different methods, results from 95 human participants were compared. PRESS, MEGA-PRESS (i.e. difference), and the MEGA-PRESS OFF spectra were analyzed with both LCModel and Gannet. In vitro, excellent agreement was shown between actual and measured glutamate concentrations for all measurements (r>0.98). In vivo CVs were better for PRESS (2.9%) than MEGA-PRESS (4.9%) and MEGA-PRESS OFF (4.2%). However, the concordance between the sequences was low (PRESS and MEGA-PRESS OFF, r=0.3) to modest (MEGA-PRESS versus MEGA-PRESS OFF, r=0.8). Both PRESS and MEGA-PRESS can be employed to measure in vivo glutamate concentrations, although PRESS shows a better repeatability. Comparisons between in vivo glutamate measures of different sequences however need to be interpreted cautiously. Copyright © 2018 Elsevier Inc. All rights reserved.
A variational technique for smoothing flight-test and accident data
NASA Technical Reports Server (NTRS)
Bach, R. E., Jr.
1980-01-01
The problem of determining aircraft motions along a trajectory is solved using a variational algorithm that generates unmeasured states and forcing functions, and estimates instrument bias and scale-factor errors. The problem is formulated as a nonlinear fixed-interval smoothing problem, and is solved as a sequence of linear two-point boundary value problems, using a sweep method. The algorithm has been implemented for use in flight-test and accident analysis. Aircraft motions are assumed to be governed by a six-degree-of-freedom kinematic model; forcing functions consist of body accelerations and winds, and the measurement model includes aerodynamic and radar data. Examples of the determination of aircraft motions from typical flight-test and accident data are presented.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Raimondo, E.; Capman, J.L.; Herovard, M.
1985-05-01
Requirements for qualification of electrical equipment used in French-built nuclear power plants are stated in a national code, the RCC-E, or Regles de Construction et de Conception des Materiels Electriques. Under the RCC-E, safety related equipment is assigned to one of three different categories, according to location in the plant and anticipated normal, accident and post-accident behavior. Qualification tests differ for each category and procedures range in scope from the standard seismic test to the highly stringent VISA program, which specifies a predetermined sequence of aging, radiation, seismic and simulated accident testing. A network of official French test facilities wasmore » developed specifically to meet RCC-E requirements.« less
Assessment of an explosive LPG release accident: a case study.
Bubbico, Roberto; Marchini, Mauro
2008-07-15
In the present paper, an accident occurred during a liquefied petroleum gas (LPG) tank filling activity has been taken into consideration. During the transfer of LPG from the source road tank car to the receiving fixed storage vessel, an accidental release of LPG gave rise to different final consequences ranging from a pool fire, to a fireball and to the catastrophic rupture of the tank with successive explosion of its contents. The sequence of events has been investigated by using some of the consequence calculation models most commonly adopted in risk analysis and accident investigation. On one hand, this allows to better understand the link between the various events of the accident. On the other hand, a comparison between the results of the calculations and the damages actually observed after the accident, allows to check the accuracy of the prediction models and to critically assess their validity. In particular, it was shown that the largest uncertainty is associated with the calculation of the energy involved in the physical expansion of the fluid (both liquid and vapor) after the catastrophic rupture of the tank.
Schoone, G J; Oskam, L; Kroon, N C; Schallig, H D; Omar, S A
2000-11-01
A quantitative nucleic acid sequence-based amplification (QT-NASBA) assay for the detection of Plasmodium parasites has been developed. Primers and probes were selected on the basis of the sequence of the small-subunit rRNA gene. Quantification was achieved by coamplification of the RNA in the sample with one modified in vitro RNA as a competitor in a single-tube NASBA reaction. Parasite densities ranging from 10 to 10(8) Plasmodium falciparum parasites per ml could be demonstrated and quantified in whole blood. This is approximately 1,000 times more sensitive than conventional microscopy analysis of thick blood smears. Comparison of the parasite densities obtained by microscopy and QT-NASBA with 120 blood samples from Kenyan patients with clinical malaria revealed that for 112 of 120 (93%) of the samples results were within a 1-log difference. QT-NASBA may be especially useful for the detection of low parasite levels in patients with early-stage malaria and for the monitoring of the efficacy of drug treatment.
Summerskill, Stephen; Marshall, Russell; Cook, Sharon; Lenard, James; Richardson, John
2016-03-01
The aim of the study is to understand the nature of blind spots in the vision of drivers of Large Goods Vehicles caused by vehicle design variables such as the driver eye height, and mirror designs. The study was informed by the processing of UK national accident data using cluster analysis to establish if vehicle blind spots contribute to accidents. In order to establish the cause and nature of blind spots six top selling trucks in the UK, with a range of sizes were digitized and imported into the SAMMIE Digital Human Modelling (DHM) system. A novel CAD based vision projection technique, which has been validated in a laboratory study, allowed multiple mirror and window aperture projections to be created, resulting in the identification and quantification of a key blind spot. The identified blind spot was demonstrated to have the potential to be associated with the scenarios that were identified in the accident data. The project led to the revision of UNECE Regulation 46 that defines mirror coverage in the European Union, with new vehicle registrations in Europe being required to meet the amended standard after June of 2015. Crown Copyright © 2015. Published by Elsevier Ltd. All rights reserved.
WHEN MODEL MEETS REALITY – A REVIEW OF SPAR LEVEL 2 MODEL AGAINST FUKUSHIMA ACCIDENT
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhegang Ma
The Standardized Plant Analysis Risk (SPAR) models are a set of probabilistic risk assessment (PRA) models used by the Nuclear Regulatory Commission (NRC) to evaluate the risk of operations at U.S. nuclear power plants and provide inputs to risk informed regulatory process. A small number of SPAR Level 2 models have been developed mostly for feasibility study purpose. They extend the Level 1 models to include containment systems, group plant damage states, and model containment phenomenology and accident progression in containment event trees. A severe earthquake and tsunami hit the eastern coast of Japan in March 2011 and caused significantmore » damages on the reactors in Fukushima Daiichi site. Station blackout (SBO), core damage, containment damage, hydrogen explosion, and intensive radioactivity release, which have been previous analyzed and assumed as postulated accident progression in PRA models, now occurred with various degrees in the multi-units Fukushima Daiichi site. This paper reviews and compares a typical BWR SPAR Level 2 model with the “real” accident progressions and sequences occurred in Fukushima Daiichi Units 1, 2, and 3. It shows that the SPAR Level 2 model is a robust PRA model that could very reasonably describe the accident progression for a real and complicated nuclear accident in the world. On the other hand, the comparison shows that the SPAR model could be enhanced by incorporating some accident characteristics for better representation of severe accident progression.« less
Quantification of color vision using a tablet display.
Chacon, Alicia; Rabin, Jeff; Yu, Dennis; Johnston, Shawn; Bradshaw, Timothy
2015-01-01
Accurate color vision is essential for optimal performance in aviation and space environments using nonredundant color coding to convey critical information. Most color tests detect color vision deficiency (CVD) but fail to diagnose type or severity of CVD, which are important to link performance to occupational demands. The computer-based Cone Contrast Test (CCT) diagnoses type and severity of CVD. It is displayed on a netbook computer for clinical application, but a more portable version may prove useful for deployments, space and aviation cockpits, as well as accident and sports medicine settings. Our purpose was to determine if the CCT can be conducted on a tablet display (Windows 8, Microsoft, Seattle, WA) using touch-screen response input. The CCT presents colored letters visible only to red (R), green (G), and blue (B) sensitive retinal cones to determine the lowest R, G, and B cone contrast visible to the observer. The CCT was measured in 16 color vision normals (CVN) and 16 CVDs using the standard netbook computer and a Windows 8 tablet display calibrated to produce equal color contrasts. Both displays showed 100% specificity for confirming CVN and 100% sensitivity for detecting CVD. In CVNs there was no difference between scores on netbook vs. tablet displays. G cone CVDs showed slightly lower G cone CCT scores on the tablet. CVD can be diagnosed with a tablet display. Ease-of-use, portability, and complete computer capabilities make tablets ideal for multiple settings, including aviation, space, military deployments, accidents and rescue missions, and sports vision. Chacon A, Rabin J, Yu D, Johnston S, Bradshaw T. Quantification of color vision using a tablet display.
Leveraging transcript quantification for fast computation of alternative splicing profiles.
Alamancos, Gael P; Pagès, Amadís; Trincado, Juan L; Bellora, Nicolás; Eyras, Eduardo
2015-09-01
Alternative splicing plays an essential role in many cellular processes and bears major relevance in the understanding of multiple diseases, including cancer. High-throughput RNA sequencing allows genome-wide analyses of splicing across multiple conditions. However, the increasing number of available data sets represents a major challenge in terms of computation time and storage requirements. We describe SUPPA, a computational tool to calculate relative inclusion values of alternative splicing events, exploiting fast transcript quantification. SUPPA accuracy is comparable and sometimes superior to standard methods using simulated as well as real RNA-sequencing data compared with experimentally validated events. We assess the variability in terms of the choice of annotation and provide evidence that using complete transcripts rather than more transcripts per gene provides better estimates. Moreover, SUPPA coupled with de novo transcript reconstruction methods does not achieve accuracies as high as using quantification of known transcripts, but remains comparable to existing methods. Finally, we show that SUPPA is more than 1000 times faster than standard methods. Coupled with fast transcript quantification, SUPPA provides inclusion values at a much higher speed than existing methods without compromising accuracy, thereby facilitating the systematic splicing analysis of large data sets with limited computational resources. The software is implemented in Python 2.7 and is available under the MIT license at https://bitbucket.org/regulatorygenomicsupf/suppa. © 2015 Alamancos et al.; Published by Cold Spring Harbor Laboratory Press for the RNA Society.
Kolacsek, Orsolya; Pergel, Enikő; Varga, Nóra; Apáti, Ágota; Orbán, Tamás I
2017-01-20
There are numerous applications of quantitative PCR for both diagnostic and basic research. As in many other techniques the basis of quantification is that comparisons are made between different (unknown and known or reference) specimens of the same entity. When the aim is to compare real quantities of different species in samples, one cannot escape their separate precise absolute quantification. We have established a simple and reliable method for this purpose (Ct shift method) which combines the absolute and the relative approach. It requires a plasmid standard containing both sequences of amplicons to be compared (e.g. the target of interest and the endogenous control). It can serve as a reference sample with equal copies of templates for both targets. Using the ΔΔCt formula we can quantify the exact ratio of the two templates in each unknown sample. The Ct shift method has been successfully applied for transposon gene copy measurements, as well as for comparison of different mRNAs in cDNA samples. This study provides the proof of concept and introduces some potential applications of the method; the absolute nature of results even without the need for real reference samples can contribute to the universality of the method and comparability of different studies. Copyright © 2016 Elsevier B.V. All rights reserved.
Watzinger, Franz; Hörth, Elfriede; Lion, Thomas
2001-01-01
Despite the recent introduction of real-time PCR methods, competitive PCR techniques continue to play an important role in nucleic acid quantification because of the significantly lower cost of equipment and consumables. Here we describe a shifted restriction-site competitive PCR (SRS-cPCR) assay based on a modified type of competitor. The competitor fragments are designed to contain a recognition site for a restriction endonuclease that is also present in the target sequence to be quantified, but in a different position. Upon completion of the PCR, the amplicons are digested in the same tube with a single restriction enzyme, without the need to purify PCR products. The generated competitor- and target-specific restriction fragments display different sizes, and can be readily separated by electrophoresis and quantified by image analysis. Suboptimal digestion affects competitor- and target-derived amplicons to the same extent, thus eliminating the problem of incorrect quantification as a result of incomplete digestion of PCR products. We have established optimized conditions for a panel of 20 common restriction endonucleases permitting efficient digestion in PCR buffer. It is possible, therefore, to find a suitable restriction site for competitive PCR in virtually any sequence of interest. The assay presented is inexpensive, widely applicable, and permits reliable and accurate quantification of nucleic acid targets. PMID:11376164
Bokulich, Nicholas A.
2013-01-01
Ultra-high-throughput sequencing (HTS) of fungal communities has been restricted by short read lengths and primer amplification bias, slowing the adoption of newer sequencing technologies to fungal community profiling. To address these issues, we evaluated the performance of several common internal transcribed spacer (ITS) primers and designed a novel primer set and work flow for simultaneous quantification and species-level interrogation of fungal consortia. Primer comparison and validation were predicted in silico and by sequencing a “mock community” of mixed yeast species to explore the challenges of amplicon length and amplification bias for reconstructing defined yeast community structures. The amplicon size and distribution of this primer set are smaller than for all preexisting ITS primer sets, maximizing sequencing coverage of hypervariable ITS domains by very-short-amplicon, high-throughput sequencing platforms. This feature also enables the optional integration of quantitative PCR (qPCR) directly into the HTS preparatory work flow by substituting qPCR with these primers for standard PCR, yielding quantification of individual community members. The complete work flow described here, utilizing any of the qualified primer sets evaluated, can rapidly profile mixed fungal communities and capably reconstructed well-characterized beer and wine fermentation fungal communities. PMID:23377949
Synthetic spike-in standards for high-throughput 16S rRNA gene amplicon sequencing
Tourlousse, Dieter M.; Yoshiike, Satowa; Ohashi, Akiko; Matsukura, Satoko; Noda, Naohiro
2017-01-01
Abstract High-throughput sequencing of 16S rRNA gene amplicons (16S-seq) has become a widely deployed method for profiling complex microbial communities but technical pitfalls related to data reliability and quantification remain to be fully addressed. In this work, we have developed and implemented a set of synthetic 16S rRNA genes to serve as universal spike-in standards for 16S-seq experiments. The spike-ins represent full-length 16S rRNA genes containing artificial variable regions with negligible identity to known nucleotide sequences, permitting unambiguous identification of spike-in sequences in 16S-seq read data from any microbiome sample. Using defined mock communities and environmental microbiota, we characterized the performance of the spike-in standards and demonstrated their utility for evaluating data quality on a per-sample basis. Further, we showed that staggered spike-in mixtures added at the point of DNA extraction enable concurrent estimation of absolute microbial abundances suitable for comparative analysis. Results also underscored that template-specific Illumina sequencing artifacts may lead to biases in the perceived abundance of certain taxa. Taken together, the spike-in standards represent a novel bioanalytical tool that can substantially improve 16S-seq-based microbiome studies by enabling comprehensive quality control along with absolute quantification. PMID:27980100
Preliminary calculations related to the accident at Three Mile Island
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kirchner, W.L.; Stevenson, M.G.
This report discusses preliminary studies of the Three Mile Island Unit 2 (TMI-2) accident based on available methods and data. The work reported includes: (1) a TRAC base case calculation out to 3 hours into the accident sequence; (2) TRAC parametric calculations, these are the same as the base case except for a single hypothetical change in the system conditions, such as assuming the high pressure injection (HPI) system operated as designed rather than as in the accident; (3) fuel rod cladding failure, cladding oxidation due to zirconium metal-steam reactions, hydrogen release due to cladding oxidation, cladding ballooning, cladding embrittlement,more » and subsequent cladding breakup estimates based on TRAC calculated cladding temperatures and system pressures. Some conclusions of this work are: the TRAC base case accident calculation agrees very well with known system conditions to nearly 3 hours into the accident; the parametric calculations indicate that, loss-of-core cooling was most influenced by the throttling of High-Pressure Injection (HPI) flows, given the accident initiating events and the pressurizer electromagnetic-operated valve (EMOV) failing to close as designed; failure of nearly all the rods and gaseous fission product gas release from the failed rods is predicted to have occurred at about 2 hours and 30 minutes; cladding oxidation (zirconium-steam reaction) up to 3 hours resulted in the production of approximately 40 kilograms of hydrogen.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Farmer, M. T.; Corradini, M.; Rempe, J.
The U.S. Department of Energy (DOE) has played a major role in the U.S. response to the events at Fukushima Daiichi. During the first several weeks following the accident, U.S. assistance efforts were guided by results from a significant and diverse set of analyses. In the months that followed, a coordinated analysis activity aimed at gaining a more thorough understanding of the accident sequence was completed using laboratory-developed, system-level best-estimate accident analysis codes, while a parallel analysis was conducted by U.S. industry. A comparison of predictions for Unit 1 from these two studies indicated significant differences between MAAP and MELCORmore » results for key plant parameters, such as in-core hydrogen production. On that basis, a crosswalk was completed to determine the key modeling variations that led to these differences. In parallel with these activities, it became clear that there was a need to perform a technology gap evaluation on accident-tolerant components and severe accident analysis methodologies with the goal of identifying any data and/or knowledge gaps that may exist given the current state of light water reactor (LWR) severe accident research and augmented by insights from Fukushima. In addition, there is growing international recognition that data from Fukushima could significantly reduce uncertainties related to severe accident progression, particularly for boiling water reactors. On these bases, a group of U. S. experts in LWR safety and plant operations was convened by the DOE Office of Nuclear Energy (DOE-NE) to complete technology gap analysis and Fukushima forensics data needs identification activities. The results from these activities were used as the basis for refining DOE-NE's severe accident research and development (R&D) plan. Finally, this paper provides a high-level review of DOE-sponsored R&D efforts in these areas, including planned activities on accident-tolerant components and accident analysis methods.« less
Farmer, M. T.; Corradini, M.; Rempe, J.; ...
2016-11-02
The U.S. Department of Energy (DOE) has played a major role in the U.S. response to the events at Fukushima Daiichi. During the first several weeks following the accident, U.S. assistance efforts were guided by results from a significant and diverse set of analyses. In the months that followed, a coordinated analysis activity aimed at gaining a more thorough understanding of the accident sequence was completed using laboratory-developed, system-level best-estimate accident analysis codes, while a parallel analysis was conducted by U.S. industry. A comparison of predictions for Unit 1 from these two studies indicated significant differences between MAAP and MELCORmore » results for key plant parameters, such as in-core hydrogen production. On that basis, a crosswalk was completed to determine the key modeling variations that led to these differences. In parallel with these activities, it became clear that there was a need to perform a technology gap evaluation on accident-tolerant components and severe accident analysis methodologies with the goal of identifying any data and/or knowledge gaps that may exist given the current state of light water reactor (LWR) severe accident research and augmented by insights from Fukushima. In addition, there is growing international recognition that data from Fukushima could significantly reduce uncertainties related to severe accident progression, particularly for boiling water reactors. On these bases, a group of U. S. experts in LWR safety and plant operations was convened by the DOE Office of Nuclear Energy (DOE-NE) to complete technology gap analysis and Fukushima forensics data needs identification activities. The results from these activities were used as the basis for refining DOE-NE's severe accident research and development (R&D) plan. Finally, this paper provides a high-level review of DOE-sponsored R&D efforts in these areas, including planned activities on accident-tolerant components and accident analysis methods.« less
A Multiplex PCR assay to differentiate between dog and red fox.
Weissenberger, M; Reichert, W; Mattern, R
2011-11-01
Foxes are frequently the cause of car accidents in Baden-Württemberg (BW, Germany). The domestic dog (Canis familiaris) is in close relation to the red fox (Vulpes vulpes) and the silver fox which is a coat colour variant of the red fox. As insurance claims that involve accidents with animals require authentication, we analyzed frequency distribution and allele sizes in two canine microsatellite loci in 26 dogs (different breeds) and 19 red foxes of the region of BW, Germany. Moreover, sequencing analysis was performed. Red foxes exhibited only 1 allele at each microsatellite locus, whereas in dog 7 alleles at the CPH4 locus and 6 alleles at the CPH12 locus were detected. Sequences of PCR products from the two species revealed several differences between dogs and foxes. We established a sequenced allelic ladder and give population data from dogs and red foxes from the region of BW, Germany. Using microsatellite polymorphisms is efficient in differentiating between dogs and foxes in forensic casework. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.
Evaluation of the reliability of maize reference assays for GMO quantification.
Papazova, Nina; Zhang, David; Gruden, Kristina; Vojvoda, Jana; Yang, Litao; Buh Gasparic, Meti; Blejec, Andrej; Fouilloux, Stephane; De Loose, Marc; Taverniers, Isabel
2010-03-01
A reliable PCR reference assay for relative genetically modified organism (GMO) quantification must be specific for the target taxon and amplify uniformly along the commercialised varieties within the considered taxon. Different reference assays for maize (Zea mays L.) are used in official methods for GMO quantification. In this study, we evaluated the reliability of eight existing maize reference assays, four of which are used in combination with an event-specific polymerase chain reaction (PCR) assay validated and published by the Community Reference Laboratory (CRL). We analysed the nucleotide sequence variation in the target genomic regions in a broad range of transgenic and conventional varieties and lines: MON 810 varieties cultivated in Spain and conventional varieties from various geographical origins and breeding history. In addition, the reliability of the assays was evaluated based on their PCR amplification performance. A single base pair substitution, corresponding to a single nucleotide polymorphism (SNP) reported in an earlier study, was observed in the forward primer of one of the studied alcohol dehydrogenase 1 (Adh1) (70) assays in a large number of varieties. The SNP presence is consistent with a poor PCR performance observed for this assay along the tested varieties. The obtained data show that the Adh1 (70) assay used in the official CRL NK603 assay is unreliable. Based on our results from both the nucleotide stability study and the PCR performance test, we can conclude that the Adh1 (136) reference assay (T25 and Bt11 assays) as well as the tested high mobility group protein gene assay, which also form parts of CRL methods for quantification, are highly reliable. Despite the observed uniformity in the nucleotide sequence of the invertase gene assay, the PCR performance test reveals that this target sequence might occur in more than one copy. Finally, although currently not forming a part of official quantification methods, zein and SSIIb assays are found to be highly reliable in terms of nucleotide stability and PCR performance and are proposed as good alternative targets for a reference assay for maize.
Asara, John M; Zhang, Xiang; Zheng, Bin; Christofk, Heather H; Wu, Ning; Cantley, Lewis C
2006-01-01
Most proteomics approaches for relative quantification of protein expression use a combination of stable-isotope labeling and mass spectrometry. Traditionally, researchers have used difference gel electrophoresis (DIGE) from stained 1D and 2D gels for relative quantification. While differences in protein staining intensity can often be visualized, abundant proteins can obscure less abundant proteins, and quantification of post-translational modifications is difficult. A method is presented for quantifying changes in the abundance of a specific protein or changes in specific modifications of a protein using In-gel Stable-Isotope Labeling (ISIL). Proteins extracted from any source (tissue, cell line, immunoprecipitate, etc.), treated under two experimental conditions, are resolved in separate lanes by gel electrophoresis. The regions of interest (visualized by staining) are reacted separately with light versus heavy isotope-labeled reagents, and the gel slices are then mixed and digested with proteases. The resulting peptides are then analyzed by LC-MS to determine relative abundance of light/heavy isotope pairs and analyzed by LC-MS/MS for identification of sequence and modifications. The strategy compares well with other relative quantification strategies, and in silico calculations reveal its effectiveness as a global relative quantification strategy. An advantage of ISIL is that visualization of gel differences can be used as a first quantification step followed by accurate and sensitive protein level stable-isotope labeling and mass spectrometry-based relative quantification.
Noor, M Omair; Tavares, Anthony J; Krull, Ulrich J
2013-07-25
A microfluidic based solid-phase assay for the multiplexed detection of nucleic acid hybridization using quantum dot (QD) mediated fluorescence resonance energy transfer (FRET) is described herein. The glass surface of hybrid glass-polydimethylsiloxane (PDMS) microfluidic channels was chemically modified to assemble the biorecognition interface. Multiplexing was demonstrated using a detection system that was comprised of two colors of immobilized semi-conductor QDs and two different oligonucleotide probe sequences. Green-emitting and red-emitting QDs were paired with Cy3 and Alexa Fluor 647 (A647) labeled oligonucleotides, respectively. The QDs served as energy donors for the transduction of dye labeled oligonucleotide targets. The in-channel assembly of the biorecognition interface and the subsequent introduction of oligonucleotide targets was accomplished within minutes using a combination of electroosmotic flow and electrophoretic force. The concurrent quantification of femtomole quantities of two target sequences was possible by measuring the spatial coverage of FRET sensitized emission along the length of the channel. In previous reports, multiplexed QD-FRET hybridization assays that employed a ratiometric method for quantification had challenges associated with lower analytical sensitivity arising from both donor and acceptor dilution that resulted in reduced energy transfer pathways as compared to single-color hybridization assays. Herein, a spatial method for quantification that is based on in-channel QD-FRET profiles provided higher analytical sensitivity in the multiplexed assay format as compared to single-color hybridization assays. The selectivity of the multiplexed hybridization assays was demonstrated by discrimination between a fully-complementary sequence and a 3 base pair sequence at a contrast ratio of 8 to 1. Copyright © 2013 Elsevier B.V. All rights reserved.
Recurrence plots and recurrence quantification analysis of human motion data
NASA Astrophysics Data System (ADS)
Josiński, Henryk; Michalczuk, Agnieszka; Świtoński, Adam; Szczesna, Agnieszka; Wojciechowski, Konrad
2016-06-01
The authors present exemplary application of recurrence plots, cross recurrence plots and recurrence quantification analysis for the purpose of exploration of experimental time series describing selected aspects of human motion. Time series were extracted from treadmill gait sequences which were recorded in the Human Motion Laboratory (HML) of the Polish-Japanese Academy of Information Technology in Bytom, Poland by means of the Vicon system. Analysis was focused on the time series representing movements of hip, knee, ankle and wrist joints in the sagittal plane.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sattison, M.B.; Schroeder, J.A.; Russell, K.D.
The Idaho National Engineering Laboratory (INEL) over the past year has created 75 plant-specific Accident Sequence Precursor (ASP) models using the SAPHIRE suite of PRA codes. Along with the new models, the INEL has also developed a new module for SAPHIRE which is tailored specifically to the unique needs of ASP evaluations. These models and software will be the next generation of risk tools for the evaluation of accident precursors by both NRR and AEOD. This paper presents an overview of the models and software. Key characteristics include: (1) classification of the plant models according to plant response with amore » unique set of event trees for each plant class, (2) plant-specific fault trees using supercomponents, (3) generation and retention of all system and sequence cutsets, (4) full flexibility in modifying logic, regenerating cutsets, and requantifying results, and (5) user interface for streamlined evaluation of ASP events.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sattison, M.B.; Schroeder, J.A.; Russell, K.D.
The Idaho National Engineering Laboratory (INEL) over the past year has created 75 plant-specific Accident Sequence Precursor (ASP) models using the SAPHIRE suite of PRA codes. Along with the new models, the INEL has also developed a new module for SAPHIRE which is tailored specifically to the unique needs of conditional core damage probability (CCDP) evaluations. These models and software will be the next generation of risk tools for the evaluation of accident precursors by both NRR and AEOD. This paper presents an overview of the models and software. Key characteristics include: (1) classification of the plant models according tomore » plant response with a unique set of event trees for each plant class, (2) plant-specific fault trees using supercomponents, (3) generation and retention of all system and sequence cutsets, (4) full flexibility in modifying logic, regenerating cutsets, and requantifying results, and (5) user interface for streamlined evaluation of ASP events.« less
To Err is Human Case Reports of Two Military Aircraft Accidents
Dikshit, Mohan B
2010-01-01
It has been postulated that pilot error or in-flight incapacitation may be the main contributory factors to 70–80% of aircraft accidents. Two fatal aircraft accidents are presented in which either of the above possibilities may have played a role. The first case report describes an erroneous decision by a fighter pilot to use a seat position adjustment of the ejection seat leading to fatal injuries when he had to eject from his aircraft. Injuries to the body of the pilot, and observations on the state of his flying clothing and the ejection seat were used to postulate the mechanism of fatal injury and establish the cause of the accident. The second case report describes the sequence of events which culminated in the incapacitation of a fighter pilot while executing a routine manouevre. This resulted in a fatal air crash. Possible contributions of environmental factors which may have resulted in failure of his physiological mechanisms are discussed. PMID:21509093
Deep Borehole Emplacement Mode Hazard Analysis Revision 0
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sevougian, S. David
This letter report outlines a methodology and provides resource information for the Deep Borehole Emplacement Mode Hazard Analysis (DBEMHA). The main purpose is identify the accident hazards and accident event sequences associated with the two emplacement mode options (wireline or drillstring), to outline a methodology for computing accident probabilities and frequencies, and to point to available databases on the nature and frequency of accidents typically associated with standard borehole drilling and nuclear handling operations. Risk mitigation and prevention measures, which have been incorporated into the two emplacement designs (see Cochran and Hardin 2015), are also discussed. A key intent ofmore » this report is to provide background information to brief subject matter experts involved in the Emplacement Mode Design Study. [Note: Revision 0 of this report is concentrated more on the wireline emplacement mode. It is expected that Revision 1 will contain further development of the preliminary fault and event trees for the drill string emplacement mode.]« less
The Viareggio LPG railway accident: event reconstruction and modeling.
Brambilla, Sara; Manca, Davide
2010-10-15
This manuscript describes in detail the LPG accident occurred in Viareggio on June 2009 and its modeling. The accident investigation highlighted the uncertainty and complexity of assessing and modeling what happened in the congested environment close to the Viareggio railway station. Nonetheless, the analysis allowed comprehending the sequence of events, the way they influenced each other, and the different possible paths/evolutions. The paper describes suitable models for the quantitative assessment of the consequences of the most probable accidental dynamics and its outcomes. The main finding is that after about 80 s from the beginning of the release the dense-gas cloud reached the surrounding houses that were destroyed successively by internal explosions. This fact has two main implications. First, it shows that the adopted modeling framework can give a correct picture of what happened in Viareggio. Second, it confirms the need to develop effective mitigation measures because, in case of this kind of accidents, there is no time to apply any protective emergency plans/actions. 2010 Elsevier B.V. All rights reserved.
Pulmonary fat embolism after pelvic and long bone fractures in a trauma patient.
Huang, Brady K; Monu, Johnny U V; Wandtke, John
2009-09-01
Fat embolism is a common complication of pelvic and long bone fractures. Macroscopic fat emboli in the pulmonary arteries on computed tomography have been reported postoperatively after fixation of long bone fractures for trauma, however the quantification of attenuation values of fat emboli have been infrequently reported in the literature. We present a case of pulmonary fat embolism in a 52-year-old female after acute bony trauma sustained during a motor vehicle accident. To the authors' knowledge however, pulmonary fat embolism has not been described on the initial trauma CT scan.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bley, D.C.; Cooper, S.E.; Forester, J.A.
ATHEANA, a second-generation Human Reliability Analysis (HRA) method integrates advances in psychology with engineering, human factors, and Probabilistic Risk Analysis (PRA) disciplines to provide an HRA quantification process and PRA modeling interface that can accommodate and represent human performance in real nuclear power plant events. The method uses the characteristics of serious accidents identified through retrospective analysis of serious operational events to set priorities in a search process for significant human failure events, unsafe acts, and error-forcing context (unfavorable plant conditions combined with negative performance-shaping factors). ATHEANA has been tested in a demonstration project at an operating pressurized water reactor.
Salazar, Oscar; Valverde, Aranzazu; Genilloud, Olga
2006-01-01
Real-time PCR (RT-PCR) technology was used for the specific detection and quantification of members of the family Geodermatophilaceae in stone samples. Differences in the nucleotide sequences of the 16S rRNA gene region were used to design a pair of family-specific primers that were used to detect and quantify by RT-PCR DNA from members of this family in stone samples from different geographical origins in Spain. These primers were applied later to identify by PCR-specific amplification new members of the family Geodermatophilaceae isolated from the same stone samples. The diversity and taxonomic position of the wild-type strains identified from ribosomal sequence analysis suggest the presence of a new lineage within the genus Blastococcus. PMID:16391063
Wood, David L. A.; Nones, Katia; Steptoe, Anita; Christ, Angelika; Harliwong, Ivon; Newell, Felicity; Bruxner, Timothy J. C.; Miller, David; Cloonan, Nicole; Grimmond, Sean M.
2015-01-01
Genetic variation modulates gene expression transcriptionally or post-transcriptionally, and can profoundly alter an individual’s phenotype. Measuring allelic differential expression at heterozygous loci within an individual, a phenomenon called allele-specific expression (ASE), can assist in identifying such factors. Massively parallel DNA and RNA sequencing and advances in bioinformatic methodologies provide an outstanding opportunity to measure ASE genome-wide. In this study, matched DNA and RNA sequencing, genotyping arrays and computationally phased haplotypes were integrated to comprehensively and conservatively quantify ASE in a single human brain and liver tissue sample. We describe a methodological evaluation and assessment of common bioinformatic steps for ASE quantification, and recommend a robust approach to accurately measure SNP, gene and isoform ASE through the use of personalized haplotype genome alignment, strict alignment quality control and intragenic SNP aggregation. Our results indicate that accurate ASE quantification requires careful bioinformatic analyses and is adversely affected by sample specific alignment confounders and random sampling even at moderate sequence depths. We identified multiple known and several novel ASE genes in liver, including WDR72, DSP and UBD, as well as genes that contained ASE SNPs with imbalance direction discordant with haplotype phase, explainable by annotated transcript structure, suggesting isoform derived ASE. The methods evaluated in this study will be of use to researchers performing highly conservative quantification of ASE, and the genes and isoforms identified as ASE of interest to researchers studying those loci. PMID:25965996
Antibiotic Resistome: Improving Detection and Quantification Accuracy for Comparative Metagenomics.
Elbehery, Ali H A; Aziz, Ramy K; Siam, Rania
2016-04-01
The unprecedented rise of life-threatening antibiotic resistance (AR), combined with the unparalleled advances in DNA sequencing of genomes and metagenomes, has pushed the need for in silico detection of the resistance potential of clinical and environmental metagenomic samples through the quantification of AR genes (i.e., genes conferring antibiotic resistance). Therefore, determining an optimal methodology to quantitatively and accurately assess AR genes in a given environment is pivotal. Here, we optimized and improved existing AR detection methodologies from metagenomic datasets to properly consider AR-generating mutations in antibiotic target genes. Through comparative metagenomic analysis of previously published AR gene abundance in three publicly available metagenomes, we illustrate how mutation-generated resistance genes are either falsely assigned or neglected, which alters the detection and quantitation of the antibiotic resistome. In addition, we inspected factors influencing the outcome of AR gene quantification using metagenome simulation experiments, and identified that genome size, AR gene length, total number of metagenomics reads and selected sequencing platforms had pronounced effects on the level of detected AR. In conclusion, our proposed improvements in the current methodologies for accurate AR detection and resistome assessment show reliable results when tested on real and simulated metagenomic datasets.
Station Blackout at Browns Ferry Unit One - accident sequence analysis. Volume 1
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cook, D.H.; Harrington, R.M.; Greene, S.R.
1981-11-01
This study describes the predicted response of Unit 1 at the Browns Ferry Nuclear Plant to Station Blackout, defined as a loss of offsite power combined with failure of all onsite emergency diesel-generators to start and load. Every effort has been made to employ the most realistic assumptions during the process of defining the sequence of events for this hypothetical accident. DC power is assumed to remain available from the unit batteries during the initial phase and the operator actions and corresponding events during this period are described using results provided by an analysis code developed specifically for this purpose.more » The Station Blackout is assumed to persist beyond the point of battery exhaustion and the events during this second phase of the accident in which dc power would be unavailable were determined through use of the MARCH code. Without dc power, cooling water could no longer be injected into the reactor vessel and the events of the second phase include core meltdown and subsequent containment failure. An estimate of the magnitude and timing of the concomitant release of the noble gas, cesium, and iodine-based fission products to the environment is provided in Volume 2 of this report. 58 refs., 75 figs., 8 tabs.« less
Pi, Liqun; Li, Xiang; Cao, Yiwei; Wang, Canhua; Pan, Liangwen; Yang, Litao
2015-04-01
Reference materials are important in accurate analysis of genetically modified organism (GMO) contents in food/feeds, and development of novel reference plasmid is a new trend in the research of GMO reference materials. Herein, we constructed a novel multi-targeting plasmid, pSOY, which contained seven event-specific sequences of five GM soybeans (MON89788-5', A2704-12-3', A5547-127-3', DP356043-5', DP305423-3', A2704-12-5', and A5547-127-5') and sequence of soybean endogenous reference gene Lectin. We evaluated the specificity, limit of detection and quantification, and applicability of pSOY in both qualitative and quantitative PCR analyses. The limit of detection (LOD) was as low as 20 copies in qualitative PCR, and the limit of quantification (LOQ) in quantitative PCR was 10 copies. In quantitative real-time PCR analysis, the PCR efficiencies of all event-specific and Lectin assays were higher than 90%, and the squared regression coefficients (R(2)) were more than 0.999. The quantification bias varied from 0.21% to 19.29%, and the relative standard deviations were from 1.08% to 9.84% in simulated samples analysis. All the results demonstrated that the developed multi-targeting plasmid, pSOY, was a credible substitute of matrix reference materials, and could be used as a reliable reference calibrator in the identification and quantification of multiple GM soybean events.
Ballari, Rajashekhar V; Martin, Asha
2013-12-01
DNA quality is an important parameter for the detection and quantification of genetically modified organisms (GMO's) using the polymerase chain reaction (PCR). Food processing leads to degradation of DNA, which may impair GMO detection and quantification. This study evaluated the effect of various processing treatments such as heating, baking, microwaving, autoclaving and ultraviolet (UV) irradiation on the relative transgenic content of MON 810 maize using pRSETMON-02, a dual target plasmid as a model system. Amongst all the processing treatments examined, autoclaving and UV irradiation resulted in the least recovery of the transgenic (CaMV 35S promoter) and taxon-specific (zein) target DNA sequences. Although a profound impact on DNA degradation was seen during the processing, DNA could still be reliably quantified by Real-time PCR. The measured mean DNA copy number ratios of the processed samples were in agreement with the expected values. Our study confirms the premise that the final analytical value assigned to a particular sample is independent of the degree of DNA degradation since the transgenic and the taxon-specific target sequences possessing approximately similar lengths degrade in parallel. The results of our study demonstrate that food processing does not alter the relative quantification of the transgenic content provided the quantitative assays target shorter amplicons and the difference in the amplicon size between the transgenic and taxon-specific genes is minimal. Copyright © 2013 Elsevier Ltd. All rights reserved.
Synthetic spike-in standards for high-throughput 16S rRNA gene amplicon sequencing.
Tourlousse, Dieter M; Yoshiike, Satowa; Ohashi, Akiko; Matsukura, Satoko; Noda, Naohiro; Sekiguchi, Yuji
2017-02-28
High-throughput sequencing of 16S rRNA gene amplicons (16S-seq) has become a widely deployed method for profiling complex microbial communities but technical pitfalls related to data reliability and quantification remain to be fully addressed. In this work, we have developed and implemented a set of synthetic 16S rRNA genes to serve as universal spike-in standards for 16S-seq experiments. The spike-ins represent full-length 16S rRNA genes containing artificial variable regions with negligible identity to known nucleotide sequences, permitting unambiguous identification of spike-in sequences in 16S-seq read data from any microbiome sample. Using defined mock communities and environmental microbiota, we characterized the performance of the spike-in standards and demonstrated their utility for evaluating data quality on a per-sample basis. Further, we showed that staggered spike-in mixtures added at the point of DNA extraction enable concurrent estimation of absolute microbial abundances suitable for comparative analysis. Results also underscored that template-specific Illumina sequencing artifacts may lead to biases in the perceived abundance of certain taxa. Taken together, the spike-in standards represent a novel bioanalytical tool that can substantially improve 16S-seq-based microbiome studies by enabling comprehensive quality control along with absolute quantification. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.
Tentacle: distributed quantification of genes in metagenomes.
Boulund, Fredrik; Sjögren, Anders; Kristiansson, Erik
2015-01-01
In metagenomics, microbial communities are sequenced at increasingly high resolution, generating datasets with billions of DNA fragments. Novel methods that can efficiently process the growing volumes of sequence data are necessary for the accurate analysis and interpretation of existing and upcoming metagenomes. Here we present Tentacle, which is a novel framework that uses distributed computational resources for gene quantification in metagenomes. Tentacle is implemented using a dynamic master-worker approach in which DNA fragments are streamed via a network and processed in parallel on worker nodes. Tentacle is modular, extensible, and comes with support for six commonly used sequence aligners. It is easy to adapt Tentacle to different applications in metagenomics and easy to integrate into existing workflows. Evaluations show that Tentacle scales very well with increasing computing resources. We illustrate the versatility of Tentacle on three different use cases. Tentacle is written for Linux in Python 2.7 and is published as open source under the GNU General Public License (v3). Documentation, tutorials, installation instructions, and the source code are freely available online at: http://bioinformatics.math.chalmers.se/tentacle.
Unifying cancer and normal RNA sequencing data from different sources
Wang, Qingguo; Armenia, Joshua; Zhang, Chao; Penson, Alexander V.; Reznik, Ed; Zhang, Liguo; Minet, Thais; Ochoa, Angelica; Gross, Benjamin E.; Iacobuzio-Donahue, Christine A.; Betel, Doron; Taylor, Barry S.; Gao, Jianjiong; Schultz, Nikolaus
2018-01-01
Driven by the recent advances of next generation sequencing (NGS) technologies and an urgent need to decode complex human diseases, a multitude of large-scale studies were conducted recently that have resulted in an unprecedented volume of whole transcriptome sequencing (RNA-seq) data, such as the Genotype Tissue Expression project (GTEx) and The Cancer Genome Atlas (TCGA). While these data offer new opportunities to identify the mechanisms underlying disease, the comparison of data from different sources remains challenging, due to differences in sample and data processing. Here, we developed a pipeline that processes and unifies RNA-seq data from different studies, which includes uniform realignment, gene expression quantification, and batch effect removal. We find that uniform alignment and quantification is not sufficient when combining RNA-seq data from different sources and that the removal of other batch effects is essential to facilitate data comparison. We have processed data from GTEx and TCGA and successfully corrected for study-specific biases, enabling comparative analysis between TCGA and GTEx. The normalized datasets are available for download on figshare. PMID:29664468
Specific and quantitative detection of human polyomaviruses BKV, JCV, and SV40 by real time PCR.
McNees, Adrienne L; White, Zoe S; Zanwar, Preeti; Vilchez, Regis A; Butel, Janet S
2005-09-01
The polyomaviruses that infect humans, BK virus (BKV), JC virus (JCV), and simian virus 40 (SV40), typically establish subclinical persistent infections. However, reactivation of these viruses in immunocompromised hosts is associated with renal nephropathy and hemorrhagic cystitis (HC) caused by BKV and with progressive multifocal leukoencephalopathy (PML) caused by JCV. Additionally, SV40 is associated with several types of human cancers including primary brain and bone cancers, mesotheliomas, and non-Hodgkin's lymphoma. Advancements in detection of these viruses may contribute to improved diagnosis and treatment of affected patients. To develop sensitive and specific real time quantitative polymerase chain reaction (RQ-PCR) assays for the detection of T-antigen DNA sequences of the human polyomaviruses BKV, JCV, and SV40 using the ABI Prism 7000 Sequence Detection System. Assays for absolute quantification of the viral T-ag sequences were designed and the sensitivity and specificity were evaluated. A quantitative assay to measure the single copy human RNAse P gene was also developed and evaluated in order to normalize viral gene copy numbers to cell numbers. Quantification of the target genes is sensitive and specific over a 7 log dynamic range. Ten copies each of the viral and cellular genes are reproducibly and accurately detected. The sensitivity of detection of the RQ-PCR assays is increased 10- to 100-fold compared to conventional PCR and agarose gel protocols. The primers and probes used to detect the viral genes are specific for each virus and there is no cross reactivity within the dynamic range of the standard dilutions. The sensitivity of detection for these assays is not reduced in human cellular extracts; however, different DNA extraction protocols may affect quantification. These assays provide a technique for rapid and specific quantification of polyomavirus genomes per cell in human samples.
Neuberger, Ulf; Kickingereder, Philipp; Helluy, Xavier; Fischer, Manuel; Bendszus, Martin; Heiland, Sabine
2017-12-01
Non-invasive detection of 2-hydroxyglutarate (2HG) by magnetic resonance spectroscopy is attractive since it is related to tumor metabolism. Here, we compare the detection accuracy of 2HG in a controlled phantom setting via widely used localized spectroscopy sequences quantified by linear combination of metabolite signals vs. a more complex approach applying a J-difference editing technique at 9.4T. Different phantoms, comprised out of a concentration series of 2HG and overlapping brain metabolites, were measured with an optimized point-resolved-spectroscopy sequence (PRESS) and an in-house developed J-difference editing sequence. The acquired spectra were post-processed with LCModel and a simulated metabolite set (PRESS) or with a quantification formula for J-difference editing. Linear regression analysis demonstrated a high correlation of real 2HG values with those measured with the PRESS method (adjusted R-squared: 0.700, p<0.001) as well as with those measured with the J-difference editing method (adjusted R-squared: 0.908, p<0.001). The regression model with the J-difference editing method however had a significantly higher explanatory value over the regression model with the PRESS method (p<0.0001). Moreover, with J-difference editing 2HG was discernible down to 1mM, whereas with the PRESS method 2HG values were not discernable below 2mM and with higher systematic errors, particularly in phantoms with high concentrations of N-acetyl-asparate (NAA) and glutamate (Glu). In summary, quantification of 2HG with linear combination of metabolite signals shows high systematic errors particularly at low 2HG concentration and high concentration of confounding metabolites such as NAA and Glu. In contrast, J-difference editing offers a more accurate quantification even at low 2HG concentrations, which outweighs the downsides of longer measurement time and more complex postprocessing. Copyright © 2017. Published by Elsevier GmbH.
Weighardt, Florian; Barbati, Cristina; Paoletti, Claudia; Querci, Maddalena; Kay, Simon; De Beuckeleer, Marc; Van den Eede, Guy
2004-01-01
In Europe, a growing interest for reliable techniques for the quantification of genetically modified component(s) of food matrixes is arising from the need to comply with the European legislative framework on novel food products. Real-time polymerase chain reaction (PCR) is currently the most powerful technique for the quantification of specific nucleic acid sequences. Several real-time PCR methodologies based on different molecular principles have been developed for this purpose. The most frequently used approach in the field of genetically modified organism (GMO) quantification in food or feed samples is based on the 5'-3'-exonuclease activity of Taq DNA polymerase on specific degradation probes (TaqMan principle). A novel approach was developed for the establishment of a TaqMan quantification system assessing GMO contents around the 1% threshold stipulated under European Union (EU) legislation for the labeling of food products. The Zea mays T25 elite event was chosen as a model for the development of the novel GMO quantification approach. The most innovative aspect of the system is represented by the use of sequences cloned in plasmids as reference standards. In the field of GMO quantification, plasmids are an easy to use, cheap, and reliable alternative to Certified Reference Materials (CRMs), which are only available for a few of the GMOs authorized in Europe, have a relatively high production cost, and require further processing to be suitable for analysis. Strengths and weaknesses of the use of novel plasmid-based standards are addressed in detail. In addition, the quantification system was designed to avoid the use of a reference gene (e.g., a single copy, species-specific gene) as normalizer, i.e., to perform a GMO quantification based on an absolute instead of a relative measurement. In fact, experimental evidences show that the use of reference genes adds variability to the measurement system because a second independent real-time PCR-based measurement must be performed. Moreover, for some reference genes no sufficient information on copy number in and among genomes of different lines is available, making adequate quantification difficult. Once developed, the method was subsequently validated according to IUPAC and ISO 5725 guidelines. Thirteen laboratories from 8 EU countries participated in the trial. Eleven laboratories provided results complying with the predefined study requirements. Repeatability (RSDr) values ranged from 8.7 to 15.9%, with a mean value of 12%. Reproducibility (RSDR) values ranged from 16.3 to 25.5%, with a mean value of 21%. Following Codex Alimentarius Committee guidelines, both the limits of detection and quantitation were determined to be <0.1%.
NASA Astrophysics Data System (ADS)
Suryono, T. J.; Gofuku, A.
2018-02-01
One of the important thing in the mitigation of accidents in nuclear power plant accidents is time management. The accidents should be resolved as soon as possible in order to prevent the core melting and the release of radioactive material to the environment. In this case, operators should follow the emergency operating procedure related with the accident, in step by step order and in allowable time. Nowadays, the advanced main control rooms are equipped with computer-based procedures (CBPs) which is make it easier for operators to do their tasks of monitoring and controlling the reactor. However, most of the CBPs do not include the time remaining display feature which informs operators of time available for them to execute procedure steps and warns them if the they reach the time limit. Furthermore, the feature will increase the awareness of operators about their current situation in the procedure. This paper investigates this issue. The simplified of emergency operating procedure (EOP) of steam generator tube rupture (SGTR) accident of PWR plant is applied. In addition, the sequence of actions on each step of the procedure is modelled using multilevel flow modelling (MFM) and influenced propagation rule. The prediction of action time on each step is acquired based on similar case accidents and the Support Vector Regression. The derived time will be processed and then displayed on a CBP user interface.
PCR detection of uncultured rumen bacteria.
Rosero, Jaime A; Strosová, Lenka; Mrázek, Jakub; Fliegerová, Kateřina; Kopečný, Jan
2012-07-01
16S rRNA sequences of ruminal uncultured bacterial clones from public databases were phylogenetically examined. The sequences were found to form two unique clusters not affiliated with any known bacterial species: cluster of unidentified sequences of free floating rumen fluid uncultured bacteria (FUB) and cluster of unidentified sequences of bacteria associated with rumen epithelium (AUB). A set of PCR primers targeting 16S rRNA of ruminal free uncultured bacteria and rumen epithelium adhering uncultured bacteria was designed based on these sequences. FUB primers were used for relative quantification of uncultured bacteria in ovine rumen samples. The effort to increase the population size of FUB group has been successful in sulfate reducing broth and culture media supplied with cellulose.
Sequencing small genomic targets with high efficiency and extreme accuracy
Schmitt, Michael W.; Fox, Edward J.; Prindle, Marc J.; Reid-Bayliss, Kate S.; True, Lawrence D.; Radich, Jerald P.; Loeb, Lawrence A.
2015-01-01
The detection of minority variants in mixed samples demands methods for enrichment and accurate sequencing of small genomic intervals. We describe an efficient approach based on sequential rounds of hybridization with biotinylated oligonucleotides, enabling more than one-million fold enrichment of genomic regions of interest. In conjunction with error correcting double-stranded molecular tags, our approach enables the quantification of mutations in individual DNA molecules. PMID:25849638
Li, P; Jia, J W; Jiang, L X; Zhu, H; Bai, L; Wang, J B; Tang, X M; Pan, A H
2012-04-27
To ensure the implementation of genetically modified organism (GMO)-labeling regulations, an event-specific detection method was developed based on the junction sequence of an exogenous integrant in the transgenic carnation variety Moonlite. The 5'-transgene integration sequence was isolated by thermal asymmetric interlaced PCR. Based upon the 5'-transgene integration sequence, the event-specific primers and TaqMan probe were designed to amplify the fragments, which spanned the exogenous DNA and carnation genomic DNA. Qualitative and quantitative PCR assays were developed employing the designed primers and probe. The detection limit of the qualitative PCR assay was 0.05% for Moonlite in 100 ng total carnation genomic DNA, corresponding to about 79 copies of the carnation haploid genome; the limit of detection and quantification of the quantitative PCR assay were estimated to be 38 and 190 copies of haploid carnation genomic DNA, respectively. Carnation samples with different contents of genetically modified components were quantified and the bias between the observed and true values of three samples were lower than the acceptance criterion (<25%) of the GMO detection method. These results indicated that these event-specific methods would be useful for the identification and quantification of the GMO carnation Moonlite.
Liu, Jia; Guo, Jinchao; Zhang, Haibo; Li, Ning; Yang, Litao; Zhang, Dabing
2009-11-25
Various polymerase chain reaction (PCR) methods were developed for the execution of genetically modified organism (GMO) labeling policies, of which an event-specific PCR detection method based on the flanking sequence of exogenous integration is the primary trend in GMO detection due to its high specificity. In this study, the 5' and 3' flanking sequences of the exogenous integration of MON89788 soybean were revealed by thermal asymmetric interlaced PCR. The event-specific PCR primers and TaqMan probe were designed based upon the revealed 5' flanking sequence, and the qualitative and quantitative PCR assays were established employing these designed primers and probes. In qualitative PCR, the limit of detection (LOD) was about 0.01 ng of genomic DNA corresponding to 10 copies of haploid soybean genomic DNA. In the quantitative PCR assay, the LOD was as low as two haploid genome copies, and the limit of quantification was five haploid genome copies. Furthermore, the developed PCR methods were in-house validated by five researchers, and the validated results indicated that the developed event-specific PCR methods can be used for identification and quantification of MON89788 soybean and its derivates.
Li, Peng; Jia, Junwei; Bai, Lan; Pan, Aihu; Tang, Xueming
2013-07-01
Genetically modified carnation (Dianthus caryophyllus L.) Moonshade was approved for planting and commercialization in several countries from 2004. Developing methods for analyzing Moonshade is necessary for implementing genetically modified organism labeling regulations. In this study, the 5'-transgene integration sequence was isolated using thermal asymmetric interlaced (TAIL)-PCR. Based upon the 5'-transgene integration sequence, conventional and TaqMan real-time PCR assays were established. The relative limit of detection for the conventional PCR assay was 0.05 % for Moonshade using 100 ng total carnation genomic DNA, corresponding to approximately 79 copies of the carnation haploid genome, and the limits of detection and quantification of the TaqMan real-time PCR assay were estimated to be 51 and 254 copies of haploid carnation genomic DNA, respectively. These results are useful for identifying and quantifying Moonshade and its derivatives.
A Quantitative PCR-Electrochemical Genosensor Test for the Screening of Biotech Crops
Moura-Melo, Suely; Miranda-Castro, Rebeca; de-los-Santos-Álvarez, Noemí; Miranda-Ordieres, Arturo J.; dos Santos Junior, José Ribeiro; da Silva Fonseca, Rosana A.; Lobo-Castañón, María Jesús
2017-01-01
The design of screening methods for the detection of genetically modified organisms (GMOs) in food would improve the efficiency in their control. We report here a PCR amplification method combined with a sequence-specific electrochemical genosensor for the quantification of a DNA sequence characteristic of the 35S promoter derived from the cauliflower mosaic virus (CaMV). Specifically, we employ a genosensor constructed by chemisorption of a thiolated capture probe and p-aminothiophenol gold surfaces to entrap on the sensing layer the unpurified PCR amplicons, together with a signaling probe labeled with fluorescein. The proposed test allows for the determination of a transgene copy number in both hemizygous (maize MON810 trait) and homozygous (soybean GTS40-3-2) transformed plants, and exhibits a limit of quantification of at least 0.25% for both kinds of GMO lines. PMID:28420193
Deng, Yue; Bao, Feng; Yang, Yang; Ji, Xiangyang; Du, Mulong; Zhang, Zhengdong
2017-01-01
Abstract The automated transcript discovery and quantification of high-throughput RNA sequencing (RNA-seq) data are important tasks of next-generation sequencing (NGS) research. However, these tasks are challenging due to the uncertainties that arise in the inference of complete splicing isoform variants from partially observed short reads. Here, we address this problem by explicitly reducing the inherent uncertainties in a biological system caused by missing information. In our approach, the RNA-seq procedure for transforming transcripts into short reads is considered an information transmission process. Consequently, the data uncertainties are substantially reduced by exploiting the information transduction capacity of information theory. The experimental results obtained from the analyses of simulated datasets and RNA-seq datasets from cell lines and tissues demonstrate the advantages of our method over state-of-the-art competitors. Our algorithm is an open-source implementation of MaxInfo. PMID:28911101
Examination of Icing Induced Loss of Control and Its Mitigations
NASA Technical Reports Server (NTRS)
Reehorst, Andrew L.; Addy, Harold E., Jr.; Colantonio, Renato O.
2010-01-01
Factors external to the aircraft are often a significant causal factor in loss of control (LOC) accidents. In today s aviation world, very few accidents stem from a single cause and typically have a number of causal factors that culminate in a LOC accident. Very often the "trigger" that initiates an accident sequence is an external environment factor. In a recent NASA statistical analysis of LOC accidents, aircraft icing was shown to be the most common external environmental LOC causal factor for scheduled operations. When investigating LOC accident or incidents aircraft icing causal factors can be categorized into groups of 1) in-flight encounter with super-cooled liquid water clouds, 2) take-off with ice contamination, or 3) in-flight encounter with high concentrations of ice crystals. As with other flight hazards, icing induced LOC accidents can be prevented through avoidance, detection, and recovery mitigations. For icing hazards, avoidance can take the form of avoiding flight into icing conditions or avoiding the hazard of icing by making the aircraft tolerant to icing conditions. Icing detection mitigations can take the form of detecting icing conditions or detecting early performance degradation caused by icing. Recovery from icing induced LOC requires flight crew or automated systems capable of accounting for reduced aircraft performance and degraded control authority during the recovery maneuvers. In this report we review the icing induced LOC accident mitigations defined in a recent LOC study and for each mitigation describe a research topic required to enable or strengthen the mitigation. Many of these research topics are already included in ongoing or planned NASA icing research activities or are being addressed by members of the icing research community. These research activities are described and the status of the ongoing or planned research to address the technology needs is discussed
NASA Astrophysics Data System (ADS)
Cudalbu, C.; Mlynárik, V.; Xin, L.; Gruetter, Rolf
2009-10-01
Reliable quantification of the macromolecule signals in short echo-time 1H MRS spectra is particularly important at high magnetic fields for an accurate quantification of metabolite concentrations (the neurochemical profile) due to effectively increased spectral resolution of the macromolecule components. The purpose of the present study was to assess two approaches of quantification, which take the contribution of macromolecules into account in the quantification step. 1H spectra were acquired on a 14.1 T/26 cm horizontal scanner on five rats using the ultra-short echo-time SPECIAL (spin echo full intensity acquired localization) spectroscopy sequence. Metabolite concentrations were estimated using LCModel, combined with a simulated basis set of metabolites using published spectral parameters and either the spectrum of macromolecules measured in vivo, using an inversion recovery technique, or baseline simulated by the built-in spline function. The fitted spline function resulted in a smooth approximation of the in vivo macromolecules, but in accordance with previous studies using Subtract-QUEST could not reproduce completely all features of the in vivo spectrum of macromolecules at 14.1 T. As a consequence, the measured macromolecular 'baseline' led to a more accurate and reliable quantification at higher field strengths.
Recent plant studies using Victoria 2.0
DOE Office of Scientific and Technical Information (OSTI.GOV)
BIXLER,NATHAN E.; GASSER,RONALD D.
2000-03-08
VICTORIA 2.0 is a mechanistic computer code designed to analyze fission product behavior within the reactor coolant system (RCS) during a severe nuclear reactor accident. It provides detailed predictions of the release of radioactive and nonradioactive materials from the reactor core and transport and deposition of these materials within the RCS and secondary circuits. These predictions account for the chemical and aerosol processes that affect radionuclide behavior. VICTORIA 2.0 was released in early 1999; a new version VICTORIA 2.1, is now under development. The largest improvements in VICTORIA 2.1 are connected with the thermochemical database, which is being revised andmore » expanded following the recommendations of a peer review. Three risk-significant severe accident sequences have recently been investigated using the VICTORIA 2.0 code. The focus here is on how various chemistry options affect the predictions. Additionally, the VICTORIA predictions are compared with ones made using the MELCOR code. The three sequences are a station blackout in a GE BWR and steam generator tube rupture (SGTR) and pump-seal LOCA sequences in a 3-loop Westinghouse PWR. These sequences cover a range of system pressures, from fully depressurized to full system pressure. The chief results of this study are the fission product fractions that are retained in the core, RCS, secondary, and containment and the fractions that are released into the environment.« less
Jacchia, Sara; Nardini, Elena; Savini, Christian; Petrillo, Mauro; Angers-Loustau, Alexandre; Shim, Jung-Hyun; Trijatmiko, Kurniawan; Kreysa, Joachim; Mazzara, Marco
2015-02-18
In this study, we developed, optimized, and in-house validated a real-time PCR method for the event-specific detection and quantification of Golden Rice 2, a genetically modified rice with provitamin A in the grain. We optimized and evaluated the performance of the taxon (targeting rice Phospholipase D α2 gene)- and event (targeting the 3' insert-to-plant DNA junction)-specific assays that compose the method as independent modules, using haploid genome equivalents as unit of measurement. We verified the specificity of the two real-time PCR assays and determined their dynamic range, limit of quantification, limit of detection, and robustness. We also confirmed that the taxon-specific DNA sequence is present in single copy in the rice genome and verified its stability of amplification across 132 rice varieties. A relative quantification experiment evidenced the correct performance of the two assays when used in combination.
Nuclear Power Plant Cyber Security Discrete Dynamic Event Tree Analysis (LDRD 17-0958) FY17 Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wheeler, Timothy A.; Denman, Matthew R.; Williams, R. A.
Instrumentation and control of nuclear power is transforming from analog to modern digital assets. These control systems perform key safety and security functions. This transformation is occurring in new plant designs as well as in the existing fleet of plants as the operation of those plants is extended to 60 years. This transformation introduces new and unknown issues involving both digital asset induced safety issues and security issues. Traditional nuclear power risk assessment tools and cyber security assessment methods have not been modified or developed to address the unique nature of cyber failure modes and of cyber security threat vulnerabilities.more » iii This Lab-Directed Research and Development project has developed a dynamic cyber-risk in- formed tool to facilitate the analysis of unique cyber failure modes and the time sequencing of cyber faults, both malicious and non-malicious, and impose those cyber exploits and cyber faults onto a nuclear power plant accident sequence simulator code to assess how cyber exploits and cyber faults could interact with a plants digital instrumentation and control (DI&C) system and defeat or circumvent a plants cyber security controls. This was achieved by coupling an existing Sandia National Laboratories nuclear accident dynamic simulator code with a cyber emulytics code to demonstrate real-time simulation of cyber exploits and their impact on automatic DI&C responses. Studying such potential time-sequenced cyber-attacks and their risks (i.e., the associated impact and the associated degree of difficulty to achieve the attack vector) on accident management establishes a technical risk informed framework for developing effective cyber security controls for nuclear power.« less
Cycling transport safety quantification
NASA Astrophysics Data System (ADS)
Drbohlav, Jiri; Kocourek, Josef
2018-05-01
Dynamic interest in cycling transport brings the necessity to design safety cycling infrastructure. In las few years, couple of norms with safety elements have been designed and suggested for the cycling infrastructure. But these were not fully examined. The main parameter of suitable and fully functional transport infrastructure is the evaluation of its safety. Common evaluation of transport infrastructure safety is based on accident statistics. These statistics are suitable for motor vehicle transport but unsuitable for the cycling transport. Cycling infrastructure evaluation of safety is suitable for the traffic conflicts monitoring. The results of this method are fast, based on real traffic situations and can be applied on any traffic situations.
Analysis of unmitigated large break loss of coolant accidents using MELCOR code
NASA Astrophysics Data System (ADS)
Pescarini, M.; Mascari, F.; Mostacci, D.; De Rosa, F.; Lombardo, C.; Giannetti, F.
2017-11-01
In the framework of severe accident research activity developed by ENEA, a MELCOR nodalization of a generic Pressurized Water Reactor of 900 MWe has been developed. The aim of this paper is to present the analysis of MELCOR code calculations concerning two independent unmitigated large break loss of coolant accident transients, occurring in the cited type of reactor. In particular, the analysis and comparison between the transients initiated by an unmitigated double-ended cold leg rupture and an unmitigated double-ended hot leg rupture in the loop 1 of the primary cooling system is presented herein. This activity has been performed focusing specifically on the in-vessel phenomenology that characterizes this kind of accidents. The analysis of the thermal-hydraulic transient phenomena and the core degradation phenomena is therefore here presented. The analysis of the calculated data shows the capability of the code to reproduce the phenomena typical of these transients and permits their phenomenological study. A first sequence of main events is here presented and shows that the cold leg break transient results faster than the hot leg break transient because of the position of the break. Further analyses are in progress to quantitatively assess the results of the code nodalization for accident management strategy definition and fission product source term evaluation.
Nuclear power and probabilistic safety assessment (PSA): past through future applications
NASA Astrophysics Data System (ADS)
Stamatelatos, M. G.; Moieni, P.; Everline, C. J.
1995-03-01
Nuclear power reactor safety in the United States is about to enter a new era -- an era of risk- based management and risk-based regulation. First, there was the age of `prescribed safety assessment,' during which a series of design-basis accidents in eight categories of severity, or classes, were postulated and analyzed. Toward the end of that era, it was recognized that `Class 9,' or `beyond design basis,' accidents would need special attention because of the potentially severe health and financial consequences of these accidents. The accident at Three Mile Island showed that sequences of low-consequence, high-frequency events and human errors can be much more risk dominant than the Class 9 accidents. A different form of safety assessment, PSA, emerged and began to gain ground against the deterministic safety establishment. Eventually, this led to the current regulatory requirements for individual plant examinations (IPEs). The IPEs can serve as a basis for risk-based regulation and management, a concept that may ultimately transform the U.S. regulatory process from its traditional deterministic foundations to a process predicated upon PSA. Beyond the possibility of a regulatory environment predicated upon PSA lies the possibility of using PSA as the foundation for managing daily nuclear power plant operations.
Cheng, Wei; Cai, Shu; Sun, Jia-yu; Xia, Chun-chao; Li, Zhen-lin; Chen, Yu-cheng; Zhong, Yao-zu
2015-05-01
To compare the two sequences [single shot true-FISP-PSIR (single shot-PSIR) and segmented-turbo-FLASH-PSIR (segmented-PSIR)] in the value of quantification for myocardial infarct size at 3. 0 tesla MRI. 38 patients with clinical confirmed myocardial infarction were served a comprehensive gadonilium cardiac MRI at 3. 0 tesla MRI system (Trio, Siemens). Myocardial delayed enhancement (MDE) were performed by single shot-PSIR and segmented-PSIR sequences separatedly in 12-20 min followed gadopentetate dimeglumine injection (0. 15 mmol/kg). The quality of MDE images were analysed by experienced physicians. Signal-to-noise ratio (SNR), contrast-to-noise ratio (CNR) between the two techniques were compared. Myocardial infarct size was quantified by a dedicated software automatically (Q-mass, Medis). All objectives were scanned on the 3. 0T MR successfully. No significant difference was found in SNR and CNR of the image quality between the two sequences (P>0. 05), as well as the total myocardial volume, between two sequences (P>0. 05). Furthermore, there were still no difference in the infarct size [single shot-PSIR (30. 87 ± 15. 72) mL, segmented-PSIR (29. 26±14. 07) ml], ratio [single shot-PSIR (22. 94%±10. 94%), segmented-PSIR (20. 75% ± 8. 78%)] between the two sequences (P>0. 05). However, the average aquisition time of single shot-PSIR (21. 4 s) was less than that of the latter (380 s). Single shot-PSIR is equal to segmented-PSIR in detecting the myocardial infarct size with less acquisition time, which is valuable in the clinic application and further research.
Use of multiple competitors for quantification of human immunodeficiency virus type 1 RNA in plasma.
Vener, T; Nygren, M; Andersson, A; Uhlén, M; Albert, J; Lundeberg, J
1998-07-01
Quantification of human immunodeficiency virus type 1 (HIV-1) RNA in plasma has rapidly become an important tool in basic HIV research and in the clinical care of infected individuals. Here, a quantitative HIV assay based on competitive reverse transcription-PCR with multiple competitors was developed. Four RNA competitors containing identical PCR primer binding sequences as the viral HIV-1 RNA target were constructed. One of the PCR primers was fluorescently labeled, which facilitated discrimination between the viral RNA and competitor amplicons by fragment analysis with conventional automated sequencers. The coamplification of known amounts of the RNA competitors provided the means to establish internal calibration curves for the individual reactions resulting in exclusion of tube-to-tube variations. Calibration curves were created from the peak areas, which were proportional to the starting amount of each competitor. The fluorescence detection format was expanded to provide a dynamic range of more than 5 log units. This quantitative assay allowed for reproducible analysis of samples containing as few as 40 viral copies of HIV-1 RNA per reaction. The within- and between-run coefficients of variation were <24% (range, 10 to 24) and <36% (range, 27 to 36), respectively. The high reproducibility (standard deviation, <0.13 log) of the overall procedure for quantification of HIV-1 RNA in plasma, including sample preparation, amplification, and detection variations, allowed reliable detection of a 0.5-log change in RNA viral load. The assay could be a useful tool for monitoring HIV-1 disease progression and antiviral treatment and can easily be adapted to the quantification of other pathogens.
Conceptual design study of Fusion Experimental Reactor (FY86 FER): Safety
NASA Astrophysics Data System (ADS)
Seki, Yasushi; Iida, Hiromasa; Honda, Tsutomu
1987-08-01
This report describes the study on safety for FER (Fusion Experimental Reactor) which has been designed as a next step machine to the JT-60. Though the final purpose of this study is to have an image of design base accident, maximum credible accident and to assess their risk or probability, etc., as FER plant system, the emphasis of this years study is placed on fuel-gas circulation system where the tritium inventory is maximum. The report consists of two chapters. The first chapter summarizes the FER system and describes FMEA (Failure Mode and Effect Analysis) and related accident progression sequence for FER plant system as a whole. The second chapter of this report is focused on fuel-gas circulation system including purification, isotope separation and storage. Probability of risk is assessed by the probabilistic risk analysis (PRA) procedure based on FMEA, ETA and FTA.
Reference tissue quantification of DCE-MRI data without a contrast agent calibration
NASA Astrophysics Data System (ADS)
Walker-Samuel, Simon; Leach, Martin O.; Collins, David J.
2007-02-01
The quantification of dynamic contrast-enhanced (DCE) MRI data conventionally requires a conversion from signal intensity to contrast agent concentration by measuring a change in the tissue longitudinal relaxation rate, R1. In this paper, it is shown that the use of a spoiled gradient-echo acquisition sequence (optimized so that signal intensity scales linearly with contrast agent concentration) in conjunction with a reference tissue-derived vascular input function (VIF), avoids the need for the conversion to Gd-DTPA concentration. This study evaluates how to optimize such sequences and which dynamic time-series parameters are most suitable for this type of analysis. It is shown that signal difference and relative enhancement provide useful alternatives when full contrast agent quantification cannot be achieved, but that pharmacokinetic parameters derived from both contain sources of error (such as those caused by differences between reference tissue and region of interest proton density and native T1 values). It is shown in a rectal cancer study that these sources of uncertainty are smaller when using signal difference, compared with relative enhancement (15 ± 4% compared with 33 ± 4%). Both of these uncertainties are of the order of those associated with the conversion to Gd-DTPA concentration, according to literature estimates.
Meinhardt, Kelley A; Bertagnolli, Anthony; Pannu, Manmeet W; Strand, Stuart E; Brown, Sally L; Stahl, David A
2015-04-01
Ammonia-oxidizing archaea (AOA) and bacteria (AOB) fill key roles in the nitrogen cycle. Thus, well-vetted methods for characterizing their distribution are essential for framing studies of their significance in natural and managed systems. Quantification of the gene coding for one subunit of the ammonia monooxygenase (amoA) by polymerase chain reaction is frequently employed to enumerate the two groups. However, variable amplification of sequence variants comprising this conserved genetic marker for ammonia oxidizers potentially compromises within- and between-system comparisons. We compared the performance of newly designed non-degenerate quantitative polymerase chain reaction primer sets to existing primer sets commonly used to quantify the amoA of AOA and AOB using a collection of plasmids and soil DNA samples. The new AOA primer set provided improved quantification of model mixtures of different amoA sequence variants and increased detection of amoA in DNA recovered from soils. Although both primer sets for the AOB provided similar results for many comparisons, the new primers demonstrated increased detection in environmental application. Thus, the new primer sets should provide a useful complement to primers now commonly used to characterize the environmental distribution of AOA and AOB. © 2014 Society for Applied Microbiology and John Wiley & Sons Ltd.
Liu, Ruolin; Dickerson, Julie
2017-11-01
We propose a novel method and software tool, Strawberry, for transcript reconstruction and quantification from RNA-Seq data under the guidance of genome alignment and independent of gene annotation. Strawberry consists of two modules: assembly and quantification. The novelty of Strawberry is that the two modules use different optimization frameworks but utilize the same data graph structure, which allows a highly efficient, expandable and accurate algorithm for dealing large data. The assembly module parses aligned reads into splicing graphs, and uses network flow algorithms to select the most likely transcripts. The quantification module uses a latent class model to assign read counts from the nodes of splicing graphs to transcripts. Strawberry simultaneously estimates the transcript abundances and corrects for sequencing bias through an EM algorithm. Based on simulations, Strawberry outperforms Cufflinks and StringTie in terms of both assembly and quantification accuracies. Under the evaluation of a real data set, the estimated transcript expression by Strawberry has the highest correlation with Nanostring probe counts, an independent experiment measure for transcript expression. Strawberry is written in C++14, and is available as open source software at https://github.com/ruolin/strawberry under the MIT license.
Targeted RNA-Sequencing with Competitive Multiplex-PCR Amplicon Libraries
Blomquist, Thomas M.; Crawford, Erin L.; Lovett, Jennie L.; Yeo, Jiyoun; Stanoszek, Lauren M.; Levin, Albert; Li, Jia; Lu, Mei; Shi, Leming; Muldrew, Kenneth; Willey, James C.
2013-01-01
Whole transcriptome RNA-sequencing is a powerful tool, but is costly and yields complex data sets that limit its utility in molecular diagnostic testing. A targeted quantitative RNA-sequencing method that is reproducible and reduces the number of sequencing reads required to measure transcripts over the full range of expression would be better suited to diagnostic testing. Toward this goal, we developed a competitive multiplex PCR-based amplicon sequencing library preparation method that a) targets only the sequences of interest and b) controls for inter-target variation in PCR amplification during library preparation by measuring each transcript native template relative to a known number of synthetic competitive template internal standard copies. To determine the utility of this method, we intentionally selected PCR conditions that would cause transcript amplification products (amplicons) to converge toward equimolar concentrations (normalization) during library preparation. We then tested whether this approach would enable accurate and reproducible quantification of each transcript across multiple library preparations, and at the same time reduce (through normalization) total sequencing reads required for quantification of transcript targets across a large range of expression. We demonstrate excellent reproducibility (R2 = 0.997) with 97% accuracy to detect 2-fold change using External RNA Controls Consortium (ERCC) reference materials; high inter-day, inter-site and inter-library concordance (R2 = 0.97–0.99) using FDA Sequencing Quality Control (SEQC) reference materials; and cross-platform concordance with both TaqMan qPCR (R2 = 0.96) and whole transcriptome RNA-sequencing following “traditional” library preparation using Illumina NGS kits (R2 = 0.94). Using this method, sequencing reads required to accurately quantify more than 100 targeted transcripts expressed over a 107-fold range was reduced more than 10,000-fold, from 2.3×109 to 1.4×105 sequencing reads. These studies demonstrate that the competitive multiplex-PCR amplicon library preparation method presented here provides the quality control, reproducibility, and reduced sequencing reads necessary for development and implementation of targeted quantitative RNA-sequencing biomarkers in molecular diagnostic testing. PMID:24236095
Fukushima Accident: Sequence of Events and Lessons Learned
NASA Astrophysics Data System (ADS)
Morse, Edward C.
2011-10-01
The Fukushima Dai-Ichi nuclear power station suffered a devastating Richter 9.0 earthquake followed by a 14.0 m tsunami on 11 March 2011. The subsequent loss of power for emergency core cooling systems resulted in damage to the fuel in the cores of three reactors. The relief of pressure from the containment in these three reactors led to sufficient hydrogen gas release to cause explosions in the buildings housing the reactors. There was probably subsequent damage to a spent fuel pool of a fourth reactor caused by debris from one of these explosions. Resultant releases of fission product isotopes in air were significant and have been estimated to be in the 3 . 7 --> 6 . 3 ×1017 Bq range (~10 MCi) for 131I and 137Cs combined, or approximately one tenth that of the Chernobyl accident. A synopsis of the sequence of events leading up to this large release of radioactivity will be presented, along with likely scenarios for stabilization and site cleanup in the future. Some aspects of the isotope monitoring programs, both locally and at large, will also be discussed. An assessment of radiological health risk for the plant workers as well as the general public will also be presented. Finally, the impact of this accident on design and deployment of nuclear generating stations in the future will be discussed.
RNA-Seq for Bacterial Gene Expression.
Poulsen, Line Dahl; Vinther, Jeppe
2018-06-01
RNA sequencing (RNA-seq) has become the preferred method for global quantification of bacterial gene expression. With the continued improvements in sequencing technology and data analysis tools, the most labor-intensive and expensive part of an RNA-seq experiment is the preparation of sequencing libraries, which is also essential for the quality of the data obtained. Here, we present a straightforward and inexpensive basic protocol for preparation of strand-specific RNA-seq libraries from bacterial RNA as well as a computational pipeline for the data analysis of sequencing reads. The protocol is based on the Illumina platform and allows easy multiplexing of samples and the removal of sequencing reads that are PCR duplicates. © 2018 by John Wiley & Sons, Inc. © 2018 John Wiley & Sons, Inc.
NASA Astrophysics Data System (ADS)
Boss, Andreas; Martirosian, Petros; Artunc, Ferruh; Risler, Teut; Claussen, Claus D.; Schlemmer, Heinz-Peter; Schick, Fritz
2007-03-01
Purpose: As the MR contrast-medium gadobutrol is completely eliminated via glomerular filtration, the glomerular filtration rate (GFR) can be quantified after bolus-injection of gadobutrol and complete mixing in the extracellular fluid volume (ECFV) by measuring the signal decrease within the liver parenchyma. Two different navigator-gated single-shot saturation-recovery sequences have been tested for suitability of GFR quantification: a TurboFLASH and a TrueFISP readout technique. Materials and Methods: Ten healthy volunteers (mean age 26.1+/-3.6) were equally devided in two subgroups. After bolus-injection of 0.05 mmol/kg gadobutrol, coronal single-slice images of the liver were recorded every 4-5 seconds during free breathing using either the TurboFLASH or the TrueFISP technique. Time-intensity curves were determined from manually drawn regions-of-interest over the liver parenchyma. Both sequences were subsequently evaluated regarding signal to noise ratio (SNR) and the behaviour of signal intensity curves. The calculated GFR values were compared to an iopromide clearance gold standard. Results: The TrueFISP sequence exhibited a 3.4-fold higher SNR as compared to the TurboFLASH sequence and markedly lower variability of the recorded time-intensity curves. The calculated mean GFR values were 107.0+/-16.1 ml/min/1.73m2 (iopromide: 92.1+/-14.5 ml/min/1.73m2) for the TrueFISP technique and 125.6+/-24.1 ml/min/1.73m2 (iopromide: 97.7+/-6.3 ml/min/1.73m2) for the TurboFLASH approach. The mean paired differences with TrueFISP was lower (15.0 ml/min/1.73m2) than in the TurboFLASH method (27.9 ml/min/1.73m2). Conclusion: The global GFR can be quantified via measurement of gadobutrol clearance from the ECFV. A saturation-recovery TrueFISP sequence allows for more reliable GFR quantification as a saturation recovery TurboFLASH technique.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chu, T.L.; Musicki, Z.; Kohut, P.
1994-06-01
During 1989, the Nuclear Regulatory Commission (NRC) initiated an extensive program to carefully examine the potential risks during low power and shutdown operations. The program includes two parallel projects being performed by Brookhaven National Laboratory (BNL) and Sandia National Laboratories (SNL). Two plants, Surry (pressurized water reactor) and Grand Gulf (boiling water reactor), were selected as the plants to be studied. The objectives of the program are to assess the risks of severe accidents initiated during plant operational states other than full power operation and to compare the estimated core damage frequencies, important accident sequences and other qualitative and quantitativemore » results with those accidents initiated during full power operation as assessed in NUREG-1150. The objective of this report is to document the approach utilized in the Surry plant and discuss the results obtained. A parallel report for the Grand Gulf plant is prepared by SNL. This study shows that the core-damage frequency during mid-loop operation at the Surry plant is comparable to that of power operation. The authors recognize that there is very large uncertainty in the human error probabilities in this study. This study identified that only a few procedures are available for mitigating accidents that may occur during shutdown. Procedures written specifically for shutdown accidents would be useful.« less
Inverse modelling of radionuclide release rates using gamma dose rate observations
NASA Astrophysics Data System (ADS)
Hamburger, Thomas; Stohl, Andreas; von Haustein, Christoph; Thummerer, Severin; Wallner, Christian
2014-05-01
Severe accidents in nuclear power plants such as the historical accident in Chernobyl 1986 or the more recent disaster in the Fukushima Dai-ichi nuclear power plant in 2011 have drastic impacts on the population and environment. The hazardous consequences reach out on a national and continental scale. Environmental measurements and methods to model the transport and dispersion of the released radionuclides serve as a platform to assess the regional impact of nuclear accidents - both, for research purposes and, more important, to determine the immediate threat to the population. However, the assessments of the regional radionuclide activity concentrations and the individual exposure to radiation dose underlie several uncertainties. For example, the accurate model representation of wet and dry deposition. One of the most significant uncertainty, however, results from the estimation of the source term. That is, the time dependent quantification of the released spectrum of radionuclides during the course of the nuclear accident. The quantification of the source terms of severe nuclear accidents may either remain uncertain (e.g. Chernobyl, Devell et al., 1995) or rely on rather rough estimates of released key radionuclides given by the operators. Precise measurements are mostly missing due to practical limitations during the accident. Inverse modelling can be used to realise a feasible estimation of the source term (Davoine and Bocquet, 2007). Existing point measurements of radionuclide activity concentrations are therefore combined with atmospheric transport models. The release rates of radionuclides at the accident site are then obtained by improving the agreement between the modelled and observed concentrations (Stohl et al., 2012). The accuracy of the method and hence of the resulting source term depends amongst others on the availability, reliability and the resolution in time and space of the observations. Radionuclide activity concentrations are observed on a relatively sparse grid and the temporal resolution of available data may be low within the order of hours or a day. Gamma dose rates on the other hand are observed routinely on a much denser grid and higher temporal resolution. Gamma dose rate measurements contain no explicit information on the observed spectrum of radionuclides and have to be interpreted carefully. Nevertheless, they provide valuable information for the inverse evaluation of the source term due to their availability (Saunier et al., 2013). We present a new inversion approach combining an atmospheric dispersion model and observations of radionuclide activity concentrations and gamma dose rates to obtain the source term of radionuclides. We use the Lagrangian particle dispersion model FLEXPART (Stohl et al., 1998; Stohl et al., 2005) to model the atmospheric transport of the released radionuclides. The gamma dose rates are calculated from the modelled activity concentrations. The inversion method uses a Bayesian formulation considering uncertainties for the a priori source term and the observations (Eckhardt et al., 2008). The a priori information on the source term is a first guess. The gamma dose rate observations will be used with inverse modelling to improve this first guess and to retrieve a reliable source term. The details of this method will be presented at the conference. This work is funded by the Bundesamt für Strahlenschutz BfS, Forschungsvorhaben 3612S60026. References Davoine, X. and Bocquet, M., Atmos. Chem. Phys., 7, 1549-1564, 2007. Devell, L., et al., OCDE/GD(96)12, 1995. Eckhardt, S., et al., Atmos. Chem. Phys., 8, 3881-3897, 2008. Saunier, O., et al., Atmos. Chem. Phys., 13, 11403-11421, 2013. Stohl, A., et al., Atmos. Environ., 32, 4245-4264, 1998. Stohl, A., et al., Atmos. Chem. Phys., 5, 2461-2474, 2005. Stohl, A., et al., Atmos. Chem. Phys., 12, 2313-2343, 2012.
ERIC Educational Resources Information Center
Thumm, Walter
1975-01-01
Relates the story of Wilhelm Conrad Rontgen and presents one view of the extent to which the discovery of the x-ray was an accident. Reconstructs the sequence of events that led to the discovery and includes photographs of the lab where he worked and replicas of apparatus used. (GS)
Droplet digital PCR technology promises new applications and research areas.
Manoj, P
2016-01-01
Digital Polymerase Chain Reaction (dPCR) is used to quantify nucleic acids and its applications are in the detection and precise quantification of low-level pathogens, rare genetic sequences, quantification of copy number variants, rare mutations and in relative gene expressions. Here the PCR is performed in large number of reaction chambers or partitions and the reaction is carried out in each partition individually. This separation allows a more reliable collection and sensitive measurement of nucleic acid. Results are calculated by counting amplified target sequence (positive droplets) and the number of partitions in which there is no amplification (negative droplets). The mean number of target sequences was calculated by Poisson Algorithm. Poisson correction compensates the presence of more than one copy of target gene in any droplets. The method provides information with accuracy and precision which is highly reproducible and less susceptible to inhibitors than qPCR. It has been demonstrated in studying variations in gene sequences, such as copy number variants and point mutations, distinguishing differences between expression of nearly identical alleles, assessment of clinically relevant genetic variations and it is routinely used for clonal amplification of samples for NGS methods. dPCR enables more reliable predictors of tumor status and patient prognosis by absolute quantitation using reference normalizations. Rare mitochondrial DNA deletions associated with a range of diseases and disorders as well as aging can be accurately detected with droplet digital PCR.
Dikow, Nicola; Nygren, Anders Oh; Schouten, Jan P; Hartmann, Carolin; Krämer, Nikola; Janssen, Bart; Zschocke, Johannes
2007-06-01
Standard methods used for genomic methylation analysis allow the detection of complete absence of either methylated or non-methylated alleles but are usually unable to detect changes in the proportion of methylated and unmethylated alleles. We compare two methods for quantitative methylation analysis, using the chromosome 15q11-q13 imprinted region as model. Absence of the non-methylated paternal allele in this region leads to Prader-Willi syndrome (PWS) whilst absence of the methylated maternal allele results in Angelman syndrome (AS). A proportion of AS is caused by mosaic imprinting defects which may be missed with standard methods and require quantitative analysis for their detection. Sequence-based quantitative methylation analysis (SeQMA) involves quantitative comparison of peaks generated through sequencing reactions after bisulfite treatment. It is simple, cost-effective and can be easily established for a large number of genes. However, our results support previous suggestions that methods based on bisulfite treatment may be problematic for exact quantification of methylation status. Methylation-specific multiplex ligation-dependent probe amplification (MS-MLPA) avoids bisulfite treatment. It detects changes in both CpG methylation as well as copy number of up to 40 chromosomal sequences in one simple reaction. Once established in a laboratory setting, the method is more accurate, reliable and less time consuming.
Applications of Single-Cell Sequencing for Multiomics.
Xu, Yungang; Zhou, Xiaobo
2018-01-01
Single-cell sequencing interrogates the sequence or chromatin information from individual cells with advanced next-generation sequencing technologies. It provides a higher resolution of cellular differences and a better understanding of the underlying genetic and epigenetic mechanisms of an individual cell in the context of its survival and adaptation to microenvironment. However, it is more challenging to perform single-cell sequencing and downstream data analysis, owing to the minimal amount of starting materials, sample loss, and contamination. In addition, due to the picogram level of the amount of nucleic acids used, heavy amplification is often needed during sample preparation of single-cell sequencing, resulting in the uneven coverage, noise, and inaccurate quantification of sequencing data. All these unique properties raise challenges in and thus high demands for computational methods that specifically fit single-cell sequencing data. We here comprehensively survey the current strategies and challenges for multiple single-cell sequencing, including single-cell transcriptome, genome, and epigenome, beginning with a brief introduction to multiple sequencing techniques for single cells.
Russell, Jason D.; Scalf, Mark; Book, Adam J.; Ladror, Daniel T.; Vierstra, Richard D.; Smith, Lloyd M.; Coon, Joshua J.
2013-01-01
Quantification of gas-phase intact protein ions by mass spectrometry (MS) is impeded by highly-variable ionization, ion transmission, and ion detection efficiencies. Therefore, quantification of proteins using MS-associated techniques is almost exclusively done after proteolysis where peptides serve as proxies for estimating protein abundance. Advances in instrumentation, protein separations, and informatics have made large-scale sequencing of intact proteins using top-down proteomics accessible to the proteomics community; yet quantification of proteins using a top-down workflow has largely been unaddressed. Here we describe a label-free approach to determine the abundance of intact proteins separated by nanoflow liquid chromatography prior to MS analysis by using solution-phase measurements of ultraviolet light-induced intrinsic fluorescence (UV-IF). UV-IF is measured directly at the electrospray interface just prior to the capillary exit where proteins containing at least one tryptophan residue are readily detected. UV-IF quantification was demonstrated using commercially available protein standards and provided more accurate and precise protein quantification than MS ion current. We evaluated the parallel use of UV-IF and top-down tandem MS for quantification and identification of protein subunits and associated proteins from an affinity-purified 26S proteasome sample from Arabidopsis thaliana. We identified 26 unique proteins and quantified 13 tryptophan-containing species. Our analyses discovered previously unidentified N-terminal processing of the β6 (PBF1) and β7 (PBG1) subunit - such processing of PBG1 may generate a heretofore unknown additional protease active site upon cleavage. In addition, our approach permitted the unambiguous identification and quantification both isoforms of the proteasome-associated protein DSS1. PMID:23536786
Russell, Jason D; Scalf, Mark; Book, Adam J; Ladror, Daniel T; Vierstra, Richard D; Smith, Lloyd M; Coon, Joshua J
2013-01-01
Quantification of gas-phase intact protein ions by mass spectrometry (MS) is impeded by highly-variable ionization, ion transmission, and ion detection efficiencies. Therefore, quantification of proteins using MS-associated techniques is almost exclusively done after proteolysis where peptides serve as proxies for estimating protein abundance. Advances in instrumentation, protein separations, and informatics have made large-scale sequencing of intact proteins using top-down proteomics accessible to the proteomics community; yet quantification of proteins using a top-down workflow has largely been unaddressed. Here we describe a label-free approach to determine the abundance of intact proteins separated by nanoflow liquid chromatography prior to MS analysis by using solution-phase measurements of ultraviolet light-induced intrinsic fluorescence (UV-IF). UV-IF is measured directly at the electrospray interface just prior to the capillary exit where proteins containing at least one tryptophan residue are readily detected. UV-IF quantification was demonstrated using commercially available protein standards and provided more accurate and precise protein quantification than MS ion current. We evaluated the parallel use of UV-IF and top-down tandem MS for quantification and identification of protein subunits and associated proteins from an affinity-purified 26S proteasome sample from Arabidopsis thaliana. We identified 26 unique proteins and quantified 13 tryptophan-containing species. Our analyses discovered previously unidentified N-terminal processing of the β6 (PBF1) and β7 (PBG1) subunit - such processing of PBG1 may generate a heretofore unknown additional protease active site upon cleavage. In addition, our approach permitted the unambiguous identification and quantification both isoforms of the proteasome-associated protein DSS1.
Benz, Matthias R; Bongartz, Georg; Froehlich, Johannes M; Winkel, David; Boll, Daniel T; Heye, Tobias
2018-07-01
The aim was to investigate the variation of the arterial input function (AIF) within and between various DCE MRI sequences. A dynamic flow-phantom and steady signal reference were scanned on a 3T MRI using fast low angle shot (FLASH) 2d, FLASH3d (parallel imaging factor (P) = P0, P2, P4), volumetric interpolated breath-hold examination (VIBE) (P = P0, P3, P2 × 2, P2 × 3, P3 × 2), golden-angle radial sparse parallel imaging (GRASP), and time-resolved imaging with stochastic trajectories (TWIST). Signal over time curves were normalized and quantitatively analyzed by full width half maximum (FWHM) measurements to assess variation within and between sequences. The coefficient of variation (CV) for the steady signal reference ranged from 0.07-0.8%. The non-accelerated gradient echo FLASH2d, FLASH3d, and VIBE sequences showed low within sequence variation with 2.1%, 1.0%, and 1.6%. The maximum FWHM CV was 3.2% for parallel imaging acceleration (VIBE P2 × 3), 2.7% for GRASP and 9.1% for TWIST. The FWHM CV between sequences ranged from 8.5-14.4% for most non-accelerated/accelerated gradient echo sequences except 6.2% for FLASH3d P0 and 0.3% for FLASH3d P2; GRASP FWHM CV was 9.9% versus 28% for TWIST. MRI acceleration techniques vary in reproducibility and quantification of the AIF. Incomplete coverage of the k-space with TWIST as a representative of view-sharing techniques showed the highest variation within sequences and might be less suited for reproducible quantification of the AIF. Copyright © 2018 Elsevier B.V. All rights reserved.
Germer, Jeffrey J.; Ankoudinova, Irina; Belousov, Yevgeniy S.; Mahoney, Walt; Dong, Chen; Meng, Jihong; Mandrekar, Jayawant N.
2017-01-01
ABSTRACT Hepatitis E virus (HEV) has emerged as a cause of chronic hepatitis among immunocompromised patients. Molecular assays have become important tools for the diagnosis and management of these chronically infected patients. A real-time reverse transcription-quantitative PCR (RT-qPCR) assay utilizing Pleiades probe chemistry and an RNA internal control for the simultaneous detection and quantification of HEV RNA in human serum was developed based on an adaptation of a previously described and broadly reactive primer set targeting the overlapping open reading frame 2/3 (ORF2/3) nucleotide sequence of HEV. A chimeric bovine viral diarrhea virus construct containing an HEV RNA insert (SynTura HEV) was developed, value assigned with the first World Health Organization (WHO) international standard for HEV RNA (code 6329/10), and used to prepare working assay calibrators and controls, which supported an assay quantification range of 100 to 5,000,000 IU/ml. The analytical sensitivity (95% detection rate) of this assay was 25.2 IU/ml (95% confidence interval [CI], 19.2 to 44.1 IU/ml). The assay successfully amplified 16 different HEV sequences with significant nucleotide mismatching in primer/probe binding regions, while evaluation of a WHO international reference panel for HEV genotypes (code 8578/13) showed viral load results falling within the result ranges generated by WHO collaborative study participants for all panel members (genotypes 1 to 4). Broadly reactive RT-qPCR primers targeting HEV ORF2/3 were successfully adapted for use in an assay based on Pleiades probe chemistry. The availability of secondary standards calibrated to the WHO HEV international standard can improve the standardization and performance of assays for the detection and quantification of HEV RNA. PMID:28228493
Use of Multiple Competitors for Quantification of Human Immunodeficiency Virus Type 1 RNA in Plasma
Vener, Tanya; Nygren, Malin; Andersson, AnnaLena; Uhlén, Mathias; Albert, Jan; Lundeberg, Joakim
1998-01-01
Quantification of human immunodeficiency virus type 1 (HIV-1) RNA in plasma has rapidly become an important tool in basic HIV research and in the clinical care of infected individuals. Here, a quantitative HIV assay based on competitive reverse transcription-PCR with multiple competitors was developed. Four RNA competitors containing identical PCR primer binding sequences as the viral HIV-1 RNA target were constructed. One of the PCR primers was fluorescently labeled, which facilitated discrimination between the viral RNA and competitor amplicons by fragment analysis with conventional automated sequencers. The coamplification of known amounts of the RNA competitors provided the means to establish internal calibration curves for the individual reactions resulting in exclusion of tube-to-tube variations. Calibration curves were created from the peak areas, which were proportional to the starting amount of each competitor. The fluorescence detection format was expanded to provide a dynamic range of more than 5 log units. This quantitative assay allowed for reproducible analysis of samples containing as few as 40 viral copies of HIV-1 RNA per reaction. The within- and between-run coefficients of variation were <24% (range, 10 to 24) and <36% (range, 27 to 36), respectively. The high reproducibility (standard deviation, <0.13 log) of the overall procedure for quantification of HIV-1 RNA in plasma, including sample preparation, amplification, and detection variations, allowed reliable detection of a 0.5-log change in RNA viral load. The assay could be a useful tool for monitoring HIV-1 disease progression and antiviral treatment and can easily be adapted to the quantification of other pathogens. PMID:9650926
PCR technology for screening and quantification of genetically modified organisms (GMOs).
Holst-Jensen, Arne; Rønning, Sissel B; Løvseth, Astrid; Berdal, Knut G
2003-04-01
Although PCR technology has obvious limitations, the potentially high degree of sensitivity and specificity explains why it has been the first choice of most analytical laboratories interested in detection of genetically modified (GM) organisms (GMOs) and derived materials. Because the products that laboratories receive for analysis are often processed and refined, the quality and quantity of target analyte (e.g. protein or DNA) frequently challenges the sensitivity of any detection method. Among the currently available methods, PCR methods are generally accepted as the most sensitive and reliable methods for detection of GM-derived material in routine applications. The choice of target sequence motif is the single most important factor controlling the specificity of the PCR method. The target sequence is normally a part of the modified gene construct, for example a promoter, a terminator, a gene, or a junction between two of these elements. However, the elements may originate from wildtype organisms, they may be present in more than one GMO, and their copy number may also vary from one GMO to another. They may even be combined in a similar way in more than one GMO. Thus, the choice of method should fit the purpose. Recent developments include event-specific methods, particularly useful for identification and quantification of GM content. Thresholds for labelling are now in place in many countries including those in the European Union. The success of the labelling schemes is dependent upon the efficiency with which GM-derived material can be detected. We will present an overview of currently available PCR methods for screening and quantification of GM-derived DNA, and discuss their applicability and limitations. In addition, we will discuss some of the major challenges related to determination of the limits of detection (LOD) and quantification (LOQ), and to validation of methods.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chu, T.L.; Musicki, Z.; Kohut, P.
1994-06-01
During 1989, the Nuclear Regulatory Commission (NRC) initiated an extensive program to carefully examine the Potential risks during low Power and shutdown operations. The program includes two parallel projects being performed by Brookhaven National Laboratory (BNL) and Sandia National Laboratories (SNL). Two plants, Surry (pressurized water reactor) and Grand Gulf (boiling water reactor), were selected as the Plants to be studied. The objectives of the program are to assess the risks of severe accidents initiated during plant operational states other than full power operation and to compare the estimated core damage frequencies, important accident sequences and other qualitative and quantitativemore » results with those accidents initiated during full power operation as assessed in NUREG-1150. The objective of this report is to document the approach utilized in the Surry plant and discuss the results obtained. A parallel report for the Grand Gulf plant is prepared by SNL. This study shows that the core-damage frequency during mid-loop operation at the Surry plant is comparable to that of power operation. We recognize that there is very large uncertainty in the human error probabilities in this study. This study identified that only a few procedures are available for mitigating accidents that may occur during shutdown. Procedures written specifically for shutdown accidents would be useful. This document, Volume 2, Pt. 2 provides appendices A through D of this report.« less
NASA Astrophysics Data System (ADS)
Bau, Haim; Liu, Changchun; Killawala, Chitvan; Sadik, Mohamed; Mauk, Michael
2014-11-01
Real-time amplification and quantification of specific nucleic acid sequences plays a major role in many medical and biotechnological applications. In the case of infectious diseases, quantification of the pathogen-load in patient specimens is critical to assessing disease progression, effectiveness of drug therapy, and emergence of drug-resistance. Typically, nucleic acid quantification requires sophisticated and expensive instruments, such as real-time PCR machines, which are not appropriate for on-site use and for low resource settings. We describe a simple, low-cost, reactiondiffusion based method for end-point quantification of target nucleic acids undergoing enzymatic amplification. The number of target molecules is inferred from the position of the reaction-diffusion front, analogous to reading temperature in a mercury thermometer. We model the process with the Fisher Kolmogoroff Petrovskii Piscounoff (FKPP) Equation and compare theoretical predictions with experimental observations. The proposed method is suitable for nucleic acid quantification at the point of care, compatible with multiplexing and high-throughput processing, and can function instrument-free. C.L. was supported by NIH/NIAID K25AI099160; M.S. was supported by the Pennsylvania Ben Franklin Technology Development Authority; C.K. and H.B. were funded, in part, by NIH/NIAID 1R41AI104418-01A1.
Collewet, Guylaine; Moussaoui, Saïd; Deligny, Cécile; Lucas, Tiphaine; Idier, Jérôme
2018-06-01
Multi-tissue partial volume estimation in MRI images is investigated with a viewpoint related to spectral unmixing as used in hyperspectral imaging. The main contribution of this paper is twofold. It firstly proposes a theoretical analysis of the statistical optimality conditions of the proportion estimation problem, which in the context of multi-contrast MRI data acquisition allows to appropriately set the imaging sequence parameters. Secondly, an efficient proportion quantification algorithm based on the minimisation of a penalised least-square criterion incorporating a regularity constraint on the spatial distribution of the proportions is proposed. Furthermore, the resulting developments are discussed using empirical simulations. The practical usefulness of the spectral unmixing approach for partial volume quantification in MRI is illustrated through an application to food analysis on the proving of a Danish pastry. Copyright © 2018 Elsevier Inc. All rights reserved.
Yang, Litao; Liang, Wanqi; Jiang, Lingxi; Li, Wenquan; Cao, Wei; Wilson, Zoe A; Zhang, Dabing
2008-06-04
Real-time PCR techniques are being widely used for nucleic acids analysis, but one limitation of current frequently employed real-time PCR is the high cost of the labeled probe for each target molecule. We describe a real-time PCR technique employing attached universal duplex probes (AUDP), which has the advantage of generating fluorescence by probe hydrolysis and strand displacement over current real-time PCR methods. AUDP involves one set of universal duplex probes in which the 5' end of the fluorescent probe (FP) and a complementary quenching probe (QP) lie in close proximity so that fluorescence can be quenched. The PCR primer pair with attached universal template (UT) and the FP are identical to the UT sequence. We have shown that the AUDP technique can be used for detecting multiple target DNA sequences in both simplex and duplex real-time PCR assays for gene expression analysis, genotype identification, and genetically modified organism (GMO) quantification with comparable sensitivity, reproducibility, and repeatability with other real-time PCR methods. The results from GMO quantification, gene expression analysis, genotype identification, and GMO quantification using AUDP real-time PCR assays indicate that the AUDP real-time PCR technique has been successfully applied in nucleic acids analysis, and the developed AUDP real-time PCR technique will offer an alternative way for nucleic acid analysis with high efficiency, reliability, and flexibility at low cost.
Station blackout calculations for Browns Ferry
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ott, L.J.; Weber, C.F.; Hyman, C.R.
1985-01-01
This paper presents the results of calculations performed with the ORNL SASA code suite for the Station Blackout Severe Accident Sequence at Browns Ferry. The accident is initiated by a loss of offsite power combined with failure of all onsite emergency diesel generators to start and load. The Station Blackout is assumed to persist beyond the point of battery exhaustion (at six hours) and without DC power, cooling water could no longer be injected into the reactor vessel. Calculations are continued through the period of core degradation and melting, reactor vessel failure, and the subsequent containment failure. An estimate ofmore » the magnitude and timing of the concomitant fission product releases is also provided.« less
Shiroguchi, Katsuyuki; Jia, Tony Z.; Sims, Peter A.; Xie, X. Sunney
2012-01-01
RNA sequencing (RNA-Seq) is a powerful tool for transcriptome profiling, but is hampered by sequence-dependent bias and inaccuracy at low copy numbers intrinsic to exponential PCR amplification. We developed a simple strategy for mitigating these complications, allowing truly digital RNA-Seq. Following reverse transcription, a large set of barcode sequences is added in excess, and nearly every cDNA molecule is uniquely labeled by random attachment of barcode sequences to both ends. After PCR, we applied paired-end deep sequencing to read the two barcodes and cDNA sequences. Rather than counting the number of reads, RNA abundance is measured based on the number of unique barcode sequences observed for a given cDNA sequence. We optimized the barcodes to be unambiguously identifiable, even in the presence of multiple sequencing errors. This method allows counting with single-copy resolution despite sequence-dependent bias and PCR-amplification noise, and is analogous to digital PCR but amendable to quantifying a whole transcriptome. We demonstrated transcriptome profiling of Escherichia coli with more accurate and reproducible quantification than conventional RNA-Seq. PMID:22232676
NASA Technical Reports Server (NTRS)
1972-01-01
The detailed abort sequence trees for the reference zirconium hydride (ZrH) reactor power module that have been generated for each phase of the reference Space Base program mission are presented. The trees are graphical representations of causal sequences. Each tree begins with the phase identification and the dichotomy between success and failure. The success branch shows the mission phase objective as being achieved. The failure branch is subdivided, as conditions require, into various primary initiating abort conditions.
2014-01-01
Affinity capture of DNA methylation combined with high-throughput sequencing strikes a good balance between the high cost of whole genome bisulfite sequencing and the low coverage of methylation arrays. We present BayMeth, an empirical Bayes approach that uses a fully methylated control sample to transform observed read counts into regional methylation levels. In our model, inefficient capture can readily be distinguished from low methylation levels. BayMeth improves on existing methods, allows explicit modeling of copy number variation, and offers computationally efficient analytical mean and variance estimators. BayMeth is available in the Repitools Bioconductor package. PMID:24517713
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sattison, M.B.
The Idaho National Engineering Laboratory (INEL) over the three years has created 75 plant-specific Accident Sequence Precursor (ASP) models using the SAPHIRE suite of PRA codes. Along with the new models, the INEL has also developed a new module for SAPHIRE which is tailored specifically to the unique needs of ASP evaluations. These models and software will be the next generation of risk tools for the evaluation of accident precursors by both the U.S. Nuclear Regulatory Commission`s (NRC`s) Office of Nuclear Reactor Regulation (NRR) and the Office for Analysis and Evaluation of Operational Data (AEOD). This paper presents an overviewmore » of the models and software. Key characteristics include: (1) classification of the plant models according to plant response with a unique set of event trees for each plant class, (2) plant-specific fault trees using supercomponents, (3) generation and retention of all system and sequence cutsets, (4) full flexibility in modifying logic, regenerating cutsets, and requantifying results, and (5) user interface for streamlined evaluation of ASP events. Future plans for the ASP models is also presented.« less
How life changes itself: the Read-Write (RW) genome.
Shapiro, James A
2013-09-01
The genome has traditionally been treated as a Read-Only Memory (ROM) subject to change by copying errors and accidents. In this review, I propose that we need to change that perspective and understand the genome as an intricately formatted Read-Write (RW) data storage system constantly subject to cellular modifications and inscriptions. Cells operate under changing conditions and are continually modifying themselves by genome inscriptions. These inscriptions occur over three distinct time-scales (cell reproduction, multicellular development and evolutionary change) and involve a variety of different processes at each time scale (forming nucleoprotein complexes, epigenetic formatting and changes in DNA sequence structure). Research dating back to the 1930s has shown that genetic change is the result of cell-mediated processes, not simply accidents or damage to the DNA. This cell-active view of genome change applies to all scales of DNA sequence variation, from point mutations to large-scale genome rearrangements and whole genome duplications (WGDs). This conceptual change to active cell inscriptions controlling RW genome functions has profound implications for all areas of the life sciences. © 2013 Elsevier B.V. All rights reserved.
In-vessel coolability and retention of a core melt. Volume 1
DOE Office of Scientific and Technical Information (OSTI.GOV)
Theofanous, T.G.; Liu, C.; Additon, S.
1996-10-01
The efficacy of external flooding of a reactor vessel as a severe accident management strategy is assessed for an AP600-like reactor design. The overall approach is based on the Risk Oriented Accident Analysis Methodology (ROAAM), and the assessment includes consideration of bounding scenarios and sensitivity studies, as well as arbitrary parametric evaluations that allow the delineation of the failure boundaries. Quantification of the input parameters is carried out for an AP600-like design, and the results of the assessment demonstrate that lower head failure is physically unreasonable. Use of this conclusion for any specific application is subject to verifying the requiredmore » reliability of the depressurization and cavity-flooding systems, and to showing the appropriateness (in relation to the database presented here, or by further testing as necessary) of the thermal insulation design and of the external surface properties of the lower head, including any applicable coatings. The AP600 is particularly favorable to in-vessel retention. Some ideas to enhance the assessment basis as well as performance in this respect, for applications to larger and/or higher power density reactors are also provided.« less
In-vessel coolability and retention of a core melt. Volume 2
DOE Office of Scientific and Technical Information (OSTI.GOV)
Theofanous, T.G.; Liu, C.; Additon, S.
1996-10-01
The efficacy of external flooding of a reactor vessel as a severe accident management strategy is assessed for an AP600-like reactor design. The overall approach is based on the Risk Oriented Accident Analysis Methodology (ROAAM), and the assessment includes consideration of bounding scenarios and sensitivity studies, as well as arbitrary parametric evaluations that allow the delineation of the failure boundaries. Quantification of the input parameters is carried out for an AP600-like design, and the results of the assessment demonstrate that lower head failure is physically unreasonable. Use of this conclusion for any specific application is subject to verifying the requiredmore » reliability of the depressurization and cavity-flooding systems, and to showing the appropriateness (in relation to the database presented here, or by further testing as necessary) of the thermal insulation design and of the external surface properties of the lower head, including any applicable coatings. The AP600 is particularly favorable to in-vessel retention. Some ideas to enhance the assessment basis as well as performance in this respect, for applications to larger and/or higher power density reactors are also provided.« less
In vivo MR detection of fluorine-labeled human MSC using the bSSFP sequence
Ribot, Emeline J; Gaudet, Jeffrey M; Chen, Yuhua; Gilbert, Kyle M; Foster, Paula J
2014-01-01
Mesenchymal stem cells (MSC) are used to restore deteriorated cell environments. There is a need to specifically track these cells following transplantation in order to evaluate different methods of implantation, to follow their migration within the body, and to quantify their accumulation at the target. Cellular magnetic resonance imaging (MRI) using fluorine-based nanoemulsions is a great means to detect these transplanted cells in vivo because of the high specificity for fluorine detection and the capability for precise quantification. This technique, however, has low sensitivity, necessitating improvement in MR sequences. To counteract this issue, the balanced steady-state free precession (bSSFP) imaging sequence can be of great interest due to the high signal-to-noise ratio (SNR). Furthermore, it can be applied to obtain 3D images within short acquisition times. In this paper, bSSFP provided accurate quantification of samples of the perfluorocarbon Cell Sense-labeled cells in vitro. Cell Sense was internalized by human MSC (hMSC) without adverse alterations in cell viability or differentiation into adipocytes/osteocytes. The bSSFP sequence was applied in vivo to track and quantify the signals from both Cell Sense-labeled and iron-labeled hMSC after intramuscular implantation. The fluorine signal was observed to decrease faster and more significantly than the volume of iron-associated voids, which points to the advantage of quantifying the fluorine signal and the complexity of quantifying signal loss due to iron. PMID:24748787
In vivo MR detection of fluorine-labeled human MSC using the bSSFP sequence.
Ribot, Emeline J; Gaudet, Jeffrey M; Chen, Yuhua; Gilbert, Kyle M; Foster, Paula J
2014-01-01
Mesenchymal stem cells (MSC) are used to restore deteriorated cell environments. There is a need to specifically track these cells following transplantation in order to evaluate different methods of implantation, to follow their migration within the body, and to quantify their accumulation at the target. Cellular magnetic resonance imaging (MRI) using fluorine-based nanoemulsions is a great means to detect these transplanted cells in vivo because of the high specificity for fluorine detection and the capability for precise quantification. This technique, however, has low sensitivity, necessitating improvement in MR sequences. To counteract this issue, the balanced steady-state free precession (bSSFP) imaging sequence can be of great interest due to the high signal-to-noise ratio (SNR). Furthermore, it can be applied to obtain 3D images within short acquisition times. In this paper, bSSFP provided accurate quantification of samples of the perfluorocarbon Cell Sense-labeled cells in vitro. Cell Sense was internalized by human MSC (hMSC) without adverse alterations in cell viability or differentiation into adipocytes/osteocytes. The bSSFP sequence was applied in vivo to track and quantify the signals from both Cell Sense-labeled and iron-labeled hMSC after intramuscular implantation. The fluorine signal was observed to decrease faster and more significantly than the volume of iron-associated voids, which points to the advantage of quantifying the fluorine signal and the complexity of quantifying signal loss due to iron.
Confetti: A Multiprotease Map of the HeLa Proteome for Comprehensive Proteomics*
Guo, Xiaofeng; Trudgian, David C.; Lemoff, Andrew; Yadavalli, Sivaramakrishna; Mirzaei, Hamid
2014-01-01
Bottom-up proteomics largely relies on tryptic peptides for protein identification and quantification. Tryptic digestion often provides limited coverage of protein sequence because of issues such as peptide length, ionization efficiency, and post-translational modification colocalization. Unfortunately, a region of interest in a protein, for example, because of proximity to an active site or the presence of important post-translational modifications, may not be covered by tryptic peptides. Detection limits, quantification accuracy, and isoform differentiation can also be improved with greater sequence coverage. Selected reaction monitoring (SRM) would also greatly benefit from being able to identify additional targetable sequences. In an attempt to improve protein sequence coverage and to target regions of proteins that do not generate useful tryptic peptides, we deployed a multiprotease strategy on the HeLa proteome. First, we used seven commercially available enzymes in single, double, and triple enzyme combinations. A total of 48 digests were performed. 5223 proteins were detected by analyzing the unfractionated cell lysate digest directly; with 42% mean sequence coverage. Additional strong-anion exchange fractionation of the most complementary digests permitted identification of over 3000 more proteins, with improved mean sequence coverage. We then constructed a web application (https://proteomics.swmed.edu/confetti) that allows the community to examine a target protein or protein isoform in order to discover the enzyme or combination of enzymes that would yield peptides spanning a certain region of interest in the sequence. Finally, we examined the use of nontryptic digests for SRM. From our strong-anion exchange fractionation data, we were able to identify three or more proteotypic SRM candidates within a single digest for 6056 genes. Surprisingly, in 25% of these cases the digest producing the most observable proteotypic peptides was neither trypsin nor Lys-C. SRM analysis of Asp-N versus tryptic peptides for eight proteins determined that Asp-N yielded higher signal in five of eight cases. PMID:24696503
Turner, Clare E; Russell, Bruce R; Gant, Nicholas
2015-11-01
Magnetic resonance spectroscopy (MRS) is an analytical procedure that can be used to non-invasively measure the concentration of a range of neural metabolites. Creatine is an important neurometabolite with dietary supplementation offering therapeutic potential for neurological disorders with dysfunctional energetic processes. Neural creatine concentrations can be probed using proton MRS and quantified using a range of software packages based on different analytical methods. This experiment examines the differences in quantification performance of two commonly used analysis packages following a creatine supplementation strategy with potential therapeutic application. Human participants followed a seven day dietary supplementation regime in a placebo-controlled, cross-over design interspersed with a five week wash-out period. Spectroscopy data were acquired the day immediately following supplementation and analyzed with two commonly-used software packages which employ vastly different quantification methods. Results demonstrate that neural creatine concentration was augmented following creatine supplementation when analyzed using the peak fitting method of quantification (105.9%±10.1). In contrast, no change in neural creatine levels were detected with supplementation when analysis was conducted using the basis spectrum method of quantification (102.6%±8.6). Results suggest that software packages that employ the peak fitting procedure for spectral quantification are possibly more sensitive to subtle changes in neural creatine concentrations. The relative simplicity of the spectroscopy sequence and the data analysis procedure suggest that peak fitting procedures may be the most effective means of metabolite quantification when detection of subtle alterations in neural metabolites is necessary. The straightforward technique can be used on a clinical magnetic resonance imaging system. Copyright © 2015 Elsevier Inc. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Daling, P.M.; Marler, J.E.; Vo, T.V.
This study evaluates the values (benefits) and impacts (costs) associated with potential resolutions to Generic Issue 143, ``Availability of HVAC and Chilled Water Systems.`` The study identifies vulnerabilities related to failures of HVAC, chilled water, and room cooling systems; develops estimates of room heatup rates and safety-related equipment vulnerabilities following losses of HVAC/room cooler systems; develops estimates of the core damage frequencies and public risks associated with failures of these systems; develops three proposed resolution strategies to this generic issue; and performs a value/impact analysis of the proposed resolutions. Existing probabilistic risk assessments for four representative plants, including one plantmore » from each vendor, form the basis for the core damage frequency and public risk calculations. Both internal and external events were considered. It was concluded that all three proposed resolution strategies exceed the $1,000/person-rem cost-effectiveness ratio. Additional evaluations were performed to develop ``generic`` insights on potential design-related and configuration-related vulnerabilities and potential high-frequency ({approximately}1E-04/RY) accident sequences that involve failures of HVAC/room cooling functions. It was concluded that, although high-frequency accident sequences may exist at some plants, these high-frequency sequences are plant-specific in nature or have been resolved through hardware and/or operational changes. The plant-specific Individual Plant Examinations are an effective vehicle for identification and resolution of these plant-specific anomalies and hardware configurations.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schultz, R.R.
1982-01-01
Operating plant transients are of great interest for many reasons, not the least of which is the potential for a mild transient to degenerate to a severe transient yielding core damage. Using the Browns Ferry (BF) Unit-1 plant as a basis of study, the station blackout sequence was investigated by the Severe Accident Sequence Analysis (SASA) Program in support of the Nuclear Regulatory Commission's Unresolved Safety Issue A-44: Station Blackout. A station blackout transient occurs when the plant's AC power from a comemrcial power grid is lost and cannot be restored by the diesel generators. Under normal operating conditions, fmore » a loss of offsite power (LOSP) occurs (i.e., a complete severance of the BF plants from the Tennessee Valley Authority (TVA) power grid), the eight diesel generators at the three BF units would quickly start and power the emergency AC buses. Of the eight diesel generators, only six are needed to safely shut down all three units. Examination of BF-specific data show that LOSP frequency is low at Unit 1. The station blackout frequency is even lower (5.7 x 10/sup -4/ events per year) and hinges on whether the diesel generators start. The frequency of diesel generator failure is dictated in large measure by the emergency equipment cooling water (EECW) system that cools the diesel generators.« less
Improving risk management: from lame excuses to principled practice.
Paté-Cornell, Elisabeth; Cox, Louis Anthony
2014-07-01
The three classic pillars of risk analysis are risk assessment (how big is the risk and how sure can we be?), risk management (what shall we do about it?), and risk communication (what shall we say about it, to whom, when, and how?). We propose two complements as important parts of these three bases: risk attribution (who or what addressable conditions actually caused an accident or loss?) and learning from experience about risk reduction (what works, and how well?). Failures in complex systems usually evoke blame, often with insufficient attention to root causes of failure, including some aspects of the situation, design decisions, or social norms and culture. Focusing on blame, however, can inhibit effective learning, instead eliciting excuses to deflect attention and perceived culpability. Productive understanding of what went wrong, and how to do better, thus requires moving past recrimination and excuses. This article identifies common blame-shifting "lame excuses" for poor risk management. These generally contribute little to effective improvements and may leave real risks and preventable causes unaddressed. We propose principles from risk and decision sciences and organizational design to improve results. These start with organizational leadership. More specifically, they include: deliberate testing and learning-especially from near-misses and accident precursors; careful causal analysis of accidents; risk quantification; candid expression of uncertainties about costs and benefits of risk-reduction options; optimization of tradeoffs between gathering additional information and immediate action; promotion of safety culture; and mindful allocation of people, responsibilities, and resources to reduce risks. We propose that these principles provide sound foundations for improving successful risk management. © 2014 Society for Risk Analysis.
The Lagerlunda collision and the introduction of color vision testing.
Mollon, J D; Cavonius, L R
2012-01-01
In histories of vision testing, the origins of occupational screening for color blindness are often traced to a fatal railroad accident that occurred in Sweden on the night of 14-15 November 1875. The scene of the accident was the estate of Baron Lagerfelt in Östergötland, but the critical events were played out at Linköping (the normal passing place for the northbound and southbound expresses) and at Bankeberg (a small station to which the passing place was reassigned at a few minutes' notice). First to arrive at Bankeberg, the northbound express slowed almost to a halt, but then inexplicably accelerated forwards towards the Lagerlunda estate, despite a sequence of signals from the stationmaster, Uno Björkelund, and a lineman, Oskar Johansson. Soon after the accident, the ophthalmologist Frithiof Holmgren suggested that the engineer of the northbound express, Andersson, or his oiler, Larsson, had been color blind. Neither survived to be tested. Using the records of the subsequent trial and other archival materials, we have re-examined the role of color blindness in the Lagerlunda incident and conclude that the accident cannot be attributed to color blindness alone. Yet the accident undoubtedly had a central role in the introduction of color vision testing by European and North American railroads. To persuade the railroad management to introduce universal screening of employees for color blindness, Holmgren used a dramatic coup de theatre and some unashamed subterfuge. Copyright © 2012 Elsevier Inc. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Paladino, D.; Guentay, S.; Andreani, M.
2012-07-01
During a postulated severe accident with core degradation, hydrogen would form in the reactor pressure vessel mainly due to high temperatures zirconium-steam reaction and flow together with steam into the containment where it will mix with the containment atmosphere (steam-air). The hydrogen transport into the containment is a safety concern because it can lead to explosive mixtures through the associated phenomena of condensation, mixing and stratification. The ERCOSAM and SAMARA projects, co-financed by the European Union and the Russia, include various experiments addressing accident scenarios scaled down from existing plant calculations to different thermal-hydraulics facilities (TOSQAN, MISTRA, PANDA, SPOT). Themore » tests sequences aim to investigate hydrogen concentration build-up and stratification during a postulated accident and the effect of the activation of Severe Accident Management systems (SAMs), e.g. sprays, coolers and Passive Auto-catalytic Recombiners (PARs). Analytical activities, performed by the project participants, are an essential component of the projects, as they aim to improve and validate various computational methods. They accompany the projects in the various phases; plant calculations, scaling to generic containment and to the different facilities, planning pre-test and post-test simulations are performed. Code benchmark activities on the basis of conceptual near full scale HYMIX facility will finally provide a further opportunity to evaluate the applicability of the various methods to the study of scaling issues. (authors)« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kohout, E.F.; Folga, S.; Mueller, C.
1996-03-01
This paper describes the Waste Management Facility Accident Analysis (WASTE{underscore}ACC) software, which was developed at Argonne National Laboratory (ANL) to support the US Department of Energy`s (DOE`s) Waste Management (WM) Programmatic Environmental Impact Statement (PEIS). WASTE{underscore}ACC is a decision support and database system that is compatible with Microsoft{reg_sign} Windows{trademark}. It assesses potential atmospheric releases from accidents at waste management facilities. The software provides the user with an easy-to-use tool to determine the risk-dominant accident sequences for the many possible combinations of process technologies, waste and facility types, and alternative cases described in the WM PEIS. In addition, its structure willmore » allow additional alternative cases and assumptions to be tested as part of the future DOE programmatic decision-making process. The WASTE{underscore}ACC system demonstrates one approach to performing a generic, systemwide evaluation of accident risks at waste management facilities. The advantages of WASTE{underscore}ACC are threefold. First, the software gets waste volume and radiological profile data that were used to perform other WM PEIS-related analyses directly from the WASTE{underscore}MGMT system. Second, the system allows for a consistent analysis across all sites and waste streams, which enables decision makers to understand more fully the trade-offs among various policy options and scenarios. Third, the system is easy to operate; even complex scenario runs are completed within minutes.« less
Kislinger, Thomas; Humeny, Andreas; Peich, Carlo C; Zhang, Xiaohong; Niwa, Toshimitsu; Pischetsrieder, Monika; Becker, Cord-Michael
2003-01-01
The nonenzymatic glycation of proteins by reducing sugars, also known as the Maillard reaction, has received increasing recognition from nutritional science and medical research. In this study, we applied matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOF-MS) to perform relative and simultaneous quantification of the Amadori product, which is an early glycation product, and of N(epsilon)-(carboxymethyl)lysine and imidazolone A, two important advanced glycation end products. Therefore, native lysozyme was incubated with d-glucose for increasing periods of time (1, 4, 8, and 16 weeks) in phosphate-buffered saline pH 7.8 at 50 degrees C. After enzymatic digestion with endoproteinase Glu-C, the N-terminal peptide fragment (m/z 838; amino acid sequence KVFGRCE) and the C-terminal peptide fragment (m/z 1202; amino acid sequence VQAWIRGCRL) were used for relative quantification of the three Maillard products. Amadori product, N(epsilon)-(carboxymethyl)lysine, and imidazolone A were the main glycation products formed under these conditions. Their formation was dependent on glucose concentration and reaction time. The kinetics were similar to those obtained by competitive ELISA, an established method for quantification of N(epsilon)-(carboxymethyl)lysine and imidazolone A. Inhibition experiments showed that coincubation with N(alpha)-acetylargine suppressed formation of imidazolone A but not of the Amadori product or N(epsilon)-(carboxymethyl)lysine. The presence of N(alpha)-acetyllysine resulted in the inhibition of lysine modifications but in higher concentrations of imidazolone A. o-Phenylenediamine decreased the yield of the Amadori product and completely inhibited the formation of N(epsilon)-(carboxymethyl)lysine and imidazolone A. MALDI-TOF-MS proved to be a new analytical tool for the simultaneous, relative quantification of specific products of the Maillard reaction. For the first time, kinetic data of defined products on specific sites of glycated protein could be measured. This characterizes MALDI-TOF-MS as a valuable method for monitoring the Maillard reaction in the course of food processing.
Potential effects of the fire protection system sprays at Browns Ferry on fission product transport
DOE Office of Scientific and Technical Information (OSTI.GOV)
Niemczyk, S.J.
1983-01-01
The fire protection system (FPS) sprays within any nuclear plant are not intended to mitigate radioactive releases to the environment resulting from severe core-damage accidents. However, it has been shown here that during certain postulated severe accident scenarios at the Browns Ferry Nuclear Plant, the functioning of FPS sprays could have a significant impact on the radioactive releases. Thus the effects of those sprays need to be taken into account for realistic estimation of source terms for some accident scenarios. The effects would include direct ones such as cooling of the reactor building atmosphere and scrubbing of radioactivity from it,more » as well as indirect effects such as an altered likelihood of hydrogen burning and flooding of various safety-related pumps in the reactor building basement. Thus some of the impacts of the sprays would be beneficial with respect to mitigating releases to the environment but some others might not be. The effects of the FPS would be very scenario dependent with a wide range of potential effects often existing for a given accident sequence. Any generalization of the specific results presented here for Browns Ferry to other nuclear plants must be done cautiously, as it appears from a preliminary investigation that the relevant physical and operational characteristics of FPS spray systems differ widely among even otherwise apparently similar plants. Likewise the standby gas treatment systems, which substantially impact the effects of the FPS, differ significantly among plants. More work for both Mark I plants and other plants, BWRs and PWRs alike, is indicated so the potential effects of FPS spray systems during severe accidents can be at least ball-parked for more realistic accident analyses.« less
Flow cytometry for enrichment and titration in massively parallel DNA sequencing
Sandberg, Julia; Ståhl, Patrik L.; Ahmadian, Afshin; Bjursell, Magnus K.; Lundeberg, Joakim
2009-01-01
Massively parallel DNA sequencing is revolutionizing genomics research throughout the life sciences. However, the reagent costs and labor requirements in current sequencing protocols are still substantial, although improvements are continuously being made. Here, we demonstrate an effective alternative to existing sample titration protocols for the Roche/454 system using Fluorescence Activated Cell Sorting (FACS) technology to determine the optimal DNA-to-bead ratio prior to large-scale sequencing. Our method, which eliminates the need for the costly pilot sequencing of samples during titration is capable of rapidly providing accurate DNA-to-bead ratios that are not biased by the quantification and sedimentation steps included in current protocols. Moreover, we demonstrate that FACS sorting can be readily used to highly enrich fractions of beads carrying template DNA, with near total elimination of empty beads and no downstream sacrifice of DNA sequencing quality. Automated enrichment by FACS is a simple approach to obtain pure samples for bead-based sequencing systems, and offers an efficient, low-cost alternative to current enrichment protocols. PMID:19304748
Non-ECG-gated unenhanced MRA of the carotids: optimization and clinical feasibility.
Raoult, H; Gauvrit, J Y; Schmitt, P; Le Couls, V; Bannier, E
2013-11-01
To optimise and assess the clinical feasibility of a carotid non-ECG-gated unenhanced MRA sequence. Sixteen healthy volunteers and 11 patients presenting with internal carotid artery (ICA) disease underwent large field-of-view balanced steady-state free precession (bSSFP) unenhanced MRA at 3T. Sampling schemes acquiring the k-space centre either early (kCE) or late (kCL) in the acquisition window were evaluated. Signal and image quality was scored in comparison to ECG-gated kCE unenhanced MRA and TOF. For patients, computed tomography angiography was used as the reference. In volunteers, kCE sampling yielded higher image quality than kCL and TOF, with fewer flow artefacts and improved signal homogeneity. kCE unenhanced MRA image quality was higher without ECG-gating. Arterial signal and artery/vein contrast were higher with both bSSFP sampling schemes than with TOF. The kCE sequence allowed correct quantification of ten significant stenoses, and it facilitated the identification of an infrapetrous dysplasia, which was outside of the TOF imaging coverage. Non-ECG-gated bSSFP carotid imaging offers high-quality images and is a promising sequence for carotid disease diagnosis in a short acquisition time with high spatial resolution and a large field of view. • Non-ECG-gated unenhanced bSSFP MRA offers high-quality imaging of the carotid arteries. • Sequences using early acquisition of the k-space centre achieve higher image quality. • Non-ECG-gated unenhanced bSSFP MRA allows quantification of significant carotid stenosis. • Short MR acquisition times and ungated sequences are helpful in clinical practice. • High 3D spatial resolution and a large field of view improve diagnostic performance.
Yao, Shaolun; Jiang, Chuan; Huang, Ziyue; Torres-Jerez, Ivone; Chang, Junil; Zhang, Heng; Udvardi, Michael; Liu, Renyi; Verdier, Jerome
2016-10-01
Legume research and cultivar development are important for sustainable food production, especially of high-protein seed. Thanks to the development of deep-sequencing technologies, crop species have been taken to the front line, even without completion of their genome sequences. Black-eyed pea (Vigna unguiculata) is a legume species widely grown in semi-arid regions, which has high potential to provide stable seed protein production in a broad range of environments, including drought conditions. The black-eyed pea reference genotype has been used to generate a gene expression atlas of the major plant tissues (i.e. leaf, root, stem, flower, pod and seed), with a developmental time series for pods and seeds. From these various organs, 27 cDNA libraries were generated and sequenced, resulting in more than one billion reads. Following filtering, these reads were de novo assembled into 36 529 transcript sequences that were annotated and quantified across the different tissues. A set of 24 866 unique transcript sequences, called Unigenes, was identified. All the information related to transcript identification, annotation and quantification were stored into a gene expression atlas webserver (http://vugea.noble.org), providing a user-friendly interface and necessary tools to analyse transcript expression in black-eyed pea organs and to compare data with other legume species. Using this gene expression atlas, we inferred details of molecular processes that are active during seed development, and identified key putative regulators of seed maturation. Additionally, we found evidence for conservation of regulatory mechanisms involving miRNA in plant tissues subjected to drought and seeds undergoing desiccation. © 2016 The Authors. The Plant Journal published by Society for Experimental Biology and John Wiley & Sons Ltd.
Paté-Cornell, M E; Lakats, L M; Murphy, D M; Gaba, D M
1997-08-01
The risk of death or brain damage to anesthesia patients is relatively low, particularly for healthy patients in modern hospitals. When an accident does occur, its cause is usually an error made by the anesthesiologist, either in triggering the accident sequence, or failing to take timely corrective measures. This paper presents a pilot study which explores the feasibility of extending probabilistic risk analysis (PRA) of anesthesia accidents to assess the effects of human and management components on the patient risk. We develop first a classic PRA model for the patient risk per operation. We then link the probabilities of the different accident types to their root causes using a probabilistic analysis of the performance shaping factors. These factors are described here as the "state of the anesthesiologist" characterized both in terms of alertness and competence. We then analyze the effects of different management factors that affect the state of the anesthesiologist and we compute the risk reduction benefits of several risk management policies. Our data sources include the published version of the Australian Incident Monitoring Study as well as expert opinions. We conclude that patient risk could be reduced substantially by closer supervision of residents, the use of anesthesia simulators both in training and for periodic recertification, and regular medical examinations for all anesthesiologists.
[Analysis of free foetal DNA in maternal plasma using STR loci].
Vodicka, R; Vrtel, R; Procházka, M; Santavá, A; Dusek, L; Vrbická, D; Singh, R; Krejciríková, E; Schneiderová, E; Santavý, J
2006-01-01
Problems of maternal and foetal genotype differentiation of maternal plasma in pregnant women are solved generally by real-time systems. In this case the specific probes are used to distinguish particular genotype. Mostly gonosomal sequences are utilised to recognise the male foetus. This work describes possibilities in free foetal DNA detection and quantification by STR. Artificial genotype mixtures ranging from 0,2 % to 100 % to simulate maternal and paternal genotypes and 27 DNA samples from pregnant women in different stage of pregnancy were used for DNA quantification and detection. Foetal genotype was confirmed by biological father genotyping. The detection was performed in STR from 21st chromosome Down syndrome (DS) responsible region by innovated (I) QF PCR which allows to reveal and quantify even very rare DNA mosaics. The STR quantification was assessed in artificial mixtures of genotypes and discriminability of particular genotypes was on the level of few percent. Foetal DNA was detected in 74 % of tested samples. The IQF PCR application in quantification and differentiation between maternal and foetal genotypes by STR loci could have importance in non-invasive prenatal diagnostics as another possible marker for DS risk assessment.
Phylogenetic Quantification of Intra-tumour Heterogeneity
Schwarz, Roland F.; Trinh, Anne; Sipos, Botond; Brenton, James D.; Goldman, Nick; Markowetz, Florian
2014-01-01
Intra-tumour genetic heterogeneity is the result of ongoing evolutionary change within each cancer. The expansion of genetically distinct sub-clonal populations may explain the emergence of drug resistance, and if so, would have prognostic and predictive utility. However, methods for objectively quantifying tumour heterogeneity have been missing and are particularly difficult to establish in cancers where predominant copy number variation prevents accurate phylogenetic reconstruction owing to horizontal dependencies caused by long and cascading genomic rearrangements. To address these challenges, we present MEDICC, a method for phylogenetic reconstruction and heterogeneity quantification based on a Minimum Event Distance for Intra-tumour Copy-number Comparisons. Using a transducer-based pairwise comparison function, we determine optimal phasing of major and minor alleles, as well as evolutionary distances between samples, and are able to reconstruct ancestral genomes. Rigorous simulations and an extensive clinical study show the power of our method, which outperforms state-of-the-art competitors in reconstruction accuracy, and additionally allows unbiased numerical quantification of tumour heterogeneity. Accurate quantification and evolutionary inference are essential to understand the functional consequences of tumour heterogeneity. The MEDICC algorithms are independent of the experimental techniques used and are applicable to both next-generation sequencing and array CGH data. PMID:24743184
Quantification of DNA using the luminescent oxygen channeling assay.
Patel, R; Pollner, R; de Keczer, S; Pease, J; Pirio, M; DeChene, N; Dafforn, A; Rose, S
2000-09-01
Simplified and cost-effective methods for the detection and quantification of nucleic acid targets are still a challenge in molecular diagnostics. Luminescent oxygen channeling assay (LOCI(TM)) latex particles can be conjugated to synthetic oligodeoxynucleotides and hybridized, via linking probes, to different DNA targets. These oligomer-conjugated LOCI particles survive thermocycling in a PCR reaction and allow quantified detection of DNA targets in both real-time and endpoint formats. The endpoint DNA quantification format utilized two sensitizer bead types that are sensitive to separate illumination wavelengths. These two bead types were uniquely annealed to target or control amplicons, and separate illuminations generated time-resolved chemiluminescence, which distinguished the two amplicon types. In the endpoint method, ratios of the two signals allowed determination of the target DNA concentration over a three-log range. The real-time format allowed quantification of the DNA target over a six-log range with a linear relationship between threshold cycle and log of the number of DNA targets. This is the first report of the use of an oligomer-labeled latex particle assay capable of producing DNA quantification and sequence-specific chemiluminescent signals in a homogeneous format. It is also the first report of the generation of two signals from a LOCI assay. The methods described here have been shown to be easily adaptable to new DNA targets because of the generic nature of the oligomer-labeled LOCI particles.
Wang, Xiaomin; Zhang, Xiaojing; Ma, Lin; Li, Shengli
2018-06-20
Quantification of hepatic fat and iron content is important for early detection and monitoring of nonalcoholic fatty liver disease (NAFLD) patients. This study evaluated quantification efficiency of hepatic proton density fat fraction (PDFF) by MRI using NAFLD rabbits. R2* was also measured to investigate whether it correlates with fat levels in NAFLD. NAFLD rabbit model was successfully established by high fat and cholesterol diet. Rabbits underwent MRI examination for fat and iron analyses, compared with liver histological findings. MR examinations were performed on a 3.0T MR system using multi-echo 3D gradient recalled echo (GRE) sequence. MRI-PDFF showed significant differences between different steatosis grades with medians of 3.72% (normal), 5.43% (mild), 9.11% (moderate) and 11.17% (severe), whereas this was not observed in R2*. Close correlation between MRI-PDFF and histological steatosis was observed (r=0.78, P=0.000). Hepatic iron deposit was not found in any rabbits. There was no correlation between R2* and either liver MRI-PDFF or histological steatosis. MR measuring MRI-PDFF and R2* simultaneously provides promising quantification of steatosis and iron. Rabbit NAFLD model confirmed accuracy of MRI-PDFF for liver fat quantification. R2* measurement and relationship between fat and iron of NAFLD liver need further experimental investigation.
Kellman, Peter; Hansen, Michael S; Nielles-Vallespin, Sonia; Nickander, Jannike; Themudo, Raquel; Ugander, Martin; Xue, Hui
2017-04-07
Quantification of myocardial blood flow requires knowledge of the amount of contrast agent in the myocardial tissue and the arterial input function (AIF) driving the delivery of this contrast agent. Accurate quantification is challenged by the lack of linearity between the measured signal and contrast agent concentration. This work characterizes sources of non-linearity and presents a systematic approach to accurate measurements of contrast agent concentration in both blood and myocardium. A dual sequence approach with separate pulse sequences for AIF and myocardial tissue allowed separate optimization of parameters for blood and myocardium. A systems approach to the overall design was taken to achieve linearity between signal and contrast agent concentration. Conversion of signal intensity values to contrast agent concentration was achieved through a combination of surface coil sensitivity correction, Bloch simulation based look-up table correction, and in the case of the AIF measurement, correction of T2* losses. Validation of signal correction was performed in phantoms, and values for peak AIF concentration and myocardial flow are provided for 29 normal subjects for rest and adenosine stress. For phantoms, the measured fits were within 5% for both AIF and myocardium. In healthy volunteers the peak [Gd] was 3.5 ± 1.2 for stress and 4.4 ± 1.2 mmol/L for rest. The T2* in the left ventricle blood pool at peak AIF was approximately 10 ms. The peak-to-valley ratio was 5.6 for the raw signal intensities without correction, and was 8.3 for the look-up-table (LUT) corrected AIF which represents approximately 48% correction. Without T2* correction the myocardial blood flow estimates are overestimated by approximately 10%. The signal-to-noise ratio of the myocardial signal at peak enhancement (1.5 T) was 17.7 ± 6.6 at stress and the peak [Gd] was 0.49 ± 0.15 mmol/L. The estimated perfusion flow was 3.9 ± 0.38 and 1.03 ± 0.19 ml/min/g using the BTEX model and 3.4 ± 0.39 and 0.95 ± 0.16 using a Fermi model, for stress and rest, respectively. A dual sequence for myocardial perfusion cardiovascular magnetic resonance and AIF measurement has been optimized for quantification of myocardial blood flow. A validation in phantoms was performed to confirm that the signal conversion to gadolinium concentration was linear. The proposed sequence was integrated with a fully automatic in-line solution for pixel-wise mapping of myocardial blood flow and evaluated in adenosine stress and rest studies on N = 29 normal healthy subjects. Reliable perfusion mapping was demonstrated and produced estimates with low variability.
PuLSE: Quality control and quantification of peptide sequences explored by phage display libraries.
Shave, Steven; Mann, Stefan; Koszela, Joanna; Kerr, Alastair; Auer, Manfred
2018-01-01
The design of highly diverse phage display libraries is based on assumption that DNA bases are incorporated at similar rates within the randomized sequence. As library complexity increases and expected copy numbers of unique sequences decrease, the exploration of library space becomes sparser and the presence of truly random sequences becomes critical. We present the program PuLSE (Phage Library Sequence Evaluation) as a tool for assessing randomness and therefore diversity of phage display libraries. PuLSE runs on a collection of sequence reads in the fastq file format and generates tables profiling the library in terms of unique DNA sequence counts and positions, translated peptide sequences, and normalized 'expected' occurrences from base to residue codon frequencies. The output allows at-a-glance quantitative quality control of a phage library in terms of sequence coverage both at the DNA base and translated protein residue level, which has been missing from toolsets and literature. The open source program PuLSE is available in two formats, a C++ source code package for compilation and integration into existing bioinformatics pipelines and precompiled binaries for ease of use.
Lowering the quantification limit of the QubitTM RNA HS assay using RNA spike-in.
Li, Xin; Ben-Dov, Iddo Z; Mauro, Maurizio; Williams, Zev
2015-05-06
RNA quantification is often a prerequisite for most RNA analyses such as RNA sequencing. However, the relatively low sensitivity and large sample consumption of traditional RNA quantification methods such as UV spectrophotometry and even the much more sensitive fluorescence-based RNA quantification assays, such as the Qubit™ RNA HS Assay, are often inadequate for measuring minute levels of RNA isolated from limited cell and tissue samples and biofluids. Thus, there is a pressing need for a more sensitive method to reliably and robustly detect trace levels of RNA without interference from DNA. To improve the quantification limit of the Qubit™ RNA HS Assay, we spiked-in a known quantity of RNA to achieve the minimum reading required by the assay. Samples containing trace amounts of RNA were then added to the spike-in and measured as a reading increase over RNA spike-in baseline. We determined the accuracy and precision of reading increases between 1 and 20 pg/μL as well as RNA-specificity in this range, and compared to those of RiboGreen(®), another sensitive fluorescence-based RNA quantification assay. We then applied Qubit™ Assay with RNA spike-in to quantify plasma RNA samples. RNA spike-in improved the quantification limit of the Qubit™ RNA HS Assay 5-fold, from 25 pg/μL down to 5 pg/μL while maintaining high specificity to RNA. This enabled quantification of RNA with original concentration as low as 55.6 pg/μL compared to 250 pg/μL for the standard assay and decreased sample consumption from 5 to 1 ng. Plasma RNA samples that were not measurable by the Qubit™ RNA HS Assay were measurable by our modified method. The Qubit™ RNA HS Assay with RNA spike-in is able to quantify RNA with high specificity at 5-fold lower concentration and uses 5-fold less sample quantity than the standard Qubit™ Assay.
Real-time quantitative PCR for retrovirus-like particle quantification in CHO cell culture.
de Wit, C; Fautz, C; Xu, Y
2000-09-01
Chinese hamster ovary (CHO) cells have been widely used to manufacture recombinant proteins intended for human therapeutic uses. Retrovirus-like particles, which are apparently defective and non-infectious, have been detected in all CHO cells by electron microscopy (EM). To assure viral safety of CHO cell-derived biologicals, quantification of retrovirus-like particles in production cell culture and demonstration of sufficient elimination of such retrovirus-like particles by the down-stream purification process are required for product market registration worldwide. EM, with a detection limit of 1x10(6) particles/ml, is the standard retrovirus-like particle quantification method. The whole process, which requires a large amount of sample (3-6 litres), is labour intensive, time consuming, expensive, and subject to significant assay variability. In this paper, a novel real-time quantitative PCR assay (TaqMan assay) has been developed for the quantification of retrovirus-like particles. Each retrovirus particle contains two copies of the viral genomic particle RNA (pRNA) molecule. Therefore, quantification of retrovirus particles can be achieved by quantifying the pRNA copy number, i.e. every two copies of retroviral pRNA is equivalent to one retrovirus-like particle. The TaqMan assay takes advantage of the 5'-->3' exonuclease activity of Taq DNA polymerase and utilizes the PRISM 7700 Sequence Detection System of PE Applied Biosystems (Foster City, CA, U.S.A.) for automated pRNA quantification through a dual-labelled fluorogenic probe. The TaqMan quantification technique is highly comparable to the EM analysis. In addition, it offers significant advantages over the EM analysis, such as a higher sensitivity of less than 600 particles/ml, greater accuracy and reliability, higher sample throughput, more flexibility and lower cost. Therefore, the TaqMan assay should be used as a substitute for EM analysis for retrovirus-like particle quantification in CHO cell-based production system. Copyright 2000 The International Association for Biologicals.
Yang, Litao; Liang, Wanqi; Jiang, Lingxi; Li, Wenquan; Cao, Wei; Wilson, Zoe A; Zhang, Dabing
2008-01-01
Background Real-time PCR techniques are being widely used for nucleic acids analysis, but one limitation of current frequently employed real-time PCR is the high cost of the labeled probe for each target molecule. Results We describe a real-time PCR technique employing attached universal duplex probes (AUDP), which has the advantage of generating fluorescence by probe hydrolysis and strand displacement over current real-time PCR methods. AUDP involves one set of universal duplex probes in which the 5' end of the fluorescent probe (FP) and a complementary quenching probe (QP) lie in close proximity so that fluorescence can be quenched. The PCR primer pair with attached universal template (UT) and the FP are identical to the UT sequence. We have shown that the AUDP technique can be used for detecting multiple target DNA sequences in both simplex and duplex real-time PCR assays for gene expression analysis, genotype identification, and genetically modified organism (GMO) quantification with comparable sensitivity, reproducibility, and repeatability with other real-time PCR methods. Conclusion The results from GMO quantification, gene expression analysis, genotype identification, and GMO quantification using AUDP real-time PCR assays indicate that the AUDP real-time PCR technique has been successfully applied in nucleic acids analysis, and the developed AUDP real-time PCR technique will offer an alternative way for nucleic acid analysis with high efficiency, reliability, and flexibility at low cost. PMID:18522756
Study of dynamics of two-phase flow through a minichannel by means of recurrences
NASA Astrophysics Data System (ADS)
Litak, Grzegorz; Górski, Grzegorz; Mosdorf, Romuald; Rysak, Andrzej
2017-05-01
By changing air and water flow rates in the two-phase (air-water) flow through a minichannel, we observed the evolution of air bubbles and slugs patterns. This spatiotemporal behaviour was identified qualitatively by using a digital camera. Simultaneously, we provided a detailed analysis of these phenomena by using the corresponding sequences of light transmission time series recorded with a laser-phototransistor sensor. To distinguish particular patterns, we used recurrence plots and recurrence quantification analysis. Finally, we showed that the maxima of various recurrence quantificators obtained from the laser time series could follow the bubble and slugs patterns in studied ranges of air and water flows.
Dobnik, David; Spilsberg, Bjørn; Bogožalec Košir, Alexandra; Holst-Jensen, Arne; Žel, Jana
2015-08-18
Presence of genetically modified organisms (GMO) in food and feed products is regulated in many countries. The European Union (EU) has implemented a threshold for labeling of products containing more than 0.9% of authorized GMOs per ingredient. As the number of GMOs has increased over time, standard-curve based simplex quantitative polymerase chain reaction (qPCR) analyses are no longer sufficiently cost-effective, despite widespread use of initial PCR based screenings. Newly developed GMO detection methods, also multiplex methods, are mostly focused on screening and detection but not quantification. On the basis of droplet digital PCR (ddPCR) technology, multiplex assays for quantification of all 12 EU authorized GM maize lines (per April first 2015) were developed. Because of high sequence similarity of some of the 12 GM targets, two separate multiplex assays were needed. In both assays (4-plex and 10-plex), the transgenes were labeled with one fluorescence reporter and the endogene with another (GMO concentration = transgene/endogene ratio). It was shown that both multiplex assays produce specific results and that performance parameters such as limit of quantification, repeatability, and trueness comply with international recommendations for GMO quantification methods. Moreover, for samples containing GMOs, the throughput and cost-effectiveness is significantly improved compared to qPCR. Thus, it was concluded that the multiplex ddPCR assays could be applied for routine quantification of 12 EU authorized GM maize lines. In case of new authorizations, the events can easily be added to the existing multiplex assays. The presented principle of quantitative multiplexing can be applied to any other domain.
The promise and challenge of high-throughput sequencing of the antibody repertoire
Georgiou, George; Ippolito, Gregory C; Beausang, John; Busse, Christian E; Wardemann, Hedda; Quake, Stephen R
2014-01-01
Efforts to determine the antibody repertoire encoded by B cells in the blood or lymphoid organs using high-throughput DNA sequencing technologies have been advancing at an extremely rapid pace and are transforming our understanding of humoral immune responses. Information gained from high-throughput DNA sequencing of immunoglobulin genes (Ig-seq) can be applied to detect B-cell malignancies with high sensitivity, to discover antibodies specific for antigens of interest, to guide vaccine development and to understand autoimmunity. Rapid progress in the development of experimental protocols and informatics analysis tools is helping to reduce sequencing artifacts, to achieve more precise quantification of clonal diversity and to extract the most pertinent biological information. That said, broader application of Ig-seq, especially in clinical settings, will require the development of a standardized experimental design framework that will enable the sharing and meta-analysis of sequencing data generated by different laboratories. PMID:24441474
Zhang, Chi; Fang, Xin; Qiu, Haopu; Li, Ning
2015-01-01
Real-time PCR amplification of mitochondria gene could not be used for DNA quantification, and that of single copy DNA did not allow an ideal sensitivity. Moreover, cross-reactions among similar species were commonly observed in the published methods amplifying repetitive sequence, which hindered their further application. The purpose of this study was to establish a short interspersed nuclear element (SINE)-based real-time PCR approach having high specificity for species detection that could be used in DNA quantification. After massive screening of candidate Sus scrofa SINEs, one optimal combination of primers and probe was selected, which had no cross-reaction with other common meat species. LOD of the method was 44 fg DNA/reaction. Further, quantification tests showed this approach was practical in DNA estimation without tissue variance. Thus, this study provided a new tool for qualitative detection of porcine component, which could be promising in the QC of meat products.
Bach, H-J; Jessen, I; Schloter, M; Munch, J C
2003-01-01
Real-time TaqMan-PCR assays were developed for detection, differentiation and absolute quantification of the pathogenic subspecies of Clavibacter michiganensis (Cm) in one single PCR run. The designed primer pair, targeting intergenic sequences of the rRNA operon (ITS) common in all subspecies, was suitable for the amplification of the expected 223-nt DNA fragments of all subspecies. Closely related bacteria were completely discriminated, except of Rathayibacter iranicus, from which weak PCR product bands appeared on agarose gel after 35 PCR cycles. Sufficient specificity of PCR detection was reached by introduction of the additional subspecies specific probes used in TaqMan-PCR. Only Cm species were detected and there was clear differentiation among the subspecies C. michiganensis sepedonicus (Cms), C. michiganensis michiganensis (Cmm), C. michiganensis nebraskensis (Cmn), C. michiganensis insidiosus (Cmi) and C. michiganensis tessellarius (Cmt). The TaqMan assays were optimized to enable a simultaneous quantification of each subspecies. Validity is shown by comparison with cell counts.
Li, Xiang; Pan, Liangwen; Li, Junyi; Zhang, Qigang; Zhang, Shuya; Lv, Rong; Yang, Litao
2011-12-28
For implementation of the issued regulations and labeling policies for genetically modified organism (GMO) supervision, the polymerase chain reaction (PCR) method has been widely used due to its high specificity and sensitivity. In particular, use of the event-specific PCR method based on the flanking sequence of transgenes has become the primary trend. In this study, both qualitative and quantitative PCR methods were established on the basis of the 5' flanking sequence of transgenic soybean A2704-12 and the 3' flanking sequence of transgenic soybean A5547-127, respectively. In qualitative PCR assays, the limits of detection (LODs) were 10 copies of haploid soybean genomic DNA for both A2704-12 and A5547-127. In quantitative real-time PCR assays, the LODs were 5 copies of haploid soybean genomic DNA for both A2704-12 and A5547-127, and the limits of quantification (LOQs) were 10 copies for both. Low bias and acceptable SD and RSD values were also achieved in quantification of four blind samples using the developed real-time PCR assays. In addition, the developed PCR assays for the two transgenic soybean events were used for routine analysis of soybean samples imported to Shanghai in a 6 month period from October 2010 to March 2011. A total of 27 lots of soybean from the United States and Argentina were analyzed: 8 lots from the Unites States were found to have the GM soybean A2704-12 event, and the GM contents were <1.5% in all eight analyzed lots. On the contrary, no GM soybean A5547-127 content was found in any of the eight lots. These results demonstrated that the established event-specific qualitative and quantitative PCR methods could be used effectively in routine identification and quantification of GM soybeans A2704-12 and A5547-127 and their derived products.
RNA-Skim: a rapid method for RNA-Seq quantification at transcript level
Zhang, Zhaojun; Wang, Wei
2014-01-01
Motivation: RNA-Seq technique has been demonstrated as a revolutionary means for exploring transcriptome because it provides deep coverage and base pair-level resolution. RNA-Seq quantification is proven to be an efficient alternative to Microarray technique in gene expression study, and it is a critical component in RNA-Seq differential expression analysis. Most existing RNA-Seq quantification tools require the alignments of fragments to either a genome or a transcriptome, entailing a time-consuming and intricate alignment step. To improve the performance of RNA-Seq quantification, an alignment-free method, Sailfish, has been recently proposed to quantify transcript abundances using all k-mers in the transcriptome, demonstrating the feasibility of designing an efficient alignment-free method for transcriptome quantification. Even though Sailfish is substantially faster than alternative alignment-dependent methods such as Cufflinks, using all k-mers in the transcriptome quantification impedes the scalability of the method. Results: We propose a novel RNA-Seq quantification method, RNA-Skim, which partitions the transcriptome into disjoint transcript clusters based on sequence similarity, and introduces the notion of sig-mers, which are a special type of k-mers uniquely associated with each cluster. We demonstrate that the sig-mer counts within a cluster are sufficient for estimating transcript abundances with accuracy comparable with any state-of-the-art method. This enables RNA-Skim to perform transcript quantification on each cluster independently, reducing a complex optimization problem into smaller optimization tasks that can be run in parallel. As a result, RNA-Skim uses <4% of the k-mers and <10% of the CPU time required by Sailfish. It is able to finish transcriptome quantification in <10 min per sample by using just a single thread on a commodity computer, which represents >100 speedup over the state-of-the-art alignment-based methods, while delivering comparable or higher accuracy. Availability and implementation: The software is available at http://www.csbio.unc.edu/rs. Contact: weiwang@cs.ucla.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:24931995
Jung, SeungWoo; Bohan, Amy
2018-02-01
OBJECTIVE To characterize expression profiles of circulating microRNAs via genome-wide sequencing for dogs with congestive heart failure (CHF) secondary to myxomatous mitral valve degeneration (MMVD). ANIMALS 9 healthy client-owned dogs and 8 age-matched client-owned dogs with CHF secondary to MMVD. PROCEDURES Blood samples were collected before administering cardiac medications for the management of CHF. Isolated microRNAs from plasma were classified into microRNA libraries and subjected to next-generation sequencing (NGS) for genome-wide sequencing analysis and quantification of circulating microRNAs. Quantitative reverse transcription PCR (qRT-PCR) assays were used to validate expression profiles of differentially expressed circulating microRNAs identified from NGS analysis of dogs with CHF. RESULTS 326 microRNAs were identified with NGS analysis. Hierarchical analysis revealed distinct expression patterns of circulating microRNAs between healthy dogs and dogs with CHF. Results of qRT-PCR assays confirmed upregulation of 4 microRNAs (miR-133, miR-1, miR-let-7e, and miR-125) and downregulation of 4 selected microRNAs (miR-30c, miR-128, miR-142, and miR-423). Results of qRT-PCR assays were highly correlated with NGS data and supported the specificity of circulating microRNA expression profiles in dogs with CHF secondary to MMVD. CONCLUSIONS AND CLINICAL RELEVANCE These results suggested that circulating microRNA expression patterns were unique and could serve as molecular biomarkers of CHF in dogs with MMVD.
Detection and Analysis of Circular RNAs by RT-PCR.
Panda, Amaresh C; Gorospe, Myriam
2018-03-20
Gene expression in eukaryotic cells is tightly regulated at the transcriptional and posttranscriptional levels. Posttranscriptional processes, including pre-mRNA splicing, mRNA export, mRNA turnover, and mRNA translation, are controlled by RNA-binding proteins (RBPs) and noncoding (nc)RNAs. The vast family of ncRNAs comprises diverse regulatory RNAs, such as microRNAs and long noncoding (lnc)RNAs, but also the poorly explored class of circular (circ)RNAs. Although first discovered more than three decades ago by electron microscopy, only the advent of high-throughput RNA-sequencing (RNA-seq) and the development of innovative bioinformatic pipelines have begun to allow the systematic identification of circRNAs (Szabo and Salzman, 2016; Panda et al ., 2017b; Panda et al ., 2017c). However, the validation of true circRNAs identified by RNA sequencing requires other molecular biology techniques including reverse transcription (RT) followed by conventional or quantitative (q) polymerase chain reaction (PCR), and Northern blot analysis (Jeck and Sharpless, 2014). RT-qPCR analysis of circular RNAs using divergent primers has been widely used for the detection, validation, and sometimes quantification of circRNAs (Abdelmohsen et al ., 2015 and 2017; Panda et al ., 2017b). As detailed here, divergent primers designed to span the circRNA backsplice junction sequence can specifically amplify the circRNAs and not the counterpart linear RNA. In sum, RT-PCR analysis using divergent primers allows direct detection and quantification of circRNAs.
Hong, Jungeui; Gresham, David
2017-11-01
Quantitative analysis of next-generation sequencing (NGS) data requires discriminating duplicate reads generated by PCR from identical molecules that are of unique origin. Typically, PCR duplicates are identified as sequence reads that align to the same genomic coordinates using reference-based alignment. However, identical molecules can be independently generated during library preparation. Misidentification of these molecules as PCR duplicates can introduce unforeseen biases during analyses. Here, we developed a cost-effective sequencing adapter design by modifying Illumina TruSeq adapters to incorporate a unique molecular identifier (UMI) while maintaining the capacity to undertake multiplexed, single-index sequencing. Incorporation of UMIs into TruSeq adapters (TrUMIseq adapters) enables identification of bona fide PCR duplicates as identically mapped reads with identical UMIs. Using TrUMIseq adapters, we show that accurate removal of PCR duplicates results in improved accuracy of both allele frequency (AF) estimation in heterogeneous populations using DNA sequencing and gene expression quantification using RNA-Seq.
An Exercise Health Simulation Method Based on Integrated Human Thermophysiological Model
Chen, Xiaohui; Yu, Liang; Yang, Kaixing
2017-01-01
Research of healthy exercise has garnered a keen research for the past few years. It is known that participation in a regular exercise program can help improve various aspects of cardiovascular function and reduce the risk of suffering from illness. But some exercise accidents like dehydration, exertional heatstroke, and even sudden death need to be brought to attention. If these exercise accidents can be analyzed and predicted before they happened, it will be beneficial to alleviate or avoid disease or mortality. To achieve this objective, an exercise health simulation approach is proposed, in which an integrated human thermophysiological model consisting of human thermal regulation model and a nonlinear heart rate regulation model is reported. The human thermoregulatory mechanism as well as the heart rate response mechanism during exercise can be simulated. On the basis of the simulated physiological indicators, a fuzzy finite state machine is constructed to obtain the possible health transition sequence and predict the exercise health status. The experiment results show that our integrated exercise thermophysiological model can numerically simulate the thermal and physiological processes of the human body during exercise and the predicted exercise health transition sequence from finite state machine can be used in healthcare. PMID:28702074
Germer, Jeffrey J; Ankoudinova, Irina; Belousov, Yevgeniy S; Mahoney, Walt; Dong, Chen; Meng, Jihong; Mandrekar, Jayawant N; Yao, Joseph D
2017-05-01
Hepatitis E virus (HEV) has emerged as a cause of chronic hepatitis among immunocompromised patients. Molecular assays have become important tools for the diagnosis and management of these chronically infected patients. A real-time reverse transcription-quantitative PCR (RT-qPCR) assay utilizing Pleiades probe chemistry and an RNA internal control for the simultaneous detection and quantification of HEV RNA in human serum was developed based on an adaptation of a previously described and broadly reactive primer set targeting the overlapping open reading frame 2/3 (ORF2/3) nucleotide sequence of HEV. A chimeric bovine viral diarrhea virus construct containing an HEV RNA insert (SynTura HEV) was developed, value assigned with the first World Health Organization (WHO) international standard for HEV RNA (code 6329/10), and used to prepare working assay calibrators and controls, which supported an assay quantification range of 100 to 5,000,000 IU/ml. The analytical sensitivity (95% detection rate) of this assay was 25.2 IU/ml (95% confidence interval [CI], 19.2 to 44.1 IU/ml). The assay successfully amplified 16 different HEV sequences with significant nucleotide mismatching in primer/probe binding regions, while evaluation of a WHO international reference panel for HEV genotypes (code 8578/13) showed viral load results falling within the result ranges generated by WHO collaborative study participants for all panel members (genotypes 1 to 4). Broadly reactive RT-qPCR primers targeting HEV ORF2/3 were successfully adapted for use in an assay based on Pleiades probe chemistry. The availability of secondary standards calibrated to the WHO HEV international standard can improve the standardization and performance of assays for the detection and quantification of HEV RNA. Copyright © 2017 American Society for Microbiology.
Bergallo, M; Costa, C; Tarallo, S; Daniele, R; Merlino, C; Segoloni, G P; Negro Ponzi, A; Cavallo, R
2006-06-01
The human cytomegalovirus (HCMV) is an important pathogen in immunocompromised patients, such as transplant recipients. The use of sensitive and rapid diagnostic assays can have a great impact on antiviral prophylaxis and therapy monitoring and diagnosing active disease. Quantification of HCMV DNA may additionally have prognostic value and guide routine management. The aim of this study was to develop a reliable internally-controlled quantitative-competitive PCR (QC-PCR) for the detection and quantification of HCMV DNA viral load in peripheral blood and compare it with other methods: the HCMV pp65 antigenaemia assay in leukocyte fraction, the HCMV viraemia, both routinely employed in our laboratory, and the nucleic acid sequence-based amplification (NASBA) for detection of HCMV pp67-mRNA. Quantitative-competitive PCR is a procedure for nucleic acid quantification based on co-amplification of competitive templates, the target DNA and a competitor functioning as internal standard. In particular, a standard curve is generated by amplifying 10(2) to 10(5) copies of target pCMV-435 plasmid with 10(4) copies of competitor pCMV-C plasmid. Clinical samples derived from 40 kidney transplant patients were tested by spiking 10(4) copies of pCMV-C into the PCR mix as internal control, and comparing results with the standard curve. Of the 40 patients studied, 39 (97.5%) were positive for HCMV DNA by QC-PCR. While the correlation between the number of pp65-positive cells and the number of HCMV DNA genome copies/mL and the former and the pp67mRNA-positivity were statistically significant, there was no significant correlation between HCMV DNA viral load assayed by QC-PCR and HCMV viraemia. The QC-PCR assay could detect from 10(2) to over 10(7) copies of HCMV DNA with a range of linearity between 10(2) and 10(5) genomes.
Li, Zhuqing; Li, Xiang; Wang, Canhua; Song, Guiwen; Pi, Liqun; Zheng, Lan; Zhang, Dabing; Yang, Litao
2017-09-27
Multiple-target plasmid DNA reference materials have been generated and utilized as good substitutes of matrix-based reference materials in the analysis of genetically modified organisms (GMOs). Herein, we report the construction of one multiple-target plasmid reference molecule, pCAN, which harbors eight GM canola event-specific sequences (RF1, RF2, MS1, MS8, Topas 19/2, Oxy235, RT73, and T45) and a partial sequence of the canola endogenous reference gene PEP. The applicability of this plasmid reference material in qualitative and quantitative PCR assays of the eight GM canola events was evaluated, including the analysis of specificity, limit of detection (LOD), limit of quantification (LOQ), and performance of pCAN in the analysis of various canola samples, etc. The LODs are 15 copies for RF2, MS1, and RT73 assays using pCAN as the calibrator and 10 genome copies for the other events. The LOQ in each event-specific real-time PCR assay is 20 copies. In quantitative real-time PCR analysis, the PCR efficiencies of all event-specific and PEP assays are between 91% and 97%, and the squared regression coefficients (R 2 ) are all higher than 0.99. The quantification bias values varied from 0.47% to 20.68% with relative standard deviation (RSD) from 1.06% to 24.61% in the quantification of simulated samples. Furthermore, 10 practical canola samples sampled from imported shipments in the port of Shanghai, China, were analyzed employing pCAN as the calibrator, and the results were comparable with those assays using commercial certified materials as the calibrator. Concluding from these results, we believe that this newly developed pCAN plasmid is one good candidate for being a plasmid DNA reference material in the detection and quantification of the eight GM canola events in routine analysis.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kalimullah
1994-03-01
Some special purpose heavy-water reactors (EM) are made of assemblies consisting of a number of coaxial aluminum-clad U-Al alloy fuel tubes and an outer Al sleeve surrounding the fuel tubes. The heavy water coolant flows in the annular gaps between the circular tubes. Analysis of severe accidents in such reactors requires a model for predicting the behavior of the fuel tubes as they melt and disrupt. This paper describes a detailed, mechanistic model for fuel tube heatup, melting, freezing, and molten material relocation, called MARTINS (Melting and Relocation of Tubes in Nuclear subassembly). The paper presents the modeling of themore » phenomena in MARTINS, and an application of the model to analysis of a reactivity insertion accident. Some models are being developed to compute gradual downward relocation of molten material at decay-heat power levels via candling along intact tubes, neglecting coolant vapor hydrodynamic forces on molten material. These models are inadequate for high power accident sequences involving significant hydrodynamic forces. These forces are included in MARTINS.« less
Horn, T; Chang, C A; Urdea, M S
1997-12-01
The divergent synthesis of branched DNA (bDNA) comb structures is described. This new type of bDNA contains one unique oligonucleotide, the primary sequence, covalently attached through a comb-like branch network to many identical copies of a different oligonucleotide, the secondary sequence. The bDNA comb structures were assembled on a solid support and several synthesis parameters were investigated and optimized. The bDNA comb molecules were characterized by polyacrylamide gel electrophoretic methods and by controlled cleavage at periodate-cleavable moieties incorporated during synthesis. The developed chemistry allows synthesis of bDNA comb molecules containing multiple secondary sequences. In the accompanying article we describe the synthesis and characterization of large bDNA combs containing all four deoxynucleotides for use as signal amplifiers in nucleic acid quantification assays.
Horn, T; Chang, C A; Urdea, M S
1997-01-01
The divergent synthesis of branched DNA (bDNA) comb structures is described. This new type of bDNA contains one unique oligonucleotide, the primary sequence, covalently attached through a comb-like branch network to many identical copies of a different oligonucleotide, the secondary sequence. The bDNA comb structures were assembled on a solid support and several synthesis parameters were investigated and optimized. The bDNA comb molecules were characterized by polyacrylamide gel electrophoretic methods and by controlled cleavage at periodate-cleavable moieties incorporated during synthesis. The developed chemistry allows synthesis of bDNA comb molecules containing multiple secondary sequences. In the accompanying article we describe the synthesis and characterization of large bDNA combs containing all four deoxynucleotides for use as signal amplifiers in nucleic acid quantification assays. PMID:9365265
Sun, Bing; Shen, Feng; McCalla, Stephanie E; Kreutz, Jason E; Karymov, Mikhail A; Ismagilov, Rustem F
2013-02-05
Here we used a SlipChip microfluidic device to evaluate the performance of digital reverse transcription-loop-mediated isothermal amplification (dRT-LAMP) for quantification of HIV viral RNA. Tests are needed for monitoring HIV viral load to control the emergence of drug resistance and to diagnose acute HIV infections. In resource-limited settings, in vitro measurement of HIV viral load in a simple format is especially needed, and single-molecule counting using a digital format could provide a potential solution. We showed here that when one-step dRT-LAMP is used for quantification of HIV RNA, the digital count is lower than expected and is limited by the yield of desired cDNA. We were able to overcome the limitations by developing a microfluidic protocol to manipulate many single molecules in parallel through a two-step digital process. In the first step we compartmentalize the individual RNA molecules (based on Poisson statistics) and perform reverse transcription on each RNA molecule independently to produce DNA. In the second step, we perform the LAMP amplification on all individual DNA molecules in parallel. Using this new protocol, we increased the absolute efficiency (the ratio between the concentration calculated from the actual count and the expected concentration) of dRT-LAMP 10-fold, from ∼2% to ∼23%, by (i) using a more efficient reverse transcriptase, (ii) introducing RNase H to break up the DNA:RNA hybrid, and (iii) adding only the BIP primer during the RT step. We also used this two-step method to quantify HIV RNA purified from four patient samples and found that in some cases, the quantification results were highly sensitive to the sequence of the patient's HIV RNA. We learned the following three lessons from this work: (i) digital amplification technologies, including dLAMP and dPCR, may give adequate dilution curves and yet have low efficiency, thereby providing quantification values that underestimate the true concentration. Careful validation is essential before a method is considered to provide absolute quantification; (ii) the sensitivity of dLAMP to the sequence of the target nucleic acid necessitates additional validation with patient samples carrying the full spectrum of mutations; (iii) for multistep digital amplification chemistries, such as a combination of reverse transcription with amplification, microfluidic devices may be used to decouple these steps from one another and to perform them under different, individually optimized conditions for improved efficiency.
Gaseous iodine monitoring in Europe after the Fukushima accident
NASA Astrophysics Data System (ADS)
Masson, Olivier; de Vismes-Ott, Anne; Manificat, Guillaume; Gurriaran, Rodolfo; Debayle, Christophe
2014-05-01
After the Fukushima accident and following the worldwide dispersion of contaminated air masses, many monitoring networks have reported airborne levels of emitted radionuclides, namely and mainly cesium isotopes and iodine 131. Most of the values focused on the particulate fraction (i.e. radionuclide-labeled aerosols) and were dedicated to cesium 137, cesium 134 and iodine 131. Iodine-131 was also found under gaseous form that accounted for most part of the total (gaseous + particulate)I-131 throughout the world. This gaseous predominance was also noticed after the Chernobyl accident despite differences in the type of accident. This predominance is due to the high iodine volatility and also by a rather low transfer from the gaseous form to the particulate one by adsorption on ambient airborne particles. Paradoxically, the number of gaseous determinations was rather low compared to the magnitude of data related to the particulate form (around 10 percent). Routine monitoring of airborne radionuclides species have been extensively based on aerosol sampling for decades as this allows the long term characterization of trace levels of remnant anthropogenic radionuclides. Moreover the capability of gaseous sampler equipped with activated charcoal to allow the quantification of 131I gaseous at trace level is limited by the contact time required for the sorption of iodine on the sorbent and thus by the low acceptable flow rate (usually between 3 and 5 m3/h, exceptionally 12 m3/h). In this context and despite the fact that airborne level outside Japan were of no concern for public health, this contribute to the lack of information on the actual levels of gaseous iodine. Other incidents involving iodine determination in the air have been reported in Europe in 2011 and 2012 without any relation with the Fukushima accident. For the same reason as previously mentioned, mainly, if not only, the particulate form was reported whereas it can be supposed that the predominant form was gaseous. In order to cope with these limitations, some improvements can be done 1) to increase the number of iodine samplers, as engaged by IRSN, 2) to have a number of gaseous surveillance station operating on a routine basis, 3) to diminish the detection limit of the gaseous iodine.
ERIC Educational Resources Information Center
Atar, Cihat; Seedhouse, Paul
2018-01-01
This study analyses teacher-led clarification sequences in a university second language classroom setting from a conversation-analytic perspective. In the literature, there are many studies of clarification requests, but the focus is on individual categories and quantification. No previous study has examined clarification, as reconceptualised in…
How Efficient Is My (Medicinal) Chemistry?
Vanden Eynde, Jean Jacques
2016-01-01
“Greening” a chemical transformation is not about only changing the nature of a solvent or decreasing the reaction temperature. There are metrics enabling a critical quantification of the efficiency of an experimental protocol. Some of them are applied to different sequences for the preparation of paracetamol in order to understand their performance parameters and elucidate pathways for improvement. PMID:27196914
Parallel sequencing lives, or what makes large sequencing projects successful
Cuartero, Yasmina; Stadhouders, Ralph; Graf, Thomas; Marti-Renom, Marc A; Beato, Miguel
2017-01-01
Abstract T47D_rep2 and b1913e6c1_51720e9cf were 2 Hi-C samples. They were born and processed at the same time, yet their fates were very different. The life of b1913e6c1_51720e9cf was simple and fruitful, while that of T47D_rep2 was full of accidents and sorrow. At the heart of these differences lies the fact that b1913e6c1_51720e9cf was born under a lab culture of Documentation, Automation, Traceability, and Autonomy and compliance with the FAIR Principles. Their lives are a lesson for those who wish to embark on the journey of managing high-throughput sequencing data. PMID:29048533
QUESP and QUEST revisited - fast and accurate quantitative CEST experiments.
Zaiss, Moritz; Angelovski, Goran; Demetriou, Eleni; McMahon, Michael T; Golay, Xavier; Scheffler, Klaus
2018-03-01
Chemical exchange saturation transfer (CEST) NMR or MRI experiments allow detection of low concentrated molecules with enhanced sensitivity via their proton exchange with the abundant water pool. Be it endogenous metabolites or exogenous contrast agents, an exact quantification of the actual exchange rate is required to design optimal pulse sequences and/or specific sensitive agents. Refined analytical expressions allow deeper insight and improvement of accuracy for common quantification techniques. The accuracy of standard quantification methodologies, such as quantification of exchange rate using varying saturation power or varying saturation time, is improved especially for the case of nonequilibrium initial conditions and weak labeling conditions, meaning the saturation amplitude is smaller than the exchange rate (γB 1 < k). The improved analytical 'quantification of exchange rate using varying saturation power/time' (QUESP/QUEST) equations allow for more accurate exchange rate determination, and provide clear insights on the general principles to execute the experiments and to perform numerical evaluation. The proposed methodology was evaluated on the large-shift regime of paramagnetic chemical-exchange-saturation-transfer agents using simulated data and data of the paramagnetic Eu(III) complex of DOTA-tetraglycineamide. The refined formulas yield improved exchange rate estimation. General convergence intervals of the methods that would apply for smaller shift agents are also discussed. Magn Reson Med 79:1708-1721, 2018. © 2017 International Society for Magnetic Resonance in Medicine. © 2017 International Society for Magnetic Resonance in Medicine.
Liu, Dian; Steingoetter, Andreas; Curcic, Jelena; Kozerke, Sebastian
2018-01-01
To investigate and exploit the effect of intravoxel off-resonance compartments in the triple-echo steady-state (TESS) sequence without fat suppression for T 2 mapping and to leverage the results for fat fraction quantification. In multicompartment tissue, where at least one compartment is excited off-resonance, the total signal exhibits periodic modulations as a function of echo time (TE). Simulated multicompartment TESS signals were synthesized at various TEs. Fat emulsion phantoms were prepared and scanned at the same TE combinations using TESS. In vivo knee data were obtained with TESS to validate the simulations. The multicompartment effect was exploited for fat fraction quantification in the stomach by acquiring TESS signals at two TE combinations. Simulated and measured multicompartment signal intensities were in good agreement. Multicompartment effects caused erroneous T 2 offsets, even at low water-fat ratios. The choice of TE caused T 2 variations of as much as 28% in cartilage. The feasibility of fat fraction quantification to monitor the decrease of fat content in the stomach during digestion is demonstrated. Intravoxel off-resonance compartments are a confounding factor for T 2 quantification using TESS, causing errors that are dependent on the TE. At the same time, off-resonance effects may allow for efficient fat fraction mapping using steady-state imaging. Magn Reson Med 79:423-429, 2018. © 2017 International Society for Magnetic Resonance in Medicine. © 2017 International Society for Magnetic Resonance in Medicine.
Damond, F; Benard, A; Balotta, Claudia; Böni, Jürg; Cotten, Matthew; Duque, Vitor; Ferns, Bridget; Garson, Jeremy; Gomes, Perpetua; Gonçalves, Fátima; Gottlieb, Geoffrey; Kupfer, Bernd; Ruelle, Jean; Rodes, Berta; Soriano, Vicente; Wainberg, Mark; Taieb, Audrey; Matheron, Sophie; Chene, Genevieve; Brun-Vezinet, Francoise
2011-10-01
Accurate HIV-2 plasma viral load quantification is crucial for adequate HIV-2 patient management and for the proper conduct of clinical trials and international cohort collaborations. This study compared the homogeneity of HIV-2 RNA quantification when using HIV-2 assays from ACHI(E)V(2E) study sites and either in-house PCR calibration standards or common viral load standards supplied to all collaborators. Each of the 12 participating laboratories quantified blinded HIV-2 samples, using its own HIV-2 viral load assay and standard as well as centrally validated and distributed common HIV-2 group A and B standards (http://www.hiv.lanl.gov/content/sequence/HelpDocs/subtypes-more.html). Aliquots of HIV-2 group A and B strains, each at 2 theoretical concentrations (2.7 and 3.7 log(10) copies/ml), were tested. Intralaboratory, interlaboratory, and overall variances of quantification results obtained with both standards were compared using F tests. For HIV-2 group A quantifications, overall and interlaboratory and/or intralaboratory variances were significantly lower when using the common standard than when using in-house standards at the concentration levels of 2.7 log(10) copies/ml and 3.7 log(10) copies/ml, respectively. For HIV-2 group B, a high heterogeneity was observed and the variances did not differ according to the type of standard used. In this international collaboration, the use of a common standard improved the homogeneity of HIV-2 group A RNA quantification only. The diversity of HIV-2 group B, particularly in PCR primer-binding regions, may explain the heterogeneity in quantification of this strain. Development of a validated HIV-2 viral load assay that accurately quantifies distinct circulating strains is needed.
Damond, F.; Benard, A.; Balotta, Claudia; Böni, Jürg; Cotten, Matthew; Duque, Vitor; Ferns, Bridget; Garson, Jeremy; Gomes, Perpetua; Gonçalves, Fátima; Gottlieb, Geoffrey; Kupfer, Bernd; Ruelle, Jean; Rodes, Berta; Soriano, Vicente; Wainberg, Mark; Taieb, Audrey; Matheron, Sophie; Chene, Genevieve; Brun-Vezinet, Francoise
2011-01-01
Accurate HIV-2 plasma viral load quantification is crucial for adequate HIV-2 patient management and for the proper conduct of clinical trials and international cohort collaborations. This study compared the homogeneity of HIV-2 RNA quantification when using HIV-2 assays from ACHIEV2E study sites and either in-house PCR calibration standards or common viral load standards supplied to all collaborators. Each of the 12 participating laboratories quantified blinded HIV-2 samples, using its own HIV-2 viral load assay and standard as well as centrally validated and distributed common HIV-2 group A and B standards (http://www.hiv.lanl.gov/content/sequence/HelpDocs/subtypes-more.html). Aliquots of HIV-2 group A and B strains, each at 2 theoretical concentrations (2.7 and 3.7 log10 copies/ml), were tested. Intralaboratory, interlaboratory, and overall variances of quantification results obtained with both standards were compared using F tests. For HIV-2 group A quantifications, overall and interlaboratory and/or intralaboratory variances were significantly lower when using the common standard than when using in-house standards at the concentration levels of 2.7 log10 copies/ml and 3.7 log10 copies/ml, respectively. For HIV-2 group B, a high heterogeneity was observed and the variances did not differ according to the type of standard used. In this international collaboration, the use of a common standard improved the homogeneity of HIV-2 group A RNA quantification only. The diversity of HIV-2 group B, particularly in PCR primer-binding regions, may explain the heterogeneity in quantification of this strain. Development of a validated HIV-2 viral load assay that accurately quantifies distinct circulating strains is needed. PMID:21813718
RNAbrowse: RNA-Seq de novo assembly results browser.
Mariette, Jérôme; Noirot, Céline; Nabihoudine, Ibounyamine; Bardou, Philippe; Hoede, Claire; Djari, Anis; Cabau, Cédric; Klopp, Christophe
2014-01-01
Transcriptome analysis based on a de novo assembly of next generation RNA sequences is now performed routinely in many laboratories. The generated results, including contig sequences, quantification figures, functional annotations and variation discovery outputs are usually bulky and quite diverse. This article presents a user oriented storage and visualisation environment permitting to explore the data in a top-down manner, going from general graphical views to all possible details. The software package is based on biomart, easy to install and populate with local data. The software package is available under the GNU General Public License (GPL) at http://bioinfo.genotoul.fr/RNAbrowse.
Lin, Huimin; Fu, Caixia; Kannengiesser, Stephan; Cheng, Shu; Shen, Jun; Dong, Haipeng; Yan, Fuhua
2018-03-07
The coexistence of hepatic iron and fat is common in patients with hyperferritinemia, which plays an interactive and aggressive role in the progression of diseases (fibrosis, cirrhosis, and hepatocellular carcinomas). To evaluate a modified high-speed T 2 -corrected multi-echo, single voxel spectroscopy sequence (HISTOV) for liver iron concentration (LIC) quantification in patients with hyperferritinemia, with simultaneous fat fraction (FF) estimation. Retrospective cohort study. Thirty-eight patients with hyperferritinemia were enrolled. HISTOV, a fat-saturated multi-echo gradient echo (GRE) sequence, and a spin echo sequence (FerriScan) were performed at 1.5T. R 2 of the water signal and FF were calculated with HISTOV, and R2* values were derived from the GRE sequence, with R 2 and LIC from FerriScan serving as the references. Linear regression, correlation analyses, receiver operating characteristic analyses, and Bland-Altman analyses were conducted. Abnormal hepatic iron load was detected in 32/38 patients, of whom 10/32 had coexisting steatosis. Strong correlation was found between R2* and FerriScan-LIC (R 2 = 0.861), and between HISTOV-R 2_ water and FerriScan-R 2 (R 2 = 0.889). Furthermore, HISTOV-R 2_ water was not correlated with HISTOV-FF. The area under the curve (AUC) for HISTOV-R 2_ water was 0.974, 0.971, and 1, corresponding to clinical FerriScan-LIC thresholds of 1.8, 3.2, and 7.0 mg/g dw, respectively. No significant difference in the AUC was found between HISTOV-R 2_ water and R2* at any of the LIC thresholds, with P-values of 0.42, 0.37, and 1, respectively. HISTOV-LIC showed excellent agreement with FerriScan-LIC, with a mean bias of 0.00 ± 1.18 mg/g dw, whereas the mean bias between GRE-LIC and FerriScan-LIC was 0.53 ± 1.49 mg/g dw. HISTOV is useful for the quantification and grading of liver iron overload in patients with hyperferritinemia, particularly in cases with coexisting steatosis. HISTOV-LIC showed no systematic bias compared with FerriScan-LIC, making it a promising alternative for iron quantification. 3 Technical Efficacy Stage 2 J. Magn. Reson. Imaging 2018. © 2018 International Society for Magnetic Resonance in Medicine.
Corruption of genomic databases with anomalous sequence.
Lamperti, E D; Kittelberger, J M; Smith, T F; Villa-Komaroff, L
1992-06-11
We describe evidence that DNA sequences from vectors used for cloning and sequencing have been incorporated accidentally into eukaryotic entries in the GenBank database. These incorporations were not restricted to one type of vector or to a single mechanism. Many minor instances may have been the result of simple editing errors, but some entries contained large blocks of vector sequence that had been incorporated by contamination or other accidents during cloning. Some cases involved unusual rearrangements and areas of vector distant from the normal insertion sites. Matches to vector were found in 0.23% of 20,000 sequences analyzed in GenBank Release 63. Although the possibility of anomalous sequence incorporation has been recognized since the inception of GenBank and should be easy to avoid, recent evidence suggests that this problem is increasing more quickly than the database itself. The presence of anomalous sequence may have serious consequences for the interpretation and use of database entries, and will have an impact on issues of database management. The incorporated vector fragments described here may also be useful for a crude estimate of the fidelity of sequence information in the database. In alignments with well-defined ends, the matching sequences showed 96.8% identity to vector; when poorer matches with arbitrary limits were included, the aggregate identity to vector sequence was 94.8%.
Couillerot, O; Poirier, M-A; Prigent-Combaret, C; Mavingui, P; Caballero-Mellado, J; Moënne-Loccoz, Y
2010-08-01
To assess the applicability of sequence characterized amplified region (SCAR) markers obtained from BOX, ERIC and RAPD fragments to design primers for real-time PCR quantification of the phytostimulatory maize inoculants Azospirillum brasilense UAP-154 and CFN-535 in the rhizosphere. Primers were designed based on strain-specific SCAR markers and were screened for successful amplification of target strain and absence of cross-reaction with other Azospirillum strains. The specificity of primers thus selected was verified under real-time PCR conditions using genomic DNA from strain collection and DNA from rhizosphere samples. The detection limit was 60 fg DNA with pure cultures and 4 x 10(3) (for UAP-154) and 4 x 10(4) CFU g(-1) (for CFN-535) in the maize rhizosphere. Inoculant quantification was effective from 10(4) to 10(8) CFU g(-1) soil. BOX-based SCAR markers were useful to find primers for strain-specific real-time PCR quantification of each A. brasilense inoculant in the maize rhizosphere. Effective root colonization is a prerequisite for successful Azospirillum phytostimulation, but cultivation-independent monitoring methods were lacking. The real-time PCR methods developed here will help understand the effect of environmental conditions on root colonization and phytostimulation by A. brasilense UAP-154 and CFN-535.
Ghedira, Rim; Papazova, Nina; Vuylsteke, Marnik; Ruttink, Tom; Taverniers, Isabel; De Loose, Marc
2009-10-28
GMO quantification, based on real-time PCR, relies on the amplification of an event-specific transgene assay and a species-specific reference assay. The uniformity of the nucleotide sequences targeted by both assays across various transgenic varieties is an important prerequisite for correct quantification. Single nucleotide polymorphisms (SNPs) frequently occur in the maize genome and might lead to nucleotide variation in regions used to design primers and probes for reference assays. Further, they may affect the annealing of the primer to the template and reduce the efficiency of DNA amplification. We assessed the effect of a minor DNA template modification, such as a single base pair mismatch in the primer attachment site, on real-time PCR quantification. A model system was used based on the introduction of artificial mismatches between the forward primer and the DNA template in the reference assay targeting the maize starch synthase (SSIIb) gene. The results show that the presence of a mismatch between the primer and the DNA template causes partial to complete failure of the amplification of the initial DNA template depending on the type and location of the nucleotide mismatch. With this study, we show that the presence of a primer/template mismatch affects the estimated total DNA quantity to a varying degree.
Li, Xiang; Wang, Xiuxiu; Yang, Jielin; Liu, Yueming; He, Yuping; Pan, Liangwen
2014-05-16
To date, over 150 genetically modified (GM) crops are widely cultivated. To comply with regulations developed for genetically modified organisms (GMOs), including labeling policies, many detection methods for GMO identification and quantification have been developed. To detect the entrance and exit of unauthorized GM crop events in China, we developed a novel quadruplex real-time PCR method for simultaneous detection and quantification of GM cotton events GHB119 and T304-40 in cotton-derived products (based on the 5'-flanking sequence) and the insect-resistance gene Cry2Ae. The limit of detection was 10 copies for GHB119 and Cry2Ae and 25 copies for T304-40. The limit of quantification was 25 copies for GHB119 and Cry2Ae and 50 copies for T304-40. Moreover, low bias and acceptable standard deviation and relative standard deviation values were obtained in quantification analysis of six blind samples containing different GHB119 and T304-40 ingredients. The developed quadruplex quantitative method could be used for quantitative detection of two GM cotton events (GHB119 and T304-40) and Cry2Ae gene ingredient in cotton derived products.
2014-01-01
Background To date, over 150 genetically modified (GM) crops are widely cultivated. To comply with regulations developed for genetically modified organisms (GMOs), including labeling policies, many detection methods for GMO identification and quantification have been developed. Results To detect the entrance and exit of unauthorized GM crop events in China, we developed a novel quadruplex real-time PCR method for simultaneous detection and quantification of GM cotton events GHB119 and T304-40 in cotton-derived products (based on the 5′-flanking sequence) and the insect-resistance gene Cry2Ae. The limit of detection was 10 copies for GHB119 and Cry2Ae and 25 copies for T304-40. The limit of quantification was 25 copies for GHB119 and Cry2Ae and 50 copies for T304-40. Moreover, low bias and acceptable standard deviation and relative standard deviation values were obtained in quantification analysis of six blind samples containing different GHB119 and T304-40 ingredients. Conclusions The developed quadruplex quantitative method could be used for quantitative detection of two GM cotton events (GHB119 and T304-40) and Cry2Ae gene ingredient in cotton derived products. PMID:24884946
Salvi, Sergio; D'Orso, Fabio; Morelli, Giorgio
2008-06-25
Many countries have introduced mandatory labeling requirements on foods derived from genetically modified organisms (GMOs). Real-time quantitative polymerase chain reaction (PCR) based upon the TaqMan probe chemistry has become the method mostly used to support these regulations; moreover, event-specific PCR is the preferred method in GMO detection because of its high specificity based on the flanking sequence of the exogenous integrant. The aim of this study was to evaluate the use of very short (eight-nucleotide long), locked nucleic acid (LNA) TaqMan probes in 5'-nuclease PCR assays for the detection and quantification of GMOs. Classic TaqMan and LNA TaqMan probes were compared for the analysis of the maize MON810 transgene. The performance of the two types of probes was tested on the maize endogenous reference gene hmga, the CaMV 35S promoter, and the hsp70/cryIA(b) construct as well as for the event-specific 5'-integration junction of MON810, using plasmids as standard reference molecules. The results of our study demonstrate that the LNA 5'-nuclease PCR assays represent a valid and reliable analytical system for the detection and quantification of transgenes. Application of very short LNA TaqMan probes to GMO quantification can simplify the design of 5'-nuclease assays.
Tugnoli, Alessandro; Khan, Faisal; Amyotte, Paul; Cozzani, Valerio
2008-12-15
The design of layout plans requires adequate assessment tools for the quantification of safety performance. The general focus of the present work is to introduce an inherent safety perspective at different points of the layout design process. In particular, index approaches for safety assessment and decision-making in the early stages of layout design are developed and discussed in this two-part contribution. Part 1 (accompanying paper) of the current work presents an integrated index approach for safety assessment of early plant layout. In the present paper (Part 2), an index for evaluation of the hazard related to the potential of domino effects is developed. The index considers the actual consequences of possible escalation scenarios and scores or ranks the subsequent accident propagation potential. The effects of inherent and passive protection measures are also assessed. The result is a rapid quantification of domino hazard potential that can provide substantial support for choices in the early stages of layout design. Additionally, a case study concerning selection among various layout options is presented and analyzed. The case study demonstrates the use and applicability of the indices developed in both parts of the current work and highlights the value of introducing inherent safety features early in layout design.
Zoutman, Willem H; Nell, Rogier J; Versluis, Mieke; van Steenderen, Debby; Lalai, Rajshri N; Out-Luiting, Jacoba J; de Lange, Mark J; Vermeer, Maarten H; Langerak, Anton W; van der Velden, Pieter A
2017-03-01
Quantifying T cells accurately in a variety of tissues of benign, inflammatory, or malignant origin can be of great importance in a variety of clinical applications. Flow cytometry and immunohistochemistry are considered to be gold-standard methods for T-cell quantification. However, these methods require fresh, frozen, or fixated cells and tissue of a certain quality. In addition, conventional and droplet digital PCR (ddPCR), whether followed by deep sequencing techniques, have been used to elucidate T-cell content by focusing on rearranged T-cell receptor (TCR) genes. These approaches typically target the whole TCR repertoire, thereby supplying additional information about TCR use. We alternatively developed and validated two novel generic single duplex ddPCR assays to quantify T cells accurately by measuring loss of specific germline TCR loci and compared them with flow cytometry-based quantification. These assays target sequences between the Dδ2 and Dδ3 genes (TRD locus) and Dβ1 and Jβ1.1 genes (TRB locus) that become deleted systematically early during lymphoid differentiation. Because these ddPCR assays require small amounts of DNA instead of freshly isolated, frozen, or fixated material, initially unanalyzable (scarce) specimens can be assayed from now on, supplying valuable information about T-cell content. Our ddPCR method provides a novel and sensitive way for quantifying T cells relatively fast, accurate, and independent of the cellular context. Copyright © 2017 American Society for Investigative Pathology and the Association for Molecular Pathology. Published by Elsevier Inc. All rights reserved.
Deciphering the Epigenetic Code: An Overview of DNA Methylation Analysis Methods
Umer, Muhammad
2013-01-01
Abstract Significance: Methylation of cytosine in DNA is linked with gene regulation, and this has profound implications in development, normal biology, and disease conditions in many eukaryotic organisms. A wide range of methods and approaches exist for its identification, quantification, and mapping within the genome. While the earliest approaches were nonspecific and were at best useful for quantification of total methylated cytosines in the chunk of DNA, this field has seen considerable progress and development over the past decades. Recent Advances: Methods for DNA methylation analysis differ in their coverage and sensitivity, and the method of choice depends on the intended application and desired level of information. Potential results include global methyl cytosine content, degree of methylation at specific loci, or genome-wide methylation maps. Introduction of more advanced approaches to DNA methylation analysis, such as microarray platforms and massively parallel sequencing, has brought us closer to unveiling the whole methylome. Critical Issues: Sensitive quantification of DNA methylation from degraded and minute quantities of DNA and high-throughput DNA methylation mapping of single cells still remain a challenge. Future Directions: Developments in DNA sequencing technologies as well as the methods for identification and mapping of 5-hydroxymethylcytosine are expected to augment our current understanding of epigenomics. Here we present an overview of methodologies available for DNA methylation analysis with special focus on recent developments in genome-wide and high-throughput methods. While the application focus relates to cancer research, the methods are equally relevant to broader issues of epigenetics and redox science in this special forum. Antioxid. Redox Signal. 18, 1972–1986. PMID:23121567
Relative quantification in seed GMO analysis: state of art and bottlenecks.
Chaouachi, Maher; Bérard, Aurélie; Saïd, Khaled
2013-06-01
Reliable quantitative methods are needed to comply with current EU regulations on the mandatory labeling of genetically modified organisms (GMOs) and GMO-derived food and feed products with a minimum GMO content of 0.9 %. The implementation of EU Commission Recommendation 2004/787/EC on technical guidance for sampling and detection which meant as a helpful tool for the practical implementation of EC Regulation 1830/2003, which states that "the results of quantitative analysis should be expressed as the number of target DNA sequences per target taxon specific sequences calculated in terms of haploid genomes". This has led to an intense debate on the type of calibrator best suitable for GMO quantification. The main question addressed in this review is whether reference materials and calibrators should be matrix based or whether pure DNA analytes should be used for relative quantification in GMO analysis. The state of the art, including the advantages and drawbacks, of using DNA plasmid (compared to genomic DNA reference materials) as calibrators, is widely described. In addition, the influence of the genetic structure of seeds on real-time PCR quantitative results obtained for seed lots is discussed. The specific composition of a seed kernel, the mode of inheritance, and the ploidy level ensure that there is discordance between a GMO % expressed as a haploid genome equivalent and a GMO % based on numbers of seeds. This means that a threshold fixed as a percentage of seeds cannot be used as such for RT-PCR. All critical points that affect the expression of the GMO content in seeds are discussed in this paper.
Genesis Failure Investigation Report
NASA Technical Reports Server (NTRS)
Klein, John
2004-01-01
The-Genesis mission to collect solar-wind samples and return them to Earth for detailed analysis proceeded successfully for 3.5 years. During reentry on September 8, 2004, a failure in the entry, descent and landing sequence resulted in a crash landing of the Genesis sample return capsule. This document describes the findings of the avionics sub-team that supported the accident investigation of the JPL Failure Review Board.
[Quantitative PCR in the diagnosis of Leishmania].
Mortarino, M; Franceschi, A; Mancianti, F; Bazzocchi, C; Genchi, C; Bandi, C
2004-06-01
Polymerase chain reaction (PCR) is a sensitive and rapid method for the diagnosis of canine Leishmania infection and can be performed on a variety of biological samples, including peripheral blood, lymph node, bone marrow and skin. Standard PCR requires electrophoretic analysis of the amplification products and is usually not suitable for quantification of the template DNA (unless competitor-based or other methods are developed), being of reduced usefulness when accurate monitoring of target DNA is required. Quantitative real-time PCR allows the continuous monitoring of the accumulation of PCR products during the amplification reaction. This allows the identification of the cycle of near-logarithmic PCR product generation (threshold cycle) and, by inference, the relative quantification of the template DNA present at the start of the reaction. Since the amplification product are monitored in "real-time" as they form cycle-by-cycle, no post-amplification handling is required. The absolute quantification is performed according either to an internal standard co-amplified with the sample DNA, or to an external standard curve obtained by parallel amplification of serial known concentrations of a reference DNA sequence. From the quantification of the template DNA, an estimation of the relative load of parasites in the different samples can be obtained. The advantages compared to standard and semi-quantitative PCR techniques are reduction of the assay's time and contamination risks, and improved sensitivity. As for standard PCR, the minimal components of the quantitative PCR reaction mixture are the DNA target of the amplification, an oligonucleotide primer pair flanking the target sequence, a suitable DNA polymerase, deoxynucleotides, buffer and salts. Different technologies have been set up for the monitoring of amplification products, generally based on the use of fluorescent probes. For instance, SYBR Green technology is a non-specific detection system based on a fluorescent dsDNA intercalator and it is applicable to all potential targets. TaqMan technology is more specific since performs the direct assessment of the amount of amplified DNA using a fluorescent probe specific for the target sequence flanked by the primer pair. This probe is an oligonucleotide labelled with a reporter dye (fluorescent) and a quencher (which absorbs the fluorescent signal generated by the reporter). The thermic protocol of amplification allows the binding of the fluorescent probe to the target sequence before the binding of the primers and the starting of the polymerization by Taq polymerase. During polymerization, 5'-3' exonuclease activity of Taq polymerase digests the probe and in this way the reporter dye is released from the probe and a fluorescent signal is detected. The intensity of the signal accumulates at the end of each cycle and is related to the amount of the amplification product. In recent years, quantitative PCR methods based either on SYBR Green or TaqMan technology have been set up for the quantification of Leishmania in mouse liver, mouse skin and human peripheral blood, targeting either single-copy chromosomal or multi-copy minicircle sequences with high sensitivity and reproducibility. In particular, real-time PCR seems to be a reliable, rapid and noninvasive method for the diagnosis and follow up of visceral leishmaniasis in humans. At present, the application of real-time PCR for research and clinical diagnosis of Leishmania infection in dogs is still foreseable. As for standard PCR, the high sensitivity of real-time PCR could allow the use of blood sampling that is less invasive and easily performed for monitoring the status of the dogs. The development of a real-time PCR assay for Leishmania infantum infection in dogs could support the standard and optimized serological and PCR methods currenly in use for the diagnosis and follow-up of canine leishmaniasis, and perhaps prediction of recurrences associated with tissue loads of residual pathogens after treatment. At this regard, a TaqMan Real Time PCR method developed for the quantification of Leishmania infantum minicircle DNA in peripheral blood of naturally infected dogs sampled before and at different time points after the beginning of a standard antileishmanial therapy will be illustrated.
Sources of PCR-induced distortions in high-throughput sequencing data sets
Kebschull, Justus M.; Zador, Anthony M.
2015-01-01
PCR permits the exponential and sequence-specific amplification of DNA, even from minute starting quantities. PCR is a fundamental step in preparing DNA samples for high-throughput sequencing. However, there are errors associated with PCR-mediated amplification. Here we examine the effects of four important sources of error—bias, stochasticity, template switches and polymerase errors—on sequence representation in low-input next-generation sequencing libraries. We designed a pool of diverse PCR amplicons with a defined structure, and then used Illumina sequencing to search for signatures of each process. We further developed quantitative models for each process, and compared predictions of these models to our experimental data. We find that PCR stochasticity is the major force skewing sequence representation after amplification of a pool of unique DNA amplicons. Polymerase errors become very common in later cycles of PCR but have little impact on the overall sequence distribution as they are confined to small copy numbers. PCR template switches are rare and confined to low copy numbers. Our results provide a theoretical basis for removing distortions from high-throughput sequencing data. In addition, our findings on PCR stochasticity will have particular relevance to quantification of results from single cell sequencing, in which sequences are represented by only one or a few molecules. PMID:26187991
Centrifuge: rapid and sensitive classification of metagenomic sequences
Song, Li; Breitwieser, Florian P.
2016-01-01
Centrifuge is a novel microbial classification engine that enables rapid, accurate, and sensitive labeling of reads and quantification of species on desktop computers. The system uses an indexing scheme based on the Burrows-Wheeler transform (BWT) and the Ferragina-Manzini (FM) index, optimized specifically for the metagenomic classification problem. Centrifuge requires a relatively small index (4.2 GB for 4078 bacterial and 200 archaeal genomes) and classifies sequences at very high speed, allowing it to process the millions of reads from a typical high-throughput DNA sequencing run within a few minutes. Together, these advances enable timely and accurate analysis of large metagenomics data sets on conventional desktop computers. Because of its space-optimized indexing schemes, Centrifuge also makes it possible to index the entire NCBI nonredundant nucleotide sequence database (a total of 109 billion bases) with an index size of 69 GB, in contrast to k-mer-based indexing schemes, which require far more extensive space. PMID:27852649
Mapping RNA-seq Reads with STAR
Dobin, Alexander; Gingeras, Thomas R.
2015-01-01
Mapping of large sets of high-throughput sequencing reads to a reference genome is one of the foundational steps in RNA-seq data analysis. The STAR software package performs this task with high levels of accuracy and speed. In addition to detecting annotated and novel splice junctions, STAR is capable of discovering more complex RNA sequence arrangements, such as chimeric and circular RNA. STAR can align spliced sequences of any length with moderate error rates providing scalability for emerging sequencing technologies. STAR generates output files that can be used for many downstream analyses such as transcript/gene expression quantification, differential gene expression, novel isoform reconstruction, signal visualization, and so forth. In this unit we describe computational protocols that produce various output files, use different RNA-seq datatypes, and utilize different mapping strategies. STAR is Open Source software that can be run on Unix, Linux or Mac OS X systems. PMID:26334920
Mapping RNA-seq Reads with STAR.
Dobin, Alexander; Gingeras, Thomas R
2015-09-03
Mapping of large sets of high-throughput sequencing reads to a reference genome is one of the foundational steps in RNA-seq data analysis. The STAR software package performs this task with high levels of accuracy and speed. In addition to detecting annotated and novel splice junctions, STAR is capable of discovering more complex RNA sequence arrangements, such as chimeric and circular RNA. STAR can align spliced sequences of any length with moderate error rates, providing scalability for emerging sequencing technologies. STAR generates output files that can be used for many downstream analyses such as transcript/gene expression quantification, differential gene expression, novel isoform reconstruction, and signal visualization. In this unit, we describe computational protocols that produce various output files, use different RNA-seq datatypes, and utilize different mapping strategies. STAR is open source software that can be run on Unix, Linux, or Mac OS X systems. Copyright © 2015 John Wiley & Sons, Inc.
Accounting for uncertainty in DNA sequencing data.
O'Rawe, Jason A; Ferson, Scott; Lyon, Gholson J
2015-02-01
Science is defined in part by an honest exposition of the uncertainties that arise in measurements and propagate through calculations and inferences, so that the reliabilities of its conclusions are made apparent. The recent rapid development of high-throughput DNA sequencing technologies has dramatically increased the number of measurements made at the biochemical and molecular level. These data come from many different DNA-sequencing technologies, each with their own platform-specific errors and biases, which vary widely. Several statistical studies have tried to measure error rates for basic determinations, but there are no general schemes to project these uncertainties so as to assess the surety of the conclusions drawn about genetic, epigenetic, and more general biological questions. We review here the state of uncertainty quantification in DNA sequencing applications, describe sources of error, and propose methods that can be used for accounting and propagating these errors and their uncertainties through subsequent calculations. Copyright © 2014 Elsevier Ltd. All rights reserved.
Yoshimura, Tomoaki; Kuribara, Hideo; Matsuoka, Takeshi; Kodama, Takashi; Iida, Mayu; Watanabe, Takahiro; Akiyama, Hiroshi; Maitani, Tamio; Furui, Satoshi; Hino, Akihiro
2005-03-23
The applicability of quantifying genetically modified (GM) maize and soy to processed foods was investigated using heat treatment processing models. The detection methods were based on real-time quantitative polymerase chain reaction (PCR) analysis. Ground seeds of insect resistant GM maize (MON810) and glyphosate tolerant Roundup Ready (RR) soy were dissolved in water and were heat treated by autoclaving for various time intervals. The calculated copy numbers of the recombinant and taxon specific deoxyribonucleic acid (DNA) sequences in the extracted DNA solution were found to decrease with time. This decrease was influenced by the PCR-amplified size. The conversion factor (Cf), which is the ratio of the recombinant DNA sequence to the taxon specific DNA sequence and is used as a constant number for calculating GM% at each event, tended to be stable when the sizes of PCR products of two DNA sequences were nearly equal. The results suggested that the size of the PCR product plays a key role in the quantification of GM organisms in processed foods. It is believed that the Cf of the endosperm (3n) is influenced by whether the GM originated from a paternal or maternal source. The embryos and endosperms were separated from the F1 generation seeds of five GM maize events, and their Cf values were measured. Both paternal and maternal GM events were identified. In these, the endosperm Cf was lower than that of the embryo, and the embryo Cf was lower than that of the endosperm. These results demonstrate the difficulties encountered in the determination of GM% in maize grains (F2 generation) and in processed foods from maize and soy.
Optimisation of DNA extraction from the crustacean Daphnia
Athanasio, Camila Gonçalves; Chipman, James K.; Viant, Mark R.
2016-01-01
Daphnia are key model organisms for mechanistic studies of phenotypic plasticity, adaptation and microevolution, which have led to an increasing demand for genomics resources. A key step in any genomics analysis, such as high-throughput sequencing, is the availability of sufficient and high quality DNA. Although commercial kits exist to extract genomic DNA from several species, preparation of high quality DNA from Daphnia spp. and other chitinous species can be challenging. Here, we optimise methods for tissue homogenisation, DNA extraction and quantification customised for different downstream analyses (e.g., LC-MS/MS, Hiseq, mate pair sequencing or Nanopore). We demonstrate that if Daphnia magna are homogenised as whole animals (including the carapace), absorbance-based DNA quantification methods significantly over-estimate the amount of DNA, resulting in using insufficient starting material for experiments, such as preparation of sequencing libraries. This is attributed to the high refractive index of chitin in Daphnia’s carapace at 260 nm. Therefore, unless the carapace is removed by overnight proteinase digestion, the extracted DNA should be quantified with fluorescence-based methods. However, overnight proteinase digestion will result in partial fragmentation of DNA therefore the prepared DNA is not suitable for downstream methods that require high molecular weight DNA, such as PacBio, mate pair sequencing and Nanopore. In conclusion, we found that the MasterPure DNA purification kit, coupled with grinding of frozen tissue, is the best method for extraction of high molecular weight DNA as long as the extracted DNA is quantified with fluorescence-based methods. This method generated high yield and high molecular weight DNA (3.10 ± 0.63 ng/µg dry mass, fragments >60 kb), free of organic contaminants (phenol, chloroform) and is suitable for large number of downstream analyses. PMID:27190714
Fuller, Nicholas J.; Wilson, William H.; Joint, Ian R.; Mann, Nicholas H.
1998-01-01
Viruses are ubiquitous components of marine ecosystems and are known to infect unicellular phycoerythrin-containing cyanobacteria belonging to the genus Synechococcus. A conserved region from the cyanophage genome was identified in three genetically distinct cyanomyoviruses, and a sequence analysis revealed that this region exhibited significant similarity to a gene encoding a capsid assembly protein (gp20) from the enteric coliphage T4. The results of a comparison of gene 20 sequences from three cyanomyoviruses and T4 allowed us to design two degenerate PCR primers, CPS1 and CPS2, which specifically amplified a 165-bp region from the majority of cyanomyoviruses tested. A competitive PCR (cPCR) analysis revealed that cyanomyovirus strains could be accurately enumerated, and it was demonstrated that quantification was log-linear over ca. 3 orders of magnitude. Different calibration curves were obtained for each of the three cyanomyovirus strains tested; consequently, cPCR performed with primers CPS1 and CPS2 could lead to substantial inaccuracies in estimates of phage abundance in natural assemblages. Further sequence analysis of cyanomyovirus gene 20 homologs would be necessary in order to design primers which do not exhibit phage-to-phage variability in priming efficiency. It was demonstrated that PCR products of the correct size could be amplified from seawater samples following 100× concentration and even directly without any prior concentration. Hence, the use of degenerate primers in PCR analyses of cyanophage populations should provide valuable data on the diversity of cyanophages in natural assemblages. Further optimization of procedures may ultimately lead to a sensitive assay which can be used to analyze natural cyanophage populations both quantitatively (by cPCR) and qualitatively following phylogenetic analysis of amplified products. PMID:9603813
NASA Astrophysics Data System (ADS)
Sheynkman, Gloria M.; Shortreed, Michael R.; Cesnik, Anthony J.; Smith, Lloyd M.
2016-06-01
Mass spectrometry-based proteomics has emerged as the leading method for detection, quantification, and characterization of proteins. Nearly all proteomic workflows rely on proteomic databases to identify peptides and proteins, but these databases typically contain a generic set of proteins that lack variations unique to a given sample, precluding their detection. Fortunately, proteogenomics enables the detection of such proteomic variations and can be defined, broadly, as the use of nucleotide sequences to generate candidate protein sequences for mass spectrometry database searching. Proteogenomics is experiencing heightened significance due to two developments: (a) advances in DNA sequencing technologies that have made complete sequencing of human genomes and transcriptomes routine, and (b) the unveiling of the tremendous complexity of the human proteome as expressed at the levels of genes, cells, tissues, individuals, and populations. We review here the field of human proteogenomics, with an emphasis on its history, current implementations, the types of proteomic variations it reveals, and several important applications.
Dasa, Siva Sai Krishna; Kelly, Kimberly A.
2016-01-01
Next-generation sequencing has enhanced the phage display process, allowing for the quantification of millions of sequences resulting from the biopanning process. In response, many valuable analysis programs focused on specificity and finding targeted motifs or consensus sequences were developed. For targeted drug delivery and molecular imaging, it is also necessary to find peptides that are selective—targeting only the cell type or tissue of interest. We present a new analysis strategy and accompanying software, PHage Analysis for Selective Targeted PEPtides (PHASTpep), which identifies highly specific and selective peptides. Using this process, we discovered and validated, both in vitro and in vivo in mice, two sequences (HTTIPKV and APPIMSV) targeted to pancreatic cancer-associated fibroblasts that escaped identification using previously existing software. Our selectivity analysis makes it possible to discover peptides that target a specific cell type and avoid other cell types, enhancing clinical translatability by circumventing complications with systemic use. PMID:27186887
Sheynkman, Gloria M.; Shortreed, Michael R.; Cesnik, Anthony J.; Smith, Lloyd M.
2016-01-01
Mass spectrometry–based proteomics has emerged as the leading method for detection, quantification, and characterization of proteins. Nearly all proteomic workflows rely on proteomic databases to identify peptides and proteins, but these databases typically contain a generic set of proteins that lack variations unique to a given sample, precluding their detection. Fortunately, proteogenomics enables the detection of such proteomic variations and can be defined, broadly, as the use of nucleotide sequences to generate candidate protein sequences for mass spectrometry database searching. Proteogenomics is experiencing heightened significance due to two developments: (a) advances in DNA sequencing technologies that have made complete sequencing of human genomes and transcriptomes routine, and (b) the unveiling of the tremendous complexity of the human proteome as expressed at the levels of genes, cells, tissues, individuals, and populations. We review here the field of human proteogenomics, with an emphasis on its history, current implementations, the types of proteomic variations it reveals, and several important applications. PMID:27049631
Pulsed arterial spin labeling using TurboFLASH with suppression of intravascular signal.
Pell, Gaby S; Lewis, David P; Branch, Craig A
2003-02-01
Accurate quantification of perfusion with the ADC techniques requires the suppression of the majority of the intravascular signal. This is normally achieved with the use of diffusion gradients. The TurboFLASH sequence with its ultrashort repetition times is not readily amenable to this scheme. This report demonstrates the implementation of a modified TurboFLASH sequence for FAIR imaging. Intravascular suppression is achieved with a modified preparation period that includes a driven equilibrium Fourier transform (DEFT) combination of 90 degrees-180 degrees-90 degrees hard RF pulses subsequent to the inversion delay. These pulses rotate the perfusion-prepared magnetization into the transverse plane where it can experience the suitably placed diffusion gradients before being returned to the longitudinal direction by the second 90 degrees pulse. A value of b = 20-30 s/mm(2) was thereby found to suppress the majority of the intravascular signal. For single-slice perfusion imaging, quantification is only slightly modified. The technique can be readily extended to multislice acquisition if the evolving flow signal after the DEFT preparation is considered. An advantage of the modified preparation scheme is evident in the multislice FAIR images by the preservation of the sign of the magnetization difference. Copyright 2003 Wiley-Liss, Inc.
Li, Ningzhi; An, Li; Johnson, Christopher; Shen, Jun
2017-01-01
Due to imperfect slice profiles, unwanted signals from outside the selected voxel may significantly contaminate metabolite signals acquired using in vivo magnetic resonance spectroscopy (MRS). The use of outer volume suppression may exceed the SAR threshold, especially at high field. We propose using phase-encoding gradients after radiofrequency (RF) excitation to spatially encode unwanted signals originating from outside of the selected single voxel. Phase-encoding gradients were added to a standard single voxel point-resolved spectroscopy (PRESS) sequence which selects a 2 × 2 × 2 cm 3 voxel. Subsequent spatial Fourier transform was used to encode outer volume signals. Phantom and in vivo experiments were performed using both phase-encoded PRESS and standard PRESS at 7 Tesla. Quantification was performed using fitting software developed in-house. Both phantom and in vivo studies showed that spectra from the phase-encoded PRESS sequence were relatively immune from contamination by oil signals and have more accurate quantification results than spectra from standard PRESS spectra of the same voxel. The proposed phase-encoded single-voxel PRESS method can significantly suppress outer volume signals that may appear in the spectra of standard PRESS without increasing RF power deposition.
DIFFUSION-WEIGHTED IMAGING OF THE LIVER: TECHNIQUES AND APPLICATIONS
Lewis, Sara; Dyvorne, Hadrien; Cui, Yong; Taouli, Bachir
2014-01-01
SYNOPSIS Diffusion weighted MRI (DWI) is a technique that assesses the cellularity, tortuosity of the extracellular/extravascular space and cell membrane density based upon differences in water proton mobility in tissues. The strength of the diffusion weighting is reflected by the b-value. DWI using several b-values enables quantification of the apparent diffusion coefficient (ADC). DWI is increasingly employed in liver imaging for multiple reasons: it can add useful qualitative and quantitative information to conventional imaging sequences, it is acquired relatively quickly, it is easily incorporated into existing clinical protocols, and it is a non-contrast technique. DWI is useful for focal liver lesion detection and characterization, for the assessment of post-treatment tumor response and for evaluation of diffuse liver disease. ADC quantification can be used to characterize lesions as cystic/necrotic or solid and for predicting tumor response to therapy. Advanced diffusion methods such as IVIM (intravoxel incoherent motion) may have potential for detection, staging and evaluation of the progression of liver fibrosis and for liver lesion characterization. The lack of standardization of DWI technique including choice of b-values and sequence parameters has somewhat limited its widespread adoption. PMID:25086935
Yang, Jian-Yi; Peng, Zhen-Ling; Yu, Zu-Guo; Zhang, Rui-Jie; Anh, Vo; Wang, Desheng
2009-04-21
In this paper, we intend to predict protein structural classes (alpha, beta, alpha+beta, or alpha/beta) for low-homology data sets. Two data sets were used widely, 1189 (containing 1092 proteins) and 25PDB (containing 1673 proteins) with sequence homology being 40% and 25%, respectively. We propose to decompose the chaos game representation of proteins into two kinds of time series. Then, a novel and powerful nonlinear analysis technique, recurrence quantification analysis (RQA), is applied to analyze these time series. For a given protein sequence, a total of 16 characteristic parameters can be calculated with RQA, which are treated as feature representation of protein sequences. Based on such feature representation, the structural class for each protein is predicted with Fisher's linear discriminant algorithm. The jackknife test is used to test and compare our method with other existing methods. The overall accuracies with step-by-step procedure are 65.8% and 64.2% for 1189 and 25PDB data sets, respectively. With one-against-others procedure used widely, we compare our method with five other existing methods. Especially, the overall accuracies of our method are 6.3% and 4.1% higher for the two data sets, respectively. Furthermore, only 16 parameters are used in our method, which is less than that used by other methods. This suggests that the current method may play a complementary role to the existing methods and is promising to perform the prediction of protein structural classes.
Gillot, Guillaume; Jany, Jean-Luc; Dominguez-Santos, Rebeca; Poirier, Elisabeth; Debaets, Stella; Hidalgo, Pedro I; Ullán, Ricardo V; Coton, Emmanuel; Coton, Monika
2017-04-01
Mycophenolic acid (MPA) is a secondary metabolite produced by various Penicillium species including Penicillium roqueforti. The MPA biosynthetic pathway was recently described in Penicillium brevicompactum. In this study, an in silico analysis of the P. roqueforti FM164 genome sequence localized a 23.5-kb putative MPA gene cluster. The cluster contains seven genes putatively coding seven proteins (MpaA, MpaB, MpaC, MpaDE, MpaF, MpaG, MpaH) and is highly similar (i.e. gene synteny, sequence homology) to the P. brevicompactum cluster. To confirm the involvement of this gene cluster in MPA biosynthesis, gene silencing using RNA interference targeting mpaC, encoding a putative polyketide synthase, was performed in a high MPA-producing P. roqueforti strain (F43-1). In the obtained transformants, decreased MPA production (measured by LC-Q-TOF/MS) was correlated to reduced mpaC gene expression by Q-RT-PCR. In parallel, mycotoxin quantification on multiple P. roqueforti strains suggested strain-dependent MPA-production. Thus, the entire MPA cluster was sequenced for P. roqueforti strains with contrasted MPA production and a 174bp deletion in mpaC was observed in low MPA-producers. PCRs directed towards the deleted region among 55 strains showed an excellent correlation with MPA quantification. Our results indicated the clear involvement of mpaC gene as well as surrounding cluster in P. roqueforti MPA biosynthesis. Copyright © 2016 Elsevier Ltd. All rights reserved.
Holzhauser, Thomas; Kleiner, Kornelia; Janise, Annabella; Röder, Martin
2014-11-15
A novel method to quantify species or DNA on the basis of a competitive quantitative real-time polymerase chain reaction (cqPCR) was developed. Potentially allergenic peanut in food served as one example. Based on an internal competitive DNA sequence for normalisation of DNA extraction and amplification, the cqPCR was threshold-calibrated against 100mg/kg incurred peanut in milk chocolate. No external standards were necessary. The competitive molecule successfully served as calibrator for quantification, matrix normalisation, and inhibition control. Although designed for verification of a virtual threshold of 100mg/kg, the method allowed quantification of 10-1,000 mg/kg peanut incurred in various food matrices and without further matrix adaption: On the basis of four PCR replicates per sample, mean recovery of 10-1,000 mg/kg peanut in chocolate, vanilla ice cream, cookie dough, cookie, and muesli was 87% (range: 39-147%) in comparison to 199% (range: 114-237%) by three commercial ELISA kits. Copyright © 2014 Elsevier Ltd. All rights reserved.
Emwas, Abdul-Hamid; Roy, Raja; McKay, Ryan T; Ryan, Danielle; Brennan, Lorraine; Tenori, Leonardo; Luchinat, Claudio; Gao, Xin; Zeri, Ana Carolina; Gowda, G A Nagana; Raftery, Daniel; Steinbeck, Christoph; Salek, Reza M; Wishart, David S
2016-02-05
NMR-based metabolomics has shown considerable promise in disease diagnosis and biomarker discovery because it allows one to nondestructively identify and quantify large numbers of novel metabolite biomarkers in both biofluids and tissues. Precise metabolite quantification is a prerequisite to move any chemical biomarker or biomarker panel from the lab to the clinic. Among the biofluids commonly used for disease diagnosis and prognosis, urine has several advantages. It is abundant, sterile, and easily obtained, needs little sample preparation, and does not require invasive medical procedures for collection. Furthermore, urine captures and concentrates many "unwanted" or "undesirable" compounds throughout the body, providing a rich source of potentially useful disease biomarkers; however, incredible variation in urine chemical concentrations makes analysis of urine and identification of useful urinary biomarkers by NMR challenging. We discuss a number of the most significant issues regarding NMR-based urinary metabolomics with specific emphasis on metabolite quantification for disease biomarker applications and propose data collection and instrumental recommendations regarding NMR pulse sequences, acceptable acquisition parameter ranges, relaxation effects on quantitation, proper handling of instrumental differences, sample preparation, and biomarker assessment.
Integrative analysis with ChIP-seq advances the limits of transcript quantification from RNA-seq
Liu, Peng; Sanalkumar, Rajendran; Bresnick, Emery H.; Keleş, Sündüz; Dewey, Colin N.
2016-01-01
RNA-seq is currently the technology of choice for global measurement of transcript abundances in cells. Despite its successes, isoform-level quantification remains difficult because short RNA-seq reads are often compatible with multiple alternatively spliced isoforms. Existing methods rely heavily on uniquely mapping reads, which are not available for numerous isoforms that lack regions of unique sequence. To improve quantification accuracy in such difficult cases, we developed a novel computational method, prior-enhanced RSEM (pRSEM), which uses a complementary data type in addition to RNA-seq data. We found that ChIP-seq data of RNA polymerase II and histone modifications were particularly informative in this approach. In qRT-PCR validations, pRSEM was shown to be superior than competing methods in estimating relative isoform abundances within or across conditions. Data-driven simulations suggested that pRSEM has a greatly decreased false-positive rate at the expense of a small increase in false-negative rate. In aggregate, our study demonstrates that pRSEM transforms existing capacity to precisely estimate transcript abundances, especially at the isoform level. PMID:27405803
Nuriel, Tal; Deeb, Ruba S.; Hajjar, David P.; Gross, Steven S.
2008-01-01
Nitration of tyrosine residues by nitric oxide (NO)-derived species results in the accumulation of 3-nitrotyrosine in proteins, a hallmark of nitrosative stress in cells and tissues. Tyrosine nitration is recognized as one of the multiple signaling modalities used by NO-derived species for the regulation of protein structure and function in health and disease. Various methods have been described for the quantification of protein 3-nitrotyrosine residues, and several strategies have been presented toward the goal of proteome-wide identification of protein tyrosine modification sites. This chapter details a useful protocol for the quantification of 3-nitrotyrosine in cells and tissues using high-pressure liquid chromatography with electrochemical detection. Additionally, this chapter describes a novel biotin-tagging strategy for specific enrichment of 3-nitrotyrosine-containing peptides. Application of this strategy, in conjunction with high-throughput MS/MS-based peptide sequencing, is anticipated to fuel efforts in developing comprehensive inventories of nitrosative stress-induced protein-tyrosine modification sites in cells and tissues. PMID:18554526
Gautam, Aarti; Kumar, Raina; Dimitrov, George; Hoke, Allison; Hammamieh, Rasha; Jett, Marti
2016-10-01
miRNAs act as important regulators of gene expression by promoting mRNA degradation or by attenuating protein translation. Since miRNAs are stably expressed in bodily fluids, there is growing interest in profiling these miRNAs, as it is minimally invasive and cost-effective as a diagnostic matrix. A technical hurdle in studying miRNA dynamics is the ability to reliably extract miRNA as small sample volumes and low RNA abundance create challenges for extraction and downstream applications. The purpose of this study was to develop a pipeline for the recovery of miRNA using small volumes of archived serum samples. The RNA was extracted employing several widely utilized RNA isolation kits/methods with and without addition of a carrier. The small RNA library preparation was carried out using Illumina TruSeq small RNA kit and sequencing was carried out using Illumina platform. A fraction of five microliters of total RNA was used for library preparation as quantification is below the detection limit. We were able to profile miRNA levels in serum from all the methods tested. We found out that addition of nucleic acid based carrier molecules had higher numbers of processed reads but it did not enhance the mapping of any miRBase annotated sequences. However, some of the extraction procedures offer certain advantages: RNA extracted by TRIzol seemed to align to the miRBase best; extractions using TRIzol with carrier yielded higher miRNA-to-small RNA ratios. Nuclease free glycogen can be carrier of choice for miRNA sequencing. Our findings illustrate that miRNA extraction and quantification is influenced by the choice of methodologies. Addition of nucleic acid- based carrier molecules during extraction procedure is not a good choice when assaying miRNA using sequencing. The careful selection of an extraction method permits the archived serum samples to become valuable resources for high-throughput applications.
Hennebique, Aurélie; Bidart, Marie; Jarraud, Sophie; Beraud, Laëtitia; Schwebel, Carole; Maurin, Max; Boisset, Sandrine
2017-09-01
The emergence of fluoroquinolone (FQ)-resistant mutants of Legionella pneumophila in infected humans was previously reported using a next-generation DNA sequencing (NGS) approach. This finding could explain part of the therapeutic failures observed in legionellosis patients treated with these antibiotics. The aim of this study was to develop digital PCR (dPCR) assays allowing rapid and accurate detection and quantification of these resistant mutants in respiratory samples, especially when the proportion of mutants in a wild-type background is low. We designed three dPCRgyrA assays to detect and differentiate the wild-type and one of the three gyrA mutations previously described as associated with FQ resistance in L. pneumophila : at positions 248C→T (T83I), 259G→A (D87N), and 259G→C (D87H). To assess the performance of these assays, mixtures of FQ-resistant and -susceptible strains of L. pneumophila were analyzed, and the results were compared with those obtained with Sanger DNA sequencing and real-time quantitative PCR (qPCR) technologies. The dPCRgyrA assays were able to detect mutated gyrA sequences in the presence of wild-type sequences at up to 1:1,000 resistant/susceptible allele ratios. By comparison, Sanger DNA sequencing and qPCR were less sensitive, allowing the detection of gyrA mutants at up to 1:1 and 1:10 ratios, respectively. When testing 38 respiratory samples from 23 legionellosis patients (69.6% treated with an FQ), dPCRgyrA detected small amounts of gyrA mutants in four (10.5%) samples from three (13.0%) patients. These results demonstrate that dPCR is a highly sensitive alternative to quantify FQ resistance in L. pneumophila , and it could be used in clinical practice to detect patients that could be at higher risk of therapeutic failure. Copyright © 2017 American Society for Microbiology.
Taverniers, Isabel; Van Bockstaele, Erik; De Loose, Marc
2004-03-01
Analytical real-time PCR technology is a powerful tool for implementation of the GMO labeling regulations enforced in the EU. The quality of analytical measurement data obtained by quantitative real-time PCR depends on the correct use of calibrator and reference materials (RMs). For GMO methods of analysis, the choice of appropriate RMs is currently under debate. So far, genomic DNA solutions from certified reference materials (CRMs) are most often used as calibrators for GMO quantification by means of real-time PCR. However, due to some intrinsic features of these CRMs, errors may be expected in the estimations of DNA sequence quantities. In this paper, two new real-time PCR methods are presented for Roundup Ready soybean, in which two types of plasmid DNA fragments are used as calibrators. Single-target plasmids (STPs) diluted in a background of genomic DNA were used in the first method. Multiple-target plasmids (MTPs) containing both sequences in one molecule were used as calibrators for the second method. Both methods simultaneously detect a promoter 35S sequence as GMO-specific target and a lectin gene sequence as endogenous reference target in a duplex PCR. For the estimation of relative GMO percentages both "delta C(T)" and "standard curve" approaches are tested. Delta C(T) methods are based on direct comparison of measured C(T) values of both the GMO-specific target and the endogenous target. Standard curve methods measure absolute amounts of target copies or haploid genome equivalents. A duplex delta C(T) method with STP calibrators performed at least as well as a similar method with genomic DNA calibrators from commercial CRMs. Besides this, high quality results were obtained with a standard curve method using MTP calibrators. This paper demonstrates that plasmid DNA molecules containing either one or multiple target sequences form perfect alternative calibrators for GMO quantification and are especially suitable for duplex PCR reactions.
R/S analysis of reaction time in Neuron Type Test for human activity in civil aviation
NASA Astrophysics Data System (ADS)
Zhang, Hong-Yan; Kang, Ming-Cui; Li, Jing-Qiang; Liu, Hai-Tao
2017-03-01
Human factors become the most serious problem leading to accidents of civil aviation, which stimulates the design and analysis of Neuron Type Test (NTT) system to explore the intrinsic properties and patterns behind the behaviors of professionals and students in civil aviation. In the experiment, normal practitioners' reaction time sequences, collected from NTT, exhibit log-normal distribution approximately. We apply the χ2 test to compute the goodness-of-fit by transforming the time sequence with Box-Cox transformation to cluster practitioners. The long-term correlation of different individual practitioner's time sequence is represented by the Hurst exponent via Rescaled Range Analysis, also named by Range/Standard deviation (R/S) Analysis. The different Hurst exponent suggests the existence of different collective behavior and different intrinsic patterns of human factors in civil aviation.
Parallel sequencing lives, or what makes large sequencing projects successful.
Quilez, Javier; Vidal, Enrique; Dily, François Le; Serra, François; Cuartero, Yasmina; Stadhouders, Ralph; Graf, Thomas; Marti-Renom, Marc A; Beato, Miguel; Filion, Guillaume
2017-11-01
T47D_rep2 and b1913e6c1_51720e9cf were 2 Hi-C samples. They were born and processed at the same time, yet their fates were very different. The life of b1913e6c1_51720e9cf was simple and fruitful, while that of T47D_rep2 was full of accidents and sorrow. At the heart of these differences lies the fact that b1913e6c1_51720e9cf was born under a lab culture of Documentation, Automation, Traceability, and Autonomy and compliance with the FAIR Principles. Their lives are a lesson for those who wish to embark on the journey of managing high-throughput sequencing data. © The Author 2017. Published by Oxford University Press.
On-line fission products measurements during a PWR severe accident: the French DECA-PF project
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ducros, G.; Allinei, P.G.; Roure, C.
Following the Fukushima accident, a lot of recommendations was drawn by international organizations (IAEA, OECD, NUGENIA network...) in order to improve the safety in such accidental conditions and mitigate their consequences. One of these recommendations was to improve the robustness of the instrumentation, which was dramatically lacking at Fukushima, as well as to better determine the Source Term involved in nuclear accident. The DECA-PF project (Diagnosis of a degraded reactor core through Fission Product measurements) was elaborated in this context and selected as one of 21 collaborative R and D projects in the field of nuclear safety and radioprotection, fundedmore » in May 2013 by the French National Research Agency. Over the months following the Fukushima accident, a CEA crisis team was held in order to analyze on-line the situation taking into account the data delivered by TEPCO and other organizations. Despite the difficulties encountered concerning the reliability of these data, the work performed showed the high capacity of Fission Products (FP) measurements to get a diagnosis relative to the status of the reactors and the spent fuel pools (SFP). Based on these FP measurements, it was possible to conclude that the main origin of the releases was coming from the cores and not from the SFP, in particular for SFP-4 which was of high concern, and that the degradation level of the reactors was very large, including probably an extensive core melting. To improve the reliability of this kind of diagnosis, the necessity to get such measurements as soon as possible after the accident and as near as possible from the reactor was stressed. In this way the present DECA-PF project intends to develop a new and innovative instrumentation taking into account the design of the French nuclear power plants on which sand bed filters have been implemented for severe accident management. Three complementary techniques, devoted to measure the FP release on-line, are being studied: - Gamma spectrometry, with an industrial objective to build a prototype aimed at improving the capacity of the present radiation monitoring system, - Gas chromatography, for the quantification of the fission gases (Xe, Kr) as well as potential carbon oxides produced in case of Molten Corium Concrete Interaction, - Optical absorption spectroscopy, the objective of this most innovative technique being to quantify the tetra-oxide of ruthenium, which could be produced in case of lower head failure, and the gaseous forms of iodine (molecular and organic) released in the environment. A global description and the present status of this project is presented, focusing on the Source Term establishment at the outlet stack of the sand bed filters and on the perspectives of implementation of the on-line gamma spectrometry equipment. (authors)« less
Statistical modeling of isoform splicing dynamics from RNA-seq time series data.
Huang, Yuanhua; Sanguinetti, Guido
2016-10-01
Isoform quantification is an important goal of RNA-seq experiments, yet it remains problematic for genes with low expression or several isoforms. These difficulties may in principle be ameliorated by exploiting correlated experimental designs, such as time series or dosage response experiments. Time series RNA-seq experiments, in particular, are becoming increasingly popular, yet there are no methods that explicitly leverage the experimental design to improve isoform quantification. Here, we present DICEseq, the first isoform quantification method tailored to correlated RNA-seq experiments. DICEseq explicitly models the correlations between different RNA-seq experiments to aid the quantification of isoforms across experiments. Numerical experiments on simulated datasets show that DICEseq yields more accurate results than state-of-the-art methods, an advantage that can become considerable at low coverage levels. On real datasets, our results show that DICEseq provides substantially more reproducible and robust quantifications, increasing the correlation of estimates from replicate datasets by up to 10% on genes with low or moderate expression levels (bottom third of all genes). Furthermore, DICEseq permits to quantify the trade-off between temporal sampling of RNA and depth of sequencing, frequently an important choice when planning experiments. Our results have strong implications for the design of RNA-seq experiments, and offer a novel tool for improved analysis of such datasets. Python code is freely available at http://diceseq.sf.net G.Sanguinetti@ed.ac.uk Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Unlocking the Mystery of Columbia's Tragic Accident Through Materials Characterization
NASA Technical Reports Server (NTRS)
Shah, Sandeep; Jerman, Gregory; Coston, James
2003-01-01
The wing and underbelly reconstruction of Space Shuttle Columbia took place at the Shuttle Landing Facility Hangar after the accident which destroyed STS-107. Fragments were placed on a grid according to their original location on the orbiter. Some Reinforced Carbon-Carbon (RCC) panels of the left wing leading edge and other parts from both leading edges were recovered and incorporated into the reconstruction. The recovered parts were tracked on a database according to a number and also tracked on a map of the orbiter. This viewgraph presentation describes the process of failure analysis undertaken by the Materials and Processes (M&P) Problem Resolution Team. The team started with factual observations about the accident, and identified highest level questions for it to answer in order to understand where on the orbiter failure occured, what component(s) failed, and what was the sequence of events. The finding of Columbia's MADS/OEX data recorder shifted the focus of the team's analysis to the left wing leading edge damage. The team placed particular attention on slag deposits on some of the RCC panels. The presentation lists analysis techniques, and lower level questions for the team to answer.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kolaczkowski, A.M.; Lambright, J.A.; Ferrell, W.L.
This document contains the internal event initiated accident sequence analyses for Peach Bottom, Unit 2; one of the reference plants being examined as part of the NUREG-1150 effort by the Nuclear Regulatory Commission. NUREG-1150 will document the risk of a selected group of nuclear power plants. As part of that work, this report contains the overall core damage frequency estimate for Peach Bottom, Unit 2, and the accompanying plant damage state frequencies. Sensitivity and uncertainty analyses provided additional insights regarding the dominant contributors to the Peach Bottom core damage frequency estimate. The mean core damage frequency at Peach Bottom wasmore » calculated to be 8.2E-6. Station blackout type accidents (loss of all ac power) were found to dominate the overall results. Anticipated Transient Without Scram accidents were also found to be non-negligible contributors. The numerical results are largely driven by common mode failure probability estimates and to some extent, human error. Because of significant data and analysis uncertainties in these two areas (important, for instance, to the most dominant scenario in this study), it is recommended that the results of the uncertainty and sensitivity analyses be considered before any actions are taken based on this analysis.« less
Mayayo, Emilio; Stchigel, Alberto M; Cano, José F; Bernal-Escoté, Xana; Guarro, Josep
2013-01-03
Cutaneous mucormycosis (zygomycosis), with subcutaneous spreading and dissemination, in immunocompetent patients is an uncommon disease caused by species belonging to the fungal genera Apophysomyces, Rhizopus and Saksenaea, among others. A case of necrotising fasciitis by Saksenaea vasiformis in an immunocompetent woman is described. The infection was acquired through a car accident resulting in multiple injuries affecting mainly her right arm. After the surgical reduction of fractures, skin lesions worsened and led to necrosis. The patient quickly developed a severe necrotising fasciitis with negative cultures at first. Despite the extensive surgical debridement and the aggressive antifungal treatment, the patient died. The histopathological study showed a fungal infection due to a fungus belonging to the Mucorales order, which was confirmed by culturing the clinical sample on Sabouraud agar, and identifying the species by cultures on Czapek-Dox agar, and sequencing of the ITS region of the ribosomal DNA. This case confirm the presence of this fungus in Spain, the value of histopathology for the mucormycosis diagnosis, as well as the need to perform special cultures to facilitate their isolation and identification to the species level by the combined use of Czapek-Dox agar and sequencing of the ITS region. Copyright © 2012 Revista Iberoamericana de Micología. Published by Elsevier España, S.L. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gauntt, Randall O.; Goldmann, Andrew; Kalinich, Donald A.
2016-12-01
In this study, risk-significant pressurized-water reactor severe accident sequences are examined using MELCOR 1.8.5 to explore the range of fission product releases to the reactor containment building. Advances in the understanding of fission product release and transport behavior and severe accident progression are used to render best estimate analyses of selected accident sequences. Particular emphasis is placed on estimating the effects of high fuel burnup in contrast with low burnup on fission product releases to the containment. Supporting this emphasis, recent data available on fission product release from high-burnup (HBU) fuel from the French VERCOR project are used in thismore » study. The results of these analyses are treated as samples from a population of accident sequences in order to employ approximate order statistics characterization of the results. These trends and tendencies are then compared to the NUREG-1465 alternative source term prescription used today for regulatory applications. In general, greater differences are observed between the state-of-the-art calculations for either HBU or low-burnup (LBU) fuel and the NUREG-1465 containment release fractions than exist between HBU and LBU release fractions. Current analyses suggest that retention of fission products within the vessel and the reactor coolant system (RCS) are greater than contemplated in the NUREG-1465 prescription, and that, overall, release fractions to the containment are therefore lower across the board in the present analyses than suggested in NUREG-1465. The decreased volatility of Cs 2 MoO 4 compared to CsI or CsOH increases the predicted RCS retention of cesium, and as a result, cesium and iodine do not follow identical behaviors with respect to distribution among vessel, RCS, and containment. With respect to the regulatory alternative source term, greater differences are observed between the NUREG-1465 prescription and both HBU and LBU predictions than exist between HBU and LBU analyses. Additionally, current analyses suggest that the NUREG-1465 release fractions are conservative by about a factor of 2 in terms of release fractions and that release durations for in-vessel and late in-vessel release periods are in fact longer than the NUREG-1465 durations. It is currently planned that a subsequent report will further characterize these results using more refined statistical methods, permitting a more precise reformulation of the NUREG-1465 alternative source term for both LBU and HBU fuels, with the most important finding being that the NUREG-1465 formula appears to embody significant conservatism compared to current best-estimate analyses. ACKNOWLEDGEMENTS This work was supported by the United States Nuclear Regulatory Commission, Office of Nuclear Regulatory Research. The authors would like to thank Dr. Ian Gauld and Dr. Germina Ilas, of Oak Ridge National Laboratory, for their contributions to this work. In addition to development of core fission product inventory and decay heat information for use in MELCOR models, their insights related to fuel management practices and resulting effects on spatial distribution of fission products in the core was instrumental in completion of our work.« less
RNAbrowse: RNA-Seq De Novo Assembly Results Browser
Mariette, Jérôme; Noirot, Céline; Nabihoudine, Ibounyamine; Bardou, Philippe; Hoede, Claire; Djari, Anis; Cabau, Cédric; Klopp, Christophe
2014-01-01
Transcriptome analysis based on a de novo assembly of next generation RNA sequences is now performed routinely in many laboratories. The generated results, including contig sequences, quantification figures, functional annotations and variation discovery outputs are usually bulky and quite diverse. This article presents a user oriented storage and visualisation environment permitting to explore the data in a top-down manner, going from general graphical views to all possible details. The software package is based on biomart, easy to install and populate with local data. The software package is available under the GNU General Public License (GPL) at http://bioinfo.genotoul.fr/RNAbrowse. PMID:24823498
PARRoT- a homology-based strategy to quantify and compare RNA-sequencing from non-model organisms.
Gan, Ruei-Chi; Chen, Ting-Wen; Wu, Timothy H; Huang, Po-Jung; Lee, Chi-Ching; Yeh, Yuan-Ming; Chiu, Cheng-Hsun; Huang, Hsien-Da; Tang, Petrus
2016-12-22
Next-generation sequencing promises the de novo genomic and transcriptomic analysis of samples of interests. However, there are only a few organisms having reference genomic sequences and even fewer having well-defined or curated annotations. For transcriptome studies focusing on organisms lacking proper reference genomes, the common strategy is de novo assembly followed by functional annotation. However, things become even more complicated when multiple transcriptomes are compared. Here, we propose a new analysis strategy and quantification methods for quantifying expression level which not only generate a virtual reference from sequencing data, but also provide comparisons between transcriptomes. First, all reads from the transcriptome datasets are pooled together for de novo assembly. The assembled contigs are searched against NCBI NR databases to find potential homolog sequences. Based on the searched result, a set of virtual transcripts are generated and served as a reference transcriptome. By using the same reference, normalized quantification values including RC (read counts), eRPKM (estimated RPKM) and eTPM (estimated TPM) can be obtained that are comparable across transcriptome datasets. In order to demonstrate the feasibility of our strategy, we implement it in the web service PARRoT. PARRoT stands for Pipeline for Analyzing RNA Reads of Transcriptomes. It analyzes gene expression profiles for two transcriptome sequencing datasets. For better understanding of the biological meaning from the comparison among transcriptomes, PARRoT further provides linkage between these virtual transcripts and their potential function through showing best hits in SwissProt, NR database, assigning GO terms. Our demo datasets showed that PARRoT can analyze two paired-end transcriptomic datasets of approximately 100 million reads within just three hours. In this study, we proposed and implemented a strategy to analyze transcriptomes from non-reference organisms which offers the opportunity to quantify and compare transcriptome profiles through a homolog based virtual transcriptome reference. By using the homolog based reference, our strategy effectively avoids the problems that may cause from inconsistencies among transcriptomes. This strategy will shed lights on the field of comparative genomics for non-model organism. We have implemented PARRoT as a web service which is freely available at http://parrot.cgu.edu.tw .
Sung, Hye-Jin; Jeon, Seon-Ae; Ahn, Jung-Mo; Seul, Kyung-Jo; Kim, Jin Young; Lee, Ju Yeon; Yoo, Jong Shin; Lee, Soo-Youn; Kim, Hojoong; Cho, Je-Yoel
2012-04-03
Quantification is an essential step in biomarker development. Multiple reaction monitoring (MRM) is a new modified mass spectrometry-based quantification technology that does not require antibody development. Serum amyloid A (SAA) is a positive acute-phase protein identified as a lung cancer biomarker in our previous study. Acute SAA exists in two isoforms with highly similar (92%) amino acid sequences. Until now, studies of SAA have been unable to distinguish between SAA1 and SAA2. To overcome the unavailability of a SAA2-specific antibody, we developed MRM methodology for the verification of SAA1 and SAA2 in clinical crude serum samples from 99 healthy controls and 100 lung adenocarcinoma patients. Differential measurement of SAA1 and SAA2 was made possible for the first time with the developed isotype-specific MRM method. Most healthy control samples had small or no MS/MS peaks of the targeted peptides otherwise, higher peak areas with 10- to 34-fold increase over controls were detected in lung cancer samples. In addition, our SAA1 MRM data demonstrated good agreement with the SAA1 enzyme-linked immunosorbent assay (ELISA) data. Finally, successful quantification of SAA2 in crude serum by MRM, for the first time, shows that SAA2 can be a good biomarker for the detection of lung cancers. Copyright © 2012 Elsevier B.V. All rights reserved.
miR-MaGiC improves quantification accuracy for small RNA-seq.
Russell, Pamela H; Vestal, Brian; Shi, Wen; Rudra, Pratyaydipta D; Dowell, Robin; Radcliffe, Richard; Saba, Laura; Kechris, Katerina
2018-05-15
Many tools have been developed to profile microRNA (miRNA) expression from small RNA-seq data. These tools must contend with several issues: the small size of miRNAs, the small number of unique miRNAs, the fact that similar miRNAs can be transcribed from multiple loci, and the presence of miRNA isoforms known as isomiRs. Methods failing to address these issues can return misleading information. We propose a novel quantification method designed to address these concerns. We present miR-MaGiC, a novel miRNA quantification method, implemented as a cross-platform tool in Java. miR-MaGiC performs stringent mapping to a core region of each miRNA and defines a meaningful set of target miRNA sequences by collapsing the miRNA space to "functional groups". We hypothesize that these two features, mapping stringency and collapsing, provide more optimal quantification to a more meaningful unit (i.e., miRNA family). We test miR-MaGiC and several published methods on 210 small RNA-seq libraries, evaluating each method's ability to accurately reflect global miRNA expression profiles. We define accuracy as total counts close to the total number of input reads originating from miRNAs. We find that miR-MaGiC, which incorporates both stringency and collapsing, provides the most accurate counts.
Moreano, Francisco; Busch, Ulrich; Engel, Karl-Heinz
2005-12-28
Milling fractions from conventional and transgenic corn were prepared at laboratory scale and used to study the influence of sample composition and heat-induced DNA degradation on the relative quantification of genetically modified organisms (GMO) in food products. Particle size distributions of the obtained fractions (coarse grits, regular grits, meal, and flour) were characterized using a laser diffraction system. The application of two DNA isolation protocols revealed a strong correlation between the degree of comminution of the milling fractions and the DNA yield in the extracts. Mixtures of milling fractions from conventional and transgenic material (1%) were prepared and analyzed via real-time polymerase chain reaction. Accurate quantification of the adjusted GMO content was only possible in mixtures containing conventional and transgenic material in the form of analogous milling fractions, whereas mixtures of fractions exhibiting different particle size distributions delivered significantly over- and underestimated GMO contents depending on their compositions. The process of heat-induced nucleic acid degradation was followed by applying two established quantitative assays showing differences between the lengths of the recombinant and reference target sequences (A, deltal(A) = -25 bp; B, deltal(B) = +16 bp; values related to the amplicon length of the reference gene). Data obtained by the application of method A resulted in underestimated recoveries of GMO contents in the samples of heat-treated products, reflecting the favored degradation of the longer target sequence used for the detection of the transgene. In contrast, data yielded by the application of method B resulted in increasingly overestimated recoveries of GMO contents. The results show how commonly used food technological processes may lead to distortions in the results of quantitative GMO analyses.
Guo, Jinchao; Yang, Litao; Liu, Xin; Guan, Xiaoyan; Jiang, Lingxi; Zhang, Dabing
2009-08-26
Genetically modified (GM) papaya (Carica papaya L.), Huanong No. 1, was approved for commercialization in Guangdong province, China in 2006, and the development of the Huanong No. 1 papaya detection method is necessary for implementing genetically modified organism (GMO) labeling regulations. In this study, we reported the characterization of the exogenous integration of GM Huanong No. 1 papaya by means of conventional polymerase chain reaction (PCR) and thermal asymmetric interlaced (TAIL)-PCR strategies. The results suggested that one intact copy of the initial construction was integrated in the papaya genome and which probably resulted in one deletion (38 bp in size) of the host genomic DNA. Also, one unintended insertion of a 92 bp truncated NptII fragment was observed at the 5' end of the exogenous insert. Furthermore, we revealed its 5' and 3' flanking sequences between the insert DNA and the papaya genomic DNA, and developed the event-specific qualitative and quantitative PCR assays for GM Huanong No. 1 papaya based on the 5' integration flanking sequence. The relative limit of detection (LOD) of the qualitative PCR assay was about 0.01% in 100 ng of total papaya genomic DNA, corresponding to about 25 copies of papaya haploid genome. In the quantitative PCR, the limits of detection and quantification (LOD and LOQ) were as low as 12.5 and 25 copies of papaya haploid genome, respectively. In practical sample quantification, the quantified biases between the test and true values of three samples ranged from 0.44% to 4.41%. Collectively, we proposed that all of these results are useful for the identification and quantification of Huanong No. 1 papaya and its derivates.
Dong, Lianhua; Meng, Ying; Wang, Jing; Liu, Yingying
2014-02-01
DNA reference materials of certified value have a critical function in many analytical processes of DNA measurement. Quantification of amoA genes in ammonia oxidizing bacteria (AOB) and archaea (AOA), and of nirS and nosZ genes in the denitrifiers is very important for determining their distribution and abundance in the natural environment. A plasmid reference material containing nirS, nosZ, amoA-AOB, and amoA-AOA is developed to provide a DNA standard with copy number concentration for ensuring comparability and reliability of quantification of these genes. Droplet digital PCR (ddPCR) was evaluated for characterization of the plasmid reference material. The result revealed that restriction endonuclease digestion of plasmids can improve amplification efficiency and minimize the measurement bias of ddPCR. Compared with the conformation of the plasmid, the size of the DNA fragment containing the target sequence and the location of the restriction site relative to the target sequence are not significant factors affecting plasmid quantification by ddPCR. Liquid chromatography-isotope dilution mass spectrometry (LC-IDMS) was used to provide independent data for quantifying the plasmid reference material. The copy number concentration of the digested plasmid determined by ddPCR agreed well with that determined by LC-IDMS, improving both the accuracy and reliability of the plasmid reference material. The reference value, with its expanded uncertainty (k = 2), of the plasmid reference material was determined to be (5.19 ± 0.41) × 10(9) copies μL(-1) by averaging the results of two independent measurements. Consideration of the factors revealed in this study can improve the reliability and accuracy of ddPCR; thus, this method has the potential to accurately quantify DNA reference materials.
Aoki, Takatoshi; Yamaguchi, Shinpei; Kinoshita, Shunsuke; Hayashida, Yoshiko; Korogi, Yukunori
2016-09-01
To determine the reproducibility of the quantitative chemical shift-based water-fat separation method with a multiecho gradient echo sequence [iteraterative decomposition of water and fat with echo asymmetry and least-squares estimation quantitation sequence (IDEAL-IQ)] for assessing bone marrow fat fraction (FF); to evaluate variation of FF at different bone sites; and to investigate its association with age and menopause. 31 consecutive females who underwent pelvic iterative decomposition of water and fat with echo asymmetry and least-squares estimation at 3-T MRI were included in this study. Quantitative FF using IDEAL-IQ of four bone sites were analyzed. The coefficients of variance (CV) on each site were evaluated repeatedly 10 times to assess the reproducibility. Correlations between FF and age were evaluated on each site, and the FFs between pre- and post-menopausal groups were compared. The CV in the quantification of marrow FF ranged from 0.69% to 1.70%. A statistically significant correlation was established between the FF and the age in lumbar vertebral body, ilium and intertrochanteric region of the femur (p < 0.001). The average FF of post-menopausal females was significantly higher than that of pre-menopausal females in these sites (p < 0.05). In the greater trochanter of the femur, there was no significant correlation between FF and age. In vivo IDEAL-IQ would provide reliable quantification of bone marrow fat. IDEAL-IQ is simple to perform in a short time and may be practical for providing information on bone quality in clinical settings.
Prinsen, Hetty; de Graaf, Robin A; Mason, Graeme F; Pelletier, Daniel; Juchem, Christoph
2017-01-01
To determine the reproducibility of a comprehensive single-session measurement of glutathione (GSH), γ-aminobutyric acid (GABA), glutamate, and other biochemicals implicated in the pathophysiology of multiple sclerosis (MS) in the human brain with 1 H magnetic resonance spectroscopy (MRS). Five healthy subjects were studied twice in separate 1-hour sessions at 7T. One MS patient was also scanned once. GSH and GABA were measured with J-difference editing using a semilocalized by adiabatic selective refocusing sequence (semi-LASER, TE = 72 msec). A stimulated echo acquisition mode sequence (STEAM, TE = 10 msec) was used to detect glutamate along with the overall biochemical profile. Spectra were quantified with LCModel. Quantification accuracy was assessed through Cramer-Rao lower bounds (CRLB). Reproducibility of the metabolite quantification was tested using coefficients of variation (CoV). CRLB were ≤7% for GSH, GABA, and glutamate and average CoV of 7.8 ± 3.2%, 9.5 ± 7.0%, and 3.2 ± 1.7% were achieved, respectively. The average test/retest concentration differences at this measurement reproducibility and quantification accuracy were smaller for GABA and glutamate than intersubject variations in metabolite content with CoV ratios of 0.6 and 0.8, respectively. As proof of principle, GSH, GABA, and glutamate were also detected in an MS patient. GSH, GABA, glutamate, and other metabolites relevant in MS can be quantified at 7T with high accuracy and reproducibility in a single 1-hour session. This methodology might serve as a clinical research tool to investigate biochemical markers associated with MS. 2 J. Magn. Reson. Imaging 2017;45:187-198. © 2016 International Society for Magnetic Resonance in Medicine.
cFinder: definition and quantification of multiple haplotypes in a mixed sample.
Niklas, Norbert; Hafenscher, Julia; Barna, Agnes; Wiesinger, Karin; Pröll, Johannes; Dreiseitl, Stephan; Preuner-Stix, Sandra; Valent, Peter; Lion, Thomas; Gabriel, Christian
2015-09-07
Next-generation sequencing allows for determining the genetic composition of a mixed sample. For instance, when performing resistance testing for BCR-ABL1 it is necessary to identify clones and define compound mutations; together with an exact quantification this may complement diagnosis and therapy decisions with additional information. Moreover, that applies not only to oncological issues but also determination of viral, bacterial or fungal infection. The efforts to retrieve multiple haplotypes (more than two) and proportion information from data with conventional software are difficult, cumbersome and demand multiple manual steps. Therefore, we developed a tool called cFinder that is capable of automatic detection of haplotypes and their accurate quantification within one sample. BCR-ABL1 samples containing multiple clones were used for testing and our cFinder could identify all previously found clones together with their abundance and even refine some results. Additionally, reads were simulated using GemSIM with multiple haplotypes, the detection was very close to linear (R(2) = 0.96). Our aim is not to deduce haploblocks over statistics, but to characterize one sample's composition precisely. As a result the cFinder reports the connections of variants (haplotypes) with their readcount and relative occurrence (percentage). Download is available at http://sourceforge.net/projects/cfinder/. Our cFinder is implemented in an efficient algorithm that can be run on a low-performance desktop computer. Furthermore, it considers paired-end information (if available) and is generally open for any current next-generation sequencing technology and alignment strategy. To our knowledge, this is the first software that enables researchers without extensive bioinformatic support to designate multiple haplotypes and how they constitute to a sample.
Dementia resulting from traumatic brain injury
Ramalho, Joana; Castillo, Mauricio
2015-01-01
Traumatic brain injury (TBI) represents a significant public health problem in modern societies. It is primarily a consequence of traffic-related accidents and falls. Other recently recognized causes include sports injuries and indirect forces such as shock waves from battlefield explosions. TBI is an important cause of death and lifelong disability and represents the most well-established environmental risk factor for dementia. With the growing recognition that even mild head injury can lead to neurocognitive deficits, imaging of brain injury has assumed greater importance. However, there is no single imaging modality capable of characterizing TBI. Current advances, particularly in MR imaging, enable visualization and quantification of structural and functional brain changes not hitherto possible. In this review, we summarize data linking TBI with dementia, emphasizing the imaging techniques currently available in clinical practice along with some advances in medical knowledge. PMID:29213985
A study on MFL based wire rope damage detection
NASA Astrophysics Data System (ADS)
Park, J.; Kim, J.-W.; Kim, J.; Park, S.
2017-04-01
Non-destructive testing on wire rope is in great demand to prevent safety accidents at sites where many heavy equipment using ropes are installed. In this paper, a research on quantification of magnetic flux leakage (MFL) signals were carried out to detect damages on wire rope. First, a simulation study was performed with a steel rod model using a finite element analysis (FEA) program. The leakage signals from the simulation study were obtained and it was compared for parameter: depth of defect. Then, an experiment on same conditions was conducted to verify the results of the simulation. Throughout the results, the MFL signal was quantified and a wire rope damage detection was then confirmed to be feasible. In further study, it is expected that the damage characterization of an entire specimen will be visualized as well.
Wang, Jianxiu; Yi, Xinyao; Tang, Hailin; Han, Hongxing; Wu, Minghua; Zhou, Feimeng
2012-01-01
MicroRNAs (miRNAs), acting as oncogenes or tumor suppressors in humans, play a key role in regulating gene expression and are believed to be important for developing novel therapeutic treatments and clinical prognoses. Due to their short lengths (17–25 nucleotides) and extremely low concentrations (typically < pM) in biological samples, quantification of miRNAs has been challenging to conventional biochemical methods, such as Northern blotting, microarray, and quantitative polymerase chain reaction (qPCR). In this work, a biotinylated miRNA (biotin-miRNA) whose sequence is the same as that of a miRNA target is introduced into samples of interest and allowed to compete with the miRNA target for the oligonucleotide (ODN) probe preimmobilized onto an electrode. Voltammetric quantification of the miRNA target was accomplished after complexation of the biotin-miRNA with ferrocene (Fc)-capped gold nanoparticle/streptavidin conjugates. The Fc oxidation current was found to be inversely proportional to the concentration of target miRNA between 10 fM and 2.0 pM. The method is highly reproducible (RSD < 5%), regenerable (at least 8 regeneration/assay cycles without discernible signal decrease) and selective (with sequence specificity down to a single nucleotide mismatch). The low detection levels (10 fM or 0.1 attomoles of miRNA in a 10-HL solution) allow the direct quantification of miRNA-182, a marker correlated to the progression of glioma in patients, to be performed in serum samples without sample pretreatment and RNA extraction and enrichment. The concentration of miRNA-182 in glioma patients was found to be 3.1 times as high as that in healthy persons, a conclusion in excellent agreement with a separate qPCR measurement of the expression level. The obviations of the requirement of an internal reference in qPCR, simplicity, and cost-effectiveness are other additional advantages of this method for detection of nucleic acids in clinical samples. PMID:22788545
NASA Astrophysics Data System (ADS)
Geiger, E.; Le Gall, C.; Gallais-During, A.; Pontillon, Y.; Lamontagne, J.; Hanus, E.; Ducros, G.
2017-11-01
Within the framework of the International Source Term Programme (ISTP), the VERDON programme aims at quantifying the source term of radioactive materials in case of a hypothetical severe accident in a light water reactor (LWR). Tests were performed in a new experimental laboratory (VERDON) built in the LECA-STAR facility (CEA Cadarache). The VERDON-1 test was devoted to the study of a high burn-up UO2 fuel and FP releases at very high temperature (≈2873 K) in a reducing atmosphere. Post-test qualitative and quantitative characterisations of the VERDON-1 sample led to the proposal of a scenario explaining the phenomena occurring during the experimental sequence. Hence, the fuel and the cladding may have interacted which led to the melting of UO2-ZrO2 alloy. Although no relocation was observed during the test, it may have been imminent.
Radiant Heat Testing of the H1224A Shipping/Storage Container
1994-05-01
re - entry vehicles caused by credible accidents during air and ground transportation. Radiant heat testing of the H1224A storage/shipping container is...inner container, and re - entry vehicle (RV) temperatures during radiant heat testing. Computer modelling can be used to predict weapon response throughout...Nomenclature RV Re - entry Vehicle midsection mass mock-up WR War Reserve STS Stockpile-to-Target Sequence NAWC Simulated H1224A container by Naval Air
Ridoux, Olivier; Foucault, Cédric; Drancourt, Michel
1998-01-01
Encephalitozoon species are strict intracellular microsporidia. Cocultures with eukaryotic cell lines can become accidently contaminated by mycoplasmas. We propose a decontamination protocol based on differential cell targeting after intraperitoneal inoculation in mice. Mycoplasma-free microsporidia were isolated from the brains and spleens of inoculated mice 24 h postinoculation by using the centrifugation shell vial system. Identification was confirmed by direct sequencing of PCR-amplified 16S rRNA. PMID:9666031
Pereira, Rui P A; Peplies, Jörg; Brettar, Ingrid; Höfle, Manfred G
2017-03-31
Next Generation Sequencing (NGS) has revolutionized the analysis of natural and man-made microbial communities by using universal primers for bacteria in a PCR based approach targeting the 16S rRNA gene. In our study we narrowed primer specificity to a single, monophyletic genus because for many questions in microbiology only a specific part of the whole microbiome is of interest. We have chosen the genus Legionella, comprising more than 20 pathogenic species, due to its high relevance for water-based respiratory infections. A new NGS-based approach was designed by sequencing 16S rRNA gene amplicons specific for the genus Legionella using the Illumina MiSeq technology. This approach was validated and applied to a set of representative freshwater samples. Our results revealed that the generated libraries presented a low average raw error rate per base (<0.5%); and substantiated the use of high-fidelity enzymes, such as KAPA HiFi, for increased sequence accuracy and quality. The approach also showed high in situ specificity (>95%) and very good repeatability. Only in samples in which the gammabacterial clade SAR86 was present more than 1% non-Legionella sequences were observed. Next-generation sequencing read counts did not reveal considerable amplification/sequencing biases and showed a sensitive as well as precise quantification of L. pneumophila along a dilution range using a spiked-in, certified genome standard. The genome standard and a mock community consisting of six different Legionella species demonstrated that the developed NGS approach was quantitative and specific at the level of individual species, including L. pneumophila. The sensitivity of our genus-specific approach was at least one order of magnitude higher compared to the universal NGS approach. Comparison of quantification by real-time PCR showed consistency with the NGS data. Overall, our NGS approach can determine the quantitative abundances of Legionella species, i. e. the complete Legionella microbiome, without the need for species-specific primers. The developed NGS approach provides a new molecular surveillance tool to monitor all Legionella species in qualitative and quantitative terms if a spiked-in genome standard is used to calibrate the method. Overall, the genus-specific NGS approach opens up a new avenue to massive parallel diagnostics in a quantitative, specific and sensitive way.
Wang, Charlie Y; Liu, Yuchi; Huang, Shuying; Griswold, Mark A; Seiberlich, Nicole; Yu, Xin
2017-12-01
The purpose of this work was to develop a 31 P spectroscopic magnetic resonance fingerprinting (MRF) method for fast quantification of the chemical exchange rate between phosphocreatine (PCr) and adenosine triphosphate (ATP) via creatine kinase (CK). A 31 P MRF sequence (CK-MRF) was developed to quantify the forward rate constant of ATP synthesis via CK ( kfCK), the T 1 relaxation time of PCr ( T1PCr), and the PCr-to-ATP concentration ratio ( MRPCr). The CK-MRF sequence used a balanced steady-state free precession (bSSFP)-type excitation with ramped flip angles and a unique saturation scheme sensitive to the exchange between PCr and γATP. Parameter estimation was accomplished by matching the acquired signals to a dictionary generated using the Bloch-McConnell equation. Simulation studies were performed to examine the susceptibility of the CK-MRF method to several potential error sources. The accuracy of nonlocalized CK-MRF measurements before and after an ischemia-reperfusion (IR) protocol was compared with the magnetization transfer (MT-MRS) method in rat hindlimb at 9.4 T (n = 14). The reproducibility of CK-MRF was also assessed by comparing CK-MRF measurements with both MT-MRS (n = 17) and four angle saturation transfer (FAST) (n = 7). Simulation results showed that CK-MRF quantification of kfCK was robust, with less than 5% error in the presence of model inaccuracies including dictionary resolution, metabolite T 2 values, inorganic phosphate metabolism, and B 1 miscalibration. Estimation of kfCK by CK-MRF (0.38 ± 0.02 s -1 at baseline and 0.42 ± 0.03 s -1 post-IR) showed strong agreement with MT-MRS (0.39 ± 0.03 s -1 at baseline and 0.44 ± 0.04 s -1 post-IR). kfCK estimation was also similar between CK-MRF and FAST (0.38 ± 0.02 s -1 for CK-MRF and 0.38 ± 0.11 s -1 for FAST). The coefficient of variation from 20 s CK-MRF quantification of kfCK was 42% of that by 150 s MT-MRS acquisition and was 12% of that by 20 s FAST acquisition. This study demonstrates the potential of a 31 P spectroscopic MRF framework for rapid, accurate and reproducible quantification of chemical exchange rate of CK in vivo. Copyright © 2017 John Wiley & Sons, Ltd.
Auffret, Marc; Pilote, Alexandre; Proulx, Emilie; Proulx, Daniel; Vandenberg, Grant; Villemur, Richard
2011-12-15
Geosmin and 2-methylisoborneol (MIB) have been associated with off-flavour problems in fish and seafood products, generating a strong negative impact for aquaculture industries. Although most of the producers of geosmin and MIB have been identified as Streptomyces species or cyanobacteria, Streptomyces spp. are thought to be responsible for the synthesis of these compounds in indoor recirculating aquaculture systems (RAS). The detection of genes involved in the synthesis of geosmin and MIB can be a relevant indicator of the beginning of off-flavour events in RAS. Here, we report a real-time polymerase chain reaction (qPCR) protocol targeting geoA sequences that encode a germacradienol synthase involved in geosmin synthesis. New geoA-related sequences were retrieved from eleven geosmin-producing Actinomycete strains, among them two Streptomyces strains isolated from two RAS. Combined with geoA-related sequences available in gene databases, we designed primers and standards suitable for qPCR assays targeting mainly Streptomyces geoA. Using our qPCR protocol, we succeeded in measuring the level of geoA copies in sand filter and biofilters in two RAS. This study is the first to apply qPCR assays to detect and quantify the geosmin synthesis gene (geoA) in RAS. Quantification of geoA in RAS could permit the monitoring of the level of geosmin producers prior to the occurrence of geosmin production. This information will be most valuable for fish producers to manage further development of off-flavour events. Copyright © 2011 Elsevier Ltd. All rights reserved.
Digital gene expression for non-model organisms
Hong, Lewis Z.; Li, Jun; Schmidt-Küntzel, Anne; Warren, Wesley C.; Barsh, Gregory S.
2011-01-01
Next-generation sequencing technologies offer new approaches for global measurements of gene expression but are mostly limited to organisms for which a high-quality assembled reference genome sequence is available. We present a method for gene expression profiling called EDGE, or EcoP15I-tagged Digital Gene Expression, based on ultra-high-throughput sequencing of 27-bp cDNA fragments that uniquely tag the corresponding gene, thereby allowing direct quantification of transcript abundance. We show that EDGE is capable of assaying for expression in >99% of genes in the genome and achieves saturation after 6–8 million reads. EDGE exhibits very little technical noise, reveals a large (106) dynamic range of gene expression, and is particularly suited for quantification of transcript abundance in non-model organisms where a high-quality annotated genome is not available. In a direct comparison with RNA-seq, both methods provide similar assessments of relative transcript abundance, but EDGE does better at detecting gene expression differences for poorly expressed genes and does not exhibit transcript length bias. Applying EDGE to laboratory mice, we show that a loss-of-function mutation in the melanocortin 1 receptor (Mc1r), recognized as a Mendelian determinant of yellow hair color in many different mammals, also causes reduced expression of genes involved in the interferon response. To illustrate the application of EDGE to a non-model organism, we examine skin biopsy samples from a cheetah (Acinonyx jubatus) and identify genes likely to control differences in the color of spotted versus non-spotted regions. PMID:21844123
MR fingerprinting for rapid quantification of myocardial T1 , T2 , and proton spin density.
Hamilton, Jesse I; Jiang, Yun; Chen, Yong; Ma, Dan; Lo, Wei-Ching; Griswold, Mark; Seiberlich, Nicole
2017-04-01
To introduce a two-dimensional MR fingerprinting (MRF) technique for quantification of T 1 , T 2 , and M 0 in myocardium. An electrocardiograph-triggered MRF method is introduced for mapping myocardial T 1 , T 2 , and M 0 during a single breath-hold in as short as four heartbeats. The pulse sequence uses variable flip angles, repetition times, inversion recovery times, and T 2 preparation dephasing times. A dictionary of possible signal evolutions is simulated for each scan that incorporates the subject's unique variations in heart rate. Aspects of the sequence design were explored in simulations, and the accuracy and precision of cardiac MRF were assessed in a phantom study. In vivo imaging was performed at 3 Tesla in 11 volunteers to generate native parametric maps. T 1 and T 2 measurements from the proposed cardiac MRF sequence correlated well with standard spin echo measurements in the phantom study (R 2 > 0.99). A Bland-Altman analysis revealed good agreement for myocardial T 1 measurements between MRF and MOLLI (bias 1 ms, 95% limits of agreement -72 to 72 ms) and T 2 measurements between MRF and T 2 -prepared balanced steady-state free precession (bias, -2.6 ms; 95% limits of agreement, -8.5 to 3.3 ms). MRF can provide quantitative single slice T 1 , T 2 , and M 0 maps in the heart within a single breath-hold. Magn Reson Med 77:1446-1458, 2017. © 2016 International Society for Magnetic Resonance in Medicine. © 2016 International Society for Magnetic Resonance in Medicine.
NASA Astrophysics Data System (ADS)
Carvalho, Diego D. B.; Akkus, Zeynettin; Bosch, Johan G.; van den Oord, Stijn C. H.; Niessen, Wiro J.; Klein, Stefan
2014-03-01
In this work, we investigate nonrigid motion compensation in simultaneously acquired (side-by-side) B-mode ultrasound (BMUS) and contrast enhanced ultrasound (CEUS) image sequences of the carotid artery. These images are acquired to study the presence of intraplaque neovascularization (IPN), which is a marker of plaque vulnerability. IPN quantification is visualized by performing the maximum intensity projection (MIP) on the CEUS image sequence over time. As carotid images contain considerable motion, accurate global nonrigid motion compensation (GNMC) is required prior to the MIP. Moreover, we demonstrate that an improved lumen and plaque differentiation can be obtained by averaging the motion compensated BMUS images over time. We propose to use a previously published 2D+t nonrigid registration method, which is based on minimization of pixel intensity variance over time, using a spatially and temporally smooth B-spline deformation model. The validation compares displacements of plaque points with manual trackings by 3 experts in 11 carotids. The average (+/- standard deviation) root mean square error (RMSE) was 99+/-74μm for longitudinal and 47+/-18μm for radial displacements. These results were comparable with the interobserver variability, and with results of a local rigid registration technique based on speckle tracking, which estimates motion in a single point, whereas our approach applies motion compensation to the entire image. In conclusion, we evaluated that the GNMC technique produces reliable results. Since this technique tracks global deformations, it can aid in the quantification of IPN and the delineation of lumen and plaque contours.
Centrifuge: rapid and sensitive classification of metagenomic sequences.
Kim, Daehwan; Song, Li; Breitwieser, Florian P; Salzberg, Steven L
2016-12-01
Centrifuge is a novel microbial classification engine that enables rapid, accurate, and sensitive labeling of reads and quantification of species on desktop computers. The system uses an indexing scheme based on the Burrows-Wheeler transform (BWT) and the Ferragina-Manzini (FM) index, optimized specifically for the metagenomic classification problem. Centrifuge requires a relatively small index (4.2 GB for 4078 bacterial and 200 archaeal genomes) and classifies sequences at very high speed, allowing it to process the millions of reads from a typical high-throughput DNA sequencing run within a few minutes. Together, these advances enable timely and accurate analysis of large metagenomics data sets on conventional desktop computers. Because of its space-optimized indexing schemes, Centrifuge also makes it possible to index the entire NCBI nonredundant nucleotide sequence database (a total of 109 billion bases) with an index size of 69 GB, in contrast to k-mer-based indexing schemes, which require far more extensive space. © 2016 Kim et al.; Published by Cold Spring Harbor Laboratory Press.
Niland, Courtney N.; Jankowsky, Eckhard; Harris, Michael E.
2016-01-01
Quantification of the specificity of RNA binding proteins and RNA processing enzymes is essential to understanding their fundamental roles in biological processes. High Throughput Sequencing Kinetics (HTS-Kin) uses high throughput sequencing and internal competition kinetics to simultaneously monitor the processing rate constants of thousands of substrates by RNA processing enzymes. This technique has provided unprecedented insight into the substrate specificity of the tRNA processing endonuclease ribonuclease P. Here, we investigate the accuracy and robustness of measurements associated with each step of the HTS-Kin procedure. We examine the effect of substrate concentration on the observed rate constant, determine the optimal kinetic parameters, and provide guidelines for reducing error in amplification of the substrate population. Importantly, we find that high-throughput sequencing, and experimental reproducibility contribute their own sources of error, and these are the main sources of imprecision in the quantified results when otherwise optimized guidelines are followed. PMID:27296633
The genome of melon (Cucumis melo L.)
Garcia-Mas, Jordi; Benjak, Andrej; Sanseverino, Walter; Bourgeois, Michael; Mir, Gisela; González, Víctor M.; Hénaff, Elizabeth; Câmara, Francisco; Cozzuto, Luca; Lowy, Ernesto; Alioto, Tyler; Capella-Gutiérrez, Salvador; Blanca, Jose; Cañizares, Joaquín; Ziarsolo, Pello; Gonzalez-Ibeas, Daniel; Rodríguez-Moreno, Luis; Droege, Marcus; Du, Lei; Alvarez-Tejado, Miguel; Lorente-Galdos, Belen; Melé, Marta; Yang, Luming; Weng, Yiqun; Navarro, Arcadi; Marques-Bonet, Tomas; Aranda, Miguel A.; Nuez, Fernando; Picó, Belén; Gabaldón, Toni; Roma, Guglielmo; Guigó, Roderic; Casacuberta, Josep M.; Arús, Pere; Puigdomènech, Pere
2012-01-01
We report the genome sequence of melon, an important horticultural crop worldwide. We assembled 375 Mb of the double-haploid line DHL92, representing 83.3% of the estimated melon genome. We predicted 27,427 protein-coding genes, which we analyzed by reconstructing 22,218 phylogenetic trees, allowing mapping of the orthology and paralogy relationships of sequenced plant genomes. We observed the absence of recent whole-genome duplications in the melon lineage since the ancient eudicot triplication, and our data suggest that transposon amplification may in part explain the increased size of the melon genome compared with the close relative cucumber. A low number of nucleotide-binding site–leucine-rich repeat disease resistance genes were annotated, suggesting the existence of specific defense mechanisms in this species. The DHL92 genome was compared with that of its parental lines allowing the quantification of sequence variability in the species. The use of the genome sequence in future investigations will facilitate the understanding of evolution of cucurbits and the improvement of breeding strategies. PMID:22753475
Martins, Ademir Jesus; Lins, Rachel Mazzei Moura de Andrade; Linss, Jutta Gerlinde Birgitt; Peixoto, Alexandre Afranio; Valle, Denise
2009-07-01
The nature of pyrethroid resistance in Aedes aegypti Brazilian populations was investigated. Quantification of enzymes related to metabolic resistance in two distinct populations, located in the Northeast and Southeast regions, revealed increases in Glutathione-S-transferase (GST) and Esterase levels. Additionally, polymorphism was found in the IIS6 region of Ae. aegypti voltage-gated sodium channel (AaNa(V)), the pyrethroid target site. Sequences were classified in two haplotype groups, A and B, according to the size of the intron in that region. Rockefeller, a susceptible control lineage, contains only B sequences. In field populations, some A sequences present a substitution in the 1011 site (Ile/Met). When resistant and susceptible individuals were compared, the frequency of both A (with the Met mutation) and B sequences were slightly increased in resistant specimens. The involvement of the AaNa(V) polymorphism in pyrethroid resistance and the metabolic mechanisms that lead to potential cross-resistance between organophosphate and pyrethroids are discussed.
Seashols-Williams, Sarah; Green, Raquel; Wohlfahrt, Denise; Brand, Angela; Tan-Torres, Antonio Limjuco; Nogales, Francy; Brooks, J Paul; Singh, Baneshwar
2018-05-17
Sequencing and classification of microbial taxa within forensically relevant biological fluids has the potential for applications in the forensic science and biomedical fields. The quantity of bacterial DNA from human samples is currently estimated based on quantity of total DNA isolated. This method can miscalculate bacterial DNA quantity due to the mixed nature of the sample, and consequently library preparation is often unreliable. We developed an assay that can accurately and specifically quantify bacterial DNA within a mixed sample for reliable 16S ribosomal DNA (16S rDNA) library preparation and high throughput sequencing (HTS). A qPCR method was optimized using universal 16S rDNA primers, and a commercially available bacterial community DNA standard was used to develop a precise standard curve. Following qPCR optimization, 16S rDNA libraries from saliva, vaginal and menstrual secretions, urine, and fecal matter were amplified and evaluated at various DNA concentrations; successful HTS data were generated with as low as 20 pg of bacterial DNA. Changes in bacterial DNA quantity did not impact observed relative abundances of major bacterial taxa, but relative abundance changes of minor taxa were observed. Accurate quantification of microbial DNA resulted in consistent, successful library preparations for HTS analysis. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Integrative analysis with ChIP-seq advances the limits of transcript quantification from RNA-seq.
Liu, Peng; Sanalkumar, Rajendran; Bresnick, Emery H; Keleş, Sündüz; Dewey, Colin N
2016-08-01
RNA-seq is currently the technology of choice for global measurement of transcript abundances in cells. Despite its successes, isoform-level quantification remains difficult because short RNA-seq reads are often compatible with multiple alternatively spliced isoforms. Existing methods rely heavily on uniquely mapping reads, which are not available for numerous isoforms that lack regions of unique sequence. To improve quantification accuracy in such difficult cases, we developed a novel computational method, prior-enhanced RSEM (pRSEM), which uses a complementary data type in addition to RNA-seq data. We found that ChIP-seq data of RNA polymerase II and histone modifications were particularly informative in this approach. In qRT-PCR validations, pRSEM was shown to be superior than competing methods in estimating relative isoform abundances within or across conditions. Data-driven simulations suggested that pRSEM has a greatly decreased false-positive rate at the expense of a small increase in false-negative rate. In aggregate, our study demonstrates that pRSEM transforms existing capacity to precisely estimate transcript abundances, especially at the isoform level. © 2016 Liu et al.; Published by Cold Spring Harbor Laboratory Press.
2016-01-01
NMR-based metabolomics has shown considerable promise in disease diagnosis and biomarker discovery because it allows one to nondestructively identify and quantify large numbers of novel metabolite biomarkers in both biofluids and tissues. Precise metabolite quantification is a prerequisite to move any chemical biomarker or biomarker panel from the lab to the clinic. Among the biofluids commonly used for disease diagnosis and prognosis, urine has several advantages. It is abundant, sterile, and easily obtained, needs little sample preparation, and does not require invasive medical procedures for collection. Furthermore, urine captures and concentrates many “unwanted” or “undesirable” compounds throughout the body, providing a rich source of potentially useful disease biomarkers; however, incredible variation in urine chemical concentrations makes analysis of urine and identification of useful urinary biomarkers by NMR challenging. We discuss a number of the most significant issues regarding NMR-based urinary metabolomics with specific emphasis on metabolite quantification for disease biomarker applications and propose data collection and instrumental recommendations regarding NMR pulse sequences, acceptable acquisition parameter ranges, relaxation effects on quantitation, proper handling of instrumental differences, sample preparation, and biomarker assessment. PMID:26745651
Huang, Huali; Cheng, Fang; Wang, Ruoan; Zhang, Dabing; Yang, Litao
2013-01-01
Proper selection of endogenous reference genes and their real-time PCR assays is quite important in genetically modified organisms (GMOs) detection. To find a suitable endogenous reference gene and its real-time PCR assay for common wheat (Triticum aestivum L.) DNA content or copy number quantification, four previously reported wheat endogenous reference genes and their real-time PCR assays were comprehensively evaluated for the target gene sequence variation and their real-time PCR performance among 37 common wheat lines. Three SNPs were observed in the PKABA1 and ALMT1 genes, and these SNPs significantly decreased the efficiency of real-time PCR amplification. GeNorm analysis of the real-time PCR performance of each gene among common wheat lines showed that the Waxy-D1 assay had the lowest M values with the best stability among all tested lines. All results indicated that the Waxy-D1 gene and its real-time PCR assay were most suitable to be used as an endogenous reference gene for common wheat DNA content quantification. The validated Waxy-D1 gene assay will be useful in establishing accurate and creditable qualitative and quantitative PCR analysis of GM wheat.
Huang, Huali; Cheng, Fang; Wang, Ruoan; Zhang, Dabing; Yang, Litao
2013-01-01
Proper selection of endogenous reference genes and their real-time PCR assays is quite important in genetically modified organisms (GMOs) detection. To find a suitable endogenous reference gene and its real-time PCR assay for common wheat (Triticum aestivum L.) DNA content or copy number quantification, four previously reported wheat endogenous reference genes and their real-time PCR assays were comprehensively evaluated for the target gene sequence variation and their real-time PCR performance among 37 common wheat lines. Three SNPs were observed in the PKABA1 and ALMT1 genes, and these SNPs significantly decreased the efficiency of real-time PCR amplification. GeNorm analysis of the real-time PCR performance of each gene among common wheat lines showed that the Waxy-D1 assay had the lowest M values with the best stability among all tested lines. All results indicated that the Waxy-D1 gene and its real-time PCR assay were most suitable to be used as an endogenous reference gene for common wheat DNA content quantification. The validated Waxy-D1 gene assay will be useful in establishing accurate and creditable qualitative and quantitative PCR analysis of GM wheat. PMID:24098735
Daruwalla, Nayreen; Belur, Jyoti; Kumar, Meena; Tiwari, Vinay; Sarabahi, Sujata; Tilley, Nick; Osrin, David
2014-11-30
Most burns happen in low- and middle-income countries. In India, deaths related to burns are more common in women than in men and occur against a complex background in which the cause - accidental or non-accidental, suicidal or homicidal - is often unclear. Our study aimed to understand the antecedents to burns and the problem of ascribing cause, the sequence of medicolegal events after a woman was admitted to hospital, and potential opportunities for improvement. We conducted semi-structured interviews with 33 women admitted to two major burns units, their families, and 26 key informant doctors, nurses, and police officers. We used framework analysis to examine the context in which burns occurred and the sequence of medicolegal action after admission to hospital. Interviewees described accidents, attempted suicide, and attempted homicide. Distinguishing between these was difficult because the underlying combination of poverty and cultural precedent was common to all and action was contingent on potentially conflicting narratives. Space constraint, problems with cooking equipment, and inflammable clothing increased the risk of accidental burns, but coexisted with household conflict, gender-based violence, and alcohol use. Most burns were initially ascribed to accidents. Clinicians adhered to medicolegal procedures, the police carried out their investigative requirements relatively rapidly, but both groups felt vulnerable in the face of the legal process. Women's understandable reticence to describe burns as non-accidental, the contested nature of statements, their perceived history of changeability, the limited quality and validity of forensic evidence, and the requirement for resilience on the part of clients underlay a general pessimism. The similarities between accident and intention cluster so tightly as to make them challenging to distinguish, especially given women's understandable reticence to describe burns as non-accidental. The contested status of forensic evidence and a reliance on testimony means that only a minority of cases lead to conviction. The emphasis should be on improving documentation, communication between service providers, and public understanding of the risks of burns.
Farmer, M. T.; Gerardi, C.; Bremer, N.; ...
2016-10-31
The reactor accidents at Fukushima-Dai-ichi have rekindled interest in late phase severe accident behavior involving reactor pressure vessel breach and discharge of molten core melt into the containment. Two technical issues of interest in this area include core-concrete interaction and the extent to which the core debris may be quenched and rendered coolable by top flooding. The OECD-sponsored Melt Coolability and Concrete Interaction (MCCI) programs at Argonne National Laboratory included the conduct of large scale reactor material experiments and associated analysis with the objectives of resolving the ex-vessel debris coolability issue, and to address remaining uncertainties related to long-term two-dimensionalmore » molten core-concrete interactions under both wet and dry cavity conditions. These tests provided a broad database to support accident management planning, as well as the development and validation of models and codes that can be used to extrapolate the experiment results to plant conditions. This paper provides a high level overview of the key experiment results obtained during the program. Finally, a discussion is also provided that describes technical gaps that remain in this area, several of which have arisen based on the sequence of events and operator actions during Fukushima.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Farmer, M. T.; Gerardi, C.; Bremer, N.
The reactor accidents at Fukushima-Dai-ichi have rekindled interest in late phase severe accident behavior involving reactor pressure vessel breach and discharge of molten core melt into the containment. Two technical issues of interest in this area include core-concrete interaction and the extent to which the core debris may be quenched and rendered coolable by top flooding. The OECD-sponsored Melt Coolability and Concrete Interaction (MCCI) programs at Argonne National Laboratory included the conduct of large scale reactor material experiments and associated analysis with the objectives of resolving the ex-vessel debris coolability issue, and to address remaining uncertainties related to long-term two-dimensionalmore » molten core-concrete interactions under both wet and dry cavity conditions. These tests provided a broad database to support accident management planning, as well as the development and validation of models and codes that can be used to extrapolate the experiment results to plant conditions. This paper provides a high level overview of the key experiment results obtained during the program. Finally, a discussion is also provided that describes technical gaps that remain in this area, several of which have arisen based on the sequence of events and operator actions during Fukushima.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Benet, L.V.; Caroli, C.; Cornet, P.
1995-09-01
This paper reports part of a study of possible severe pressurized water reactor (PWR) accidents. The need for containment modeling, and in particular for a hydrogen risk study, was reinforced in France after 1990, with the requirement that severe accidents must be taken into account in the design of future plants. This new need of assessing the transient local hydrogen concentration led to the development, in the Mechanical Engineering and Technology Department of the French Atomic Energy Commission (CEA/DMT), of the multidimensional code GEYSER/TONUS for containment analysis. A detailed example of the use of this code is presented. The mixturemore » consisted of noncondensable gases (air or air plus hydrogen) and water vapor and liquid water. This is described by a compressible homogeneous two-phase flow model and wall condensation is based on the Chilton-Colburn formula and the analogy between heat and mass transfer. Results are given for a transient two-dimensional axially-symmetric computation for the first hour of a simplified accident sequence. In this there was an initial injection of a large amount of water vapor followed by a smaller amount and by hydrogen injection.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Farmer, M. T.
The overall objective of the current work is to carry out a scoping analysis to determine the impact of ATF on late phase accident progression; in particular, the molten core-concrete interaction portion of the sequence that occurs after the core debris fails the reactor vessel and relocates into containment. This additional study augments previous work by including kinetic effects that govern chemical reaction rates during core-concrete interaction. The specific ATF considered as part of this study is SiC-clad UO 2.
Interim reliability evaluation program, Browns Ferry 1
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mays, S.E.; Poloski, J.P.; Sullivan, W.H.
1981-01-01
Probabilistic risk analysis techniques, i.e., event tree and fault tree analysis, were utilized to provide a risk assessment of the Browns Ferry Nuclear Plant Unit 1. Browns Ferry 1 is a General Electric boiling water reactor of the BWR 4 product line with a Mark 1 (drywell and torus) containment. Within the guidelines of the IREP Procedure and Schedule Guide, dominant accident sequences that contribute to public health and safety risks were identified and grouped according to release categories.
Remily-Wood, Elizabeth R; Benson, Kaaron; Baz, Rachid C; Chen, Y Ann; Hussein, Mohamad; Hartley-Brown, Monique A; Sprung, Robert W; Perez, Brianna; Liu, Richard Z; Yoder, Sean J; Teer, Jamie K; Eschrich, Steven A; Koomen, John M
2014-10-01
Quantitative MS assays for Igs are compared with existing clinical methods in samples from patients with plasma cell dyscrasias, for example, multiple myeloma (MM). Using LC-MS/MS data, Ig constant region peptides, and transitions were selected for LC-MRM MS. Quantitative assays were used to assess Igs in serum from 83 patients. RNA sequencing and peptide-based LC-MRM are used to define peptides for quantification of the disease-specific Ig. LC-MRM assays quantify serum levels of Igs and their isoforms (IgG1-4, IgA1-2, IgM, IgD, and IgE, as well as kappa (κ) and lambda (λ) light chains). LC-MRM quantification has been applied to single samples from a patient cohort and a longitudinal study of an IgE patient undergoing treatment, to enable comparison with existing clinical methods. Proof-of-concept data for defining and monitoring variable region peptides are provided using the H929 MM cell line and two MM patients. LC-MRM assays targeting constant region peptides determine the type and isoform of the involved Ig and quantify its expression; the LC-MRM approach has improved sensitivity compared with the current clinical method, but slightly higher inter-assay variability. Detection of variable region peptides is a promising way to improve Ig quantification, which could produce a dramatic increase in sensitivity over existing methods, and could further complement current clinical techniques. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Smol, Thomas; Nibourel, Olivier; Marceau-Renaut, Alice; Celli-Lebras, Karine; Berthon, Céline; Quesnel, Bruno; Boissel, Nicolas; Terré, Christine; Thomas, Xavier; Castaigne, Sylvie; Dombret, Hervé; Preudhomme, Claude; Renneville, Aline
2015-12-01
EVI1 overexpression confers poor prognosis in acute myeloid leukemia (AML). Quantification of EVI1 expression has been mainly assessed by real-time quantitative PCR (RT-qPCR) based on relative quantification of EVI1-1D splice variant. In this study, we developed a RT-qPCR assay to perform quantification of EVI1 expression covering the different splice variants. A sequence localized in EVI1 exons 14 and 15 was cloned into plasmids that were used to establish RT-qPCR standard curves. Threshold values to define EVI1 overexpression were determined using 17 bone marrow (BM) and 31 peripheral blood (PB) control samples and were set at 1% in BM and 0.5% in PB. Samples from 64 AML patients overexpressing EVI1 included in the ALFA-0701 or -0702 trials were collected at diagnosis and during follow-up (n=152). Median EVI1 expression at AML diagnosis was 23.3% in BM and 3.6% in PB. EVI1 expression levels significantly decreased between diagnostic and post-induction samples, with an average variation from 21.6% to 3.56% in BM and from 4.0% to 0.22% in PB, but did not exceed 1 log10 reduction. Our study demonstrates that the magnitude of reduction in EVI1 expression levels between AML diagnosis and follow-up is not sufficient to allow sensitive detection of minimal residual disease. Copyright © 2015 Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Cassell, Rick; Smith, Alex; Connors, Mary; Wojciech, Jack; Rosekind, Mark R. (Technical Monitor)
1996-01-01
As new technologies and procedures are introduced into the National Airspace System, whether they are intended to improve efficiency, capacity, or safety level, the quantification of potential changes in safety levels is of vital concern. Applications of technology can improve safety levels and allow the reduction of separation standards. An excellent example is the Precision Runway Monitor (PRM). By taking advantage of the surveillance and display advances of PRM, airports can run instrument parallel approaches to runways separated by 3400 feet with the same level of safety as parallel approaches to runways separated by 4300 feet using the standard technology. Despite a wealth of information from flight operations and testing programs, there is no readily quantifiable relationship between numerical safety levels and the separation standards that apply to aircraft on final approach. This paper presents a modeling approach to quantify the risk associated with reducing separation on final approach. Reducing aircraft separation, both laterally and longitudinally, has been the goal of several aviation R&D programs over the past several years. Many of these programs have focused on technological solutions to improve navigation accuracy, surveillance accuracy, aircraft situational awareness, controller situational awareness, and other technical and operational factors that are vital to maintaining flight safety. The risk assessment model relates different types of potential aircraft accidents and incidents and their contribution to overall accident risk. The framework links accident risks to a hierarchy of failsafe mechanisms characterized by procedures and interventions. The model will be used to assess the overall level of safety associated with reducing separation standards and the introduction of new technology and procedures, as envisaged under the Free Flight concept. The model framework can be applied to various aircraft scenarios, including parallel and in-trail approaches. This research was performed under contract to NASA and in cooperation with the FAA's Safety Division (ASY).
Beelaert, G; Van Heddegem, L; Van Frankenhuijsen, M; Vandewalle, G; Compernolle, V; Florence, E; Fransen, K
2016-08-01
Oral fluid has many advantages over blood-based techniques: it is less invasive, eliminates the occupational risk associated with needle stick accidents and collection can be self-administrated. Each individual test is packaged with a corresponding collection device. This study tested the suitability of the Intercept Oral Specimen Collection Device for different HIV diagnostic tests: three different rapid HIV tests and two adapted ELISAs, which were evaluated and compared with a gold standard on blood. In addition a total IgG quantification was performed to demonstrate the quality of the specimen. HIV antibodies were detected with a sensitivity of 100%, 99.3%, 98.6%, 100% and 95.7% for, DPP, OraQuick, Aware, Genscreen and Vironostika respectively using the Intercept Collection Device. Respective specificities were 100%, 100%, 99.3%, 97.3% and 100%. Copyright © 2016 Elsevier B.V. All rights reserved.
Quantification of pilot workload via instrument scan
NASA Technical Reports Server (NTRS)
Tole, J. R.; Stephens, A. T.; Harris, R. L., Sr.; Ephrath, A.
1982-01-01
The use of visual scanning behavior as an indicator of pilot workload is described. The relationship between level of performance on a constant piloting task under simulated IFR conditions, the skill of the pilot the level of mental workload induced by an additional verbal task imposed on the basic control task, and visual scanning behavior is investigated. An increase in fixation dwell times, especially on the primary instrument with increased mental loading is indicated. Skilled subjects 'stared' less under increased loading than did novice pilots. Sequences of instrument fixations were also examined. The percentage occurrence of the subject's most used sequences decreased with increased task difficulty for novice subjects but not for highly skilled subjects. Entropy rate (bits/sec) of the sequence of fixations was also used to quantify the scan pattern. It consistently decreased for most subjects as the four loading levels used increased.
Quantifying Genome Editing Outcomes at Endogenous Loci using SMRT Sequencing
Clark, Joseph; Punjya, Niraj; Sebastiano, Vittorio; Bao, Gang; Porteus, Matthew H
2014-01-01
SUMMARY Targeted genome editing with engineered nucleases has transformed the ability to introduce precise sequence modifications at almost any site within the genome. A major obstacle to probing the efficiency and consequences of genome editing is that no existing method enables the frequency of different editing events to be simultaneously measured across a cell population at any endogenous genomic locus. We have developed a novel method for quantifying individual genome editing outcomes at any site of interest using single molecule real time (SMRT) DNA sequencing. We show that this approach can be applied at various loci, using multiple engineered nuclease platforms including TALENs, RNA guided endonucleases (CRISPR/Cas9), and ZFNs, and in different cell lines to identify conditions and strategies in which the desired engineering outcome has occurred. This approach facilitates the evaluation of new gene editing technologies and permits sensitive quantification of editing outcomes in almost every experimental system used. PMID:24685129
Evaluation of a compact tinnitus therapy by electrophysiological tinnitus decompensation measures.
Low, Yin Fen; Argstatter, Heike; Bolay, Hans Volker; Strauss, Daniel J
2008-01-01
Large-scale neural correlates of the tinnitus decompensation have been identified by using wavelet phase stability criteria of single sweep sequences of auditory late responses (ALRs). Our previous work showed that the synchronization stability in ALR sequences might be used for objective quantification of the tinnitus decompensation and attention which link to Jastreboff tinnitus model. In this study, we intend to provide an objective evaluation for quantifying the effect of music therapy in tinnitus patients. We examined neural correlates of the attentional mechanism in single sweep sequences of ALRs in chronic tinnitus patients who underwent compact therapy course by using the maximum entropy auditory paradigm. Results by our measure showed that the extent of differentiation between attended and unattended conditions improved significantly after the therapy. It is concluded that the wavelet phase synchronization stability of ALRs single sweeps can be used for the objective evaluation of tinnitus therapies, in this case the compact tinnitus music therapy.
Picelli, Carina G; Borges, Rafael J; Fernandes, Carlos A H; Matioli, Fabio M; Fernandes, Carla F C; Sobrinho, Juliana C; Holanda, Rudson J; Ozaki, Luiz S; Kayano, Anderson M; Calderon, Leonardo A; Fontes, Marcos R M; Stábeli, Rodrigo G; Soares, Andreimar M
2017-10-01
Phospholipases A 2 inhibitors (PLIs) produced by venomous and non-venomous snakes play essential role in this resistance. These endogenous inhibitors may be classified by their fold in PLIα, PLIβ and PLIγ. Phospholipases A 2 (PLA 2 s) develop myonecrosis in snake envenomation, a consequence that is not efficiently neutralized by antivenom treatment. This work aimed to identify and characterize two PLIs from Amazonian snake species, Bothrops atrox and Micrurus lemniscatus. Liver tissues RNA of specimens from each species were isolated and amplified by RT-PCR using PCR primers based on known PLIγ gene sequences, followed by cloning and sequencing of amplified fragments. Sequence similarity studies showed elevated identity with inhibitor PLIγ gene sequences from other snake species. Molecular models of translated inhibitors' gene sequences resemble canonical three finger fold from PLIγ and support the hypothesis that the decapeptide (residues 107-116) may be responsible for PLA 2 inhibition. Structural studies and action mechanism of these PLIs may provide necessary information to evaluate their potential as antivenom or as complement of the current ophidian accident treatment. Copyright © 2017 Elsevier B.V. All rights reserved.
Stets, Maria Isabel; Alqueres, Sylvia Maria Campbell; Souza, Emanuel Maltempi; Pedrosa, Fábio de Oliveira; Schmid, Michael; Hartmann, Anton; Cruz, Leonardo Magalhães
2015-10-01
Azospirillum is a rhizobacterial genus containing plant growth-promoting species associated with different crops worldwide. Azospirillum brasilense strains exhibit a growth-promoting effect by means of phytohormone production and possibly by N2 fixation. However, one of the most important factors for achieving an increase in crop yield by plant growth-promoting rhizobacteria is the survival of the inoculant in the rhizosphere, which is not always achieved. The objective of this study was to develop quantitative PCR protocols for the strain-specific quantification of A. brasilense FP2. A novel approach was applied to identify strain-specific DNA sequences based on a comparison of the genomic sequences within the same species. The draft genome sequences of A. brasilense FP2 and Sp245 were aligned, and FP2-specific regions were filtered and checked for other possible matches in public databases. Strain-specific regions were then selected to design and evaluate strain-specific primer pairs. The primer pairs AzoR2.1, AzoR2.2, AzoR5.1, AzoR5.2, and AzoR5.3 were specific for the A. brasilense FP2 strain. These primer pairs were used to monitor quantitatively the population of A. brasilense in wheat roots under sterile and nonsterile growth conditions. In addition, coinoculations with other plant growth-promoting bacteria in wheat were performed under nonsterile conditions. The results showed that A. brasilense FP2 inoculated into wheat roots is highly competitive and achieves high cell numbers (∼10(7) CFU/g [fresh weight] of root) in the rhizosphere even under nonsterile conditions and when coinoculated with other rhizobacteria, maintaining the population at rather stable levels for at least up to 13 days after inoculation. The strategy used here can be applied to other organisms whose genome sequences are available. Copyright © 2015, American Society for Microbiology. All Rights Reserved.
Stets, Maria Isabel; Alqueres, Sylvia Maria Campbell; Souza, Emanuel Maltempi; Pedrosa, Fábio de Oliveira; Schmid, Michael
2015-01-01
Azospirillum is a rhizobacterial genus containing plant growth-promoting species associated with different crops worldwide. Azospirillum brasilense strains exhibit a growth-promoting effect by means of phytohormone production and possibly by N2 fixation. However, one of the most important factors for achieving an increase in crop yield by plant growth-promoting rhizobacteria is the survival of the inoculant in the rhizosphere, which is not always achieved. The objective of this study was to develop quantitative PCR protocols for the strain-specific quantification of A. brasilense FP2. A novel approach was applied to identify strain-specific DNA sequences based on a comparison of the genomic sequences within the same species. The draft genome sequences of A. brasilense FP2 and Sp245 were aligned, and FP2-specific regions were filtered and checked for other possible matches in public databases. Strain-specific regions were then selected to design and evaluate strain-specific primer pairs. The primer pairs AzoR2.1, AzoR2.2, AzoR5.1, AzoR5.2, and AzoR5.3 were specific for the A. brasilense FP2 strain. These primer pairs were used to monitor quantitatively the population of A. brasilense in wheat roots under sterile and nonsterile growth conditions. In addition, coinoculations with other plant growth-promoting bacteria in wheat were performed under nonsterile conditions. The results showed that A. brasilense FP2 inoculated into wheat roots is highly competitive and achieves high cell numbers (∼107 CFU/g [fresh weight] of root) in the rhizosphere even under nonsterile conditions and when coinoculated with other rhizobacteria, maintaining the population at rather stable levels for at least up to 13 days after inoculation. The strategy used here can be applied to other organisms whose genome sequences are available. PMID:26187960
Morsbach, F; Gordic, S; Gruner, C; Niemann, M; Goetti, R; Gotschy, A; Kozerke, S; Alkadhi, H; Manka, R
2016-08-15
This study aims to determine whether the quantification of myocardial fibrosis in patients with Fabry disease (FD) and hypertrophic cardiomyopathy (HCM) using a late gadolinium enhancement (LGE) singlebreath-hold three-dimensional (3D) inversion recovery magnetic resonance (MR) imaging sequence is comparable with a clinically established two-dimensional (2D) multi-breath-hold sequence. In this retrospective, IRB-approved study, 40 consecutive patients (18 male; mean age 50±17years) with Fabry disease (n=18) and HCM (n=22) underwent MR imaging at 1.5T. Spatial resolution was the same for 3D and 2D images (field-of-view, 350×350mm(2); in-plane-resolution, 1.2×1.2mm(2); section-thickness, 8mm). Datasets were analyzed for subjective image quality; myocardial and fibrotic mass, and total fibrotic tissue percentage were quantified. There was no significant difference in subjective image quality between 3D and 2D acquisitions (P=0.1 and P=0.3) for either disease. In patients with Fabry disease there were no significant differences between 3D and 2D acquisitions for myocardial mass (P=0.55), fibrous tissue mass (P=0.89), and total fibrous percentage (P=0.67), with good agreement between acquisitions according to Bland-Altman analyses. In patients with HCM there were also no significant differences between acquisitions for myocardial mass (P=0.48), fibrous tissue mass (P=0.56), and total fibrous percentage (P=0.67), with good agreement according to Bland-Altman analyses. Acquisition time was significantly shorter for 3D (25±5s) as compared to the 2D sequence (349±62s, P<0.001). In patients with Fabry disease and HCM, 3D LGE imaging provides equivalent diagnostic information in regard to quantification of myocardial fibrosis as compared with a standard 2D sequence, but at superior acquisition speed. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Montella, E; Schiavone, D; Apicella, L; Di Silverio, P; Gaudiosi, M; Ambrosone, E; Moscaritolo, E; Triassi, M
2014-01-01
The occupational exposure to biological risk is a frequent event that affects millions of workers in the health sector. Operators are exposed to accidental contact with blood and other potentially infectious biological materials with a frequency higher than that observed in the population (occupational exposure). The pathogens most frequently implicated are the human immunodeficiency virus (HIV), hepatitis C (HCV) and hepatitis B (HBV) viruses. The World Health Organization estimates that each year more than 3 million health workers hurt themselves with an object/edge definitely contaminated with at least one HIV (about 170,000 exposures), hepatitis B (approximately 2,000,000 exposures) and hepatitis C (approximately 900,000 exposures). In Italy approximately 100,000 percutaneous exposures/year are estimated to take place. The needlestick injuries in health care workers are, in large part, preventable by adopting measures such as the use of instrumental needlesticks Prevention Devices - NPDs. The adoption of the NPDs is extremely effective in reducing occupational exposure to biological risk (from 63 % to 100 % reduction). The aim of this study was to evaluate whether the adoption of NPDs for insulin therapy is costeffective in terms of prevention of accidents by Biohazard, compared to administration of insulin with traditional methods (syringe + vial). The estimation is carried out both in the light of current legislation (European Directive 2010/32 and 81/08 Italian Law) and epidemiological data and cost of accidents (according to frequency) and alternative interventions. The evaluation of cost-effectiveness included the construction of an economic model that would allow the weighting of the costs of accidents that can occur following the administration of insulin therapy with traditional methods. The economic model was developed taking into account the international literature on the phenomenon of "accidental puncture" and allowed the financial quantification of the event. Then we calculated the cost of insulin therapy using the traditional methodology and the cost has been converted to the cost of insulin therapy when administered by NPDs. The period of the study was the year 2010. The data thus obtained were used to evaluate the benefits of implementing NPDs for insulin therapy, in terms not only of economic advantage but also of preventive efficacy and on the cost of the accident.
Shimizu, Kie; Namimoto, Tomohiro; Nakagawa, Masataka; Morita, Kosuke; Oda, Seitaro; Nakaura, Takeshi; Utsunomiya, Daisuke; Yamashita, Yasuyuki
To compare automated six-point Dixon (6-p-Dixon) MRI comparing with dual-echo chemical-shift-imaging (CSI) and CT for hepatic fat fraction in phantoms and clinical study. Phantoms and fifty-nine patients were examined both MRI and CT for quantitative fat measurements. In phantom study, linear regression between fat concentration and 6-p-Dixon showed good agreement. In clinical study, linear regression between 6-p-Dixon and dual-echo CSI showed good agreement. CT attenuation value was strongly correlated with 6-p-Dixon (R 2 =0.852; P<0.001) and dual-echo CSI (R 2 =0.812; P<0.001). Automated 6-p-Dixon and dual-echo CSI were accurate correlation with CT attenuation value of liver parenchyma. 6-p-Dixon has the potential for automated hepatic fat quantification. Copyright © 2017 Elsevier Inc. All rights reserved.
Yu, Ting Yue; Syeda, Fahima; Holmes, Andrew P; Osborne, Benjamin; Dehghani, Hamid; Brain, Keith L; Kirchhof, Paulus; Fabritz, Larissa
2014-08-01
We developed and validated a new optical mapping system for quantification of electrical activation and repolarisation in murine atria. The system makes use of a novel 2nd generation complementary metal-oxide-semiconductor (CMOS) camera with deliberate oversampling to allow both assessment of electrical activation with high spatial and temporal resolution (128 × 2048 pixels) and reliable assessment of atrial murine repolarisation using post-processing of signals. Optical recordings were taken from isolated, superfused and electrically stimulated murine left atria. The system reliably describes activation sequences, identifies areas of functional block, and allows quantification of conduction velocities and vectors. Furthermore, the system records murine atrial action potentials with comparable duration to both monophasic and transmembrane action potentials in murine atria. Copyright © 2014 The Authors. Published by Elsevier Ltd.. All rights reserved.
Sonnante, Gabriella; Montemurro, Cinzia; Morgese, Anita; Sabetta, Wilma; Blanco, Antonio; Pasqualone, Antonella
2009-11-11
Italian industrial pasta and durum wheat typical breads must be prepared using exclusively durum wheat semolina. Previously, a microsatellite sequence specific of the wheat D-genome had been chosen for traceability of soft wheat in semolina and bread samples, using qualitative and quantitative Sybr green-based real-time experiments. In this work, we describe an improved method based on the same soft wheat genomic region by means of a quantitative real-time PCR using a dual-labeled probe. Standard curves based on dilutions of 100% soft wheat flour, pasta, or bread were constructed. Durum wheat semolina, pasta, and bread samples were prepared with increasing amounts of soft wheat to verify the accuracy of the method. Results show that reliable quantifications were obtained especially for the samples containing a lower amount of soft wheat DNA, fulfilling the need to verify labeling of pasta and typical durum wheat breads.
NASA Technical Reports Server (NTRS)
Korram, S.
1977-01-01
The design of general remote sensing-aided methodologies was studied to provide the estimates of several important inputs to water yield forecast models. These input parameters are snow area extent, snow water content, and evapotranspiration. The study area is Feather River Watershed (780,000 hectares), Northern California. The general approach involved a stepwise sequence of identification of the required information, sample design, measurement/estimation, and evaluation of results. All the relevent and available information types needed in the estimation process are being defined. These include Landsat, meteorological satellite, and aircraft imagery, topographic and geologic data, ground truth data, and climatic data from ground stations. A cost-effective multistage sampling approach was employed in quantification of all the required parameters. The physical and statistical models for both snow quantification and evapotranspiration estimation was developed. These models use the information obtained by aerial and ground data through appropriate statistical sampling design.
The Chernobyl accident — an epidemiological perspective
Cardis, E.; Hatch, M.
2011-01-01
Twenty-five years have passed since radioactive releases from the Chernobyl nuclear accident led to exposure of millions of people in Europe. Studies of affected populations have provided important new data on the links between radiation and cancer – particularly the risk of thyroid tumours from exposure to iodine isotopes - that are important not only for a fuller scientific understanding of radiation effects, but also for radiation protection. It is now well-documented that children and adolescents exposed to radioiodines from Chernobyl fallout have a sizeable dose-related increase in thyroid cancer, with risk greatest in those youngest at exposure and with a suggestion that deficiency in stable iodine may increase the risk. Data on thyroid cancer risks to other age groups are somewhat less definitive. In addition, there have been reported increases in incidence and mortality from non-thyroid cancers and non-cancer endpoints. Although some studies are difficult to interpret because of methodological limitations, recent investigations of Chernobyl clean-up workers (“liquidators”) have provided evidence of increased risks of leukaemia and other hematological malignancies and of cataracts, and suggestions of an increase in risk of cardiovascular diseases, following low doses and low dose rates of radiation. Further careful follow-up of these populations, including establishment and long-term support of life-span study cohorts, could provide additional important information for the quantification of radiation risks and the protection of persons exposed to low doses of radiation. PMID:21396807
Smith, Jim T
2007-01-01
Background Following a nuclear incident, the communication and perception of radiation risk becomes a (perhaps the) major public health issue. In response to such incidents it is therefore crucial to communicate radiation health risks in the context of other more common environmental and lifestyle risk factors. This study compares the risk of mortality from past radiation exposures (to people who survived the Hiroshima and Nagasaki atomic bombs and those exposed after the Chernobyl accident) with risks arising from air pollution, obesity and passive and active smoking. Methods A comparative assessment of mortality risks from ionising radiation was carried out by estimating radiation risks for realistic exposure scenarios and assessing those risks in comparison with risks from air pollution, obesity and passive and active smoking. Results The mortality risk to populations exposed to radiation from the Chernobyl accident may be no higher than that for other more common risk factors such as air pollution or passive smoking. Radiation exposures experienced by the most exposed group of survivors of Hiroshima and Nagasaki led to an average loss of life expectancy significantly lower than that caused by severe obesity or active smoking. Conclusion Population-averaged risks from exposures following major radiation incidents are clearly significant, but may be no greater than those from other much more common environmental and lifestyle factors. This comparative analysis, whilst highlighting inevitable uncertainties in risk quantification and comparison, helps place the potential consequences of radiation exposures in the context of other public health risks. PMID:17407581
The Chernobyl accident--an epidemiological perspective.
Cardis, E; Hatch, M
2011-05-01
Twenty-five years have passed since radioactive releases from the Chernobyl nuclear accident led to the exposure of millions of people in Europe. Studies of affected populations have provided important new data on the links between radiation and cancer-particularly the risk of thyroid tumours from exposure to iodine isotopes-that are important not only for a fuller scientific understanding of radiation effects, but also for radiation protection. It is now well documented that children and adolescents exposed to radioiodines from Chernobyl fallout have a sizeable dose-related increase in thyroid cancer, with the risk greatest in those youngest at exposure and with a suggestion that deficiency in stable iodine may increase the risk. Data on thyroid cancer risks to other age groups are somewhat less definitive. In addition, there have been reported increases in incidence and mortality from non-thyroid cancers and non-cancer end points. Although some studies are difficult to interpret because of methodological limitations, recent investigations of Chernobyl clean-up workers ('liquidators') have provided evidence of increased risks of leukaemia and other haematological malignancies and of cataracts, and suggestions of an increase in the risk of cardiovascular diseases, following low doses and low dose rates of radiation. Further careful follow-up of these populations, including the establishment and long-term support of life-span study cohorts, could provide additional important information for the quantification of radiation risks and the protection of persons exposed to low doses of radiation. Copyright © 2011 The Royal College of Radiologists. Published by Elsevier Ltd. All rights reserved.
Uncertainty quantification for accident management using ACE surrogates
DOE Office of Scientific and Technical Information (OSTI.GOV)
Varuttamaseni, A.; Lee, J. C.; Youngblood, R. W.
The alternating conditional expectation (ACE) regression method is used to generate RELAP5 surrogates which are then used to determine the distribution of the peak clad temperature (PCT) during the loss of feedwater accident coupled with a subsequent initiation of the feed and bleed (F and B) operation in the Zion-1 nuclear power plant. The construction of the surrogates assumes conditional independence relations among key reactor parameters. The choice of parameters to model is based on the macroscopic balance statements governing the behavior of the reactor. The peak clad temperature is calculated based on the independent variables that are known tomore » be important in determining the success of the F and B operation. The relationship between these independent variables and the plant parameters such as coolant pressure and temperature is represented by surrogates that are constructed based on 45 RELAP5 cases. The time-dependent PCT for different values of F and B parameters is calculated by sampling the independent variables from their probability distributions and propagating the information through two layers of surrogates. The results of our analysis show that the ACE surrogates are able to satisfactorily reproduce the behavior of the plant parameters even though a quasi-static assumption is primarily used in their construction. The PCT is found to be lower in cases where the F and B operation is initiated, compared to the case without F and B, regardless of the F and B parameters used. (authors)« less
On "black swans" and "perfect storms": risk analysis and management when statistics are not enough.
Paté-Cornell, Elisabeth
2012-11-01
Two images, "black swans" and "perfect storms," have struck the public's imagination and are used--at times indiscriminately--to describe the unthinkable or the extremely unlikely. These metaphors have been used as excuses to wait for an accident to happen before taking risk management measures, both in industry and government. These two images represent two distinct types of uncertainties (epistemic and aleatory). Existing statistics are often insufficient to support risk management because the sample may be too small and the system may have changed. Rationality as defined by the von Neumann axioms leads to a combination of both types of uncertainties into a single probability measure--Bayesian probability--and accounts only for risk aversion. Yet, the decisionmaker may also want to be ambiguity averse. This article presents an engineering risk analysis perspective on the problem, using all available information in support of proactive risk management decisions and considering both types of uncertainty. These measures involve monitoring of signals, precursors, and near-misses, as well as reinforcement of the system and a thoughtful response strategy. It also involves careful examination of organizational factors such as the incentive system, which shape human performance and affect the risk of errors. In all cases, including rare events, risk quantification does not allow "prediction" of accidents and catastrophes. Instead, it is meant to support effective risk management rather than simply reacting to the latest events and headlines. © 2012 Society for Risk Analysis.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ritchie, L.T.; Johnson, J.D.; Blond, R.M.
The CRAC2 computer code is a revision of the Calculation of Reactor Accident Consequences computer code, CRAC, developed for the Reactor Safety Study. The CRAC2 computer code incorporates significant modeling improvements in the areas of weather sequence sampling and emergency response, and refinements to the plume rise, atmospheric dispersion, and wet deposition models. New output capabilities have also been added. This guide is to facilitate the informed and intelligent use of CRAC2. It includes descriptions of the input data, the output results, the file structures, control information, and five sample problems.
Aiyetan, Paul; Zhang, Bai; Zhang, Zhen; Zhang, Hui
2014-01-01
Mass spectrometry based glycoproteomics has become a major means of identifying and characterizing previously N-linked glycan attached loci (glycosites). In the bottom-up approach, several factors which include but not limited to sample preparation, mass spectrometry analyses, and protein sequence database searches result in previously N-linked peptide spectrum matches (PSMs) of varying lengths. Given that multiple PSM scan map to a glycosite, we reason that identified PSMs are varying length peptide species of a unique set of glycosites. Because associated spectra of these PSMs are typically summed separately, true glycosite associated spectra counts are lost or complicated. Also, these varying length peptide species complicate protein inference as smaller sized peptide sequences are more likely to map to more proteins than larger sized peptides or actual glycosite sequences. Here, we present XGlycScan. XGlycScan maps varying length peptide species to glycosites to facilitate an accurate quantification of glycosite associated spectra counts. We observed that this reduced the variability in reported identifications of mass spectrometry technical replicates of our sample dataset. We also observed that mapping identified peptides to glycosites provided an assessment of search-engine identification. Inherently, XGlycScan reported glycosites reduce the complexity in protein inference. We implemented XGlycScan in the platform independent Java programing language and have made it available as open source. XGlycScan's source code is freely available at https://bitbucket.org/paiyetan/xglycscan/src and its compiled binaries and documentation can be freely downloaded at https://bitbucket.org/paiyetan/xglycscan/downloads. The graphical user interface version can also be found at https://bitbucket.org/paiyetan/xglycscangui/src and https://bitbucket.org/paiyetan/xglycscangui/downloads respectively.
Universal multiplex PCR and CE for quantification of SMN1/SMN2 genes in spinal muscular atrophy.
Wang, Chun-Chi; Chang, Jan-Gowth; Jong, Yuh-Jyh; Wu, Shou-Mei
2009-04-01
We established a universal multiplex PCR and CE to calculate the copy number of survival motor neuron (SMN1 and SMN2) genes for clinical screening of spinal muscular atrophy (SMA). In this study, one universal fluorescent primer was designed and applied for multiplex PCR of SMN1, SMN2 and two internal standards (CYBB and KRIT1). These amplicons were separated by conformation sensitive CE. Mixture of hydroxyethyl cellulose and hydroxypropyl cellulose were used in this CE system. Our method provided the potential to separate two 390-bp PCR products that differ in a single nucleotide. Differentiation and quantification of SMN1 and SMN2 are essential for clinical screening of SMA patients and carriers. The DNA samples included 22 SMA patients, 45 parents of SMA patients (obligatory carriers) and 217 controls. For evaluating accuracy, those 284 samples were blind-analyzed by this method and denaturing high pressure liquid chromatography (DHPLC). Eight of the total samples showed different results. Among them, two samples were diagnosed as having only SMN2 gene by DHPLC, however, they contained both SMN1 and SMN2 by our method. They were further confirmed by DNA sequencing. Our method showed good agreement with the DNA sequencing. The multiplex ligation-dependent probe amplification (MLPA) was used for confirming the other five samples, and showed the same results with our CE method. For only one sample, our CE showed different results with MLPA and DNA sequencing. One out of 284 samples (0.35%) belonged to mismatching. Our method provided a better accurate method and convenient method for clinical genotyping of SMA disease.
Comprehensive Analysis of Protein Modifications by Top-down Mass Spectrometry
Zhang, Han; Ge, Ying
2012-01-01
Mass spectrometry (MS)-based proteomics is playing an increasingly important role in cardiovascular research. Proteomics includes not only identification and quantification of proteins, but also the characterization of protein modifications such as post-translational modifications and sequence variants. The conventional bottom-up approach, involving proteolytic digestion of proteins into small peptides prior to MS analysis, is routinely used for protein identification and quantification with high throughput and automation. Nevertheless, it has limitations in the analysis of protein modifications mainly due to the partial sequence coverage and loss of connections among modifications on disparate portions of a protein. An alternative approach, top-down MS, has emerged as a powerful tool for the analysis of protein modifications. The top-down approach analyzes whole proteins directly, providing a “bird’s eye” view of all existing modifications. Subsequently, each modified protein form can be isolated and fragmented in the mass spectrometer to locate the modification site. The incorporation of the non-ergodic dissociation methods such as electron capture dissociation (ECD) greatly enhances the top-down capabilities. ECD is especially useful for mapping labile post-translational modifications which are well-preserved during the ECD fragmentation process. Top-down MS with ECD has been successfully applied to cardiovascular research with the unique advantages in unraveling the molecular complexity, quantifying modified protein forms, complete mapping of modifications with full sequence coverage, discovering unexpected modifications, and identifying and quantifying positional isomers and determining the order of multiple modifications. Nevertheless, top-down MS still needs to overcome some technical challenges to realize its full potential. Herein, we reviewed the advantages and challenges of top-down methodology with a focus on its application in cardiovascular research. PMID:22187450
Frequent Detection and Genetic Diversity of Human Bocavirus in Urban Sewage Samples.
Iaconelli, M; Divizia, M; Della Libera, S; Di Bonito, P; La Rosa, Giuseppina
2016-12-01
The prevalence and genetic diversity of human bocaviruses (HBoVs) in sewage water samples are largely unknown. In this study, 134 raw sewage samples from 25 wastewater treatment plants (WTPs) in Italy were analyzed by nested PCR and sequencing using species-specific primer pairs and broad-range primer pairs targeting the capsid proteins VP1/VP2. A large number of samples (106, 79.1 %) were positive for HBoV. Out of these, 49 were classified as HBoV species 2, and 27 as species 3. For the remaining 30 samples, sequencing results showed mixed electropherograms. By cloning PCR amplicons and sequencing, we confirmed the copresence of species 2 and 3 in 29 samples and species 2 and 4 in only one sample. A real-time PCR assay was also performed, using a newly designed TaqMan assay, for quantification of HBoVs in sewage water samples. Viral load quantification ranged from 5.51E+03 to 1.84E+05 GC/L (mean value 4.70E+04 GC/L) for bocavirus 2 and from 1.89E+03 to 1.02E+05 GC/L (mean value 2.27E+04 GC/L) for bocavirus 3. The wide distribution of HBoV in sewages suggests that this virus is common in the population, and the most prevalent are the species 2 and 3. HBoV-4 was also found, representing the first detection of this species in Italy. Although there is no indication of waterborne transmission for HBoV, the significant presence in sewage waters suggests that HBoV may spread to other water environments, and therefore, a potential role of water in the HBoV transmission should not be neglected.
Stöcher, Markus; Leb, Victoria; Hölzl, Gabriele; Berg, Jörg
2002-12-01
The real-time PCR technology allows convenient detection and quantification of virus derived DNA. This approach is used in many PCR based assays in clinical laboratories. Detection and quantification of virus derived DNA is usually performed against external controls or external standards. Thus, adequacy within a clinical sample is not monitored for. This can be achieved using internal controls that are co-amplified with the specific target within the same reaction vessel. We describe a convenient way to prepare heterologous internal controls as competitors for real-time PCR based assays. The internal controls were devised as competitors in real-time PCR, e.g. LightCycler-PCR. The bacterial neomycin phosphotransferase gene (neo) was used as source for heterologous DNA. Within the neo gene a box was chosen containing sequences for four differently spaced forward primers, one reverse primer, and a pair of neo specific hybridization probes. Pairs of primers were constructed to compose of virus-specific primer sequences and neo box specific primer sequences. Using those composite primers in conventional preparative PCR four types of internal controls were amplified from the neo box and subsequently cloned. A panel of the four differently sized internal controls was generated and tested by LightCycler PCR using their virus-specific primers. All four different PCR products were detected with the single pair of neo specific FRET-hybridization probes. The presented approach to generate competitive internal controls for use in LightCycler PCR assays proved convenient und rapid. The obtained internal controls match most PCR product sizes used in clinical routine molecular assays and will assist to discriminate true from false negative results.
The phonetics of talk in interaction--introduction to the special issue.
Ogden, Richard
2012-03-01
This overview paper provides an introduction to work on naturally-occurring speech data, combining techniques of conversation analysis with techniques and methods from phonetics. The paper describes the development of the field, highlighting current challenges and progress in interdisciplinary work. It considers the role of quantification and its relationship to a qualitative methodology. It presents the conversation analytic notion of sequence as a version of context, and argues that sequences of talk constrain relevant phonetic design, and so provide one account for variability in naturally occurring speech. The paper also describes the manipulation of speech and language on many levels simultaneously. All of these themes occur and are explored in more detail in the papers contained in this special issue.
Henninger, B; Zoller, H; Rauch, S; Schocke, M; Kannengiesser, S; Zhong, X; Reiter, G; Jaschke, W; Kremser, C
2015-05-01
To evaluate the automated two-point Dixon screening sequence for the detection and estimated quantification of hepatic iron and fat compared with standard sequences as a reference. One hundred and two patients with suspected diffuse liver disease were included in this prospective study. The following MRI protocol was used: 3D-T1-weighted opposed- and in-phase gradient echo with two-point Dixon reconstruction and dual-ratio signal discrimination algorithm ("screening" sequence); fat-saturated, multi-gradient-echo sequence with 12 echoes; gradient-echo T1 FLASH opposed- and in-phase. Bland-Altman plots were generated and correlation coefficients were calculated to compare the sequences. The screening sequence diagnosed fat in 33, iron in 35 and a combination of both in 4 patients. Correlation between R2* values of the screening sequence and the standard relaxometry was excellent (r = 0.988). A slightly lower correlation (r = 0.978) was found between the fat fraction of the screening sequence and the standard sequence. Bland-Altman revealed systematically lower R2* values obtained from the screening sequence and higher fat fraction values obtained with the standard sequence with a rather high variability in agreement. The screening sequence is a promising method with fast diagnosis of the predominant liver disease. It is capable of estimating the amount of hepatic fat and iron comparable to standard methods. • MRI plays a major role in the clarification of diffuse liver disease. • The screening sequence was introduced for the assessment of diffuse liver disease. • It is a fast and automated algorithm for the evaluation of hepatic iron and fat. • It is capable of estimating the amount of hepatic fat and iron.
Liu, Rui; Zhang, Shixi; Wei, Chao; Xing, Zhi; Zhang, Sichun; Zhang, Xinrong
2016-05-17
The unambiguous quantification of biomolecules is of great significance in fundamental biological research as well as practical clinical diagnosis. Due to the lack of a detectable moiety, the direct and highly sensitive quantification of biomolecules is often a "mission impossible". Consequently, tagging strategies to introduce detectable moieties for labeling target biomolecules were invented, which had a long and significant impact on studies of biomolecules in the past decades. For instance, immunoassays have been developed with radioisotope tagging by Yalow and Berson in the late 1950s. The later languishment of this technology can be almost exclusively ascribed to the use of radioactive isotopes, which led to the development of nonradioactive tagging strategy-based assays such as enzyme-linked immunosorbent assay, fluorescent immunoassay, and chemiluminescent and electrochemiluminescent immunoassay. Despite great success, these strategies suffered from drawbacks such as limited spectral window capacity for multiplex detection and inability to provide absolute quantification of biomolecules. After recalling the sequences of tagging strategies, an apparent question is why not use stable isotopes from the start? A reasonable explanation is the lack of reliable means for accurate and precise quantification of stable isotopes at that time. The situation has changed greatly at present, since several atomic mass spectrometric measures for metal stable isotopes have been developed. Among the newly developed techniques, inductively coupled plasma mass spectrometry is an ideal technique to determine metal stable isotope-tagged biomolecules, for its high sensitivity, wide dynamic linear range, and more importantly multiplex and absolute quantification ability. Since the first published report by our group, metal stable isotope tagging has become a revolutionary technique and gained great success in biomolecule quantification. An exciting research highlight in this area is the development and application of the mass cytometer, which fully exploited the multiplexing potential of metal stable isotope tagging. It realized the simultaneous detection of dozens of parameters in single cells, accurate immunophenotyping in cell populations, through modeling of intracellular signaling network and undoubted discrimination of function and connection of cell subsets. Metal stable isotope tagging has great potential applications in hematopoiesis, immunology, stem cells, cancer, and drug screening related research and opened a post-fluorescence era of cytometry. Herein, we review the development of biomolecule quantification using metal stable isotope tagging. Particularly, the power of multiplex and absolute quantification is demonstrated. We address the advantages, applicable situations, and limitations of metal stable isotope tagging strategies and propose suggestions for future developments. The transfer of enzymatic or fluorescent tagging to metal stable isotope tagging may occur in many aspects of biological and clinical practices in the near future, just as the revolution from radioactive isotope tagging to fluorescent tagging happened in the past.
Application of Faecalibacterium 16S rDNA genetic marker for accurate identification of duck faeces.
Sun, Da; Duan, Chuanren; Shang, Yaning; Ma, Yunxia; Tan, Lili; Zhai, Jun; Gao, Xu; Guo, Jingsong; Wang, Guixue
2016-04-01
The aim of this study was to judge the legal duty of pollution liabilities by assessing a duck faeces-specific marker, which can exclude distractions of residual bacteria from earlier contamination accidents. With the gene sequencing technology and bioinformatics method, we completed the comparative analysis of Faecalibacterium sequences, which were associated with ducks and other animal species, and found the sequences unique to duck faeces. Polymerase chain reaction (PCR) and agarose gel electrophoresis techniques were used to verify the reliability of both human and duck faeces-specific primers. The duck faeces-specific primers generated an amplicon of 141 bp from 43.3 % of duck faecal samples, 0 % of control samples and 100 % of sewage wastewater samples that contained duck faeces. We present here the initial evidence of Faecalibacterium-based applicability as human faeces-specificity in China. Meanwhile, this study represents the initial report of a Faecalibacterium marker for duck faeces and suggests an independent or supplementary environmental biotechnology of microbial source tracking (MST).
Baigent, Susan J.; Nair, Venugopal K.; Le Galludec, Hervé
2016-01-01
CVI988/Rispens vaccine, the ‘gold standard’ vaccine against Marek’s disease in poultry, is not easily distinguishable from virulent strains of Marek’s disease herpesvirus (MDV). Accurate differential measurement of CVI988 and virulent MDV is commercially important to confirm successful vaccination, to diagnose Marek’s disease, and to investigate causes of vaccine failure. A real-time quantitative PCR assay to distinguish CVI988 and virulent MDV based on a consistent single nucleotide polymorphism in the pp38 gene, was developed, optimised and validated using common primers to amplify both viruses, but differential detection of PCR products using two short probes specific for either CVI988 or virulent MDV. Both probes showed perfect specificity for three commercial preparations of CVI988 and 12 virulent MDV strains. Validation against BAC-sequence-specific and US2-sequence-specific q-PCR, on spleen samples from experimental chickens co-infected with BAC-cloned pCVI988 and wild-type virulent MDV, demonstrated that CVI988 and virulent MDV could be quantified very accurately. The assay was then used to follow kinetics of replication of commercial CVI988 and virulent MDV in feather tips and blood of vaccinated and challenged experimental chickens. The assay is a great improvement in enabling accurate differential quantification of CVI988 and virulent MDV over a biologically relevant range of virus levels. PMID:26973285
[Classical and molecular methods for identification and quantification of domestic moulds].
Fréalle, E; Bex, V; Reboux, G; Roussel, S; Bretagne, S
2017-12-01
To study the impact of the constant and inevitable inhalation of moulds, it is necessary to sample, identify and count the spores. Environmental sampling methods can be separated into three categories: surface sampling is easy to perform but non quantitative, air sampling is easy to calibrate but provides time limited information, and dust sampling which is more representative of long term exposure to moulds. The sampling strategy depends on the objectives (evaluation of the risk of exposure for individuals; quantification of the household contamination; evaluation of the efficacy of remediation). The mould colonies obtained in culture are identified using microscopy, Maldi-TOF, and/or DNA sequencing. Electrostatic dust collectors are an alternative to older methods for identifying and quantifying household mould spores. They are easy to use and relatively cheap. Colony counting should be progressively replaced by quantitative real-time PCR, which is already validated, while waiting for more standardised high throughput sequencing methods for assessment of mould contamination without technical bias. Despite some technical recommendations for obtaining reliable and comparable results, the huge diversity of environmental moulds, the variable quantity of spores inhaled and the association with other allergens (mites, plants) make the evaluation of their impact on human health difficult. Hence there is a need for reliable and generally applicable quantitative methods. Copyright © 2017 SPLF. Published by Elsevier Masson SAS. All rights reserved.
Functional assessment of human enhancer activities using whole-genome STARR-sequencing.
Liu, Yuwen; Yu, Shan; Dhiman, Vineet K; Brunetti, Tonya; Eckart, Heather; White, Kevin P
2017-11-20
Genome-wide quantification of enhancer activity in the human genome has proven to be a challenging problem. Recent efforts have led to the development of powerful tools for enhancer quantification. However, because of genome size and complexity, these tools have yet to be applied to the whole human genome. In the current study, we use a human prostate cancer cell line, LNCaP as a model to perform whole human genome STARR-seq (WHG-STARR-seq) to reliably obtain an assessment of enhancer activity. This approach builds upon previously developed STARR-seq in the fly genome and CapSTARR-seq techniques in targeted human genomic regions. With an improved library preparation strategy, our approach greatly increases the library complexity per unit of starting material, which makes it feasible and cost-effective to explore the landscape of regulatory activity in the much larger human genome. In addition to our ability to identify active, accessible enhancers located in open chromatin regions, we can also detect sequences with the potential for enhancer activity that are located in inaccessible, closed chromatin regions. When treated with the histone deacetylase inhibitor, Trichostatin A, genes nearby this latter class of enhancers are up-regulated, demonstrating the potential for endogenous functionality of these regulatory elements. WHG-STARR-seq provides an improved approach to current pipelines for analysis of high complexity genomes to gain a better understanding of the intricacies of transcriptional regulation.
You, Leiming; Wu, Jiexin; Feng, Yuchao; Fu, Yonggui; Guo, Yanan; Long, Liyuan; Zhang, Hui; Luan, Yijie; Tian, Peng; Chen, Liangfu; Huang, Guangrui; Huang, Shengfeng; Li, Yuxin; Li, Jie; Chen, Chengyong; Zhang, Yaqing; Chen, Shangwu; Xu, Anlong
2015-01-01
Increasing amounts of genes have been shown to utilize alternative polyadenylation (APA) 3′-processing sites depending on the cell and tissue type and/or physiological and pathological conditions at the time of processing, and the construction of genome-wide database regarding APA is urgently needed for better understanding poly(A) site selection and APA-directed gene expression regulation for a given biology. Here we present a web-accessible database, named APASdb (http://mosas.sysu.edu.cn/utr), which can visualize the precise map and usage quantification of different APA isoforms for all genes. The datasets are deeply profiled by the sequencing alternative polyadenylation sites (SAPAS) method capable of high-throughput sequencing 3′-ends of polyadenylated transcripts. Thus, APASdb details all the heterogeneous cleavage sites downstream of poly(A) signals, and maintains near complete coverage for APA sites, much better than the previous databases using conventional methods. Furthermore, APASdb provides the quantification of a given APA variant among transcripts with different APA sites by computing their corresponding normalized-reads, making our database more useful. In addition, APASdb supports URL-based retrieval, browsing and display of exon-intron structure, poly(A) signals, poly(A) sites location and usage reads, and 3′-untranslated regions (3′-UTRs). Currently, APASdb involves APA in various biological processes and diseases in human, mouse and zebrafish. PMID:25378337
Low, Yin Fen; Trenado, Carlos; Delb, Wolfgang; Corona-Strauss, Farah I; Strauss, Daniel J
2007-01-01
Large-scale neural correlates of the tinnitus decompensation have been identified by using wavelet phase stability criteria of single sweep sequences of auditory late responses (ALRs). The suggested measure provided an objective quantification of the tinnitus decompensation and allowed for a reliable discrimination between a group of compensated and decompensated tinnitus patients. By interpreting our results with an oscillatory tinnitus model, our synchronization stability measure of ALRs can be linked to the focus of attention on the tinnitus signal. In the following study, we examined in detail the correlates of this attentional mechanism in healthy subjects. The results support our previous findings of the phase synchronization stability measure that reflected neural correlates of the fixation of attention to the tinnitus signal. In this case, enabling the differentiation between the attended and unattended conditions. It is concluded that the wavelet phase synchronization stability of ALRs single sweeps can be used as objective tinnitus decompensation measure and can be interpreted in the framework of the Jastreboff tinnitus model and adaptive resonance theory. Our studies confirm that the synchronization stability in ALR sequences is linked to attention. This measure is not only able to serve as objective quantification of the tinnitus decompensation, but also can be applied in all online and real time neurofeedback therapeutic approach where a direct stimulus locked attention monitoring is compulsory as if it based on a single sweeps processing.
Investigation of the influence of sampling schemes on quantitative dynamic fluorescence imaging
Dai, Yunpeng; Chen, Xueli; Yin, Jipeng; Wang, Guodong; Wang, Bo; Zhan, Yonghua; Nie, Yongzhan; Wu, Kaichun; Liang, Jimin
2018-01-01
Dynamic optical data from a series of sampling intervals can be used for quantitative analysis to obtain meaningful kinetic parameters of probe in vivo. The sampling schemes may affect the quantification results of dynamic fluorescence imaging. Here, we investigate the influence of different sampling schemes on the quantification of binding potential (BP) with theoretically simulated and experimentally measured data. Three groups of sampling schemes are investigated including the sampling starting point, sampling sparsity, and sampling uniformity. In the investigation of the influence of the sampling starting point, we further summarize two cases by considering the missing timing sequence between the probe injection and sampling starting time. Results show that the mean value of BP exhibits an obvious growth trend with an increase in the delay of the sampling starting point, and has a strong correlation with the sampling sparsity. The growth trend is much more obvious if throwing the missing timing sequence. The standard deviation of BP is inversely related to the sampling sparsity, and independent of the sampling uniformity and the delay of sampling starting time. Moreover, the mean value of BP obtained by uniform sampling is significantly higher than that by using the non-uniform sampling. Our results collectively suggest that a suitable sampling scheme can help compartmental modeling of dynamic fluorescence imaging provide more accurate results and simpler operations. PMID:29675325
NASA Astrophysics Data System (ADS)
Lu, Aiming; Atkinson, Ian C.; Vaughn, J. Thomas; Thulborn, Keith R.
2011-12-01
The rapid biexponential transverse relaxation of the sodium MR signal from brain tissue requires efficient k-space sampling for quantitative imaging in a time that is acceptable for human subjects. The flexible twisted projection imaging (flexTPI) sequence has been shown to be suitable for quantitative sodium imaging with an ultra-short echo time to minimize signal loss. The fidelity of the k-space center location is affected by the readout gradient timing errors on the three physical axes, which is known to cause image distortion for projection-based acquisitions. This study investigated the impact of these timing errors on the voxel-wise accuracy of the tissue sodium concentration (TSC) bioscale measured with the flexTPI sequence. Our simulations show greater than 20% spatially varying quantification errors when the gradient timing errors are larger than 10 μs on all three axes. The quantification is more tolerant of gradient timing errors on the Z-axis. An existing method was used to measure the gradient timing errors with <1 μs error. The gradient timing error measurement is shown to be RF coil dependent, and timing error differences of up to ˜16 μs have been observed between different RF coils used on the same scanner. The measured timing errors can be corrected prospectively or retrospectively to obtain accurate TSC values.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1978-05-01
The Transient Reactor Analysis Code (TRAC) is being developed at the Los Alamos Scientific Laboratory (LASL) to provide an advanced ''best estimate'' predictive capability for the analysis of postulated accidents in light water reactors (LWRs). TRAC-Pl provides this analysis capability for pressurized water reactors (PWRs) and for a wide variety of thermal-hydraulic experimental facilities. It features a three-dimensional treatment of the pressure vessel and associated internals; two-phase nonequilibrium hydrodynamics models; flow-regime-dependent constitutive equation treatment; reflood tracking capability for both bottom flood and falling film quench fronts; and consistent treatment of entire accident sequences including the generation of consistent initial conditions.more » The TRAC-Pl User's Manual is composed of two separate volumes. Volume I gives a description of the thermal-hydraulic models and numerical solution methods used in the code. Detailed programming and user information is also provided. Volume II presents the results of the developmental verification calculations.« less
Pretest aerosol code comparisons for LWR aerosol containment tests LA1 and LA2
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wright, A.L.; Wilson, J.H.; Arwood, P.C.
The Light-Water-Reactor (LWR) Aerosol Containment Experiments (LACE) are being performed in Richland, Washington, at the Hanford Engineering Development Laboratory (HEDL) under the leadership of an international project board and the Electric Power Research Institute. These tests have two objectives: (1) to investigate, at large scale, the inherent aerosol retention behavior in LWR containments under simulated severe accident conditions, and (2) to provide an experimental data base for validating aerosol behavior and thermal-hydraulic computer codes. Aerosol computer-code comparison activities are being coordinated at the Oak Ridge National Laboratory. For each of the six LACE tests, ''pretest'' calculations (for code-to-code comparisons) andmore » ''posttest'' calculations (for code-to-test data comparisons) are being performed. The overall goals of the comparison effort are (1) to provide code users with experience in applying their codes to LWR accident-sequence conditions and (2) to evaluate and improve the code models.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kymaelaeinen, O.; Tuomisto, H.; Theofanous, T.G.
1997-02-01
The concept of lower head coolability and in-vessel retention of corium has been approved as a basic element of the severe accident management strategy for IVO`s Loviisa Plant (VVER-440) in Finland. The selected approach takes advantage of the unique features of the plant such as low power density, reactor pressure vessel without penetrations at the bottom and ice-condenser containment which ensures flooded cavity in all risk significant sequences. The thermal analyses, which are supported by experimental program, demonstrate that in Loviisa the molten corium on the lower head of the reactor vessel is coolable externally with wide margins. This papermore » summarizes the approach and the plant modifications being implemented. During the approval process some technical concerns were raised, particularly with regard to thermal loadings caused by contact of cool cavity water and hot corium with the reactor vessel. Resolution of these concerns is also discussed.« less
Automated quantification of lumbar vertebral kinematics from dynamic fluoroscopic sequences
NASA Astrophysics Data System (ADS)
Camp, Jon; Zhao, Kristin; Morel, Etienne; White, Dan; Magnuson, Dixon; Gay, Ralph; An, Kai-Nan; Robb, Richard
2009-02-01
We hypothesize that the vertebra-to-vertebra patterns of spinal flexion and extension motion of persons with lower back pain will differ from those of persons who are pain-free. Thus, it is our goal to measure the motion of individual lumbar vertebrae noninvasively from dynamic fluoroscopic sequences. Two-dimensional normalized mutual information-based image registration was used to track frame-to-frame motion. Software was developed that required the operator to identify each vertebra on the first frame of the sequence using a four-point "caliper" placed at the posterior and anterior edges of the inferior and superior end plates of the target vertebrae. The program then resolved the individual motions of each vertebra independently throughout the entire sequence. To validate the technique, 6 cadaveric lumbar spine specimens were potted in polymethylmethacrylate and instrumented with optoelectric sensors. The specimens were then placed in a custom dynamic spine simulator and moved through flexion-extension cycles while kinematic data and fluoroscopic sequences were simultaneously acquired. We found strong correlation between the absolute flexionextension range of motion of each vertebra as recorded by the optoelectric system and as determined from the fluoroscopic sequence via registration. We conclude that this method is a viable way of noninvasively assessing twodimensional vertebral motion.
Petersson, Sven; Dyverfeldt, Petter; Sigfridsson, Andreas; Lantz, Jonas; Carlhäll, Carl-Johan; Ebbers, Tino
2016-03-01
Evaluate spiral three-dimensional (3D) phase contrast MRI for the assessment of turbulence and velocity in stenotic flow. A-stack-of-spirals 3D phase contrast MRI sequence was evaluated in vitro against a conventional Cartesian sequence. Measurements were made in a flow phantom with a 75% stenosis. Both spiral and Cartesian imaging were performed using different scan orientations and flow rates. Volume flow rate, maximum velocity and turbulent kinetic energy (TKE) were computed for both methods. Moreover, the estimated TKE was compared with computational fluid dynamics (CFD) data. There was good agreement between the turbulent kinetic energy from the spiral, Cartesian and CFD data. Flow rate and maximum velocity from the spiral data agreed well with Cartesian data. As expected, the short echo time of the spiral sequence resulted in less prominent displacement artifacts compared with the Cartesian sequence. However, both spiral and Cartesian flow rate estimates were sensitive to displacement when the flow was oblique to the encoding directions. Spiral 3D phase contrast MRI appears favorable for the assessment of stenotic flow. The spiral sequence was more than three times faster and less sensitive to displacement artifacts when compared with a conventional Cartesian sequence. © 2015 Wiley Periodicals, Inc.
Fernández-Friera, Leticia; García-Ruiz, José Manuel; García-Álvarez, Ana; Fernández-Jiménez, Rodrigo; Sánchez-González, Javier; Rossello, Xavier; Gómez-Talavera, Sandra; López-Martín, Gonzalo J; Pizarro, Gonzalo; Fuster, Valentín; Ibáñez, Borja
2017-05-01
Area at risk (AAR) quantification is important to evaluate the efficacy of cardioprotective therapies. However, postinfarction AAR assessment could be influenced by the infarcted coronary territory. Our aim was to determine the accuracy of T 2 -weighted short tau triple-inversion recovery (T 2 W-STIR) cardiac magnetic resonance (CMR) imaging for accurate AAR quantification in anterior, lateral, and inferior myocardial infarctions. Acute reperfused myocardial infarction was experimentally induced in 12 pigs, with 40-minute occlusion of the left anterior descending (n = 4), left circumflex (n = 4), and right coronary arteries (n = 4). Perfusion CMR was performed during selective intracoronary gadolinium injection at the coronary occlusion site (in vivo criterion standard) and, additionally, a 7-day CMR, including T 2 W-STIR sequences, was performed. Finally, all animals were sacrificed and underwent postmortem Evans blue staining (classic criterion standard). The concordance between the CMR-based criterion standard and T 2 W-STIR to quantify AAR was high for anterior and inferior infarctions (r = 0.73; P = .001; mean error = 0.50%; limits = -12.68%-13.68% and r = 0.87; P = .001; mean error = -1.5%; limits = -8.0%-5.8%, respectively). Conversely, the correlation for the circumflex territories was poor (r = 0.21, P = .37), showing a higher mean error and wider limits of agreement. A strong correlation between pathology and the CMR-based criterion standard was observed (r = 0.84, P < .001; mean error = 0.91%; limits = -7.55%-9.37%). T 2 W-STIR CMR sequences are accurate to determine the AAR for anterior and inferior infarctions; however, their accuracy for lateral infarctions is poor. These findings may have important implications for the design and interpretation of clinical trials evaluating the effectiveness of cardioprotective therapies. Copyright © 2016 Sociedad Española de Cardiología. Published by Elsevier España, S.L.U. All rights reserved.
Jiang, Wenting; Liu, Liang; Chen, Yun
2018-03-06
Abnormal expression of C-terminal p53 isoforms α, β, and γ can cause the development of cancers including breast cancer. To date, much evidence has demonstrated that these isoforms can differentially regulate target genes and modulate their expression. Thus, quantification of individual isoforms may help to link clinical outcome to p53 status and to improve cancer patient treatment. However, there are few studies on accurate determination of p53 isoforms, probably due to sequence homology of these isoforms and also their low abundance. In this study, a targeted proteomics assay combining molecularly imprinted polymers (MIPs) and liquid chromatography-tandem mass spectrometry (LC-MS/MS) was developed for simultaneous quantification of C-terminal p53 isoforms. Isoform-specific surrogate peptides (i.e., KPLDGEYFTLQIR (peptide-α) for isoform α, KPLDGEYFTLQDQTSFQK (peptide-β) for isoform β, and KPLDGEYFTLQMLLDLR (peptide-γ) for isoform γ) were first selected and used in both MIPs enrichment and mass spectrometric detection. The common sequence KPLDGEYFTLQ of these three surrogate peptides was used as single template in MIPs. In addition to optimization of imprinting conditions and characterization of the prepared MIPs, binding affinity and cross-reactivity of the MIPs for each surrogate peptide were also evaluated. As a result, a LOQ of 5 nM was achieved, which was >15-fold more sensitive than that without MIPs. Finally, the assay was validated and applied to simultaneous quantitative analysis of C-terminal p53 isoforms α, β, and γ in several human breast cell lines (i.e., MCF-10A normal cells, MCF-7 and MDA-MB-231 cancer cells, and drug-resistant MCF-7/ADR cancer cells). This study is among the first to employ single template MIPs and cross-reactivity phenomenon to select isoform-specific surrogate peptides and enable simultaneous quantification of protein isoforms in LC-MS/MS-based targeted proteomics.
Fast multiclonal clusterization of V(D)J recombinations from high-throughput sequencing.
Giraud, Mathieu; Salson, Mikaël; Duez, Marc; Villenet, Céline; Quief, Sabine; Caillault, Aurélie; Grardel, Nathalie; Roumier, Christophe; Preudhomme, Claude; Figeac, Martin
2014-05-28
V(D)J recombinations in lymphocytes are essential for immunological diversity. They are also useful markers of pathologies. In leukemia, they are used to quantify the minimal residual disease during patient follow-up. However, the full breadth of lymphocyte diversity is not fully understood. We propose new algorithms that process high-throughput sequencing (HTS) data to extract unnamed V(D)J junctions and gather them into clones for quantification. This analysis is based on a seed heuristic and is fast and scalable because in the first phase, no alignment is performed with germline database sequences. The algorithms were applied to TR γ HTS data from a patient with acute lymphoblastic leukemia, and also on data simulating hypermutations. Our methods identified the main clone, as well as additional clones that were not identified with standard protocols. The proposed algorithms provide new insight into the analysis of high-throughput sequencing data for leukemia, and also to the quantitative assessment of any immunological profile. The methods described here are implemented in a C++ open-source program called Vidjil.
Enzyme-free detection and quantification of double-stranded nucleic acids.
Feuillie, Cécile; Merheb, Maxime Mohamad; Gillet, Benjamin; Montagnac, Gilles; Hänni, Catherine; Daniel, Isabelle
2012-08-01
We have developed a fully enzyme-free SERRS hybridization assay for specific detection of double-stranded DNA sequences. Although all DNA detection methods ranging from PCR to high-throughput sequencing rely on enzymes, this method is unique for being totally non-enzymatic. The efficiency of enzymatic processes is affected by alterations, modifications, and/or quality of DNA. For instance, a limitation of most DNA polymerases is their inability to process DNA damaged by blocking lesions. As a result, enzymatic amplification and sequencing of degraded DNA often fail. In this study we succeeded in detecting and quantifying, within a mixture, relative amounts of closely related double-stranded DNA sequences from Rupicapra rupicapra (chamois) and Capra hircus (goat). The non-enzymatic SERRS assay presented here is the corner stone of a promising approach to overcome the failure of DNA polymerase when DNA is too degraded or when the concentration of polymerase inhibitors is too high. It is the first time double-stranded DNA has been detected with a truly non-enzymatic SERRS-based method. This non-enzymatic, inexpensive, rapid assay is therefore a breakthrough in nucleic acid detection.
Kim, Jaai; Lim, Juntaek; Lee, Changsoo
2013-12-01
Quantitative real-time PCR (qPCR) has been widely used in recent environmental microbial ecology studies as a tool for detecting and quantifying microorganisms of interest, which aids in better understandings of the complexity of wastewater microbial communities. Although qPCR can be used to provide more specific and accurate quantification than other molecular techniques, it does have limitations that must be considered when applying it in practice. This article reviews the principle of qPCR quantification and its applications to microbial ecology studies in various wastewater treatment environments. Here we also address several limitations of qPCR-based approaches that can affect the validity of quantification data: template nucleic acid quality, nucleic acid extraction efficiency, specificity of group-specific primers and probes, amplification of nonviable DNA, gene copy number variation, and limited number of sequences in the database. Even with such limitations, qPCR is reportedly among the best methods for quantitatively investigating environmental microbial communities. The application of qPCR is and will continue to be increasingly common in studies of wastewater treatment systems. To obtain reliable analyses, however, the limitations that have often been overlooked must be carefully considered when interpreting the results. Copyright © 2013 Elsevier Inc. All rights reserved.
Development of a method for detection and quantification of B. brongniartii and B. bassiana in soil
NASA Astrophysics Data System (ADS)
Canfora, L.; Malusà, E.; Tkaczuk, C.; Tartanus, M.; Łabanowska, B. H.; Pinzari, F.
2016-03-01
A culture independent method based on qPCR was developed for the detection and quantification of two fungal inoculants in soil. The aim was to adapt a genotyping approach based on SSR (Simple Sequence Repeat) marker to a discriminating tracing of two different species of bioinoculants in soil, after their in-field release. Two entomopathogenic fungi, Beauveria bassiana and B. brongniartii, were traced and quantified in soil samples obtained from field trials. These two fungal species were used as biological agents in Poland to control Melolontha melolontha (European cockchafer), whose larvae live in soil menacing horticultural crops. Specificity of SSR markers was verified using controls consisting of: i) soil samples containing fungal spores of B. bassiana and B. brongniartii in known dilutions; ii) the DNA of the fungal microorganisms; iii) soil samples singly inoculated with each fungus species. An initial evaluation of the protocol was performed with analyses of soil DNA and mycelial DNA. Further, the simultaneous detection and quantification of B. bassiana and B. brongniartii in soil was achieved in field samples after application of the bio-inoculants. The protocol can be considered as a relatively low cost solution for the detection, identification and traceability of fungal bio-inoculants in soil.
Reproducibility study of whole-brain 1H spectroscopic imaging with automated quantification.
Gu, Meng; Kim, Dong-Hyun; Mayer, Dirk; Sullivan, Edith V; Pfefferbaum, Adolf; Spielman, Daniel M
2008-09-01
A reproducibility study of proton MR spectroscopic imaging ((1)H-MRSI) of the human brain was conducted to evaluate the reliability of an automated 3D in vivo spectroscopic imaging acquisition and associated quantification algorithm. A PRESS-based pulse sequence was implemented using dualband spectral-spatial RF pulses designed to fully excite the singlet resonances of choline (Cho), creatine (Cre), and N-acetyl aspartate (NAA) while simultaneously suppressing water and lipids; 1% of the water signal was left to be used as a reference signal for robust data processing, and additional lipid suppression was obtained using adiabatic inversion recovery. Spiral k-space trajectories were used for fast spectral and spatial encoding yielding high-quality spectra from 1 cc voxels throughout the brain with a 13-min acquisition time. Data were acquired with an 8-channel phased-array coil and optimal signal-to-noise ratio (SNR) for the combined signals was achieved using a weighting based on the residual water signal. Automated quantification of the spectrum of each voxel was performed using LCModel. The complete study consisted of eight healthy adult subjects to assess intersubject variations and two subjects scanned six times each to assess intrasubject variations. The results demonstrate that reproducible whole-brain (1)H-MRSI data can be robustly obtained with the proposed methods.
Petr, Jan; Schramm, Georg; Hofheinz, Frank; Langner, Jens; van den Hoff, Jörg
2014-10-01
To estimate the relaxation time changes during Q2TIPS bolus saturation caused by magnetization transfer effects and to propose and evaluate an extended model for perfusion quantification which takes this into account. Three multi inversion-time pulsed arterial spin labeling sequences with different bolus saturation duration were acquired for five healthy volunteers. Magnetization transfer exchange rates in tissue and blood were obtained from control image saturation recovery. Cerebral blood flow (CBF) obtained using the extended model and the standard model was compared. A decrease of obtained CBF of 6% (10%) was observed in grey matter when the duration of bolus saturation increased from 600 to 900 ms (1200 ms). This decrease was reduced to 1.6% (2.8%) when the extended quantification model was used. Compared with the extended model, the standard model underestimated CBF in grey matter by 9.7, 15.0, and 18.7% for saturation durations 600, 900, and 1200 ms, respectively. Results for simulated single inversion-time data showed 5-16% CBF underestimation depending on blood arrival time and bolus saturation duration. Magnetization transfer effects caused by bolus saturation pulses should not be ignored when performing quantification as they can cause appreciable underestimation of the CBF. Copyright © 2013 Wiley Periodicals, Inc.
Buh Gasparic, Meti; Tengs, Torstein; La Paz, Jose Luis; Holst-Jensen, Arne; Pla, Maria; Esteve, Teresa; Zel, Jana; Gruden, Kristina
2010-03-01
Several techniques have been developed for detection and quantification of genetically modified organisms, but quantitative real-time PCR is by far the most popular approach. Among the most commonly used real-time PCR chemistries are TaqMan probes and SYBR green, but many other detection chemistries have also been developed. Because their performance has never been compared systematically, here we present an extensive evaluation of some promising chemistries: sequence-unspecific DNA labeling dyes (SYBR green), primer-based technologies (AmpliFluor, Plexor, Lux primers), and techniques involving double-labeled probes, comprising hybridization (molecular beacon) and hydrolysis (TaqMan, CPT, LNA, and MGB) probes, based on recently published experimental data. For each of the detection chemistries assays were included targeting selected loci. Real-time PCR chemistries were subsequently compared for their efficiency in PCR amplification and limits of detection and quantification. The overall applicability of the chemistries was evaluated, adding practicability and cost issues to the performance characteristics. None of the chemistries seemed to be significantly better than any other, but certain features favor LNA and MGB technology as good alternatives to TaqMan in quantification assays. SYBR green and molecular beacon assays can perform equally well but may need more optimization prior to use.
Instantaneous Wavenumber Estimation for Damage Quantification in Layered Plate Structures
NASA Technical Reports Server (NTRS)
Mesnil, Olivier; Leckey, Cara A. C.; Ruzzene, Massimo
2014-01-01
This paper illustrates the application of instantaneous and local wavenumber damage quantification techniques for high frequency guided wave interrogation. The proposed methodologies can be considered as first steps towards a hybrid structural health monitoring/ nondestructive evaluation (SHM/NDE) approach for damage assessment in composites. The challenges and opportunities related to the considered type of interrogation and signal processing are explored through the analysis of numerical data obtained via EFIT simulations of damage in CRFP plates. Realistic damage configurations are modeled from x-ray CT scan data of plates subjected to actual impacts, in order to accurately predict wave-damage interactions in terms of scattering and mode conversions. Simulation data is utilized to enhance the information provided by instantaneous and local wavenumbers and mitigate the complexity related to the multi-modal content of the plate response. Signal processing strategies considered for this purpose include modal decoupling through filtering in the frequency/wavenumber domain, the combination of displacement components, and the exploitation of polarization information for the various modes as evaluated through the dispersion analysis of the considered laminate lay-up sequence. The results presented assess the effectiveness of the proposed wavefield processing techniques as a hybrid SHM/NDE technique for damage detection and quantification in composite, plate-like structures.
Development of a method for detection and quantification of B. brongniartii and B. bassiana in soil
Canfora, L.; Malusà, E.; Tkaczuk, C.; Tartanus, M.; Łabanowska, B.H.; Pinzari, F.
2016-01-01
A culture independent method based on qPCR was developed for the detection and quantification of two fungal inoculants in soil. The aim was to adapt a genotyping approach based on SSR (Simple Sequence Repeat) marker to a discriminating tracing of two different species of bioinoculants in soil, after their in-field release. Two entomopathogenic fungi, Beauveria bassiana and B. brongniartii, were traced and quantified in soil samples obtained from field trials. These two fungal species were used as biological agents in Poland to control Melolontha melolontha (European cockchafer), whose larvae live in soil menacing horticultural crops. Specificity of SSR markers was verified using controls consisting of: i) soil samples containing fungal spores of B. bassiana and B. brongniartii in known dilutions; ii) the DNA of the fungal microorganisms; iii) soil samples singly inoculated with each fungus species. An initial evaluation of the protocol was performed with analyses of soil DNA and mycelial DNA. Further, the simultaneous detection and quantification of B. bassiana and B. brongniartii in soil was achieved in field samples after application of the bio-inoculants. The protocol can be considered as a relatively low cost solution for the detection, identification and traceability of fungal bio-inoculants in soil. PMID:26975931
Salisu, Ibrahim B.; Shahid, Ahmad A.; Yaqoob, Amina; Ali, Qurban; Bajwa, Kamran S.; Rao, Abdul Q.; Husnain, Tayyab
2017-01-01
As long as the genetically modified crops are gaining attention globally, their proper approval and commercialization need accurate and reliable diagnostic methods for the transgenic content. These diagnostic techniques are mainly divided into two major groups, i.e., identification of transgenic (1) DNA and (2) proteins from GMOs and their products. Conventional methods such as PCR (polymerase chain reaction) and enzyme-linked immunosorbent assay (ELISA) were routinely employed for DNA and protein based quantification respectively. Although, these Techniques (PCR and ELISA) are considered as significantly convenient and productive, but there is need for more advance technologies that allow for high throughput detection and the quantification of GM event as the production of more complex GMO is increasing day by day. Therefore, recent approaches like microarray, capillary gel electrophoresis, digital PCR and next generation sequencing are more promising due to their accuracy and precise detection of transgenic contents. The present article is a brief comparative study of all such detection techniques on the basis of their advent, feasibility, accuracy, and cost effectiveness. However, these emerging technologies have a lot to do with detection of a specific event, contamination of different events and determination of fusion as well as stacked gene protein are the critical issues to be addressed in future. PMID:29085378
Gatos, Ilias; Tsantis, Stavros; Spiliopoulos, Stavros; Skouroliakou, Aikaterini; Theotokas, Ioannis; Zoumpoulis, Pavlos; Hazle, John D; Kagadis, George C
2015-07-01
Detect and classify focal liver lesions (FLLs) from contrast-enhanced ultrasound (CEUS) imaging by means of an automated quantification algorithm. The proposed algorithm employs a sophisticated segmentation method to detect and contour focal lesions from 52 CEUS video sequences (30 benign and 22 malignant). Lesion detection involves wavelet transform zero crossings utilization as an initialization step to the Markov random field model toward the lesion contour extraction. After FLL detection across frames, time intensity curve (TIC) is computed which provides the contrast agents' behavior at all vascular phases with respect to adjacent parenchyma for each patient. From each TIC, eight features were automatically calculated and employed into the support vector machines (SVMs) classification algorithm in the design of the image analysis model. With regard to FLLs detection accuracy, all lesions detected had an average overlap value of 0.89 ± 0.16 with manual segmentations for all CEUS frame-subsets included in the study. Highest classification accuracy from the SVM model was 90.3%, misdiagnosing three benign and two malignant FLLs with sensitivity and specificity values of 93.1% and 86.9%, respectively. The proposed quantification system that employs FLLs detection and classification algorithms may be of value to physicians as a second opinion tool for avoiding unnecessary invasive procedures.
Grimm, Alexandra; Meyer, Heiko; Nickel, Marcel D; Nittka, Mathias; Raithel, Esther; Chaudry, Oliver; Friedberger, Andreas; Uder, Michael; Kemmler, Wolfgang; Quick, Harald H; Engelke, Klaus
2018-06-01
The purpose of this study is to evaluate and compare 2-point (2pt), 3-point (3pt), and 6-point (6pt) Dixon magnetic resonance imaging (MRI) sequences with flexible echo times (TE) to measure proton density fat fraction (PDFF) within muscles. Two subject groups were recruited (G1: 23 young and healthy men, 31 ± 6 years; G2: 50 elderly men, sarcopenic, 77 ± 5 years). A 3-T MRI system was used to perform Dixon imaging on the left thigh. PDFF was measured with six Dixon prototype sequences: 2pt, 3pt, and 6pt sequences once with optimal TEs (in- and opposed-phase echo times), lower resolution, and higher bandwidth (optTE sequences) and once with higher image resolution (highRes sequences) and shortest possible TE, respectively. Intra-fascia PDFF content was determined. To evaluate the comparability among the sequences, Bland-Altman analysis was performed. The highRes 6pt Dixon sequences served as reference as a high correlation of this sequence to magnetic resonance spectroscopy has been shown before. The PDFF difference between the highRes 6pt Dixon sequence and the optTE 6pt, both 3pt, and the optTE 2pt was low (between 2.2% and 4.4%), however, not to the highRes 2pt Dixon sequence (33%). For the optTE sequences, difference decreased with the number of echoes used. In conclusion, for Dixon sequences with more than two echoes, the fat fraction measurement was reliable with arbitrary echo times, while for 2pt Dixon sequences, it was reliable with dedicated in- and opposed-phase echo timing. Copyright © 2018 Elsevier B.V. All rights reserved.
Risk-Based Fire Safety Experiment Definition for Manned Spacecraft
NASA Technical Reports Server (NTRS)
Apostolakis, G. E.; Ho, V. S.; Marcus, E.; Perry, A. T.; Thompson, S. L.
1989-01-01
Risk methodology is used to define experiments to be conducted in space which will help to construct and test the models required for accident sequence identification. The development of accident scenarios is based on the realization that whether damage occurs depends on the time competition of two processes: the ignition and creation of an adverse environment, and the detection and suppression activities. If the fire grows and causes damage faster than it is detected and suppressed, then an accident occurred. The proposed integrated experiments will provide information on individual models that apply to each of the above processes, as well as previously unidentified interactions and processes, if any. Initially, models that are used in terrestrial fire risk assessments are considered. These include heat and smoke release models, detection and suppression models, as well as damage models. In cases where the absence of gravity substantially invalidates a model, alternate models will be developed. Models that depend on buoyancy effects, such as the multizone compartment fire models, are included in these cases. The experiments will be performed in a variety of geometries simulating habitable areas, racks, and other spaces. These simulations will necessitate theoretical studies of scaling effects. Sensitivity studies will also be carried out including the effects of varying oxygen concentrations, pressures, fuel orientation and geometry, and air flow rates. The experimental apparatus described herein includes three major modules: the combustion, the fluids, and the command and power modules.
NASA Astrophysics Data System (ADS)
Huh, Chih-An; Hsu, Shih-Chieh; Lin, Chuan-Yao
2012-02-01
The 2011 Fukushima nuclear accident in Japan was the worst nuclear disaster following the 1986 Chernobyl accident. Fission products (nuclides) released from the Fukushima plant site since March 12, 2011 had been detected around the northern hemisphere in about two weeks and also in the southern hemisphere about one month later. We report here detailed time series of radioiodine and radiocesium isotopes monitored in a regional network around Taiwan, including one high-mountain and three ground-level sites. Our results show several pulses of emission from a sequence of accidents in the Fukushima facility, with the more volatile 131I released preferentially over 134Cs and 137Cs at the beginning. In the middle of the time series, there was a pronounced peak of radiocesium observed in northern Taiwan, with activity concentrations of 134Cs and 137Cs far exceeding that of 131I during that episode. From the first arrival time of these fission nuclides and their spatial and temporal variations at our sampling sites and elsewhere, we suggest that Fukushima-derived radioactive nuclides were transported to Taiwan and its vicinity via two pathways at different altitudes. One was transported in the free troposphere by the prevailing westerly winds around the globe; the other was transported in the planetary boundary layer by the northeast monsoon wind directly toward Taiwan.
Zou, Wei; Marcil, Anne; Paquet, Eric; Gadoury, Christine; Jaentschke, Bozena; Li, Xuguang; Petiot, Emma; Durocher, Yves; Baardsnes, Jason; Rosa-Calatrava, Manuel; Ansorge, Sven; Kamen, Amine A.
2017-01-01
Vaccination is the most effective course of action to prevent influenza. About 150 million doses of influenza vaccines were distributed for the 2015–2016 season in the USA alone according to the Centers for Disease Control and Prevention. Vaccine dosage is calculated based on the concentration of hemagglutinin (HA), the main surface glycoprotein expressed by influenza which varies from strain to strain. Therefore yearly-updated strain-specific antibodies and calibrating antigens are required. Preparing these quantification reagents can take up to three months and significantly slows down the release of new vaccine lots. Therefore, to circumvent the need for strain-specific sera, two anti-HA monoclonal antibodies (mAbs) against a highly conserved sequence have been produced by immunizing mice with a novel peptide-conjugate. Immunoblots demonstrate that 40 strains of influenza encompassing HA subtypes H1 to H13, as well as B strains from the Yamagata and Victoria lineage were detected when the two mAbs are combined to from a pan-HA mAb cocktail. Quantification using this pan-HA mAbs cocktail was achieved in a dot blot assay and results correlated with concentrations measured in a hemagglutination assay with a coefficient of correlation of 0.80. A competitive ELISA was also optimised with purified viral-like particles. Regardless of the quantification method used, pan-HA antibodies can be employed to accelerate process development when strain-specific antibodies are not available, and represent a valuable tool in case of pandemics. These antibodies were also expressed in CHO cells to facilitate large-scale production using bioreactor technologies which might be required to meet industrial needs for quantification reagents. Finally, a simulation model was created to predict the binding affinity of the two anti-HA antibodies to the amino acids composing the highly conserved epitope; different probabilities of interaction between a given amino acid and the antibodies might explain the affinity of each antibody against different influenza strains. PMID:28662134
In situ spectroradiometric quantification of ERTS data
NASA Technical Reports Server (NTRS)
Yost, E. (Principal Investigator)
1972-01-01
The author has identified the following significant results. Additive color photographic analysis of ERTS-1 multispectral imagery indicates that the presence of soil moisture in playas (desert dry lakes) can be readily detected from space. Time sequence additive color presentations in which 600-700 nm bands taken at three successive 18-day cycles show that changes in soil moisture of playas with time can be detected as unique color signatures and can probably be quantitatively measured using photographic images of multispectral scanner data.
Diagnostics of Tree Diseases Caused by Phytophthora austrocedri Species.
Mulholland, Vincent; Elliot, Matthew; Green, Sarah
2015-01-01
We present methods for the detection and quantification of four Phytophthora species which are pathogenic on trees; Phytophthora ramorum, Phytophthora kernoviae, Phytophthora lateralis, and Phytophthora austrocedri. Nucleic acid extraction methods are presented for phloem tissue from trees, soil, and pure cultures on agar plates. Real-time PCR methods are presented and include primer and probe sets for each species, general advice on real-time PCR setup and data analysis. A method for sequence-based identification, useful for pure cultures, is also included.
Two phase flow bifurcation due to turbulence: transition from slugs to bubbles
NASA Astrophysics Data System (ADS)
Górski, Grzegorz; Litak, Grzegorz; Mosdorf, Romuald; Rysak, Andrzej
2015-09-01
The bifurcation of slugs to bubbles within two-phase flow patterns in a minichannel is analyzed. The two-phase flow (water-air) occurring in a circular horizontal minichannel with a diameter of 1 mm is examined. The sequences of light transmission time series recorded by laser-phototransistor sensor is analyzed using recurrence plots and recurrence quantification analysis. Recurrence parameters allow the two-phase flow patterns to be found. On changing the water flow rate we identified partitioning of slugs or aggregation of bubbles.
Automatic right ventricle (RV) segmentation by propagating a basal spatio-temporal characterization
NASA Astrophysics Data System (ADS)
Atehortúa, Angélica; Zuluaga, María. A.; Martínez, Fabio; Romero, Eduardo
2015-12-01
An accurate right ventricular (RV) function quantification is important to support the evaluation, diagnosis and prognosis of several cardiac pathologies and to complement the left ventricular function assessment. However, expert RV delineation is a time consuming task with high inter-and-intra observer variability. In this paper we present an automatic segmentation method of the RV in MR-cardiac sequences. Unlike atlas or multi-atlas methods, this approach estimates the RV using exclusively information from the sequence itself. For so doing, a spatio-temporal analysis segments the heart at the basal slice, segmentation that is then propagated to the apex by using a non-rigid-registration strategy. The proposed approach achieves an average Dice Score of 0:79 evaluated with a set of 48 patients.
Ahdesmäki, Miika J; Gray, Simon R; Johnson, Justin H; Lai, Zhongwu
2016-01-01
Grafting of cell lines and primary tumours is a crucial step in the drug development process between cell line studies and clinical trials. Disambiguate is a program for computationally separating the sequencing reads of two species derived from grafted samples. Disambiguate operates on DNA or RNA-seq alignments to the two species and separates the components at very high sensitivity and specificity as illustrated in artificially mixed human-mouse samples. This allows for maximum recovery of data from target tumours for more accurate variant calling and gene expression quantification. Given that no general use open source algorithm accessible to the bioinformatics community exists for the purposes of separating the two species data, the proposed Disambiguate tool presents a novel approach and improvement to performing sequence analysis of grafted samples. Both Python and C++ implementations are available and they are integrated into several open and closed source pipelines. Disambiguate is open source and is freely available at https://github.com/AstraZeneca-NGS/disambiguate.
Ruggeri, Marco; de Freitas, Carolina; Williams, Siobhan; Hernandez, Victor M.; Cabot, Florence; Yesilirmak, Nilufer; Alawa, Karam; Chang, Yu-Cherng; Yoo, Sonia H.; Gregori, Giovanni; Parel, Jean-Marie; Manns, Fabrice
2016-01-01
Abstract: Two SD-OCT systems and a dual channel accommodation target were combined and precisely synchronized to simultaneously image the anterior segment and the ciliary muscle during dynamic accommodation. The imaging system simultaneously generates two synchronized OCT image sequences of the anterior segment and ciliary muscle with an imaging speed of 13 frames per second. The system was used to acquire OCT image sequences of a non-presbyopic and a pre-presbyopic subject accommodating in response to step changes in vergence. The image sequences were processed to extract dynamic morphological data from the crystalline lens and the ciliary muscle. The synchronization between the OCT systems allowed the precise correlation of anatomical changes occurring in the crystalline lens and ciliary muscle at identical time points during accommodation. To describe the dynamic interaction between the crystalline lens and ciliary muscle, we introduce accommodation state diagrams that display the relation between anatomical changes occurring in the accommodating crystalline lens and ciliary muscle. PMID:27446660
Pope, Welkin H; Bowman, Charles A; Russell, Daniel A; Jacobs-Sera, Deborah; Asai, David J; Cresawn, Steven G; Jacobs, William R; Hendrix, Roger W; Lawrence, Jeffrey G; Hatfull, Graham F; Abbazia, Patrick; Ababio, Amma; Adam, Naazneen
2015-01-01
The bacteriophage population is large, dynamic, ancient, and genetically diverse. Limited genomic information shows that phage genomes are mosaic, and the genetic architecture of phage populations remains ill-defined. To understand the population structure of phages infecting a single host strain, we isolated, sequenced, and compared 627 phages of Mycobacterium smegmatis. Their genetic diversity is considerable, and there are 28 distinct genomic types (clusters) with related nucleotide sequences. However, amino acid sequence comparisons show pervasive genomic mosaicism, and quantification of inter-cluster and intra-cluster relatedness reveals a continuum of genetic diversity, albeit with uneven representation of different phages. Furthermore, rarefaction analysis shows that the mycobacteriophage population is not closed, and there is a constant influx of genes from other sources. Phage isolation and analysis was performed by a large consortium of academic institutions, illustrating the substantial benefits of a disseminated, structured program involving large numbers of freshman undergraduates in scientific discovery. DOI: http://dx.doi.org/10.7554/eLife.06416.001 PMID:25919952
Ruggeri, Marco; de Freitas, Carolina; Williams, Siobhan; Hernandez, Victor M; Cabot, Florence; Yesilirmak, Nilufer; Alawa, Karam; Chang, Yu-Cherng; Yoo, Sonia H; Gregori, Giovanni; Parel, Jean-Marie; Manns, Fabrice
2016-04-01
Two SD-OCT systems and a dual channel accommodation target were combined and precisely synchronized to simultaneously image the anterior segment and the ciliary muscle during dynamic accommodation. The imaging system simultaneously generates two synchronized OCT image sequences of the anterior segment and ciliary muscle with an imaging speed of 13 frames per second. The system was used to acquire OCT image sequences of a non-presbyopic and a pre-presbyopic subject accommodating in response to step changes in vergence. The image sequences were processed to extract dynamic morphological data from the crystalline lens and the ciliary muscle. The synchronization between the OCT systems allowed the precise correlation of anatomical changes occurring in the crystalline lens and ciliary muscle at identical time points during accommodation. To describe the dynamic interaction between the crystalline lens and ciliary muscle, we introduce accommodation state diagrams that display the relation between anatomical changes occurring in the accommodating crystalline lens and ciliary muscle.
Pope, Welkin H; Bowman, Charles A; Russell, Daniel A; Jacobs-Sera, Deborah; Asai, David J; Cresawn, Steven G; Jacobs, William R; Hendrix, Roger W; Lawrence, Jeffrey G; Hatfull, Graham F
2015-04-28
The bacteriophage population is large, dynamic, ancient, and genetically diverse. Limited genomic information shows that phage genomes are mosaic, and the genetic architecture of phage populations remains ill-defined. To understand the population structure of phages infecting a single host strain, we isolated, sequenced, and compared 627 phages of Mycobacterium smegmatis. Their genetic diversity is considerable, and there are 28 distinct genomic types (clusters) with related nucleotide sequences. However, amino acid sequence comparisons show pervasive genomic mosaicism, and quantification of inter-cluster and intra-cluster relatedness reveals a continuum of genetic diversity, albeit with uneven representation of different phages. Furthermore, rarefaction analysis shows that the mycobacteriophage population is not closed, and there is a constant influx of genes from other sources. Phage isolation and analysis was performed by a large consortium of academic institutions, illustrating the substantial benefits of a disseminated, structured program involving large numbers of freshman undergraduates in scientific discovery.
Dynamic ASXL1 Exon Skipping and Alternative Circular Splicing in Single Human Cells
Natarajan, Sivaraman; Carter, Robert; Brown, Patrick O.
2016-01-01
Circular RNAs comprise a poorly understood new class of noncoding RNA. In this study, we used a combination of targeted deletion, high-resolution splicing detection, and single-cell sequencing to deeply probe ASXL1 circular splicing. We found that efficient circular splicing required the canonical transcriptional start site and inverted AluSx elements. Sequencing-based interrogation of isoforms after ASXL1 overexpression identified promiscuous linear splicing between all exons, with the two most abundant non-canonical linear products skipping the exons that produced the circular isoforms. Single-cell sequencing revealed a strong preference for either the linear or circular ASXL1 isoforms in each cell, and found the predominant exon skipping product is frequently co-expressed with its reciprocal circular isoform. Finally, absolute quantification of ASXL1 isoforms confirmed our findings and suggests that standard methods overestimate circRNA abundance. Taken together, these data reveal a dynamic new view of circRNA genesis, providing additional framework for studying their roles in cellular biology. PMID:27736885
Chance, necessity and the origins of life: a physical sciences perspective
NASA Astrophysics Data System (ADS)
Hazen, Robert M.
2017-11-01
Earth's 4.5-billion-year history has witnessed a complex sequence of high-probability chemical and physical processes, as well as `frozen accidents'. Most models of life's origins similarly invoke a sequence of chemical reactions and molecular self-assemblies in which both necessity and chance play important roles. Recent research adds two important insights into this discussion. First, in the context of chemical reactions, chance versus necessity is an inherently false dichotomy-a range of probabilities exists for many natural events. Second, given the combinatorial richness of early Earth's chemical and physical environments, events in molecular evolution that are unlikely at limited laboratory scales of space and time may, nevertheless, be inevitable on an Earth-like planet at time scales of a billion years. This article is part of the themed issue 'Reconceptualizing the origins of life'.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ritchie, L.T.; Alpert, D.J.; Burke, R.P.
1984-03-01
The CRAC2 computer code is a revised version of CRAC (Calculation of Reactor Accident Consequences) which was developed for the Reactor Safety Study. This document provides an overview of the CRAC2 code and a description of each of the models used. Significant improvements incorporated into CRAC2 include an improved weather sequence sampling technique, a new evacuation model, and new output capabilities. In addition, refinements have been made to the atmospheric transport and deposition model. Details of the modeling differences between CRAC2 and CRAC are emphasized in the model descriptions.
Role of susceptibility-weighted imaging in demonstration of cerebral fat embolism
Yeap, Pheyming; Kanodia, Avinash Kumar; Main, Gavin; Yong, Aiwain
2015-01-01
Cerebral fat embolism (CFE) is a rare but potentially lethal complication of long bone fractures. Many cases of CFE occur as subclinical events and remain undiagnosed. We report a case of a 22-year-old man, with multiple long bone fractures from a road traffic accident, who subsequently developed hypoxia, neurological abnormality and petechial rash. CT of the head was normal. MRI of the head confirmed the diagnosis with lesions markedly conspicuous and most widespread on susceptibility-weighted imaging as compared to all other sequences including diffusion-weighted imaging. PMID:25572601
Mariappan, Yogesh K.; Dzyubak, Bogdan; Glaser, Kevin J.; Venkatesh, Sudhakar K.; Sirlin, Claude B.; Hooker, Jonathan; McGee, Kiaran P.
2017-01-01
Purpose To (a) evaluate modified spin-echo (SE) magnetic resonance (MR) elastographic sequences for acquiring MR images with improved signal-to-noise ratio (SNR) in patients in whom the standard gradient-echo (GRE) MR elastographic sequence yields low hepatic signal intensity and (b) compare the stiffness values obtained with these sequences with those obtained with the conventional GRE sequence. Materials and Methods This HIPAA-compliant retrospective study was approved by the institutional review board; the requirement to obtain informed consent was waived. Data obtained with modified SE and SE echo-planar imaging (EPI) MR elastographic pulse sequences with short echo times were compared with those obtained with the conventional GRE MR elastographic sequence in two patient cohorts, one that exhibited adequate liver signal intensity and one that exhibited low liver signal intensity. Shear stiffness values obtained with the three sequences in 130 patients with successful GRE-based examinations were retrospectively tested for statistical equivalence by using a 5% margin. In 47 patients in whom GRE examinations were considered to have failed because of low SNR, the SNR and confidence level with the SE-based sequences were compared with those with the GRE sequence. Results The results of this study helped confirm the equivalence of SE MR elastography and SE-EPI MR elastography to GRE MR elastography (P = .0212 and P = .0001, respectively). The SE and SE-EPI MR elastographic sequences provided substantially improved SNR and stiffness inversion confidence level in 47 patients in whom GRE MR elastography had failed. Conclusion Modified SE-based MR elastographic sequences provide higher SNR MR elastographic data and reliable stiffness measurements; thus, they enable quantification of stiffness in patients in whom the conventional GRE MR elastographic sequence failed owing to low signal intensity. The equivalence of the three sequences indicates that the current diagnostic thresholds are applicable to SE MR elastographic sequences for assessing liver fibrosis. © RSNA, 2016 PMID:27509543
2009-01-01
Background The oomycete Aphanomyces astaci is regarded as the causative agent of crayfish plague and represents an evident hazard for European crayfish species. Native crayfish populations infected with this pathogen suffer up to 100% mortality. The existence of multiple transmission paths necessitates the development of a reliable, robust and efficient test to detect the pathogen. Currently, A. astaci is diagnosed by a PCR-based assay that suffers from cross-reactivity to other species. We developed an alternative closed-tube assay for A. astaci, which achieves robustness through simultaneous amplification of multiple functionally constrained genes. Results Two novel constitutively expressed members of the glycosyl hydrolase (GH18) gene family of chitinases were isolated from the A. astaci strain Gb04. The primary amino acid sequence of these chitinase genes, termed CHI2 and CHI3, is composed of an N-terminal signal peptide directing the post-translational transport of the protein into the extracellular space, the catalytic GH18 domain, a proline-, serine-, and threonine-rich domain and a C-terminal cysteine-rich putative chitin-binding site. The A. astaci mycelium grown in a pepton-glucose medium showed significant temporal changes in steady-state CHI2 and CHI3 mRNA amounts indicating functional constraint. Their different temporal occurrence with maxima at 48 and 24 hours of incubation for CHI2 and CHI3, respectively, is in accordance with the multifunctionality of GH18 family members. To identify A. astaci-specific primer target sites in these novel genes, we determined the partial sequence homologs in the related oomycetes A. frigidophilus, A. invadans, A. helicoides, A. laevis, A. repetans, Achlya racemosa, Leptolegnia caudata, and Saprolegnia parasitica, as well as in the relevant fungi Fusarium solani and Trichosporon cutaneum. An A. astaci-specific primer pair targeting the novel genes CHI2 and CHI3 as well as CHI1 - a third GH18 family member - was multiplexed with primers targeting the 5.8S rRNA used as an endogenous control. A species was typed unambiguously as A. astaci if two peaks were concomitantly detected by melting curve analysis (MCA). For sensitive detection of the pathogen, but also for quantification of agent levels in susceptible crayfish and carrier crayfish, a TaqMan-probe based real-time PCR (qPCR) assay was developed. It targets the same chitinase genes and allows quantification down to 25 target sequences. Conclusion The simultaneous qualitative detection of multiple sequences by qPCR/MCA represents a promising approach to detect species with elevated levels of genetic variation and/or limited available sequence information. The homogenous closed-tube format, reduced detection time, higher specificity, and the considerably reduced chance of false negative detection achieved by targeting multiple genes (CHI1, CHI2, CHI3, and the endogenous control) at least two of which are subject to high functional constraint, are the major advantages of this multiplex assay compared to other diagnostic methods. Sensitive quantification achieved with TaqMan qPCR facilitates to monitor infection status and pathogen distribution in different tissues and can help prevent disease transmission. PMID:19719847
The costs of traumatic brain injury due to motorcycle accidents in Hanoi, Vietnam
Hoang, Hanh TM; Pham, Tran L; Vo, Thuy TN; Nguyen, Phuong K; Doran, Christopher M; Hill, Peter S
2008-01-01
Background Road traffic accidents are the leading cause of fatal and non-fatal injuries in Vietnam. The purpose of this study is to estimate the costs, in the first year post-injury, of non-fatal traumatic brain injury (TBI) in motorcycle users not wearing helmets in Hanoi, Vietnam. The costs are calculated from the perspective of the injured patients and their families, and include quantification of direct, indirect and intangible costs, using years lost due to disability as a proxy. Methods The study was a retrospective cross-sectional study. Data on treatment and rehabilitation costs, employment and support were obtained from patients and their families using a structured questionnaire and The European Quality of Life instrument (EQ6D). Results Thirty-five patients and their families were interviewed. On average, patients with severe, moderate and minor TBI incurred direct costs at USD 2,365, USD 1,390 and USD 849, with time lost for normal activities averaging 54 weeks, 26 weeks and 17 weeks and years lived with disability (YLD) of 0.46, 0.25 and 0.15 year, respectively. Conclusion All three component costs of TBI were high; the direct cost accounted for the largest proportion, with costs rising with the severity of TBI. The results suggest that the burden of TBI can be catastrophic for families because of high direct costs, significant time off work for patients and caregivers, and impact on health-related quality of life. Further research is warranted to explore the actual social and economic benefits of mandatory helmet use. PMID:18718026
Fan, Jing; Yang, Haowen; Liu, Ming; Wu, Dan; Jiang, Hongrong; Zeng, Xin; Elingarami, Sauli; Ll, Zhiyang; Li, Song; Liu, Hongna; He, Nongyue
2015-02-01
In this research, a novel method for relative fluorescent quantification of DNA based on Fe3O4@SiO2@Au gold-coated magnetic nanocomposites (GMNPs) and multiplex ligation- dependent probe amplification (MLPA) has been developed. With the help of self-assembly, seed-mediated growth and chemical reduction method, core-shell Fe3O4@SiO2@Au GMNPs were synthesized. Through modified streptavidin on the GMNPs surface, we obtained a bead chip which can capture the biotinylated probes. Then we designed MLPA probes which were tagged with biotin or Cy3 and target DNA on the basis of human APP gene sequence. The products from the thermostable DNA ligase induced ligation reactions and PCR amplifications were incubated with SA-GMNPs. After washing, magnetic separation, spotting, the fluorescent scanning results showed our method can be used for the relative quantitative analysis of the target DNA in the concentration range of 03004~0.5 µM.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Whiteaker, Jeffrey R.; Halusa, Goran; Hoofnagle, Andrew N.
2016-02-12
The Clinical Proteomic Tumor Analysis Consortium (CPTAC) of the National Cancer Institute (NCI) has launched an Assay Portal (http://assays.cancer.gov) to serve as an open-source repository of well-characterized targeted proteomic assays. The portal is designed to curate and disseminate highly characterized, targeted mass spectrometry (MS)-based assays by providing detailed assay performance characterization data, standard operating procedures, and access to reagents. Assay content is accessed via the portal through queries to find assays targeting proteins associated with specific cellular pathways, protein complexes, or specific chromosomal regions. The position of the peptide analytes for which there are available assays are mapped relative tomore » other features of interest in the protein, such as sequence domains, isoforms, single nucleotide polymorphisms, and post-translational modifications. The overarching goals are to enable robust quantification of all human proteins and to standardize the quantification of targeted MS-based assays to ultimately enable harmonization of results over time and across laboratories.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Proudnikov, D.; Kirillov, E.; Chumakov, K.
2000-01-01
This paper describes use of a new technology of hybridization with a micro-array of immobilized oligonucleotides for detection and quantification of neurovirulent mutants in Oral Poliovirus Vaccine (OPV). We used a micro-array consisting of three-dimensional gel-elements containing all possible hexamers (total of 4096 probes). Hybridization of fluorescently labelled viral cDNA samples with such microchips resulted in a pattern of spots that was registered and quantified by a computer-linked CCD camera, so that the sequence of the original cDNA could be deduced. The method could reliably identify single point mutations, since each of them affected fluorescence intensity of 12 micro-array elements.more » Micro-array hybridization of DNA mixtures with varying contents of point mutants demonstrated that the method can detect as little as 10% of revertants in a population of vaccine virus. This new technology should be useful for quality control of live viral vaccines, as well as for other applications requiring identification and quantification of point mutations.« less
Properties of targeted preamplification in DNA and cDNA quantification.
Andersson, Daniel; Akrap, Nina; Svec, David; Godfrey, Tony E; Kubista, Mikael; Landberg, Göran; Ståhlberg, Anders
2015-01-01
Quantification of small molecule numbers often requires preamplification to generate enough copies for accurate downstream enumerations. Here, we studied experimental parameters in targeted preamplification and their effects on downstream quantitative real-time PCR (qPCR). To evaluate different strategies, we monitored the preamplification reaction in real-time using SYBR Green detection chemistry followed by melting curve analysis. Furthermore, individual targets were evaluated by qPCR. The preamplification reaction performed best when a large number of primer pairs was included in the primer pool. In addition, preamplification efficiency, reproducibility and specificity were found to depend on the number of template molecules present, primer concentration, annealing time and annealing temperature. The amount of nonspecific PCR products could also be reduced about 1000-fold using bovine serum albumin, glycerol and formamide in the preamplification. On the basis of our findings, we provide recommendations how to perform robust and highly accurate targeted preamplification in combination with qPCR or next-generation sequencing.
Whiteaker, Jeffrey R; Halusa, Goran N; Hoofnagle, Andrew N; Sharma, Vagisha; MacLean, Brendan; Yan, Ping; Wrobel, John A; Kennedy, Jacob; Mani, D R; Zimmerman, Lisa J; Meyer, Matthew R; Mesri, Mehdi; Boja, Emily; Carr, Steven A; Chan, Daniel W; Chen, Xian; Chen, Jing; Davies, Sherri R; Ellis, Matthew J C; Fenyö, David; Hiltke, Tara; Ketchum, Karen A; Kinsinger, Chris; Kuhn, Eric; Liebler, Daniel C; Liu, Tao; Loss, Michael; MacCoss, Michael J; Qian, Wei-Jun; Rivers, Robert; Rodland, Karin D; Ruggles, Kelly V; Scott, Mitchell G; Smith, Richard D; Thomas, Stefani; Townsend, R Reid; Whiteley, Gordon; Wu, Chaochao; Zhang, Hui; Zhang, Zhen; Rodriguez, Henry; Paulovich, Amanda G
2016-01-01
The Clinical Proteomic Tumor Analysis Consortium (CPTAC) of the National Cancer Institute (NCI) has launched an Assay Portal (http://assays.cancer.gov) to serve as an open-source repository of well-characterized targeted proteomic assays. The portal is designed to curate and disseminate highly characterized, targeted mass spectrometry (MS)-based assays by providing detailed assay performance characterization data, standard operating procedures, and access to reagents. Assay content is accessed via the portal through queries to find assays targeting proteins associated with specific cellular pathways, protein complexes, or specific chromosomal regions. The position of the peptide analytes for which there are available assays are mapped relative to other features of interest in the protein, such as sequence domains, isoforms, single nucleotide polymorphisms, and posttranslational modifications. The overarching goals are to enable robust quantification of all human proteins and to standardize the quantification of targeted MS-based assays to ultimately enable harmonization of results over time and across laboratories.
Koh, Hong; Kim, Seung; Kim, Myung-Joon; Kim, Hyun Gi; Shin, Hyun Joo; Lee, Mi-Jung
2015-09-07
To evaluate the possibility of treatment effect monitoring using hepatic fat quantification magnetic resonance (MR) in pediatric nonalcoholic steatohepatitis (NASH). We retrospectively reviewed the medical records of patients who received educational recommendations and vitamin E for NASH and underwent hepatic fat quantification MR from 2011 to 2013. Hepatic fat fraction (%) was measured using dual- and triple-echo gradient-recalled-echo sequences at 3T. The compliant and non-compliant groups were compared clinically, biochemically, and radiologically. Twenty seven patients (M:F = 24:3; mean age: 12 ± 2.3 years) were included (compliant group = 22, non-compliant = 5). None of the baseline findings differed between the 2 groups, except for triglyceride level (compliant vs non-compliant, 167.7 mg/dL vs 74.2 mg/dL, P = 0.001). In the compliant group, high-density lipoprotein increased and all other parameters decreased after 1-year follow-up. However, there were various changes in the non-compliant group. Dual-echo fat fraction (-19.2% vs 4.6, P < 0.001), triple-echo fat fraction (-13.4% vs 3.5, P < 0.001), alanine aminotransferase (-110.7 IU/L vs -10.6 IU/L, P = 0.047), total cholesterol (-18.1 mg/dL vs 3.8 mg/dL, P = 0.016), and triglyceride levels (-61.3 mg/dL vs 11.2 mg/dL, P = 0.013) were significantly decreased only in the compliant group. The change in body mass index and dual-echo fat fraction showed a positive correlation (ρ = 0.418, P = 0.030). Hepatic fat quantification MR can be a non-invasive, quantitative and useful tool for monitoring treatment effects in pediatric NASH.
Rieger, Benedikt; Zimmer, Fabian; Zapp, Jascha; Weingärtner, Sebastian; Schad, Lothar R
2017-11-01
To develop an implementation of the magnetic resonance fingerprinting (MRF) paradigm for quantitative imaging using echo-planar imaging (EPI) for simultaneous assessment of T 1 and T2∗. The proposed MRF method (MRF-EPI) is based on the acquisition of 160 gradient-spoiled EPI images with rapid, parallel-imaging accelerated, Cartesian readout and a measurement time of 10 s per slice. Contrast variation is induced using an initial inversion pulse, and varying the flip angles, echo times, and repetition times throughout the sequence. Joint quantification of T 1 and T2∗ is performed using dictionary matching with integrated B1+ correction. The quantification accuracy of the method was validated in phantom scans and in vivo in 6 healthy subjects. Joint T 1 and T2∗ parameter maps acquired with MRF-EPI in phantoms are in good agreement with reference measurements, showing deviations under 5% and 4% for T 1 and T2∗, respectively. In vivo baseline images were visually free of artifacts. In vivo relaxation times are in good agreement with gold-standard techniques (deviation T 1 : 4 ± 2%, T2∗: 4 ± 5%). The visual quality was comparable to the in vivo gold standard, despite substantially shortened scan times. The proposed MRF-EPI method provides fast and accurate T 1 and T2∗ quantification. This approach offers a rapid supplement to the non-Cartesian MRF portfolio, with potentially increased usability and robustness. Magn Reson Med 78:1724-1733, 2017. © 2016 International Society for Magnetic Resonance in Medicine. © 2016 International Society for Magnetic Resonance in Medicine.
Study of mitochondria D-loop gene to detect the heterogeneity of gemak in Turnicidae family
NASA Astrophysics Data System (ADS)
Setiati, N.; Partaya
2018-03-01
As a part of life biodiversity, birds in Turnicidae family should be preserved from the extinction and its type heterogeneity decline. One effort for giving the strategic base of plasma nutfah conservation is through genetic heterogeneity study. The aim of the research is to analyze D-loop gen from DNA mitochondria of gemak bird in Turnicidae family molecularly. From the result of the analysis, it may be known the genetic heterogeneity of gemak bird based on the sequence of D-loop gen. The collection of both types of gemak of Turnicidae family is still easy since we can find them in ricefield area after harvest particularly for Gemakloreng (Turnix sylvatica), it means while gemak tegalan (Turnixsusciator) is getting difficult to find. Based on the above DNA quantification standard, the blood sample of Gemak in this research is mostly grouped into pure blood (ranges from 1,63 – 1,90), and it deserves to be used for PCR analysis. The sequencing analysis has not detected the sequence of nucleotide completely. However, it indicates sequence polymorphism of base as the arranger of D-loop gen. D-loop gen may identify genetic heterogeneity of gemak bird of Turnicidae family, but it is necessary to perform further sequencing analysis with PCR-RFLP technique. This complete nucleotide sequence is obtained and easy to detect after being cut restriction enzyme.
Zhang, Huimin; He, Hongkui; Yu, Xiujuan; Xu, Zhaohui; Zhang, Zhizhou
2016-11-01
It remains an unsolved problem to quantify a natural microbial community by rapidly and conveniently measuring multiple species with functional significance. Most widely used high throughput next-generation sequencing methods can only generate information mainly for genus-level taxonomic identification and quantification, and detection of multiple species in a complex microbial community is still heavily dependent on approaches based on near full-length ribosome RNA gene or genome sequence information. In this study, we used near full-length rRNA gene library sequencing plus Primer-Blast to design species-specific primers based on whole microbial genome sequences. The primers were intended to be specific at the species level within relevant microbial communities, i.e., a defined genomics background. The primers were tested with samples collected from the Daqu (also called fermentation starters) and pit mud of a traditional Chinese liquor production plant. Sixteen pairs of primers were found to be suitable for identification of individual species. Among them, seven pairs were chosen to measure the abundance of microbial species through quantitative PCR. The combination of near full-length ribosome RNA gene library sequencing and Primer-Blast may represent a broadly useful protocol to quantify multiple species in complex microbial population samples with species-specific primers.
Windowed R-PDLF recoupling: a flexible and reliable tool to characterize molecular dynamics.
Gansmüller, Axel; Simorre, Jean-Pierre; Hediger, Sabine
2013-09-01
This work focuses on the improvement of the R-PDLF heteronuclear recoupling scheme, a method that allows quantification of molecular dynamics up to the microsecond timescale in heterogeneous materials. We show how the stability of the sequence towards rf-imperfections, one of the main sources of error of this technique, can be improved by the insertion of windows without irradiation into the basic elements of the symmetry-based recoupling sequence. The impact of this modification on the overall performance of the sequence in terms of scaling factor and homonuclear decoupling efficiency is evaluated. This study indicates the experimental conditions for which precise and reliable measurement of dipolar couplings can be obtained using the popular R18(1)(7) recoupling sequence, as well as alternative symmetry-based R sequences suited for fast MAS conditions. An analytical expression for the recoupled dipolar modulation has been derived that applies to a whole class of sequences with similar recoupling properties as R18(1)(7). This analytical expression provides an efficient and precise way to extract dipolar couplings from the experimental dipolar modulation curves. We hereby provide helpful tools and information for tailoring R-PDLF recoupling schemes to specific sample properties and hardware capabilities. This approach is particularly well suited for the study of materials with strong and heterogeneous molecular dynamics where a precise measurement of dipolar couplings is crucial. Copyright © 2013 Elsevier Inc. All rights reserved.
van der Klift, Heleen M; Tops, Carli M J; Bik, Elsa C; Boogaard, Merel W; Borgstein, Anne-Marijke; Hansson, Kerstin B M; Ausems, Margreet G E M; Gomez Garcia, Encarna; Green, Andrew; Hes, Frederik J; Izatt, Louise; van Hest, Liselotte P; Alonso, Angel M; Vriends, Annette H J T; Wagner, Anja; van Zelst-Stams, Wendy A G; Vasen, Hans F A; Morreau, Hans; Devilee, Peter; Wijnen, Juul T
2010-05-01
Heterozygous mutations in PMS2 are involved in Lynch syndrome, whereas biallelic mutations are found in Constitutional mismatch repair-deficiency syndrome patients. Mutation detection is complicated by the occurrence of sequence exchange events between the duplicated regions of PMS2 and PMS2CL. We investigated the frequency of such events with a nonspecific polymerase chain reaction (PCR) strategy, co-amplifying both PMS2 and PMS2CL sequences. This allowed us to score ratios between gene and pseudogene-specific nucleotides at 29 PSV sites from exon 11 to the end of the gene. We found sequence transfer at all investigated PSVs from intron 12 to the 3' end of the gene in 4 to 52% of DNA samples. Overall, sequence exchange between PMS2 and PMS2CL was observed in 69% (83/120) of individuals. We demonstrate that mutation scanning with PMS2-specific PCR primers and MLPA probes, designed on PSVs, in the 3' duplicated region is unreliable, and present an RNA-based mutation detection strategy to improve reliability. Using this strategy, we found 19 different putative pathogenic PMS2 mutations. Four of these (21%) are lying in the region with frequent sequence transfer and are missed or called incorrectly as homozygous with several PSV-based mutation detection methods. (c) 2010 Wiley-Liss, Inc.
Increased complexity of circRNA expression during species evolution.
Dong, Rui; Ma, Xu-Kai; Chen, Ling-Ling; Yang, Li
2017-08-03
Circular RNAs (circRNAs) are broadly identified from precursor mRNA (pre-mRNA) back-splicing across various species. Recent studies have suggested a cell-/tissue- specific manner of circRNA expression. However, the distinct expression pattern of circRNAs among species and its underlying mechanism still remain to be explored. Here, we systematically compared circRNA expression from human and mouse, and found that only a small portion of human circRNAs could be determined in parallel mouse samples. The conserved circRNA expression between human and mouse is correlated with the existence of orientation-opposite complementary sequences in introns that flank back-spliced exons in both species, but not the circRNA sequences themselves. Quantification of RNA pairing capacity of orientation-opposite complementary sequences across circRNA-flanking introns by Complementary Sequence Index (CSI) identifies that among all types of complementary sequences, SINEs, especially Alu elements in human, contribute the most for circRNA formation and that their diverse distribution across species leads to the increased complexity of circRNA expression during species evolution. Together, our integrated and comparative reference catalog of circRNAs in different species reveals a species-specific pattern of circRNA expression and suggests a previously under-appreciated impact of fast-evolved SINEs on the regulation of (circRNA) gene expression.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Breitkreutz, D.; Fallone, B. G.; Yahya, A.
2014-06-15
Purpose: To improve proton magnetic resonance spectroscopy (MRS) transverse relaxation (T{sub 2}) determination and quantification of lipid methylene chain (1.3 ppm) protons by rewinding their J-coupling evolution. Methods: MRS experiments were performed on four lipid phantoms, namely, almond, corn, sunflower and oleic acid, using a 3 T Philips MRI scanner with a transmit/receive birdcage head coil. Two PRESS (Point RESolved Spectroscopy) pulse sequences were used. The first PRESS sequence employed standard bandwidth (BW) (∼550 Hz) RF (radiofrequency) refocussing pulses, while the second used refocussing pulses of narrow BW (∼50 Hz) designed to rewind J-coupling evolution of the methylene protons inmore » the voxel of interest. Signal was acquired with each sequence from a 5×5×5 mm{sup 3} voxel, with a repetition time (TR) of 3000 ms, and with echo times (TE) of 100 to 200 ms in steps of 20 ms. 2048 sample points were measured with a 2000 Hz sampling bandwidth. Additionally, 30 mm outer volume suppression slabs were used to suppress signal outside the voxel of interest. The frequency of the RF pulses was set to that of the methylene resonance. Methylene peak areas were calculated and fitted in MATLAB to a monexponentially decaying function of the form M{sub 0}exp(-TE/T{sub 2}), where M{sub 0} is the extrapolated area when TE = 0 ms and yields a measure of concentration. Results: The determined values of M{sub 0} and T{sub 2} increased for all fatty acids when using the PRESS sequence with narrow BW refocussing pulses. M{sub 0} and T{sub 2} values increased by an average amount (over all the phantoms) of 31% and 14%, respectively. Conclusion: This investigation has demonstrated that J-coupling interactions of lipid methylene protons causes non-negligible signal losses which, if not accounted for, Result in underestimations of their levels and T{sub 2} values when performing MRS measurements. Funded by the Natural Sciences and Engineering Research Council of Canada and the Canadian Breast Cancer Foundation - Prairies.NWT.« less
Kumar, Dushyant; Hariharan, Hari; Faizy, Tobias D; Borchert, Patrick; Siemonsen, Susanne; Fiehler, Jens; Reddy, Ravinder; Sedlacik, Jan
2018-05-12
We present a computationally feasible and iterative multi-voxel spatially regularized algorithm for myelin water fraction (MWF) reconstruction. This method utilizes 3D spatial correlations present in anatomical/pathological tissues and underlying B1 + -inhomogeneity or flip angle inhomogeneity to enhance the noise robustness of the reconstruction while intrinsically accounting for stimulated echo contributions using T2-distribution data alone. Simulated data and in vivo data acquired using 3D non-selective multi-echo spin echo (3DNS-MESE) were used to compare the reconstruction quality of the proposed approach against those of the popular algorithm (the method by Prasloski et al.) and our previously proposed 2D multi-slice spatial regularization spatial regularization approach. We also investigated whether the inter-sequence correlations and agreements improved as a result of the proposed approach. MWF-quantifications from two sequences, 3DNS-MESE vs 3DNS-gradient and spin echo (3DNS-GRASE), were compared for both reconstruction approaches to assess correlations and agreements between inter-sequence MWF-value pairs. MWF values from whole-brain data of six volunteers and two multiple sclerosis patients are being reported as well. In comparison with competing approaches such as Prasloski's method or our previously proposed 2D multi-slice spatial regularization method, the proposed method showed better agreements with simulated truths using regression analyses and Bland-Altman analyses. For 3DNS-MESE data, MWF-maps reconstructed using the proposed algorithm provided better depictions of white matter structures in subcortical areas adjoining gray matter which agreed more closely with corresponding contrasts on T2-weighted images than MWF-maps reconstructed with the method by Prasloski et al. We also achieved a higher level of correlations and agreements between inter-sequence (3DNS-MESE vs 3DNS-GRASE) MWF-value pairs. The proposed algorithm provides more noise-robust fits to T2-decay data and improves MWF-quantifications in white matter structures especially in the sub-cortical white matter and major white matter tract regions. Copyright © 2018 Elsevier Inc. All rights reserved.
Bushell, Claire A.; Grant, Paul R.; Cowen, Simon; Gutierrez-Aguirre, Ion; O'Sullivan, Denise M.; Žel, Jana; Milavec, Mojca; Foy, Carole A.; Nastouli, Eleni; Garson, Jeremy A.; Huggett, Jim F.
2015-01-01
Digital PCR (dPCR) is being increasingly used for the quantification of sequence variations, including single nucleotide polymorphisms (SNPs), due to its high accuracy and precision in comparison with techniques such as quantitative PCR (qPCR) and melt curve analysis. To develop and evaluate dPCR for SNP detection using DNA, RNA, and clinical samples, an influenza virus model of resistance to oseltamivir (Tamiflu) was used. First, this study was able to recognize and reduce off-target amplification in dPCR quantification, thereby enabling technical sensitivities down to 0.1% SNP abundance at a range of template concentrations, a 50-fold improvement on the qPCR assay used routinely in the clinic. Second, a method was developed for determining the false-positive rate (background) signal. Finally, comparison of dPCR with qPCR results on clinical samples demonstrated the potential impact dPCR could have on clinical research and patient management by earlier (trace) detection of rare drug-resistant sequence variants. Ultimately this could reduce the quantity of ineffective drugs taken and facilitate early switching to alternative medication when available. In the short term such methods could advance our understanding of microbial dynamics and therapeutic responses in a range of infectious diseases such as HIV, viral hepatitis, and tuberculosis. Furthermore, the findings presented here are directly relevant to other diagnostic areas, such as the detection of rare SNPs in malignancy, monitoring of graft rejection, and fetal screening. PMID:26659206
Whale, Alexandra S; Bushell, Claire A; Grant, Paul R; Cowen, Simon; Gutierrez-Aguirre, Ion; O'Sullivan, Denise M; Žel, Jana; Milavec, Mojca; Foy, Carole A; Nastouli, Eleni; Garson, Jeremy A; Huggett, Jim F
2016-02-01
Digital PCR (dPCR) is being increasingly used for the quantification of sequence variations, including single nucleotide polymorphisms (SNPs), due to its high accuracy and precision in comparison with techniques such as quantitative PCR (qPCR) and melt curve analysis. To develop and evaluate dPCR for SNP detection using DNA, RNA, and clinical samples, an influenza virus model of resistance to oseltamivir (Tamiflu) was used. First, this study was able to recognize and reduce off-target amplification in dPCR quantification, thereby enabling technical sensitivities down to 0.1% SNP abundance at a range of template concentrations, a 50-fold improvement on the qPCR assay used routinely in the clinic. Second, a method was developed for determining the false-positive rate (background) signal. Finally, comparison of dPCR with qPCR results on clinical samples demonstrated the potential impact dPCR could have on clinical research and patient management by earlier (trace) detection of rare drug-resistant sequence variants. Ultimately this could reduce the quantity of ineffective drugs taken and facilitate early switching to alternative medication when available. In the short term such methods could advance our understanding of microbial dynamics and therapeutic responses in a range of infectious diseases such as HIV, viral hepatitis, and tuberculosis. Furthermore, the findings presented here are directly relevant to other diagnostic areas, such as the detection of rare SNPs in malignancy, monitoring of graft rejection, and fetal screening. Copyright © 2016 Whale et al.
Ning, Jia; Sun, Yongliang; Xie, Sheng; Zhang, Bida; Huang, Feng; Koken, Peter; Smink, Jouke; Yuan, Chun; Chen, Huijun
2018-05-01
To propose a simultaneous acquisition sequence for improved hepatic pharmacokinetics quantification accuracy (SAHA) method for liver dynamic contrast-enhanced MRI. The proposed SAHA simultaneously acquired high temporal-resolution 2D images for vascular input function extraction using Cartesian sampling and 3D large-coverage high spatial-resolution liver dynamic contrast-enhanced images using golden angle stack-of-stars acquisition in an interleaved way. Simulations were conducted to investigate the accuracy of SAHA in pharmacokinetic analysis. A healthy volunteer and three patients with cirrhosis or hepatocellular carcinoma were included in the study to investigate the feasibility of SAHA in vivo. Simulation studies showed that SAHA can provide closer results to the true values and lower root mean square error of estimated pharmacokinetic parameters in all of the tested scenarios. The in vivo scans of subjects provided fair image quality of both 2D images for arterial input function and portal venous input function and 3D whole liver images. The in vivo fitting results showed that the perfusion parameters of healthy liver were significantly different from those of cirrhotic liver and HCC. The proposed SAHA can provide improved accuracy in pharmacokinetic modeling and is feasible in human liver dynamic contrast-enhanced MRI, suggesting that SAHA is a potential tool for liver dynamic contrast-enhanced MRI. Magn Reson Med 79:2629-2641, 2018. © 2017 International Society for Magnetic Resonance in Medicine. © 2017 International Society for Magnetic Resonance in Medicine.
Ye, Hongping; Hill, John; Kauffman, John; Gryniewicz, Connie; Han, Xianlin
2008-08-15
Isotope tags for relative and absolute quantification (iTRAQ) reagent coupled with matrix-assisted laser desorption/ionization tandem time-of-flight (MALDI-TOF/TOF) mass spectrometric analysis has been evaluated as both a qualitative and quantitative method for the detection of modifications to active pharmaceutical ingredients derived from recombinant DNA technologies and as a method to detect counterfeit drug products. Five types of insulin (human, bovine, porcine, Lispro, and Lantus) were used as model products in the study because of their minor variations in amino acid sequence. Several experiments were conducted in which each insulin variant was separately digested with Glu-C, and the digestate was labeled with one of four different iTRAQ reagents. All digestates were then combined for desalting and MALDI-TOF/TOF mass spectrometric analysis. When the digestion procedure was optimized, the insulin sequence coverage was 100%. Five different types of insulin were readily differentiated, including human insulin (P28K29) and Lispro insulin (K28P29), which differ only by the interchange of two contiguous residues. Moreover, quantitative analyses show that the results obtained from the iTRAQ method agree well with those determined by other conventional methods. Collectively, the iTRAQ method can be used as a qualitative and quantitative technique for the detection of protein modification and counterfeiting.
Three-dimensional transgenic cell model to quantify genotoxic effects of space environment
NASA Astrophysics Data System (ADS)
Gonda, S. R.; Wu, H.; Pingerelli, P. L.; Glickman, B. W.
In this paper we describe a three-dimensional, multicellular tissue-equivalent model, produced in NASA-designed, rotating wall bioreactors using mammalian cells engineered for genomic containment of multiple copies of defined target genes for genotoxic assessment. Rat 2λ fibroblasts, genetically engineered to contain high-density target genes for mutagenesis (Stratagene, Inc., Austin, TX), were cocultured with human epithelial cells on Cytodex beads in the High Aspect Ratio Bioreactor (Synthecon, Inc, Houston, TX). Multi-bead aggregates were formed by day 5 following the complete covering of the beads by fibroblasts. Cellular retraction occurred 8-14 days after coculture initiation culminating in spheroids retaining few or no beads. Analysis of the resulting tissue assemblies revealed: multicellular spheroids, fibroblasts synthesized collagen, and cell viability was retained for the 30-day test period after removal from the bioreactor. Quantification of mutation at the LacI gene in Rat 2λ fibroblasts in spheroids exposed to 0-2 Gy neon using the Big Blue color assay (Stratagene, Inc.), revealed a linear dose-response for mutation induction. Limited sequencing analysis of mutant clones from 0.25 or 1 Gy exposures revealed a higher frequency of deletions and multiple base sequencing changes with increasing dose. These results suggest that the three-dimensional, multicellular tissue assembly model produced in NASA bioreactors are applicable to a wide variety of studies involving the quantification and identification of genotocity including measurement of the inherent damage incurred in Space.
Lee, Ra Mi; Ryu, Rae Hyung; Jeong, Seong Won; Oh, Soo Jin; Huang, Hue; Han, Jin Soo; Lee, Chi Ho; Lee, C. Justin; Jan, Lily Yeh
2011-01-01
To clone the first anion channel from Xenopus laevis (X. laevis), we isolated a calcium-activated chloride channel (CLCA)-like membrane protein 6 gene (CMP6) in X. laevis. As a first step in gene isolation, an expressed sequence tags database was screened to find the partial cDNA fragment. A putative partial cDNA sequence was obtained by comparison with rat CLCAs identified in our laboratory. First stranded cDNA was synthesized by reverse transcription polymerase-chain reaction (RT-PCR) using a specific primer designed for the target cDNA. Repeating the 5' and 3' rapid amplification of cDNA ends, full-length cDNA was constructed from the cDNA pool. The full-length CMP6 cDNA completed via 5'- and 3'-RACE was 2,940 bp long and had an open reading frame (ORF) of 940 amino acids. The predicted 940 polypeptides have four major transmembrane domains and showed about 50% identity with that of rat brain CLCAs in our previously published data. Semi-quantification analysis revealed that CMP6 was most abundantly expressed in small intestine, colon and liver. However, all tissues except small intestine, colon and liver had undetectable levels. This result became more credible after we did real-time PCR quantification for the target gene. In view of all CLCA studies focused on human or murine channels, this finding suggests a hypothetical protein as an ion channel, an X. laevis CLCA. PMID:21826170
Design and application of a data-independent precursor and product ion repository.
Thalassinos, Konstantinos; Vissers, Johannes P C; Tenzer, Stefan; Levin, Yishai; Thompson, J Will; Daniel, David; Mann, Darrin; DeLong, Mark R; Moseley, M Arthur; America, Antoine H; Ottens, Andrew K; Cavey, Greg S; Efstathiou, Georgios; Scrivens, James H; Langridge, James I; Geromanos, Scott J
2012-10-01
The functional design and application of a data-independent LC-MS precursor and product ion repository for protein identification, quantification, and validation is conceptually described. The ion repository was constructed from the sequence search results of a broad range of discovery experiments investigating various tissue types of two closely related mammalian species. The relative high degree of similarity in protein complement, ion detection, and peptide and protein identification allows for the analysis of normalized precursor and product ion intensity values, as well as standardized retention times, creating a multidimensional/orthogonal queryable, qualitative, and quantitative space. Peptide ion map selection for identification and quantification is primarily based on replication and limited variation. The information is stored in a relational database and is used to create peptide- and protein-specific fragment ion maps that can be queried in a targeted fashion against the raw or time aligned ion detections. These queries can be conducted either individually or as groups, where the latter affords pathway and molecular machinery analysis of the protein complement. The presented results also suggest that peptide ionization and fragmentation efficiencies are highly conserved between experiments and practically independent of the analyzed biological sample when using similar instrumentation. Moreover, the data illustrate only minor variation in ionization efficiency with amino acid sequence substitutions occurring between species. Finally, the data and the presented results illustrate how LC-MS performance metrics can be extracted and utilized to ensure optimal performance of the employed analytical workflows.
Comparison of alternative approaches for analysing multi-level RNA-seq data
Mohorianu, Irina; Bretman, Amanda; Smith, Damian T.; Fowler, Emily K.; Dalmay, Tamas
2017-01-01
RNA sequencing (RNA-seq) is widely used for RNA quantification in the environmental, biological and medical sciences. It enables the description of genome-wide patterns of expression and the identification of regulatory interactions and networks. The aim of RNA-seq data analyses is to achieve rigorous quantification of genes/transcripts to allow a reliable prediction of differential expression (DE), despite variation in levels of noise and inherent biases in sequencing data. This can be especially challenging for datasets in which gene expression differences are subtle, as in the behavioural transcriptomics test dataset from D. melanogaster that we used here. We investigated the power of existing approaches for quality checking mRNA-seq data and explored additional, quantitative quality checks. To accommodate nested, multi-level experimental designs, we incorporated sample layout into our analyses. We employed a subsampling without replacement-based normalization and an identification of DE that accounted for the hierarchy and amplitude of effect sizes within samples, then evaluated the resulting differential expression call in comparison to existing approaches. In a final step to test for broader applicability, we applied our approaches to a published set of H. sapiens mRNA-seq samples, The dataset-tailored methods improved sample comparability and delivered a robust prediction of subtle gene expression changes. The proposed approaches have the potential to improve key steps in the analysis of RNA-seq data by incorporating the structure and characteristics of biological experiments. PMID:28792517
Butler, Georgina S; Dean, Richard A; Morrison, Charlotte J; Overall, Christopher M
2010-01-01
Identification of protease substrates is essential to understand the functional consequences of normal proteolytic processing and dysregulated proteolysis in disease. Quantitative proteomics and mass spectrometry can be used to identify protease substrates in the cellular context. Here we describe the use of two protein labeling techniques, Isotope-Coded Affinity Tags (ICAT and Isobaric Tags for Relative and Absolute Quantification (iTRAQ), which we have used successfully to identify novel matrix metalloproteinase (MMP) substrates in cell culture systems (1-4). ICAT and iTRAQ can label proteins and protease cleavage products of secreted proteins, protein domains shed from the cell membrane or pericellular matrix of protease-transfected cells that have accumulated in conditioned medium, or cell surface proteins in membrane preparations; isotopically distinct labels are used for control cells. Tryptic digestion and tandem mass spectrometry of the generated fragments enable sequencing of differentially labeled but otherwise identical pooled peptides. The isotopic tag, which is unique for each label, identifies the peptides originating from each sample, for instance, protease-transfected or control cells, and comparison of the peak areas enables relative quantification of the peptide in each sample. Thus proteins present in altered amounts between protease-expressing and null cells are implicated as protease substrates and can be further validated as such.
Addressing Phase Errors in Fat-Water Imaging Using a Mixed Magnitude/Complex Fitting Method
Hernando, D.; Hines, C. D. G.; Yu, H.; Reeder, S.B.
2012-01-01
Accurate, noninvasive measurements of liver fat content are needed for the early diagnosis and quantitative staging of nonalcoholic fatty liver disease. Chemical shift-based fat quantification methods acquire images at multiple echo times using a multiecho spoiled gradient echo sequence, and provide fat fraction measurements through postprocessing. However, phase errors, such as those caused by eddy currents, can adversely affect fat quantification. These phase errors are typically most significant at the first echo of the echo train, and introduce bias in complex-based fat quantification techniques. These errors can be overcome using a magnitude-based technique (where the phase of all echoes is discarded), but at the cost of significantly degraded signal-to-noise ratio, particularly for certain choices of echo time combinations. In this work, we develop a reconstruction method that overcomes these phase errors without the signal-to-noise ratio penalty incurred by magnitude fitting. This method discards the phase of the first echo (which is often corrupted) while maintaining the phase of the remaining echoes (where phase is unaltered). We test the proposed method on 104 patient liver datasets (from 52 patients, each scanned twice), where the fat fraction measurements are compared to coregistered spectroscopy measurements. We demonstrate that mixed fitting is able to provide accurate fat fraction measurements with high signal-to-noise ratio and low bias over a wide choice of echo combinations. PMID:21713978
Carraro, R; Dalla Rovere, G; Ferraresso, S; Carraro, L; Franch, R; Toffan, A; Pascoli, F; Patarnello, T; Bargelloni, L
2018-02-01
The availability of a rapid and accurate method for the diagnosis of Photobacterium damselae subsp. piscicida (Phdp), able to discriminate its strictly correlated subsp. damselae (Phdd), formally known as Vibrio damsela, is essential for managing fish pasteurellosis outbreaks in farmed fish. A single-step, high-sensitivity real-time PCR assay for simultaneous detection and quantification of P. damselae was designed targeting partial of the sequence of the bamB gene and tested for specificity and sensitivity on laboratory-generated samples as well as on experimentally infected seabream tissue samples. With a limit of detection (LOD) of one copy in pure bacterial DNA, the sensitivity was higher than all methods previously reported. Validation in target and non-target bacterial species proved the assay was able to discriminate Phdd-Phdp subspecies from diverse hosts/geographical origins and between non-target species. In addition, two SNPs in the target amplicon region determine two distinctive qPCR dissociation curves distinguishing between Phdp-Phdd. This is the first time that a molecular method for P. damselae diagnosis combines detection, quantification and subspecies identification in one step. The assay holds the potential to improve the knowledge of infection dynamics and the development of better strategies to control an important fish disease. © 2017 John Wiley & Sons Ltd.
In vivo quantification of brain metabolites by 1H-MRS using water as an internal standard.
Christiansen, P; Henriksen, O; Stubgaard, M; Gideon, P; Larsson, H B
1993-01-01
The reliability of absolute quantification of average metabolite concentrations in the human brain in vivo by 1H-MRS using the fully relaxed water signal as an internal standard was tested in a number of in vitro as well as in vivo measurements. The experiments were carried out on a SIEMENS HELICON SP 63/84 wholebody MR-scanner operating at 1.5 T using a STEAM sequence. In vitro studies indicate a very high correlation between metabolite signals (area under peaks) and concentration, R = 0.99 as well as between metabolite signals and the volume of the selected voxel, R = 1.00. The error in quantification of N-acetyl aspartate (NAA) concentration was about 1-2 mM (6-12%). Also in vivo a good linearity between water signal and selected voxel size was seen. The same was true for the studied metabolites, N-acetyl aspartate (NAA), creatine/phosphocreatine (Cr/PCr), and choline (Cho). Calculated average concentrations of NAA, Cr/PCr, and Cho in the occipital lobe of the brain in five healthy volunteers were (mean +/- 1 SD) 11.6 +/- 1.3 mM, 7.6 +/- 1.4 mM, and 1.7 +/- 0.5 mM. The results indicate that the method presented offers reasonable estimation of metabolite concentrations in the brain in vivo and therefore is useful in clinical research.
Investigation of Containment Flooding Strategy for Mark-III Nuclear Power Plant with MAAP4
DOE Office of Scientific and Technical Information (OSTI.GOV)
Su Weinian; Wang, S.-J.; Chiang, S.-C
2005-06-15
Containment flooding is an important strategy for severe accident management of a conventional boiling water reactor (BWR) system. The purpose of this work is to investigate the containment flooding strategy of the Mark-III system after a reactor pressure vessel (RPV) breach. The Kuosheng Power Plant is a typical BWR-6 nuclear power plant (NPP) with Mark-III containment. The Severe Accident Management Guideline (SAMG) of the Kuosheng NPP has been developed based on the BWR Owners Group (BWROG) Emergency Procedure and Severe Accident Guidelines, Rev. 2. Therefore, the Kuosheng NPP is selected as the plant for study, and the MAAP4 code ismore » chosen as the tool for analysis. A postulated specific station blackout sequence for the Kuosheng NPP is cited as a reference case for this analysis. Because of the design features of Mark-III containment, the debris in the reactor cavity may not be submerged after an RPV breach when one follows the containment flooding strategy as suggested in the BWROG generic guideline, and the containment integrity could be challenged eventually. A more specific containment flooding strategy with drywell venting after an RPV breach is investigated, and a more stable plant condition is achieved with this strategy. Accordingly, the containment flooding strategy after an RPV breach will be modified for the Kuosheng SAMG, and these results are applicable to typical Mark-III plants with drywell vent path.« less
Lepage, Hugo; Evrard, Olivier; Onda, Yuichi; Patin, Jeremy; Chartin, Caroline; Lefèvre, Irène; Bonté, Philippe; Ayrault, Sophie
2014-04-01
Silver-110 metastable ((110m)Ag) has been far less investigated than other anthropogenic radionuclides, although it has the potential to accumulate in plants and animal tissues. It is continuously produced by nuclear power plants in normal conditions, but emitted in much larger quantities in accidental conditions facilitating its detection, which allows the investigation of its behaviour in the environment. We analysed (110m)Ag in soil and river drape sediment (i.e., mud drapes deposited on channel-bed sand) collected within coastal catchments contaminated in Fukushima Prefecture (Japan) after the Fukushima Dai-ichi Nuclear Power Plant accident that occurred on 11 March 2011. Several field experiments were conducted to document radiosilver behaviour in the terrestrial environment, with a systematic comparison to the more documented radiocesium behaviour. Results show a similar and low mobility for both elements in soils and a strong affinity with the clay fraction. Measurements conducted on sediment sequences accumulated in reservoirs tend to confirm a comparable deposition of those radionuclides even after their redistribution due to erosion and deposition processes. Therefore, as the (110m)Ag:(137)Cs initial activity ratio varied in soils across the area, we justified the relevance of using this tool to track the dispersion of contaminated sediment from the main inland radioactive pollution plume generated by FDNPP accident. Copyright © 2014 Elsevier Ltd. All rights reserved.
The role of PRA in the safety assessment of VVER Nuclear Power Plants in Ukraine.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kot, C.
1999-05-10
Ukraine operates thirteen (13) Soviet-designed pressurized water reactors, VVERS. All Ukrainian plants are currently operating with annually renewable permits until they update their safety analysis reports (SARs), in accordance with new SAR content requirements issued in September 1995, by the Nuclear Regulatory Authority and the Government Nuclear Power Coordinating Committee of Ukraine. The requirements are in three major areas: design basis accident (DBA) analysis, probabilistic risk assessment (PRA), and beyond design-basis accident (BDBA) analysis. The last two requirements, on PRA and BDBA, are new, and the DBA requirements are an expanded version of the older SAR requirements. The US Departmentmore » of Energy (USDOE), as part of its Soviet-Designed Reactor Safety activities, is providing assistance and technology transfer to Ukraine to support their nuclear power plants (NPPs) in developing a Western-type technical basis for the new SARs. USDOE sponsored In-Depth Safety Assessments (ISAs) are in progress at three pilot nuclear reactor units in Ukraine, South Ukraine Unit 1, Zaporizhzhya Unit 5, and Rivne Unit 1, and a follow-on study has been initiated at Khmenytskyy Unit 1. The ISA projects encompass most areas of plant safety evaluation, but the initial emphasis is on performing a detailed, plant-specific Level 1 Internal Events PRA. This allows the early definition of the plant risk profile, the identification of risk significant accident sequences and plant vulnerabilities and provides guidance for the remainder of the safety assessments.« less
Structure-related statistical singularities along protein sequences: a correlation study.
Colafranceschi, Mauro; Colosimo, Alfredo; Zbilut, Joseph P; Uversky, Vladimir N; Giuliani, Alessandro
2005-01-01
A data set composed of 1141 proteins representative of all eukaryotic protein sequences in the Swiss-Prot Protein Knowledge base was coded by seven physicochemical properties of amino acid residues. The resulting numerical profiles were submitted to correlation analysis after the application of a linear (simple mean) and a nonlinear (Recurrence Quantification Analysis, RQA) filter. The main RQA variables, Recurrence and Determinism, were subsequently analyzed by Principal Component Analysis. The RQA descriptors showed that (i) within protein sequences is embedded specific information neither present in the codes nor in the amino acid composition and (ii) the most sensitive code for detecting ordered recurrent (deterministic) patterns of residues in protein sequences is the Miyazawa-Jernigan hydrophobicity scale. The most deterministic proteins in terms of autocorrelation properties of primary structures were found (i) to be involved in protein-protein and protein-DNA interactions and (ii) to display a significantly higher proportion of structural disorder with respect to the average data set. A study of the scaling behavior of the average determinism with the setting parameters of RQA (embedding dimension and radius) allows for the identification of patterns of minimal length (six residues) as possible markers of zones specifically prone to inter- and intramolecular interactions.
Barreda-García, Susana; González-Álvarez, María José; de-Los-Santos-Álvarez, Noemí; Palacios-Gutiérrez, Juan José; Miranda-Ordieres, Arturo J; Lobo-Castañón, María Jesús
2015-06-15
A highly sensitive and robust method for the quantification of specific DNA sequences based on coupling asymmetric helicase-dependent DNA amplification to electrochemical detection is described. This method relies on the entrapment of the amplified ssDNA sequences on magnetic beads followed by a post-amplification hybridization assay to provide an added degree of specificity. As a proof-of-concept a 84-bases long sequence specific of Mycobacterium tuberculosis is amplified at 65°C, providing 3×10(6) amplification after 90 min. Using this system 0.5 aM, corresponding to 15 copies of the target gene in 50 µL of sample, can be successfully detected and reliably quantified under isothermal conditions in less than 4h. The assay has been applied to the detection of M. tuberculosis in sputum, pleural fluid and urine samples. Besides this application, the proposed assays is a powerful and general tool for molecular diagnostic that can be applied to the detection of other specific DNA sequences, taking full advantage of the plethora of genomic information now available. Copyright © 2014 Elsevier B.V. All rights reserved.
Spliced synthetic genes as internal controls in RNA sequencing experiments.
Hardwick, Simon A; Chen, Wendy Y; Wong, Ted; Deveson, Ira W; Blackburn, James; Andersen, Stacey B; Nielsen, Lars K; Mattick, John S; Mercer, Tim R
2016-09-01
RNA sequencing (RNA-seq) can be used to assemble spliced isoforms, quantify expressed genes and provide a global profile of the transcriptome. However, the size and diversity of the transcriptome, the wide dynamic range in gene expression and inherent technical biases confound RNA-seq analysis. We have developed a set of spike-in RNA standards, termed 'sequins' (sequencing spike-ins), that represent full-length spliced mRNA isoforms. Sequins have an entirely artificial sequence with no homology to natural reference genomes, but they align to gene loci encoded on an artificial in silico chromosome. The combination of multiple sequins across a range of concentrations emulates alternative splicing and differential gene expression, and it provides scaling factors for normalization between samples. We demonstrate the use of sequins in RNA-seq experiments to measure sample-specific biases and determine the limits of reliable transcript assembly and quantification in accompanying human RNA samples. In addition, we have designed a complementary set of sequins that represent fusion genes arising from rearrangements of the in silico chromosome to aid in cancer diagnosis. RNA sequins provide a qualitative and quantitative reference with which to navigate the complexity of the human transcriptome.
Bauer, Bianca S.; Forsyth, George W.; Sandmeyer, Lynne S.; Grahn, Bruce H.
2011-01-01
Mitochondrial transcription factor A (Tfam) has been implicated in the pathogenesis of retinal dysplasia in miniature schnauzer dogs and it has been proposed that affected dogs have altered mitochondrial numbers, size, and morphology. To test these hypotheses the Tfam gene of affected and normal miniature schnauzer dogs with retinal dysplasia was sequenced and lymphocyte mitochondria were quantified, measured, and the morphology was compared in normal and affected dogs using transmission electron microscopy. For Tfam sequencing, retina, retinal pigment epithelium (RPE), and whole blood samples were collected. Total RNA was isolated from the retina and RPE and reverse transcribed to make cDNA. Genomic DNA was extracted from white blood cell pellets obtained from the whole blood samples. The Tfam coding sequence, 5′ promoter region, intron1 and the 3′ non-coding sequence of normal and affected dogs were amplified using polymerase chain reaction (PCR), cloned and sequenced. For electron microscopy, lymphocytes from affected and normal dogs were photographed and the mitochondria within each cross-section were identified, quantified, and the mitochondrial area (μm2) per lymphocyte cross-section was calculated. Lastly, using a masked technique, mitochondrial morphology was compared between the 2 groups. Sequencing of the miniature schnauzer Tfam gene revealed no functional sequence variation between affected and normal dogs. Lymphocyte and mitochondrial area, mitochondrial quantification, and morphology assessment also revealed no significant difference between the 2 groups. Further investigation into other candidate genes or factors causing retinal dysplasia in the miniature schnauzer is warranted. PMID:21731185
Bauer, Bianca S; Forsyth, George W; Sandmeyer, Lynne S; Grahn, Bruce H
2011-04-01
Mitochondrial transcription factor A (Tfam) has been implicated in the pathogenesis of retinal dysplasia in miniature schnauzer dogs and it has been proposed that affected dogs have altered mitochondrial numbers, size, and morphology. To test these hypotheses the Tfam gene of affected and normal miniature schnauzer dogs with retinal dysplasia was sequenced and lymphocyte mitochondria were quantified, measured, and the morphology was compared in normal and affected dogs using transmission electron microscopy. For Tfam sequencing, retina, retinal pigment epithelium (RPE), and whole blood samples were collected. Total RNA was isolated from the retina and RPE and reverse transcribed to make cDNA. Genomic DNA was extracted from white blood cell pellets obtained from the whole blood samples. The Tfam coding sequence, 5' promoter region, intron1 and the 3' non-coding sequence of normal and affected dogs were amplified using polymerase chain reaction (PCR), cloned and sequenced. For electron microscopy, lymphocytes from affected and normal dogs were photographed and the mitochondria within each cross-section were identified, quantified, and the mitochondrial area (μm²) per lymphocyte cross-section was calculated. Lastly, using a masked technique, mitochondrial morphology was compared between the 2 groups. Sequencing of the miniature schnauzer Tfam gene revealed no functional sequence variation between affected and normal dogs. Lymphocyte and mitochondrial area, mitochondrial quantification, and morphology assessment also revealed no significant difference between the 2 groups. Further investigation into other candidate genes or factors causing retinal dysplasia in the miniature schnauzer is warranted.
Accident diagnosis system based on real-time decision tree expert system
NASA Astrophysics Data System (ADS)
Nicolau, Andressa dos S.; Augusto, João P. da S. C.; Schirru, Roberto
2017-06-01
Safety is one of the most studied topics when referring to power stations. For that reason, sensors and alarms develop an important role in environmental and human protection. When abnormal event happens, it triggers a chain of alarms that must be, somehow, checked by the control room operators. In this case, diagnosis support system can help operators to accurately identify the possible root-cause of the problem in short time. In this article, we present a computational model of a generic diagnose support system based on artificial intelligence, that was applied on the dataset of two real power stations: Angra1 Nuclear Power Plant and Santo Antônio Hydroelectric Plant. The proposed system processes all the information logged in the sequence of events before a shutdown signal using the expert's knowledge inputted into an expert system indicating the chain of events, from the shutdown signal to its root-cause. The results of both applications showed that the support system is a potential tool to help the control room operators identify abnormal events, as accidents and consequently increase the safety.
Mullaji, Arun; Sharma, Amit; Marawar, Satyajit; Kanna, Raj
2009-08-01
A novel sequence of posteromedial release consistent with surgical technique of total knee arthroplasty was performed in 15 cadaveric knees. Medial and lateral flexion and extension gaps were measured after each step of the release using a computed tomography-free computer navigation system. A spring-loaded distractor and a manual distractor were used to distract the joint. Posterior cruciate ligament release increased flexion more than extension gap; deep medial collateral ligament release had a negligible effect; semimembranosus release increased the flexion gap medially; reduction osteotomy increased medial flexion and extension gaps; superficial medial collateral ligament release increased medial joint gap more in flexion and caused severe instability. This sequence of release led to incremental and differential effects on flexion-extension gaps and has implications in correcting varus deformity.
iSeq: Web-Based RNA-seq Data Analysis and Visualization.
Zhang, Chao; Fan, Caoqi; Gan, Jingbo; Zhu, Ping; Kong, Lei; Li, Cheng
2018-01-01
Transcriptome sequencing (RNA-seq) is becoming a standard experimental methodology for genome-wide characterization and quantification of transcripts at single base-pair resolution. However, downstream analysis of massive amount of sequencing data can be prohibitively technical for wet-lab researchers. A functionally integrated and user-friendly platform is required to meet this demand. Here, we present iSeq, an R-based Web server, for RNA-seq data analysis and visualization. iSeq is a streamlined Web-based R application under the Shiny framework, featuring a simple user interface and multiple data analysis modules. Users without programming and statistical skills can analyze their RNA-seq data and construct publication-level graphs through a standardized yet customizable analytical pipeline. iSeq is accessible via Web browsers on any operating system at http://iseq.cbi.pku.edu.cn .
Chance, necessity and the origins of life: a physical sciences perspective.
Hazen, Robert M
2017-12-28
Earth's 4.5-billion-year history has witnessed a complex sequence of high-probability chemical and physical processes, as well as 'frozen accidents'. Most models of life's origins similarly invoke a sequence of chemical reactions and molecular self-assemblies in which both necessity and chance play important roles. Recent research adds two important insights into this discussion. First, in the context of chemical reactions, chance versus necessity is an inherently false dichotomy-a range of probabilities exists for many natural events. Second, given the combinatorial richness of early Earth's chemical and physical environments, events in molecular evolution that are unlikely at limited laboratory scales of space and time may, nevertheless, be inevitable on an Earth-like planet at time scales of a billion years.This article is part of the themed issue 'Reconceptualizing the origins of life'. © 2017 The Author(s).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ott, Larry J.; Howell, Michael; Robb, Kevin R.
Iron-chromium-aluminum (FeCrAl) alloys are being considered as advanced fuel cladding concepts with enhanced accident tolerance. At high temperatures, FeCrAl alloys have slower oxidation kinetics and higher strength compared with zirconium-based alloys. FeCrAl could be used for fuel cladding and spacer or mixing vane grids in light water reactors and/or as channel box material in boiling water reactors (BWRs). There is a need to assess the potential gains afforded by the FeCrAl accident-tolerant-fuel (ATF) concept over the existing zirconium-based materials employed today. To accurately assess the response of FeCrAl alloys under severe accident conditions, a number of FeCrAl properties and characteristicsmore » are required. These include thermophysical properties as well as burst characteristics, oxidation kinetics, possible eutectic interactions, and failure temperatures. These properties can vary among different FeCrAl alloys. Oak Ridge National Laboratory has pursued refined values for the oxidation kinetics of the B136Y FeCrAl alloy (Fe-13Cr-6Al wt %). This investigation included oxidation tests with varying heating rates and end-point temperatures in a steam environment. The rate constant for the low-temperature oxidation kinetics was found to be higher than that for the commercial APMT FeCrAl alloy (Fe-21Cr-5Al-3Mo wt %). Compared with APMT, a 5 times higher rate constant best predicted the entire dataset (root mean square deviation). Based on tests following heating rates comparable with those the cladding would experience during a station blackout, the transition to higher oxidation kinetics occurs at approximately 1,500°C. A parametric study varying the low-temperature FeCrAl oxidation kinetics was conducted for a BWR plant using FeCrAl fuel cladding and channel boxes using the MELCOR code. A range of station blackout severe accident scenarios were simulated for a BWR/4 reactor with Mark I containment. Increasing the FeCrAl low-temperature oxidation rate constant (3 times and 10 times that of the rate constant for APMT) had a negligible impact on the early stages of the accident and minor impacts on the accident progression after the first relocation of the fuel. At temperatures below 1,500°C, increasing the rate constant for APMT by a factor of 10 still resulted in only minor FeCrAl oxidation. In general, the gains afforded by the FeCrAl enhanced ATF concept with respect to accident sequence timing and combustible gas generation are consistent with previous efforts. Compared with the traditional Zircaloy-based cladding and channel box system, the FeCrAl concept could provide a few extra hours of time for operators to take mitigating actions and/or for evacuations to take place. A coolable core geometry is retained longer, enhancing the ability to stabilize an accident. For example, a station blackout was simulated in which cooling water injection was lost 36 hours after shutdown. The timing to first fuel relocation was delayed by approximately 5 h for the FeCrAl ATF concept compared with that of the traditional Zircaloy-based cladding and channel box system.« less
Bollache, Emilie; van Ooij, Pim; Powell, Alex; Carr, James; Markl, Michael; Barker, Alex J.
2016-01-01
The purpose of this study was to compare aortic flow and velocity quantification using 4D flow MRI and 2D CINE phase-contrast (PC)-MRI with either one-directional (2D-1dir) or three-directional (2D-3dir) velocity encoding. 15 healthy volunteers (51 ± 19 years) underwent MRI including (1) breath-holding 2D-1dir and (2) free breathing 2D-3dir PC-MRI in planes orthogonal to the ascending (AA) and descending (DA) aorta, as well as (3) free breathing 4D flow MRI with full thoracic aorta coverage. Flow quantification included the co-registration of the 2D PC acquisition planes with 4D flow MRI data, AA and DA segmentation, and calculation of AA and DA peak systolic velocity, peak flow and net flow volume for all sequences. Additionally, the 2D-3dir velocity taking into account the through-plane component only was used to obtain results analogous to a free breathing 2D-1dir acquisition. Good agreement was found between 4D flow and 2D-3dir peak velocity (differences = −3 to 6 %), peak flow (−7 %) and net volume (−14 to −9 %). In contrast, breath-holding 2D-1dir measurements exhibited indices significantly lower than free breathing 2D-3dir and 2D-1dir (differences = −35 to −7 %, p < 0.05). Finally, high correlations (r ≥ 0.97) were obtained for indices estimated with or without eddy current correction, with the lowest correlation observed for net volume. 4D flow and 2D-3dir aortic hemodynamic indices were in concordance. However, differences between respiration state and 2D-1dir and 2D-3dir measurements indicate that reference values should be established according to the PC-MRI sequence, especially for the widely used net flow (e.g. stroke volume in the AA). PMID:27435230
Li, Kai; Chen, Bei; Zhou, Yuxun; Huang, Rui; Liang, Yinming; Wang, Qinxi; Xiao, Zhenxian; Xiao, Junhua
2009-03-01
A new method, based on ligase detection reaction (LDR), was developed for quantitative detection of multiplex PCR amplicons of 16S rRNA genes present in complex mixtures (specifically feces). LDR has been widely used in single nucleotide polymorphism (SNP) assay but never applied for quantification of multiplex PCR products. This method employs one pair of DNA probes, one of which is labeled with fluorescence for signal capture, complementary to the target sequence. For multiple target sequence analysis, probes were modified with different lengths of polyT at the 5' end and 3' end. Using a DNA sequencer, these ligated probes were separated and identified by size and dye color. Then, relative abundance of target DNA were normalized and quantified based on the fluorescence intensities and exterior size standards. 16S rRNA gene of three preponderant bacteria groups in human feces: Clostridium coccoides, Bacteroides and related genera, and Clostridium leptum group, were amplified and cloned into plasmid DNA so as to make standard curves. After PCR-LDR analysis, a strong linear relationship was found between the florescence intensity and the diluted plasmid DNA concentrations. Furthermore, based on this method, 100 human fecal samples were quantified for the relative abundance of the three bacterial groups. Relative abundance of C. coccoides was significantly higher in elderly people in comparison with young adults, without gender differences. Relative abundance of Bacteroides and related genera and C. leptum group were significantly higher in young and middle aged than in the elderly. Regarding the whole set of sample, C. coccoides showed the highest relative abundance, followed by decreasing groups Bacteroides and related genera, and C. leptum. These results imply that PCR-LDR can be feasible and flexible applied to large scale epidemiological studies.
Li, Wenli; Turner, Amy; Aggarwal, Praful; Matter, Andrea; Storvick, Erin; Arnett, Donna K; Broeckel, Ulrich
2015-12-16
Whole transcriptome sequencing (RNA-seq) represents a powerful approach for whole transcriptome gene expression analysis. However, RNA-seq carries a few limitations, e.g., the requirement of a significant amount of input RNA and complications led by non-specific mapping of short reads. The Ion AmpliSeq Transcriptome Human Gene Expression Kit (AmpliSeq) was recently introduced by Life Technologies as a whole-transcriptome, targeted gene quantification kit to overcome these limitations of RNA-seq. To assess the performance of this new methodology, we performed a comprehensive comparison of AmpliSeq with RNA-seq using two well-established next-generation sequencing platforms (Illumina HiSeq and Ion Torrent Proton). We analyzed standard reference RNA samples and RNA samples obtained from human induced pluripotent stem cell derived cardiomyocytes (hiPSC-CMs). Using published data from two standard RNA reference samples, we observed a strong concordance of log2 fold change for all genes when comparing AmpliSeq to Illumina HiSeq (Pearson's r = 0.92) and Ion Torrent Proton (Pearson's r = 0.92). We used ROC, Matthew's correlation coefficient and RMSD to determine the overall performance characteristics. All three statistical methods demonstrate AmpliSeq as a highly accurate method for differential gene expression analysis. Additionally, for genes with high abundance, AmpliSeq outperforms the two RNA-seq methods. When analyzing four closely related hiPSC-CM lines, we show that both AmpliSeq and RNA-seq capture similar global gene expression patterns consistent with known sources of variations. Our study indicates that AmpliSeq excels in the limiting areas of RNA-seq for gene expression quantification analysis. Thus, AmpliSeq stands as a very sensitive and cost-effective approach for very large scale gene expression analysis and mRNA marker screening with high accuracy.
Wang, Hao; Straubinger, Robert M; Aletta, John M; Cao, Jin; Duan, Xiaotao; Yu, Haoying; Qu, Jun
2009-03-01
Protein arginine (Arg) methylation serves an important functional role in eucaryotic cells, and typically occurs in domains consisting of multiple Arg in close proximity. Localization of methylarginine (MA) within Arg-rich domains poses a challenge for mass spectrometry (MS)-based methods; the peptides are highly charged under electrospray ionization (ESI), which limits the number of sequence-informative products produced by collision induced dissociation (CID), and loss of the labile methylation moieties during CID precludes effective fragmentation of the peptide backbone. Here the fragmentation behavior of Arg-rich peptides was investigated comprehensively using electron-transfer dissociation (ETD) and CID for both methylated and unmodified glycine-/Arg-rich peptides (GAR), derived from residues 679-695 of human nucleolin, which contains methylation motifs that are widely-represented in biological systems. ETD produced abundant information for sequencing and MA localization, whereas CID failed to provide credible identification for any available charge state (z = 2-4). Nevertheless, CID produced characteristic neutral losses that can be employed to distinguish among different types of MA, as suggested by previous works and confirmed here with product ion scans of high accuracy/resolution by an LTQ/Orbitrap. To analyze MA-peptides in relatively complex mixtures, a method was developed that employs nano-LC coupled to alternating CID/ETD for peptide sequencing and MA localization/characterization, and an Orbitrap for accurate precursor measurement and relative quantification of MA-peptide stoichiometries. As proof of concept, GAR-peptides methylated in vitro by protein arginine N-methyltransferases PRMT1 and PRMT7 were analyzed. It was observed that PRMT1 generated a number of monomethylated (MMA) and asymmetric-dimethylated peptides, while PRMT7 produced predominantly MMA peptides and some symmetric-dimethylated peptides. This approach and the results may advance understanding of the actions of PRMTs and the functional significance of Arg methylation patterns.
Wang, Hao; Straubinger, Robert M.; Aletta, John M.; Cao, Jin; Duan, Xiaotao; Yu, Haoying; Qu, Jun
2012-01-01
Protein arginine (Arg) methylation serves an important functional role in eukaryotic cells, and typically occurs in domains consisting of multiple Arg in close proximity. Localization of methylarginine (MA) within Arg-rich domains poses a challenge for mass spectrometry (MS)-based methods; the peptides are highly-charged under electrospray ionization (ESI), which limits the number of sequence-informative products produced by collision induced dissociation (CID), and loss of the labile methylation moieties during CID precludes effective fragmentation of the peptide backbone. Here the fragmentation behavior of Arg-rich peptides was investigated comprehensively using electron transfer dissociation (ETD) and CID for both methylated and unmodified glycine-/Arg-rich peptides (GAR), derived from residues 679-695 of human nucleolin, which contains methylation motifs that are widely-represented in biological systems. ETD produced abundant information for sequencing and MA localization, whereas CID failed to provide credible identification for any available charge state (z=2-4). Nevertheless, CID produced characteristic neutral losses that can be employed to distinguish among different types of MA, as suggested by previous works and confirmed here with product ion scans of high accuracy/resolution by an LTQ/Orbitrap. To analyze MA-peptides in relatively complex mixtures, a method was developed that employs nano-LC coupled to alternating CID/ETD for peptide sequencing and MA localization/characterization, and an Orbitrap for accurate precursor measurement and relative quantification of MA-peptide stoichiometries. As proof of concept, GAR-peptides methylated in vitro by protein arginine N-methyltransferases PRMT1 and PRMT7 were analyzed. It was observed that PRMT1 generated a number of monomethylated (MMA) and asymmetric-dimethylated peptides, while PRMT7 produced predominantly MMA peptides and some symmetric-dimethylated peptides. This approach and the results may advance understanding of the actions of PRMTs and the functional significance of Arg methylation patterns. PMID:19110445
Li, Meirong; Jin, Wei; Li, Yuanfei; Zhao, Lingling; Cheng, Yanfen; Zhu, Weiyun
2016-06-01
The quantification and community of bacteria in the gastrointestinal (GI) tract (stomach, jejunum, ileum, cecum, colon and rectum) of red kangaroos (Macropus rufus) were examined by using real-time PCR and paired-end Illumina sequencing. The quantification of bacteria showed that the number of bacteria in jejunum and rectum was significantly lower than that in colon and cecum (P < 0.05). A total of 1,872,590 sequences was remained after quality-filtering and 50,948 OTUs were identified at the 97 % similarity level. The dominant phyla in the GI tract of red kangaroos were identified as Actinobacteria, Bacteroidetes and Firmicutes. At the level of genus, the samples from different parts of GI tract clustered into three groups: stomach, small intestine (jejunum and ileum) and large intestine (cecum and rectum). Prevotella (29.81 %) was the most dominant genus in the stomach and significantly (P < 0.05) higher than that in other parts of GI tract. In the small intestine, Bifidobacterium (33.04, 12.14 %) and Streptococcus (22.90, 19.16 %) were dominant genera. Unclassified Ruminococcaceae was the most dominant family in large intestine and the total relative abundance of unclassified bacteria was above 50 %. In identified genera, Dorea was the most important variable to discriminate large intestine and it was significantly higher in cecum than in stomach, small intestine and colon (P < 0.05). Bifidobacterium (21.89 %) was the only dominant genus in colon. Future work on culture in vitro and genome sequencing of those unidentified bacteria might give us insight into the function of these microorganisms in the GI tract. In addition, the comparison of the bacterial community in the foregut of kangaroos and other herbivores and the rumen might give us insight into the mechanism of fiber degradation and help us exploit approaches to improve the feed efficiency and subsequently, reduce the methane emission from herbivores.
Devonshire, Alison S; O'Sullivan, Denise M; Honeyborne, Isobella; Jones, Gerwyn; Karczmarczyk, Maria; Pavšič, Jernej; Gutteridge, Alice; Milavec, Mojca; Mendoza, Pablo; Schimmel, Heinz; Van Heuverswyn, Fran; Gorton, Rebecca; Cirillo, Daniela Maria; Borroni, Emanuele; Harris, Kathryn; Barnard, Marinus; Heydenrych, Anthenette; Ndusilo, Norah; Wallis, Carole L; Pillay, Keshree; Barry, Thomas; Reddington, Kate; Richter, Elvira; Mozioğlu, Erkan; Akyürek, Sema; Yalçınkaya, Burhanettin; Akgoz, Muslum; Žel, Jana; Foy, Carole A; McHugh, Timothy D; Huggett, Jim F
2016-08-03
Real-time PCR (qPCR) based methods, such as the Xpert MTB/RIF, are increasingly being used to diagnose tuberculosis (TB). While qualitative methods are adequate for diagnosis, the therapeutic monitoring of TB patients requires quantitative methods currently performed using smear microscopy. The potential use of quantitative molecular measurements for therapeutic monitoring has been investigated but findings have been variable and inconclusive. The lack of an adequate reference method and reference materials is a barrier to understanding the source of such disagreement. Digital PCR (dPCR) offers the potential for an accurate method for quantification of specific DNA sequences in reference materials which can be used to evaluate quantitative molecular methods for TB treatment monitoring. To assess a novel approach for the development of quality assurance materials we used dPCR to quantify specific DNA sequences in a range of prototype reference materials and evaluated accuracy between different laboratories and instruments. The materials were then also used to evaluate the quantitative performance of qPCR and Xpert MTB/RIF in eight clinical testing laboratories. dPCR was found to provide results in good agreement with the other methods tested and to be highly reproducible between laboratories without calibration even when using different instruments. When the reference materials were analysed with qPCR and Xpert MTB/RIF by clinical laboratories, all laboratories were able to correctly rank the reference materials according to concentration, however there was a marked difference in the measured magnitude. TB is a disease where the quantification of the pathogen could lead to better patient management and qPCR methods offer the potential to rapidly perform such analysis. However, our findings suggest that when precisely characterised materials are used to evaluate qPCR methods, the measurement result variation is too high to determine whether molecular quantification of Mycobacterium tuberculosis would provide a clinically useful readout. The methods described in this study provide a means by which the technical performance of quantitative molecular methods can be evaluated independently of clinical variability to improve accuracy of measurement results. These will assist in ultimately increasing the likelihood that such approaches could be used to improve patient management of TB.
NASA Astrophysics Data System (ADS)
Salas-Ramirez, Maikol; Tran-Gia, Johannes; Kesenheimer, Christian; Weng, Andreas Max; Kosmala, Aleksander; Heidemeier, Anke; Köstler, Herbert; Lassmann, Michael
2018-01-01
Absorbed dose to active bone marrow is a predictor of hematological toxicity in molecular radiotherapy. Due to the complex composition of bone marrow tissue, the necessity to improve the personalized dosimetry has led to the application of non-conventional imaging methods in nuclear medicine. The aim of this study is to apply magnetic resonance imaging (MRI) for quantification of the fat fraction in lumbar vertebrae and to analyze its implications for bone marrow dosimetry. First, a highly accelerated two-point Dixon MRI sequence for fat-water separation was validated in a 3T system against the magnetic resonance spectroscopy (MRS) gold standard. The validation was performed in a fat-water phantom composed of 11 vials with different fat fractions between 0% and 100%, and subsequently repeated in the lumbar vertebrae of three healthy volunteers. Finally, a retrospective study was performed by analyzing the fat fraction in five lumbar vertebrae of 44 patients scanned with the two-point Dixon sequence. The two-point Dixon phantom acquisition showed a good agreement (maximum difference = 2.9%) between the nominal fat fraction and MRS. In the volunteers, a statistical analysis showed a non-significant difference (p = 0.19) between MRI and MRS. In the patients, gender-specific linear fits for female and male data indicated that the age-dependent marrow conversion (red → yellow marrow) is slower in males (0.3% per year) than in females (0.5% per year). Lastly, the fat fraction values showed a considerable variability in patients of similar ages and the same gender. Two-point Dixon MRI enables a non-invasive and spatially resolved quantification of the fat fraction in bone marrow. Our study provides important evidence on the differences in marrow conversion between females and males. In addition, differences were observed in the cellularity values of the International Commission on Radiological Protection (ICRP) reference man (0.7) and the median values obtained in our patient group. These observations lead to the conclusion that the fat fraction in bone marrow should be considered as a patient-specific variable in clinical dosimetry procedures.
Salas-Ramirez, Maikol; Tran-Gia, Johannes; Kesenheimer, Christian; Weng, Andreas Max; Kosmala, Aleksander; Heidemeier, Anke; Köstler, Herbert; Lassmann, Michael
2018-01-16
Absorbed dose to active bone marrow is a predictor of hematological toxicity in molecular radiotherapy. Due to the complex composition of bone marrow tissue, the necessity to improve the personalized dosimetry has led to the application of non-conventional imaging methods in nuclear medicine. The aim of this study is to apply magnetic resonance imaging (MRI) for quantification of the fat fraction in lumbar vertebrae and to analyze its implications for bone marrow dosimetry. First, a highly accelerated two-point Dixon MRI sequence for fat-water separation was validated in a 3T system against the magnetic resonance spectroscopy (MRS) gold standard. The validation was performed in a fat-water phantom composed of 11 vials with different fat fractions between 0% and 100%, and subsequently repeated in the lumbar vertebrae of three healthy volunteers. Finally, a retrospective study was performed by analyzing the fat fraction in five lumbar vertebrae of 44 patients scanned with the two-point Dixon sequence. The two-point Dixon phantom acquisition showed a good agreement (maximum difference = 2.9%) between the nominal fat fraction and MRS. In the volunteers, a statistical analysis showed a non-significant difference (p = 0.19) between MRI and MRS. In the patients, gender-specific linear fits for female and male data indicated that the age-dependent marrow conversion (red → yellow marrow) is slower in males (0.3% per year) than in females (0.5% per year). Lastly, the fat fraction values showed a considerable variability in patients of similar ages and the same gender. Two-point Dixon MRI enables a non-invasive and spatially resolved quantification of the fat fraction in bone marrow. Our study provides important evidence on the differences in marrow conversion between females and males. In addition, differences were observed in the cellularity values of the International Commission on Radiological Protection (ICRP) reference man (0.7) and the median values obtained in our patient group. These observations lead to the conclusion that the fat fraction in bone marrow should be considered as a patient-specific variable in clinical dosimetry procedures.
Geometric and Road Environmental Effects against Total Number of Traffic Accidents in Kendari
NASA Astrophysics Data System (ADS)
Kurdin, M. Akbar; Welendo, La; Annisa, Nur
2017-05-01
From the large number of traffic accidents that occurred, the carrying of Kendari as the biggest contributor to accidents in the Southeast. The number of accidents in Kendari row since 2011 was recorded at 18 accidents due to the influence of geometric road, in 2012 registered at 13 accident and in 2013 amounted to 6 accidents, with accident data because of the influence Geometric recorded for 3 consecutive years the biggest contributor to accidents because of the influence of geometric is Abeli districts. This study aimed to determine the road which common point of accident-prone (Black spot) in Kecamatan Abeli as accident-prone areas in Kendari, analyze the influence of geometric and road environment against accidents on roads in Kecamatan Abeli, provide alternative treatment based on the causes of accidents on the location of the accident-prone points (blackspot) to reduce the rate of traffic accidents. From the results of a study of 6 curve the accident-prone locations, that the curve I, II, and VI is the “Black Spot” influenced by the amount and condition of traffic accidents, while at the curve II, a traffic accident that occurred also be caused by unsafe geometric where the type of geometric should be changed from Spiral-Spiral type to Spiral-Circle-Spiral type. This indicates geometric effect on the number of accidents.
Altimari, Annalisa; de Biase, Dario; De Maglio, Giovanna; Gruppioni, Elisa; Capizzi, Elisa; Degiovanni, Alessio; D’Errico, Antonia; Pession, Annalisa; Pizzolitto, Stefano; Fiorentino, Michelangelo; Tallini, Giovanni
2013-01-01
Detection of KRAS mutations in archival pathology samples is critical for therapeutic appropriateness of anti-EGFR monoclonal antibodies in colorectal cancer. We compared the sensitivity, specificity, and accuracy of Sanger sequencing, ARMS-Scorpion (TheraScreen®) real-time polymerase chain reaction (PCR), pyrosequencing, chip array hybridization, and 454 next-generation sequencing to assess KRAS codon 12 and 13 mutations in 60 nonconsecutive selected cases of colorectal cancer. Twenty of the 60 cases were detected as wild-type KRAS by all methods with 100% specificity. Among the 40 mutated cases, 13 were discrepant with at least one method. The sensitivity was 85%, 90%, 93%, and 92%, and the accuracy was 90%, 93%, 95%, and 95% for Sanger sequencing, TheraScreen real-time PCR, pyrosequencing, and chip array hybridization, respectively. The main limitation of Sanger sequencing was its low analytical sensitivity, whereas TheraScreen real-time PCR, pyrosequencing, and chip array hybridization showed higher sensitivity but suffered from the limitations of predesigned assays. Concordance between the methods was k = 0.79 for Sanger sequencing and k > 0.85 for the other techniques. Tumor cell enrichment correlated significantly with the abundance of KRAS-mutated deoxyribonucleic acid (DNA), evaluated as ΔCt for TheraScreen real-time PCR (P = 0.03), percentage of mutation for pyrosequencing (P = 0.001), ratio for chip array hybridization (P = 0.003), and percentage of mutation for 454 next-generation sequencing (P = 0.004). Also, 454 next-generation sequencing showed the best cross correlation for quantification of mutation abundance compared with all the other methods (P < 0.001). Our comparison showed the superiority of next-generation sequencing over the other techniques in terms of sensitivity and specificity. Next-generation sequencing will replace Sanger sequencing as the reference technique for diagnostic detection of KRAS mutation in archival tumor tissues. PMID:23950653
Role of susceptibility-weighted imaging in demonstration of cerebral fat embolism.
Yeap, Pheyming; Kanodia, Avinash Kumar; Main, Gavin; Yong, Aiwain
2015-01-08
Cerebral fat embolism (CFE) is a rare but potentially lethal complication of long bone fractures. Many cases of CFE occur as subclinical events and remain undiagnosed. We report a case of a 22-year-old man, with multiple long bone fractures from a road traffic accident, who subsequently developed hypoxia, neurological abnormality and petechial rash. CT of the head was normal. MRI of the head confirmed the diagnosis with lesions markedly conspicuous and most widespread on susceptibility-weighted imaging as compared to all other sequences including diffusion-weighted imaging. 2015 BMJ Publishing Group Ltd.
Origins of Protein Functions in Cells
NASA Technical Reports Server (NTRS)
Seelig, Burchard; Pohorille, Andrzej
2011-01-01
In modern organisms proteins perform a majority of cellular functions, such as chemical catalysis, energy transduction and transport of material across cell walls. Although great strides have been made towards understanding protein evolution, a meaningful extrapolation from contemporary proteins to their earliest ancestors is virtually impossible. In an alternative approach, the origin of water-soluble proteins was probed through the synthesis and in vitro evolution of very large libraries of random amino acid sequences. In combination with computer modeling and simulations, these experiments allow us to address a number of fundamental questions about the origins of proteins. Can functionality emerge from random sequences of proteins? How did the initial repertoire of functional proteins diversify to facilitate new functions? Did this diversification proceed primarily through drawing novel functionalities from random sequences or through evolution of already existing proto-enzymes? Did protein evolution start from a pool of proteins defined by a frozen accident and other collections of proteins could start a different evolutionary pathway? Although we do not have definitive answers to these questions yet, important clues have been uncovered. In one example (Keefe and Szostak, 2001), novel ATP binding proteins were identified that appear to be unrelated in both sequence and structure to any known ATP binding proteins. One of these proteins was subsequently redesigned computationally to bind GTP through introducing several mutations that introduce targeted structural changes to the protein, improve its binding to guanine and prevent water from accessing the active center. This study facilitates further investigations of individual evolutionary steps that lead to a change of function in primordial proteins. In a second study (Seelig and Szostak, 2007), novel enzymes were generated that can join two pieces of RNA in a reaction for which no natural enzymes are known. Recently it was found that, as in the previous case, the proteins have a structure unknown among modern enzymes. In this case, in vitro evolution started from a small, non-enzymatic protein. A similar selection process initiated from a library of random polypeptides is in progress. These results not only allow for estimating the occurrence of function in random protein assemblies but also provide evidence for the possibility of alternative protein worlds. Extant proteins might simply represent a frozen accident in the world of possible proteins. Alternative collections of proteins, even with similar functions, could originate alternative evolutionary paths.
NASA Astrophysics Data System (ADS)
Heo, S.; Lee, W. K.; Jong-Ryeul, S.; Kim, M. I.
2016-12-01
The use of chemical compounds are keep increasing because of their use in manufacturing industry. Chemical accident is growing as the consequence of the chemical use increment. Devastating damages from chemical accidents are far enough to aware people's cautious about the risk of the chemical accident. In South Korea, Gumi Hydrofluoric acid leaking accident triggered the importance of risk management and emphasized the preventing the accident over the damage reducing process after the accident occurs. Gumi accident encouraged the government data base construction relate to the chemical accident. As the result of this effort Chemical Safety-Clearing-house (CSC) have started to record the chemical accident information and damages according to the Harmful Chemical Substance Control Act (HCSC). CSC provide details information about the chemical accidents from 2002 to present. The detail informations are including title of company, address, business type, accident dates, accident types, accident chemical compounds, human damages inside of the chemical industry facilities, human damage outside of the chemical industry facilities, financial damages inside of the chemical industry facilities, and financial damages outside of the chemical industry facilities, environmental damages and response to the chemical accident. Collected the chemical accident history of South Korea from 2002 to 2015 and provide the spatial information to the each accident records based on their address. With the spatial information, compute the data on ArcGIS for the spatial-temporal analysis. The spatial-temporal information of chemical accident is organized by the chemical accident types, damages, and damages on environment and conduct the spatial proximity with local community and environmental receptors. Find the chemical accident vulnerable area of South Korea from 2002 to 2015 and add the vulnerable area of total period to examine the historically vulnerable area from the chemical accident in South Korea.
2014-01-01
Background RNA sequencing (RNA-seq) is emerging as a critical approach in biological research. However, its high-throughput advantage is significantly limited by the capacity of bioinformatics tools. The research community urgently needs user-friendly tools to efficiently analyze the complicated data generated by high throughput sequencers. Results We developed a standalone tool with graphic user interface (GUI)-based analytic modules, known as eRNA. The capacity of performing parallel processing and sample management facilitates large data analyses by maximizing hardware usage and freeing users from tediously handling sequencing data. The module miRNA identification” includes GUIs for raw data reading, adapter removal, sequence alignment, and read counting. The module “mRNA identification” includes GUIs for reference sequences, genome mapping, transcript assembling, and differential expression. The module “Target screening” provides expression profiling analyses and graphic visualization. The module “Self-testing” offers the directory setups, sample management, and a check for third-party package dependency. Integration of other GUIs including Bowtie, miRDeep2, and miRspring extend the program’s functionality. Conclusions eRNA focuses on the common tools required for the mapping and quantification analysis of miRNA-seq and mRNA-seq data. The software package provides an additional choice for scientists who require a user-friendly computing environment and high-throughput capacity for large data analysis. eRNA is available for free download at https://sourceforge.net/projects/erna/?source=directory. PMID:24593312
Yuan, Tiezheng; Huang, Xiaoyi; Dittmar, Rachel L; Du, Meijun; Kohli, Manish; Boardman, Lisa; Thibodeau, Stephen N; Wang, Liang
2014-03-05
RNA sequencing (RNA-seq) is emerging as a critical approach in biological research. However, its high-throughput advantage is significantly limited by the capacity of bioinformatics tools. The research community urgently needs user-friendly tools to efficiently analyze the complicated data generated by high throughput sequencers. We developed a standalone tool with graphic user interface (GUI)-based analytic modules, known as eRNA. The capacity of performing parallel processing and sample management facilitates large data analyses by maximizing hardware usage and freeing users from tediously handling sequencing data. The module miRNA identification" includes GUIs for raw data reading, adapter removal, sequence alignment, and read counting. The module "mRNA identification" includes GUIs for reference sequences, genome mapping, transcript assembling, and differential expression. The module "Target screening" provides expression profiling analyses and graphic visualization. The module "Self-testing" offers the directory setups, sample management, and a check for third-party package dependency. Integration of other GUIs including Bowtie, miRDeep2, and miRspring extend the program's functionality. eRNA focuses on the common tools required for the mapping and quantification analysis of miRNA-seq and mRNA-seq data. The software package provides an additional choice for scientists who require a user-friendly computing environment and high-throughput capacity for large data analysis. eRNA is available for free download at https://sourceforge.net/projects/erna/?source=directory.
How to design a single-cell RNA-sequencing experiment: pitfalls, challenges and perspectives.
Dal Molin, Alessandra; Di Camillo, Barbara
2018-01-31
The sequencing of the transcriptome of single cells, or single-cell RNA-sequencing, has now become the dominant technology for the identification of novel cell types in heterogeneous cell populations or for the study of stochastic gene expression. In recent years, various experimental methods and computational tools for analysing single-cell RNA-sequencing data have been proposed. However, most of them are tailored to different experimental designs or biological questions, and in many cases, their performance has not been benchmarked yet, thus increasing the difficulty for a researcher to choose the optimal single-cell transcriptome sequencing (scRNA-seq) experiment and analysis workflow. In this review, we aim to provide an overview of the current available experimental and computational methods developed to handle single-cell RNA-sequencing data and, based on their peculiarities, we suggest possible analysis frameworks depending on specific experimental designs. Together, we propose an evaluation of challenges and open questions and future perspectives in the field. In particular, we go through the different steps of scRNA-seq experimental protocols such as cell isolation, messenger RNA capture, reverse transcription, amplification and use of quantitative standards such as spike-ins and Unique Molecular Identifiers (UMIs). We then analyse the current methodological challenges related to preprocessing, alignment, quantification, normalization, batch effect correction and methods to control for confounding effects. © The Author(s) 2018. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Khakzad, Nima; Khan, Faisal; Amyotte, Paul
2015-07-01
Compared to the remarkable progress in risk analysis of normal accidents, the risk analysis of major accidents has not been so well-established, partly due to the complexity of such accidents and partly due to low probabilities involved. The issue of low probabilities normally arises from the scarcity of major accidents' relevant data since such accidents are few and far between. In this work, knowing that major accidents are frequently preceded by accident precursors, a novel precursor-based methodology has been developed for likelihood modeling of major accidents in critical infrastructures based on a unique combination of accident precursor data, information theory, and approximate reasoning. For this purpose, we have introduced an innovative application of information analysis to identify the most informative near accident of a major accident. The observed data of the near accident were then used to establish predictive scenarios to foresee the occurrence of the major accident. We verified the methodology using offshore blowouts in the Gulf of Mexico, and then demonstrated its application to dam breaches in the United Sates. © 2015 Society for Risk Analysis.
NASA Astrophysics Data System (ADS)
Wu, Kaizhi; Zhang, Xuming; Chen, Guangxie; Weng, Fei; Ding, Mingyue
2013-10-01
Images acquired in free breathing using contrast enhanced ultrasound exhibit a periodic motion that needs to be compensated for if a further accurate quantification of the hepatic perfusion analysis is to be executed. In this work, we present an algorithm to compensate the respiratory motion by effectively combining the PCA (Principal Component Analysis) method and block matching method. The respiratory kinetics of the ultrasound hepatic perfusion image sequences was firstly extracted using the PCA method. Then, the optimal phase of the obtained respiratory kinetics was detected after normalizing the motion amplitude and determining the image subsequences of the original image sequences. The image subsequences were registered by the block matching method using cross-correlation as the similarity. Finally, the motion-compensated contrast images can be acquired by using the position mapping and the algorithm was evaluated by comparing the TICs extracted from the original image sequences and compensated image subsequences. Quantitative comparisons demonstrated that the average fitting error estimated of ROIs (region of interest) was reduced from 10.9278 +/- 6.2756 to 5.1644 +/- 3.3431 after compensating.
Liu, Yi-Ke; Li, He-Ping; Huang, Tao; Cheng, Wei; Gao, Chun-Sheng; Zuo, Dong-Yun; Zhao, Zheng-Xi; Liao, Yu-Cai
2014-10-29
Wheat-specific ribosomal protein L21 (RPL21) is an endogenous reference gene suitable for genetically modified (GM) wheat identification. This taxon-specific RPL21 sequence displayed high homogeneity in different wheat varieties. Southern blots revealed 1 or 3 copies, and sequence analyses showed one amplicon in common wheat. Combined analyses with sequences from common wheat (AABBDD) and three diploid ancestral species, Triticum urartu (AA), Aegilops speltoides (BB), and Aegilops tauschii (DD), demonstrated the presence of this amplicon in the AA genome. Using conventional qualitative polymerase chain reaction (PCR), the limit of detection was 2 copies of wheat haploid genome per reaction. In the quantitative real-time PCR assay, limits of detection and quantification were about 2 and 8 haploid genome copies, respectively, the latter of which is 2.5-4-fold lower than other reported wheat endogenous reference genes. Construct-specific PCR assays were developed using RPL21 as an endogenous reference gene, and as little as 0.5% of GM wheat contents containing Arabidopsis NPR1 were properly quantified.
PanGEA: identification of allele specific gene expression using the 454 technology.
Kofler, Robert; Teixeira Torres, Tatiana; Lelley, Tamas; Schlötterer, Christian
2009-05-14
Next generation sequencing technologies hold great potential for many biological questions. While mainly used for genomic sequencing, they are also very promising for gene expression profiling. Sequencing of cDNA does not only provide an estimate of the absolute expression level, it can also be used for the identification of allele specific gene expression. We developed PanGEA, a tool which enables a fast and user-friendly analysis of allele specific gene expression using the 454 technology. PanGEA allows mapping of 454-ESTs to genes or whole genomes, displaying gene expression profiles, identification of SNPs and the quantification of allele specific gene expression. The intuitive GUI of PanGEA facilitates a flexible and interactive analysis of the data. PanGEA additionally implements a modification of the Smith-Waterman algorithm which deals with incorrect estimates of homopolymer length as occuring in the 454 technology To our knowledge, PanGEA is the first tool which facilitates the identification of allele specific gene expression. PanGEA is distributed under the Mozilla Public License and available at: http://www.kofler.or.at/bioinformatics/PanGEA
PanGEA: Identification of allele specific gene expression using the 454 technology
Kofler, Robert; Teixeira Torres, Tatiana; Lelley, Tamas; Schlötterer, Christian
2009-01-01
Background Next generation sequencing technologies hold great potential for many biological questions. While mainly used for genomic sequencing, they are also very promising for gene expression profiling. Sequencing of cDNA does not only provide an estimate of the absolute expression level, it can also be used for the identification of allele specific gene expression. Results We developed PanGEA, a tool which enables a fast and user-friendly analysis of allele specific gene expression using the 454 technology. PanGEA allows mapping of 454-ESTs to genes or whole genomes, displaying gene expression profiles, identification of SNPs and the quantification of allele specific gene expression. The intuitive GUI of PanGEA facilitates a flexible and interactive analysis of the data. PanGEA additionally implements a modification of the Smith-Waterman algorithm which deals with incorrect estimates of homopolymer length as occuring in the 454 technology Conclusion To our knowledge, PanGEA is the first tool which facilitates the identification of allele specific gene expression. PanGEA is distributed under the Mozilla Public License and available at: PMID:19442283
Comparative Analysis of Single-Cell RNA Sequencing Methods.
Ziegenhain, Christoph; Vieth, Beate; Parekh, Swati; Reinius, Björn; Guillaumet-Adkins, Amy; Smets, Martha; Leonhardt, Heinrich; Heyn, Holger; Hellmann, Ines; Enard, Wolfgang
2017-02-16
Single-cell RNA sequencing (scRNA-seq) offers new possibilities to address biological and medical questions. However, systematic comparisons of the performance of diverse scRNA-seq protocols are lacking. We generated data from 583 mouse embryonic stem cells to evaluate six prominent scRNA-seq methods: CEL-seq2, Drop-seq, MARS-seq, SCRB-seq, Smart-seq, and Smart-seq2. While Smart-seq2 detected the most genes per cell and across cells, CEL-seq2, Drop-seq, MARS-seq, and SCRB-seq quantified mRNA levels with less amplification noise due to the use of unique molecular identifiers (UMIs). Power simulations at different sequencing depths showed that Drop-seq is more cost-efficient for transcriptome quantification of large numbers of cells, while MARS-seq, SCRB-seq, and Smart-seq2 are more efficient when analyzing fewer cells. Our quantitative comparison offers the basis for an informed choice among six prominent scRNA-seq methods, and it provides a framework for benchmarking further improvements of scRNA-seq protocols. Copyright © 2017 Elsevier Inc. All rights reserved.
UV-Visible Spectroscopy-Based Quantification of Unlabeled DNA Bound to Gold Nanoparticles.
Baldock, Brandi L; Hutchison, James E
2016-12-20
DNA-functionalized gold nanoparticles have been increasingly applied as sensitive and selective analytical probes and biosensors. The DNA ligands bound to a nanoparticle dictate its reactivity, making it essential to know the type and number of DNA strands bound to the nanoparticle surface. Existing methods used to determine the number of DNA strands per gold nanoparticle (AuNP) require that the sequences be fluorophore-labeled, which may affect the DNA surface coverage and reactivity of the nanoparticle and/or require specialized equipment and other fluorophore-containing reagents. We report a UV-visible-based method to conveniently and inexpensively determine the number of DNA strands attached to AuNPs of different core sizes. When this method is used in tandem with a fluorescence dye assay, it is possible to determine the ratio of two unlabeled sequences of different lengths bound to AuNPs. Two sizes of citrate-stabilized AuNPs (5 and 12 nm) were functionalized with mixtures of short (5 base) and long (32 base) disulfide-terminated DNA sequences, and the ratios of sequences bound to the AuNPs were determined using the new method. The long DNA sequence was present as a lower proportion of the ligand shell than in the ligand exchange mixture, suggesting it had a lower propensity to bind the AuNPs than the short DNA sequence. The ratio of DNA sequences bound to the AuNPs was not the same for the large and small AuNPs, which suggests that the radius of curvature had a significant influence on the assembly of DNA strands onto the AuNPs.
An optimized protocol for generation and analysis of Ion Proton sequencing reads for RNA-Seq.
Yuan, Yongxian; Xu, Huaiqian; Leung, Ross Ka-Kit
2016-05-26
Previous studies compared running cost, time and other performance measures of popular sequencing platforms. However, comprehensive assessment of library construction and analysis protocols for Proton sequencing platform remains unexplored. Unlike Illumina sequencing platforms, Proton reads are heterogeneous in length and quality. When sequencing data from different platforms are combined, this can result in reads with various read length. Whether the performance of the commonly used software for handling such kind of data is satisfactory is unknown. By using universal human reference RNA as the initial material, RNaseIII and chemical fragmentation methods in library construction showed similar result in gene and junction discovery number and expression level estimated accuracy. In contrast, sequencing quality, read length and the choice of software affected mapping rate to a much larger extent. Unspliced aligner TMAP attained the highest mapping rate (97.27 % to genome, 86.46 % to transcriptome), though 47.83 % of mapped reads were clipped. Long reads could paradoxically reduce mapping in junctions. With reference annotation guide, the mapping rate of TopHat2 significantly increased from 75.79 to 92.09 %, especially for long (>150 bp) reads. Sailfish, a k-mer based gene expression quantifier attained highly consistent results with that of TaqMan array and highest sensitivity. We provided for the first time, the reference statistics of library preparation methods, gene detection and quantification and junction discovery for RNA-Seq by the Ion Proton platform. Chemical fragmentation performed equally well with the enzyme-based one. The optimal Ion Proton sequencing options and analysis software have been evaluated.
Li, Meng; Ford, Tim; Li, Xiaoyan; Gu, Ji-Dong
2011-04-15
A newly designed primer set (AnnirS), together with a previously published primer set (ScnirS), was used to detect anammox bacterial nirS genes from sediments collected from three marine environments. Phylogenetic analysis demonstrated that all retrieved sequences were clearly different from typical denitrifiers' nirS, but do group together with the known anammox bacterial nirS. Sequences targeted by ScnirS are closely related to Scalindua nirS genes recovered from the Peruvian oxygen minimum zone (OMZ), whereas sequences targeted by AnnirS are more closely affiliated with the nirS of Candidatus 'Kuenenia stuttgartiensis' and even form a new phylogenetic nirS clade, which might be related to other genera of the anammox bacteria. Analysis demonstrated that retrieved sequences had higher sequence identities (>60%) with known anammox bacterial nirS genes than with denitrifiers' nirS, on both nucleotide and amino acid levels. Compared to the 16S rRNA and hydrazine oxidoreductase (hzo) genes, the anammox bacterial nirS not only showed consistent phylogenetic relationships but also demonstrated more reliable quantification of anammox bacteria because of the single copy of the nirS gene in the anammox bacterial genome and the specificity of PCR primers for different genera of anammox bacteria, thus providing a suitable functional biomarker for investigation of anammox bacteria.
In-vessel coolability and retention of a core melt
DOE Office of Scientific and Technical Information (OSTI.GOV)
Theofanous, T.G.; Liu, C.; Additon, S.
1997-02-01
The efficacy of external flooding of a reactor vessel as a severe accident management strategy is assessed for an AP600-like reactor design. The overall approach is based on the Risk Oriented Accident Analysis Methodology (ROAAM), and the assessment includes consideration of bounding scenarios and sensitivity studies, as well as arbitrary parametric evaluations that allow the delineation of the failure boundaries. The technical treatment in this assessment includes: (a) new data on energy flow from either volumetrically heated pools or non-heated layers on top, boiling and critical heat flux in inverted, curved geometries, emissivity of molten (superheated) samples of steel, andmore » chemical reactivity proof tests, (b) a simple but accurate mathematical formulation that allows prediction of thermal loads by means of convenient hand calculations, (c) a detailed model programmed on the computer to sample input parameters over the uncertainty ranges, and to produce probability distributions of thermal loads and margins for departure from nucleate boiling at each angular position on the lower head, and (d) detailed structural evaluations that demonstrate that departure from nucleate boiling is a necessary and sufficient criterion for failure. Quantification of the input parameters is carried out for an AP600-like design, and the results of the assessment demonstrate that lower head failure is {open_quotes}physically unreasonable.{close_quotes} Use of this conclusion for any specific application is subject to verifying the required reliability of the depressurization and cavity-flooding systems, and to showing the appropriateness (in relation to the database presented here, or by further testing as necessary) of the thermal insulation design and of the external surface properties of the lower head, including any applicable coatings.« less
Mencia-Trinchant, Nuria; Hu, Yang; Alas, Maria Antonina; Ali, Fatima; Wouters, Bas J; Lee, Sangmin; Ritchie, Ellen K; Desai, Pinkal; Guzman, Monica L; Roboz, Gail J; Hassane, Duane C
2017-07-01
The presence of minimal residual disease (MRD) is widely recognized as a powerful predictor of therapeutic outcome in acute myeloid leukemia (AML), but methods of measurement and quantification of MRD in AML are not yet standardized in clinical practice. There is an urgent, unmet need for robust and sensitive assays that can be readily adopted as real-time tools for disease monitoring. NPM1 frameshift mutations are an established MRD marker present in half of patients with cytogenetically normal AML. However, detection is complicated by the existence of hundreds of potential frameshift insertions, clonal heterogeneity, and absence of sequence information when the NPM1 mutation is identified using capillary electrophoresis. Thus, some patients are ineligible for NPM1 MRD monitoring. Furthermore, a subset of patients with NPM1-mutated AML will have false-negative MRD results because of clonal evolution. To simplify and improve MRD testing for NPM1, we present a novel digital PCR technique composed of massively multiplex pools of insertion-specific primers that selectively detect mutated but not wild-type NPM1. By measuring reaction end points using digital PCR technology, the resulting single assay enables sensitive and specific quantification of most NPM1 exon 12 mutations in a manner that is robust to clonal heterogeneity, does not require NPM1 sequence information, and obviates the need for maintenance of hundreds of type-specific assays and associated plasmid standards. Copyright © 2017 American Society for Investigative Pathology and the Association for Molecular Pathology. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Petre, Brînduşa-Alina; Ulrich, Martina; Stumbaum, Mihaela; Bernevic, Bogdan; Moise, Adrian; Döring, Gerd; Przybylski, Michael
2012-11-01
Tyrosine nitration in proteins occurs under physiologic conditions and is increased at disease conditions associated with oxidative stress, such as inflammation and Alzheimer's disease. Identification and quantification of tyrosine-nitrations are crucial for understanding nitration mechanism(s) and their functional consequences. Mass spectrometry (MS) is best suited to identify nitration sites, but is hampered by low stabilities and modification levels and possible structural changes induced by nitration. In this insight, we discuss methods for identifying and quantifying nitration sites by proteolytic affinity extraction using nitrotyrosine (NT)-specific antibodies, in combination with electrospray-MS. The efficiency of this approach is illustrated by identification of specific nitration sites in two proteins in eosinophil granules from several biological samples, eosinophil-cationic protein (ECP) and eosinophil-derived neurotoxin (EDN). Affinity extraction combined with Edman sequencing enabled the quantification of nitration levels, which were found to be 8 % and 15 % for ECP and EDN, respectively. Structure modeling utilizing available crystal structures and affinity studies using synthetic NT-peptides suggest a tyrosine nitration sequence motif comprising positively charged residues in the vicinity of the NT- residue, located at specific surface- accessible sites of the protein structure. Affinities of Tyr-nitrated peptides from ECP and EDN to NT-antibodies, determined by online bioaffinity- MS, provided nanomolar KD values. In contrast, false-positive identifications of nitrations were obtained in proteins from cystic fibrosis patients upon using NT-specific antibodies, and were shown to be hydroxy-tyrosine modifications. These results demonstrate affinity- mass spectrometry approaches to be essential for unequivocal identification of biological tyrosine nitrations.
NASA Astrophysics Data System (ADS)
Schepkin, Victor D.; Neubauer, Andreas; Nagel, Armin M.; Budinger, Thomas F.
2017-04-01
Potassium and sodium specific binding in vivo were explored at 21.1 T by triple quantum (TQ) magnetic resonance (MR) signals without filtration to achieve high sensitivities and precise quantifications. The pulse sequence used time proportional phase increments (TPPI). During simultaneous phase-time increments, it provided total single quantum (SQ) and TQ MR signals in the second dimension at single and triple quantum frequencies, respectively. The detection of both TQ and SQ signals was performed at identical experimental conditions and the resulting TQ signal equals 60 ± 3% of the SQ signal when all ions experience sufficient time for binding. In a rat head in vivo the TQ percentage relative to SQ for potassium is 41.5 ± 3% and for sodium is 16.1 ± 1%. These percentages were compared to the matching values in an agarose tissue model with MR relaxation times similar to those of mammalian brain tissue. The sodium TQ signal in agarose samples decreased in the presence of potassium, suggesting a competitive binding of potassium relative to sodium ions for the same binding sites. The TQTPPI signals correspond to almost two times more effective binding of potassium than sodium. In vivo, up to ∼69% of total potassium and ∼27% of total sodium can be regarded as bound or experiencing an association time in the range of several milliseconds. Experimental data analyses show that more than half of the in vivo total sodium TQ signal could be from extracellular space, which is an important factor for quantification of intracellular MR signals.
Prakash, Kasthuri; Rydell, Gustaf E; Larsson, Simon B; Andersson, Maria; Norkrans, Gunnar; Norder, Heléne; Lindh, Magnus
2018-05-15
Hepatocytes infected by hepatitis B virus (HBV) produce different HBV RNA species, including pregenomic RNA (pgRNA), which is reverse transcribed during replication. Particles containing HBV RNA are present in serum of infected individuals, and quantification of this HBV RNA could be clinically useful. In a retrospective study of 95 patients with chronic HBV infection, we characterised HBV RNA in serum in terms of concentration, particle association and sequence. HBV RNA was detected by real-time PCR at levels almost as high as HBV DNA. The HBV RNA was protected from RNase and it was found in particles of similar density as particles containing HBV DNA after fractionation on a Nycodenz gradient. Sequencing the epsilon region of the RNA did not reveal mutations that would preclude its binding to the viral polymerase before encapsidation. Specific quantification of precore RNA and pgRNA by digital PCR showed almost seven times lower ratio of precore RNA/pgRNA in serum than in liver tissue, which corresponds to poorer encapsidation of this RNA as compared with pgRNA. The serum ratio between HBV DNA and HBV RNA was higher in genotype D as compared with other genotypes. The results suggest that HBV RNA in serum is present in viral particles with failing reverse transcription activity, which are produced at almost as high rates as viral particles containing DNA. The results encourage further studies of the mechanisms by which these particles are produced, the impact of genotype, and the potential clinical utility of quantifying HBV RNA in serum.
Roussel, Nicolas; Sprenger, Jeff; Tappan, Susan J; Glaser, Jack R
2014-01-01
The behavior of the well-characterized nematode, Caenorhabditis elegans (C. elegans), is often used to study the neurologic control of sensory and motor systems in models of health and neurodegenerative disease. To advance the quantification of behaviors to match the progress made in the breakthroughs of genetics, RNA, proteins, and neuronal circuitry, analysis must be able to extract subtle changes in worm locomotion across a population. The analysis of worm crawling motion is complex due to self-overlap, coiling, and entanglement. Using current techniques, the scope of the analysis is typically restricted to worms to their non-occluded, uncoiled state which is incomplete and fundamentally biased. Using a model describing the worm shape and crawling motion, we designed a deformable shape estimation algorithm that is robust to coiling and entanglement. This model-based shape estimation algorithm has been incorporated into a framework where multiple worms can be automatically detected and tracked simultaneously throughout the entire video sequence, thereby increasing throughput as well as data validity. The newly developed algorithms were validated against 10 manually labeled datasets obtained from video sequences comprised of various image resolutions and video frame rates. The data presented demonstrate that tracking methods incorporated in WormLab enable stable and accurate detection of these worms through coiling and entanglement. Such challenging tracking scenarios are common occurrences during normal worm locomotion. The ability for the described approach to provide stable and accurate detection of C. elegans is critical to achieve unbiased locomotory analysis of worm motion. PMID:26435884
Kang, Kang; Zhang, Xiaoying; Liu, Hongtao; Wang, Zhiwei; Zhong, Jiasheng; Huang, Zhenting; Peng, Xiao; Zeng, Yan; Wang, Yuna; Yang, Yi; Luo, Jun; Gou, Deming
2012-01-01
Background MicroRNAs (miRNAs) are small, non-coding RNAs capable of postranscriptionally regulating gene expression. Accurate expression profiling is crucial for understanding the biological roles of miRNAs, and exploring them as biomarkers of diseases. Methodology/Principal Findings A novel, highly sensitive, and reliable miRNA quantification approach,termed S-Poly(T) miRNA assay, is designed. In this assay, miRNAs are subjected to polyadenylation and reverse transcription with a S-Poly(T) primer that contains a universal reverse primer, a universal Taqman probe, an oligo(dT)11 sequence and six miRNA-specific bases. Individual miRNAs are then amplified by a specific forward primer and a universal reverse primer, and the PCR products are detected by a universal Taqman probe. The S-Poly(T) assay showed a minimum of 4-fold increase in sensitivity as compared with the stem-loop or poly(A)-based methods. A remarkable specificity in discriminating among miRNAs with high sequence similarity was also obtained with this approach. Using this method, we profiled miRNAs in human pulmonary arterial smooth muscle cells (HPASMC) and identified 9 differentially expressed miRNAs associated with hypoxia treatment. Due to its outstanding sensitivity, the number of circulating miRNAs from normal human serum was significantly expanded from 368 to 518. Conclusions/Significance With excellent sensitivity, specificity, and high-throughput, the S-Poly(T) method provides a powerful tool for miRNAs quantification and identification of tissue- or disease-specific miRNA biomarkers. PMID:23152780
van Zwieten, Rob; Veldthuis, Martijn; Delzenne, Barend; Berghuis, Jeffrey; Groen, Joke; Ait Ichou, Fatima; Clifford, Els; Harteveld, Cornelis L; Stroobants, An K
2014-01-01
More than 20,000 blood samples of individuals living in The Netherlands and suspected of hemolytic anemia or diabetes were analyzed by high resolution cation exchange high performance liquid chromatography (HPLC). Besides common disease-related hemoglobins (Hbs), rare variants were also detected. The variant Hbs were retrospectively analyzed by capillary zone electrophoresis (CZE) and by isoelectric focusing (IEF). For unambiguous identification, the globin genes were sequenced. Most of the 80 Hb variants detected by initial screening on HPLC were also separated by capillary electrophoresis (CE), but a few variants were only detectable with one of these methods. Some variants were unstable, had thalassemic properties or increased oxygen affinity, and some interfered with Hb A2 measurement, detection of sickle cell Hb or Hb A1c quantification. Two of the six novel variants, Hb Enschede (HBA2: c.308G > A, p.Ser103Asn) and Hb Weesp (HBA1: c.301C > T, p.Leu101Phe), had no clinical consequences. In contrast, two others appeared clinically significant: Hb Ede (HBB: c.53A > T, p.Lys18Met) caused thalassemia and Hb Waterland (HBB: c.428C > T, pAla143Val) was related to mild polycytemia. Hb A2-Venlo (HBD: c.193G > A, p.Gly65Ser) and Hb A2-Rotterdam (HBD: c.38A > C, p.Asn13Thr) interfered with Hb A2 quantification. This survey shows that HPLC analysis followed by globin gene sequencing of rare variants is an effective method to reveal Hb variants.
Cost analysis of whole genome sequencing in German clinical practice.
Plöthner, Marika; Frank, Martin; von der Schulenburg, J-Matthias Graf
2017-06-01
Whole genome sequencing (WGS) is an emerging tool in clinical diagnostics. However, little has been said about its procedure costs, owing to a dearth of related cost studies. This study helps fill this research gap by analyzing the execution costs of WGS within the setting of German clinical practice. First, to estimate costs, a sequencing process related to clinical practice was undertaken. Once relevant resources were identified, a quantification and monetary evaluation was conducted using data and information from expert interviews with clinical geneticists, and personnel at private enterprises and hospitals. This study focuses on identifying the costs associated with the standard sequencing process, and the procedure costs for a single WGS were analyzed on the basis of two sequencing platforms-namely, HiSeq 2500 and HiSeq Xten, both by Illumina, Inc. In addition, sensitivity analyses were performed to assess the influence of various uses of sequencing platforms and various coverage values on a fixed-cost degression. In the base case scenario-which features 80 % utilization and 30-times coverage-the cost of a single WGS analysis with the HiSeq 2500 was estimated at €3858.06. The cost of sequencing materials was estimated at €2848.08; related personnel costs of €396.94 and acquisition/maintenance costs (€607.39) were also found. In comparison, the cost of sequencing that uses the latest technology (i.e., HiSeq Xten) was approximately 63 % cheaper, at €1411.20. The estimated costs of WGS currently exceed the prediction of a 'US$1000 per genome', by more than a factor of 3.8. In particular, the material costs in themselves exceed this predicted cost.
Utro, Filippo; Di Benedetto, Valeria; Corona, Davide F V; Giancarlo, Raffaele
2016-03-15
Thanks to research spanning nearly 30 years, two major models have emerged that account for nucleosome organization in chromatin: statistical and sequence specific. The first is based on elegant, easy to compute, closed-form mathematical formulas that make no assumptions of the physical and chemical properties of the underlying DNA sequence. Moreover, they need no training on the data for their computation. The latter is based on some sequence regularities but, as opposed to the statistical model, it lacks the same type of closed-form formulas that, in this case, should be based on the DNA sequence only. We contribute to close this important methodological gap between the two models by providing three very simple formulas for the sequence specific one. They are all based on well-known formulas in Computer Science and Bioinformatics, and they give different quantifications of how complex a sequence is. In view of how remarkably well they perform, it is very surprising that measures of sequence complexity have not even been considered as candidates to close the mentioned gap. We provide experimental evidence that the intrinsic level of combinatorial organization and information-theoretic content of subsequences within a genome are strongly correlated to the level of DNA encoded nucleosome organization discovered by Kaplan et al Our results establish an important connection between the intrinsic complexity of subsequences in a genome and the intrinsic, i.e. DNA encoded, nucleosome organization of eukaryotic genomes. It is a first step towards a mathematical characterization of this latter 'encoding'. Supplementary data are available at Bioinformatics online. futro@us.ibm.com. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Browning, J.V.; Miller, K.G.; McLaughlin, P.P.; Kominz, M.A.; Sugarman, P.J.; Monteverde, D.; Feigenson, M.D.; Hernandez, J.C.
2006-01-01
We use backstripping to quantify the roles of variations in global sea level (eustasy), subsidence, and sediment supply on the development of the Miocene stratigraphic record of the mid-Atlantic continental margin of the United States (New Jersey, Delaware, and Maryland). Eustasy is a primary influence on sequence patterns, determining the global template of sequences (i.e., times when sequences can be preserved) and explaining similarities in Miocene sequence architecture on margins throughout the world. Sequences can be correlated throughout the mid-Atlantic region with Sr-isotopic chronology (??0.6 m.y. to ??1.2 m.y.). Eight Miocene sequences correlate regionally and can be correlated to global ??18O increases, indicating glacioeustatic control. This margin is dominated by passive subsidence with little evidence for active tectonic overprints, except possibly in Maryland during the early Miocene. However, early Miocene sequences in New Jersey and Delaware display a patchwork distribution that is attributable to minor (tens of meters) intervals of excess subsidence. Backstripping quantifies that excess subsidence began in Delaware at ca. 21 Ma and continued until 12 Ma, with maximum rates from ca. 21-16 Ma. We attribute this enhanced subsidence to local flexural response to the progradation of thick sequences offshore and adjacent to this area. Removing this excess subsidence in Delaware yields a record that is remarkably similar to New Jersey eustatic estimates. We conclude that sea-level rise and fall is a first-order control on accommodation providing similar timing on all margins to the sequence record. Tectonic changes due to movement of the crust can overprint the record, resulting in large gaps in the stratigraphic record. Smaller differences in sequences can be attributed to local flexural loading effects, particularly in regions experiencing large-scale progradation. ?? 2006 Geological Society of America.
Absolute quantification of microbial taxon abundances.
Props, Ruben; Kerckhof, Frederiek-Maarten; Rubbens, Peter; De Vrieze, Jo; Hernandez Sanabria, Emma; Waegeman, Willem; Monsieurs, Pieter; Hammes, Frederik; Boon, Nico
2017-02-01
High-throughput amplicon sequencing has become a well-established approach for microbial community profiling. Correlating shifts in the relative abundances of bacterial taxa with environmental gradients is the goal of many microbiome surveys. As the abundances generated by this technology are semi-quantitative by definition, the observed dynamics may not accurately reflect those of the actual taxon densities. We combined the sequencing approach (16S rRNA gene) with robust single-cell enumeration technologies (flow cytometry) to quantify the absolute taxon abundances. A detailed longitudinal analysis of the absolute abundances resulted in distinct abundance profiles that were less ambiguous and expressed in units that can be directly compared across studies. We further provide evidence that the enrichment of taxa (increase in relative abundance) does not necessarily relate to the outgrowth of taxa (increase in absolute abundance). Our results highlight that both relative and absolute abundances should be considered for a comprehensive biological interpretation of microbiome surveys.
Efficient robust reconstruction of dynamic PET activity maps with radioisotope decay constraints.
Gao, Fei; Liu, Huafeng; Shi, Pengcheng
2010-01-01
Dynamic PET imaging performs sequence of data acquisition in order to provide visualization and quantification of physiological changes in specific tissues and organs. The reconstruction of activity maps is generally the first step in dynamic PET. State space Hinfinity approaches have been proved to be a robust method for PET image reconstruction where, however, temporal constraints are not considered during the reconstruction process. In addition, the state space strategies for PET image reconstruction have been computationally prohibitive for practical usage because of the need for matrix inversion. In this paper, we present a minimax formulation of the dynamic PET imaging problem where a radioisotope decay model is employed as physics-based temporal constraints on the photon counts. Furthermore, a robust steady state Hinfinity filter is developed to significantly improve the computational efficiency with minimal loss of accuracy. Experiments are conducted on Monte Carlo simulated image sequences for quantitative analysis and validation.
Markerless video analysis for movement quantification in pediatric epilepsy monitoring.
Lu, Haiping; Eng, How-Lung; Mandal, Bappaditya; Chan, Derrick W S; Ng, Yen-Ling
2011-01-01
This paper proposes a markerless video analytic system for quantifying body part movements in pediatric epilepsy monitoring. The system utilizes colored pajamas worn by a patient in bed to extract body part movement trajectories, from which various features can be obtained for seizure detection and analysis. Hence, it is non-intrusive and it requires no sensor/marker to be attached to the patient's body. It takes raw video sequences as input and a simple user-initialization indicates the body parts to be examined. In background/foreground modeling, Gaussian mixture models are employed in conjunction with HSV-based modeling. Body part detection follows a coarse-to-fine paradigm with graph-cut-based segmentation. Finally, body part parameters are estimated with domain knowledge guidance. Experimental studies are reported on sequences captured in an Epilepsy Monitoring Unit at a local hospital. The results demonstrate the feasibility of the proposed system in pediatric epilepsy monitoring and seizure detection.