Sample records for specific method based

  1. Development of performance-based evaluation methods and specifications for roadside maintenance.

    DOT National Transportation Integrated Search

    2011-01-01

    This report documents the work performed during Project 0-6387, Performance Based Roadside : Maintenance Specifications. Quality assurance methods and specifications for roadside performance-based : maintenance contracts (PBMCs) were developed ...

  2. SB certification handout material requirements, test methods, responsibilities, and minimum classification levels for mixture-based specification for flexible base.

    DOT National Transportation Integrated Search

    2012-10-01

    A handout with tables representing the material requirements, test methods, responsibilities, and minimum classification levels mixture-based specification for flexible base and details on aggregate and test methods employed, along with agency and co...

  3. System and method for deriving a process-based specification

    NASA Technical Reports Server (NTRS)

    Hinchey, Michael Gerard (Inventor); Rouff, Christopher A. (Inventor); Rash, James Larry (Inventor)

    2009-01-01

    A system and method for deriving a process-based specification for a system is disclosed. The process-based specification is mathematically inferred from a trace-based specification. The trace-based specification is derived from a non-empty set of traces or natural language scenarios. The process-based specification is mathematically equivalent to the trace-based specification. Code is generated, if applicable, from the process-based specification. A process, or phases of a process, using the features disclosed can be reversed and repeated to allow for an interactive development and modification of legacy systems. The process is applicable to any class of system, including, but not limited to, biological and physical systems, electrical and electro-mechanical systems in addition to software, hardware and hybrid hardware-software systems.

  4. Systems, methods and apparatus for verification of knowledge-based systems

    NASA Technical Reports Server (NTRS)

    Rash, James L. (Inventor); Gracinin, Denis (Inventor); Erickson, John D. (Inventor); Rouff, Christopher A. (Inventor); Hinchey, Michael G. (Inventor)

    2010-01-01

    Systems, methods and apparatus are provided through which in some embodiments, domain knowledge is translated into a knowledge-based system. In some embodiments, a formal specification is derived from rules of a knowledge-based system, the formal specification is analyzed, and flaws in the formal specification are used to identify and correct errors in the domain knowledge, from which a knowledge-based system is translated.

  5. Methods for chromosome-specific staining

    DOEpatents

    Gray, Joe W.; Pinkel, Daniel

    1995-01-01

    Methods and compositions for chromosome-specific staining are provided. Compositions comprise heterogenous mixtures of labeled nucleic acid fragments having substantially complementary base sequences to unique sequence regions of the chromosomal DNA for which their associated staining reagent is specific. Methods include methods for making the chromosome-specific staining compositions of the invention, and methods for applying the staining compositions to chromosomes.

  6. Methods and compositions for chromosome-specific staining

    DOEpatents

    Gray, Joe W.; Pinkel, Daniel

    2003-07-22

    Methods and compositions for chromosome-specific staining are provided. Compositions comprise heterogenous mixtures of labeled nucleic acid fragments having substantially complementary base sequences to unique sequence regions of the chromosomal DNA for which their associated staining reagent is specific. Methods include methods for making the chromosome-specific staining compositions of the invention, and methods for applying the staining compositions to chromosomes.

  7. Irrigation scheduling as affected by field capacity and wilting point water content from different data sources

    USDA-ARS?s Scientific Manuscript database

    Soil water content at field capacity and wilting point water content is critical information for irrigation scheduling, regardless of soil water sensor-based method (SM) or evapotranspiration (ET)-based method. Both methods require knowledge on site-specific and soil-specific Management Allowable De...

  8. Performance and Specificity of the Covalently Linked Immunomagnetic Separation-ATP Method for Rapid Detection and Enumeration of Enterococci in Coastal Environments

    PubMed Central

    Zimmer-Faust, Amity G.; Thulsiraj, Vanessa; Ferguson, Donna

    2014-01-01

    The performance and specificity of the covalently linked immunomagnetic separation-ATP (Cov-IMS/ATP) method for the detection and enumeration of enterococci was evaluated in recreational waters. Cov-IMS/ATP performance was compared with standard methods: defined substrate technology (Enterolert; IDEXX Laboratories), membrane filtration (EPA Method 1600), and an Enterococcus-specific quantitative PCR (qPCR) assay (EPA Method A). We extend previous studies by (i) analyzing the stability of the relationship between the Cov-IMS/ATP method and culture-based methods at different field sites, (ii) evaluating specificity of the assay for seven ATCC Enterococcus species, (iii) identifying cross-reacting organisms binding the antibody-bead complexes with 16S rRNA gene sequencing and evaluating specificity of the assay to five nonenterococcus species, and (iv) conducting preliminary tests of preabsorption as a means of improving the assay. Cov-IMS/ATP was found to perform consistently and with strong agreement rates (based on exceedance/compliance with regulatory limits) of between 83% and 100% compared to the culture-based Enterolert method at a variety of sites with complex inputs. The Cov-IMS/ATP method is specific to five of seven different Enterococcus spp. tested. However, there is potential for nontarget bacteria to bind the antibody, which may be reduced by purification of the IgG serum with preabsorption at problematic sites. The findings of this study help to validate the Cov-IMS/ATP method, suggesting a predictable relationship between the Cov-IMS/ATP method and traditional culture-based methods, which will allow for more widespread application of this rapid and field-portable method for coastal water quality assessment. PMID:24561583

  9. A Time-Space Domain Information Fusion Method for Specific Emitter Identification Based on Dempster-Shafer Evidence Theory.

    PubMed

    Jiang, Wen; Cao, Ying; Yang, Lin; He, Zichang

    2017-08-28

    Specific emitter identification plays an important role in contemporary military affairs. However, most of the existing specific emitter identification methods haven't taken into account the processing of uncertain information. Therefore, this paper proposes a time-space domain information fusion method based on Dempster-Shafer evidence theory, which has the ability to deal with uncertain information in the process of specific emitter identification. In this paper, radars will generate a group of evidence respectively based on the information they obtained, and our main task is to fuse the multiple groups of evidence to get a reasonable result. Within the framework of recursive centralized fusion model, the proposed method incorporates a correlation coefficient, which measures the relevance between evidence and a quantum mechanical approach, which is based on the parameters of radar itself. The simulation results of an illustrative example demonstrate that the proposed method can effectively deal with uncertain information and get a reasonable recognition result.

  10. Methods for chromosome-specific staining

    DOEpatents

    Gray, J.W.; Pinkel, D.

    1995-09-05

    Methods and compositions for chromosome-specific staining are provided. Compositions comprise heterogeneous mixtures of labeled nucleic acid fragments having substantially complementary base sequences to unique sequence regions of the chromosomal DNA for which their associated staining reagent is specific. Methods include ways for making the chromosome-specific staining compositions of the invention, and methods for applying the staining compositions to chromosomes. 3 figs.

  11. Simulation of blood flow in deformable vessels using subject-specific geometry and spatially varying wall properties

    PubMed Central

    Xiong, Guanglei; Figueroa, C. Alberto; Xiao, Nan; Taylor, Charles A.

    2011-01-01

    SUMMARY Simulation of blood flow using image-based models and computational fluid dynamics has found widespread application to quantifying hemodynamic factors relevant to the initiation and progression of cardiovascular diseases and for planning interventions. Methods for creating subject-specific geometric models from medical imaging data have improved substantially in the last decade but for many problems, still require significant user interaction. In addition, while fluid–structure interaction methods are being employed to model blood flow and vessel wall dynamics, tissue properties are often assumed to be uniform. In this paper, we propose a novel workflow for simulating blood flow using subject-specific geometry and spatially varying wall properties. The geometric model construction is based on 3D segmentation and geometric processing. Variable wall properties are assigned to the model based on combining centerline-based and surface-based methods. We finally demonstrate these new methods using an idealized cylindrical model and two subject-specific vascular models with thoracic and cerebral aneurysms. PMID:21765984

  12. An overview of very high level software design methods

    NASA Technical Reports Server (NTRS)

    Asdjodi, Maryam; Hooper, James W.

    1988-01-01

    Very High Level design methods emphasize automatic transfer of requirements to formal design specifications, and/or may concentrate on automatic transformation of formal design specifications that include some semantic information of the system into machine executable form. Very high level design methods range from general domain independent methods to approaches implementable for specific applications or domains. Applying AI techniques, abstract programming methods, domain heuristics, software engineering tools, library-based programming and other methods different approaches for higher level software design are being developed. Though one finds that a given approach does not always fall exactly in any specific class, this paper provides a classification for very high level design methods including examples for each class. These methods are analyzed and compared based on their basic approaches, strengths and feasibility for future expansion toward automatic development of software systems.

  13. [Development and validation of event-specific quantitative PCR method for genetically modified maize LY038].

    PubMed

    Mano, Junichi; Masubuchi, Tomoko; Hatano, Shuko; Futo, Satoshi; Koiwa, Tomohiro; Minegishi, Yasutaka; Noguchi, Akio; Kondo, Kazunari; Akiyama, Hiroshi; Teshima, Reiko; Kurashima, Takeyo; Takabatake, Reona; Kitta, Kazumi

    2013-01-01

    In this article, we report a novel real-time PCR-based analytical method for quantitation of the GM maize event LY038. We designed LY038-specific and maize endogenous reference DNA-specific PCR amplifications. After confirming the specificity and linearity of the LY038-specific PCR amplification, we determined the conversion factor required to calculate the weight-based content of GM organism (GMO) in a multilaboratory evaluation. Finally, in order to validate the developed method, an interlaboratory collaborative trial according to the internationally harmonized guidelines was performed with blind DNA samples containing LY038 at the mixing levels of 0, 0.5, 1.0, 5.0 and 10.0%. The precision of the method was evaluated as the RSD of reproducibility (RSDR), and the values obtained were all less than 25%. The limit of quantitation of the method was judged to be 0.5% based on the definition of ISO 24276 guideline. The results from the collaborative trial suggested that the developed quantitative method would be suitable for practical testing of LY038 maize.

  14. Examination of a Method to Determine the Reference Region for Calculating the Specific Binding Ratio in Dopamine Transporter Imaging.

    PubMed

    Watanabe, Ayumi; Inoue, Yusuke; Asano, Yuji; Kikuchi, Kei; Miyatake, Hiroki; Tokushige, Takanobu

    2017-01-01

    The specific binding ratio (SBR) was first reported by Tossici-Bolt et al. for quantitative indicators for dopamine transporter (DAT) imaging. It is defined as the ratio of the specific binding concentration of the striatum to the non-specific binding concentration of the whole brain other than the striatum. The non-specific binding concentration is calculated based on the region of interest (ROI), which is set 20 mm inside the outer contour, defined by a threshold technique. Tossici-Bolt et al. used a 50% threshold, but sometimes we couldn't define the ROI of non-specific binding concentration (reference region) and calculate SBR appropriately with a 50% threshold. Therefore, we sought a new method for determining the reference region when calculating SBR. We used data from 20 patients who had undergone DAT imaging in our hospital, to calculate the non-specific binding concentration by the following methods, the threshold to define a reference region was fixed at some specific values (the fixing method) and reference region was visually optimized by an examiner at every examination (the visual optimization method). First, we assessed the reference region of each method visually, and afterward, we quantitatively compared SBR calculated based on each method. In the visual assessment, the scores of the fixing method at 30% and visual optimization method were higher than the scores of the fixing method at other values, with or without scatter correction. In the quantitative assessment, the SBR obtained by visual optimization of the reference region, based on consensus of three radiological technologists, was used as a baseline (the standard method). The values of SBR showed good agreement between the standard method and both the fixing method at 30% and the visual optimization method, with or without scatter correction. Therefore, the fixing method at 30% and the visual optimization method were equally suitable for determining the reference region.

  15. Computational intelligence-based polymerase chain reaction primer selection based on a novel teaching-learning-based optimisation.

    PubMed

    Cheng, Yu-Huei

    2014-12-01

    Specific primers play an important role in polymerase chain reaction (PCR) experiments, and therefore it is essential to find specific primers of outstanding quality. Unfortunately, many PCR constraints must be simultaneously inspected which makes specific primer selection difficult and time-consuming. This paper introduces a novel computational intelligence-based method, Teaching-Learning-Based Optimisation, to select the specific and feasible primers. The specified PCR product lengths of 150-300 bp and 500-800 bp with three melting temperature formulae of Wallace's formula, Bolton and McCarthy's formula and SantaLucia's formula were performed. The authors calculate optimal frequency to estimate the quality of primer selection based on a total of 500 runs for 50 random nucleotide sequences of 'Homo species' retrieved from the National Center for Biotechnology Information. The method was then fairly compared with the genetic algorithm (GA) and memetic algorithm (MA) for primer selection in the literature. The results show that the method easily found suitable primers corresponding with the setting primer constraints and had preferable performance than the GA and the MA. Furthermore, the method was also compared with the common method Primer3 according to their method type, primers presentation, parameters setting, speed and memory usage. In conclusion, it is an interesting primer selection method and a valuable tool for automatic high-throughput analysis. In the future, the usage of the primers in the wet lab needs to be validated carefully to increase the reliability of the method.

  16. Automated bone segmentation from dental CBCT images using patch-based sparse representation and convex optimization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Li; Gao, Yaozong; Shi, Feng

    Purpose: Cone-beam computed tomography (CBCT) is an increasingly utilized imaging modality for the diagnosis and treatment planning of the patients with craniomaxillofacial (CMF) deformities. Accurate segmentation of CBCT image is an essential step to generate three-dimensional (3D) models for the diagnosis and treatment planning of the patients with CMF deformities. However, due to the poor image quality, including very low signal-to-noise ratio and the widespread image artifacts such as noise, beam hardening, and inhomogeneity, it is challenging to segment the CBCT images. In this paper, the authors present a new automatic segmentation method to address these problems. Methods: To segmentmore » CBCT images, the authors propose a new method for fully automated CBCT segmentation by using patch-based sparse representation to (1) segment bony structures from the soft tissues and (2) further separate the mandible from the maxilla. Specifically, a region-specific registration strategy is first proposed to warp all the atlases to the current testing subject and then a sparse-based label propagation strategy is employed to estimate a patient-specific atlas from all aligned atlases. Finally, the patient-specific atlas is integrated into amaximum a posteriori probability-based convex segmentation framework for accurate segmentation. Results: The proposed method has been evaluated on a dataset with 15 CBCT images. The effectiveness of the proposed region-specific registration strategy and patient-specific atlas has been validated by comparing with the traditional registration strategy and population-based atlas. The experimental results show that the proposed method achieves the best segmentation accuracy by comparison with other state-of-the-art segmentation methods. Conclusions: The authors have proposed a new CBCT segmentation method by using patch-based sparse representation and convex optimization, which can achieve considerably accurate segmentation results in CBCT segmentation based on 15 patients.« less

  17. Trajectories for High Specific Impulse High Specific Power Deep Space Exploration

    NASA Technical Reports Server (NTRS)

    Polsgrove, T.; Adams, R. B.; Brady, Hugh J. (Technical Monitor)

    2002-01-01

    Preliminary results are presented for two methods to approximate the mission performance of high specific impulse high specific power vehicles. The first method is based on an analytical approximation derived by Williams and Shepherd and can be used to approximate mission performance to outer planets and interstellar space. The second method is based on a parametric analysis of trajectories created using the well known trajectory optimization code, VARITOP. This parametric analysis allows the reader to approximate payload ratios and optimal power requirements for both one-way and round-trip missions. While this second method only addresses missions to and from Jupiter, future work will encompass all of the outer planet destinations and some interstellar precursor missions.

  18. Enhanced Imaging of Specific Cell-Surface Glycosylation Based on Multi-FRET.

    PubMed

    Yuan, Baoyin; Chen, Yuanyuan; Sun, Yuqiong; Guo, Qiuping; Huang, Jin; Liu, Jianbo; Meng, Xiangxian; Yang, Xiaohai; Wen, Xiaohong; Li, Zenghui; Li, Lie; Wang, Kemin

    2018-05-15

    Cell-surface glycosylation contains abundant biological information that reflects cell physiological state, and it is of great value to image cell-surface glycosylation to elucidate its functions. Here we present a hybridization chain reaction (HCR)-based multifluorescence resonance energy transfer (multi-FRET) method for specific imaging of cell-surface glycosylation. By installing donors through metabolic glycan labeling and acceptors through aptamer-tethered nanoassemblies on the same glycoconjugate, intramolecular multi-FRET occurs due to near donor-acceptor distance. Benefiting from amplified effect and spatial flexibility of the HCR nanoassemblies, enhanced multi-FRET imaging of specific cell-surface glycosylation can be obtained. With this HCR-based multi-FRET method, we achieved obvious contrast in imaging of protein-specific GalNAcylation on 7211 cell surfaces. In addition, we demonstrated the general applicability of this method by visualizing the protein-specific sialylation on CEM cell surfaces. Furthermore, the expression changes of CEM cell-surface protein-specific sialylation under drug treatment was accurately monitored. This developed imaging method may provide a powerful tool in researching glycosylation functions, discovering biomarkers, and screening drugs.

  19. Design and performance analysis of gas and liquid radial turbines

    NASA Astrophysics Data System (ADS)

    Tan, Xu

    In the first part of the research, pumps running in reverse as turbines are studied. This work uses experimental data of wide range of pumps representing the centrifugal pumps' configurations in terms of specific speed. Based on specific speed and specific diameter an accurate correlation is developed to predict the performances at best efficiency point of the centrifugal pump in its turbine mode operation. The proposed prediction method yields very good results to date compared to previous such attempts. The present method is compared to nine previous methods found in the literature. The comparison results show that the method proposed in this paper is the most accurate. The proposed method can be further complemented and supplemented by more future tests to increase its accuracy. The proposed method is meaningful because it is based both specific speed and specific diameter. The second part of the research is focused on the design and analysis of the radial gas turbine. The specification of the turbine is obtained from the solar biogas hybrid system. The system is theoretically analyzed and constructed based on the purchased compressor. Theoretical analysis results in a specification of 100lb/min, 900ºC inlet total temperature and 1.575atm inlet total pressure. 1-D and 3-D geometry of the rotor is generated based on Aungier's method. 1-D loss model analysis and 3-D CFD simulations are performed to examine the performances of the rotor. The total-to-total efficiency of the rotor is more than 90%. With the help of CFD analysis, modifications on the preliminary design obtained optimized aerodynamic performances. At last, the theoretical performance analysis on the hybrid system is performed with the designed turbine.

  20. Systems, methods and apparatus for modeling, specifying and deploying policies in autonomous and autonomic systems using agent-oriented software engineering

    NASA Technical Reports Server (NTRS)

    Sterritt, Roy (Inventor); Hinchey, Michael G. (Inventor); Penn, Joaquin (Inventor)

    2011-01-01

    Systems, methods and apparatus are provided through which in some embodiments, an agent-oriented specification modeled with MaCMAS, is analyzed, flaws in the agent-oriented specification modeled with MaCMAS are corrected, and an implementation is derived from the corrected agent-oriented specification. Described herein are systems, method and apparatus that produce fully (mathematically) tractable development of agent-oriented specification(s) modeled with methodology fragment for analyzing complex multiagent systems (MaCMAS) and policies for autonomic systems from requirements through to code generation. The systems, method and apparatus described herein are illustrated through an example showing how user formulated policies can be translated into a formal mode which can then be converted to code. The requirements-based programming systems, method and apparatus described herein may provide faster, higher quality development and maintenance of autonomic systems based on user formulation of policies.

  1. Isolation of phage-display library-derived scFv antibody specific to Listeria monocytogenes by a novel immobilized method.

    PubMed

    Nguyen, X-H; Trinh, T-L; Vu, T-B-H; Le, Q-H; To, K-A

    2018-02-01

    To select Listeria monocytogenes-specific single-chain fragment variable (scFv) antibodies from a phage-display library by a novel simple and cost-effective immobilization method. Light expanded clay aggregate (LECA) was used as biomass support matrix for biopanning of a phage-display library to select L. monocytogenes-specific scFv antibody. Four rounds of positive selection against LECA-immobilized L. monocytogenes and an additional subtractive panning against Listeria innocua were performed. The phage clones selected using this panning scheme and LECA-based immobilization method exhibited the ability to bind L. monocytogenes without cross-reactivity toward 10 other non-L. monocytogenes bacteria. One of the selected phage clones was able to specifically recognize three major pathogenic serotypes (1/2a, 1/2b and 4b) of L. monocytogenes and 11 tested L. monocytogenes strains isolated from foods. The LECA-based immobilization method is applicable for isolating species-specific anti-L. monocytogenes scFv antibodies by phage display. The isolated scFv antibody has potential use in development of immunoassay-based methods for rapid detection of L. monocytogenes in food and environmental samples. In addition, the LECA immobilization method described here could feasibly be employed to isolate specific monoclonal antibodies against any given species of pathogenic bacteria from phage-display libraries. © 2017 The Society for Applied Microbiology.

  2. Probe-specific mixed-model approach to detect copy number differences using multiplex ligation-dependent probe amplification (MLPA)

    PubMed Central

    González, Juan R; Carrasco, Josep L; Armengol, Lluís; Villatoro, Sergi; Jover, Lluís; Yasui, Yutaka; Estivill, Xavier

    2008-01-01

    Background MLPA method is a potentially useful semi-quantitative method to detect copy number alterations in targeted regions. In this paper, we propose a method for the normalization procedure based on a non-linear mixed-model, as well as a new approach for determining the statistical significance of altered probes based on linear mixed-model. This method establishes a threshold by using different tolerance intervals that accommodates the specific random error variability observed in each test sample. Results Through simulation studies we have shown that our proposed method outperforms two existing methods that are based on simple threshold rules or iterative regression. We have illustrated the method using a controlled MLPA assay in which targeted regions are variable in copy number in individuals suffering from different disorders such as Prader-Willi, DiGeorge or Autism showing the best performace. Conclusion Using the proposed mixed-model, we are able to determine thresholds to decide whether a region is altered. These threholds are specific for each individual, incorporating experimental variability, resulting in improved sensitivity and specificity as the examples with real data have revealed. PMID:18522760

  3. Computer-Based Radiographic Quantification of Joint Space Narrowing Progression Using Sequential Hand Radiographs: Validation Study in Rheumatoid Arthritis Patients from Multiple Institutions.

    PubMed

    Ichikawa, Shota; Kamishima, Tamotsu; Sutherland, Kenneth; Fukae, Jun; Katayama, Kou; Aoki, Yuko; Okubo, Takanobu; Okino, Taichi; Kaneda, Takahiko; Takagi, Satoshi; Tanimura, Kazuhide

    2017-10-01

    We have developed a refined computer-based method to detect joint space narrowing (JSN) progression with the joint space narrowing progression index (JSNPI) by superimposing sequential hand radiographs. The purpose of this study is to assess the validity of a computer-based method using images obtained from multiple institutions in rheumatoid arthritis (RA) patients. Sequential hand radiographs of 42 patients (37 females and 5 males) with RA from two institutions were analyzed by a computer-based method and visual scoring systems as a standard of reference. The JSNPI above the smallest detectable difference (SDD) defined JSN progression on the joint level. The sensitivity and specificity of the computer-based method for JSN progression was calculated using the SDD and a receiver operating characteristic (ROC) curve. Out of 314 metacarpophalangeal joints, 34 joints progressed based on the SDD, while 11 joints widened. Twenty-one joints progressed in the computer-based method, 11 joints in the scoring systems, and 13 joints in both methods. Based on the SDD, we found lower sensitivity and higher specificity with 54.2 and 92.8%, respectively. At the most discriminant cutoff point according to the ROC curve, the sensitivity and specificity was 70.8 and 81.7%, respectively. The proposed computer-based method provides quantitative measurement of JSN progression using sequential hand radiographs and may be a useful tool in follow-up assessment of joint damage in RA patients.

  4. Systems, methods and apparatus for implementation of formal specifications derived from informal requirements

    NASA Technical Reports Server (NTRS)

    Hinchey, Michael G. (Inventor); Rouff, Christopher A. (Inventor); Rash, James L. (Inventor); Erickson, John D. (Inventor); Gracinin, Denis (Inventor)

    2010-01-01

    Systems, methods and apparatus are provided through which in some embodiments an informal specification is translated without human intervention into a formal specification. In some embodiments the formal specification is a process-based specification. In some embodiments, the formal specification is translated into a high-level computer programming language which is further compiled into a set of executable computer instructions.

  5. HIV-1 protease cleavage site prediction based on two-stage feature selection method.

    PubMed

    Niu, Bing; Yuan, Xiao-Cheng; Roeper, Preston; Su, Qiang; Peng, Chun-Rong; Yin, Jing-Yuan; Ding, Juan; Li, HaiPeng; Lu, Wen-Cong

    2013-03-01

    Knowledge of the mechanism of HIV protease cleavage specificity is critical to the design of specific and effective HIV inhibitors. Searching for an accurate, robust, and rapid method to correctly predict the cleavage sites in proteins is crucial when searching for possible HIV inhibitors. In this article, HIV-1 protease specificity was studied using the correlation-based feature subset (CfsSubset) selection method combined with Genetic Algorithms method. Thirty important biochemical features were found based on a jackknife test from the original data set containing 4,248 features. By using the AdaBoost method with the thirty selected features the prediction model yields an accuracy of 96.7% for the jackknife test and 92.1% for an independent set test, with increased accuracy over the original dataset by 6.7% and 77.4%, respectively. Our feature selection scheme could be a useful technique for finding effective competitive inhibitors of HIV protease.

  6. Texture-specific bag of visual words model and spatial cone matching-based method for the retrieval of focal liver lesions using multiphase contrast-enhanced CT images.

    PubMed

    Xu, Yingying; Lin, Lanfen; Hu, Hongjie; Wang, Dan; Zhu, Wenchao; Wang, Jian; Han, Xian-Hua; Chen, Yen-Wei

    2018-01-01

    The bag of visual words (BoVW) model is a powerful tool for feature representation that can integrate various handcrafted features like intensity, texture, and spatial information. In this paper, we propose a novel BoVW-based method that incorporates texture and spatial information for the content-based image retrieval to assist radiologists in clinical diagnosis. This paper presents a texture-specific BoVW method to represent focal liver lesions (FLLs). Pixels in the region of interest (ROI) are classified into nine texture categories using the rotation-invariant uniform local binary pattern method. The BoVW-based features are calculated for each texture category. In addition, a spatial cone matching (SCM)-based representation strategy is proposed to describe the spatial information of the visual words in the ROI. In a pilot study, eight radiologists with different clinical experience performed diagnoses for 20 cases with and without the top six retrieved results. A total of 132 multiphase computed tomography volumes including five pathological types were collected. The texture-specific BoVW was compared to other BoVW-based methods using the constructed dataset of FLLs. The results show that our proposed model outperforms the other three BoVW methods in discriminating different lesions. The SCM method, which adds spatial information to the orderless BoVW model, impacted the retrieval performance. In the pilot trial, the average diagnosis accuracy of the radiologists was improved from 66 to 80% using the retrieval system. The preliminary results indicate that the texture-specific features and the SCM-based BoVW features can effectively characterize various liver lesions. The retrieval system has the potential to improve the diagnostic accuracy and the confidence of the radiologists.

  7. Real-Time PCR-Based Quantitation Method for the Genetically Modified Soybean Line GTS 40-3-2.

    PubMed

    Kitta, Kazumi; Takabatake, Reona; Mano, Junichi

    2016-01-01

    This chapter describes a real-time PCR-based method for quantitation of the relative amount of genetically modified (GM) soybean line GTS 40-3-2 [Roundup Ready(®) soybean (RRS)] contained in a batch. The method targets a taxon-specific soybean gene (lectin gene, Le1) and the specific DNA construct junction region between the Petunia hybrida chloroplast transit peptide sequence and the Agrobacterium 5-enolpyruvylshikimate-3-phosphate synthase gene (epsps) sequence present in GTS 40-3-2. The method employs plasmid pMulSL2 as a reference material in order to quantify the relative amount of GTS 40-3-2 in soybean samples using a conversion factor (Cf) equal to the ratio of the RRS-specific DNA to the taxon-specific DNA in representative genuine GTS 40-3-2 seeds.

  8. A combined learning algorithm for prostate segmentation on 3D CT images.

    PubMed

    Ma, Ling; Guo, Rongrong; Zhang, Guoyi; Schuster, David M; Fei, Baowei

    2017-11-01

    Segmentation of the prostate on CT images has many applications in the diagnosis and treatment of prostate cancer. Because of the low soft-tissue contrast on CT images, prostate segmentation is a challenging task. A learning-based segmentation method is proposed for the prostate on three-dimensional (3D) CT images. We combine population-based and patient-based learning methods for segmenting the prostate on CT images. Population data can provide useful information to guide the segmentation processing. Because of inter-patient variations, patient-specific information is particularly useful to improve the segmentation accuracy for an individual patient. In this study, we combine a population learning method and a patient-specific learning method to improve the robustness of prostate segmentation on CT images. We train a population model based on the data from a group of prostate patients. We also train a patient-specific model based on the data of the individual patient and incorporate the information as marked by the user interaction into the segmentation processing. We calculate the similarity between the two models to obtain applicable population and patient-specific knowledge to compute the likelihood of a pixel belonging to the prostate tissue. A new adaptive threshold method is developed to convert the likelihood image into a binary image of the prostate, and thus complete the segmentation of the gland on CT images. The proposed learning-based segmentation algorithm was validated using 3D CT volumes of 92 patients. All of the CT image volumes were manually segmented independently three times by two, clinically experienced radiologists and the manual segmentation results served as the gold standard for evaluation. The experimental results show that the segmentation method achieved a Dice similarity coefficient of 87.18 ± 2.99%, compared to the manual segmentation. By combining the population learning and patient-specific learning methods, the proposed method is effective for segmenting the prostate on 3D CT images. The prostate CT segmentation method can be used in various applications including volume measurement and treatment planning of the prostate. © 2017 American Association of Physicists in Medicine.

  9. Optimization of cerebellar purkinje neuron cultures and development of a plasmid-based method for purkinje neuron-specific, miRNA-mediated protein knockdown.

    PubMed

    Alexander, C J; Hammer, J A

    2016-01-01

    We present a simple and efficient method to knock down proteins specifically in Purkinje neurons (PN) present in mixed mouse primary cerebellar cultures. This method utilizes the introduction via nucleofection of a plasmid encoding a specific miRNA downstream of the L7/Pcp2 promoter, which drives PN-specific expression. As proof-of-principle, we used this plasmid to knock down the motor protein myosin Va, which is required for the targeting of smooth endoplasmic reticulum (ER) into PN spines. Consistent with effective knockdown, transfected PNs robustly phenocopied PNs from dilute-lethal (myosin Va-null) mice with regard to the ER targeting defect. Importantly, our plasmid-based approach is less challenging technically and more specific to PNs than several alternative methods (e.g., biolistic- and lentiviral-based introduction of siRNAs). We also present a number of improvements for generating mixed cerebellar cultures that shorten the procedure and improve the total yield of PNs, and of transfected PNs, considerably. Finally, we present a method to rescue cerebellar cultures that develop large cell aggregates, a common problem that otherwise precludes the further use of the culture. Copyright © 2016 Elsevier Inc. All rights reserved.

  10. Development and in-house validation of the event-specific polymerase chain reaction detection methods for genetically modified soybean MON89788 based on the cloned integration flanking sequence.

    PubMed

    Liu, Jia; Guo, Jinchao; Zhang, Haibo; Li, Ning; Yang, Litao; Zhang, Dabing

    2009-11-25

    Various polymerase chain reaction (PCR) methods were developed for the execution of genetically modified organism (GMO) labeling policies, of which an event-specific PCR detection method based on the flanking sequence of exogenous integration is the primary trend in GMO detection due to its high specificity. In this study, the 5' and 3' flanking sequences of the exogenous integration of MON89788 soybean were revealed by thermal asymmetric interlaced PCR. The event-specific PCR primers and TaqMan probe were designed based upon the revealed 5' flanking sequence, and the qualitative and quantitative PCR assays were established employing these designed primers and probes. In qualitative PCR, the limit of detection (LOD) was about 0.01 ng of genomic DNA corresponding to 10 copies of haploid soybean genomic DNA. In the quantitative PCR assay, the LOD was as low as two haploid genome copies, and the limit of quantification was five haploid genome copies. Furthermore, the developed PCR methods were in-house validated by five researchers, and the validated results indicated that the developed event-specific PCR methods can be used for identification and quantification of MON89788 soybean and its derivates.

  11. A method for radiological characterization based on fluence conversion coefficients

    NASA Astrophysics Data System (ADS)

    Froeschl, Robert

    2018-06-01

    Radiological characterization of components in accelerator environments is often required to ensure adequate radiation protection during maintenance, transport and handling as well as for the selection of the proper disposal pathway. The relevant quantities are typical the weighted sums of specific activities with radionuclide-specific weighting coefficients. Traditional methods based on Monte Carlo simulations are radionuclide creation-event based or the particle fluences in the regions of interest are scored and then off-line weighted with radionuclide production cross sections. The presented method bases the radiological characterization on a set of fluence conversion coefficients. For a given irradiation profile and cool-down time, radionuclide production cross-sections, material composition and radionuclide-specific weighting coefficients, a set of particle type and energy dependent fluence conversion coefficients is computed. These fluence conversion coefficients can then be used in a Monte Carlo transport code to perform on-line weighting to directly obtain the desired radiological characterization, either by using built-in multiplier features such as in the PHITS code or by writing a dedicated user routine such as for the FLUKA code. The presented method has been validated against the standard event-based methods directly available in Monte Carlo transport codes.

  12. Application of probabilistic analysis/design methods in space programs - The approaches, the status, and the needs

    NASA Technical Reports Server (NTRS)

    Ryan, Robert S.; Townsend, John S.

    1993-01-01

    The prospective improvement of probabilistic methods for space program analysis/design entails the further development of theories, codes, and tools which match specific areas of application, the drawing of lessons from previous uses of probability and statistics data bases, the enlargement of data bases (especially in the field of structural failures), and the education of engineers and managers on the advantages of these methods. An evaluation is presently made of the current limitations of probabilistic engineering methods. Recommendations are made for specific applications.

  13. Morphometry-based impedance boundary conditions for patient-specific modeling of blood flow in pulmonary arteries.

    PubMed

    Spilker, Ryan L; Feinstein, Jeffrey A; Parker, David W; Reddy, V Mohan; Taylor, Charles A

    2007-04-01

    Patient-specific computational models could aid in planning interventions to relieve pulmonary arterial stenoses common in many forms of congenital heart disease. We describe a new approach to simulate blood flow in subject-specific models of the pulmonary arteries that consists of a numerical model of the proximal pulmonary arteries created from three-dimensional medical imaging data with terminal impedance boundary conditions derived from linear wave propagation theory applied to morphometric models of distal vessels. A tuning method, employing numerical solution methods for nonlinear systems of equations, was developed to modify the distal vasculature to match measured pressure and flow distribution data. One-dimensional blood flow equations were solved with a finite element method in image-based pulmonary arterial models using prescribed inlet flow and morphometry-based impedance at the outlets. Application of these methods in a pilot study of the effect of removal of unilateral pulmonary arterial stenosis induced in a pig showed good agreement with experimental measurements for flow redistribution and main pulmonary arterial pressure. Next, these methods were applied to a patient with repaired tetralogy of Fallot and predicted insignificant hemodynamic improvement with relief of the stenosis. This method of coupling image-based and morphometry-based models could enable increased fidelity in pulmonary hemodynamic simulation.

  14. Application of nanomaterials in the bioanalytical detection of disease-related genes.

    PubMed

    Zhu, Xiaoqian; Li, Jiao; He, Hanping; Huang, Min; Zhang, Xiuhua; Wang, Shengfu

    2015-12-15

    In the diagnosis of genetic diseases and disorders, nanomaterials-based gene detection systems have significant advantages over conventional diagnostic systems in terms of simplicity, sensitivity, specificity, and portability. In this review, we describe the application of nanomaterials for disease-related genes detection in different methods excluding PCR-related method, such as colorimetry, fluorescence-based methods, electrochemistry, microarray methods, surface-enhanced Raman spectroscopy (SERS), quartz crystal microbalance (QCM) methods, and dynamic light scattering (DLS). The most commonly used nanomaterials are gold, silver, carbon and semiconducting nanoparticles. Various nanomaterials-based gene detection methods are introduced, their respective advantages are discussed, and selected examples are provided to illustrate the properties of these nanomaterials and their emerging applications for the detection of specific nucleic acid sequences. Copyright © 2015. Published by Elsevier B.V.

  15. Development of melting temperature-based SYBR Green I polymerase chain reaction methods for multiplex genetically modified organism detection.

    PubMed

    Hernández, Marta; Rodríguez-Lázaro, David; Esteve, Teresa; Prat, Salomé; Pla, Maria

    2003-12-15

    Commercialization of several genetically modified crops has been approved worldwide to date. Uniplex polymerase chain reaction (PCR)-based methods to identify these different insertion events have been developed, but their use in the analysis of all commercially available genetically modified organisms (GMOs) is becoming progressively insufficient. These methods require a large number of assays to detect all possible GMOs present in the sample and thereby the development of multiplex PCR systems using combined probes and primers targeted to sequences specific to various GMOs is needed for detection of this increasing number of GMOs. Here we report on the development of a multiplex real-time PCR suitable for multiple GMO identification, based on the intercalating dye SYBR Green I and the analysis of the melting curves of the amplified products. Using this method, different amplification products specific for Maximizer 176, Bt11, MON810, and GA21 maize and for GTS 40-3-2 soybean were obtained and identified by their specific Tm. We have combined amplification of these products in a number of multiplex reactions and show the suitability of the methods for identification of GMOs with a sensitivity of 0.1% in duplex reactions. The described methods offer an economic and simple alternative to real-time PCR systems based on sequence-specific probes (i.e., TaqMan chemistry). These methods can be used as selection tests and further optimized for uniplex GMO quantification.

  16. Treatment selection in a randomized clinical trial via covariate-specific treatment effect curves.

    PubMed

    Ma, Yunbei; Zhou, Xiao-Hua

    2017-02-01

    For time-to-event data in a randomized clinical trial, we proposed two new methods for selecting an optimal treatment for a patient based on the covariate-specific treatment effect curve, which is used to represent the clinical utility of a predictive biomarker. To select an optimal treatment for a patient with a specific biomarker value, we proposed pointwise confidence intervals for each covariate-specific treatment effect curve and the difference between covariate-specific treatment effect curves of two treatments. Furthermore, to select an optimal treatment for a future biomarker-defined subpopulation of patients, we proposed confidence bands for each covariate-specific treatment effect curve and the difference between each pair of covariate-specific treatment effect curve over a fixed interval of biomarker values. We constructed the confidence bands based on a resampling technique. We also conducted simulation studies to evaluate finite-sample properties of the proposed estimation methods. Finally, we illustrated the application of the proposed method in a real-world data set.

  17. Evaluation of microplate immunocapture method for detection of Vibrio cholerae, Salmonella Typhi and Shigella flexneri from food.

    PubMed

    Fakruddin, Md; Hossain, Md Nur; Ahmed, Monzur Morshed

    2017-08-29

    Improved methods with better separation and concentration ability for detection of foodborne pathogens are in constant need. The aim of this study was to evaluate microplate immunocapture (IC) method for detection of Salmonella Typhi, Shigella flexneri and Vibrio cholerae from food samples to provide a better alternative to conventional culture based methods. The IC method was optimized for incubation time, bacterial concentration, and capture efficiency. 6 h incubation and log 6 CFU/ml cell concentration provided optimal results. The method was shown to be highly specific for the pathogens concerned. Capture efficiency (CE) was around 100% of the target pathogens, whereas CE was either zero or very low for non-target pathogens. The IC method also showed better pathogen detection ability at different concentrations of cells from artificially contaminated food samples in comparison with culture based methods. Performance parameter of the method was also comparable (Detection limit- 25 CFU/25 g; sensitivity 100%; specificity-96.8%; Accuracy-96.7%), even better than culture based methods (Detection limit- 125 CFU/25 g; sensitivity 95.9%; specificity-97%; Accuracy-96.2%). The IC method poses to be the potential to be used as a method of choice for detection of foodborne pathogens in routine laboratory practice after proper validation.

  18. Estimating pathway-specific contributions to biodegradation in aquifers based on dual isotope analysis: theoretical analysis and reactive transport simulations.

    PubMed

    Centler, Florian; Heße, Falk; Thullner, Martin

    2013-09-01

    At field sites with varying redox conditions, different redox-specific microbial degradation pathways contribute to total contaminant degradation. The identification of pathway-specific contributions to total contaminant removal is of high practical relevance, yet difficult to achieve with current methods. Current stable-isotope-fractionation-based techniques focus on the identification of dominant biodegradation pathways under constant environmental conditions. We present an approach based on dual stable isotope data to estimate the individual contributions of two redox-specific pathways. We apply this approach to carbon and hydrogen isotope data obtained from reactive transport simulations of an organic contaminant plume in a two-dimensional aquifer cross section to test the applicability of the method. To take aspects typically encountered at field sites into account, additional simulations addressed the effects of transverse mixing, diffusion-induced stable-isotope fractionation, heterogeneities in the flow field, and mixing in sampling wells on isotope-based estimates for aerobic and anaerobic pathway contributions to total contaminant biodegradation. Results confirm the general applicability of the presented estimation method which is most accurate along the plume core and less accurate towards the fringe where flow paths receive contaminant mass and associated isotope signatures from the core by transverse dispersion. The presented method complements the stable-isotope-fractionation-based analysis toolbox. At field sites with varying redox conditions, it provides a means to identify the relative importance of individual, redox-specific degradation pathways. © 2013.

  19. Identification of Single- and Multiple-Class Specific Signature Genes from Gene Expression Profiles by Group Marker Index

    PubMed Central

    Tsai, Yu-Shuen; Aguan, Kripamoy; Pal, Nikhil R.; Chung, I-Fang

    2011-01-01

    Informative genes from microarray data can be used to construct prediction model and investigate biological mechanisms. Differentially expressed genes, the main targets of most gene selection methods, can be classified as single- and multiple-class specific signature genes. Here, we present a novel gene selection algorithm based on a Group Marker Index (GMI), which is intuitive, of low-computational complexity, and efficient in identification of both types of genes. Most gene selection methods identify only single-class specific signature genes and cannot identify multiple-class specific signature genes easily. Our algorithm can detect de novo certain conditions of multiple-class specificity of a gene and makes use of a novel non-parametric indicator to assess the discrimination ability between classes. Our method is effective even when the sample size is small as well as when the class sizes are significantly different. To compare the effectiveness and robustness we formulate an intuitive template-based method and use four well-known datasets. We demonstrate that our algorithm outperforms the template-based method in difficult cases with unbalanced distribution. Moreover, the multiple-class specific genes are good biomarkers and play important roles in biological pathways. Our literature survey supports that the proposed method identifies unique multiple-class specific marker genes (not reported earlier to be related to cancer) in the Central Nervous System data. It also discovers unique biomarkers indicating the intrinsic difference between subtypes of lung cancer. We also associate the pathway information with the multiple-class specific signature genes and cross-reference to published studies. We find that the identified genes participate in the pathways directly involved in cancer development in leukemia data. Our method gives a promising way to find genes that can involve in pathways of multiple diseases and hence opens up the possibility of using an existing drug on other diseases as well as designing a single drug for multiple diseases. PMID:21909426

  20. Properties of a Formal Method for Prediction of Emergent Behaviors in Swarm-based Systems

    NASA Technical Reports Server (NTRS)

    Rouff, Christopher; Vanderbilt, Amy; Hinchey, Mike; Truszkowski, Walt; Rash, James

    2004-01-01

    Autonomous intelligent swarms of satellites are being proposed for NASA missions that have complex behaviors and interactions. The emergent properties of swarms make these missions powerful, but at the same time more difficult to design and assure that proper behaviors will emerge. This paper gives the results of research into formal methods techniques for verification and validation of NASA swarm-based missions. Multiple formal methods were evaluated to determine their effectiveness in modeling and assuring the behavior of swarms of spacecraft. The NASA ANTS mission was used as an example of swarm intelligence for which to apply the formal methods. This paper will give the evaluation of these formal methods and give partial specifications of the ANTS mission using four selected methods. We then give an evaluation of the methods and the needed properties of a formal method for effective specification and prediction of emergent behavior in swarm-based systems.

  1. Development of a specification for flexible base construction.

    DOT National Transportation Integrated Search

    2014-01-01

    The Texas Department of Transportation (TxDOT) currently uses Item 247 Flexible Base to specify a : pavement foundation course. The goal of this project was to evaluate the current method of base course : acceptance and investigate methods to r...

  2. A mass spectrometry-based multiplex SNP genotyping by utilizing allele-specific ligation and strand displacement amplification.

    PubMed

    Park, Jung Hun; Jang, Hyowon; Jung, Yun Kyung; Jung, Ye Lim; Shin, Inkyung; Cho, Dae-Yeon; Park, Hyun Gyu

    2017-05-15

    We herein describe a new mass spectrometry-based method for multiplex SNP genotyping by utilizing allele-specific ligation and strand displacement amplification (SDA) reaction. In this method, allele-specific ligation is first performed to discriminate base sequence variations at the SNP site within the PCR-amplified target DNA. The primary ligation probe is extended by a universal primer annealing site while the secondary ligation probe has base sequences as an overhang with a nicking enzyme recognition site and complementary mass marker sequence. The ligation probe pairs are ligated by DNA ligase only at specific allele in the target DNA and the resulting ligated product serves as a template to promote the SDA reaction using a universal primer. This process isothermally amplifies short DNA fragments, called mass markers, to be analyzed by mass spectrometry. By varying the sizes of the mass markers, we successfully demonstrated the multiplex SNP genotyping capability of this method by reliably identifying several BRCA mutations in a multiplex manner with mass spectrometry. Copyright © 2016 Elsevier B.V. All rights reserved.

  3. Fabrication technology of CNT-Nickel Oxide based planar pseudocapacitor for MEMS and NEMS

    NASA Astrophysics Data System (ADS)

    Lebedev, E. A.; Kitsyuk, E. P.; Gavrilin, I. M.; Gromov, D. G.; Gruzdev, N. E.; Gavrilov, S. A.; Dronov, A. A.; Pavlov, A. A.

    2015-11-01

    Fabrication technology of planar pseudocapacitor (PsC) based on carbon nanotube (CNT) forest, synthesized using plasma enhanced chemical vapor deposition (PECVD) method, covered with thin nickel oxide layer deposited by successive ionic layer adsorption and reaction (SILAR) method, is demonstrated. Dependences of deposited oxide layers thickness on device specific capacities is studied. It is shown that pseudocapacity of nickel oxide thin layer increases specific capacity of the CNT's based device up to 2.5 times.

  4. Paper-Based MicroRNA Expression Profiling from Plasma and Circulating Tumor Cells.

    PubMed

    Leong, Sai Mun; Tan, Karen Mei-Ling; Chua, Hui Wen; Huang, Mo-Chao; Cheong, Wai Chye; Li, Mo-Huang; Tucker, Steven; Koay, Evelyn Siew-Chuan

    2017-03-01

    Molecular characterization of circulating tumor cells (CTCs) holds great promise for monitoring metastatic progression and characterizing metastatic disease. However, leukocyte and red blood cell contamination of routinely isolated CTCs makes CTC-specific molecular characterization extremely challenging. Here we report the use of a paper-based medium for efficient extraction of microRNAs (miRNAs) from limited amounts of biological samples such as rare CTCs harvested from cancer patient blood. Specifically, we devised a workflow involving the use of Flinders Technology Associates (FTA) ® Elute Card with a digital PCR-inspired "partitioning" method to extract and purify miRNAs from plasma and CTCs. We demonstrated the sensitivity of this method to detect miRNA expression from as few as 3 cancer cells spiked into human blood. Using this method, background miRNA expression was excluded from contaminating blood cells, and CTC-specific miRNA expression profiles were derived from breast and colorectal cancer patients. Plasma separated out during purification of CTCs could likewise be processed using the same paper-based method for miRNA detection, thereby maximizing the amount of patient-specific information that can be derived from a single blood draw. Overall, this paper-based extraction method enables an efficient, cost-effective workflow for maximized recovery of small RNAs from limited biological samples for downstream molecular analyses. © 2016 American Association for Clinical Chemistry.

  5. Characterization and prediction of residues determining protein functional specificity.

    PubMed

    Capra, John A; Singh, Mona

    2008-07-01

    Within a homologous protein family, proteins may be grouped into subtypes that share specific functions that are not common to the entire family. Often, the amino acids present in a small number of sequence positions determine each protein's particular functional specificity. Knowledge of these specificity determining positions (SDPs) aids in protein function prediction, drug design and experimental analysis. A number of sequence-based computational methods have been introduced for identifying SDPs; however, their further development and evaluation have been hindered by the limited number of known experimentally determined SDPs. We combine several bioinformatics resources to automate a process, typically undertaken manually, to build a dataset of SDPs. The resulting large dataset, which consists of SDPs in enzymes, enables us to characterize SDPs in terms of their physicochemical and evolutionary properties. It also facilitates the large-scale evaluation of sequence-based SDP prediction methods. We present a simple sequence-based SDP prediction method, GroupSim, and show that, surprisingly, it is competitive with a representative set of current methods. We also describe ConsWin, a heuristic that considers sequence conservation of neighboring amino acids, and demonstrate that it improves the performance of all methods tested on our large dataset of enzyme SDPs. Datasets and GroupSim code are available online at http://compbio.cs.princeton.edu/specificity/. Supplementary data are available at Bioinformatics online.

  6. Literature-based condition-specific miRNA-mRNA target prediction.

    PubMed

    Oh, Minsik; Rhee, Sungmin; Moon, Ji Hwan; Chae, Heejoon; Lee, Sunwon; Kang, Jaewoo; Kim, Sun

    2017-01-01

    miRNAs are small non-coding RNAs that regulate gene expression by binding to the 3'-UTR of genes. Many recent studies have reported that miRNAs play important biological roles by regulating specific mRNAs or genes. Many sequence-based target prediction algorithms have been developed to predict miRNA targets. However, these methods are not designed for condition-specific target predictions and produce many false positives; thus, expression-based target prediction algorithms have been developed for condition-specific target predictions. A typical strategy to utilize expression data is to leverage the negative control roles of miRNAs on genes. To control false positives, a stringent cutoff value is typically set, but in this case, these methods tend to reject many true target relationships, i.e., false negatives. To overcome these limitations, additional information should be utilized. The literature is probably the best resource that we can utilize. Recent literature mining systems compile millions of articles with experiments designed for specific biological questions, and the systems provide a function to search for specific information. To utilize the literature information, we used a literature mining system, BEST, that automatically extracts information from the literature in PubMed and that allows the user to perform searches of the literature with any English words. By integrating omics data analysis methods and BEST, we developed Context-MMIA, a miRNA-mRNA target prediction method that combines expression data analysis results and the literature information extracted based on the user-specified context. In the pathway enrichment analysis using genes included in the top 200 miRNA-targets, Context-MMIA outperformed the four existing target prediction methods that we tested. In another test on whether prediction methods can re-produce experimentally validated target relationships, Context-MMIA outperformed the four existing target prediction methods. In summary, Context-MMIA allows the user to specify a context of the experimental data to predict miRNA targets, and we believe that Context-MMIA is very useful for predicting condition-specific miRNA targets.

  7. Automatic construction of subject-specific human airway geometry including trifurcations based on a CT-segmented airway skeleton and surface

    PubMed Central

    Miyawaki, Shinjiro; Tawhai, Merryn H.; Hoffman, Eric A.; Wenzel, Sally E.; Lin, Ching-Long

    2016-01-01

    We propose a method to construct three-dimensional airway geometric models based on airway skeletons, or centerlines (CLs). Given a CT-segmented airway skeleton and surface, the proposed CL-based method automatically constructs subject-specific models that contain anatomical information regarding branches, include bifurcations and trifurcations, and extend from the trachea to terminal bronchioles. The resulting model can be anatomically realistic with the assistance of an image-based surface; alternatively a model with an idealized skeleton and/or branch diameters is also possible. This method systematically identifies and classifies trifurcations to successfully construct the models, which also provides the number and type of trifurcations for the analysis of the airways from an anatomical point of view. We applied this method to 16 normal and 16 severe asthmatic subjects using their computed tomography images. The average distance between the surface of the model and the image-based surface was 11% of the average voxel size of the image. The four most frequent locations of trifurcations were the left upper division bronchus, left lower lobar bronchus, right upper lobar bronchus, and right intermediate bronchus. The proposed method automatically constructed accurate subject-specific three-dimensional airway geometric models that contain anatomical information regarding branches using airway skeleton, diameters, and image-based surface geometry. The proposed method can construct (i) geometry automatically for population-based studies, (ii) trifurcations to retain the original airway topology, (iii) geometry that can be used for automatic generation of computational fluid dynamics meshes, and (iv) geometry based only on a skeleton and diameters for idealized branches. PMID:27704229

  8. A hypersonic aeroheating calculation method based on inviscid outer edge of boundary layer parameters

    NASA Astrophysics Data System (ADS)

    Meng, ZhuXuan; Fan, Hu; Peng, Ke; Zhang, WeiHua; Yang, HuiXin

    2016-12-01

    This article presents a rapid and accurate aeroheating calculation method for hypersonic vehicles. The main innovation is combining accurate of numerical method with efficient of engineering method, which makes aeroheating simulation more precise and faster. Based on the Prandtl boundary layer theory, the entire flow field is divided into inviscid and viscid flow at the outer edge of the boundary layer. The parameters at the outer edge of the boundary layer are numerically calculated from assuming inviscid flow. The thermodynamic parameters of constant-volume specific heat, constant-pressure specific heat and the specific heat ratio are calculated, the streamlines on the vehicle surface are derived and the heat flux is then obtained. The results of the double cone show that at the 0° and 10° angle of attack, the method of aeroheating calculation based on inviscid outer edge of boundary layer parameters reproduces the experimental data better than the engineering method. Also the proposed simulation results of the flight vehicle reproduce the viscid numerical results well. Hence, this method provides a promising way to overcome the high cost of numerical calculation and improves the precision.

  9. Language and Social Factors in the Use of Cell Phone Technology by Adolescents with and without Specific Language Impairment (SLI)

    ERIC Educational Resources Information Center

    Conti-Ramsden, Gina; Durkin, Kevin; Simkin, Zoe

    2010-01-01

    Purpose: This study aimed to compare cell phone use (both oral and text-based) by adolescents with and without specific language impairment (SLI) and examine the extent to which language and social factors affect frequency of use. Method: Both interview and diary methods were used to compare oral and text-based communication using cell phones by…

  10. Frequency-Specific Fractal Analysis of Postural Control Accounts for Control Strategies

    PubMed Central

    Gilfriche, Pierre; Deschodt-Arsac, Véronique; Blons, Estelle; Arsac, Laurent M.

    2018-01-01

    Diverse indicators of postural control in Humans have been explored for decades, mostly based on the trajectory of the center-of-pressure. Classical approaches focus on variability, based on the notion that if a posture is too variable, the subject is not stable. Going deeper, an improved understanding of underlying physiology has been gained from studying variability in different frequency ranges, pointing to specific short-loops (proprioception), and long-loops (visuo-vestibular) in neural control. More recently, fractal analyses have proliferated and become useful additional metrics of postural control. They allowed identifying two scaling phenomena, respectively in short and long timescales. Here, we show that one of the most widely used methods for fractal analysis, Detrended Fluctuation Analysis, could be enhanced to account for scalings on specific frequency ranges. By computing and filtering a bank of synthetic fractal signals, we established how scaling analysis can be focused on specific frequency components. We called the obtained method Frequency-specific Fractal Analysis (FsFA) and used it to associate the two scaling phenomena of postural control to proprioceptive-based control loop and visuo-vestibular based control loop. After that, convincing arguments of method validity came from an application on the study of unaltered vs. altered postural control in athletes. Overall, the analysis suggests that at least two timescales contribute to postural control: a velocity-based control in short timescales relying on proprioceptive sensors, and a position-based control in longer timescales with visuo-vestibular sensors, which is a brand-new vision of postural control. Frequency-specific scaling exponents are promising markers of control strategies in Humans. PMID:29643816

  11. A rapid low-cost high-density DNA-based multi-detection test for routine inspection of meat species.

    PubMed

    Lin, Chun Chi; Fung, Lai Ling; Chan, Po Kwok; Lee, Cheuk Man; Chow, Kwok Fai; Cheng, Shuk Han

    2014-02-01

    The increasing occurrence of food frauds suggests that species identification should be part of food authentication. Current molecular-based species identification methods have their own limitations or drawbacks, such as relatively time-consuming experimental steps, expensive equipment and, in particular, these methods cannot identify mixed species in a single experiment. This project proposes an improved method involving PCR amplification of the COI gene and detection of species-specific sequences by hybridisation. Major innovative breakthrough lies in the detection of multiple species, including pork, beef, lamb, horse, cat, dog and mouse, from a mixed sample within a single experiment. The probes used are species-specific either in sole or mixed species samples. As little as 5 pg of DNA template in the PCR is detectable in the proposed method. By designing species-specific probes and adopting reverse dot blot hybridisation and flow-through hybridisation, a low-cost high-density DNA-based multi-detection test suitable for routine inspection of meat species was developed. © 2013.

  12. Assessment of public health impact of work-related asthma.

    PubMed

    Jaakkola, Maritta S; Jaakkola, Jouni J K

    2012-03-05

    Asthma is among the most common chronic diseases in working-aged populations and occupational exposures are important causal agents. Our aims were to evaluate the best methods to assess occurrence, public health impact, and burden to society related to occupational or work-related asthma and to achieve comparable estimates for different populations. We addressed three central questions: 1: What is the best method to assess the occurrence of occupational asthma? We evaluated: 1) assessment of the occurrence of occupational asthma per se, and 2) assessment of adult-onset asthma and the population attributable fractions due to specific occupational exposures. 2: What are the best methods to assess public health impact and burden to society related to occupational or work-related asthma? We evaluated methods based on assessment of excess burden of disease due to specific occupational exposures. 3: How to achieve comparable estimates for different populations? We evaluated comparability of estimates of occurrence and burden attributable to occupational asthma based on different methods. Assessment of the occurrence of occupational asthma per se can be used in countries with good coverage of the identification system for occupational asthma, i.e. countries with well-functioning occupational health services. Assessment based on adult-onset asthma and population attributable fractions due to specific occupational exposures is a good approach to estimate the occurrence of occupational asthma at the population level. For assessment of public health impact from work-related asthma we recommend assessing excess burden of disease due to specific occupational exposures, including excess incidence of asthma complemented by an assessment of disability from it. International comparability of estimates can be best achieved by methods based on population attributable fractions. Public health impact assessment for occupational asthma is central in prevention and health policy planning and could be improved by purposeful development of methods for assessing health benefits from preventive actions. Registry-based methods are suitable for evaluating time-trends of occurrence at a given population but for international comparisons they face serious limitations. Assessment of excess burden of disease due to specific occupational exposure is a useful measure, when there is valid information on population exposure and attributable fractions.

  13. Thermal Desorption Analysis of Effective Specific Soil Surface Area

    NASA Astrophysics Data System (ADS)

    Smagin, A. V.; Bashina, A. S.; Klyueva, V. V.; Kubareva, A. V.

    2017-12-01

    A new method of assessing the effective specific surface area based on the successive thermal desorption of water vapor at different temperature stages of sample drying is analyzed in comparison with the conventional static adsorption method using a representative set of soil samples of different genesis and degree of dispersion. The theory of the method uses the fundamental relationship between the thermodynamic water potential (Ψ) and the absolute temperature of drying ( T): Ψ = Q - aT, where Q is the specific heat of vaporization, and a is the physically based parameter related to the initial temperature and relative humidity of the air in the external thermodynamic reservoir (laboratory). From gravimetric data on the mass fraction of water ( W) and the Ψ value, Polyanyi potential curves ( W(Ψ)) for the studied samples are plotted. Water sorption isotherms are then calculated, from which the capacity of monolayer and the target effective specific surface area are determined using the BET theory. Comparative analysis shows that the new method well agrees with the conventional estimation of the degree of dispersion by the BET and Kutilek methods in a wide range of specific surface area values between 10 and 250 m2/g.

  14. Evaluation of the Technical Adequacy of Three Methods for Identifying Specific Learning Disabilities Based on Cognitive Discrepancies

    ERIC Educational Resources Information Center

    Stuebing, Karla K.; Fletcher, Jack M.; Branum-Martin, Lee; Francis, David J.

    2012-01-01

    This study used simulation techniques to evaluate the technical adequacy of three methods for the identification of specific learning disabilities via patterns of strengths and weaknesses in cognitive processing. Latent and observed data were generated and the decision-making process of each method was applied to assess concordance in…

  15. A bibliography on formal methods for system specification, design and validation

    NASA Technical Reports Server (NTRS)

    Meyer, J. F.; Furchtgott, D. G.; Movaghar, A.

    1982-01-01

    Literature on the specification, design, verification, testing, and evaluation of avionics systems was surveyed, providing 655 citations. Journal papers, conference papers, and technical reports are included. Manual and computer-based methods were employed. Keywords used in the online search are listed.

  16. Droplet Digital PCR-Based Chimerism Analysis for Primary Immunodeficiency Diseases.

    PubMed

    Okano, Tsubasa; Tsujita, Yuki; Kanegane, Hirokazu; Mitsui-Sekinaka, Kanako; Tanita, Kay; Miyamoto, Satoshi; Yeh, Tzu-Wen; Yamashita, Motoi; Terada, Naomi; Ogura, Yumi; Takagi, Masatoshi; Imai, Kohsuke; Nonoyama, Shigeaki; Morio, Tomohiro

    2018-04-01

    In the current study, we aimed to accurately evaluate donor/recipient or male/female chimerism in samples from patients who underwent hematopoietic stem cell transplantation (HSCT). We designed the droplet digital polymerase chain reaction (ddPCR) for SRY and RPP30 to detect the male/female chimerism. We also developed mutation-specific ddPCR for four primary immunodeficiency diseases. The accuracy of the male/female chimerism analysis using ddPCR was confirmed by comparing the results with those of conventional methods (fluorescence in situ hybridization and short tandem repeat-PCR) and evaluating dilution assays. In particular, we found that this method was useful for analyzing small samples. Thus, this method could be used with patient samples, especially to sorted leukocyte subpopulations, during the early post-transplant period. Four mutation-specific ddPCR accurately detected post-transplant chimerism. ddPCR-based male/female chimerism analysis and mutation-specific ddPCR were useful for all HSCT, and these simple methods contribute to following the post-transplant chimerism, especially in disease-specific small leukocyte fractions.

  17. Homotopy method for optimization of variable-specific-impulse low-thrust trajectories

    NASA Astrophysics Data System (ADS)

    Chi, Zhemin; Yang, Hongwei; Chen, Shiyu; Li, Junfeng

    2017-11-01

    The homotopy method has been used as a useful tool in solving fuel-optimal trajectories with constant-specific-impulse low thrust. However, the specific impulse is often variable for many practical solar electric power-limited thrusters. This paper investigates the application of the homotopy method for optimization of variable-specific-impulse low-thrust trajectories. Difficulties arise when the two commonly-used homotopy functions are employed for trajectory optimization. The optimal power throttle level and the optimal specific impulse are coupled with the commonly-used quadratic and logarithmic homotopy functions. To overcome these difficulties, a modified logarithmic homotopy function is proposed to serve as a gateway for trajectory optimization, leading to decoupled expressions of both the optimal power throttle level and the optimal specific impulse. The homotopy method based on this homotopy function is proposed. Numerical simulations validate the feasibility and high efficiency of the proposed method.

  18. MSP-HTPrimer: a high-throughput primer design tool to improve assay design for DNA methylation analysis in epigenetics.

    PubMed

    Pandey, Ram Vinay; Pulverer, Walter; Kallmeyer, Rainer; Beikircher, Gabriel; Pabinger, Stephan; Kriegner, Albert; Weinhäusel, Andreas

    2016-01-01

    Bisulfite (BS) conversion-based and methylation-sensitive restriction enzyme (MSRE)-based PCR methods have been the most commonly used techniques for locus-specific DNA methylation analysis. However, both methods have advantages and limitations. Thus, an integrated approach would be extremely useful to quantify the DNA methylation status successfully with great sensitivity and specificity. Designing specific and optimized primers for target regions is the most critical and challenging step in obtaining the adequate DNA methylation results using PCR-based methods. Currently, no integrated, optimized, and high-throughput methylation-specific primer design software methods are available for both BS- and MSRE-based methods. Therefore an integrated, powerful, and easy-to-use methylation-specific primer design pipeline with great accuracy and success rate will be very useful. We have developed a new web-based pipeline, called MSP-HTPrimer, to design primers pairs for MSP, BSP, pyrosequencing, COBRA, and MSRE assays on both genomic strands. First, our pipeline converts all target sequences into bisulfite-treated templates for both forward and reverse strand and designs all possible primer pairs, followed by filtering for single nucleotide polymorphisms (SNPs) and known repeat regions. Next, each primer pairs are annotated with the upstream and downstream RefSeq genes, CpG island, and cut sites (for COBRA and MSRE). Finally, MSP-HTPrimer selects specific primers from both strands based on custom and user-defined hierarchical selection criteria. MSP-HTPrimer produces a primer pair summary output table in TXT and HTML format for display and UCSC custom tracks for resulting primer pairs in GTF format. MSP-HTPrimer is an integrated, web-based, and high-throughput pipeline and has no limitation on the number and size of target sequences and designs MSP, BSP, pyrosequencing, COBRA, and MSRE assays. It is the only pipeline, which automatically designs primers on both genomic strands to increase the success rate. It is a standalone web-based pipeline, which is fully configured within a virtual machine and thus can be readily used without any configuration. We have experimentally validated primer pairs designed by our pipeline and shown a very high success rate of primer pairs: out of 66 BSP primer pairs, 63 were successfully validated without any further optimization step and using the same qPCR conditions. The MSP-HTPrimer pipeline is freely available from http://sourceforge.net/p/msp-htprimer.

  19. A comparison study of size-specific dose estimate calculation methods.

    PubMed

    Parikh, Roshni A; Wien, Michael A; Novak, Ronald D; Jordan, David W; Klahr, Paul; Soriano, Stephanie; Ciancibello, Leslie; Berlin, Sheila C

    2018-01-01

    The size-specific dose estimate (SSDE) has emerged as an improved metric for use by medical physicists and radiologists for estimating individual patient dose. Several methods of calculating SSDE have been described, ranging from patient thickness or attenuation-based (automated and manual) measurements to weight-based techniques. To compare the accuracy of thickness vs. weight measurement of body size to allow for the calculation of the size-specific dose estimate (SSDE) in pediatric body CT. We retrospectively identified 109 pediatric body CT examinations for SSDE calculation. We examined two automated methods measuring a series of level-specific diameters of the patient's body: method A used the effective diameter and method B used the water-equivalent diameter. Two manual methods measured patient diameter at two predetermined levels: the superior endplate of L2, where body width is typically most thin, and the superior femoral head or iliac crest (for scans that did not include the pelvis), where body width is typically most thick; method C averaged lateral measurements at these two levels from the CT projection scan, and method D averaged lateral and anteroposterior measurements at the same two levels from the axial CT images. Finally, we used body weight to characterize patient size, method E, and compared this with the various other measurement methods. Methods were compared across the entire population as well as by subgroup based on body width. Concordance correlation (ρ c ) between each of the SSDE calculation methods (methods A-E) was greater than 0.92 across the entire population, although the range was wider when analyzed by subgroup (0.42-0.99). When we compared each SSDE measurement method with CTDI vol, there was poor correlation, ρ c <0.77, with percentage differences between 20.8% and 51.0%. Automated computer algorithms are accurate and efficient in the calculation of SSDE. Manual methods based on patient thickness provide acceptable dose estimates for pediatric patients <30 cm in body width. Body weight provides a quick and practical method to identify conversion factors that can be used to estimate SSDE with reasonable accuracy in pediatric patients with body width ≥20 cm.

  20. Evaluation of Two PCR-Based Swine-Specific Fecal Source Tracking Assays (Poster)

    EPA Science Inventory

    Several PCR-based methods have been proposed to identify swine fecal pollution in environmental waters. However, the specificity and distribution of these targets have not been adequately assessed. Consequently, the utility of these assays in identifying swine fecal contamination...

  1. Design of a species-specific PCR method for the detection of the heat-resistant fungi Talaromyces macrosporus and Talaromyces trachyspermus.

    PubMed

    Yamashita, S; Nakagawa, H; Sakaguchi, T; Arima, T-H; Kikoku, Y

    2018-01-01

    Heat-resistant fungi occur sporadically and are a continuing problem for the food and beverage industry. The genus Talaromyces, as a typical fungus, is capable of producing the heat-resistant ascospores responsible for the spoilage of processed food products. Isocitrate lyase, a signature enzyme of the glyoxylate cycle, is required for the metabolism of non-fermentable carbon compounds, like acetate and ethanol. Here, species-specific primer sets for detection and identification of DNA derived from Talaromyces macrosporus and Talaromyces trachyspermus were designed based on the nucleotide sequences of their isocitrate lyase genes. Polymerase chain reaction (PCR) using a species-specific primer set amplified products specific to T. macrosporus and T. trachyspermus. Other fungal species, such as Byssochlamys fulva and Hamigera striata, which cause food spoilage, were not detected using the Talaromyces-specific primer sets. The detection limit for each species-specific primer set was determined as being 50 pg of template DNA, without using a nested PCR method. The specificity of each species-specific primer set was maintained in the presence of 1,000-fold amounts of genomic DNA from other fungi. The method also detected fungal DNA extracted from blueberry inoculated with T. macrosporus. This PCR method provides a quick, simple, powerful and reliable way to detect T. macrosporus and T. trachyspermus. Polymerase chain reaction (PCR)-based detection is rapid, convenient and sensitive compared with traditional methods of detecting heat-resistant fungi. In this study, a PCR-based method was developed for the detection and identification of amplification products from Talaromyces macrosporus and Talaromyces trachyspermus using primer sets that target the isocitrate lyase gene. This method could be used for the on-site detection of T. macrosporus and T. trachyspermus in the near future, and will be helpful in the safety control of raw materials and in food and beverage production. © 2017 The Authors. Letters in Applied Microbiology published by John Wiley & Sons Ltd on behalf of The Society for Applied Microbiology.

  2. Improved regulatory element prediction based on tissue-specific local epigenomic signatures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    He, Yupeng; Gorkin, David U.; Dickel, Diane E.

    Accurate enhancer identification is critical for understanding the spatiotemporal transcriptional regulation during development as well as the functional impact of disease-related noncoding genetic variants. Computational methods have been developed to predict the genomic locations of active enhancers based on histone modifications, but the accuracy and resolution of these methods remain limited. Here, we present an algorithm, regulator y element prediction based on tissue-specific local epigenetic marks (REPTILE), which integrates histone modification and whole-genome cytosine DNA methylation profiles to identify the precise location of enhancers. We tested the ability of REPTILE to identify enhancers previously validated in reporter assays. Compared withmore » existing methods, REPTILE shows consistently superior performance across diverse cell and tissue types, and the enhancer locations are significantly more refined. We show that, by incorporating base-resolution methylation data, REPTILE greatly improves upon current methods for annotation of enhancers across a variety of cell and tissue types.« less

  3. Improved regulatory element prediction based on tissue-specific local epigenomic signatures

    DOE PAGES

    He, Yupeng; Gorkin, David U.; Dickel, Diane E.; ...

    2017-02-13

    Accurate enhancer identification is critical for understanding the spatiotemporal transcriptional regulation during development as well as the functional impact of disease-related noncoding genetic variants. Computational methods have been developed to predict the genomic locations of active enhancers based on histone modifications, but the accuracy and resolution of these methods remain limited. Here, we present an algorithm, regulator y element prediction based on tissue-specific local epigenetic marks (REPTILE), which integrates histone modification and whole-genome cytosine DNA methylation profiles to identify the precise location of enhancers. We tested the ability of REPTILE to identify enhancers previously validated in reporter assays. Compared withmore » existing methods, REPTILE shows consistently superior performance across diverse cell and tissue types, and the enhancer locations are significantly more refined. We show that, by incorporating base-resolution methylation data, REPTILE greatly improves upon current methods for annotation of enhancers across a variety of cell and tissue types.« less

  4. Combining population and patient-specific characteristics for prostate segmentation on 3D CT images

    NASA Astrophysics Data System (ADS)

    Ma, Ling; Guo, Rongrong; Tian, Zhiqiang; Venkataraman, Rajesh; Sarkar, Saradwata; Liu, Xiabi; Tade, Funmilayo; Schuster, David M.; Fei, Baowei

    2016-03-01

    Prostate segmentation on CT images is a challenging task. In this paper, we explore the population and patient-specific characteristics for the segmentation of the prostate on CT images. Because population learning does not consider the inter-patient variations and because patient-specific learning may not perform well for different patients, we are combining the population and patient-specific information to improve segmentation performance. Specifically, we train a population model based on the population data and train a patient-specific model based on the manual segmentation on three slice of the new patient. We compute the similarity between the two models to explore the influence of applicable population knowledge on the specific patient. By combining the patient-specific knowledge with the influence, we can capture the population and patient-specific characteristics to calculate the probability of a pixel belonging to the prostate. Finally, we smooth the prostate surface according to the prostate-density value of the pixels in the distance transform image. We conducted the leave-one-out validation experiments on a set of CT volumes from 15 patients. Manual segmentation results from a radiologist serve as the gold standard for the evaluation. Experimental results show that our method achieved an average DSC of 85.1% as compared to the manual segmentation gold standard. This method outperformed the population learning method and the patient-specific learning approach alone. The CT segmentation method can have various applications in prostate cancer diagnosis and therapy.

  5. Qualitative Assessment of Inquiry-Based Teaching Methods

    ERIC Educational Resources Information Center

    Briggs, Michael; Long, George; Owens, Katrina

    2011-01-01

    A new approach to teaching method assessment using student focused qualitative studies and the theoretical framework of mental models is proposed. The methodology is considered specifically for the advantages it offers when applied to the assessment of inquiry-based teaching methods. The theoretical foundation of mental models is discussed, and…

  6. Disease gene prioritization by integrating tissue-specific molecular networks using a robust multi-network model.

    PubMed

    Ni, Jingchao; Koyuturk, Mehmet; Tong, Hanghang; Haines, Jonathan; Xu, Rong; Zhang, Xiang

    2016-11-10

    Accurately prioritizing candidate disease genes is an important and challenging problem. Various network-based methods have been developed to predict potential disease genes by utilizing the disease similarity network and molecular networks such as protein interaction or gene co-expression networks. Although successful, a common limitation of the existing methods is that they assume all diseases share the same molecular network and a single generic molecular network is used to predict candidate genes for all diseases. However, different diseases tend to manifest in different tissues, and the molecular networks in different tissues are usually different. An ideal method should be able to incorporate tissue-specific molecular networks for different diseases. In this paper, we develop a robust and flexible method to integrate tissue-specific molecular networks for disease gene prioritization. Our method allows each disease to have its own tissue-specific network(s). We formulate the problem of candidate gene prioritization as an optimization problem based on network propagation. When there are multiple tissue-specific networks available for a disease, our method can automatically infer the relative importance of each tissue-specific network. Thus it is robust to the noisy and incomplete network data. To solve the optimization problem, we develop fast algorithms which have linear time complexities in the number of nodes in the molecular networks. We also provide rigorous theoretical foundations for our algorithms in terms of their optimality and convergence properties. Extensive experimental results show that our method can significantly improve the accuracy of candidate gene prioritization compared with the state-of-the-art methods. In our experiments, we compare our methods with 7 popular network-based disease gene prioritization algorithms on diseases from Online Mendelian Inheritance in Man (OMIM) database. The experimental results demonstrate that our methods recover true associations more accurately than other methods in terms of AUC values, and the performance differences are significant (with paired t-test p-values less than 0.05). This validates the importance to integrate tissue-specific molecular networks for studying disease gene prioritization and show the superiority of our network models and ranking algorithms toward this purpose. The source code and datasets are available at http://nijingchao.github.io/CRstar/ .

  7. HomPPI: a class of sequence homology based protein-protein interface prediction methods

    PubMed Central

    2011-01-01

    Background Although homology-based methods are among the most widely used methods for predicting the structure and function of proteins, the question as to whether interface sequence conservation can be effectively exploited in predicting protein-protein interfaces has been a subject of debate. Results We studied more than 300,000 pair-wise alignments of protein sequences from structurally characterized protein complexes, including both obligate and transient complexes. We identified sequence similarity criteria required for accurate homology-based inference of interface residues in a query protein sequence. Based on these analyses, we developed HomPPI, a class of sequence homology-based methods for predicting protein-protein interface residues. We present two variants of HomPPI: (i) NPS-HomPPI (Non partner-specific HomPPI), which can be used to predict interface residues of a query protein in the absence of knowledge of the interaction partner; and (ii) PS-HomPPI (Partner-specific HomPPI), which can be used to predict the interface residues of a query protein with a specific target protein. Our experiments on a benchmark dataset of obligate homodimeric complexes show that NPS-HomPPI can reliably predict protein-protein interface residues in a given protein, with an average correlation coefficient (CC) of 0.76, sensitivity of 0.83, and specificity of 0.78, when sequence homologs of the query protein can be reliably identified. NPS-HomPPI also reliably predicts the interface residues of intrinsically disordered proteins. Our experiments suggest that NPS-HomPPI is competitive with several state-of-the-art interface prediction servers including those that exploit the structure of the query proteins. The partner-specific classifier, PS-HomPPI can, on a large dataset of transient complexes, predict the interface residues of a query protein with a specific target, with a CC of 0.65, sensitivity of 0.69, and specificity of 0.70, when homologs of both the query and the target can be reliably identified. The HomPPI web server is available at http://homppi.cs.iastate.edu/. Conclusions Sequence homology-based methods offer a class of computationally efficient and reliable approaches for predicting the protein-protein interface residues that participate in either obligate or transient interactions. For query proteins involved in transient interactions, the reliability of interface residue prediction can be improved by exploiting knowledge of putative interaction partners. PMID:21682895

  8. [Study on commercial specification of atractylodes based on Delphi method].

    PubMed

    Wang, Hao; Chen, Li-Xiao; Huang, Lu-Qi; Zhang, Tian-Tian; Li, Ying; Zheng, Yu-Guang

    2016-03-01

    This research adopts "Delphi method" to evaluate atractylodes traditional traits and rank correlation. By using methods of mathematical statistics the relationship of the traditional identification indicators and atractylodes goods rank correlation was analyzed, It is found that the main characteristics affectingatractylodes commodity specifications and grades of main characters wereoil points of transaction,color of transaction,color of surface,grain of transaction,texture of transaction andspoilage. The study points out that the original "seventy-six kinds of medicinal materials commodity specification standards of atractylodes differentiate commodity specification" is not in conformity with the actual market situation, we need to formulate corresponding atractylodes medicinal products specifications and grades.This study combined with experimental results "Delphi method" and the market actual situation, proposed the new draft atractylodes commodity specifications and grades, as the new atractylodes commodity specifications and grades standards. It provides a reference and theoretical basis. Copyright© by the Chinese Pharmaceutical Association.

  9. Prediction of protein-protein interaction network using a multi-objective optimization approach.

    PubMed

    Chowdhury, Archana; Rakshit, Pratyusha; Konar, Amit

    2016-06-01

    Protein-Protein Interactions (PPIs) are very important as they coordinate almost all cellular processes. This paper attempts to formulate PPI prediction problem in a multi-objective optimization framework. The scoring functions for the trial solution deal with simultaneous maximization of functional similarity, strength of the domain interaction profiles, and the number of common neighbors of the proteins predicted to be interacting. The above optimization problem is solved using the proposed Firefly Algorithm with Nondominated Sorting. Experiments undertaken reveal that the proposed PPI prediction technique outperforms existing methods, including gene ontology-based Relative Specific Similarity, multi-domain-based Domain Cohesion Coupling method, domain-based Random Decision Forest method, Bagging with REP Tree, and evolutionary/swarm algorithm-based approaches, with respect to sensitivity, specificity, and F1 score.

  10. A hybrid patient-specific biomechanical model based image registration method for the motion estimation of lungs.

    PubMed

    Han, Lianghao; Dong, Hua; McClelland, Jamie R; Han, Liangxiu; Hawkes, David J; Barratt, Dean C

    2017-07-01

    This paper presents a new hybrid biomechanical model-based non-rigid image registration method for lung motion estimation. In the proposed method, a patient-specific biomechanical modelling process captures major physically realistic deformations with explicit physical modelling of sliding motion, whilst a subsequent non-rigid image registration process compensates for small residuals. The proposed algorithm was evaluated with 10 4D CT datasets of lung cancer patients. The target registration error (TRE), defined as the Euclidean distance of landmark pairs, was significantly lower with the proposed method (TRE = 1.37 mm) than with biomechanical modelling (TRE = 3.81 mm) and intensity-based image registration without specific considerations for sliding motion (TRE = 4.57 mm). The proposed method achieved a comparable accuracy as several recently developed intensity-based registration algorithms with sliding handling on the same datasets. A detailed comparison on the distributions of TREs with three non-rigid intensity-based algorithms showed that the proposed method performed especially well on estimating the displacement field of lung surface regions (mean TRE = 1.33 mm, maximum TRE = 5.3 mm). The effects of biomechanical model parameters (such as Poisson's ratio, friction and tissue heterogeneity) on displacement estimation were investigated. The potential of the algorithm in optimising biomechanical models of lungs through analysing the pattern of displacement compensation from the image registration process has also been demonstrated. Copyright © 2017 Elsevier B.V. All rights reserved.

  11. Event specific qualitative and quantitative polymerase chain reaction detection of genetically modified MON863 maize based on the 5'-transgene integration sequence.

    PubMed

    Yang, Litao; Xu, Songci; Pan, Aihu; Yin, Changsong; Zhang, Kewei; Wang, Zhenying; Zhou, Zhigang; Zhang, Dabing

    2005-11-30

    Because of the genetically modified organisms (GMOs) labeling policies issued in many countries and areas, polymerase chain reaction (PCR) methods were developed for the execution of GMO labeling policies, such as screening, gene specific, construct specific, and event specific PCR detection methods, which have become a mainstay of GMOs detection. The event specific PCR detection method is the primary trend in GMOs detection because of its high specificity based on the flanking sequence of the exogenous integrant. This genetically modified maize, MON863, contains a Cry3Bb1 coding sequence that produces a protein with enhanced insecticidal activity against the coleopteran pest, corn rootworm. In this study, the 5'-integration junction sequence between the host plant DNA and the integrated gene construct of the genetically modified maize MON863 was revealed by means of thermal asymmetric interlaced-PCR, and the specific PCR primers and TaqMan probe were designed based upon the revealed 5'-integration junction sequence; the conventional qualitative PCR and quantitative TaqMan real-time PCR detection methods employing these primers and probes were successfully developed. In conventional qualitative PCR assay, the limit of detection (LOD) was 0.1% for MON863 in 100 ng of maize genomic DNA for one reaction. In the quantitative TaqMan real-time PCR assay, the LOD and the limit of quantification were eight and 80 haploid genome copies, respectively. In addition, three mixed maize samples with known MON863 contents were detected using the established real-time PCR systems, and the ideal results indicated that the established event specific real-time PCR detection systems were reliable, sensitive, and accurate.

  12. Correction factors for the NMi free-air ionization chamber for medium-energy x-rays calculated with the Monte Carlo method.

    PubMed

    Grimbergen, T W; van Dijk, E; de Vries, W

    1998-11-01

    A new method is described for the determination of x-ray quality dependent correction factors for free-air ionization chambers. The method is based on weighting correction factors for mono-energetic photons, which are calculated using the Monte Carlo method, with measured air kerma spectra. With this method, correction factors for electron loss, scatter inside the chamber and transmission through the diaphragm and front wall have been calculated for the NMi free-air chamber for medium-energy x-rays for a wide range of x-ray qualities in use at NMi. The newly obtained correction factors were compared with the values in use at present, which are based on interpolation of experimental data for a specific set of x-ray qualities. For x-ray qualities which are similar to this specific set, the agreement between the correction factors determined with the new method and those based on the experimental data is better than 0.1%, except for heavily filtered x-rays generated at 250 kV. For x-ray qualities dissimilar to the specific set, differences up to 0.4% exist, which can be explained by uncertainties in the interpolation procedure of the experimental data. Since the new method does not depend on experimental data for a specific set of x-ray qualities, the new method allows for a more flexible use of the free-air chamber as a primary standard for air kerma for any x-ray quality in the medium-energy x-ray range.

  13. Alignment-free genome tree inference by learning group-specific distance metrics.

    PubMed

    Patil, Kaustubh R; McHardy, Alice C

    2013-01-01

    Understanding the evolutionary relationships between organisms is vital for their in-depth study. Gene-based methods are often used to infer such relationships, which are not without drawbacks. One can now attempt to use genome-scale information, because of the ever increasing number of genomes available. This opportunity also presents a challenge in terms of computational efficiency. Two fundamentally different methods are often employed for sequence comparisons, namely alignment-based and alignment-free methods. Alignment-free methods rely on the genome signature concept and provide a computationally efficient way that is also applicable to nonhomologous sequences. The genome signature contains evolutionary signal as it is more similar for closely related organisms than for distantly related ones. We used genome-scale sequence information to infer taxonomic distances between organisms without additional information such as gene annotations. We propose a method to improve genome tree inference by learning specific distance metrics over the genome signature for groups of organisms with similar phylogenetic, genomic, or ecological properties. Specifically, our method learns a Mahalanobis metric for a set of genomes and a reference taxonomy to guide the learning process. By applying this method to more than a thousand prokaryotic genomes, we showed that, indeed, better distance metrics could be learned for most of the 18 groups of organisms tested here. Once a group-specific metric is available, it can be used to estimate the taxonomic distances for other sequenced organisms from the group. This study also presents a large scale comparison between 10 methods--9 alignment-free and 1 alignment-based.

  14. ROKU: a novel method for identification of tissue-specific genes.

    PubMed

    Kadota, Koji; Ye, Jiazhen; Nakai, Yuji; Terada, Tohru; Shimizu, Kentaro

    2006-06-12

    One of the important goals of microarray research is the identification of genes whose expression is considerably higher or lower in some tissues than in others. We would like to have ways of identifying such tissue-specific genes. We describe a method, ROKU, which selects tissue-specific patterns from gene expression data for many tissues and thousands of genes. ROKU ranks genes according to their overall tissue specificity using Shannon entropy and detects tissues specific to each gene if any exist using an outlier detection method. We evaluated the capacity for the detection of various specific expression patterns using synthetic and real data. We observed that ROKU was superior to a conventional entropy-based method in its ability to rank genes according to overall tissue specificity and to detect genes whose expression pattern are specific only to objective tissues. ROKU is useful for the detection of various tissue-specific expression patterns. The framework is also directly applicable to the selection of diagnostic markers for molecular classification of multiple classes.

  15. Fuzzy pulmonary vessel segmentation in contrast enhanced CT data

    NASA Astrophysics Data System (ADS)

    Kaftan, Jens N.; Kiraly, Atilla P.; Bakai, Annemarie; Das, Marco; Novak, Carol L.; Aach, Til

    2008-03-01

    Pulmonary vascular tree segmentation has numerous applications in medical imaging and computer-aided diagnosis (CAD), including detection and visualization of pulmonary emboli (PE), improved lung nodule detection, and quantitative vessel analysis. We present a novel approach to pulmonary vessel segmentation based on a fuzzy segmentation concept, combining the strengths of both threshold and seed point based methods. The lungs of the original image are first segmented and a threshold-based approach identifies core vessel components with a high specificity. These components are then used to automatically identify reliable seed points for a fuzzy seed point based segmentation method, namely fuzzy connectedness. The output of the method consists of the probability of each voxel belonging to the vascular tree. Hence, our method provides the possibility to adjust the sensitivity/specificity of the segmentation result a posteriori according to application-specific requirements, through definition of a minimum vessel-probability required to classify a voxel as belonging to the vascular tree. The method has been evaluated on contrast-enhanced thoracic CT scans from clinical PE cases and demonstrates overall promising results. For quantitative validation we compare the segmentation results to randomly selected, semi-automatically segmented sub-volumes and present the resulting receiver operating characteristic (ROC) curves. Although we focus on contrast enhanced chest CT data, the method can be generalized to other regions of the body as well as to different imaging modalities.

  16. Multi-approach assessment of the spatial distribution of the specific yield: application to the Crau plain aquifer, France

    NASA Astrophysics Data System (ADS)

    Seraphin, Pierre; Gonçalvès, Julio; Vallet-Coulomb, Christine; Champollion, Cédric

    2018-06-01

    Spatially distributed values of the specific yield, a fundamental parameter for transient groundwater mass balance calculations, were obtained by means of three independent methods for the Crau plain, France. In contrast to its traditional use to assess recharge based on a given specific yield, the water-table fluctuation (WTF) method, applied using major recharging events, gave a first set of reference values. Then, large infiltration processes recorded by monitored boreholes and caused by major precipitation events were interpreted in terms of specific yield by means of a one-dimensional vertical numerical model solving Richards' equations within the unsaturated zone. Finally, two gravity field campaigns, at low and high piezometric levels, were carried out to assess the groundwater mass variation and thus alternative specific yield values. The range obtained by the WTF method for this aquifer made of alluvial detrital material was 2.9- 26%, in line with the scarce data available so far. The average spatial value of specific yield by the WTF method (9.1%) is consistent with the aquifer scale value from the hydro-gravimetric approach. In this investigation, an estimate of the hitherto unknown spatial distribution of the specific yield over the Crau plain was obtained using the most reliable method (the WTF method). A groundwater mass balance calculation over the domain using this distribution yielded similar results to an independent quantification based on a stable isotope-mixing model. This agreement reinforces the relevance of such estimates, which can be used to build a more accurate transient hydrogeological model.

  17. Multi-approach assessment of the spatial distribution of the specific yield: application to the Crau plain aquifer, France

    NASA Astrophysics Data System (ADS)

    Seraphin, Pierre; Gonçalvès, Julio; Vallet-Coulomb, Christine; Champollion, Cédric

    2018-03-01

    Spatially distributed values of the specific yield, a fundamental parameter for transient groundwater mass balance calculations, were obtained by means of three independent methods for the Crau plain, France. In contrast to its traditional use to assess recharge based on a given specific yield, the water-table fluctuation (WTF) method, applied using major recharging events, gave a first set of reference values. Then, large infiltration processes recorded by monitored boreholes and caused by major precipitation events were interpreted in terms of specific yield by means of a one-dimensional vertical numerical model solving Richards' equations within the unsaturated zone. Finally, two gravity field campaigns, at low and high piezometric levels, were carried out to assess the groundwater mass variation and thus alternative specific yield values. The range obtained by the WTF method for this aquifer made of alluvial detrital material was 2.9- 26%, in line with the scarce data available so far. The average spatial value of specific yield by the WTF method (9.1%) is consistent with the aquifer scale value from the hydro-gravimetric approach. In this investigation, an estimate of the hitherto unknown spatial distribution of the specific yield over the Crau plain was obtained using the most reliable method (the WTF method). A groundwater mass balance calculation over the domain using this distribution yielded similar results to an independent quantification based on a stable isotope-mixing model. This agreement reinforces the relevance of such estimates, which can be used to build a more accurate transient hydrogeological model.

  18. Comparison of PCR-based methods for the simultaneous detection of Neisseria meningitidis, Haemophilus influenzae, and Streptococcus pneumoniae in clinical samples.

    PubMed

    de Filippis, Ivano; de Andrade, Claudia Ferreira; Caldeira, Nathalia; de Azevedo, Aline Carvalho; de Almeida, Antonio Eugenio

    2016-01-01

    Several in-house PCR-based assays have been described for the detection of bacterial meningitis caused by Neisseria meningitidis, Streptococcus pneumoniae, and Haemophilus influenzae from clinical samples. PCR-based methods targeting different bacterial genes are frequently used by different laboratories worldwide, but no standard method has ever been established. The aim of our study was to compare different in-house and a commercial PCR-based tests for the detection of bacterial pathogens causing meningitis and invasive disease in humans. A total of 110 isolates and 134 clinical samples (99 cerebrospinal fluid and 35 blood samples) collected from suspected cases of invasive disease were analyzed. Specific sets of primers frequently used for PCR-diagnosis of the three pathogens were used and compared with the results achieved using the multiplex approach described here. Several different gene targets were used for each microorganism, namely ctrA, crgA and nspA for N. meningitidis, ply for S. pneumoniae, P6 and bexA for H. influenzae. All used methods were fast, specific and sensitive, while some of the targets used for the in-house PCR assay detected lower concentrations of genomic DNA than the commercial method. An additional PCR reaction is described for the differentiation of capsulated and non-capsulated H. influenzae strains, the while commercial method only detects capsulated strains. The in-house PCR methods here compared showed to be rapid, sensitive, highly specific, and cheaper than commercial methods. The in-house PCR methods could be easily adopted by public laboratories of developing countries for diagnostic purposes. The best results were achieved using primers targeting the genes nspA, ply, and P6 which were able to detect the lowest DNA concentrations for each specific target. Copyright © 2016 Elsevier Editora Ltda. All rights reserved.

  19. Intensity-Based Registration for Lung Motion Estimation

    NASA Astrophysics Data System (ADS)

    Cao, Kunlin; Ding, Kai; Amelon, Ryan E.; Du, Kaifang; Reinhardt, Joseph M.; Raghavan, Madhavan L.; Christensen, Gary E.

    Image registration plays an important role within pulmonary image analysis. The task of registration is to find the spatial mapping that brings two images into alignment. Registration algorithms designed for matching 4D lung scans or two 3D scans acquired at different inflation levels can catch the temporal changes in position and shape of the region of interest. Accurate registration is critical to post-analysis of lung mechanics and motion estimation. In this chapter, we discuss lung-specific adaptations of intensity-based registration methods for 3D/4D lung images and review approaches for assessing registration accuracy. Then we introduce methods for estimating tissue motion and studying lung mechanics. Finally, we discuss methods for assessing and quantifying specific volume change, specific ventilation, strain/ stretch information and lobar sliding.

  20. Method Engineering: A Service-Oriented Approach

    NASA Astrophysics Data System (ADS)

    Cauvet, Corine

    In the past, a large variety of methods have been published ranging from very generic frameworks to methods for specific information systems. Method Engineering has emerged as a research discipline for designing, constructing and adapting methods for Information Systems development. Several approaches have been proposed as paradigms in method engineering. The meta modeling approach provides means for building methods by instantiation, the component-based approach aims at supporting the development of methods by using modularization constructs such as method fragments, method chunks and method components. This chapter presents an approach (SO2M) for method engineering based on the service paradigm. We consider services as autonomous computational entities that are self-describing, self-configuring and self-adapting. They can be described, published, discovered and dynamically composed for processing a consumer's demand (a developer's requirement). The method service concept is proposed to capture a development process fragment for achieving a goal. Goal orientation in service specification and the principle of service dynamic composition support method construction and method adaptation to different development contexts.

  1. Evaluation of Two PCR-based Swine-specific Fecal Source Tracking Assays (Abstract)

    EPA Science Inventory

    Several PCR-based methods have been proposed to identify swine fecal pollution in environmental waters. However, the utility of these assays in identifying swine fecal contamination on a broad geographic scale is largely unknown. In this study, we evaluated the specificity, distr...

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xu, Q; Han, H; Xing, L

    Purpose: Dictionary learning based method has attracted more and more attentions in low-dose CT due to the superior performance on suppressing noise and preserving structural details. Considering the structures and noise vary from region to region in one imaging object, we propose a region-specific dictionary learning method to improve the low-dose CT reconstruction. Methods: A set of normal-dose images was used for dictionary learning. Segmentations were performed on these images, so that the training patch sets corresponding to different regions can be extracted out. After that, region-specific dictionaries were learned from these training sets. For the low-dose CT reconstruction, amore » conventional reconstruction, such as filtered back-projection (FBP), was performed firstly, and then segmentation was followed to segment the image into different regions. Sparsity constraints of each region based on its dictionary were used as regularization terms. The regularization parameters were selected adaptively according to different regions. A low-dose human thorax dataset was used to evaluate the proposed method. The single dictionary based method was performed for comparison. Results: Since the lung region is very different from the other part of thorax, two dictionaries corresponding to lung region and the rest part of thorax respectively were learned to better express the structural details and avoid artifacts. With only one dictionary some artifact appeared in the body region caused by the spot atoms corresponding to the structures in the lung region. And also some structure in the lung regions cannot be recovered well by only one dictionary. The quantitative indices of the result by the proposed method were also improved a little compared to the single dictionary based method. Conclusion: Region-specific dictionary can make the dictionary more adaptive to different region characteristics, which is much desirable for enhancing the performance of dictionary learning based method.« less

  3. Valid analytical performance specifications for combined analytical bias and imprecision for the use of common reference intervals.

    PubMed

    Hyltoft Petersen, Per; Lund, Flemming; Fraser, Callum G; Sandberg, Sverre; Sölétormos, György

    2018-01-01

    Background Many clinical decisions are based on comparison of patient results with reference intervals. Therefore, an estimation of the analytical performance specifications for the quality that would be required to allow sharing common reference intervals is needed. The International Federation of Clinical Chemistry (IFCC) recommended a minimum of 120 reference individuals to establish reference intervals. This number implies a certain level of quality, which could then be used for defining analytical performance specifications as the maximum combination of analytical bias and imprecision required for sharing common reference intervals, the aim of this investigation. Methods Two methods were investigated for defining the maximum combination of analytical bias and imprecision that would give the same quality of common reference intervals as the IFCC recommendation. Method 1 is based on a formula for the combination of analytical bias and imprecision and Method 2 is based on the Microsoft Excel formula NORMINV including the fractional probability of reference individuals outside each limit and the Gaussian variables of mean and standard deviation. The combinations of normalized bias and imprecision are illustrated for both methods. The formulae are identical for Gaussian and log-Gaussian distributions. Results Method 2 gives the correct results with a constant percentage of 4.4% for all combinations of bias and imprecision. Conclusion The Microsoft Excel formula NORMINV is useful for the estimation of analytical performance specifications for both Gaussian and log-Gaussian distributions of reference intervals.

  4. Methods for selective functionalization and separation of carbon nanotubes

    NASA Technical Reports Server (NTRS)

    Strano, Michael S. (Inventor); Usrey, Monica (Inventor); Barone, Paul (Inventor); Dyke, Christopher A. (Inventor); Tour, James M. (Inventor); Kittrell, W. Carter (Inventor); Hauge, Robert H (Inventor); Smalley, Richard E. (Inventor); Marek, legal representative, Irene Marie (Inventor)

    2011-01-01

    The present invention is directed toward methods of selectively functionalizing carbon nanotubes of a specific type or range of types, based on their electronic properties, using diazonium chemistry. The present invention is also directed toward methods of separating carbon nanotubes into populations of specific types or range(s) of types via selective functionalization and electrophoresis, and also to the novel compositions generated by such separations.

  5. A PCR-Based Method for RNA Probes and Applications in Neuroscience.

    PubMed

    Hua, Ruifang; Yu, Shanshan; Liu, Mugen; Li, Haohong

    2018-01-01

    In situ hybridization (ISH) is a powerful technique that is used to detect the localization of specific nucleic acid sequences for understanding the organization, regulation, and function of genes. However, in most cases, RNA probes are obtained by in vitro transcription from plasmids containing specific promoter elements and mRNA-specific cDNA. Probes originating from plasmid vectors are time-consuming and not suitable for the rapid gene mapping. Here, we introduce a simplified method to prepare digoxigenin (DIG)-labeled non-radioactive RNA probes based on polymerase chain reaction (PCR) amplification and applications in free-floating mouse brain sections. Employing a transgenic reporter line, we investigate the expression of the somatostatin (SST) mRNA in the adult mouse brain. The method can be applied to identify the colocalization of SST mRNA and proteins including corticotrophin-releasing hormone (CRH) and protein kinase C delta type (PKC-δ) using double immunofluorescence, which is useful for understanding the organization of complex brain nuclei. Moreover, the method can also be incorporated with retrograde tracing to visualize the functional connection in the neural circuitry. Briefly, the PCR-based method for non-radioactive RNA probes is a useful tool that can be substantially utilized in neuroscience studies.

  6. Genetic potential of black bean genotypes with predictable behaviors in multienvironment trials.

    PubMed

    Torga, P P; Melo, P G S; Pereira, H S; Faria, L C; Melo, L C

    2016-10-24

    The aim of this study was to evaluate the phenotypic stability and specific and broad adaptability of common black bean genotypes for the Central and Center-South regions of Brazil by using the Annicchiarico and AMMI (weighted average of absolute scores: WAAS, and weighted average of absolute scores and productivity: WAASP) methodologies. We carried out 69 trials, with 43 and 26 trials in the Central and Center-South regions, respectively. Thirteen genotypes were evaluated in a randomized block design with three replications, during the rainy, dry, and winter seasons in 2 years. To obtain estimates of specific adaptation, we analyzed the parameters for each method obtained in the two geographic regions separately. To estimate broad adaptation, we used the average of the parameters obtained from each region. The lines identified with high specific adaptation in each region were not the same based on the Annicchiarico and AMMI (WAAS) methodologies. It was not possible to identify the same genotypes with specific or broad stability by using these methods. By contrast, the Annicchiarico and AMMI (WAASP) methods presented very similar estimates of broad and specific adaptation. Based on these methods, the lines with more specific adaptation were CNFP 8000 and CNFP 7994, in the Central and Center-South regions, respectively, of which the CNFP 8000 line was more widely adapted.

  7. Enabling Requirements-Based Programming for Highly-Dependable Complex Parallel and Distributed Systems

    NASA Technical Reports Server (NTRS)

    Hinchey, Michael G.; Rash, James L.; Rouff, Christopher A.

    2005-01-01

    The manual application of formal methods in system specification has produced successes, but in the end, despite any claims and assertions by practitioners, there is no provable relationship between a manually derived system specification or formal model and the customer's original requirements. Complex parallel and distributed system present the worst case implications for today s dearth of viable approaches for achieving system dependability. No avenue other than formal methods constitutes a serious contender for resolving the problem, and so recognition of requirements-based programming has come at a critical juncture. We describe a new, NASA-developed automated requirement-based programming method that can be applied to certain classes of systems, including complex parallel and distributed systems, to achieve a high degree of dependability.

  8. A Model Based Security Testing Method for Protocol Implementation

    PubMed Central

    Fu, Yu Long; Xin, Xiao Long

    2014-01-01

    The security of protocol implementation is important and hard to be verified. Since the penetration testing is usually based on the experience of the security tester and the specific protocol specifications, a formal and automatic verification method is always required. In this paper, we propose an extended model of IOLTS to describe the legal roles and intruders of security protocol implementations, and then combine them together to generate the suitable test cases to verify the security of protocol implementation. PMID:25105163

  9. A model based security testing method for protocol implementation.

    PubMed

    Fu, Yu Long; Xin, Xiao Long

    2014-01-01

    The security of protocol implementation is important and hard to be verified. Since the penetration testing is usually based on the experience of the security tester and the specific protocol specifications, a formal and automatic verification method is always required. In this paper, we propose an extended model of IOLTS to describe the legal roles and intruders of security protocol implementations, and then combine them together to generate the suitable test cases to verify the security of protocol implementation.

  10. Implications to Postsecondary Faculty of Alternative Calculation Methods of Gender-Based Wage Differentials.

    ERIC Educational Resources Information Center

    Hagedorn, Linda Serra

    1998-01-01

    A study explored two distinct methods of calculating a precise measure of gender-based wage differentials among college faculty. The first estimation considered wage differences using a formula based on human capital; the second included compensation for past discriminatory practices. Both measures were used to predict three specific aspects of…

  11. Address tracing for parallel machines

    NASA Technical Reports Server (NTRS)

    Stunkel, Craig B.; Janssens, Bob; Fuchs, W. Kent

    1991-01-01

    Recently implemented parallel system address-tracing methods based on several metrics are surveyed. The issues specific to collection of traces for both shared and distributed memory parallel computers are highlighted. Five general categories of address-trace collection methods are examined: hardware-captured, interrupt-based, simulation-based, altered microcode-based, and instrumented program-based traces. The problems unique to shared memory and distributed memory multiprocessors are examined separately.

  12. A universal TaqMan-based RT-PCR protocol for cost-efficient detection of small noncoding RNA.

    PubMed

    Jung, Ulrike; Jiang, Xiaoou; Kaufmann, Stefan H E; Patzel, Volker

    2013-12-01

    Several methods for the detection of RNA have been developed over time. For small RNA detection, a stem-loop reverse primer-based protocol relying on TaqMan RT-PCR has been described. This protocol requires an individual specific TaqMan probe for each target RNA and, hence, is highly cost-intensive for experiments with small sample sizes or large numbers of different samples. We describe a universal TaqMan-based probe protocol which can be used to detect any target sequence and demonstrate its applicability for the detection of endogenous as well as artificial eukaryotic and bacterial small RNAs. While the specific and the universal probe-based protocol showed the same sensitivity, the absolute sensitivity of detection was found to be more than 100-fold lower for both than previously reported. In subsequent experiments, we found previously unknown limitations intrinsic to the method affecting its feasibility in determination of mature template RISC incorporation as well as in multiplexing. Both protocols were equally specific in discriminating between correct and incorrect small RNA targets or between mature miRNA and its unprocessed RNA precursor, indicating the stem-loop RT-primer, but not the TaqMan probe, triggers target specificity. The presented universal TaqMan-based RT-PCR protocol represents a cost-efficient method for the detection of small RNAs.

  13. Microsiemens or Milligrams: Measures of Ionic Mixtures

    EPA Science Inventory

    In December of 2016, EPA released the Draft Field-Based Methods for Developing Aquatic Life Criteria for Specific Conductivity for public comment. Once final, states and authorized tribes may use these methods to derive field-based ecoregional ambient Aquatic Life Ambient Water Q...

  14. Detection of VR-2332 strain of porcine reproductive and respiratory syndrome virus type II using an aptamer-based sandwich-type assay.

    PubMed

    Lee, Su Jin; Kwon, Young Seop; Lee, Ji-eun; Choi, Eun-Jin; Lee, Chang-Hee; Song, Jae-Young; Gu, Man Bock

    2013-01-02

    Porcine reproductive and respiratory syndrome virus (PRRSV) causes porcine reproductive and respiratory syndrome disease (PRRS), a disease that has a significant and economic impact on the swine industry. In this study, single-stranded DNA (ssDNA) aptamers with high specificity and affinity against VR-2332 strain of PRRSV type II were successfully obtained. Of 19 candidates, the LB32 aptamer was found to be the most specific and sensitive to VR-2332 strain according to an aptamer-based surface plasmon resonance (SPR) analysis. The detection of VR-2332 of PRRSV type II was successfully accomplished using the enzyme-linked antibody-aptamer sandwich (ELAAS) method. The detection limit of ELAAS was 4.8 × 10(0) TCID(50)/mL that is comparable to some of the previous reports of the PCR-based detection but does not require any complicated equipment or extra costs. Moreover, this ELAAS-based PRRSV detection showed similar sensitivity for both the VR-2332 samples spiked in diluted swine serum and in buffer. Therefore, this VR-2332 strain-specific aptamer and its assay method with high specificity can be used as an alternative method for the fast and precise detection of PRRSV.

  15. Adaptive smoothing based on Gaussian processes regression increases the sensitivity and specificity of fMRI data.

    PubMed

    Strappini, Francesca; Gilboa, Elad; Pitzalis, Sabrina; Kay, Kendrick; McAvoy, Mark; Nehorai, Arye; Snyder, Abraham Z

    2017-03-01

    Temporal and spatial filtering of fMRI data is often used to improve statistical power. However, conventional methods, such as smoothing with fixed-width Gaussian filters, remove fine-scale structure in the data, necessitating a tradeoff between sensitivity and specificity. Specifically, smoothing may increase sensitivity (reduce noise and increase statistical power) but at the cost loss of specificity in that fine-scale structure in neural activity patterns is lost. Here, we propose an alternative smoothing method based on Gaussian processes (GP) regression for single subjects fMRI experiments. This method adapts the level of smoothing on a voxel by voxel basis according to the characteristics of the local neural activity patterns. GP-based fMRI analysis has been heretofore impractical owing to computational demands. Here, we demonstrate a new implementation of GP that makes it possible to handle the massive data dimensionality of the typical fMRI experiment. We demonstrate how GP can be used as a drop-in replacement to conventional preprocessing steps for temporal and spatial smoothing in a standard fMRI pipeline. We present simulated and experimental results that show the increased sensitivity and specificity compared to conventional smoothing strategies. Hum Brain Mapp 38:1438-1459, 2017. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  16. Application of image recognition-based automatic hyphae detection in fungal keratitis.

    PubMed

    Wu, Xuelian; Tao, Yuan; Qiu, Qingchen; Wu, Xinyi

    2018-03-01

    The purpose of this study is to evaluate the accuracy of two methods in diagnosis of fungal keratitis, whereby one method is automatic hyphae detection based on images recognition and the other method is corneal smear. We evaluate the sensitivity and specificity of the method in diagnosis of fungal keratitis, which is automatic hyphae detection based on image recognition. We analyze the consistency of clinical symptoms and the density of hyphae, and perform quantification using the method of automatic hyphae detection based on image recognition. In our study, 56 cases with fungal keratitis (just single eye) and 23 cases with bacterial keratitis were included. All cases underwent the routine inspection of slit lamp biomicroscopy, corneal smear examination, microorganism culture and the assessment of in vivo confocal microscopy images before starting medical treatment. Then, we recognize the hyphae images of in vivo confocal microscopy by using automatic hyphae detection based on image recognition to evaluate its sensitivity and specificity and compare with the method of corneal smear. The next step is to use the index of density to assess the severity of infection, and then find the correlation with the patients' clinical symptoms and evaluate consistency between them. The accuracy of this technology was superior to corneal smear examination (p < 0.05). The sensitivity of the technology of automatic hyphae detection of image recognition was 89.29%, and the specificity was 95.65%. The area under the ROC curve was 0.946. The correlation coefficient between the grading of the severity in the fungal keratitis by the automatic hyphae detection based on image recognition and the clinical grading is 0.87. The technology of automatic hyphae detection based on image recognition was with high sensitivity and specificity, able to identify fungal keratitis, which is better than the method of corneal smear examination. This technology has the advantages when compared with the conventional artificial identification of confocal microscope corneal images, of being accurate, stable and does not rely on human expertise. It was the most useful to the medical experts who are not familiar with fungal keratitis. The technology of automatic hyphae detection based on image recognition can quantify the hyphae density and grade this property. Being noninvasive, it can provide an evaluation criterion to fungal keratitis in a timely, accurate, objective and quantitative manner.

  17. Probability-based estimates of site-specific copper water quality criteria for the Chesapeake Bay, USA.

    PubMed

    Arnold, W Ray; Warren-Hicks, William J

    2007-01-01

    The object of this study was to estimate site- and region-specific dissolved copper criteria for a large embayment, the Chesapeake Bay, USA. The intent is to show the utility of 2 copper saltwater quality site-specific criteria estimation models and associated region-specific criteria selection methods. The criteria estimation models and selection methods are simple, efficient, and cost-effective tools for resource managers. The methods are proposed as potential substitutes for the US Environmental Protection Agency's water effect ratio methods. Dissolved organic carbon data and the copper criteria models were used to produce probability-based estimates of site-specific copper saltwater quality criteria. Site- and date-specific criteria estimations were made for 88 sites (n = 5,296) in the Chesapeake Bay. The average and range of estimated site-specific chronic dissolved copper criteria for the Chesapeake Bay were 7.5 and 5.3 to 16.9 microg Cu/L. The average and range of estimated site-specific acute dissolved copper criteria for the Chesapeake Bay were 11.7 and 8.3 to 26.4 microg Cu/L. The results suggest that applicable national and state copper criteria can increase in much of the Chesapeake Bay and remain protective. Virginia Department of Environmental Quality copper criteria near the mouth of the Chesapeake Bay, however, need to decrease to protect species of equal or greater sensitivity to that of the marine mussel, Mytilus sp.

  18. SU-E-J-122: The CBCT Dose Calculation Using a Patient Specific CBCT Number to Mass Density Conversion Curve Based On a Novel Image Registration and Organ Mapping Method in Head-And-Neck Radiation Therapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhou, J; Lasio, G; Chen, S

    2015-06-15

    Purpose: To develop a CBCT HU correction method using a patient specific HU to mass density conversion curve based on a novel image registration and organ mapping method for head-and-neck radiation therapy. Methods: There are three steps to generate a patient specific CBCT HU to mass density conversion curve. First, we developed a novel robust image registration method based on sparseness analysis to register the planning CT (PCT) and the CBCT. Second, a novel organ mapping method was developed to transfer the organs at risk (OAR) contours from the PCT to the CBCT and corresponding mean HU values of eachmore » OAR were measured in both the PCT and CBCT volumes. Third, a set of PCT and CBCT HU to mass density conversion curves were created based on the mean HU values of OARs and the corresponding mass density of the OAR in the PCT. Then, we compared our proposed conversion curve with the traditional Catphan phantom based CBCT HU to mass density calibration curve. Both curves were input into the treatment planning system (TPS) for dose calculation. Last, the PTV and OAR doses, DVH and dose distributions of CBCT plans are compared to the original treatment plan. Results: One head-and-neck cases which contained a pair of PCT and CBCT was used. The dose differences between the PCT and CBCT plans using the proposed method are −1.33% for the mean PTV, 0.06% for PTV D95%, and −0.56% for the left neck. The dose differences between plans of PCT and CBCT corrected using the CATPhan based method are −4.39% for mean PTV, 4.07% for PTV D95%, and −2.01% for the left neck. Conclusion: The proposed CBCT HU correction method achieves better agreement with the original treatment plan compared to the traditional CATPhan based calibration method.« less

  19. Method for the determination of natural ester-type gum bases used as food additives via direct analysis of their constituent wax esters using high-temperature GC/MS.

    PubMed

    Tada, Atsuko; Ishizuki, Kyoko; Yamazaki, Takeshi; Sugimoto, Naoki; Akiyama, Hiroshi

    2014-07-01

    Natural ester-type gum bases, which are used worldwide as food additives, mainly consist of wax esters composed of long-chain fatty acids and long-chain fatty alcohols. There are many varieties of ester-type gum bases, and thus a useful method for their discrimination is needed in order to establish official specifications and manage their quality control. Herein is reported a rapid and simple method for the analysis of different ester-type gum bases used as food additives by high-temperature gas chromatography/mass spectrometry (GC/MS). With this method, the constituent wax esters in ester-type gum bases can be detected without hydrolysis and derivatization. The method was applied to the determination of 10 types of gum bases, including beeswax, carnauba wax, lanolin, and jojoba wax, and it was demonstrated that the gum bases derived from identical origins have specific and characteristic total ion chromatogram (TIC) patterns and ester compositions. Food additive gum bases were thus distinguished from one another based on their TIC patterns and then more clearly discriminated using simultaneous monitoring of the fragment ions corresponding to the fatty acid moieties of the individual molecular species of the wax esters. This direct high-temperature GC/MS method was shown to be very useful for the rapid and simple discrimination of varieties of ester-type gum bases used as food additives.

  20. Method for the determination of natural ester-type gum bases used as food additives via direct analysis of their constituent wax esters using high-temperature GC/MS

    PubMed Central

    Tada, Atsuko; Ishizuki, Kyoko; Yamazaki, Takeshi; Sugimoto, Naoki; Akiyama, Hiroshi

    2014-01-01

    Natural ester-type gum bases, which are used worldwide as food additives, mainly consist of wax esters composed of long-chain fatty acids and long-chain fatty alcohols. There are many varieties of ester-type gum bases, and thus a useful method for their discrimination is needed in order to establish official specifications and manage their quality control. Herein is reported a rapid and simple method for the analysis of different ester-type gum bases used as food additives by high-temperature gas chromatography/mass spectrometry (GC/MS). With this method, the constituent wax esters in ester-type gum bases can be detected without hydrolysis and derivatization. The method was applied to the determination of 10 types of gum bases, including beeswax, carnauba wax, lanolin, and jojoba wax, and it was demonstrated that the gum bases derived from identical origins have specific and characteristic total ion chromatogram (TIC) patterns and ester compositions. Food additive gum bases were thus distinguished from one another based on their TIC patterns and then more clearly discriminated using simultaneous monitoring of the fragment ions corresponding to the fatty acid moieties of the individual molecular species of the wax esters. This direct high-temperature GC/MS method was shown to be very useful for the rapid and simple discrimination of varieties of ester-type gum bases used as food additives. PMID:25473499

  1. Maintenance Training Equipment: Design Specification Based on Instructional System Development. Revision

    DTIC Science & Technology

    1984-12-01

    model pr, vides a method for communicating a specific training equipment design to the procurement office after A"~- ISD ana lysis has est~blished a...maintenance trainer has been identified. The model provides a method by which a training equipment design can be communicated to the System Project Office...ensure * ase of development of procurement specifications and consistency between different documented designs. A completed application of this maodel

  2. Self-stabilizing byzantine-fault-tolerant clock synchronization system and method

    NASA Technical Reports Server (NTRS)

    Malekpour, Mahyar R. (Inventor)

    2012-01-01

    Systems and methods for rapid Byzantine-fault-tolerant self-stabilizing clock synchronization are provided. The systems and methods are based on a protocol comprising a state machine and a set of monitors that execute once every local oscillator tick. The protocol is independent of specific application specific requirements. The faults are assumed to be arbitrary and/or malicious. All timing measures of variables are based on the node's local clock and thus no central clock or externally generated pulse is used. Instances of the protocol are shown to tolerate bursts of transient failures and deterministically converge with a linear convergence time with respect to the synchronization period as predicted.

  3. Ontology-based configuration of problem-solving methods and generation of knowledge-acquisition tools: application of PROTEGE-II to protocol-based decision support.

    PubMed

    Tu, S W; Eriksson, H; Gennari, J H; Shahar, Y; Musen, M A

    1995-06-01

    PROTEGE-II is a suite of tools and a methodology for building knowledge-based systems and domain-specific knowledge-acquisition tools. In this paper, we show how PROTEGE-II can be applied to the task of providing protocol-based decision support in the domain of treating HIV-infected patients. To apply PROTEGE-II, (1) we construct a decomposable problem-solving method called episodic skeletal-plan refinement, (2) we build an application ontology that consists of the terms and relations in the domain, and of method-specific distinctions not already captured in the domain terms, and (3) we specify mapping relations that link terms from the application ontology to the domain-independent terms used in the problem-solving method. From the application ontology, we automatically generate a domain-specific knowledge-acquisition tool that is custom-tailored for the application. The knowledge-acquisition tool is used for the creation and maintenance of domain knowledge used by the problem-solving method. The general goal of the PROTEGE-II approach is to produce systems and components that are reusable and easily maintained. This is the rationale for constructing ontologies and problem-solving methods that can be composed from a set of smaller-grained methods and mechanisms. This is also why we tightly couple the knowledge-acquisition tools to the application ontology that specifies the domain terms used in the problem-solving systems. Although our evaluation is still preliminary, for the application task of providing protocol-based decision support, we show that these goals of reusability and easy maintenance can be achieved. We discuss design decisions and the tradeoffs that have to be made in the development of the system.

  4. Benchmarking Procedures for High-Throughput Context Specific Reconstruction Algorithms

    PubMed Central

    Pacheco, Maria P.; Pfau, Thomas; Sauter, Thomas

    2016-01-01

    Recent progress in high-throughput data acquisition has shifted the focus from data generation to processing and understanding of how to integrate collected information. Context specific reconstruction based on generic genome scale models like ReconX or HMR has the potential to become a diagnostic and treatment tool tailored to the analysis of specific individuals. The respective computational algorithms require a high level of predictive power, robustness and sensitivity. Although multiple context specific reconstruction algorithms were published in the last 10 years, only a fraction of them is suitable for model building based on human high-throughput data. Beside other reasons, this might be due to problems arising from the limitation to only one metabolic target function or arbitrary thresholding. This review describes and analyses common validation methods used for testing model building algorithms. Two major methods can be distinguished: consistency testing and comparison based testing. The first is concerned with robustness against noise, e.g., missing data due to the impossibility to distinguish between the signal and the background of non-specific binding of probes in a microarray experiment, and whether distinct sets of input expressed genes corresponding to i.e., different tissues yield distinct models. The latter covers methods comparing sets of functionalities, comparison with existing networks or additional databases. We test those methods on several available algorithms and deduce properties of these algorithms that can be compared with future developments. The set of tests performed, can therefore serve as a benchmarking procedure for future algorithms. PMID:26834640

  5. Analysis of visual quality improvements provided by known tools for HDR content

    NASA Astrophysics Data System (ADS)

    Kim, Jaehwan; Alshina, Elena; Lee, JongSeok; Park, Youngo; Choi, Kwang Pyo

    2016-09-01

    In this paper, the visual quality of different solutions for high dynamic range (HDR) compression using MPEG test contents is analyzed. We also simulate the method for an efficient HDR compression which is based on statistical property of the signal. The method is compliant with HEVC specification and also easily compatible with other alternative methods which might require HEVC specification changes. It was subjectively tested on commercial TVs and compared with alternative solutions for HDR coding. Subjective visual quality tests were performed using SUHD TVs model which is SAMSUNG JS9500 with maximum luminance up to 1000nit in test. The solution that is based on statistical property shows not only improvement of objective performance but improvement of visual quality compared to other HDR solutions, while it is compatible with HEVC specification.

  6. Sequence Based Prediction of Antioxidant Proteins Using a Classifier Selection Strategy

    PubMed Central

    Zhang, Lina; Zhang, Chengjin; Gao, Rui; Yang, Runtao; Song, Qing

    2016-01-01

    Antioxidant proteins perform significant functions in maintaining oxidation/antioxidation balance and have potential therapies for some diseases. Accurate identification of antioxidant proteins could contribute to revealing physiological processes of oxidation/antioxidation balance and developing novel antioxidation-based drugs. In this study, an ensemble method is presented to predict antioxidant proteins with hybrid features, incorporating SSI (Secondary Structure Information), PSSM (Position Specific Scoring Matrix), RSA (Relative Solvent Accessibility), and CTD (Composition, Transition, Distribution). The prediction results of the ensemble predictor are determined by an average of prediction results of multiple base classifiers. Based on a classifier selection strategy, we obtain an optimal ensemble classifier composed of RF (Random Forest), SMO (Sequential Minimal Optimization), NNA (Nearest Neighbor Algorithm), and J48 with an accuracy of 0.925. A Relief combined with IFS (Incremental Feature Selection) method is adopted to obtain optimal features from hybrid features. With the optimal features, the ensemble method achieves improved performance with a sensitivity of 0.95, a specificity of 0.93, an accuracy of 0.94, and an MCC (Matthew’s Correlation Coefficient) of 0.880, far better than the existing method. To evaluate the prediction performance objectively, the proposed method is compared with existing methods on the same independent testing dataset. Encouragingly, our method performs better than previous studies. In addition, our method achieves more balanced performance with a sensitivity of 0.878 and a specificity of 0.860. These results suggest that the proposed ensemble method can be a potential candidate for antioxidant protein prediction. For public access, we develop a user-friendly web server for antioxidant protein identification that is freely accessible at http://antioxidant.weka.cc. PMID:27662651

  7. Computed tomography landmark-based semi-automated mesh morphing and mapping techniques: generation of patient specific models of the human pelvis without segmentation.

    PubMed

    Salo, Zoryana; Beek, Maarten; Wright, David; Whyne, Cari Marisa

    2015-04-13

    Current methods for the development of pelvic finite element (FE) models generally are based upon specimen specific computed tomography (CT) data. This approach has traditionally required segmentation of CT data sets, which is time consuming and necessitates high levels of user intervention due to the complex pelvic anatomy. The purpose of this research was to develop and assess CT landmark-based semi-automated mesh morphing and mapping techniques to aid the generation and mechanical analysis of specimen-specific FE models of the pelvis without the need for segmentation. A specimen-specific pelvic FE model (source) was created using traditional segmentation methods and morphed onto a CT scan of a different (target) pelvis using a landmark-based method. The morphed model was then refined through mesh mapping by moving the nodes to the bone boundary. A second target model was created using traditional segmentation techniques. CT intensity based material properties were assigned to the morphed/mapped model and to the traditionally segmented target models. Models were analyzed to evaluate their geometric concurrency and strain patterns. Strains generated in a double-leg stance configuration were compared to experimental strain gauge data generated from the same target cadaver pelvis. CT landmark-based morphing and mapping techniques were efficiently applied to create a geometrically multifaceted specimen-specific pelvic FE model, which was similar to the traditionally segmented target model and better replicated the experimental strain results (R(2)=0.873). This study has shown that mesh morphing and mapping represents an efficient validated approach for pelvic FE model generation without the need for segmentation. Copyright © 2015 Elsevier Ltd. All rights reserved.

  8. Automated real time constant-specificity surveillance for disease outbreaks.

    PubMed

    Wieland, Shannon C; Brownstein, John S; Berger, Bonnie; Mandl, Kenneth D

    2007-06-13

    For real time surveillance, detection of abnormal disease patterns is based on a difference between patterns observed, and those predicted by models of historical data. The usefulness of outbreak detection strategies depends on their specificity; the false alarm rate affects the interpretation of alarms. We evaluate the specificity of five traditional models: autoregressive, Serfling, trimmed seasonal, wavelet-based, and generalized linear. We apply each to 12 years of emergency department visits for respiratory infection syndromes at a pediatric hospital, finding that the specificity of the five models was almost always a non-constant function of the day of the week, month, and year of the study (p < 0.05). We develop an outbreak detection method, called the expectation-variance model, based on generalized additive modeling to achieve a constant specificity by accounting for not only the expected number of visits, but also the variance of the number of visits. The expectation-variance model achieves constant specificity on all three time scales, as well as earlier detection and improved sensitivity compared to traditional methods in most circumstances. Modeling the variance of visit patterns enables real-time detection with known, constant specificity at all times. With constant specificity, public health practitioners can better interpret the alarms and better evaluate the cost-effectiveness of surveillance systems.

  9. Using groundwater levels to estimate recharge

    USGS Publications Warehouse

    Healy, R.W.; Cook, P.G.

    2002-01-01

    Accurate estimation of groundwater recharge is extremely important for proper management of groundwater systems. Many different approaches exist for estimating recharge. This paper presents a review of methods that are based on groundwater-level data. The water-table fluctuation method may be the most widely used technique for estimating recharge; it requires knowledge of specific yield and changes in water levels over time. Advantages of this approach include its simplicity and an insensitivity to the mechanism by which water moves through the unsaturated zone. Uncertainty in estimates generated by this method relate to the limited accuracy with which specific yield can be determined and to the extent to which assumptions inherent in the method are valid. Other methods that use water levels (mostly based on the Darcy equation) are also described. The theory underlying the methods is explained. Examples from the literature are used to illustrate applications of the different methods.

  10. Improved regulatory element prediction based on tissue-specific local epigenomic signatures

    PubMed Central

    He, Yupeng; Gorkin, David U.; Dickel, Diane E.; Nery, Joseph R.; Castanon, Rosa G.; Lee, Ah Young; Shen, Yin; Visel, Axel; Pennacchio, Len A.; Ren, Bing; Ecker, Joseph R.

    2017-01-01

    Accurate enhancer identification is critical for understanding the spatiotemporal transcriptional regulation during development as well as the functional impact of disease-related noncoding genetic variants. Computational methods have been developed to predict the genomic locations of active enhancers based on histone modifications, but the accuracy and resolution of these methods remain limited. Here, we present an algorithm, regulatory element prediction based on tissue-specific local epigenetic marks (REPTILE), which integrates histone modification and whole-genome cytosine DNA methylation profiles to identify the precise location of enhancers. We tested the ability of REPTILE to identify enhancers previously validated in reporter assays. Compared with existing methods, REPTILE shows consistently superior performance across diverse cell and tissue types, and the enhancer locations are significantly more refined. We show that, by incorporating base-resolution methylation data, REPTILE greatly improves upon current methods for annotation of enhancers across a variety of cell and tissue types. REPTILE is available at https://github.com/yupenghe/REPTILE/. PMID:28193886

  11. ROKU: a novel method for identification of tissue-specific genes

    PubMed Central

    Kadota, Koji; Ye, Jiazhen; Nakai, Yuji; Terada, Tohru; Shimizu, Kentaro

    2006-01-01

    Background One of the important goals of microarray research is the identification of genes whose expression is considerably higher or lower in some tissues than in others. We would like to have ways of identifying such tissue-specific genes. Results We describe a method, ROKU, which selects tissue-specific patterns from gene expression data for many tissues and thousands of genes. ROKU ranks genes according to their overall tissue specificity using Shannon entropy and detects tissues specific to each gene if any exist using an outlier detection method. We evaluated the capacity for the detection of various specific expression patterns using synthetic and real data. We observed that ROKU was superior to a conventional entropy-based method in its ability to rank genes according to overall tissue specificity and to detect genes whose expression pattern are specific only to objective tissues. Conclusion ROKU is useful for the detection of various tissue-specific expression patterns. The framework is also directly applicable to the selection of diagnostic markers for molecular classification of multiple classes. PMID:16764735

  12. Improved segmentation of abnormal cervical nuclei using a graph-search based approach

    NASA Astrophysics Data System (ADS)

    Zhang, Ling; Liu, Shaoxiong; Wang, Tianfu; Chen, Siping; Sonka, Milan

    2015-03-01

    Reliable segmentation of abnormal nuclei in cervical cytology is of paramount importance in automation-assisted screening techniques. This paper presents a general method for improving the segmentation of abnormal nuclei using a graph-search based approach. More specifically, the proposed method focuses on the improvement of coarse (initial) segmentation. The improvement relies on a transform that maps round-like border in the Cartesian coordinate system into lines in the polar coordinate system. The costs consisting of nucleus-specific edge and region information are assigned to the nodes. The globally optimal path in the constructed graph is then identified by dynamic programming. We have tested the proposed method on abnormal nuclei from two cervical cell image datasets, Herlev and H and E stained liquid-based cytology (HELBC), and the comparative experiments with recent state-of-the-art approaches demonstrate the superior performance of the proposed method.

  13. Molecular testing for clinical diagnosis and epidemiological investigations of intestinal parasitic infections.

    PubMed

    Verweij, Jaco J; Stensvold, C Rune

    2014-04-01

    Over the past few decades, nucleic acid-based methods have been developed for the diagnosis of intestinal parasitic infections. Advantages of nucleic acid-based methods are numerous; typically, these include increased sensitivity and specificity and simpler standardization of diagnostic procedures. DNA samples can also be stored and used for genetic characterization and molecular typing, providing a valuable tool for surveys and surveillance studies. A variety of technologies have been applied, and some specific and general pitfalls and limitations have been identified. This review provides an overview of the multitude of methods that have been reported for the detection of intestinal parasites and offers some guidance in applying these methods in the clinical laboratory and in epidemiological studies.

  14. Molecular Testing for Clinical Diagnosis and Epidemiological Investigations of Intestinal Parasitic Infections

    PubMed Central

    Stensvold, C. Rune

    2014-01-01

    SUMMARY Over the past few decades, nucleic acid-based methods have been developed for the diagnosis of intestinal parasitic infections. Advantages of nucleic acid-based methods are numerous; typically, these include increased sensitivity and specificity and simpler standardization of diagnostic procedures. DNA samples can also be stored and used for genetic characterization and molecular typing, providing a valuable tool for surveys and surveillance studies. A variety of technologies have been applied, and some specific and general pitfalls and limitations have been identified. This review provides an overview of the multitude of methods that have been reported for the detection of intestinal parasites and offers some guidance in applying these methods in the clinical laboratory and in epidemiological studies. PMID:24696439

  15. A broad range assay for rapid detection and etiologic characterization of bacterial meningitis: performance testing in samples from sub-Sahara.

    PubMed

    Won, Helen; Yang, Samuel; Gaydos, Charlotte; Hardick, Justin; Ramachandran, Padmini; Hsieh, Yu-Hsiang; Kecojevic, Alexander; Njanpop-Lafourcade, Berthe-Marie; Mueller, Judith E; Tameklo, Tsidi Agbeko; Badziklou, Kossi; Gessner, Bradford D; Rothman, Richard E

    2012-09-01

    This study aimed to conduct a pilot evaluation of broad-based multiprobe polymerase chain reaction (PCR) in clinical cerebrospinal fluid (CSF) samples compared to local conventional PCR/culture methods used for bacterial meningitis surveillance. A previously described PCR consisting of initial broad-based detection of Eubacteriales by a universal probe, followed by Gram typing, and pathogen-specific probes was designed targeting variable regions of the 16S rRNA gene. The diagnostic performance of the 16S rRNA assay in "127 CSF samples was evaluated in samples from patients from Togo, Africa, by comparison to conventional PCR/culture methods. Our probes detected Neisseria meningitidis, Streptococcus pneumoniae, and Haemophilus influenzae. Uniprobe sensitivity and specificity versus conventional PCR were 100% and 54.6%, respectively. Sensitivity and specificity of uniprobe versus culture methods were 96.5% and 52.5%, respectively. Gram-typing probes correctly typed 98.8% (82/83) and pathogen-specific probes identified 96.4% (80/83) of the positives. This broad-based PCR algorithm successfully detected and provided species level information for multiple bacterial meningitis agents in clinical samples. Copyright © 2012 Elsevier Inc. All rights reserved.

  16. A broad range assay for rapid detection and etiologic characterization of bacterial meningitis: performance testing in samples from sub-Sahara☆, ☆☆,★

    PubMed Central

    Won, Helen; Yang, Samuel; Gaydos, Charlotte; Hardick, Justin; Ramachandran, Padmini; Hsieh, Yu-Hsiang; Kecojevic, Alexander; Njanpop-Lafourcade, Berthe-Marie; Mueller, Judith E.; Tameklo, Tsidi Agbeko; Badziklou, Kossi; Gessner, Bradford D.; Rothman, Richard E.

    2012-01-01

    This study aimed to conduct a pilot evaluation of broad-based multiprobe polymerase chain reaction (PCR) in clinical cerebrospinal fluid (CSF) samples compared to local conventional PCR/culture methods used for bacterial meningitis surveillance. A previously described PCR consisting of initial broad-based detection of Eubacteriales by a universal probe, followed by Gram typing, and pathogen-specific probes was designed targeting variable regions of the 16S rRNA gene. The diagnostic performance of the 16S rRNA assay in “”127 CSF samples was evaluated in samples from patients from Togo, Africa, by comparison to conventional PCR/culture methods. Our probes detected Neisseria meningitidis, Streptococcus pneumoniae, and Haemophilus influenzae. Uniprobe sensitivity and specificity versus conventional PCR were 100% and 54.6%, respectively. Sensitivity and specificity of uniprobe versus culture methods were 96.5% and 52.5%, respectively. Gram-typing probes correctly typed 98.8% (82/83) and pathogen-specific probes identified 96.4% (80/83) of the positives. This broad-based PCR algorithm successfully detected and provided species level information for multiple bacterial meningitis agents in clinical samples. PMID:22809694

  17. Statistical method evaluation for differentially methylated CpGs in base resolution next-generation DNA sequencing data.

    PubMed

    Zhang, Yun; Baheti, Saurabh; Sun, Zhifu

    2018-05-01

    High-throughput bisulfite methylation sequencing such as reduced representation bisulfite sequencing (RRBS), Agilent SureSelect Human Methyl-Seq (Methyl-seq) or whole-genome bisulfite sequencing is commonly used for base resolution methylome research. These data are represented either by the ratio of methylated cytosine versus total coverage at a CpG site or numbers of methylated and unmethylated cytosines. Multiple statistical methods can be used to detect differentially methylated CpGs (DMCs) between conditions, and these methods are often the base for the next step of differentially methylated region identification. The ratio data have a flexibility of fitting to many linear models, but the raw count data take consideration of coverage information. There is an array of options in each datatype for DMC detection; however, it is not clear which is an optimal statistical method. In this study, we systematically evaluated four statistic methods on methylation ratio data and four methods on count-based data and compared their performances with regard to type I error control, sensitivity and specificity of DMC detection and computational resource demands using real RRBS data along with simulation. Our results show that the ratio-based tests are generally more conservative (less sensitive) than the count-based tests. However, some count-based methods have high false-positive rates and should be avoided. The beta-binomial model gives a good balance between sensitivity and specificity and is preferred method. Selection of methods in different settings, signal versus noise and sample size estimation are also discussed.

  18. EVALUATE THE UTILITY OF ENTEROCOCCI AS INDICATORS OF THE SOURCES OF FECAL CONTAMINATION IN IMPAIRED SUBWATERSHEDS THROUGH DNA-BASED MOLECULAR TECHNIQUES

    EPA Science Inventory

    Microbial source tracking (MST) is based on the assumption that specific strains of bacteria are associated with specific host species. MST methods are attractive because their application on environmental samples could help define the nature of water quality problems in impaire...

  19. Common and Specific Factors Approaches to Home-Based Treatment: I-FAST and MST

    ERIC Educational Resources Information Center

    Lee, Mo Yee; Greene, Gilbert J.; Fraser, J. Scott; Edwards, Shivani G.; Grove, David; Solovey, Andrew D.; Scott, Pamela

    2013-01-01

    Objectives: This study examined the treatment outcomes of integrated families and systems treatment (I-FAST), a moderated common factors approach, in reference to multisystemic therapy (MST), an established specific factor approach, for treating at risk children and adolescents and their families in an intensive community-based setting. Method:…

  20. Comparison of MI, Chromocult® coliform, and Compass CC chromogenic culture-based methods to detect Escherichia coli and total coliforms in water using 16S rRNA sequencing for colony identification.

    PubMed

    Maheux, Andrée F; Bouchard, Sébastien; Bérubé, Ève; Bergeron, Michel G

    2017-06-01

    The MI, Chromocult ® coliform, and Compass CC chromogenic culture-based methods used to assess water quality by the detection of Escherichia coli and total coliforms were compared in terms of their specificity and sensitivity, using 16S rRNA sequencing for colony identification. A sewage water sample was divided in 2-μL subsamples for testing by all three culture-based methods. All growing colonies were harvested and subjected to 16S rRNA sequencing. Test results showed that all E. coli colonies were correctly identified by all three methods, for a specificity and a sensitivity of 100%. However, for the total coliform detection, the MI agar, Chromocult ® coliform agar, and Compass CC agar were specific for only 69.2% (9/13), 47.2% (25/53), and 40.5% (17/42), whereas sensitive for 97.8% (45/46), 97.5% (39/40), and 85.7% (24/28), respectively. Thus, given the low level of specificity of these methods for the detection of total coliforms, confirming the identity of total coliform colonies could help to take public health decisions, in particular for cities connected to a public drinking water distribution system since the growth of few putative total coliform colonies on chromogenic agar is problematic and can lead to unnecessary and costly boiling notices from public health authorities.

  1. A time domain frequency-selective multivariate Granger causality approach.

    PubMed

    Leistritz, Lutz; Witte, Herbert

    2016-08-01

    The investigation of effective connectivity is one of the major topics in computational neuroscience to understand the interaction between spatially distributed neuronal units of the brain. Thus, a wide variety of methods has been developed during the last decades to investigate functional and effective connectivity in multivariate systems. Their spectrum ranges from model-based to model-free approaches with a clear separation into time and frequency range methods. We present in this simulation study a novel time domain approach based on Granger's principle of predictability, which allows frequency-selective considerations of directed interactions. It is based on a comparison of prediction errors of multivariate autoregressive models fitted to systematically modified time series. These modifications are based on signal decompositions, which enable a targeted cancellation of specific signal components with specific spectral properties. Depending on the embedded signal decomposition method, a frequency-selective or data-driven signal-adaptive Granger Causality Index may be derived.

  2. Secure password-based authenticated key exchange for web services

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liang, Fang; Meder, Samuel; Chevassut, Olivier

    This paper discusses an implementation of an authenticated key-exchange method rendered on message primitives defined in the WS-Trust and WS-SecureConversation specifications. This IEEE-specified cryptographic method (AuthA) is proven-secure for password-based authentication and key exchange, while the WS-Trust and WS-Secure Conversation are emerging Web Services Security specifications that extend the WS-Security specification. A prototype of the presented protocol is integrated in the WSRF-compliant Globus Toolkit V4. Further hardening of the implementation is expected to result in a version that will be shipped with future Globus Toolkit releases. This could help to address the current unavailability of decent shared-secret-based authentication options inmore » the Web Services and Grid world. Future work will be to integrate One-Time-Password (OTP) features in the authentication protocol.« less

  3. Imaging quality analysis of computer-generated holograms using the point-based method and slice-based method

    NASA Astrophysics Data System (ADS)

    Zhang, Zhen; Chen, Siqing; Zheng, Huadong; Sun, Tao; Yu, Yingjie; Gao, Hongyue; Asundi, Anand K.

    2017-06-01

    Computer holography has made a notably progress in recent years. The point-based method and slice-based method are chief calculation algorithms for generating holograms in holographic display. Although both two methods are validated numerically and optically, the differences of the imaging quality of these methods have not been specifically analyzed. In this paper, we analyze the imaging quality of computer-generated phase holograms generated by point-based Fresnel zone plates (PB-FZP), point-based Fresnel diffraction algorithm (PB-FDA) and slice-based Fresnel diffraction algorithm (SB-FDA). The calculation formula and hologram generation with three methods are demonstrated. In order to suppress the speckle noise, sequential phase-only holograms are generated in our work. The results of reconstructed images numerically and experimentally are also exhibited. By comparing the imaging quality, the merits and drawbacks with three methods are analyzed. Conclusions are given by us finally.

  4. Improving lab compaction specifications for flexible bases within the Texas DOT.

    DOT National Transportation Integrated Search

    2009-04-01

    In Test Methods Tex-113-E and Tex-114-E, the Texas Department of Transportation (TxDOT) employs an impact hammer method of sample compaction for laboratory preparation of road base and subgrade materials for testing. In this third and final report do...

  5. Multiple Imputation based Clustering Validation (MIV) for Big Longitudinal Trial Data with Missing Values in eHealth.

    PubMed

    Zhang, Zhaoyang; Fang, Hua; Wang, Honggang

    2016-06-01

    Web-delivered trials are an important component in eHealth services. These trials, mostly behavior-based, generate big heterogeneous data that are longitudinal, high dimensional with missing values. Unsupervised learning methods have been widely applied in this area, however, validating the optimal number of clusters has been challenging. Built upon our multiple imputation (MI) based fuzzy clustering, MIfuzzy, we proposed a new multiple imputation based validation (MIV) framework and corresponding MIV algorithms for clustering big longitudinal eHealth data with missing values, more generally for fuzzy-logic based clustering methods. Specifically, we detect the optimal number of clusters by auto-searching and -synthesizing a suite of MI-based validation methods and indices, including conventional (bootstrap or cross-validation based) and emerging (modularity-based) validation indices for general clustering methods as well as the specific one (Xie and Beni) for fuzzy clustering. The MIV performance was demonstrated on a big longitudinal dataset from a real web-delivered trial and using simulation. The results indicate MI-based Xie and Beni index for fuzzy-clustering are more appropriate for detecting the optimal number of clusters for such complex data. The MIV concept and algorithms could be easily adapted to different types of clustering that could process big incomplete longitudinal trial data in eHealth services.

  6. Multiple Imputation based Clustering Validation (MIV) for Big Longitudinal Trial Data with Missing Values in eHealth

    PubMed Central

    Zhang, Zhaoyang; Wang, Honggang

    2016-01-01

    Web-delivered trials are an important component in eHealth services. These trials, mostly behavior-based, generate big heterogeneous data that are longitudinal, high dimensional with missing values. Unsupervised learning methods have been widely applied in this area, however, validating the optimal number of clusters has been challenging. Built upon our multiple imputation (MI) based fuzzy clustering, MIfuzzy, we proposed a new multiple imputation based validation (MIV) framework and corresponding MIV algorithms for clustering big longitudinal eHealth data with missing values, more generally for fuzzy-logic based clustering methods. Specifically, we detect the optimal number of clusters by auto-searching and -synthesizing a suite of MI-based validation methods and indices, including conventional (bootstrap or cross-validation based) and emerging (modularity-based) validation indices for general clustering methods as well as the specific one (Xie and Beni) for fuzzy clustering. The MIV performance was demonstrated on a big longitudinal dataset from a real web-delivered trial and using simulation. The results indicate MI-based Xie and Beni index for fuzzy-clustering is more appropriate for detecting the optimal number of clusters for such complex data. The MIV concept and algorithms could be easily adapted to different types of clustering that could process big incomplete longitudinal trial data in eHealth services. PMID:27126063

  7. Enzyme immunoassays for IgG and IgM antibodies to Toxoplasma gondii based on enhanced chemiluminescence.

    PubMed Central

    Crouch, C F

    1995-01-01

    AIMS--To evaluate the clinical performance of enzyme immunoassays for IgG and IgM antibodies to Toxoplasma gondii based on enhanced chemiluminescence. METHODS--Classification of routine clinical samples from the originating laboratories was compared with that obtained using the chemiluminescence based assays. Resolution of discordant results was achieved by testing in alternative enzyme immunoassays (IgM) or by an independent laboratory using the dye test (IgG). RESULTS--Compared with resolved data, the IgM assay was found to be highly specific (100%) with a cut off selected to give optimal performance with respect to both the early detection of specific IgM and the detection of persistent levels of specific IgM (sensitivity 98%). Compared with resolved data, the IgG assay was shown to have a sensitivity and a specificity of 99.4%. CONCLUSIONS--The Amerlite Toxo IgM assay possesses high levels of sensitivity and specificity. Assay interference due to rheumatoid factor like substances is not a problem. The Amerlite Toxo IgG assay possesses good sensitivity and specificity, but is less sensitive for the detection of seroconversion than methods detecting both IgG and IgM. PMID:7560174

  8. Population-based absolute risk estimation with survey data

    PubMed Central

    Kovalchik, Stephanie A.; Pfeiffer, Ruth M.

    2013-01-01

    Absolute risk is the probability that a cause-specific event occurs in a given time interval in the presence of competing events. We present methods to estimate population-based absolute risk from a complex survey cohort that can accommodate multiple exposure-specific competing risks. The hazard function for each event type consists of an individualized relative risk multiplied by a baseline hazard function, which is modeled nonparametrically or parametrically with a piecewise exponential model. An influence method is used to derive a Taylor-linearized variance estimate for the absolute risk estimates. We introduce novel measures of the cause-specific influences that can guide modeling choices for the competing event components of the model. To illustrate our methodology, we build and validate cause-specific absolute risk models for cardiovascular and cancer deaths using data from the National Health and Nutrition Examination Survey. Our applications demonstrate the usefulness of survey-based risk prediction models for predicting health outcomes and quantifying the potential impact of disease prevention programs at the population level. PMID:23686614

  9. Detection of nucleic acids by multiple sequential invasive cleavages

    DOEpatents

    Hall, Jeff G.; Lyamichev, Victor I.; Mast, Andrea L.; Brow, Mary Ann D.

    1999-01-01

    The present invention relates to means for the detection and characterization of nucleic acid sequences, as well as variations in nucleic acid sequences. The present invention also relates to methods for forming a nucleic acid cleavage structure on a target sequence and cleaving the nucleic acid cleavage structure in a site-specific manner. The structure-specific nuclease activity of a variety of enzymes is used to cleave the target-dependent cleavage structure, thereby indicating the presence of specific nucleic acid sequences or specific variations thereof. The present invention further relates to methods and devices for the separation of nucleic acid molecules based on charge. The present invention also provides methods for the detection of non-target cleavage products via the formation of a complete and activated protein binding region. The invention further provides sensitive and specific methods for the detection of human cytomegalovirus nucleic acid in a sample.

  10. Nucleic acid detection kits

    DOEpatents

    Hall, Jeff G.; Lyamichev, Victor I.; Mast, Andrea L.; Brow, Mary Ann; Kwiatkowski, Robert W.; Vavra, Stephanie H.

    2005-03-29

    The present invention relates to means for the detection and characterization of nucleic acid sequences, as well as variations in nucleic acid sequences. The present invention also relates to methods for forming a nucleic acid cleavage structure on a target sequence and cleaving the nucleic acid cleavage structure in a site-specific manner. The structure-specific nuclease activity of a variety of enzymes is used to cleave the target-dependent cleavage structure, thereby indicating the presence of specific nucleic acid sequences or specific variations thereof. The present invention further relates to methods and devices for the separation of nucleic acid molecules based on charge. The present invention also provides methods for the detection of non-target cleavage products via the formation of a complete and activated protein binding region. The invention further provides sensitive and specific methods for the detection of nucleic acid from various viruses in a sample.

  11. Detection of nucleic acids by multiple sequential invasive cleavages 02

    DOEpatents

    Hall, Jeff G.; Lyamichev, Victor I.; Mast, Andrea L.; Brow, Mary Ann D.

    2002-01-01

    The present invention relates to means for the detection and characterization of nucleic acid sequences, as well as variations in nucleic acid sequences. The present invention also relates to methods for forming a nucleic acid cleavage structure on a target sequence and cleaving the nucleic acid cleavage structure in a site-specific manner. The structure-specific nuclease activity of a variety of enzymes is used to cleave the target-dependent cleavage structure, thereby indicating the presence of specific nucleic acid sequences or specific variations thereof. The present invention further relates to methods and devices for the separation of nucleic acid molecules based on charge. The present invention also provides methods for the detection of non-target cleavage products via the formation of a complete and activated protein binding region. The invention further provides sensitive and specific methods for the detection of human cytomegalovirus nucleic acid in a sample.

  12. Detection of nucleic acids by multiple sequential invasive cleavages

    DOEpatents

    Hall, Jeff G; Lyamichev, Victor I; Mast, Andrea L; Brow, Mary Ann D

    2012-10-16

    The present invention relates to means for the detection and characterization of nucleic acid sequences, as well as variations in nucleic acid sequences. The present invention also relates to methods for forming a nucleic acid cleavage structure on a target sequence and cleaving the nucleic acid cleavage structure in a site-specific manner. The structure-specific nuclease activity of a variety of enzymes is used to cleave the target-dependent cleavage structure, thereby indicating the presence of specific nucleic acid sequences or specific variations thereof. The present invention further relates to methods and devices for the separation of nucleic acid molecules based on charge. The present invention also provides methods for the detection of non-target cleavage products via the formation of a complete and activated protein binding region. The invention further provides sensitive and specific methods for the detection of human cytomegalovirus nucleic acid in a sample.

  13. An improved method for detecting circulating microRNAs with S-Poly(T) Plus real-time PCR

    PubMed Central

    Niu, Yanqin; Zhang, Limin; Qiu, Huiling; Wu, Yike; Wang, Zhiwei; Zai, Yujia; Liu, Lin; Qu, Junle; Kang, Kang; Gou, Deming

    2015-01-01

    We herein describe a simple, sensitive and specific method for analysis of circulating microRNAs (miRNA), termed S-Poly(T) Plus real-time PCR assay. This new method is based on our previously developed S-Poly(T) method, in which a unique S-Poly(T) primer is used during reverse-transcription to increase sensitivity and specificity. Further increased sensitivity and simplicity of S-Poly(T) Plus, in comparison with the S-Poly(T) method, were achieved by a single-step, multiple-stage reaction, where RNAs were polyadenylated and reverse-transcribed at the same time. The sensitivity of circulating miRNA detection was further improved by a modified method of total RNA isolation from serum/plasma, S/P miRsol, in which glycogen was used to increase the RNA yield. We validated our methods by quantifying miRNA expression profiles in the sera of the patients with pulmonary arterial hypertension associated with congenital heart disease. In conclusion, we developed a simple, sensitive, and specific method for detecting circulating miRNAs that allows the measurement of 266 miRNAs from 100 μl of serum or plasma. This method presents a promising tool for basic miRNA research and clinical diagnosis of human diseases based on miRNA biomarkers. PMID:26459910

  14. Differentially co-expressed interacting protein pairs discriminate samples under distinct stages of HIV type 1 infection.

    PubMed

    Yoon, Dukyong; Kim, Hyosil; Suh-Kim, Haeyoung; Park, Rae Woong; Lee, KiYoung

    2011-01-01

    Microarray analyses based on differentially expressed genes (DEGs) have been widely used to distinguish samples across different cellular conditions. However, studies based on DEGs have not been able to clearly determine significant differences between samples of pathophysiologically similar HIV-1 stages, e.g., between acute and chronic progressive (or AIDS) or between uninfected and clinically latent stages. We here suggest a novel approach to allow such discrimination based on stage-specific genetic features of HIV-1 infection. Our approach is based on co-expression changes of genes known to interact. The method can identify a genetic signature for a single sample as contrasted with existing protein-protein-based analyses with correlational designs. Our approach distinguishes each sample using differentially co-expressed interacting protein pairs (DEPs) based on co-expression scores of individual interacting pairs within a sample. The co-expression score has positive value if two genes in a sample are simultaneously up-regulated or down-regulated. And the score has higher absolute value if expression-changing ratios are similar between the two genes. We compared characteristics of DEPs with that of DEGs by evaluating their usefulness in separation of HIV-1 stage. And we identified DEP-based network-modules and their gene-ontology enrichment to find out the HIV-1 stage-specific gene signature. Based on the DEP approach, we observed clear separation among samples from distinct HIV-1 stages using clustering and principal component analyses. Moreover, the discrimination power of DEPs on the samples (70-100% accuracy) was much higher than that of DEGs (35-45%) using several well-known classifiers. DEP-based network analysis also revealed the HIV-1 stage-specific network modules; the main biological processes were related to "translation," "RNA splicing," "mRNA, RNA, and nucleic acid transport," and "DNA metabolism." Through the HIV-1 stage-related modules, changing stage-specific patterns of protein interactions could be observed. DEP-based method discriminated the HIV-1 infection stages clearly, and revealed a HIV-1 stage-specific gene signature. The proposed DEP-based method might complement existing DEG-based approaches in various microarray expression analyses.

  15. Learning predictive models that use pattern discovery--a bootstrap evaluative approach applied in organ functioning sequences.

    PubMed

    Toma, Tudor; Bosman, Robert-Jan; Siebes, Arno; Peek, Niels; Abu-Hanna, Ameen

    2010-08-01

    An important problem in the Intensive Care is how to predict on a given day of stay the eventual hospital mortality for a specific patient. A recent approach to solve this problem suggested the use of frequent temporal sequences (FTSs) as predictors. Methods following this approach were evaluated in the past by inducing a model from a training set and validating the prognostic performance on an independent test set. Although this evaluative approach addresses the validity of the specific models induced in an experiment, it falls short of evaluating the inductive method itself. To achieve this, one must account for the inherent sources of variation in the experimental design. The main aim of this work is to demonstrate a procedure based on bootstrapping, specifically the .632 bootstrap procedure, for evaluating inductive methods that discover patterns, such as FTSs. A second aim is to apply this approach to find out whether a recently suggested inductive method that discovers FTSs of organ functioning status is superior over a traditional method that does not use temporal sequences when compared on each successive day of stay at the Intensive Care Unit. The use of bootstrapping with logistic regression using pre-specified covariates is known in the statistical literature. Using inductive methods of prognostic models based on temporal sequence discovery within the bootstrap procedure is however novel at least in predictive models in the Intensive Care. Our results of applying the bootstrap-based evaluative procedure demonstrate the superiority of the FTS-based inductive method over the traditional method in terms of discrimination as well as accuracy. In addition we illustrate the insights gained by the analyst into the discovered FTSs from the bootstrap samples. Copyright 2010 Elsevier Inc. All rights reserved.

  16. Prediction of TF target sites based on atomistic models of protein-DNA complexes

    PubMed Central

    Angarica, Vladimir Espinosa; Pérez, Abel González; Vasconcelos, Ana T; Collado-Vides, Julio; Contreras-Moreira, Bruno

    2008-01-01

    Background The specific recognition of genomic cis-regulatory elements by transcription factors (TFs) plays an essential role in the regulation of coordinated gene expression. Studying the mechanisms determining binding specificity in protein-DNA interactions is thus an important goal. Most current approaches for modeling TF specific recognition rely on the knowledge of large sets of cognate target sites and consider only the information contained in their primary sequence. Results Here we describe a structure-based methodology for predicting sequence motifs starting from the coordinates of a TF-DNA complex. Our algorithm combines information regarding the direct and indirect readout of DNA into an atomistic statistical model, which is used to estimate the interaction potential. We first measure the ability of our method to correctly estimate the binding specificities of eight prokaryotic and eukaryotic TFs that belong to different structural superfamilies. Secondly, the method is applied to two homology models, finding that sampling of interface side-chain rotamers remarkably improves the results. Thirdly, the algorithm is compared with a reference structural method based on contact counts, obtaining comparable predictions for the experimental complexes and more accurate sequence motifs for the homology models. Conclusion Our results demonstrate that atomic-detail structural information can be feasibly used to predict TF binding sites. The computational method presented here is universal and might be applied to other systems involving protein-DNA recognition. PMID:18922190

  17. Combining Evidence of Preferential Gene-Tissue Relationships from Multiple Sources

    PubMed Central

    Guo, Jing; Hammar, Mårten; Öberg, Lisa; Padmanabhuni, Shanmukha S.; Bjäreland, Marcus; Dalevi, Daniel

    2013-01-01

    An important challenge in drug discovery and disease prognosis is to predict genes that are preferentially expressed in one or a few tissues, i.e. showing a considerably higher expression in one tissue(s) compared to the others. Although several data sources and methods have been published explicitly for this purpose, they often disagree and it is not evident how to retrieve these genes and how to distinguish true biological findings from those that are due to choice-of-method and/or experimental settings. In this work we have developed a computational approach that combines results from multiple methods and datasets with the aim to eliminate method/study-specific biases and to improve the predictability of preferentially expressed human genes. A rule-based score is used to merge and assign support to the results. Five sets of genes with known tissue specificity were used for parameter pruning and cross-validation. In total we identify 3434 tissue-specific genes. We compare the genes of highest scores with the public databases: PaGenBase (microarray), TiGER (EST) and HPA (protein expression data). The results have 85% overlap to PaGenBase, 71% to TiGER and only 28% to HPA. 99% of our predictions have support from at least one of these databases. Our approach also performs better than any of the databases on identifying drug targets and biomarkers with known tissue-specificity. PMID:23950964

  18. A new automated NaCl based robust method for routine production of gallium-68 labeled peptides

    PubMed Central

    Schultz, Michael K.; Mueller, Dirk; Baum, Richard P.; Watkins, G. Leonard; Breeman, Wouter A. P.

    2017-01-01

    A new NaCl based method for preparation of gallium-68 labeled radiopharmaceuticals has been adapted for use with an automated gallium-68 generator system. The method was evaluated based on 56 preparations of [68Ga]DOTATOC and compared to a similar acetone-based approach. Advantages of the new NaCl approach include reduced preparation time (< 15 min) and removal of organic solvents. The method produces high peptide-bound % (> 97%), and specific activity (> 40 MBq nmole−1 [68Ga]DOTATOC) and is well-suited for clinical production of radiopharmaceuticals. PMID:23026223

  19. Recommendation Method for Build-to-Order Products Considering Substitutability of Specifications and Stock Consumption Balance of Components

    NASA Astrophysics Data System (ADS)

    Shimoda, Atsushi; Kosugi, Hidenori; Karino, Takafumi; Komoda, Norihisa

    This study focuses on a stock reduction method for build-to-order (BTO) products to flow surplus parts out to the market using sale by recommendation. A sale by recommendation is repeated in an each business negotiation using a recommended configuration selected from the inventory of parts to minimize the stock deficiency or excess at the end of a certain period of the production plan. The method is based on the potential of a customer specification to be replaced by an alternative one if the alternative one is close to the initial customer specification. A recommendation method is proposed that decides the recommended product configuration by balancing the part consumption so that the alternative specification of the configuration is close enough to the initial customer specification for substitutability. The method was evaluated by a simulation using real BTO manufacturing data and the result demonstrates that the unbalance of the consumption of parts inventory is improved.

  20. A Comparison of Evaluation Metrics for Biomedical Journals, Articles, and Websites in Terms of Sensitivity to Topic

    PubMed Central

    Fu, Lawrence D.; Aphinyanaphongs, Yindalon; Wang, Lily; Aliferis, Constantin F.

    2011-01-01

    Evaluating the biomedical literature and health-related websites for quality are challenging information retrieval tasks. Current commonly used methods include impact factor for journals, PubMed’s clinical query filters and machine learning-based filter models for articles, and PageRank for websites. Previous work has focused on the average performance of these methods without considering the topic, and it is unknown how performance varies for specific topics or focused searches. Clinicians, researchers, and users should be aware when expected performance is not achieved for specific topics. The present work analyzes the behavior of these methods for a variety of topics. Impact factor, clinical query filters, and PageRank vary widely across different topics while a topic-specific impact factor and machine learning-based filter models are more stable. The results demonstrate that a method may perform excellently on average but struggle when used on a number of narrower topics. Topic adjusted metrics and other topic robust methods have an advantage in such situations. Users of traditional topic-sensitive metrics should be aware of their limitations. PMID:21419864

  1. Identifying functional cancer-specific miRNA-mRNA interactions in testicular germ cell tumor.

    PubMed

    Sedaghat, Nafiseh; Fathy, Mahmood; Modarressi, Mohammad Hossein; Shojaie, Ali

    2016-09-07

    Testicular cancer is the most common cancer in men aged between 15 and 35 and more than 90% of testicular neoplasms are originated at germ cells. Recent research has shown the impact of microRNAs (miRNAs) in different types of cancer, including testicular germ cell tumor (TGCT). MicroRNAs are small non-coding RNAs which affect the development and progression of cancer cells by binding to mRNAs and regulating their expressions. The identification of functional miRNA-mRNA interactions in cancers, i.e. those that alter the expression of genes in cancer cells, can help delineate post-regulatory mechanisms and may lead to new treatments to control the progression of cancer. A number of sequence-based methods have been developed to predict miRNA-mRNA interactions based on the complementarity of sequences. While necessary, sequence complementarity is, however, not sufficient for presence of functional interactions. Alternative methods have thus been developed to refine the sequence-based interactions using concurrent expression profiles of miRNAs and mRNAs. This study aims to find functional cancer-specific miRNA-mRNA interactions in TGCT. To this end, the sequence-based predicted interactions are first refined using an ensemble learning method, based on two well-known methods of learning miRNA-mRNA interactions, namely, TaLasso and GenMiR++. Additional functional analyses were then used to identify a subset of interactions to be most likely functional and specific to TGCT. The final list of 13 miRNA-mRNA interactions can be potential targets for identifying TGCT-specific interactions and future laboratory experiments to develop new therapies. Copyright © 2016 Elsevier Ltd. All rights reserved.

  2. EVALUATION OF HOST SPECIFIC PCR-BASED METHODS FOR THE IDENTIFICATION OF FECAL POLLUTION

    EPA Science Inventory

    Microbial Source Tracking (MST) is an approach to determine the origin of fecal pollution impacting a body of water. MST is based on the assumption that, given the appropriate method and indicator, the source of microbial pollution can be identified. One of the key elements of...

  3. Method for indexing and retrieving manufacturing-specific digital imagery based on image content

    DOEpatents

    Ferrell, Regina K.; Karnowski, Thomas P.; Tobin, Jr., Kenneth W.

    2004-06-15

    A method for indexing and retrieving manufacturing-specific digital images based on image content comprises three steps. First, at least one feature vector can be extracted from a manufacturing-specific digital image stored in an image database. In particular, each extracted feature vector corresponds to a particular characteristic of the manufacturing-specific digital image, for instance, a digital image modality and overall characteristic, a substrate/background characteristic, and an anomaly/defect characteristic. Notably, the extracting step includes generating a defect mask using a detection process. Second, using an unsupervised clustering method, each extracted feature vector can be indexed in a hierarchical search tree. Third, a manufacturing-specific digital image associated with a feature vector stored in the hierarchicial search tree can be retrieved, wherein the manufacturing-specific digital image has image content comparably related to the image content of the query image. More particularly, can include two data reductions, the first performed based upon a query vector extracted from a query image. Subsequently, a user can select relevant images resulting from the first data reduction. From the selection, a prototype vector can be calculated, from which a second-level data reduction can be performed. The second-level data reduction can result in a subset of feature vectors comparable to the prototype vector, and further comparable to the query vector. An additional fourth step can include managing the hierarchical search tree by substituting a vector average for several redundant feature vectors encapsulated by nodes in the hierarchical search tree.

  4. Worker-specific exposure monitor and method for surveillance of workers

    DOEpatents

    Lovejoy, Michael L.; Peeters, John P.; Johnson, A. Wayne

    2000-01-01

    A person-specific monitor that provides sensor information regarding hazards to which the person is exposed and means to geolocate the person at the time of the exposure. The monitor also includes means to communicate with a remote base station. Information from the monitor can be downloaded at the base station for long term storage and analysis. The base station can also include means to recharge the monitor.

  5. The Effect of Teaching Methods and Learning Style on Learning Program Design in Web-Based Education Systems

    ERIC Educational Resources Information Center

    Hung, Yen-Chu

    2012-01-01

    The instructional value of web-based education systems has been an important area of research in information systems education. This study investigates the effect of various teaching methods on program design learning for students with specific learning styles in web-based education systems. The study takes first-year Computer Science and…

  6. System and method for air temperature control in an oxygen transport membrane based reactor

    DOEpatents

    Kelly, Sean M

    2016-09-27

    A system and method for air temperature control in an oxygen transport membrane based reactor is provided. The system and method involves introducing a specific quantity of cooling air or trim air in between stages in a multistage oxygen transport membrane based reactor or furnace to maintain generally consistent surface temperatures of the oxygen transport membrane elements and associated reactors. The associated reactors may include reforming reactors, boilers or process gas heaters.

  7. System and method for temperature control in an oxygen transport membrane based reactor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kelly, Sean M.

    A system and method for temperature control in an oxygen transport membrane based reactor is provided. The system and method involves introducing a specific quantity of cooling air or trim air in between stages in a multistage oxygen transport membrane based reactor or furnace to maintain generally consistent surface temperatures of the oxygen transport membrane elements and associated reactors. The associated reactors may include reforming reactors, boilers or process gas heaters.

  8. Pyrolyzed-parylene based sensors and method of manufacture

    NASA Technical Reports Server (NTRS)

    Tai, Yu-Chong (Inventor); Liger, Matthieu (Inventor); Miserendino, Scott (Inventor); Konishi, Satoshi (Inventor)

    2007-01-01

    A method (and resulting structure) for fabricating a sensing device. The method includes providing a substrate comprising a surface region and forming an insulating material overlying the surface region. The method also includes forming a film of carbon based material overlying the insulating material and treating to the film of carbon based material to pyrolyzed the carbon based material to cause formation of a film of substantially carbon based material having a resistivity ranging within a predetermined range. The method also provides at least a portion of the pyrolyzed carbon based material in a sensor application and uses the portion of the pyrolyzed carbon based material in the sensing application. In a specific embodiment, the sensing application is selected from chemical, humidity, piezoelectric, radiation, mechanical strain or temperature.

  9. A method for measuring different classes of human immunoglobulins specific for the penicilloyl group

    PubMed Central

    Wheeler, A. W.

    1971-01-01

    A method is described for the detection of human immunoglobulins of the four main classes specific for the penicilloyl group. The technique is an adaptation of the red cell linked antigen antiglobulin reaction based on the finding that benzyl penicilloylated rabbit γ-globulin, specific for human erythrocytes, reacted specifically with erythrocytes but did not agglutinate them. In turn this complex reacted specifically with human penicilloyl antibody and it was then possible to titrate each immunoglobulin class by the addition of anti-immunoglobulin sera. The method described here was used to compare titres of penicilloyl specific immunoglobulins of the same class between different sera. The test was found to be less sensitive than the hapten modified bacteriophage reduction test but had the advantage that individual immunoglobulin classes could be compared. In the absence of a reliable method for the diagnosis of pencillin allergy, it is hoped that the technique described will be a useful addition to existing in vivo and in vitro methods of determining the antibody response of the patient to the penicilloyl group. PMID:4105475

  10. Validation of Field Methods to Assess Body Fat Percentage in Elite Youth Soccer Players.

    PubMed

    Munguia-Izquierdo, Diego; Suarez-Arrones, Luis; Di Salvo, Valter; Paredes-Hernandez, Victor; Alcazar, Julian; Ara, Ignacio; Kreider, Richard; Mendez-Villanueva, Alberto

    2018-05-01

    This study determined the most effective field method for quantifying body fat percentage in male elite youth soccer players and developed prediction equations based on anthropometric variables. Forty-four male elite-standard youth soccer players aged 16.3-18.0 years underwent body fat percentage assessments, including bioelectrical impedance analysis and the calculation of various skinfold-based prediction equations. Dual X-ray absorptiometry provided a criterion measure of body fat percentage. Correlation coefficients, bias, limits of agreement, and differences were used as validity measures, and regression analyses were used to develop soccer-specific prediction equations. The equations from Sarria et al. (1998) and Durnin & Rahaman (1967) reached very large correlations and the lowest biases, and they reached neither the practically worthwhile difference nor the substantial difference between methods. The new youth soccer-specific skinfold equation included a combination of triceps and supraspinale skinfolds. None of the practical methods compared in this study are adequate for estimating body fat percentage in male elite youth soccer players, except for the equations from Sarria et al. (1998) and Durnin & Rahaman (1967). The new youth soccer-specific equation calculated in this investigation is the only field method specifically developed and validated in elite male players, and it shows potentially good predictive power. © Georg Thieme Verlag KG Stuttgart · New York.

  11. Research on segmentation based on multi-atlas in brain MR image

    NASA Astrophysics Data System (ADS)

    Qian, Yuejing

    2018-03-01

    Accurate segmentation of specific tissues in brain MR image can be effectively achieved with the multi-atlas-based segmentation method, and the accuracy mainly depends on the image registration accuracy and fusion scheme. This paper proposes an automatic segmentation method based on the multi-atlas for brain MR image. Firstly, to improve the registration accuracy in the area to be segmented, we employ a target-oriented image registration method for the refinement. Then In the label fusion, we proposed a new algorithm to detect the abnormal sparse patch and simultaneously abandon the corresponding abnormal sparse coefficients, this method is made based on the remaining sparse coefficients combined with the multipoint label estimator strategy. The performance of the proposed method was compared with those of the nonlocal patch-based label fusion method (Nonlocal-PBM), the sparse patch-based label fusion method (Sparse-PBM) and majority voting method (MV). Based on our experimental results, the proposed method is efficient in the brain MR images segmentation compared with MV, Nonlocal-PBM, and Sparse-PBM methods.

  12. EVALUATE THE UTILITY OF ENTEROCOCCI AND BACTEROIDES AS INDICATORS OF THE SOURCES OF FECAL CONTAMINATION IN IMPAIRED SUBWATERSHEDS THROUGH DNA-BASED MOLECULAR TECHNIQUES.

    EPA Science Inventory

    Microbial source tracking (MST) is based on the assumption that specific strains of bacteria are associated with specific host species. MST methods are attractive because their application on environmental samples could help define the nature of water quality problems in impaire...

  13. Gel-based methods in redox proteomics.

    PubMed

    Charles, Rebecca; Jayawardhana, Tamani; Eaton, Philip

    2014-02-01

    The key to understanding the full significance of oxidants in health and disease is the development of tools and methods that allow the study of proteins that sense and transduce changes in cellular redox. Oxidant-reactive deprotonated thiols commonly operate as redox sensors in proteins and a variety of methods have been developed that allow us to monitor their oxidative modification. This outline review specifically focuses on gel-based methods used to detect, quantify and identify protein thiol oxidative modifications. The techniques we discuss fall into one of two broad categories. Firstly, methods that allow oxidation of thiols in specific proteins or the global cellular pool to be monitored are discussed. These typically utilise thiol-labelling reagents that add a reporter moiety (e.g. affinity tag, fluorophore, chromophore), in which loss of labelling signifies oxidation. Secondly, we outline methods that allow specific thiol oxidation states of proteins (e.g. S-sulfenylation, S-nitrosylation, S-thionylation and interprotein disulfide bond formation) to be investigated. A variety of different gel-based methods for identifying thiol proteins that are sensitive to oxidative modifications have been developed. These methods can aid the detection and quantification of thiol redox state, as well as identifying the sensor protein. By understanding how cellular redox is sensed and transduced to a functional effect by protein thiol redox sensors, this will help us better appreciate the role of oxidants in health and disease. This article is part of a Special Issue entitled Current methods to study reactive oxygen species - pros and cons and biophysics of membrane proteins. Guest Editor: Christine Winterbourn. Copyright © 2013 Elsevier B.V. All rights reserved.

  14. Determination of allergenic egg proteins in food by protein-, mass spectrometry-, and DNA-based methods.

    PubMed

    Lee, Ji-Yun; Kim, Chang Jong

    2010-01-01

    Egg allergy is one of the most common food allergies in both adults and children, and foods including eggs and their byproducts should be declared under food allergen labeling policies in industrial countries. Therefore, to develop and validate a sensitive and specific method to detect hidden egg allergens in foods, we compared immunochemical, DNA-based, and proteomic methods for detecting egg allergens in foods using egg allergen standards such as egg whole protein, egg white protein, egg yolk protein, ovomucoid, ovalbumin, ovotransferrin, lysozyme, and alpha-livetin. Protein-based immunochemical methods, including ELISA as an initial screening quantitative analysis and immunoblotting as a final confirmatory qualitative analysis, were very sensitive and specific in detecting potentially allergenic egg residues in processed foods in trace amounts. In contrast, the proteomics-based, matrix-assisted laser desorption/ionization time-of-flight MS and LC-tandem quadrupole time-of-flight MS methods were not able to detect some egg allergens, such as ovomucoid, because of its nondenaturing property under urea and trypsin. The DNA-based PCR method could not distinguish between egg and chicken meat because it is tissue-nonspecific. In further studies for the feasibility of these immunochemical methods on 100 real raw dietary samples, four food samples without listed egg ingredients produced a positive response by ELISA, but exhibited negative results by immunoblotting.

  15. Selective functionalization of carbon nanotubes

    NASA Technical Reports Server (NTRS)

    Strano, Michael S. (Inventor); Usrey, Monica (Inventor); Barone, Paul (Inventor); Dyke, Christopher A. (Inventor); Tour, James M. (Inventor); Kittrell, W. Carter (Inventor); Hauge, Robert H. (Inventor); Smalley, Richard E. (Inventor)

    2009-01-01

    The present invention is directed toward methods of selectively functionalizing carbon nanotubes of a specific type or range of types, based on their electronic properties, using diazonium chemistry. The present invention is also directed toward methods of separating carbon nanotubes into populations of specific types or range(s) of types via selective functionalization and electrophoresis, and also to the novel compositions generated by such separations.

  16. Enumeration of antigen-specific CD8+ T lymphocytes by single-platform, HLA tetramer-based flow cytometry: a European multicenter evaluation.

    PubMed

    Heijnen, Ingmar A F M; Barnett, David; Arroz, Maria J; Barry, Simon M; Bonneville, Marc; Brando, Bruno; D'hautcourt, Jean-Luc; Kern, Florian; Tötterman, Thomas H; Marijt, Erik W A; Bossy, David; Preijers, Frank W M B; Rothe, Gregor; Gratama, Jan W

    2004-11-01

    HLA class I peptide tetramers represent powerful diagnostic tools for detection and monitoring of antigen-specific CD8(+) T cells. The impetus for the current multicenter study is the critical need to standardize tetramer flow cytometry if it is to be implemented as a routine diagnostic assay. Hence, the European Working Group on Clinical Cell Analysis set out to develop and evaluate a single-platform tetramer-based method that used cytomegalovirus (CMV) as the antigenic model. Absolute numbers of CMV-specific CD8(+) T cells were obtained by combining the percentage of tetramer-binding cells with the absolute CD8(+) T-cell count. Six send-outs of stabilized blood from healthy individuals or CMV-carrying donors with CMV-specific CD8(+) T-cell counts of 3 to 10 cells/microl were distributed to 7 to 16 clinical sites. These sites were requested to enumerate CD8(+) T cells and, in the case of CMV-positive donors, CMV-specific subsets on three separate occasions using the standard method. Between-site coefficients of variation of less than 10% (absolute CD8(+) T-cell counts) and approximately 30% (percentage and absolute numbers of CMV-specific CD8(+) T cells) were achieved. Within-site coefficients of variation were approximately 5% (absolute CD8(+) T-cell counts), approximately 9% (percentage CMV-specific CD8(+) T cells), and approximately 17% (absolute CMV-specific CD8(+) T-cell counts). The degree of variation tended to correlate inversely with the proportion of CMV-specific CD8(+) T-cell subsets. The single-platform MHC tetramer-based method for antigen-specific CD8(+) T-cell counting has been evaluated by a European group of laboratories and can be considered a reproducible assay for routine enumeration of antigen-specific CD8(+) T cells. (c) 2004 Wiley-Liss, Inc.

  17. Connectivity-based fixel enhancement: Whole-brain statistical analysis of diffusion MRI measures in the presence of crossing fibres

    PubMed Central

    Raffelt, David A.; Smith, Robert E.; Ridgway, Gerard R.; Tournier, J-Donald; Vaughan, David N.; Rose, Stephen; Henderson, Robert; Connelly, Alan

    2015-01-01

    In brain regions containing crossing fibre bundles, voxel-average diffusion MRI measures such as fractional anisotropy (FA) are difficult to interpret, and lack within-voxel single fibre population specificity. Recent work has focused on the development of more interpretable quantitative measures that can be associated with a specific fibre population within a voxel containing crossing fibres (herein we use fixel to refer to a specific fibre population within a single voxel). Unfortunately, traditional 3D methods for smoothing and cluster-based statistical inference cannot be used for voxel-based analysis of these measures, since the local neighbourhood for smoothing and cluster formation can be ambiguous when adjacent voxels may have different numbers of fixels, or ill-defined when they belong to different tracts. Here we introduce a novel statistical method to perform whole-brain fixel-based analysis called connectivity-based fixel enhancement (CFE). CFE uses probabilistic tractography to identify structurally connected fixels that are likely to share underlying anatomy and pathology. Probabilistic connectivity information is then used for tract-specific smoothing (prior to the statistical analysis) and enhancement of the statistical map (using a threshold-free cluster enhancement-like approach). To investigate the characteristics of the CFE method, we assessed sensitivity and specificity using a large number of combinations of CFE enhancement parameters and smoothing extents, using simulated pathology generated with a range of test-statistic signal-to-noise ratios in five different white matter regions (chosen to cover a broad range of fibre bundle features). The results suggest that CFE input parameters are relatively insensitive to the characteristics of the simulated pathology. We therefore recommend a single set of CFE parameters that should give near optimal results in future studies where the group effect is unknown. We then demonstrate the proposed method by comparing apparent fibre density between motor neurone disease (MND) patients with control subjects. The MND results illustrate the benefit of fixel-specific statistical inference in white matter regions that contain crossing fibres. PMID:26004503

  18. Rectal swab sampling followed by an enrichment culture-based real-time PCR assay to detect Salmonella enterocolitis in children.

    PubMed

    Lin, L-H; Tsai, C-Y; Hung, M-H; Fang, Y-T; Ling, Q-D

    2011-09-01

    Although routine bacterial culture is the traditional reference standard method for the detection of Salmonella infection in children with diarrhoea, it is a time-consuming procedure that usually only gives results after 3-4 days. Some molecular detection methods can improve the turn-around time to within 24 h, but these methods are not applied directly from stool or rectal swab specimens as routine diagnostic methods for the detection of gastrointestinal pathogens. In this study, we tested the feasibility of a bacterial enrichment culture-based real-time PCR assay method for detecting and screening for diarrhoea in children caused by Salmonella. Our results showed that the minimum real-time PCR assay time required to detect enriched bacterial culture from a swab was 3 h. In all children with suspected Salmonella diarrhoea, the enrichment culture-based real-time PCR achieved 85.4% sensitivity and 98.1% specificity, as compared with the 53.7% sensitivity and 100% specificity of detection with the routine bacterial culture method. We suggest that rectal swab sampling followed by enrichment culture-based real-time PCR is suitable as a rapid method for detecting and screening for Salmonella in paediatric patients. © 2011 The Authors. Clinical Microbiology and Infection © 2011 European Society of Clinical Microbiology and Infectious Diseases.

  19. DO TIE LABORATORY BASED METHODS REALLY REFLECT FIELD CONDITIONS

    EPA Science Inventory

    Sediment Toxicity Identification and Evaluation (TIE) methods have been developed for both interstitial waters and whole sediments. These relatively simple laboratory methods are designed to identify specific toxicants or classes of toxicants in sediments; however, the question ...

  20. Microsiemens or Milligrams: Measures of Ionic Mixtures ...

    EPA Pesticide Factsheets

    In December of 2016, EPA released the Draft Field-Based Methods for Developing Aquatic Life Criteria for Specific Conductivity for public comment. Once final, states and authorized tribes may use these methods to derive field-based ecoregional ambient Aquatic Life Ambient Water Quality Criteria (AWQC) for specific conductivity (SC) in flowing waters. The methods provide flexible approaches for developing science-based SC criteria that reflect ecoregional or state specific factors. The concentration of a dissolved salt mixture can be measured in a number of ways including measurement of total dissolved solids, freezing point depression, refractive index, density, or the sum of the concentrations of individually measured ions. For the draft method, SC was selected as the measure because SC is a measure of all ions in the mixture; the measurement technology is fast, inexpensive, and accurate, and it measures only dissolved ions. When developing water quality criteria for major ions, some stakeholders may prefer to identify the ionic constituents as a measure of exposure instead of SC. A field-based method was used to derive example chronic and acute water quality criteria for SC and two anions a common mixture of ions (bicarbonate plus sulfate, [HCO3−] + [SO42−] in mg/L) that represent common mixtures in streams. These two anions are sufficient to model the ion mixture and SC (R2 = 0.94). Using [HCO3−] + [SO42−] does not imply that these two anions are the

  1. A method for the development of disease-specific reference standards vocabularies from textual biomedical literature resources

    PubMed Central

    Wang, Liqin; Bray, Bruce E.; Shi, Jianlin; Fiol, Guilherme Del; Haug, Peter J.

    2017-01-01

    Objective Disease-specific vocabularies are fundamental to many knowledge-based intelligent systems and applications like text annotation, cohort selection, disease diagnostic modeling, and therapy recommendation. Reference standards are critical in the development and validation of automated methods for disease-specific vocabularies. The goal of the present study is to design and test a generalizable method for the development of vocabulary reference standards from expert-curated, disease-specific biomedical literature resources. Methods We formed disease-specific corpora from literature resources like textbooks, evidence-based synthesized online sources, clinical practice guidelines, and journal articles. Medical experts annotated and adjudicated disease-specific terms in four classes (i.e., causes or risk factors, signs or symptoms, diagnostic tests or results, and treatment). Annotations were mapped to UMLS concepts. We assessed source variation, the contribution of each source to build disease-specific vocabularies, the saturation of the vocabularies with respect to the number of used sources, and the generalizability of the method with different diseases. Results The study resulted in 2588 string-unique annotations for heart failure in four classes, and 193 and 425 respectively for pulmonary embolism and rheumatoid arthritis in treatment class. Approximately 80% of the annotations were mapped to UMLS concepts. The agreement among heart failure sources ranged between 0.28 and 0.46. The contribution of these sources to the final vocabulary ranged between 18% and 49%. With the sources explored, the heart failure vocabulary reached near saturation in all four classes with the inclusion of minimal six sources (or between four to seven sources if only counting terms occurred in two or more sources). It took fewer sources to reach near saturation for the other two diseases in terms of the treatment class. Conclusions We developed a method for the development of disease-specific reference vocabularies. Expert-curated biomedical literature resources are substantial for acquiring disease-specific medical knowledge. It is feasible to reach near saturation in a disease-specific vocabulary using a relatively small number of literature sources. PMID:26971304

  2. Methodology for the specification of communication activities within the framework of a multi-layered architecture: Toward the definition of a knowledge base

    NASA Astrophysics Data System (ADS)

    Amyay, Omar

    A method defined in terms of synthesis and verification steps is presented. The specification of the services and protocols of communication within a multilayered architecture of the Open Systems Interconnection (OSI) type is an essential issue for the design of computer networks. The aim is to obtain an operational specification of the protocol service couple of a given layer. Planning synthesis and verification steps constitute a specification trajectory. The latter is based on the progressive integration of the 'initial data' constraints and verification of the specification originating from each synthesis step, through validity constraints that characterize an admissible solution. Two types of trajectories are proposed according to the style of the initial specification of the service protocol couple: operational type and service supplier viewpoint; knowledge property oriented type and service viewpoint. Synthesis and verification activities were developed and formalized in terms of labeled transition systems, temporal logic and epistemic logic. The originality of the second specification trajectory and the use of the epistemic logic are shown. An 'artificial intelligence' approach enables a conceptual model to be defined for a knowledge base system for implementing the method proposed. It is structured in three levels of representation of the knowledge relating to the domain, the reasoning characterizing synthesis and verification activities and the planning of the steps of a specification trajectory.

  3. Development and evaluation of a PCR-based assay kit for authentication of Zaocys dhumnades in traditional Chinese medicine.

    PubMed

    Zhang, Xiaomei; Zhou, Tingting; Yu, Wenjing; Ai, Jinxia; Wang, Xuesong; Gao, Lijun; Yuan, Guangxin; Li, Mingcheng

    2018-01-01

    We developed a kind of Zaocys dhumnades DNA test kit and it's indexes including specificity, sensitivity and stability were evaluated and compared with the method recorded in Chinese Pharmacopoeia (2010 edition). The bioinformatics technology was used to design primers, sequencing and blast, in conjunction with PCR technology based on the characteristics of Z. dhumnades cytochrome b (Cyt b) gene. The efficiency of nucleic acid extraction by the kit was done in accordance with Pharmacopoeia method. The kit stability results proved effective after repeated freezing and thawing 20 times. The sensitivity results indicated that the lowest amount detected by the kit was 0. 025 g of each specimen. The specificity test of the kit was 100% specific. All repeatability tests indicated the same results when conducted three times. Compared with the method recorded in Chinese Pharmacopoeia, the PCR-based assay kit by our team developed is accurate, effective in identification of Z. dhumnades, it is simple and fast, demonstrating a broad prospect in quality inspection of Z. dhumnades in the future.

  4. Sequence-specific bias correction for RNA-seq data using recurrent neural networks.

    PubMed

    Zhang, Yao-Zhong; Yamaguchi, Rui; Imoto, Seiya; Miyano, Satoru

    2017-01-25

    The recent success of deep learning techniques in machine learning and artificial intelligence has stimulated a great deal of interest among bioinformaticians, who now wish to bring the power of deep learning to bare on a host of bioinformatical problems. Deep learning is ideally suited for biological problems that require automatic or hierarchical feature representation for biological data when prior knowledge is limited. In this work, we address the sequence-specific bias correction problem for RNA-seq data redusing Recurrent Neural Networks (RNNs) to model nucleotide sequences without pre-determining sequence structures. The sequence-specific bias of a read is then calculated based on the sequence probabilities estimated by RNNs, and used in the estimation of gene abundance. We explore the application of two popular RNN recurrent units for this task and demonstrate that RNN-based approaches provide a flexible way to model nucleotide sequences without knowledge of predetermined sequence structures. Our experiments show that training a RNN-based nucleotide sequence model is efficient and RNN-based bias correction methods compare well with the-state-of-the-art sequence-specific bias correction method on the commonly used MAQC-III data set. RNNs provides an alternative and flexible way to calculate sequence-specific bias without explicitly pre-determining sequence structures.

  5. Molecular differentiation of Russian wild ginseng using mitochondrial nad7 intron 3 region.

    PubMed

    Li, Guisheng; Cui, Yan; Wang, Hongtao; Kwon, Woo-Saeng; Yang, Deok-Chun

    2017-07-01

    Cultivated ginseng is often introduced as a substitute and adulterant of Russian wild ginseng due to its lower cost or misidentification caused by similarity in appearance with wild ginseng. The aim of this study is to develop a simple and reliable method to differentiate Russian wild ginseng from cultivated ginseng. The mitochondrial NADH dehydrogenase subunit 7 ( nad 7) intron 3 regions of Russian wild ginseng and Chinese cultivated ginseng were analyzed. Based on the multiple sequence alignment result, a specific primer for Russian wild ginseng was designed by introducing additional mismatch and allele-specific polymerase chain reaction (PCR) was performed for identification of wild ginseng. Real-time allele-specific PCR with endpoint analysis was used for validation of the developed Russian wild ginseng single nucleotide polymorphism (SNP) marker. An SNP site specific to Russian wild ginseng was exploited by multiple alignments of mitochondrial nad 7 intron 3 regions of different ginseng samples. With the SNP-based specific primer, Russian wild ginseng was successfully discriminated from Chinese and Korean cultivated ginseng samples by allele-specific PCR. The reliability and specificity of the SNP marker was validated by checking 20 individuals of Russian wild ginseng samples with real-time allele-specific PCR assay. An effective DNA method for molecular discrimination of Russian wild ginseng from Chinese and Korean cultivated ginseng was developed. The established real-time allele-specific PCR was simple and reliable, and the present method should be a crucial complement of chemical analysis for authentication of Russian wild ginseng.

  6. The integrative review: updated methodology.

    PubMed

    Whittemore, Robin; Knafl, Kathleen

    2005-12-01

    The aim of this paper is to distinguish the integrative review method from other review methods and to propose methodological strategies specific to the integrative review method to enhance the rigour of the process. Recent evidence-based practice initiatives have increased the need for and the production of all types of reviews of the literature (integrative reviews, systematic reviews, meta-analyses, and qualitative reviews). The integrative review method is the only approach that allows for the combination of diverse methodologies (for example, experimental and non-experimental research), and has the potential to play a greater role in evidence-based practice for nursing. With respect to the integrative review method, strategies to enhance data collection and extraction have been developed; however, methods of analysis, synthesis, and conclusion drawing remain poorly formulated. A modified framework for research reviews is presented to address issues specific to the integrative review method. Issues related to specifying the review purpose, searching the literature, evaluating data from primary sources, analysing data, and presenting the results are discussed. Data analysis methods of qualitative research are proposed as strategies that enhance the rigour of combining diverse methodologies as well as empirical and theoretical sources in an integrative review. An updated integrative review method has the potential to allow for diverse primary research methods to become a greater part of evidence-based practice initiatives.

  7. Computer-aided diagnostic method for classification of Alzheimer's disease with atrophic image features on MR images

    NASA Astrophysics Data System (ADS)

    Arimura, Hidetaka; Yoshiura, Takashi; Kumazawa, Seiji; Tanaka, Kazuhiro; Koga, Hiroshi; Mihara, Futoshi; Honda, Hiroshi; Sakai, Shuji; Toyofuku, Fukai; Higashida, Yoshiharu

    2008-03-01

    Our goal for this study was to attempt to develop a computer-aided diagnostic (CAD) method for classification of Alzheimer's disease (AD) with atrophic image features derived from specific anatomical regions in three-dimensional (3-D) T1-weighted magnetic resonance (MR) images. Specific regions related to the cerebral atrophy of AD were white matter and gray matter regions, and CSF regions in this study. Cerebral cortical gray matter regions were determined by extracting a brain and white matter regions based on a level set based method, whose speed function depended on gradient vectors in an original image and pixel values in grown regions. The CSF regions in cerebral sulci and lateral ventricles were extracted by wrapping the brain tightly with a zero level set determined from a level set function. Volumes of the specific regions and the cortical thickness were determined as atrophic image features. Average cortical thickness was calculated in 32 subregions, which were obtained by dividing each brain region. Finally, AD patients were classified by using a support vector machine, which was trained by the image features of AD and non-AD cases. We applied our CAD method to MR images of whole brains obtained from 29 clinically diagnosed AD cases and 25 non-AD cases. As a result, the area under a receiver operating characteristic (ROC) curve obtained by our computerized method was 0.901 based on a leave-one-out test in identification of AD cases among 54 cases including 8 AD patients at early stages. The accuracy for discrimination between 29 AD patients and 25 non-AD subjects was 0.840, which was determined at the point where the sensitivity was the same as the specificity on the ROC curve. This result showed that our CAD method based on atrophic image features may be promising for detecting AD patients by using 3-D MR images.

  8. Application of the Bootstrap Statistical Method in Deriving Vibroacoustic Specifications

    NASA Technical Reports Server (NTRS)

    Hughes, William O.; Paez, Thomas L.

    2006-01-01

    This paper discusses the Bootstrap Method for specification of vibroacoustic test specifications. Vibroacoustic test specifications are necessary to properly accept or qualify a spacecraft and its components for the expected acoustic, random vibration and shock environments seen on an expendable launch vehicle. Traditionally, NASA and the U.S. Air Force have employed methods of Normal Tolerance Limits to derive these test levels based upon the amount of data available, and the probability and confidence levels desired. The Normal Tolerance Limit method contains inherent assumptions about the distribution of the data. The Bootstrap is a distribution-free statistical subsampling method which uses the measured data themselves to establish estimates of statistical measures of random sources. This is achieved through the computation of large numbers of Bootstrap replicates of a data measure of interest and the use of these replicates to derive test levels consistent with the probability and confidence desired. The comparison of the results of these two methods is illustrated via an example utilizing actual spacecraft vibroacoustic data.

  9. Assessing differential expression in two-color microarrays: a resampling-based empirical Bayes approach.

    PubMed

    Li, Dongmei; Le Pape, Marc A; Parikh, Nisha I; Chen, Will X; Dye, Timothy D

    2013-01-01

    Microarrays are widely used for examining differential gene expression, identifying single nucleotide polymorphisms, and detecting methylation loci. Multiple testing methods in microarray data analysis aim at controlling both Type I and Type II error rates; however, real microarray data do not always fit their distribution assumptions. Smyth's ubiquitous parametric method, for example, inadequately accommodates violations of normality assumptions, resulting in inflated Type I error rates. The Significance Analysis of Microarrays, another widely used microarray data analysis method, is based on a permutation test and is robust to non-normally distributed data; however, the Significance Analysis of Microarrays method fold change criteria are problematic, and can critically alter the conclusion of a study, as a result of compositional changes of the control data set in the analysis. We propose a novel approach, combining resampling with empirical Bayes methods: the Resampling-based empirical Bayes Methods. This approach not only reduces false discovery rates for non-normally distributed microarray data, but it is also impervious to fold change threshold since no control data set selection is needed. Through simulation studies, sensitivities, specificities, total rejections, and false discovery rates are compared across the Smyth's parametric method, the Significance Analysis of Microarrays, and the Resampling-based empirical Bayes Methods. Differences in false discovery rates controls between each approach are illustrated through a preterm delivery methylation study. The results show that the Resampling-based empirical Bayes Methods offer significantly higher specificity and lower false discovery rates compared to Smyth's parametric method when data are not normally distributed. The Resampling-based empirical Bayes Methods also offers higher statistical power than the Significance Analysis of Microarrays method when the proportion of significantly differentially expressed genes is large for both normally and non-normally distributed data. Finally, the Resampling-based empirical Bayes Methods are generalizable to next generation sequencing RNA-seq data analysis.

  10. Mesh-free based variational level set evolution for breast region segmentation and abnormality detection using mammograms.

    PubMed

    Kashyap, Kanchan L; Bajpai, Manish K; Khanna, Pritee; Giakos, George

    2018-01-01

    Automatic segmentation of abnormal region is a crucial task in computer-aided detection system using mammograms. In this work, an automatic abnormality detection algorithm using mammographic images is proposed. In the preprocessing step, partial differential equation-based variational level set method is used for breast region extraction. The evolution of the level set method is done by applying mesh-free-based radial basis function (RBF). The limitation of mesh-based approach is removed by using mesh-free-based RBF method. The evolution of variational level set function is also done by mesh-based finite difference method for comparison purpose. Unsharp masking and median filtering is used for mammogram enhancement. Suspicious abnormal regions are segmented by applying fuzzy c-means clustering. Texture features are extracted from the segmented suspicious regions by computing local binary pattern and dominated rotated local binary pattern (DRLBP). Finally, suspicious regions are classified as normal or abnormal regions by means of support vector machine with linear, multilayer perceptron, radial basis, and polynomial kernel function. The algorithm is validated on 322 sample mammograms of mammographic image analysis society (MIAS) and 500 mammograms from digital database for screening mammography (DDSM) datasets. Proficiency of the algorithm is quantified by using sensitivity, specificity, and accuracy. The highest sensitivity, specificity, and accuracy of 93.96%, 95.01%, and 94.48%, respectively, are obtained on MIAS dataset using DRLBP feature with RBF kernel function. Whereas, the highest 92.31% sensitivity, 98.45% specificity, and 96.21% accuracy are achieved on DDSM dataset using DRLBP feature with RBF kernel function. Copyright © 2017 John Wiley & Sons, Ltd.

  11. Comparing writing style feature-based classification methods for estimating user reputations in social media.

    PubMed

    Suh, Jong Hwan

    2016-01-01

    In recent years, the anonymous nature of the Internet has made it difficult to detect manipulated user reputations in social media, as well as to ensure the qualities of users and their posts. To deal with this, this study designs and examines an automatic approach that adopts writing style features to estimate user reputations in social media. Under varying ways of defining Good and Bad classes of user reputations based on the collected data, it evaluates the classification performance of the state-of-art methods: four writing style features, i.e. lexical, syntactic, structural, and content-specific, and eight classification techniques, i.e. four base learners-C4.5, Neural Network (NN), Support Vector Machine (SVM), and Naïve Bayes (NB)-and four Random Subspace (RS) ensemble methods based on the four base learners. When South Korea's Web forum, Daum Agora, was selected as a test bed, the experimental results show that the configuration of the full feature set containing content-specific features and RS-SVM combining RS and SVM gives the best accuracy for classification if the test bed poster reputations are segmented strictly into Good and Bad classes by portfolio approach. Pairwise t tests on accuracy confirm two expectations coming from the literature reviews: first, the feature set adding content-specific features outperform the others; second, ensemble learning methods are more viable than base learners. Moreover, among the four ways on defining the classes of user reputations, i.e. like, dislike, sum, and portfolio, the results show that the portfolio approach gives the highest accuracy.

  12. Targeted, Site-specific quantitation of N- and O-glycopeptides using 18O-labeling and product ion based mass spectrometry.

    PubMed

    Srikanth, Jandhyam; Agalyadevi, Rathinasamy; Babu, Ponnusamy

    2017-02-01

    The site-specific quantitation of N- and O-glycosylation is vital to understanding the function(s) of different glycans expressed at a given site of a protein under physiological and disease conditions. Most commonly used precursor ion intensity based quantification method is less accurate and other labeled methods are expensive and require enrichment of glycopeptides. Here, we used glycopeptide product (y and Y0) ions and 18 O-labeling of C-terminal carboxyl group as a strategy to obtain quantitative information about fold-change and relative abundance of most of the glycoforms attached to the glycopeptides. As a proof of concept, the accuracy and robustness of this targeted, relative quantification LC-MS method was demonstrated using Rituximab. Furthermore, the N-glycopeptide quantification results were compared with a biosimilar of Rituximab and validated with quantitative data obtained from 2-AB-UHPLC-FL method. We further demonstrated the intensity fold-change and relative abundance of 46 unique N- and O-glycopeptides and aglycopeptides from innovator and biosimilar samples of Etanercept using both the normal-MS and product ion based quantitation. The results showed a very similar site-specific expression of N- and O-glycopeptides between the samples but with subtle differences. Interestingly, we have also been able to quantify macro-heterogeneity of all N- and O-glycopetides of Etanercept. In addition to applications in biotherapeutics, the developed method can also be used for site-specific quantitation of N- and O-glycopeptides and aglycopeptides of glycoproteins with known glycosylation pattern.

  13. A network function-based definition of communities in complex networks.

    PubMed

    Chauhan, Sanjeev; Girvan, Michelle; Ott, Edward

    2012-09-01

    We consider an alternate definition of community structure that is functionally motivated. We define network community structure based on the function the network system is intended to perform. In particular, as a specific example of this approach, we consider communities whose function is enhanced by the ability to synchronize and/or by resilience to node failures. Previous work has shown that, in many cases, the largest eigenvalue of the network's adjacency matrix controls the onset of both synchronization and percolation processes. Thus, for networks whose functional performance is dependent on these processes, we propose a method that divides a given network into communities based on maximizing a function of the largest eigenvalues of the adjacency matrices of the resulting communities. We also explore the differences between the partitions obtained by our method and the modularity approach (which is based solely on consideration of network structure). We do this for several different classes of networks. We find that, in many cases, modularity-based partitions do almost as well as our function-based method in finding functional communities, even though modularity does not specifically incorporate consideration of function.

  14. Comparative analysis of machine learning methods in ligand-based virtual screening of large compound libraries.

    PubMed

    Ma, Xiao H; Jia, Jia; Zhu, Feng; Xue, Ying; Li, Ze R; Chen, Yu Z

    2009-05-01

    Machine learning methods have been explored as ligand-based virtual screening tools for facilitating drug lead discovery. These methods predict compounds of specific pharmacodynamic, pharmacokinetic or toxicological properties based on their structure-derived structural and physicochemical properties. Increasing attention has been directed at these methods because of their capability in predicting compounds of diverse structures and complex structure-activity relationships without requiring the knowledge of target 3D structure. This article reviews current progresses in using machine learning methods for virtual screening of pharmacodynamically active compounds from large compound libraries, and analyzes and compares the reported performances of machine learning tools with those of structure-based and other ligand-based (such as pharmacophore and clustering) virtual screening methods. The feasibility to improve the performance of machine learning methods in screening large libraries is discussed.

  15. DO TIE LABORATORY BASED ASSESSMENT METHODS REALLY PREDICT FIELD EFFECTS?

    EPA Science Inventory

    Sediment Toxicity Identification and Evaluation (TIE) methods have been developed for both porewaters and whole sediments. These relatively simple laboratory methods are designed to identify specific toxicants or classes of toxicants in sediments; however, the question of whethe...

  16. Event-specific qualitative and quantitative PCR detection of the GMO carnation (Dianthus caryophyllus) variety Moonlite based upon the 5'-transgene integration sequence.

    PubMed

    Li, P; Jia, J W; Jiang, L X; Zhu, H; Bai, L; Wang, J B; Tang, X M; Pan, A H

    2012-04-27

    To ensure the implementation of genetically modified organism (GMO)-labeling regulations, an event-specific detection method was developed based on the junction sequence of an exogenous integrant in the transgenic carnation variety Moonlite. The 5'-transgene integration sequence was isolated by thermal asymmetric interlaced PCR. Based upon the 5'-transgene integration sequence, the event-specific primers and TaqMan probe were designed to amplify the fragments, which spanned the exogenous DNA and carnation genomic DNA. Qualitative and quantitative PCR assays were developed employing the designed primers and probe. The detection limit of the qualitative PCR assay was 0.05% for Moonlite in 100 ng total carnation genomic DNA, corresponding to about 79 copies of the carnation haploid genome; the limit of detection and quantification of the quantitative PCR assay were estimated to be 38 and 190 copies of haploid carnation genomic DNA, respectively. Carnation samples with different contents of genetically modified components were quantified and the bias between the observed and true values of three samples were lower than the acceptance criterion (<25%) of the GMO detection method. These results indicated that these event-specific methods would be useful for the identification and quantification of the GMO carnation Moonlite.

  17. Method of improving system performance and survivability through changing function

    NASA Technical Reports Server (NTRS)

    Hinchey, Michael G. (Inventor); Vassev, Emil I. (Inventor)

    2012-01-01

    A biologically-inspired system and method is provided for self-adapting behavior of swarm-based exploration missions, whereby individual components, for example, spacecraft, in the system can sacrifice themselves for the greater good of the entire system. The swarm-based system can exhibit emergent self-adapting behavior. Each component can be configured to exhibit self-sacrifice behavior based on Autonomic System Specification Language (ASSL).

  18. Kernelized Locality-Sensitive Hashing for Fast Image Landmark Association

    DTIC Science & Technology

    2011-03-24

    based Simultaneous Localization and Mapping ( SLAM ). The problem, however, is that vision-based navigation techniques can re- quire excessive amounts of...up and optimizing the data association process in vision-based SLAM . Specifically, this work studies the current methods that algorithms use to...required for location identification than that of other methods. This work can then be extended into a vision- SLAM implementation to subsequently

  19. Pedagogical Strategies Used by Selected Leading Mixed Methodologists in Mixed Research Courses

    ERIC Educational Resources Information Center

    Frels, Rebecca K.; Onwuegbuzie, Anthony J.; Leech, Nancy L.; Collins, Kathleen M. T.

    2014-01-01

    The teaching of research methods is common across multiple fields in the social and educational sciences for establishing evidence-based practices and furthering the knowledge base through scholarship. Yet, specific to mixed methods, scant information exists as to how to approach teaching complex concepts for meaningful learning experiences. Thus,…

  20. Service-Learning's Ongoing Journey as a Method of Instruction: Implications for School-Based Agricultural Education

    ERIC Educational Resources Information Center

    Roberts, Richie; Edwards, M. Craig

    2015-01-01

    American education's journey has witnessed the rise and fall of various progressive education approaches, including service-learning. In many respects, however, service-learning is still undergoing formation and adoption as a teaching method, specifically in School-Based, Agricultural Education (SBAE). For this reason, the interest existed to…

  1. Early Dose Response to Yttrium-90 Microsphere Treatment of Metastatic Liver Cancer by a Patient-Specific Method Using Single Photon Emission Computed Tomography and Positron Emission Tomography

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Campbell, Janice M.; Department of Radiation Oncology, Wayne State University, Detroit, MI; Wong, C. Oliver

    2009-05-01

    Purpose: To evaluate a patient-specific single photon emission computed tomography (SPECT)-based method of dose calculation for treatment planning of yttrium-90 ({sup 90}Y) microsphere selective internal radiotherapy (SIRT). Methods and Materials: Fourteen consecutive {sup 90}Y SIRTs for colorectal liver metastasis were retrospectively analyzed. Absorbed dose to tumor and normal liver tissue was calculated by partition methods with two different tumor/normal liver vascularity ratios: an average 3:1 and a patient-specific ratio derived from pretreatment technetium-99m macroaggregated albumin SPECT. Tumor response was quantitatively evaluated from fluorine-18 fluoro-2-deoxy-D-glucose positron emission tomography scans. Results: Positron emission tomography showed a significant decrease in total tumor standardizedmore » uptake value (average, 52%). There was a significant difference in the tumor absorbed dose between the average and specific methods (p = 0.009). Response vs. dose curves fit by linear and linear-quadratic modeling showed similar results. Linear fit r values increased for all tumor response parameters with the specific method (+0.20 for mean standardized uptake value). Conclusion: Tumor dose calculated with the patient-specific method was more predictive of response in liver-directed {sup 90}Y SIRT.« less

  2. Evaluation of methods to reduce background using the Python-based ELISA_QC program.

    PubMed

    Webster, Rose P; Cohen, Cinder F; Saeed, Fatima O; Wetzel, Hanna N; Ball, William J; Kirley, Terence L; Norman, Andrew B

    2018-05-01

    Almost all immunological approaches [immunohistochemistry, enzyme-linked immunosorbent assay (ELISA), Western blot], that are used to quantitate specific proteins have had to address high backgrounds due to non-specific reactivity. We report here for the first time a quantitative comparison of methods for reduction of the background of commercial biotinylated antibodies using the Python-based ELISA_QC program. This is demonstrated using a recombinant humanized anti-cocaine monoclonal antibody. Several approaches, such as adjustment of the incubation time and the concentration of blocking agent, as well as the dilution of secondary antibodies, have been explored to address this issue. In this report, systematic comparisons of two different methods, contrasted with other more traditional methods to address this problem are provided. Addition of heparin (HP) at 1 μg/ml to the wash buffer prior to addition of the secondary biotinylated antibody reduced the elevated background absorbance values (from a mean of 0.313 ± 0.015 to 0.137 ± 0.002). A novel immunodepletion (ID) method also reduced the background (from a mean of 0.331 ± 0.010 to 0.146 ± 0.013). Overall, the ID method generated more similar results at each concentration of the ELISA standard curve to that using the standard lot 1 than the HP method, as analyzed by the Python-based ELISA_QC program. We conclude that the ID method, while more laborious, provides the best solution to resolve the high background seen with specific lots of biotinylated secondary antibody. Copyright © 2018. Published by Elsevier B.V.

  3. A high-throughput multiplex method adapted for GMO detection.

    PubMed

    Chaouachi, Maher; Chupeau, Gaëlle; Berard, Aurélie; McKhann, Heather; Romaniuk, Marcel; Giancola, Sandra; Laval, Valérie; Bertheau, Yves; Brunel, Dominique

    2008-12-24

    A high-throughput multiplex assay for the detection of genetically modified organisms (GMO) was developed on the basis of the existing SNPlex method designed for SNP genotyping. This SNPlex assay allows the simultaneous detection of up to 48 short DNA sequences (approximately 70 bp; "signature sequences") from taxa endogenous reference genes, from GMO constructions, screening targets, construct-specific, and event-specific targets, and finally from donor organisms. This assay avoids certain shortcomings of multiplex PCR-based methods already in widespread use for GMO detection. The assay demonstrated high specificity and sensitivity. The results suggest that this assay is reliable, flexible, and cost- and time-effective for high-throughput GMO detection.

  4. Rapid method to detect duplex formation in sequencing by hybridization methods, a method for constructing containment structures for reagent interaction

    DOEpatents

    Mirzabekov, Andrei Darievich; Yershov, Gennadiy Moiseyevich; Guschin, Dmitry Yuryevich; Gemmell, Margaret Anne; Shick, Valentine V.; Proudnikov, Dmitri Y.; Timofeev, Edward N.

    2002-01-01

    A method for determining the existence of duplexes of oligonucleotide complementary molecules is provided whereby a plurality of immobilized oligonucleotide molecules, each of a specific length and each having a specific base sequence, is contacted with complementary, single stranded oligonucleotide molecules to form a duplex so as to facilitate intercalation of a fluorescent dye between the base planes of the duplex. The invention also provides for a method for constructing oligonucleotide matrices comprising confining light sensitive fluid to a surface, exposing said light-sensitive fluid to a light pattern so as to cause the fluid exposed to the light to polymerize into discrete units and adhere to the surface; and contacting each of the units with a set of different oligonucleotide molecules so as to allow the molecules to disperse into the units.

  5. Novel Spectrofluorimetric Method for the Determination of Perindopril Erbumine Based on Fluorescence Quenching of Rhodamine B.

    PubMed

    Fael, Hanan; Sakur, Amir Al-Haj

    2015-11-01

    A novel, simple and specific spectrofluorimetric method was developed and validated for the determination of perindopril erbumine (PDE). The method is based on the fluorescence quenching of Rhodamine B upon adding perindopril erbumine. The quenched fluorescence was monitored at 578 nm after excitation at 500 nm. The optimization of the reaction conditions such as the solvent, reagent concentration, and reaction time were investigated. Under the optimum conditions, the fluorescence quenching was linear over a concentration range of 1.0-6.0 μg/mL. The proposed method was fully validated and successfully applied to the analysis of perindopril erbumine in pure form and tablets. Statistical comparison of the results obtained by the developed and reference methods revealed no significant differences between the methods compared in terms of accuracy and precision. The method was shown to be highly specific in the presence of indapamide, a diuretic that is commonly combined with perindopril erbumine. The mechanism of rhodamine B quenching was also discussed.

  6. Synchronous acquisition of multi-channel signals by single-channel ADC based on square wave modulation

    NASA Astrophysics Data System (ADS)

    Yi, Xiaoqing; Hao, Liling; Jiang, Fangfang; Xu, Lisheng; Song, Shaoxiu; Li, Gang; Lin, Ling

    2017-08-01

    Synchronous acquisition of multi-channel biopotential signals, such as electrocardiograph (ECG) and electroencephalograph, has vital significance in health care and clinical diagnosis. In this paper, we proposed a new method which is using single channel ADC to acquire multi-channel biopotential signals modulated by square waves synchronously. In this method, a specific modulate and demodulate method has been investigated without complex signal processing schemes. For each channel, the sampling rate would not decline with the increase of the number of signal channels. More specifically, the signal-to-noise ratio of each channel is n times of the time-division method or an improvement of 3.01 ×log2n dB, where n represents the number of the signal channels. A numerical simulation shows the feasibility and validity of this method. Besides, a newly developed 8-lead ECG based on the new method has been introduced. These experiments illustrate that the method is practicable and thus is potential for low-cost medical monitors.

  7. Novel approach for the simultaneous detection of DNA from different fish species based on a nuclear target: quantification potential.

    PubMed

    Prado, Marta; Boix, Ana; von Holst, Christoph

    2012-07-01

    The development of DNA-based methods for the identification and quantification of fish in food and feed samples is frequently focused on a specific fish species and/or on the detection of mitochondrial DNA of fish origin. However, a quantitative method for the most common fish species used by the food and feed industry is needed for official control purposes, and such a method should rely on the use of a single-copy nuclear DNA target owing to its more stable copy number in different tissues. In this article, we report on the development of a real-time PCR method based on the use of a nuclear gene as a target for the simultaneous detection of fish DNA from different species and on the evaluation of its quantification potential. The method was tested in 22 different fish species, including those most commonly used by the food and feed industry, and in negative control samples, which included 15 animal species and nine feed ingredients. The results show that the method reported here complies with the requirements concerning specificity and with the criteria required for real-time PCR methods with high sensitivity.

  8. FUN-LDA: A Latent Dirichlet Allocation Model for Predicting Tissue-Specific Functional Effects of Noncoding Variation: Methods and Applications.

    PubMed

    Backenroth, Daniel; He, Zihuai; Kiryluk, Krzysztof; Boeva, Valentina; Pethukova, Lynn; Khurana, Ekta; Christiano, Angela; Buxbaum, Joseph D; Ionita-Laza, Iuliana

    2018-05-03

    We describe a method based on a latent Dirichlet allocation model for predicting functional effects of noncoding genetic variants in a cell-type- and/or tissue-specific way (FUN-LDA). Using this unsupervised approach, we predict tissue-specific functional effects for every position in the human genome in 127 different tissues and cell types. We demonstrate the usefulness of our predictions by using several validation experiments. Using eQTL data from several sources, including the GTEx project, Geuvadis project, and TwinsUK cohort, we show that eQTLs in specific tissues tend to be most enriched among the predicted functional variants in relevant tissues in Roadmap. We further show how these integrated functional scores can be used for (1) deriving the most likely cell or tissue type causally implicated for a complex trait by using summary statistics from genome-wide association studies and (2) estimating a tissue-based correlation matrix of various complex traits. We found large enrichment of heritability in functional components of relevant tissues for various complex traits, and FUN-LDA yielded higher enrichment estimates than existing methods. Finally, using experimentally validated functional variants from the literature and variants possibly implicated in disease by previous studies, we rigorously compare FUN-LDA with state-of-the-art functional annotation methods and show that FUN-LDA has better prediction accuracy and higher resolution than these methods. In particular, our results suggest that tissue- and cell-type-specific functional prediction methods tend to have substantially better prediction accuracy than organism-level prediction methods. Scores for each position in the human genome and for each ENCODE and Roadmap tissue are available online (see Web Resources). Copyright © 2018 American Society of Human Genetics. Published by Elsevier Inc. All rights reserved.

  9. Cosine Kuramoto Based Distribution of a Convoy with Limit-Cycle Obstacle Avoidance Through the Use of Simulated Agents

    NASA Astrophysics Data System (ADS)

    Howerton, William

    This thesis presents a method for the integration of complex network control algorithms with localized agent specific algorithms for maneuvering and obstacle avoidance. This method allows for successful implementation of group and agent specific behaviors. It has proven to be robust and will work for a variety of vehicle platforms. Initially, a review and implementation of two specific algorithms will be detailed. The first, a modified Kuramoto model was developed by Xu [1] which utilizes tools from graph theory to efficiently perform the task of distributing agents. The second algorithm developed by Kim [2] is an effective method for wheeled robots to avoid local obstacles using a limit-cycle navigation method. The results of implementing these methods on a test-bed of wheeled robots will be presented. Control issues related to outside disturbances not anticipated in the original theory are then discussed. A novel method of using simulated agents to separate the task of distributing agents from agent specific velocity and heading commands has been developed and implemented to address these issues. This new method can be used to combine various behaviors and is not limited to a specific control algorithm.

  10. The composite dynamic method as evidence for age-specific waterfowl mortality

    USGS Publications Warehouse

    Burnham, Kenneth P.; Anderson, David R.

    1979-01-01

    For the past 25 years estimation of mortality rates for waterfowl has been based almost entirely on the composite dynamic life table. We examined the specific assumptions for this method and derived a valid goodness of fit test. We performed this test on 45 data sets representing a cross section of banded sampled for various waterfowl species, geographic areas, banding periods, and age/sex classes. We found that: (1) the composite dynamic method was rejected (P <0.001) in 37 of the 45 data sets (in fact, 29 were rejected at P <0.00001) and (2) recovery and harvest rates are year-specific (a critical violation of the necessary assumptions). We conclude that the restrictive assumptions required for the composite dynamic method to produce valid estimates of mortality rates are not met in waterfowl data. Also we demonstrate that even when the required assumptions are met, the method produces very biased estimates of age-specific mortality rates. We believe the composite dynamic method should not be used in the analysis of waterfowl banding data. Furthermore, the composite dynamic method does not provide valid evidence for age-specific mortality rates in waterfowl.

  11. Genome Editing of Monkey.

    PubMed

    Liu, Zhen; Cai, Yijun; Sun, Qiang

    2017-01-01

    Gene-modified monkey models would be particularly valuable in biomedical and neuroscience research. Virus-based transgenic and programmable nucleases-based site-specific gene editing methods (TALEN, CRISPR-cas9) enable the generation of gene-modified monkeys with gain or loss of function of specific genes. Here, we describe the generation of transgenic and knock-out (KO) monkeys with high efficiency by lentivirus and programmable nucleases.

  12. Development of a screening method for genetically modified soybean by plasmid-based quantitative competitive polymerase chain reaction.

    PubMed

    Shimizu, Eri; Kato, Hisashi; Nakagawa, Yuki; Kodama, Takashi; Futo, Satoshi; Minegishi, Yasutaka; Watanabe, Takahiro; Akiyama, Hiroshi; Teshima, Reiko; Furui, Satoshi; Hino, Akihiro; Kitta, Kazumi

    2008-07-23

    A novel type of quantitative competitive polymerase chain reaction (QC-PCR) system for the detection and quantification of the Roundup Ready soybean (RRS) was developed. This system was designed based on the advantage of a fully validated real-time PCR method used for the quantification of RRS in Japan. A plasmid was constructed as a competitor plasmid for the detection and quantification of genetically modified soy, RRS. The plasmid contained the construct-specific sequence of RRS and the taxon-specific sequence of lectin1 (Le1), and both had 21 bp oligonucleotide insertion in the sequences. The plasmid DNA was used as a reference molecule instead of ground seeds, which enabled us to precisely and stably adjust the copy number of targets. The present study demonstrated that the novel plasmid-based QC-PCR method could be a simple and feasible alternative to the real-time PCR method used for the quantification of genetically modified organism contents.

  13. DNA aptamer-based colorimetric detection platform for Salmonella Enteritidis.

    PubMed

    Bayraç, Ceren; Eyidoğan, Füsun; Avni Öktem, Hüseyin

    2017-12-15

    Food safety is a major issue to protect public health and a key challenge is to find detection methods for identification of hazards in food. Food borne infections affects millions of people each year and among pathogens, Salmonella Enteritidis is most widely found bacteria causing food borne diseases. Therefore, simple, rapid, and specific detection methods are needed for food safety. In this study, we demonstrated the selection of DNA aptamers with high affinity and specificity against S. Enteritidis via Cell Systematic Evolution of Ligands by Exponential Enrichment (Cell-SELEX) and development of sandwich type aptamer-based colorimetric platforms for its detection. Two highly specific aptamers, crn-1 and crn-2, were developed through 12 rounds of selection with K d of 0.971µM and 0.309µM, respectively. Both aptamers were used to construct sandwich type capillary detection platforms. With the detection limit of 10 3 CFU/mL, crn-1 and crn-2 based platforms detected target bacteria specifically based on color change. This platform is also suitable for detection of S. Enteritidis in complex food matrix. Thus, this is the first to demonstrate use of Salmonella aptamers for development of the colorimetric aptamer-based detection platform in its identification and detection with naked eye in point-of-care. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. Ethnographic Methods in Academic Libraries: A Review

    ERIC Educational Resources Information Center

    Ramsden, Bryony

    2016-01-01

    Research in academic libraries has recently seen an increase in the use of ethnographic-based methods to collect data. Primarily used to learn about library users and their interaction with spaces and resources, the methods are proving particularly useful to academic libraries. The data ethnographic methods retrieve is rich, context specific, and…

  15. 78 FR 35072 - Proposed Revisions to Reliability Assurance Program

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-06-11

    ... current staff review methods and practices based on lessons learned from NRC reviews of design... following methods (unless this document describes a different method for submitting comments on a specific... possesses and is publicly-available, by the following methods: Federal Rulemaking Web site: Go to http://www...

  16. Development and selection of Asian-specific humeral implants based on statistical atlas: toward planning minimally invasive surgery.

    PubMed

    Wu, K; Daruwalla, Z J; Wong, K L; Murphy, D; Ren, H

    2015-08-01

    The commercial humeral implants based on the Western population are currently not entirely compatible with Asian patients, due to differences in bone size, shape and structure. Surgeons may have to compromise or use different implants that are less conforming, which may cause complications of as well as inconvenience to the implant position. The construction of Asian humerus atlases of different clusters has therefore been proposed to eradicate this problem and to facilitate planning minimally invasive surgical procedures [6,31]. According to the features of the atlases, new implants could be designed specifically for different patients. Furthermore, an automatic implant selection algorithm has been proposed as well in order to reduce the complications caused by implant and bone mismatch. Prior to the design of the implant, data clustering and extraction of the relevant features were carried out on the datasets of each gender. The fuzzy C-means clustering method is explored in this paper. Besides, two new schemes of implant selection procedures, namely the Procrustes analysis-based scheme and the group average distance-based scheme, were proposed to better search for the matching implants for new coming patients from the database. Both these two algorithms have not been used in this area, while they turn out to have excellent performance in implant selection. Additionally, algorithms to calculate the matching scores between various implants and the patient data are proposed in this paper to assist the implant selection procedure. The results obtained have indicated the feasibility of the proposed development and selection scheme. The 16 sets of male data were divided into two clusters with 8 and 8 subjects, respectively, and the 11 female datasets were also divided into two clusters with 5 and 6 subjects, respectively. Based on the features of each cluster, the implants designed by the proposed algorithm fit very well on their reference humeri and the proposed implant selection procedure allows for a scenario of treating a patient with merely a preoperative anatomical model in order to correctly select the implant that has the best fit. Based on the leave-one-out validation, it can be concluded that both the PA-based method and GAD-based method are able to achieve excellent performance when dealing with the problem of implant selection. The accuracy and average execution time for the PA-based method were 100 % and 0.132 s, respectively, while those of the GAD- based method were 100 % and 0.058 s. Therefore, the GAD-based method outperformed the PA-based method in terms of execution speed. The primary contributions of this paper include the proposal of methods for development of Asian-, gender- and cluster-specific implants based on shape features and selection of the best fit implants for future patients according to their features. To the best of our knowledge, this is the first work that proposes implant design and selection for Asian patients automatically based on features extracted from cluster-specific statistical atlases.

  17. An efficient diagnosis system for Parkinson's disease using kernel-based extreme learning machine with subtractive clustering features weighting approach.

    PubMed

    Ma, Chao; Ouyang, Jihong; Chen, Hui-Ling; Zhao, Xue-Hua

    2014-01-01

    A novel hybrid method named SCFW-KELM, which integrates effective subtractive clustering features weighting and a fast classifier kernel-based extreme learning machine (KELM), has been introduced for the diagnosis of PD. In the proposed method, SCFW is used as a data preprocessing tool, which aims at decreasing the variance in features of the PD dataset, in order to further improve the diagnostic accuracy of the KELM classifier. The impact of the type of kernel functions on the performance of KELM has been investigated in detail. The efficiency and effectiveness of the proposed method have been rigorously evaluated against the PD dataset in terms of classification accuracy, sensitivity, specificity, area under the receiver operating characteristic (ROC) curve (AUC), f-measure, and kappa statistics value. Experimental results have demonstrated that the proposed SCFW-KELM significantly outperforms SVM-based, KNN-based, and ELM-based approaches and other methods in the literature and achieved highest classification results reported so far via 10-fold cross validation scheme, with the classification accuracy of 99.49%, the sensitivity of 100%, the specificity of 99.39%, AUC of 99.69%, the f-measure value of 0.9964, and kappa value of 0.9867. Promisingly, the proposed method might serve as a new candidate of powerful methods for the diagnosis of PD with excellent performance.

  18. An Efficient Diagnosis System for Parkinson's Disease Using Kernel-Based Extreme Learning Machine with Subtractive Clustering Features Weighting Approach

    PubMed Central

    Ma, Chao; Ouyang, Jihong; Chen, Hui-Ling; Zhao, Xue-Hua

    2014-01-01

    A novel hybrid method named SCFW-KELM, which integrates effective subtractive clustering features weighting and a fast classifier kernel-based extreme learning machine (KELM), has been introduced for the diagnosis of PD. In the proposed method, SCFW is used as a data preprocessing tool, which aims at decreasing the variance in features of the PD dataset, in order to further improve the diagnostic accuracy of the KELM classifier. The impact of the type of kernel functions on the performance of KELM has been investigated in detail. The efficiency and effectiveness of the proposed method have been rigorously evaluated against the PD dataset in terms of classification accuracy, sensitivity, specificity, area under the receiver operating characteristic (ROC) curve (AUC), f-measure, and kappa statistics value. Experimental results have demonstrated that the proposed SCFW-KELM significantly outperforms SVM-based, KNN-based, and ELM-based approaches and other methods in the literature and achieved highest classification results reported so far via 10-fold cross validation scheme, with the classification accuracy of 99.49%, the sensitivity of 100%, the specificity of 99.39%, AUC of 99.69%, the f-measure value of 0.9964, and kappa value of 0.9867. Promisingly, the proposed method might serve as a new candidate of powerful methods for the diagnosis of PD with excellent performance. PMID:25484912

  19. Palaeomagnetic dating method accounting for post-depositional remanence and its application to geomagnetic field modelling

    NASA Astrophysics Data System (ADS)

    Nilsson, A.; Suttie, N.

    2016-12-01

    Sedimentary palaeomagnetic data may exhibit some degree of smoothing of the recorded field due to the gradual processes by which the magnetic signal is `locked-in' over time. Here we present a new Bayesian method to construct age-depth models based on palaeomagnetic data, taking into account and correcting for potential lock-in delay. The age-depth model is built on the widely used "Bacon" dating software by Blaauw and Christen (2011, Bayesian Analysis 6, 457-474) and is designed to combine both radiocarbon and palaeomagnetic measurements. To our knowledge, this is the first palaeomagnetic dating method that addresses the potential problems related post-depositional remanent magnetisation acquisition in age-depth modelling. Age-depth models, including site specific lock-in depth and lock-in filter function, produced with this method are shown to be consistent with independent results based on radiocarbon wiggle match dated sediment sections. Besides its primary use as a dating tool, our new method can also be used specifically to identify the most likely lock-in parameters for a specific record. We explore the potential to use these results to construct high-resolution geomagnetic field models based on sedimentary palaeomagnetic data, adjusting for smoothing induced by post-depositional remanent magnetisation acquisition. Potentially, this technique could enable reconstructions of Holocene geomagnetic field with the same amplitude of variability observed in archaeomagnetic field models for the past three millennia.

  20. Computational Prediction of Protein Epsilon Lysine Acetylation Sites Based on a Feature Selection Method.

    PubMed

    Gao, JianZhao; Tao, Xue-Wen; Zhao, Jia; Feng, Yuan-Ming; Cai, Yu-Dong; Zhang, Ning

    2017-01-01

    Lysine acetylation, as one type of post-translational modifications (PTM), plays key roles in cellular regulations and can be involved in a variety of human diseases. However, it is often high-cost and time-consuming to use traditional experimental approaches to identify the lysine acetylation sites. Therefore, effective computational methods should be developed to predict the acetylation sites. In this study, we developed a position-specific method for epsilon lysine acetylation site prediction. Sequences of acetylated proteins were retrieved from the UniProt database. Various kinds of features such as position specific scoring matrix (PSSM), amino acid factors (AAF), and disorders were incorporated. A feature selection method based on mRMR (Maximum Relevance Minimum Redundancy) and IFS (Incremental Feature Selection) was employed. Finally, 319 optimal features were selected from total 541 features. Using the 319 optimal features to encode peptides, a predictor was constructed based on dagging. As a result, an accuracy of 69.56% with MCC of 0.2792 was achieved. We analyzed the optimal features, which suggested some important factors determining the lysine acetylation sites. We developed a position-specific method for epsilon lysine acetylation site prediction. A set of optimal features was selected. Analysis of the optimal features provided insights into the mechanism of lysine acetylation sites, providing guidance of experimental validation. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  1. Physically consistent data assimilation method based on feedback control for patient-specific blood flow analysis.

    PubMed

    Ii, Satoshi; Adib, Mohd Azrul Hisham Mohd; Watanabe, Yoshiyuki; Wada, Shigeo

    2018-01-01

    This paper presents a novel data assimilation method for patient-specific blood flow analysis based on feedback control theory called the physically consistent feedback control-based data assimilation (PFC-DA) method. In the PFC-DA method, the signal, which is the residual error term of the velocity when comparing the numerical and reference measurement data, is cast as a source term in a Poisson equation for the scalar potential field that induces flow in a closed system. The pressure values at the inlet and outlet boundaries are recursively calculated by this scalar potential field. Hence, the flow field is physically consistent because it is driven by the calculated inlet and outlet pressures, without any artificial body forces. As compared with existing variational approaches, although this PFC-DA method does not guarantee the optimal solution, only one additional Poisson equation for the scalar potential field is required, providing a remarkable improvement for such a small additional computational cost at every iteration. Through numerical examples for 2D and 3D exact flow fields, with both noise-free and noisy reference data as well as a blood flow analysis on a cerebral aneurysm using actual patient data, the robustness and accuracy of this approach is shown. Moreover, the feasibility of a patient-specific practical blood flow analysis is demonstrated. Copyright © 2017 John Wiley & Sons, Ltd.

  2. Annual Conference on HAN-Based Liquid Propellants. Volume 1

    DTIC Science & Technology

    1989-05-01

    Fischer . This situation is obviously not ideal and effort is being made to find a suitable method . However we have been assured that there has been...CLASSIFICATION OF HAN-BASED LIQUID PROPELLANT LP101. S. Westlake --..---- ------------ 64 POSSIBLE TEST METHODS TO STUDY THE THERMAL STABILITY OF...specifications for LP. The phase of the program which is now in progress has dealt with (1) reviewing. recommending and developing applicable analytical methods

  3. A high-throughput liquid bead array-based screening technology for Bt presence in GMO manipulation.

    PubMed

    Fu, Wei; Wang, Huiyu; Wang, Chenguang; Mei, Lin; Lin, Xiangmei; Han, Xueqing; Zhu, Shuifang

    2016-03-15

    The number of species and planting areas of genetically modified organisms (GMOs) has been rapidly developed during the past ten years. For the purpose of GMO inspection, quarantine and manipulation, we have now devised a high-throughput Bt-based GMOs screening method based on the liquid bead array. This novel method is based on the direct competitive recognition between biotinylated antibodies and beads-coupled antigens, searching for Bt presence in samples if it contains Bt Cry1 Aa, Bt Cry1 Ab, Bt Cry1 Ac, Bt Cry1 Ah, Bt Cry1 B, Bt Cry1 C, Bt Cry1 F, Bt Cry2 A, Bt Cry3 or Bt Cry9 C. Our method has a wide GMO species coverage so that more than 90% of the whole commercialized GMO species can be identified throughout the world. Under our optimization, specificity, sensitivity, repeatability and availability validation, the method shows a high specificity and 10-50 ng/mL sensitivity of quantification. We then assessed more than 1800 samples in the field and food market to prove capacity of our method in performing a high throughput screening work for GMO manipulation. Our method offers an applicant platform for further inspection and research on GMO plants. Copyright © 2015 Elsevier B.V. All rights reserved.

  4. Do Toxicity Identification and Evaluation Laboratory-Based Methods Reflect Causes of Field Impairment?

    EPA Science Inventory

    Sediment Toxicity Identification and Evaluation (TIE) methods have been developed for both interstitial waters and whole sediments. These relatively simple laboratory methods are designed to identify specific toxicants or classes of toxicants in sediments; however, the question ...

  5. AN APPROACH TO METHODS DEVELOPMENT FOR HUMAN EXPOSURE ASSESSMENT STUDIES

    EPA Science Inventory

    Human exposure assessment studies require methods that are rapid, cost-effective and have a high sample through-put. The development of analytical methods for exposure studies should be based on specific information for individual studies. Human exposure studies suggest that di...

  6. Qualification of a Quantitative Method for Monitoring Aspartate Isomerization of a Monoclonal Antibody by Focused Peptide Mapping.

    PubMed

    Cao, Mingyan; Mo, Wenjun David; Shannon, Anthony; Wei, Ziping; Washabaugh, Michael; Cash, Patricia

    Aspartate (Asp) isomerization is a common post-translational modification of recombinant therapeutic proteins that can occur during manufacturing, storage, or administration. Asp isomerization in the complementarity-determining regions of a monoclonal antibody may affect the target binding and thus a sufficiently robust quality control method for routine monitoring is desirable. In this work, we utilized a liquid chromatography-mass spectrometry (LC/MS)-based approach to identify the Asp isomerization in the complementarity-determining regions of a therapeutic monoclonal antibody. To quantitate the site-specific Asp isomerization of the monoclonal antibody, a UV detection-based quantitation assay utilizing the same LC platform was developed. The assay was qualified and implemented for routine monitoring of this product-specific modification. Compared with existing methods, this analytical paradigm is applicable to identify Asp isomerization (or other modifications) and subsequently develop a rapid, sufficiently robust quality control method for routine site-specific monitoring and quantitation to ensure product quality. This approach first identifies and locates a product-related impurity (a critical quality attribute) caused by isomerization, deamidation, oxidation, or other post-translational modifications, and then utilizes synthetic peptides and MS to assist the development of a LC-UV-based chromatographic method that separates and quantifies the product-related impurities by UV peaks. The established LC-UV method has acceptable peak specificity, precision, linearity, and accuracy; it can be validated and used in a good manufacturing practice environment for lot release and stability testing. Aspartate isomerization is a common post-translational modification of recombinant proteins during manufacture process and storage. Isomerization in the complementarity-determining regions (CDRs) of a monoclonal antibody A (mAb-A) has been detected and has been shown to have impact on the binding affinity to the antigen. In this work, we utilized a mass spectrometry-based peptide mapping approach to detect and quantitate the Asp isomerization in the CDRs of mAb-A. To routinely monitor the CDR isomerization of mAb-A, a focused peptide mapping method utilizing reversed phase chromatographic separation and UV detection has been developed and qualified. This approach is generally applicable to monitor isomerization and other post-translational modifications of proteins in a specific and high-throughput mode to ensure product quality. © PDA, Inc. 2016.

  7. EPA Scientists Develop Research Methods for Studying Mold Fact Sheet

    EPA Pesticide Factsheets

    In 2002, U.S. Environmental Protection Agency researchers developed a DNA-based Mold Specific Quantitative Polymerase Chain Reaction method (MSQPCR) for identifying and quantifying over 100 common molds and fungi.

  8. 40 CFR 146.95 - Class VI injection depth waiver requirements.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... detection tools), unless the Director determines, based on site-specific geology, that such methods are not... geology, that such methods are not appropriate; (5) Any additional requirements requested by the Director...

  9. Rapid method to detect duplex formation in sequencing by hybridization methods

    DOEpatents

    Mirzabekov, A.D.; Timofeev, E.N.; Florentiev, V.L.; Kirillov, E.V.

    1999-01-19

    A method for determining the existence of duplexes of oligonucleotide complementary molecules is provided. A plurality of immobilized oligonucleotide molecules, each of a specific length and each having a specific base sequence, is contacted with complementary, single stranded oligonucleotide molecules to form a duplex. Each duplex facilitates intercalation of a fluorescent dye between the base planes of the duplex. The invention also provides for a method for constructing oligonucleotide matrices comprising confining light sensitive fluid to a surface and exposing the light-sensitive fluid to a light pattern. This causes the fluid exposed to the light to coalesce into discrete units and adhere to the surface. This places each of the units in contact with a set of different oligonucleotide molecules so as to allow the molecules to disperse into the units. 13 figs.

  10. Rapid method to detect duplex formation in sequencing by hybridization methods

    DOEpatents

    Mirzabekov, Andrei Darievich; Timofeev, Edward Nikolaevich; Florentiev, Vladimer Leonidovich; Kirillov, Eugene Vladislavovich

    1999-01-01

    A method for determining the existence of duplexes of oligonucleotide complementary molecules is provided whereby a plurality of immobilized oligonucleotide molecules, each of a specific length and each having a specific base sequence, is contacted with complementary, single stranded oligonucleotide molecules to form a duplex so as to facilitate intercalation of a fluorescent dye between the base planes of the duplex. The invention also provides for a method for constructing oligonucleotide matrices comprising confining light sensitive fluid to a surface, exposing said light-sensitive fluid to a light pattern so as to cause the fluid exposed to the light to coalesce into discrete units and adhere to the surface; and contacting each of the units with a set of different oligonucleotide molecules so as to allow the molecules to disperse into the units.

  11. Detecting crop growth stages of maize and soybeans by using time-series MODIS data

    NASA Astrophysics Data System (ADS)

    Sakamoto, T.; Wardlow, B. D.; Gitelson, A. A.; Verma, S. B.; Suyker, A. E.; Arkebauer, T. J.

    2009-12-01

    The crop phenological stages are one of essential parameters for evaluating crop productivity based on a crop simulation model. In this study, we improved a method named the Wavelet-based Filter for detecting Crop Phenology (WFCP) for detecting the specific phenological dates of maize and soybeans. The improved method was applied to MODIS-derived Wide Dynamic Range Vegetation Index (WDRVI) over a 6-year period (2003 to 2008) for three experimental fields planted to either maize or soybeans as part of the Carbon Sequestration Program (CSP) at the University of Nebraska-Lincoln (UNL). Using the ground-based crop growth stage observations collected by the CSP, it was confirmed that the improved method can estimate the specific phenological dates of maize (V2.5, R1, R5 and R6) and soybeans (V1, R5, R6 and R7) with reasonable accuracy.

  12. Are the classic diagnostic methods in mycology still state of the art?

    PubMed

    Wiegand, Cornelia; Bauer, Andrea; Brasch, Jochen; Nenoff, Pietro; Schaller, Martin; Mayser, Peter; Hipler, Uta-Christina; Elsner, Peter

    2016-05-01

    The diagnostic workup of cutaneous fungal infections is traditionally based on microscopic KOH preparations as well as culturing of the causative organism from sample material. Another possible option is the detection of fungal elements by dermatohistology. If performed correctly, these methods are generally suitable for the diagnosis of mycoses. However, the advent of personalized medicine and the tasks arising therefrom require new procedures marked by simplicity, specificity, and swiftness. The additional use of DNA-based molecular techniques further enhances sensitivity and diagnostic specificity, and reduces the diagnostic interval to 24-48 hours, compared to weeks required for conventional mycological methods. Given the steady evolution in the field of personalized medicine, simple analytical PCR-based systems are conceivable, which allow for instant diagnosis of dermatophytes in the dermatology office (point-of-care tests). © 2016 Deutsche Dermatologische Gesellschaft (DDG). Published by John Wiley & Sons Ltd.

  13. Extension of specification language for soundness and completeness of service workflow

    NASA Astrophysics Data System (ADS)

    Viriyasitavat, Wattana; Xu, Li Da; Bi, Zhuming; Sapsomboon, Assadaporn

    2018-05-01

    A Service Workflow is an aggregation of distributed services to fulfill specific functionalities. With ever increasing available services, the methodologies for the selections of the services against the given requirements become main research subjects in multiple disciplines. A few of researchers have contributed to the formal specification languages and the methods for model checking; however, existing methods have the difficulties to tackle with the complexity of workflow compositions. In this paper, we propose to formalize the specification language to reduce the complexity of the workflow composition. To this end, we extend a specification language with the consideration of formal logic, so that some effective theorems can be derived for the verification of syntax, semantics, and inference rules in the workflow composition. The logic-based approach automates compliance checking effectively. The Service Workflow Specification (SWSpec) has been extended and formulated, and the soundness, completeness, and consistency of SWSpec applications have been verified; note that a logic-based SWSpec is mandatory for the development of model checking. The application of the proposed SWSpec has been demonstrated by the examples with the addressed soundness, completeness, and consistency.

  14. Quantitative evaluation of specific vulnerability to nitrate for groundwater resource protection based on process-based simulation model.

    PubMed

    Huan, Huan; Wang, Jinsheng; Zhai, Yuanzheng; Xi, Beidou; Li, Juan; Li, Mingxiao

    2016-04-15

    It has been proved that groundwater vulnerability assessment is an effective tool for groundwater protection. Nowadays, quantitative assessment methods for specific vulnerability are scarce due to limited cognition of complicated contaminant fate and transport processes in the groundwater system. In this paper, process-based simulation model for specific vulnerability to nitrate using 1D flow and solute transport model in the unsaturated vadose zone is presented for groundwater resource protection. For this case study in Jilin City of northeast China, rate constants of denitrification and nitrification as well as adsorption constants of ammonium and nitrate in the vadose zone were acquired by laboratory experiments. The transfer time at the groundwater table t50 was taken as the specific vulnerability indicator. Finally, overall vulnerability was assessed by establishing the relationship between groundwater net recharge, layer thickness and t50. The results suggested that the most vulnerable regions of Jilin City were mainly distributed in the floodplain of Songhua River and Mangniu River. The least vulnerable areas mostly appear in the second terrace and back of the first terrace. The overall area of low, relatively low and moderate vulnerability accounted for 76% of the study area, suggesting the relatively low possibility of suffering nitrate contamination. In addition, the sensitivity analysis showed that the most sensitive factors of specific vulnerability in the vadose zone included the groundwater net recharge rate, physical properties of soil medium and rate constants of nitrate denitrification. By validating the suitability of the process-based simulation model for specific vulnerability and comparing with index-based method by a group of integrated indicators, more realistic and accurate specific vulnerability mapping could be acquired by the process-based simulation model acquiring. In addition, the advantages, disadvantages, constraint conditions and applying prospects of the quantitative approach for specific vulnerability assessment were discussed. Copyright © 2016 Elsevier B.V. All rights reserved.

  15. DNA-based identification of spices: DNA isolation, whole genome amplification, and polymerase chain reaction.

    PubMed

    Focke, Felix; Haase, Ilka; Fischer, Markus

    2011-01-26

    Usually spices are identified morphologically using simple methods like magnifying glasses or microscopic instruments. On the other hand, molecular biological methods like the polymerase chain reaction (PCR) enable an accurate and specific detection also in complex matrices. Generally, the origins of spices are plants with diverse genetic backgrounds and relationships. The processing methods used for the production of spices are complex and individual. Consequently, the development of a reliable DNA-based method for spice analysis is a challenging intention. However, once established, this method will be easily adapted to less difficult food matrices. In the current study, several alternative methods for the isolation of DNA from spices have been developed and evaluated in detail with regard to (i) its purity (photometric), (ii) yield (fluorimetric methods), and (iii) its amplifiability (PCR). Whole genome amplification methods were used to preamplify isolates to improve the ratio between amplifiable DNA and inhibiting substances. Specific primer sets were designed, and the PCR conditions were optimized to detect 18 spices selectively. Assays of self-made spice mixtures were performed to proof the applicability of the developed methods.

  16. Reconstruction of metabolic pathways by combining probabilistic graphical model-based and knowledge-based methods

    PubMed Central

    2014-01-01

    Automatic reconstruction of metabolic pathways for an organism from genomics and transcriptomics data has been a challenging and important problem in bioinformatics. Traditionally, known reference pathways can be mapped into an organism-specific ones based on its genome annotation and protein homology. However, this simple knowledge-based mapping method might produce incomplete pathways and generally cannot predict unknown new relations and reactions. In contrast, ab initio metabolic network construction methods can predict novel reactions and interactions, but its accuracy tends to be low leading to a lot of false positives. Here we combine existing pathway knowledge and a new ab initio Bayesian probabilistic graphical model together in a novel fashion to improve automatic reconstruction of metabolic networks. Specifically, we built a knowledge database containing known, individual gene / protein interactions and metabolic reactions extracted from existing reference pathways. Known reactions and interactions were then used as constraints for Bayesian network learning methods to predict metabolic pathways. Using individual reactions and interactions extracted from different pathways of many organisms to guide pathway construction is new and improves both the coverage and accuracy of metabolic pathway construction. We applied this probabilistic knowledge-based approach to construct the metabolic networks from yeast gene expression data and compared its results with 62 known metabolic networks in the KEGG database. The experiment showed that the method improved the coverage of metabolic network construction over the traditional reference pathway mapping method and was more accurate than pure ab initio methods. PMID:25374614

  17. YIP Formal Synthesis of Software-Based Control Protocols for Fractionated,Composable Autonomous Systems

    DTIC Science & Technology

    2016-07-08

    Systems Using Automata Theory and Barrier Certifi- cates We developed a sound but incomplete method for the computational verification of specifications...method merges ideas from automata -based model checking with those from control theory including so-called barrier certificates and optimization-based... Automata theory meets barrier certificates: Temporal logic verification of nonlinear systems,” IEEE Transactions on Automatic Control, 2015. [J2] R

  18. Age assessment based on third molar mineralisation : An epidemiological-radiological study on a Central-European population.

    PubMed

    Hofmann, Elisabeth; Robold, Matthias; Proff, Peter; Kirschneck, Christian

    2017-03-01

    The method published in 1973 by Demirjian et al. to assess age based on the mineralisation stage of permanent teeth is standard practice in forensic and orthodontic diagnostics. From age 14 onwards, however, this method is only applicable to third molars. No current epidemiological data on third molar mineralisation are available for Caucasian Central-Europeans. Thus, a method for assessing age in this population based on third molar mineralisation is presented, taking into account possible topographic and gender-specific differences. The study included 486 Caucasian Central-European orthodontic patients (9-24 years) with unaffected dental development. In an anonymized, randomized, and blinded manner, one orthopantomogram of each patient at either start, mid or end of treatment was visually analysed regarding the mineralisation stage of the third molars according to the method by Demirjian et al. Corresponding topographic and gender-specific point scores were determined and added to form a dental maturity score. Prediction equations for age assessment were derived by linear regression analysis with chronological age and checked for reliability within the study population. Mineralisation of the lower third molars was slower than mineralisation of the upper third molars, whereas no jaw-side-specific differences were detected. Gender-specific differences were relatively small, but girls reached mineralisation stage C earlier than boys, whereas boys showed an accelerated mineralisation between the ages of 15 and 16. The global equation generated by regression analysis (age = -1.103 + 0.268 × dental maturity score 18 + 28 + 38 + 48) is sufficiently accurate and reliable for clinical use. Age assessment only based on either maxilla or mandible also shows good prognostic reliability.

  19. Cortical Enhanced Tissue Segmentation of Neonatal Brain MR Images Acquired by a Dedicated Phased Array Coil

    PubMed Central

    Shi, Feng; Yap, Pew-Thian; Fan, Yong; Cheng, Jie-Zhi; Wald, Lawrence L.; Gerig, Guido; Lin, Weili; Shen, Dinggang

    2010-01-01

    The acquisition of high quality MR images of neonatal brains is largely hampered by their characteristically small head size and low tissue contrast. As a result, subsequent image processing and analysis, especially for brain tissue segmentation, are often hindered. To overcome this problem, a dedicated phased array neonatal head coil is utilized to improve MR image quality by effectively combing images obtained from 8 coil elements without lengthening data acquisition time. In addition, a subject-specific atlas based tissue segmentation algorithm is specifically developed for the delineation of fine structures in the acquired neonatal brain MR images. The proposed tissue segmentation method first enhances the sheet-like cortical gray matter (GM) structures in neonatal images with a Hessian filter for generation of cortical GM prior. Then, the prior is combined with our neonatal population atlas to form a cortical enhanced hybrid atlas, which we refer to as the subject-specific atlas. Various experiments are conducted to compare the proposed method with manual segmentation results, as well as with additional two population atlas based segmentation methods. Results show that the proposed method is capable of segmenting the neonatal brain with the highest accuracy, compared to other two methods. PMID:20862268

  20. A comparison of evaluation metrics for biomedical journals, articles, and websites in terms of sensitivity to topic.

    PubMed

    Fu, Lawrence D; Aphinyanaphongs, Yindalon; Wang, Lily; Aliferis, Constantin F

    2011-08-01

    Evaluating the biomedical literature and health-related websites for quality are challenging information retrieval tasks. Current commonly used methods include impact factor for journals, PubMed's clinical query filters and machine learning-based filter models for articles, and PageRank for websites. Previous work has focused on the average performance of these methods without considering the topic, and it is unknown how performance varies for specific topics or focused searches. Clinicians, researchers, and users should be aware when expected performance is not achieved for specific topics. The present work analyzes the behavior of these methods for a variety of topics. Impact factor, clinical query filters, and PageRank vary widely across different topics while a topic-specific impact factor and machine learning-based filter models are more stable. The results demonstrate that a method may perform excellently on average but struggle when used on a number of narrower topics. Topic-adjusted metrics and other topic robust methods have an advantage in such situations. Users of traditional topic-sensitive metrics should be aware of their limitations. Copyright © 2011 Elsevier Inc. All rights reserved.

  1. Novel techniques for enhancement and segmentation of acne vulgaris lesions.

    PubMed

    Malik, A S; Humayun, J; Kamel, N; Yap, F B-B

    2014-08-01

    More than 99% acne patients suffer from acne vulgaris. While diagnosing the severity of acne vulgaris lesions, dermatologists have observed inter-rater and intra-rater variability in diagnosis results. This is because during assessment, identifying lesion types and their counting is a tedious job for dermatologists. To make the assessment job objective and easier for dermatologists, an automated system based on image processing methods is proposed in this study. There are two main objectives: (i) to develop an algorithm for the enhancement of various acne vulgaris lesions; and (ii) to develop a method for the segmentation of enhanced acne vulgaris lesions. For the first objective, an algorithm is developed based on the theory of high dynamic range (HDR) images. The proposed algorithm uses local rank transform to generate the HDR images from a single acne image followed by the log transformation. Then, segmentation is performed by clustering the pixels based on Mahalanobis distance of each pixel from spectral models of acne vulgaris lesions. Two metrics are used to evaluate the enhancement of acne vulgaris lesions, i.e., contrast improvement factor (CIF) and image contrast normalization (ICN). The proposed algorithm is compared with two other methods. The proposed enhancement algorithm shows better result than both the other methods based on CIF and ICN. In addition, sensitivity and specificity are calculated for the segmentation results. The proposed segmentation method shows higher sensitivity and specificity than other methods. This article specifically discusses the contrast enhancement and segmentation for automated diagnosis system of acne vulgaris lesions. The results are promising that can be used for further classification of acne vulgaris lesions for final grading of the lesions. © 2013 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  2. A Classroom-Based Assessment Method to Test Speaking Skills in English for Specific Purposes

    ERIC Educational Resources Information Center

    Alberola Colomar, María Pilar

    2014-01-01

    This article presents and analyses a classroom-based assessment method to test students' speaking skills in a variety of professional settings in tourism. The assessment system has been implemented in the Communication in English for Tourism course, as part of the Tourism Management degree programme, at Florida Universitaria (affiliated to the…

  3. A prior-based integrative framework for functional transcriptional regulatory network inference

    PubMed Central

    Siahpirani, Alireza F.

    2017-01-01

    Abstract Transcriptional regulatory networks specify regulatory proteins controlling the context-specific expression levels of genes. Inference of genome-wide regulatory networks is central to understanding gene regulation, but remains an open challenge. Expression-based network inference is among the most popular methods to infer regulatory networks, however, networks inferred from such methods have low overlap with experimentally derived (e.g. ChIP-chip and transcription factor (TF) knockouts) networks. Currently we have a limited understanding of this discrepancy. To address this gap, we first develop a regulatory network inference algorithm, based on probabilistic graphical models, to integrate expression with auxiliary datasets supporting a regulatory edge. Second, we comprehensively analyze our and other state-of-the-art methods on different expression perturbation datasets. Networks inferred by integrating sequence-specific motifs with expression have substantially greater agreement with experimentally derived networks, while remaining more predictive of expression than motif-based networks. Our analysis suggests natural genetic variation as the most informative perturbation for network inference, and, identifies core TFs whose targets are predictable from expression. Multiple reasons make the identification of targets of other TFs difficult, including network architecture and insufficient variation of TF mRNA level. Finally, we demonstrate the utility of our inference algorithm to infer stress-specific regulatory networks and for regulator prioritization. PMID:27794550

  4. TU-F-BRF-02: MR-US Prostate Registration Using Patient-Specific Tissue Elasticity Property Prior for MR-Targeted, TRUS-Guided HDR Brachytherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yang, X; Rossi, P; Ogunleye, T

    2014-06-15

    Purpose: High-dose-rate (HDR) brachytherapy has become a popular treatment modality for prostate cancer. Conventional transrectal ultrasound (TRUS)-guided prostate HDR brachytherapy could benefit significantly from MR-targeted, TRUS-guided procedure where the tumor locations, acquired from the multiparametric MRI, are incorporated into the treatment planning. In order to enable this integration, we have developed a MR-TRUS registration with a patient-specific biomechanical elasticity prior. Methods: The proposed method used a biomechanical elasticity prior to guide the prostate volumetric B-spline deformation in the MRI and TRUS registration. The patient-specific biomechanical elasticity prior was generated using ultrasound elastography, where two 3D TRUS prostate images were acquiredmore » under different probe-induced pressures during the HDR procedure, which takes 2-4 minutes. These two 3D TRUS images were used to calculate the local displacement (elasticity map) of two prostate volumes. The B-spline transformation was calculated by minimizing the Euclidean distance between the normalized attribute vectors of the prostate surface landmarks on the MR and TRUS. This technique was evaluated through two studies: a prostate-phantom study and a pilot study with 5 patients undergoing prostate HDR treatment. The accuracy of our approach was assessed through the locations of several landmarks in the post-registration and TRUS images; our registration results were compared with the surface-based method. Results: For the phantom study, the mean landmark displacement of the proposed method was 1.29±0.11 mm. For the 5 patients, the mean landmark displacement of the surface-based method was 3.25±0.51 mm; our method, 1.71±0.25 mm. Therefore, our proposed method of prostate registration outperformed the surfaced-based registration significantly. Conclusion: We have developed a novel MR-TRUS prostate registration approach based on patient-specific biomechanical elasticity prior. Successful integration of multi-parametric MR and TRUS prostate images provides a prostate-cancer map for treatment planning, enables accurate dose planning and delivery, and potentially enhances prostate HDR treatment outcome.« less

  5. A Varian DynaLog file-based procedure for patient dose-volume histogram-based IMRT QA.

    PubMed

    Calvo-Ortega, Juan F; Teke, Tony; Moragues, Sandra; Pozo, Miquel; Casals-Farran, Joan

    2014-03-06

    In the present study, we describe a method based on the analysis of the dynamic MLC log files (DynaLog) generated by the controller of a Varian linear accelerator in order to perform patient-specific IMRT QA. The DynaLog files of a Varian Millennium MLC, recorded during an IMRT treatment, can be processed using a MATLAB-based code in order to generate the actual fluence for each beam and so recalculate the actual patient dose distribution using the Eclipse treatment planning system. The accuracy of the DynaLog-based dose reconstruction procedure was assessed by introducing ten intended errors to perturb the fluence of the beams of a reference plan such that ten subsequent erroneous plans were generated. In-phantom measurements with an ionization chamber (ion chamber) and planar dose measurements using an EPID system were performed to investigate the correlation between the measured dose changes and the expected ones detected by the reconstructed plans for the ten intended erroneous cases. Moreover, the method was applied to 20 cases of clinical plans for different locations (prostate, lung, breast, and head and neck). A dose-volume histogram (DVH) metric was used to evaluate the impact of the delivery errors in terms of dose to the patient. The ionometric measurements revealed a significant positive correlation (R² = 0.9993) between the variations of the dose induced in the erroneous plans with respect to the reference plan and the corresponding changes indicated by the DynaLog-based reconstructed plans. The EPID measurements showed that the accuracy of the DynaLog-based method to reconstruct the beam fluence was comparable with the dosimetric resolution of the portal dosimetry used in this work (3%/3 mm). The DynaLog-based reconstruction method described in this study is a suitable tool to perform a patient-specific IMRT QA. This method allows us to perform patient-specific IMRT QA by evaluating the result based on the DVH metric of the planning CT image (patient DVH-based IMRT QA).

  6. Rapid analysis method for the determination of 14C specific activity in irradiated graphite

    PubMed Central

    Remeikis, Vidmantas; Lagzdina, Elena; Garbaras, Andrius; Gudelis, Arūnas; Garankin, Jevgenij; Juodis, Laurynas; Duškesas, Grigorijus; Lingis, Danielius; Abdulajev, Vladimir; Plukis, Artūras

    2018-01-01

    14C is one of the limiting radionuclides used in the categorization of radioactive graphite waste; this categorization is crucial in selecting the appropriate graphite treatment/disposal method. We propose a rapid analysis method for 14C specific activity determination in small graphite samples in the 1–100 μg range. The method applies an oxidation procedure to the sample, which extracts 14C from the different carbonaceous matrices in a controlled manner. Because this method enables fast online measurement and 14C specific activity evaluation, it can be especially useful for characterizing 14C in irradiated graphite when dismantling graphite moderator and reflector parts, or when sorting radioactive graphite waste from decommissioned nuclear power plants. The proposed rapid method is based on graphite combustion and the subsequent measurement of both CO2 and 14C, using a commercial elemental analyser and the semiconductor detector, respectively. The method was verified using the liquid scintillation counting (LSC) technique. The uncertainty of this rapid method is within the acceptable range for radioactive waste characterization purposes. The 14C specific activity determination procedure proposed in this study takes approximately ten minutes, comparing favorably to the more complicated and time consuming LSC method. This method can be potentially used to radiologically characterize radioactive waste or used in biomedical applications when dealing with the specific activity determination of 14C in the sample. PMID:29370233

  7. Rapid analysis method for the determination of 14C specific activity in irradiated graphite.

    PubMed

    Remeikis, Vidmantas; Lagzdina, Elena; Garbaras, Andrius; Gudelis, Arūnas; Garankin, Jevgenij; Plukienė, Rita; Juodis, Laurynas; Duškesas, Grigorijus; Lingis, Danielius; Abdulajev, Vladimir; Plukis, Artūras

    2018-01-01

    14C is one of the limiting radionuclides used in the categorization of radioactive graphite waste; this categorization is crucial in selecting the appropriate graphite treatment/disposal method. We propose a rapid analysis method for 14C specific activity determination in small graphite samples in the 1-100 μg range. The method applies an oxidation procedure to the sample, which extracts 14C from the different carbonaceous matrices in a controlled manner. Because this method enables fast online measurement and 14C specific activity evaluation, it can be especially useful for characterizing 14C in irradiated graphite when dismantling graphite moderator and reflector parts, or when sorting radioactive graphite waste from decommissioned nuclear power plants. The proposed rapid method is based on graphite combustion and the subsequent measurement of both CO2 and 14C, using a commercial elemental analyser and the semiconductor detector, respectively. The method was verified using the liquid scintillation counting (LSC) technique. The uncertainty of this rapid method is within the acceptable range for radioactive waste characterization purposes. The 14C specific activity determination procedure proposed in this study takes approximately ten minutes, comparing favorably to the more complicated and time consuming LSC method. This method can be potentially used to radiologically characterize radioactive waste or used in biomedical applications when dealing with the specific activity determination of 14C in the sample.

  8. Meat authentication: a new HPLC-MS/MS based method for the fast and sensitive detection of horse and pork in highly processed food.

    PubMed

    von Bargen, Christoph; Brockmeyer, Jens; Humpf, Hans-Ulrich

    2014-10-01

    Fraudulent blending of food products with meat from undeclared species is a problem on a global scale, as exemplified by the European horse meat scandal in 2013. Routinely used methods such as ELISA and PCR can suffer from limited sensitivity or specificity when processed food samples are analyzed. In this study, we have developed an optimized method for the detection of horse and pork in different processed food matrices using MRM and MRM(3) detection of species-specific tryptic marker peptides. Identified marker peptides were sufficiently stable to resist thermal processing of different meat products and thus allow the sensitive and specific detection of pork or horse in processed food down to 0.24% in a beef matrix system. In addition, we were able to establish a rapid 2-min extraction protocol for the efficient protein extraction from processed food using high molar urea and thiourea buffers. Together, we present here the specific and sensitive detection of horse and pork meat in different processed food matrices using MRM-based detection of marker peptides. Notably, prefractionation of proteins using 2D-PAGE or off-gel fractionation is not necessary. The presented method is therefore easily applicable in analytical routine laboratories without dedicated proteomics background.

  9. The ratio method: A new tool to study one-neutron halo nuclei

    DOE PAGES

    Capel, Pierre; Johnson, R. C.; Nunes, F. M.

    2013-10-02

    Recently a new observable to study halo nuclei was introduced, based on the ratio between breakup and elastic angular cross sections. This new observable is shown by the analysis of specific reactions to be independent of the reaction mechanism and to provide nuclear-structure information of the projectile. Here we explore the details of this ratio method, including the sensitivity to binding energy and angular momentum of the projectile. We also study the reliability of the method with breakup energy. Lastly, we provide guidelines and specific examples for experimentalists who wish to apply this method.

  10. Ensemble stacking mitigates biases in inference of synaptic connectivity.

    PubMed

    Chambers, Brendan; Levy, Maayan; Dechery, Joseph B; MacLean, Jason N

    2018-01-01

    A promising alternative to directly measuring the anatomical connections in a neuronal population is inferring the connections from the activity. We employ simulated spiking neuronal networks to compare and contrast commonly used inference methods that identify likely excitatory synaptic connections using statistical regularities in spike timing. We find that simple adjustments to standard algorithms improve inference accuracy: A signing procedure improves the power of unsigned mutual-information-based approaches and a correction that accounts for differences in mean and variance of background timing relationships, such as those expected to be induced by heterogeneous firing rates, increases the sensitivity of frequency-based methods. We also find that different inference methods reveal distinct subsets of the synaptic network and each method exhibits different biases in the accurate detection of reciprocity and local clustering. To correct for errors and biases specific to single inference algorithms, we combine methods into an ensemble. Ensemble predictions, generated as a linear combination of multiple inference algorithms, are more sensitive than the best individual measures alone, and are more faithful to ground-truth statistics of connectivity, mitigating biases specific to single inference methods. These weightings generalize across simulated datasets, emphasizing the potential for the broad utility of ensemble-based approaches.

  11. ELEGANT ENVIRONMENTAL IMMUNOASSAYS

    EPA Science Inventory

    Immunochemical methods are based on selective antibodies directed to a particular target analyte. The specific binding between antibody and analyte can be used for detection and quantitation. Methods such as the enzyme-linked immunosorbent assay (ELISA) can provide a sensitiv...

  12. Comparison of 16S rDNA-based PCR and checkerboard DNA-DNA hybridisation for detection of selected endodontic pathogens.

    PubMed

    Siqueira, José F; Rôças, Isabela N; De Uzeda, Milton; Colombo, Ana P; Santos, Kátia R N

    2002-12-01

    Molecular methods have been used recently to investigate the bacteria encountered in human endodontic infections. The aim of the present study was to compare the ability of a 16S rDNA-based PCR assay and checkerboard DNA-DNA hybridisation in detecting Actinobacillus actinomycetemcomitans, Bacteroides forsythus, Peptostreptococcus micros, Porphyromonas endodontalis, Por. gingivalis and Treponema denticola directly from clinical samples. Specimens were obtained from 50 cases of endodontic infections and the presence of the target species was investigated by whole genomic DNA probes and checkerboard DNA-DNA hybridisation or taxon-specific oligonucleotides with PCR assay. Prevalence of the target species was based on data obtained by each method. The sensitivity and specificity of each molecular method was compared with the data generated by the other method as the reference--a value of 1.0 representing total agreement with the chosen standard. The methods were also compared with regard to the prevalence values for each target species. Regardless of the detection method used, T. denticola, Por. gingivalis, Por. endodontalis and B. forsythus were the most prevalent species. If the checkerboard data for these four species were used as the reference, PCR detection sensitivities ranged from 0.53 to 1.0, and specificities from 0.5 to 0.88, depending on the target bacterial species. When PCR data for the same species were used as the reference, the detection sensitivities for the checkerboard method ranged from 0.17 to 0.73, and specificities from 0.75 to 1.0. Accuracy values ranged from 0.6 to 0.74. On the whole, matching results between the two molecular methods ranged from 60% to 97.5%, depending on the target species. The major discrepancies between the methods comprised a number of PCR-positive but checkerboard-negative results. Significantly higher prevalence figures for Por. endodontalis and T. denticola were observed after PCR assessment. There was no further significant difference between the methods with regard to detection of the other target species.

  13. Laboratory study of test methods for polymer modified asphalt in hot mix pavement.

    DOT National Transportation Integrated Search

    1989-11-01

    Increasing use of asphalt binders modified with elastomeric or plastic modifiers makes the specification of binders a difficult task. Ideally, a generic specification would allow various suppliers and additives to compete based on expected performanc...

  14. Muscle parameters estimation based on biplanar radiography.

    PubMed

    Dubois, G; Rouch, P; Bonneau, D; Gennisson, J L; Skalli, W

    2016-11-01

    The evaluation of muscle and joint forces in vivo is still a challenge. Musculo-Skeletal (musculo-skeletal) models are used to compute forces based on movement analysis. Most of them are built from a scaled-generic model based on cadaver measurements, which provides a low level of personalization, or from Magnetic Resonance Images, which provide a personalized model in lying position. This study proposed an original two steps method to access a subject-specific musculo-skeletal model in 30 min, which is based solely on biplanar X-Rays. First, the subject-specific 3D geometry of bones and skin envelopes were reconstructed from biplanar X-Rays radiography. Then, 2200 corresponding control points were identified between a reference model and the subject-specific X-Rays model. Finally, the shape of 21 lower limb muscles was estimated using a non-linear transformation between the control points in order to fit the muscle shape of the reference model to the X-Rays model. Twelfth musculo-skeletal models were reconstructed and compared to their reference. The muscle volume was not accurately estimated with a standard deviation (SD) ranging from 10 to 68%. However, this method provided an accurate estimation the muscle line of action with a SD of the length difference lower than 2% and a positioning error lower than 20 mm. The moment arm was also well estimated with SD lower than 15% for most muscle, which was significantly better than scaled-generic model for most muscle. This method open the way to a quick modeling method for gait analysis based on biplanar radiography.

  15. Evaluation and comparison of statistical methods for early temporal detection of outbreaks: A simulation-based study

    PubMed Central

    Le Strat, Yann

    2017-01-01

    The objective of this paper is to evaluate a panel of statistical algorithms for temporal outbreak detection. Based on a large dataset of simulated weekly surveillance time series, we performed a systematic assessment of 21 statistical algorithms, 19 implemented in the R package surveillance and two other methods. We estimated false positive rate (FPR), probability of detection (POD), probability of detection during the first week, sensitivity, specificity, negative and positive predictive values and F1-measure for each detection method. Then, to identify the factors associated with these performance measures, we ran multivariate Poisson regression models adjusted for the characteristics of the simulated time series (trend, seasonality, dispersion, outbreak sizes, etc.). The FPR ranged from 0.7% to 59.9% and the POD from 43.3% to 88.7%. Some methods had a very high specificity, up to 99.4%, but a low sensitivity. Methods with a high sensitivity (up to 79.5%) had a low specificity. All methods had a high negative predictive value, over 94%, while positive predictive values ranged from 6.5% to 68.4%. Multivariate Poisson regression models showed that performance measures were strongly influenced by the characteristics of time series. Past or current outbreak size and duration strongly influenced detection performances. PMID:28715489

  16. [Enzymatic methods in the analysis of musts and wines].

    PubMed

    Lafon-Lafourcade, S

    1978-01-01

    The enzymatic methods are based on the property of the enzymes to catalyse specifically and reversibly the conversion of certain metabolites. These methods, developed thanks to the industrial preparation of enzymes, can be applied with no major modification to the analysis of drinks. About 15 constituants of musts and wines can now be determined by these methods. If their cost price was not relatively high, their specificity, sensitivity and rapidity would enable them to compete with the most precise of chemical methods. This is why they are only used in analytic oenology when chemical analysis is most specific enough or too laborious. Enzymatic measurement allows one by its specificity to determine the amount of residual sugar that is fermentable in a dry wine and by its sensitivity to verifie the total disappearance of the malic acid of the wine. Its rapidity must make it preferable to the long and not very specific chemical measurement, especially concerning the determination of citric acid. But glycerol, ethanol and acetic acid can be measured by chemical or chromatographical means with sufficient precision and for a more modest price. In oenology the methods are essentially used for research. They have permitted the study of the combinations of sulphur anhydride in wines (measurement of cetonic acids). The determination of the isomeric nature of the lactic acid produced from sugars by lactic bacteria is based on their application; this determination is a criterium for the identification and classification of these microorganisms. The measurement of the lactic acid during vinification allows the early disclosure of the first effects of a bacterial development; inversely it permits the invalidation of the existence of a lactic sourness, which a high volatile acidity might point to. Lastly, the enzymatic measurement of gluconic acid allows the health of the crop to be controlled.

  17. Enzymatic determination of carbon-14 labeled L-alanine in biological samples

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Serra, F.; Palou, A.; Pons, A.

    A method for determination of L-alanine-specific radioactivity in biological samples is presented. This method is based on the specific enzymatic transformation of L-alanine to pyruvic acid hydrazone catalyzed by the enzyme L-alanine dehydrogenase, formation of the pyruvic acid 2,4-dinitrophenylhydrazone derivative, and quantitative trapping in Amberlite XAD-7 columns, followed by radioactivity counting of the lipophilic eluate. No interferences from other UC-labeled materials such as D-glucose, glycerol, L-lactate, L-serine, L-glutamate, L-phenylalanine, glycine, L-leucine, and L-arginine were observed. This inexpensive and high-speed method is applicable to the simultaneous determination of L-alanine-specific radioactivity for a large number of samples.

  18. Enhancing requirements engineering for patient registry software systems with evidence-based components.

    PubMed

    Lindoerfer, Doris; Mansmann, Ulrich

    2017-07-01

    Patient registries are instrumental for medical research. Often their structures are complex and their implementations use composite software systems to meet the wide spectrum of challenges. Commercial and open-source systems are available for registry implementation, but many research groups develop their own systems. Methodological approaches in the selection of software as well as the construction of proprietary systems are needed. We propose an evidence-based checklist, summarizing essential items for patient registry software systems (CIPROS), to accelerate the requirements engineering process. Requirements engineering activities for software systems follow traditional software requirements elicitation methods, general software requirements specification (SRS) templates, and standards. We performed a multistep procedure to develop a specific evidence-based CIPROS checklist: (1) A systematic literature review to build a comprehensive collection of technical concepts, (2) a qualitative content analysis to define a catalogue of relevant criteria, and (3) a checklist to construct a minimal appraisal standard. CIPROS is based on 64 publications and covers twelve sections with a total of 72 items. CIPROS also defines software requirements. Comparing CIPROS with traditional software requirements elicitation methods, SRS templates and standards show a broad consensus but differences in issues regarding registry-specific aspects. Using an evidence-based approach to requirements engineering for registry software adds aspects to the traditional methods and accelerates the software engineering process for registry software. The method we used to construct CIPROS serves as a potential template for creating evidence-based checklists in other fields. The CIPROS list supports developers in assessing requirements for existing systems and formulating requirements for their own systems, while strengthening the reporting of patient registry software system descriptions. It may be a first step to create standards for patient registry software system assessments. Copyright © 2017 Elsevier Inc. All rights reserved.

  19. RNA-templated single-base mutation detection based on T4 DNA ligase and reverse molecular beacon.

    PubMed

    Tang, Hongxing; Yang, Xiaohai; Wang, Kemin; Tan, Weihong; Li, Huimin; He, Lifang; Liu, Bin

    2008-06-15

    A novel RNA-templated single-base mutation detection method based on T4 DNA ligase and reverse molecular beacon (rMB) has been developed and successfully applied to identification of single-base mutation in codon 273 of the p53 gene. The discrimination was carried out using allele-specific primers, which flanked the variable position in the target RNA and was ligated using T4 DNA ligase only when the primers perfectly matched the RNA template. The allele-specific primers also carried complementary stem structures with end-labels (fluorophore TAMRA, quencher DABCYL), which formed a molecular beacon after RNase H digestion. One-base mismatch can be discriminated by analyzing the change of fluorescence intensity before and after RNase H digestion. This method has several advantages for practical applications, such as direct discrimination of single-base mismatch of the RNA extracted from cell; no requirement of PCR amplification; performance of homogeneous detection; and easily design of detection probes.

  20. Cause-Specific Mortality and Death Certificate Reporting in Adults with Moderate to Profound Intellectual Disability

    ERIC Educational Resources Information Center

    Tyrer, F.; McGrother, C.

    2009-01-01

    Background: The study of premature deaths in people with intellectual disability (ID) has become the focus of recent policy initiatives in England. This is the first UK population-based study to explore cause-specific mortality in adults with ID compared with the general population. Methods: Cause-specific standardised mortality ratios (SMRs) and…

  1. Colorimetric Detection of Ehrlichia Canis via Nucleic Acid Hybridization in Gold Nano-Colloids

    PubMed Central

    Muangchuen, Ajima; Chaumpluk, Piyasak; Suriyasomboon, Annop; Ekgasit, Sanong

    2014-01-01

    Canine monocytic ehrlichiosis (CME) is a major thick-bone disease of dog caused by Ehrlichia canis. Detection of this causal agent outside the laboratory using conventional methods is not effective enough. Thus an assay for E. canis detection based on the p30 outer membrane protein gene was developed. It was based on the p30 gene amplification using loop-mediated isothermal DNA amplification (LAMP). The primer set specific to six areas within the target gene were designed and tested for their sensitivity and specificity. Detection of DNA signals was based on modulation of gold nanoparticles' surface properties and performing DNA/DNA hybridization using an oligonucleotide probe. Presence of target DNA affected the gold colloid nanoparticles in terms of particle aggregation with a plasmonic color change of the gold colloids from ruby red to purple, visible by the naked eye. All the assay steps were completed within 90 min including DNA extraction without relying on standard laboratory facilities. This method was very specific to target bacteria. Its sensitivity with probe hybridization was sufficient to detect 50 copies of target DNA. This method should provide an alternative choice for point of care control and management of the disease. PMID:25111239

  2. Colorimetric detection of Ehrlichia canis via nucleic acid hybridization in gold nano-colloids.

    PubMed

    Muangchuen, Ajima; Chaumpluk, Piyasak; Suriyasomboon, Annop; Ekgasit, Sanong

    2014-08-08

    Canine monocytic ehrlichiosis (CME) is a major thick-bone disease of dog caused by Ehrlichia canis. Detection of this causal agent outside the laboratory using conventional methods is not effective enough. Thus an assay for E. canis detection based on the p30 outer membrane protein gene was developed. It was based on the p30 gene amplification using loop-mediated isothermal DNA amplification (LAMP). The primer set specific to six areas within the target gene were designed and tested for their sensitivity and specificity. Detection of DNA signals was based on modulation of gold nanoparticles' surface properties and performing DNA/DNA hybridization using an oligonucleotide probe. Presence of target DNA affected the gold colloid nanoparticles in terms of particle aggregation with a plasmonic color change of the gold colloids from ruby red to purple, visible by the naked eye. All the assay steps were completed within 90 min including DNA extraction without relying on standard laboratory facilities. This method was very specific to target bacteria. Its sensitivity with probe hybridization was sufficient to detect 50 copies of target DNA. This method should provide an alternative choice for point of care control and management of the disease.

  3. Analysis of evolutionary conservation patterns and their influence on identifying protein functional sites.

    PubMed

    Fang, Chun; Noguchi, Tamotsu; Yamana, Hayato

    2014-10-01

    Evolutionary conservation information included in position-specific scoring matrix (PSSM) has been widely adopted by sequence-based methods for identifying protein functional sites, because all functional sites, whether in ordered or disordered proteins, are found to be conserved at some extent. However, different functional sites have different conservation patterns, some of them are linear contextual, some of them are mingled with highly variable residues, and some others seem to be conserved independently. Every value in PSSMs is calculated independently of each other, without carrying the contextual information of residues in the sequence. Therefore, adopting the direct output of PSSM for prediction fails to consider the relationship between conservation patterns of residues and the distribution of conservation scores in PSSMs. In order to demonstrate the importance of combining PSSMs with the specific conservation patterns of functional sites for prediction, three different PSSM-based methods for identifying three kinds of functional sites have been analyzed. Results suggest that, different PSSM-based methods differ in their capability to identify different patterns of functional sites, and better combining PSSMs with the specific conservation patterns of residues would largely facilitate the prediction.

  4. Subject-based feature extraction by using fisher WPD-CSP in brain-computer interfaces.

    PubMed

    Yang, Banghua; Li, Huarong; Wang, Qian; Zhang, Yunyuan

    2016-06-01

    Feature extraction of electroencephalogram (EEG) plays a vital role in brain-computer interfaces (BCIs). In recent years, common spatial pattern (CSP) has been proven to be an effective feature extraction method. However, the traditional CSP has disadvantages of requiring a lot of input channels and the lack of frequency information. In order to remedy the defects of CSP, wavelet packet decomposition (WPD) and CSP are combined to extract effective features. But WPD-CSP method considers less about extracting specific features that are fitted for the specific subject. So a subject-based feature extraction method using fisher WPD-CSP is proposed in this paper. The idea of proposed method is to adapt fisher WPD-CSP to each subject separately. It mainly includes the following six steps: (1) original EEG signals from all channels are decomposed into a series of sub-bands using WPD; (2) average power values of obtained sub-bands are computed; (3) the specified sub-bands with larger values of fisher distance according to average power are selected for that particular subject; (4) each selected sub-band is reconstructed to be regarded as a new EEG channel; (5) all new EEG channels are used as input of the CSP and a six-dimensional feature vector is obtained by the CSP. The subject-based feature extraction model is so formed; (6) the probabilistic neural network (PNN) is used as the classifier and the classification accuracy is obtained. Data from six subjects are processed by the subject-based fisher WPD-CSP, the non-subject-based fisher WPD-CSP and WPD-CSP, respectively. Compared with non-subject-based fisher WPD-CSP and WPD-CSP, the results show that the proposed method yields better performance (sensitivity: 88.7±0.9%, and specificity: 91±1%) and the classification accuracy from subject-based fisher WPD-CSP is increased by 6-12% and 14%, respectively. The proposed subject-based fisher WPD-CSP method can not only remedy disadvantages of CSP by WPD but also discriminate helpless sub-bands for each subject and make remaining fewer sub-bands keep better separability by fisher distance, which leads to a higher classification accuracy than WPD-CSP method. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  5. Development of a rapid, sensitive and specific diagnostic assay for fish Aquareovirus based on RT-PCR.

    PubMed

    Seng, E K; Fang, Q; Lam, T J; Sin, Y M

    2004-06-15

    A rapid, sensitive and highly specific detection method for Aquareovirus based on reverse-transcription polymerase chain reaction (RT-PCR) was developed. Based on multiple sequence alignment of the cloned sequences of a local isolates, the Threadfin reovirus (TFV) and Guppy reovirus (GPV) with Grass carp reovirus (GCRV), a pair of degenerate primers was selected carefully and synthesized. Using this primer combination, only one specific product, approximately 450 bp in length was obtained when RT-PCR was carried out using the genomic double-stranded RNA (dsRNA) of TFV, GPV and GCRV. Similar results were also obtained when Chum salmon reovirus (CSRV) and Striped bass reovirus (SBRV) dsRNA were used as templates. No products were observed when nucleic acids other than the dsRNA of the aquareoviruses described above were used as RT-PCR templates. This technique could detect not only TFV but also GPV and GCRV in low titer virus-infected cell cultured cells. Furthermore, this method has also been shown to be able to diagnose GPV-infected guppy (Poecilia reticulata) that exhibit clinical symptoms as well as GPV-carrier guppy. Collectively, these results showed that the RT-PCR amplification method using specific degenerate primers described below is very useful for rapid and accurate detection of a variety of aquareovirus strains isolated from different host species and origin.

  6. SPECIATION OF ORGANICS IN WATER

    EPA Science Inventory

    We describe herein a method for determining constants for simultaneously occurring, site-specific "microequilibria" (as with tautomers) for organics in water. The method is based in part on modeling temperature-variant Raman spectra according to the van't Hoff equation....

  7. IMMUNOCHEMICAL APPLICATIONS IN ENVIRONMENTAL SCIENCE

    EPA Science Inventory

    Immunochemical methods are based on selective antibodies combining with a particular target analyte or analyte group. The specific binding between antibody and analyte can be used to detect environmental contaminants in a variety of sample matrixes. Immunoassay methods provide ...

  8. Structured output-feedback controller synthesis with design specifications

    NASA Astrophysics Data System (ADS)

    Hao, Yuqing; Duan, Zhisheng

    2017-03-01

    This paper considers the problem of structured output-feedback controller synthesis with finite frequency specifications. Based on the orthogonal space information of input matrix, an improved parameter-dependent Lyapunov function method is first proposed. Then, a two-stage construction method is designed, which depends on an initial centralised controller. Corresponding design conditions for three types of output-feedback controllers are presented in terms of unified representations. Moreover, heuristic algorithms are provided to explore the desirable controllers. Finally, the effectiveness of these proposed methods is illustrated via some practical examples.

  9. Male contraception: what is on the horizon?

    PubMed

    Blithe, Diana

    2008-10-01

    Male contraception remains an important area of research. Methods can inhibit sperm production or can be targeted to inhibit sperm functions such as motility, orientation or interaction with the egg. Hormonal methods appear to be safe and effective in proof of concept studies but efforts are underway to improve delivery options or lead time until full efficacy is achieved. Nonhormonal methods are based on numerous targets that impact sperm production or function. Several agents that inhibit the sperm-specific or testis-specific targets have been identified and studies in animals have shown promising results.

  10. SU-F-J-06: Optimized Patient Inclusion for NaF PET Response-Based Biopsies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Roth, A; Harmon, S; Perk, T

    Purpose: A method to guide mid-treatment biopsies using quantitative [F-18]NaF PET/CT response is being investigated in a clinical trial. This study aims to develop methodology to identify patients amenable to mid-treatment biopsy based on pre-treatment imaging characteristics. Methods: 35 metastatic prostate cancer patients had NaF PET/CT scans taken prior to the start of treatment and 9–12 weeks into treatment. For mid-treatment biopsy targeting, lesions must be at least 1.5 cm{sup 3} and located in a clinically feasible region (lumbar/sacral spine, pelvis, humerus, or femur). Three methods were developed based on number of lesions present prior to treatment: a feasibility-restricted method,more » a location-restricted method, and an unrestricted method. The feasibility restricted method only utilizes information from lesions meeting biopsy requirements in the pre-treatment scan. The unrestricted method accounts for all lesions present in the pre-treatment scan. For each method, optimized classification cutoffs for candidate patients were determined. Results: 13 of the 35 patients had enough lesions at the mid-treatment for biopsy candidacy. Of 1749 lesions identified in all 35 patients at mid-treatment, only 9.8% were amenable to biopsy. Optimizing the feasibility-restricted method required 4 lesions at pre-treatment meeting volume and region requirements for biopsy, resulting patient identification sensitivity of 0.8 and specificity of 0.7. Of 6 false positive patients, only one patient lacked lesions for biopsy. Restricting for location alone showed poor results (sensitivity 0.2 and specificity 0.3). The optimized unrestricted method required patients have at least 37 lesions in pretreatment scan, resulting in a sensitivity of 0.8 and specificity of 0.8. There were 5 false positives, only one lacked lesions for biopsy. Conclusion: Incorporating the overall pre-treatment number of NaF PET/CT identified lesions provided best prediction for identifying candidate patients for mid-treatment biopsy. This study provides validity for prediction-based inclusion criteria that can be extended to various clinical trial scenarios. Funded by Prostate Cancer Foundation.« less

  11. NASA software specification and evaluation system design, part 2

    NASA Technical Reports Server (NTRS)

    1976-01-01

    A survey and analysis of the existing methods, tools and techniques employed in the development of software are presented along with recommendations for the construction of reliable software. Functional designs for software specification language, and the data base verifier are presented.

  12. Grain-Boundary Resistance in Copper Interconnects: From an Atomistic Model to a Neural Network

    NASA Astrophysics Data System (ADS)

    Valencia, Daniel; Wilson, Evan; Jiang, Zhengping; Valencia-Zapata, Gustavo A.; Wang, Kuang-Chung; Klimeck, Gerhard; Povolotskyi, Michael

    2018-04-01

    Orientation effects on the specific resistance of copper grain boundaries are studied systematically with two different atomistic tight-binding methods. A methodology is developed to model the specific resistance of grain boundaries in the ballistic limit using the embedded atom model, tight- binding methods, and nonequilibrium Green's functions. The methodology is validated against first-principles calculations for thin films with a single coincident grain boundary, with 6.4% deviation in the specific resistance. A statistical ensemble of 600 large, random structures with grains is studied. For structures with three grains, it is found that the distribution of specific resistances is close to normal. Finally, a compact model for grain-boundary-specific resistance is constructed based on a neural network.

  13. On the track for an efficient detection of Escherichia coli in water: A review on PCR-based methods.

    PubMed

    Mendes Silva, Diana; Domingues, Lucília

    2015-03-01

    Ensuring water safety is an ongoing challenge to public health providers. Assessing the presence of fecal contamination indicators in water is essential to protect public health from diseases caused by waterborne pathogens. For this purpose, the bacteria Escherichia coli has been used as the most reliable indicator of fecal contamination in water. The methods currently in use for monitoring the microbiological safety of water are based on culturing the microorganisms. However, these methods are not the desirable solution to prevent outbreaks as they provide the results with a considerable delay, lacking on specificity and sensitivity. Moreover, viable but non-culturable microorganisms, which may be present as a result of environmental stress or water treatment processes, are not detected by culture-based methods and, thus, may result in false-negative assessments of E. coli in water samples. These limitations may place public health at significant risk, leading to substantial monetary losses in health care and, additionally, in costs related with a reduced productivity in the area affected by the outbreak, and in costs supported by the water quality control departments involved. Molecular methods, particularly polymerase chain reaction-based methods, have been studied as an alternative technology to overcome the current limitations, as they offer the possibility to reduce the assay time, to improve the detection sensitivity and specificity, and to identify multiple targets and pathogens, including new or emerging strains. The variety of techniques and applications available for PCR-based methods has increased considerably and the costs involved have been substantially reduced, which together have contributed to the potential standardization of these techniques. However, they still require further refinement in order to be standardized and applied to the variety of environmental waters and their specific characteristics. The PCR-based methods under development for monitoring the presence of E. coli in water are here discussed. Special emphasis is given to methodologies that avoid pre-enrichment during the water sample preparation process so that the assay time is reduced and the required legislated sensitivity is achieved. The advantages and limitations of these methods are also reviewed, contributing to a more comprehensive overview toward a more conscious research in identifying E. coli in water. Copyright © 2014 Elsevier Inc. All rights reserved.

  14. EFFECT OF DIFFERENT REGIONS OF AMPLIFIED 16S RDNA ON A PERFORMANCE OF A MULTIPLEXED, BEAD-BASED METHOD FOR ANALYSIS OF DNA SEQUENCES IN ENVIRONMENTAL SAMPLES.

    EPA Science Inventory

    Using a bead-based method for multiplexed analysis of community DNA, the dynamics of aquatic microbial communities can be assessed. Capture probes, specific for a genus or species of bacteria, are attached to the surface of uniquely labeled, microscopic polystyrene beads. Primers...

  15. The Principle of the Micro-Electronic Neural Bridge and a Prototype System Design.

    PubMed

    Huang, Zong-Hao; Wang, Zhi-Gong; Lu, Xiao-Ying; Li, Wen-Yuan; Zhou, Yu-Xuan; Shen, Xiao-Yan; Zhao, Xin-Tai

    2016-01-01

    The micro-electronic neural bridge (MENB) aims to rebuild lost motor function of paralyzed humans by routing movement-related signals from the brain, around the damage part in the spinal cord, to the external effectors. This study focused on the prototype system design of the MENB, including the principle of the MENB, the neural signal detecting circuit and the functional electrical stimulation (FES) circuit design, and the spike detecting and sorting algorithm. In this study, we developed a novel improved amplitude threshold spike detecting method based on variable forward difference threshold for both training and bridging phase. The discrete wavelet transform (DWT), a new level feature coefficient selection method based on Lilliefors test, and the k-means clustering method based on Mahalanobis distance were used for spike sorting. A real-time online spike detecting and sorting algorithm based on DWT and Euclidean distance was also implemented for the bridging phase. Tested by the data sets available at Caltech, in the training phase, the average sensitivity, specificity, and clustering accuracies are 99.43%, 97.83%, and 95.45%, respectively. Validated by the three-fold cross-validation method, the average sensitivity, specificity, and classification accuracy are 99.43%, 97.70%, and 96.46%, respectively.

  16. A branch-migration based fluorescent probe for straightforward, sensitive and specific discrimination of DNA mutations

    PubMed Central

    Xiao, Xianjin; Wu, Tongbo; Xu, Lei; Chen, Wei

    2017-01-01

    Abstract Genetic mutations are important biomarkers for cancer diagnostics and surveillance. Preferably, the methods for mutation detection should be straightforward, highly specific and sensitive to low-level mutations within various sequence contexts, fast and applicable at room-temperature. Though some of the currently available methods have shown very encouraging results, their discrimination efficiency is still very low. Herein, we demonstrate a branch-migration based fluorescent probe (BM probe) which is able to identify the presence of known or unknown single-base variations at abundances down to 0.3%-1% within 5 min, even in highly GC-rich sequence regions. The discrimination factors between the perfect-match target and single-base mismatched target are determined to be 89–311 by measurement of their respective branch-migration products via polymerase elongation reactions. The BM probe not only enabled sensitive detection of two types of EGFR-associated point mutations located in GC-rich regions, but also successfully identified the BRAF V600E mutation in the serum from a thyroid cancer patient which could not be detected by the conventional sequencing method. The new method would be an ideal choice for high-throughput in vitro diagnostics and precise clinical treatment. PMID:28201758

  17. An endoglycosidase-assisted LC-MS/MS-based strategy for the analysis of site-specific core-fucosylation of low-concentrated glycoproteins in human serum using prostate-specific antigen (PSA) as example.

    PubMed

    Lang, Robert; Leinenbach, Andreas; Karl, Johann; Swiatek-de Lange, Magdalena; Kobold, Uwe; Vogeser, Michael

    2018-05-01

    Recently, site-specific fucosylation of glycoproteins has attracted attention as it can be associated with several types of cancers including prostate cancer. However, individual glycoproteins, which might serve as potential cancer markers, often are very low-concentrated in complex serum matrices and distinct glycan structures are hard to detect by immunoassays. Here, we present a mass spectrometry-based strategy for the simultaneous analysis of core-fucosylated and total prostate-specific antigen (PSA) in human serum in the low ng/ml concentration range. Sample preparation comprised an immunoaffinity capture step to enrich total PSA from human serum using anti-PSA antibody coated magnetic beads followed by consecutive two-step on-bead partial deglycosylation with endoglycosidase F3 and tryptic digestion prior to LC-MS/MS analysis. The method was shown to be linear from 0.5 to 60 ng/ml total PSA concentrations and allows the simultaneous quantification of core-fucosylated PSA down to 1 ng/ml and total PSA lower than 0.5 ng/ml. The imprecision of the method over two days ranged from 9.7-23.2% for core-fucosylated PSA and 10.3-18.3% for total PSA depending on the PSA level. The feasibility of the method in native sera was shown using three human specimens. To our knowledge, this is the first MS-based method for quantification of core-fucosylated PSA in the low ng/ml concentration range in human serum. This method could be used in large patient cohorts as core-fucosylated PSA may be a diagnostic biomarker for the differentiation of prostate cancer and other prostatic diseases, such as benign prostatic hyperplasia (BPH). Furthermore, the described strategy could be used to monitor potential changes in site-specific core-fucosylation of other low-concentrated glycoproteins, which could serve as more specific markers ("marker refinement") in cancer research. Copyright © 2018 Elsevier B.V. All rights reserved.

  18. Investigation of self-adaptive LED surgical lighting based on entropy contrast enhancing method

    NASA Astrophysics Data System (ADS)

    Liu, Peng; Wang, Huihui; Zhang, Yaqin; Shen, Junfei; Wu, Rengmao; Zheng, Zhenrong; Li, Haifeng; Liu, Xu

    2014-05-01

    Investigation was performed to explore the possibility of enhancing contrast by varying the spectral distribution (SPD) of the surgical lighting. The illumination scenes with different SPDs were generated by the combination of a self-adaptive white light optimization method and the LED ceiling system, the images of biological sample are taken by a CCD camera and then processed by an 'Entropy' based contrast evaluation model which is proposed specific for surgery occasion. Compared with the neutral white LED based and traditional algorithm based image enhancing methods, the illumination based enhancing method turns out a better performance in contrast enhancing and improves the average contrast value about 9% and 6%, respectively. This low cost method is simple, practicable, and thus may provide an alternative solution for the expensive visual facility medical instruments.

  19. Comparison of hand-craft feature based SVM and CNN based deep learning framework for automatic polyp classification.

    PubMed

    Younghak Shin; Balasingham, Ilangko

    2017-07-01

    Colonoscopy is a standard method for screening polyps by highly trained physicians. Miss-detected polyps in colonoscopy are potential risk factor for colorectal cancer. In this study, we investigate an automatic polyp classification framework. We aim to compare two different approaches named hand-craft feature method and convolutional neural network (CNN) based deep learning method. Combined shape and color features are used for hand craft feature extraction and support vector machine (SVM) method is adopted for classification. For CNN approach, three convolution and pooling based deep learning framework is used for classification purpose. The proposed framework is evaluated using three public polyp databases. From the experimental results, we have shown that the CNN based deep learning framework shows better classification performance than the hand-craft feature based methods. It achieves over 90% of classification accuracy, sensitivity, specificity and precision.

  20. Biomarker-specific conjugated nanopolyplexes for the active coloring of stem-like cancer cells

    NASA Astrophysics Data System (ADS)

    Hong, Yoochan; Lee, Eugene; Choi, Jihye; Haam, Seungjoo; Suh, Jin-Suck; Yang, Jaemoon

    2016-06-01

    Stem-like cancer cells possess intrinsic features and their CD44 regulate redox balance in cancer cells to survive under stress conditions. Thus, we have fabricated biomarker-specific conjugated polyplexes using CD44-targetable hyaluronic acid and redox-sensible polyaniline based on a nanoemulsion method. For the most sensitive recognition of the cellular redox at a single nanoparticle scale, a nano-scattering spectrum imaging analyzer system was introduced. The conjugated polyplexes showed a specific targeting ability toward CD44-expressing cancer cells as well as a dramatic change in its color, which depended on the redox potential in the light-scattered images. Therefore, these polyaniline-based conjugated polyplexes as well as analytical processes that include light-scattering imaging and measurements of scattering spectra, clearly establish a systematic method for the detection and monitoring of cancer microenvironments.

  1. Single Laboratory Comparison of Host-Specific PCR Assays for the Detection of Bovine Fecal Pollution

    EPA Science Inventory

    There are numerous PCR-based methods available to detect bovine fecal pollution in ambient waters. Each method targets a different gene and microorganism leading to differences in method performance, making it difficult to determine which approach is most suitable for field appl...

  2. Methods for Probing Magnetic Films with Neutrons

    NASA Astrophysics Data System (ADS)

    Kozhevnikov, S. V.; Ott, F.; Radu, F.

    2018-03-01

    We review various methods in the investigation of magnetic films with neutrons, including those based on the effects of Larmor precession, Zeeman spatial splitting of the beam, neutron spin resonance, and polarized neutron channeling. The underlying principles, examples of the investigated systems, specific features, applications, and perspectives of these methods are discussed.

  3. Mixed Methods in Intervention Research: Theory to Adaptation

    ERIC Educational Resources Information Center

    Nastasi, Bonnie K.; Hitchcock, John; Sarkar, Sreeroopa; Burkholder, Gary; Varjas, Kristen; Jayasena, Asoka

    2007-01-01

    The purpose of this article is to demonstrate the application of mixed methods research designs to multiyear programmatic research and development projects whose goals include integration of cultural specificity when generating or translating evidence-based practices. The authors propose a set of five mixed methods designs related to different…

  4. AMPHION: Specification-based programming for scientific subroutine libraries

    NASA Technical Reports Server (NTRS)

    Lowry, Michael; Philpot, Andrew; Pressburger, Thomas; Underwood, Ian; Waldinger, Richard; Stickel, Mark

    1994-01-01

    AMPHION is a knowledge-based software engineering (KBSE) system that guides a user in developing a diagram representing a formal problem specification. It then automatically implements a solution to this specification as a program consisting of calls to subroutines from a library. The diagram provides an intuitive domain oriented notation for creating a specification that also facilitates reuse and modification. AMPHION'S architecture is domain independent. AMPHION is specialized to an application domain by developing a declarative domain theory. Creating a domain theory is an iterative process that currently requires the joint expertise of domain experts and experts in automated formal methods for software development.

  5. Identification of family-specific residue packing motifs and their use for structure-based protein function prediction: I. Method development.

    PubMed

    Bandyopadhyay, Deepak; Huan, Jun; Prins, Jan; Snoeyink, Jack; Wang, Wei; Tropsha, Alexander

    2009-11-01

    Protein function prediction is one of the central problems in computational biology. We present a novel automated protein structure-based function prediction method using libraries of local residue packing patterns that are common to most proteins in a known functional family. Critical to this approach is the representation of a protein structure as a graph where residue vertices (residue name used as a vertex label) are connected by geometrical proximity edges. The approach employs two steps. First, it uses a fast subgraph mining algorithm to find all occurrences of family-specific labeled subgraphs for all well characterized protein structural and functional families. Second, it queries a new structure for occurrences of a set of motifs characteristic of a known family, using a graph index to speed up Ullman's subgraph isomorphism algorithm. The confidence of function inference from structure depends on the number of family-specific motifs found in the query structure compared with their distribution in a large non-redundant database of proteins. This method can assign a new structure to a specific functional family in cases where sequence alignments, sequence patterns, structural superposition and active site templates fail to provide accurate annotation.

  6. SNPase-ARMS qPCR: Ultrasensitive Mutation-Based Detection of Cell-Free Tumor DNA in Melanoma Patients

    PubMed Central

    Stadler, Julia; Eder, Johanna; Pratscher, Barbara; Brandt, Sabine; Schneller, Doris; Müllegger, Robert; Vogl, Claus; Trautinger, Franz; Brem, Gottfried; Burgstaller, Joerg P.

    2015-01-01

    Cell-free circulating tumor DNA in the plasma of cancer patients has become a common point of interest as indicator of therapy options and treatment response in clinical cancer research. Especially patient- and tumor-specific single nucleotide variants that accurately distinguish tumor DNA from wild type DNA are promising targets. The reliable detection and quantification of these single-base DNA variants is technically challenging. Currently, a variety of techniques is applied, with no apparent “gold standard”. Here we present a novel qPCR protocol that meets the conditions of extreme sensitivity and specificity that are required for detection and quantification of tumor DNA. By consecutive application of two polymerases, one of them designed for extreme base-specificity, the method reaches unprecedented sensitivity and specificity. Three qPCR assays were tested with spike-in experiments, specific for point mutations BRAF V600E, PTEN T167A and NRAS Q61L of melanoma cell lines. It was possible to detect down to one copy of tumor DNA per reaction (Poisson distribution), at a background of up to 200 000 wild type DNAs. To prove its clinical applicability, the method was successfully tested on a small cohort of BRAF V600E positive melanoma patients. PMID:26562020

  7. Polymerase chain reaction-hybridization method using urease gene sequences for high-throughput Ureaplasma urealyticum and Ureaplasma parvum detection and differentiation.

    PubMed

    Xu, Chen; Zhang, Nan; Huo, Qianyu; Chen, Minghui; Wang, Rengfeng; Liu, Zhili; Li, Xue; Liu, Yunde; Bao, Huijing

    2016-04-15

    In this article, we discuss the polymerase chain reaction (PCR)-hybridization assay that we developed for high-throughput simultaneous detection and differentiation of Ureaplasma urealyticum and Ureaplasma parvum using one set of primers and two specific DNA probes based on urease gene nucleotide sequence differences. First, U. urealyticum and U. parvum DNA samples were specifically amplified using one set of biotin-labeled primers. Furthermore, amine-modified DNA probes, which can specifically react with U. urealyticum or U. parvum DNA, were covalently immobilized to a DNA-BIND plate surface. The plate was then incubated with the PCR products to facilitate sequence-specific DNA binding. Horseradish peroxidase-streptavidin conjugation and a colorimetric assay were used. Based on the results, the PCR-hybridization assay we developed can specifically differentiate U. urealyticum and U. parvum with high sensitivity (95%) compared with cultivation (72.5%). Hence, this study demonstrates a new method for high-throughput simultaneous differentiation and detection of U. urealyticum and U. parvum with high sensitivity. Based on these observations, the PCR-hybridization assay developed in this study is ideal for detecting and discriminating U. urealyticum and U. parvum in clinical applications. Copyright © 2016 Elsevier Inc. All rights reserved.

  8. From user needs to system specifications: multi-disciplinary thematic seminars as a collaborative design method for development of health information systems.

    PubMed

    Scandurra, I; Hägglund, M; Koch, S

    2008-08-01

    This paper presents a new multi-disciplinary method for user needs analysis and requirements specification in the context of health information systems based on established theories from the fields of participatory design and computer supported cooperative work (CSCW). Whereas conventional methods imply a separate, sequential needs analysis for each profession, the "multi-disciplinary thematic seminar" (MdTS) method uses a collaborative design process. Application of the method in elderly homecare resulted in prototypes that were well adapted to the intended user groups. Vital information in the points of intersection between different care professions was elicited and a holistic view of the entire care process was obtained. Health informatics-usability specialists and clinical domain experts are necessary to apply the method. Although user needs acquisition can be time-consuming, MdTS was perceived to efficiently identify in-context user needs, and transformed these directly into requirements specifications. Consequently the method was perceived to expedite the entire ICT implementation process.

  9. Space construction data base

    NASA Technical Reports Server (NTRS)

    1979-01-01

    Construction of large systems in space is a technology requiring the development of construction methods to deploy, assemble, and fabricate the elements comprising such systems. A construction method is comprised of all essential functions and operations and related support equipment necessary to accomplish a specific construction task in a particular way. The data base objective is to provide to the designers of large space systems a compendium of the various space construction methods which could have application to their projects.

  10. High performance of a new PCR-based urine assay for HPV-DNA detection and genotyping.

    PubMed

    Tanzi, Elisabetta; Bianchi, Silvia; Fasolo, Maria Michela; Frati, Elena R; Mazza, Francesca; Martinelli, Marianna; Colzani, Daniela; Beretta, Rosangela; Zappa, Alessandra; Orlando, Giovanna

    2013-01-01

    Human papillomavirus (HPV) testing has been proposed as a means of replacing or supporting conventional cervical screening (Pap test). However, both methods require the collection of cervical samples. Urine sample is easier and more acceptable to collect and could be helpful in facilitating cervical cancer screening. The aim of this study was to evaluate the sensitivity and specificity of urine testing compared to conventional cervical smear testing using a PCR-based method with a new, designed specifically primer set. Paired cervical and first voided urine samples collected from 107 women infected with HIV were subjected to HPV-DNA detection and genotyping using a PCR-based assay and a restriction fragment length polymorphism method. Sensitivity, specificity, Positive Predictive Value (PPV), and Negative Predictive Value (NPV) were calculated using the McNemar's test for differences. Concordance between tests was assessed using the Cohen's unweighted Kappa (k). HPV DNA was detected in 64.5% (95% CI: 55.1-73.1%) of both cytobrush and urine samples. High concordance rates of HPV-DNA detection (k = 0.96; 95% CI: 0.90-1.0) and of high risk-clade and low-risk genotyping in paired samples (k = 0.80; 95% CI: 0.67-0.92 and k = 0.74; 95% CI: 0.60-0.88, respectively) were observed. HPV-DNA detection in urine versus cervix testing revealed a sensitivity of 98.6% (95% CI: 93.1-99.9%) and a specificity of 97.4% (95% CI: 87.7-99.9%), with a very high NPV (97.4%; 95% CI: 87.7-99.9%). The PCR-based assay utilized in this study proved highly sensitive and specific for HPV-DNA detection and genotyping in urine samples. These data suggest that a urine-based assay would be a suitable and effective tool for epidemiological surveillance and, most of all, screening programs. Copyright © 2012 Wiley Periodicals, Inc.

  11. Semi-Lagrangian particle methods for high-dimensional Vlasov-Poisson systems

    NASA Astrophysics Data System (ADS)

    Cottet, Georges-Henri

    2018-07-01

    This paper deals with the implementation of high order semi-Lagrangian particle methods to handle high dimensional Vlasov-Poisson systems. It is based on recent developments in the numerical analysis of particle methods and the paper focuses on specific algorithmic features to handle large dimensions. The methods are tested with uniform particle distributions in particular against a recent multi-resolution wavelet based method on a 4D plasma instability case and a 6D gravitational case. Conservation properties, accuracy and computational costs are monitored. The excellent accuracy/cost trade-off shown by the method opens new perspective for accurate simulations of high dimensional kinetic equations by particle methods.

  12. Use of Comparative Genomics-Based Markers for Discrimination of Host Specificity in Fusarium oxysporum.

    PubMed

    van Dam, Peter; de Sain, Mara; Ter Horst, Anneliek; van der Gragt, Michelle; Rep, Martijn

    2018-01-01

    The polyphyletic nature of many formae speciales of Fusarium oxysporum prevents molecular identification of newly encountered strains based on conserved, vertically inherited genes. Alternative molecular detection methods that could replace labor- and time-intensive disease assays are therefore highly desired. Effectors are functional elements in the pathogen-host interaction and have been found to show very limited sequence diversity between strains of the same forma specialis , which makes them potential markers for host-specific pathogenicity. We therefore compared candidate effector genes extracted from 60 existing and 22 newly generated genome assemblies, specifically targeting strains affecting cucurbit plant species. Based on these candidate effector genes, a total of 18 PCR primer pairs were designed to discriminate between each of the seven Cucurbitaceae-affecting formae speciales When tested on a collection of strains encompassing different clonal lineages of these formae speciales , nonpathogenic strains, and strains of other formae speciales , they allowed clear recognition of the host range of each evaluated strain. Within Fusarium oxysporum f. sp. melonis more genetic variability exists than anticipated, resulting in three F. oxysporum f. sp. melonis marker patterns that partially overlapped with the cucurbit-infecting Fusarium oxysporum f. sp. cucumerinum , Fusarium oxysporum f. sp. niveum , Fusarium oxysporum f. sp. momordicae , and/or Fusarium oxysporum f. sp. lagenariae For F. oxysporum f. sp. niveum , a multiplex TaqMan assay was evaluated and was shown to allow quantitative and specific detection of template DNA quantities as low as 2.5 pg. These results provide ready-to-use marker sequences for the mentioned F. oxysporum pathogens. Additionally, the method can be applied to find markers distinguishing other host-specific forms of F. oxysporum IMPORTANCE Pathogenic strains of Fusarium oxysporum are differentiated into formae speciales based on their host range, which is normally restricted to only one or a few plant species. However, horizontal gene transfer between strains in the species complex has resulted in a polyphyletic origin of host specificity in many of these formae speciales This hinders accurate and rapid pathogen detection through molecular methods. In our research, we compared the genomes of 88 strains of F. oxysporum with each other, specifically targeting virulence-related genes that are typically highly similar within each forma specialis Using this approach, we identified marker sequences that allow the discrimination of F. oxysporum strains affecting various cucurbit plant species through different PCR-based methods. Copyright © 2017 American Society for Microbiology.

  13. A review of numerical techniques approaching microstructures of crystalline rocks

    NASA Astrophysics Data System (ADS)

    Zhang, Yahui; Wong, Louis Ngai Yuen

    2018-06-01

    The macro-mechanical behavior of crystalline rocks including strength, deformability and failure pattern are dominantly influenced by their grain-scale structures. Numerical technique is commonly used to assist understanding the complicated mechanisms from a microscopic perspective. Each numerical method has its respective strengths and limitations. This review paper elucidates how numerical techniques take geometrical aspects of the grain into consideration. Four categories of numerical methods are examined: particle-based methods, block-based methods, grain-based methods, and node-based methods. Focusing on the grain-scale characters, specific relevant issues including increasing complexity of micro-structure, deformation and breakage of model elements, fracturing and fragmentation process are described in more detail. Therefore, the intrinsic capabilities and limitations of different numerical approaches in terms of accounting for the micro-mechanics of crystalline rocks and their phenomenal mechanical behavior are explicitly presented.

  14. Combined Feature Based and Shape Based Visual Tracker for Robot Navigation

    NASA Technical Reports Server (NTRS)

    Deans, J.; Kunz, C.; Sargent, R.; Park, E.; Pedersen, L.

    2005-01-01

    We have developed a combined feature based and shape based visual tracking system designed to enable a planetary rover to visually track and servo to specific points chosen by a user with centimeter precision. The feature based tracker uses invariant feature detection and matching across a stereo pair, as well as matching pairs before and after robot movement in order to compute an incremental 6-DOF motion at each tracker update. This tracking method is subject to drift over time, which can be compensated by the shape based method. The shape based tracking method consists of 3D model registration, which recovers 6-DOF motion given sufficient shape and proper initialization. By integrating complementary algorithms, the combined tracker leverages the efficiency and robustness of feature based methods with the precision and accuracy of model registration. In this paper, we present the algorithms and their integration into a combined visual tracking system.

  15. Development and validation of a 48-target analytical method for high-throughput monitoring of genetically modified organisms.

    PubMed

    Li, Xiaofei; Wu, Yuhua; Li, Jun; Li, Yunjing; Long, Likun; Li, Feiwu; Wu, Gang

    2015-01-05

    The rapid increase in the number of genetically modified (GM) varieties has led to a demand for high-throughput methods to detect genetically modified organisms (GMOs). We describe a new dynamic array-based high throughput method to simultaneously detect 48 targets in 48 samples on a Fludigm system. The test targets included species-specific genes, common screening elements, most of the Chinese-approved GM events, and several unapproved events. The 48 TaqMan assays successfully amplified products from both single-event samples and complex samples with a GMO DNA amount of 0.05 ng, and displayed high specificity. To improve the sensitivity of detection, a preamplification step for 48 pooled targets was added to enrich the amount of template before performing dynamic chip assays. This dynamic chip-based method allowed the synchronous high-throughput detection of multiple targets in multiple samples. Thus, it represents an efficient, qualitative method for GMO multi-detection.

  16. Development and Validation of A 48-Target Analytical Method for High-throughput Monitoring of Genetically Modified Organisms

    PubMed Central

    Li, Xiaofei; Wu, Yuhua; Li, Jun; Li, Yunjing; Long, Likun; Li, Feiwu; Wu, Gang

    2015-01-01

    The rapid increase in the number of genetically modified (GM) varieties has led to a demand for high-throughput methods to detect genetically modified organisms (GMOs). We describe a new dynamic array-based high throughput method to simultaneously detect 48 targets in 48 samples on a Fludigm system. The test targets included species-specific genes, common screening elements, most of the Chinese-approved GM events, and several unapproved events. The 48 TaqMan assays successfully amplified products from both single-event samples and complex samples with a GMO DNA amount of 0.05 ng, and displayed high specificity. To improve the sensitivity of detection, a preamplification step for 48 pooled targets was added to enrich the amount of template before performing dynamic chip assays. This dynamic chip-based method allowed the synchronous high-throughput detection of multiple targets in multiple samples. Thus, it represents an efficient, qualitative method for GMO multi-detection. PMID:25556930

  17. The Weakest Link: Library Catalogs.

    ERIC Educational Resources Information Center

    Young, Terrence E., Jr.

    2002-01-01

    Describes methods of correcting MARC records in online public access catalogs in school libraries. Highlights include in-house methods; professional resources; conforming to library cataloging standards; vendor services, including Web-based services; software specifically developed for record cleanup; and outsourcing. (LRW)

  18. Rapid methods for the detection of foodborne bacterial pathogens: principles, applications, advantages and limitations

    PubMed Central

    Law, Jodi Woan-Fei; Ab Mutalib, Nurul-Syakima; Chan, Kok-Gan; Lee, Learn-Han

    2015-01-01

    The incidence of foodborne diseases has increased over the years and resulted in major public health problem globally. Foodborne pathogens can be found in various foods and it is important to detect foodborne pathogens to provide safe food supply and to prevent foodborne diseases. The conventional methods used to detect foodborne pathogen are time consuming and laborious. Hence, a variety of methods have been developed for rapid detection of foodborne pathogens as it is required in many food analyses. Rapid detection methods can be categorized into nucleic acid-based, biosensor-based and immunological-based methods. This review emphasizes on the principles and application of recent rapid methods for the detection of foodborne bacterial pathogens. Detection methods included are simple polymerase chain reaction (PCR), multiplex PCR, real-time PCR, nucleic acid sequence-based amplification (NASBA), loop-mediated isothermal amplification (LAMP) and oligonucleotide DNA microarray which classified as nucleic acid-based methods; optical, electrochemical and mass-based biosensors which classified as biosensor-based methods; enzyme-linked immunosorbent assay (ELISA) and lateral flow immunoassay which classified as immunological-based methods. In general, rapid detection methods are generally time-efficient, sensitive, specific and labor-saving. The developments of rapid detection methods are vital in prevention and treatment of foodborne diseases. PMID:25628612

  19. Molecular diagnostic methods for invasive fungal disease: the horizon draws nearer?

    PubMed

    Halliday, C L; Kidd, S E; Sorrell, T C; Chen, S C-A

    2015-04-01

    Rapid, accurate diagnostic laboratory tests are needed to improve clinical outcomes of invasive fungal disease (IFD). Traditional direct microscopy, culture and histological techniques constitute the 'gold standard' against which newer tests are judged. Molecular diagnostic methods, whether broad-range or fungal-specific, have great potential to enhance sensitivity and speed of IFD diagnosis, but have varying specificities. The use of PCR-based assays, DNA sequencing, and other molecular methods including those incorporating proteomic approaches such as matrix-assisted laser desorption ionisation-time of flight mass spectroscopy (MALDI-TOF MS) have shown promising results. These are used mainly to complement conventional methods since they require standardisation before widespread implementation can be recommended. None are incorporated into diagnostic criteria for defining IFD. Commercial assays may assist standardisation. This review provides an update of molecular-based diagnostic approaches applicable to biological specimens and fungal cultures in microbiology laboratories. We focus on the most common pathogens, Candida and Aspergillus, and the mucormycetes. The position of molecular-based approaches in the detection of azole and echinocandin antifungal resistance is also discussed.

  20. A Simulation Study of Methods for Selecting Subgroup-Specific Doses in Phase I Trials

    PubMed Central

    Morita, Satoshi; Thall, Peter F.; Takeda, Kentaro

    2016-01-01

    Summary Patient heterogeneity may complicate dose-finding in phase I clinical trials if the dose-toxicity curves differ between subgroups. Conducting separate trials within subgroups may lead to infeasibly small sample sizes in subgroups having low prevalence. Alternatively, it is not obvious how to conduct a single trial while accounting for heterogeneity. To address this problem, we consider a generalization of the continual reassessment method (O’Quigley, et al., 1990) based on a hierarchical Bayesian dose-toxicity model that borrows strength between subgroups under the assumption that the subgroups are exchangeable. We evaluate a design using this model that includes subgroup-specific dose selection and safety rules. A simulation study is presented that includes comparison of this method to three alternative approaches, based on non-hierarchical models, that make different types of assumptions about within-subgroup dose-toxicity curves. The simulations show that the hierarchical model-based method is recommended in settings where the dose-toxicity curves are exchangeable between subgroups. We present practical guidelines for application, and provide computer programs for trial simulation and conduct. PMID:28111916

  1. A comparison of per sample global scaling and per gene normalization methods for differential expression analysis of RNA-seq data.

    PubMed

    Li, Xiaohong; Brock, Guy N; Rouchka, Eric C; Cooper, Nigel G F; Wu, Dongfeng; O'Toole, Timothy E; Gill, Ryan S; Eteleeb, Abdallah M; O'Brien, Liz; Rai, Shesh N

    2017-01-01

    Normalization is an essential step with considerable impact on high-throughput RNA sequencing (RNA-seq) data analysis. Although there are numerous methods for read count normalization, it remains a challenge to choose an optimal method due to multiple factors contributing to read count variability that affects the overall sensitivity and specificity. In order to properly determine the most appropriate normalization methods, it is critical to compare the performance and shortcomings of a representative set of normalization routines based on different dataset characteristics. Therefore, we set out to evaluate the performance of the commonly used methods (DESeq, TMM-edgeR, FPKM-CuffDiff, TC, Med UQ and FQ) and two new methods we propose: Med-pgQ2 and UQ-pgQ2 (per-gene normalization after per-sample median or upper-quartile global scaling). Our per-gene normalization approach allows for comparisons between conditions based on similar count levels. Using the benchmark Microarray Quality Control Project (MAQC) and simulated datasets, we performed differential gene expression analysis to evaluate these methods. When evaluating MAQC2 with two replicates, we observed that Med-pgQ2 and UQ-pgQ2 achieved a slightly higher area under the Receiver Operating Characteristic Curve (AUC), a specificity rate > 85%, the detection power > 92% and an actual false discovery rate (FDR) under 0.06 given the nominal FDR (≤0.05). Although the top commonly used methods (DESeq and TMM-edgeR) yield a higher power (>93%) for MAQC2 data, they trade off with a reduced specificity (<70%) and a slightly higher actual FDR than our proposed methods. In addition, the results from an analysis based on the qualitative characteristics of sample distribution for MAQC2 and human breast cancer datasets show that only our gene-wise normalization methods corrected data skewed towards lower read counts. However, when we evaluated MAQC3 with less variation in five replicates, all methods performed similarly. Thus, our proposed Med-pgQ2 and UQ-pgQ2 methods perform slightly better for differential gene analysis of RNA-seq data skewed towards lowly expressed read counts with high variation by improving specificity while maintaining a good detection power with a control of the nominal FDR level.

  2. A comparison of per sample global scaling and per gene normalization methods for differential expression analysis of RNA-seq data

    PubMed Central

    Li, Xiaohong; Brock, Guy N.; Rouchka, Eric C.; Cooper, Nigel G. F.; Wu, Dongfeng; O’Toole, Timothy E.; Gill, Ryan S.; Eteleeb, Abdallah M.; O’Brien, Liz

    2017-01-01

    Normalization is an essential step with considerable impact on high-throughput RNA sequencing (RNA-seq) data analysis. Although there are numerous methods for read count normalization, it remains a challenge to choose an optimal method due to multiple factors contributing to read count variability that affects the overall sensitivity and specificity. In order to properly determine the most appropriate normalization methods, it is critical to compare the performance and shortcomings of a representative set of normalization routines based on different dataset characteristics. Therefore, we set out to evaluate the performance of the commonly used methods (DESeq, TMM-edgeR, FPKM-CuffDiff, TC, Med UQ and FQ) and two new methods we propose: Med-pgQ2 and UQ-pgQ2 (per-gene normalization after per-sample median or upper-quartile global scaling). Our per-gene normalization approach allows for comparisons between conditions based on similar count levels. Using the benchmark Microarray Quality Control Project (MAQC) and simulated datasets, we performed differential gene expression analysis to evaluate these methods. When evaluating MAQC2 with two replicates, we observed that Med-pgQ2 and UQ-pgQ2 achieved a slightly higher area under the Receiver Operating Characteristic Curve (AUC), a specificity rate > 85%, the detection power > 92% and an actual false discovery rate (FDR) under 0.06 given the nominal FDR (≤0.05). Although the top commonly used methods (DESeq and TMM-edgeR) yield a higher power (>93%) for MAQC2 data, they trade off with a reduced specificity (<70%) and a slightly higher actual FDR than our proposed methods. In addition, the results from an analysis based on the qualitative characteristics of sample distribution for MAQC2 and human breast cancer datasets show that only our gene-wise normalization methods corrected data skewed towards lower read counts. However, when we evaluated MAQC3 with less variation in five replicates, all methods performed similarly. Thus, our proposed Med-pgQ2 and UQ-pgQ2 methods perform slightly better for differential gene analysis of RNA-seq data skewed towards lowly expressed read counts with high variation by improving specificity while maintaining a good detection power with a control of the nominal FDR level. PMID:28459823

  3. A narrative review of evidence-based recommendations for the physical examination of the lumbar spine, sacroiliac and hip joint complex.

    PubMed

    Wong, C K; Johnson, E K

    2012-09-01

    Non-specific low back pain is a frequent complaint in primary care, but the differential diagnosis for low back pain can be complex. Despite advances in diagnostic imaging, a specific pathoanatomical source of low back pain can remain elusive in up to 85% of individuals. Best practice guidelines recommend that clinicians conduct a focused physical examination to help to identify patients with non-specific low back pain and an evidence-based course of clinical management. The use of sensitive and specific clinical methods to assess the lumbar spine, sacroiliac and hip joints is critical for effective physical examination. Psychosocial factors also play an important role in the evaluation of individuals with low back pain, but are not included in this narrative review of physical examination methods. Physical examination of the lumbar spine, sacroiliac and hip joints is presented, organized around patient position for efficient and effective clinical assessment. Copyright © 2012 John Wiley & Sons, Ltd.

  4. Reliability of programs specified with equational specifications

    NASA Astrophysics Data System (ADS)

    Nikolik, Borislav

    Ultrareliability is desirable (and sometimes a demand of regulatory authorities) for safety-critical applications, such as commercial flight-control programs, medical applications, nuclear reactor control programs, etc. A method is proposed, called the Term Redundancy Method (TRM), for obtaining ultrareliable programs through specification-based testing. Current specification-based testing schemes need a prohibitively large number of testcases for estimating ultrareliability. They assume availability of an accurate program-usage distribution prior to testing, and they assume the availability of a test oracle. It is shown how to obtain ultrareliable programs (probability of failure near zero) with a practical number of testcases, without accurate usage distribution, and without a test oracle. TRM applies to the class of decision Abstract Data Type (ADT) programs specified with unconditional equational specifications. TRM is restricted to programs that do not exceed certain efficiency constraints in generating testcases. The effectiveness of TRM in failure detection and recovery is demonstrated on formulas from the aircraft collision avoidance system TCAS.

  5. Diagnostic accuracy of automatic normalization of CBV in glioma grading using T1- weighted DCE-MRI.

    PubMed

    Sahoo, Prativa; Gupta, Rakesh K; Gupta, Pradeep K; Awasthi, Ashish; Pandey, Chandra M; Gupta, Mudit; Patir, Rana; Vaishya, Sandeep; Ahlawat, Sunita; Saha, Indrajit

    2017-12-01

    Aim of this retrospective study was to compare diagnostic accuracy of proposed automatic normalization method to quantify the relative cerebral blood volume (rCBV) with existing contra-lateral region of interest (ROI) based CBV normalization method for glioma grading using T1-weighted dynamic contrast enhanced MRI (DCE-MRI). Sixty patients with histologically confirmed gliomas were included in this study retrospectively. CBV maps were generated using T1-weighted DCE-MRI and are normalized by contralateral ROI based method (rCBV_contra), unaffected white matter (rCBV_WM) and unaffected gray matter (rCBV_GM), the latter two of these were generated automatically. An expert radiologist with >10years of experience in DCE-MRI and a non-expert with one year experience were used independently to measure rCBVs. Cutoff values for glioma grading were decided from ROC analysis. Agreement of histology with rCBV_WM, rCBV_GM and rCBV_contra respectively was studied using Kappa statistics and intra-class correlation coefficient (ICC). The diagnostic accuracy of glioma grading using the measured rCBV_contra by expert radiologist was found to be high (sensitivity=1.00, specificity=0.96, p<0.001) compared to the non-expert user (sensitivity=0.65, specificity=0.78, p<0.001). On the other hand, both the expert and non-expert user showed similar diagnostic accuracy for automatic rCBV_WM (sensitivity=0.89, specificity=0.87, p=0.001) and rCBV_GM (sensitivity=0.81, specificity=0.78, p=0.001) measures. Further, it was also observed that, contralateral based method by expert user showed highest agreement with histological grading of tumor (kappa=0.96, agreement 98.33%, p<0.001), however; automatic normalization method showed same percentage of agreement for both expert and non-expert user. rCBV_WM showed an agreement of 88.33% (kappa=0.76,p<0.001) with histopathological grading. It was inferred from this study that, in the absence of expert user, automated normalization of CBV using the proposed method could provide better diagnostic accuracy compared to the manual contralateral based approach. Copyright © 2017 Elsevier Inc. All rights reserved.

  6. Screening DNA chip and event-specific multiplex PCR detection methods for biotech crops.

    PubMed

    Lee, Seong-Hun

    2014-11-01

    There are about 80 biotech crop events that have been approved by safety assessment in Korea. They have been controlled by genetically modified organism (GMO) and living modified organism (LMO) labeling systems. The DNA-based detection method has been used as an efficient scientific management tool. Recently, the multiplex polymerase chain reaction (PCR) and DNA chip have been developed as simultaneous detection methods for several biotech crops' events. The event-specific multiplex PCR method was developed to detect five biotech maize events: MIR604, Event 3272, LY 038, MON 88017 and DAS-59122-7. The specificity was confirmed and the sensitivity was 0.5%. The screening DNA chip was developed from four endogenous genes of soybean, maize, cotton and canola respectively along with two regulatory elements and seven genes: P35S, tNOS, pat, bar, epsps1, epsps2, pmi, cry1Ac and cry3B. The specificity was confirmed and the sensitivity was 0.5% for four crops' 12 events: one soybean, six maize, three cotton and two canola events. The multiplex PCR and DNA chip can be available for screening, gene-specific and event-specific analysis of biotech crops as efficient detection methods by saving on workload and time. © 2014 Society of Chemical Industry. © 2014 Society of Chemical Industry.

  7. Removal of anti-Stokes emission background in STED microscopy by FPGA-based synchronous detection

    NASA Astrophysics Data System (ADS)

    Castello, M.; Tortarolo, G.; Coto Hernández, I.; Deguchi, T.; Diaspro, A.; Vicidomini, G.

    2017-05-01

    In stimulated emission depletion (STED) microscopy, the role of the STED beam is to de-excite, via stimulated emission, the fluorophores that have been previously excited by the excitation beam. This condition, together with specific beam intensity distributions, allows obtaining true sub-diffraction spatial resolution images. However, if the STED beam has a non-negligible probability to excite the fluorophores, a strong fluorescent background signal (anti-Stokes emission) reduces the effective resolution. For STED scanning microscopy, different synchronous detection methods have been proposed to remove this anti-Stokes emission background and recover the resolution. However, every method works only for a specific STED microscopy implementation. Here we present a user-friendly synchronous detection method compatible with any STED scanning microscope. It exploits a data acquisition (DAQ) card based on a field-programmable gate array (FPGA), which is progressively used in STED microscopy. In essence, the FPGA-based DAQ card synchronizes the fluorescent signal registration, the beam deflection, and the excitation beam interruption, providing a fully automatic pixel-by-pixel synchronous detection method. We validate the proposed method in both continuous wave and pulsed STED microscope systems.

  8. A novel polydopamine-based chemiluminescence resonance energy transfer method for microRNA detection coupling duplex-specific nuclease-aided target recycling strategy.

    PubMed

    Wang, Qian; Yin, Bin-Cheng; Ye, Bang-Ce

    2016-06-15

    MicroRNAs (miRNAs), functioning as oncogenes or tumor suppressors, play significant regulatory roles in regulating gene expression and become as biomarkers for disease diagnostics and therapeutics. In this work, we have coupled a polydopamine (PDA) nanosphere-assisted chemiluminescence resonance energy transfer (CRET) platform and a duplex-specific nuclease (DSN)-assisted signal amplification strategy to develop a novel method for specific miRNA detection. With the assistance of hemin, luminol, and H2O2, the horseradish peroxidase (HRP)-mimicking G-rich sequence in the sensing probe produces chemiluminescence, which is quickly quenched by the CRET effect between PDA as energy acceptor and excited luminol as energy donor. The target miRNA triggers DSN to partially degrade the sensing probe in the DNA-miRNA heteroduplex to repeatedly release G-quadruplex formed by G-rich sequence from PDA for the production of chemiluminescence. The method allows quantitative detection of target miRNA in the range of 80 pM-50 nM with a detection limit of 49.6 pM. The method also shows excellent specificity to discriminate single-base differences, and can accurately quantify miRNA in biological samples, with good agreement with the result from a commercial miRNA detection kit. The procedure requires no organic dyes or labels, and is a simple and cost-effective method for miRNA detection for early clinical diagnosis. Copyright © 2016 Elsevier B.V. All rights reserved.

  9. Symmetric redox supercapacitor based on micro-fabrication with three-dimensional polypyrrole electrodes

    NASA Astrophysics Data System (ADS)

    Sun, Wei; Zheng, Ruilin; Chen, Xuyuan

    To achieve higher energy density and power density, we have designed and fabricated a symmetric redox supercapacitor based on microelectromechanical system (MEMS) technologies. The supercapacitor consists of a three-dimensional (3D) microstructure on silicon substrate micromachined by high-aspect-ratio deep reactive ion etching (DRIE) method, two sputtered Ti current collectors and two electrochemical polymerized polypyrrole (PPy) films as electrodes. Electrochemical tests, including cyclic voltammetry (CV), electrochemical impedance spectroscopy (EIS) and galvanostatical charge/discharge methods have been carried out on the single PPy electrodes and the symmetric supercapacitor in different electrolytes. The specific capacitance (capacitance per unit footprint area) and specific power (power per unit footprint area) of the PPy electrodes and symmetric supercapacitor can be calculated from the electrochemical test data. It is found that NaCl solution is a good electrolyte for the polymerized PPy electrodes. In NaCl electrolyte, single PPy electrodes exhibit 0.128 F cm -2 specific capacitance and 1.28 mW cm -2 specific power at 20 mV s -1 scan rate. The symmetric supercapacitor presents 0.056 F cm -2 specific capacitance and 0.56 mW cm -2 specific power at 20 mV s -1 scan rate.

  10. A high-throughput detection method for invasive fruit fly (Diptera: Tephritidae) species based on microfluidic dynamic array.

    PubMed

    Jiang, Fan; Fu, Wei; Clarke, Anthony R; Schutze, Mark Kurt; Susanto, Agus; Zhu, Shuifang; Li, Zhihong

    2016-11-01

    Invasive species can be detrimental to a nation's ecology, economy and human health. Rapid and accurate diagnostics are critical to limit the establishment and spread of exotic organisms. The increasing rate of biological invasions relative to the taxonomic expertise available generates a demand for high-throughput, DNA-based diagnostics methods for identification. We designed species-specific qPCR primer and probe combinations for 27 economically important tephritidae species in six genera (Anastrepha, Bactrocera, Carpomya, Ceratitis, Dacus and Rhagoletis) based on 935 COI DNA barcode haplotypes from 181 fruit fly species publically available in BOLD, and then tested the specificity for each primer pair and probe through qPCR of 35 of those species. We then developed a standardization reaction system for detecting the 27 target species based on a microfluidic dynamic array and also applied the method to identify unknown immature samples from port interceptions and field monitoring. This method led to a specific and simultaneous detection for all 27 species in 7.5 h, using only 0.2 μL of reaction system in each reaction chamber. The approach successfully discriminated among species within complexes that had genetic similarities of up to 98.48%, while it also identified all immature samples consistent with the subsequent results of morphological examination of adults which were reared from larvae of cohorts from the same samples. We present an accurate, rapid and high-throughput innovative approach for detecting fruit flies of quarantine concern. This is a new method which has broad potential to be one of international standards for plant quarantine and invasive species detection. © 2016 John Wiley & Sons Ltd.

  11. Sensitivity ranking for freshwater invertebrates towards hydrocarbon contaminants.

    PubMed

    Gerner, Nadine V; Cailleaud, Kevin; Bassères, Anne; Liess, Matthias; Beketov, Mikhail A

    2017-11-01

    Hydrocarbons have an utmost economical importance but may also cause substantial ecological impacts due to accidents or inadequate transportation and use. Currently, freshwater biomonitoring methods lack an indicator that can unequivocally reflect the impacts caused by hydrocarbons while being independent from effects of other stressors. The aim of the present study was to develop a sensitivity ranking for freshwater invertebrates towards hydrocarbon contaminants, which can be used in hydrocarbon-specific bioindicators. We employed the Relative Sensitivity method and developed the sensitivity ranking S hydrocarbons based on literature ecotoxicological data supplemented with rapid and mesocosm test results. A first validation of the sensitivity ranking based on an earlier field study has been conducted and revealed the S hydrocarbons ranking to be promising for application in sensitivity based indicators. Thus, the first results indicate that the ranking can serve as the core component of future hydrocarbon-specific and sensitivity trait based bioindicators.

  12. Action-based flood forecasting for triggering humanitarian action

    NASA Astrophysics Data System (ADS)

    Coughlan de Perez, Erin; van den Hurk, Bart; van Aalst, Maarten K.; Amuron, Irene; Bamanya, Deus; Hauser, Tristan; Jongma, Brenden; Lopez, Ana; Mason, Simon; Mendler de Suarez, Janot; Pappenberger, Florian; Rueth, Alexandra; Stephens, Elisabeth; Suarez, Pablo; Wagemaker, Jurjen; Zsoter, Ervin

    2016-09-01

    Too often, credible scientific early warning information of increased disaster risk does not result in humanitarian action. With financial resources tilted heavily towards response after a disaster, disaster managers have limited incentive and ability to process complex scientific data, including uncertainties. These incentives are beginning to change, with the advent of several new forecast-based financing systems that provide funding based on a forecast of an extreme event. Given the changing landscape, here we demonstrate a method to select and use appropriate forecasts for specific humanitarian disaster prevention actions, even in a data-scarce location. This action-based forecasting methodology takes into account the parameters of each action, such as action lifetime, when verifying a forecast. Forecasts are linked with action based on an understanding of (1) the magnitude of previous flooding events and (2) the willingness to act "in vain" for specific actions. This is applied in the context of the Uganda Red Cross Society forecast-based financing pilot project, with forecasts from the Global Flood Awareness System (GloFAS). Using this method, we define the "danger level" of flooding, and we select the probabilistic forecast triggers that are appropriate for specific actions. Results from this methodology can be applied globally across hazards and fed into a financing system that ensures that automatic, pre-funded early action will be triggered by forecasts.

  13. "The mirror" and "the village": a new method for teaching practice-based learning and improvement and systems-based practice.

    PubMed

    Ziegelstein, Roy C; Fiebach, Nicholas H

    2004-01-01

    Practice-based learning and improvement (PBLI) and systems-based practice (SBP) may be conceptually difficult for both residents and faculty. Methods for introducing these concepts are needed if PBLI and SBP are to be incorporated into education and practice. In 2001, PBLI and SBP were introduced at Johns Hopkins Bayview Medical Center in Baltimore, Maryland, using the metaphors "the mirror" and "the village." PBLI was likened to residents' holding up a mirror to document, assess, and improve their practice. Specific tools for residents (e.g., weekly morbidity and mortality morning reports, continuity clinic chart self-audits, and resident learning portfolios) became the mirrors. SBP was introduced through specific training activities (e.g., multidisciplinary patient care rounds, nursing evaluations, and quality assessment-systems improvement exercises) using the metaphor of the village made famous by Hillary Clinton in the phrase: "It takes a village to raise a child." Residents completed a questionnaire in which they rated these initiatives' impact on their training. The majority of residents who participated in specific activities agreed that quality assessment-systems improvement exercises (92.9%), multidisciplinary rounds (92.1%), morbidity and mortality morning reports (86.8%), clinic chart self-audits (76.4%), and nursing evaluations (52.8%) helped to improve their proficiency in specific aspects of PBLI and SBP. Residents' retrospective self-assessments of their PBLI abilities demonstrated significant improvement after the introduction of specific training activities. PBLI and SBP can be introduced effectively in residency training by incorporating specific activities that use the metaphors of the mirror and the village.

  14. Evaluation by latent class analysis of a magnetic capture based DNA extraction followed by real-time qPCR as a new diagnostic method for detection of Echinococcus multilocularis in definitive hosts.

    PubMed

    Maas, Miriam; van Roon, Annika; Dam-Deisz, Cecile; Opsteegh, Marieke; Massolo, Alessandro; Deksne, Gunita; Teunis, Peter; van der Giessen, Joke

    2016-10-30

    A new method, based on a magnetic capture based DNA extraction followed by qPCR, was developed for the detection of the zoonotic parasite Echinococcus multilocularis in definitive hosts. Latent class analysis was used to compare this new method with the currently used phenol-chloroform DNA extraction followed by single tube nested PCR. In total, 60 red foxes and coyotes from three different locations were tested with both molecular methods and the sedimentation and counting technique (SCT) or intestinal scraping technique (IST). Though based on a limited number of samples, it could be established that the magnetic capture based DNA extraction followed by qPCR showed similar sensitivity and specificity as the currently used phenol-chloroform DNA extraction followed by single tube nested PCR. All methods have a high specificity as shown by Bayesian latent class analysis. Both molecular assays have higher sensitivities than the combined SCT and IST, though the uncertainties in sensitivity estimates were wide for all assays tested. The magnetic capture based DNA extraction followed by qPCR has the advantage of not requiring hazardous chemicals like the phenol-chloroform DNA extraction followed by single tube nested PCR. This supports the replacement of the phenol-chloroform DNA extraction followed by single tube nested PCR by the magnetic capture based DNA extraction followed by qPCR for molecular detection of E. multilocularis in definitive hosts. Copyright © 2016 Elsevier B.V. All rights reserved.

  15. A state-specific approach to multireference coupled electron-pair approximation like methods: Development and applications

    NASA Astrophysics Data System (ADS)

    Chattopadhyay, Sudip; Pahari, Dola; Mukherjee, Debashis; Mahapatra, Uttam Sinha

    2004-04-01

    The traditional multireference (MR) coupled-cluster (CC) methods based on the effective Hamiltonian are often beset by the problem of intruder states, and are not suitable for studying potential energy surface (PES) involving real or avoided curve crossing. State-specific MR-based approaches obviate this limitation. The state-specific MRCC (SS-MRCC) method developed some years ago [Mahapatra et al., J. Chem. Phys. 110, 6171 (1999)] can handle quasidegeneracy of varying degrees over a wide range of PES, including regions of real or avoided curve-crossing. Motivated by its success, we have suggested and explored in this paper a suite of physically motivated coupled electron-pair approximations (SS-MRCEPA) like methods, which are designed to capture the essential strength of the parent SS-MRCC method without significant sacrificing its accuracy. These SS-MRCEPA theories, like their CC counterparts, are based on complete active space, treat all the reference functions on the same footing and provide a description of potentially uniform precision of PES of states with varying MR character. The combining coefficients of the reference functions are self-consistently determined along with the cluster amplitudes themselves. The newly developed SS-MRCEPA methods are size-extensive, and are also size-consistent with localized orbitals. Among the various versions, there are two which are invariant with respect to the restricted rotations among doubly occupied and active orbitals separately. Similarity of performance of this latter and the noninvariant versions at the crossing points of the degenerate orbitals imply that the all the methods presented are rather robust with respect to the rotations among degenerate orbitals. Illustrative numerical applications are presented for PES of the ground state of a number of difficult test cases such as the model H4, H8 problems, the insertion of Be into H2, and Li2, where intruders exist and for a state of a molecule such as CH2, with pronounced MR character. Results obtained with SS-MRCEPA methods are found to be comparable in accuracy to the parent SS-MRCC and FCI/large scale CI results throughout the PES, which indicates the efficacy of our SS-MRCEPA methods over a wide range of geometries, despite their neglect of a host of complicated nonlinear terms, even when the traditional MR-based methods based on effective Hamiltonians fail due to intruders.

  16. A Geochemical Mass-Balance Method for Base-Flow Separation, Upper Hillsborough River Watershed, West-Central Florida, 2003-2005 and 2009

    USGS Publications Warehouse

    Kish, G.R.; Stringer, C.E.; Stewart, M.T.; Rains, M.C.; Torres, A.E.

    2010-01-01

    Geochemical mass-balance (GMB) and conductivity mass-balance (CMB) methods for hydrograph separation were used to determine the contribution of base flow to total stormflow at two sites in the upper Hillsborough River watershed in west-central Florida from 2003-2005 and at one site in 2009. The chemical and isotopic composition of streamflow and precipitation was measured during selected local and frontal low- and high-intensity storm events and compared to the geochemical and isotopic composition of groundwater. Input for the GMB method included cation, anion, and stable isotope concentrations of surface water and groundwater, whereas input for the CMB method included continuous or point-sample measurement of specific conductance. The surface water is a calcium-bicarbonate type water, which closely resembles groundwater geochemically, indicating that much of the surface water in the upper Hillsborough River basin is derived from local groundwater discharge. This discharge into the Hillsborough River at State Road 39 and at Hillsborough River State Park becomes diluted by precipitation and runoff during the wet season, but retains the calcium-bicarbonate characteristics of Upper Floridan aquifer water. Field conditions limited the application of the GMB method to low-intensity storms but the CMB method was applied to both low-intensity and high-intensity storms. The average contribution of base flow to total discharge for all storms ranged from 31 to 100 percent, whereas the contribution of base flow to total discharge during peak discharge periods ranged from less than 10 percent to 100 percent. Although calcium, magnesium, and silica were consistent markers of Upper Floridan aquifer chemistry, their use in calculating base flow by the GMB method was limited because the frequency of point data collected in this study was not sufficient to capture the complete hydrograph from pre-event base-flow to post-event base-flow concentrations. In this study, pre-event water represented somewhat diluted groundwater. Streamflow conductivity integrates the concentrations of the major ions, and the logistics of acquiring specific conductance at frequent time intervals are less complicated than data collection, sample processing, shipment, and analysis of water samples in a laboratory. The acquisition of continuous specific conductance data reduces uncertainty associated with less-frequently collected geochemical point data.

  17. Knowledge-based grouping of modeled HLA peptide complexes.

    PubMed

    Kangueane, P; Sakharkar, M K; Lim, K S; Hao, H; Lin, K; Chee, R E; Kolatkar, P R

    2000-05-01

    Human leukocyte antigens are the most polymorphic of human genes and multiple sequence alignment shows that such polymorphisms are clustered in the functional peptide binding domains. Because of such polymorphism among the peptide binding residues, the prediction of peptides that bind to specific HLA molecules is very difficult. In recent years two different types of computer based prediction methods have been developed and both the methods have their own advantages and disadvantages. The nonavailability of allele specific binding data restricts the use of knowledge-based prediction methods for a wide range of HLA alleles. Alternatively, the modeling scheme appears to be a promising predictive tool for the selection of peptides that bind to specific HLA molecules. The scoring of the modeled HLA-peptide complexes is a major concern. The use of knowledge based rules (van der Waals clashes and solvent exposed hydrophobic residues) to distinguish binders from nonbinders is applied in the present study. The rules based on (1) number of observed atomic clashes between the modeled peptide and the HLA structure, and (2) number of solvent exposed hydrophobic residues on the modeled peptide effectively discriminate experimentally known binders from poor/nonbinders. Solved crystal complexes show no vdW Clash (vdWC) in 95% cases and no solvent exposed hydrophobic peptide residues (SEHPR) were seen in 86% cases. In our attempt to compare experimental binding data with the predicted scores by this scoring scheme, 77% of the peptides are correctly grouped as good binders with a sensitivity of 71%.

  18. Evaluating a Control System Architecture Based on a Formally Derived AOCS Model

    NASA Astrophysics Data System (ADS)

    Ilic, Dubravka; Latvala, Timo; Varpaaniemi, Kimmo; Vaisanen, Pauli; Troubitsyna, Elena; Laibinis, Linas

    2010-08-01

    Attitude & Orbit Control System (AOCS) refers to a wider class of control systems which are used to determine and control the attitude of the spacecraft while in orbit, based on the information obtained from various sensors. In this paper, we propose an approach to evaluate a typical (yet somewhat simplified) AOCS architecture using formal development - based on the Event-B method. As a starting point, an Ada specification of the AOCS is translated into a formal specification and further refined to incorporate all the details of its original source code specification. This way we are able not only to evaluate the Ada specification by expressing and verifying specific system properties in our formal models, but also to determine how well the chosen modelling framework copes with the level of detail required for an actual implementation and code generation from the derived models.

  19. Online automatic tuning and control for fed-batch cultivation

    PubMed Central

    van Straten, Gerrit; van der Pol, Leo A.; van Boxtel, Anton J. B.

    2007-01-01

    Performance of controllers applied in biotechnological production is often below expectation. Online automatic tuning has the capability to improve control performance by adjusting control parameters. This work presents automatic tuning approaches for model reference specific growth rate control during fed-batch cultivation. The approaches are direct methods that use the error between observed specific growth rate and its set point; systematic perturbations of the cultivation are not necessary. Two automatic tuning methods proved to be efficient, in which the adaptation rate is based on a combination of the error, squared error and integral error. These methods are relatively simple and robust against disturbances, parameter uncertainties, and initialization errors. Application of the specific growth rate controller yields a stable system. The controller and automatic tuning methods are qualified by simulations and laboratory experiments with Bordetella pertussis. PMID:18157554

  20. Chromosome-specific staining to detect genetic rearrangements

    DOEpatents

    Gray, Joe W.; Pinkel, Daniel; Tkachuk, Douglas; Westbrook, Carol

    2013-04-09

    Methods and compositions for staining based upon nucleic acid sequence that employ nucleic acid probes are provided. Said methods produce staining patterns that can be tailored for specific cytogenetic analyzes. Said probes are appropriate for in situ hybridization and stain both interphase and metaphase chromosomal material with reliable signals. The nucleic acid probes are typically of a complexity greater than 50 kb, the complexity depending upon the cytogenetic application. Methods and reagents are provided for the detection of genetic rearrangements. Probes and test kits are provided for use in detecting genetic rearrangements, particularly for use in tumor cytogenetics, in the detection of disease related loci, specifically cancer, such as chronic myelogenous leukemia (CML) and for biological dosimetry. Methods and reagents are described for cytogenetic research, for the differentiation of cytogenetically similar but genetically different diseases, and for many prognostic and diagnostic applications.

  1. Development of a polymerase chain reaction applicable to rapid and sensitive detection of Clonorchis sinensis eggs in human stool samples

    PubMed Central

    Cho, Pyo Yun; Na, Byoung-Kuk; Mi Choi, Kyung; Kim, Jin Su; Cho, Shin-Hyeong; Lee, Won-Ja; Lim, Sung-Bin; Cha, Seok Ho; Park, Yun-Kyu; Pak, Jhang Ho; Lee, Hyeong-Woo; Hong, Sung-Jong; Kim, Tong-Soo

    2013-01-01

    Microscopic examination of eggs of parasitic helminths in stool samples has been the most widely used classical diagnostic method for infections, but tiny and low numbers of eggs in stool samples often hamper diagnosis of helminthic infections with classical microscopic examination. Moreover, it is also difficult to differentiate parasite eggs by the classical method, if they have similar morphological characteristics. In this study, we developed a rapid and sensitive polymerase chain reaction (PCR)-based molecular diagnostic method for detection of Clonorchis sinensis eggs in stool samples. Nine primers were designed based on the long-terminal repeat (LTR) of C. sinensis retrotransposon1 (CsRn1) gene, and seven PCR primer sets were paired. Polymerase chain reaction with each primer pair produced specific amplicons for C. sinensis, but not for other trematodes including Metagonimus yokogawai and Paragonimus westermani. Particularly, three primer sets were able to detect 10 C. sinensis eggs and were applicable to amplify specific amplicons from DNA samples purified from stool of C. sinensis-infected patients. This PCR method could be useful for diagnosis of C. sinensis infections in human stool samples with a high level of specificity and sensitivity. PMID:23916334

  2. Toward patient-specific articular contact mechanics

    PubMed Central

    Ateshian, Gerard A.; Henak, Corinne R.; Weiss, Jeffrey A.

    2015-01-01

    The mechanics of contacting cartilage layers is fundamentally important to understanding the development, homeostasis and pathology of diarthrodial joints. Because of the highly nonlinear nature of both the materials and the contact problem itself, numerical methods such as the finite element method are typically incorporated to obtain solutions. Over the course of five decades, we have moved from an initial qualitative understanding of articular cartilage material behavior to the ability to perform complex, three-dimensional contact analysis, including multiphasic material representations. This history includes the development of analytical and computational contact analysis methods that now provide the ability to perform highly nonlinear analyses. Numerical implementations of contact analysis based on the finite element method are rapidly advancing and will soon enable patient-specific analysis of joint contact mechanics using models based on medical image data. In addition to contact stress on the articular surfaces, these techniques can predict variations in strain and strain through the cartilage layers, providing the basis to predict damage and failure. This opens up exciting areas for future research and application to patient-specific diagnosis and treatment planning applied to a variety of pathologies that affect joint function and cartilage homeostasis. PMID:25698236

  3. Practice Makes Perfect: Using a Computer-Based Business Simulation in Entrepreneurship Education

    ERIC Educational Resources Information Center

    Armer, Gina R. M.

    2011-01-01

    This article explains the use of a specific computer-based simulation program as a successful experiential learning model and as a way to increase student motivation while augmenting conventional methods of business instruction. This model is based on established adult learning principles.

  4. An Object-Based Requirements Modeling Method.

    ERIC Educational Resources Information Center

    Cordes, David W.; Carver, Doris L.

    1992-01-01

    Discusses system modeling and specification as it relates to object-based information systems development and software development. An automated system model based on the objects in the initial requirements document is described, the requirements document translator is explained, and a sample application of the technique is provided. (12…

  5. Validation of Skeletal Muscle cis-Regulatory Module Predictions Reveals Nucleotide Composition Bias in Functional Enhancers

    PubMed Central

    Kwon, Andrew T.; Chou, Alice Yi; Arenillas, David J.; Wasserman, Wyeth W.

    2011-01-01

    We performed a genome-wide scan for muscle-specific cis-regulatory modules (CRMs) using three computational prediction programs. Based on the predictions, 339 candidate CRMs were tested in cell culture with NIH3T3 fibroblasts and C2C12 myoblasts for capacity to direct selective reporter gene expression to differentiated C2C12 myotubes. A subset of 19 CRMs validated as functional in the assay. The rate of predictive success reveals striking limitations of computational regulatory sequence analysis methods for CRM discovery. Motif-based methods performed no better than predictions based only on sequence conservation. Analysis of the properties of the functional sequences relative to inactive sequences identifies nucleotide sequence composition can be an important characteristic to incorporate in future methods for improved predictive specificity. Muscle-related TFBSs predicted within the functional sequences display greater sequence conservation than non-TFBS flanking regions. Comparison with recent MyoD and histone modification ChIP-Seq data supports the validity of the functional regions. PMID:22144875

  6. Research on keyword retrieval method of HBase database based on index structure

    NASA Astrophysics Data System (ADS)

    Gong, Pijin; Lv, Congmin; Gong, Yongsheng; Ma, Haozhi; Sun, Yang; Wang, Lu

    2017-10-01

    With the rapid development of manned spaceflight engineering, the scientific experimental data in space application system is increasing rapidly. How to efficiently query the specific data in the mass data volume has become a problem. In this paper, a method of retrieving the object data based on the object attribute as the keyword is proposed. The HBase database is used to store the object data and object attributes, and the secondary index is constructed. The research shows that this method is a good way to retrieve specified data based on object attributes.

  7. QUALITY ASSURANCE FOR PCR

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) held a workshop in January 2003 on the detection of viruses in water using polymerase chain reaction (PCR)-based methods. Speakers were asked to address a series of specific questions, including whether a single standard method coul...

  8. 16 CFR 1000.29 - Directorate for Engineering Sciences.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... standards, product safety tests and test methods, performance criteria, design specifications, and quality control standards for consumer products, based on engineering and scientific methods. It conducts... consumer interest groups. The Directorate conducts human factors studies and research of consumer product...

  9. 16 CFR 1000.29 - Directorate for Engineering Sciences.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... standards, product safety tests and test methods, performance criteria, design specifications, and quality control standards for consumer products, based on engineering and scientific methods. It conducts... consumer interest groups. The Directorate conducts human factors studies and research of consumer product...

  10. Analysis and feasibility of asphalt pavement performance-based specifications for WisDOT.

    DOT National Transportation Integrated Search

    2016-12-25

    Literature review of most recent methods used for effective characterization of asphalt mixtures resulted in selecting aset of test methods for measuring mixture resistance for rutting and moisture damage at high temperature, fatigue cracking at inte...

  11. DEVELOPMENT OF SULFHYDRYL-REACTIVE SILICA FOR PROTEIN IMMOBILIZATION IN HIGH-PERFORMANCE AFFINITY CHROMATOGRAPHY

    PubMed Central

    Mallik, Rangan; Wa, Chunling; Hage, David S.

    2008-01-01

    Two techniques were developed for the immobilization of proteins and other ligands to silica through sulfhydryl groups. These methods made use of maleimide-activated silica (the SMCC method) or iodoacetyl-activated silica (the SIA method). The resulting supports were tested for use in high-performance affinity chromatography by employing human serum albumin (HSA) as a model protein. Studies with normal and iodoacetamide-modified HSA indicated that these methods had a high selectivity for sulfhydryl groups on this protein, which accounted for the coupling of 77–81% of this protein to maleimide- or iodacetyl-activated silica. These supports were also evaluated in terms of their total protein content, binding capacity, specific activity, non-specific binding, stability and chiral selectivity for several test solutes. HSA columns prepared using maleimide-activated silica gave the best overall results for these properties when compared to HSA that had been immobilized to silica through the Schiff base method (i.e., an amine-based coupling technique). A key advantage of the supports developed in this work is that they offer the potential of giving greater site-selective immobilization and ligand activity than amine-based coupling methods. These features make these supports attractive in the development of protein columns for such applications as the study of biological interactions and chiral separations. PMID:17297940

  12. On-line monitoring of H2 generation and the HTF degradation in parabolic trough solar thermal power plants: Development of an optical sensor based on an innovative approach

    NASA Astrophysics Data System (ADS)

    Pagola, Iñigo; Funcia, Ibai; Sánchez, Marcelino; Gil, Javier; González-Vallejo, Victoria; Bedoya, Maxi; Orellana, Guillermo

    2017-06-01

    The work presented in this paper offers a robust, effective and economically competitive method for online detection and monitoring of the presence of molecular hydrogen in the heat transfer fluids of parabolic trough collector plants. The novel method is based on a specific fluorescent sensor according to the ES2425002 patent ("Method for the detection and quantification of hydrogen in a heat transfer fluid").

  13. Research on Generating Method of Embedded Software Test Document Based on Dynamic Model

    NASA Astrophysics Data System (ADS)

    Qu, MingCheng; Wu, XiangHu; Tao, YongChao; Liu, Ying

    2018-03-01

    This paper provides a dynamic model-based test document generation method for embedded software that provides automatic generation of two documents: test requirements specification documentation and configuration item test documentation. This method enables dynamic test requirements to be implemented in dynamic models, enabling dynamic test demand tracking to be easily generated; able to automatically generate standardized, standardized test requirements and test documentation, improved document-related content inconsistency and lack of integrity And other issues, improve the efficiency.

  14. Statistical methods for analysis of radiation effects with tumor and dose location-specific information with application to the WECARE study of asynchronous contralateral breast cancer

    PubMed Central

    Langholz, Bryan; Thomas, Duncan C.; Stovall, Marilyn; Smith, Susan A.; Boice, John D.; Shore, Roy E.; Bernstein, Leslie; Lynch, Charles F.; Zhang, Xinbo; Bernstein, Jonine L.

    2009-01-01

    Summary Methods for the analysis of individually matched case-control studies with location-specific radiation dose and tumor location information are described. These include likelihood methods for analyses that just use cases with precise location of tumor information and methods that also include cases with imprecise tumor location information. The theory establishes that each of these likelihood based methods estimates the same radiation rate ratio parameters, within the context of the appropriate model for location and subject level covariate effects. The underlying assumptions are characterized and the potential strengths and limitations of each method are described. The methods are illustrated and compared using the WECARE study of radiation and asynchronous contralateral breast cancer. PMID:18647297

  15. Staining Methods for Normal and Regenerative Myelin in the Nervous System.

    PubMed

    Carriel, Víctor; Campos, Antonio; Alaminos, Miguel; Raimondo, Stefania; Geuna, Stefano

    2017-01-01

    Histochemical techniques enable the specific identification of myelin by light microscopy. Here we describe three histochemical methods for the staining of myelin suitable for formalin-fixed and paraffin-embedded materials. The first method is conventional luxol fast blue (LFB) method which stains myelin in blue and Nissl bodies and mast cells in purple. The second method is a LBF-based method called MCOLL, which specifically stains the myelin as well the collagen fibers and cells, giving an integrated overview of the histology and myelin content of the tissue. Finally, we describe the osmium tetroxide method, which consist in the osmication of previously fixed tissues. Osmication is performed prior the embedding of tissues in paraffin giving a permanent positive reaction for myelin as well as other lipids present in the tissue.

  16. A linear parameter-varying multiobjective control law design based on youla parametrization for a flexible blended wing body aircraft

    NASA Astrophysics Data System (ADS)

    Demourant, F.; Ferreres, G.

    2013-12-01

    This article presents a methodology for a linear parameter-varying (LPV) multiobjective flight control law design for a blended wing body (BWB) aircraft and results. So, the method is a direct design of a parametrized control law (with respect to some measured flight parameters) through a multimodel convex design to optimize a set of specifications on the full-flight domain and different mass cases. The methodology is based on the Youla parameterization which is very useful since closed loop specifications are affine with respect to Youla parameter. The LPV multiobjective design method is detailed and applied to the BWB flexible aircraft example.

  17. Recombinational Cloning Using Gateway and In-Fusion Cloning Schemes

    PubMed Central

    Throop, Andrea L.; LaBaer, Joshua

    2015-01-01

    The comprehensive study of protein structure and function, or proteomics, depends on the obtainability of full-length cDNAs in species-specific expression vectors and subsequent functional analysis of the expressed protein. Recombinational cloning is a universal cloning technique based on site-specific recombination that is independent of the insert DNA sequence of interest, which differentiates this method from the classical restriction enzyme-based cloning methods. Recombinational cloning enables rapid and efficient parallel transfer of DNA inserts into multiple expression systems. This unit summarizes strategies for generating expression-ready clones using the most popular recombinational cloning technologies, including the commercially available Gateway® (Life Technologies) and In-Fusion® (Clontech) cloning technologies. PMID:25827088

  18. A microRNA detection system based on padlock probes and rolling circle amplification

    PubMed Central

    Jonstrup, Søren Peter; Koch, Jørn; Kjems, Jørgen

    2006-01-01

    The differential expression and the regulatory roles of microRNAs (miRNAs) are being studied intensively these years. Their minute size of only 19–24 nucleotides and strong sequence similarity among related species call for enhanced methods for reliable detection and quantification. Moreover, miRNA expression is generally restricted to a limited number of specific cells within an organism and therefore requires highly sensitive detection methods. Here we present a simple and reliable miRNA detection protocol based on padlock probes and rolling circle amplification. It can be performed without specialized equipment and is capable of measuring the content of specific miRNAs in a few nanograms of total RNA. PMID:16888321

  19. A microRNA detection system based on padlock probes and rolling circle amplification.

    PubMed

    Jonstrup, Søren Peter; Koch, Jørn; Kjems, Jørgen

    2006-09-01

    The differential expression and the regulatory roles of microRNAs (miRNAs) are being studied intensively these years. Their minute size of only 19-24 nucleotides and strong sequence similarity among related species call for enhanced methods for reliable detection and quantification. Moreover, miRNA expression is generally restricted to a limited number of specific cells within an organism and therefore requires highly sensitive detection methods. Here we present a simple and reliable miRNA detection protocol based on padlock probes and rolling circle amplification. It can be performed without specialized equipment and is capable of measuring the content of specific miRNAs in a few nanograms of total RNA.

  20. Wideband dichroic-filter design for LED-phosphor beam-combining

    DOEpatents

    Falicoff, Waqidi

    2010-12-28

    A general method is disclosed of designing two-component dichroic short-pass filters operable for incidence angle distributions over the 0-30.degree. range, and specific preferred embodiments are listed. The method is based on computer optimization algorithms for an N-layer design, specifically the N-dimensional conjugate-gradient minimization of a merit function based on difference from a target transmission spectrum, as well as subsequent cycles of needle synthesis for increasing N. A key feature of the method is the initial filter design, upon which the algorithm proceeds to iterate successive design candidates with smaller merit functions. This initial design, with high-index material H and low-index L, is (0.75 H, 0.5 L, 0.75 H)^m, denoting m (20-30) repetitions of a three-layer motif, giving rise to a filter with N=2 m+1.

  1. Development of the IBSAL-SimMOpt Method for the Optimization of Quality in a Corn Stover Supply Chain

    DOE PAGES

    Chavez, Hernan; Castillo-Villar, Krystel; Webb, Erin

    2017-08-01

    Variability on the physical characteristics of feedstock has a relevant effect on the reactor’s reliability and operating cost. Most of the models developed to optimize biomass supply chains have failed to quantify the effect of biomass quality and preprocessing operations required to meet biomass specifications on overall cost and performance. The Integrated Biomass Supply Analysis and Logistics (IBSAL) model estimates the harvesting, collection, transportation, and storage cost while considering the stochastic behavior of the field-to-biorefinery supply chain. This paper proposes an IBSAL-SimMOpt (Simulation-based Multi-Objective Optimization) method for optimizing the biomass quality and costs associated with the efforts needed to meetmore » conversion technology specifications. The method is developed in two phases. For the first phase, a SimMOpt tool that interacts with the extended IBSAL is developed. For the second phase, the baseline IBSAL model is extended so that the cost for meeting and/or penalization for failing in meeting specifications are considered. The IBSAL-SimMOpt method is designed to optimize quality characteristics of biomass, cost related to activities intended to improve the quality of feedstock, and the penalization cost. A case study based on 1916 farms in Ontario, Canada is considered for testing the proposed method. Analysis of the results demonstrates that this method is able to find a high-quality set of non-dominated solutions.« less

  2. Development of the IBSAL-SimMOpt Method for the Optimization of Quality in a Corn Stover Supply Chain

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chavez, Hernan; Castillo-Villar, Krystel; Webb, Erin

    Variability on the physical characteristics of feedstock has a relevant effect on the reactor’s reliability and operating cost. Most of the models developed to optimize biomass supply chains have failed to quantify the effect of biomass quality and preprocessing operations required to meet biomass specifications on overall cost and performance. The Integrated Biomass Supply Analysis and Logistics (IBSAL) model estimates the harvesting, collection, transportation, and storage cost while considering the stochastic behavior of the field-to-biorefinery supply chain. This paper proposes an IBSAL-SimMOpt (Simulation-based Multi-Objective Optimization) method for optimizing the biomass quality and costs associated with the efforts needed to meetmore » conversion technology specifications. The method is developed in two phases. For the first phase, a SimMOpt tool that interacts with the extended IBSAL is developed. For the second phase, the baseline IBSAL model is extended so that the cost for meeting and/or penalization for failing in meeting specifications are considered. The IBSAL-SimMOpt method is designed to optimize quality characteristics of biomass, cost related to activities intended to improve the quality of feedstock, and the penalization cost. A case study based on 1916 farms in Ontario, Canada is considered for testing the proposed method. Analysis of the results demonstrates that this method is able to find a high-quality set of non-dominated solutions.« less

  3. Coupled double-distribution-function lattice Boltzmann method for the compressible Navier-Stokes equations.

    PubMed

    Li, Q; He, Y L; Wang, Y; Tao, W Q

    2007-11-01

    A coupled double-distribution-function lattice Boltzmann method is developed for the compressible Navier-Stokes equations. Different from existing thermal lattice Boltzmann methods, this method can recover the compressible Navier-Stokes equations with a flexible specific-heat ratio and Prandtl number. In the method, a density distribution function based on a multispeed lattice is used to recover the compressible continuity and momentum equations, while the compressible energy equation is recovered by an energy distribution function. The energy distribution function is then coupled to the density distribution function via the thermal equation of state. In order to obtain an adjustable specific-heat ratio, a constant related to the specific-heat ratio is introduced into the equilibrium energy distribution function. Two different coupled double-distribution-function lattice Boltzmann models are also proposed in the paper. Numerical simulations are performed for the Riemann problem, the double-Mach-reflection problem, and the Couette flow with a range of specific-heat ratios and Prandtl numbers. The numerical results are found to be in excellent agreement with analytical and/or other solutions.

  4. In vivo serial MRI-based models and statistical methods to quantify sensitivity and specificity of mechanical predictors for carotid plaque rupture: location and beyond.

    PubMed

    Wu, Zheyang; Yang, Chun; Tang, Dalin

    2011-06-01

    It has been hypothesized that mechanical risk factors may be used to predict future atherosclerotic plaque rupture. Truly predictive methods for plaque rupture and methods to identify the best predictor(s) from all the candidates are lacking in the literature. A novel combination of computational and statistical models based on serial magnetic resonance imaging (MRI) was introduced to quantify sensitivity and specificity of mechanical predictors to identify the best candidate for plaque rupture site prediction. Serial in vivo MRI data of carotid plaque from one patient was acquired with follow-up scan showing ulceration. 3D computational fluid-structure interaction (FSI) models using both baseline and follow-up data were constructed and plaque wall stress (PWS) and strain (PWSn) and flow maximum shear stress (FSS) were extracted from all 600 matched nodal points (100 points per matched slice, baseline matching follow-up) on the lumen surface for analysis. Each of the 600 points was marked "ulcer" or "nonulcer" using follow-up scan. Predictive statistical models for each of the seven combinations of PWS, PWSn, and FSS were trained using the follow-up data and applied to the baseline data to assess their sensitivity and specificity using the 600 data points for ulcer predictions. Sensitivity of prediction is defined as the proportion of the true positive outcomes that are predicted to be positive. Specificity of prediction is defined as the proportion of the true negative outcomes that are correctly predicted to be negative. Using probability 0.3 as a threshold to infer ulcer occurrence at the prediction stage, the combination of PWS and PWSn provided the best predictive accuracy with (sensitivity, specificity) = (0.97, 0.958). Sensitivity and specificity given by PWS, PWSn, and FSS individually were (0.788, 0.968), (0.515, 0.968), and (0.758, 0.928), respectively. The proposed computational-statistical process provides a novel method and a framework to assess the sensitivity and specificity of various risk indicators and offers the potential to identify the optimized predictor for plaque rupture using serial MRI with follow-up scan showing ulceration as the gold standard for method validation. While serial MRI data with actual rupture are hard to acquire, this single-case study suggests that combination of multiple predictors may provide potential improvement to existing plaque assessment schemes. With large-scale patient studies, this predictive modeling process may provide more solid ground for rupture predictor selection strategies and methods for image-based plaque vulnerability assessment.

  5. Working with Sparse Data in Rated Language Tests: Generalizability Theory Applications

    ERIC Educational Resources Information Center

    Lin, Chih-Kai

    2017-01-01

    Sparse-rated data are common in operational performance-based language tests, as an inevitable result of assigning examinee responses to a fraction of available raters. The current study investigates the precision of two generalizability-theory methods (i.e., the rating method and the subdividing method) specifically designed to accommodate the…

  6. e-Learning Business Research Methods

    ERIC Educational Resources Information Center

    Cowie, Jonathan

    2004-01-01

    This paper outlines the development of a generic Business Research Methods course from a simple name in a box to a full e-Learning web based module. It highlights particular issues surrounding the nature of the discipline and the integration of a large number of cross faculty subject specific research methods courses into a single generic module.…

  7. Interactive Learning in the Classroom: Is Student Response Method Related to Performance?

    ERIC Educational Resources Information Center

    Elicker, Joelle D.; McConnell, Nicole L.

    2011-01-01

    This study examined three methods of responding to in-class multiple-choice concept questions in an Introduction to Psychology course. Specifically, this study compared exam performance and student reactions using three methods of responding to concept questions: (a) a technology-based network system, (b) hand-held flashcards, and (c) hand…

  8. Specific Yields Estimated from Gravity Change during Pumping Test

    NASA Astrophysics Data System (ADS)

    Chen, K. H.; Hwang, C.; Chang, L. C.

    2017-12-01

    Specific yield (Sy) is the most important parameter to describe available groundwater capacity in an unconfined aquifer. When estimating Sy by a field pumping test, aquifer heterogeneity and well performers will cause a large uncertainty. In this study, we use a gravity-based method to estimate Sy. At the time of pumping test, amounts of mass (groundwater) are forced to be taken out. If drawdown corn is big and close enough to high precision gravimeter, the gravity change can be detected. The gravity-based method use gravity observations that are independent from traditional flow computation. Only the drawdown corn should be modeled with observed head and hydrogeology data. The gravity method can be used in most groundwater field tests, such as locally pumping/injection tests initiated by active man-made or annual variations due to natural sources. We apply our gravity method at few sites in Taiwan situated over different unconfined aquifer. Here pumping tests for Sy determinations were also carried out. We will discuss why the gravity method produces different results from traditional pumping test, field designs and limitations of the gravity method.

  9. Comparison of methods for in-house screening of HLA-B*57:01 to prevent abacavir hypersensitivity in HIV-1 care.

    PubMed

    De Spiegelaere, Ward; Philippé, Jan; Vervisch, Karen; Verhofstede, Chris; Malatinkova, Eva; Kiselinova, Maja; Trypsteen, Wim; Bonczkowski, Pawel; Vogelaers, Dirk; Callens, Steven; Ruelle, Jean; Kabeya, Kabamba; De Wit, Stephane; Van Acker, Petra; Van Sandt, Vicky; Emonds, Marie-Paule; Coucke, Paul; Sermijn, Erica; Vandekerckhove, Linos

    2015-01-01

    Abacavir is a nucleoside reverse transcriptase inhibitor used as part of combination antiretroviral therapy in HIV-1-infected patients. Because this drug can cause a hypersensitivity reaction that is correlated with the presence of the HLA-B*57:01 allotype, screening for the presence of HLA-B*57:01 is recommended before abacavir initiation. Different genetic assays have been developed for HLA-B*57:01 screening, each with specific sensitivity, turnaround time and assay costs. Here, a new real-time PCR (qPCR) based analysis is described and compared to sequence specific primer PCR with capillary electrophoresis (SSP PCR CE) on 149 patient-derived samples, using sequence specific oligonucleotide hybridization combined with high resolution SSP PCR as gold standard. In addition to these PCR based methods, a complementary approach was developed using flow cytometry with an HLA-B17 specific monoclonal antibody as a pre-screening assay to diminish the number of samples for genetic testing. All three assays had a maximum sensitivity of >99. However, differences in specificity were recorded, i.e. 84.3%, 97.2% and >99% for flow cytometry, qPCR and SSP PCR CE respectively. Our data indicate that the most specific and sensitive of the compared methods is the SSP PCR CE. Flow cytometry pre-screening can substantially decrease the number of genetic tests for HLA-B*57:01 typing in a clinical setting.

  10. Toward a method of collaborative, evidence-based response to desertification

    USDA-ARS?s Scientific Manuscript database

    Overgeneralized narratives about how desertified ecosystems will respond to restoration actions may result in wasted resources, missed opportunities, or accelerated degradation. Evidence-based collaborative adaptive management (CAM) could solve this problem by providing site-specific information tha...

  11. Orbital and maxillofacial computer aided surgery: patient-specific finite element models to predict surgical outcomes.

    PubMed

    Luboz, Vincent; Chabanas, Matthieu; Swider, Pascal; Payan, Yohan

    2005-08-01

    This paper addresses an important issue raised for the clinical relevance of Computer-Assisted Surgical applications, namely the methodology used to automatically build patient-specific finite element (FE) models of anatomical structures. From this perspective, a method is proposed, based on a technique called the mesh-matching method, followed by a process that corrects mesh irregularities. The mesh-matching algorithm generates patient-specific volume meshes from an existing generic model. The mesh regularization process is based on the Jacobian matrix transform related to the FE reference element and the current element. This method for generating patient-specific FE models is first applied to computer-assisted maxillofacial surgery, and more precisely, to the FE elastic modelling of patient facial soft tissues. For each patient, the planned bone osteotomies (mandible, maxilla, chin) are used as boundary conditions to deform the FE face model, in order to predict the aesthetic outcome of the surgery. Seven FE patient-specific models were successfully generated by our method. For one patient, the prediction of the FE model is qualitatively compared with the patient's post-operative appearance, measured from a computer tomography scan. Then, our methodology is applied to computer-assisted orbital surgery. It is, therefore, evaluated for the generation of 11 patient-specific FE poroelastic models of the orbital soft tissues. These models are used to predict the consequences of the surgical decompression of the orbit. More precisely, an average law is extrapolated from the simulations carried out for each patient model. This law links the size of the osteotomy (i.e. the surgical gesture) and the backward displacement of the eyeball (the consequence of the surgical gesture).

  12. Individual subject classification for Alzheimer's disease based on incremental learning using a spatial frequency representation of cortical thickness data.

    PubMed

    Cho, Youngsang; Seong, Joon-Kyung; Jeong, Yong; Shin, Sung Yong

    2012-02-01

    Patterns of brain atrophy measured by magnetic resonance structural imaging have been utilized as significant biomarkers for diagnosis of Alzheimer's disease (AD). However, brain atrophy is variable across patients and is non-specific for AD in general. Thus, automatic methods for AD classification require a large number of structural data due to complex and variable patterns of brain atrophy. In this paper, we propose an incremental method for AD classification using cortical thickness data. We represent the cortical thickness data of a subject in terms of their spatial frequency components, employing the manifold harmonic transform. The basis functions for this transform are obtained from the eigenfunctions of the Laplace-Beltrami operator, which are dependent only on the geometry of a cortical surface but not on the cortical thickness defined on it. This facilitates individual subject classification based on incremental learning. In general, methods based on region-wise features poorly reflect the detailed spatial variation of cortical thickness, and those based on vertex-wise features are sensitive to noise. Adopting a vertex-wise cortical thickness representation, our method can still achieve robustness to noise by filtering out high frequency components of the cortical thickness data while reflecting their spatial variation. This compromise leads to high accuracy in AD classification. We utilized MR volumes provided by Alzheimer's Disease Neuroimaging Initiative (ADNI) to validate the performance of the method. Our method discriminated AD patients from Healthy Control (HC) subjects with 82% sensitivity and 93% specificity. It also discriminated Mild Cognitive Impairment (MCI) patients, who converted to AD within 18 months, from non-converted MCI subjects with 63% sensitivity and 76% specificity. Moreover, it showed that the entorhinal cortex was the most discriminative region for classification, which is consistent with previous pathological findings. In comparison with other classification methods, our method demonstrated high classification performance in both categories, which supports the discriminative power of our method in both AD diagnosis and AD prediction. Copyright © 2011 Elsevier Inc. All rights reserved.

  13. Microscale Concentration Measurements Using Laser Light Scattering Methods

    NASA Technical Reports Server (NTRS)

    Niederhaus, Charles; Miller, Fletcher

    2004-01-01

    The development of lab-on-a-chip devices for microscale biochemical assays has led to the need for microscale concentration measurements of specific analyses. While fluorescence methods are the current choice, this method requires developing fluorophore-tagged conjugates for each analyte of interest. In addition, fluorescent imaging is also a volume-based method, and can be limiting as smaller detection regions are required.

  14. Recent developments in detection and enumeration of waterborne bacteria: a retrospective minireview.

    PubMed

    Deshmukh, Rehan A; Joshi, Kopal; Bhand, Sunil; Roy, Utpal

    2016-12-01

    Waterborne diseases have emerged as global health problems and their rapid and sensitive detection in environmental water samples is of great importance. Bacterial identification and enumeration in water samples is significant as it helps to maintain safe drinking water for public consumption. Culture-based methods are laborious, time-consuming, and yield false-positive results, whereas viable but nonculturable (VBNCs) microorganisms cannot be recovered. Hence, numerous methods have been developed for rapid detection and quantification of waterborne pathogenic bacteria in water. These rapid methods can be classified into nucleic acid-based, immunology-based, and biosensor-based detection methods. This review summarizes the principle and current state of rapid methods for the monitoring and detection of waterborne bacterial pathogens. Rapid methods outlined are polymerase chain reaction (PCR), digital droplet PCR, real-time PCR, multiplex PCR, DNA microarray, Next-generation sequencing (pyrosequencing, Illumina technology and genomics), and fluorescence in situ hybridization that are categorized as nucleic acid-based methods. Enzyme-linked immunosorbent assay (ELISA) and immunofluorescence are classified into immunology-based methods. Optical, electrochemical, and mass-based biosensors are grouped into biosensor-based methods. Overall, these methods are sensitive, specific, time-effective, and important in prevention and diagnosis of waterborne bacterial diseases. © 2016 The Authors. MicrobiologyOpen published by John Wiley & Sons Ltd.

  15. A Novel Real-Time PCR Assay of microRNAs Using S-Poly(T), a Specific Oligo(dT) Reverse Transcription Primer with Excellent Sensitivity and Specificity

    PubMed Central

    Kang, Kang; Zhang, Xiaoying; Liu, Hongtao; Wang, Zhiwei; Zhong, Jiasheng; Huang, Zhenting; Peng, Xiao; Zeng, Yan; Wang, Yuna; Yang, Yi; Luo, Jun; Gou, Deming

    2012-01-01

    Background MicroRNAs (miRNAs) are small, non-coding RNAs capable of postranscriptionally regulating gene expression. Accurate expression profiling is crucial for understanding the biological roles of miRNAs, and exploring them as biomarkers of diseases. Methodology/Principal Findings A novel, highly sensitive, and reliable miRNA quantification approach,termed S-Poly(T) miRNA assay, is designed. In this assay, miRNAs are subjected to polyadenylation and reverse transcription with a S-Poly(T) primer that contains a universal reverse primer, a universal Taqman probe, an oligo(dT)11 sequence and six miRNA-specific bases. Individual miRNAs are then amplified by a specific forward primer and a universal reverse primer, and the PCR products are detected by a universal Taqman probe. The S-Poly(T) assay showed a minimum of 4-fold increase in sensitivity as compared with the stem-loop or poly(A)-based methods. A remarkable specificity in discriminating among miRNAs with high sequence similarity was also obtained with this approach. Using this method, we profiled miRNAs in human pulmonary arterial smooth muscle cells (HPASMC) and identified 9 differentially expressed miRNAs associated with hypoxia treatment. Due to its outstanding sensitivity, the number of circulating miRNAs from normal human serum was significantly expanded from 368 to 518. Conclusions/Significance With excellent sensitivity, specificity, and high-throughput, the S-Poly(T) method provides a powerful tool for miRNAs quantification and identification of tissue- or disease-specific miRNA biomarkers. PMID:23152780

  16. Use of PCR-Based Methods for Rapid Differentiation of Lactobacillus delbrueckii subsp. bulgaricus and L. delbrueckii subsp. lactis

    PubMed Central

    Torriani, Sandra; Zapparoli, Giacomo; Dellaglio, Franco

    1999-01-01

    Two PCR-based methods, specific PCR and randomly amplified polymorphic DNA PCR (RAPD-PCR), were used for rapid and reliable differentiation of Lactobacillus delbrueckii subsp. bulgaricus and L. delbrueckii subsp. lactis. PCR with a single combination of primers which targeted the proline iminopeptidase (pepIP) gene of L. delbrueckii subsp. bulgaricus allowed amplification of genomic fragments specific for the two subspecies when either DNA from a single colony or cells extracted from dairy products were used. A numerical analysis of the RAPD-PCR patterns obtained with primer M13 gave results that were consistent with the results of specific PCR for all strains except L. delbrueckii subsp. delbrueckii LMG 6412T, which clustered with L. delbrueckii subsp. lactis strains. In addition, RAPD-PCR performed with primer 1254 provided highly polymorphic profiles and thus was superior for distinguishing individual L. delbrueckii strains. PMID:10508059

  17. Use of PCR-based methods for rapid differentiation of Lactobacillus delbrueckii subsp. bulgaricus and L. delbrueckii subsp. lactis.

    PubMed

    Torriani, S; Zapparoli, G; Dellaglio, F

    1999-10-01

    Two PCR-based methods, specific PCR and randomly amplified polymorphic DNA PCR (RAPD-PCR), were used for rapid and reliable differentiation of Lactobacillus delbrueckii subsp. bulgaricus and L. delbrueckii subsp. lactis. PCR with a single combination of primers which targeted the proline iminopeptidase (pepIP) gene of L. delbrueckii subsp. bulgaricus allowed amplification of genomic fragments specific for the two subspecies when either DNA from a single colony or cells extracted from dairy products were used. A numerical analysis of the RAPD-PCR patterns obtained with primer M13 gave results that were consistent with the results of specific PCR for all strains except L. delbrueckii subsp. delbrueckii LMG 6412(T), which clustered with L. delbrueckii subsp. lactis strains. In addition, RAPD-PCR performed with primer 1254 provided highly polymorphic profiles and thus was superior for distinguishing individual L. delbrueckii strains.

  18. Development of representative magnetic resonance imaging-based atlases of the canine brain and evaluation of three methods for atlas-based segmentation.

    PubMed

    Milne, Marjorie E; Steward, Christopher; Firestone, Simon M; Long, Sam N; O'Brien, Terrence J; Moffat, Bradford A

    2016-04-01

    To develop representative MRI atlases of the canine brain and to evaluate 3 methods of atlas-based segmentation (ABS). 62 dogs without clinical signs of epilepsy and without MRI evidence of structural brain disease. The MRI scans from 44 dogs were used to develop 4 templates on the basis of brain shape (brachycephalic, mesaticephalic, dolichocephalic, and combined mesaticephalic and dolichocephalic). Atlas labels were generated by segmenting the brain, ventricular system, hippocampal formation, and caudate nuclei. The MRI scans from the remaining 18 dogs were used to evaluate 3 methods of ABS (manual brain extraction and application of a brain shape-specific template [A], automatic brain extraction and application of a brain shape-specific template [B], and manual brain extraction and application of a combined template [C]). The performance of each ABS method was compared by calculation of the Dice and Jaccard coefficients, with manual segmentation used as the gold standard. Method A had the highest mean Jaccard coefficient and was the most accurate ABS method assessed. Measures of overlap for ABS methods that used manual brain extraction (A and C) ranged from 0.75 to 0.95 and compared favorably with repeated measures of overlap for manual extraction, which ranged from 0.88 to 0.97. Atlas-based segmentation was an accurate and repeatable method for segmentation of canine brain structures. It could be performed more rapidly than manual segmentation, which should allow the application of computer-assisted volumetry to large data sets and clinical cases and facilitate neuroimaging research and disease diagnosis.

  19. Whole genome sequence analysis of unidentified genetically modified papaya for development of a specific detection method.

    PubMed

    Nakamura, Kosuke; Kondo, Kazunari; Akiyama, Hiroshi; Ishigaki, Takumi; Noguchi, Akio; Katsumata, Hiroshi; Takasaki, Kazuto; Futo, Satoshi; Sakata, Kozue; Fukuda, Nozomi; Mano, Junichi; Kitta, Kazumi; Tanaka, Hidenori; Akashi, Ryo; Nishimaki-Mogami, Tomoko

    2016-08-15

    Identification of transgenic sequences in an unknown genetically modified (GM) papaya (Carica papaya L.) by whole genome sequence analysis was demonstrated. Whole genome sequence data were generated for a GM-positive fresh papaya fruit commodity detected in monitoring using real-time polymerase chain reaction (PCR). The sequences obtained were mapped against an open database for papaya genome sequence. Transgenic construct- and event-specific sequences were identified as a GM papaya developed to resist infection from a Papaya ringspot virus. Based on the transgenic sequences, a specific real-time PCR detection method for GM papaya applicable to various food commodities was developed. Whole genome sequence analysis enabled identifying unknown transgenic construct- and event-specific sequences in GM papaya and development of a reliable method for detecting them in papaya food commodities. Copyright © 2016 Elsevier Ltd. All rights reserved.

  20. [Analysis of commercial specifications and grades of wild and cultivated Gentianae Macrophyllae Radix based on multi-indicative constituents].

    PubMed

    Yang, Yan-Mei; Lin, Li; Lu, You-Yuan; Ma, Xiao-Hui; Jin, Ling; Zhu, Tian-Tian

    2016-03-01

    The study is aimed to analyze the commercial specifications and grades of wild and cultivated Gentianae Macrophllae Radix based on multi-indicative constituents. The seven kinds of main chemical components containing in Gentianae Macrophyllae Radix were determined by UPLC, and then the quality levels of chemical component of Gentianae Macrophyllae Radix were clustered and classified by modern statistical methods (canonical correspondence analysis, Fisher discriminant analysis and so on). The quality indices were selected and their correlations were analyzed. Lastly, comprehensively quantitative grade division for quality under different commodity-specifications and different grades of same commodity-specifications of wild and planting were divided. The results provide a basis for a reasonable division of specification and grade of the commodity of Gentianae Macrophyllae Radix. The range of quality evaluation of main index components (gentiopicrin, loganin acid and swertiamarin) was proposed, and the Herbal Quality Index (HQI) was introduced. The rank discriminant function was established based on the quality by Fisher discriminant analysis. According to the analysis, the quality of wild and cultivated Luobojiao, one of the commercial specification of Gentianae Macrophyllae Radix was the best, Mahuajiao, the other commercial specification, was average , Xiaoqinjiao was inferior. Among grades, the quality of first-class cultivated Luobojiao was the worst, of second class secondary, and the third class the best; The quality of the first-class of wild Luobojiao was secondary, and the second-class the best; The quality of the second-class of Mahuajiao was secondary, and the first-class was the best; the quality of first-class Xiaoqinjiao was secondary, and the second-class was the better one between the two grades, but not obvious significantly. The method provides a new idea and method for evaluation of comprehensively quantitative on the quality of Gentianae Macrophyllae Radix. Copyright© by the Chinese Pharmaceutical Association.

  1. Recent advances in the fabrication and structure-specific applications of graphene-based inorganic hybrid membranes.

    PubMed

    Zhao, Xinne; Zhang, Panpan; Chen, Yuting; Su, Zhiqiang; Wei, Gang

    2015-03-12

    The preparation and applications of graphene (G)-based materials are attracting increasing interests due to their unique electronic, optical, magnetic, thermal, and mechanical properties. Compared to G-based hybrid and composite materials, G-based inorganic hybrid membrane (GIHM) offers enormous advantages ascribed to their facile synthesis, planar two-dimensional multilayer structure, high specific surface area, and mechanical stability, as well as their unique optical and mechanical properties. In this review, we report the recent advances in the technical fabrication and structure-specific applications of GIHMs with desirable thickness and compositions. In addition, the advantages and disadvantages of the methods utilized for creating GIHMs are discussed in detail. Finally, the potential applications and key challenges of GIHMs for future technical applications are mentioned.

  2. Modeling specific action potentials in the human atria based on a minimal single-cell model.

    PubMed

    Richter, Yvonne; Lind, Pedro G; Maass, Philipp

    2018-01-01

    We present an effective method to model empirical action potentials of specific patients in the human atria based on the minimal model of Bueno-Orovio, Cherry and Fenton adapted to atrial electrophysiology. In this model, three ionic are currents introduced, where each of it is governed by a characteristic time scale. By applying a nonlinear optimization procedure, a best combination of the respective time scales is determined, which allows one to reproduce specific action potentials with a given amplitude, width and shape. Possible applications for supporting clinical diagnosis are pointed out.

  3. A Protocol Specification-Based Intrusion Detection System for VoIP and Its Evaluation

    NASA Astrophysics Data System (ADS)

    Phit, Thyda; Abe, Kôki

    We propose an architecture of Intrusion Detection System (IDS) for VoIP using a protocol specification-based detection method to monitor the network traffics and alert administrator for further analysis of and response to suspicious activities. The protocol behaviors and their interactions are described by state machines. Traffic that behaves differently from the standard specifications are considered to be suspicious. The IDS has been implemented and simulated using OPNET Modeler, and verified to detect attacks. It was found that our system can detect typical attacks within a reasonable amount of delay time.

  4. Three-dimensional compound comparison methods and their application in drug discovery.

    PubMed

    Shin, Woong-Hee; Zhu, Xiaolei; Bures, Mark Gregory; Kihara, Daisuke

    2015-07-16

    Virtual screening has been widely used in the drug discovery process. Ligand-based virtual screening (LBVS) methods compare a library of compounds with a known active ligand. Two notable advantages of LBVS methods are that they do not require structural information of a target receptor and that they are faster than structure-based methods. LBVS methods can be classified based on the complexity of ligand structure information utilized: one-dimensional (1D), two-dimensional (2D), and three-dimensional (3D). Unlike 1D and 2D methods, 3D methods can have enhanced performance since they treat the conformational flexibility of compounds. In this paper, a number of 3D methods will be reviewed. In addition, four representative 3D methods were benchmarked to understand their performance in virtual screening. Specifically, we tested overall performance in key aspects including the ability to find dissimilar active compounds, and computational speed.

  5. Evaluation Framework for NASA's Educational Outreach Programs

    NASA Technical Reports Server (NTRS)

    Berg, Rick; Booker, Angela; Linde, Charlotte; Preston, Connie

    1999-01-01

    The objective of the proposed work is to develop an evaluation framework for NASA's educational outreach efforts. We focus on public (rather than technical or scientific) dissemination efforts, specifically on Internet-based outreach sites for children.The outcome of this work is to propose both methods and criteria for evaluation, which would enable NASA to do a more analytic evaluation of its outreach efforts. The proposed framework is based on IRL's ethnographic and video-based observational methods, which allow us to analyze how these sites are actually used.

  6. A high-throughput microtiter plate based method for the determination of peracetic acid and hydrogen peroxide.

    PubMed

    Putt, Karson S; Pugh, Randall B

    2013-01-01

    Peracetic acid is gaining usage in numerous industries who have found a myriad of uses for its antimicrobial activity. However, rapid high throughput quantitation methods for peracetic acid and hydrogen peroxide are lacking. Herein, we describe the development of a high-throughput microtiter plate based assay based upon the well known and trusted titration chemical reactions. The adaptation of these titration chemistries to rapid plate based absorbance methods for the sequential determination of hydrogen peroxide specifically and the total amount of peroxides present in solution are described. The results of these methods were compared to those of a standard titration and found to be in good agreement. Additionally, the utility of the developed method is demonstrated through the generation of degradation curves of both peracetic acid and hydrogen peroxide in a mixed solution.

  7. A High-Throughput Microtiter Plate Based Method for the Determination of Peracetic Acid and Hydrogen Peroxide

    PubMed Central

    Putt, Karson S.; Pugh, Randall B.

    2013-01-01

    Peracetic acid is gaining usage in numerous industries who have found a myriad of uses for its antimicrobial activity. However, rapid high throughput quantitation methods for peracetic acid and hydrogen peroxide are lacking. Herein, we describe the development of a high-throughput microtiter plate based assay based upon the well known and trusted titration chemical reactions. The adaptation of these titration chemistries to rapid plate based absorbance methods for the sequential determination of hydrogen peroxide specifically and the total amount of peroxides present in solution are described. The results of these methods were compared to those of a standard titration and found to be in good agreement. Additionally, the utility of the developed method is demonstrated through the generation of degradation curves of both peracetic acid and hydrogen peroxide in a mixed solution. PMID:24260173

  8. Discipline-Specific Language Instruction for International Students in Introductory Economics

    ERIC Educational Resources Information Center

    Nguyen, Trien T.; Williams, Julia; Trimarchi, Angela

    2015-01-01

    This paper explores student perceptions of the effects of pairing discipline-specific language instruction with the traditional method of course delivery in economics. Our research involved teaching content-based English as an additional language (EAL) tutorials to a small group of ten international students taking first-year introductory…

  9. Rapid and Specific Method for Evaluating Streptomyces Competitive Dynamics in Complex Soil Communities

    USDA-ARS?s Scientific Manuscript database

    Quantifying target microbial populations in complex communities remains a barrier to studying species interactions in soil environments. Quantitative real-time PCR (qPCR) offers a rapid and specific means to assess populations of target microorganisms. SYBR Green and TaqMan-based qPCR assays were de...

  10. The Heat Is on: An Inquiry-Based Investigation for Specific Heat

    ERIC Educational Resources Information Center

    Herrington, Deborah G.

    2011-01-01

    A substantial number of upper-level science students and practicing physical science teachers demonstrate confusion about thermal equilibrium, heat transfer, heat capacity, and specific heat capacity. The traditional method of instruction, which involves learning the related definitions and equations, using equations to solve heat transfer…

  11. Nanotechnology: a promising method for oral cancer detection and diagnosis.

    PubMed

    Chen, Xiao-Jie; Zhang, Xue-Qiong; Liu, Qi; Zhang, Jing; Zhou, Gang

    2018-06-11

    Oral cancer is a common and aggressive cancer with high morbidity, mortality, and recurrence rate globally. Early detection is of utmost importance for cancer prevention and disease management. Currently, tissue biopsy remains the gold standard for oral cancer diagnosis, but it is invasive, which may cause patient discomfort. The application of traditional noninvasive methods-such as vital staining, exfoliative cytology, and molecular imaging-is limited by insufficient sensitivity and specificity. Thus, there is an urgent need for exploring noninvasive, highly sensitive, and specific diagnostic techniques. Nano detection systems are known as new emerging noninvasive strategies that bring the detection sensitivity of biomarkers to nano-scale. Moreover, compared to current imaging contrast agents, nanoparticles are more biocompatible, easier to synthesize, and able to target specific surface molecules. Nanoparticles generate localized surface plasmon resonances at near-infrared wavelengths, providing higher image contrast and resolution. Therefore, using nano-based techniques can help clinicians to detect and better monitor diseases during different phases of oral malignancy. Here, we review the progress of nanotechnology-based methods in oral cancer detection and diagnosis.

  12. Affinity entrapment of oligosaccharides and glycopeptides using free lectin solution.

    PubMed

    Yodoshi, Masahiro; Oyama, Takehiro; Masaki, Ken; Kakehi, Kazuaki; Hayakawa, Takao; Suzuki, Shigeo

    2011-01-01

    Two procedures were proposed for the specific recovery of fluorescent derivatives of glycoprotein-derived oligosaccharides and tryptic glycopeptides using certain plant lectins. The first was based on the salting out of oligosaccharide-lectin conjugates with ammonium sulfate. Oligosaccharides specifically bound to lectins were recovered free from lectins using ethanol precipitation after dissolution in water. This method enabled group separation of 2-aminopyridine-labeled oligosaccharides derived from ovalbumin to galacto-oligosaccharides and agalacto-oligosaccharides by Ricinus communis agglutinin, and to high mannose- and hybrid-type oligosaccharides by wheat-germ agglutinin. Fractional precipitation based on differences in affinity for concanavalin A was accomplished by adding an appropriate concentration of methyl α-mannoside as an inhibitor. In the second method, tryptic digests of glycoproteins were mixed with a lectin solution, and the glycopeptide-lectin conjugates were specifically trapped on a centrifugal ultrafiltration membrane with cut-off of 10 kD. Trapped glycopeptides, as retentates, were passed through membranes by resuspension in diluted acid. This method is particularly useful for the enrichment of glycopeptides in protease digestion mixtures for glycosylation analyses by liquid chromatography-mass spectrometry.

  13. Electron microscopic visualization of complementary labeled DNA with platinum-containing guanine derivative.

    PubMed

    Loukanov, Alexandre; Filipov, Chavdar; Mladenova, Polina; Toshev, Svetlin; Emin, Saim

    2016-04-01

    The object of the present report is to provide a method for a visualization of DNA in TEM by complementary labeling of cytosine with guanine derivative, which contains platinum as contrast-enhanced heavy element. The stretched single-chain DNA was obtained by modifying double-stranded DNA. The labeling method comprises the following steps: (i) stretching and adsorption of DNA on the support film of an electron microscope grid (the hydrophobic carbon film holding negative charged DNA); (ii) complementary labeling of the cytosine bases from the stretched single-stranded DNA pieces on the support film with platinum containing guanine derivative to form base-specific hydrogen bond; and (iii) producing a magnified image of the base-specific labeled DNA. Stretched single-stranded DNA on a support film is obtained by a rapid elongation of DNA pieces on the surface between air and aqueous buffer solution. The attached platinum-containing guanine derivative serves as a high-dense marker and it can be discriminated from the surrounding background of support carbon film and visualized by use of conventional TEM observation at 100 kV accelerated voltage. This method allows examination of specific nucleic macromolecules through atom-by-atom analysis and it is promising way toward future DNA-sequencing or molecular diagnostics of nucleic acids by electron microscopic observation. © 2016 Wiley Periodicals, Inc.

  14. Detection and identification of genetically modified EE-1 brinjal (Solanum melongena) by single, multiplex and SYBR® real-time PCR.

    PubMed

    Ballari, Rajashekhar V; Martin, Asha; Gowda, Lalitha R

    2013-01-01

    Brinjal is an important vegetable crop. Major crop loss of brinjal is due to insect attack. Insect-resistant EE-1 brinjal has been developed and is awaiting approval for commercial release. Consumer health concerns and implementation of international labelling legislation demand reliable analytical detection methods for genetically modified (GM) varieties. End-point and real-time polymerase chain reaction (PCR) methods were used to detect EE-1 brinjal. In end-point PCR, primer pairs specific to 35S CaMV promoter, NOS terminator and nptII gene common to other GM crops were used. Based on the revealed 3' transgene integration sequence, primers specific for the event EE-1 brinjal were designed. These primers were used for end-point single, multiplex and SYBR-based real-time PCR. End-point single PCR showed that the designed primers were highly specific to event EE-1 with a sensitivity of 20 pg of genomic DNA, corresponding to 20 copies of haploid EE-1 brinjal genomic DNA. The limits of detection and quantification for SYBR-based real-time PCR assay were 10 and 100 copies respectively. The prior development of detection methods for this important vegetable crop will facilitate compliance with any forthcoming labelling regulations. Copyright © 2012 Society of Chemical Industry.

  15. Nested-PCR and a new ELISA-based NovaLisa test kit for malaria diagnosis in an endemic area of Thailand.

    PubMed

    Thongdee, Pimwan; Chaijaroenkul, Wanna; Kuesap, Jiraporn; Na-Bangchang, Kesara

    2014-08-01

    Microscopy is considered as the gold standard for malaria diagnosis although its wide application is limited by the requirement of highly experienced microscopists. PCR and serological tests provide efficient diagnostic performance and have been applied for malaria diagnosis and research. The aim of this study was to investigate the diagnostic performance of nested PCR and a recently developed an ELISA-based new rapid diagnosis test (RDT), NovaLisa test kit, for diagnosis of malaria infection, using microscopic method as the gold standard. The performance of nested-PCR as a malaria diagnostic tool is excellent with respect to its high accuracy, sensitivity, specificity, and ability to discriminate Plasmodium species. The sensitivity and specificity of nested-PCR compared with the microscopic method for detection of Plasmodium falciparum, Plasmodium vivax, and P. falciparum/P. vivax mixed infection were 71.4 vs 100%, 100 vs 98.7%, and 100 vs 95.0%, respectively. The sensitivity and specificity of the ELISA-based NovaLisa test kit compared with the microscopic method for detection of Plasmodium genus were 89.0 vs 91.6%, respectively. NovaLisa test kit provided comparable diagnostic performance. Its relatively low cost, simplicity, and rapidity enables large scale field application.

  16. Gel Electrophoresis of Gold-DNA Nanoconjugates

    DOE PAGES

    Pellegrino, T.; Sperling, R. A.; Alivisatos, A. P.; ...

    2007-01-01

    Gold-DNA conjugates were investigated in detail by a comprehensive gel electrophoresis study based on 1200 gels. A controlled number of single-stranded DNA of different length was attached specifically via thiol-Au bonds to phosphine-stabilized colloidal gold nanoparticles. Alternatively, the surface of the gold particles was saturated with single stranded DNA of different length either specifically via thiol-Au bonds or by nonspecific adsorption. From the experimentally determined electrophoretic mobilities, estimates for the effective diameters of the gold-DNA conjugates were derived by applying two different data treatment approaches. The first method is based on making a calibration curve for the relation between effectivemore » diameters and mobilities with gold nanoparticles of known diameter. The second method is based on Ferguson analysis which uses gold nanoparticles of known diameter as reference database. Our study shows that effective diameters derived from gel electrophoresis measurements are affected with a high error bar as the determined values strongly depend on the method of evaluation, though relative changes in size upon binding of molecules can be detected with high precision. Furthermore, in this study, the specific attachment of DNA via gold-thiol bonds to Au nanoparticles is compared to nonspecific adsorption of DNA. Also, the maximum number of DNA molecules that can be bound per particle was determined.« less

  17. Methods of biological dosimetry employing chromosome-specific staining

    DOEpatents

    Gray, Joe W.; Pinkel, Daniel

    2000-01-01

    Methods and compositions for staining based upon nucleic acid sequence that employ nucleic acid probes are provided. Said methods produce staining patterns that can be tailored for specific cytogenetic analyses. Said probes are appropriate for in situ hybridization and stain both interphase and metaphase chromosomal material with reliable signals. The nucleic acid probes are typically of a complexity greater than 50 kb, the complexity depending upon the cytogenetic application. Methods are provided to disable the hybridization capacity of shared, high copy repetitive sequences and/or remove such sequences to provide for useful contrast. Still further methods are provided to produce chromosome-specific staining reagents which are made specific to the targeted chromosomal material, which can be one or more whole chromosomes, one or more regions on one or more chromosomes, subsets of chromosomes and/or the entire genome. Probes and test kits are provided for use in tumor cytogenetics, in the detection of disease related loci, in analysis of structural abnormalities, such as translocations, and for biological dosimetry. Further, methods and prenatal test kits are provided to stain targeted chromosomal material of fetal cells, including fetal cells obtained from maternal blood. Still further, the invention provides for automated means to detect and analyse chromosomal abnormalities.

  18. Methods And Compositions For Chromosome-Specific Staining

    DOEpatents

    Gray, Joe W.; Pinkel, Daniel

    2003-08-19

    Methods and compositions for staining based upon nucleic acid sequence that employ nucleic acid probes are provided. Said methods produce staining patterns that can be tailored for specific cytogenetic analyses. Said probes are appropriate for in situ hybridization and stain both interphase and metaphase chromosomal material with reliable signals. The nucleic acid probes are typically of a complexity greater than 50 kb, the complexity depending upon the cytogenetic application. Methods are provided to disable the hybridization capacity of shared, high copy repetitive sequences and/or remove such sequences to provide for useful contrast. Still further methods are provided to produce chromosome-specific staining reagents which are made specific to the targeted chromosomal material, which can be one or more whole chromosomes, one or more regions on one or more chromosomes, subsets of chromosomes and/or the entire genome. Probes and test kits are provided for use in tumor cytogenetics, in the detection of disease related loci, in analysis of structural abnormalities, such as translocations, and for biological dosimetry. Further, methods and prenatal test kits are provided to stain targeted chromosomal material of fetal cells, including fetal cells obtained from maternal blood. Still further, the invention provides for automated means to detect and analyse chromosomal abnormalities.

  19. A plasmid-based reporter system for live cell imaging of dengue virus infected cells.

    PubMed

    Medin, Carey L; Valois, Sierra; Patkar, Chinmay G; Rothman, Alan L

    2015-01-01

    Cell culture models are used widely to study the effects of dengue virus (DENV) on host cell function. Current methods of identification of cells infected with an unmodified DENV requires fixation and permeablization of cells to allow DENV-specific antibody staining. This method does not permit imaging of viable cells over time. In this report, a plasmid-based reporter was developed to allow non-destructive identification of DENV-infected cells. The plasmid-based reporter was demonstrated to be broadly applicable to the four DENV serotypes, including low-passaged strains, and was specifically cleaved by the viral protease with minimal interference on viral production. This study reveals the potential for this novel reporter system to advance the studies of virus-host interactions during DENV infection. Copyright © 2014 Elsevier B.V. All rights reserved.

  20. Comparability of fish-based ecological quality assessments for geographically distinct Iberian regions.

    PubMed

    Segurado, P; Caiola, N; Pont, D; Oliveira, J M; Delaigue, O; Ferreira, M T

    2014-04-01

    In this work we compare two Iberian and a pan-European fish-based methods to assess ecological quality in rivers: the Fish-based Index of Biotic Integrity for Portuguese Wadeable Streams (F-IBIP), the Mediterranean Index of Biotic Integrity (IBIMED) and the pan-European Fish Index (EFI+). The results presented herein were developed in the context of the 2nd phase of the Intercalibration Exercise (IC), as required by the Water Frame Directive (WFD). The IC is aimed at ensuring comparability of the quality boundaries among the different WFD assessment methods developed by the Member States for each biological quality element. Although the two national assessment methods were developed for very distinct regions of Iberia (Western and Eastern Iberian Peninsula) they share the same methodological background: both are type-specific and guild-based multimetric indices. EFI+ is a multimetric guild-based model, but it is site-specific and uses a predictive modelling approach. The three indices were computed for all sites included in the Iberian Intercalibration database to allow the direct comparison, by means of linear regressions, of the resulting three quality values per site. The quality boundary harmonization between the two Iberian methods was only possible through an indirect comparison between the two indices, using EFI+ as a common metric. The three indices were also shown to be responsive to a common set of human induced pressures. This study highlights the need to develop general assessment methods adapted to wide geographical ranges with high species turnover to help intercalibrating assessment methods tailored for geographically more restricted regions. © 2013.

  1. Methods of Measurement in epidemiology: Sedentary Behaviour

    PubMed Central

    Atkin, Andrew J; Gorely, Trish; Clemes, Stacy A; Yates, Thomas; Edwardson, Charlotte; Brage, Soren; Salmon, Jo; Marshall, Simon J; Biddle, Stuart JH

    2012-01-01

    Background Research examining sedentary behaviour as a potentially independent risk factor for chronic disease morbidity and mortality has expanded rapidly in recent years. Methods We present a narrative overview of the sedentary behaviour measurement literature. Subjective and objective methods of measuring sedentary behaviour suitable for use in population-based research with children and adults are examined. The validity and reliability of each method is considered, gaps in the literature specific to each method identified and potential future directions discussed. Results To date, subjective approaches to sedentary behaviour measurement, e.g. questionnaires, have focused predominantly on TV viewing or other screen-based behaviours. Typically, such measures demonstrate moderate reliability but slight to moderate validity. Accelerometry is increasingly being used for sedentary behaviour assessments; this approach overcomes some of the limitations of subjective methods, but detection of specific postures and postural changes by this method is somewhat limited. Instruments developed specifically for the assessment of body posture have demonstrated good reliability and validity in the limited research conducted to date. Miniaturization of monitoring devices, interoperability between measurement and communication technologies and advanced analytical approaches are potential avenues for future developments in this field. Conclusions High-quality measurement is essential in all elements of sedentary behaviour epidemiology, from determining associations with health outcomes to the development and evaluation of behaviour change interventions. Sedentary behaviour measurement remains relatively under-developed, although new instruments, both objective and subjective, show considerable promise and warrant further testing. PMID:23045206

  2. Towards high-throughput molecular detection of Plasmodium: new approaches and molecular markers

    PubMed Central

    Steenkeste, Nicolas; Incardona, Sandra; Chy, Sophy; Duval, Linda; Ekala, Marie-Thérèse; Lim, Pharath; Hewitt, Sean; Sochantha, Tho; Socheat, Doung; Rogier, Christophe; Mercereau-Puijalon, Odile; Fandeur, Thierry; Ariey, Frédéric

    2009-01-01

    Background Several strategies are currently deployed in many countries in the tropics to strengthen malaria control toward malaria elimination. To measure the impact of any intervention, there is a need to detect malaria properly. Mostly, decisions still rely on microscopy diagnosis. But sensitive diagnosis tools enabling to deal with a large number of samples are needed. The molecular detection approach offers a much higher sensitivity, and the flexibility to be automated and upgraded. Methods Two new molecular methods were developed: dot18S, a Plasmodium-specific nested PCR based on the 18S rRNA gene followed by dot-blot detection of species by using species-specific probes and CYTB, a Plasmodium-specific nested PCR based on cytochrome b gene followed by species detection using SNP analysis. The results were compared to those obtained with microscopic examination and the "standard" 18S rRNA gene based nested PCR using species specific primers. 337 samples were diagnosed. Results Compared to the microscopy the three molecular methods were more sensitive, greatly increasing the estimated prevalence of Plasmodium infection, including P. malariae and P. ovale. A high rate of mixed infections was uncovered with about one third of the villagers infected with more than one malaria parasite species. Dot18S and CYTB sensitivity outranged the "standard" nested PCR method, CYTB being the most sensitive. As a consequence, compared to the "standard" nested PCR method for the detection of Plasmodium spp., the sensitivity of dot18S and CYTB was respectively 95.3% and 97.3%. Consistent detection of Plasmodium spp. by the three molecular methods was obtained for 83% of tested isolates. Contradictory results were mostly related to detection of Plasmodium malariae and Plasmodium ovale in mixed infections, due to an "all-or-none" detection effect at low-level parasitaemia. Conclusion A large reservoir of asymptomatic infections was uncovered using the molecular methods. Dot18S and CYTB, the new methods reported herein are highly sensitive, allow parasite DNA extraction as well as genus- and species-specific diagnosis of several hundreds of samples, and are amenable to high-throughput scaling up for larger sample sizes. Such methods provide novel information on malaria prevalence and epidemiology and are suited for active malaria detection. The usefulness of such sensitive malaria diagnosis tools, especially in low endemic areas where eradication plans are now on-going, is discussed in this paper. PMID:19402894

  3. Lexicon-enhanced sentiment analysis framework using rule-based classification scheme.

    PubMed

    Asghar, Muhammad Zubair; Khan, Aurangzeb; Ahmad, Shakeel; Qasim, Maria; Khan, Imran Ali

    2017-01-01

    With the rapid increase in social networks and blogs, the social media services are increasingly being used by online communities to share their views and experiences about a particular product, policy and event. Due to economic importance of these reviews, there is growing trend of writing user reviews to promote a product. Nowadays, users prefer online blogs and review sites to purchase products. Therefore, user reviews are considered as an important source of information in Sentiment Analysis (SA) applications for decision making. In this work, we exploit the wealth of user reviews, available through the online forums, to analyze the semantic orientation of words by categorizing them into +ive and -ive classes to identify and classify emoticons, modifiers, general-purpose and domain-specific words expressed in the public's feedback about the products. However, the un-supervised learning approach employed in previous studies is becoming less efficient due to data sparseness, low accuracy due to non-consideration of emoticons, modifiers, and presence of domain specific words, as they may result in inaccurate classification of users' reviews. Lexicon-enhanced sentiment analysis based on Rule-based classification scheme is an alternative approach for improving sentiment classification of users' reviews in online communities. In addition to the sentiment terms used in general purpose sentiment analysis, we integrate emoticons, modifiers and domain specific terms to analyze the reviews posted in online communities. To test the effectiveness of the proposed method, we considered users reviews in three domains. The results obtained from different experiments demonstrate that the proposed method overcomes limitations of previous methods and the performance of the sentiment analysis is improved after considering emoticons, modifiers, negations, and domain specific terms when compared to baseline methods.

  4. RuleMonkey: software for stochastic simulation of rule-based models

    PubMed Central

    2010-01-01

    Background The system-level dynamics of many molecular interactions, particularly protein-protein interactions, can be conveniently represented using reaction rules, which can be specified using model-specification languages, such as the BioNetGen language (BNGL). A set of rules implicitly defines a (bio)chemical reaction network. The reaction network implied by a set of rules is often very large, and as a result, generation of the network implied by rules tends to be computationally expensive. Moreover, the cost of many commonly used methods for simulating network dynamics is a function of network size. Together these factors have limited application of the rule-based modeling approach. Recently, several methods for simulating rule-based models have been developed that avoid the expensive step of network generation. The cost of these "network-free" simulation methods is independent of the number of reactions implied by rules. Software implementing such methods is now needed for the simulation and analysis of rule-based models of biochemical systems. Results Here, we present a software tool called RuleMonkey, which implements a network-free method for simulation of rule-based models that is similar to Gillespie's method. The method is suitable for rule-based models that can be encoded in BNGL, including models with rules that have global application conditions, such as rules for intramolecular association reactions. In addition, the method is rejection free, unlike other network-free methods that introduce null events, i.e., steps in the simulation procedure that do not change the state of the reaction system being simulated. We verify that RuleMonkey produces correct simulation results, and we compare its performance against DYNSTOC, another BNGL-compliant tool for network-free simulation of rule-based models. We also compare RuleMonkey against problem-specific codes implementing network-free simulation methods. Conclusions RuleMonkey enables the simulation of rule-based models for which the underlying reaction networks are large. It is typically faster than DYNSTOC for benchmark problems that we have examined. RuleMonkey is freely available as a stand-alone application http://public.tgen.org/rulemonkey. It is also available as a simulation engine within GetBonNie, a web-based environment for building, analyzing and sharing rule-based models. PMID:20673321

  5. Utility of NIST Whole-Genome Reference Materials for the Technical Validation of a Multigene Next-Generation Sequencing Test.

    PubMed

    Shum, Bennett O V; Henner, Ilya; Belluoccio, Daniele; Hinchcliffe, Marcus J

    2017-07-01

    The sensitivity and specificity of next-generation sequencing laboratory developed tests (LDTs) are typically determined by an analyte-specific approach. Analyte-specific validations use disease-specific controls to assess an LDT's ability to detect known pathogenic variants. Alternatively, a methods-based approach can be used for LDT technical validations. Methods-focused validations do not use disease-specific controls but use benchmark reference DNA that contains known variants (benign, variants of unknown significance, and pathogenic) to assess variant calling accuracy of a next-generation sequencing workflow. Recently, four whole-genome reference materials (RMs) from the National Institute of Standards and Technology (NIST) were released to standardize methods-based validations of next-generation sequencing panels across laboratories. We provide a practical method for using NIST RMs to validate multigene panels. We analyzed the utility of RMs in validating a novel newborn screening test that targets 70 genes, called NEO1. Despite the NIST RM variant truth set originating from multiple sequencing platforms, replicates, and library types, we discovered a 5.2% false-negative variant detection rate in the RM truth set genes that were assessed in our validation. We developed a strategy using complementary non-RM controls to demonstrate 99.6% sensitivity of the NEO1 test in detecting variants. Our findings have implications for laboratories or proficiency testing organizations using whole-genome NIST RMs for testing. Copyright © 2017 American Society for Investigative Pathology and the Association for Molecular Pathology. Published by Elsevier Inc. All rights reserved.

  6. A multilevel-ROI-features-based machine learning method for detection of morphometric biomarkers in Parkinson's disease.

    PubMed

    Peng, Bo; Wang, Suhong; Zhou, Zhiyong; Liu, Yan; Tong, Baotong; Zhang, Tao; Dai, Yakang

    2017-06-09

    Machine learning methods have been widely used in recent years for detection of neuroimaging biomarkers in regions of interest (ROIs) and assisting diagnosis of neurodegenerative diseases. The innovation of this study is to use multilevel-ROI-features-based machine learning method to detect sensitive morphometric biomarkers in Parkinson's disease (PD). Specifically, the low-level ROI features (gray matter volume, cortical thickness, etc.) and high-level correlative features (connectivity between ROIs) are integrated to construct the multilevel ROI features. Filter- and wrapper- based feature selection method and multi-kernel support vector machine (SVM) are used in the classification algorithm. T1-weighted brain magnetic resonance (MR) images of 69 PD patients and 103 normal controls from the Parkinson's Progression Markers Initiative (PPMI) dataset are included in the study. The machine learning method performs well in classification between PD patients and normal controls with an accuracy of 85.78%, a specificity of 87.79%, and a sensitivity of 87.64%. The most sensitive biomarkers between PD patients and normal controls are mainly distributed in frontal lobe, parental lobe, limbic lobe, temporal lobe, and central region. The classification performance of our method with multilevel ROI features is significantly improved comparing with other classification methods using single-level features. The proposed method shows promising identification ability for detecting morphometric biomarkers in PD, thus confirming the potentiality of our method in assisting diagnosis of the disease. Copyright © 2017 Elsevier B.V. All rights reserved.

  7. Differentiation in MALDI-TOF MS and FTIR spectra between two pathovars of Xanthomonas oryzae

    NASA Astrophysics Data System (ADS)

    Ge, Mengyu; Li, Bin; Wang, Li; Tao, Zhongyun; Mao, Shengfeng; Wang, Yangli; Xie, Guanlin; Sun, Guochang

    2014-12-01

    Xanthomonas oryzae pv. oryzae (Xoo) and Xanthomonas oryzae pv. oryzicola (Xoc) strains are closely related phenotypically and genetically, which make it difficult to differentiate between the two pathovars based on phenotypic and DNA-based methods. In this study, a fast and accurate method was developed based on the differences in MALDI-TOF MS and FTIR spectra between the two pathovars. MALDI-TOF MS analysis revealed that 9 and 10 peaks are specific to Xoo and Xoc, respectively, which can be used as biomarkers to identify and differentiate the two closely related pathovars. Furthermore, FTIR analysis showed that there is a significant difference in both the band frequencies and absorption intensity of various functional groups between the two pathovars. In particular, the 6 peaks at 3433, 2867, 1273, 1065, 983 and 951 cm-1 were specific to the Xoo strains, while one peak at 1572 cm-1 was specific to the Xoc strains. Overall, this study gives the first attempt to identify and differentiate the two pathovars of X. oryzae based on mass and FTIR spectra, which will be helpful for the early detection and prevention of the two rice diseases caused by both X. oryzae pathovars.

  8. Comparison of two PCR-based methods and automated DNA sequencing for prop-1 genotyping in Ames dwarf mice.

    PubMed

    Gerstner, Arpad; DeFord, James H; Papaconstantinou, John

    2003-07-25

    Ames dwarfism is caused by a homozygous single nucleotide mutation in the pituitary specific prop-1 gene, resulting in combined pituitary hormone deficiency, reduced growth and extended lifespan. Thus, these mice serve as an important model system for endocrinological, aging and longevity studies. Because the phenotype of wild type and heterozygous mice is undistinguishable, it is imperative for successful breeding to accurately genotype these animals. Here we report a novel, yet simple, approach for prop-1 genotyping using PCR-based allele-specific amplification (PCR-ASA). We also compare this method to other potential genotyping techniques, i.e. PCR-based restriction fragment length polymorphism analysis (PCR-RFLP) and fluorescence automated DNA sequencing. We demonstrate that the single-step PCR-ASA has several advantages over the classical PCR-RFLP because the procedure is simple, less expensive and rapid. To further increase the specificity and sensitivity of the PCR-ASA, we introduced a single-base mismatch at the 3' penultimate position of the mutant primer. Our results also reveal that the fluorescence automated DNA sequencing has limitations for detecting a single nucleotide polymorphism in the prop-1 gene, particularly in heterozygotes.

  9. Proceedings of the Second NASA Formal Methods Symposium

    NASA Technical Reports Server (NTRS)

    Munoz, Cesar (Editor)

    2010-01-01

    This publication contains the proceedings of the Second NASA Formal Methods Symposium sponsored by the National Aeronautics and Space Administration and held in Washington D.C. April 13-15, 2010. Topics covered include: Decision Engines for Software Analysis using Satisfiability Modulo Theories Solvers; Verification and Validation of Flight-Critical Systems; Formal Methods at Intel -- An Overview; Automatic Review of Abstract State Machines by Meta Property Verification; Hardware-independent Proofs of Numerical Programs; Slice-based Formal Specification Measures -- Mapping Coupling and Cohesion Measures to Formal Z; How Formal Methods Impels Discovery: A Short History of an Air Traffic Management Project; A Machine-Checked Proof of A State-Space Construction Algorithm; Automated Assume-Guarantee Reasoning for Omega-Regular Systems and Specifications; Modeling Regular Replacement for String Constraint Solving; Using Integer Clocks to Verify the Timing-Sync Sensor Network Protocol; Can Regulatory Bodies Expect Efficient Help from Formal Methods?; Synthesis of Greedy Algorithms Using Dominance Relations; A New Method for Incremental Testing of Finite State Machines; Verification of Faulty Message Passing Systems with Continuous State Space in PVS; Phase Two Feasibility Study for Software Safety Requirements Analysis Using Model Checking; A Prototype Embedding of Bluespec System Verilog in the PVS Theorem Prover; SimCheck: An Expressive Type System for Simulink; Coverage Metrics for Requirements-Based Testing: Evaluation of Effectiveness; Software Model Checking of ARINC-653 Flight Code with MCP; Evaluation of a Guideline by Formal Modelling of Cruise Control System in Event-B; Formal Verification of Large Software Systems; Symbolic Computation of Strongly Connected Components Using Saturation; Towards the Formal Verification of a Distributed Real-Time Automotive System; Slicing AADL Specifications for Model Checking; Model Checking with Edge-valued Decision Diagrams; and Data-flow based Model Analysis.

  10. Comparison of Effectiveness of the Metacognition Treatment and the Mindfulness-Based Stress Reduction Treatment on Global and Specific Life Quality of Women with Breast Cancer

    PubMed Central

    Rahmani, Soheila; Talepasand, Siavash; Ghanbary-Motlagh, ALi

    2014-01-01

    Background This study is conducted to compare the metacognition treatment and the mindfulness-based stress reduction treatment on life quality of women with breast cancer. Methods In a quasi-experimental design, with pre-test, post-test and control group, 36 patients with diagnosis of breast cancer, among patients who referred to the Division of Oncology and Radiotherapy of Imam Hossein hospital in Tehran, were selected in accessible way and were assigned randomly to three experimental groups, the first group receiving meta-cognition treatment (n=12), the second one receiving mindfulness-based stress reduction program (n=12), and the other was the control group. Participants completed global life quality in cancer patient's questionnaire and specific quality of life in breast cancer patient's questionnaire in three stages: baseline, after intervention and two-month follow-up. Data were analyzed using the multivariate repeated measurement model. Results Findings showed both treatments were effective in improving global and specific quality of life in patients with breast cancer. The mindfulness -based stress reduction treatment excelled in functions and roles, fatigue, pain, future perspective and treatment side effects symptoms at the end of the treatment and follow-up in comparison to the metacognition treatment. Conclusion Results of this research showed the mindfulness-based stress reduction treatment can be effective in improving global and specific life quality of women with breast cancer and is a selective method for improving quality of life in patients. PMID:25628839

  11. The research on user behavior evaluation method for network state

    NASA Astrophysics Data System (ADS)

    Zhang, Chengyuan; Xu, Haishui

    2017-08-01

    Based on the correlation between user behavior and network running state, this paper proposes a method of user behavior evaluation based on network state. Based on the analysis and evaluation methods in other fields of study, we introduce the theory and tools of data mining. Based on the network status information provided by the trusted network view, the user behavior data and the network state data are analysed. Finally, we construct the user behavior evaluation index and weight, and on this basis, we can accurately quantify the influence degree of the specific behavior of different users on the change of network running state, so as to provide the basis for user behavior control decision.

  12. Agricultural soil greenhouse gas emissions: a review of national inventory methods.

    PubMed

    Lokupitiya, Erandathie; Paustian, Keith

    2006-01-01

    Parties to the United Nations Framework Convention on Climate Change (UNFCCC) are required to submit national greenhouse gas (GHG) inventories, together with information on methods used in estimating their emissions. Currently agricultural activities contribute a significant portion (approximately 20%) of global anthropogenic GHG emissions, and agricultural soils have been identified as one of the main GHG source categories within the agricultural sector. However, compared to many other GHG sources, inventory methods for soils are relatively more complex and have been implemented only to varying degrees among member countries. This review summarizes and evaluates the methods used by Annex 1 countries in estimating CO2 and N2O emissions in agricultural soils. While most countries utilize the Intergovernmental Panel on Climate Change (IPCC) default methodology, several Annex 1 countries are developing more advanced methods that are tailored for specific country circumstances. Based on the latest national inventory reporting, about 56% of the Annex 1 countries use IPCC Tier 1 methods, about 26% use Tier 2 methods, and about 18% do not estimate or report N2O emissions from agricultural soils. More than 65% of the countries do not report CO2 emissions from the cultivation of mineral soils, organic soils, or liming, and only a handful of countries have used country-specific, Tier 3 methods. Tier 3 methods usually involve process-based models and detailed, geographically specific activity data. Such methods can provide more robust, accurate estimates of emissions and removals but require greater diligence in documentation, transparency, and uncertainty assessment to ensure comparability between countries. Availability of detailed, spatially explicit activity data is a major constraint to implementing higher tiered methods in many countries.

  13. Dual channel sensitive detection of hsa-miR-21 based on rolling circle amplification and quantum dots tagging.

    PubMed

    Wangt, Dan-Chen; Hu, Li-Hui; Zhou, Yu-Hui; Huang, Yu-Ting; Li, Xinhua; Zhu, Jun-Jie

    2014-04-01

    An isothermal, highly sensitive and specific assay for the detection of hsa-miR-21 with the integration of QDs tagging and rolling circle amplification was offered. In addition, a dual channel strategy for miRNA detection was proposed: anodic stripping voltammetry (ASV) and fluorescent method were both performed for the final Cd2+ signal readout. The designed strategy exhibited good specificity to hsa-miR-21 and presented comparable detection results by detection methods.

  14. Predicting locations of rare aquatic species’ habitat with a combination of species-specific and assemblage-based models

    USGS Publications Warehouse

    McKenna, James E.; Carlson, Douglas M.; Payne-Wynne, Molly L.

    2013-01-01

    Aim: Rare aquatic species are a substantial component of biodiversity, and their conservation is a major objective of many management plans. However, they are difficult to assess, and their optimal habitats are often poorly known. Methods to effectively predict the likely locations of suitable rare aquatic species habitats are needed. We combine two modelling approaches to predict occurrence and general abundance of several rare fish species. Location: Allegheny watershed of western New York State (USA) Methods: Our method used two empirical neural network modelling approaches (species specific and assemblage based) to predict stream-by-stream occurrence and general abundance of rare darters, based on broad-scale habitat conditions. Species-specific models were developed for longhead darter (Percina macrocephala), spotted darter (Etheostoma maculatum) and variegate darter (Etheostoma variatum) in the Allegheny drainage. An additional model predicted the type of rare darter-containing assemblage expected in each stream reach. Predictions from both models were then combined inclusively and exclusively and compared with additional independent data. Results Example rare darter predictions demonstrate the method's effectiveness. Models performed well (R2 ≥ 0.79), identified where suitable darter habitat was most likely to occur, and predictions matched well to those of collection sites. Additional independent data showed that the most conservative (exclusive) model slightly underestimated the distributions of these rare darters or predictions were displaced by one stream reach, suggesting that new darter habitat types were detected in the later collections. Main conclusions Broad-scale habitat variables can be used to effectively identify rare species' habitats. Combining species-specific and assemblage-based models enhances our ability to make use of the sparse data on rare species and to identify habitat units most likely and least likely to support those species. This hybrid approach may assist managers with the prioritization of habitats to be examined or conserved for rare species.

  15. Real-time PCR assay is superior to other methods for the detection of mycoplasma contamination in the cell lines of the National Cell Bank of Iran.

    PubMed

    Molla Kazemiha, Vahid; Bonakdar, Shahin; Amanzadeh, Amir; Azari, Shahram; Memarnejadian, Arash; Shahbazi, Shirin; Shokrgozar, Mohammad Ali; Mahdian, Reza

    2016-08-01

    Mycoplasmas are the most important contaminants of cell cultures throughout the world. They are considered as a major problem in biological studies and biopharmaceutical economic issues. In this study, our aim was to find the best standard technique as a rapid method with high sensitivity, specificity and accuracy for the detection of mycoplasma contamination in the cell lines of the National Cell Bank of Iran. Thirty cell lines suspected to mycoplasma contamination were evaluated by five different techniques including microbial culture, indirect DNA DAPI staining, enzymatic mycoalert(®) assay, conventional PCR and real-time PCR. Five mycoplasma-contaminated cell lines were assigned as positive controls and five mycoplasma-free cell lines as negative controls. The enzymatic method was performed using the mycoalert(®) mycoplasma detection kit. Real-time PCR technique was conducted by PromoKine diagnostic kits. In the conventional PCR method, mycoplasma genus-specific primers were designed to analyze the sequences based on a fixed and common region on 16S ribosomal RNA with PCR product size of 425 bp. Mycoplasma contamination was observed in 60, 56.66, 53.33, 46.66 and 33.33 % of 30 different cell cultures by real-time PCR, PCR, enzymatic mycoalert(®), indirect DNA DAPI staining and microbial culture methods, respectively. The analysis of the results of the different methods showed that the real-time PCR assay was superior the other methods with the sensitivity, specificity, accuracy, predictive value of positive and negative results of 100 %. These values were 94.44, 100, 96.77, 100 and 92.85 % for the conventional PCR method, respectively. Therefore, this study showed that real-time PCR and PCR assays based on the common sequences in the 16S ribosomal RNA are reliable methods with high sensitivity, specificity and accuracy for detection of mycoplasma contamination in cell cultures and other biological products.

  16. Practical Methods for Including Torsional Anharmonicity in Thermochemical Calculations on Complex Molecules: The Internal-Coordinate Multi-Structural Approximation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zheng, J.; Yu, T.; Papajak, E.

    2011-01-01

    Many methods for correcting harmonic partition functions for the presence of torsional motions employ some form of one-dimensional torsional treatment to replace the harmonic contribution of a specific normal mode. However, torsions are often strongly coupled to other degrees of freedom, especially other torsions and low-frequency bending motions, and this coupling can make assigning torsions to specific normal modes problematic. Here, we present a new class of methods, called multi-structural (MS) methods, that circumvents the need for such assignments by instead adjusting the harmonic results by torsional correction factors that are determined using internal coordinates. We present three versions ofmore » the MS method: (i) MS-AS based on including all structures (AS), i.e., all conformers generated by internal rotations; (ii) MS-ASCB based on all structures augmented with explicit conformational barrier (CB) information, i.e., including explicit calculations of all barrier heights for internal-rotation barriers between the conformers; and (iii) MS-RS based on including all conformers generated from a reference structure (RS) by independent torsions. In the MS-AS scheme, one has two options for obtaining the local periodicity parameters, one based on consideration of the nearly separable limit and one based on strongly coupled torsions. The latter involves assigning the local periodicities on the basis of Voronoi volumes. The methods are illustrated with calculations for ethanol, 1-butanol, and 1-pentyl radical as well as two one-dimensional torsional potentials. The MS-AS method is particularly interesting because it does not require any information about conformational barriers or about the paths that connect the various structures.« less

  17. Practical methods for including torsional anharmonicity in thermochemical calculations on complex molecules: the internal-coordinate multi-structural approximation.

    PubMed

    Zheng, Jingjing; Yu, Tao; Papajak, Ewa; Alecu, I M; Mielke, Steven L; Truhlar, Donald G

    2011-06-21

    Many methods for correcting harmonic partition functions for the presence of torsional motions employ some form of one-dimensional torsional treatment to replace the harmonic contribution of a specific normal mode. However, torsions are often strongly coupled to other degrees of freedom, especially other torsions and low-frequency bending motions, and this coupling can make assigning torsions to specific normal modes problematic. Here, we present a new class of methods, called multi-structural (MS) methods, that circumvents the need for such assignments by instead adjusting the harmonic results by torsional correction factors that are determined using internal coordinates. We present three versions of the MS method: (i) MS-AS based on including all structures (AS), i.e., all conformers generated by internal rotations; (ii) MS-ASCB based on all structures augmented with explicit conformational barrier (CB) information, i.e., including explicit calculations of all barrier heights for internal-rotation barriers between the conformers; and (iii) MS-RS based on including all conformers generated from a reference structure (RS) by independent torsions. In the MS-AS scheme, one has two options for obtaining the local periodicity parameters, one based on consideration of the nearly separable limit and one based on strongly coupled torsions. The latter involves assigning the local periodicities on the basis of Voronoi volumes. The methods are illustrated with calculations for ethanol, 1-butanol, and 1-pentyl radical as well as two one-dimensional torsional potentials. The MS-AS method is particularly interesting because it does not require any information about conformational barriers or about the paths that connect the various structures.

  18. ANTONIA perfusion and stroke. A software tool for the multi-purpose analysis of MR perfusion-weighted datasets and quantitative ischemic stroke assessment.

    PubMed

    Forkert, N D; Cheng, B; Kemmling, A; Thomalla, G; Fiehler, J

    2014-01-01

    The objective of this work is to present the software tool ANTONIA, which has been developed to facilitate a quantitative analysis of perfusion-weighted MRI (PWI) datasets in general as well as the subsequent multi-parametric analysis of additional datasets for the specific purpose of acute ischemic stroke patient dataset evaluation. Three different methods for the analysis of DSC or DCE PWI datasets are currently implemented in ANTONIA, which can be case-specifically selected based on the study protocol. These methods comprise a curve fitting method as well as a deconvolution-based and deconvolution-free method integrating a previously defined arterial input function. The perfusion analysis is extended for the purpose of acute ischemic stroke analysis by additional methods that enable an automatic atlas-based selection of the arterial input function, an analysis of the perfusion-diffusion and DWI-FLAIR mismatch as well as segmentation-based volumetric analyses. For reliability evaluation, the described software tool was used by two observers for quantitative analysis of 15 datasets from acute ischemic stroke patients to extract the acute lesion core volume, FLAIR ratio, perfusion-diffusion mismatch volume with manually as well as automatically selected arterial input functions, and follow-up lesion volume. The results of this evaluation revealed that the described software tool leads to highly reproducible results for all parameters if the automatic arterial input function selection method is used. Due to the broad selection of processing methods that are available in the software tool, ANTONIA is especially helpful to support image-based perfusion and acute ischemic stroke research projects.

  19. Development and evaluation of a polydiacetylene based biosensor for the detection of H5 influenza virus.

    PubMed

    Jiang, Lixiang; Luo, Jing; Dong, Wenjie; Wang, Chengmin; Jin, Wen; Xia, Yuetong; Wang, Haijing; Ding, Hua; Jiang, Long; He, Hongxuan

    2015-07-01

    H5N1 avian influenza has caused serious economic losses as well as posed significant threats to public health, agriculture and wildlife. It is important to develop a rapid, sensitive and specific detection platform suitable for disease surveillance and control. In this study, a highly sensitive, specific and rapid biosensor based on polydiacetylene was developed for detecting H5 influenza virus. The polydiacetylene based biosensor was produced from an optimized ratio of 10,12-pentacosadiynoic acid and 1,2-dimyristoyl-sn-glycero-3-phosphocholine, with the anti-H5 influenza antibody embedded onto the vesicle surface. The optimized polydiacetylene vesicle could detect H5 influenza virus sensitively with a detection limit of 0.53 copies/μL, showing a dramatic blue-to-red color change that can be observed directly by the naked eye and recorded by a UV-vis spectrometer. The sensitivity, specificity and accuracy of the biosensor were also evaluated. The sensor could specifically differentiate H5 influenza virus from H3 influenza virus, Newcastle disease virus and porcine reproductive and respiratory syndrome virus. Detection using tracheal swabs was in accord with virus isolation results, and comparable to the RT-PCR method. These results offer the possibility and potential of simple polydiacetylene based bio-analytical method for influenza surveillance. Copyright © 2015 Elsevier B.V. All rights reserved.

  20. The efficacy of the addition of the Pilates method over a minimal intervention in the treatment of chronic nonspecific low back pain: a study protocol of a randomized controlled trial☆

    PubMed Central

    Miyamoto, Gisela C.; Costa, Leonardo O.P.; Galvanin, Thalissa; Cabral, Cristina M.N.

    2011-01-01

    Objective There is little high-quality evidence on the efficacy of the Pilates-based exercises for the treatment of chronic nonspecific low back pain. Therefore, the objective of this paper is to present a study protocol to investigate the efficacy of adding Pilates-based exercises to a minimum intervention in patients with chronic non-specific low back pain. Methods This randomized controlled trial will recruit 86 patients of both sexes, aged between 18 and 60 years, with chronic non-specific low back pain. The participants will be randomly allocated into 2 treatment groups: the Booklet Group, which will receive a booklet with postural orientations, and the Pilates Group, which will receive the same booklet in addition to a Pilates-based exercises program. The general and specific functional capacities of the patient, kinesiophobia, pain intensity, and the global perceived effect will be evaluated by a blinded assessor before randomization and at 6 weeks and 6 months after randomization. In addition, the expectations of the participants and their confidence in the treatment will be evaluated before the randomization and after the first treatment session, respectively. Conclusions It is hoped that the results of this study will provide high-quality evidence on the usefulness of Pilates-based exercises in the treatment of chronic non-specific low back pain. PMID:22654682

  1. Evaluation and integration of existing methods for computational prediction of allergens

    PubMed Central

    2013-01-01

    Background Allergy involves a series of complex reactions and factors that contribute to the development of the disease and triggering of the symptoms, including rhinitis, asthma, atopic eczema, skin sensitivity, even acute and fatal anaphylactic shock. Prediction and evaluation of the potential allergenicity is of importance for safety evaluation of foods and other environment factors. Although several computational approaches for assessing the potential allergenicity of proteins have been developed, their performance and relative merits and shortcomings have not been compared systematically. Results To evaluate and improve the existing methods for allergen prediction, we collected an up-to-date definitive dataset consisting of 989 known allergens and massive putative non-allergens. The three most widely used allergen computational prediction approaches including sequence-, motif- and SVM-based (Support Vector Machine) methods were systematically compared using the defined parameters and we found that SVM-based method outperformed the other two methods with higher accuracy and specificity. The sequence-based method with the criteria defined by FAO/WHO (FAO: Food and Agriculture Organization of the United Nations; WHO: World Health Organization) has higher sensitivity of over 98%, but having a low specificity. The advantage of motif-based method is the ability to visualize the key motif within the allergen. Notably, the performances of the sequence-based method defined by FAO/WHO and motif eliciting strategy could be improved by the optimization of parameters. To facilitate the allergen prediction, we integrated these three methods in a web-based application proAP, which provides the global search of the known allergens and a powerful tool for allergen predication. Flexible parameter setting and batch prediction were also implemented. The proAP can be accessed at http://gmobl.sjtu.edu.cn/proAP/main.html. Conclusions This study comprehensively evaluated sequence-, motif- and SVM-based computational prediction approaches for allergens and optimized their parameters to obtain better performance. These findings may provide helpful guidance for the researchers in allergen-prediction. Furthermore, we integrated these methods into a web application proAP, greatly facilitating users to do customizable allergen search and prediction. PMID:23514097

  2. Evaluation and integration of existing methods for computational prediction of allergens.

    PubMed

    Wang, Jing; Yu, Yabin; Zhao, Yunan; Zhang, Dabing; Li, Jing

    2013-01-01

    Allergy involves a series of complex reactions and factors that contribute to the development of the disease and triggering of the symptoms, including rhinitis, asthma, atopic eczema, skin sensitivity, even acute and fatal anaphylactic shock. Prediction and evaluation of the potential allergenicity is of importance for safety evaluation of foods and other environment factors. Although several computational approaches for assessing the potential allergenicity of proteins have been developed, their performance and relative merits and shortcomings have not been compared systematically. To evaluate and improve the existing methods for allergen prediction, we collected an up-to-date definitive dataset consisting of 989 known allergens and massive putative non-allergens. The three most widely used allergen computational prediction approaches including sequence-, motif- and SVM-based (Support Vector Machine) methods were systematically compared using the defined parameters and we found that SVM-based method outperformed the other two methods with higher accuracy and specificity. The sequence-based method with the criteria defined by FAO/WHO (FAO: Food and Agriculture Organization of the United Nations; WHO: World Health Organization) has higher sensitivity of over 98%, but having a low specificity. The advantage of motif-based method is the ability to visualize the key motif within the allergen. Notably, the performances of the sequence-based method defined by FAO/WHO and motif eliciting strategy could be improved by the optimization of parameters. To facilitate the allergen prediction, we integrated these three methods in a web-based application proAP, which provides the global search of the known allergens and a powerful tool for allergen predication. Flexible parameter setting and batch prediction were also implemented. The proAP can be accessed at http://gmobl.sjtu.edu.cn/proAP/main.html. This study comprehensively evaluated sequence-, motif- and SVM-based computational prediction approaches for allergens and optimized their parameters to obtain better performance. These findings may provide helpful guidance for the researchers in allergen-prediction. Furthermore, we integrated these methods into a web application proAP, greatly facilitating users to do customizable allergen search and prediction.

  3. Image Quality Assessment Based on Local Linear Information and Distortion-Specific Compensation.

    PubMed

    Wang, Hanli; Fu, Jie; Lin, Weisi; Hu, Sudeng; Kuo, C-C Jay; Zuo, Lingxuan

    2016-12-14

    Image Quality Assessment (IQA) is a fundamental yet constantly developing task for computer vision and image processing. Most IQA evaluation mechanisms are based on the pertinence of subjective and objective estimation. Each image distortion type has its own property correlated with human perception. However, this intrinsic property may not be fully exploited by existing IQA methods. In this paper, we make two main contributions to the IQA field. First, a novel IQA method is developed based on a local linear model that examines the distortion between the reference and the distorted images for better alignment with human visual experience. Second, a distortion-specific compensation strategy is proposed to offset the negative effect on IQA modeling caused by different image distortion types. These score offsets are learned from several known distortion types. Furthermore, for an image with an unknown distortion type, a Convolutional Neural Network (CNN) based method is proposed to compute the score offset automatically. Finally, an integrated IQA metric is proposed by combining the aforementioned two ideas. Extensive experiments are performed to verify the proposed IQA metric, which demonstrate that the local linear model is useful in human perception modeling, especially for individual image distortion, and the overall IQA method outperforms several state-of-the-art IQA approaches.

  4. "Spoligoriftyping," a dual-priming-oligonucleotide-based direct-hybridization assay for tuberculosis control with a multianalyte microbead-based hybridization system.

    PubMed

    Gomgnimbou, Michel Kiréopori; Abadia, Edgar; Zhang, Jian; Refrégier, Guislaine; Panaiotov, Stefan; Bachiyska, Elizabeta; Sola, Christophe

    2012-10-01

    We developed "spoligoriftyping," a 53-plex assay based on two preexisting methods, the spoligotyping and "rifoligotyping" assays, by combining them into a single assay. Spoligoriftyping allows simultaneous spoligotyping (i.e., clustered regularly interspaced short palindromic repeat [CRISPR]-based genotyping) and characterization of the main rifampin drug resistance mutations on the rpoB hot spot region in a few hours. This test partly uses the dual-priming-oligonucleotide (DPO) principle, which allows simultaneous efficient amplifications of rpoB and the CRISPR locus in the same sample. We tested this method on a set of 114 previously phenotypically and genotypically characterized multidrug-resistant (MDR) Mycobacterium tuberculosis or drug-susceptible M. tuberculosis DNA extracted from clinical isolates obtained from patients from Bulgaria, Nigeria, and Germany. We showed that our method is 100% concordant with rpoB sequencing results and 99.95% (3,911/3,913 spoligotype data points) correlated with classical spoligotyping results. The sensitivity and specificity of our assay were 99 and 100%, respectively, compared to those of phenotypic drug susceptibility testing. Such assays pave the way to the implementation of locally and specifically adapted methods of performing in a single tube both drug resistance mutation detection and genotyping in a few hours.

  5. Multi-objective Decision Based Available Transfer Capability in Deregulated Power System Using Heuristic Approaches

    NASA Astrophysics Data System (ADS)

    Pasam, Gopi Krishna; Manohar, T. Gowri

    2016-09-01

    Determination of available transfer capability (ATC) requires the use of experience, intuition and exact judgment in order to meet several significant aspects in the deregulated environment. Based on these points, this paper proposes two heuristic approaches to compute ATC. The first proposed heuristic algorithm integrates the five methods known as continuation repeated power flow, repeated optimal power flow, radial basis function neural network, back propagation neural network and adaptive neuro fuzzy inference system to obtain ATC. The second proposed heuristic model is used to obtain multiple ATC values. Out of these, a specific ATC value will be selected based on a number of social, economic, deregulated environmental constraints and related to specific applications like optimization, on-line monitoring, and ATC forecasting known as multi-objective decision based optimal ATC. The validity of results obtained through these proposed methods are scrupulously verified on various buses of the IEEE 24-bus reliable test system. The results presented and derived conclusions in this paper are very useful for planning, operation, maintaining of reliable power in any power system and its monitoring in an on-line environment of deregulated power system. In this way, the proposed heuristic methods would contribute the best possible approach to assess multiple objective ATC using integrated methods.

  6. OFFGEL electrophoresis and tandem mass spectrometry approach compared with DNA-based PCR method for authentication of meat species from raw and cooked ground meat mixtures containing cattle meat, water buffalo meat and sheep meat.

    PubMed

    Naveena, Basappa M; Jagadeesh, Deepak S; Jagadeesh Babu, A; Madhava Rao, T; Kamuni, Veeranna; Vaithiyanathan, S; Kulkarni, Vinayak V; Rapole, Srikanth

    2017-10-15

    The present study compared the accuracy of an OFFGEL electrophoresis and tandem mass spectrometry-based proteomic approach with a DNA-based method for meat species identification from raw and cooked ground meat mixes containing cattle, water buffalo and sheep meat. The proteomic approach involved the separation of myofibrillar proteins using OFFGEL electrophoresis, SDS-PAGE and protein identification by MALDI-TOF MS. Species-specific peptides derived from myosin light chain-1 and 2 were identified for authenticating buffalo meat spiked at a minimum 0.5% level in sheep meat with high confidence. Relative quantification of buffalo meat mixed with sheep meat was done by quantitative label-free mass spectrometry using UPLC-QTOF and PLGS search engine to substantiate the confidence level of the data. In the DNA-based method, PCR amplification of mitochondrial D loop gene using species specific primers found 226bp and 126bp product amplicons for buffalo and cattle meat, respectively. The method was efficient in detecting a minimum of 0.5% and 1.0% when buffalo meat was spiked with cattle meat in raw and cooked meat mixes. Copyright © 2017 Elsevier Ltd. All rights reserved.

  7. Method for Predicting the Energy Characteristics of Li-Ion Cells Designed for High Specific Energy

    NASA Technical Reports Server (NTRS)

    Bennett, William, R.

    2012-01-01

    Novel electrode materials with increased specific capacity and voltage performance are critical to the NASA goals for developing Li-ion batteries with increased specific energy and energy density. Although performance metrics of the individual electrodes are critically important, a fundamental understanding of the interactions of electrodes in a full cell is essential to achieving the desired performance, and for establishing meaningful goals for electrode performance in the first place. This paper presents design considerations for matching positive and negative electrodes in a viable design. Methods for predicting cell-level performance, based on laboratory data for individual electrodes, are presented and discussed.

  8. Detection of nucleic acid sequences by invader-directed cleavage

    DOEpatents

    Brow, Mary Ann D.; Hall, Jeff Steven Grotelueschen; Lyamichev, Victor; Olive, David Michael; Prudent, James Robert

    1999-01-01

    The present invention relates to means for the detection and characterization of nucleic acid sequences, as well as variations in nucleic acid sequences. The present invention also relates to methods for forming a nucleic acid cleavage structure on a target sequence and cleaving the nucleic acid cleavage structure in a site-specific manner. The 5' nuclease activity of a variety of enzymes is used to cleave the target-dependent cleavage structure, thereby indicating the presence of specific nucleic acid sequences or specific variations thereof. The present invention further relates to methods and devices for the separation of nucleic acid molecules based by charge.

  9. PCR-Based Method for the Detection of Toxic Mushrooms Causing Food-Poisoning Incidents.

    PubMed

    Nomura, Chie; Masayama, Atsushi; Yamaguchi, Mizuka; Sakuma, Daisuke; Kajimura, Keiji

    2017-01-01

    In this study, species-specific identification of five toxic mushrooms, Chlorophyllum molybdites, Gymnopilus junonius, Hypholoma fasciculare, Pleurocybella porrigens, and Tricholoma ustale, which have been involved in food-poisoning incidents in Japan, was investigated. Specific primer pairs targeting internal transcribed spacer (ITS) regions were designed for PCR detection. The specific amplicons were obtained from fresh, cooked, and simulated gastric fluid (SGF)-treated samples. No amplicons were detected from other mushrooms with similar morphology. Our method using one-step extraction of mushrooms allows rapid detection within 2.5 hr. It could be utilized for rapid identification or screening of toxic mushrooms.

  10. Single tube genotyping of sickle cell anaemia using PCR-based SNP analysis.

    PubMed

    Waterfall, C M; Cobb, B D

    2001-12-01

    Allele-specific amplification (ASA) is a generally applicable technique for the detection of known single nucleotide polymorphisms (SNPs), deletions, insertions and other sequence variations. Conventionally, two reactions are required to determine the zygosity of DNA in a two-allele system, along with significant upstream optimisation to define the specific test conditions. Here, we combine single tube bi-directional ASA with a 'matrix-based' optimisation strategy, speeding up the whole process in a reduced reaction set. We use sickle cell anaemia as our model SNP system, a genetic disease that is currently screened using ASA methods. Discriminatory conditions were rapidly optimised enabling the unambiguous identification of DNA from homozygous sickle cell patients (HbS/S), heterozygous carriers (HbA/S) or normal DNA in a single tube. Simple downstream mathematical analyses based on product yield across the optimisation set allow an insight into the important aspects of priming competition and component interactions in this competitive PCR. This strategy can be applied to any polymorphism, defining specific conditions using a multifactorial approach. The inherent simplicity and low cost of this PCR-based method validates bi-directional ASA as an effective tool in future clinical screening and pharmacogenomic research where more expensive fluorescence-based approaches may not be desirable.

  11. Chromosome-specific staining to detect genetic rearrangements associated with chromosome 3 and/or chromosome 17

    DOEpatents

    Gray, Joe W.; Pinkel, Daniel; Kallioniemi, Olli-Pekka; Kallioniemi, Anne; Sakamoto, Masaru

    2002-01-01

    Methods and compositions for staining based upon nucleic acid sequence that employ nudeic acid probes are provided. Said methods produce staining patterns that can be tailored for specific cytogenetic analyses. Said probes are appropriate for in situ hybridization and stain both interphase and metaphase chromosomal material with reliable signals. The nucleic acid probes are typically of a complexity greater than 50 kb, the complexity depending upon the cytogenetic application. Methods and reagents are provided for the detection of genetic rearrangements. Probes and test kits are provided for use in detecting genetic rearrangements, particularly for use in tumor cytogenetics, in the detection of disease related loci, specifically cancer, such as chronic myelogenous leukemia (CML), retinoblastoma, ovarian and uterine cancers, and for biological dosimetry. Methods and reagents are described for cytogenetic research, for the differentiation of cytogenetically similar but genetically different diseases, and for many prognostic and diagnostic applications.

  12. Chromosome-specific staining to detect genetic rearrangements associated with chromosome 3 and/or chromosome 17

    DOEpatents

    Gray, Joe W.; Pinkel, Daniel; Kallioniemi, Olli-Pekka; Kallioniemi, Anne; Sakamoto, Masaru

    2008-09-09

    Methods and compositions for staining based upon nucleic acid sequence that employ nucleic acid probes are provided. Said methods produce staining patterns that can be tailored for specific cytogenetic analyses. Said probes are appropriate for in situ hybridization and stain both interphase and metaphase chromosomal material with reliable signals. The nucleic acid probes are typically of a complexity greater than 50 kb, the complexity depending upon the cytogenetic application. Methods and reagents are provided for the detection of genetic rearrangements. Probes and test kits are provided for use in detecting genetic rearrangements, particularly for use in tumor cytogenetics, in the detection of disease related loci, specifically cancer, such as chronic myelogenous leukemia (CML), retinoblastoma, ovarian and uterine cancers, and for biological dosimetry. Methods and reagents are described for cytogenetic research, for the differentiation of cytogenetically similar but genetically different diseases, and for many prognostic and diagnostic applications.

  13. Chromosome-specific staining to detect genetic rearrangements associated with chromosome 3 and/or chromosome 17

    DOEpatents

    Gray, Joe W [San Francisco, CA; Pinkel, Daniel [Lafayette, CA; Kallioniemi, Olli-Pekka [Turku, FI; Kallioniemi, Anne [Tampere, FI; Sakamoto, Masaru [Tokyo, JP

    2009-10-06

    Methods and compositions for staining based upon nucleic acid sequence that employ .[.nudeic.]. .Iadd.nucleic .Iaddend.acid probes are provided. Said methods produce staining patterns that can be tailored for specific cytogenetic analyses. Said probes are appropriate for in situ hybridization and stain both interphase and metaphase chromosomal material with reliable signals. The nucleic acid probes are typically of a complexity greater than 50 kb, the complexity depending upon the cytogenetic application. Methods and reagents are provided for the detection of genetic rearrangements. Probes and test kits are provided for use in detecting genetic rearrangements, particularly for use in tumor cytogenetics, in the detection of disease related loci, specifically cancer, such as chronic myelogenous leukemia (CML), retinoblastoma, ovarian and uterine cancers, and for biological dosimetry. Methods and reagents are described for cytogenetic research, for the differentiation of cytogenetically similar but genetically different diseases, and for many prognostic and diagnostic applications.

  14. Chromosome-Specific Staining To Detect Genetic Rearrangements Associated With Chromosome 3 And/Or Chromosone 17

    DOEpatents

    Gray; Joe W.; Pinkel; Daniel; Kallioniemi; Olli-Pekka; Kallioniemi; Anne; Sakamoto; Masaru

    2002-02-05

    Methods and compositions for staining based upon nucleic acid sequence that employ nucleic acid probes are provided. Said methods produce staining patterns that can be tailored for specific cytogenetic analyses. Said probes are appropriate for in situ hybridization and stain both interphase and metaphase chromosomal material with reliable signals. The nucleic acid probes are typically of a complexity greater than 50 kb, the complexity depending upon the cytogenetic application. Methods and reagents are provided for the detection of genetic rearrangements. Probes and test kits are provided for use in detecting genetic rearrangements, particularly for use in tumor cytogenetics, in the detection of disease related loci, specifically cancer, such as chronic myelogenous leukemia (CML), retinoblastoma, ovarian and uterine cancers, and for biological dosimetry. Methods and reagents are described for cytogenetic research, for the differentiation of cytogenetically similar but genetically different diseases, and for many prognostic and diagnostic applications.

  15. HPV Genotyping of Modified General Primer-Amplicons Is More Analytically Sensitive and Specific by Sequencing than by Hybridization

    PubMed Central

    Meisal, Roger; Rounge, Trine Ballestad; Christiansen, Irene Kraus; Eieland, Alexander Kirkeby; Worren, Merete Molton; Molden, Tor Faksvaag; Kommedal, Øyvind; Hovig, Eivind; Leegaard, Truls Michael

    2017-01-01

    Sensitive and specific genotyping of human papillomaviruses (HPVs) is important for population-based surveillance of carcinogenic HPV types and for monitoring vaccine effectiveness. Here we compare HPV genotyping by Next Generation Sequencing (NGS) to an established DNA hybridization method. In DNA isolated from urine, the overall analytical sensitivity of NGS was found to be 22% higher than that of hybridization. NGS was also found to be the most specific method and expanded the detection repertoire beyond the 37 types of the DNA hybridization assay. Furthermore, NGS provided an increased resolution by identifying genetic variants of individual HPV types. The same Modified General Primers (MGP)-amplicon was used in both methods. The NGS method is described in detail to facilitate implementation in the clinical microbiology laboratory and includes suggestions for new standards for detection and calling of types and variants with improved resolution. PMID:28045981

  16. HPV Genotyping of Modified General Primer-Amplicons Is More Analytically Sensitive and Specific by Sequencing than by Hybridization.

    PubMed

    Meisal, Roger; Rounge, Trine Ballestad; Christiansen, Irene Kraus; Eieland, Alexander Kirkeby; Worren, Merete Molton; Molden, Tor Faksvaag; Kommedal, Øyvind; Hovig, Eivind; Leegaard, Truls Michael; Ambur, Ole Herman

    2017-01-01

    Sensitive and specific genotyping of human papillomaviruses (HPVs) is important for population-based surveillance of carcinogenic HPV types and for monitoring vaccine effectiveness. Here we compare HPV genotyping by Next Generation Sequencing (NGS) to an established DNA hybridization method. In DNA isolated from urine, the overall analytical sensitivity of NGS was found to be 22% higher than that of hybridization. NGS was also found to be the most specific method and expanded the detection repertoire beyond the 37 types of the DNA hybridization assay. Furthermore, NGS provided an increased resolution by identifying genetic variants of individual HPV types. The same Modified General Primers (MGP)-amplicon was used in both methods. The NGS method is described in detail to facilitate implementation in the clinical microbiology laboratory and includes suggestions for new standards for detection and calling of types and variants with improved resolution.

  17. System, apparatus and methods to implement high-speed network analyzers

    DOEpatents

    Ezick, James; Lethin, Richard; Ros-Giralt, Jordi; Szilagyi, Peter; Wohlford, David E

    2015-11-10

    Systems, apparatus and methods for the implementation of high-speed network analyzers are provided. A set of high-level specifications is used to define the behavior of the network analyzer emitted by a compiler. An optimized inline workflow to process regular expressions is presented without sacrificing the semantic capabilities of the processing engine. An optimized packet dispatcher implements a subset of the functions implemented by the network analyzer, providing a fast and slow path workflow used to accelerate specific processing units. Such dispatcher facility can also be used as a cache of policies, wherein if a policy is found, then packet manipulations associated with the policy can be quickly performed. An optimized method of generating DFA specifications for network signatures is also presented. The method accepts several optimization criteria, such as min-max allocations or optimal allocations based on the probability of occurrence of each signature input bit.

  18. Use of superparamagnetic beads for the isolation of a peptide with specificity to cymbidium mosaic virus.

    PubMed

    Ooi, Diana Jia Miin; Dzulkurnain, Adriya; Othman, Rofina Yasmin; Lim, Saw Hoon; Harikrishna, Jennifer Ann

    2006-09-01

    A modified method for the rapid isolation of specific ligands to whole virus particles is described. Biopanning against cymbidium mosaic virus was carried out with a commercial 12-mer random peptide display library. A solution phase panning method was devised using streptavidin-coated superparamagnetic beads. The solution based panning method was more efficient than conventional immobilized target panning when using whole viral particles of cymbidium mosaic virus as a target. Enzyme-linked immunosorbent assay of cymbidium mosaic virus-binding peptides isolated from the library identified seven peptides with affinity for cymbidium mosaic virus and one peptide which was specific to cymbidium mosaic virus and had no significant binding to odontoglossum ringspot virus. This method should have broad application for the screening of whole viral particles towards the rapid development of diagnostic reagents without the requirement for cloning and expression of single antigens.

  19. MO-C-18A-01: Advances in Model-Based 3D Image Reconstruction

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, G; Pan, X; Stayman, J

    2014-06-15

    Recent years have seen the emergence of CT image reconstruction techniques that exploit physical models of the imaging system, photon statistics, and even the patient to achieve improved 3D image quality and/or reduction of radiation dose. With numerous advantages in comparison to conventional 3D filtered backprojection, such techniques bring a variety of challenges as well, including: a demanding computational load associated with sophisticated forward models and iterative optimization methods; nonlinearity and nonstationarity in image quality characteristics; a complex dependency on multiple free parameters; and the need to understand how best to incorporate prior information (including patient-specific prior images) within themore » reconstruction process. The advantages, however, are even greater – for example: improved image quality; reduced dose; robustness to noise and artifacts; task-specific reconstruction protocols; suitability to novel CT imaging platforms and noncircular orbits; and incorporation of known characteristics of the imager and patient that are conventionally discarded. This symposium features experts in 3D image reconstruction, image quality assessment, and the translation of such methods to emerging clinical applications. Dr. Chen will address novel methods for the incorporation of prior information in 3D and 4D CT reconstruction techniques. Dr. Pan will show recent advances in optimization-based reconstruction that enable potential reduction of dose and sampling requirements. Dr. Stayman will describe a “task-based imaging” approach that leverages models of the imaging system and patient in combination with a specification of the imaging task to optimize both the acquisition and reconstruction process. Dr. Samei will describe the development of methods for image quality assessment in such nonlinear reconstruction techniques and the use of these methods to characterize and optimize image quality and dose in a spectrum of clinical applications. Learning Objectives: Learn the general methodologies associated with model-based 3D image reconstruction. Learn the potential advantages in image quality and dose associated with model-based image reconstruction. Learn the challenges associated with computational load and image quality assessment for such reconstruction methods. Learn how imaging task can be incorporated as a means to drive optimal image acquisition and reconstruction techniques. Learn how model-based reconstruction methods can incorporate prior information to improve image quality, ease sampling requirements, and reduce dose.« less

  20. Betweenness-Based Method to Identify Critical Transmission Sectors for Supply Chain Environmental Pressure Mitigation.

    PubMed

    Liang, Sai; Qu, Shen; Xu, Ming

    2016-02-02

    To develop industry-specific policies for mitigating environmental pressures, previous studies primarily focus on identifying sectors that directly generate large amounts of environmental pressures (a.k.a. production-based method) or indirectly drive large amounts of environmental pressures through supply chains (e.g., consumption-based method). In addition to those sectors as important environmental pressure producers or drivers, there exist sectors that are also important to environmental pressure mitigation as transmission centers. Economy-wide environmental pressure mitigation might be achieved by improving production efficiency of these key transmission sectors, that is, using less upstream inputs to produce unitary output. We develop a betweenness-based method to measure the importance of transmission sectors, borrowing the betweenness concept from network analysis. We quantify the betweenness of sectors by examining supply chain paths extracted from structural path analysis that pass through a particular sector. We take China as an example and find that those critical transmission sectors identified by betweenness-based method are not always identifiable by existing methods. This indicates that betweenness-based method can provide additional insights that cannot be obtained with existing methods on the roles individual sectors play in generating economy-wide environmental pressures. Betweenness-based method proposed here can therefore complement existing methods for guiding sector-level environmental pressure mitigation strategies.

  1. Specific detection and identification of [Actinobacillus] muris by PCR using primers targeting the 16S-23S rRNA internal transcribed spacer regions.

    PubMed

    Benga, Laurentiu; Benten, W Peter M; Engelhardt, Eva; Gougoula, Christina; Sager, Martin

    2013-08-01

    [Actinobacillus] muris represents along with [Pasteurella] pneumotropica the most prevalent Pasteurellaceae species isolated from the laboratory mouse. Despite the biological and economic importance of Pasteurellaceae in relation to experimental animals, no molecular based methods for the identification of [A.] muris are available. The aim of the present investigation was to develop a PCR method allowing detection and identification of [A.] muris. In this assay, a Pasteurellaceae common forward primer based on a conserved region of the 16S rRNA gene was used in conjunction with two different reverse primers specific for [A.] muris, targeting the 16S-23S internal transcribed spacer sequences. The specificity of the assay was tested against 78 reference and clinical isolates of Pasteurellaceae, including 37 strains of [A.] muris. In addition, eight other mice associated bacterial species which could pose a diagnostic problem were included. The assay showed 100% sensitivity and 97.95% specificity. Identification of the clinical isolates was validated by ITS profiling and when necessary by 16S rRNA sequencing. This multiplex PCR represents the first molecular tool able to detect [A.] muris and may become a reliable alternative to the present diagnostic methods. Copyright © 2013 Elsevier B.V. All rights reserved.

  2. A simple real-time polymerase chain reaction (PCR)-based assay for authentication of the Chinese Panax ginseng cultivar Damaya from a local ginseng population.

    PubMed

    Wang, H; Wang, J; Li, G

    2016-06-27

    Panax ginseng is one of the most important medicinal plants in the Orient. Owing to its increasing demand in the world market, cultivated ginseng has become the main source of medicinal material. Among the Chinese ginseng cultivars, Damaya commands higher prices and is grown in significant proportions among the local ginseng population. Due to the lack of rapid and accurate authentication methods, Damaya is distributed among different cultivars in the local ginseng population in China. Here, we identified a unique, Damaya-specific single nucleotide polymorphism (SNP) site present in the second intron of mitochondrial cytochrome c oxidase subunit 2 (cox2). Based on this SNP, a Damaya cultivar-specific primer was designed and an allele-specific polymerase chain reaction (PCR) was optimized for the effective molecular authentication of Damaya. We designed a method by combining a simple DNA isolation method with real-time allele-specific PCR using SYBR Green I fluorescent dye, and proved its efficacy in clearly discriminated Damaya cultivar from other Chinese ginseng cultivars according to the allelic discrimination analysis. Hence, this study provides a simple and rapid assay for the differentiation and conservation of Damaya from the local Chinese ginseng population.

  3. The Pixon Method for Data Compression Image Classification, and Image Reconstruction

    NASA Technical Reports Server (NTRS)

    Puetter, Richard; Yahil, Amos

    2002-01-01

    As initially proposed, this program had three goals: (1) continue to develop the highly successful Pixon method for image reconstruction and support other scientist in implementing this technique for their applications; (2) develop image compression techniques based on the Pixon method; and (3) develop artificial intelligence algorithms for image classification based on the Pixon approach for simplifying neural networks. Subsequent to proposal review the scope of the program was greatly reduced and it was decided to investigate the ability of the Pixon method to provide superior restorations of images compressed with standard image compression schemes, specifically JPEG-compressed images.

  4. Comparison of Three Different Hepatitis C Virus Genotyping Methods: 5'NCR PCR-RFLP, Core Type-Specific PCR, and NS5b Sequencing in a Tertiary Care Hospital in South India.

    PubMed

    Daniel, Hubert D-J; David, Joel; Raghuraman, Sukanya; Gnanamony, Manu; Chandy, George M; Sridharan, Gopalan; Abraham, Priya

    2017-05-01

    Based on genetic heterogeneity, hepatitis C virus (HCV) is classified into seven major genotypes and 64 subtypes. In spite of the sequence heterogeneity, all genotypes share an identical complement of colinear genes within the large open reading frame. The genetic interrelationships between these genes are consistent among genotypes. Due to this property, complete sequencing of the HCV genome is not required. HCV genotypes along with subtypes are critical for planning antiviral therapy. Certain genotypes are also associated with higher progression to liver cirrhosis. In this study, 100 blood samples were collected from individuals who came for routine HCV genotype identification. These samples were used for the comparison of two different genotyping methods (5'NCR PCR-RFLP and HCV core type-specific PCR) with NS5b sequencing. Of the 100 samples genotyped using 5'NCR PCR-RFLP and HCV core type-specific PCR, 90% (κ = 0.913, P < 0.00) and 96% (κ = 0.794, P < 0.00) correlated with NS5b sequencing, respectively. Sixty percent and 75% of discordant samples by 5'NCR PCR-RFLP and HCV core type-specific PCR, respectively, belonged to genotype 6. All the HCV genotype 1 subtypes were classified accurately by both the methods. This study shows that the 5'NCR-based PCR-RFLP and the HCV core type-specific PCR-based assays correctly identified HCV genotypes except genotype 6 from this region. Direct sequencing of the HCV core region was able to identify all the genotype 6 from this region and serves as an alternative to NS5b sequencing. © 2016 Wiley Periodicals, Inc.

  5. Microplate-based method to screen inhibitors of isozymes of cyclic nucleotide phosphodiesterase fused to SUMO.

    PubMed

    Chen, Chunyan; Liu, Miaomiao; Wu, Jing; Yang, Xiaolan; Hu, Xiaolei; Pu, Jun; Long, Gaobo; Xie, Yanling; Jiang, Hairong; Yuan, Yonghua; Liao, Fei

    2014-12-01

    The feasibility for microplate-based screening of inhibitors of isozymes of cyclic nucleotide phosphodiesterase (PDE) was tested via the coupled action of a phosphatase on adenosine-5'-monophosphate and an improved malachite green assay of phosphate. Human full-length PDE4B2 and truncated mutant (152-528aa) were expressed in Escherichia coli via fusion to SUMO, which after purification through Ni-NTA column exhibited specific activities >0.017 U mg(-1). In the presence of proteins <30 mg L(-1), absorbance for 10 µΜ phosphate was measurable; a PDE isozyme of specific activity over 0.008 U mg(-1) after reaction for 20 min thus suited for microplate-based screening of inhibitors. By using Biotek ELX 800 microplate reader, affinities of two forms of PEDE4B2 for cAMP, rolipram and papaverine varied over three magnitudes and were consistent with those by routine assay, respectively. Hence, the proposed method was promising for high-throughput-screening of inhibitors of phosphate-releasing enzymes bearing specific activities over 0.008 U mg(-1).

  6. Lab-based validation of different data processing methods for wrist-worn ActiGraph accelerometers in young adults.

    PubMed

    Ellingson, Laura D; Hibbing, Paul R; Kim, Youngwon; Frey-Law, Laura A; Saint-Maurice, Pedro F; Welk, Gregory J

    2017-06-01

    The wrist is increasingly being used as the preferred site for objectively assessing physical activity but the relative accuracy of processing methods for wrist data has not been determined. This study evaluates the validity of four processing methods for wrist-worn ActiGraph (AG) data against energy expenditure (EE) measured using a portable metabolic analyzer (OM; Oxycon mobile) and the Compendium of physical activity. Fifty-one adults (ages 18-40) completed 15 activities ranging from sedentary to vigorous in a laboratory setting while wearing an AG and the OM. Estimates of EE and categorization of activity intensity were obtained from the AG using a linear method based on Hildebrand cutpoints (HLM), a non-linear modification of this method (HNLM), and two methods developed by Staudenmayer based on a Linear Model (SLM) and using random forest (SRF). Estimated EE and classification accuracy were compared to the OM and Compendium using Bland-Altman plots, equivalence testing, mean absolute percent error (MAPE), and Kappa statistics. Overall, classification agreement with the Compendium was similar across methods ranging from a Kappa of 0.46 (HLM) to 0.54 (HNLM). However, specificity and sensitivity varied by method and intensity, ranging from a sensitivity of 0% (HLM for sedentary) to a specificity of ~99% for all methods for vigorous. None of the methods was significantly equivalent to the OM (p  >  0.05). Across activities, none of the methods evaluated had a high level of agreement with criterion measures. Additional research is needed to further refine the accuracy of processing wrist-worn accelerometer data.

  7. OPTIMIZATION OF RAMAN SPECTROSCOPY FOR SPECIATION OF ORGANICS IN WATER

    EPA Science Inventory

    We describe herein a method for determining constants for simultaneously occurring, site-specific "microequilibria" (as with tautomers) for organics in water. The method is based in part on modeling temperature-variant Raman spectra according to the van't Hoff equation. Spectra a...

  8. LABORATORY TOXICITY TESTS FOR EVALUATING POTENTIAL EFFECTS OF ENDOCRINE-DISRUPTING COMPOUNDS

    EPA Science Inventory

    The scope of the Laboratory Testing Work Group was to evaluate methods for testing aquatic and terrestrial invertebrates in the laboratory. Specifically, discussions focused on the following objectives: 1) assess the extent to which consensus-based standard methods and other pub...

  9. Cross-Validation of Survival Bump Hunting by Recursive Peeling Methods.

    PubMed

    Dazard, Jean-Eudes; Choe, Michael; LeBlanc, Michael; Rao, J Sunil

    2014-08-01

    We introduce a survival/risk bump hunting framework to build a bump hunting model with a possibly censored time-to-event type of response and to validate model estimates. First, we describe the use of adequate survival peeling criteria to build a survival/risk bump hunting model based on recursive peeling methods. Our method called "Patient Recursive Survival Peeling" is a rule-induction method that makes use of specific peeling criteria such as hazard ratio or log-rank statistics. Second, to validate our model estimates and improve survival prediction accuracy, we describe a resampling-based validation technique specifically designed for the joint task of decision rule making by recursive peeling (i.e. decision-box) and survival estimation. This alternative technique, called "combined" cross-validation is done by combining test samples over the cross-validation loops, a design allowing for bump hunting by recursive peeling in a survival setting. We provide empirical results showing the importance of cross-validation and replication.

  10. Cross-Validation of Survival Bump Hunting by Recursive Peeling Methods

    PubMed Central

    Dazard, Jean-Eudes; Choe, Michael; LeBlanc, Michael; Rao, J. Sunil

    2015-01-01

    We introduce a survival/risk bump hunting framework to build a bump hunting model with a possibly censored time-to-event type of response and to validate model estimates. First, we describe the use of adequate survival peeling criteria to build a survival/risk bump hunting model based on recursive peeling methods. Our method called “Patient Recursive Survival Peeling” is a rule-induction method that makes use of specific peeling criteria such as hazard ratio or log-rank statistics. Second, to validate our model estimates and improve survival prediction accuracy, we describe a resampling-based validation technique specifically designed for the joint task of decision rule making by recursive peeling (i.e. decision-box) and survival estimation. This alternative technique, called “combined” cross-validation is done by combining test samples over the cross-validation loops, a design allowing for bump hunting by recursive peeling in a survival setting. We provide empirical results showing the importance of cross-validation and replication. PMID:26997922

  11. Electrophoretic mobility shift scanning using an automated infrared DNA sequencer.

    PubMed

    Sano, M; Ohyama, A; Takase, K; Yamamoto, M; Machida, M

    2001-11-01

    Electrophoretic mobility shift assay (EMSA) is widely used in the study of sequence-specific DNA-binding proteins, including transcription factors and mismatch binding proteins. We have established a non-radioisotope-based protocol for EMSA that features an automated DNA sequencer with an infrared fluorescent dye (IRDye) detection unit. Our modification of the elec- trophoresis unit, which includes cooling the gel plates with a reduced well-to-read length, has made it possible to detect shifted bands within 1 h. Further, we have developed a rapid ligation-based method for generating IRDye-labeled probes with an approximately 60% cost reduction. This method has the advantages of real-time scanning, stability of labeled probes, and better safety associated with nonradioactive methods of detection. Analysis of a promoter from an industrially important filamentous fungus, Aspergillus oryzae, in a prototype experiment revealed that the method we describe has potential for use in systematic scanning and identification of the functionally important elements to which cellular factors bind in a sequence-specific manner.

  12. Dose escalation methods in phase I cancer clinical trials.

    PubMed

    Le Tourneau, Christophe; Lee, J Jack; Siu, Lillian L

    2009-05-20

    Phase I clinical trials are an essential step in the development of anticancer drugs. The main goal of these studies is to establish the recommended dose and/or schedule of new drugs or drug combinations for phase II trials. The guiding principle for dose escalation in phase I trials is to avoid exposing too many patients to subtherapeutic doses while preserving safety and maintaining rapid accrual. Here we review dose escalation methods for phase I trials, including the rule-based and model-based dose escalation methods that have been developed to evaluate new anticancer agents. Toxicity has traditionally been the primary endpoint for phase I trials involving cytotoxic agents. However, with the emergence of molecularly targeted anticancer agents, potential alternative endpoints to delineate optimal biological activity, such as plasma drug concentration and target inhibition in tumor or surrogate tissues, have been proposed along with new trial designs. We also describe specific methods for drug combinations as well as methods that use a time-to-event endpoint or both toxicity and efficacy as endpoints. Finally, we present the advantages and drawbacks of the various dose escalation methods and discuss specific applications of the methods in developmental oncotherapeutics.

  13. Highly sensitive fluorescence detection of metastatic lymph nodes of gastric cancer with photo-oxidation of protoporphyrin IX.

    PubMed

    Koizumi, N; Harada, Y; Beika, M; Minamikawa, T; Yamaoka, Y; Dai, P; Murayama, Y; Yanagisawa, A; Otsuji, E; Tanaka, H; Takamatsu, T

    2016-08-01

    The establishment of a precise and rapid method to detect metastatic lymph nodes (LNs) is essential to perform less invasive surgery with reduced gastrectomy along with reduced lymph node dissection. We herein describe a novel imaging strategy to detect 5-aminolevulinic acid (5-ALA)-induced protoporphyrin IX (PpIX) fluorescence in excised LNs specifically with reduced effects of tissue autofluorescence based on photo-oxidation of PpIX. We applied the method in a clinical setting, and evaluated its feasibility. To reduce the unfavorable effect of autofluorescence, we focused on photo-oxidation of PpIX: Following light irradiation, PpIX changes into another substance, photo-protoporphyrin, via an oxidative process, which has a different spectral peak, at 675 nm, whereas PpIX has its spectral peak at 635 nm. Based on the unique spectral alteration, fluorescence spectral imaging before and after light irradiation and subsequent originally-developed image processing was performed. Following in vitro study, we applied this method to a total of 662 excised LNs obtained from 30 gastric cancer patients administered 5-ALA preoperatively. Specific visualization of PpIX was achieved in in vitro study. The method allowed highly sensitive detection of metastatic LNs, with sensitivity of 91.9% and specificity of 90.8% in the in vivo clinical trial. Receiver operating characteristic analysis indicated high diagnostic accuracy, with the area under the curve of 0.926. We established a highly sensitive and specific 5-ALA-induced fluorescence imaging method applicable in clinical settings. The novel method has a potential to become a useful tool for intraoperative rapid diagnosis of LN metastasis. Copyright © 2016 Elsevier Ltd. All rights reserved.

  14. Diagnostic algorithm for detection of targetable driver mutations in lung adenocarcinomas: Comprehensive analyses of 205 cases with immunohistochemistry, real-time PCR and fluorescence in situ hybridization methods.

    PubMed

    Kao, Hua-Lin; Yeh, Yi-Chen; Lin, Chin-Hsuan; Hsu, Wei-Fang; Hsieh, Wen-Yu; Ho, Hsiang-Ling; Chou, Teh-Ying

    2016-11-01

    Analysis of the targetable driver mutations is now recommended in all patients with advanced lung adenocarcinoma. Molecular-based methods are usually adopted, however, along with the implementation of highly sensitive and/or mutation-specific antibodies, immunohistochemistry (IHC) has been considered an alternative method for identifying driver mutations in lung adenocarcinomas. A total of 205 lung adenocarcinomas were examined for EGFR mutations and ALK and ROS1 rearrangements using real-time PCR, fluorescence in situ hybridization (FISH) and IHC in parallel. The performance of different commercially available IHC antibody clones toward targetable driver mutations was evaluated. The association between these driver mutations and clinicopathological characteristics was also analyzed. In 205 cases we studied, 58.5% were found to harbor EGFR mutations, 6.3% ALK rearrangements and 1.0% ROS1 rearrangements. Compared to molecular-based methods, IHC of EGFR mutations showed an excellent specificity but the sensitivity is suboptimal, while IHC of ALK and ROS1 rearrangements demonstrated high sensitivity and specificity. No significant difference regarding the performance of different antibody clones toward these driver mutations was observed, except that clone SP125 showed a higher sensitivity than 43B2 in the detection of p.L858R of EGFR. In circumstances such as poor quality of nucleic acids or low content of tumor cells, IHC of EGFR mutation-specific antibodies could be used as an alternative method. Patients negative for EGFR mutations are subjected to further analysis on ALK and ROS1 rearrangements using IHC methods. Herein, we proposed a lung adenocarcinoma testing algorithm for the application of IHC in therapeutic diagnosis. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  15. Comparison of methods for evaluating ground-contact copper preservative depletion

    Treesearch

    Stan Lebow; Steven Halverson

    2008-01-01

    Depletion of the biocide(s) used to treat wood has a major influence on service life and environmental concerns. However, little is known about the extent of depletion from the specific leaching method employed. Wood treated with two types of copper-based preservatives were leached using three different methods: field stakes (American Wood Protection Association (AWPA...

  16. Restriction-Site-Specific PCR as a Rapid Test To Detect Enterohemorrhagic Escherichia coli O157:H7 Strains in Environmental Samples

    PubMed Central

    Kimura, Richard; Mandrell, Robert E.; Galland, John C.; Hyatt, Doreene; Riley, Lee W.

    2000-01-01

    Enterohemorrhagic Escherichia coli (EHEC) O157:H7 is an important food-borne pathogen in industrialized countries. We developed a rapid and simple test for detecting E. coli O157:H7 using a method based on restriction site polymorphisms. Restriction-site-specific PCR (RSS-PCR) involves the amplification of DNA fragments using primers based on specific restriction enzyme recognition sequences, without the use of endonucleases, to generate a set of amplicons that yield “fingerprint” patterns when resolved electrophoretically on an agarose gel. The method was evaluated in a blinded study of E. coli isolates obtained from environmental samples collected at beef cattle feedyards. The 54 isolates were all initially identified by a commonly used polyclonal antibody test as belonging to O157:H7 serotype. They were retested by anti-O157 and anti-H7 monoclonal antibody enzyme-linked immunosorbent assay (ELISA). The RSS-PCR method identified all 28 isolates that were shown to be E. coli O157:H7 by the monoclonal antibody ELISA as belonging to the O157:H7 serotype. Of the remaining 26 ELISA-confirmed non-O157:H7 strains, the method classified 25 strains as non-O157:H7. The specificity of the RSS-PCR results correlated better with the monoclonal antibody ELISA than with the polyclonal antibody latex agglutination tests. The RSS-PCR method may be a useful test to distinguish E. coli O157:H7 from a large number of E. coli isolates from environmental samples. PMID:10831431

  17. Localization and force analysis at the single virus particle level using atomic force microscopy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Chih-Hao; Horng, Jim-Tong; Chang, Jeng-Shian

    2012-01-06

    Highlights: Black-Right-Pointing-Pointer Localization of single virus particle. Black-Right-Pointing-Pointer Force measurements. Black-Right-Pointing-Pointer Force mapping. -- Abstract: Atomic force microscopy (AFM) is a vital instrument in nanobiotechnology. In this study, we developed a method that enables AFM to simultaneously measure specific unbinding force and map the viral glycoprotein at the single virus particle level. The average diameter of virus particles from AFM images and the specificity between the viral surface antigen and antibody probe were integrated to design a three-stage method that sets the measuring area to a single virus particle before obtaining the force measurements, where the influenza virus was usedmore » as the object of measurements. Based on the purposed method and performed analysis, several findings can be derived from the results. The mean unbinding force of a single virus particle can be quantified, and no significant difference exists in this value among virus particles. Furthermore, the repeatability of the proposed method is demonstrated. The force mapping images reveal that the distributions of surface viral antigens recognized by antibody probe were dispersed on the whole surface of individual virus particles under the proposed method and experimental criteria; meanwhile, the binding probabilities are similar among particles. This approach can be easily applied to most AFM systems without specific components or configurations. These results help understand the force-based analysis at the single virus particle level, and therefore, can reinforce the capability of AFM to investigate a specific type of viral surface protein and its distributions.« less

  18. Nested PCR Assay for Eight Pathogens: A Rapid Tool for Diagnosis of Bacterial Meningitis.

    PubMed

    Bhagchandani, Sharda P; Kubade, Sushant; Nikhare, Priyanka P; Manke, Sonali; Chandak, Nitin H; Kabra, Dinesh; Baheti, Neeraj N; Agrawal, Vijay S; Sarda, Pankaj; Mahajan, Parikshit; Ganjre, Ashish; Purohit, Hemant J; Singh, Lokendra; Taori, Girdhar M; Daginawala, Hatim F; Kashyap, Rajpal S

    2016-02-01

    Bacterial meningitis is a dreadful infectious disease with a high mortality and morbidity if remained undiagnosed. Traditional diagnostic methods for bacterial meningitis pose a challenge in accurate identification of pathogen, making prognosis difficult. The present study is therefore aimed to design and evaluate a specific and sensitive nested 16S rDNA genus-based polymerase chain reaction (PCR) assay using clinical cerebrospinal fluid (CSF) for rapid diagnosis of eight pathogens causing the disease. The present work was dedicated to development of an in-house genus specific 16S rDNA nested PCR covering pathogens of eight genera responsible for causing bacterial meningitis using newly designed as well as literature based primers for respective genus. A total 150 suspected meningitis CSF obtained from the patients admitted to Central India Institute of Medical Sciences (CIIMS), India during the period from August 2011 to May 2014, were used to evaluate clinical sensitivity and clinical specificity of optimized PCR assays. The analytical sensitivity and specificity of our newly designed genus-specific 16S rDNA PCR were found to be ≥92%. With such a high sensitivity and specificity, our in-house nested PCR was able to give 100% sensitivity in clinically confirmed positive cases and 100% specificity in clinically confirmed negative cases indicating its applicability in clinical diagnosis. Our in-house nested PCR system therefore can diagnose the accurate pathogen causing bacterial meningitis and therefore be useful in selecting a specific treatment line to minimize morbidity. Results are obtained within 24 h and high sensitivity makes this nested PCR assay a rapid and accurate diagnostic tool compared to traditional culture-based methods.

  19. Using ROC Curves to Choose Minimally Important Change Thresholds when Sensitivity and Specificity Are Valued Equally: The Forgotten Lesson of Pythagoras. Theoretical Considerations and an Example Application of Change in Health Status

    PubMed Central

    Froud, Robert; Abel, Gary

    2014-01-01

    Background Receiver Operator Characteristic (ROC) curves are being used to identify Minimally Important Change (MIC) thresholds on scales that measure a change in health status. In quasi-continuous patient reported outcome measures, such as those that measure changes in chronic diseases with variable clinical trajectories, sensitivity and specificity are often valued equally. Notwithstanding methodologists agreeing that these should be valued equally, different approaches have been taken to estimating MIC thresholds using ROC curves. Aims and objectives We aimed to compare the different approaches used with a new approach, exploring the extent to which the methods choose different thresholds, and considering the effect of differences on conclusions in responder analyses. Methods Using graphical methods, hypothetical data, and data from a large randomised controlled trial of manual therapy for low back pain, we compared two existing approaches with a new approach that is based on the addition of the sums of squares of 1-sensitivity and 1-specificity. Results There can be divergence in the thresholds chosen by different estimators. The cut-point selected by different estimators is dependent on the relationship between the cut-points in ROC space and the different contours described by the estimators. In particular, asymmetry and the number of possible cut-points affects threshold selection. Conclusion Choice of MIC estimator is important. Different methods for choosing cut-points can lead to materially different MIC thresholds and thus affect results of responder analyses and trial conclusions. An estimator based on the smallest sum of squares of 1-sensitivity and 1-specificity is preferable when sensitivity and specificity are valued equally. Unlike other methods currently in use, the cut-point chosen by the sum of squares method always and efficiently chooses the cut-point closest to the top-left corner of ROC space, regardless of the shape of the ROC curve. PMID:25474472

  20. Characterization of Chronic Aortic and Mitral Regurgitation Undergoing Valve Surgery Using Cardiovascular Magnetic Resonance.

    PubMed

    Polte, Christian L; Gao, Sinsia A; Johnsson, Åse A; Lagerstrand, Kerstin M; Bech-Hanssen, Odd

    2017-06-15

    Grading of chronic aortic regurgitation (AR) and mitral regurgitation (MR) by cardiovascular magnetic resonance (CMR) is currently based on thresholds, which are neither modality nor quantification method specific. Accordingly, this study sought to identify CMR-specific and quantification method-specific thresholds for regurgitant volumes (RVols), RVol indexes, and regurgitant fractions (RFs), which denote severe chronic AR or MR with an indication for surgery. The study comprised patients with moderate and severe chronic AR (n = 38) and MR (n = 40). Echocardiography and CMR was performed at baseline and in all operated AR/MR patients (n = 23/25) 10 ± 1 months after surgery. CMR quantification of AR: direct (aortic flow) and indirect method (left ventricular stroke volume [LVSV] - pulmonary stroke volume [PuSV]); MR: 2 indirect methods (LVSV - aortic forward flow [AoFF]; mitral inflow [MiIF] - AoFF). All operated patients had severe regurgitation and benefited from surgery, indicated by a significant postsurgical reduction in end-diastolic volume index and improvement or relief of symptoms. The discriminatory ability between moderate and severe AR was strong for RVol >40 ml, RVol index >20 ml/m 2 , and RF >30% (direct method) and RVol >62 ml, RVol index >31 ml/m 2 , and RF >36% (LVSV-PuSV) with a negative likelihood ratio ≤ 0.2. In MR, the discriminatory ability was very strong for RVol >64 ml, RVol index >32 ml/m 2 , and RF >41% (LVSV-AoFF) and RVol >40 ml, RVol index >20 ml/m 2 , and RF >30% (MiIF-AoFF) with a negative likelihood ratio < 0.1. In conclusion, CMR grading of chronic AR and MR should be based on modality-specific and quantification method-specific thresholds, as they differ largely from recognized guideline criteria, to assure appropriate clinical decision-making and timing of surgery. Copyright © 2017 Elsevier Inc. All rights reserved.

  1. Statistical Methods for Proteomic Biomarker Discovery based on Feature Extraction or Functional Modeling Approaches.

    PubMed

    Morris, Jeffrey S

    2012-01-01

    In recent years, developments in molecular biotechnology have led to the increased promise of detecting and validating biomarkers, or molecular markers that relate to various biological or medical outcomes. Proteomics, the direct study of proteins in biological samples, plays an important role in the biomarker discovery process. These technologies produce complex, high dimensional functional and image data that present many analytical challenges that must be addressed properly for effective comparative proteomics studies that can yield potential biomarkers. Specific challenges include experimental design, preprocessing, feature extraction, and statistical analysis accounting for the inherent multiple testing issues. This paper reviews various computational aspects of comparative proteomic studies, and summarizes contributions I along with numerous collaborators have made. First, there is an overview of comparative proteomics technologies, followed by a discussion of important experimental design and preprocessing issues that must be considered before statistical analysis can be done. Next, the two key approaches to analyzing proteomics data, feature extraction and functional modeling, are described. Feature extraction involves detection and quantification of discrete features like peaks or spots that theoretically correspond to different proteins in the sample. After an overview of the feature extraction approach, specific methods for mass spectrometry ( Cromwell ) and 2D gel electrophoresis ( Pinnacle ) are described. The functional modeling approach involves modeling the proteomic data in their entirety as functions or images. A general discussion of the approach is followed by the presentation of a specific method that can be applied, wavelet-based functional mixed models, and its extensions. All methods are illustrated by application to two example proteomic data sets, one from mass spectrometry and one from 2D gel electrophoresis. While the specific methods presented are applied to two specific proteomic technologies, MALDI-TOF and 2D gel electrophoresis, these methods and the other principles discussed in the paper apply much more broadly to other expression proteomics technologies.

  2. Prospective regularization design in prior-image-based reconstruction

    NASA Astrophysics Data System (ADS)

    Dang, Hao; Siewerdsen, Jeffrey H.; Webster Stayman, J.

    2015-12-01

    Prior-image-based reconstruction (PIBR) methods leveraging patient-specific anatomical information from previous imaging studies and/or sequences have demonstrated dramatic improvements in dose utilization and image quality for low-fidelity data. However, a proper balance of information from the prior images and information from the measurements is required (e.g. through careful tuning of regularization parameters). Inappropriate selection of reconstruction parameters can lead to detrimental effects including false structures and failure to improve image quality. Traditional methods based on heuristics are subject to error and sub-optimal solutions, while exhaustive searches require a large number of computationally intensive image reconstructions. In this work, we propose a novel method that prospectively estimates the optimal amount of prior image information for accurate admission of specific anatomical changes in PIBR without performing full image reconstructions. This method leverages an analytical approximation to the implicitly defined PIBR estimator, and introduces a predictive performance metric leveraging this analytical form and knowledge of a particular presumed anatomical change whose accurate reconstruction is sought. Additionally, since model-based PIBR approaches tend to be space-variant, a spatially varying prior image strength map is proposed to optimally admit changes everywhere in the image (eliminating the need to know change locations a priori). Studies were conducted in both an ellipse phantom and a realistic thorax phantom emulating a lung nodule surveillance scenario. The proposed method demonstrated accurate estimation of the optimal prior image strength while achieving a substantial computational speedup (about a factor of 20) compared to traditional exhaustive search. Moreover, the use of the proposed prior strength map in PIBR demonstrated accurate reconstruction of anatomical changes without foreknowledge of change locations in phantoms where the optimal parameters vary spatially by an order of magnitude or more. In a series of studies designed to explore potential unknowns associated with accurate PIBR, optimal prior image strength was found to vary with attenuation differences associated with anatomical change but exhibited only small variations as a function of the shape and size of the change. The results suggest that, given a target change attenuation, prospective patient-, change-, and data-specific customization of the prior image strength can be performed to ensure reliable reconstruction of specific anatomical changes.

  3. [Development of molecular detection of food-borne pathogenic bacteria using miniaturized microfluidic devices].

    PubMed

    Iván, Kristóf; Maráz, Anna

    2015-12-20

    Detection and identification of food-borne pathogenic bacteria are key points for the assurance of microbiological food safety. Traditional culture-based methods are more and more replaced by or supplemented with nucleic acid based molecular techniques, targeting specific (preferably virulence) genes in the genomes. Internationally validated DNA amplification - most frequently real-time polymerase chain reaction - methods are applied by the food microbiological testing laboratories for routine analysis, which will result not only in shortening the time for results but they also improve the performance characteristics (e.g. sensitivity, specificity) of the methods. Beside numerous advantages of the polymerase chain reaction based techniques for routine microbiological analysis certain drawbacks have to be mentioned, such as the high cost of the equipment and reagents, as well as the risk of contamination of the laboratory environment by the polymerase chain reaction amplicons, which require construction of an isolated laboratory system. Lab-on-a-chip systems can integrate most of these laboratory processes within a miniaturized device that delivers the same specificity and reliability as the standard protocols. The benefits of miniaturized devices are: simple - often automated - use, small overall size, portability, sterility due to single use possibility. These miniaturized rapid diagnostic tests are being researched and developed at the best research centers around the globe implementing various sample preparation and molecular DNA amplification methods on-chip. In parallel, the aim of the authors' research is to develop microfluidic Lab-on-a-chip devices for the detection and identification of food-borne pathogenic bacteria.

  4. Supervised DNA Barcodes species classification: analysis, comparisons and results

    PubMed Central

    2014-01-01

    Background Specific fragments, coming from short portions of DNA (e.g., mitochondrial, nuclear, and plastid sequences), have been defined as DNA Barcode and can be used as markers for organisms of the main life kingdoms. Species classification with DNA Barcode sequences has been proven effective on different organisms. Indeed, specific gene regions have been identified as Barcode: COI in animals, rbcL and matK in plants, and ITS in fungi. The classification problem assigns an unknown specimen to a known species by analyzing its Barcode. This task has to be supported with reliable methods and algorithms. Methods In this work the efficacy of supervised machine learning methods to classify species with DNA Barcode sequences is shown. The Weka software suite, which includes a collection of supervised classification methods, is adopted to address the task of DNA Barcode analysis. Classifier families are tested on synthetic and empirical datasets belonging to the animal, fungus, and plant kingdoms. In particular, the function-based method Support Vector Machines (SVM), the rule-based RIPPER, the decision tree C4.5, and the Naïve Bayes method are considered. Additionally, the classification results are compared with respect to ad-hoc and well-established DNA Barcode classification methods. Results A software that converts the DNA Barcode FASTA sequences to the Weka format is released, to adapt different input formats and to allow the execution of the classification procedure. The analysis of results on synthetic and real datasets shows that SVM and Naïve Bayes outperform on average the other considered classifiers, although they do not provide a human interpretable classification model. Rule-based methods have slightly inferior classification performances, but deliver the species specific positions and nucleotide assignments. On synthetic data the supervised machine learning methods obtain superior classification performances with respect to the traditional DNA Barcode classification methods. On empirical data their classification performances are at a comparable level to the other methods. Conclusions The classification analysis shows that supervised machine learning methods are promising candidates for handling with success the DNA Barcoding species classification problem, obtaining excellent performances. To conclude, a powerful tool to perform species identification is now available to the DNA Barcoding community. PMID:24721333

  5. A Tale of Two Methods: Chart and Interview Methods for Identifying Delirium

    PubMed Central

    Saczynski, Jane S.; Kosar, Cyrus M.; Xu, Guoquan; Puelle, Margaret R.; Schmitt, Eva; Jones, Richard N.; Marcantonio, Edward R.; Wong, Bonnie; Isaza, Ilean; Inouye, Sharon K.

    2014-01-01

    Background Interview and chart-based methods for identifying delirium have been validated. However, relative strengths and limitations of each method have not been described, nor has a combined approach (using both interviews and chart), been systematically examined. Objectives To compare chart and interview-based methods for identification of delirium. Design, Setting and Participants Participants were 300 patients aged 70+ undergoing major elective surgery (majority were orthopedic surgery) interviewed daily during hospitalization for delirium using the Confusion Assessment Method (CAM; interview-based method) and whose medical charts were reviewed for delirium using a validated chart-review method (chart-based method). We examined rate of agreement on the two methods and patient characteristics of those identified using each approach. Predictive validity for clinical outcomes (length of stay, postoperative complications, discharge disposition) was compared. In the absence of a gold-standard, predictive value could not be calculated. Results The cumulative incidence of delirium was 23% (n= 68) by the interview-based method, 12% (n=35) by the chart-based method and 27% (n=82) by the combined approach. Overall agreement was 80%; kappa was 0.30. The methods differed in detection of psychomotor features and time of onset. The chart-based method missed delirium in CAM-identified patients laacking features of psychomotor agitation or inappropriate behavior. The CAM-based method missed chart-identified cases occurring during the night shift. The combined method had high predictive validity for all clinical outcomes. Conclusions Interview and chart-based methods have specific strengths for identification of delirium. A combined approach captures the largest number and the broadest range of delirium cases. PMID:24512042

  6. A highly sensitive and specific method for the screening detection of genetically modified organisms based on digital PCR without pretreatment

    PubMed Central

    Fu, Wei; Zhu, Pengyu; Wang, Chenguang; Huang, Kunlun; Du, Zhixin; Tian, Wenying; Wang, Qin; Wang, Huiyu; Xu, Wentao; Zhu, Shuifang

    2015-01-01

    Digital PCR has developed rapidly since it was first reported in the 1990s. It was recently reported that an improved method facilitated the detection of genetically modified organisms (GMOs). However, to use this improved method, the samples must be pretreated, which could introduce inaccuracy into the results. In our study, we explored a pretreatment-free digital PCR detection method for the screening for GMOs. We chose the CaMV35s promoter and the NOS terminator as the templates in our assay. To determine the specificity of our method, 9 events of GMOs were collected, including MON810, MON863, TC1507, MIR604, MIR162, GA21, T25, NK603 and Bt176. Moreover, the sensitivity, intra-laboratory and inter-laboratory reproducibility of our detection method were assessed. The results showed that the limit of detection of our method was 0.1%, which was lower than the labeling threshold level of the EU. The specificity and stability among the 9 events were consistent, respectively. The intra-laboratory and inter-laboratory reproducibility were both good. Finally, the perfect fitness for the detection of eight double-blind samples indicated the good practicability of our method. In conclusion, the method in our study would allow more sensitive, specific and stable screening detection of the GMO content of international trading products. PMID:26239916

  7. A highly sensitive and specific method for the screening detection of genetically modified organisms based on digital PCR without pretreatment.

    PubMed

    Fu, Wei; Zhu, Pengyu; Wang, Chenguang; Huang, Kunlun; Du, Zhixin; Tian, Wenying; Wang, Qin; Wang, Huiyu; Xu, Wentao; Zhu, Shuifang

    2015-08-04

    Digital PCR has developed rapidly since it was first reported in the 1990 s. It was recently reported that an improved method facilitated the detection of genetically modified organisms (GMOs). However, to use this improved method, the samples must be pretreated, which could introduce inaccuracy into the results. In our study, we explored a pretreatment-free digital PCR detection method for the screening for GMOs. We chose the CaMV35s promoter and the NOS terminator as the templates in our assay. To determine the specificity of our method, 9 events of GMOs were collected, including MON810, MON863, TC1507, MIR604, MIR162, GA21, T25, NK603 and Bt176. Moreover, the sensitivity, intra-laboratory and inter-laboratory reproducibility of our detection method were assessed. The results showed that the limit of detection of our method was 0.1%, which was lower than the labeling threshold level of the EU. The specificity and stability among the 9 events were consistent, respectively. The intra-laboratory and inter-laboratory reproducibility were both good. Finally, the perfect fitness for the detection of eight double-blind samples indicated the good practicability of our method. In conclusion, the method in our study would allow more sensitive, specific and stable screening detection of the GMO content of international trading products.

  8. Estimation of groundwater consumption by phreatophytes using diurnal water table fluctuations: A saturated‐unsaturated flow assessment

    USGS Publications Warehouse

    Loheide, Steven P.; Butler, James J.; Gorelick, Steven M.

    2005-01-01

    Groundwater consumption by phreatophytes is a difficult‐to‐measure but important component of the water budget in many arid and semiarid environments. Over the past 70 years the consumptive use of groundwater by phreatophytes has been estimated using a method that analyzes diurnal trends in hydrographs from wells that are screened across the water table (White, 1932). The reliability of estimates obtained with this approach has never been rigorously evaluated using saturated‐unsaturated flow simulation. We present such an evaluation for common flow geometries and a range of hydraulic properties. Results indicate that the major source of error in the White method is the uncertainty in the estimate of specific yield. Evapotranspirative consumption of groundwater will often be significantly overpredicted with the White method if the effects of drainage time and the depth to the water table on specific yield are ignored. We utilize the concept of readily available specific yield as the basis for estimation of the specific yield value appropriate for use with the White method. Guidelines are defined for estimating readily available specific yield based on sediment texture. Use of these guidelines with the White method should enable the evapotranspirative consumption of groundwater to be more accurately quantified.

  9. Development of multiplex PCR assay for authentication of Cornu Cervi Pantotrichum in traditional Chinese medicine based on cytochrome b and C oxidase subunit 1 genes.

    PubMed

    Gao, Lijun; Xia, Wei; Ai, Jinxia; Li, Mingcheng; Yuan, Guanxin; Niu, Jiamu; Fu, Guilian; Zhang, Lihua

    2016-07-01

    This study describes a method for discriminating the true Cervus antlers from its counterfeits using multiplex PCR. Bioinformatics were carried out to design the specific alleles primers for mitochondrial (mt) cytochrome b (Cyt b) and cytochrome C oxidase subunit 1 (Cox 1) genes. The mt DNA and genomic DNA were extracted from Cervi Cornu Pantotrichum through the modified alkaline and the salt-extracting method in addition to its counterfeits, respectively. Sufficient DNA templates were extracted from all samples used in two methods, and joint fragments of 354 bp and 543 bp that were specifically amplified from both of true Cervus antlers served as a standard control. The data revealed that the multiplex PCR-based assays using two primer sets can be used for forensic and quantitative identification of original Cervus deer products from counterfeit antlers in a single step.

  10. Variable context Markov chains for HIV protease cleavage site prediction.

    PubMed

    Oğul, Hasan

    2009-06-01

    Deciphering the knowledge of HIV protease specificity and developing computational tools for detecting its cleavage sites in protein polypeptide chain are very desirable for designing efficient and specific chemical inhibitors to prevent acquired immunodeficiency syndrome. In this study, we developed a generative model based on a generalization of variable order Markov chains (VOMC) for peptide sequences and adapted the model for prediction of their cleavability by certain proteases. The new method, called variable context Markov chains (VCMC), attempts to identify the context equivalence based on the evolutionary similarities between individual amino acids. It was applied for HIV-1 protease cleavage site prediction problem and shown to outperform existing methods in terms of prediction accuracy on a common dataset. In general, the method is a promising tool for prediction of cleavage sites of all proteases and encouraged to be used for any kind of peptide classification problem as well.

  11. Develop nondestructive rapid pavement quality assurance/quality control evaluation test methods and supporting technology : project summary.

    DOT National Transportation Integrated Search

    2017-01-01

    The findings from the proof of concept with mechanics-based models for flexible base suggest additional validation work should be performed, draft construction specification frameworks should be developed, and work extending the technology to stabili...

  12. Develop nondestructive rapid pavement quality Assurance/quality control evaluation test methods and supporting technology : project summary.

    DOT National Transportation Integrated Search

    2017-01-01

    The findings from the proof of concept with mechanics-based models for flexible base suggest additional validation work should be performed, draft construction specification frameworks should be developed, and work extending the technology to stabili...

  13. Assessment of Matrix Multiplication Learning with a Rule-Based Analytical Model--"A Bayesian Network Representation"

    ERIC Educational Resources Information Center

    Zhang, Zhidong

    2016-01-01

    This study explored an alternative assessment procedure to examine learning trajectories of matrix multiplication. It took rule-based analytical and cognitive task analysis methods specifically to break down operation rules for a given matrix multiplication. Based on the analysis results, a hierarchical Bayesian network, an assessment model,…

  14. Thermal Property Measurement of Semiconductor Melt using Modified Laser Flash Method

    NASA Technical Reports Server (NTRS)

    Lin, Bochuan; Zhu, Shen; Ban, Heng; Li, Chao; Scripa, Rosalla N.; Su, Ching-Hua; Lehoczky, Sandor L.

    2003-01-01

    This study further developed standard laser flash method to measure multiple thermal properties of semiconductor melts. The modified method can determine thermal diffusivity, thermal conductivity, and specific heat capacity of the melt simultaneously. The transient heat transfer process in the melt and its quartz container was numerically studied in detail. A fitting procedure based on numerical simulation results and the least root-mean-square error fitting to the experimental data was used to extract the values of specific heat capacity, thermal conductivity and thermal diffusivity. This modified method is a step forward from the standard laser flash method, which is usually used to measure thermal diffusivity of solids. The result for tellurium (Te) at 873 K: specific heat capacity 300.2 Joules per kilogram K, thermal conductivity 3.50 Watts per meter K, thermal diffusivity 2.04 x 10(exp -6) square meters per second, are within the range reported in literature. The uncertainty analysis showed the quantitative effect of sample geometry, transient temperature measured, and the energy of the laser pulse.

  15. Mass Spectrometry Based Ultrasensitive DNA Methylation Profiling Using Target Fragmentation Assay.

    PubMed

    Lin, Xiang-Cheng; Zhang, Ting; Liu, Lan; Tang, Hao; Yu, Ru-Qin; Jiang, Jian-Hui

    2016-01-19

    Efficient tools for profiling DNA methylation in specific genes are essential for epigenetics and clinical diagnostics. Current DNA methylation profiling techniques have been limited by inconvenient implementation, requirements of specific reagents, and inferior accuracy in quantifying methylation degree. We develop a novel mass spectrometry method, target fragmentation assay (TFA), which enable to profile methylation in specific sequences. This method combines selective capture of DNA target from restricted cleavage of genomic DNA using magnetic separation with MS detection of the nonenzymatic hydrolysates of target DNA. This method is shown to be highly sensitive with a detection limit as low as 0.056 amol, allowing direct profiling of methylation using genome DNA without preamplification. Moreover, this method offers a unique advantage in accurately determining DNA methylation level. The clinical applicability was demonstrated by DNA methylation analysis using prostate tissue samples, implying the potential of this method as a useful tool for DNA methylation profiling in early detection of related diseases.

  16. Human joint motion estimation for electromyography (EMG)-based dynamic motion control.

    PubMed

    Zhang, Qin; Hosoda, Ryo; Venture, Gentiane

    2013-01-01

    This study aims to investigate a joint motion estimation method from Electromyography (EMG) signals during dynamic movement. In most EMG-based humanoid or prosthetics control systems, EMG features were directly or indirectly used to trigger intended motions. However, both physiological and nonphysiological factors can influence EMG characteristics during dynamic movements, resulting in subject-specific, non-stationary and crosstalk problems. Particularly, when motion velocity and/or joint torque are not constrained, joint motion estimation from EMG signals are more challenging. In this paper, we propose a joint motion estimation method based on muscle activation recorded from a pair of agonist and antagonist muscles of the joint. A linear state-space model with multi input single output is proposed to map the muscle activity to joint motion. An adaptive estimation method is proposed to train the model. The estimation performance is evaluated in performing a single elbow flexion-extension movement in two subjects. All the results in two subjects at two load levels indicate the feasibility and suitability of the proposed method in joint motion estimation. The estimation root-mean-square error is within 8.3% ∼ 10.6%, which is lower than that being reported in several previous studies. Moreover, this method is able to overcome subject-specific problem and compensate non-stationary EMG properties.

  17. Estimation of Fine Particulate Matter in Taipei Using Landuse Regression and Bayesian Maximum Entropy Methods

    PubMed Central

    Yu, Hwa-Lung; Wang, Chih-Hsih; Liu, Ming-Che; Kuo, Yi-Ming

    2011-01-01

    Fine airborne particulate matter (PM2.5) has adverse effects on human health. Assessing the long-term effects of PM2.5 exposure on human health and ecology is often limited by a lack of reliable PM2.5 measurements. In Taipei, PM2.5 levels were not systematically measured until August, 2005. Due to the popularity of geographic information systems (GIS), the landuse regression method has been widely used in the spatial estimation of PM concentrations. This method accounts for the potential contributing factors of the local environment, such as traffic volume. Geostatistical methods, on other hand, account for the spatiotemporal dependence among the observations of ambient pollutants. This study assesses the performance of the landuse regression model for the spatiotemporal estimation of PM2.5 in the Taipei area. Specifically, this study integrates the landuse regression model with the geostatistical approach within the framework of the Bayesian maximum entropy (BME) method. The resulting epistemic framework can assimilate knowledge bases including: (a) empirical-based spatial trends of PM concentration based on landuse regression, (b) the spatio-temporal dependence among PM observation information, and (c) site-specific PM observations. The proposed approach performs the spatiotemporal estimation of PM2.5 levels in the Taipei area (Taiwan) from 2005–2007. PMID:21776223

  18. Estimation of fine particulate matter in Taipei using landuse regression and bayesian maximum entropy methods.

    PubMed

    Yu, Hwa-Lung; Wang, Chih-Hsih; Liu, Ming-Che; Kuo, Yi-Ming

    2011-06-01

    Fine airborne particulate matter (PM2.5) has adverse effects on human health. Assessing the long-term effects of PM2.5 exposure on human health and ecology is often limited by a lack of reliable PM2.5 measurements. In Taipei, PM2.5 levels were not systematically measured until August, 2005. Due to the popularity of geographic information systems (GIS), the landuse regression method has been widely used in the spatial estimation of PM concentrations. This method accounts for the potential contributing factors of the local environment, such as traffic volume. Geostatistical methods, on other hand, account for the spatiotemporal dependence among the observations of ambient pollutants. This study assesses the performance of the landuse regression model for the spatiotemporal estimation of PM2.5 in the Taipei area. Specifically, this study integrates the landuse regression model with the geostatistical approach within the framework of the Bayesian maximum entropy (BME) method. The resulting epistemic framework can assimilate knowledge bases including: (a) empirical-based spatial trends of PM concentration based on landuse regression, (b) the spatio-temporal dependence among PM observation information, and (c) site-specific PM observations. The proposed approach performs the spatiotemporal estimation of PM2.5 levels in the Taipei area (Taiwan) from 2005-2007.

  19. Competitive Protein-binding assay-based Enzyme-immunoassay Method, Compared to High-pressure Liquid Chromatography, Has a Very Lower Diagnostic Value to Detect Vitamin D Deficiency in 9-12 Years Children.

    PubMed

    Zahedi Rad, Maliheh; Neyestani, Tirang Reza; Nikooyeh, Bahareh; Shariatzadeh, Nastaran; Kalayi, Ali; Khalaji, Niloufar; Gharavi, Azam

    2015-01-01

    The most reliable indicator of Vitamin D status is circulating concentration of 25-hydroxycalciferol (25(OH) D) routinely determined by enzyme-immunoassays (EIA) methods. This study was performed to compare commonly used competitive protein-binding assays (CPBA)-based EIA with the gold standard, high-pressure liquid chromatography (HPLC). Concentrations of 25(OH) D in sera from 257 randomly selected school children aged 9-11 years were determined by two methods of CPBA and HPLC. Mean 25(OH) D concentration was 22 ± 18.8 and 21.9 ± 15.6 nmol/L by CPBA and HPLC, respectively. However, mean 25(OH) D concentrations of the two methods became different after excluding undetectable samples (25.1 ± 18.9 vs. 29 ± 14.5 nmol/L, respectively; P = 0.04). Based on predefined Vitamin D deficiency as 25(OH) D < 12.5 nmol/L, CPBA sensitivity and specificity were 44.2% and 60.6%, respectively, compared to HPLC. In receiver operating characteristic curve analysis, the best cut-offs for CPBA was 5.8 nmol/L, which gave 82% sensitivity, but specificity was 17%. Though CPBA may be used as a screening tool, more reliable methods are needed for diagnostic purposes.

  20. An overview on the emerging area of identification, characterization, and assessment of health apps.

    PubMed

    Paglialonga, Alessia; Lugo, Alessandra; Santoro, Eugenio

    2018-05-28

    The need to characterize and assess health apps has inspired a significant amount of research in the past years, in search for methods able to provide potential app users with relevant, meaningful knowledge. This article presents an overview of the recent literature in this field and categorizes - by discussing some specific examples - the various methodologies introduced so far for the identification, characterization, and assessment of health apps. Specifically, this article outlines the most significant web-based resources for app identification, relevant frameworks for descriptive characterization of apps' features, and a number of methods for the assessment of quality along its various components (e.g., evidence base, trustworthiness, privacy, or user engagement). The development of methods to characterize the apps' features and to assess their quality is important to define benchmarks and minimum requirements. Similarly, such methods are important to categorize potential risks and challenges in the field so that risks can be minimized, whenever possible, by design. Understanding methods to assess apps is key to raise the standards of quality of health apps on the market, towards the final goal of delivering apps that are built on the pillars of evidence-base, reliability, long-term effectiveness, and user-oriented quality. Copyright © 2018. Published by Elsevier Inc.

  1. Structure-based multiscale approach for identification of interaction partners of PDZ domains.

    PubMed

    Tiwari, Garima; Mohanty, Debasisa

    2014-04-28

    PDZ domains are peptide recognition modules which mediate specific protein-protein interactions and are known to have a complex specificity landscape. We have developed a novel structure-based multiscale approach which identifies crucial specificity determining residues (SDRs) of PDZ domains from explicit solvent molecular dynamics (MD) simulations on PDZ-peptide complexes and uses these SDRs in combination with knowledge-based scoring functions for proteomewide identification of their interaction partners. Multiple explicit solvent simulations ranging from 5 to 50 ns duration have been carried out on 28 PDZ-peptide complexes with known binding affinities. MM/PBSA binding energy values calculated from these simulations show a correlation coefficient of 0.755 with the experimental binding affinities. On the basis of the SDRs of PDZ domains identified by MD simulations, we have developed a simple scoring scheme for evaluating binding energies for PDZ-peptide complexes using residue based statistical pair potentials. This multiscale approach has been benchmarked on a mouse PDZ proteome array data set by calculating the binding energies for 217 different substrate peptides in binding pockets of 64 different mouse PDZ domains. Receiver operating characteristic (ROC) curve analysis indicates that, the area under curve (AUC) values for binder vs nonbinder classification by our structure based method is 0.780. Our structure based method does not require experimental PDZ-peptide binding data for training.

  2. Automated seeding-based nuclei segmentation in nonlinear optical microscopy.

    PubMed

    Medyukhina, Anna; Meyer, Tobias; Heuke, Sandro; Vogler, Nadine; Dietzek, Benjamin; Popp, Jürgen

    2013-10-01

    Nonlinear optical (NLO) microscopy based, e.g., on coherent anti-Stokes Raman scattering (CARS) or two-photon-excited fluorescence (TPEF) is a fast label-free imaging technique, with a great potential for biomedical applications. However, NLO microscopy as a diagnostic tool is still in its infancy; there is a lack of robust and durable nuclei segmentation methods capable of accurate image processing in cases of variable image contrast, nuclear density, and type of investigated tissue. Nonetheless, such algorithms specifically adapted to NLO microscopy present one prerequisite for the technology to be routinely used, e.g., in pathology or intraoperatively for surgical guidance. In this paper, we compare the applicability of different seeding and boundary detection methods to NLO microscopic images in order to develop an optimal seeding-based approach capable of accurate segmentation of both TPEF and CARS images. Among different methods, the Laplacian of Gaussian filter showed the best accuracy for the seeding of the image, while a modified seeded watershed segmentation was the most accurate in the task of boundary detection. The resulting combination of these methods followed by the verification of the detected nuclei performs high average sensitivity and specificity when applied to various types of NLO microscopy images.

  3. Detection of blob objects in microscopic zebrafish images based on gradient vector diffusion.

    PubMed

    Li, Gang; Liu, Tianming; Nie, Jingxin; Guo, Lei; Malicki, Jarema; Mara, Andrew; Holley, Scott A; Xia, Weiming; Wong, Stephen T C

    2007-10-01

    The zebrafish has become an important vertebrate animal model for the study of developmental biology, functional genomics, and disease mechanisms. It is also being used for drug discovery. Computerized detection of blob objects has been one of the important tasks in quantitative phenotyping of zebrafish. We present a new automated method that is able to detect blob objects, such as nuclei or cells in microscopic zebrafish images. This method is composed of three key steps. The first step is to produce a diffused gradient vector field by a physical elastic deformable model. In the second step, the flux image is computed on the diffused gradient vector field. The third step performs thresholding and nonmaximum suppression based on the flux image. We report the validation and experimental results of this method using zebrafish image datasets from three independent research labs. Both sensitivity and specificity of this method are over 90%. This method is able to differentiate closely juxtaposed or connected blob objects, with high sensitivity and specificity in different situations. It is characterized by a good, consistent performance in blob object detection.

  4. Estimating patient-specific and anatomically correct reference model for craniomaxillofacial deformity via sparse representation

    PubMed Central

    Wang, Li; Ren, Yi; Gao, Yaozong; Tang, Zhen; Chen, Ken-Chung; Li, Jianfu; Shen, Steve G. F.; Yan, Jin; Lee, Philip K. M.; Chow, Ben; Xia, James J.; Shen, Dinggang

    2015-01-01

    Purpose: A significant number of patients suffer from craniomaxillofacial (CMF) deformity and require CMF surgery in the United States. The success of CMF surgery depends on not only the surgical techniques but also an accurate surgical planning. However, surgical planning for CMF surgery is challenging due to the absence of a patient-specific reference model. Currently, the outcome of the surgery is often subjective and highly dependent on surgeon’s experience. In this paper, the authors present an automatic method to estimate an anatomically correct reference shape of jaws for orthognathic surgery, a common type of CMF surgery. Methods: To estimate a patient-specific jaw reference model, the authors use a data-driven method based on sparse shape composition. Given a dictionary of normal subjects, the authors first use the sparse representation to represent the midface of a patient by the midfaces of the normal subjects in the dictionary. Then, the derived sparse coefficients are used to reconstruct a patient-specific reference jaw shape. Results: The authors have validated the proposed method on both synthetic and real patient data. Experimental results show that the authors’ method can effectively reconstruct the normal shape of jaw for patients. Conclusions: The authors have presented a novel method to automatically estimate a patient-specific reference model for the patient suffering from CMF deformity. PMID:26429255

  5. Glaucoma progression detection: agreement, sensitivity, and specificity of expert visual field evaluation, event analysis, and trend analysis.

    PubMed

    Antón, Alfonso; Pazos, Marta; Martín, Belén; Navero, José Manuel; Ayala, Miriam Eleonora; Castany, Marta; Martínez, Patricia; Bardavío, Javier

    2013-01-01

    To assess sensitivity, specificity, and agreement among automated event analysis, automated trend analysis, and expert evaluation to detect glaucoma progression. This was a prospective study that included 37 eyes with a follow-up of 36 months. All had glaucomatous disks and fields and performed reliable visual fields every 6 months. Each series of fields was assessed with 3 different methods: subjective assessment by 2 independent teams of glaucoma experts, glaucoma/guided progression analysis (GPA) event analysis, and GPA (visual field index-based) trend analysis. Kappa agreement coefficient between methods and sensitivity and specificity for each method using expert opinion as gold standard were calculated. The incidence of glaucoma progression was 16% to 18% in 3 years but only 3 cases showed progression with all 3 methods. Kappa agreement coefficient was high (k=0.82) between subjective expert assessment and GPA event analysis, and only moderate between these two and GPA trend analysis (k=0.57). Sensitivity and specificity for GPA event and GPA trend analysis were 71% and 96%, and 57% and 93%, respectively. The 3 methods detected similar numbers of progressing cases. The GPA event analysis and expert subjective assessment showed high agreement between them and moderate agreement with GPA trend analysis. In a period of 3 years, both methods of GPA analysis offered high specificity, event analysis showed 83% sensitivity, and trend analysis had a 66% sensitivity.

  6. Evaluation of new gyrB-based real-time PCR system for the detection of B. fragilis as an indicator of human-specific fecal contamination.

    PubMed

    Lee, Chang Soo; Lee, Jiyoung

    2010-09-01

    A rapid and specific gyrB-based real-time PCR system has been developed for detecting Bacteroides fragilis as a human-specific marker of fecal contamination. Its specificity and sensitivity was evaluated by comparison with other 16S rRNA gene-based primers using closely related Bacteroides and Prevotella. Many studies have used 16S rRNA gene-based method targeting Bacteroides because this genus is relatively abundant in human feces and is useful for microbial source tracking. However, 16S rRNA gene-based primers are evolutionarily too conserved among taxa to discriminate between human-specific species of Bacteroides and other closely related genera, such as Prevotella. Recently, one of the housekeeping genes, gyrB, has been used as an alternative target in multilocus sequence analysis (MLSA) to provide greater phylogenetic resolution. In this study, a new B. fragilis-specific primer set (Bf904F/Bf958R) was designed by alignments of 322 gyrB genes and was compared with the performance of the 16S rRNA gene-based primers in the presence of B. fragilis, Bacteroides ovatus and Prevotella melaninogenica. Amplicons were sequenced and a phylogenetic tree was constructed to confirm the specificity of the primers to B. fragilis. The gyrB-based primers successfully discriminated B. fragilis from B. ovatus and P. melaninogenica. Real-time PCR results showed that the gyrB primer set had a comparable sensitivity in the detection of B. fragilis when compared with the 16S rRNA primer set. The host-specificity of our gyrB-based primer set was validated with human, pig, cow, and dog fecal samples. The gyrB primer system had superior human-specificity. The gyrB-based system can rapidly detect human-specific fecal source and can be used for improved source tracking of human contamination. (c) 2010 Elsevier B.V. All rights reserved.

  7. MO-FG-CAMPUS-TeP1-05: Rapid and Efficient 3D Dosimetry for End-To-End Patient-Specific QA of Rotational SBRT Deliveries Using a High-Resolution EPID

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yang, Y M; Han, B; Xing, L

    2016-06-15

    Purpose: EPID-based patient-specific quality assurance provides verification of the planning setup and delivery process that phantomless QA and log-file based virtual dosimetry methods cannot achieve. We present a method for EPID-based QA utilizing spatially-variant EPID response kernels that allows for direct calculation of the entrance fluence and 3D phantom dose. Methods: An EPID dosimetry system was utilized for 3D dose reconstruction in a cylindrical phantom for the purposes of end-to-end QA. Monte Carlo (MC) methods were used to generate pixel-specific point-spread functions (PSFs) characterizing the spatially non-uniform EPID portal response in the presence of phantom scatter. The spatially-variant PSFs weremore » decomposed into spatially-invariant basis PSFs with the symmetric central-axis kernel as the primary basis kernel and off-axis representing orthogonal perturbations in pixel-space. This compact and accurate characterization enables the use of a modified Richardson-Lucy deconvolution algorithm to directly reconstruct entrance fluence from EPID images without iterative scatter subtraction. High-resolution phantom dose kernels were cogenerated in MC with the PSFs enabling direct recalculation of the resulting phantom dose by rapid forward convolution once the entrance fluence was calculated. A Delta4 QA phantom was used to validate the dose reconstructed in this approach. Results: The spatially-invariant representation of the EPID response accurately reproduced the entrance fluence with >99.5% fidelity with a simultaneous reduction of >60% in computational overhead. 3D dose for 10{sub 6} voxels was reconstructed for the entire phantom geometry. A 3D global gamma analysis demonstrated a >95% pass rate at 3%/3mm. Conclusion: Our approach demonstrates the capabilities of an EPID-based end-to-end QA methodology that is more efficient than traditional EPID dosimetry methods. Displacing the point of measurement external to the QA phantom reduces the necessary complexity of the phantom itself while offering a method that is highly scalable and inherently generalizable to rotational and trajectory based deliveries. This research was partially supported by Varian.« less

  8. Real-time design with peer tasks

    NASA Technical Reports Server (NTRS)

    Goforth, Andre; Howes, Norman R.; Wood, Jonathan D.; Barnes, Michael J.

    1995-01-01

    We introduce a real-time design methodology for large scale, distributed, parallel architecture, real-time systems (LDPARTS), as an alternative to those methods using rate or dead-line monotonic analysis. In our method the fundamental units of prioritization, work items, are domain specific objects with timing requirements (deadlines) found in user's specification. A work item consists of a collection of tasks of equal priority. Current scheduling theories are applied with artifact deadlines introduced by the designer whereas our method schedules work items to meet user's specification deadlines (sometimes called end-to-end deadlines). Our method supports these scheduling properties. Work item scheduling is based on domain specific importance instead of task level urgency and still meets as many user specification deadlines as can be met by scheduling tasks with respect to urgency. Second, the minimum (closest) on-line deadline that can be guaranteed for a work item of highest importance, scheduled at run time, is approximately the inverse of the throughput, measured in work items per second. Third, throughput is not degraded during overload and instead of resorting to task shedding during overload, the designer can specify which work items to shed. We prove these properties in a mathematical model.

  9. Simple Methods and Rational Design for Enhancing Aptamer Sensitivity and Specificity

    PubMed Central

    Kalra, Priya; Dhiman, Abhijeet; Cho, William C.; Bruno, John G.; Sharma, Tarun K.

    2018-01-01

    Aptamers are structured nucleic acid molecules that can bind to their targets with high affinity and specificity. However, conventional SELEX (Systematic Evolution of Ligands by EXponential enrichment) methods may not necessarily produce aptamers of desired affinity and specificity. Thus, to address these questions, this perspective is intended to suggest some approaches and tips along with novel selection methods to enhance evolution of aptamers. This perspective covers latest novel innovations as well as a broad range of well-established approaches to improve the individual binding parameters (aptamer affinity, avidity, specificity and/or selectivity) of aptamers during and/or post-SELEX. The advantages and limitations of individual aptamer selection methods and post-SELEX optimizations, along with rational approaches to overcome these limitations are elucidated in each case. Further the impact of chosen selection milieus, linker-systems, aptamer cocktails and detection modules utilized in conjunction with target-specific aptamers, on the overall assay performance are discussed in detail, each with its own advantages and limitations. The simple variations suggested are easily available for facile implementation during and/or post-SELEX to develop ultrasensitive and specific assays. Finally, success studies of established aptamer-based assays are discussed, highlighting how they utilized some of the suggested methodologies to develop commercially successful point-of-care diagnostic assays. PMID:29868605

  10. Simple Methods and Rational Design for Enhancing Aptamer Sensitivity and Specificity.

    PubMed

    Kalra, Priya; Dhiman, Abhijeet; Cho, William C; Bruno, John G; Sharma, Tarun K

    2018-01-01

    Aptamers are structured nucleic acid molecules that can bind to their targets with high affinity and specificity. However, conventional SELEX (Systematic Evolution of Ligands by EXponential enrichment) methods may not necessarily produce aptamers of desired affinity and specificity. Thus, to address these questions, this perspective is intended to suggest some approaches and tips along with novel selection methods to enhance evolution of aptamers. This perspective covers latest novel innovations as well as a broad range of well-established approaches to improve the individual binding parameters (aptamer affinity, avidity, specificity and/or selectivity) of aptamers during and/or post-SELEX. The advantages and limitations of individual aptamer selection methods and post-SELEX optimizations, along with rational approaches to overcome these limitations are elucidated in each case. Further the impact of chosen selection milieus, linker-systems, aptamer cocktails and detection modules utilized in conjunction with target-specific aptamers, on the overall assay performance are discussed in detail, each with its own advantages and limitations. The simple variations suggested are easily available for facile implementation during and/or post-SELEX to develop ultrasensitive and specific assays. Finally, success studies of established aptamer-based assays are discussed, highlighting how they utilized some of the suggested methodologies to develop commercially successful point-of-care diagnostic assays.

  11. Assessing groundwater vulnerability to agrichemical contamination in the Midwest US

    USGS Publications Warehouse

    Burkart, M.R.; Kolpin, D.W.; James, D.E.

    1999-01-01

    Agrichemicals (herbicides and nitrate) are significant sources of diffuse pollution to groundwater. Indirect methods are needed to assess the potential for groundwater contamination by diffuse sources because groundwater monitoring is too costly to adequately define the geographic extent of contamination at a regional or national scale. This paper presents examples of the application of statistical, overlay and index, and process-based modeling methods for groundwater vulnerability assessments to a variety of data from the Midwest U.S. The principles for vulnerability assessment include both intrinsic (pedologic, climatologic, and hydrogeologic factors) and specific (contaminant and other anthropogenic factors) vulnerability of a location. Statistical methods use the frequency of contaminant occurrence, contaminant concentration, or contamination probability as a response variable. Statistical assessments are useful for defining the relations among explanatory and response variables whether they define intrinsic or specific vulnerability. Multivariate statistical analyses are useful for ranking variables critical to estimating water quality responses of interest. Overlay and index methods involve intersecting maps of intrinsic and specific vulnerability properties and indexing the variables by applying appropriate weights. Deterministic models use process-based equations to simulate contaminant transport and are distinguished from the other methods in their potential to predict contaminant transport in both space and time. An example of a one-dimensional leaching model linked to a geographic information system (GIS) to define a regional metamodel for contamination in the Midwest is included.

  12. Visual Fast Mapping in School-Aged Children with Specific Language Impairment

    ERIC Educational Resources Information Center

    Alt, Mary

    2013-01-01

    Purpose: To determine whether children with specific language impairment (SLI) demonstrate impaired visual fast mapping skills compared with unimpaired peers and to test components of visual working memory that may contribute to a visual working memory deficit. Methods: Fifty children (25 SLI) played 2 computer-based visual fast mapping games…

  13. Disturbance ecology and forest management: A review of the literature

    Treesearch

    Paul Rogers

    1996-01-01

    This review of the disturbance ecology literature, and how it pertains to forest management, is a resource for forest managers and researchers interested in disturbance theory, specific disturbance agents, their interactions, and appropriate methods of inquiry for specific geographic regions. Implications for the future of disturbance ecology-based management are...

  14. SELECTING AND TRAINING THE TRAINING OFFICER.

    ERIC Educational Resources Information Center

    TAYLOR, NANCY

    TO ACHIEVE THE OBJECTIVES OF TRAINING IN INDUSTRY--TECHNICAL AND LIBERAL EDUCATION, SPECIFIC JOB SKILLS, AND THE DEVELOPMENT OF ATTITUDES--THE TRAINING OFFICER MUST KNOW THE COMPANY WITHIN WHICH HE IS WORKING, AS WELL AS MANAGEMENT THEORY AND TRAINING METHODS. THE SELECTION OF TRAINING OFFICERS IS BASED ON A JOB SPECIFICATION, AN OUTGROWTH OF A…

  15. Relationships of host plant phylogeny, chemistry and host plant specificity of several agents of yellow starthistle

    USDA-ARS?s Scientific Manuscript database

    Plant species used for host specificity testing are usually chosen based on the assumption that the risk of attack by a prospective biological control agent decreases with increasing phylogenetic distance from the target weed. Molecular genetics methods have greatly improved our ability to measure ...

  16. A Classification Scheme for Adult Education. Education Libraries Bulletin, Supplement Twelve.

    ERIC Educational Resources Information Center

    Greaves, Monica A., Comp.

    This classification scheme, based on the 'facet formula' theory of Ranganathan, is designed primarily for the library of the National Institute of Adult Education in London, England. Kinds of persons being educated (educands), methods and problems of education, specific countries, specific organizations, and forms in which the information is…

  17. Experimental verification of a real-time tuning method of a model-based controller by perturbations to its poles

    NASA Astrophysics Data System (ADS)

    Kajiwara, Itsuro; Furuya, Keiichiro; Ishizuka, Shinichi

    2018-07-01

    Model-based controllers with adaptive design variables are often used to control an object with time-dependent characteristics. However, the controller's performance is influenced by many factors such as modeling accuracy and fluctuations in the object's characteristics. One method to overcome these negative factors is to tune model-based controllers. Herein we propose an online tuning method to maintain control performance for an object that exhibits time-dependent variations. The proposed method employs the poles of the controller as design variables because the poles significantly impact performance. Specifically, we use the simultaneous perturbation stochastic approximation (SPSA) to optimize a model-based controller with multiple design variables. Moreover, a vibration control experiment of an object with time-dependent characteristics as the temperature is varied demonstrates that the proposed method allows adaptive control and stably maintains the closed-loop characteristics.

  18. Molecular beacon probes-base multiplex NASBA Real-time for detection of HIV-1 and HCV.

    PubMed

    Mohammadi-Yeganeh, S; Paryan, M; Mirab Samiee, S; Kia, V; Rezvan, H

    2012-06-01

    Developed in 1991, nucleic acid sequence-based amplification (NASBA) has been introduced as a rapid molecular diagnostic technique, where it has been shown to give quicker results than PCR, and it can also be more sensitive. This paper describes the development of a molecular beacon-based multiplex NASBA assay for simultaneous detection of HIV-1 and HCV in plasma samples. A well-conserved region in the HIV-1 pol gene and 5'-NCR of HCV genome were used for primers and molecular beacon design. The performance features of HCV/HIV-1 multiplex NASBA assay including analytical sensitivity and specificity, clinical sensitivity and clinical specificity were evaluated. The analysis of scalar concentrations of the samples indicated that the limit of quantification of the assay was <1000 copies/ml for HIV-1 and <500 copies/ml for HCV with 95% confidence interval. Multiplex NASBA assay showed a 98% sensitivity and 100% specificity. The analytical specificity study with BLAST software demonstrated that the primers do not attach to any other sequences except for that of HIV-1 or HCV. The primers and molecular beacon probes detected all HCV genotypes and all major variants of HIV-1. This method may represent a relatively inexpensive isothermal method for detection of HIV-1/HCV co-infection in monitoring of patients.

  19. Modeling and E-M estimation of haplotype-specific relative risks from genotype data for a case-control study of unrelated individuals.

    PubMed

    Stram, Daniel O; Leigh Pearce, Celeste; Bretsky, Phillip; Freedman, Matthew; Hirschhorn, Joel N; Altshuler, David; Kolonel, Laurence N; Henderson, Brian E; Thomas, Duncan C

    2003-01-01

    The US National Cancer Institute has recently sponsored the formation of a Cohort Consortium (http://2002.cancer.gov/scpgenes.htm) to facilitate the pooling of data on very large numbers of people, concerning the effects of genes and environment on cancer incidence. One likely goal of these efforts will be generate a large population-based case-control series for which a number of candidate genes will be investigated using SNP haplotype as well as genotype analysis. The goal of this paper is to outline the issues involved in choosing a method of estimating haplotype-specific risk estimates for such data that is technically appropriate and yet attractive to epidemiologists who are already comfortable with odds ratios and logistic regression. Our interest is to develop and evaluate extensions of methods, based on haplotype imputation, that have been recently described (Schaid et al., Am J Hum Genet, 2002, and Zaykin et al., Hum Hered, 2002) as providing score tests of the null hypothesis of no effect of SNP haplotypes upon risk, which may be used for more complex tasks, such as providing confidence intervals, and tests of equivalence of haplotype-specific risks in two or more separate populations. In order to do so we (1) develop a cohort approach towards odds ratio analysis by expanding the E-M algorithm to provide maximum likelihood estimates of haplotype-specific odds ratios as well as genotype frequencies; (2) show how to correct the cohort approach, to give essentially unbiased estimates for population-based or nested case-control studies by incorporating the probability of selection as a case or control into the likelihood, based on a simplified model of case and control selection, and (3) finally, in an example data set (CYP17 and breast cancer, from the Multiethnic Cohort Study) we compare likelihood-based confidence interval estimates from the two methods with each other, and with the use of the single-imputation approach of Zaykin et al. applied under both null and alternative hypotheses. We conclude that so long as haplotypes are well predicted by SNP genotypes (we use the Rh2 criteria of Stram et al. [1]) the differences between the three methods are very small and in particular that the single imputation method may be expected to work extremely well. Copyright 2003 S. Karger AG, Basel

  20. Switching industrial production processes from complex to defined media: method development and case study using the example of Penicillium chrysogenum.

    PubMed

    Posch, Andreas E; Spadiut, Oliver; Herwig, Christoph

    2012-06-22

    Filamentous fungi are versatile cell factories and widely used for the production of antibiotics, organic acids, enzymes and other industrially relevant compounds at large scale. As a fact, industrial production processes employing filamentous fungi are commonly based on complex raw materials. However, considerable lot-to-lot variability of complex media ingredients not only demands for exhaustive incoming components inspection and quality control, but unavoidably affects process stability and performance. Thus, switching bioprocesses from complex to defined media is highly desirable. This study presents a strategy for strain characterization of filamentous fungi on partly complex media using redundant mass balancing techniques. Applying the suggested method, interdependencies between specific biomass and side-product formation rates, production of fructooligosaccharides, specific complex media component uptake rates and fungal strains were revealed. A 2-fold increase of the overall penicillin space time yield and a 3-fold increase in the maximum specific penicillin formation rate were reached in defined media compared to complex media. The newly developed methodology enabled fast characterization of two different industrial Penicillium chrysogenum candidate strains on complex media based on specific complex media component uptake kinetics and identification of the most promising strain for switching the process from complex to defined conditions. Characterization at different complex/defined media ratios using only a limited number of analytical methods allowed maximizing the overall industrial objectives of increasing both, method throughput and the generation of scientific process understanding.

  1. Switching industrial production processes from complex to defined media: method development and case study using the example of Penicillium chrysogenum

    PubMed Central

    2012-01-01

    Background Filamentous fungi are versatile cell factories and widely used for the production of antibiotics, organic acids, enzymes and other industrially relevant compounds at large scale. As a fact, industrial production processes employing filamentous fungi are commonly based on complex raw materials. However, considerable lot-to-lot variability of complex media ingredients not only demands for exhaustive incoming components inspection and quality control, but unavoidably affects process stability and performance. Thus, switching bioprocesses from complex to defined media is highly desirable. Results This study presents a strategy for strain characterization of filamentous fungi on partly complex media using redundant mass balancing techniques. Applying the suggested method, interdependencies between specific biomass and side-product formation rates, production of fructooligosaccharides, specific complex media component uptake rates and fungal strains were revealed. A 2-fold increase of the overall penicillin space time yield and a 3-fold increase in the maximum specific penicillin formation rate were reached in defined media compared to complex media. Conclusions The newly developed methodology enabled fast characterization of two different industrial Penicillium chrysogenum candidate strains on complex media based on specific complex media component uptake kinetics and identification of the most promising strain for switching the process from complex to defined conditions. Characterization at different complex/defined media ratios using only a limited number of analytical methods allowed maximizing the overall industrial objectives of increasing both, method throughput and the generation of scientific process understanding. PMID:22727013

  2. Optimizing Cognitive Rehabilitation: Effective Instructional Methods

    ERIC Educational Resources Information Center

    Sohlberg, McKay Moore; Turkstra, Lyn S.

    2011-01-01

    Rehabilitation professionals face a key challenge when working with clients with acquired cognitive impairments: how to teach new skills to individuals who have difficulty learning. Unique in its focus, this book presents evidence-based instructional methods specifically designed to help this population learn more efficiently. The expert authors…

  3. Field-based evaluation of a male-specific (F+) RNA coliphage concentration method

    EPA Science Inventory

    Fecal contamination of water poses a significant risk to public health due to the potential presence of pathogens, including enteric viruses. Thus, sensitive, reliable and easy to use methods for the detection of microorganisms are needed to evaluate water quality. In this stud...

  4. New Performance Metrics for Quantitative Polymerase Chain Reaction-Based Microbial Source Tracking Methods

    EPA Science Inventory

    Binary sensitivity and specificity metrics are not adequate to describe the performance of quantitative microbial source tracking methods because the estimates depend on the amount of material tested and limit of detection. We introduce a new framework to compare the performance ...

  5. Particle-based and meshless methods with Aboria

    NASA Astrophysics Data System (ADS)

    Robinson, Martin; Bruna, Maria

    Aboria is a powerful and flexible C++ library for the implementation of particle-based numerical methods. The particles in such methods can represent actual particles (e.g. Molecular Dynamics) or abstract particles used to discretise a continuous function over a domain (e.g. Radial Basis Functions). Aboria provides a particle container, compatible with the Standard Template Library, spatial search data structures, and a Domain Specific Language to specify non-linear operators on the particle set. This paper gives an overview of Aboria's design, an example of use, and a performance benchmark.

  6. Predicting metabolic syndrome using decision tree and support vector machine methods.

    PubMed

    Karimi-Alavijeh, Farzaneh; Jalili, Saeed; Sadeghi, Masoumeh

    2016-05-01

    Metabolic syndrome which underlies the increased prevalence of cardiovascular disease and Type 2 diabetes is considered as a group of metabolic abnormalities including central obesity, hypertriglyceridemia, glucose intolerance, hypertension, and dyslipidemia. Recently, artificial intelligence based health-care systems are highly regarded because of its success in diagnosis, prediction, and choice of treatment. This study employs machine learning technics for predict the metabolic syndrome. This study aims to employ decision tree and support vector machine (SVM) to predict the 7-year incidence of metabolic syndrome. This research is a practical one in which data from 2107 participants of Isfahan Cohort Study has been utilized. The subjects without metabolic syndrome according to the ATPIII criteria were selected. The features that have been used in this data set include: gender, age, weight, body mass index, waist circumference, waist-to-hip ratio, hip circumference, physical activity, smoking, hypertension, antihypertensive medication use, systolic blood pressure (BP), diastolic BP, fasting blood sugar, 2-hour blood glucose, triglycerides (TGs), total cholesterol, low-density lipoprotein, high density lipoprotein-cholesterol, mean corpuscular volume, and mean corpuscular hemoglobin. Metabolic syndrome was diagnosed based on ATPIII criteria and two methods of decision tree and SVM were selected to predict the metabolic syndrome. The criteria of sensitivity, specificity and accuracy were used for validation. SVM and decision tree methods were examined according to the criteria of sensitivity, specificity and accuracy. Sensitivity, specificity and accuracy were 0.774 (0.758), 0.74 (0.72) and 0.757 (0.739) in SVM (decision tree) method. The results show that SVM method sensitivity, specificity and accuracy is more efficient than decision tree. The results of decision tree method show that the TG is the most important feature in predicting metabolic syndrome. According to this study, in cases where only the final result of the decision is regarded significant, SVM method can be used with acceptable accuracy in decision making medical issues. This method has not been implemented in the previous research.

  7. Matrix Assisted Laser Desorption Ionization Mass Spectrometric Analysis of Bacillus anthracis: From Fingerprint Analysis of the Bacterium to Quantification of its Toxins in Clinical Samples

    NASA Astrophysics Data System (ADS)

    Woolfitt, Adrian R.; Boyer, Anne E.; Quinn, Conrad P.; Hoffmaster, Alex R.; Kozel, Thomas R.; de, Barun K.; Gallegos, Maribel; Moura, Hercules; Pirkle, James L.; Barr, John R.

    A range of mass spectrometry-based techniques have been used to identify, characterize and differentiate Bacillus anthracis, both in culture for forensic applications and for diagnosis during infection. This range of techniques could usefully be considered to exist as a continuum, based on the degrees of specificity involved. We show two examples here, a whole-organism fingerprinting method and a high-specificity assay for one unique protein, anthrax lethal factor.

  8. Defining the wheat gluten peptide fingerprint via a discovery and targeted proteomics approach.

    PubMed

    Martínez-Esteso, María José; Nørgaard, Jørgen; Brohée, Marcel; Haraszi, Reka; Maquet, Alain; O'Connor, Gavin

    2016-09-16

    Accurate, reliable and sensitive detection methods for gluten are required to support current EU regulations. The enforcement of legislative levels requires that measurement results are comparable over time and between methods. This is not a trivial task for gluten which comprises a large number of protein targets. This paper describes a strategy for defining a set of specific analytical targets for wheat gluten. A comprehensive proteomic approach was applied by fractionating wheat gluten using RP-HPLC (reversed phase high performance liquid chromatography) followed by a multi-enzymatic digestion (LysC, trypsin and chymotrypsin) with subsequent mass spectrometric analysis. This approach identified 434 peptide sequences from gluten. Peptides were grouped based on two criteria: unique to a single gluten protein sequence; contained known immunogenic and toxic sequences in the context of coeliac disease. An LC-MS/MS method based on selected reaction monitoring (SRM) was developed on a triple quadrupole mass spectrometer for the specific detection of the target peptides. The SRM based screening approach was applied to gluten containing cereals (wheat, rye, barley and oats) and non-gluten containing flours (corn, soy and rice). A unique set of wheat gluten marker peptides were identified and are proposed as wheat specific markers. The measurement of gluten in processed food products in support of regulatory limits is performed routinely. Mass spectrometry is emerging as a viable alternative to ELISA based methods. Here we outline a set of peptide markers that are representative of gluten and consider the end user's needs in protecting those with coeliac disease. The approach taken has been applied to wheat but can be easily extended to include other species potentially enabling the MS quantification of different gluten containing species from the identified markers. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.

  9. Improving binding mode and binding affinity predictions of docking by ligand-based search of protein conformations: evaluation in D3R grand challenge 2015

    NASA Astrophysics Data System (ADS)

    Xu, Xianjin; Yan, Chengfei; Zou, Xiaoqin

    2017-08-01

    The growing number of protein-ligand complex structures, particularly the structures of proteins co-bound with different ligands, in the Protein Data Bank helps us tackle two major challenges in molecular docking studies: the protein flexibility and the scoring function. Here, we introduced a systematic strategy by using the information embedded in the known protein-ligand complex structures to improve both binding mode and binding affinity predictions. Specifically, a ligand similarity calculation method was employed to search a receptor structure with a bound ligand sharing high similarity with the query ligand for the docking use. The strategy was applied to the two datasets (HSP90 and MAP4K4) in recent D3R Grand Challenge 2015. In addition, for the HSP90 dataset, a system-specific scoring function (ITScore2_hsp90) was generated by recalibrating our statistical potential-based scoring function (ITScore2) using the known protein-ligand complex structures and the statistical mechanics-based iterative method. For the HSP90 dataset, better performances were achieved for both binding mode and binding affinity predictions comparing with the original ITScore2 and with ensemble docking. For the MAP4K4 dataset, although there were only eight known protein-ligand complex structures, our docking strategy achieved a comparable performance with ensemble docking. Our method for receptor conformational selection and iterative method for the development of system-specific statistical potential-based scoring functions can be easily applied to other protein targets that have a number of protein-ligand complex structures available to improve predictions on binding.

  10. An innovative SNP genotyping method adapting to multiple platforms and throughputs.

    PubMed

    Long, Y M; Chao, W S; Ma, G J; Xu, S S; Qi, L L

    2017-03-01

    An innovative genotyping method designated as semi-thermal asymmetric reverse PCR (STARP) was developed for genotyping individual SNPs with improved accuracy, flexible throughputs, low operational costs, and high platform compatibility. Multiplex chip-based technology for genome-scale genotyping of single nucleotide polymorphisms (SNPs) has made great progress in the past two decades. However, PCR-based genotyping of individual SNPs still remains problematic in accuracy, throughput, simplicity, and/or operational costs as well as the compatibility with multiple platforms. Here, we report a novel SNP genotyping method designated semi-thermal asymmetric reverse PCR (STARP). In this method, genotyping assay was performed under unique PCR conditions using two universal priming element-adjustable primers (PEA-primers) and one group of three locus-specific primers: two asymmetrically modified allele-specific primers (AMAS-primers) and their common reverse primer. The two AMAS-primers each were substituted one base in different positions at their 3' regions to significantly increase the amplification specificity of the two alleles and tailed at 5' ends to provide priming sites for PEA-primers. The two PEA-primers were developed for common use in all genotyping assays to stringently target the PCR fragments generated by the two AMAS-primers with similar PCR efficiencies and for flexible detection using either gel-free fluorescence signals or gel-based size separation. The state-of-the-art primer design and unique PCR conditions endowed STARP with all the major advantages of high accuracy, flexible throughputs, simple assay design, low operational costs, and platform compatibility. In addition to SNPs, STARP can also be employed in genotyping of indels (insertion-deletion polymorphisms). As vast variations in DNA sequences are being unearthed by many genome sequencing projects and genotyping by sequencing, STARP will have wide applications across all biological organisms in agriculture, medicine, and forensics.

  11. Rapid PCR-mediated synthesis of competitor molecules for accurate quantification of beta(2) GABA(A) receptor subunit mRNA.

    PubMed

    Vela, J; Vitorica, J; Ruano, D

    2001-12-01

    We describe a fast and easy method for the synthesis of competitor molecules based on non-specific conditions of PCR. RT-competitive PCR is a sensitive technique that allows quantification of very low quantities of mRNA molecules in small tissue samples. This technique is based on the competition established between the native and standard templates for nucleotides, primers or other factors during PCR. Thus, the most critical parameter is the use of good internal standards to generate a standard curve from which the amount of native sequences can be properly estimated. At the present time different types of internal standards and methods for their synthesis have been described. Normally, most of these methods are time-consuming and require the use of different sets of primers, different rounds of PCR or specific modifications, such as site-directed mutagenesis, that need subsequent analysis of the PCR products. Using our method, we obtained in a single round of PCR and with the same primer pair, competitor molecules that were successfully used in RT-competitive PCR experiments. The principal advantage of this method is high versatility and economy. Theoretically it is possible to synthesize a specific competitor molecule for each primer pair used. Finally, using this method we have been able to quantify the increase in the expression of the beta(2) GABA(A) receptor subunit mRNA that occurs during rat hippocampus development.

  12. Advances in serological, imaging techniques and molecular diagnosis of Toxoplasma gondii infection.

    PubMed

    Rostami, Ali; Karanis, Panagiotis; Fallahi, Shirzad

    2018-06-01

    Toxoplasmosis is worldwide distributed zoonotic infection disease with medical importance in immunocompromised patients, pregnant women and congenitally infected newborns. Having basic information on the traditional and new developed methods is essential for general physicians and infectious disease specialists for choosing a suitable diagnostic approach for rapid and accurate diagnosis of the disease and, consequently, timely and effective treatment. We conducted English literature searches in PubMed from 1989 to 2016 using relevant keywords and summarized the recent advances in diagnosis of toxoplasmosis. Enzyme-linked immunosorbent assay (ELISA) was most used method in past century. Recently advanced ELISA-based methods including chemiluminescence assays (CLIA), enzyme-linked fluorescence assay (ELFA), immunochromatographic test (ICT), serum IgG avidity test and immunosorbent agglutination assays (ISAGA) have shown high sensitivity and specificity. Recent studies using recombinant or chimeric antigens and multiepitope peptides method demonstrated very promising results to development of new strategies capable of discriminating recently acquired infections from chronic infection. Real-time PCR and loop-mediated isothermal amplification (LAMP) are two recently developed PCR-based methods with high sensitivity and specificity and could be useful to early diagnosis of infection. Computed tomography, magnetic resonance imaging, nuclear imaging and ultrasonography could be useful, although their results might be not specific alone. This review provides a summary of recent developed methods and also attempts to improve their sensitivity for diagnosis of toxoplasmosis. Serology, molecular and imaging technologies each has their own advantages and limitations which can certainly achieve definitive diagnosis of toxoplasmosis by combining these diagnostic techniques.

  13. Normalization to specific gravity prior to analysis improves information recovery from high resolution mass spectrometry metabolomic profiles of human urine.

    PubMed

    Edmands, William M B; Ferrari, Pietro; Scalbert, Augustin

    2014-11-04

    Extraction of meaningful biological information from urinary metabolomic profiles obtained by liquid-chromatography coupled to mass spectrometry (MS) necessitates the control of unwanted sources of variability associated with large differences in urine sample concentrations. Different methods of normalization either before analysis (preacquisition normalization) through dilution of urine samples to the lowest specific gravity measured by refractometry, or after analysis (postacquisition normalization) to urine volume, specific gravity and median fold change are compared for their capacity to recover lead metabolites for a potential future use as dietary biomarkers. Twenty-four urine samples of 19 subjects from the European Prospective Investigation into Cancer and nutrition (EPIC) cohort were selected based on their high and low/nonconsumption of six polyphenol-rich foods as assessed with a 24 h dietary recall. MS features selected on the basis of minimum discriminant selection criteria were related to each dietary item by means of orthogonal partial least-squares discriminant analysis models. Normalization methods ranked in the following decreasing order when comparing the number of total discriminant MS features recovered to that obtained in the absence of normalization: preacquisition normalization to specific gravity (4.2-fold), postacquisition normalization to specific gravity (2.3-fold), postacquisition median fold change normalization (1.8-fold increase), postacquisition normalization to urinary volume (0.79-fold). A preventative preacquisition normalization based on urine specific gravity was found to be superior to all curative postacquisition normalization methods tested for discovery of MS features discriminant of dietary intake in these urinary metabolomic datasets.

  14. Accuracy of dementia diagnosis: a direct comparison between radiologists and a computerized method.

    PubMed

    Klöppel, Stefan; Stonnington, Cynthia M; Barnes, Josephine; Chen, Frederick; Chu, Carlton; Good, Catriona D; Mader, Irina; Mitchell, L Anne; Patel, Ameet C; Roberts, Catherine C; Fox, Nick C; Jack, Clifford R; Ashburner, John; Frackowiak, Richard S J

    2008-11-01

    There has been recent interest in the application of machine learning techniques to neuroimaging-based diagnosis. These methods promise fully automated, standard PC-based clinical decisions, unbiased by variable radiological expertise. We recently used support vector machines (SVMs) to separate sporadic Alzheimer's disease from normal ageing and from fronto-temporal lobar degeneration (FTLD). In this study, we compare the results to those obtained by radiologists. A binary diagnostic classification was made by six radiologists with different levels of experience on the same scans and information that had been previously analysed with SVM. SVMs correctly classified 95% (sensitivity/specificity: 95/95) of sporadic Alzheimer's disease and controls into their respective groups. Radiologists correctly classified 65-95% (median 89%; sensitivity/specificity: 88/90) of scans. SVM correctly classified another set of sporadic Alzheimer's disease in 93% (sensitivity/specificity: 100/86) of cases, whereas radiologists ranged between 80% and 90% (median 83%; sensitivity/specificity: 80/85). SVMs were better at separating patients with sporadic Alzheimer's disease from those with FTLD (SVM 89%; sensitivity/specificity: 83/95; compared to radiological range from 63% to 83%; median 71%; sensitivity/specificity: 64/76). Radiologists were always accurate when they reported a high degree of diagnostic confidence. The results show that well-trained neuroradiologists classify typical Alzheimer's disease-associated scans comparable to SVMs. However, SVMs require no expert knowledge and trained SVMs can readily be exchanged between centres for use in diagnostic classification. These results are encouraging and indicate a role for computerized diagnostic methods in clinical practice.

  15. Accuracy of dementia diagnosis—a direct comparison between radiologists and a computerized method

    PubMed Central

    Stonnington, Cynthia M.; Barnes, Josephine; Chen, Frederick; Chu, Carlton; Good, Catriona D.; Mader, Irina; Mitchell, L. Anne; Patel, Ameet C.; Roberts, Catherine C.; Fox, Nick C.; Jack, Clifford R.; Ashburner, John; Frackowiak, Richard S. J.

    2008-01-01

    There has been recent interest in the application of machine learning techniques to neuroimaging-based diagnosis. These methods promise fully automated, standard PC-based clinical decisions, unbiased by variable radiological expertise. We recently used support vector machines (SVMs) to separate sporadic Alzheimer's disease from normal ageing and from fronto-temporal lobar degeneration (FTLD). In this study, we compare the results to those obtained by radiologists. A binary diagnostic classification was made by six radiologists with different levels of experience on the same scans and information that had been previously analysed with SVM. SVMs correctly classified 95% (sensitivity/specificity: 95/95) of sporadic Alzheimer's disease and controls into their respective groups. Radiologists correctly classified 65–95% (median 89%; sensitivity/specificity: 88/90) of scans. SVM correctly classified another set of sporadic Alzheimer's disease in 93% (sensitivity/specificity: 100/86) of cases, whereas radiologists ranged between 80% and 90% (median 83%; sensitivity/specificity: 80/85). SVMs were better at separating patients with sporadic Alzheimer's disease from those with FTLD (SVM 89%; sensitivity/specificity: 83/95; compared to radiological range from 63% to 83%; median 71%; sensitivity/specificity: 64/76). Radiologists were always accurate when they reported a high degree of diagnostic confidence. The results show that well-trained neuroradiologists classify typical Alzheimer's disease-associated scans comparable to SVMs. However, SVMs require no expert knowledge and trained SVMs can readily be exchanged between centres for use in diagnostic classification. These results are encouraging and indicate a role for computerized diagnostic methods in clinical practice. PMID:18835868

  16. Ensemble-based methods for forecasting census in hospital units

    PubMed Central

    2013-01-01

    Background The ability to accurately forecast census counts in hospital departments has considerable implications for hospital resource allocation. In recent years several different methods have been proposed forecasting census counts, however many of these approaches do not use available patient-specific information. Methods In this paper we present an ensemble-based methodology for forecasting the census under a framework that simultaneously incorporates both (i) arrival trends over time and (ii) patient-specific baseline and time-varying information. The proposed model for predicting census has three components, namely: current census count, number of daily arrivals and number of daily departures. To model the number of daily arrivals, we use a seasonality adjusted Poisson Autoregressive (PAR) model where the parameter estimates are obtained via conditional maximum likelihood. The number of daily departures is predicted by modeling the probability of departure from the census using logistic regression models that are adjusted for the amount of time spent in the census and incorporate both patient-specific baseline and time varying patient-specific covariate information. We illustrate our approach using neonatal intensive care unit (NICU) data collected at Women & Infants Hospital, Providence RI, which consists of 1001 consecutive NICU admissions between April 1st 2008 and March 31st 2009. Results Our results demonstrate statistically significant improved prediction accuracy for 3, 5, and 7 day census forecasts and increased precision of our forecasting model compared to a forecasting approach that ignores patient-specific information. Conclusions Forecasting models that utilize patient-specific baseline and time-varying information make the most of data typically available and have the capacity to substantially improve census forecasts. PMID:23721123

  17. Developing rapid methods for analyzing upland riparian functions and values.

    PubMed

    Hruby, Thomas

    2009-06-01

    Regulators protecting riparian areas need to understand the integrity, health, beneficial uses, functions, and values of this resource. Up to now most methods providing information about riparian areas are based on analyzing condition or integrity. These methods, however, provide little information about functions and values. Different methods are needed that specifically address this aspect of riparian areas. In addition to information on functions and values, regulators have very specific needs that include: an analysis at the site scale, low cost, usability, and inclusion of policy interpretations. To meet these needs a rapid method has been developed that uses a multi-criteria decision matrix to categorize riparian areas in Washington State, USA. Indicators are used to identify the potential of the site to provide a function, the potential of the landscape to support the function, and the value the function provides to society. To meet legal needs fixed boundaries for assessment units are established based on geomorphology, the distance from "Ordinary High Water Mark" and different categories of land uses. Assessment units are first classified based on ecoregions, geomorphic characteristics, and land uses. This simplifies the data that need to be collected at a site, but it requires developing and calibrating a separate model for each "class." The approach to developing methods is adaptable to other locations as its basic structure is not dependent on local conditions.

  18. Feature selection from hyperspectral imaging for guava fruit defects detection

    NASA Astrophysics Data System (ADS)

    Mat Jafri, Mohd. Zubir; Tan, Sou Ching

    2017-06-01

    Development of technology makes hyperspectral imaging commonly used for defect detection. In this research, a hyperspectral imaging system was setup in lab to target for guava fruits defect detection. Guava fruit was selected as the object as to our knowledge, there is fewer attempts were made for guava defect detection based on hyperspectral imaging. The common fluorescent light source was used to represent the uncontrolled lighting condition in lab and analysis was carried out in a specific wavelength range due to inefficiency of this particular light source. Based on the data, the reflectance intensity of this specific setup could be categorized in two groups. Sequential feature selection with linear discriminant (LD) and quadratic discriminant (QD) function were used to select features that could potentially be used in defects detection. Besides the ordinary training method, training dataset in discriminant was separated in two to cater for the uncontrolled lighting condition. These two parts were separated based on the brighter and dimmer area. Four evaluation matrixes were evaluated which are LD with common training method, QD with common training method, LD with two part training method and QD with two part training method. These evaluation matrixes were evaluated using F1-score with total 48 defected areas. Experiment shown that F1-score of linear discriminant with the compensated method hitting 0.8 score, which is the highest score among all.

  19. Competitive Protein-binding assay-based Enzyme-immunoassay Method, Compared to High-pressure Liquid Chromatography, Has a Very Lower Diagnostic Value to Detect Vitamin D Deficiency in 9–12 Years Children

    PubMed Central

    Zahedi Rad, Maliheh; Neyestani, Tirang Reza; Nikooyeh, Bahareh; Shariatzadeh, Nastaran; Kalayi, Ali; Khalaji, Niloufar; Gharavi, Azam

    2015-01-01

    Background: The most reliable indicator of Vitamin D status is circulating concentration of 25-hydroxycalciferol (25(OH) D) routinely determined by enzyme-immunoassays (EIA) methods. This study was performed to compare commonly used competitive protein-binding assays (CPBA)-based EIA with the gold standard, high-pressure liquid chromatography (HPLC). Methods: Concentrations of 25(OH) D in sera from 257 randomly selected school children aged 9–11 years were determined by two methods of CPBA and HPLC. Results: Mean 25(OH) D concentration was 22 ± 18.8 and 21.9 ± 15.6 nmol/L by CPBA and HPLC, respectively. However, mean 25(OH) D concentrations of the two methods became different after excluding undetectable samples (25.1 ± 18.9 vs. 29 ± 14.5 nmol/L, respectively; P = 0.04). Based on predefined Vitamin D deficiency as 25(OH) D < 12.5 nmol/L, CPBA sensitivity and specificity were 44.2% and 60.6%, respectively, compared to HPLC. In receiver operating characteristic curve analysis, the best cut-offs for CPBA was 5.8 nmol/L, which gave 82% sensitivity, but specificity was 17%. Conclusions: Though CPBA may be used as a screening tool, more reliable methods are needed for diagnostic purposes. PMID:26330983

  20. Maximal likelihood correspondence estimation for face recognition across pose.

    PubMed

    Li, Shaoxin; Liu, Xin; Chai, Xiujuan; Zhang, Haihong; Lao, Shihong; Shan, Shiguang

    2014-10-01

    Due to the misalignment of image features, the performance of many conventional face recognition methods degrades considerably in across pose scenario. To address this problem, many image matching-based methods are proposed to estimate semantic correspondence between faces in different poses. In this paper, we aim to solve two critical problems in previous image matching-based correspondence learning methods: 1) fail to fully exploit face specific structure information in correspondence estimation and 2) fail to learn personalized correspondence for each probe image. To this end, we first build a model, termed as morphable displacement field (MDF), to encode face specific structure information of semantic correspondence from a set of real samples of correspondences calculated from 3D face models. Then, we propose a maximal likelihood correspondence estimation (MLCE) method to learn personalized correspondence based on maximal likelihood frontal face assumption. After obtaining the semantic correspondence encoded in the learned displacement, we can synthesize virtual frontal images of the profile faces for subsequent recognition. Using linear discriminant analysis method with pixel-intensity features, state-of-the-art performance is achieved on three multipose benchmarks, i.e., CMU-PIE, FERET, and MultiPIE databases. Owe to the rational MDF regularization and the usage of novel maximal likelihood objective, the proposed MLCE method can reliably learn correspondence between faces in different poses even in complex wild environment, i.e., labeled face in the wild database.

  1. Efficient Agent-Based Cluster Ensembles

    NASA Technical Reports Server (NTRS)

    Agogino, Adrian; Tumer, Kagan

    2006-01-01

    Numerous domains ranging from distributed data acquisition to knowledge reuse need to solve the cluster ensemble problem of combining multiple clusterings into a single unified clustering. Unfortunately current non-agent-based cluster combining methods do not work in a distributed environment, are not robust to corrupted clusterings and require centralized access to all original clusterings. Overcoming these issues will allow cluster ensembles to be used in fundamentally distributed and failure-prone domains such as data acquisition from satellite constellations, in addition to domains demanding confidentiality such as combining clusterings of user profiles. This paper proposes an efficient, distributed, agent-based clustering ensemble method that addresses these issues. In this approach each agent is assigned a small subset of the data and votes on which final cluster its data points should belong to. The final clustering is then evaluated by a global utility, computed in a distributed way. This clustering is also evaluated using an agent-specific utility that is shown to be easier for the agents to maximize. Results show that agents using the agent-specific utility can achieve better performance than traditional non-agent based methods and are effective even when up to 50% of the agents fail.

  2. Subject-Specific Sparse Dictionary Learning for Atlas-Based Brain MRI Segmentation.

    PubMed

    Roy, Snehashis; He, Qing; Sweeney, Elizabeth; Carass, Aaron; Reich, Daniel S; Prince, Jerry L; Pham, Dzung L

    2015-09-01

    Quantitative measurements from segmentations of human brain magnetic resonance (MR) images provide important biomarkers for normal aging and disease progression. In this paper, we propose a patch-based tissue classification method from MR images that uses a sparse dictionary learning approach and atlas priors. Training data for the method consists of an atlas MR image, prior information maps depicting where different tissues are expected to be located, and a hard segmentation. Unlike most atlas-based classification methods that require deformable registration of the atlas priors to the subject, only affine registration is required between the subject and training atlas. A subject-specific patch dictionary is created by learning relevant patches from the atlas. Then the subject patches are modeled as sparse combinations of learned atlas patches leading to tissue memberships at each voxel. The combination of prior information in an example-based framework enables us to distinguish tissues having similar intensities but different spatial locations. We demonstrate the efficacy of the approach on the application of whole-brain tissue segmentation in subjects with healthy anatomy and normal pressure hydrocephalus, as well as lesion segmentation in multiple sclerosis patients. For each application, quantitative comparisons are made against publicly available state-of-the art approaches.

  3. Benchmarking of Methods for Genomic Taxonomy

    DOE PAGES

    Larsen, Mette V.; Cosentino, Salvatore; Lukjancenko, Oksana; ...

    2014-02-26

    One of the first issues that emerges when a prokaryotic organism of interest is encountered is the question of what it is—that is, which species it is. The 16S rRNA gene formed the basis of the first method for sequence-based taxonomy and has had a tremendous impact on the field of microbiology. Nevertheless, the method has been found to have a number of shortcomings. In this paper, we trained and benchmarked five methods for whole-genome sequence-based prokaryotic species identification on a common data set of complete genomes: (i) SpeciesFinder, which is based on the complete 16S rRNA gene; (ii) Reads2Typemore » that searches for species-specific 50-mers in either the 16S rRNA gene or the gyrB gene (for the Enterobacteraceae family); (iii) the ribosomal multilocus sequence typing (rMLST) method that samples up to 53 ribosomal genes; (iv) TaxonomyFinder, which is based on species-specific functional protein domain profiles; and finally (v) KmerFinder, which examines the number of cooccurring k-mers (substrings of k nucleotides in DNA sequence data). The performances of the methods were subsequently evaluated on three data sets of short sequence reads or draft genomes from public databases. In total, the evaluation sets constituted sequence data from more than 11,000 isolates covering 159 genera and 243 species. Our results indicate that methods that sample only chromosomal, core genes have difficulties in distinguishing closely related species which only recently diverged. Finally, the KmerFinder method had the overall highest accuracy and correctly identified from 93% to 97% of the isolates in the evaluations sets.« less

  4. Direct Fluorescence Detection of Allele-Specific PCR Products Using Novel Energy-Transfer Labeled Primers.

    PubMed

    Winn-Deen

    1998-12-01

    Background: Currently analysis of point mutations can be done by allele-specific polymerase chain reaction (PCR) followed by gel analysis or by gene-specific PCR followed by hybridization with an allele-specific probe. Both of these mutation detection methods require post-PCR laboratory time and run the risk of contaminating subsequent experiments with the PCR product liberated during the detection step. The author has combined the PCR amplification and detection steps into a single procedure suitable for closed-tube analysis. Methods and Results: Allele-specific PCR primers were designed as Sunrise energy-transfer primers and contained a 3' terminal mismatch to distinguish between normal and mutant DNA. Cloned normal (W64) and mutant (R64) templates of the beta3-adrenergic receptor gene were tested to verify amplification specificity and yield. A no-target negative control was also run with each reaction. After PCR, each reaction was tested for fluorescence yield by measuring fluorescence on a spectrofluorimeter or fluorescent microtitreplate reader. The cloned controls and 24 patient samples were tested for the W64R mutation by two methods. The direct fluorescence results with the Sunrise allele-specific PCR method gave comparable genotypes to those obtained with the PCR/ restriction digest/gel electrophoresis control method. No PCR artifacts were observed in the negative controls or in the PCR reactions run with the mismatched target. Conclusions: The results of this pilot study indicate good PCR product and fluorescence yield from allele-specific energy-transfer labeled primers, and the capability of distinguishing between normal and mutant alleles based on fluorescence alone, without the need for restriction digestion, gel electrophoresis, or hybridization with an allele-specific probe.

  5. Comparative analysis of QSAR models for predicting pK(a) of organic oxygen acids and nitrogen bases from molecular structure.

    PubMed

    Yu, Haiying; Kühne, Ralph; Ebert, Ralf-Uwe; Schüürmann, Gerrit

    2010-11-22

    For 1143 organic compounds comprising 580 oxygen acids and 563 nitrogen bases that cover more than 17 orders of experimental pK(a) (from -5.00 to 12.23), the pK(a) prediction performances of ACD, SPARC, and two calibrations of a semiempirical quantum chemical (QC) AM1 approach have been analyzed. The overall root-mean-square errors (rms) for the acids are 0.41, 0.58 (0.42 without ortho-substituted phenols with intramolecular H-bonding), and 0.55 and for the bases are 0.65, 0.70, 1.17, and 1.27 for ACD, SPARC, and both QC methods, respectively. Method-specific performances are discussed in detail for six acid subsets (phenols and aromatic and aliphatic carboxylic acids with different substitution patterns) and nine base subsets (anilines, primary, secondary and tertiary amines, meta/para-substituted and ortho-substituted pyridines, pyrimidines, imidazoles, and quinolines). The results demonstrate an overall better performance for acids than for bases but also a substantial variation across subsets. For the overall best-performing ACD, rms ranges from 0.12 to 1.11 and 0.40 to 1.21 pK(a) units for the acid and base subsets, respectively. With regard to the squared correlation coefficient r², the results are 0.86 to 0.96 (acids) and 0.79 to 0.95 (bases) for ACD, 0.77 to 0.95 (acids) and 0.85 to 0.97 (bases) for SPARC, and 0.64 to 0.87 (acids) and 0.43 to 0.83 (bases) for the QC methods, respectively. Attention is paid to structural and method-specific causes for observed pitfalls. The significant subset dependence of the prediction performances suggests a consensus modeling approach.

  6. Template‐based field map prediction for rapid whole brain B0 shimming

    PubMed Central

    Shi, Yuhang; Vannesjo, S. Johanna; Miller, Karla L.

    2017-01-01

    Purpose In typical MRI protocols, time is spent acquiring a field map to calculate the shim settings for best image quality. We propose a fast template‐based field map prediction method that yields near‐optimal shims without measuring the field. Methods The template‐based prediction method uses prior knowledge of the B0 distribution in the human brain, based on a large database of field maps acquired from different subjects, together with subject‐specific structural information from a quick localizer scan. The shimming performance of using the template‐based prediction is evaluated in comparison to a range of potential fast shimming methods. Results Static B0 shimming based on predicted field maps performed almost as well as shimming based on individually measured field maps. In experimental evaluations at 7 T, the proposed approach yielded a residual field standard deviation in the brain of on average 59 Hz, compared with 50 Hz using measured field maps and 176 Hz using no subject‐specific shim. Conclusions This work demonstrates that shimming based on predicted field maps is feasible. The field map prediction accuracy could potentially be further improved by generating the template from a subset of subjects, based on parameters such as head rotation and body mass index. Magn Reson Med 80:171–180, 2018. © 2017 The Authors Magnetic Resonance in Medicine published by Wiley Periodicals, Inc. on behalf of International Society for Magnetic Resonance in Medicine. This is an open access article under the terms of the Creative Commons Attribution License, which permits use, distribution and reproduction in any medium, provided the original work is properly cited. PMID:29193340

  7. Assessing Auditory Discrimination Skill of Malay Children Using Computer-based Method.

    PubMed

    Ting, H; Yunus, J; Mohd Nordin, M Z

    2005-01-01

    The purpose of this paper is to investigate the auditory discrimination skill of Malay children using computer-based method. Currently, most of the auditory discrimination assessments are conducted manually by Speech-Language Pathologist. These conventional tests are actually general tests of sound discrimination, which do not reflect the client's specific speech sound errors. Thus, we propose computer-based Malay auditory discrimination test to automate the whole process of assessment as well as to customize the test according to the specific speech error sounds of the client. The ability in discriminating voiced and unvoiced Malay speech sounds was studied for the Malay children aged between 7 and 10 years old. The study showed no major difficulty for the children in discriminating the Malay speech sounds except differentiating /g/-/k/ sounds. Averagely the children of 7 years old failed to discriminate /g/-/k/ sounds.

  8. Education and Training to Address Specific Needs During the Career Progression of Surgeons.

    PubMed

    Sachdeva, Ajit K; Blair, Patrice Gabler; Lupi, Linda K

    2016-02-01

    Surgeons have specific education and training needs as they enter practice, progress through the core period of active practice, and then as they wind down their clinical work before retirement. These transitions and the career progression process, combined with the dynamic health care environment, present specific opportunities for innovative education and training based on practice-based learning and improvement, and continuous professional development methods. Cutting-edge technologies, blended models, simulation, mentoring, preceptoring, and integrated approaches can play critical roles in supporting surgeons as they provide the best surgical care throughout various phases of their careers. Copyright © 2016 Elsevier Inc. All rights reserved.

  9. Generating Test Templates via Automated Theorem Proving

    NASA Technical Reports Server (NTRS)

    Kancherla, Mani Prasad

    1997-01-01

    Testing can be used during the software development process to maintain fidelity between evolving specifications, program designs, and code implementations. We use a form of specification-based testing that employs the use of an automated theorem prover to generate test templates. A similar approach was developed using a model checker on state-intensive systems. This method applies to systems with functional rather than state-based behaviors. This approach allows for the use of incomplete specifications to aid in generation of tests for potential failure cases. We illustrate the technique on the cannonical triangle testing problem and discuss its use on analysis of a spacecraft scheduling system.

  10. Trajectories for High Specific Impulse High Specific Power Deep Space Exploration

    NASA Technical Reports Server (NTRS)

    Polsgrove, Tara; Adams, Robert B.; Brady, Hugh J. (Technical Monitor)

    2002-01-01

    Flight times and deliverable masses for electric and fusion propulsion systems are difficult to approximate. Numerical integration is required for these continuous thrust systems. Many scientists are not equipped with the tools and expertise to conduct interplanetary and interstellar trajectory analysis for their concepts. Several charts plotting the results of well-known trajectory simulation codes were developed and are contained in this paper. These charts illustrate the dependence of time of flight and payload ratio on jet power, initial mass, specific impulse and specific power. These charts are intended to be a tool by which people in the propulsion community can explore the possibilities of their propulsion system concepts. Trajectories were simulated using the tools VARITOP and IPOST. VARITOP is a well known trajectory optimization code that involves numerical integration based on calculus of variations. IPOST has several methods of trajectory simulation; the one used in this paper is Cowell's method for full integration of the equations of motion. An analytical method derived in the companion paper was also evaluated. The accuracy of this method is discussed in the paper.

  11. Computerization of guidelines: a knowledge specification method to convert text to detailed decision tree for electronic implementation.

    PubMed

    Aguirre-Junco, Angel-Ricardo; Colombet, Isabelle; Zunino, Sylvain; Jaulent, Marie-Christine; Leneveut, Laurence; Chatellier, Gilles

    2004-01-01

    The initial step for the computerization of guidelines is the knowledge specification from the prose text of guidelines. We describe a method of knowledge specification based on a structured and systematic analysis of text allowing detailed specification of a decision tree. We use decision tables to validate the decision algorithm and decision trees to specify and represent this algorithm, along with elementary messages of recommendation. Edition tools are also necessary to facilitate the process of validation and workflow between expert physicians who will validate the specified knowledge and computer scientist who will encode the specified knowledge in a guide-line model. Applied to eleven different guidelines issued by an official agency, the method allows a quick and valid computerization and integration in a larger decision support system called EsPeR (Personalized Estimate of Risks). The quality of the text guidelines is however still to be developed further. The method used for computerization could help to define a framework usable at the initial step of guideline development in order to produce guidelines ready for electronic implementation.

  12. Automatic classification of transiently evoked otoacoustic emissions using an artificial neural network.

    PubMed

    Buller, G; Lutman, M E

    1998-08-01

    The increasing use of transiently evoked otoacoustic emissions (TEOAE) in large neonatal hearing screening programmes makes a standardized method of response classification desirable. Until now methods have been either subjective or based on arbitrary response characteristics. This study takes an expert system approach to standardize the subjective judgements of an experienced scorer. The method that is developed comprises three stages. First, it transforms TEOAEs from waveforms in the time domain into a simplified parameter set. Second, the parameter set is classified by an artificial neural network that has been taught on a large database TEOAE waveforms and corresponding expert scores. Third, additional fuzzy logic rules automatically detect probable artefacts in the waveforms and synchronized spontaneous emission components. In this way, the knowledge of the experienced scorer is encapsulated in the expert system software and thereafter can be accessed by non-experts. Teaching and evaluation of the neural network was based on TEOAEs from a database totalling 2190 neonatal hearing screening tests. The database was divided into learning and test groups with 820 and 1370 waveforms respectively. From each recorded waveform a set of 12 parameters was calculated, representing signal static and dynamic properties. The artifical network was taught with parameter sets of only the learning groups. Reproduction of the human scorer classification by the neural net in the learning group showed a sensitivity for detecting screen fails of 99.3% (299 from 301 failed results on subjective scoring) and a specificity for detecting screen passes of 81.1% (421 of 519 pass results). To quantify the post hoc performance of the net (generalization), the test group was then presented to the network input. Sensitivity was 99.4% (474 from 477) and specificity was 87.3% (780 from 893). To check the efficiency of the classification method, a second learning group was selected out of the previous test group, and the previous learning group was used as the test group. Repeating learning and test procedures yielded 99.3% sensitivity and 80.7% specificity for reproduction, and 99.4% sensitivity and 86.7% specificity for generalization. In all respects, performance was better than for a previously optimized method based simply on cross-correlation between replicate non-linear waveforms. It is concluded that classification methods based on neural networks show promise for application to large neonatal screening programmes utilizing TEOAEs.

  13. PCR-based methods for the detection of L1014 kdr mutation in Anopheles culicifacies sensu lato

    PubMed Central

    Singh, Om P; Bali, Prerna; Hemingway, Janet; Subbarao, Sarala K; Dash, Aditya P; Adak, Tridibes

    2009-01-01

    Background Anopheles culicifacies s.l., a major malaria vector in India, has developed widespread resistance to DDT and is becoming resistant to pyrethroids–the only insecticide class recommended for the impregnation of bed nets. Knock-down resistance due to a point mutation in the voltage gated sodium channel at L1014 residue (kdr) is a common mechanism of resistance to DDT and pyrethroids. The selection of this resistance may pose a serious threat to the success of the pyrethroid-impregnated bed net programme. This study reports the presence of kdr mutation (L1014F) in a field population of An. culicifacies s.l. and three new PCR-based methods for kdr genotyping. Methods The IIS4-IIS5 linker to IIS6 segments of the para type voltage gated sodium channel gene of DDT and pyrethroid resistant An. culicifacies s.l. population from the Surat district of India was sequenced. This revealed the presence of an A-to-T substitution at position 1014 leading to a leucine-phenylalanine mutation (L1014F) in a few individuals. Three molecular methods viz. Allele Specific PCR (AS-PCR), an Amplification Refractory Mutation System (ARMS) and Primer Introduced Restriction Analysis-PCR (PIRA-PCR) were developed and tested for kdr genotyping. The specificity of the three assays was validated following DNA sequencing of the samples genotyped. Results The genotyping of this An. culicifacies s.l. population by the three PCR based assays provided consistent result and were in agreement with DNA sequencing result. A low frequency of the kdr allele mostly in heterozygous condition was observed in the resistant population. Frequencies of the different genotypes were in Hardy-Weinberg equilibrium. Conclusion The Leu-Phe mutation, which generates the kdr phenotype in many insects, was detected in a pyrethroid and DDT resistant An. culicifacies s.l. population. Three PCR-based methods were developed for kdr genotyping. All the three assays were specific. The ARMS method was refractory to non-specific amplification in non-stringent amplification conditions. The PIRA-PCR assay is able to detect both the codons for the phenylalanine mutation at kdr locus, i.e., TTT and TTC, in a single assay, although the latter codon was not found in the population genotyped. PMID:19594947

  14. SU-F-I-38: Patient Organ Specific Dose Assessment in Coronary CT Angiograph Using Voxellaized Volume Dose Index in Monte Carlo Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fallal, Mohammadi Gh.; Riyahi, Alam N.; Graily, Gh.

    Purpose: Clinical use of multi detector computed tomography(MDCT) in diagnosis of diseases due to high speed in data acquisition and high spatial resolution is significantly increased. Regarding to the high radiation dose in CT and necessity of patient specific radiation risk assessment, the adoption of new method in the calculation of organ dose is completely required and necessary. In this study by introducing a conversion factor, patient organ dose in thorax region based on CT image data using MC system was calculated. Methods: The geometry of x-ray tube, inherent filter, bow tie filter and collimator were designed using EGSnrc/BEAMnrc MC-systemmore » component modules according to GE-Light-speed 64-slices CT-scanner geometry. CT-scan image of patient thorax as a specific phantom was voxellised with 6.25mm3 in voxel and 64×64×20 matrix size. Dose to thorax organ include esophagus, lung, heart, breast, ribs, muscle, spine, spinal cord with imaging technical condition of prospectively-gated-coronary CT-Angiography(PGT) as a step and shoot method, were calculated. Irradiation of patient specific phantom was performed using a dedicated MC-code as DOSXYZnrc with PGT-irradiation model. The ratio of organ dose value calculated in MC-method to the volume CT dose index(CTDIvol) reported by CT-scanner machine according to PGT radiation technique has been introduced as conversion factor. Results: In PGT method, CTDIvol was 10.6mGy and Organ Dose/CTDIvol conversion factor for esophagus, lung, heart, breast, ribs, muscle, spine and spinal cord were obtained as; 0.96, 1.46, 1.2, 3.28. 6.68. 1.35, 3.41 and 0.93 respectively. Conclusion: The results showed while, underestimation of patient dose was found in dose calculation based on CTDIvol, also dose to breast is higher than the other studies. Therefore, the method in this study can be used to provide the actual patient organ dose in CT imaging based on CTDIvol in order to calculation of real effective dose(ED) based on organ dose. This work has been supported by the research chancellor of tehran university of medical sciences(tums), school of medicine, Tehran, Iran.« less

  15. A new nanostructured Silicon biosensor for diagnostics of bovine leucosis

    NASA Astrophysics Data System (ADS)

    Luchenko, A. I.; Melnichenko, M. M.; Starodub, N. F.; Shmyryeva, O. M.

    2010-08-01

    In this report we propose a new instrumental method for the biochemical diagnostics of the bovine leucosis through the registration of the formation of the specific immune complex (antigen-antibody) with the help of biosensor based on the nano-structured silicon. The principle of the measurements is based on the determination of the photosensitivity of the surface. In spite of the existed traditional methods of the biochemical diagnostics of the bovine leucosis the proposed approach may provide the express control of the milk quality as direct on the farm and during the process raw materials. The proposed variant of the biosensor based on the nano-structured silicon may be applied for the determination of the concentration of different substances which may form the specific complex in the result of the bioaffine reactions. A new immune technique based on the nanostructured silicon and intended for the quantitative determination of some toxic substances is offered. The sensitivity of such biosensor allows determining T-2 mycotoxin at the concentration of 10 ng/ml during several minutes.

  16. Identification of species origin of meat and meat products on the DNA basis: a review.

    PubMed

    Kumar, Arun; Kumar, Rajiv Ranjan; Sharma, Brahm Deo; Gokulakrishnan, Palanisamy; Mendiratta, Sanjod Kumar; Sharma, Deepak

    2015-01-01

    The adulteration/substitution of meat has always been a concern for various reasons such as public health, religious factors, wholesomeness, and unhealthy competition in meat market. Consumer should be protected from these malicious practices of meat adulterations by quick, precise, and specific identification of meat animal species. Several analytical methodologies have been employed for meat speciation based on anatomical, histological, microscopic, organoleptic, chemical, electrophoretic, chromatographic, or immunological principles. However, by virtue of their inherent limitations, most of these techniques have been replaced by the recent DNA-based molecular techniques. In the last decades, several methods based on polymerase chain reaction have been proposed as useful means for identifying the species origin in meat and meat products, due to their high specificity and sensitivity, as well as rapid processing time and low cost. This review intends to provide an updated and extensive overview on the DNA-based methods for species identification in meat and meat products.

  17. MCTDH on-the-fly: Efficient grid-based quantum dynamics without pre-computed potential energy surfaces

    NASA Astrophysics Data System (ADS)

    Richings, Gareth W.; Habershon, Scott

    2018-04-01

    We present significant algorithmic improvements to a recently proposed direct quantum dynamics method, based upon combining well established grid-based quantum dynamics approaches and expansions of the potential energy operator in terms of a weighted sum of Gaussian functions. Specifically, using a sum of low-dimensional Gaussian functions to represent the potential energy surface (PES), combined with a secondary fitting of the PES using singular value decomposition, we show how standard grid-based quantum dynamics methods can be dramatically accelerated without loss of accuracy. This is demonstrated by on-the-fly simulations (using both standard grid-based methods and multi-configuration time-dependent Hartree) of both proton transfer on the electronic ground state of salicylaldimine and the non-adiabatic dynamics of pyrazine.

  18. Compositions for chromosome-specific staining

    DOEpatents

    Gray, Joe W.; Pinkel, Daniel

    1998-01-01

    Methods and compositions for staining based upon nucleic acid sequence that employ nucleic acid probes are provided. Said methods produce staining patterns that can be tailored for specific cytogenetic analyses. Said probes are appropriate for in situ hybridization and stain both interphase and metaphase chromosomal material with reliable signals. The nucleic acid probes are typically of a complexity greater than 50 kb, the complexity depending upon the cytogenetic application. Methods are provided to disable the hybridization capacity of shared, high copy repetitive sequences and/or remove such sequences to provide for useful contrast. Still further methods are provided to produce chromosome-specific staining reagents which are made specific to the targeted chromosomal material, which can be one or more whole chromosomes, one or more regions on one or more chromosomes, subsets of chromosomes and/or the entire genome. Probes and test kits are provided for use in tumor cytogenetics, in the detection of disease related loci, in analysis of structural abnormalities, such as translocations, and for biological dosimetry. Further, methods and prenatal test kits are provided to stain targeted chromosomal material of fetal cells, including fetal cells obtained from maternal blood. Still further, the invention provides for automated means to detect and analyse chromosomal abnormalities.

  19. Compositions for chromosome-specific staining

    DOEpatents

    Gray, J.W.; Pinkel, D.

    1998-05-26

    Methods and compositions for staining based upon nucleic acid sequence that employ nucleic acid probes are provided. The methods produce staining patterns that can be tailored for specific cytogenetic analyses. The probes are appropriate for in situ hybridization and stain both interphase and metaphase chromosomal material with reliable signals. The nucleic acid probes are typically of a complexity greater than 50 kb, the complexity depending upon the cytogenetic application. Methods are provided to disable the hybridization capacity of shared, high copy repetitive sequences and/or remove such sequences to provide for useful contrast. Still further methods are provided to produce chromosome-specific staining reagents which are made specific to the targeted chromosomal material, which can be one or more whole chromosomes, one or more regions on one or more chromosomes, subsets of chromosomes and/or the entire genome. Probes and test kits are provided for use in tumor cytogenetics, in the detection of disease related loci, in analysis of structural abnormalities, such as translocations, and for biological dosimetry. Methods and prenatal test kits are provided to stain targeted chromosomal material of fetal cells, including fetal cells obtained from maternal blood. The invention provides for automated means to detect and analyze chromosomal abnormalities. 17 figs.

  20. Differential activity staining: its use in characterization of guanylyl-specific ribonuclease in the genus Ustilago.

    PubMed Central

    Blank, A; Dekker, C A

    1975-01-01

    Guanylyl-specific ribonuclease can be identified by a novel technique employing electrophoresis in polyacrylamide slabs followed by differential activity staining. The technique requires as little as 7 ng of enzyme which may be grossly admixed with contaminants, including other ribonucleases. Upon electrophoresis and activity staining, a variety of ribonucleases can be visualized as light or clear bands in a colored background formed by toluidine blue complexed with oligonucleotide substrate. Guanylyl-specific ribonuclease, which is detectable when using an oligonucleotide substrate of random base sequence, does not yield a band when using oligonucleotides bearing guanylyl residues at the 3'-termini only and containing, therefore, no susceptible internucleotide bonds; in contrast, a ribonuclease with a different base specificity or no base specificity yields a band with either substrate. This differential activity staining method for establishing guanylyl specificity permits estimation of the extent of nonspecific cleavage of internucleotide linkages by a putatively guanylyl-specific enzyme and is at least as sensitive as conventional procedures for determination of base specificity. With this new technique guanyloribonuclease has been identified in the unfractionated culture medium of 10 organisms belonging to the phytopathogenic fungal genus Ustilago. It is suggested that guanylyl-specific ribonuclease is widely distributed among Ustilago species; its electrophoretic properties may be revealing of phylogenetic relationships among these plant parasites and among their hosts. The general technique of differential activity staining, developed for determination of the base specificity of ribonucleases, may be widely applicable to analysis of enzymes catalyzing depolymerization reactions. Images PMID:813217

Top