Science.gov

Sample records for level computational identification

  1. MODEL IDENTIFICATION AND COMPUTER ALGEBRA.

    PubMed

    Bollen, Kenneth A; Bauldry, Shawn

    2010-10-07

    Multiequation models that contain observed or latent variables are common in the social sciences. To determine whether unique parameter values exist for such models, one needs to assess model identification. In practice analysts rely on empirical checks that evaluate the singularity of the information matrix evaluated at sample estimates of parameters. The discrepancy between estimates and population values, the limitations of numerical assessments of ranks, and the difference between local and global identification make this practice less than perfect. In this paper we outline how to use computer algebra systems (CAS) to determine the local and global identification of multiequation models with or without latent variables. We demonstrate a symbolic CAS approach to local identification and develop a CAS approach to obtain explicit algebraic solutions for each of the model parameters. We illustrate the procedures with several examples, including a new proof of the identification of a model for handling missing data using auxiliary variables. We present an identification procedure for Structural Equation Models that makes use of CAS and that is a useful complement to current methods.

  2. Computational methods for remote homolog identification.

    PubMed

    Wan, Xiu-Feng; Xu, Dong

    2005-12-01

    As more and more protein sequences are available, homolog identification becomes increasingly important for functional, structural, and evolutional studies of proteins. Many homologous proteins were separated a very long time ago in their evolutionary history and thus their sequences share low sequence identity. These remote homologs have become a research focus in bioinformatics over the past decade, and some significant advances have been achieved. In this paper, we provide a comprehensive review on computational techniques used in remote homolog identification based on different methods, including sequence-sequence comparison, and sequence-structure comparison, and structure-structure comparison. Other miscellaneous approaches are also summarized. Pointers to the online resources of these methods and their related databases are provided. Comparisons among different methods in terms of their technical approaches, their strengths, and limitations are followed. Studies on proteins in SARS-CoV are shown as an example for remote homolog identification application.

  3. Computing Health: Programing Problem 3, Computing Peak Blood Alcohol Levels.

    ERIC Educational Resources Information Center

    Gold, Robert S.

    1985-01-01

    The Alcohol Metabolism Program, a computer program used to compute peak blood alcohol levels, is expanded upon to include a cover page, brief introduction, and techniques for generalizing the program to calculate peak levels for any number of drinks. (DF)

  4. Computer method for identification of boiler transfer functions

    NASA Technical Reports Server (NTRS)

    Miles, J. H.

    1972-01-01

    Iterative computer aided procedure was developed which provides for identification of boiler transfer functions using frequency response data. Method uses frequency response data to obtain satisfactory transfer function for both high and low vapor exit quality data.

  5. Multi level programming Paradigm for Extreme Computing

    NASA Astrophysics Data System (ADS)

    Petiton, S.; Sato, M.; Emad, N.; Calvin, C.; Tsuji, M.; Dandouna, M.

    2014-06-01

    Abstract: In order to propose a framework and programming paradigms for post-petascale computing, on the road to exascale computing and beyond, we introduced new languages, associated with a hierarchical multi-level programming paradigm, allowing scientific end-users and developers to program highly hierarchical architectures designed for extreme computing. In this paper, we explain the interest of such hierarchical multi-level programming paradigm for extreme computing and its well adaptation to several large computational science applications, such as for linear algebra solvers used for reactor core physic. We describe the YML language and framework allowing describing graphs of parallel components, which may be developed using PGAS-like language such as XMP, scheduled and computed on supercomputers. Then, we propose experimentations on supercomputers (such as the "K" and "Hooper" ones) of the hybrid method MERAM (Multiple Explicitly Restarted Arnoldi Method) as a case study for iterative methods manipulating sparse matrices, and the block Gauss-Jordan method as a case study for direct method manipulating dense matrices. We conclude proposing evolutions for this programming paradigm.

  6. Identification of Protein–Excipient Interaction Hotspots Using Computational Approaches

    PubMed Central

    Barata, Teresa S.; Zhang, Cheng; Dalby, Paul A.; Brocchini, Steve; Zloh, Mire

    2016-01-01

    Protein formulation development relies on the selection of excipients that inhibit protein–protein interactions preventing aggregation. Empirical strategies involve screening many excipient and buffer combinations using force degradation studies. Such methods do not readily provide information on intermolecular interactions responsible for the protective effects of excipients. This study describes a molecular docking approach to screen and rank interactions allowing for the identification of protein–excipient hotspots to aid in the selection of excipients to be experimentally screened. Previously published work with Drosophila Su(dx) was used to develop and validate the computational methodology, which was then used to determine the formulation hotspots for Fab A33. Commonly used excipients were examined and compared to the regions in Fab A33 prone to protein–protein interactions that could lead to aggregation. This approach could provide information on a molecular level about the protective interactions of excipients in protein formulations to aid the more rational development of future formulations. PMID:27258262

  7. Multi-level RF identification system

    DOEpatents

    Steele, Kerry D.; Anderson, Gordon A.; Gilbert, Ronald W.

    2004-07-20

    A radio frequency identification system having a radio frequency transceiver for generating a continuous wave RF interrogation signal that impinges upon an RF identification tag. An oscillation circuit in the RF identification tag modulates the interrogation signal with a subcarrier of a predetermined frequency and modulates the frequency-modulated signal back to the transmitting interrogator. The interrogator recovers and analyzes the subcarrier signal and determines its frequency. The interrogator generates an output indicative of the frequency of the subcarrier frequency, thereby identifying the responding RFID tag as one of a "class" of RFID tags configured to respond with a subcarrier signal of a predetermined frequency.

  8. Enzyme optimization for next level molecular computing

    NASA Astrophysics Data System (ADS)

    Wąsiewicz, Piotr; Malinowski, Michal; Plucienniczak, Andrzej

    2006-10-01

    The main concept of molecular computing depends on DNA self-assembly abilities and on modifying DNA with the help of enzymes during genetic operations. In the typical DNA computing a sequence of operations executed on DNA strings in parallel is called an algorithm, which is also determined by a model of DNA strings. This methodology is similar to the soft hardware specialized architecture driven here by heating, cooling and enzymes, especially polymerases used for copying strings. As it is described in this paper the polymerase Taq properties are changed by modifying its DNA sequence in such a way that polymerase side activities together with peptide chains, responsible for destroying amplified strings, are cut off. Thus, it introduces the next level of molecular computing. The genetic operation execution succession and the given molecule model with designed nucleotide sequences produce computation results and additionally they modify enzymes, which directly influence on the computation process. The information flow begins to circulate. Additionally, such optimized enzymes are more suitable for nanoconstruction, because they have only desired characteristics. The experiment was proposed to confirm the possibilities of the suggested implementation.

  9. SLIMM: species level identification of microorganisms from metagenomes

    PubMed Central

    Renard, Bernhard Y.; Wieler, Lothar H.; Semmler, Torsten; Reinert, Knut

    2017-01-01

    Identification and quantification of microorganisms is a significant step in studying the alpha and beta diversities within and between microbial communities respectively. Both identification and quantification of a given microbial community can be carried out using whole genome shotgun sequences with less bias than when using 16S-rDNA sequences. However, shared regions of DNA among reference genomes and taxonomic units pose a significant challenge in assigning reads correctly to their true origins. The existing microbial community profiling tools commonly deal with this problem by either preparing signature-based unique references or assigning an ambiguous read to its least common ancestor in a taxonomic tree. The former method is limited to making use of the reads which can be mapped to the curated regions, while the latter suffer from the lack of uniquely mapped reads at lower (more specific) taxonomic ranks. Moreover, even if the tools exhibited good performance in calling the organisms present in a sample, there is still room for improvement in determining the correct relative abundance of the organisms. We present a new method Species Level Identification of Microorganisms from Metagenomes (SLIMM) which addresses the above issues by using coverage information of reference genomes to remove unlikely genomes from the analysis and subsequently gain more uniquely mapped reads to assign at lower ranks of a taxonomic tree. SLIMM is based on a few, seemingly easy steps which when combined create a tool that outperforms state-of-the-art tools in run-time and memory usage while being on par or better in computing quantitative and qualitative information at species-level. PMID:28367376

  10. Computational Strategies for a System-Level Understanding of Metabolism

    PubMed Central

    Cazzaniga, Paolo; Damiani, Chiara; Besozzi, Daniela; Colombo, Riccardo; Nobile, Marco S.; Gaglio, Daniela; Pescini, Dario; Molinari, Sara; Mauri, Giancarlo; Alberghina, Lilia; Vanoni, Marco

    2014-01-01

    Cell metabolism is the biochemical machinery that provides energy and building blocks to sustain life. Understanding its fine regulation is of pivotal relevance in several fields, from metabolic engineering applications to the treatment of metabolic disorders and cancer. Sophisticated computational approaches are needed to unravel the complexity of metabolism. To this aim, a plethora of methods have been developed, yet it is generally hard to identify which computational strategy is most suited for the investigation of a specific aspect of metabolism. This review provides an up-to-date description of the computational methods available for the analysis of metabolic pathways, discussing their main advantages and drawbacks.  In particular, attention is devoted to the identification of the appropriate scale and level of accuracy in the reconstruction of metabolic networks, and to the inference of model structure and parameters, especially when dealing with a shortage of experimental measurements. The choice of the proper computational methods to derive in silico data is then addressed, including topological analyses, constraint-based modeling and simulation of the system dynamics. A description of some computational approaches to gain new biological knowledge or to formulate hypotheses is finally provided. PMID:25427076

  11. Occupational risk identification using hand-held or laptop computers.

    PubMed

    Naumanen, Paula; Savolainen, Heikki; Liesivuori, Jyrki

    2008-01-01

    This paper describes the Work Environment Profile (WEP) program and its use in risk identification by computer. It is installed into a hand-held computer or a laptop to be used in risk identification during work site visits. A 5-category system is used to describe the identified risks in 7 groups, i.e., accidents, biological and physical hazards, ergonomic and psychosocial load, chemicals, and information technology hazards. Each group contains several qualifying factors. These 5 categories are colour-coded at this stage to aid with visualization. Risk identification produces visual summary images the interpretation of which is facilitated by colours. The WEP program is a tool for risk assessment which is easy to learn and to use both by experts and nonprofessionals. It is especially well adapted to be used both in small and in larger enterprises. Considerable time is saved as no paper notes are needed.

  12. Computational identification of transcriptional regulatory elements in DNA sequence

    PubMed Central

    GuhaThakurta, Debraj

    2006-01-01

    Identification and annotation of all the functional elements in the genome, including genes and the regulatory sequences, is a fundamental challenge in genomics and computational biology. Since regulatory elements are frequently short and variable, their identification and discovery using computational algorithms is difficult. However, significant advances have been made in the computational methods for modeling and detection of DNA regulatory elements. The availability of complete genome sequence from multiple organisms, as well as mRNA profiling and high-throughput experimental methods for mapping protein-binding sites in DNA, have contributed to the development of methods that utilize these auxiliary data to inform the detection of transcriptional regulatory elements. Progress is also being made in the identification of cis-regulatory modules and higher order structures of the regulatory sequences, which is essential to the understanding of transcription regulation in the metazoan genomes. This article reviews the computational approaches for modeling and identification of genomic regulatory elements, with an emphasis on the recent developments, and current challenges. PMID:16855295

  13. Human operator identification model and related computer programs

    NASA Technical Reports Server (NTRS)

    Kessler, K. M.; Mohr, J. N.

    1978-01-01

    Four computer programs which provide computational assistance in the analysis of man/machine systems are reported. The programs are: (1) Modified Transfer Function Program (TF); (2) Time Varying Response Program (TVSR); (3) Optimal Simulation Program (TVOPT); and (4) Linear Identification Program (SCIDNT). The TV program converts the time domain state variable system representative to frequency domain transfer function system representation. The TVSR program computes time histories of the input/output responses of the human operator model. The TVOPT program is an optimal simulation program and is similar to TVSR in that it produces time histories of system states associated with an operator in the loop system. The differences between the two programs are presented. The SCIDNT program is an open loop identification code which operates on the simulated data from TVOPT (or TVSR) or real operator data from motion simulators.

  14. Identification of Computational and Experimental Reduced-Order Models

    NASA Technical Reports Server (NTRS)

    Silva, Walter A.; Hong, Moeljo S.; Bartels, Robert E.; Piatak, David J.; Scott, Robert C.

    2003-01-01

    The identification of computational and experimental reduced-order models (ROMs) for the analysis of unsteady aerodynamic responses and for efficient aeroelastic analyses is presented. For the identification of a computational aeroelastic ROM, the CFL3Dv6.0 computational fluid dynamics (CFD) code is used. Flutter results for the AGARD 445.6 Wing and for a Rigid Semispan Model (RSM) computed using CFL3Dv6.0 are presented, including discussion of associated computational costs. Modal impulse responses of the unsteady aerodynamic system are computed using the CFL3Dv6.0 code and transformed into state-space form. The unsteady aerodynamic state-space ROM is then combined with a state-space model of the structure to create an aeroelastic simulation using the MATLAB/SIMULINK environment. The MATLAB/SIMULINK ROM is then used to rapidly compute aeroelastic transients, including flutter. The ROM shows excellent agreement with the aeroelastic analyses computed using the CFL3Dv6.0 code directly. For the identification of experimental unsteady pressure ROMs, results are presented for two configurations: the RSM and a Benchmark Supercritical Wing (BSCW). Both models were used to acquire unsteady pressure data due to pitching oscillations on the Oscillating Turntable (OTT) system at the Transonic Dynamics Tunnel (TDT). A deconvolution scheme involving a step input in pitch and the resultant step response in pressure, for several pressure transducers, is used to identify the unsteady pressure impulse responses. The identified impulse responses are then used to predict the pressure responses due to pitching oscillations at several frequencies. Comparisons with the experimental data are then presented.

  15. Computational identification of operons in microbial genomes.

    PubMed

    Zheng, Yu; Szustakowski, Joseph D; Fortnow, Lance; Roberts, Richard J; Kasif, Simon

    2002-08-01

    By applying graph representations to biochemical pathways, a new computational pipeline is proposed to find potential operons in microbial genomes. The algorithm relies on the fact that enzyme genes in operons tend to catalyze successive reactions in metabolic pathways. We applied this algorithm to 42 microbial genomes to identify putative operon structures. The predicted operons from Escherichia coli were compared with a selected metabolism-related operon dataset from the RegulonDB database, yielding a prediction sensitivity (89%) and specificity (87%) relative to this dataset. Several examples of detected operons are given and analyzed. Modular gene cluster transfer and operon fusion are observed. A further use of predicted operon data to assign function to putative genes was suggested and, as an example, a previous putative gene (MJ1604) from Methanococcus jannaschii is now annotated as a phosphofructokinase, which was regarded previously as a missing enzyme in this organism. GC content changes in the operon region and nonoperon region were examined. The results reveal a clear GC content transition at the boundaries of putative operons. We looked further into the conservation of operons across genomes. A trp operon alignment is analyzed in depth to show gene loss and rearrangement in different organisms during operon evolution.

  16. Multilevel Relaxation in Low Level Computer Vision.

    DTIC Science & Technology

    1982-01-01

    Lab, Cambridge,MA, June,1981. HR80 Hanson,A. and Riseman,E.M., Processing Cones: A Computational Structure for Image Analysis, In: Structured Computer Vision...Klinger,A. (Editors), Structured Computer Vision: Machine Perception through Hierarchical Computation Structures, Academic Press, New York, 1980. TP75

  17. 48 CFR 227.7203-10 - Contractor identification and marking of computer software or computer software documentation to...

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... and marking of computer software or computer software documentation to be furnished with restrictive... Rights in Computer Software and Computer Software Documentation 227.7203-10 Contractor identification and marking of computer software or computer software documentation to be furnished with restrictive...

  18. 48 CFR 227.7203-10 - Contractor identification and marking of computer software or computer software documentation to...

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... and marking of computer software or computer software documentation to be furnished with restrictive... Rights in Computer Software and Computer Software Documentation 227.7203-10 Contractor identification and marking of computer software or computer software documentation to be furnished with restrictive...

  19. 48 CFR 227.7203-10 - Contractor identification and marking of computer software or computer software documentation to...

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... and marking of computer software or computer software documentation to be furnished with restrictive... Rights in Computer Software and Computer Software Documentation 227.7203-10 Contractor identification and marking of computer software or computer software documentation to be furnished with restrictive...

  20. 48 CFR 227.7203-10 - Contractor identification and marking of computer software or computer software documentation to...

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... and marking of computer software or computer software documentation to be furnished with restrictive... Rights in Computer Software and Computer Software Documentation 227.7203-10 Contractor identification and marking of computer software or computer software documentation to be furnished with restrictive...

  1. 48 CFR 227.7203-10 - Contractor identification and marking of computer software or computer software documentation to...

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... and marking of computer software or computer software documentation to be furnished with restrictive... Rights in Computer Software and Computer Software Documentation 227.7203-10 Contractor identification and marking of computer software or computer software documentation to be furnished with restrictive...

  2. Computed tomographic identification of calcified optic nerve drusen

    SciTech Connect

    Ramirez, H.; Blatt, E.S.; Hibri, N.S.

    1983-07-01

    Four cases of optic disk drusen were accurately diagnosed with orbital computed tomography (CT). The radiologist should be aware of the characteristic CT finding of discrete calcification within an otherwise normal optic disk. This benign process is easily differentiated from lesions such as calcific neoplastic processes of the posterior globe. CT identification of optic disk drusen is essential in the evaluation of visual field defects, migraine-like headaches, and pseudopapilledema.

  3. An experimental modal testing/identification technique for personal computers

    NASA Technical Reports Server (NTRS)

    Roemer, Michael J.; Schlonski, Steven T.; Mook, D. Joseph

    1990-01-01

    A PC-based system for mode shape identification is evaluated. A time-domain modal identification procedure is utilized to identify the mode shapes of a beam apparatus from discrete time-domain measurements. The apparatus includes a cantilevered aluminum beam, four accelerometers, four low-pass filters, and the computer. The method's algorithm is comprised of an identification algorithm: the Eigensystem Realization Algorithm (ERA) and an estimation algorithm called Minimum Model Error (MME). The identification ability of this algorithm is compared with ERA alone, a frequency-response-function technique, and an Euler-Bernoulli beam model. Detection of modal parameters and mode shapes by the PC-based time-domain system is shown to be accurate in an application with an aluminum beam, while mode shapes identified by the frequency-domain technique are not as accurate as predicted. The new method is shown to be significantly less sensitive to noise and poorly excited modes than other leading methods. The results support the use of time-domain identification systems for mode shape prediction.

  4. Multi-level hot zone identification for pedestrian safety.

    PubMed

    Lee, Jaeyoung; Abdel-Aty, Mohamed; Choi, Keechoo; Huang, Helai

    2015-03-01

    According to the National Highway Traffic Safety Administration (NHTSA), while fatalities from traffic crashes have decreased, the proportion of pedestrian fatalities has steadily increased from 11% to 14% over the past decade. This study aims at identifying two zonal levels factors. The first is to identify hot zones at which pedestrian crashes occurs, while the second are zones where crash-involved pedestrians came from. Bayesian Poisson lognormal simultaneous equation spatial error model (BPLSESEM) was estimated and revealed significant factors for the two target variables. Then, PSIs (potential for safety improvements) were computed using the model. Subsequently, a novel hot zone identification method was suggested to combine both hot zones from where vulnerable pedestrians originated with hot zones where many pedestrian crashes occur. For the former zones, targeted safety education and awareness campaigns can be provided as countermeasures whereas area-wide engineering treatments and enforcement may be effective safety treatments for the latter ones. Thus, it is expected that practitioners are able to suggest appropriate safety treatments for pedestrian crashes using the method and results from this study.

  5. Splign: algorithms for computing spliced alignments with identification of paralogs

    PubMed Central

    Kapustin, Yuri; Souvorov, Alexander; Tatusova, Tatiana; Lipman, David

    2008-01-01

    Background The computation of accurate alignments of cDNA sequences against a genome is at the foundation of modern genome annotation pipelines. Several factors such as presence of paralogs, small exons, non-consensus splice signals, sequencing errors and polymorphic sites pose recognized difficulties to existing spliced alignment algorithms. Results We describe a set of algorithms behind a tool called Splign for computing cDNA-to-Genome alignments. The algorithms include a high-performance preliminary alignment, a compartment identification based on a formally defined model of adjacent duplicated regions, and a refined sequence alignment. In a series of tests, Splign has produced more accurate results than other tools commonly used to compute spliced alignments, in a reasonable amount of time. Conclusion Splign's ability to deal with various issues complicating the spliced alignment problem makes it a helpful tool in eukaryotic genome annotation processes and alternative splicing studies. Its performance is enough to align the largest currently available pools of cDNA data such as the human EST set on a moderate-sized computing cluster in a matter of hours. The duplications identification (compartmentization) algorithm can be used independently in other areas such as the study of pseudogenes. Reviewers This article was reviewed by: Steven Salzberg, Arcady Mushegian and Andrey Mironov (nominated by Mikhail Gelfand). PMID:18495041

  6. Computational Issues in Damping Identification for Large Scale Problems

    NASA Technical Reports Server (NTRS)

    Pilkey, Deborah L.; Roe, Kevin P.; Inman, Daniel J.

    1997-01-01

    Two damping identification methods are tested for efficiency in large-scale applications. One is an iterative routine, and the other a least squares method. Numerical simulations have been performed on multiple degree-of-freedom models to test the effectiveness of the algorithm and the usefulness of parallel computation for the problems. High Performance Fortran is used to parallelize the algorithm. Tests were performed using the IBM-SP2 at NASA Ames Research Center. The least squares method tested incurs high communication costs, which reduces the benefit of high performance computing. This method's memory requirement grows at a very rapid rate meaning that larger problems can quickly exceed available computer memory. The iterative method's memory requirement grows at a much slower pace and is able to handle problems with 500+ degrees of freedom on a single processor. This method benefits from parallelization, and significant speedup can he seen for problems of 100+ degrees-of-freedom.

  7. Computer Literacy for Teachers: Level 1.

    ERIC Educational Resources Information Center

    Oliver, Marvin E.

    This brief, non-technical narrative for teachers addresses the following questions: (1) What are computers? (2) How do they work? (3) What can they do? (4) Why should we care? (5) What do they have to do with reading instruction? and (6) What is the future of computers for teachers and students? Specific topics discussed include the development of…

  8. Chip level simulation of fault tolerant computers

    NASA Technical Reports Server (NTRS)

    Armstrong, J. R.

    1983-01-01

    Chip level modeling techniques, functional fault simulation, simulation software development, a more efficient, high level version of GSP, and a parallel architecture for functional simulation are discussed.

  9. Factors Influencing Exemplary Science Teachers' Levels of Computer Use

    ERIC Educational Resources Information Center

    Hakverdi, Meral; Dana, Thomas M.; Swain, Colleen

    2011-01-01

    The purpose of this study was to examine exemplary science teachers' use of technology in science instruction, factors influencing their level of computer use, their level of knowledge/skills in using specific computer applications for science instruction, their use of computer-related applications/tools during their instruction, and their…

  10. Computer program to predict aircraft noise levels

    NASA Technical Reports Server (NTRS)

    Clark, B. J.

    1981-01-01

    Methods developed at the NASA Lewis Research Center for predicting the noise contributions from various aircraft noise sources were programmed to predict aircraft noise levels either in flight or in ground tests. The noise sources include fan inlet and exhaust, jet, flap (for powered lift), core (combustor), turbine, and airframe. Noise propagation corrections are available for atmospheric attenuation, ground reflections, extra ground attenuation, and shielding. Outputs can include spectra, overall sound pressure level, perceived noise level, tone-weighted perceived noise level, and effective perceived noise level at locations specified by the user. Footprint contour coordinates and approximate footprint areas can also be calculated. Inputs and outputs can be in either System International or U.S. customary units. The subroutines for each noise source and propagation correction are described. A complete listing is given.

  11. Factors influencing exemplary science teachers' levels of computer use

    NASA Astrophysics Data System (ADS)

    Hakverdi, Meral

    This study examines exemplary science teachers' use of technology in science instruction, factors influencing their level of computer use, their level of knowledge/skills in using specific computer applications for science instruction, their use of computer-related applications/tools during their instruction, and their students' use of computer applications/tools in or for their science class. After a relevant review of the literature certain variables were selected for analysis. These variables included personal self-efficacy in teaching with computers, outcome expectancy, pupil-control ideology, level of computer use, age, gender, teaching experience, personal computer use, professional computer use and science teachers' level of knowledge/skills in using specific computer applications for science instruction. The sample for this study includes middle and high school science teachers who received the Presidential Award for Excellence in Science Teaching Award (sponsored by the White House and the National Science Foundation) between the years 1997 and 2003 from all 50 states and U.S. territories. Award-winning science teachers were contacted about the survey via e-mail or letter with an enclosed return envelope. Of the 334 award-winning science teachers, usable responses were received from 92 science teachers, which made a response rate of 27.5%. Analysis of the survey responses indicated that exemplary science teachers have a variety of knowledge/skills in using computer related applications/tools. The most commonly used computer applications/tools are information retrieval via the Internet, presentation tools, online communication, digital cameras, and data collection probes. Results of the study revealed that students' use of technology in their science classroom is highly correlated with the frequency of their science teachers' use of computer applications/tools. The results of the multiple regression analysis revealed that personal self-efficacy related to

  12. Identification and levels of airborne fungi in Portuguese primary schools.

    PubMed

    Madureira, Joana; Pereira, Cristiana; Paciência, Inês; Teixeira, João Paulo; de Oliveira Fernandes, Eduardo

    2014-01-01

    Several studies found associations between exposure to airborne fungi and allergy, infection, or irritation. This study aimed to characterize airborne fungi populations present in public primary schools in Porto, Portugal, during winter through quantification and identification procedures. Fungal concentration levels and identification were obtained in a total of 73 classrooms. The AirIdeal portable air sampler was used in combination with chloramphenicol malt extract agar. Results showed a wide range of indoor fungi levels, with indoor concentrations higher than outdoors. The most prevalent fungi found indoors were Penicillium sp. (>70%) and Cladosporium sp. As evidence indicates that indoor fungal exposures plays a role in asthma clinical status, these results may contribute to (1) promoting and implementing public health prevention programs and (2) formulating recommendations aimed at providing healthier school environments.

  13. Computational system identification of continuous-time nonlinear systems using approximate Bayesian computation

    NASA Astrophysics Data System (ADS)

    Krishnanathan, Kirubhakaran; Anderson, Sean R.; Billings, Stephen A.; Kadirkamanathan, Visakan

    2016-11-01

    In this paper, we derive a system identification framework for continuous-time nonlinear systems, for the first time using a simulation-focused computational Bayesian approach. Simulation approaches to nonlinear system identification have been shown to outperform regression methods under certain conditions, such as non-persistently exciting inputs and fast-sampling. We use the approximate Bayesian computation (ABC) algorithm to perform simulation-based inference of model parameters. The framework has the following main advantages: (1) parameter distributions are intrinsically generated, giving the user a clear description of uncertainty, (2) the simulation approach avoids the difficult problem of estimating signal derivatives as is common with other continuous-time methods, and (3) as noted above, the simulation approach improves identification under conditions of non-persistently exciting inputs and fast-sampling. Term selection is performed by judging parameter significance using parameter distributions that are intrinsically generated as part of the ABC procedure. The results from a numerical example demonstrate that the method performs well in noisy scenarios, especially in comparison to competing techniques that rely on signal derivative estimation.

  14. Sound source localization identification accuracy: Level and duration dependencies.

    PubMed

    Yost, William A

    2016-07-01

    Sound source localization accuracy for noises was measured for sources in the front azimuthal open field mainly as a function of overall noise level and duration. An identification procedure was used in which listeners identify which loudspeakers presented a sound. Noises were filtered and differed in bandwidth and center frequency. Sound source localization accuracy depended on the bandwidth of the stimuli, and for the narrow bandwidths, accuracy depended on the filter's center frequency. Sound source localization accuracy did not depend on overall level or duration.

  15. Proficiency Level--A Fuzzy Variable in Computer Learner Corpora

    ERIC Educational Resources Information Center

    Carlsen, Cecilie

    2012-01-01

    This article focuses on the proficiency level of texts in Computer Learner Corpora (CLCs). A claim is made that proficiency levels are often poorly defined in CLC design, and that the methods used for level assignment of corpus texts are not always adequate. Proficiency level can therefore, best be described as a fuzzy variable in CLCs,…

  16. Evaluation of new computer-enhanced identification program for microorganisms: adaptation of BioBASE for identification of members of the family Enterobacteriaceae.

    PubMed

    Miller, J M; Alachi, P

    1996-01-01

    We report the use of BioBASE, a computer-enhanced numerical identification software package, as a valuable aid for the rapid identification of unknown enteric bacilli when using conventional biochemicals. We compared BioBASE identification results with those of the Centers for Disease Control and Prevention's mainframe computer to determine the former's accuracy in identifying both common and rare unknown isolates of the family Enterobacteriaceae by using the same compiled data matrix. Of 293 enteric strains tested by BioBASE, 278 (94.9%) were correctly identified to the species level; 13 (4.4%) were assigned unacceptable or low discrimination profiles, but 8 of these (2.7%) were listed as the first choice; and 2 (0.7%) were not identified correctly because of their highly unusual biochemical profiles. The software is user friendly, rapid, and accurate and would be of value to any laboratory that uses conventional biochemicals.

  17. Learning support assessment study of a computer simulation for the development of microbial identification strategies.

    PubMed

    Johnson, T E; Gedney, C

    2001-05-01

    This paper describes a study that examined how microbiology students construct knowledge of bacterial identification while using a computer simulation. The purpose of this study was to understand how the simulation affects the cognitive processing of students during thinking, problem solving, and learning about bacterial identification and to determine how the simulation facilitates the learning of a domain-specific problem-solving strategy. As part of an upper-division microbiology course, five students participated in several simulation assignments. The data were collected using think-aloud protocol and video action logs as the students used the simulation. The analysis revealed two major themes that determined the performance of the students: Simulation Usage-how the students used the software features and Problem-Solving Strategy Development-the strategy level students started with and the skill level they achieved when they completed their use of the simulation. SEVERAL CONCLUSIONS EMERGED FROM THE ANALYSIS OF THE DATA: (i) The simulation affects various aspects of cognitive processing by creating an environment that makes it possible to practice the application of a problem-solving strategy. The simulation was used as an environment that allowed students to practice the cognitive skills required to solve an unknown. (ii) Identibacter (the computer simulation) may be considered to be a cognitive tool to facilitate the learning of a bacterial identification problem-solving strategy. (iii) The simulation characteristics did support student learning of a problem-solving strategy. (iv) Students demonstrated problem-solving strategy development specific to bacterial identification. (v) Participants demonstrated an improved performance from their repeated use of the simulation.

  18. Computational Identification of Novel Genes: Current and Future Perspectives

    PubMed Central

    Klasberg, Steffen; Bitard-Feildel, Tristan; Mallet, Ludovic

    2016-01-01

    While it has long been thought that all genomic novelties are derived from the existing material, many genes lacking homology to known genes were found in recent genome projects. Some of these novel genes were proposed to have evolved de novo, ie, out of noncoding sequences, whereas some have been shown to follow a duplication and divergence process. Their discovery called for an extension of the historical hypotheses about gene origination. Besides the theoretical breakthrough, increasing evidence accumulated that novel genes play important roles in evolutionary processes, including adaptation and speciation events. Different techniques are available to identify genes and classify them as novel. Their classification as novel is usually based on their similarity to known genes, or lack thereof, detected by comparative genomics or against databases. Computational approaches are further prime methods that can be based on existing models or leveraging biological evidences from experiments. Identification of novel genes remains however a challenging task. With the constant software and technologies updates, no gold standard, and no available benchmark, evaluation and characterization of genomic novelty is a vibrant field. In this review, the classical and state-of-the-art tools for gene prediction are introduced. The current methods for novel gene detection are presented; the methodological strategies and their limits are discussed along with perspective approaches for further studies. PMID:27493475

  19. A Model of Computation for Bit-Level Concurrent Computing and Programming: APEC

    NASA Astrophysics Data System (ADS)

    Ajiro, Takashi; Tsuchida, Kensei

    A concurrent model of computation and a language based on the model for bit-level operation are useful for developing asynchronous and concurrent programs compositionally, which frequently use bit-level operations. Some examples are programs for video games, hardware emulation (including virtual machines), and signal processing. However, few models and languages are optimized and oriented to bit-level concurrent computation. We previously developed a visual programming language called A-BITS for bit-level concurrent programming. The language is based on a dataflow-like model that computes using processes that provide serial bit-level operations and FIFO buffers connected to them. It can express bit-level computation naturally and develop compositionally. We then devised a concurrent computation model called APEC (Asynchronous Program Elements Connection) for bit-level concurrent computation. This model enables precise and formal expression of the process of computation, and a notion of primitive program elements for controlling and operating can be expressed synthetically. Specifically, the model is based on a notion of uniform primitive processes, called primitives, that have three terminals and four ordered rules at most, as well as on bidirectional communication using vehicles called carriers. A new notion is that a carrier moving between two terminals can briefly express some kinds of computation such as synchronization and bidirectional communication. The model's properties make it most applicable to bit-level computation compositionally, since the uniform computation elements are enough to develop components that have practical functionality. Through future application of the model, our research may enable further research on a base model of fine-grain parallel computer architecture, since the model is suitable for expressing massive concurrency by a network of primitives.

  20. The algorithmic level is the bridge between computation and brain.

    PubMed

    Love, Bradley C

    2015-04-01

    Every scientist chooses a preferred level of analysis and this choice shapes the research program, even determining what counts as evidence. This contribution revisits Marr's (1982) three levels of analysis (implementation, algorithmic, and computational) and evaluates the prospect of making progress at each individual level. After reviewing limitations of theorizing within a level, two strategies for integration across levels are considered. One is top-down in that it attempts to build a bridge from the computational to algorithmic level. Limitations of this approach include insufficient theoretical constraint at the computation level to provide a foundation for integration, and that people are suboptimal for reasons other than capacity limitations. Instead, an inside-out approach is forwarded in which all three levels of analysis are integrated via the algorithmic level. This approach maximally leverages mutual data constraints at all levels. For example, algorithmic models can be used to interpret brain imaging data, and brain imaging data can be used to select among competing models. Examples of this approach to integration are provided. This merging of levels raises questions about the relevance of Marr's tripartite view.

  1. Computer ethics and teritary level education in Hong Kong

    SciTech Connect

    Wong, E.Y.W.; Davison, R.M.; Wade, P.W.

    1994-12-31

    This paper seeks to highlight some ethical issues relating to the increasing proliferation of Information Technology into our everyday lives. The authors explain their understanding of computer ethics, and give some reasons why the study of computer ethics is becoming increasingly pertinent. The paper looks at some of the problems that arise in attempting to develop appropriate ethical concepts in a constantly changing environment, and explores some of the ethical dilemmas arising from the increasing use of computers. Some initial research undertaken to explore the ideas and understanding of tertiary level students in Hong Kong on a number of ethical issues of interest is described, and our findings discussed. We hope that presenting this paper and eliciting subsequent discussion will enable us to draw up more comprehensive guidelines for the teaching of computer related ethics to tertiary level students, as well as reveal some directions for future research.

  2. Computing NLTE Opacities -- Node Level Parallel Calculation

    SciTech Connect

    Holladay, Daniel

    2015-09-11

    Presentation. The goal: to produce a robust library capable of computing reasonably accurate opacities inline with the assumption of LTE relaxed (non-LTE). Near term: demonstrate acceleration of non-LTE opacity computation. Far term (if funded): connect to application codes with in-line capability and compute opacities. Study science problems. Use efficient algorithms that expose many levels of parallelism and utilize good memory access patterns for use on advanced architectures. Portability to multiple types of hardware including multicore processors, manycore processors such as KNL, GPUs, etc. Easily coupled to radiation hydrodynamics and thermal radiative transfer codes.

  3. Identification of natural images and computer-generated graphics based on statistical and textural features.

    PubMed

    Peng, Fei; Li, Jiao-ting; Long, Min

    2015-03-01

    To discriminate the acquisition pipelines of digital images, a novel scheme for the identification of natural images and computer-generated graphics is proposed based on statistical and textural features. First, the differences between them are investigated from the view of statistics and texture, and 31 dimensions of feature are acquired for identification. Then, LIBSVM is used for the classification. Finally, the experimental results are presented. The results show that it can achieve an identification accuracy of 97.89% for computer-generated graphics, and an identification accuracy of 97.75% for natural images. The analyses also demonstrate the proposed method has excellent performance, compared with some existing methods based only on statistical features or other features. The method has a great potential to be implemented for the identification of natural images and computer-generated graphics.

  4. Identification of Learning Processes by Means of Computer Graphics.

    ERIC Educational Resources Information Center

    Sorensen, Birgitte Holm

    1993-01-01

    Describes a development project for the use of computer graphics and video in connection with an inservice training course for primary education teachers in Denmark. Topics addressed include research approaches to computers; computer graphics in learning processes; activities relating to computer graphics; the role of the teacher; and student…

  5. Dysregulation in level of goal and action identification across psychological disorders

    PubMed Central

    Watkins, Edward

    2011-01-01

    Goals, events, and actions can be mentally represented within a hierarchical framework that ranges from more abstract to more concrete levels of identification. A more abstract level of identification involves general, superordinate, and decontextualized mental representations that convey the meaning of goals, events, and actions, “why” an action is performed, and its purpose, ends, and consequences. A more concrete level of identification involves specific and subordinate mental representations that include contextual details of goals, events, and actions, and the specific “how” details of an action. This review considers three lines of evidence for considering that dysregulation of level of goal/action identification may be a transdiagnostic process. First, there is evidence that different levels of identification have distinct functional consequences and that in non-clinical samples level of goal/action identification appears to be regulated in a flexible and adaptive way to match the level of goal/action identification to circumstances. Second, there is evidence that level of goal/action identification causally influences symptoms and processes involved in psychological disorders, including emotional response, repetitive thought, impulsivity, problem solving and procrastination. Third, there is evidence that the level of goal/action identification is biased and/or dysregulated in certain psychological disorders, with a bias towards more abstract identification for negative events in depression, GAD, PTSD, and social anxiety. PMID:20579789

  6. Computational simulation of drug delivery at molecular level.

    PubMed

    Li, Youyong; Hou, Tingjun

    2010-01-01

    The field of drug delivery is advancing rapidly. By controlling the precise level and/or location of a given drug in the body, side effects are reduced, doses are lowered, and new therapies are possible. Nonetheless, substantial challenges remain for delivering specific drugs into specific cells. Computational methods to predict the binding and dynamics between drug molecule and its carrier are increasingly desirable to minimize the investment in drug design and development. Significant progress in computational simulation is making it possible to understand the mechanism of drug delivery. This review summarizes the computational methods and progress of four categories of drug delivery systems: dendrimers, polymer micelle, liposome and carbon nanotubes. Computational simulations are particularly valuable in designing better drug carriers and addressing issues that are difficult to be explored by laboratory experiments, such as diffusion, dynamics, etc.

  7. A Program for the Identification of the Enterobacteriaceae for Use in Teaching the Principles of Computer Identification of Bacteria.

    ERIC Educational Resources Information Center

    Hammonds, S. J.

    1990-01-01

    A technique for the numerical identification of bacteria using normalized likelihoods calculated from a probabilistic database is described, and the principles of the technique are explained. The listing of the computer program is included. Specimen results from the program, and examples of how they should be interpreted, are given. (KR)

  8. The Reality of Computers at the Community College Level.

    ERIC Educational Resources Information Center

    Leone, Stephen J.

    Writing teachers at the community college level who teach using a computer have come to accept the fact that it is more than "just teaching" composition. Such teaching often requires instructors to be as knowledgeable as some of the technicians. Two-year college students and faculty are typically given little support in using computers…

  9. OS friendly microprocessor architecture: Hardware level computer security

    NASA Astrophysics Data System (ADS)

    Jungwirth, Patrick; La Fratta, Patrick

    2016-05-01

    We present an introduction to the patented OS Friendly Microprocessor Architecture (OSFA) and hardware level computer security. Conventional microprocessors have not tried to balance hardware performance and OS performance at the same time. Conventional microprocessors have depended on the Operating System for computer security and information assurance. The goal of the OS Friendly Architecture is to provide a high performance and secure microprocessor and OS system. We are interested in cyber security, information technology (IT), and SCADA control professionals reviewing the hardware level security features. The OS Friendly Architecture is a switched set of cache memory banks in a pipeline configuration. For light-weight threads, the memory pipeline configuration provides near instantaneous context switching times. The pipelining and parallelism provided by the cache memory pipeline provides for background cache read and write operations while the microprocessor's execution pipeline is running instructions. The cache bank selection controllers provide arbitration to prevent the memory pipeline and microprocessor's execution pipeline from accessing the same cache bank at the same time. This separation allows the cache memory pages to transfer to and from level 1 (L1) caching while the microprocessor pipeline is executing instructions. Computer security operations are implemented in hardware. By extending Unix file permissions bits to each cache memory bank and memory address, the OSFA provides hardware level computer security.

  10. Logic as Marr's Computational Level: Four Case Studies.

    PubMed

    Baggio, Giosuè; van Lambalgen, Michiel; Hagoort, Peter

    2015-04-01

    We sketch four applications of Marr's levels-of-analysis methodology to the relations between logic and experimental data in the cognitive neuroscience of language and reasoning. The first part of the paper illustrates the explanatory power of computational level theories based on logic. We show that a Bayesian treatment of the suppression task in reasoning with conditionals is ruled out by EEG data, supporting instead an analysis based on defeasible logic. Further, we describe how results from an EEG study on temporal prepositions can be reanalyzed using formal semantics, addressing a potential confound. The second part of the article demonstrates the predictive power of logical theories drawing on EEG data on processing progressive constructions and on behavioral data on conditional reasoning in people with autism. Logical theories can constrain processing hypotheses all the way down to neurophysiology, and conversely neuroscience data can guide the selection of alternative computational level models of cognition.

  11. Identification of condition-specific regulatory modules through multi-level motif and mRNA expression analysis

    PubMed Central

    Chen, Li; Wang, Yue; Hoffman, Eric P.; Riggins, Rebecca B.; Clarke, Robert

    2013-01-01

    Many computational methods for identification of transcription regulatory modules often result in many false positives in practice due to noise sources of binding information and gene expression profiling data. In this paper, we propose a multi-level strategy for condition-specific gene regulatory module identification by integrating motif binding information and gene expression data through support vector regression and significant analysis. We have demonstrated the feasibility of the proposed method on a yeast cell cycle data set. The study on a breast cancer microarray data set shows that it can successfully identify the significant and reliable regulatory modules associated with breast cancer. PMID:20054984

  12. Contours identification of elements in a cone beam computed tomography for investigating maxillary cysts

    NASA Astrophysics Data System (ADS)

    Chioran, Doina; Nicoarǎ, Adrian; Roşu, Şerban; Cǎrligeriu, Virgil; Ianeş, Emilia

    2013-10-01

    Digital processing of two-dimensional cone beam computer tomography slicesstarts by identification of the contour of elements within. This paper deals with the collective work of specialists in medicine and applied mathematics in computer science on elaborating and implementation of algorithms in dental 2D imagery.

  13. Computational model for supporting SHM systems design: Damage identification via numerical analyses

    NASA Astrophysics Data System (ADS)

    Sartorato, Murilo; de Medeiros, Ricardo; Vandepitte, Dirk; Tita, Volnei

    2017-02-01

    This work presents a computational model to simulate thin structures monitored by piezoelectric sensors in order to support the design of SHM systems, which use vibration based methods. Thus, a new shell finite element model was proposed and implemented via a User ELement subroutine (UEL) into the commercial package ABAQUS™. This model was based on a modified First Order Shear Theory (FOST) for piezoelectric composite laminates. After that, damaged cantilever beams with two piezoelectric sensors in different positions were investigated by using experimental analyses and the proposed computational model. A maximum difference in the magnitude of the FRFs between numerical and experimental analyses of 7.45% was found near the resonance regions. For damage identification, different levels of damage severity were evaluated by seven damage metrics, including one proposed by the present authors. Numerical and experimental damage metrics values were compared, showing a good correlation in terms of tendency. Finally, based on comparisons of numerical and experimental results, it is shown a discussion about the potentials and limitations of the proposed computational model to be used for supporting SHM systems design.

  14. Computing Bounds on Resource Levels for Flexible Plans

    NASA Technical Reports Server (NTRS)

    Muscvettola, Nicola; Rijsman, David

    2009-01-01

    A new algorithm efficiently computes the tightest exact bound on the levels of resources induced by a flexible activity plan (see figure). Tightness of bounds is extremely important for computations involved in planning because tight bounds can save potentially exponential amounts of search (through early backtracking and detection of solutions), relative to looser bounds. The bound computed by the new algorithm, denoted the resource-level envelope, constitutes the measure of maximum and minimum consumption of resources at any time for all fixed-time schedules in the flexible plan. At each time, the envelope guarantees that there are two fixed-time instantiations one that produces the minimum level and one that produces the maximum level. Therefore, the resource-level envelope is the tightest possible resource-level bound for a flexible plan because any tighter bound would exclude the contribution of at least one fixed-time schedule. If the resource- level envelope can be computed efficiently, one could substitute looser bounds that are currently used in the inner cores of constraint-posting scheduling algorithms, with the potential for great improvements in performance. What is needed to reduce the cost of computation is an algorithm, the measure of complexity of which is no greater than a low-degree polynomial in N (where N is the number of activities). The new algorithm satisfies this need. In this algorithm, the computation of resource-level envelopes is based on a novel combination of (1) the theory of shortest paths in the temporal-constraint network for the flexible plan and (2) the theory of maximum flows for a flow network derived from the temporal and resource constraints. The measure of asymptotic complexity of the algorithm is O(N O(maxflow(N)), where O(x) denotes an amount of computing time or a number of arithmetic operations proportional to a number of the order of x and O(maxflow(N)) is the measure of complexity (and thus of cost) of a maximumflow

  15. Domain identification in impedance computed tomography by spline collocation method

    NASA Technical Reports Server (NTRS)

    Kojima, Fumio

    1990-01-01

    A method for estimating an unknown domain in elliptic boundary value problems is considered. The problem is formulated as an inverse problem of integral equations of the second kind. A computational method is developed using a splice collocation scheme. The results can be applied to the inverse problem of impedance computed tomography (ICT) for image reconstruction.

  16. Root morphology and anatomical patterns in forensic dental identification: a comparison of computer-aided identification with traditional forensic dental identification.

    PubMed

    van der Meer, Dirk T; Brumit, Paula C; Schrader, Bruce A; Dove, Stephen B; Senn, David R

    2010-11-01

    An online forensic dental identification exercise was conducted involving 24 antemortem-postmortem (AM-PM) dental radiograph pairs from actual forensic identification cases. Images had been digitally cropped to remove coronal tooth structure and dental restorations. Volunteer forensic odontologists were passively recruited to compare the AM-PM dental radiographs online and conclude identification status using the guidelines for identification from the American Board of Forensic Odontology. The mean accuracy rate for identification was 86.0% (standard deviation 9.2%). The same radiograph pairs were compared using a digital imaging software algorithm, which generated a normalized coefficient of similarity for each pair. Twenty of the radiograph pairs generated a mean accuracy of 85.0%. Four of the pairs could not be used to generate a coefficient of similarity. Receiver operator curve and area under the curve statistical analysis confirmed good discrimination abilities of both methods (online exercise = 0.978; UT-ID index = 0.923) and Spearman's rank correlation coefficient analysis (0.683) indicated good correlation between the results of both methods. Computer-aided dental identification allows for an objective comparison of AM-PM radiographs and can be a useful tool to support a forensic dental identification conclusion.

  17. Identification of the pleural fissures with computed tomography

    SciTech Connect

    Marks, B.W.; Kuhns, L.R.

    1982-04-01

    The pleural fissures can be identified as avascular planes within the pulmonary parenchyma on CT scans. A retrospective analysis of 23 consecutive scans was conducted to consider identification of fissures. On 21% of the axial images, a ''ground glass'' band was identified within the avascular plane, probably due to partial volume averaging of the pleural fissure with the adjacent lung. The pleural fissures could be identified in 84% of cases.

  18. Data identification for improving gene network inference using computational algebra.

    PubMed

    Dimitrova, Elena; Stigler, Brandilyn

    2014-11-01

    Identification of models of gene regulatory networks is sensitive to the amount of data used as input. Considering the substantial costs in conducting experiments, it is of value to have an estimate of the amount of data required to infer the network structure. To minimize wasted resources, it is also beneficial to know which data are necessary to identify the network. Knowledge of the data and knowledge of the terms in polynomial models are often required a priori in model identification. In applications, it is unlikely that the structure of a polynomial model will be known, which may force data sets to be unnecessarily large in order to identify a model. Furthermore, none of the known results provides any strategy for constructing data sets to uniquely identify a model. We provide a specialization of an existing criterion for deciding when a set of data points identifies a minimal polynomial model when its monomial terms have been specified. Then, we relax the requirement of the knowledge of the monomials and present results for model identification given only the data. Finally, we present a method for constructing data sets that identify minimal polynomial models.

  19. Use of a Computer-Assisted Identification System in the Identification of the Remains of Deceased USAF Personnel.

    DTIC Science & Technology

    1988-04-01

    1987 he completed a Master of Arts degree in Computer Resource Management. Major Triplett also graduated from AFIP’s Forensic Odontology Course and...Thomas, 1973. Articles and Periodicals 4. Brannon, Lawrence S., Capt, USA. "Forensic Odontology : An Application for the Army Dentist." Military...34 Military Medicine, Vol. 146 (April 1981), pp. 262-264. 12. Kim, Ho Wohn. "The Role of Forensic Odontology in the Field of Human Identification

  20. All-memristive neuromorphic computing with level-tuned neurons

    NASA Astrophysics Data System (ADS)

    Pantazi, Angeliki; Woźniak, Stanisław; Tuma, Tomas; Eleftheriou, Evangelos

    2016-09-01

    In the new era of cognitive computing, systems will be able to learn and interact with the environment in ways that will drastically enhance the capabilities of current processors, especially in extracting knowledge from vast amount of data obtained from many sources. Brain-inspired neuromorphic computing systems increasingly attract research interest as an alternative to the classical von Neumann processor architecture, mainly because of the coexistence of memory and processing units. In these systems, the basic components are neurons interconnected by synapses. The neurons, based on their nonlinear dynamics, generate spikes that provide the main communication mechanism. The computational tasks are distributed across the neural network, where synapses implement both the memory and the computational units, by means of learning mechanisms such as spike-timing-dependent plasticity. In this work, we present an all-memristive neuromorphic architecture comprising neurons and synapses realized by using the physical properties and state dynamics of phase-change memristors. The architecture employs a novel concept of interconnecting the neurons in the same layer, resulting in level-tuned neuronal characteristics that preferentially process input information. We demonstrate the proposed architecture in the tasks of unsupervised learning and detection of multiple temporal correlations in parallel input streams. The efficiency of the neuromorphic architecture along with the homogenous neuro-synaptic dynamics implemented with nanoscale phase-change memristors represent a significant step towards the development of ultrahigh-density neuromorphic co-processors.

  1. All-memristive neuromorphic computing with level-tuned neurons.

    PubMed

    Pantazi, Angeliki; Woźniak, Stanisław; Tuma, Tomas; Eleftheriou, Evangelos

    2016-09-02

    In the new era of cognitive computing, systems will be able to learn and interact with the environment in ways that will drastically enhance the capabilities of current processors, especially in extracting knowledge from vast amount of data obtained from many sources. Brain-inspired neuromorphic computing systems increasingly attract research interest as an alternative to the classical von Neumann processor architecture, mainly because of the coexistence of memory and processing units. In these systems, the basic components are neurons interconnected by synapses. The neurons, based on their nonlinear dynamics, generate spikes that provide the main communication mechanism. The computational tasks are distributed across the neural network, where synapses implement both the memory and the computational units, by means of learning mechanisms such as spike-timing-dependent plasticity. In this work, we present an all-memristive neuromorphic architecture comprising neurons and synapses realized by using the physical properties and state dynamics of phase-change memristors. The architecture employs a novel concept of interconnecting the neurons in the same layer, resulting in level-tuned neuronal characteristics that preferentially process input information. We demonstrate the proposed architecture in the tasks of unsupervised learning and detection of multiple temporal correlations in parallel input streams. The efficiency of the neuromorphic architecture along with the homogenous neuro-synaptic dynamics implemented with nanoscale phase-change memristors represent a significant step towards the development of ultrahigh-density neuromorphic co-processors.

  2. Computer-guided drug repurposing: identification of trypanocidal activity of clofazimine, benidipine and saquinavir.

    PubMed

    Bellera, Carolina L; Balcazar, Darío E; Vanrell, M Cristina; Casassa, A Florencia; Palestro, Pablo H; Gavernet, Luciana; Labriola, Carlos A; Gálvez, Jorge; Bruno-Blanch, Luis E; Romano, Patricia S; Carrillo, Carolina; Talevi, Alan

    2015-03-26

    In spite of remarkable advances in the knowledge on Trypanosoma cruzi biology, no medications to treat Chagas disease have been approved in the last 40 years and almost 8 million people remain infected. Since the public sector and non-profit organizations play a significant role in the research efforts on Chagas disease, it is important to implement research strategies that promote translation of basic research into the clinical practice. Recent international public-private initiatives address the potential of drug repositioning (i.e. finding second or further medical uses for known-medications) which can substantially improve the success at clinical trials and the innovation in the pharmaceutical field. In this work, we present the computer-aided identification of approved drugs clofazimine, benidipine and saquinavir as potential trypanocidal compounds and test their effects at biochemical as much as cellular level on different parasite stages. According to the obtained results, we discuss biopharmaceutical, toxicological and physiopathological criteria applied to decide to move clofazimine and benidipine into preclinical phase, in an acute model of infection. The article illustrates the potential of computer-guided drug repositioning to integrate and optimize drug discovery and preclinical development; it also proposes rational rules to select which among repositioned candidates should advance to investigational drug status and offers a new insight on clofazimine and benidipine as candidate treatments for Chagas disease. One Sentence Summary: We present the computer-guided drug repositioning of three approved drugs as potential new treatments for Chagas disease, integrating computer-aided drug screening and biochemical, cellular and preclinical tests.

  3. A new computer-assisted technique to aid personal identification.

    PubMed

    De Angelis, Danilo; Sala, Remo; Cantatore, Angela; Grandi, Marco; Cattaneo, Cristina

    2009-07-01

    The paper describes a procedure aimed at identification from two-dimensional (2D) images (video-surveillance tapes, for example) by comparison with a three-dimensional (3D) facial model of a suspect. The application is intended to provide a tool which can help in analyzing compatibility or incompatibility between a criminal and a suspect's facial traits. The authors apply the concept of "geometrically compatible images". The idea is to use a scanner to reconstruct a 3D facial model of a suspect and to compare it to a frame extracted from the video-surveillance sequence which shows the face of the perpetrator. Repositioning and reorientation of the 3D model according to subject's face framed in the crime scene photo are manually accomplished, after automatic resizing. Repositioning and reorientation are performed in correspondence of anthropometric landmarks, distinctive for that person and detected both on the 2D face and on the 3D model. In this way, the superimposition between the original two-dimensional facial image and the three-dimensional one is obtained and a judgment is formulated by an expert on the basis of the fit between the anatomical facial districts of the two subjects. The procedure reduces the influence of face orientation and may be a useful tool in identification.

  4. Parallel Computation of the Topology of Level Sets

    SciTech Connect

    Pascucci, V; Cole-McLaughlin, K

    2004-12-16

    This paper introduces two efficient algorithms that compute the Contour Tree of a 3D scalar field F and its augmented version with the Betti numbers of each isosurface. The Contour Tree is a fundamental data structure in scientific visualization that is used to preprocess the domain mesh to allow optimal computation of isosurfaces with minimal overhead storage. The Contour Tree can also be used to build user interfaces reporting the complete topological characterization of a scalar field, as shown in Figure 1. Data exploration time is reduced since the user understands the evolution of level set components with changing isovalue. The Augmented Contour Tree provides even more accurate information segmenting the range space of the scalar field in portion of invariant topology. The exploration time for a single isosurface is also improved since its genus is known in advance. Our first new algorithm augments any given Contour Tree with the Betti numbers of all possible corresponding isocontours in linear time with the size of the tree. Moreover we show how to extend the scheme introduced in [3] with the Betti number computation without increasing its complexity. Thus, we improve on the time complexity from our previous approach [10] from O(m log m) to O(n log n + m), where m is the number of cells and n is the number of vertices in the domain of F. Our second contribution is a new divide-and-conquer algorithm that computes the Augmented Contour Tree with improved efficiency. The approach computes the output Contour Tree by merging two intermediate Contour Trees and is independent of the interpolant. In this way we confine any knowledge regarding a specific interpolant to an independent function that computes the tree for a single cell. We have implemented this function for the trilinear interpolant and plan to replace it with higher order interpolants when needed. The time complexity is O(n + t log n), where t is the number of critical points of F. For the first time

  5. Correlations of Electrophysiological Measurements with Identification Levels of Ancient Chinese Characters

    PubMed Central

    Qi, Zhengyang; Wang, Xiaolong; Hao, Shuang; Zhu, Chuanlin; He, Weiqi; Luo, Wenbo

    2016-01-01

    Studies of event-related potential (ERP) in the human brain have shown that the N170 component can reliably distinguish among different object categories. However, it is unclear whether this is true for different identifiable levels within a single category. In the present study, we used ERP recording to examine the neural response to different identification levels and orientations (upright vs. inverted) of Chinese characters. The results showed that P1, N170, and P250 were modulated by different identification levels of Chinese characters. Moreover, time frequency analysis showed similar results, indicating that identification levels were associated with object recognition, particularly during processing of a single categorical stimulus. PMID:26982215

  6. A Simple Computer Application for the Identification of Conifer Genera

    ERIC Educational Resources Information Center

    Strain, Steven R.; Chmielewski, Jerry G.

    2010-01-01

    The National Science Education Standards prescribe that an understanding of the importance of classifying organisms be one component of a student's educational experience in the life sciences. The use of a classification scheme to identify organisms is one way of addressing this goal. We describe Conifer ID, a computer application that assists…

  7. A survey of computational methods and error rate estimation procedures for peptide and protein identification in shotgun proteomics

    PubMed Central

    Nesvizhskii, Alexey I.

    2010-01-01

    This manuscript provides a comprehensive review of the peptide and protein identification process using tandem mass spectrometry (MS/MS) data generated in shotgun proteomic experiments. The commonly used methods for assigning peptide sequences to MS/MS spectra are critically discussed and compared, from basic strategies to advanced multi-stage approaches. A particular attention is paid to the problem of false-positive identifications. Existing statistical approaches for assessing the significance of peptide to spectrum matches are surveyed, ranging from single-spectrum approaches such as expectation values to global error rate estimation procedures such as false discovery rates and posterior probabilities. The importance of using auxiliary discriminant information (mass accuracy, peptide separation coordinates, digestion properties, and etc.) is discussed, and advanced computational approaches for joint modeling of multiple sources of information are presented. This review also includes a detailed analysis of the issues affecting the interpretation of data at the protein level, including the amplification of error rates when going from peptide to protein level, and the ambiguities in inferring the identifies of sample proteins in the presence of shared peptides. Commonly used methods for computing protein-level confidence scores are discussed in detail. The review concludes with a discussion of several outstanding computational issues. PMID:20816881

  8. Diagnostic reference level of computed tomography (CT) in Japan.

    PubMed

    Fukushima, Yasuhiro; Tsushima, Yoshito; Takei, Hiroyuki; Taketomi-Takahashi, Ayako; Otake, Hidenori; Endo, Keigo

    2012-08-01

    Optimisation of computed tomography (CT) parameters is important in avoiding excess radiation exposure. The aim of this study is to establish the diagnostic reference levels (DRL) of CT in Japan by using dose-length product (DLP). Datasheets were sent to all hospitals/clinics which had CT scanner(s) in Gunma prefecture. Data were obtained for all patients who underwent CT during a single month (June 2010), and the distributions of DLP were evaluated for eight anatomical regions and five patient age groups. The DRL was defined as the 25th and 75th percentiles of DLP. Datasheets were collected from 80 of 192 hospitals/clinics (26 090 patients). DLP for head CT of paediatric patients tended to be higher in Japan compared with DRLs of paediatric head CTs reported from the EU or Syria. Although this study was performed with limited samples, DLP for adult patients were at comparable levels for all anatomical regions.

  9. Computational Identification of Active Enhancers in Model Organisms

    PubMed Central

    Wang, Chengqi; Zhang, Michael Q.; Zhang, Zhihua

    2013-01-01

    As a class of cis-regulatory elements, enhancers were first identified as the genomic regions that are able to markedly increase the transcription of genes nearly 30 years ago. Enhancers can regulate gene expression in a cell-type specific and developmental stage specific manner. Although experimental technologies have been developed to identify enhancers genome-wide, the design principle of the regulatory elements and the way they rewire the transcriptional regulatory network tempo-spatially are far from clear. At present, developing predictive methods for enhancers, particularly for the cell-type specific activity of enhancers, is central to computational biology. In this review, we survey the current computational approaches for active enhancer prediction and discuss future directions. PMID:23685394

  10. Computational identification of Ciona intestinalis microRNAs.

    PubMed

    Keshavan, Raja; Virata, Michael; Keshavan, Anisha; Zeller, Robert W

    2010-02-01

    MicroRNAs (miRNAs) are conserved non-coding small RNAs with potent post-transcriptional gene regulatory functions. Recent computational approaches and sequencing of small RNAs had indicated the existence of about 80 Ciona intestinalis miRNAs, although it was not clear whether other miRNA genes were present in the genome. We undertook an alternative computational approach to look for Ciona miRNAs. Conserved non-coding sequences from the C. intestinalis genome were extracted and computationally folded to identify putative hairpin-like structures. After applying additional criteria, we obtained 458 miRNA candidates whose sequences were used to design a custom microarray. Over 100 of our predicted hairpins were identified in this array when probed with RNA from various Ciona stages. We also compared our predictions to recently deposited sequences of Ciona small RNAs and report that 170 of our predicted hairpins are represented in this data set. Altogether, about 250 of our 458 predicted miRNAs were represented in either our array data or the small-RNA sequence database. These results suggest that Ciona has a large number of genomically encoded miRNAs that play an important role in modulating gene activity in developing embryos and adults.

  11. MAX--An Interactive Computer Program for Teaching Identification of Clay Minerals by X-ray Diffraction.

    ERIC Educational Resources Information Center

    Kohut, Connie K.; And Others

    1993-01-01

    Discusses MAX, an interactive computer program for teaching identification of clay minerals based on standard x-ray diffraction characteristics. The program provides tutorial-type exercises for identification of 16 clay standards, self-evaluation exercises, diffractograms of 28 soil clay minerals, and identification of nonclay minerals. (MDH)

  12. Cloud identification using genetic algorithms and massively parallel computation

    NASA Technical Reports Server (NTRS)

    Buckles, Bill P.; Petry, Frederick E.

    1996-01-01

    As a Guest Computational Investigator under the NASA administered component of the High Performance Computing and Communication Program, we implemented a massively parallel genetic algorithm on the MasPar SIMD computer. Experiments were conducted using Earth Science data in the domains of meteorology and oceanography. Results obtained in these domains are competitive with, and in most cases better than, similar problems solved using other methods. In the meteorological domain, we chose to identify clouds using AVHRR spectral data. Four cloud speciations were used although most researchers settle for three. Results were remarkedly consistent across all tests (91% accuracy). Refinements of this method may lead to more timely and complete information for Global Circulation Models (GCMS) that are prevalent in weather forecasting and global environment studies. In the oceanographic domain, we chose to identify ocean currents from a spectrometer having similar characteristics to AVHRR. Here the results were mixed (60% to 80% accuracy). Given that one is willing to run the experiment several times (say 10), then it is acceptable to claim the higher accuracy rating. This problem has never been successfully automated. Therefore, these results are encouraging even though less impressive than the cloud experiment. Successful conclusion of an automated ocean current detection system would impact coastal fishing, naval tactics, and the study of micro-climates. Finally we contributed to the basic knowledge of GA (genetic algorithm) behavior in parallel environments. We developed better knowledge of the use of subpopulations in the context of shared breeding pools and the migration of individuals. Rigorous experiments were conducted based on quantifiable performance criteria. While much of the work confirmed current wisdom, for the first time we were able to submit conclusive evidence. The software developed under this grant was placed in the public domain. An extensive user

  13. Computer method for identification of boiler transfer functions

    NASA Technical Reports Server (NTRS)

    Miles, J. H.

    1971-01-01

    An iterative computer method is described for identifying boiler transfer functions using frequency response data. An objective penalized performance measure and a nonlinear minimization technique are used to cause the locus of points generated by a transfer function to resemble the locus of points obtained from frequency response measurements. Different transfer functions can be tried until a satisfactory empirical transfer function to the system is found. To illustrate the method, some examples and some results from a study of a set of data consisting of measurements of the inlet impedance of a single tube forced flow boiler with inserts are given.

  14. Novel computational identification of highly selective biomarkers of pollutant exposure.

    PubMed

    Weisman, David; Liu, Hong; Redfern, Jessica; Zhu, Liya; Colón-Carmona, Adán

    2011-06-15

    The use of in vivo biosensors to acquire environmental pollution data is an emerging and promising paradigm. One major challenge is the identification of highly specific biomarkers that selectively report exposure to a target pollutant, while remaining quiescent under a diverse set of other, often unknown, environmental conditions. This study hypothesized that a microarray data mining approach can identify highly specific biomarkers, and, that the robustness property can generalize to unforeseen environmental conditions. Starting with Arabidopsis thaliana microarray data measuring responses to a variety of treatments, the study used the top scoring pair (TSP) algorithm to identify mRNA transcripts that respond uniquely to phenanthrene, a model polycyclic aromatic hydrocarbon. Subsequent in silico analysis with a larger set of microarray data indicated that the biomarkers remained robust under new conditions. Finally, in vivo experiments were performed with unforeseen conditions that mimic phenanthrene stress, and the biomarkers were assayed using qRT-PCR. In these experiments, the biomarkers always responded positively to phenanthrene, and never responded to the unforeseen conditions, thereby supporting the hypotheses. This data mining approach requires only microarray or next-generation RNA-seq data, and, in principle, can be applied to arbitrary biomonitoring organisms and chemical exposures.

  15. A color and texture based multi-level fusion scheme for ethnicity identification

    NASA Astrophysics Data System (ADS)

    Du, Hongbo; Salah, Sheerko Hma; Ahmed, Hawkar O.

    2014-05-01

    Ethnicity identification of face images is of interest in many areas of application. Different from face recognition of individuals, ethnicity identification classifies faces according to the common features of a specific ethnic group. This paper presents a multi-level fusion scheme for ethnicity identification that combines texture features of local areas of a face using local binary patterns with color features using HSV binning. The scheme fuses the decisions from a k-nearest neighbor classifier and a support vector machine classifier into a final identification decision. We have tested the scheme on a collection of face images from a number of publicly available databases. The results demonstrate the effectiveness of the combined features and improvements on accuracy of identification by the fusion scheme over the identification using individual features and other state-of-art techniques.

  16. Australian diagnostic reference levels for multi detector computed tomography.

    PubMed

    Hayton, Anna; Wallace, Anthony; Marks, Paul; Edmonds, Keith; Tingey, David; Johnston, Peter

    2013-03-01

    The Australian Radiation Protection and Nuclear Safety Agency (ARPANSA) is undertaking web based surveys to obtain data to establish national diagnostic reference levels (DRLs) for diagnostic imaging. The first set of DRLs to be established are for multi detector computed tomography (MDCT). The survey samples MDCT dosimetry metrics: dose length product (DLP, mGy.cm) and volume computed tomography dose index (CTDIvol, mGy), for six common protocols/habitus: Head, Neck, Chest, AbdoPelvis, ChestAbdoPelvis and Lumbar Spine from individual radiology clinics and platforms. A practice reference level (PRL) for a given platform and protocol is calculated from a compliant survey containing data collected from at least ten patients. The PRL is defined as the median of the DLP/CTDIvol values for a single compliant survey. Australian National DRLs are defined as the 75th percentile of the distribution of the PRLs for each protocol and age group. Australian National DRLs for adult MDCT have been determined in terms of DLP and CTDIvol. In terms of DLP the national DRLs are 1,000 mGy cm, 600 mGy cm, 450 mGy cm, 700 mGy cm, 1,200 mGy cm, and 900 mGy cm for the protocols Head, Neck, Chest, AbdoPelvis, ChestAbdoPelvis and Lumbar Spine respectively. Average dose values obtained from the European survey Dose Datamed I reveal Australian doses to be higher by comparison for four out of the six protocols. The survey is ongoing, allowing practices to optimise dose delivery as well as allowing the periodic update of DRLs to reflect changes in technology and technique.

  17. A computational approach to studying ageing at the individual level

    PubMed Central

    Mourão, Márcio A.; Schnell, Santiago; Pletcher, Scott D.

    2016-01-01

    The ageing process is actively regulated throughout an organism's life, but studying the rate of ageing in individuals is difficult with conventional methods. Consequently, ageing studies typically make biological inference based on population mortality rates, which often do not accurately reflect the probabilities of death at the individual level. To study the relationship between individual and population mortality rates, we integrated in vivo switch experiments with in silico stochastic simulations to elucidate how carefully designed experiments allow key aspects of individual ageing to be deduced from group mortality measurements. As our case study, we used the recent report demonstrating that pheromones of the opposite sex decrease lifespan in Drosophila melanogaster by reversibly increasing population mortality rates. We showed that the population mortality reversal following pheromone removal was almost surely occurring in individuals, albeit more slowly than suggested by population measures. Furthermore, heterogeneity among individuals due to the inherent stochasticity of behavioural interactions skewed population mortality rates in middle-age away from the individual-level trajectories of which they are comprised. This article exemplifies how computational models function as important predictive tools for designing wet-laboratory experiments to use population mortality rates to understand how genetic and environmental manipulations affect ageing in the individual. PMID:26865300

  18. Computationally Inexpensive Identification of Non-Informative Model Parameters

    NASA Astrophysics Data System (ADS)

    Mai, J.; Cuntz, M.; Kumar, R.; Zink, M.; Samaniego, L. E.; Schaefer, D.; Thober, S.; Rakovec, O.; Musuuza, J. L.; Craven, J. R.; Spieler, D.; Schrön, M.; Prykhodko, V.; Dalmasso, G.; Langenberg, B.; Attinger, S.

    2014-12-01

    Sensitivity analysis is used, for example, to identify parameters which induce the largest variability in model output and are thus informative during calibration. Variance-based techniques are employed for this purpose, which unfortunately require a large number of model evaluations and are thus ineligible for complex environmental models. We developed, therefore, a computational inexpensive screening method, which is based on Elementary Effects, that automatically separates informative and non-informative model parameters. The method was tested using the mesoscale hydrologic model (mHM) with 52 parameters. The model was applied in three European catchments with different hydrological characteristics, i.e. Neckar (Germany), Sava (Slovenia), and Guadalquivir (Spain). The method identified the same informative parameters as the standard Sobol method but with less than 1% of model runs. In Germany and Slovenia, 22 of 52 parameters were informative mostly in the formulations of evapotranspiration, interflow and percolation. In Spain 19 of 52 parameters were informative with an increased importance of soil parameters. We showed further that Sobol' indexes calculated for the subset of informative parameters are practically the same as Sobol' indexes before the screening but the number of model runs was reduced by more than 50%. The model mHM was then calibrated twice in the three test catchments. First all 52 parameters were taken into account and then only the informative parameters were calibrated while all others are kept fixed. The Nash-Sutcliffe efficiencies were 0.87 and 0.83 in Germany, 0.89 and 0.88 in Slovenia, and 0.86 and 0.85 in Spain, respectively. This minor loss of at most 4% in model performance comes along with a substantial decrease of at least 65% in model evaluations. In summary, we propose an efficient screening method to identify non-informative model parameters that can be discarded during further applications. We have shown that sensitivity

  19. Computational health economics for identification of unprofitable health care enrollees.

    PubMed

    Rose, Sherri; Bergquist, Savannah L; Layton, Timothy J

    2017-03-22

    Health insurers may attempt to design their health plans to attract profitable enrollees while deterring unprofitable ones. Such insurers would not be delivering socially efficient levels of care by providing health plans that maximize societal benefit, but rather intentionally distorting plan benefits to avoid high-cost enrollees, potentially to the detriment of health and efficiency. In this work, we focus on a specific component of health plan design at risk for health insurer distortion in the Health Insurance Marketplaces: the prescription drug formulary. We introduce an ensembled machine learning function to determine whether drug utilization variables are predictive of a new measure of enrollee unprofitability we derive, and thus vulnerable to distortions by insurers. Our implementation also contains a unique application-specific variable selection tool. This study demonstrates that super learning is effective in extracting the relevant signal for this prediction problem, and that a small number of drug variables can be used to identify unprofitable enrollees. The results are both encouraging and concerning. While risk adjustment appears to have been reasonably successful at weakening the relationship between therapeutic-class-specific drug utilization and unprofitability, some classes remain predictive of insurer losses. The vulnerable enrollees whose prescription drug regimens include drugs in these classes may need special protection from regulators in health insurance market design.

  20. Frequency domain transfer function identification using the computer program SYSFIT

    SciTech Connect

    Trudnowski, D.J.

    1992-12-01

    Because the primary application of SYSFIT for BPA involves studying power system dynamics, this investigation was geared toward simulating the effects that might be encountered in studying electromechanical oscillations in power systems. Although the intended focus of this work is power system oscillations, the studies are sufficiently genetic that the results can be applied to many types of oscillatory systems with closely-spaced modes. In general, there are two possible ways of solving the optimization problem. One is to use a least-squares optimization function and to write the system in such a form that the problem becomes one of linear least-squares. The solution can then be obtained using a standard least-squares technique. The other method involves using a search method to obtain the optimal model. This method allows considerably more freedom in forming the optimization function and model, but it requires an initial guess of the system parameters. SYSFIT employs this second approach. Detailed investigations were conducted into three main areas: (1) fitting to exact frequency response data of a linear system; (2) fitting to the discrete Fourier transformation of noisy data; and (3) fitting to multi-path systems. The first area consisted of investigating the effects of alternative optimization cost function options; using different optimization search methods; incorrect model order, missing response data; closely-spaced poles; and closely-spaced pole-zero pairs. Within the second area, different noise colorations and levels were studied. In the third area, methods were investigated for improving fitting results by incorporating more than one system path. The following is a list of guidelines and properties developed from the study for fitting a transfer function to the frequency response of a system using optimization search methods.

  1. Level-set surface segmentation and registration for computing intrasurgical deformations

    NASA Astrophysics Data System (ADS)

    Audette, Michel A.; Peters, Terence M.

    1999-05-01

    We propose a method for estimating intrasurgical brain shift for image-guided surgery. This method consists of five stages: the identification of relevant anatomical surfaces within the MRI/CT volume, range-sensing of the skin and cortex in the OR, rigid registration of the skin range image with its MRI/CT homologue, non-rigid motion tracking over time of cortical range images, and lastly, interpolation of this surface displacement information over the whole brain volume via a realistically valued finite element model of the head. This paper focuses on the anatomical surface identification and cortical range surface tracking problems. The surface identification scheme implements a recent algorithm which imbeds 3D surface segmentation as the level- set of a 4D moving front. A by-product of this stage is a Euclidean distance and closest point map which is later exploited to speed up the rigid and non-rigid surface registration. The range-sensor uses both laser-based triangulation and defocusing techniques to produce a 2D range profile, and is linearly swept across the skin or cortical surface to produce a 3D range image. The surface registration technique is of the iterative closest point type, where each iteration benefits from looking up, rather than searching for, explicit closest point pairs. These explicit point pairs in turn are used in conjunction with a closed-form SVD-based rigid transformation computation and with fast recursive splines to make each rigid and non-rigid registration iteration essentially instantaneous. Our method is validated with a novel deformable brain-shaped phantom, made of Polyvinyl Alcohol Cryogel.

  2. Identification of Restrictive Computer and Software Variables among Preoperational Users of a Computer Learning Center.

    ERIC Educational Resources Information Center

    Kozubal, Diane K.

    While manufacturers have produced a wide variety of software said to be easy for even the youngest child to use, there are conflicting perspectives on computer issues such as ease of use, influence on meeting educational objectives, effects on procedural learning, and rationale for use with young children. Addressing these concerns, this practicum…

  3. Blind source computer device identification from recorded VoIP calls for forensic investigation.

    PubMed

    Jahanirad, Mehdi; Anuar, Nor Badrul; Wahab, Ainuddin Wahid Abdul

    2017-03-01

    The VoIP services provide fertile ground for criminal activity, thus identifying the transmitting computer devices from recorded VoIP call may help the forensic investigator to reveal useful information. It also proves the authenticity of the call recording submitted to the court as evidence. This paper extended the previous study on the use of recorded VoIP call for blind source computer device identification. Although initial results were promising but theoretical reasoning for this is yet to be found. The study suggested computing entropy of mel-frequency cepstrum coefficients (entropy-MFCC) from near-silent segments as an intrinsic feature set that captures the device response function due to the tolerances in the electronic components of individual computer devices. By applying the supervised learning techniques of naïve Bayesian, linear logistic regression, neural networks and support vector machines to the entropy-MFCC features, state-of-the-art identification accuracy of near 99.9% has been achieved on different sets of computer devices for both call recording and microphone recording scenarios. Furthermore, unsupervised learning techniques, including simple k-means, expectation-maximization and density-based spatial clustering of applications with noise (DBSCAN) provided promising results for call recording dataset by assigning the majority of instances to their correct clusters.

  4. Parallel Computers for Region-Level Image Processing.

    DTIC Science & Technology

    1980-11-01

    34corner" proces- sor in each region, we can use it as the region’s represen- tative in the adjacency-graph- structured computer ; but we need to give it the...graph of bounded degree (% 5). We now briefly describe how to construct a quadtree- structured computer corresponding to (the quadtree of) a given

  5. Identifying the Computer Competency Levels of Recreation Department Undergraduates

    ERIC Educational Resources Information Center

    Zorba, Erdal

    2011-01-01

    Computer-based and web-based applications are as major instructional tools to increase undergraduates' motivation at school. In the recreation field usage of, computer and the internet based recreational applications has become more prevalent in order to present visual and interactive entertainment activities. Recreation department undergraduates…

  6. Quantum One Go Computation and the Physical Computation Level of Biological Information Processing

    NASA Astrophysics Data System (ADS)

    Castagnoli, Giuseppe

    2010-02-01

    By extending the representation of quantum algorithms to problem-solution interdependence, the unitary evolution part of the algorithm entangles the register containing the problem with the register containing the solution. Entanglement becomes correlation, or mutual causality, between the two measurement outcomes: the string of bits encoding the problem and that encoding the solution. In former work, we showed that this is equivalent to the algorithm knowing in advance 50% of the bits of the solution it will find in the future, which explains the quantum speed up. Mutual causality between bits of information is also equivalent to seeing quantum measurement as a many body interaction between the parts of a perfect classical machine whose normalized coordinates represent the qubit populations. This “hidden machine” represents the problem to be solved. The many body interaction (measurement) satisfies all the constraints of a nonlinear Boolean network “together and at the same time”—in one go—thus producing the solution. Quantum one go computation can formalize the physical computation level of the theories that place consciousness in quantum measurement. In fact, in visual perception, we see, thus recognize, thus process, a significant amount of information “together and at the same time”. Identifying the fundamental mechanism of consciousness with that of the quantum speed up gives quantum consciousness, with respect to classical consciousness, a potentially enormous evolutionary advantage.

  7. Computational Identification of Key Regulators in Two Different Colorectal Cancer Cell Lines

    PubMed Central

    Wlochowitz, Darius; Haubrock, Martin; Arackal, Jetcy; Bleckmann, Annalen; Wolff, Alexander; Beißbarth, Tim; Wingender, Edgar; Gültas, Mehmet

    2016-01-01

    Transcription factors (TFs) are gene regulatory proteins that are essential for an effective regulation of the transcriptional machinery. Today, it is known that their expression plays an important role in several types of cancer. Computational identification of key players in specific cancer cell lines is still an open challenge in cancer research. In this study, we present a systematic approach which combines colorectal cancer (CRC) cell lines, namely 1638N-T1 and CMT-93, and well-established computational methods in order to compare these cell lines on the level of transcriptional regulation as well as on a pathway level, i.e., the cancer cell-intrinsic pathway repertoire. For this purpose, we firstly applied the Trinity platform to detect signature genes, and then applied analyses of the geneXplain platform to these for detection of upstream transcriptional regulators and their regulatory networks. We created a CRC-specific position weight matrix (PWM) library based on the TRANSFAC database (release 2014.1) to minimize the rate of false predictions in the promoter analyses. Using our proposed workflow, we specifically focused on revealing the similarities and differences in transcriptional regulation between the two CRC cell lines, and report a number of well-known, cancer-associated TFs with significantly enriched binding sites in the promoter regions of the signature genes. We show that, although the signature genes of both cell lines show no overlap, they may still be regulated by common TFs in CRC. Based on our findings, we suggest that canonical Wnt signaling is activated in 1638N-T1, but inhibited in CMT-93 through cross-talks of Wnt signaling with the VDR signaling pathway and/or LXR-related pathways. Furthermore, our findings provide indication of several master regulators being present such as MLK3 and Mapk1 (ERK2) which might be important in cell proliferation, migration, and invasion of 1638N-T1 and CMT-93, respectively. Taken together, we provide

  8. Automatic Measurement of Water Levels by Using Image Identification Method in Open Channel

    NASA Astrophysics Data System (ADS)

    Chung Yang, Han; Xue Yang, Jia

    2014-05-01

    Water level data is indispensable to hydrology research, and it is important information for hydraulic engineering and overall utilization of water resources. The information of water level can be transmitted to management office by the network so that the management office may well understand whether the river level is exceeding the warning line. The existing water level measurement method can only present water levels in a form of data without any of images, the methods which make data just be a data and lack the sense of reality. Those images such as the rising or overflow of river level that the existing measurement method cannot obtain simultaneously. Therefore, this research employs a newly, improved method for water level measurement. Through the Video Surveillance System to record the images on site, an image of water surface will be snapped, and then the snapped image will be pre-processed and be compared with its altitude reference value to obtain a water level altitude value. With the ever-growing technology, the application scope of image identification is widely in increase. This research attempts to use image identification technology to analyze water level automatically. The image observation method used in this research is one of non-contact water level gage but it is quite different from other ones; the image observation method is cheap and the facilities can be set up beside an embankment of river or near the houses, thus the impact coming from external factors will be significantly reduced, and a real scene picture will be transmitted through wireless transmission. According to the dynamic water flow test held in an indoor experimental channel, the results of the research indicated that all of error levels of water level identification were less than 2% which meant the image identification could achieve identification result at different water levels. This new measurement method can offer instant river level figures and on-site video so that a

  9. Identification of areas with high levels of untreated dental caries.

    PubMed

    Ellwood, R P; O'Mullane, D M

    1996-02-01

    In order to examine the geographical variation of dental health within 10 county districts in North Wales, 3538 children were examined. The associations between three demographic indicators, based on the 1981 OPCS census, and dental health outcomes were assessed for electoral wards within the county districts. The Townsend and Jarman indices were the first two indicators employed and the third was based on a mathematical model representing the variation in the mean number of untreated decayed surfaces per person for the wards. This model was developed using the children examined in the five most westerly county districts. Using the data derived from the five most easterly county districts, the three indicators were assessed. All three showed strong correlations (r > or = 0.88) with dental health. These results indicate that measures of dental health based on large administrative units may obscure variation within them. It is concluded that geographical methods of this type may be useful for targeting dental resources at small areas with high levels of need.

  10. Evolutionary Computation for the Identification of Emergent Behavior in Autonomous Systems

    NASA Technical Reports Server (NTRS)

    Terrile, Richard J.; Guillaume, Alexandre

    2009-01-01

    Over the past several years the Center for Evolutionary Computation and Automated Design at the Jet Propulsion Laboratory has developed a technique based on Evolutionary Computational Methods (ECM) that allows for the automated optimization of complex computationally modeled systems. An important application of this technique is for the identification of emergent behaviors in autonomous systems. Mobility platforms such as rovers or airborne vehicles are now being designed with autonomous mission controllers that can find trajectories over a solution space that is larger than can reasonably be tested. It is critical to identify control behaviors that are not predicted and can have surprising results (both good and bad). These emergent behaviors need to be identified, characterized and either incorporated into or isolated from the acceptable range of control characteristics. We use cluster analysis of automatically retrieved solutions to identify isolated populations of solutions with divergent behaviors.

  11. Trusted Computing Exemplar: Low-Level Design Document Standards

    DTIC Science & Technology

    2014-12-12

    for writing low-level design documents. Low-level design documents provide a detailed description of one or more modules. The level of detail should...further this goal, the NPS CAG and NPS CISR ask that any derivative products, code, writings , and/or other derivative materials, include an attribution...functionality, whether accidental or intentional. This document provides the standard format for writing low-level design documents. Low-level design

  12. Reliability of frontal sinus by cone beam-computed tomography (CBCT) for individual identification.

    PubMed

    Cossellu, Gianguido; De Luca, Stefano; Biagi, Roberto; Farronato, Giampietro; Cingolani, Mariano; Ferrante, Luigi; Cameriere, Roberto

    2015-12-01

    Analysis of the frontal sinus is an important tool in personal identification. Cone beam-computed tomography (CBCT) is also progressively replacing conventional radiography and multi-slice computed tomography (MSCT) in human identification. The aim of this study is to develop a reproducible technique and measurements from 3D reconstructions obtained with CBCT, for use in human identification. CBCT from 150 patients (91 female, 59 male), aged between 15 and 78 years, was analysed with the specific software program MIMICS 11.11 (Materialise N.V., Leuven, Belgium). Corresponding 3D volumes were generated and maximal dimensions along 3 directions (x, y, z), X M, Y M, Z M (in mm), total volume area (in mm(3)), V t, and total surface (in mm(2)), S t, were calculated. Correlation analysis showed that sinus surfaces were strongly correlated with their volume (r = 0.976). Frontal sinuses were separate in 21 subjects (14 %), fused in 67 (44.6 %) and found on only one side (unilateral) in 9 (6 %). A Prominent Middle of Fused Sinus (PMS) was found in 53 subjects (35.3 %). The intra- (0.963-0.999) and inter-observer variability (0.973-0.999) showed a great agreement and a substantial homogeneity of evaluation.

  13. Computer-Assisted Photo Identification Outperforms Visible Implant Elastomers in an Endangered Salamander, Eurycea tonkawae

    PubMed Central

    Bendik, Nathan F.; Morrison, Thomas A.; Gluesenkamp, Andrew G.; Sanders, Mark S.; O’Donnell, Lisa J.

    2013-01-01

    Despite recognition that nearly one-third of the 6300 amphibian species are threatened with extinction, our understanding of the general ecology and population status of many amphibians is relatively poor. A widely-used method for monitoring amphibians involves injecting captured individuals with unique combinations of colored visible implant elastomer (VIE). We compared VIE identification to a less-invasive method – computer-assisted photographic identification (photoID) – in endangered Jollyville Plateau salamanders (Eurycea tonkawae), a species with a known range limited to eight stream drainages in central Texas. We based photoID on the unique pigmentation patterns on the dorsal head region of 1215 individual salamanders using identification software Wild-ID. We compared the performance of photoID methods to VIEs using both ‘high-quality’ and ‘low-quality’ images, which were taken using two different camera types and technologies. For high-quality images, the photoID method had a false rejection rate of 0.76% compared to 1.90% for VIEs. Using a comparable dataset of lower-quality images, the false rejection rate was much higher (15.9%). Photo matching scores were negatively correlated with time between captures, suggesting that evolving natural marks could increase misidentification rates in longer term capture-recapture studies. Our study demonstrates the utility of large-scale capture-recapture using photo identification methods for Eurycea and other species with stable natural marks that can be reliably photographed. PMID:23555669

  14. Application of superimposition-based personal identification using skull computed tomography images.

    PubMed

    Ishii, Masuko; Yayama, Kazuhiro; Motani, Hisako; Sakuma, Ayaka; Yasjima, Daisuke; Hayakawa, Mutumi; Yamamoto, Seiji; Iwase, Hirotaro

    2011-07-01

    Superimposition has been applied to skulls of unidentified skeletonized corpses as a personal identification method. The current method involves layering of a skull and a facial image of a suspected person and thus requires a real skeletonized skull. In this study, we scanned skulls of skeletonized corpses by computed tomography (CT), reconstructed three-dimensional (3D) images of skulls from the CT images, and superimposed the 3D images with facial images of the corresponding persons taken in their lives. Superimposition using 3D-reconstructed skull images demonstrated, as did superimposition using real skulls, an adequate degree of morphological consistency between the 3D-reconstructed skulls and persons in the facial images. Three-dimensional skull images reconstructed from CT images can be saved as data files and the use of these images in superimposition is effective for personal identification of unidentified bodies.

  15. Computational Acoustic Beamforming for Noise Source Identification for Small Wind Turbines

    PubMed Central

    Lien, Fue-Sang

    2017-01-01

    This paper develops a computational acoustic beamforming (CAB) methodology for identification of sources of small wind turbine noise. This methodology is validated using the case of the NACA 0012 airfoil trailing edge noise. For this validation case, the predicted acoustic maps were in excellent conformance with the results of the measurements obtained from the acoustic beamforming experiment. Following this validation study, the CAB methodology was applied to the identification of noise sources generated by a commercial small wind turbine. The simulated acoustic maps revealed that the blade tower interaction and the wind turbine nacelle were the two primary mechanisms for sound generation for this small wind turbine at frequencies between 100 and 630 Hz. PMID:28378012

  16. Computational Acoustic Beamforming for Noise Source Identification for Small Wind Turbines.

    PubMed

    Ma, Ping; Lien, Fue-Sang; Yee, Eugene

    2017-01-01

    This paper develops a computational acoustic beamforming (CAB) methodology for identification of sources of small wind turbine noise. This methodology is validated using the case of the NACA 0012 airfoil trailing edge noise. For this validation case, the predicted acoustic maps were in excellent conformance with the results of the measurements obtained from the acoustic beamforming experiment. Following this validation study, the CAB methodology was applied to the identification of noise sources generated by a commercial small wind turbine. The simulated acoustic maps revealed that the blade tower interaction and the wind turbine nacelle were the two primary mechanisms for sound generation for this small wind turbine at frequencies between 100 and 630 Hz.

  17. Three Computer Programs for Use in Introductory Level Physics Laboratories.

    ERIC Educational Resources Information Center

    Kagan, David T.

    1984-01-01

    Describes three computer programs which operate on Apple II+ microcomputers: (1) a menu-driven graph drawing program; (2) a simulation of the Millikan oil drop experiment; and (3) a program used to study the half-life of silver. (Instructions for obtaining the programs from the author are included.) (JN)

  18. Efficient Computation of the Topology of Level Sets

    SciTech Connect

    Pascucci, V; Cole-McLaughlin, K

    2002-07-19

    This paper introduces two efficient algorithms that compute the Contour Tree of a 3D scalar field F and its augmented version with the Betti numbers of each isosurface. The Contour Tree is a fundamental data structure in scientific visualization that is used to pre-process the domain mesh to allow optimal computation of isosurfaces with minimal storage overhead. The Contour Tree can be also used to build user interfaces reporting the complete topological characterization of a scalar field, as shown in Figure 1. In the first part of the paper we present a new scheme that augments the Contour Tree with the Betti numbers of each isocontour in linear time. We show how to extend the scheme introduced in 3 with the Betti number computation without increasing its complexity. Thus we improve on the time complexity from our previous approach 8 from 0(m log m) to 0(n log n+m), where m is the number of tetrahedra and n is the number of vertices in the domain of F. In the second part of the paper we introduce a new divide and conquer algorithm that computes the Augmented Contour Tree for scalar fields defined on rectilinear grids. The central part of the scheme computes the output contour tree by merging two intermediate contour trees and is independent of the interpolant. In this way we confine any knowledge regarding a specific interpolant to an oracle that computes the tree for a single cell. We have implemented this oracle for the trilinear interpolant and plan to replace it with higher order interpolants when needed. The complexity of the scheme is O(n + t log n), where t is the number of critical points of F. This allows for the first time to compute the Contour Tree in linear time in many practical cases when t = O(n{sup 1-e}). We report the running times for a parallel implementation of our algorithm, showing good scalability with the number of processors.

  19. Automatic de-identification of electronic medical records using token-level and character-level conditional random fields.

    PubMed

    Liu, Zengjian; Chen, Yangxin; Tang, Buzhou; Wang, Xiaolong; Chen, Qingcai; Li, Haodi; Wang, Jingfeng; Deng, Qiwen; Zhu, Suisong

    2015-12-01

    De-identification, identifying and removing all protected health information (PHI) present in clinical data including electronic medical records (EMRs), is a critical step in making clinical data publicly available. The 2014 i2b2 (Center of Informatics for Integrating Biology and Bedside) clinical natural language processing (NLP) challenge sets up a track for de-identification (track 1). In this study, we propose a hybrid system based on both machine learning and rule approaches for the de-identification track. In our system, PHI instances are first identified by two (token-level and character-level) conditional random fields (CRFs) and a rule-based classifier, and then are merged by some rules. Experiments conducted on the i2b2 corpus show that our system submitted for the challenge achieves the highest micro F-scores of 94.64%, 91.24% and 91.63% under the "token", "strict" and "relaxed" criteria respectively, which is among top-ranked systems of the 2014 i2b2 challenge. After integrating some refined localization dictionaries, our system is further improved with F-scores of 94.83%, 91.57% and 91.95% under the "token", "strict" and "relaxed" criteria respectively.

  20. Reliable identification at the species level of Brucella isolates with MALDI-TOF-MS

    PubMed Central

    2011-01-01

    Background The genus Brucella contains highly infectious species that are classified as biological threat agents. The timely detection and identification of the microorganism involved is essential for an effective response not only to biological warfare attacks but also to natural outbreaks. Matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOF-MS) is a rapid method for the analysis of biological samples. The advantages of this method, compared to conventional techniques, are rapidity, cost-effectiveness, accuracy and suitability for the high-throughput identification of bacteria. Discrepancies between taxonomy and genetic relatedness on the species and biovar level complicate the development of detection and identification assays. Results In this study, the accurate identification of Brucella species using MALDI-TOF-MS was achieved by constructing a Brucella reference library based on multilocus variable-number tandem repeat analysis (MLVA) data. By comparing MS-spectra from Brucella species against a custom-made MALDI-TOF-MS reference library, MALDI-TOF-MS could be used as a rapid identification method for Brucella species. In this way, 99.3% of the 152 isolates tested were identified at the species level, and B. suis biovar 1 and 2 were identified at the level of their biovar. This result demonstrates that for Brucella, even minimal genomic differences between these serovars translate to specific proteomic differences. Conclusions MALDI-TOF-MS can be developed into a fast and reliable identification method for genetically highly related species when potential taxonomic and genetic inconsistencies are taken into consideration during the generation of the reference library. PMID:22192890

  1. Social Identification and Interpersonal Communication in Computer-Mediated Communication: What You Do versus Who You Are in Virtual Groups

    ERIC Educational Resources Information Center

    Wang, Zuoming; Walther, Joseph B.; Hancock, Jeffrey T.

    2009-01-01

    This study investigates the influence of interpersonal communication and intergroup identification on members' evaluations of computer-mediated groups. Participants (N= 256) in 64 four-person groups interacted through synchronous computer chat. Subgroup assignments to minimal groups instilled significantly greater in-group versus out-group…

  2. Computerized nipple identification for multiple image analysis in computer-aided diagnosis

    SciTech Connect

    Zhou Chuan; Chan Heangping; Paramagul, Chintana; Roubidoux, Marilyn A.; Sahiner, Berkman; Hadjiiski, Labomir M.; Petrick, Nicholas

    2004-10-01

    Correlation of information from multiple-view mammograms (e.g., MLO and CC views, bilateral views, or current and prior mammograms) can improve the performance of breast cancer diagnosis by radiologists or by computer. The nipple is a reliable and stable landmark on mammograms for the registration of multiple mammograms. However, accurate identification of nipple location on mammograms is challenging because of the variations in image quality and in the nipple projections, resulting in some nipples being nearly invisible on the mammograms. In this study, we developed a computerized method to automatically identify the nipple location on digitized mammograms. First, the breast boundary was obtained using a gradient-based boundary tracking algorithm, and then the gray level profiles along the inside and outside of the boundary were identified. A geometric convergence analysis was used to limit the nipple search to a region of the breast boundary. A two-stage nipple detection method was developed to identify the nipple location using the gray level information around the nipple, the geometric characteristics of nipple shapes, and the texture features of glandular tissue or ducts which converge toward the nipple. At the first stage, a rule-based method was designed to identify the nipple location by detecting significant changes of intensity along the gray level profiles inside and outside the breast boundary and the changes in the boundary direction. At the second stage, a texture orientation-field analysis was developed to estimate the nipple location based on the convergence of the texture pattern of glandular tissue or ducts towards the nipple. The nipple location was finally determined from the detected nipple candidates by a rule-based confidence analysis. In this study, 377 and 367 randomly selected digitized mammograms were used for training and testing the nipple detection algorithm, respectively. Two experienced radiologists identified the nipple locations

  3. Systems Level Analysis and Identification of Pathways and Networks Associated with Liver Fibrosis

    DTIC Science & Technology

    2014-11-07

    Systems Level Analysis and Identification of Pathways and Networks Associated with Liver Fibrosis Mohamed Diwan M. AbdulHameed1, Gregory J. Tawa1...Toxic liver injury causes necrosis and fibrosis, which may lead to cirrhosis and liver failure. Despite recent progress in understanding the...mechanism of liver fibrosis, our knowledge of the molecular-level details of this disease is still incomplete. The elucidation of networks and pathways

  4. An effective computational tool for parametric studies and identification problems in materials mechanics

    NASA Astrophysics Data System (ADS)

    Bolzon, Gabriella; Buljak, Vladimir

    2011-12-01

    Parametric studies and identification problems require to perform repeated analyses, where only a few input parameters are varied among those defining the problem of interest, often associated to complex numerical simulations. In fact, physical phenomena relevant to several practical applications involve coupled material and geometry non-linearities. In these situations, accurate but expensive computations, usually carried out by the finite element method, may be replaced by numerical procedures based on proper orthogonal decomposition combined with radial basis function interpolation. Besides drastically reducing computing times and costs, this approach is capable of retaining the essential features of the considered system responses while filtering most disturbances. These features are illustrated in this paper with specific reference to some elastic-plastic problems. The presented results can however be easily extended to other meaningful engineering situations.

  5. Computational aspects of hot-wire identification of thermal conductivity and diffusivity under high temperature

    NASA Astrophysics Data System (ADS)

    Vala, Jiří; Jarošová, Petra

    2016-07-01

    Development of advanced materials resistant to high temperature, needed namely for the design of heat storage for low-energy and passive buildings, requires simple, inexpensive and reliable methods of identification of their temperature-sensitive thermal conductivity and diffusivity, covering both well-advised experimental setting and implementation of robust and effective computational algorithms. Special geometrical configurations offer a possibility of quasi-analytical evaluation of temperature development for direct problems, whereas inverse problems of simultaneous evaluation of thermal conductivity and diffusivity must be handled carefully, using some least-squares (minimum variance) arguments. This paper demonstrates the proper mathematical and computational approach to such model problem, thanks to the radial symmetry of hot-wire measurements, including its numerical implementation.

  6. A Functional Level Preprocessor for Computer Aided Digital Design.

    DTIC Science & Technology

    1980-12-01

    the parsing of ucer input, is based on that for the computer language, PASCAL [J2,1J. The procedure is tle author’s original design Each line of input...NIKLAUS WIR~iN. PASCAL -USER:I MANUAL AmD REPORT. NEW YORK, NY: SPRINGER-VERLAG 1978 Li LANCAST17R, DOIN. CMOS CoORBOLK(A. IND)IANAPOLIS, IND): HOWAI(D...34flGS 0151, OtTAll me:;genera ted by SISL, duri -ne iti; last run. Each message is of the foriiat: SutiWur Uk ;L ATLNG M1:SSA(;lE-, )’URIAT NUMBELR, and

  7. A conceptual framework of computations in mid-level vision.

    PubMed

    Kubilius, Jonas; Wagemans, Johan; Op de Beeck, Hans P

    2014-01-01

    If a picture is worth a thousand words, as an English idiom goes, what should those words-or, rather, descriptors-capture? What format of image representation would be sufficiently rich if we were to reconstruct the essence of images from their descriptors? In this paper, we set out to develop a conceptual framework that would be: (i) biologically plausible in order to provide a better mechanistic understanding of our visual system; (ii) sufficiently robust to apply in practice on realistic images; and (iii) able to tap into underlying structure of our visual world. We bring forward three key ideas. First, we argue that surface-based representations are constructed based on feature inference from the input in the intermediate processing layers of the visual system. Such representations are computed in a largely pre-semantic (prior to categorization) and pre-attentive manner using multiple cues (orientation, color, polarity, variation in orientation, and so on), and explicitly retain configural relations between features. The constructed surfaces may be partially overlapping to compensate for occlusions and are ordered in depth (figure-ground organization). Second, we propose that such intermediate representations could be formed by a hierarchical computation of similarity between features in local image patches and pooling of highly-similar units, and reestimated via recurrent loops according to the task demands. Finally, we suggest to use datasets composed of realistically rendered artificial objects and surfaces in order to better understand a model's behavior and its limitations.

  8. A conceptual framework of computations in mid-level vision

    PubMed Central

    Kubilius, Jonas; Wagemans, Johan; Op de Beeck, Hans P.

    2014-01-01

    If a picture is worth a thousand words, as an English idiom goes, what should those words—or, rather, descriptors—capture? What format of image representation would be sufficiently rich if we were to reconstruct the essence of images from their descriptors? In this paper, we set out to develop a conceptual framework that would be: (i) biologically plausible in order to provide a better mechanistic understanding of our visual system; (ii) sufficiently robust to apply in practice on realistic images; and (iii) able to tap into underlying structure of our visual world. We bring forward three key ideas. First, we argue that surface-based representations are constructed based on feature inference from the input in the intermediate processing layers of the visual system. Such representations are computed in a largely pre-semantic (prior to categorization) and pre-attentive manner using multiple cues (orientation, color, polarity, variation in orientation, and so on), and explicitly retain configural relations between features. The constructed surfaces may be partially overlapping to compensate for occlusions and are ordered in depth (figure-ground organization). Second, we propose that such intermediate representations could be formed by a hierarchical computation of similarity between features in local image patches and pooling of highly-similar units, and reestimated via recurrent loops according to the task demands. Finally, we suggest to use datasets composed of realistically rendered artificial objects and surfaces in order to better understand a model's behavior and its limitations. PMID:25566044

  9. Assessing Pre-Service Teachers' Computer Phobia Levels in Terms of Gender and Experience, Turkish Sample

    ERIC Educational Resources Information Center

    Ursavas, Omer Faruk; Karal, Hasan

    2009-01-01

    In this study it is aimed to determine the level of pre-service teachers' computer phobia. Whether or not computer phobia meaningfully varies statistically according to gender and computer experience has been tested in the study. The study was performed on 430 pre-service teachers at the Education Faculty in Rize/Turkey. Data in the study were…

  10. Identification of Propionibacteria to the species level using Fourier transform infrared spectroscopy and artificial neural networks.

    PubMed

    Dziuba, B

    2013-01-01

    Fourier transform infrared spectroscopy (FTIR) and artificial neural networks (ANN's) were used to identify species of Propionibacteria strains. The aim of the study was to improve the methodology to identify species of Propionibacteria strains, in which the differentiation index D, calculated based on Pearson's correlation and cluster analyses were used to describe the correlation between the Fourier transform infrared spectra and bacteria as molecular systems brought unsatisfactory results. More advanced statistical methods of identification of the FTIR spectra with application of artificial neural networks (ANN's) were used. In this experiment, the FTIR spectra of Propionibacteria strains stored in the library were used to develop artificial neural networks for their identification. Several multilayer perceptrons (MLP) and probabilistic neural networks (PNN) were tested. The practical value of selected artificial neural networks was assessed based on identification results of spectra of 9 reference strains and 28 isolates. To verify results of isolates identification, the PCR based method with the pairs of species-specific primers was used. The use of artificial neural networks in FTIR spectral analyses as the most advanced chemometric method supported correct identification of 93% bacteria of the genus Propionibacterium to the species level.

  11. Precision of cephalometric landmark identification: Cone-beam computed tomography vs conventional cephalometric views

    PubMed Central

    Ludlow, John B.; Gubler, Maritzabel; Cevidanes, Lucia; Mol, André

    2009-01-01

    Introduction In this study, we compared the precision of landmark identification using displays of multi-planar cone-beam computed tomographic (CBCT) volumes and conventional lateral cephalograms (Ceph). Methods Twenty presurgical orthodontic patients were radiographed with conventional Ceph and CBCT techniques. Five observers plotted 24 landmarks using computer displays of multi-planer reconstruction (MPR) CBCT and Ceph views during separate sessions. Absolute differences between each observer’s plot and the mean of all observers were averaged as 1 measure of variability (ODM). The absolute difference of each observer from any other observer was averaged as a second measure of variability (DEO). ANOVA and paired t tests were used to analyze variability differences. Results Radiographic modality and landmark were significant at P <0.0001 for DEO and ODM calculations. DEO calculations of observer variability were consistently greater than ODM. The overall correlation of 1920 paired ODM and DEO measurements was excellent at 0.972. All bilateral landmarks had increased precision when identified in the MPR views. Mediolateral variability was statistically greater than anteroposterior or caudal-cranial variability for 5 landmarks in the MPR views. Conclusions The MPR displays of CBCT volume images provide generally more precise identification of traditional cephalometric landmarks. More precise location of condylion, gonion, and orbitale overcomes the problem of superimposition of these bilateral landmarks seen in Ceph. Greater variability of certain landmarks in the mediolateral direction is probably related to inadequate definition of the landmarks in the third dimension. PMID:19732656

  12. Secondary School Students' Levels of Understanding in Computing Exponents

    ERIC Educational Resources Information Center

    Pitta-Pantazi, Demetra; Christou, Constantinos; Zachariades, Theodossios

    2007-01-01

    The aim of this study is to describe and analyze students' levels of understanding of exponents within the context of procedural and conceptual learning via the conceptual change and prototypes' theory. The study was conducted with 202 secondary school students with the use of a questionnaire and semi-structured interviews. The results suggest…

  13. Effects of portable computing devices on posture, muscle activation levels and efficiency.

    PubMed

    Werth, Abigail; Babski-Reeves, Kari

    2014-11-01

    Very little research exists on ergonomic exposures when using portable computing devices. This study quantified muscle activity (forearm and neck), posture (wrist, forearm and neck), and performance (gross typing speed and error rates) differences across three portable computing devices (laptop, netbook, and slate computer) and two work settings (desk and computer) during data entry tasks. Twelve participants completed test sessions on a single computer using a test-rest-test protocol (30min of work at one work setting, 15min of rest, 30min of work at the other work setting). The slate computer resulted in significantly more non-neutral wrist, elbow and neck postures, particularly when working on the sofa. Performance on the slate computer was four times less than that of the other computers, though lower muscle activity levels were also found. Potential or injury or illness may be elevated when working on smaller, portable computers in non-traditional work settings.

  14. Identification of oxygen-related midgap level in GaAs

    NASA Technical Reports Server (NTRS)

    Lagowski, J.; Lin, D. G.; Gatos, H. C.; Aoyama, T.

    1984-01-01

    An oxygen-related deep level ELO was identified in GaAs employing Bridgman-grown crystals with controlled oxygen doping. The activation energy of ELO is almost the same as that of the dominant midgap level: EL2. This fact impedes the identification of ELO by standard deep level transient spectroscopy. However, it was found that the electron capture cross section of ELO is about four times greater than that of EL2. This characteristic served as the basis for the separation and quantitative investigation of ELO employing detailed capacitance transient measurements in conjunction with reference measurements on crystals grown without oxygen doping and containing only EL2.

  15. Assigning unique identification numbers to new user accounts and groups in a computing environment with multiple registries

    DOEpatents

    DeRobertis, Christopher V.; Lu, Yantian T.

    2010-02-23

    A method, system, and program storage device for creating a new user account or user group with a unique identification number in a computing environment having multiple user registries is provided. In response to receiving a command to create a new user account or user group, an operating system of a clustered computing environment automatically checks multiple registries configured for the operating system to determine whether a candidate identification number for the new user account or user group has been assigned already to one or more existing user accounts or groups, respectively. The operating system automatically assigns the candidate identification number to the new user account or user group created in a target user registry if the checking indicates that the candidate identification number has not been assigned already to any of the existing user accounts or user groups, respectively.

  16. Fostering Girls' Computer Literacy through Laptop Learning: Can Mobile Computers Help To Level Out the Gender Difference?

    ERIC Educational Resources Information Center

    Schaumburg, Heike

    The goal of this study was to find out if the difference between boys and girls in computer literacy can be leveled out in a laptop program where each student has his/her own mobile computer to work with at home and at school. Ninth grade students (n=113) from laptop and non-laptop classes in a German high school were tested for their computer…

  17. A new experimental approach to computer-aided face/skull identification in forensic anthropology.

    PubMed

    Ricci, Alessio; Marella, Gian Luca; Apostol, Mario Alexandru

    2006-03-01

    The present study introduces a new approach to computer-assisted face/skull matching used for personal identification purposes in forensic anthropology. In this experiment, the authors formulated an algorithm able to identify the face of a person suspected to have disappeared, by comparing the respective person's facial image with the skull radiograph. A total of 14 subjects were selected for the study, from which a facial photograph and skull radiograph were taken and ultimately compiled into a database, saved to the hard drive of a computer. The photographs of the faces and corresponding skull radiographs were then drafted using common photographic software, taking caution not to alter the informational content of the images. Once computer generated, the facial images and menu were displayed on a color monitor. In the first phase, a few anatomic points of each photograph were selected and marked with a cross to facilitate and more accurately match the face with its corresponding skull. In the second phase, the above mentioned cross grid was superimposed on the radiographic image of the skull and brought to scale. In the third phase, the crosses were transferred to the cranial points of the radiograph. In the fourth phase, the algorithm calculated the distance of each transferred cross and the corresponding average. The smaller the mean value, the greater the index of similarity between the face and skull.A total of 196 cross-comparisons were conducted, with positive identification resulting in each case. Hence, the algorithm matched a facial photograph to the correct skull in 100% of the cases.

  18. The conclusive role of postmortem computed tomography (CT) of the skull and computer-assisted superimposition in identification of an unknown body.

    PubMed

    Lorkiewicz-Muszyńska, Dorota; Kociemba, Wojciech; Żaba, Czesław; Łabęcka, Marzena; Koralewska-Kordel, Małgorzata; Abreu-Głowacka, Monica; Przystańska, Agnieszka

    2013-05-01

    Computed tomography is commonly used in modern medicine, and thus, it is often helpful for medicolegal purposes, especially as part of the antemortem record. The application of postmortem computed tomography and 3D reconstruction of the skull in challenging cases is reported, and its valuable contribution to positive identification is discussed. This paper presents a case in which the body of an unknown individual is identified. Positive identification had not been possible despite a multidisciplinary examination. The postmortem use of computerized tomography and 3D reconstruction of the skull followed by the comparison of individual morphological characteristics of the viscerocranium showed the concordant points between the deceased and a missing person. Finally, superimposition using a 3D-reconstructed skull instead of the skeletonized skull demonstrated an adequate degree of morphological consistency in the facial images of the analyzed individuals that lead to positive identification. It was concluded that where other methods of personal identification had failed, the use of postmortem computed tomography had proved to be instrumental in the positive identification of the deceased.

  19. High level waste storage tanks 242-A evaporator standards/requirement identification document

    SciTech Connect

    Biebesheimer, E.

    1996-01-01

    This document, the Standards/Requirements Identification Document (S/RIDS) for the subject facility, represents the necessary and sufficient requirements to provide an adequate level of protection of the worker, public health and safety, and the environment. It lists those source documents from which requirements were extracted, and those requirements documents considered, but from which no requirements where taken. Documents considered as source documents included State and Federal Regulations, DOE Orders, and DOE Standards

  20. On parameters identification of computational models of vibrations during quiet standing of humans

    NASA Astrophysics Data System (ADS)

    Barauskas, R.; Krušinskienė, R.

    2007-12-01

    Vibration of the center of pressure (COP) of human body on the base of support during quiet standing is a very popular clinical research, which provides useful information about the physical and health condition of an individual. In this work, vibrations of COP of a human body in forward-backward direction during still standing are generated using controlled inverted pendulum (CIP) model with a single degree of freedom (dof) supplied with proportional, integral and differential (PID) controller, which represents the behavior of the central neural system of a human body and excited by cumulative disturbance vibration, generated within the body due to breathing or any other physical condition. The identification of the model and disturbance parameters is an important stage while creating a close-to-reality computational model able to evaluate features of disturbance. The aim of this study is to present the CIP model parameters identification approach based on the information captured by time series of the COP signal. The identification procedure is based on an error function minimization. Error function is formulated in terms of time laws of computed and experimentally measured COP vibrations. As an alternative, error function is formulated in terms of the stabilogram diffusion function (SDF). The minimization of error functions is carried out by employing methods based on sensitivity functions of the error with respect to model and excitation parameters. The sensitivity functions are obtained by using the variational techniques. The inverse dynamic problem approach has been employed in order to establish the properties of the disturbance time laws ensuring the satisfactory coincidence of measured and computed COP vibration laws. The main difficulty of the investigated problem is encountered during the model validation stage. Generally, neither the PID controller parameter set nor the disturbance time law are known in advance. In this work, an error function

  1. Bladed-shrouded-disc aeroelastic analyses: Computer program updates in NASTRAN level 17.7

    NASA Technical Reports Server (NTRS)

    Gallo, A. M.; Elchuri, V.; Skalski, S. C.

    1981-01-01

    In October 1979, a computer program based on the state-of-the-art compressor and structural technologies applied to bladed-shrouded-disc was developed. The program was more operational in NASTRAN Level 16. The bladed disc computer program was updated for operation in NASTRAN Level 17.7. The supersonic cascade unsteady aerodynamics routine UCAS, delivered as part of the NASTRAN Level 16 program was recorded to improve its execution time. These improvements are presented.

  2. Identify Skills and Proficiency Levels Necessary for Entry-Level Employment for All Vocational Programs Using Computers to Process Data. Final Report.

    ERIC Educational Resources Information Center

    Crowe, Jacquelyn

    This study investigated computer and word processing operator skills necessary for employment in today's high technology office. The study was comprised of seven major phases: (1) identification of existing community college computer operator programs in the state of Washington; (2) attendance at an information management seminar; (3) production…

  3. Computed Tomography (CT) Scanning Facilitates Early Identification of Neonatal Cystic Fibrosis Piglets

    PubMed Central

    Guillon, Antoine; Chevaleyre, Claire; Barc, Celine; Berri, Mustapha; Adriaensen, Hans; Lecompte, François; Villemagne, Thierry; Pezant, Jérémy; Delaunay, Rémi; Moënne-Loccoz, Joseph; Berthon, Patricia; Bähr, Andrea; Wolf, Eckhard; Klymiuk, Nikolai; Attucci, Sylvie; Ramphal, Reuben; Sarradin, Pierre; Buzoni-Gatel, Dominique; Si-Tahar, Mustapha; Caballero, Ignacio

    2015-01-01

    Background Cystic Fibrosis (CF) is the most prevalent autosomal recessive disease in the Caucasian population. A cystic fibrosis transmembrane conductance regulator knockout (CFTR-/-) pig that displays most of the features of the human CF disease has been recently developed. However, CFTR-/- pigs presents a 100% prevalence of meconium ileus that leads to death in the first hours after birth, requiring a rapid diagnosis and surgical intervention to relieve intestinal obstruction. Identification of CFTR-/- piglets is usually performed by PCR genotyping, a procedure that lasts between 4 to 6 h. Here, we aimed to develop a procedure for rapid identification of CFTR-/- piglets that will allow placing them under intensive care soon after birth and immediately proceeding with the surgical correction. Methods and Principal Findings Male and female CFTR+/- pigs were crossed and the progeny was examined by computed tomography (CT) scan to detect the presence of meconium ileus and facilitate a rapid post-natal surgical intervention. Genotype was confirmed by PCR. CT scan presented a 94.4% sensitivity to diagnose CFTR-/- piglets. Diagnosis by CT scan reduced the birth-to-surgery time from a minimum of 10 h down to a minimum of 2.5 h and increased the survival of CFTR-/- piglets to a maximum of 13 days post-surgery as opposed to just 66 h after later surgery. Conclusion CT scan imaging of meconium ileus is an accurate method for rapid identification of CFTR-/- piglets. Early CT detection of meconium ileus may help to extend the lifespan of CFTR-/- piglets and, thus, improve experimental research on CF, still an incurable disease. PMID:26600426

  4. Single-board computer based control system for a portable Raman device with integrated chemical identification

    NASA Astrophysics Data System (ADS)

    Mobley, Joel; Cullum, Brian M.; Wintenberg, Alan L.; Shane Frank, S.; Maples, Robert A.; Stokes, David L.; Vo-Dinh, Tuan

    2004-06-01

    We report the development of a battery-powered portable chemical identification device for field use consisting of an acousto-optic tunable filter (AOTF)-based Raman spectrometer with integrated data processing and analysis software. The various components and custom circuitry are integrated into a self-contained instrument by control software that runs on an embedded single-board computer (SBC), which communicates with the various instrument modules through a 48-line bidirectional TTL bus. The user interacts with the instrument via a touch-sensitive liquid crystal display unit (LCD) that provides soft buttons for user control as well as visual feedback (e.g., spectral plots, stored data, instrument settings, etc.) from the instrument. The control software manages all operational aspects of the instrument with the exception of the power management module that is run by embedded firmware. The SBC-based software includes both automated and manual library searching capabilities, permitting rapid identification of samples in the field. The use of the SBC in tandem with the LCD touchscreen for interfacing and control provides the instrument with a great deal of flexibility as its function can be customized to specific users or tasks via software modifications alone. The instrument, as currently configured, can be operated as a research-grade Raman spectrometer for scientific applications and as a "black-box" chemical identification system for field use. The instrument can acquire 198-point spectra over a spectral range of 238-1620 cm-1, perform a library search, and display the results in less than 14 s. The operating modes of the instrument are demonstrated illustrating the utility and flexibility afforded the system by the SBC-LCD control module.

  5. Identification of intestinal wall abnormalities and ischemia by modeling spatial uncertainty in computed tomography imaging findings.

    PubMed

    Tsunoyama, Taichiro; Pham, Tuan D; Fujita, Takashi; Sakamoto, Tetsuya

    2014-10-01

    Intestinal abnormalities and ischemia are medical conditions in which inflammation and injury of the intestine are caused by inadequate blood supply. Acute ischemia of the small bowel can be life-threatening. Computed tomography (CT) is currently a gold standard for the diagnosis of acute intestinal ischemia in the emergency department. However, the assessment of the diagnostic performance of CT findings in the detection of intestinal abnormalities and ischemia has been a difficult task for both radiologists and surgeons. Little effort has been found in developing computerized systems for the automated identification of these types of complex gastrointestinal disorders. In this paper, a geostatistical mapping of spatial uncertainty in CT scans is introduced for medical image feature extraction, which can be effectively applied for diagnostic detection of intestinal abnormalities and ischemia from control patterns. Experimental results obtained from the analysis of clinical data suggest the usefulness of the proposed uncertainty mapping model.

  6. Supervised neural computing solutions for fluorescence identification of benzimidazole fungicides. Data and decision fusion strategies.

    PubMed

    Suárez-Araujo, Carmen Paz; García Báez, Patricio; Sánchez Rodríguez, Álvaro; Santana-Rodrríguez, José Juan

    2016-12-01

    Benzimidazole fungicides (BFs) are a type of pesticide of high environmental interest characterized by a heavy fluorescence spectral overlap which complicates its detection in mixtures. In this paper, we present a computational study based on supervised neural networks for a multi-label classification problem. Specifically, backpropagation networks (BPNs) with data fusion and ensemble schemes are used for the simultaneous resolution of difficult multi-fungicide mixtures. We designed, optimized and compared simple BPNs, BPNs with data fusion and BPNs ensembles. The information environment used is made up of synchronous and conventional BF fluorescence spectra. The mixture spectra are not used in the training nor the validation stage. This study allows us to determine the convenience of fusioning the labels of carbendazim and benomyl for the identification of BFs in complex multi-fungicide mixtures.

  7. Computational methods for the identification of spatially varying stiffness and damping in beams

    NASA Technical Reports Server (NTRS)

    Banks, H. T.; Rosen, I. G.

    1986-01-01

    A numerical approximation scheme for the estimation of functional parameters in Euler-Bernoulli models for the transverse vibration of flexible beams with tip bodies is developed. The method permits the identification of spatially varying flexural stiffness and Voigt-Kelvin viscoelastic damping coefficients which appear in the hybrid system of ordinary and partial differential equations and boundary conditions describing the dynamics of such structures. An inverse problem is formulated as a least squares fit to data subject to constraints in the form of a vector system of abstract first order evolution equations. Spline-based finite element approximations are used to finite dimensionalize the problem. Theoretical convergence results are given and numerical studies carried out on both conventional (serial) and vector computers are discussed.

  8. A computer program for linear nonparametric and parametric identification of biological data.

    PubMed

    Werness, S A; Anderson, D J

    1984-01-01

    A computer program package for parametric ad nonparametric linear system identification of both static and dynamic biological data, written for an LSI-11 minicomputer with 28 K of memory, is described. The program has 11 possible commands including an instructional help command. A user can perform nonparametric spectral analysis and estimation of autocorrelation and partial autocorrelation functions of univariate data and estimate nonparametrically the transfer function and possibly an associated noise series of bivariate data. In addition, the commands provide the user the means to derive a parametric autoregressive moving average model for univariate data, to derive a parametric transfer function and noise model for bivariate data, and to perform several model evaluation tests such as pole-zero cancellation, examination of residual whiteness and uncorrelatedness with the input. The program, consisting of a main program and driver subroutine as well as six overlay segments, may be run interactively or automatically.

  9. NLSCIDNT user's guide maximum likehood parameter identification computer program with nonlinear rotorcraft model

    NASA Technical Reports Server (NTRS)

    1979-01-01

    A nonlinear, maximum likelihood, parameter identification computer program (NLSCIDNT) is described which evaluates rotorcraft stability and control coefficients from flight test data. The optimal estimates of the parameters (stability and control coefficients) are determined (identified) by minimizing the negative log likelihood cost function. The minimization technique is the Levenberg-Marquardt method, which behaves like the steepest descent method when it is far from the minimum and behaves like the modified Newton-Raphson method when it is nearer the minimum. Twenty-one states and 40 measurement variables are modeled, and any subset may be selected. States which are not integrated may be fixed at an input value, or time history data may be substituted for the state in the equations of motion. Any aerodynamic coefficient may be expressed as a nonlinear polynomial function of selected 'expansion variables'.

  10. An integrated transcriptomic and computational analysis for biomarker identification in gastric cancer

    PubMed Central

    Cui, Juan; Chen, Yunbo; Chou, Wen-Chi; Sun, Liankun; Chen, Li; Suo, Jian; Ni, Zhaohui; Zhang, Ming; Kong, Xiaoxia; Hoffman, Lisabeth L.; Kang, Jinsong; Su, Yingying; Olman, Victor; Johnson, Darryl; Tench, Daniel W.; Amster, I. Jonathan; Orlando, Ron; Puett, David; Li, Fan; Xu, Ying

    2011-01-01

    This report describes an integrated study on identification of potential markers for gastric cancer in patients’ cancer tissues and sera based on: (i) genome-scale transcriptomic analyses of 80 paired gastric cancer/reference tissues and (ii) computational prediction of blood-secretory proteins supported by experimental validation. Our findings show that: (i) 715 and 150 genes exhibit significantly differential expressions in all cancers and early-stage cancers versus reference tissues, respectively; and a substantial percentage of the alteration is found to be influenced by age and/or by gender; (ii) 21 co-expressed gene clusters have been identified, some of which are specific to certain subtypes or stages of the cancer; (iii) the top-ranked gene signatures give better than 94% classification accuracy between cancer and the reference tissues, some of which are gender-specific; and (iv) 136 of the differentially expressed genes were predicted to have their proteins secreted into blood, 81 of which were detected experimentally in the sera of 13 validation samples and 29 found to have differential abundances in the sera of cancer patients versus controls. Overall, the novel information obtained in this study has led to identification of promising diagnostic markers for gastric cancer and can benefit further analyses of the key (early) abnormalities during its development. PMID:20965966

  11. Multivariate Effects of Level of Education, Computer Ownership, and Computer Use on Female Students' Attitudes towards CALL

    ERIC Educational Resources Information Center

    Rahimi, Mehrak; Yadollahi, Samaneh

    2012-01-01

    The aim of this study was investigating Iranian female students' attitude towards CALL and its relationship with their level of education, computer ownership, and frequency of use. One hundred and forty-two female students (50 junior high-school students, 49 high-school students and 43 university students) participated in this study. They filled…

  12. Ubiquitin Ligase Substrate Identification through Quantitative Proteomics at Both the Protein and Peptide Levels

    PubMed Central

    Lee, Kimberly A.; Hammerle, Lisa P.; Andrews, Paul S.; Stokes, Matthew P.; Mustelin, Tomas; Silva, Jeffrey C.; Black, Roy A.; Doedens, John R.

    2011-01-01

    Protein ubiquitination is a key regulatory process essential to life at a cellular level; significant efforts have been made to identify ubiquitinated proteins through proteomics studies, but the level of success has not reached that of heavily studied post-translational modifications, such as phosphorylation. HRD1, an E3 ubiquitin ligase, has been implicated in rheumatoid arthritis, but no disease-relevant substrates have been identified. To identify these substrates, we have taken both peptide and protein level approaches to enrich for ubiquitinated proteins in the presence and absence of HRD1. At the protein level, a two-step strategy was taken using cells expressing His6-tagged ubiquitin, enriching proteins first based on their ubiquitination and second based on the His tag with protein identification by LC-MS/MS. Application of this method resulted in identification and quantification of more than 400 ubiquitinated proteins, a fraction of which were found to be sensitive to HRD1 and were therefore deemed candidate substrates. In a second approach, ubiquitinated peptides were enriched after tryptic digestion by peptide immunoprecipitation using an antibody specific for the diglycine-labeled internal lysine residue indicative of protein ubiquitination, with peptides and ubiquitination sites identified by LC-MS/MS. Peptide immunoprecipitation resulted in identification of over 1800 ubiquitinated peptides on over 900 proteins in each study, with several proteins emerging as sensitive to HRD1 levels. Notably, significant overlap exists between the HRD1 substrates identified by the protein-based and the peptide-based strategies, with clear cross-validation apparent both qualitatively and quantitatively, demonstrating the effectiveness of both strategies and furthering our understanding of HRD1 biology. PMID:21987572

  13. New Fe I Level Energies and Line Identifications from Stellar Spectra

    NASA Astrophysics Data System (ADS)

    Peterson, Ruth C.; Kurucz, Robert L.

    2015-01-01

    The spectrum of the Fe I atom is critical to many areas of astrophysics and beyond. Measurements of the energies of its high-lying levels remain woefully incomplete, however, despite extensive laboratory and solar analysis. In this work, we use high-resolution archival absorption-line ultraviolet and optical spectra of stars whose warm temperatures favor moderate Fe I excitation. We derive the energy for a particular upper level in Kurucz's semiempirical calculations by adopting a trial value that yields the same wavelength for a given line predicted to be about as strong as that of a strong unidentified spectral line observed in the stellar spectra, then checking the new wavelengths of other strong predicted transitions that share the same upper level for coincidence with other strong observed unidentified lines. To date, this analysis has provided the upper energies of 66 Fe I levels. Many new energy levels are higher than those accessible to laboratory experiments; several exceed the Fe I ionization energy. These levels provide new identifications for over 2000 potentially detectable lines. Almost all of the new levels of odd parity include UV lines that were detected but unclassified in laboratory Fe I absorption spectra, providing an external check on the energy values. We motivate and present the procedure, provide the resulting new energy levels and their uncertainties, list all the potentially detectable UV and optical new Fe I line identifications and their gf values, point out new lines of astrophysical interest, and discuss the prospects for additional Fe I energy level determinations.

  14. Computer-Assisted Instruction in Elementary Logic at the University Level. Technical Report No. 239.

    ERIC Educational Resources Information Center

    Goldberg, Adele; Suppes, Patrick

    Earlier research by the authors in the design and use of computer-assisted instructional systems and curricula for teaching mathematical logic to gifted elementary school students has been extended to the teaching of university-level courses. This report is a description of the curriculum and problem types of a computer-based course offered at…

  15. A Study of Effectiveness of Computer Assisted Instruction (CAI) over Classroom Lecture (CRL) at ICS Level

    ERIC Educational Resources Information Center

    Kaousar, Tayyeba; Choudhry, Bushra Naoreen; Gujjar, Aijaz Ahmed

    2008-01-01

    This study was aimed to evaluate the effectiveness of CAI vs. classroom lecture for computer science at ICS level. The objectives were to compare the learning effects of two groups with classroom lecture and computer-assisted instruction studying the same curriculum and the effects of CAI and CRL in terms of cognitive development. Hypotheses of…

  16. Computational techniques in tribology and material science at the atomic level

    NASA Technical Reports Server (NTRS)

    Ferrante, J.; Bozzolo, G. H.

    1992-01-01

    Computations in tribology and material science at the atomic level present considerable difficulties. Computational techniques ranging from first-principles to semi-empirical and their limitations are discussed. Example calculations of metallic surface energies using semi-empirical techniques are presented. Finally, application of the methods to calculation of adhesion and friction are presented.

  17. Identification of rounded atelectasis in workers exposed to asbestos by contrast helical computed tomography.

    PubMed

    Terra-Filho, M; Kavakama, J; Bagatin, E; Capelozzi, V L; Nery, L E; Tavares, R

    2003-10-01

    Rounded atelectasis (RA) is a benign and unusual form of subpleural lung collapse that has been described mostly in asbestos-exposed workers. This form of atelectasis manifests as a lung nodule and can be confused with bronchogenic carcinoma upon conventional radiologic examination. The objective of the present study was to evaluate the variation in contrast uptake in computed tomography for the identification of asbestos-related RA in Brazil. Between January 1998 and December 2000, high-resolution computed tomography (HRCT) was performed in 1658 asbestos-exposed workers. The diagnosis was made in nine patients based on a history of prior asbestos exposure, the presence of characteristic (HRCT) findings and lesions unchanged in size over 2 years or more. In three of them the diagnosis was confirmed during surgery. The dynamic contrast enhancement study was modified to evaluate nodules and pulmonary masses. All nine patients with RA received iodide contrast according to weight. The average enhancement after iodide contrast was infused, reported as Hounsfield units (HU), increased from 62.5+/-9.7 to 125.4+/-20.7 (P < 0.05), with a mean enhancement of 62.5+/-19.7 (range 40 to 89) and with a uniform dense opacification. In conclusion, in this study all patients with RA showed contrast enhancement with uniform dense opacification. The main clinical implication of this finding is that this procedure does not permit differentiation between RA and malignant pulmonary neoplasm.

  18. A Computational Model for the Identification of Biochemical Pathways in the Krebs Cycle

    SciTech Connect

    Oliveira, Joseph S.; Bailey, Colin G.; Jones-Oliveira, Janet B.; Dixon, David A.; Gull, Dean W.; Chandler, Mary L.

    2003-03-01

    We have applied an algorithmic methodology which provably decomposes any complex network into a complete family of principal subcircuits to study the minimal circuits that describe the Krebs cycle. Every operational behavior that the network is capable of exhibiting can be represented by some combination of these principal subcircuits and this computational decomposition is linearly efficient. We have developed a computational model that can be applied to biochemical reaction systems which accurately renders pathways of such reactions via directed hypergraphs (Petri nets). We have applied the model to the citric acid cycle (Krebs cycle). The Krebs cycle, which oxidizes the acetyl group of acetyl CoA to CO2 and reduces NAD and FAD to NADH and FADH2 is a complex interacting set of nine subreaction networks. The Krebs cycle was selected because of its familiarity to the biological community and because it exhibits enough complexity to be interesting in order to introduce this novel analytic approach. This study validates the algorithmic methodology for the identification of significant biochemical signaling subcircuits, based solely upon the mathematical model and not upon prior biological knowledge. The utility of the algebraic-combinatorial model for identifying the complete set of biochemical subcircuits as a data set is demonstrated for this important metabolic process.

  19. A computational model for the identification of biochemical pathways in the krebs cycle.

    PubMed

    Oliveira, Joseph S; Bailey, Colin G; Jones-Oliveira, Janet B; Dixon, David A; Gull, Dean W; Chandler, Mary L

    2003-01-01

    We have applied an algorithmic methodology which provably decomposes any complex network into a complete family of principal subcircuits to study the minimal circuits that describe the Krebs cycle. Every operational behavior that the network is capable of exhibiting can be represented by some combination of these principal subcircuits and this computational decomposition is linearly efficient. We have developed a computational model that can be applied to biochemical reaction systems which accurately renders pathways of such reactions via directed hypergraphs (Petri nets). We have applied the model to the citric acid cycle (Krebs cycle). The Krebs cycle, which oxidizes the acetyl group of acetyl CoA to CO(2) and reduces NAD and FAD to NADH and FADH(2), is a complex interacting set of nine subreaction networks. The Krebs cycle was selected because of its familiarity to the biological community and because it exhibits enough complexity to be interesting in order to introduce this novel analytic approach. This study validates the algorithmic methodology for the identification of significant biochemical signaling subcircuits, based solely upon the mathematical model and not upon prior biological knowledge. The utility of the algebraic-combinatorial model for identifying the complete set of biochemical subcircuits as a data set is demonstrated for this important metabolic process.

  20. CaPSID: A bioinformatics platform for computational pathogen sequence identification in human genomes and transcriptomes

    PubMed Central

    2012-01-01

    Background It is now well established that nearly 20% of human cancers are caused by infectious agents, and the list of human oncogenic pathogens will grow in the future for a variety of cancer types. Whole tumor transcriptome and genome sequencing by next-generation sequencing technologies presents an unparalleled opportunity for pathogen detection and discovery in human tissues but requires development of new genome-wide bioinformatics tools. Results Here we present CaPSID (Computational Pathogen Sequence IDentification), a comprehensive bioinformatics platform for identifying, querying and visualizing both exogenous and endogenous pathogen nucleotide sequences in tumor genomes and transcriptomes. CaPSID includes a scalable, high performance database for data storage and a web application that integrates the genome browser JBrowse. CaPSID also provides useful metrics for sequence analysis of pre-aligned BAM files, such as gene and genome coverage, and is optimized to run efficiently on multiprocessor computers with low memory usage. Conclusions To demonstrate the usefulness and efficiency of CaPSID, we carried out a comprehensive analysis of both a simulated dataset and transcriptome samples from ovarian cancer. CaPSID correctly identified all of the human and pathogen sequences in the simulated dataset, while in the ovarian dataset CaPSID’s predictions were successfully validated in vitro. PMID:22901030

  1. Crop species identification using machine vision of computer extracted individual leaves

    NASA Astrophysics Data System (ADS)

    Camargo Neto, João; Meyer, George E.

    2005-11-01

    An unsupervised method for plant species identification was developed which uses computer extracted individual whole leaves from color images of crop canopies. Green canopies were isolated from soil/residue backgrounds using a modified Excess Green and Excess Red separation method. Connected components of isolated green regions of interest were changed into pixel fragments using the Gustafson-Kessel fuzzy clustering method. The fragments were reassembled as individual leaves using a genetic optimization algorithm and a fitness method. Pixels of whole leaves were then analyzed using the elliptic Fourier shape and Haralick's classical textural feature analyses. A binary template was constructed to represent each selected leaf region of interest. Elliptic Fourier descriptors were generated from a chain encoding of the leaf boundary. Leaf template orientation was corrected by rotating each extracted leaf to a standard horizontal position. This was done using information provided from the first harmonic set of coefficients. Textural features were computed from the grayscale co-occurrence matrix of the leaf pixel set. Standardized leaf orientation significantly improved the leaf textural venation results. Principle component analysis from SAS (R) was used to select the best Fourier descriptors and textural indices. Indices of local homogeneity, and entropy were found to contribute to improved classification rates. A SAS classification model was developed and correctly classified 83% of redroot pigweed, 100% of sunflower 83% of soybean, and 73% of velvetleaf species. An overall plant species correct classification rate of 86% was attained.

  2. Computer experiments on periodic systems identification using rotor blade transient flapping-torsion responses at high advance ratio

    NASA Technical Reports Server (NTRS)

    Hohenemser, K. H.; Prelewicz, D. A.

    1974-01-01

    Systems identification methods have recently been applied to rotorcraft to estimate stability derivatives from transient flight control response data. While these applications assumed a linear constant coefficient representation of the rotorcraft, the computer experiments described in this paper used transient responses in flap-bending and torsion of a rotor blade at high advance ratio which is a rapidly time varying periodic system.

  3. Identification of Nonlinear Micron-Level Mechanics for a Precision Deployable Joint

    NASA Technical Reports Server (NTRS)

    Bullock, S. J.; Peterson, L. D.

    1994-01-01

    The experimental identification of micron-level nonlinear joint mechanics and dynamics for a pin-clevis joint used in a precision, adaptive, deployable space structure are investigated. The force-state mapping method is used to identify the behavior of the joint under a preload. The results of applying a single tension-compression cycle to the joint under a tensile preload are presented. The observed micron-level behavior is highly nonlinear and involves all six rigid body motion degrees-of-freedom of the joint. it is also suggests that at micron levels of motion modelling of the joint mechanics and dynamics must include the interactions between all internal components, such as the pin, bushings, and the joint node.

  4. The identification of a selective dopamine D2 partial agonist, D3 antagonist displaying high levels of brain exposure.

    PubMed

    Holmes, Ian P; Blunt, Richard J; Lorthioir, Olivier E; Blowers, Stephen M; Gribble, Andy; Payne, Andrew H; Stansfield, Ian G; Wood, Martyn; Woollard, Patrick M; Reavill, Charlie; Howes, Claire M; Micheli, Fabrizio; Di Fabio, Romano; Donati, Daniele; Terreni, Silvia; Hamprecht, Dieter; Arista, Luca; Worby, Angela; Watson, Steve P

    2010-03-15

    The identification of a highly selective D(2) partial agonist, D(3) antagonist tool molecule which demonstrates high levels of brain exposure and selectivity against an extensive range of dopamine, serotonin, adrenergic, histamine, and muscarinic receptors is described.

  5. Identification of pumping influences in long-term water level fluctuations.

    PubMed

    Harp, Dylan R; Vesselinov, Velimir V

    2011-01-01

    Identification of the pumping influences at monitoring wells caused by spatially and temporally variable water supply pumping can be a challenging, yet an important hydrogeological task. The information that can be obtained can be critical for conceptualization of the hydrogeological conditions and indications of the zone of influence of the individual pumping wells. However, the pumping influences are often intermittent and small in magnitude with variable production rates from multiple pumping wells. While these difficulties may support an inclination to abandon the existing dataset and conduct a dedicated cross-hole pumping test, that option can be challenging and expensive to coordinate and execute. This paper presents a method that utilizes a simple analytical modeling approach for analysis of a long-term water level record utilizing an inverse modeling approach. The methodology allows the identification of pumping wells influencing the water level fluctuations. Thus, the analysis provides an efficient and cost-effective alternative to designed and coordinated cross-hole pumping tests. We apply this method on a dataset from the Los Alamos National Laboratory site. Our analysis also provides (1) an evaluation of the information content of the transient water level data; (2) indications of potential structures of the aquifer heterogeneity inhibiting or promoting pressure propagation; and (3) guidance for the development of more complicated models requiring detailed specification of the aquifer heterogeneity.

  6. A riboprinting scheme for identification of unknown Acanthamoeba isolates at species level

    PubMed Central

    Kong, Hyun-Hee

    2002-01-01

    We describe a riboprinting scheme for identification of unknown Acanthamoeba isolates at the species level. It involved the use of PCR-RFLP of small subunit ribosomal RNA gene (riboprint) of 24 reference strains by 4 kinds of restriction enzymes. Seven strains in morphological group I and III were identified at species level with their unique sizes of PCR product and riboprint type by Rsa I. Unique RFCP of 17 strains in group II by Dde I, Taq I and Hae III were classified into: (1) four taxa that were identifiable at the species level, (2) a subgroup of 4 taxa and a pair of 2 taxa that were identical with each other, and (3) a species complex of 7 taxa assigned to A. castellanii complex that were closely related. These results were consistent with those obtained by 18s rDNA sequence analysis. This approach provides an alternative to the rDNA sequencing for rapid identification of a new clinical isolate or a large number of environmental isolates of Acanthamoeba. PMID:11949210

  7. Computational identification of miRNAs and their targets in Phaseolus vulgaris.

    PubMed

    Han, J; Xie, H; Kong, M L; Sun, Q P; Li, R Z; Pan, J B

    2014-01-21

    MicroRNAs (miRNAs) are a class of non-coding small RNAs that negatively regulate gene expression at the post-transcriptional level. Although thousands of miRNAs have been identified in plants, limited information is available about miRNAs in Phaseolus vulgaris, despite it being an important food legume worldwide. The high conservation of plant miRNAs enables the identification of new miRNAs in P. vulgaris by homology analysis. Here, 1804 known and unique plant miRNAs from 37 plant species were blast-searched against expressed sequence tag and genomic survey sequence databases to identify novel miRNAs in P. vulgaris. All candidate sequences were screened by a series of miRNA filtering criteria. Finally, we identified 27 conserved miRNAs, belonging to 24 miRNA families. When compared against known miRNAs in P. vulgaris, we found that 24 of the 27 miRNAs were newly discovered. Further, we identified 92 potential target genes with known functions for these novel miRNAs. Most of these target genes were predicted to be involved in plant development, signal transduction, metabolic pathways, disease resistance, and environmental stress response. The identification of the novel miRNAs in P. vulgaris is anticipated to provide baseline information for further research about the biological functions and evolution of miRNAs in P. vulgaris.

  8. Absolute identification of muramic acid, at trace levels, in human septic synovial fluids in vivo and absence in aseptic fluids.

    PubMed

    Fox, A; Fox, K; Christensson, B; Harrelson, D; Krahmer, M

    1996-09-01

    This is the first report of a study employing the state-of-the-art technique of gas chromatography-tandem mass spectrometry for absolute identification of muramic acid (a marker for peptidoglycan) at trace levels in a human or animal body fluid or tissue. Daughter mass spectra of synovial fluid muramic acid peaks (> or = 30 ng/ml) were identical to those of pure muramic acid. Absolute chemical identification at this level represents a 1,000-fold increase in sensitivity over previous gas chromatography-mass spectrometry identifications. Muramic acid was positively identified in synovial fluids during infection and was eliminated over time but was absent from aseptic fluids.

  9. Groundwater contamination: identification of source signal by time-reverse mass transport computation and filtering

    NASA Astrophysics Data System (ADS)

    Koussis, A. S.; Mazi, K.; Lykoudis, S.; Argyriou, A.

    2003-04-01

    Source signal identification is a forensic task, within regulatory and legal activities. Estimation of the contaminant's release history by reverse-solution (stepping back in time) of the mass transport equation, partialC/partialt + u partialC/partialx = D partial^2C/ partialx^2, is an ill-posed problem (its solution is non-unique and unstable). For this reason we propose the recovery of the source signal from measured concentration profile data through a numerical technique that is based on the premise of advection-dominated transport. We derive an explicit numerical scheme by discretising the pure advection equation, partialC/ partialt + u partial C/partialx = 0, such that it also models gradient-transport by matching numerical diffusion (leading truncation error term) to physical dispersion. The match is achieved by appropriate choice of the scheme’s spatial weighting coefficient q as function of the grid Peclet number P = u Δx/D: θ = 0.5 - P-1. This is a novel and efficient direct solution approach for the signal identification problem at hand that can accommodate space-variable transport parameters as well. First, we perform numerical experiments to define proper grids (in terms of Courant {bf C} = uΔt/Δx and grid Peclet P numbers) for control of spurious oscillations (instability). We then assess recovery of source signals, from perfect as well as from error-seeded field data, considering field data resulting from single- and double-peaked source signals. With perfect data, the scheme recovers source signals with very good accuracy. With imperfect data, however, additional data conditioning is required for control of signal noise. Alternating reverse profile computation with Savitzky-Golay low-pass filtering allows the recovery of well-timed and smooth source signals that satisfy mass conservation very well. Current research focuses on: a) optimising the performance of Savitzky-Golay filters, through selection of appropriate parameters (order of least

  10. Computational Identification of Novel MicroRNAs and Their Targets in Vigna unguiculata.

    PubMed

    Lu, Yongzhong; Yang, Xiaoyun

    2010-01-01

    MicroRNAs (miRNAs) are a class of endogenous, noncoding, short RNAs directly involved in regulating gene expression at the posttranscriptional level. High conservation of miRNAs in plant provides the foundation for identification of new miRNAs in other plant species through homology alignment. Here, previous known plant miRNAs were BLASTed against the Expressed Sequence Tag (EST) and Genomic Survey Sequence (GSS) databases of Vigna unguiculata, and according to a series of filtering criteria, a total of 47 miRNAs belonging to 13 miRNA families were identified, and 30 potential target genes of them were subsequently predicted, most of which seemed to encode transcription factors or enzymes participating in regulation of development, growth, metabolism, and other physiological processes. Overall, our findings lay the foundation for further researches of miRNAs function in Vigna unguiculata.

  11. High level language for measurement complex control based on the computer E-100I

    NASA Technical Reports Server (NTRS)

    Zubkov, B. V.

    1980-01-01

    A high level language was designed to control the process of conducting an experiment using the computer "Elektrinika-1001". Program examples are given to control the measuring and actuating devices. The procedure of including these programs in the suggested high level language is described.

  12. A Unique Automation Platform for Measuring Low Level Radioactivity in Metabolite Identification Studies

    PubMed Central

    Krauser, Joel; Walles, Markus; Wolf, Thierry; Graf, Daniel; Swart, Piet

    2012-01-01

    Generation and interpretation of biotransformation data on drugs, i.e. identification of physiologically relevant metabolites, defining metabolic pathways and elucidation of metabolite structures, have become increasingly important to the drug development process. Profiling using 14C or 3H radiolabel is defined as the chromatographic separation and quantification of drug-related material in a given biological sample derived from an in vitro, preclinical in vivo or clinical study. Metabolite profiling is a very time intensive activity, particularly for preclinical in vivo or clinical studies which have defined limitations on radiation burden and exposure levels. A clear gap exists for certain studies which do not require specialized high volume automation technologies, yet these studies would still clearly benefit from automation. Use of radiolabeled compounds in preclinical and clinical ADME studies, specifically for metabolite profiling and identification are a very good example. The current lack of automation for measuring low level radioactivity in metabolite profiling requires substantial capacity, personal attention and resources from laboratory scientists. To help address these challenges and improve efficiency, we have innovated, developed and implemented a novel and flexible automation platform that integrates a robotic plate handling platform, HPLC or UPLC system, mass spectrometer and an automated fraction collector. PMID:22723932

  13. A unique automation platform for measuring low level radioactivity in metabolite identification studies.

    PubMed

    Krauser, Joel; Walles, Markus; Wolf, Thierry; Graf, Daniel; Swart, Piet

    2012-01-01

    Generation and interpretation of biotransformation data on drugs, i.e. identification of physiologically relevant metabolites, defining metabolic pathways and elucidation of metabolite structures, have become increasingly important to the drug development process. Profiling using (14)C or (3)H radiolabel is defined as the chromatographic separation and quantification of drug-related material in a given biological sample derived from an in vitro, preclinical in vivo or clinical study. Metabolite profiling is a very time intensive activity, particularly for preclinical in vivo or clinical studies which have defined limitations on radiation burden and exposure levels. A clear gap exists for certain studies which do not require specialized high volume automation technologies, yet these studies would still clearly benefit from automation. Use of radiolabeled compounds in preclinical and clinical ADME studies, specifically for metabolite profiling and identification are a very good example. The current lack of automation for measuring low level radioactivity in metabolite profiling requires substantial capacity, personal attention and resources from laboratory scientists. To help address these challenges and improve efficiency, we have innovated, developed and implemented a novel and flexible automation platform that integrates a robotic plate handling platform, HPLC or UPLC system, mass spectrometer and an automated fraction collector.

  14. Ingroup identification and group-level narcissism as predictors of U.S. citizens' attitudes and behavior toward Arab immigrants.

    PubMed

    Lyons, Patricia A; Kenworthy, Jared B; Popan, Jason R

    2010-09-01

    In four studies, the authors explored factors contributing to negative attitudes and behavior toward Arab immigrants in the United States. In Study 1, Americans reported greater threat from Arabs, compared to other groups (e.g., Latino, Asian). In Study 2, they tested the effects of ingroup identification and group-level narcissism on attitudes toward Arab, Latino, Asian, and European immigrants. Identification interacted with group narcissism in predicting attitudes toward Arab (but not other) immigrants, such that identification predicted negative attitudes toward Arab immigrants only at mean and high levels of group narcissism. Study 3 explored the convergent and discriminant validity of the group narcissism construct. In Study 4, the authors added a behavioral dependent measure. Again, ingroup identification predicted negative behavior and attitudes toward an Arab immigrant group (but not comparison groups) only at mean and high levels of group narcissism. Theoretical and practical implications are discussed.

  15. Ontology-Based High-Level Context Inference for Human Behavior Identification

    PubMed Central

    Villalonga, Claudia; Razzaq, Muhammad Asif; Khan, Wajahat Ali; Pomares, Hector; Rojas, Ignacio; Lee, Sungyoung; Banos, Oresti

    2016-01-01

    Recent years have witnessed a huge progress in the automatic identification of individual primitives of human behavior, such as activities or locations. However, the complex nature of human behavior demands more abstract contextual information for its analysis. This work presents an ontology-based method that combines low-level primitives of behavior, namely activity, locations and emotions, unprecedented to date, to intelligently derive more meaningful high-level context information. The paper contributes with a new open ontology describing both low-level and high-level context information, as well as their relationships. Furthermore, a framework building on the developed ontology and reasoning models is presented and evaluated. The proposed method proves to be robust while identifying high-level contexts even in the event of erroneously-detected low-level contexts. Despite reasonable inference times being obtained for a relevant set of users and instances, additional work is required to scale to long-term scenarios with a large number of users. PMID:27690050

  16. Rapid identification of mycobacteria to the species level by polymerase chain reaction and restriction enzyme analysis.

    PubMed Central

    Telenti, A; Marchesi, F; Balz, M; Bally, F; Böttger, E C; Bodmer, T

    1993-01-01

    A method for the rapid identification of mycobacteria to the species level was developed on the basis of evaluation by the polymerase chain reaction (PCR) of the gene encoding for the 65-kDa protein. The method involves restriction enzyme analysis of PCR products obtained with primers common to all mycobacteria. Using two restriction enzymes, BstEII and HaeIII, medically relevant and other frequent laboratory isolates were differentiated to the species or subspecies level by PCR-restriction enzyme pattern analysis. PCR-restriction enzyme pattern analysis was performed on isolates (n = 330) from solid and fluid culture media, including BACTEC, or from frozen and lyophilized stocks. The procedure does not involve hybridization steps or the use of radioactivity and can be completed within 1 working day. Images PMID:8381805

  17. Computation of likelihood ratios in fingerprint identification for configurations of any number of minutiae.

    PubMed

    Neumann, Cédric; Champod, Christophe; Puch-Solis, Roberto; Egli, Nicole; Anthonioz, Alexandre; Bromage-Griffiths, Andie

    2007-01-01

    Recent court challenges have highlighted the need for statistical research on fingerprint identification. This paper proposes a model for computing likelihood ratios (LRs) to assess the evidential value of comparisons with any number of minutiae. The model considers minutiae type, direction and relative spatial relationships. It expands on previous work on three minutiae by adopting a spatial modeling using radial triangulation and a probabilistic distortion model for assessing the numerator of the LR. The model has been tested on a sample of 686 ulnar loops and 204 arches. Features vectors used for statistical analysis have been obtained following a preprocessing step based on Gabor filtering and image processing to extract minutiae data. The metric used to assess similarity between two feature vectors is based on an Euclidean distance measure. Tippett plots and rates of misleading evidence have been used as performance indicators of the model. The model has shown encouraging behavior with low rates of misleading evidence and a LR power of the model increasing significantly with the number of minutiae. The LRs that it provides are highly indicative of identity of source on a significant proportion of cases, even when considering configurations with few minutiae. In contrast with previous research, the model, in addition to minutia type and direction, incorporates spatial relationships of minutiae without introducing probabilistic independence assumptions. The model also accounts for finger distortion.

  18. Computational Tools for Allosteric Drug Discovery: Site Identification and Focus Library Design.

    PubMed

    Huang, Wenkang; Nussinov, Ruth; Zhang, Jian

    2017-01-01

    Allostery is an intrinsic phenomenon of biological macromolecules involving regulation and/or signal transduction induced by a ligand binding to an allosteric site distinct from a molecule's active site. Allosteric drugs are currently receiving increased attention in drug discovery because drugs that target allosteric sites can provide important advantages over the corresponding orthosteric drugs including specific subtype selectivity within receptor families. Consequently, targeting allosteric sites, instead of orthosteric sites, can reduce drug-related side effects and toxicity. On the down side, allosteric drug discovery can be more challenging than traditional orthosteric drug discovery due to difficulties associated with determining the locations of allosteric sites and designing drugs based on these sites and the need for the allosteric effects to propagate through the structure, reach the ligand binding site and elicit a conformational change. In this study, we present computational tools ranging from the identification of potential allosteric sites to the design of "allosteric-like" modulator libraries. These tools may be particularly useful for allosteric drug discovery.

  19. A computational method for the identification of new candidate carcinogenic and non-carcinogenic chemicals.

    PubMed

    Chen, Lei; Chu, Chen; Lu, Jing; Kong, Xiangyin; Huang, Tao; Cai, Yu-Dong

    2015-09-01

    Cancer is one of the leading causes of human death. Based on current knowledge, one of the causes of cancer is exposure to toxic chemical compounds, including radioactive compounds, dioxin, and arsenic. The identification of new carcinogenic chemicals may warn us of potential danger and help to identify new ways to prevent cancer. In this study, a computational method was proposed to identify potential carcinogenic chemicals, as well as non-carcinogenic chemicals. According to the current validated carcinogenic and non-carcinogenic chemicals from the CPDB (Carcinogenic Potency Database), the candidate chemicals were searched in a weighted chemical network constructed according to chemical-chemical interactions. Then, the obtained candidate chemicals were further selected by a randomization test and information on chemical interactions and structures. The analyses identified several candidate carcinogenic chemicals, while those candidates identified as non-carcinogenic were supported by a literature search. In addition, several candidate carcinogenic/non-carcinogenic chemicals exhibit structural dissimilarity with validated carcinogenic/non-carcinogenic chemicals.

  20. [Determination of antibiotype by a computer program. Epidemiological significance and antibiogram-identification correlates].

    PubMed

    Fosse, T; Macone, F; Laffont, C

    1988-06-01

    By means of a computer program disk diffusion diameter were analysed and an antibiotic susceptibility code (antibiotype) was determined for enterobacteriaceae. This code was a 6 figure-number. Each figure summarised susceptibility (susceptible or resistant) to 3 antibiotics. Thus a 18 serial antibiotics was necessary to calculate the 6 figure-code. At least following antibiotics were chosen for their characteristic behavior: amoxycillin, ticarcillin, amoxycillin + clavulanic acid, cephalothin, ticarcillin + clavulanic acid, cefotaxime, gentamycin, tobramycin, amikacin, nalidixic acid, pefloxacin, ciprofloxacin, fosfomycin and colistin. This code allowed three kind of utilisation: epidemiology by comparing biochemical and susceptibility patterns of same isolated species; laboratory control: a data base with main antibiotic susceptibility patterns for each species allowed a rapid compatibility control of biochemical identification with antibiogram. An inconsistent result lead to a checking of biochemical and susceptibility tests or to record a new code in a file to a further enrichment of the data base. Impression of a message depending of the code for a therapeutic purpose.

  1. Identification, Recovery, and Refinement of Hitherto Undescribed Population-Level Genomes from the Human Gastrointestinal Tract

    PubMed Central

    Laczny, Cedric C.; Muller, Emilie E. L.; Heintz-Buschart, Anna; Herold, Malte; Lebrun, Laura A.; Hogan, Angela; May, Patrick; de Beaufort, Carine; Wilmes, Paul

    2016-01-01

    Linking taxonomic identity and functional potential at the population-level is important for the study of mixed microbial communities and is greatly facilitated by the availability of microbial reference genomes. While the culture-independent recovery of population-level genomes from environmental samples using the binning of metagenomic data has expanded available reference genome catalogs, several microbial lineages remain underrepresented. Here, we present two reference-independent approaches for the identification, recovery, and refinement of hitherto undescribed population-level genomes. The first approach is aimed at genome recovery of varied taxa and involves multi-sample automated binning using CANOPY CLUSTERING complemented by visualization and human-augmented binning using VIZBIN post hoc. The second approach is particularly well-suited for the study of specific taxa and employs VIZBIN de novo. Using these approaches, we reconstructed a total of six population-level genomes of distinct and divergent representatives of the Alphaproteobacteria class, the Mollicutes class, the Clostridiales order, and the Melainabacteria class from human gastrointestinal tract-derived metagenomic data. Our results demonstrate that, while automated binning approaches provide great potential for large-scale studies of mixed microbial communities, these approaches should be complemented with informative visualizations because expert-driven inspection and refinements are critical for the recovery of high-quality population-level genomes. PMID:27445992

  2. Enhanced fault-tolerant quantum computing in d-level systems.

    PubMed

    Campbell, Earl T

    2014-12-05

    Error-correcting codes protect quantum information and form the basis of fault-tolerant quantum computing. Leading proposals for fault-tolerant quantum computation require codes with an exceedingly rare property, a transversal non-Clifford gate. Codes with the desired property are presented for d-level qudit systems with prime d. The codes use n=d-1 qudits and can detect up to ∼d/3 errors. We quantify the performance of these codes for one approach to quantum computation known as magic-state distillation. Unlike prior work, we find performance is always enhanced by increasing d.

  3. Level sequence and splitting identification of closely spaced energy levels by angle-resolved analysis of fluorescence light

    NASA Astrophysics Data System (ADS)

    Wu, Z. W.; Volotka, A. V.; Surzhykov, A.; Dong, C. Z.; Fritzsche, S.

    2016-06-01

    The angular distribution and linear polarization of the fluorescence light following the resonant photoexcitation is investigated within the framework of density matrix and second-order perturbation theory. Emphasis has been placed on "signatures" for determining the level sequence and splitting of intermediate (partially) overlapping resonances, if analyzed as a function of photon energy of incident light. Detailed computations within the multiconfiguration Dirac-Fock method have been performed, especially for the 1 s22 s22 p63 s ,Ji=1 /2 +γ1→(1s22 s 2 p63 s ) 13 p3 /2,J =1 /2 ,3 /2 →1 s22 s22 p63 s ,Jf=1 /2 +γ2 photoexcitation and subsequent fluorescence emission of atomic sodium. A remarkably strong dependence of the angular distribution and linear polarization of the γ2 fluorescence emission is found upon the level sequence and splitting of the intermediate (1s22 s 2 p63 s ) 13 p3 /2,J =1 /2 ,3 /2 overlapping resonances owing to their finite lifetime (linewidth). We therefore suggest that accurate measurements of the angular distribution and linear polarization might help identify the sequence and small splittings of closely spaced energy levels, even if they cannot be spectroscopically resolved.

  4. Development of a computer-assisted forensic radiographic identification method using the lateral cervical and lumbar spine.

    PubMed

    Derrick, Sharon M; Raxter, Michelle H; Hipp, John A; Goel, Priya; Chan, Elaine F; Love, Jennifer C; Wiersema, Jason M; Akella, N Shastry

    2015-01-01

    Medical examiners and coroners (ME/C) in the United States hold statutory responsibility to identify deceased individuals who fall under their jurisdiction. The computer-assisted decedent identification (CADI) project was designed to modify software used in diagnosis and treatment of spinal injuries into a mathematically validated tool for ME/C identification of fleshed decedents. CADI software analyzes the shapes of targeted vertebral bodies imaged in an array of standard radiographs and quantifies the likelihood that any two of the radiographs contain matching vertebral bodies. Six validation tests measured the repeatability, reliability, and sensitivity of the method, and the effects of age, sex, and number of radiographs in array composition. CADI returned a 92-100% success rate in identifying the true matching pair of vertebrae within arrays of five to 30 radiographs. Further development of CADI is expected to produce a novel identification method for use in ME/C offices that is reliable, timely, and cost-effective.

  5. Computational identification and functional validation of regulatory motifs in cartilage-expressed genes

    PubMed Central

    Davies, Sherri R.; Chang, Li-Wei; Patra, Debabrata; Xing, Xiaoyun; Posey, Karen; Hecht, Jacqueline; Stormo, Gary D.; Sandell, Linda J.

    2007-01-01

    Chondrocyte gene regulation is important for the generation and maintenance of cartilage tissues. Several regulatory factors have been identified that play a role in chondrogenesis, including the positive transacting factors of the SOX family such as SOX9, SOX5, and SOX6, as well as negative transacting factors such as C/EBP and delta EF1. However, a complete understanding of the intricate regulatory network that governs the tissue-specific expression of cartilage genes is not yet available. We have taken a computational approach to identify cis-regulatory, transcription factor (TF) binding motifs in a set of cartilage characteristic genes to better define the transcriptional regulatory networks that regulate chondrogenesis. Our computational methods have identified several TFs, whose binding profiles are available in the TRANSFAC database, as important to chondrogenesis. In addition, a cartilage-specific SOX-binding profile was constructed and used to identify both known, and novel, functional paired SOX-binding motifs in chondrocyte genes. Using DNA pattern-recognition algorithms, we have also identified cis-regulatory elements for unknown TFs. We have validated our computational predictions through mutational analyses in cell transfection experiments. One novel regulatory motif, N1, found at high frequency in the COL2A1 promoter, was found to bind to chondrocyte nuclear proteins. Mutational analyses suggest that this motif binds a repressive factor that regulates basal levels of the COL2A1 promoter. PMID:17785538

  6. [Multi-level identification and analysis about infrared spectroscopy of lophatheri herba].

    PubMed

    Shao, Ying; Wu, Qi-Nan; Gu, Wei; Yue, Wei; Wu, Da-Wei; Fan, Xiu-He

    2014-05-01

    Based on the infrared spectra of Lophatheri Herba and Commelinae Herba, one-dimensional infrared spectra, second derivative spectra and two-dimensional correlated spectra were used to find out the differences between Lophatheri Herba and its imitations, respectively. The common peak ratio and variant peak ratio dual-indexes sequential were calculated and established according to infrared spectra of eleven batches of herbs. Infrared spectral data of Lophatheri Herba cluster analysis was applied to explore the similarity between each sample. The grouping results trend of sequential analysis of dual-indexes and cluster analysis was accordant. The results showed that the differences could be found by multi-level identification, and the source and the quality of the herbs could be effectively distinguished by the two analysis methods. Infrared spectroscopy, used in the present work exhibited some advantages on quick procedures, less sample required, and reliable results, which could provide a new method for the identification of traditional Chinese medicine with the imitations and adulterants, and the control of quality and origin.

  7. 48 CFR 227.7203-3 - Early identification of computer software or computer software documentation to be furnished to...

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... use, reproduction or disclosure. 227.7203-3 Section 227.7203-3 Federal Acquisition Regulations System... restrictions on use, reproduction or disclosure. (a) Use the provision at 252.227-7017, Identification and... for which restrictions, other than copyright, on use, modification, reproduction, release,...

  8. 48 CFR 227.7203-3 - Early identification of computer software or computer software documentation to be furnished to...

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... use, reproduction or disclosure. 227.7203-3 Section 227.7203-3 Federal Acquisition Regulations System... restrictions on use, reproduction or disclosure. (a) Use the provision at 252.227-7017, Identification and... for which restrictions, other than copyright, on use, modification, reproduction, release,...

  9. 48 CFR 227.7203-3 - Early identification of computer software or computer software documentation to be furnished to...

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... use, reproduction or disclosure. 227.7203-3 Section 227.7203-3 Federal Acquisition Regulations System... restrictions on use, reproduction or disclosure. (a) Use the provision at 252.227-7017, Identification and... for which restrictions, other than copyright, on use, modification, reproduction, release,...

  10. 48 CFR 227.7203-3 - Early identification of computer software or computer software documentation to be furnished to...

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... use, reproduction or disclosure. 227.7203-3 Section 227.7203-3 Federal Acquisition Regulations System... restrictions on use, reproduction or disclosure. (a) Use the provision at 252.227-7017, Identification and... for which restrictions, other than copyright, on use, modification, reproduction, release,...

  11. 48 CFR 227.7203-3 - Early identification of computer software or computer software documentation to be furnished to...

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... use, reproduction or disclosure. 227.7203-3 Section 227.7203-3 Federal Acquisition Regulations System... restrictions on use, reproduction or disclosure. (a) Use the provision at 252.227-7017, Identification and... for which restrictions, other than copyright, on use, modification, reproduction, release,...

  12. Computational medicinal chemistry for rational drug design: Identification of novel chemical structures with potential anti-tuberculosis activity.

    PubMed

    Koseki, Yuji; Aoki, Shunsuke

    2014-01-01

    Tuberculosis (TB) is caused by the bacterium Mycobacterium tuberculosis and is a common infectious disease with high mortality and morbidity. The increasing prevalence of drug-resistant strains of TB presents a major public health problem. Due to the lack of effective drugs to treat these drug-resistant strains, the discovery or development of novel anti-TB drugs is important. Computer-aided drug design has become an established strategy for the identification of novel active chemicals through a combination of several drug design tools. In this review, we summarise the current chemotherapy for TB, describe attractive target proteins for the development of antibiotics against TB, and detail several computational drug design strategies that may contribute to the further identification of active chemicals for the treatment of not only TB but also other diseases.

  13. User-microprogrammable, local host computer with low-level parallelism

    SciTech Connect

    Tomita, S.; Shibayama, K.; Kitamura, T.; Nakata, T.; Hagiwara, H.

    1983-01-01

    This paper describes the architecture of a dynamically microprogrammable computer with low-level parallelism, called QA-2, which is designed as a high-performance, local host computer for laboratory use. The architectural principle of the QA-2 is the marriage of high-speed, parallel processing capability offered by four powerful arithmetic and logic units (ALUS) with architectural flexibility provided by large scale, dynamic user-microprogramming. By changing its writable control storage dynamically, the QA-2 can be tailored to a wide spectrum of research-oriented applications covering high-level language processing and real-time processing. 11 references.

  14. [Tri-Level Infrared Spectroscopic Identification of Hot Melting Reflective Road Marking Paint].

    PubMed

    Li, Hao; Ma, Fang; Sun, Su-qin

    2015-12-01

    In order to detect the road marking paint from the trace evidence in traffic accident scene, and to differentiate their brands, we use Tri-level infrared spectroscopic identification, which employs the Fourier transform infrared spectroscopy (FTIR), the second derivative infrared spectroscopy(SD-IR), two-dimensional correlation infrared spectroscopy(2D-IR) to identify three different domestic brands of hot melting reflective road marking paints and their raw materials in formula we Selected. The experimental results show that three labels coatings in ATR and FTIR spectrograms are very similar in shape, only have different absorption peak wave numbers, they have wide and strong absorption peaks near 1435 cm⁻¹, and strong absorption peak near 879, 2955, 2919, 2870 cm⁻¹. After enlarging the partial areas of spectrograms and comparing them with each kind of raw material of formula spectrograms, we can distinguish them. In the region 700-970 and 1370-1 660 cm⁻¹ the spectrograms mainly reflect the different relative content of heavy calcium carbonate of three brands of the paints, and that of polyethylene wax (PE wax), ethylene vinyl acetate resin (EVA), dioctyl phthalate (DOP) in the region 2800-2960 cm⁻¹. The SD-IR not only verify the result of the FTIR analysis, but also further expand the microcosmic differences and reflect the different relative content of quartz sand in the 512-799 cm-1 region. Within the scope of the 1351 to 1525 cm⁻¹, 2D-IR have more significant differences in positions and numbers of automatically peaks. Therefore, the Tri-level infrared spectroscopic identification is a fast and effective method to distinguish the hot melting road marking paints with a gradually improvement in apparent resolution.

  15. DNA Barcoding for Efficient Species- and Pathovar-Level Identification of the Quarantine Plant Pathogen Xanthomonas

    PubMed Central

    Tian, Qian; Zhao, Wenjun; Lu, Songyu; Zhu, Shuifang; Li, Shidong

    2016-01-01

    Genus Xanthomonas comprises many economically important plant pathogens that affect a wide range of hosts. Indeed, fourteen Xanthomonas species/pathovars have been regarded as official quarantine bacteria for imports in China. To date, however, a rapid and accurate method capable of identifying all of the quarantine species/pathovars has yet to be developed. In this study, we therefore evaluated the capacity of DNA barcoding as a digital identification method for discriminating quarantine species/pathovars of Xanthomonas. For these analyses, 327 isolates, representing 45 Xanthomonas species/pathovars, as well as five additional species/pathovars from GenBank (50 species/pathovars total), were utilized to test the efficacy of four DNA barcode candidate genes (16S rRNA gene, cpn60, gyrB, and avrBs2). Of these candidate genes, cpn60 displayed the highest rate of PCR amplification and sequencing success. The tree-building (Neighbor-joining), ‘best close match’, and barcode gap methods were subsequently employed to assess the species- and pathovar-level resolution of each gene. Notably, all isolates of each quarantine species/pathovars formed a monophyletic group in the neighbor-joining tree constructed using the cpn60 sequences. Moreover, cpn60 also demonstrated the most satisfactory results in both barcoding gap analysis and the ‘best close match’ test. Thus, compared with the other markers tested, cpn60 proved to be a powerful DNA barcode, providing a reliable and effective means for the species- and pathovar-level identification of the quarantine plant pathogen Xanthomonas. PMID:27861494

  16. Computational Identification and Comparative Analysis of Secreted and Transmembrane Proteins in Six Burkholderia Species

    PubMed Central

    Nguyen, Thao Thi; Lee, Hyun-Hee; Park, Jungwook; Park, Inmyoung; Seo, Young-Su

    2017-01-01

    As a step towards discovering novel pathogenesis-related proteins, we performed a genome scale computational identification and characterization of secreted and transmembrane (TM) proteins, which are mainly responsible for bacteria-host interactions and interactions with other bacteria, in the genomes of six representative Burkholderia species. The species comprised plant pathogens (B. glumae BGR1, B. gladioli BSR3), human pathogens (B. pseudomallei K96243, B. cepacia LO6), and plant-growth promoting endophytes (Burkholderia sp. KJ006, B. phytofirmans PsJN). The proportions of putative classically secreted proteins (CSPs) and TM proteins among the species were relatively high, up to approximately 20%. Lower proportions of putative type 3 non-classically secreted proteins (T3NCSPs) (~10%) and unclassified non-classically secreted proteins (NCSPs) (~5%) were observed. The numbers of TM proteins among the three clusters (plant pathogens, human pathogens, and endophytes) were different, while the distribution of these proteins according to the number of TM domains was conserved in which TM proteins possessing 1, 2, 4, or 12 TM domains were the dominant groups in all species. In addition, we observed conservation in the protein size distribution of the secreted protein groups among the species. There were species-specific differences in the functional characteristics of these proteins in the various groups of CSPs, T3NCSPs, and unclassified NCSPs. Furthermore, we assigned the complete sets of the conserved and unique NCSP candidates of the collected Burkholderia species using sequence similarity searching. This study could provide new insights into the relationship among plant-pathogenic, human-pathogenic, and endophytic bacteria. PMID:28381962

  17. An integrated transcriptomic and computational analysis for biomarker identification in human glioma.

    PubMed

    Xing, Wenli; Zeng, Chun

    2016-06-01

    Malignant glioma is one of the most common primary brain tumors and is among the deadliest of human cancers. The molecular mechanism for human glioma is poorly understood. Early prognosis of this disease and early treatment are vital. Thus, it is crucial to target the key genes controlling pathogenesis in the early stage of glioma. In this study, differentially expressed genes in human glioma and paired peritumoral tissues were detected by transcriptome microarray analysis. Following gene microarray analysis, the gene expression profile in the differential grade glioma was further validated by bioinformatic analyses, co-expression network construction. Microarray analysis revealed that 1725 genes were differentially expressed and classified into different glioma stage. The analysis revealed 14 genes that were significantly associated with survival with a false discovery rate. Among these genes, macrophage capping protein (CAPG), a member of the actin-regulatory protein, was the key gene in a 20-gene network that modulates cell motility by interacting with the cytoskeleton. Furthermore, the prognostic impact of CAPG was validated by use of quantitative real-time polymerase chain reaction (qPCR) and immunohistochemistry on human glioma tissue. CAPG protein was significantly upregulated in clinical high-grade glioblastoma as compared with normal brain tissues. Overexpression of CAPG levels also predict shorter overall survival of glioma patients. These data demonstrated CAPG protein expression in human glioma was associated with tumorigenesis and may be a biomarker for identification of the pathological grade of glioma.

  18. A Simple Gauss-Newton Procedure for Covariance Structure Analysis with High-Level Computer Languages.

    ERIC Educational Resources Information Center

    Cudeck, Robert; And Others

    1993-01-01

    An implementation of the Gauss-Newton algorithm for the analysis of covariance structure that is specifically adapted for high-level computer languages is reviewed. This simple method for estimating structural equation models is useful for a variety of standard models, as is illustrated. (SLD)

  19. The Relationship between Internet and Computer Game Addiction Level and Shyness among High School Students

    ERIC Educational Resources Information Center

    Ayas, Tuncay

    2012-01-01

    This study is conducted to determine the relationship between the internet and computer games addiction level and the shyness among high school students. The participants of the study consist of 365 students attending high schools in Giresun city centre during 2009-2010 academic year. As a result of the study a positive, meaningful, and high…

  20. Can Synchronous Computer-Mediated Communication (CMC) Help Beginning-Level Foreign Language Learners Speak?

    ERIC Educational Resources Information Center

    Ko, Chao-Jung

    2012-01-01

    This study investigated the possibility that initial-level learners may acquire oral skills through synchronous computer-mediated communication (SCMC). Twelve Taiwanese French as a foreign language (FFL) students, divided into three groups, were required to conduct a variety of tasks in one of the three learning environments (video/audio, audio,…

  1. 24 CFR 990.165 - Computation of project expense level (PEL).

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 24 Housing and Urban Development 4 2010-04-01 2010-04-01 false Computation of project expense level (PEL). 990.165 Section 990.165 Housing and Urban Development Regulations Relating to Housing and Urban Development (Continued) OFFICE OF ASSISTANT SECRETARY FOR PUBLIC AND INDIAN HOUSING, DEPARTMENT...

  2. Evaluation of the Wider System, a New Computer-Assisted Image-Processing Device for Bacterial Identification and Susceptibility Testing

    PubMed Central

    Cantón, Rafael; Pérez-Vázquez, María; Oliver, Antonio; Sánchez Del Saz, Begoña; Gutiérrez, M. Olga; Martínez-Ferrer, Manuel; Baquero, Fernando

    2000-01-01

    The Wider system is a newly developed computer-assisted image-processing device for both bacterial identification and antimicrobial susceptibility testing. It has been adapted to be able to read and interpret commercial MicroScan panels. Two hundred forty-four fresh consecutive clinical isolates (138 isolates of the family Enterobacteriaceae, 25 nonfermentative gram-negative rods [NFGNRs], and 81 gram-positive cocci) were tested. In addition, 100 enterobacterial strains with known β-lactam resistance mechanisms (22 strains with chromosomal AmpC β-lactamase, 8 strains with chromosomal class A β-lactamase, 21 broad-spectrum and IRT β-lactamase-producing strains, 41 extended-spectrum β-lactamase-producing strains, and 8 permeability mutants) were tested. API galleries and National Committee for Clinical Laboratory Standards (NCCLS) microdilution methods were used as reference methods. The Wider system correctly identified 97.5% of the clinical isolates at the species level. Overall essential agreement (±1 log2 dilution for 3,719 organism-antimicrobial drug combinations) was 95.6% (isolates of the family Enterobacteriaceae, 96.6%; NFGNRs, 88.0%; gram-positive cocci, 95.6%). The lowest essential agreement was observed with Enterobacteriaceae versus imipenem (84.0%), NFGNR versus piperacillin (88.0%) and cefepime (88.0%), and gram-positive isolates versus penicillin (80.4%). The category error rate (NCCLS criteria) was 4.2% (2.0% very major errors, 0.6% major errors, and 1.5% minor errors). Essential agreement and interpretive error rates for eight β-lactam antibiotics against isolates of the family Enterobacteriaceae with known β-lactam resistance mechanisms were 94.8 and 5.4%, respectively. Interestingly, the very major error rate was only 0.8%. Minor errors (3.6%) were mainly observed with amoxicillin-clavulanate and cefepime against extended-spectrum β-lactamase-producing isolates. The Wider system is a new reliable tool which applies the image

  3. Automatic Identification of the Repolarization Endpoint by Computing the Dominant T-wave on a Reduced Number of Leads

    PubMed Central

    Giuliani, C.; Agostinelli, A.; Di Nardo, F.; Fioretti, S.; Burattini, L.

    2016-01-01

    Electrocardiographic (ECG) T-wave endpoint (Tend) identification suffers lack of reliability due to the presence of noise and variability among leads. Tend identification can be improved by using global repolarization waveforms obtained by combining several leads. The dominant T-wave (DTW) is a global repolarization waveform that proved to improve Tend identification when computed using the 15 (I to III, aVr, aVl, aVf, V1 to V6, X, Y, Z) leads usually available in clinics, of which only 8 (I, II, V1 to V6) are independent. The aim of the present study was to evaluate if the 8 independent leads are sufficient to obtain a DTW which allows a reliable Tend identification. To this aim Tend measures automatically identified from 15-dependent-lead DTWs of 46 control healthy subjects (CHS) and 103 acute myocardial infarction patients (AMIP) were compared with those obtained from 8-independent-lead DTWs. Results indicate that Tend distributions have not statistically different median values (CHS: 340 ms vs. 340 ms, respectively; AMIP: 325 ms vs. 320 ms, respectively), besides being strongly correlated (CHS: ρ=0.97, AMIP: 0.88; P<10-27). Thus, measuring Tend from the 15-dependent-lead DTWs is statistically equivalent to measuring Tend from the 8-independent-lead DTWs. In conclusion, for the clinical purpose of automatic Tend identification from DTW, the 8 independent leads can be used without a statistically significant loss of accuracy but with a significant decrement of computational effort. The lead dependence of 7 out of 15 leads does not introduce a significant bias in the Tend determination from 15 dependent lead DTWs. PMID:27347218

  4. Cluster chemical ionization for improved confidence level in sample identification by gas chromatography/mass spectrometry.

    PubMed

    Fialkov, Alexander B; Amirav, Aviv

    2003-01-01

    Upon the supersonic expansion of helium mixed with vapor from an organic solvent (e.g. methanol), various clusters of the solvent with the sample molecules can be formed. As a result of 70 eV electron ionization of these clusters, cluster chemical ionization (cluster CI) mass spectra are obtained. These spectra are characterized by the combination of EI mass spectra of vibrationally cold molecules in the supersonic molecular beam (cold EI) with CI-like appearance of abundant protonated molecules, together with satellite peaks corresponding to protonated or non-protonated clusters of sample compounds with 1-3 solvent molecules. Like CI, cluster CI preferably occurs for polar compounds with high proton affinity. However, in contrast to conventional CI, for non-polar compounds or those with reduced proton affinity the cluster CI mass spectrum converges to that of cold EI. The appearance of a protonated molecule and its solvent cluster peaks, plus the lack of protonation and cluster satellites for prominent EI fragments, enable the unambiguous identification of the molecular ion. In turn, the insertion of the proper molecular ion into the NIST library search of the cold EI mass spectra eliminates those candidates with incorrect molecular mass and thus significantly increases the confidence level in sample identification. Furthermore, molecular mass identification is of prime importance for the analysis of unknown compounds that are absent in the library. Examples are given with emphasis on the cluster CI analysis of carbamate pesticides, high explosives and unknown samples, to demonstrate the usefulness of Supersonic GC/MS (GC/MS with supersonic molecular beam) in the analysis of these thermally labile compounds. Cluster CI is shown to be a practical ionization method, due to its ease-of-use and fast instrumental conversion between EI and cluster CI, which involves the opening of only one valve located at the make-up gas path. The ease-of-use of cluster CI is analogous

  5. Computational identification of CDR3 sequence archetypes among immunoglobulin sequences in chronic lymphocytic leukemia.

    PubMed

    Messmer, Bradley T; Raphael, Benjamin J; Aerni, Sarah J; Widhopf, George F; Rassenti, Laura Z; Gribben, John G; Kay, Neil E; Kipps, Thomas J

    2009-03-01

    The leukemia cells of unrelated patients with chronic lymphocytic leukemia (CLL) display a restricted repertoire of immunoglobulin (Ig) gene rearrangements with preferential usage of certain Ig gene segments. We developed a computational method to rigorously quantify biases in Ig sequence similarity in large patient databases and to identify groups of patients with unusual levels of sequence similarity. We applied our method to sequences from 1577 CLL patients through the CLL Research Consortium (CRC), and identified 67 similarity groups into which roughly 20% of all patients could be assigned. Immunoglobulin light chain class was highly correlated within all groups and light chain gene usage was similar within sets. Surprisingly, over 40% of the identified groups were composed of somatically mutated genes. This study significantly expands the evidence that antigen selection shapes the Ig repertoire in CLL.

  6. The Development of a Computer-Directed Training Subsystem and Computer Operator Training Material for the Air Force Phase II Base Level System. Final Report.

    ERIC Educational Resources Information Center

    System Development Corp., Santa Monica, CA.

    The design, development, and evaluation of an integrated Computer-Directed Training Subsystem (CDTS) for the Air Force Phase II Base Level System is described in this report. The development and evaluation of a course to train computer operators of the Air Force Phase II Base Level System under CDTS control is also described. Detailed test results…

  7. Identification of "Streptococcus milleri" group isolates to the species level with a commercially available rapid test system.

    PubMed

    Flynn, C E; Ruoff, K L

    1995-10-01

    Clinical isolates of the "Streptococcus milleri" species group were examined by conventional methods and a rapid, commercially available method for the identification of these strains to the species level. The levels of agreement between the identifications obtained with the commercially available system (Fluo-Card Milleri; KEY Scientific, Round Rock, Tex.) and conventional methods were 98% for 50 Streptococcus anginosus strains, 97% for 31 Streptococcus constellatus strains, and 88% for 17 isolates identified as Streptococcus intermedius. Patient records were also studied in order to gain information on the frequency and sites of isolation of each of the three "S. milleri" group species.

  8. Assessment of Social Vulnerability Identification at Local Level around Merapi Volcano - A Self Organizing Map Approach

    NASA Astrophysics Data System (ADS)

    Lee, S.; Maharani, Y. N.; Ki, S. J.

    2015-12-01

    The application of Self-Organizing Map (SOM) to analyze social vulnerability to recognize the resilience within sites is a challenging tasks. The aim of this study is to propose a computational method to identify the sites according to their similarity and to determine the most relevant variables to characterize the social vulnerability in each cluster. For this purposes, SOM is considered as an effective platform for analysis of high dimensional data. By considering the cluster structure, the characteristic of social vulnerability of the sites identification can be fully understand. In this study, the social vulnerability variable is constructed from 17 variables, i.e. 12 independent variables which represent the socio-economic concepts and 5 dependent variables which represent the damage and losses due to Merapi eruption in 2010. These variables collectively represent the local situation of the study area, based on conducted fieldwork on September 2013. By using both independent and dependent variables, we can identify if the social vulnerability is reflected onto the actual situation, in this case, Merapi eruption 2010. However, social vulnerability analysis in the local communities consists of a number of variables that represent their socio-economic condition. Some of variables employed in this study might be more or less redundant. Therefore, SOM is used to reduce the redundant variable(s) by selecting the representative variables using the component planes and correlation coefficient between variables in order to find the effective sample size. Then, the selected dataset was effectively clustered according to their similarities. Finally, this approach can produce reliable estimates of clustering, recognize the most significant variables and could be useful for social vulnerability assessment, especially for the stakeholder as decision maker. This research was supported by a grant 'Development of Advanced Volcanic Disaster Response System considering

  9. Computing converged free energy differences between levels of theory via nonequilibrium work methods: Challenges and opportunities.

    PubMed

    Kearns, Fiona L; Hudson, Phillip S; Woodcock, Henry L; Boresch, Stefan

    2017-03-08

    We demonstrate that Jarzynski's equation can be used to reliably compute free energy differences between low and high level representations of systems. The need for such a calculation arises when employing the so-called "indirect" approach to free energy simulations with mixed quantum mechanical/molecular mechanical (QM/MM) Hamiltonians; a popular technique for circumventing extensive simulations involving quantum chemical computations. We have applied this methodology to several small and medium sized organic molecules, both in the gas phase and explicit solvent. Test cases include several systems for which the standard approach; that is, free energy perturbation between low and high level description, fails to converge. Finally, we identify three major areas in which the difference between low and high level representations make the calculation of ΔAlow→high difficult: bond stretching and angle bending, different preferred conformations, and the response of the MM region to the charge distribution of the QM region. © 2016 Wiley Periodicals, Inc.

  10. Identification of histamine receptors and reduction of squalene levels by an antihistamine in sebocytes.

    PubMed

    Pelle, Edward; McCarthy, James; Seltmann, Holger; Huang, Xi; Mammone, Thomas; Zouboulis, Christos C; Maes, Daniel

    2008-05-01

    Overproduction of sebum, especially during adolescence, is causally related to acne and inflammation. As a way to reduce sebum and its interference with the process of follicular keratinization in the pilosebaceous unit leading to inflammatory acne lesions, antihistamines were investigated for their effect on sebocytes, the major cell of the sebaceous gland responsible for producing sebum. Reverse transcriptase-PCR analysis and immunofluorescence of an immortalized sebocyte cell line (SZ95) revealed the presence of histamine-1 receptor (H-1 receptor), and thus indicated that histamines and, conversely, antihistamines could potentially modulate sebocyte function directly. When sebocytes were incubated with an H-1 receptor antagonist, diphenhydramine (DPH), at non-cytotoxic doses, a significant decrease in squalene levels, a biomarker for sebum, was observed. As determined by high-performance liquid chromatography, untreated sebocytes contained 6.27 (+/-0.73) nmol squalene per 10(6) cells, whereas for DPH-treated cells, the levels were 2.37 (+/-0.24) and 2.03 (+/-0.97) nmol squalene per 10(6) cells at 50 and 100 microM, respectively. These data were further substantiated by the identification of histamine receptors in human sebaceous glands. In conclusion, our data show the presence of histamine receptors on sebocytes, demonstrate how an antagonist to these receptors modulated cellular function, and may indicate a new paradigm for acne therapy involving an H-1 receptor-mediated pathway.

  11. Computational identification of conserved microRNAs and their targets from expression sequence tags of blueberry (Vaccinium corybosum).

    PubMed

    Li, Xuyan; Hou, Yanming; Zhang, Li; Zhang, Wenhao; Quan, Chen; Cui, Yuhai; Bian, Shaomin

    2014-01-01

    MicroRNAs (miRNAs) are a class of endogenous, approximately 21nt in length, non-coding RNA, which mediate the expression of target genes primarily at post-transcriptional levels. miRNAs play critical roles in almost all plant cellular and metabolic processes. Although numerous miRNAs have been identified in the plant kingdom, the miRNAs in blueberry, which is an economically important small fruit crop, still remain totally unknown. In this study, we reported a computational identification of miRNAs and their targets in blueberry. By conducting an EST-based comparative genomics approach, 9 potential vco-miRNAs were discovered from 22,402 blueberry ESTs according to a series of filtering criteria, designated as vco-miR156-5p, vco-miR156-3p, vco-miR1436, vco-miR1522, vco-miR4495, vco-miR5120, vco-miR5658, vco-miR5783, and vco-miR5986. Based on sequence complementarity between miRNA and its target transcript, 34 target ESTs from blueberry and 70 targets from other species were identified for the vco-miRNAs. The targets were found to be involved in transcription, RNA splicing and binding, DNA duplication, signal transduction, transport and trafficking, stress response, as well as synthesis and metabolic process. These findings will greatly contribute to future research in regard to functions and regulatory mechanisms of blueberry miRNAs.

  12. Level set discrete element method for three-dimensional computations with triaxial case study

    NASA Astrophysics Data System (ADS)

    Kawamoto, Reid; Andò, Edward; Viggiani, Gioacchino; Andrade, José E.

    2016-06-01

    In this paper, we outline the level set discrete element method (LS-DEM) which is a discrete element method variant able to simulate systems of particles with arbitrary shape using level set functions as a geometric basis. This unique formulation allows seamless interfacing with level set-based characterization methods as well as computational ease in contact calculations. We then apply LS-DEM to simulate two virtual triaxial specimens generated from XRCT images of experiments and demonstrate LS-DEM's ability to quantitatively capture and predict stress-strain and volume-strain behavior observed in the experiments.

  13. Energy Use and Power Levels in New Monitors and Personal Computers

    SciTech Connect

    Roberson, Judy A.; Homan, Gregory K.; Mahajan, Akshay; Nordman, Bruce; Webber, Carrie A.; Brown, Richard E.; McWhinney, Marla; Koomey, Jonathan G.

    2002-07-23

    Our research was conducted in support of the EPA ENERGY STAR Office Equipment program, whose goal is to reduce the amount of electricity consumed by office equipment in the U.S. The most energy-efficient models in each office equipment category are eligible for the ENERGY STAR label, which consumers can use to identify and select efficient products. As the efficiency of each category improves over time, the ENERGY STAR criteria need to be revised accordingly. The purpose of this study was to provide reliable data on the energy consumption of the newest personal computers and monitors that the EPA can use to evaluate revisions to current ENERGY STAR criteria as well as to improve the accuracy of ENERGY STAR program savings estimates. We report the results of measuring the power consumption and power management capabilities of a sample of new monitors and computers. These results will be used to improve estimates of program energy savings and carbon emission reductions, and to inform rev isions of the ENERGY STAR criteria for these products. Our sample consists of 35 monitors and 26 computers manufactured between July 2000 and October 2001; it includes cathode ray tube (CRT) and liquid crystal display (LCD) monitors, Macintosh and Intel-architecture computers, desktop and laptop computers, and integrated computer systems, in which power consumption of the computer and monitor cannot be measured separately. For each machine we measured power consumption when off, on, and in each low-power level. We identify trends in and opportunities to reduce power consumption in new personal computers and monitors. Our results include a trend among monitor manufacturers to provide a single very low low-power level, well below the current ENERGY STAR criteria for sleep power consumption. These very low sleep power results mean that energy consumed when monitors are off or in active use has become more important in terms of contribution to the overall unit energy consumption (UEC

  14. Cellular Automata as a Computational Model for Low-Level Vision

    NASA Astrophysics Data System (ADS)

    Broggi, Alberto; D'Andrea, Vincenzo; Destri, Giulio

    In this paper we discuss the use of the Cellular Automata (CA) computational model in computer vision applications on massively parallel architectures. Motivations and guidelines of this approach to low-level vision in the frame of the PROMETHEUS project are discussed. The hard real-time requirement of actual application can be only satisfied using an ad hoc VLSI massively parallel architecture (PAPRICA). The hardware solutions and the specific algorithms can be efficiently verified and tested only using, as a simulator, a general purpose machine with a parent architecture (CM-2). An example of application related to feature extraction is discussed.

  15. E-Predict: a computational strategy for species identification based on observed DNA microarray hybridization patterns.

    PubMed

    Urisman, Anatoly; Fischer, Kael F; Chiu, Charles Y; Kistler, Amy L; Beck, Shoshannah; Wang, David; DeRisi, Joseph L

    2005-01-01

    DNA microarrays may be used to identify microbial species present in environmental and clinical samples. However, automated tools for reliable species identification based on observed microarray hybridization patterns are lacking. We present an algorithm, E-Predict, for microarray-based species identification. E-Predict compares observed hybridization patterns with theoretical energy profiles representing different species. We demonstrate the application of the algorithm to viral detection in a set of clinical samples and discuss its relevance to other metagenomic applications.

  16. Computational identification of altered metabolism using gene expression and metabolic pathways.

    PubMed

    Nam, Hojung; Lee, Jinwon; Lee, Doheon

    2009-07-01

    Understanding altered metabolism is an important issue because altered metabolism is often revealed as a cause or an effect in pathogenesis. It has also been shown to be an important factor in the manipulation of an organism's metabolism in metabolic engineering. Unfortunately, it is not yet possible to measure the concentration levels of all metabolites in the genome-wide scale of a metabolic network; consequently, a method that infers the alteration of metabolism is beneficial. The present study proposes a computational method that identifies genome-wide altered metabolism by analyzing functional units of KEGG pathways. As control of a metabolic pathway is accomplished by altering the activity of at least one rate-determining step enzyme, not all gene expressions of enzymes in the pathway demonstrate significant changes even if the pathway is altered. Therefore, we measure the alteration levels of a metabolic pathway by selectively observing expression levels of significantly changed genes in a pathway. The proposed method was applied to two strains of Saccharomyces cerevisiae gene expression profiles measured in very high-gravity (VHG) fermentation. The method identified altered metabolic pathways whose properties are related to ethanol and osmotic stress responses which had been known to be observed in VHG fermentation because of the high sugar concentration in growth media and high ethanol concentration in fermentation products. With the identified altered pathways, the proposed method achieved best accuracy and sensitivity rates for the Red Star (RS) strain compared to other three related studies (gene-set enrichment analysis (GSEA), significance analysis of microarray to gene set (SAM-GS), reporter metabolite), and for the CEN.PK 113-7D (CEN) strain, the proposed method and the GSEA method showed comparably similar performances.

  17. A novel computer-aided detection system for pulmonary nodule identification in CT images

    NASA Astrophysics Data System (ADS)

    Han, Hao; Li, Lihong; Wang, Huafeng; Zhang, Hao; Moore, William; Liang, Zhengrong

    2014-03-01

    Computer-aided detection (CADe) of pulmonary nodules from computer tomography (CT) scans is critical for assisting radiologists to identify lung lesions at an early stage. In this paper, we propose a novel approach for CADe of lung nodules using a two-stage vector quantization (VQ) scheme. The first-stage VQ aims to extract lung from the chest volume, while the second-stage VQ is designed to extract initial nodule candidates (INCs) within the lung volume. Then rule-based expert filtering is employed to prune obvious FPs from INCs, and the commonly-used support vector machine (SVM) classifier is adopted to further reduce the FPs. The proposed system was validated on 100 CT scans randomly selected from the 262 scans that have at least one juxta-pleural nodule annotation in the publicly available database - Lung Image Database Consortium and Image Database Resource Initiative (LIDC-IDRI). The two-stage VQ only missed 2 out of the 207 nodules at agreement level 1, and the INCs detection for each scan took about 30 seconds in average. Expert filtering reduced FPs more than 18 times, while maintaining a sensitivity of 93.24%. As it is trivial to distinguish INCs attached to pleural wall versus not on wall, we investigated the feasibility of training different SVM classifiers to further reduce FPs from these two kinds of INCs. Experiment results indicated that SVM classification over the entire set of INCs was in favor of, where the optimal operating of our CADe system achieved a sensitivity of 89.4% at a specificity of 86.8%.

  18. Identification of Cognitive Processes of Effective and Ineffective Students during Computer Programming

    ERIC Educational Resources Information Center

    Renumol, V. G.; Janakiram, Dharanipragada; Jayaprakash, S.

    2010-01-01

    Identifying the set of cognitive processes (CPs) a student can go through during computer programming is an interesting research problem. It can provide a better understanding of the human aspects in computer programming process and can also contribute to the computer programming education in general. The study identified the presence of a set of…

  19. Reshaping Computer Literacy Teaching in Higher Education: Identification of Critical Success Factors

    ERIC Educational Resources Information Center

    Taylor, Estelle; Goede, Roelien; Steyn, Tjaart

    2011-01-01

    Purpose: Acquiring computer skills is more important today than ever before, especially in a developing country. Teaching of computer skills, however, has to adapt to new technology. This paper aims to model factors influencing the success of the learning of computer literacy by means of an e-learning environment. The research question for this…

  20. Are accurate computations of the 13C' shielding feasible at the DFT level of theory?

    PubMed

    Vila, Jorge A; Arnautova, Yelena A; Martin, Osvaldo A; Scheraga, Harold A

    2014-02-05

    The goal of this study is twofold. First, to investigate the relative influence of the main structural factors affecting the computation of the (13)C' shielding, namely, the conformation of the residue itself and the next nearest-neighbor effects. Second, to determine whether calculation of the (13)C' shielding at the density functional level of theory (DFT), with an accuracy similar to that of the (13)C(α) shielding, is feasible with the existing computational resources. The DFT calculations, carried out for a large number of possible conformations of the tripeptide Ac-GXY-NMe, with different combinations of X and Y residues, enable us to conclude that the accurate computation of the (13)C' shielding for a given residue X depends on the: (i) (ϕ,ψ) backbone torsional angles of X; (ii) side-chain conformation of X; (iii) (ϕ,ψ) torsional angles of Y; and (iv) identity of residue Y. Consequently, DFT-based quantum mechanical calculations of the (13)C' shielding, with all these factors taken into account, are two orders of magnitude more CPU demanding than the computation, with similar accuracy, of the (13)C(α) shielding. Despite not considering the effect of the possible hydrogen bond interaction of the carbonyl oxygen, this work contributes to our general understanding of the main structural factors affecting the accurate computation of the (13)C' shielding in proteins and may spur significant progress in effort to develop new validation methods for protein structures.

  1. High-level waste storage tank farms/242-A evaporator Standards/Requirements Identification Document (S/RID), Volume 4

    SciTech Connect

    Not Available

    1994-04-01

    The High-Level Waste Storage Tank Farms/242-A Evaporator Standards/Requirements Identification Document (S/RID) is contained in multiple volumes. This document (Volume 4) presents the standards and requirements for the following sections: Radiation Protection and Operations.

  2. High-level waste storage tank farms/242-A evaporator Standards/Requirements Identification Document (S/RID)

    SciTech Connect

    Not Available

    1994-04-01

    The High-Level Waste Storage Tank Farms/242-A Evaporator Standards/Requirements Identification Document (S/RID) is contained in multiple volumes. This document (Volume 3) presents the standards and requirements for the following sections: Safeguards and Security, Engineering Design, and Maintenance.

  3. High-level waste storage tank farms/242-A evaporator Standards/Requirements Identification Document (S/RID), Volume 5

    SciTech Connect

    Not Available

    1994-04-01

    The High-Level Waste Storage Tank Farms/242-A Evaporator Standards/Requirements Identification Document (S/RID) is contained in multiple volumes. This document (Volume 5) outlines the standards and requirements for the Fire Protection and Packaging and Transportation sections.

  4. High level waste storage tank farms/242-A evaporator Standards/Requirements Identification Document (S/RID), Volume 6

    SciTech Connect

    Not Available

    1994-04-01

    The High-Level Waste Storage Tank Farms/242-A Evaporator Standards/Requirements Identification Document (S/RID) is contained in multiple volumes. This document (Volume 6) outlines the standards and requirements for the sections on: Environmental Restoration and Waste Management, Research and Development and Experimental Activities, and Nuclear Safety.

  5. Computational Psychiatry of ADHD: Neural Gain Impairments across Marrian Levels of Analysis

    PubMed Central

    Hauser, Tobias U.; Fiore, Vincenzo G.; Moutoussis, Michael; Dolan, Raymond J.

    2016-01-01

    Attention-deficit hyperactivity disorder (ADHD), one of the most common psychiatric disorders, is characterised by unstable response patterns across multiple cognitive domains. However, the neural mechanisms that explain these characteristic features remain unclear. Using a computational multilevel approach, we propose that ADHD is caused by impaired gain modulation in systems that generate this phenotypic increased behavioural variability. Using Marr's three levels of analysis as a heuristic framework, we focus on this variable behaviour, detail how it can be explained algorithmically, and how it might be implemented at a neural level through catecholamine influences on corticostriatal loops. This computational, multilevel, approach to ADHD provides a framework for bridging gaps between descriptions of neuronal activity and behaviour, and provides testable predictions about impaired mechanisms. PMID:26787097

  6. Evolving tactics using levels of intelligence in computer-generated forces

    NASA Astrophysics Data System (ADS)

    Porto, Vincent W.; Hardt, Michael; Fogel, David B.; Kreutz-Delgado, Kenneth; Fogel, Lawrence J.

    1999-06-01

    Simulated evolution on a computer can provide a means for generating appropriate tactics in real-time combat scenarios. Individual unit or higher-level organizations, such as tanks and platoons, can use evolutionary computation to adapt to the current and projected situations. We briefly review current knowledge in evolutionary algorithms and offer an example of applying these techniques to generate adaptive behavior in a platoon-level engagement of tanks where the mission of one platoon is changed on-the-fly. We also study the effects of increasing the intelligence of one side in a one-on-one tank engagement. The results indicate that measured performance increases with increased intelligence; however, this does not always come at the expense of the opposing side.

  7. Using the high-level based program interface to facilitate the large scale scientific computing.

    PubMed

    Shang, Yizi; Shang, Ling; Gao, Chuanchang; Lu, Guiming; Ye, Yuntao; Jia, Dongdong

    2014-01-01

    This paper is to make further research on facilitating the large-scale scientific computing on the grid and the desktop grid platform. The related issues include the programming method, the overhead of the high-level program interface based middleware, and the data anticipate migration. The block based Gauss Jordan algorithm as a real example of large-scale scientific computing is used to evaluate those issues presented above. The results show that the high-level based program interface makes the complex scientific applications on large-scale scientific platform easier, though a little overhead is unavoidable. Also, the data anticipation migration mechanism can improve the efficiency of the platform which needs to process big data based scientific applications.

  8. Using the High-Level Based Program Interface to Facilitate the Large Scale Scientific Computing

    PubMed Central

    Shang, Yizi; Shang, Ling; Gao, Chuanchang; Lu, Guiming; Ye, Yuntao; Jia, Dongdong

    2014-01-01

    This paper is to make further research on facilitating the large-scale scientific computing on the grid and the desktop grid platform. The related issues include the programming method, the overhead of the high-level program interface based middleware, and the data anticipate migration. The block based Gauss Jordan algorithm as a real example of large-scale scientific computing is used to evaluate those issues presented above. The results show that the high-level based program interface makes the complex scientific applications on large-scale scientific platform easier, though a little overhead is unavoidable. Also, the data anticipation migration mechanism can improve the efficiency of the platform which needs to process big data based scientific applications. PMID:24574931

  9. The Duke Personal Computer Project: A Strategy for Computing Literacy.

    ERIC Educational Resources Information Center

    Gallie, Thomas M.; And Others

    1981-01-01

    The introduction of an instructional computing strategy at Duke University that led to identification of three levels of user subsets and establishment of equipment clusters is described. Uses of the system in establishing computer literacy and special applications in chemistry, computer science, and the social sciences are reviewed. (MP)

  10. POPCYCLE: a computer code for calculating nuclear and fossil plant levelized life-cycle power costs

    SciTech Connect

    Hardie, R.W.

    1982-02-01

    POPCYCLE, a computer code designed to calculate levelized life-cycle power costs for nuclear and fossil electrical generating plants is described. Included are (1) derivations of the equations and a discussion of the methodology used by POPCYCLE, (2) a description of the input required by the code, (3) a listing of the input for a sample case, and (4) the output for a sample case.

  11. Identification of masses in digital mammogram using gray level co-occurrence matrices.

    PubMed

    Mohd Khuzi, A; Besar, R; Wan Zaki, Wmd; Ahmad, Nn

    2009-07-01

    Digital mammogram has become the most effective technique for early breast cancer detection modality. Digital mammogram takes an electronic image of the breast and stores it directly in a computer. The aim of this study is to develop an automated system for assisting the analysis of digital mammograms. Computer image processing techniques will be applied to enhance images and this is followed by segmentation of the region of interest (ROI). Subsequently, the textural features will be extracted from the ROI. The texture features will be used to classify the ROIs as either masses or non-masses. In this study normal breast images and breast image with masses used as the standard input to the proposed system are taken from Mammographic Image Analysis Society (MIAS) digital mammogram database. In MIAS database, masses are grouped into either spiculated, circumscribed or ill-defined. Additional information includes location of masses centres and radius of masses. The extraction of the textural features of ROIs is done by using gray level co-occurrence matrices (GLCM) which is constructed at four different directions for each ROI. The results show that the GLCM at 0º, 45º, 90º and 135º with a block size of 8X8 give significant texture information to identify between masses and non-masses tissues. Analysis of GLCM properties i.e. contrast, energy and homogeneity resulted in receiver operating characteristics (ROC) curve area of Az = 0.84 for Otsu's method, 0.82 for thresholding method and Az = 0.7 for K-mean clustering. ROC curve area of 0.8-0.9 is rated as good results. The authors' proposed method contains no complicated algorithm. The detection is based on a decision tree with five criterions to be analysed. This simplicity leads to less computational time. Thus, this approach is suitable for automated real-time breast cancer diagnosis system.

  12. Identification of masses in digital mammogram using gray level co-occurrence matrices

    PubMed Central

    Mohd. Khuzi, A; Besar, R; Wan Zaki, WMD; Ahmad, NN

    2009-01-01

    Digital mammogram has become the most effective technique for early breast cancer detection modality. Digital mammogram takes an electronic image of the breast and stores it directly in a computer. The aim of this study is to develop an automated system for assisting the analysis of digital mammograms. Computer image processing techniques will be applied to enhance images and this is followed by segmentation of the region of interest (ROI). Subsequently, the textural features will be extracted from the ROI. The texture features will be used to classify the ROIs as either masses or non-masses. In this study normal breast images and breast image with masses used as the standard input to the proposed system are taken from Mammographic Image Analysis Society (MIAS) digital mammogram database. In MIAS database, masses are grouped into either spiculated, circumscribed or ill-defined. Additional information includes location of masses centres and radius of masses. The extraction of the textural features of ROIs is done by using gray level co-occurrence matrices (GLCM) which is constructed at four different directions for each ROI. The results show that the GLCM at 0º, 45º, 90º and 135º with a block size of 8X8 give significant texture information to identify between masses and non-masses tissues. Analysis of GLCM properties i.e. contrast, energy and homogeneity resulted in receiver operating characteristics (ROC) curve area of Az = 0.84 for Otsu’s method, 0.82 for thresholding method and Az = 0.7 for K-mean clustering. ROC curve area of 0.8-0.9 is rated as good results. The authors’ proposed method contains no complicated algorithm. The detection is based on a decision tree with five criterions to be analysed. This simplicity leads to less computational time. Thus, this approach is suitable for automated real-time breast cancer diagnosis system. PMID:21611053

  13. Computer Based Instruction in the U.S. Army’s Entry Level Enlisted Training.

    DTIC Science & Technology

    1985-03-13

    CHART NATIONAL BUREAU OF STANDARDS- I963-A COMPUTER BASED INSTRUCTION IN THE U.S. ARMY’S ENTRY LEVEL ENLISTED TRAINING Captain James A. Eldredge HQDA... Captain James A. Eldredge 9. PERFORMING ORGANIZATION NAME AND ADDRESS 10. PROGRAM ELEMENT, PROJECT, TASK Student, HQDA, MILPERCEN(DAPC-OPA-E), AREA...34 . . . . .".. .-.. . _ - ." _; ’ " : :, , . ’ - :"-" . """.i.: ’’-i.’ ;" "." ,i 30 battalion and brigade level. CAS3 is mandatory for all senior Captains and

  14. Tenth Grade Students' Time Using a Computer as a Predictor of the Highest Level of Education Attempted

    ERIC Educational Resources Information Center

    Gaffey, Adam John

    2014-01-01

    As computing technology continued to grow in the lives of secondary students from 2002 to 2006, researchers failed to identify the influence using computers would have on the highest level of education students attempted. During the early part of the century schools moved towards increasing the usage of computers. Numerous stakeholders were unsure…

  15. Compiling high-level languages for configurable computers: applying lessons from heterogeneous processing

    NASA Astrophysics Data System (ADS)

    Weaver, Glen E.; Weems, Charles C.; McKinley, Kathryn S.

    1996-10-01

    Configurable systems offer increased performance by providing hardware that matches the computational structure of a problem. This hardware is currently programmed with CAD tools and explicit library calls. To attain widespread acceptance, configurable computing must become transparently accessible from high-level programming languages, but the changeable nature of the target hardware presents a major challenge to traditional compiler technology. A compiler for a configurable computer should optimize the use of functions embedded in hardware and schedule hardware reconfigurations. The hurdles to be overcome in achieving this capability are similar in some ways to those facing compilation for heterogeneous systems. For example, current traditional compilers have neither an interface to accept new primitive operators, nor a mechanism for applying optimizations to new operators. We are building a compiler for heterogeneous computing, called Scale, which replaces the traditional monolithic compiler architecture with a flexible framework. Scale has three main parts: translation director, compilation library, and a persistent store which holds our intermediate representation as well as other data structures. The translation director exploits the framework's flexibility by using architectural information to build a plan to direct each compilation. The translation library serves as a toolkit for use by the translation director. Our compiler intermediate representation, Score, facilities the addition of new IR nodes by distinguishing features used in defining nodes from properties on which transformations depend. In this paper, we present an overview of the scale architecture and its capabilities for dealing with heterogeneity, followed by a discussion of how those capabilities apply to problems in configurable computing. We then address aspects of configurable computing that are likely to require extensions to our approach and propose some extensions.

  16. Computational Identification of Mechanistic Factors That Determine the Timing and Intensity of the Inflammatory Response

    DTIC Science & Technology

    2016-05-09

    target mechanisms to regulate such indices. PLOS Computational Biology | DOI:10.1371/journal.pcbi.1004460 December 3, 2015 1 / 26 OPEN ACCESS Citation...Inflammation Timing and Intensity PLOS Computational Biology | DOI:10.1371/journal.pcbi.1004460 December 3, 2015 2 / 26 receptor 1 agonist, dexamethasone...peak height value. doi:10.1371/journal.pcbi.1004460.g001 Determinants of Inflammation Timing and Intensity PLOS Computational Biology | DOI:10.1371

  17. Prediction of monthly regional groundwater levels through hybrid soft-computing techniques

    NASA Astrophysics Data System (ADS)

    Chang, Fi-John; Chang, Li-Chiu; Huang, Chien-Wei; Kao, I.-Feng

    2016-10-01

    Groundwater systems are intrinsically heterogeneous with dynamic temporal-spatial patterns, which cause great difficulty in quantifying their complex processes, while reliable predictions of regional groundwater levels are commonly needed for managing water resources to ensure proper service of water demands within a region. In this study, we proposed a novel and flexible soft-computing technique that could effectively extract the complex high-dimensional input-output patterns of basin-wide groundwater-aquifer systems in an adaptive manner. The soft-computing models combined the Self Organized Map (SOM) and the Nonlinear Autoregressive with Exogenous Inputs (NARX) network for predicting monthly regional groundwater levels based on hydrologic forcing data. The SOM could effectively classify the temporal-spatial patterns of regional groundwater levels, the NARX could accurately predict the mean of regional groundwater levels for adjusting the selected SOM, the Kriging was used to interpolate the predictions of the adjusted SOM into finer grids of locations, and consequently the prediction of a monthly regional groundwater level map could be obtained. The Zhuoshui River basin in Taiwan was the study case, and its monthly data sets collected from 203 groundwater stations, 32 rainfall stations and 6 flow stations during 2000 and 2013 were used for modelling purpose. The results demonstrated that the hybrid SOM-NARX model could reliably and suitably predict monthly basin-wide groundwater levels with high correlations (R2 > 0.9 in both training and testing cases). The proposed methodology presents a milestone in modelling regional environmental issues and offers an insightful and promising way to predict monthly basin-wide groundwater levels, which is beneficial to authorities for sustainable water resources management.

  18. Survey of computed tomography scanners in Taiwan: Dose descriptors, dose guidance levels, and effective doses

    SciTech Connect

    Tsai, H. Y.; Tung, C. J.; Yu, C. C.; Tyan, Y. S.

    2007-04-15

    The IAEA and the ICRP recommended dose guidance levels for the most frequent computed tomography (CT) examinations to promote strategies for the optimization of radiation dose to CT patients. A national survey, including on-site measurements and questionnaires, was conducted in Taiwan in order to establish dose guidance levels and evaluate effective doses for CT. The beam quality and output and the phantom doses were measured for nine representative CT scanners. Questionnaire forms were completed by respondents from facilities of 146 CT scanners out of 285 total scanners. Information on patient, procedure, scanner, and technique for the head and body examinations was provided. The weighted computed tomography dose index (CTDI{sub w}), the dose length product (DLP), organ doses and effective dose were calculated using measured data, questionnaire information and Monte Carlo simulation results. A cost-effective analysis was applied to derive the dose guidance levels on CTDI{sub w} and DLP for several CT examinations. The mean effective dose{+-}standard deviation distributes from 1.6{+-}0.9 mSv for the routine head examination to 13{+-}11 mSv for the examination of liver, spleen, and pancreas. The surveyed results and the dose guidance levels were provided to the national authorities to develop quality control standards and protocols for CT examinations.

  19. Computation of the intervals of uncertainties about the parameters found for identification

    NASA Technical Reports Server (NTRS)

    Mereau, P.; Raymond, J.

    1982-01-01

    A modeling method to calculate the intervals of uncertainty for parameters found by identification is described. The region of confidence and the general approach to the calculation of these intervals are discussed. The general subprograms for determination of dimensions are described. They provide the organizational charts for the subprograms, the tests carried out and the listings of the different subprograms.

  20. A Comparative Study to Evaluate the Effectiveness of Computer Assisted Instruction (CAI) versus Class Room Lecture (CRL) for Computer Science at ICS Level

    ERIC Educational Resources Information Center

    Kausar, Tayyaba; Choudhry, Bushra Naoreen; Gujjar, Aijaz Ahmed

    2008-01-01

    This study was aimed to evaluate the effectiveness of CAI vs. classroom lecture for computer science at ICS level. The objectives were to compare the learning effects of two groups with class room lecture and computer assisted instruction studying the same curriculum and the effects of CAI and CRL in terms of cognitive development. Hypothesis of…

  1. A Comparative Study to Evaluate the Effectiveness of Computer Assisted Instruction (CAI) versus Class Room Lecture (RL) for Computer Science at ICS Level

    ERIC Educational Resources Information Center

    Kausar, Tayyaba; Choudhry, Bushra Naoreen; Gujjar, Aijaz Ahmed

    2008-01-01

    This study was aimed to evaluate the effectiveness of CAI vs. classroom lecture for computer science at ICS level. The objectives were to compare the learning effects of two groups with class room lecture and computer assisted instruction studying the same curriculum and the effects of CAI and CRL in terms of cognitive development. Hypothesis of…

  2. Comparative phyloinformatics of virus genes at micro and macro levels in a distributed computing environment

    PubMed Central

    Singh, Dadabhai T; Trehan, Rahul; Schmidt, Bertil; Bretschneider, Timo

    2008-01-01

    Background Preparedness for a possible global pandemic caused by viruses such as the highly pathogenic influenza A subtype H5N1 has become a global priority. In particular, it is critical to monitor the appearance of any new emerging subtypes. Comparative phyloinformatics can be used to monitor, analyze, and possibly predict the evolution of viruses. However, in order to utilize the full functionality of available analysis packages for large-scale phyloinformatics studies, a team of computer scientists, biostatisticians and virologists is needed – a requirement which cannot be fulfilled in many cases. Furthermore, the time complexities of many algorithms involved leads to prohibitive runtimes on sequential computer platforms. This has so far hindered the use of comparative phyloinformatics as a commonly applied tool in this area. Results In this paper the graphical-oriented workflow design system called Quascade and its efficient usage for comparative phyloinformatics are presented. In particular, we focus on how this task can be effectively performed in a distributed computing environment. As a proof of concept, the designed workflows are used for the phylogenetic analysis of neuraminidase of H5N1 isolates (micro level) and influenza viruses (macro level). The results of this paper are hence twofold. Firstly, this paper demonstrates the usefulness of a graphical user interface system to design and execute complex distributed workflows for large-scale phyloinformatics studies of virus genes. Secondly, the analysis of neuraminidase on different levels of complexity provides valuable insights of this virus's tendency for geographical based clustering in the phylogenetic tree and also shows the importance of glycan sites in its molecular evolution. Conclusion The current study demonstrates the efficiency and utility of workflow systems providing a biologist friendly approach to complex biological dataset analysis using high performance computing. In particular, the

  3. Contracted basis Lanczos methods for computing numerically exact rovibrational levels of methane

    NASA Astrophysics Data System (ADS)

    Wang, Xiao-Gang; Carrington, Tucker

    2004-08-01

    We present a numerically exact calculation of rovibrational levels of a five-atom molecule. Two contracted basis Lanczos strategies are proposed. The first and preferred strategy is a two-stage contraction. Products of eigenfunctions of a four-dimensional (4D) stretch problem and eigenfunctions of 5D bend-rotation problems, one for each K, are used as basis functions for computing eigenfunctions and eigenvalues (for each K) of the Hamiltonian without the Coriolis coupling term, denoted H0. Finally, energy levels of the full Hamiltonian are calculated in a basis of the eigenfunctions of H0. The second strategy is a one-stage contraction in which energy levels of the full Hamiltonian are computed in the product contracted basis (without first computing eigenfunctions of H0). The two-stage contraction strategy, albeit more complicated, has the crucial advantage that it is trivial to parallelize the calculation so that the CPU and memory costs are independent of J. For the one-stage contraction strategy the CPU and memory costs of the difficult part of the calculation scale linearly with J. We use the polar coordinates associated with orthogonal Radau vectors and spherical harmonic type rovibrational basis functions. A parity-adapted rovibrational basis suitable for a five-atom molecule is proposed and employed to obtain bend-rotation eigenfunctions in the first step of both contraction methods. The effectiveness of the two methods is demonstrated by calculating a large number of converged J=1 rovibrational levels of methane using a global potential energy surface.

  4. Contracted basis Lanczos methods for computing numerically exact rovibrational levels of methane.

    PubMed

    Wang, Xiao-Gang; Carrington, Tucker

    2004-08-15

    We present a numerically exact calculation of rovibrational levels of a five-atom molecule. Two contracted basis Lanczos strategies are proposed. The first and preferred strategy is a two-stage contraction. Products of eigenfunctions of a four-dimensional (4D) stretch problem and eigenfunctions of 5D bend-rotation problems, one for each K, are used as basis functions for computing eigenfunctions and eigenvalues (for each K) of the Hamiltonian without the Coriolis coupling term, denoted H0. Finally, energy levels of the full Hamiltonian are calculated in a basis of the eigenfunctions of H0. The second strategy is a one-stage contraction in which energy levels of the full Hamiltonian are computed in the product contracted basis (without first computing eigenfunctions of H0). The two-stage contraction strategy, albeit more complicated, has the crucial advantage that it is trivial to parallelize the calculation so that the CPU and memory costs are independent of J. For the one-stage contraction strategy the CPU and memory costs of the difficult part of the calculation scale linearly with J. We use the polar coordinates associated with orthogonal Radau vectors and spherical harmonic type rovibrational basis functions. A parity-adapted rovibrational basis suitable for a five-atom molecule is proposed and employed to obtain bend-rotation eigenfunctions in the first step of both contraction methods. The effectiveness of the two methods is demonstrated by calculating a large number of converged J = 1 rovibrational levels of methane using a global potential energy surface.

  5. Systems Level Analysis and Identification of Pathways and Networks Associated with Liver Fibrosis

    PubMed Central

    AbdulHameed, Mohamed Diwan M.; Tawa, Gregory J.; Kumar, Kamal; Ippolito, Danielle L.; Lewis, John A.; Stallings, Jonathan D.; Wallqvist, Anders

    2014-01-01

    Toxic liver injury causes necrosis and fibrosis, which may lead to cirrhosis and liver failure. Despite recent progress in understanding the mechanism of liver fibrosis, our knowledge of the molecular-level details of this disease is still incomplete. The elucidation of networks and pathways associated with liver fibrosis can provide insight into the underlying molecular mechanisms of the disease, as well as identify potential diagnostic or prognostic biomarkers. Towards this end, we analyzed rat gene expression data from a range of chemical exposures that produced observable periportal liver fibrosis as documented in DrugMatrix, a publicly available toxicogenomics database. We identified genes relevant to liver fibrosis using standard differential expression and co-expression analyses, and then used these genes in pathway enrichment and protein-protein interaction (PPI) network analyses. We identified a PPI network module associated with liver fibrosis that includes known liver fibrosis-relevant genes, such as tissue inhibitor of metalloproteinase-1, galectin-3, connective tissue growth factor, and lipocalin-2. We also identified several new genes, such as perilipin-3, legumain, and myocilin, which were associated with liver fibrosis. We further analyzed the expression pattern of the genes in the PPI network module across a wide range of 640 chemical exposure conditions in DrugMatrix and identified early indications of liver fibrosis for carbon tetrachloride and lipopolysaccharide exposures. Although it is well known that carbon tetrachloride and lipopolysaccharide can cause liver fibrosis, our network analysis was able to link these compounds to potential fibrotic damage before histopathological changes associated with liver fibrosis appeared. These results demonstrated that our approach is capable of identifying early-stage indicators of liver fibrosis and underscore its potential to aid in predictive toxicity, biomarker identification, and to generally identify

  6. Effect of Computer Simulations at the Particulate and Macroscopic Levels on Students' Understanding of the Particulate Nature of Matter

    ERIC Educational Resources Information Center

    Tang, Hui; Abraham, Michael R.

    2016-01-01

    Computer-based simulations can help students visualize chemical representations and understand chemistry concepts, but simulations at different levels of representation may vary in effectiveness on student learning. This study investigated the influence of computer activities that simulate chemical reactions at different levels of representation…

  7. Minimal marker: an algorithm and computer program for the identification of minimal sets of discriminating DNA markers for efficient variety identification.

    PubMed

    Fujii, Hiroshi; Ogata, Tatsushi; Shimada, Takehiko; Endo, Tomoko; Iketani, Hiroyuki; Shimizu, Tokurou; Yamamoto, Toshiya; Omura, Mitsuo

    2013-04-01

    DNA markers are frequently used to analyze crop varieties, with the coded marker data summarized in a computer-generated table. Such summary tables often provide extraneous data about individual crop genotypes, needlessly complicating and prolonging DNA-based differentiation between crop varieties. At present, it is difficult to identify minimal marker sets--the smallest sets that can distinguish between all crop varieties listed in a marker-summary table--due to the absence of algorithms capable of such characterization. Here, we describe the development of just such an algorithm and MinimalMarker, its accompanying Perl-based computer program. MinimalMarker has been validated in variety identification of fruit trees using published datasets and is available for use with both dominant and co-dominant markers, regardless of the number of alleles, including SSR markers with numeric notation. We expect that this program will prove useful not only to genomics researchers but also to government agencies that use DNA markers to support a variety of food-inspection and -labeling regulations.

  8. Zebra tape identification for the instantaneous angular speed computation and angular resampling of motorbike valve train measurements

    NASA Astrophysics Data System (ADS)

    Rivola, Alessandro; Troncossi, Marco

    2014-02-01

    An experimental test campaign was performed on the valve train of a racing motorbike engine in order to get insight into the dynamic of the system. In particular the valve motion was acquired in cold test conditions by means of a laser vibrometer able to acquire displacement and velocity signals. The valve time-dependent measurements needed to be referred to the camshaft angular position in order to analyse the data in the angular domain, as usually done for rotating machines. To this purpose the camshaft was fitted with a zebra tape whose dark and light stripes were tracked by means of an optical probe. Unfortunately, both manufacturing and mounting imperfections of the employed zebra tape, resulting in stripes with slightly different widths, precluded the possibility to directly obtain the correct relationship between camshaft angular position and time. In order to overcome this problem, the identification of the zebra tape was performed by means of the original and practical procedure that is the focus of the present paper. The method consists of three main steps: namely, an ad-hoc test corresponding to special operating conditions, the computation of the instantaneous angular speed, and the final association of the stripes with the corresponding shaft angular position. The results reported in the paper demonstrate the suitability of the simple procedure for the zebra tape identification performed with the final purpose to implement a computed order tracking technique for the data analysis.

  9. New Fe i Level Energies and Line Identifications from Stellar Spectra. II. Initial Results from New Ultraviolet Spectra of Metal-poor Stars

    NASA Astrophysics Data System (ADS)

    Peterson, Ruth C.; Kurucz, Robert L.; Ayres, Thomas R.

    2017-04-01

    The Fe i spectrum is critical to many areas of astrophysics, yet many of the high-lying levels remain uncharacterized. To remedy this deficiency, Peterson & Kurucz identified Fe i lines in archival ultraviolet and optical spectra of metal-poor stars, whose warm temperatures favor moderate Fe i excitation. Sixty-five new levels were recovered, with 1500 detectable lines, including several bound levels in the ionization continuum of Fe i. Here, we extend the previous work by identifying 59 additional levels, with 1400 detectable lines, by incorporating new high-resolution UV spectra of warm metal-poor stars recently obtained by the Hubble Space Telescope Imaging Spectrograph. We provide gf values for these transitions, both computed as well as adjusted to fit the stellar spectra. We also expand our spectral calculations to the infrared, confirming three levels by matching high-quality spectra of the Sun and two cool stars in the H-band. The predicted gf values suggest that an additional 3700 Fe i lines should be detectable in existing solar infrared spectra. Extending the empirical line identification work to the infrared would help confirm additional Fe i levels, as would new high-resolution UV spectra of metal-poor turnoff stars below 1900 Å.

  10. Dose Assessment in Computed Tomography Examination and Establishment of Local Diagnostic Reference Levels in Mazandaran, Iran

    PubMed Central

    Janbabanezhad Toori, A.; Shabestani-Monfared, A.; Deevband, M.R.; Abdi, R.; Nabahati, M.

    2015-01-01

    Background Medical X-rays are the largest man-made source of public exposure to ionizing radiation. While the benefits of Computed Tomography (CT) are well known in accurate diagnosis, those benefits are not risk-free. CT is a device with higher patient dose in comparison with other conventional radiation procedures. Objective This study is aimed at evaluating radiation dose to patients from Computed Tomography (CT) examination in Mazandaran hospitals and defining diagnostic reference level (DRL). Methods Patient-related data on CT protocol for four common CT examinations including brain, sinus, chest and abdomen & pelvic were collected. In each center, Computed Tomography Dose Index (CTDI) measurements were performed using pencil ionization chamber and CT dosimetry phantom according to AAPM report No. 96 for those techniques. Then, Weighted Computed Tomography Dose Index (CTDIW), Volume Computed Tomography Dose Index (CTDI vol) and Dose Length Product (DLP) were calculated. Results The CTDIw for brain, sinus, chest and abdomen & pelvic ranged (15.6-73), (3.8-25. 8), (4.5-16.3) and (7-16.3), respectively. Values of DLP had a range of (197.4-981), (41.8-184), (131-342.3) and (283.6-486) for brain, sinus, chest and abdomen & pelvic, respectively. The 3rd quartile of CTDIW, derived from dose distribution for each examination is the proposed quantity for DRL. The DRLs of brain, sinus, chest and abdomen & pelvic are measured 59.5, 17, 7.8 and 11 mGy, respectively. Conclusion Results of this study demonstrated large scales of dose for the same examination among different centers. For all examinations, our values were lower than international reference doses. PMID:26688796

  11. A Framework for Different Levels of Integration of Computational Models Into Web-Based Virtual Patients

    PubMed Central

    Narracott, Andrew J; Manini, Simone; Bayley, Martin J; Lawford, Patricia V; McCormack, Keith; Zary, Nabil

    2014-01-01

    Background Virtual patients are increasingly common tools used in health care education to foster learning of clinical reasoning skills. One potential way to expand their functionality is to augment virtual patients’ interactivity by enriching them with computational models of physiological and pathological processes. Objective The primary goal of this paper was to propose a conceptual framework for the integration of computational models within virtual patients, with particular focus on (1) characteristics to be addressed while preparing the integration, (2) the extent of the integration, (3) strategies to achieve integration, and (4) methods for evaluating the feasibility of integration. An additional goal was to pilot the first investigation of changing framework variables on altering perceptions of integration. Methods The framework was constructed using an iterative process informed by Soft System Methodology. The Virtual Physiological Human (VPH) initiative has been used as a source of new computational models. The technical challenges associated with development of virtual patients enhanced by computational models are discussed from the perspectives of a number of different stakeholders. Concrete design and evaluation steps are discussed in the context of an exemplar virtual patient employing the results of the VPH ARCH project, as well as improvements for future iterations. Results The proposed framework consists of four main elements. The first element is a list of feasibility features characterizing the integration process from three perspectives: the computational modelling researcher, the health care educationalist, and the virtual patient system developer. The second element included three integration levels: basic, where a single set of simulation outcomes is generated for specific nodes in the activity graph; intermediate, involving pre-generation of simulation datasets over a range of input parameters; advanced, including dynamic solution of the

  12. K-nearest neighbors based methods for identification of different gear crack levels under different motor speeds and loads: Revisited

    NASA Astrophysics Data System (ADS)

    Wang, Dong

    2016-03-01

    Gears are the most commonly used components in mechanical transmission systems. Their failures may cause transmission system breakdown and result in economic loss. Identification of different gear crack levels is important to prevent any unexpected gear failure because gear cracks lead to gear tooth breakage. Signal processing based methods mainly require expertize to explain gear fault signatures which is usually not easy to be achieved by ordinary users. In order to automatically identify different gear crack levels, intelligent gear crack identification methods should be developed. The previous case studies experimentally proved that K-nearest neighbors based methods exhibit high prediction accuracies for identification of 3 different gear crack levels under different motor speeds and loads. In this short communication, to further enhance prediction accuracies of existing K-nearest neighbors based methods and extend identification of 3 different gear crack levels to identification of 5 different gear crack levels, redundant statistical features are constructed by using Daubechies 44 (db44) binary wavelet packet transform at different wavelet decomposition levels, prior to the use of a K-nearest neighbors method. The dimensionality of redundant statistical features is 620, which provides richer gear fault signatures. Since many of these statistical features are redundant and highly correlated with each other, dimensionality reduction of redundant statistical features is conducted to obtain new significant statistical features. At last, the K-nearest neighbors method is used to identify 5 different gear crack levels under different motor speeds and loads. A case study including 3 experiments is investigated to demonstrate that the developed method provides higher prediction accuracies than the existing K-nearest neighbors based methods for recognizing different gear crack levels under different motor speeds and loads. Based on the new significant statistical

  13. MGEScan-non-LTR: computational identification and classification of autonomous non-LTR retrotransposons in eukaryotic genomes

    PubMed Central

    Rho, Mina; Tang, Haixu

    2009-01-01

    Computational methods for genome-wide identification of mobile genetic elements (MGEs) have become increasingly necessary for both genome annotation and evolutionary studies. Non-long terminal repeat (non-LTR) retrotransposons are a class of MGEs that have been found in most eukaryotic genomes, sometimes in extremely high numbers. In this article, we present a computational tool, MGEScan-non-LTR, for the identification of non-LTR retrotransposons in genomic sequences, following a computational approach inspired by a generalized hidden Markov model (GHMM). Three different states represent two different protein domains and inter-domain linker regions encoded in the non-LTR retrotransposons, and their scores are evaluated by using profile hidden Markov models (for protein domains) and Gaussian Bayes classifiers (for linker regions), respectively. In order to classify the non-LTR retrotransposons into one of the 12 previously characterized clades using the same model, we defined separate states for different clades. MGEScan-non-LTR was tested on the genome sequences of four eukaryotic organisms, Drosophila melanogaster, Daphnia pulex, Ciona intestinalis and Strongylocentrotus purpuratus. For the D. melanogaster genome, MGEScan-non-LTR found all known ‘full-length’ elements and simultaneously classified them into the clades CR1, I, Jockey, LOA and R1. Notably, for the D. pulex genome, in which no non-LTR retrotransposon has been annotated, MGEScan-non-LTR found a significantly larger number of elements than did RepeatMasker, using the current version of the RepBase Update library. We also identified novel elements in the other two genomes, which have only been partially studied for non-LTR retrotransposons. PMID:19762481

  14. The Effects of Linear Microphone Array Changes on Computed Sound Exposure Level Footprints

    NASA Technical Reports Server (NTRS)

    Mueller, Arnold W.; Wilson, Mark R.

    1997-01-01

    Airport land planning commissions often are faced with determining how much area around an airport is affected by the sound exposure levels (SELS) associated with helicopter operations. This paper presents a study of the effects changing the size and composition of a microphone array has on the computed SEL contour (ground footprint) areas used by such commissions. Descent flight acoustic data measured by a fifteen microphone array were reprocessed for five different combinations of microphones within this array. This resulted in data for six different arrays for which SEL contours were computed. The fifteen microphone array was defined as the 'baseline' array since it contained the greatest amount of data. The computations used a newly developed technique, the Acoustic Re-propagation Technique (ART), which uses parts of the NASA noise prediction program ROTONET. After the areas of the SEL contours were calculated the differences between the areas were determined. The area differences for the six arrays are presented that show a five and a three microphone array (with spacing typical of that required by the FAA FAR Part 36 noise certification procedure) compare well with the fifteen microphone array. All data were obtained from a database resulting from a joint project conducted by NASA and U.S. Army researchers at Langley and Ames Research Centers. A brief description of the joint project test design, microphone array set-up, and data reduction methodology associated with the database are discussed.

  15. French diagnostic reference levels in diagnostic radiology, computed tomography and nuclear medicine: 2004-2008 review.

    PubMed

    Roch, P; Aubert, B

    2013-04-01

    After 5 y of collecting data on diagnostic reference levels (DRLs), the Nuclear Safety and Radiation Protection French Institute (IRSN) presents the analyses of this data. The analyses of the collected data for radiology, computed tomography (CT) and nuclear medicine allow IRSN to estimate the level of regulatory application by health professionals and the representativeness of current DRL in terms of relevant examinations, dosimetric quantities, numerical values and patient morphologies. Since 2004, the involvement of professionals has highly increased, especially in nuclear medicine, followed by CT and then by radiology. Analyses show some discordance between regulatory examinations and clinical practice. Some of the dosimetric quantities used for the DRL setting are insufficient or not relevant enough, and some numerical values should also be reviewed. On the basis of these findings, IRSN formulates recommendations to update regulatory DRL with current and relevant examination lists, dosimetric quantities and numerical values.

  16. Establishment of multi-slice computed tomography (MSCT) reference level in Johor, Malaysia

    NASA Astrophysics Data System (ADS)

    Karim, M. K. A.; Hashim, S.; Bakar, K. A.; Muhammad, H.; Sabarudin, A.; Ang, W. C.; Bahruddin, N. A.

    2016-03-01

    Radiation doses from computed tomography (CT) are the highest and most hazardous compared to other imaging modalities. This study aimed to evaluate radiation dose in Johor, Malaysia to patients during computed tomography examinations of the brain, chest and abdomen and to establish the local diagnostic reference levels (DRLs) as are present with the current, state- of-art, multi-slice CT scanners. Survey forms were sent to five centres performing CT to obtain data regarding acquisition parameters as well as the dose information from CT consoles. CT- EXPO (Version 2.3.1, Germany) was used to validate the dose information. The proposed DRLs were indicated by rounding the third quartiles of whole dose distributions where mean values of CTDIw (mGy), CTDIvol (mGy) and DLP (mGy.cm) were comparable with other reference levels; 63, 63, and 1015 respectively for CT Brain; 15, 14, and 450 respectively for CT thorax and 16, 17, and 590 respectively for CT abdomen. The study revealed that the CT practice and dose output were revolutionised, and must keep up with the pace of introductory technology. We suggest that CTDIvol should be included in current national DRLs, as modern CTs are configured with a higher number of detectors and are independent of pitch factors.

  17. High-level ab initio computations of the absorption spectra of organic iridium complexes.

    PubMed

    Plasser, Felix; Dreuw, Andreas

    2015-02-12

    The excited states of fac-tris(phenylpyridinato)iridium [Ir(ppy)3] and the smaller model complex Ir(C3H4N)3 are computed using a number of high-level ab initio methods, including the recently implemented algebraic diagrammatic construction method to third-order ADC(3). A detailed description of the states is provided through advanced analysis methods, which allow a quantification of different charge transfer and orbital relaxation effects and give extended insight into the many-body wave functions. Compared to the ADC(3) benchmark an unexpected striking difference of ADC(2) is found for Ir(C3H4N)3, which derives from an overstabilization of charge transfer effects. Time-dependent density functional theory (TDDFT) using the B3LYP functional shows an analogous but less severe error for charge transfer states, whereas the ωB97 results are in good agreement with ADC(3). Multireference configuration interaction computations, which are in reasonable agreement with ADC(3), reveal that static correlation does not play a significant role. In the case of the larger Ir(ppy)3 complex, results at the TDDFT/B3LYP and TDDFT/ωB97 levels of theory are presented. Strong discrepancies between the two functionals, which are found with respect to the energies, characters, as well as the density of the low lying states, are discussed in detail and compared to experiment.

  18. Derivation of Australian diagnostic reference levels for paediatric multi detector computed tomography.

    PubMed

    Hayton, Anna; Wallace, Anthony

    2016-09-01

    Australian National Diagnostic Reference Levels for paediatric multi detector computed tomography were established for three protocols, Head, Chest and AbdoPelvis, across two age groups, Baby/Infant 0-4 years and Child 5-14 years by the Australian Radiation Protection and Nuclear Safety Agency in 2012. The establishment of Australian paediatric DRLs is an important step towards lowering patient CT doses on a national scale. While Adult DRLs were calculated with data collected from the web based Australian National Diagnostic Reference Level Service, no paediatric data was submitted in the first year of service operation. Data from an independent Royal Australian and New Zealand College of Radiologists Quality Use of Diagnostic Imaging paediatric optimisation survey was used. The paediatric DRLs were defined for CTDIvol (mGy) and DLP (mGy·cm) values that referenced the 16 cm PMMA phantom for the Head protocol and the 32 cm PMMA phantom for body protocols for both paediatric age groups. The Australian paediatric DRLs for multi detector computed tomography are for the Head, Chest and AbdoPelvis protocols respectively, 470, 60 and 170 mGy·cm for the Baby/Infant age group, and 600, 110 and 390 mGy·cm for the Child age group. A comparison with published international paediatric DRLs for computed tomography reveal the Australian paediatric DRLs to be lower on average. However, the comparison is complicated by misalignment of defined age ranges. It is the intention of ARPANSA to review the paediatric DRLs in conjunction with a review of the adult DRLs, which should occur within 5 years of their publication.

  19. Evaluation of Staf-Sistem 18-R for identification of staphylococcal clinical isolates to the species level.

    PubMed Central

    Piccolomini, R; Catamo, G; Picciani, C; D'Antonio, D

    1994-01-01

    The accuracy and efficiency of Staf-Sistem 18-R (Liofilchem s.r.l., Roseto degli Abruzzi, Teramo, Italy) were compared with those of conventional biochemical methods to identify 523 strains belonging to 16 different human Staphylococcus species. Overall, 491 strains (93.9%) were correctly identified (percentage of identification, > or = 90.0), with 28 (5.4%) requiring supplementary tests for complete identification. For 14 isolates (2.8%), the strains did not correspond to any key in the codebook and could not be identified by the manufacturer's computer service. Only 18 isolates (3.4%) were misidentified. The system is simple to use, is easy to handle, gives highly reproducible results, and is inexpensive. With the inclusion of more discriminating tests and adjustment in supplementary code numbers for some species, such as Staphylococcus lugdunensis and Staphylococcus schleiferi, Staf-Sistem 18-R is a suitable alternative for identification of human coagulase-positive and coagulase-negative Staphylococcus species in microbiological laboratories. Images PMID:8195373

  20. The utility of including pathology reports in improving the computational identification of patients

    PubMed Central

    Chen, Wei; Huang, Yungui; Boyle, Brendan; Lin, Simon

    2016-01-01

    Background: Celiac disease (CD) is a common autoimmune disorder. Efficient identification of patients may improve chronic management of the disease. Prior studies have shown searching International Classification of Diseases-9 (ICD-9) codes alone is inaccurate for identifying patients with CD. In this study, we developed automated classification algorithms leveraging pathology reports and other clinical data in Electronic Health Records (EHRs) to refine the subset population preselected using ICD-9 code (579.0). Materials and Methods: EHRs were searched for established ICD-9 code (579.0) suggesting CD, based on which an initial identification of cases was obtained. In addition, laboratory results for tissue transglutaminse were extracted. Using natural language processing we analyzed pathology reports from upper endoscopy. Twelve machine learning classifiers using different combinations of variables related to ICD-9 CD status, laboratory result status, and pathology reports were experimented to find the best possible CD classifier. Ten-fold cross-validation was used to assess the results. Results: A total of 1498 patient records were used including 363 confirmed cases and 1135 false positive cases that served as controls. Logistic model based on both clinical and pathology report features produced the best results: Kappa of 0.78, F1 of 0.92, and area under the curve (AUC) of 0.94, whereas in contrast using ICD-9 only generated poor results: Kappa of 0.28, F1 of 0.75, and AUC of 0.63. Conclusion: Our automated classification system presented an efficient and reliable way to improve the performance of CD patient identification. PMID:27994938

  1. EUDOC: a computer program for identification of drug interaction sites in macromolecules and drug leads from chemical databases.

    PubMed

    Pang, Yuan-Ping; Perola, Emanuele; Xu, Kun; Prendergast, Franklyn G.

    2001-11-30

    The completion of the Human Genome Project, the growing effort on proteomics, and the Structural Genomics Initiative have recently intensified the attention being paid to reliable computer docking programs able to identify molecules that can affect the function of a macromolecule through molecular complexation. We report herein an automated computer docking program, EUDOC, for prediction of ligand-receptor complexes from 3D receptor structures, including metalloproteins, and for identification of a subset enriched in drug leads from chemical databases. This program was evaluated from the standpoints of force field and sampling issues using 154 experimentally determined ligand-receptor complexes and four "real-life" applications of the EUDOC program. The results provide evidence for the reliability and accuracy of the EUDOC program. In addition, key principles underlying molecular recognition, and the effects of structural water molecules in the active site and different atomic charge models on docking results are discussed. Copyright 2001 John Wiley & Sons, Inc. J Comput Chem 22: 1750-1771, 2001

  2. 3D Multislice and Cone-beam Computed Tomography Systems for Dental Identification.

    PubMed

    Eliášová, Hana; Dostálová, Taťjana

    2017-01-01

    3D Multislice and Cone-beam computed tomography (CBCT) in forensic odontology has been shown to be useful not only in terms of one or a few of dead bodies but also in multiple fatality incidents. 3D Multislice and Cone-beam computed tomography and digital radiography were demonstrated in a forensic examination form. 3D images of the skull and teeth were analysed and validated for long ante mortem/post mortem intervals. The image acquisition was instantaneous; the images were able to be optically enlarged, measured, superimposed and compared prima vista or using special software and exported as a file. Digital radiology and computer tomography has been shown to be important both in common criminalistics practices and in multiple fatality incidents. Our study demonstrated that CBCT imaging offers less image artifacts, low image reconstruction times, mobility of the unit and considerably lower equipment cost.

  3. Power levels in office equipment: Measurements of new monitors and personal computers

    SciTech Connect

    Roberson, Judy A.; Brown, Richard E.; Nordman, Bruce; Webber, Carrie A.; Homan, Gregory H.; Mahajan, Akshay; McWhinney, Marla; Koomey, Jonathan G.

    2002-05-14

    Electronic office equipment has proliferated rapidly over the last twenty years and is projected to continue growing in the future. Efforts to reduce the growth in office equipment energy use have focused on power management to reduce power consumption of electronic devices when not being used for their primary purpose. The EPA ENERGY STAR[registered trademark] program has been instrumental in gaining widespread support for power management in office equipment, and accurate information about the energy used by office equipment in all power levels is important to improving program design and evaluation. This paper presents the results of a field study conducted during 2001 to measure the power levels of new monitors and personal computers. We measured off, on, and low-power levels in about 60 units manufactured since July 2000. The paper summarizes power data collected, explores differences within the sample (e.g., between CRT and LCD monitors), and discusses some issues that arise in m etering office equipment. We also present conclusions to help improve the success of future power management programs.Our findings include a trend among monitor manufacturers to provide a single very low low-power level, and the need to standardize methods for measuring monitor on power, to more accurately estimate the annual energy consumption of office equipment, as well as actual and potential energy savings from power management.

  4. TPASS: a gamma-ray spectrum analysis and isotope identification computer code

    SciTech Connect

    Dickens, J.K.

    1981-03-01

    The gamma-ray spectral data-reduction and analysis computer code TPASS is described. This computer code is used to analyze complex Ge(Li) gamma-ray spectra to obtain peak areas corrected for detector efficiencies, from which are determined gamma-ray yields. These yields are compared with an isotope gamma-ray data file to determine the contributions to the observed spectrum from decay of specific radionuclides. A complete FORTRAN listing of the code and a complex test case are given.

  5. Identification of phytophthora isolates to species level using restriction fragment length polymorphism analysis of a polymerase chain reaction-amplified region of mitochondrial DNA.

    PubMed

    Martin, Frank N; Tooley, Paul W

    2004-09-01

    ABSTRACT Polymerase chain reaction primers spanning the mitochondrially encoded coxI and II genes have been identified that were capable of amplifying target DNA from all 152 isolates of 31 species in the genus Phytophthora that were tested. Digestion of the amplicons with restriction enzymes generated species-specific restriction fragment length polymorphism banding profiles that were effective for isolate classification to a species level. Of the 24 species in which multiple isolates were examined, intraspecific polymorphisms were not observed for 16 species, while 5 species exhibited limited intraspecific polymorphism that could be explained by the addition/loss of a single restriction site. Intraspecific polymorphisms were observed for P. megakarya, P. megasperma, and P. syringae; however, these differences may be a reflection of the variation that exists in these species as reported in the literature. Although digestion with AluI alone could differentiate most species tested, single digests with a total of four restriction enzymes were used in this investigation to enhance the accuracy of the technique and minimize the effect of intraspecific variability on correct isolate identification. The use of the computer program BioNumerics simplified data analysis and identification of isolates. Successful template amplification was obtained with DNA recovered from hyphae using a boiling miniprep procedure, thereby reducing the time and materials needed for conducting this analysis.

  6. Early Identification of Handicapped Children. Computer Assisted Remedial Education Report No. R-36.

    ERIC Educational Resources Information Center

    Cartwright, G. Phillip; Cartwright, Carol A.

    The handbook is intended to be part of a graduate course entitled "Introduction to Exceptional Children" which is taught via computer assisted instruction and emphasizes the social, psychological, and physiological characteristics of the mentally, visually, aurally, physically, emotionally, or neurologically handicapped primary grade child to…

  7. Drug target identification in sphingolipid metabolism by computational systems biology tools: metabolic control analysis and metabolic pathway analysis.

    PubMed

    Ozbayraktar, F Betül Kavun; Ulgen, Kutlu O

    2010-08-01

    Sphingolipids regulate cellular processes that are critically important in cell's fate and function in cancer development and progression. This fact underlies the basics of the novel cancer therapy approach. The pharmacological manipulation of the sphingolipid metabolism in cancer therapeutics necessitates the detailed understanding of the pathway. Two computational systems biology tools are used to identify potential drug target enzymes among sphingolipid pathway that can be further utilized in drug design studies for cancer therapy. The enzymes in sphingolipid pathway were ranked according to their roles in controlling the metabolic network by metabolic control analysis. The physiologically connected reactions, i.e. biologically significant and functional modules of network, were identified by metabolic pathway analysis. The final set of candidate drug target enzymes are selected such that their manipulation leads to ceramide accumulation and long chain base phosphates depletion. The mathematical tools' efficiency for drug target identification performed in this study is validated by clinically available drugs.

  8. Computer-aided identification of the water diffusion coefficient for maize kernels dried in a thin layer

    NASA Astrophysics Data System (ADS)

    Kujawa, Sebastian; Weres, Jerzy; Olek, Wiesław

    2016-07-01

    Uncertainties in mathematical modelling of water transport in cereal grain kernels during drying and storage are mainly due to implementing unreliable values of the water diffusion coefficient and simplifying the geometry of kernels. In the present study an attempt was made to reduce the uncertainties by developing a method for computer-aided identification of the water diffusion coefficient and more accurate 3D geometry modelling for individual kernels using original inverse finite element algorithms. The approach was exemplified by identifying the water diffusion coefficient for maize kernels subjected to drying. On the basis of the developed method, values of the water diffusion coefficient were estimated, 3D geometry of a maize kernel was represented by isoparametric finite elements, and the moisture content inside maize kernels dried in a thin layer was predicted. Validation of the results against experimental data showed significantly lower error values than in the case of results obtained for the water diffusion coefficient values available in the literature.

  9. Self-Assessment and Student Improvement in an Introductory Computer Course at the Community College-Level

    ERIC Educational Resources Information Center

    Spicer-Sutton, Jama

    2013-01-01

    The purpose of this study was to determine a student's computer knowledge upon course entry and if there was a difference in college students' improvement scores as measured by the difference in pretest and posttest scores of new or novice users, moderate users, and expert users at the end of a college-level introductory computing class. This…

  10. High-level waste storage tank farms/242-A evaporator standards/requirements identification document (S/RID), Vol. 7

    SciTech Connect

    Not Available

    1994-04-01

    This Requirements Identification Document (RID) describes an Occupational Health and Safety Program as defined through the Relevant DOE Orders, regulations, industry codes/standards, industry guidance documents and, as appropriate, good industry practice. The definition of an Occupational Health and Safety Program as specified by this document is intended to address Defense Nuclear Facilities Safety Board Recommendations 90-2 and 91-1, which call for the strengthening of DOE complex activities through the identification and application of relevant standards which supplement or exceed requirements mandated by DOE Orders. This RID applies to the activities, personnel, structures, systems, components, and programs involved in maintaining the facility and executing the mission of the High-Level Waste Storage Tank Farms.

  11. Computer models of possible physiological contributions to low-level auditory scene analysis

    NASA Astrophysics Data System (ADS)

    Meddis, Ray

    2004-05-01

    Auditory selective attention is a general term for a wide range of phenomena including grouping and streaming of sound sources. While we know a great deal about the circumstances in which these phenomena occur, we have little understanding of the physiological mechanisms that give rise to these effects. Because attention is sometimes under conscious control, it is tempting to conclude that attention is a high-level/cortical function and beyond our current understanding of brain physiology. However, a number of mechanisms operating at the level of the brainstem may well make an important contribution to auditory scene analysis. Because we know much more about the response of the brainstem to auditory stimulation, we can begin some speculative modeling concerning their possible involvement. Two mechanisms will be discussed in terms of their possible relevance: lateral and recurrent inhibition at the level of the cochlear nucleus. These are likely to contribute to the selection of auditory channels. A new approach to within-channel selection on the basis of pitch will also be discussed. These approaches will be illustrated using computer models of the underlying physiology and their response to stimuli used in psychophysical experiments.

  12. Patient dose, gray level and exposure index with a computed radiography system

    NASA Astrophysics Data System (ADS)

    Silva, T. R.; Yoshimura, E. M.

    2014-02-01

    Computed radiography (CR) is gradually replacing conventional screen-film system in Brazil. To assess image quality, manufactures provide the calculation of an exposure index through the acquisition software of the CR system. The objective of this study is to verify if the CR image can be used as an evaluator of patient absorbed dose too, through a relationship between the entrance skin dose and the exposure index or the gray level values obtained in the image. The CR system used for this study (Agfa model 30-X with NX acquisition software) calculates an exposure index called Log of the Median (lgM), related to the absorbed dose to the IP. The lgM value depends on the average gray level (called Scan Average Level (SAL)) of the segmented pixel value histogram of the whole image. A Rando male phantom was used to simulate a human body (chest and head), and was irradiated with an X-ray equipment, using usual radiologic techniques for chest exams. Thermoluminescent dosimeters (LiF, TLD100) were used to evaluate entrance skin dose and exit dose. The results showed a logarithm relation between entrance dose and SAL in the image center, regardless of the beam filtration. The exposure index varies linearly with the entrance dose, but the angular coefficient is beam quality dependent. We conclude that, with an adequate calibration, the CR system can be used to evaluate the patient absorbed dose.

  13. Computational approaches to analyse and predict small molecule transport and distribution at cellular and subcellular levels.

    PubMed

    Min, Kyoung Ah; Zhang, Xinyuan; Yu, Jing-yu; Rosania, Gus R

    2014-01-01

    Quantitative structure-activity relationship (QSAR) studies and mechanistic mathematical modeling approaches have been independently employed for analysing and predicting the transport and distribution of small molecule chemical agents in living organisms. Both of these computational approaches have been useful for interpreting experiments measuring the transport properties of small molecule chemical agents, in vitro and in vivo. Nevertheless, mechanistic cell-based pharmacokinetic models have been especially useful to guide the design of experiments probing the molecular pathways underlying small molecule transport phenomena. Unlike QSAR models, mechanistic models can be integrated from microscopic to macroscopic levels, to analyse the spatiotemporal dynamics of small molecule chemical agents from intracellular organelles to whole organs, well beyond the experiments and training data sets upon which the models are based. Based on differential equations, mechanistic models can also be integrated with other differential equations-based systems biology models of biochemical networks or signaling pathways. Although the origin and evolution of mathematical modeling approaches aimed at predicting drug transport and distribution has occurred independently from systems biology, we propose that the incorporation of mechanistic cell-based computational models of drug transport and distribution into a systems biology modeling framework is a logical next step for the advancement of systems pharmacology research.

  14. Identification and analysis of unsatisfactory psychosocial work situations: a participatory approach employing video-computer interaction.

    PubMed

    Hanse, J J; Forsman, M

    2001-02-01

    A method for psychosocial evaluation of potentially stressful or unsatisfactory situations in manual work was developed. It focuses on subjective responses regarding specific situations and is based on interactive worker assessment when viewing video recordings of oneself. The worker is first video-recorded during work. The video is then displayed on the computer terminal, and the filmed worker clicks on virtual controls on the screen whenever an unsatisfactory psychosocial situation appears; a window of questions regarding psychological demands, mental strain and job control is then opened. A library with pictorial information and comments on the selected situations is formed in the computer. The evaluation system, called PSIDAR, was applied in two case studies, one of manual materials handling in an automotive workshop and one of a group of workers producing and testing instrument panels. The findings indicate that PSIDAR can provide data that are useful in a participatory ergonomic process of change.

  15. Two-Level Verification of Data Integrity for Data Storage in Cloud Computing

    NASA Astrophysics Data System (ADS)

    Xu, Guangwei; Chen, Chunlin; Wang, Hongya; Zang, Zhuping; Pang, Mugen; Jiang, Ping

    Data storage in cloud computing can save capital expenditure and relive burden of storage management for users. As the lose or corruption of files stored may happen, many researchers focus on the verification of data integrity. However, massive users often bring large numbers of verifying tasks for the auditor. Moreover, users also need to pay extra fee for these verifying tasks beyond storage fee. Therefore, we propose a two-level verification of data integrity to alleviate these problems. The key idea is to routinely verify the data integrity by users and arbitrate the challenge between the user and cloud provider by the auditor according to the MACs and ϕ values. The extensive performance simulations show that the proposed scheme obviously decreases auditor's verifying tasks and the ratio of wrong arbitration.

  16. Identification of Problems Hindering Logistics Support of Commercial- Off-The-Shelf Computer Equipment

    DTIC Science & Technology

    1993-09-01

    accelerating technology taking place in the computer industry . Market-Based Pricing. Because the Air Force is buying the COTS equipment "off-the-shelf...supporting, participating, and using commands. b. Visit Air Force and Private industry to get their perspective, approaches, and procedures on both effective...teams to collect data from industry , acquirers, and users/supporters. The industry team collected data from The Bank of Boston, Commonwealth of

  17. A Computational Wireless Network Backplane: Performance in a Distributed Speaker Identification Application Postprint

    DTIC Science & Technology

    2008-12-01

    traffic patterns are intense but constrained to a local area. Examples include peer-to-peer applications or sensor data processing in the region. In such...vol. 30, no. 4, pp. 68–74, 1997. [7] J. Dean and S. Ghemawat, “ Mapreduce : simplified data processing on large clusters ,” Commun. ACM, vol. 51, no. 1...DWARF, a general distributed application execution framework for wireless ad-hoc networks which dynamically allocates computation resources and manages

  18. Goal-Directed Behavior and Instrumental Devaluation: A Neural System-Level Computational Model

    PubMed Central

    Mannella, Francesco; Mirolli, Marco; Baldassarre, Gianluca

    2016-01-01

    Devaluation is the key experimental paradigm used to demonstrate the presence of instrumental behaviors guided by goals in mammals. We propose a neural system-level computational model to address the question of which brain mechanisms allow the current value of rewards to control instrumental actions. The model pivots on and shows the computational soundness of the hypothesis for which the internal representation of instrumental manipulanda (e.g., levers) activate the representation of rewards (or “action-outcomes”, e.g., foods) while attributing to them a value which depends on the current internal state of the animal (e.g., satiation for some but not all foods). The model also proposes an initial hypothesis of the integrated system of key brain components supporting this process and allowing the recalled outcomes to bias action selection: (a) the sub-system formed by the basolateral amygdala and insular cortex acquiring the manipulanda-outcomes associations and attributing the current value to the outcomes; (b) three basal ganglia-cortical loops selecting respectively goals, associative sensory representations, and actions; (c) the cortico-cortical and striato-nigro-striatal neural pathways supporting the selection, and selection learning, of actions based on habits and goals. The model reproduces and explains the results of several devaluation experiments carried out with control rats and rats with pre- and post-training lesions of the basolateral amygdala, the nucleus accumbens core, the prelimbic cortex, and the dorso-medial striatum. The results support the soundness of the hypotheses of the model and show its capacity to integrate, at the system-level, the operations of the key brain structures underlying devaluation. Based on its hypotheses and predictions, the model also represents an operational framework to support the design and analysis of new experiments on the motivational aspects of goal-directed behavior. PMID:27803652

  19. Goal-Directed Behavior and Instrumental Devaluation: A Neural System-Level Computational Model.

    PubMed

    Mannella, Francesco; Mirolli, Marco; Baldassarre, Gianluca

    2016-01-01

    Devaluation is the key experimental paradigm used to demonstrate the presence of instrumental behaviors guided by goals in mammals. We propose a neural system-level computational model to address the question of which brain mechanisms allow the current value of rewards to control instrumental actions. The model pivots on and shows the computational soundness of the hypothesis for which the internal representation of instrumental manipulanda (e.g., levers) activate the representation of rewards (or "action-outcomes", e.g., foods) while attributing to them a value which depends on the current internal state of the animal (e.g., satiation for some but not all foods). The model also proposes an initial hypothesis of the integrated system of key brain components supporting this process and allowing the recalled outcomes to bias action selection: (a) the sub-system formed by the basolateral amygdala and insular cortex acquiring the manipulanda-outcomes associations and attributing the current value to the outcomes; (b) three basal ganglia-cortical loops selecting respectively goals, associative sensory representations, and actions; (c) the cortico-cortical and striato-nigro-striatal neural pathways supporting the selection, and selection learning, of actions based on habits and goals. The model reproduces and explains the results of several devaluation experiments carried out with control rats and rats with pre- and post-training lesions of the basolateral amygdala, the nucleus accumbens core, the prelimbic cortex, and the dorso-medial striatum. The results support the soundness of the hypotheses of the model and show its capacity to integrate, at the system-level, the operations of the key brain structures underlying devaluation. Based on its hypotheses and predictions, the model also represents an operational framework to support the design and analysis of new experiments on the motivational aspects of goal-directed behavior.

  20. Instruction-Level Characterization of Scientific Computing Applications Using Hardware Performance Counters

    SciTech Connect

    Luo, Y.; Cameron, K.W.

    1998-11-24

    Workload characterization has been proven an essential tool to architecture design and performance evaluation in both scientific and commercial computing areas. Traditional workload characterization techniques include FLOPS rate, cache miss ratios, CPI (cycles per instruction or IPC, instructions per cycle) etc. With the complexity of sophisticated modern superscalar microprocessors, these traditional characterization techniques are not powerful enough to pinpoint the performance bottleneck of an application on a specific microprocessor. They are also incapable of immediately demonstrating the potential performance benefit of any architectural or functional improvement in a new processor design. To solve these problems, many people rely on simulators, which have substantial constraints especially on large-scale scientific computing applications. This paper presents a new technique of characterizing applications at the instruction level using hardware performance counters. It has the advantage of collecting instruction-level characteristics in a few runs virtually without overhead or slowdown. A variety of instruction counts can be utilized to calculate some average abstract workload parameters corresponding to microprocessor pipelines or functional units. Based on the microprocessor architectural constraints and these calculated abstract parameters, the architectural performance bottleneck for a specific application can be estimated. In particular, the analysis results can provide some insight to the problem that only a small percentage of processor peak performance can be achieved even for many very cache-friendly codes. Meanwhile, the bottleneck estimation can provide suggestions about viable architectural/functional improvement for certain workloads. Eventually, these abstract parameters can lead to the creation of an analytical microprocessor pipeline model and memory hierarchy model.

  1. Establishment of diagnostic reference levels in computed tomography for select procedures in Pudhuchery, India.

    PubMed

    Saravanakumar, A; Vaideki, K; Govindarajan, K N; Jayakumar, S

    2014-01-01

    Computed tomography (CT) scanner under operating conditions has become a major source of human exposure to diagnostic X-rays. In this context, weighed CT dose index (CTDIw), volumetric CT dose index (CTDIv), and dose length product (DLP) are important parameter to assess procedures in CT imaging as surrogate dose quantities for patient dose optimization. The current work aims to estimate the existing dose level of CT scanner for head, chest, and abdomen procedures in Pudhuchery in south India and establish dose reference level (DRL) for the region. The study was carried out for six CT scanners in six different radiology departments using 100 mm long pencil ionization chamber and polymethylmethacrylate (PMMA) phantom. From each CT scanner, data pertaining to patient and machine details were collected for 50 head, 50 chest, and 50 abdomen procedures performed over a period of 1 year. The experimental work was carried out using the machine operating parameters used during the procedures. Initially, dose received in the phantom at the center and periphery was measured by five point method. Using these values CTDIw, CTDIv, and DLP were calculated. The DRL is established based on the third quartile value of CTDIv and DLP which is 32 mGy and 925 mGy.cm for head, 12 mGy and 456 mGy.cm for chest, and 16 mGy and 482 mGy.cm for abdomen procedures. These values are well below European Commission Dose Reference Level (EC DRL) and comparable with the third quartile value reported for Tamil Nadu region in India. The present study is the first of its kind to determine the DRL for scanners operating in the Pudhuchery region. Similar studies in other regions of India are necessary in order to establish a National Dose Reference Level.

  2. Establishment of diagnostic reference levels in computed tomography for select procedures in Pudhuchery, India

    PubMed Central

    Saravanakumar, A.; Vaideki, K.; Govindarajan, K. N.; Jayakumar, S.

    2014-01-01

    Computed tomography (CT) scanner under operating conditions has become a major source of human exposure to diagnostic X-rays. In this context, weighed CT dose index (CTDIw), volumetric CT dose index (CTDIv), and dose length product (DLP) are important parameter to assess procedures in CT imaging as surrogate dose quantities for patient dose optimization. The current work aims to estimate the existing dose level of CT scanner for head, chest, and abdomen procedures in Pudhuchery in south India and establish dose reference level (DRL) for the region. The study was carried out for six CT scanners in six different radiology departments using 100 mm long pencil ionization chamber and polymethylmethacrylate (PMMA) phantom. From each CT scanner, data pertaining to patient and machine details were collected for 50 head, 50 chest, and 50 abdomen procedures performed over a period of 1 year. The experimental work was carried out using the machine operating parameters used during the procedures. Initially, dose received in the phantom at the center and periphery was measured by five point method. Using these values CTDIw, CTDIv, and DLP were calculated. The DRL is established based on the third quartile value of CTDIv and DLP which is 32 mGy and 925 mGy.cm for head, 12 mGy and 456 mGy.cm for chest, and 16 mGy and 482 mGy.cm for abdomen procedures. These values are well below European Commission Dose Reference Level (EC DRL) and comparable with the third quartile value reported for Tamil Nadu region in India. The present study is the first of its kind to determine the DRL for scanners operating in the Pudhuchery region. Similar studies in other regions of India are necessary in order to establish a National Dose Reference Level. PMID:24600173

  3. Revisiting thoracic surface anatomy in an adult population: A computed tomography evaluation of vertebral level.

    PubMed

    Badshah, Masroor; Soames, Roger; Khan, Muhammad Jaffar; Ibrahim, Muhammad; Khan, Adnan

    2017-03-01

    To compare key thoracic anatomical surface landmarks between healthy and patient adult populations using Computed Tomography (CT). Sixteen slice CT images of 250 age and gender matched healthy individuals and 99 patients with lung parenchymal disease were analyzed to determine the relationship of 17 thoracic structures and their vertebral levels using a 32-bit Radiant DICOM viewer. The structures studied were: aortic hiatus, azygos vein, brachiocephalic artery, gastroesophageal junction (GEJ), left and right common carotid arteries, left and right subclavian arteries, pulmonary trunk bifurcation, superior vena cava junction with the right atrium, carina, cardiac apex, manubriosternal junction, xiphisternal joint, inferior vena cava (IVC) crossing the diaphragm, aortic arch and junction of brachiocephalic veins. The surface anatomy of all structures varied among individuals with no significant effect of age. Binary logistic regression analysis showed a significant association between individual health status and vertebral level for brachiocephalic artery (P = 0.049), GEJ (P = 0.020), right common carotid (P = 0.009) and subclavian arteries (P = 0.009), pulmonary trunk bifurcation (P = 0.049), carina (P = 0.004), and IVC crossing the diaphragm (P = 0.025). These observations differ from those reported in a healthy white Caucasian population and from the vertebral levels of the IVC, esophagus, and aorta crossing the diaphragm in an Iranian population. The differences observed in this study provide insight into the effect of lung pathology on specific thoracic structures and their vertebral levels. Further studies are needed to determine whether these are general changes or pathology-specific. Clin. Anat. 30:227-236, 2017. © 2017 Wiley Periodicals, Inc.

  4. Combining computer algorithms with experimental approaches permits the rapid and accurate identification of T cell epitopes from defined antigens.

    PubMed

    Schirle, M; Weinschenk, T; Stevanović, S

    2001-11-01

    The identification of T cell epitopes from immunologically relevant antigens remains a critical step in the development of vaccines and methods for monitoring of T cell responses. This review presents an overview of strategies that employ computer algorithms for the selection of candidate peptides from defined proteins and subsequent verification of their in vivo relevance by experimental approaches. Several computer algorithms are currently being used for epitope prediction of various major histocompatibility complex (MHC) class I and II molecules, based either on the analysis of natural MHC ligands or on the binding properties of synthetic peptides. Moreover, the analysis of proteasomal digests of peptides and whole proteins has led to the development of algorithms for the prediction of proteasomal cleavages. In order to verify the generation of the predicted peptides during antigen processing in vivo as well as their immunogenic potential, several experimental approaches have been pursued in the recent past. Mass spectrometry-based bioanalytical approaches have been used specifically to detect predicted peptides among isolated natural ligands. Other strategies employ various methods for the stimulation of primary T cell responses against the predicted peptides and subsequent testing of the recognition pattern towards target cells that express the antigen.

  5. Development of computer program ENAUDIBL for computation of the sensation levels of multiple, complex, intrusive sounds in the presence of residual environmental masking noise

    SciTech Connect

    Liebich, R. E.; Chang, Y.-S.; Chun, K. C.

    2000-03-31

    The relative audibility of multiple sounds occurs in separate, independent channels (frequency bands) termed critical bands or equivalent rectangular (filter-response) bandwidths (ERBs) of frequency. The true nature of human hearing is a function of a complex combination of subjective factors, both auditory and nonauditory. Assessment of the probability of individual annoyance, community-complaint reaction levels, speech intelligibility, and the most cost-effective mitigation actions requires sensation-level data; these data are one of the most important auditory factors. However, sensation levels cannot be calculated by using single-number, A-weighted sound level values. This paper describes specific steps to compute sensation levels. A unique, newly developed procedure is used, which simplifies and improves the accuracy of such computations by the use of maximum sensation levels that occur, for each intrusive-sound spectrum, within each ERB. The newly developed program ENAUDIBL makes use of ERB sensation-level values generated with some computational subroutines developed for the formerly documented program SPECTRAN.

  6. Computational identification of putative miRNAs and their target genes in pathogenic amoeba Naegleria fowleri

    PubMed Central

    Padmashree, Dyavegowda; Swamy, Narayanaswamy Ramachandra

    2015-01-01

    Naegleria fowleri is a parasitic unicellular free living eukaryotic amoeba. The parasite spreads through contaminated water and causes primary amoebic meningoencephalitis (PAM). Therefore, it is of interest to understand its molecular pathogenesis. Hence, we analyzed the parasite genome for miRNAs (microRNAs) that are non-coding, single stranded RNA molecules. We identified 245 miRNAs using computational methods in N. fowleri, of which five miRNAs are conserved. The predicted miRNA targets were analyzed by using miRanda (software) and further studied the functions by subsequently annotating using AmiGo (a gene ontology web tool). PMID:26770029

  7. Identification of Cardiac and Aortic Injuries in Trauma with Multi-detector Computed Tomography.

    PubMed

    Shergill, Arvind K; Maraj, Tishan; Barszczyk, Mark S; Cheung, Helen; Singh, Navneet; Zavodni, Anna E

    2015-01-01

    Blunt and penetrating cardiovascular (CV) injuries are associated with a high morbidity and mortality. Rapid detection of these injuries in trauma is critical for patient survival. The advent of multi-detector computed tomography (MDCT) has led to increased detection of CV injuries during rapid comprehensive scanning of stabilized major trauma patients. MDCT has the ability to acquire images with a higher temporal and spatial resolution, as well as the capability to create multiplanar reformats. This pictorial review illustrates several common and life-threatening traumatic CV injuries from a regional trauma center.

  8. Identification of Cardiac and Aortic Injuries in Trauma with Multi-detector Computed Tomography

    PubMed Central

    Shergill, Arvind K; Maraj, Tishan; Barszczyk, Mark S; Cheung, Helen; Singh, Navneet; Zavodni, Anna E

    2015-01-01

    Blunt and penetrating cardiovascular (CV) injuries are associated with a high morbidity and mortality. Rapid detection of these injuries in trauma is critical for patient survival. The advent of multi-detector computed tomography (MDCT) has led to increased detection of CV injuries during rapid comprehensive scanning of stabilized major trauma patients. MDCT has the ability to acquire images with a higher temporal and spatial resolution, as well as the capability to create multiplanar reformats. This pictorial review illustrates several common and life-threatening traumatic CV injuries from a regional trauma center. PMID:26430541

  9. Computational identification of putative miRNAs and their target genes in pathogenic amoeba Naegleria fowleri.

    PubMed

    Padmashree, Dyavegowda; Swamy, Narayanaswamy Ramachandra

    2015-01-01

    Naegleria fowleri is a parasitic unicellular free living eukaryotic amoeba. The parasite spreads through contaminated water and causes primary amoebic meningoencephalitis (PAM). Therefore, it is of interest to understand its molecular pathogenesis. Hence, we analyzed the parasite genome for miRNAs (microRNAs) that are non-coding, single stranded RNA molecules. We identified 245 miRNAs using computational methods in N. fowleri, of which five miRNAs are conserved. The predicted miRNA targets were analyzed by using miRanda (software) and further studied the functions by subsequently annotating using AmiGo (a gene ontology web tool).

  10. Computational Identification of Genomic Features That Influence 3D Chromatin Domain Formation

    PubMed Central

    Mourad, Raphaël; Cuvier, Olivier

    2016-01-01

    Recent advances in long-range Hi-C contact mapping have revealed the importance of the 3D structure of chromosomes in gene expression. A current challenge is to identify the key molecular drivers of this 3D structure. Several genomic features, such as architectural proteins and functional elements, were shown to be enriched at topological domain borders using classical enrichment tests. Here we propose multiple logistic regression to identify those genomic features that positively or negatively influence domain border establishment or maintenance. The model is flexible, and can account for statistical interactions among multiple genomic features. Using both simulated and real data, we show that our model outperforms enrichment test and non-parametric models, such as random forests, for the identification of genomic features that influence domain borders. Using Drosophila Hi-C data at a very high resolution of 1 kb, our model suggests that, among architectural proteins, BEAF-32 and CP190 are the main positive drivers of 3D domain borders. In humans, our model identifies well-known architectural proteins CTCF and cohesin, as well as ZNF143 and Polycomb group proteins as positive drivers of domain borders. The model also reveals the existence of several negative drivers that counteract the presence of domain borders including P300, RXRA, BCL11A and ELK1. PMID:27203237

  11. Computational identification and analysis of MADS box genes in Camellia sinensis.

    PubMed

    Gogoi, Madhurjya; Borchetia, Sangeeta; Bandyopadhyay, Tanoy

    2015-01-01

    MADS (Minichromosome Maintenance1 Agamous Deficiens Serum response factor) box genes encode transcription factors and they play a key role in growth and development of flowering plants. There are two types of MADS box genes- Type I (serum response factor (SRF)-like) and Type II (myocyte enhancer factor 2 (MEF2)-like). Type II MADS box genes have a conserved MIKC domain (MADS DNA-binding domain, intervening domain, keratin-like domain, and c-terminal domain) and these were extensively studied in plants. Compared to other plants very little is known about MADS box genes in Camellia sinensis. The present study aims at identifying and analyzing the MADS-box genes present in Camellia sinensis. A comparative bioinformatics and phylogenetic analysis of the Camellia sinensis sequences along with Arabidopsis thaliana MADS box sequences available in the public domain databases led to the identification of 16 genes which were orthologous to Type II MADS box gene family members. The protein sequences were classified into distinct clades which are associated with the conserved function of flower and seed development. The identified genes may be used for gene expression and gene manipulation studies to elucidate their role in the development and flowering of tea which may pave the way to improve the crop productivity.

  12. Computational identification and analysis of MADS box genes in Camellia sinensis

    PubMed Central

    Gogoi, Madhurjya; Borchetia, Sangeeta; Bandyopadhyay, Tanoy

    2015-01-01

    MADS (Minichromosome Maintenance1 Agamous Deficiens Serum response factor) box genes encode transcription factors and they play a key role in growth and development of flowering plants. There are two types of MADS box genes- Type I (serum response factor (SRF)-like) and Type II (myocyte enhancer factor 2 (MEF2)-like). Type II MADS box genes have a conserved MIKC domain (MADS DNA-binding domain, intervening domain, keratin-like domain, and c-terminal domain) and these were extensively studied in plants. Compared to other plants very little is known about MADS box genes in Camellia sinensis. The present study aims at identifying and analyzing the MADS-box genes present in Camellia sinensis. A comparative bioinformatics and phylogenetic analysis of the Camellia sinensis sequences along with Arabidopsis thaliana MADS box sequences available in the public domain databases led to the identification of 16 genes which were orthologous to Type II MADS box gene family members. The protein sequences were classified into distinct clades which are associated with the conserved function of flower and seed development. The identified genes may be used for gene expression and gene manipulation studies to elucidate their role in the development and flowering of tea which may pave the way to improve the crop productivity. PMID:25914445

  13. Computational identification of human long intergenic non-coding RNAs using a GA-SVM algorithm.

    PubMed

    Wang, Yanqiu; Li, Yang; Wang, Qi; Lv, Yingli; Wang, Shiyuan; Chen, Xi; Yu, Xuexin; Jiang, Wei; Li, Xia

    2014-01-01

    Long intergenic non-coding RNAs (lincRNAs) are a new type of non-coding RNAs and are closely related with the occurrence and development of diseases. In previous studies, most lincRNAs have been identified through next-generation sequencing. Because lincRNAs exhibit tissue-specific expression, the reproducibility of lincRNA discovery in different studies is very poor. In this study, not including lincRNA expression, we used the sequence, structural and protein-coding potential features as potential features to construct a classifier that can be used to distinguish lincRNAs from non-lincRNAs. The GA-SVM algorithm was performed to extract the optimized feature subset. Compared with several feature subsets, the five-fold cross validation results showed that this optimized feature subset exhibited the best performance for the identification of human lincRNAs. Moreover, the LincRNA Classifier based on Selected Features (linc-SF) was constructed by support vector machine (SVM) based on the optimized feature subset. The performance of this classifier was further evaluated by predicting lincRNAs from two independent lincRNA sets. Because the recognition rates for the two lincRNA sets were 100% and 99.8%, the linc-SF was found to be effective for the prediction of human lincRNAs.

  14. The sensitivity of nonlinear computational models of trabecular bone to tissue level constitutive model.

    PubMed

    Baumann, Andrew P; Shi, Xiutao; Roeder, Ryan K; Niebur, Glen L

    2016-01-01

    Microarchitectural finite element models have become a key tool in the analysis of trabecular bone. Robust, accurate, and validated constitutive models would enhance confidence in predictive applications of these models and in their usefulness as accurate assays of tissue properties. Human trabecular bone specimens from the femoral neck (n = 3), greater trochanter (n = 6), and lumbar vertebra (n = 1) of eight different donors were scanned by μ-CT and converted to voxel-based finite element models. Unconfined uniaxial compression and shear loading were simulated for each of three different constitutive models: a principal strain-based model, Drucker-Lode, and Drucker-Prager. The latter was applied with both infinitesimal and finite kinematics. Apparent yield strains exhibited minimal dependence on the constitutive model, differing by at most 16.1%, with the kinematic formulation being influential in compression loading. At the tissue level, the quantities and locations of yielded tissue were insensitive to the constitutive model, with the exception of the Drucker-Lode model, suggesting that correlation of microdamage with computational models does not improve the ability to discriminate between constitutive laws. Taken together, it is unlikely that a tissue constitutive model can be fully validated from apparent-level experiments alone, as the calculations are too insensitive to identify differences in the outcomes. Rather, any asymmetric criterion with a valid yield surface will likely be suitable for most trabecular bone models.

  15. National diagnostic reference level initiative for computed tomography examinations in Kenya

    PubMed Central

    Korir, Geoffrey K.; Wambani, Jeska S.; Korir, Ian K.; Tries, Mark A.; Boen, Patrick K.

    2016-01-01

    The purpose of this study was to estimate the computed tomography (CT) examination frequency, patient radiation exposure, effective doses and national diagnostic reference levels (NDRLs) associated with CT examinations in clinical practice. A structured questionnaire-type form was developed for recording examination frequency, scanning protocols and patient radiation exposure during CT procedures in fully equipped medical facilities across the country. The national annual number of CT examinations per 1000 people was estimated to be 3 procedures. The volume-weighted CT dose index, dose length product, effective dose and NDRLs were determined for 20 types of adult and paediatric CT examinations. Additionally, the CT annual collective effective dose and effective dose per capita were approximated. The radiation exposure during CT examinations was broadly distributed between the facilities that took part in the study. This calls for a need to develop and implement diagnostic reference levels as a standardisation and optimisation tool for the radiological protection of patients at all the CT facilities nationwide. PMID:25790825

  16. Identification of miRNAs Potentially Involved in Bronchiolitis Obliterans Syndrome: A Computational Study

    PubMed Central

    Politano, Gianfranco; Inghilleri, Simona; Morbini, Patrizia; Calabrese, Fiorella; Benso, Alfredo; Savino, Alessandro; Cova, Emanuela; Zampieri, Davide; Meloni, Federica

    2016-01-01

    The pathogenesis of Bronchiolitis Obliterans Syndrome (BOS), the main clinical phenotype of chronic lung allograft dysfunction, is poorly understood. Recent studies suggest that epigenetic regulation of microRNAs might play a role in its development. In this paper we present the application of a complex computational pipeline to perform enrichment analysis of miRNAs in pathways applied to the study of BOS. The analysis considered the full set of miRNAs annotated in miRBase (version 21), and applied a sequence of filtering approaches and statistical analyses to reduce this set and to score the candidate miRNAs according to their potential involvement in BOS development. Dysregulation of two of the selected candidate miRNAs–miR-34a and miR-21 –was clearly shown in in-situ hybridization (ISH) on five explanted human BOS lungs and on a rat model of acute and chronic lung rejection, thus definitely identifying miR-34a and miR-21 as pathogenic factors in BOS and confirming the effectiveness of the computational pipeline. PMID:27564214

  17. Facile identification of dual FLT3-Aurora A inhibitors: a computer-guided drug design approach.

    PubMed

    Chang Hsu, Yung; Ke, Yi-Yu; Shiao, Hui-Yi; Lee, Chieh-Chien; Lin, Wen-Hsing; Chen, Chun-Hwa; Yen, Kuei-Jung; Hsu, John T-A; Chang, Chungming; Hsieh, Hsing-Pang

    2014-05-01

    Computer-guided drug design is a powerful tool for drug discovery. Herein we disclose the use of this approach for the discovery of dual FMS-like receptor tyrosine kinase-3 (FLT3)-Aurora A inhibitors against cancer. An Aurora hit compound was selected as a starting point, from which 288 virtual molecules were screened. Subsequently, some of these were synthesized and evaluated for their capacity to inhibit FLT3 and Aurora kinase A. To further enhance FLT3 inhibition, structure-activity relationship studies of the lead compound were conducted through a simplification strategy and bioisosteric replacement, followed by the use of computer-guided drug design to prioritize molecules bearing a variety of different terminal groups in terms of favorable binding energy. Selected compounds were then synthesized, and their bioactivity was evaluated. Of these, one novel inhibitor was found to exhibit excellent inhibition of FLT3 and Aurora kinase A and exert a dramatic antiproliferative effect on MOLM-13 and MV4-11 cells, with an IC50 value of 7 nM. Accordingly, it is considered a highly promising candidate for further development.

  18. Computational Identification of Mechanistic Factors That Determine the Timing and Intensity of the Inflammatory Response

    PubMed Central

    Nagaraja, Sridevi; Reifman, Jaques; Mitrophanov, Alexander Y.

    2015-01-01

    Timely resolution of inflammation is critical for the restoration of homeostasis in injured or infected tissue. Chronic inflammation is often characterized by a persistent increase in the concentrations of inflammatory cells and molecular mediators, whose distinct amount and timing characteristics offer an opportunity to identify effective therapeutic regulatory targets. Here, we used our recently developed computational model of local inflammation to identify potential targets for molecular interventions and to investigate the effects of individual and combined inhibition of such targets. This was accomplished via the development and application of computational strategies involving the simulation and analysis of thousands of inflammatory scenarios. We found that modulation of macrophage influx and efflux is an effective potential strategy to regulate the amount of inflammatory cells and molecular mediators in both normal and chronic inflammatory scenarios. We identified three molecular mediators − tumor necrosis factor-α (TNF-α), transforming growth factor-β (TGF-β), and the chemokine CXCL8 − as potential molecular targets whose individual or combined inhibition may robustly regulate both the amount and timing properties of the kinetic trajectories for neutrophils and macrophages in chronic inflammation. Modulation of macrophage flux, as well as of the abundance of TNF-α, TGF-β, and CXCL8, may improve the resolution of chronic inflammation. PMID:26633296

  19. Myocardial strain estimation from CT: towards computer-aided diagnosis on infarction identification

    NASA Astrophysics Data System (ADS)

    Wong, Ken C. L.; Tee, Michael; Chen, Marcus; Bluemke, David A.; Summers, Ronald M.; Yao, Jianhua

    2015-03-01

    Regional myocardial strains have the potential for early quantification and detection of cardiac dysfunctions. Although image modalities such as tagged and strain-encoded MRI can provide motion information of the myocardium, they are uncommon in clinical routine. In contrary, cardiac CT images are usually available, but they only provide motion information at salient features such as the cardiac boundaries. To estimate myocardial strains from a CT image sequence, we adopted a cardiac biomechanical model with hyperelastic material properties to relate the motion on the cardiac boundaries to the myocardial deformation. The frame-to-frame displacements of the cardiac boundaries are obtained using B-spline deformable image registration based on mutual information, which are enforced as boundary conditions to the biomechanical model. The system equation is solved by the finite element method to provide the dense displacement field of the myocardium, and the regional values of the three principal strains and the six strains in cylindrical coordinates are computed in terms of the American Heart Association nomenclature. To study the potential of the estimated regional strains on identifying myocardial infarction, experiments were performed on cardiac CT image sequences of ten canines with artificially induced myocardial infarctions. The leave-one-subject-out cross validations show that, by using the optimal strain magnitude thresholds computed from ROC curves, the radial strain and the first principal strain have the best performance.

  20. Identification of contaminant source architectures—A statistical inversion that emulates multiphase physics in a computationally practicable manner

    NASA Astrophysics Data System (ADS)

    Koch, J.; Nowak, W.

    2016-02-01

    The goal of this work is to improve the inference of nonaqueous-phase contaminated source zone architectures (CSA) from field data. We follow the idea that a physically motivated model for CSA formation helps in this inference by providing relevant relationships between observables and the unknown CSA. Typical multiphase models are computationally too expensive to be applied for inverse modeling; thus, state-of-the-art CSA identification techniques do not yet use physically based CSA formation models. To overcome this shortcoming, we apply a stochastic multiphase model with reduced computational effort that can be used to generate a large ensemble of possible CSA realizations. Further, we apply a reverse transport formulation in order to accelerate the inversion of transport-related data such as downgradient aqueous-phase concentrations. We combine these approaches within an inverse Bayesian methodology for joint inversion of CSA and aquifer parameters. Because we use multiphase physics to constrain and inform the inversion, (1) only physically meaningful CSAs are inferred; (2) each conditional realization is statistically meaningful; (3) we obtain physically meaningful spatial dependencies for interpolation and extrapolation of point-like observations between the different involved unknowns and observables, and (4) dependencies far beyond simple correlation; (5) the inversion yields meaningful uncertainty bounds. We illustrate our concept by inferring three-dimensional probability distributions of DNAPL residence, contaminant mass discharge, and of other CSA characteristics. In the inference example, we use synthetic numerical data on permeability, DNAPL saturation and downgradient aqueous-phase concentration, and we substantiate our claims about the advantages of emulating a multiphase flow model with reduced computational requirement in the inversion.

  1. Computational identification of the selenocysteine tRNA (tRNASec) in genomes

    PubMed Central

    2017-01-01

    Selenocysteine (Sec) is known as the 21st amino acid, a cysteine analogue with selenium replacing sulphur. Sec is inserted co-translationally in a small fraction of proteins called selenoproteins. In selenoprotein genes, the Sec specific tRNA (tRNASec) drives the recoding of highly specific UGA codons from stop signals to Sec. Although found in organisms from the three domains of life, Sec is not universal. Many species are completely devoid of selenoprotein genes and lack the ability to synthesize Sec. Since tRNASec is a key component in selenoprotein biosynthesis, its efficient identification in genomes is instrumental to characterize the utilization of Sec across lineages. Available tRNA prediction methods fail to accurately predict tRNASec, due to its unusual structural fold. Here, we present Secmarker, a method based on manually curated covariance models capturing the specific tRNASec structure in archaea, bacteria and eukaryotes. We exploited the non-universality of Sec to build a proper benchmark set for tRNASec predictions, which is not possible for the predictions of other tRNAs. We show that Secmarker greatly improves the accuracy of previously existing methods constituting a valuable tool to identify tRNASec genes, and to efficiently determine whether a genome contains selenoproteins. We used Secmarker to analyze a large set of fully sequenced genomes, and the results revealed new insights in the biology of tRNASec, led to the discovery of a novel bacterial selenoprotein family, and shed additional light on the phylogenetic distribution of selenoprotein containing genomes. Secmarker is freely accessible for download, or online analysis through a web server at http://secmarker.crg.cat. PMID:28192430

  2. Computational Identification and Systematic Classification of Novel Cytochrome P450 Genes in Salvia miltiorrhiza

    PubMed Central

    Nelson, David R.; Wu, Kai; Liu, Chang

    2014-01-01

    Salvia miltiorrhiza is one of the most economically important medicinal plants. Cytochrome P450 (CYP450) genes have been implicated in the biosynthesis of its active components. However, only a dozen full-length CYP450 genes have been described, and there is no systematic classification of CYP450 genes in S. miltiorrhiza. We obtained 77,549 unigenes from three tissue types of S. miltiorrhiza using RNA-Seq technology. Combining our data with previously identified CYP450 sequences and scanning with the CYP450 model from Pfam resulted in the identification of 116 full-length and 135 partial-length CYP450 genes. The 116 genes were classified into 9 clans and 38 families using standard criteria. The RNA-Seq results showed that 35 CYP450 genes were co-expressed with CYP76AH1, a marker gene for tanshinone biosynthesis, using r≥0.9 as a cutoff. The expression profiles for 16 of 19 randomly selected CYP450 obtained from RNA-Seq were validated by qRT-PCR. Comparing against the KEGG database, 10 CYP450 genes were found to be associated with diterpenoid biosynthesis. Considering all the evidence, 3 CYP450 genes were identified to be potentially involved in terpenoid biosynthesis. Moreover, we found that 15 CYP450 genes were possibly regulated by antisense transcripts (r≥0.9 or r≤–0.9). Lastly, a web resource (SMCYP450, http://www.herbalgenomics.org/samicyp450) was set up, which allows users to browse, search, retrieve and compare CYP450 genes and can serve as a centralized resource. PMID:25493946

  3. Computational approaches for identification of conserved/unique binding pockets in the A chain of ricin

    SciTech Connect

    Ecale Zhou, C L; Zemla, A T; Roe, D; Young, M; Lam, M; Schoeniger, J; Balhorn, R

    2005-01-29

    Specific and sensitive ligand-based protein detection assays that employ antibodies or small molecules such as peptides, aptamers, or other small molecules require that the corresponding surface region of the protein be accessible and that there be minimal cross-reactivity with non-target proteins. To reduce the time and cost of laboratory screening efforts for diagnostic reagents, we developed new methods for evaluating and selecting protein surface regions for ligand targeting. We devised combined structure- and sequence-based methods for identifying 3D epitopes and binding pockets on the surface of the A chain of ricin that are conserved with respect to a set of ricin A chains and unique with respect to other proteins. We (1) used structure alignment software to detect structural deviations and extracted from this analysis the residue-residue correspondence, (2) devised a method to compare corresponding residues across sets of ricin structures and structures of closely related proteins, (3) devised a sequence-based approach to determine residue infrequency in local sequence context, and (4) modified a pocket-finding algorithm to identify surface crevices in close proximity to residues determined to be conserved/unique based on our structure- and sequence-based methods. In applying this combined informatics approach to ricin A we identified a conserved/unique pocket in close proximity (but not overlapping) the active site that is suitable for bi-dentate ligand development. These methods are generally applicable to identification of surface epitopes and binding pockets for development of diagnostic reagents, therapeutics, and vaccines.

  4. A study to establish international diagnostic reference levels for paediatric computed tomography.

    PubMed

    Vassileva, J; Rehani, M; Kostova-Lefterova, D; Al-Naemi, H M; Al Suwaidi, J S; Arandjic, D; Bashier, E H O; Kodlulovich Renha, S; El-Nachef, L; Aguilar, J G; Gershan, V; Gershkevitsh, E; Gruppetta, E; Hustuc, A; Jauhari, A; Kharita, Mohammad Hassan; Khelassi-Toutaoui, N; Khosravi, H R; Khoury, H; Kralik, I; Mahere, S; Mazuoliene, J; Mora, P; Muhogora, W; Muthuvelu, P; Nikodemova, D; Novak, L; Pallewatte, A; Pekarovič, D; Shaaban, M; Shelly, E; Stepanyan, K; Thelsy, N; Visrutaratna, P; Zaman, A

    2015-07-01

    The article reports results from the largest international dose survey in paediatric computed tomography (CT) in 32 countries and proposes international diagnostic reference levels (DRLs) in terms of computed tomography dose index (CTDI vol) and dose length product (DLP). It also assesses whether mean or median values of individual facilities should be used. A total of 6115 individual patient data were recorded among four age groups: <1 y, >1-5 y, >5-10 y and >10-15 y. CTDIw, CTDI vol and DLP from the CT console were recorded in dedicated forms together with patient data and technical parameters. Statistical analysis was performed, and international DRLs were established at rounded 75th percentile values of distribution of median values from all CT facilities. The study presents evidence in favour of using median rather than mean of patient dose indices as the representative of typical local dose in a facility, and for establishing DRLs as third quartile of median values. International DRLs were established for paediatric CT examinations for routine head, chest and abdomen in the four age groups. DRLs for CTDI vol are similar to the reference values from other published reports, with some differences for chest and abdomen CT. Higher variations were observed between DLP values, based on a survey of whole multi-phase exams. It may be noted that other studies in literature were based on single phase only. DRLs reported in this article can be used in countries without sufficient medical physics support to identify non-optimised practice. Recommendations to improve the accuracy and importance of future surveys are provided.

  5. Identification and Utilization of Employer Requirements for Entry-Level Health Occupations Workers. Final Report.

    ERIC Educational Resources Information Center

    Zukowski, James J.

    The purpose of a research project was to identify employer expectations regarding entry-level competency requirements for selected assistance-level health occupations. Physicians, dentists, and other health professionals reviewed lists of competencies associated with the performance of assistance-level employees in nursing, medical laboratory, and…

  6. Social Identification as a Determinant of Concerns about Individual-, Group-, and Inclusive-Level Justice

    ERIC Educational Resources Information Center

    Wenzel, Michael

    2004-01-01

    Extending concepts of micro- and macrojustice, three levels of justice are distinguished. Individual-, group-, and inclusive-level justice are defined in terms of the target of justice concerns: one's individual treatment, one's group's treatment, and the distribution in the collective (e.g., nation). Individual-level justice permits a more…

  7. Analysis of the Computer Anxiety Levels of Secondary Technical Education Teachers in West Virginia.

    ERIC Educational Resources Information Center

    Gordon, Howard R. D.

    1995-01-01

    Responses from 116 secondary technical education teachers (91%) who completed Oetting's Computer Anxiety Scale indicated that 46% experienced general computer anxiety. Development of computer and typing skills explained a large portion of the variance. More hands-on training was recommended. (SK)

  8. Computational genomic identification and functional reconstitution of plant natural product biosynthetic pathways

    PubMed Central

    2016-01-01

    Covering: 2003 to 2016 The last decade has seen the first major discoveries regarding the genomic basis of plant natural product biosynthetic pathways. Four key computationally driven strategies have been developed to identify such pathways, which make use of physical clustering, co-expression, evolutionary co-occurrence and epigenomic co-regulation of the genes involved in producing a plant natural product. Here, we discuss how these approaches can be used for the discovery of plant biosynthetic pathways encoded by both chromosomally clustered and non-clustered genes. Additionally, we will discuss opportunities to prioritize plant gene clusters for experimental characterization, and end with a forward-looking perspective on how synthetic biology technologies will allow effective functional reconstitution of candidate pathways using a variety of genetic systems. PMID:27321668

  9. Identification of potential drug targets based on a computational biology algorithm for venous thromboembolism.

    PubMed

    Xie, Ruiqiang; Li, Lei; Chen, Lina; Li, Wan; Chen, Binbin; Jiang, Jing; Huang, Hao; Li, Yiran; He, Yuehan; Lv, Junjie; He, Weiming

    2017-02-01

    Venous thromboembolism (VTE) is a common, fatal and frequently recurrent disease. Changes in the activity of different coagulation factors serve as a pathophysiological basis for the recurrent risk of VTE. Systems biology approaches provide a better understanding of the pathological mechanisms responsible for recurrent VTE. In this study, a novel computational method was presented to identify the recurrent risk modules (RRMs) based on the integration of expression profiles and human signaling network, which hold promise for achieving new and deeper insights into the mechanisms responsible for VTE. The results revealed that the RRMs had good classification performance to discriminate patients with recurrent VTE. The functional annotation analysis demonstrated that the RRMs played a crucial role in the pathogenesis of VTE. Furthermore, a variety of approved drug targets in the RRM M5 were related to VTE. Thus, the M5 may be applied to select potential drug targets for combination therapy and the extended treatment of VTE.

  10. Computational identification of gene over-expression targets for metabolic engineering of taxadiene production.

    PubMed

    Boghigian, Brett A; Armando, John; Salas, Daniel; Pfeifer, Blaine A

    2012-03-01

    Taxadiene is the first dedicated intermediate in the biosynthetic pathway of the anticancer compound Taxol. Recent studies have taken advantage of heterologous hosts to produce taxadiene and other isoprenoid compounds, and such ventures now offer research opportunities that take advantage of the engineering tools associated with the surrogate host. In this study, metabolic engineering was applied in the context of over-expression targets predicted to improve taxadiene production. Identified targets included genes both within and outside of the isoprenoid precursor pathway. These targets were then tested for experimental over-expression in a heterologous Escherichia coli host designed to support isoprenoid biosynthesis. Results confirmed the computationally predicted improvements and indicated a synergy between targets within the expected isoprenoid precursor pathway and those outside this pathway. The presented algorithm is broadly applicable to other host systems and/or product choices.

  11. Key for protein coding sequences identification: computer analysis of codon strategy.

    PubMed Central

    Rodier, F; Gabarro-Arpa, J; Ehrlich, R; Reiss, C

    1982-01-01

    The signal qualifying an AUG or GUG as an initiator in mRNAs processed by E. coli ribosomes is not found to be a systematic, literal homology sequence. In contrast, stability analysis reveals that initiators always occur within nucleic acid domains of low stability, for which a high A/U content is observed. Since no aminoacid selection pressure can be detected at N-termini of the proteins, the A/U enrichment results from a biased usage of the code degeneracy. A computer analysis is presented which allows easy detection of the codon strategy. N-terminal codons carry rather systematically A or U in third position, which suggests a mechanism for translation initiation and helps to detect protein coding sequences in sequenced DNA. PMID:7038623

  12. Cutting edge: identification of novel T cell epitopes in Lol p5a by computational prediction.

    PubMed

    de Lalla, C; Sturniolo, T; Abbruzzese, L; Hammer, J; Sidoli, A; Sinigaglia, F; Panina-Bordignon, P

    1999-08-15

    Although atopic allergy affects computational prediction of DR ligands might thus allow the design of T cell epitopes with potential useful application in novel immunotherapy strategies.

  13. Computer Folding of RNA Tetraloops: Identification of Key Force Field Deficiencies.

    PubMed

    Kührová, Petra; Best, Robert B; Bottaro, Sandro; Bussi, Giovanni; Šponer, Jiří; Otyepka, Michal; Banáš, Pavel

    2016-09-13

    The computer-aided folding of biomolecules, particularly RNAs, is one of the most difficult challenges in computational structural biology. RNA tetraloops are fundamental RNA motifs playing key roles in RNA folding and RNA-RNA and RNA-protein interactions. Although state-of-the-art Molecular Dynamics (MD) force fields correctly describe the native state of these tetraloops as a stable free-energy basin on the microsecond time scale, enhanced sampling techniques reveal that the native state is not the global free energy minimum, suggesting yet unidentified significant imbalances in the force fields. Here, we tested our ability to fold the RNA tetraloops in various force fields and simulation settings. We employed three different enhanced sampling techniques, namely, temperature replica exchange MD (T-REMD), replica exchange with solute tempering (REST2), and well-tempered metadynamics (WT-MetaD). We aimed to separate problems caused by limited sampling from those due to force-field inaccuracies. We found that none of the contemporary force fields is able to correctly describe folding of the 5'-GAGA-3' tetraloop over a range of simulation conditions. We thus aimed to identify which terms of the force field are responsible for this poor description of TL folding. We showed that at least two different imbalances contribute to this behavior, namely, overstabilization of base-phosphate and/or sugar-phosphate interactions and underestimated stability of the hydrogen bonding interaction in base pairing. The first artifact stabilizes the unfolded ensemble, while the second one destabilizes the folded state. The former problem might be partially alleviated by reparametrization of the van der Waals parameters of the phosphate oxygens suggested by Case et al., while in order to overcome the latter effect we suggest local potentials to better capture hydrogen bonding interactions.

  14. Computationally Guided Identification of Novel Mycobacterium tuberculosis GlmU Inhibitory Leads, Their Optimization, and in Vitro Validation.

    PubMed

    Mehra, Rukmankesh; Rani, Chitra; Mahajan, Priya; Vishwakarma, Ram Ashrey; Khan, Inshad Ali; Nargotra, Amit

    2016-02-08

    Mycobacterium tuberculosis (Mtb) infections are causing serious health concerns worldwide. Antituberculosis drug resistance threatens the current therapies and causes further need to develop effective antituberculosis therapy. GlmU represents an interesting target for developing novel Mtb drug candidates. It is a bifunctional acetyltransferase/uridyltransferase enzyme that catalyzes the biosynthesis of UDP-N-acetyl-glucosamine (UDP-GlcNAc) from glucosamine-1-phosphate (GlcN-1-P). UDP-GlcNAc is a substrate for the biosynthesis of lipopolysaccharide and peptidoglycan that are constituents of the bacterial cell wall. In the current study, structure and ligand based computational models were developed and rationally applied to screen a drug-like compound repository of 20,000 compounds procured from ChemBridge DIVERSet database for the identification of probable inhibitors of Mtb GlmU. The in vitro evaluation of the in silico identified inhibitor candidates resulted in the identification of 15 inhibitory leads of this target. Literature search of these leads through SciFinder and their similarity analysis with the PubChem training data set (AID 1376) revealed the structural novelty of these hits with respect to Mtb GlmU. IC50 of the most potent identified inhibitory lead (5810599) was found to be 9.018 ± 0.04 μM. Molecular dynamics (MD) simulation of this inhibitory lead (5810599) in complex with protein affirms the stability of the lead within the binding pocket and also emphasizes on the key interactive residues for further designing. Binding site analysis of the acetyltransferase pocket with respect to the identified structural moieties provides a thorough analysis for carrying out the lead optimization studies.

  15. Perceptions of teaching and learning automata theory in a college-level computer science course

    NASA Astrophysics Data System (ADS)

    Weidmann, Phoebe Kay

    This dissertation identifies and describes student and instructor perceptions that contribute to effective teaching and learning of Automata Theory in a competitive college-level Computer Science program. Effective teaching is the ability to create an appropriate learning environment in order to provide effective learning. We define effective learning as the ability of a student to meet instructor set learning objectives, demonstrating this by passing the course, while reporting a good learning experience. We conducted our investigation through a detailed qualitative case study of two sections (118 students) of Automata Theory (CS 341) at The University of Texas at Austin taught by Dr. Lily Quilt. Because Automata Theory has a fixed curriculum in the sense that many curricula and textbooks agree on what Automata Theory contains, differences being depth and amount of material to cover in a single course, a case study would allow for generalizable findings. Automata Theory is especially problematic in a Computer Science curriculum since students are not experienced in abstract thinking before taking this course, fail to understand the relevance of the theory, and prefer classes with more concrete activities such as programming. This creates a special challenge for any instructor of Automata Theory as motivation becomes critical for student learning. Through the use of student surveys, instructor interviews, classroom observation, material and course grade analysis we sought to understand what students perceived, what instructors expected of students, and how those perceptions played out in the classroom in terms of structure and instruction. Our goal was to create suggestions that would lead to a better designed course and thus a higher student success rate in Automata Theory. We created a unique theoretical basis, pedagogical positivism, on which to study college-level courses. Pedagogical positivism states that through examining instructor and student perceptions

  16. LEVEL: A computer program for solving the radial Schrödinger equation for bound and quasibound levels

    NASA Astrophysics Data System (ADS)

    Le Roy, Robert J.

    2017-01-01

    This paper describes program LEVEL, which can solve the radial or one-dimensional Schrödinger equation and automatically locate either all of, or a selected number of, the bound and/or quasibound levels of any smooth single- or double-minimum potential, and calculate inertial rotation and centrifugal distortion constants and various expectation values for those levels. It can also calculate Franck-Condon factors and other off-diagonal matrix elements, either between levels of a single potential or between levels of two different potentials. The potential energy function may be defined by any one of a number of analytic functions, or by a set of input potential function values which the code will interpolate over and extrapolate beyond to span the desired range.

  17. Computational identification of genetic subnetwork modules associated with maize defense response to Fusarium verticillioides

    PubMed Central

    2015-01-01

    Background Maize, a crop of global significance, is vulnerable to a variety of biotic stresses resulting in economic losses. Fusarium verticillioides (teleomorph Gibberella moniliformis) is one of the key fungal pathogens of maize, causing ear rots and stalk rots. To better understand the genetic mechanisms involved in maize defense as well as F. verticillioides virulence, a systematic investigation of the host-pathogen interaction is needed. The aim of this study was to computationally identify potential maize subnetwork modules associated with its defense response against F. verticillioides. Results We obtained time-course RNA-seq data from B73 maize inoculated with wild type F. verticillioides and a loss-of-virulence mutant, and subsequently established a computational pipeline for network-based comparative analysis. Specifically, we first analyzed the RNA-seq data by a cointegration-correlation-expression approach, where maize genes were jointly analyzed with known F. verticillioides virulence genes to find candidate maize genes likely associated with the defense mechanism. We predicted maize co-expression networks around the selected maize candidate genes based on partial correlation, and subsequently searched for subnetwork modules that were differentially activated when inoculated with two different fungal strains. Based on our analysis pipeline, we identified four potential maize defense subnetwork modules. Two were directly associated with maize defense response and were associated with significant GO terms such as GO:0009817 (defense response to fungus) and GO:0009620 (response to fungus). The other two predicted modules were indirectly involved in the defense response, where the most significant GO terms associated with these modules were GO:0046914 (transition metal ion binding) and GO:0046686 (response to cadmium ion). Conclusion Through our RNA-seq data analysis, we have shown that a network-based approach can enhance our understanding of the

  18. Computational identification of RNA functional determinants by three-dimensional quantitative structure–activity relationships

    PubMed Central

    Blanchet, Marc-Frédérick; St-Onge, Karine; Lisi, Véronique; Robitaille, Julie; Hamel, Sylvie; Major, François

    2014-01-01

    Anti-infection drugs target vital functions of infectious agents, including their ribosome and other essential non-coding RNAs. One of the reasons infectious agents become resistant to drugs is due to mutations that eliminate drug-binding affinity while maintaining vital elements. Identifying these elements is based on the determination of viable and lethal mutants and associated structures. However, determining the structure of enough mutants at high resolution is not always possible. Here, we introduce a new computational method, MC-3DQSAR, to determine the vital elements of target RNA structure from mutagenesis and available high-resolution data. We applied the method to further characterize the structural determinants of the bacterial 23S ribosomal RNA sarcin–ricin loop (SRL), as well as those of the lead-activated and hammerhead ribozymes. The method was accurate in confirming experimentally determined essential structural elements and predicting the viability of new SRL variants, which were either observed in bacteria or validated in bacterial growth assays. Our results indicate that MC-3DQSAR could be used systematically to evaluate the drug-target potentials of any RNA sites using current high-resolution structural data. PMID:25200082

  19. Identification of natural allosteric inhibitor for Akt1 protein through computational approaches and in vitro evaluation.

    PubMed

    Pragna Lakshmi, T; Kumar, Amit; Vijaykumar, Veena; Natarajan, Sakthivel; Krishna, Ramadas

    2017-03-01

    Akt, a serine/threonine protein kinase, is often hyper activated in breast and prostate cancers, but with poor prognosis. Allosteric inhibitors regulate aberrant kinase activity by stabilizing the protein in inactive conformation. Several natural compounds have been reported as inhibitors for kinases. In this study, to identify potential natural allosteric inhibitor for Akt1, we generated a seven-point pharmacophore model and screened it through natural compound library. Quercetin-7-O-β-d-glucopyranoside or Q7G was found to be the best among selected molecules based on its hydrogen bond occupancy with key allosteric residues, persistent polar contacts and salt bridges that stabilize Akt1 in inactive conformation and minimum binding free energy during molecular dynamics simulation. Q7G induced dose-dependent inhibition of breast cancer cells (MDA MB-231) and arrested them in G1 and sub-G phase. This was associated with down-regulation of anti-apoptotic protein Bcl-2, up-regulation of cleaved caspase-3 and PARP. Expression of p-Akt (Ser473) was also down-regulated which might be due to Akt1 inhibition in inactive conformation. We further confirmed the Akt1 and Q7G interaction which was observed to have a dissociation constant (Kd) of 0.246μM. With these computational, biological and thermodynamic studies, we suggest Q7G as a lead molecule and propose for its further optimization.

  20. Computational identification of microRNAs and their targets in cassava (Manihot esculenta Crantz.).

    PubMed

    Patanun, Onsaya; Lertpanyasampatha, Manassawe; Sojikul, Punchapat; Viboonjun, Unchera; Narangajavana, Jarunya

    2013-03-01

    MicroRNAs (miRNAs) are a newly discovered class of noncoding endogenous small RNAs involved in plant growth and development as well as response to environmental stresses. miRNAs have been extensively studied in various plant species, however, only few information are available in cassava, which serves as one of the staple food crops, a biofuel crop, animal feed and industrial raw materials. In this study, the 169 potential cassava miRNAs belonging to 34 miRNA families were identified by computational approach. Interestingly, mes-miR319b was represented as the first putative mirtron demonstrated in cassava. A total of 15 miRNA clusters involving 7 miRNA families, and 12 pairs of sense and antisense strand cassava miRNAs belonging to six different miRNA families were discovered. Prediction of potential miRNA target genes revealed their functions involved in various important plant biological processes. The cis-regulatory elements relevant to drought stress and plant hormone response were identified in the promoter regions of those miRNA genes. The results provided a foundation for further investigation of the functional role of known transcription factors in the regulation of cassava miRNAs. The better understandings of the complexity of miRNA-mediated genes network in cassava would unravel cassava complex biology in storage root development and in coping with environmental stresses, thus providing more insights for future exploitation in cassava improvement.

  1. Toward accurate tooth segmentation from computed tomography images using a hybrid level set model

    SciTech Connect

    Gan, Yangzhou; Zhao, Qunfei; Xia, Zeyang E-mail: jing.xiong@siat.ac.cn; Hu, Ying; Xiong, Jing E-mail: jing.xiong@siat.ac.cn; Zhang, Jianwei

    2015-01-15

    Purpose: A three-dimensional (3D) model of the teeth provides important information for orthodontic diagnosis and treatment planning. Tooth segmentation is an essential step in generating the 3D digital model from computed tomography (CT) images. The aim of this study is to develop an accurate and efficient tooth segmentation method from CT images. Methods: The 3D dental CT volumetric images are segmented slice by slice in a two-dimensional (2D) transverse plane. The 2D segmentation is composed of a manual initialization step and an automatic slice by slice segmentation step. In the manual initialization step, the user manually picks a starting slice and selects a seed point for each tooth in this slice. In the automatic slice segmentation step, a developed hybrid level set model is applied to segment tooth contours from each slice. Tooth contour propagation strategy is employed to initialize the level set function automatically. Cone beam CT (CBCT) images of two subjects were used to tune the parameters. Images of 16 additional subjects were used to validate the performance of the method. Volume overlap metrics and surface distance metrics were adopted to assess the segmentation accuracy quantitatively. The volume overlap metrics were volume difference (VD, mm{sup 3}) and Dice similarity coefficient (DSC, %). The surface distance metrics were average symmetric surface distance (ASSD, mm), RMS (root mean square) symmetric surface distance (RMSSSD, mm), and maximum symmetric surface distance (MSSD, mm). Computation time was recorded to assess the efficiency. The performance of the proposed method has been compared with two state-of-the-art methods. Results: For the tested CBCT images, the VD, DSC, ASSD, RMSSSD, and MSSD for the incisor were 38.16 ± 12.94 mm{sup 3}, 88.82 ± 2.14%, 0.29 ± 0.03 mm, 0.32 ± 0.08 mm, and 1.25 ± 0.58 mm, respectively; the VD, DSC, ASSD, RMSSSD, and MSSD for the canine were 49.12 ± 9.33 mm{sup 3}, 91.57 ± 0.82%, 0.27 ± 0.02 mm, 0

  2. A component-level failure detection and identification algorithm based on open-loop and closed-loop state estimators

    NASA Astrophysics Data System (ADS)

    You, Seung-Han; Cho, Young Man; Hahn, Jin-Oh

    2013-04-01

    This study presents a component-level failure detection and identification (FDI) algorithm for a cascade mechanical system subsuming a plant driven by an actuator unit. The novelty of the FDI algorithm presented in this study is that it is able to discriminate failure occurring in the actuator unit, the sensor measuring the output of the actuator unit, and the plant driven by the actuator unit. The proposed FDI algorithm exploits the measurement of the actuator unit output together with its estimates generated by open-loop (OL) and closed-loop (CL) estimators to enable FDI at the component's level. In this study, the OL estimator is designed based on the system identification of the actuator unit. The CL estimator, which is guaranteed to be stable against variations in the plant, is synthesized based on the dynamics of the entire cascade system. The viability of the proposed algorithm is demonstrated using a hardware-in-the-loop simulation (HILS), which shows that it can detect and identify target failures reliably in the presence of plant uncertainties.

  3. Genotypic and Phenotypic Applications for the Differentiation and Species-Level Identification of Achromobacter for Clinical Diagnoses

    PubMed Central

    Gomila, Margarita; Prince-Manzano, Claudia; Svensson-Stadler, Liselott; Busquets, Antonio; Erhard, Marcel; Martínez, Deny L.; Lalucat, Jorge; Moore, Edward R. B.

    2014-01-01

    The Achromobacter is a genus in the family Alcaligenaceae, comprising fifteen species isolated from different sources, including clinical samples. The ability to detect and correctly identify Achromobacter species, particularly A. xylosoxidans, and differentiate them from other phenotypically similar and genotypically related Gram-negative, aerobic, non-fermenting species is important for patients with cystic fibrosis (CF), as well as for nosocomial and other opportunistic infections. Traditional phenotypic profile-based analyses have been demonstrated to be inadequate for reliable identifications of isolates of Achromobacter species and genotypic-based assays, relying upon comparative 16S rRNA gene sequence analyses are not able to insure definitive identifications of Achromobacter species, due to the inherently conserved nature of the gene. The uses of alternative methodologies to enable high-resolution differentiation between the species in the genus are needed. A comparative multi-locus sequence analysis (MLSA) of four selected ‘house-keeping’ genes (atpD, gyrB, recA, and rpoB) assessed the individual gene sequences for their potential in developing a reliable, rapid and cost-effective diagnostic protocol for Achromobacter species identifications. The analysis of the type strains of the species of the genus and 46 strains of Achromobacter species showed congruence between the cluster analyses derived from the individual genes. The MLSA gene sequences exhibited different levels of resolution in delineating the validly published Achromobacter species and elucidated strains that represent new genotypes and probable new species of the genus. Our results also suggested that the recently described A. spritinus is a later heterotypic synonym of A. marplatensis. Strains were analyzed, using whole-cell Matrix-Assisted Laser Desorption/Ionization Time-Of-Flight mass spectrometry (MALDI-TOF MS), as an alternative phenotypic profile-based method with the potential to

  4. Mechanical Behaviour of Light Metal Alloys at High Strain Rates. Computer Simulation on Mesoscale Levels

    NASA Astrophysics Data System (ADS)

    Skripnyak, Vladimir; Skripnyak, Evgeniya; Meyer, Lothar W.; Herzig, Norman; Skripnyak, Nataliya

    2012-02-01

    Researches of the last years have allowed to establish that the laws of deformation and fracture of bulk ultrafine-grained and coarse-grained materials are various both in static and in dynamic loading conditions. Development of adequate constitutive equations for the description of mechanical behavior of bulk ultrafine-grained materials at intensive dynamic influences is complicated in consequence of insufficient knowledge about general rules of inelastic deformation and nucleation and growth of cracks. Multi-scale computational model was used for the investigation of deformation and fracture of bulk structured aluminum and magnesium alloys under stress pulse loadings on mesoscale level. The increment of plastic deformation is defined by the sum of the increments caused by a nucleation and gliding of dislocations, the twinning, meso-blocks movement, and grain boundary sliding. The model takes into account the influence on mechanical properties of alloys an average grains size, grain sizes distribution of and concentration of precipitates. It was obtained the nucleation and gliding of dislocations caused the high attenuation rate of the elastic precursor of ultrafine-grained alloys than in coarse grained counterparts.

  5. Extended gray level co-occurrence matrix computation for 3D image volume

    NASA Astrophysics Data System (ADS)

    Salih, Nurulazirah M.; Dewi, Dyah Ekashanti Octorina

    2017-02-01

    Gray Level Co-occurrence Matrix (GLCM) is one of the main techniques for texture analysis that has been widely used in many applications. Conventional GLCMs usually focus on two-dimensional (2D) image texture analysis only. However, a three-dimensional (3D) image volume requires specific texture analysis computation. In this paper, an extended 2D to 3D GLCM approach based on the concept of multiple 2D plane positions and pixel orientation directions in the 3D environment is proposed. The algorithm was implemented by breaking down the 3D image volume into 2D slices based on five different plane positions (coordinate axes and oblique axes) resulting in 13 independent directions, then calculating the GLCMs. The resulted GLCMs were averaged to obtain normalized values, then the 3D texture features were calculated. A preliminary examination was performed on a 3D image volume (64 x 64 x 64 voxels). Our analysis confirmed that the proposed technique is capable of extracting the 3D texture features from the extended GLCMs approach. It is a simple and comprehensive technique that can contribute to the 3D image analysis.

  6. Radiation safety concerns and diagnostic reference levels for computed tomography scanners in Tamil Nadu.

    PubMed

    Livingstone, Roshan S; Dinakaran, Paul M

    2011-01-01

    Radiation safety in computed tomography (CT) scanners is of concern due its widespread use in the field of radiological imaging. This study intends to evaluate radiation doses imparted to patients undergoing thorax, abdomen and pelvic CT examinations and formulate regional diagnostic reference levels (DRL) in Tamil Nadu, South India. In-site CT dose measurement was performed in 127 CT scanners in Tamil Nadu for a period of 2 years as a part of the Atomic Energy Regulatory Board (AERB)-funded project. Out of the 127 CT scanners,13 were conventional; 53 single-slice helical scanners (SSHS); 44 multislice CT (MSCT) scanners; and 17 refurbished scanners. CT dose index (CTDI) was measured using a 32-cm polymethyl methacrylate (PMMA)-body phantom in each CT scanner. Dose length product (DLP) for different anatomical regions was generated using CTDI values. The regional DRLs for thorax, abdomen and pelvis examinations were 557, 521 and 294 mGy cm, respectively. The mean effective dose was estimated using the DLP values and was found to be 8.04, 6.69 and 4.79 mSv for thorax, abdomen and pelvic CT examinations, respectively. The establishment of DRLs in this study is the first step towards optimization of CT doses in the Indian context.

  7. Validation of a computer-aided diagnosis system for the automatic identification of carotid atherosclerosis.

    PubMed

    Bonanno, Lilla; Marino, Silvia; Bramanti, Placido; Sottile, Fabrizio

    2015-02-01

    Carotid atherosclerosis represents one of the most important causes of brain stroke. The degree of carotid stenosis is, up to now, considered one of the most important features for determining the risk of brain stroke. Ultrasound (US) is a non-invasive, relatively inexpensive, portable technique, which has an excellent temporal resolution. Computer-aided diagnosis (CAD) has become one of the major research fields in medical and diagnostic imaging. We studied US images of 44 patients, 22 patients with and 22 without carotid artery stenosis, by using US examination and applying a CAD system, an automatic prototype software to detect carotid plaques. We obtained 287 regions: 60 were classified as plaques, with an average signal echogenicity of 244.1 ± 20.0 and 227 were classified as non-plaques, with an average signal echogenicity of 193.8 ± 38.6 compared with the opinion of an expert neurologist (golden test). The receiver operating characteristic (ROC) analysis revealed a highly significant area under the ROC curve difference from 0.5 (null hypothesis) in the discrimination between plaques and non-plaques; the diagnostic accuracy was 89% (95% CI: 0.85-0.92), with an appropriate cut-off value of 236.8, sensitivity was 83% and specificity reached a value of 85%. The experimental results showed that the proposed method is feasible and has a good agreement with the expert neurologist. Without the need of any user-interaction, this method generates a detection out-put that may be useful in second opinion.

  8. Identification of novel microRNAs in Hevea brasiliensis and computational prediction of their targets

    PubMed Central

    2012-01-01

    Background Plants respond to external stimuli through fine regulation of gene expression partially ensured by small RNAs. Of these, microRNAs (miRNAs) play a crucial role. They negatively regulate gene expression by targeting the cleavage or translational inhibition of target messenger RNAs (mRNAs). In Hevea brasiliensis, environmental and harvesting stresses are known to affect natural rubber production. This study set out to identify abiotic stress-related miRNAs in Hevea using next-generation sequencing and bioinformatic analysis. Results Deep sequencing of small RNAs was carried out on plantlets subjected to severe abiotic stress using the Solexa technique. By combining the LeARN pipeline, data from the Plant microRNA database (PMRD) and Hevea EST sequences, we identified 48 conserved miRNA families already characterized in other plant species, and 10 putatively novel miRNA families. The results showed the most abundant size for miRNAs to be 24 nucleotides, except for seven families. Several MIR genes produced both 20-22 nucleotides and 23-27 nucleotides. The two miRNA class sizes were detected for both conserved and putative novel miRNA families, suggesting their functional duality. The EST databases were scanned with conserved and novel miRNA sequences. MiRNA targets were computationally predicted and analysed. The predicted targets involved in "responses to stimuli" and to "antioxidant" and "transcription activities" are presented. Conclusions Deep sequencing of small RNAs combined with transcriptomic data is a powerful tool for identifying conserved and novel miRNAs when the complete genome is not yet available. Our study provided additional information for evolutionary studies and revealed potentially specific regulation of the control of redox status in Hevea. PMID:22330773

  9. Identification of a potential conformationally disordered mesophase in a small molecule: experimental and computational approaches.

    PubMed

    Chakravarty, Paroma; Bates, Simon; Thomas, Leonard

    2013-08-05

    GNE068, a small organic molecule, was obtained as an amorphous form (GNE068-A) after isolation from ethanol and as a partially disordered form (GNE068-PC) from ethyl acetate. On subsequent characterization, GNE068-PC exhibited a number of properties that were anomalous for a two phase crystalline-amorphous system but consistent with the presence of a solid state phase having intermediate order (mesomorphous). Modulated DSC measurements of GNE068-PC revealed an overlapping endotherm and glass transition in the 135-145 °C range. ΔH of the endotherm showed strong heating rate dependence. Variable temperature XRPD (25-160 °C) revealed structure loss in GNE068-PC, suggesting the endotherm to be an "apparent melt". In addition, gentle grinding of GNE068-PC in a mortar led to a marked decrease in XRPD peak intensities, indicating a "soft" crystalline lattice. Computational analysis of XRPD data revealed the presence of two noncrystalline contributions, one of which was associated with GNE068-A. The second was a variable component that could be modeled as diffuse scattering from local disorder within the associated crystal structure, suggesting a mesomorphous system. Owing to the dominance of the noncrystalline diffuse scattering in GNE068-PC and the observed lattice deformation, the mesomorphous phase exhibited properties consistent with a conformationally disordered mesophase. Because of the intimate association of the residual solvent (ethyl acetate) with the lattice long-range order, loss of solvent on heating through the glass transition temperature of the local disorder caused irrecoverable loss of the long-range order. This precluded the observation of characteristic thermodynamic mesophase behavior above the glass transition temperature.

  10. ENHANCING THE ATOMIC-LEVEL UNDERSTANDING OF CO2 MINERAL SEQUESTRATION MECHANISMS VIA ADVANCED COMPUTATIONAL MODELING

    SciTech Connect

    A.V.G. Chizmeshya; M.J. McKelvy; G.H. Wolf; R.W. Carpenter; D.A. Gormley; J.R. Diefenbacher; R. Marzke

    2006-03-01

    significantly improved our understanding of mineral carbonation. Group members at the Albany Research Center have recently shown that carbonation of olivine and serpentine, which naturally occurs over geological time (i.e., 100,000s of years), can be accelerated to near completion in hours. Further process refinement will require a synergetic science/engineering approach that emphasizes simultaneous investigation of both thermodynamic processes and the detailed microscopic, atomic-level mechanisms that govern carbonation kinetics. Our previously funded Phase I Innovative Concepts project demonstrated the value of advanced quantum-mechanical modeling as a complementary tool in bridging important gaps in our understanding of the atomic/molecular structure and reaction mechanisms that govern CO2 mineral sequestration reaction processes for the model Mg-rich lamellar hydroxide feedstock material Mg(OH)2. In the present simulation project, improved techniques and more efficient computational schemes have allowed us to expand and augment these capabilities and explore more complex Mg-rich, lamellar hydroxide-based feedstock materials, including the serpentine-based minerals. These feedstock materials are being actively investigated due to their wide availability, and low-cost CO2 mineral sequestration potential. Cutting-edge first principles quantum chemical, computational solid-state and materials simulation methodology studies proposed herein, have been strategically integrated with our new DOE supported (ASU-Argonne National Laboratory) project to investigate the mechanisms that govern mineral feedstock heat-treatment and aqueous/fluid-phase serpentine mineral carbonation in situ. This unified, synergetic theoretical and experimental approach has provided a deeper understanding of the key reaction mechanisms than either individual approach can alone. We used ab initio techniques to significantly advance our understanding of atomic-level processes at the solid/solution interface by

  11. ENHANCING THE ATOMIC-LEVEL UNDERSTANDING OF CO2 MINERAL SEQUESTRATION MECHANISMS VIA ADVANCED COMPUTATIONAL MODELING

    SciTech Connect

    A.V.G. Chizmeshya

    2003-12-19

    /NETL managed National Mineral Sequestration Working Group we have already significantly improved our understanding of mineral carbonation. Group members at the Albany Research Center have recently shown that carbonation of olivine and serpentine, which naturally occurs over geological time (i.e., 100,000s of years), can be accelerated to near completion in hours. Further process refinement will require a synergetic science/engineering approach that emphasizes simultaneous investigation of both thermodynamic processes and the detailed microscopic, atomic-level mechanisms that govern carbonation kinetics. Our previously funded Phase I Innovative Concepts project demonstrated the value of advanced quantum-mechanical modeling as a complementary tool in bridging important gaps in our understanding of the atomic/molecular structure and reaction mechanisms that govern CO{sub 2} mineral sequestration reaction processes for the model Mg-rich lamellar hydroxide feedstock material Mg(OH){sub 2}. In the present simulation project, improved techniques and more efficient computational schemes have allowed us to expand and augment these capabilities and explore more complex Mg-rich, lamellar hydroxide-based feedstock materials, including the serpentine-based minerals. These feedstock materials are being actively investigated due to their wide availability, and low-cost CO{sub 2} mineral sequestration potential. Cutting-edge first principles quantum chemical, computational solid-state and materials simulation methodology studies proposed herein, have been strategically integrated with our new DOE supported (ASU-Argonne National Laboratory) project to investigate the mechanisms that govern mineral feedstock heat-treatment and aqueous/fluid-phase serpentine mineral carbonation in situ. This unified, synergetic theoretical and experimental approach will provide a deeper understanding of the key reaction mechanisms than either individual approach can alone. Ab initio techniques will also

  12. ENHANCING THE ATOMIC-LEVEL UNDERSTANDING OF CO2 MINERAL SEQUESTRATION MECHANISMS VIA ADVANCED COMPUTATIONAL MODELING

    SciTech Connect

    A.V.G. Chizmeshya

    2002-12-19

    /NETL managed National Mineral Sequestration Working Group we have already significantly improved our understanding of mineral carbonation. Group members at the Albany Research Center have recently shown that carbonation of olivine and serpentine, which naturally occurs over geological time (i.e., 100,000s of years), can be accelerated to near completion in hours. Further process refinement will require a synergetic science/engineering approach that emphasizes simultaneous investigation of both thermodynamic processes and the detailed microscopic, atomic-level mechanisms that govern carbonation kinetics. Our previously funded Phase I Innovative Concepts project demonstrated the value of advanced quantum-mechanical modeling as a complementary tool in bridging important gaps in our understanding of the atomic/molecular structure and reaction mechanisms that govern CO{sub 2} mineral sequestration reaction processes for the model Mg-rich lamellar hydroxide feedstock material Mg(OH){sub 2}. In the present simulation project, improved techniques and more efficient computational schemes have allowed us to expand and augment these capabilities and explore more complex Mg-rich, lamellar hydroxide-based feedstock materials, including the serpentine-based minerals. These feedstock materials are being actively investigated due to their wide availability, and low-cost CO{sub 2} mineral sequestration potential. Cutting-edge first principles quantum chemical, computational solid-state and materials simulation methodology studies proposed herein, have been strategically integrated with our new DOE supported (ASU-Argonne National Laboratory) project to investigate the mechanisms that govern mineral feedstock heat-treatment and aqueous/fluid-phase serpentine mineral carbonation in situ. This unified, synergetic theoretical and experimental approach will provide a deeper understanding of the key reaction mechanisms than either individual approach can alone. Ab initio techniques will also

  13. Interactomes to Biological Phase Space: a call to begin thinking at a new level in computational biology.

    SciTech Connect

    Davidson, George S.; Brown, William Michael

    2007-09-01

    Techniques for high throughput determinations of interactomes, together with high resolution protein collocalizations maps within organelles and through membranes will soon create a vast resource. With these data, biological descriptions, akin to the high dimensional phase spaces familiar to physicists, will become possible. These descriptions will capture sufficient information to make possible realistic, system-level models of cells. The descriptions and the computational models they enable will require powerful computing techniques. This report is offered as a call to the computational biology community to begin thinking at this scale and as a challenge to develop the required algorithms and codes to make use of the new data.3

  14. Three dimensional morphological studies of Larger Benthic Foraminifera at the population level using micro computed tomography

    NASA Astrophysics Data System (ADS)

    Kinoshita, Shunichi; Eder, Wolfgang; Woeger, Julia; Hohenegger, Johann; Briguglio, Antonino; Ferrandez-Canadell, Carles

    2015-04-01

    Symbiont-bearing larger benthic Foraminifera (LBF) are long-living marine (at least 1 year), single-celled organisms with complex calcium carbonate shells. Their morphology has been intensively studied since the middle of the nineteenth century. This led to a broad spectrum of taxonomic results, important from biostratigraphy to ecology in shallow water tropical to warm temperate marine palaeo-environments. However, it was necessary for the traditional investigation methods to cut or destruct specimens for analysing the taxonomically important inner structures. X-ray micro-computed tomography (microCT) is one of the newest techniques used in morphological studies. The greatest advantage is the non-destructive acquisition of inner structures. Furthermore, the running improve of microCT scanners' hard- and software provides high resolution and short time scans well-suited for LBF. Three-dimensional imaging techniques allow to select and extract each chamber and to measure easily its volume, surface and several form parameters used for morphometric analyses. Thus, 3-dimensional visualisation of LBF-tests is a very big step forward from traditional morphology based on 2-dimensional data. The quantification of chamber form is a great opportunity to tackle LBF structures, architectures and the bauplan geometry. The micrometric digital resolution is the only way to solve many controversies in phylogeny and evolutionary trends of LBF. For the present study we used micro-computed tomography to easily investigate the chamber number of every specimen from statistically representative part of populations to estimate population dynamics. Samples of living individuals are collected at monthly intervals from fixed locations. Specific preparation allows to scan up to 35 specimens per scan within 2 hours and to obtain the complete digital dataset for each specimen of the population. MicroCT enables thus a fast and precise count of all chambers built by the foraminifer from its

  15. Identification of a Small Molecule That Selectively Inhibits Mouse PC2 over Mouse PC1/3: A Computational and Experimental Study

    PubMed Central

    Yongye, Austin B.; Vivoli, Mirella; Lindberg, Iris; Appel, Jon R.; Houghten, Richard A.; Martinez-Mayorga, Karina

    2013-01-01

    The calcium-dependent serine endoproteases prohormone convertase 1/3 (PC1/3) and prohormone convertase 2 (PC2) play important roles in the homeostatic regulation of blood glucose levels, hence implicated in diabetes mellitus. Specifically, the absence of PC2 has been associated with chronic hypoglycemia. Since there is a reasonably good conservation of the catalytic domain between species translation of inhibitory effects is likely. In fact, similar results have been found using both mouse and human recombinant enzymes. Here, we employed computational structure-based approaches to screen 14,400 compounds from the Maybridge small molecule library towards mouse PC2. Our most remarkable finding was the identification of a potent and selective PC2 inhibitor. Kinetic data showed the compound to be an allosteric inhibitor. The compound identified is one of the few reported selective, small-molecule inhibitors of PC2. In addition, this new PC2 inhibitor is structurally different and of smaller size than those reported previously. This is advantageous for future studies where structural analogues can be built upon. PMID:23451118

  16. Computed Tomography Image Origin Identification based on Original Sensor Pattern Noise and 3D Image Reconstruction Algorithm Footprints.

    PubMed

    Duan, Yuping; Bouslimi, Dalel; Yang, Guanyu; Shu, Huazhong; Coatrieux, Gouenou

    2016-06-08

    In this paper, we focus on the "blind" identification of the Computed Tomography (CT) scanner that has produced a CT image. To do so, we propose a set of noise features derived from the image chain acquisition and which can be used as CT-Scanner footprint. Basically, we propose two approaches. The first one aims at identifying a CT-Scanner based on an Original Sensor Pattern Noise (OSPN) that is intrinsic to the X-ray detectors. The second one identifies an acquisition system based on the way this noise is modified by its 3D image reconstruction algorithm. As these reconstruction algorithms are manufacturer dependent and kept secret, our features are used as input to train an SVM based classifier so as to discriminate acquisition systems. Experiments conducted on images issued from 15 different CT-Scanner models of 4 distinct manufacturers demonstrate that our system identifies the origin of one CT image with a detection rate of at least 94% and that it achieves better performance than Sensor Pattern Noise (SPN) based strategy proposed for general public camera devices.

  17. Diagnostic accuracy of cone beam computed tomography in identification and postoperative evaluation of furcation defects

    PubMed Central

    Pajnigara, Natasha; Kolte, Abhay; Kolte, Rajashri; Pajnigara, Nilufer; Lathiya, Vrushali

    2016-01-01

    Background: Decision-making in periodontal therapeutics is critical and is influenced by accurate diagnosis of osseous defects, especially furcation involvement. Commonly used diagnostic methods such as clinical probing and conventional radiography have their own limitations. Hence, this study was planned to evaluate the dimensions of furcation defects clinically (pre- and post-surgery), intra-surgically, and by cone beam computed tomography (CBCT) (pre- and post-surgery). Materials and Methods: The study comprised a total of 200 Grade II furcation defects in forty patients, with a mean age of 38.05 ± 4.77 years diagnosed with chronic periodontitis which were evaluated clinically (pre- and post-surgically), by CBCT (pre- and post-surgically), and intrasurgically after flap reflection (40 defects in each). After the presurgical clinical and CBCT measurements, demineralized freeze-dried bone allograft was placed in the furcation defect and the flaps were sutured back. Six months later, these defects were evaluated by recording measurements clinically, i.e., postsurgery clinical measurements and also postsurgery CBCT measurements (40 defects each). Results: Presurgery clinical measurements (vertical 6.15 ± 1.71 mm and horizontal 3.05 ± 0.84 mm) and CBCT measurements (vertical 7.69 ± 1.67 mm and horizontal 4.62 ± 0.77 mm) underestimated intrasurgery measurements (vertical 8.025 ± 1.67 mm and horizontal 4.82 ± 0.67 mm) in both vertical and horizontal aspects, and the difference was statistically not significant (vertical P = 1.000, 95% confidence interval [CI], horizontal P = 0.867, 95% CI). Further, postsurgery clinical measurements (vertical 2.9 ± 0.74 mm and horizontal 1.52 ± 0.59 mm) underestimated CBCT measurements (vertical 3.67 ± 1.17 mm and horizontal 2.45 ± 0.48 mm). There was statistically significant difference between presurgery clinical–presurgery CBCT (P < 0.0001, 95% CI) versus postsurgery clinical–postsurgery CBCT (P < 0.0001, 95% CI

  18. Identification at biovar level of Brucella isolates causing abortion in small ruminants of iran.

    PubMed

    Behroozikhah, Ali Mohammad; Bagheri Nejad, Ramin; Amiri, Karim; Bahonar, Ali Reza

    2012-01-01

    To determine the most prevalent biovar responsible for brucellosis in sheep and goat populations of Iran, a cross-sectional study was carried out over 2 years in six provinces selected based on geography and disease prevalence. Specimens obtained from referred aborted sheep and goat fetuses were cultured on Brucella selective media for microbiological isolation. Brucellae were isolated from 265 fetuses and examined for biovar identification using standard microbiological methods. Results showed that 246 isolates (92.8%) were B. melitensis biovar 1, 18 isolates (6.8%) were B. melitensis biovar 2, and, interestingly, one isolate (0.4%) obtained from Mazandaran province was B. abortus biovar 3. In this study, B. melitensis biovar 3 was isolated in none of the selected provinces, and all isolates from 3 provinces (i.e., Chehar-mahal Bakhtiari, Markazi, and Ilam) were identified only as B. melitensis biovar 1. In conclusion, we found that B. melitensis biovar 1 remains the most prevalent cause of small ruminant brucellosis in various provinces of Iran.

  19. Identification of New {CIS} Vibrational Levels in the S1 State of C2H2

    NASA Astrophysics Data System (ADS)

    Baraban, J. H.; Changala, P. B.; Shaver, R. G.; Field, R. W.; Stanton, J. F.; Merer, A. J.

    2012-06-01

    Although the S_1 (tilde{A} ^1A_u) state of the trans conformer of acetylene has been known for many years, the corresponding S_1 (tilde{A} ^1A_2) state of the cis conformer was only discovered recently. Transitions to it from the ground state are electronically forbidden, but its vibrational levels acquire intensity by tunneling through the isomerization barrier and interacting with levels of the trans conformer. We have recently identified two new vibrational levels (32 and 41 61) of the {cis} conformer of S1 C2H2, bringing the total number of levels observed to six out of an expected ten up to the energies studied in this work. The appearance of these levels in IR-UV double resonance LIF spectra will be discussed, along with their vibrational assignments. Experimentally determined vibrational parameters and {ab initio} anharmonic force fields for both the {trans} and {cis} conformers will be presented as part of the evidence supporting these assignments. These results shed new light on the vibrational level structure of both conformers in this isomerizing system. A. J. Merer, A. H. Steeves, J. H. Baraban, H. A. Bechtel, and R. W. Field. J. Chem. Phys., 134(24):244310, 2011.

  20. Risk of node metastasis of sentinel lymph nodes detected in level II/III of the axilla by single-photon emission computed tomography/computed tomography

    PubMed Central

    SHIMA, HIROAKI; KUTOMI, GORO; SATOMI, FUKINO; MAEDA, HIDEKI; TAKAMARU, TOMOKO; KAMESHIMA, HIDEKAZU; OMURA, TOSEI; MORI, MITSURU; HATAKENAKA, MASAMITSU; HASEGAWA, TADASHI; HIRATA, KOICHI

    2014-01-01

    In breast cancer, single-photon emission computed tomography/computed tomography (SPECT/CT) shows the exact anatomical location of sentinel nodes (SN). SPECT/CT mainly exposes axilla and partly exposes atypical sites of extra-axillary lymphatic drainage. The mechanism of how the atypical hot nodes are involved in lymphatic metastasis was retrospectively investigated in the present study, particularly at the level II/III region. SPECT/CT was performed in 92 clinical stage 0-IIA breast cancer patients. Sentinel lymph nodes are depicted as hot nodes in SPECT/CT. Patients were divided into two groups: With or without hot node in level II/III on SPECT/CT. The existence of metastasis in level II/III was investigated and the risk factors were identified. A total of 12 patients were sentinel lymph node biopsy metastasis positive and axillary lymph node dissection (ALND) was performed. These patients were divided into two groups: With and without SN in level II/III, and nodes in level II/III were pathologically proven. In 11 of the 92 patients, hot nodes were detected in level II/III. There was a significant difference in node metastasis depending on whether there were hot nodes in level II/III (P=0.0319). Multivariate analysis indicated that the hot nodes in level II/III and lymphatic invasion were independent factors associated with node metastasis. There were 12 SN-positive patients followed by ALND. In four of the 12 patients, hot nodes were observed in level II/III. Two of the four patients with hot nodes depicted by SPECT/CT and metastatic nodes were pathologically evident in the same lesion. Therefore, the present study indicated that the hot node in level II/III as depicted by SPECT/CT may be a risk of SN metastasis, including deeper nodes. PMID:25289038

  1. Identification of Hemoglobin Levels Based on Anthropometric Indices in Elderly Koreans

    PubMed Central

    Kim, Jong Yeol

    2016-01-01

    Objectives Anemia is independently and strongly associated with an increased risk of mortality in older people and is also strongly associated with obesity. The objectives of the present study were to examine the associations between the hemoglobin level and various anthropometric indices, to predict low and normal hemoglobin levels using combined anthropometric indices, and to assess differences in the hemoglobin level and anthropometric indices between Korean men and women. Methods A total of 7,156 individuals ranging in age from 53–90 years participated in this retrospective cross-sectional study. Binary logistic regression (LR) and naïve Bayes (NB) models were used to identify significant differences in the anthropometric indices between subjects with low and normal hemoglobin levels and to assess the predictive power of these indices for the hemoglobin level. Results Among all of the variables, age displayed the strongest association with the hemoglobin level in both men (p < 0.0001, odds ratio [OR] = 0.487, area under the receiver operating characteristic curve based on the LR [LR-AUC] = 0.702, NB-AUC = 0.701) and women (p < 0.0001, OR = 0.636, LR-AUC = 0.625, NB-AUC = 0.624). Among the anthropometric indices, weight and body mass index (BMI) were the best predictors of the hemoglobin level. The predictive powers of all of the variables were higher in men than in women. The AUC values for the NB-Wrapper and LR-Wrapper predictive models generated using combined anthropometric indices were 0.734 and 0.723, respectively, for men and 0.649 and 0.652, respectively, for women. The use of combined anthropometric indices may improve the predictive power for the hemoglobin level. Discussion Among the various anthropometric indices, with the exception of age, we did not identify any indices that were better predictors than weight and BMI for low and normal hemoglobin levels. In addition, none of the ratios between pairs of indices were good indicators of the

  2. High quality machine-robust image features: Identification in nonsmall cell lung cancer computed tomography images

    PubMed Central

    Hunter, Luke A.; Krafft, Shane; Stingo, Francesco; Choi, Haesun; Martel, Mary K.; Kry, Stephen F.; Court, Laurence E.

    2013-01-01

    Purpose: For nonsmall cell lung cancer (NSCLC) patients, quantitative image features extracted from computed tomography (CT) images can be used to improve tumor diagnosis, staging, and response assessment. For these findings to be clinically applied, image features need to have high intra and intermachine reproducibility. The objective of this study is to identify CT image features that are reproducible, nonredundant, and informative across multiple machines. Methods: Noncontrast-enhanced, test-retest CT image pairs were obtained from 56 NSCLC patients imaged on three CT machines from two institutions. Two machines (“M1” and “M2”) used cine 4D-CT and one machine (“M3”) used breath-hold helical 3D-CT. Gross tumor volumes (GTVs) were semiautonomously segmented then pruned by removing voxels with CT numbers less than a prescribed Hounsfield unit (HU) cutoff. Three hundred and twenty eight quantitative image features were extracted from each pruned GTV based on its geometry, intensity histogram, absolute gradient image, co-occurrence matrix, and run-length matrix. For each machine, features with concordance correlation coefficient values greater than 0.90 were considered reproducible. The Dice similarity coefficient (DSC) and the Jaccard index (JI) were used to quantify reproducible feature set agreement between machines. Multimachine reproducible feature sets were created by taking the intersection of individual machine reproducible feature sets. Redundant features were removed through hierarchical clustering based on the average correlation between features across multiple machines. Results: For all image types, GTV pruning was found to negatively affect reproducibility (reported results use no HU cutoff). The reproducible feature percentage was highest for average images (M1 = 90.5%, M2 = 94.5%, M1∩M2 = 86.3%), intermediate for end-exhale images (M1 = 75.0%, M2 = 71.0%, M1∩M2 = 52.1%), and lowest for breath-hold images (M3 = 61.0%). Between M1 and M2

  3. Analysis of the Computer Anxiety Levels of Secondary Technical Education Teachers in West Virginia.

    ERIC Educational Resources Information Center

    Gordon, Howard R. D.

    The computer anxiety of 116 randomly selected secondary technical education teachers from 8 area vocational-technical centers in West Virginia was the focus of a study. The mailed questionnaire consisted of two parts: Oetting's Computer Anxiety Scale (COMPAS) and closed-form questions to obtain general demographic information about the teachers…

  4. Determining the Effectiveness of the 3D Alice Programming Environment at the Computer Science I Level

    ERIC Educational Resources Information Center

    Sykes, Edward R.

    2007-01-01

    Student retention in Computer Science is becoming a serious concern among Educators in many colleges and universities. Most institutions currently face a significant drop in enrollment in Computer Science. A number of different tools and strategies have emerged to address this problem (e.g., BlueJ, Karel Robot, etc.). Although these tools help to…

  5. Integration of Computer Technology Into an Introductory-Level Neuroscience Laboratory

    ERIC Educational Resources Information Center

    Evert, Denise L.; Goodwin, Gregory; Stavnezer, Amy Jo

    2005-01-01

    We describe 3 computer-based neuroscience laboratories. In the first 2 labs, we used commercially available interactive software to enhance the study of functional and comparative neuroanatomy and neurophysiology. In the remaining lab, we used customized software and hardware in 2 psychophysiological experiments. With the use of the computer-based…

  6. Computing at the High School Level: Changing What Teachers and Students Know and Believe

    ERIC Educational Resources Information Center

    Munson, Ashlyn; Moskal, Barbara; Harriger, Alka; Lauriski-Karriker, Tonya; Heersink, Daniel

    2011-01-01

    Research indicates that students often opt out of computing majors due to a lack of prior experience in computing and a lack of knowledge of field-based job opportunities. In addition, it has been found that students respond positively to new subjects when teachers and counselors are enthusiastic and knowledgeable about the area. The summer…

  7. Computer-assisted identification of novel small molecule inhibitors targeting GLUT1

    NASA Astrophysics Data System (ADS)

    Wan, Zhining; Li, Xin; Sun, Rong; Li, Yuanyuan; Wang, Xiaoyun; Li, Xinru; Rong, Li; Shi, Zheng; Bao, Jinku

    2015-12-01

    Glucose transporters (GLUTs) are the main carriers of glucose that facilitate the diffusion of glucose in mammalian cells, especially GLUT1. Notably, GLUT1 is a rate-limiting transporter for glucose uptake, and its overexpression is a common characteristic in most cancers. Thus, the inhibition of GLUT1 by novel small compounds to lower glucose levels for cancer cells has become an emerging strategy. Herein, we employed high-throughput screening approaches to identify potential inhibitors against the sugar-binding site of GLUT1. Firstly, molecular docking screening was launched against the specs products, and three molecules (ZINC19909927, ZINC19908826, and ZINC19815451) were selected as candidate GLUT1 inhibitors for further analysis. Then, taking the initial ligand β-NG as a reference, molecular dynamic (MD) simulations and molecular mechanics/generalized born surface area (MM/GBSA) method were applied to evaluate the binding stability and affinity of the three candidates towards GLUT1. Finally, we found that ZINC19909927 might have the highest affinity to occupy the binding site of GLUT1. Meanwhile, energy decomposition analysis identified several residues located in substrate-binding site that might provide clues for future inhibitor discovery towards GLUT1. Taken together, these results in our study may provide valuable information for identifying new inhibitors targeting GLUT1-mediated glucose transport and metabolism for cancer therapeutics.

  8. Identification of overlapping communities and their hierarchy by locally calculating community-changing resolution levels

    NASA Astrophysics Data System (ADS)

    Havemann, Frank; Heinz, Michael; Struck, Alexander; Gläser, Jochen

    2011-01-01

    We propose a new local, deterministic and parameter-free algorithm that detects fuzzy and crisp overlapping communities in a weighted network and simultaneously reveals their hierarchy. Using a local fitness function, the algorithm greedily expands natural communities of seeds until the whole graph is covered. The hierarchy of communities is obtained analytically by calculating resolution levels at which communities grow rather than numerically by testing different resolution levels. This analytic procedure is not only more exact than its numerical alternatives such as LFM and GCE but also much faster. Critical resolution levels can be identified by searching for intervals in which large changes of the resolution do not lead to growth of communities. We tested our algorithm on benchmark graphs and on a network of 492 papers in information science. Combined with a specific post-processing, the algorithm gives much more precise results on LFR benchmarks with high overlap compared to other algorithms and performs very similarly to GCE.

  9. The combined rapid detection and species-level identification of yeasts in simulated blood culture using a colorimetric sensor array

    PubMed Central

    Lim, Sung H.; Wilson, Deborah A.; SalasVargas, Ana Victoria; Churi, Yair S.; Rhodes, Paul A.; Mazzone, Peter J.; Procop, Gary W.

    2017-01-01

    Background A colorimetric sensor array (CSA) has been demonstrated to rapidly detect and identify bacteria growing in blood cultures by obtaining a species-specific “fingerprint” of the volatile organic compounds (VOCs) produced during growth. This capability has been demonstrated in prokaryotes, but has not been reported for eukaryotic cells growing in culture. The purpose of this study was to explore if a disposable CSA could differentially identify 7 species of pathogenic yeasts growing in blood culture. Methods Culture trials of whole blood inoculated with a panel of clinically important pathogenic yeasts at four different microorganism loads were performed. Cultures were done in both standard BacT/Alert and CSA-embedded bottles, after adding 10 mL of spiked blood to each bottle. Color changes in the CSA were captured as images by an optical scanner at defined time intervals. The captured images were analyzed to identify the yeast species. Time to detection by the CSA was compared to that in the BacT/Alert system. Results One hundred sixty-two yeast culture trials were performed, including strains of several species of Candida (Ca. albicans, Ca. glabrata, Ca. parapsilosis, and Ca. tropicalis), Clavispora (synonym Candida) lusitaniae, Pichia kudriavzevii (synonym Candida krusei) and Cryptococcus neoformans, at loads of 8.2 × 105, 8.3 × 103, 8.5 × 101, and 1.7 CFU/mL. In addition, 8 negative trials (no yeast) were conducted. All negative trials were correctly identified as negative, and all positive trials were detected. Colorimetric responses were species-specific and did not vary by inoculum load over the 500000-fold range of loads tested, allowing for accurate species-level identification. The mean sensitivity for species-level identification by CSA was 74% at detection, and increased with time, reaching almost 95% at 4 hours after detection. At an inoculum load of 1.7 CFU/mL, mean time to detection with the CSA was 6.8 hours (17%) less than with the

  10. Identification of entry-level competencies for associate degree radiographers as perceived by primary role definers

    SciTech Connect

    Thorpe, R.L.

    1981-01-01

    The primary purpose of this study was to identify those competencies needed by Associate Degree Radiographers when they assume employment as entry-level practitioners. A second purpose of the study was to rank order the identified competencies within the role delineations recognized by the Essentials and Guidelines of an Accredited Educational Program for the Radiographer. These role delineations include: radiation protection, exercise discretion and judgment, emergency and life saving techniques, patient care and interpersonal communication, and role as professional member. A third purpose of the study was to examine the degree of consensus on role definition of entry-level competencies needed by Associate Degree Radiographers as perceived by primary role definers (such as employers, employees, and educators), and by other selected variables: age, sex, length of experience in radiologic technology, level of formal education, and place of employment. A major finding of this study was that respondents did not differ significantly in their ranking of entry-level competencies needed by Associate Degree Radiographers when the responses were analyzed according to position, age, sex, length of experience, level of education, or place of employment. Another important finding was that respondents considered all of the 63 competencies as important and needed by Associate Degree Radiographers upon initial employment.A major conclusion and recommendation of this study, in view of the high agreement on the rank ordering of competencies, was that these competencies should be included in a competency-based education model. It was further recommended that a three-way system of communication between employers, employees, and educators be considered in order to pool resources and to increase understanding of each position group's contribution and influence on entry-level Associate Degree Radiographers.

  11. Clinical Identification of the Vertebral Level at Which the Lumbar Sympathetic Ganglia Aggregate

    PubMed Central

    An, Ji Won; Koh, Jae Chul; Sun, Jong Min; Park, Ju Yeon; Choi, Jong Bum; Shin, Myung Ju

    2016-01-01

    Background The location and the number of lumbar sympathetic ganglia (LSG) vary between individuals. The aim of this study was to determine the appropriate level for a lumbar sympathetic ganglion block (LSGB), corresponding to the level at which the LSG principally aggregate. Methods Seventy-four consecutive subjects, including 31 women and 31 men, underwent LSGB either on the left (n = 31) or the right side (n = 43). The primary site of needle entry was randomly selected at the L3 or L4 vertebra. A total of less than 1 ml of radio opaque dye with 4% lidocaine was injected, taking caution not to traverse beyond the level of one vertebral body. The procedure was considered responsive when the skin temperature increased by more than 1℃ within 5 minutes. Results The median responsive level was significantly different between the left (lower third of the L4 body) and right (lower margin of the L3 body) sides (P = 0.021). However, there was no significant difference in the values between men and women. The overall median responsive level was the upper third of the L4 body. The mean responsive level did not correlate with height or BMI. There were no complications on short-term follow-up. Conclusions Selection of the primary target in the left lower third of the L4 vertebral body and the right lower margin of the L3 vertebral body may reduce the number of needle insertions and the volume of agents used in conventional or neurolytic LSGB and radiofrequency thermocoagulation. PMID:27103965

  12. Flow cytometric identification and enumeration of photosynthetic sulfur bacteria and potential for ecophysiological studies at the single-cell level.

    PubMed

    Casamayor, Emilio O; Ferrera, Isabel; Cristina, Xavier; Borrego, Carles M; Gasol, Josep M

    2007-08-01

    We show the potential of flow cytometry as a fast tool for population identification and enumeration of photosynthetic sulfur bacteria. Purple (PSB) and green sulfur bacteria (GSB) oxidize hydrogen sulfide to elemental sulfur that can act as storage compound to be further oxidized to sulfate generating the reducing power required for growth. Both groups have different elemental sulfur allocation strategies: whereas PSB store elemental sulfur as intracellular inclusions, GSB allocate sulfur globules externally. We used well-characterized laboratory strains and complex natural photosynthetic populations developing in a sharply stratified meromictic lake to show that PSB and GSB could be detected, differentiated and enumerated in unstained samples using a blue laser-based flow cytometer. Variations in cell-specific pigment content and the dynamics of sulfur accumulation, both intra- and extracellularly, were also detected in flow cytometric plots as sulfur accumulation changed the light scatter characteristics of the cells. These data were used to show the potential for studies on the metabolic status and the rate of activity at the single-cell level. Flow cytometric identification and enumeration resulted in faster and more precise analyses than previous approaches, and may open the door to more complex ecophysiological experiments with photosynthetic sulfur bacteria in mixed cultures and natural environments.

  13. SU-E-P-10: Establishment of Local Diagnostic Reference Levels of Routine Exam in Computed Tomography

    SciTech Connect

    Yeh, M; Wang, Y; Weng, H

    2015-06-15

    Introduction National diagnostic reference levels (NDRLs) can be used as a reference dose of radiological examination can provide radiation dose as the basis of patient dose optimization. Local diagnostic reference levels (LDRLs) by periodically view and check doses, more efficiency to improve the way of examination. Therefore, the important first step is establishing a diagnostic reference level. Computed Tomography in Taiwan had been built up the radiation dose limit value,in addition, many studies report shows that CT scan contributed most of the radiation dose in different medical. Therefore, this study was mainly to let everyone understand DRL’s international status. For computed tomography in our hospital to establish diagnostic reference levels. Methods and Materials: There are two clinical CT scanners (a Toshiba Aquilion and a Siemens Sensation) were performed in this study. For CT examinations the basic recommended dosimetric quantity is the Computed Tomography Dose Index (CTDI). Each exam each different body part, we collect 10 patients at least. Carried out the routine examinations, and all exposure parameters have been collected and the corresponding CTDIv and DLP values have been determined. Results: The majority of patients (75%) were between 60–70 Kg of body weight. There are 25 examinations in this study. Table 1 shows the LDRL of each CT routine examination. Conclusions: Therefore, this study would like to let everyone know DRL’s international status, but also establishment of computed tomography of the local reference levels for our hospital, and providing radiation reference, as a basis for optimizing patient dose.

  14. Identification of the source of elevated hepatocyte growth factor levels in multiple myeloma patients

    PubMed Central

    2014-01-01

    Background Hepatocyte growth factor (HGF) is a pleiotropic cytokine which can lead to cancer cell proliferation, migration and metastasis. In multiple myeloma (MM) patients it is an abundant component of the bone marrow. HGF levels are elevated in 50% of patients and associated with poor prognosis. Here we aim to investigate its source in myeloma. Methods HGF mRNA levels in bone marrow core biopsies from healthy individuals and myeloma patients were quantified by real-time PCR. HGF gene expression profiling in CD138+ cells isolated from bone marrow aspirates of healthy individuals and MM patients was performed by microarray analysis. HGF protein concentrations present in peripheral blood of MM patients were measured by enzyme-linked immunosorbent assay (ELISA). Cytogenetic status of CD138+ cells was determined by fluorescence in situ hybridization (FISH) and DNA sequencing of the HGF gene promoter. HGF secretion in co-cultures of human myeloma cell lines and bone marrow stromal cells was measured by ELISA. Results HGF gene expression profiling in both bone marrow core biopsies and CD138+ cells showed elevated HGF mRNA levels in myeloma patients. HGF mRNA levels in biopsies and in myeloma cells correlated. Quantification of HGF protein levels in serum also correlated with HGF mRNA levels in CD138+ cells from corresponding patients. Cytogenetic analysis showed myeloma cell clones with HGF copy numbers between 1 and 3 copies. There was no correlation between HGF copy number and HGF mRNA levels. Co-cultivation of the human myeloma cell lines ANBL-6 and JJN3 with bone marrow stromal cells or the HS-5 cell line resulted in a significant increase in secreted HGF. Conclusions We here show that in myeloma patients HGF is primarily produced by malignant plasma cells, and that HGF production by these cells might be supported by the bone marrow microenvironment. Considering the fact that elevated HGF serum and plasma levels predict poor prognosis, these findings are of

  15. Policy Matters: An Analysis of District-Level Efforts to Increase the Identification of Underrepresented Learners

    ERIC Educational Resources Information Center

    McBee, Matthew T.; Shaunessy, Elizabeth; Matthews, Michael S.

    2012-01-01

    Policies delegating control of educational policy to the local level are widespread, yet there has been little examination of the effects of such distributed decision making in the area of advanced education programming. We used propensity score matching to examine the effectiveness of locally developed policies for identifying intellectually…

  16. DIATOM INDICES OF STREAM ECOSYSTEM CONDITIONS: COMPARISON OF GENUS VS. SPECIES LEVEL IDENTIFICATIONS

    EPA Science Inventory

    Diatom assemblage data collected between 1993 and 1995 from 233 Mid-Appalachian streams were used to compare indices of biotic integrity based on genus vs. species level taxonomy. Thirty-seven genera and 197 species of diatoms were identified from these samples. Metrics included...

  17. Demonstration of Cost-Effective, High-Performance Computing at Performance and Reliability Levels Equivalent to a 1994 Vector Supercomputer

    NASA Technical Reports Server (NTRS)

    Babrauckas, Theresa

    2000-01-01

    The Affordable High Performance Computing (AHPC) project demonstrated that high-performance computing based on a distributed network of computer workstations is a cost-effective alternative to vector supercomputers for running CPU and memory intensive design and analysis tools. The AHPC project created an integrated system called a Network Supercomputer. By connecting computer work-stations through a network and utilizing the workstations when they are idle, the resulting distributed-workstation environment has the same performance and reliability levels as the Cray C90 vector Supercomputer at less than 25 percent of the C90 cost. In fact, the cost comparison between a Cray C90 Supercomputer and Sun workstations showed that the number of distributed networked workstations equivalent to a C90 costs approximately 8 percent of the C90.

  18. Tin-carbon clusters and the onset of microscopic level immiscibility: Experimental and computational study

    NASA Astrophysics Data System (ADS)

    Bernstein, J.; Landau, A.; Zemel, E.; Kolodney, E.

    2015-09-01

    We report the experimental observation and computational analysis of the binary tin-carbon gas phase species. These novel ionic compounds are generated by impact of C60 - anions on a clean tin target at some kiloelectronvolts kinetic energies. Positive SnmCn+ (m = 1-12, 1 ≤ n ≤ 8) ions were detected mass spectrometrically following ejection from the surface. Impact induced shattering of the C60 - ion followed by sub-surface penetration of the resulting atomic carbon flux forces efficient mixing between target and projectile atoms even though the two elements (Sn/C) are completely immiscible in the bulk. This approach of C60 - ion beam induced synthesis can be considered as an effective way for producing novel metal-carbon species of the so-called non-carbide forming elements, thus exploring the possible onset of molecular level miscibility in these systems. Sn2C2+ was found to be the most abundant carbide cluster ion. Its instantaneous formation kinetics and its measured kinetic energy distribution while exiting the surface demonstrate a single impact formation/emission event (on the sub-ps time scale). Optimal geometries were calculated for both neutral and positively charged species using Born-Oppenheimer molecular dynamics for identifying global minima, followed by density functional theory (DFT) structure optimization and energy calculations at the coupled cluster singles, doubles and perturbative triples [CCSD(T)] level. The calculated structures reflect two distinct binding tendencies. The carbon rich species exhibit polyynic/cummulenic nature (tin end capped carbon chains) while the more stoichiometrically balanced species have larger contributions of metal-metal bonding, sometimes resulting in distinct tin and carbon moieties attached to each other (segregated structures). The Sn2Cn (n = 3-8) and Sn2Cn+ (n = 2-8) are polyynic/cummulenic while all neutral SnmCn structures (m = 3-4) could be described as small tin clusters (dimer, trimer, and tetramer

  19. Tin-carbon clusters and the onset of microscopic level immiscibility: Experimental and computational study.

    PubMed

    Bernstein, J; Landau, A; Zemel, E; Kolodney, E

    2015-09-21

    We report the experimental observation and computational analysis of the binary tin-carbon gas phase species. These novel ionic compounds are generated by impact of C60(-) anions on a clean tin target at some kiloelectronvolts kinetic energies. Positive Sn(m)C(n)(+) (m = 1-12, 1 ≤ n ≤ 8) ions were detected mass spectrometrically following ejection from the surface. Impact induced shattering of the C60(-) ion followed by sub-surface penetration of the resulting atomic carbon flux forces efficient mixing between target and projectile atoms even though the two elements (Sn/C) are completely immiscible in the bulk. This approach of C60(-) ion beam induced synthesis can be considered as an effective way for producing novel metal-carbon species of the so-called non-carbide forming elements, thus exploring the possible onset of molecular level miscibility in these systems. Sn2C2(+) was found to be the most abundant carbide cluster ion. Its instantaneous formation kinetics and its measured kinetic energy distribution while exiting the surface demonstrate a single impact formation/emission event (on the sub-ps time scale). Optimal geometries were calculated for both neutral and positively charged species using Born-Oppenheimer molecular dynamics for identifying global minima, followed by density functional theory (DFT) structure optimization and energy calculations at the coupled cluster singles, doubles and perturbative triples [CCSD(T)] level. The calculated structures reflect two distinct binding tendencies. The carbon rich species exhibit polyynic/cummulenic nature (tin end capped carbon chains) while the more stoichiometrically balanced species have larger contributions of metal-metal bonding, sometimes resulting in distinct tin and carbon moieties attached to each other (segregated structures). The Sn2C(n) (n = 3-8) and Sn2C(n)(+) (n = 2-8) are polyynic/cummulenic while all neutral Sn(m)C(n) structures (m = 3-4) could be described as small tin clusters (dimer

  20. High-level waste storage tank farms/242-A evaporator standards/requirements identification document (S/RID), Vol. 3

    SciTech Connect

    Not Available

    1994-04-01

    The Safeguards and Security (S&S) Functional Area address the programmatic and technical requirements, controls, and standards which assure compliance with applicable S&S laws and regulations. Numerous S&S responsibilities are performed on behalf of the Tank Farm Facility by site level organizations. Certain other responsibilities are shared, and the remainder are the sole responsibility of the Tank Farm Facility. This Requirements Identification Document describes a complete functional Safeguards and Security Program that is presumed to be the responsibility of the Tank Farm Facility. The following list identifies the programmatic elements in the S&S Functional Area: Program Management, Protection Program Scope and Evaluation, Personnel Security, Physical Security Systems, Protection Program Operations, Material Control and Accountability, Information Security, and Key Program Interfaces.

  1. Multi-level slug tests in highly permeable formations: 2. Hydraulic conductivity identification, method verification, and field applications

    USGS Publications Warehouse

    Zlotnik, V.A.; McGuire, V.L.

    1998-01-01

    Using the developed theory and modified Springer-Gelhar (SG) model, an identification method is proposed for estimating hydraulic conductivity from multi-level slug tests. The computerized algorithm calculates hydraulic conductivity from both monotonic and oscillatory well responses obtained using a double-packer system. Field verification of the method was performed at a specially designed fully penetrating well of 0.1-m diameter with a 10-m screen in a sand and gravel alluvial aquifer (MSEA site, Shelton, Nebraska). During well installation, disturbed core samples were collected every 0.6 m using a split-spoon sampler. Vertical profiles of hydraulic conductivity were produced on the basis of grain-size analysis of the disturbed core samples. These results closely correlate with the vertical profile of horizontal hydraulic conductivity obtained by interpreting multi-level slug test responses using the modified SG model. The identification method was applied to interpret the response from 474 slug tests in 156 locations at the MSEA site. More than 60% of responses were oscillatory. The method produced a good match to experimental data for both oscillatory and monotonic responses using an automated curve matching procedure. The proposed method allowed us to drastically increase the efficiency of each well used for aquifer characterization and to process massive arrays of field data. Recommendations generalizing this experience to massive application of the proposed method are developed.Using the developed theory and modified Springer-Gelhar (SG) model, an identification method is proposed for estimating hydraulic conductivity from multi-level slug tests. The computerized algorithm calculates hydraulic conductivity from both monotonic and oscillatory well responses obtained using a double-packer system. Field verification of the method was performed at a specially designed fully penetrating well of 0.1-m diameter with a 10-m screen in a sand and gravel alluvial

  2. Modeling Cardiac Electrophysiology at the Organ Level in the Peta FLOPS Computing Age

    NASA Astrophysics Data System (ADS)

    Mitchell, Lawrence; Bishop, Martin; Hötzl, Elena; Neic, Aurel; Liebmann, Manfred; Haase, Gundolf; Plank, Gernot

    2010-09-01

    Despite a steep increase in available compute power, in-silico experimentation with highly detailed models of the heart remains to be challenging due to the high computational cost involved. It is hoped that next generation high performance computing (HPC) resources lead to significant reductions in execution times to leverage a new class of in-silico applications. However, peformance gains with these new platforms can only be achieved by engaging a much larger number of compute cores, necessitating strongly scalable numerical techniques. So far strong scalability has been demonstrated only for a moderate number of cores, orders of magnitude below the range required to achieve the desired performance boost. In this study, strong scalability of currently used techniques to solve the bidomain equations is investigated. Benchmark results suggest that scalability is limited to 512-4096 cores within the range of relevant problem sizes even when systems are carefully load-balanced and advanced IO strategies are employed.

  3. Modeling Cardiac Electrophysiology at the Organ Level in the Peta FLOPS Computing Age

    SciTech Connect

    Mitchell, Lawrence; Bishop, Martin; Hoetzl, Elena; Neic, Aurel; Liebmann, Manfred; Haase, Gundolf; Plank, Gernot

    2010-09-30

    Despite a steep increase in available compute power, in-silico experimentation with highly detailed models of the heart remains to be challenging due to the high computational cost involved. It is hoped that next generation high performance computing (HPC) resources lead to significant reductions in execution times to leverage a new class of in-silico applications. However, performance gains with these new platforms can only be achieved by engaging a much larger number of compute cores, necessitating strongly scalable numerical techniques. So far strong scalability has been demonstrated only for a moderate number of cores, orders of magnitude below the range required to achieve the desired performance boost.In this study, strong scalability of currently used techniques to solve the bidomain equations is investigated. Benchmark results suggest that scalability is limited to 512-4096 cores within the range of relevant problem sizes even when systems are carefully load-balanced and advanced IO strategies are employed.

  4. The style of a stranger: Identification expertise generalizes to coarser level categories.

    PubMed

    Searston, Rachel A; Tangen, Jason M

    2016-12-05

    Experience identifying visual objects and categories improves generalization within the same class (e.g., discriminating bird species improves transfer to new bird species), but does such perceptual expertise transfer to coarser category judgments? We tested whether fingerprint experts, who spend their days comparing pairs of prints and judging whether they were left by the same finger or two different fingers, can generalize their finger discrimination expertise to people more broadly. That is, can these experts identify prints from Jones's right thumb and prints from Jones's right index finger as instances of the same "Jones" category? Novices and experts were both sensitive to the style of a stranger's prints; despite lower levels of confidence, experts were significantly more sensitive to this style than novices. This expert advantage persisted even when we reduced the number of exemplars provided. Our results demonstrate that perceptual expertise can be flexible to upwards shifts in the level of specificity, suggesting a dynamic memory retrieval process.

  5. Identification of technical problems encountered in the shallow land burial of low-level radioactive wastes

    SciTech Connect

    Jacobs, D.G.; Epler, J.S.; Rose, R.R.

    1980-03-01

    A review of problems encountered in the shallow land burial of low-level radioactive wastes has been made in support of the technical aspects of the National Low-Level Waste (LLW) Management Research and Development Program being administered by the Low-Level Waste Management Program Office, Oak Ridge National Laboratory. The operating histories of burial sites at six major DOE and five commercial facilities in the US have been examined and several major problems identified. The problems experienced st the sites have been grouped into general categories dealing with site development, waste characterization, operation, and performance evaluation. Based on this grouping of the problem, a number of major technical issues have been identified which should be incorporated into program plans for further research and development. For each technical issue a discussion is presented relating the issue to a particular problem, identifying some recent or current related research, and suggesting further work necessary for resolving the issue. Major technical issues which have been identified include the need for improved water management, further understanding of the effect of chemical and physical parameters on radionuclide migration, more comprehensive waste records, improved programs for performance monitoring and evaluation, development of better predictive capabilities, evaluation of space utilization, and improved management control.

  6. Identification and quantification method of spiramycin and tylosin in feedingstuffs with HPLC-UV/DAD at 1 ppm level.

    PubMed

    Civitareale, C; Fiori, M; Ballerini, A; Brambilla, G

    2004-10-29

    The use of the two macrolides antibiotics Spiramycin (S) and Tylosin (T) as growth promoters in animal feeding has been recently withdrawn in the European Union due to a concern about the outbreaks of farmacoresistance fenomena as a possible hazard for humans. For feed additives monitoring purposes, an analytical method has been developed for their extraction, purification and identification in different animal feedingstuffs (pelleted beef, pig, poultry feeds and calves milk replacer) at a minimum performance required limit (MRPL) of 1 microg g(-1) (ppm). Such limit has been established according to the lowest dosage of additives still able to elicit an appreciable growth promoting effect. Blank feeds were spiked at two concentration levels, 1.0 and 2.5 ppm in six replicates. After methanolic extraction, samples were cleaned up on SPE CN columns and extracts analysed in HPLC-UV/DAD, using a gradient elution. Detection limits, calculated as the tree time mean noise of 20 blank feeds, were 176 and 118 ng g(-1) for S and T, respectively. Results show good repeatability (CV% not exceeding the value of 15) and mean recovery in the range of 99-74% and 81-53% for S and T, respectively, at 1 ppm. When the standards were injected up to 250 ng the chromatographic method can resolve the components of analytes (Spiramycin I, II and III; Tylosin A and B) but can not resolve the components on real feed samples at the spiked levels considered. For this reason the identification and quantification of analytes on matrix were carried out considering the main compound of the drugs (Spiramycin I and Tylosin A). As a verification, the overlapping of UV spectra in the range 220-350 nm between analytical standards and the compounds in the matrix were considered.

  7. Computer Use Ethics among University Students and Staffs: The Influence of Gender, Religious Work Value and Organizational Level

    ERIC Educational Resources Information Center

    Mohamed, Norshidah; Karim, Nor Shahriza Abdul; Hussein, Ramlah

    2012-01-01

    Purpose: The purpose of this paper is to investigate the extent to which individual characteristics, which are gender, religious (Islamic) work value, and organization level (students and staff), are related to attitudes toward computer use ethics. This investigation is conducted in an academic setting in Malaysia, among those subscribing to the…

  8. POOLMS: A computer program for fitting and model selection for two level factorial replication-free experiments

    NASA Technical Reports Server (NTRS)

    Amling, G. E.; Holms, A. G.

    1973-01-01

    A computer program is described that performs a statistical multiple-decision procedure called chain pooling. It uses a number of mean squares assigned to error variance that is conditioned on the relative magnitudes of the mean squares. The model selection is done according to user-specified levels of type 1 or type 2 error probabilities.

  9. Student Conceptions about the DNA Structure within a Hierarchical Organizational Level: Improvement by Experiment- and Computer-Based Outreach Learning

    ERIC Educational Resources Information Center

    Langheinrich, Jessica; Bogner, Franz X.

    2015-01-01

    As non-scientific conceptions interfere with learning processes, teachers need both, to know about them and to address them in their classrooms. For our study, based on 182 eleventh graders, we analyzed the level of conceptual understanding by implementing the "draw and write" technique during a computer-supported gene technology module.…

  10. The Influence of the Level of Free-Choice Learning Activities on the Use of an Educational Computer Game

    ERIC Educational Resources Information Center

    Barendregt, Wolmet; Bekker, Tilde M.

    2011-01-01

    Employing a mixed-method explorative approach, this study examined the in situ use of and opinions about an educational computer game for learning English introduced in three schools offering different levels of freedom to choose school activities. The results indicated that the general behaviour of the children with the game was very different…

  11. Analysis of the applicability of geophysical methods and computer modelling in determining groundwater level

    NASA Astrophysics Data System (ADS)

    Czaja, Klaudia; Matula, Rafal

    2014-05-01

    The paper presents analysis of the possibilities of application geophysical methods to investigation groundwater conditions. In this paper groundwater is defined as liquid water flowing through shallow aquifers. Groundwater conditions are described through the distribution of permeable layers (like sand, gravel, fractured rock) and impermeable or low-permeable layers (like clay, till, solid rock) in the subsurface. GPR (Ground Penetrating Radar), ERT(Electrical Resistivity Tomography), VES (Vertical Electric Soundings) and seismic reflection, refraction and MASW (Multichannel Analysis of Surface Waves) belong to non - invasive, surface, geophysical methods. Due to differences in physical parameters like dielectric constant, resistivity, density and elastic properties for saturated and saturated zones it is possible to use geophysical techniques for groundwater investigations. Few programmes for GPR, ERT, VES and seismic modelling were applied in order to verify and compare results. Models differ in values of physical parameters such as dielectric constant, electrical conductivity, P and S-wave velocity and the density, layers thickness and the depth of occurrence of the groundwater level. Obtained results for computer modelling for GPR and seismic methods and interpretation of test field measurements are presented. In all of this methods vertical resolution is the most important issue in groundwater investigations. This require proper measurement methodology e.g. antennas with frequencies high enough, Wenner array in electrical surveys, proper geometry for seismic studies. Seismic velocities of unconsolidated rocks like sand and gravel are strongly influenced by porosity and water saturation. No influence of water saturation degree on seismic velocities is observed below a value of about 90% water saturation. A further saturation increase leads to a strong increase of P-wave velocity and a slight decrease of S-wave velocity. But in case of few models only the

  12. Identification of new stress-induced microRNA and their targets in wheat using computational approach

    PubMed Central

    Pandey, Bharati; Gupta, Om Prakash; Pandey, Dev Mani; Sharma, Indu; Sharma, Pradeep

    2013-01-01

    MicroRNAs (miRNAs) are a class of short endogenous non-coding small RNA molecules of about 18–22 nucleotides in length. Their main function is to downregulate gene expression in different manners like translational repression, mRNA cleavage and epigenetic modification. Computational predictions have raised the number of miRNAs in wheat significantly using an EST based approach. Hence, a combinatorial approach which is amalgamation of bioinformatics software and perl script was used to identify new miRNA to add to the growing database of wheat miRNA. Identification of miRNAs was initiated by mining the EST (Expressed Sequence Tags) database available at National Center for Biotechnology Information. In this investigation, 4677 mature microRNA sequences belonging to 50 miRNA families from different plant species were used to predict miRNA in wheat. A total of five abiotic stress-responsive new miRNAs were predicted and named Ta-miR5653, Ta-miR855, Ta-miR819k, Ta-miR3708 and Ta-miR5156. In addition, four previously identified miRNA, i.e., Ta-miR1122, miR1117, Ta-miR1134 and Ta-miR1133 were predicted in newly identified EST sequence and 14 potential target genes were subsequently predicted, most of which seems to encode ubiquitin carrier protein, serine/threonine protein kinase, 40S ribosomal protein, F-box/kelch-repeat protein, BTB/POZ domain-containing protein, transcription factors which are involved in growth, development, metabolism and stress response. Our result has increased the number of miRNAs in wheat, which should be useful for further investigation into the biological functions and evolution of miRNAs in wheat and other plant species. PMID:23511197

  13. Identification of antipsychotic drug fluspirilene as a potential p53-MDM2 inhibitor: a combined computational and experimental study

    NASA Astrophysics Data System (ADS)

    Patil, Sachin P.; Pacitti, Michael F.; Gilroy, Kevin S.; Ruggiero, John C.; Griffin, Jonathan D.; Butera, Joseph J.; Notarfrancesco, Joseph M.; Tran, Shawn; Stoddart, John W.

    2015-02-01

    The inhibition of tumor suppressor p53 protein due to its direct interaction with oncogenic murine double minute 2 (MDM2) protein, plays a central role in almost 50 % of all human tumor cells. Therefore, pharmacological inhibition of the p53-binding pocket on MDM2, leading to p53 activation, presents an important therapeutic target against these cancers expressing wild-type p53. In this context, the present study utilized an integrated virtual and experimental screening approach to screen a database of approved drugs for potential p53-MDM2 interaction inhibitors. Specifically, using an ensemble rigid-receptor docking approach with four MDM2 protein crystal structures, six drug molecules were identified as possible p53-MDM2 inhibitors. These drug molecules were then subjected to further molecular modeling investigation through flexible-receptor docking followed by Prime/MM-GBSA binding energy analysis. These studies identified fluspirilene, an approved antipsychotic drug, as a top hit with MDM2 binding mode and energy similar to that of a native MDM2 crystal ligand. The molecular dynamics simulations suggested stable binding of fluspirilene to the p53-binding pocket on MDM2 protein. The experimental testing of fluspirilene showed significant growth inhibition of human colon tumor cells in a p53-dependent manner. Fluspirilene also inhibited growth of several other human tumor cell lines in the NCI60 cell line panel. Taken together, these computational and experimental data suggest a potentially novel role of fluspirilene in inhibiting the p53-MDM2 interaction. It is noteworthy here that fluspirilene has a long history of safe human use, thus presenting immediate clinical potential as a cancer therapeutic. Furthermore, fluspirilene could also serve as a structurally-novel lead molecule for the development of more potent, small-molecule p53-MDM2 inhibitors against several types of cancer. Importantly, the combined computational and experimental screening protocol

  14. Identification of low level gamma-irradiation of meats by high sensitivity comet assay

    NASA Astrophysics Data System (ADS)

    Miyahara, Makoto; Saito, Akiko; Ito, Hitoshi; Toyoda, Masatake

    2002-03-01

    The detection of low levels of irradiation in meats (pork, beef, and chicken) using the new comet assay was investigated in order to assess the capability of the procedure. The new assay includes a process that improves its sensitivity to irradiation and a novel evaluation system for each slide (influence score and comet-type distribution). Samples used were purchased at retailers and were irradiated at 0.5 and 2kGy at 0°C. The samples were processed to obtain comets. Slides were evaluated by typing comets, calculating the influence score and analyzing the comet-type distribution chart of shown on the slide. Influence scores of beef, pork, and chicken at 0kGy were 287(SD=8.0), 305 (SD=12.9), and 320 (SD=21.0), respectively. Those at 500Gy, were 305 (SD=5.3), 347 (SD=10.6), and 364 (12.6), respectively. Irradiation levels in food were successfully determined. Sensitivity to irradiation differed among samples (chicken>pork>beef).

  15. Oil contamination of fish in the North Sea. Determination of levels and identification of sources

    SciTech Connect

    Johnsen, S.; Restucci, R.; Klungsoyr, J.

    1996-12-31

    Two fish species, cod and haddock have been sampled from five different regions in the Norwegian sector of the North Sea, the Haltenbanken and the Barents Sea, Three of the five sampling areas were located in regions with no local oil or gas production, while the remaining two areas represented regions with high density of oil and gas production fields. A total of 25 specimen of each of the two fish species were collected, and liver (all samples) and muscle (10 samples from each group) were analyzed for the content of total hydrocarbons (THC), selected aromatic compounds (NPD and PAH) and bicyclic aliphatic decalines. The present paper outlines the results of liver samples analyses from four of the sampled regions, & northern North Sea region and the three reference regions Egersundbanken, Haltenbanken and the Barents Sea. In general, no significant difference was observed between the hydrocarbon levels within the sampled regions. The only observed exception was a moderate, but significant increase in decaline levels in haddock liver from the Northern North Sea region. The qualitative interpretation of the results showed that the sources of hydrocarbon contamination varied within the total sampling area. This observation indicates that the local discharge sources in areas with high petroleum production activity are the sources of hydrocarbons in fish from such areas. However, it was not possible to identify single discharges as a contamination source from the present results.

  16. Identification of risk factors for Campylobacter contamination levels on broiler carcasses during the slaughter process.

    PubMed

    Seliwiorstow, Tomasz; Baré, Julie; Berkvens, Dirk; Van Damme, Inge; Uyttendaele, Mieke; De Zutter, Lieven

    2016-06-02

    Campylobacter carcass contamination was quantified across the slaughter line during processing of Campylobacter positive batches. These quantitative data were combined together with information describing slaughterhouse and batch related characteristics in order to identify risk factors for Campylobacter contamination levels on broiler carcasses. The results revealed that Campylobacter counts are influenced by the contamination of incoming birds (both the initial external carcass contamination and the colonization level of caeca) and the duration of transport and holding time that can be linked with feed withdrawal period. In addition, technical aspects of the slaughter process such as a dump based unloading system, electrical stunning, lower scalding temperature, incorrect setting of plucking, vent cutter and evisceration machines were identified as risk factors associated with increased Campylobacter counts on processed carcasses. As such the study indicates possible improvements of the slaughter process that can result in better control of Campylobacter numbers under routine processing of Campylobacter positive batches without use of chemical or physical decontamination. Moreover, all investigated factors were existing variations of the routine processing practises and therefore proposed interventions are practically and economically achievable.

  17. Assessment of the level of the actual and desirable levels of computer literacy, usage and expected knowledge of undergraduate students of nursing.

    PubMed

    Hardy, J L

    1995-01-01

    The aim of the study was to determine the perceptions of students entering nursing at the bachelors level, of their actual and desirable knowledge about computers and their applications relevant to nursing and health care. Within the health care system the use of computerized systems is increasing rapidly. In Australia, the NSW Health Department's Information Management Resource Consortium pilot project--to introduce the First Data Hospital Information System into NSW public hospitals--is a significant example of this trend. While the importance of computerized systems is fairly well recognized for the areas of management and research, they are becoming increasingly significant in the delivery of clinical care and quality assurance. It is important that nurses during their undergraduate education develop the computer literacy and awareness that will allow them access to the both the information and its management. To achieve this effectively it is essential to determine both the entry knowledge of students and what skills and knowledge are essential and desirable for their future roles as nurses. The research undertaken replicated an American study [2] and used their validated questionnaire. Both pre-registration(n=20) and post-registration (n=24) undergraduate students responded to a 21 item questionnaire. The issues addressed related to computer literacy, usage, and knowledge of clinical applications. Both groups desired more "hands on" experience and knowledge in the nurse's role in developing applications and using computers to help care for patients. These results were consistent with the Parks et. al. study. The significance of comparing actual and desirable levels of computer knowledge and awareness is in assisting educators to shape curriculum and course content to more effectively meet the educational needs of these groups in terms of Health Informatics.

  18. Rapid identification of Brucella isolates to the species level by real time PCR based single nucleotide polymorphism (SNP) analysis

    PubMed Central

    Gopaul, Krishna K; Koylass, Mark S; Smith, Catherine J; Whatmore, Adrian M

    2008-01-01

    Background Brucellosis, caused by members of the genus Brucella, remains one of the world's major zoonotic diseases. Six species have classically been recognised within the family Brucella largely based on a combination of classical microbiology and host specificity, although more recently additional isolations of novel Brucella have been reported from various marine mammals and voles. Classical identification to species level is based on a biotyping approach that is lengthy, requires extensive and hazardous culturing and can be difficult to interpret. Here we describe a simple and rapid approach to identification of Brucella isolates to the species level based on real-time PCR analysis of species-specific single nucleotide polymorphisms (SNPs) that were identified following a robust and extensive phylogenetic analysis of the genus. Results Seven pairs of short sequence Minor Groove Binding (MGB) probes were designed corresponding to SNPs shown to possess an allele specific for each of the six classical Brucella spp and the marine mammal Brucella. Assays were optimised to identical reaction parameters in order to give a multiple outcome assay that can differentiate all the classical species and Brucella isolated from marine mammals. The scope of the assay was confirmed by testing of over 300 isolates of Brucella, all of which typed as predicted when compared to other phenotypic and genotypic approaches. The assay is sensitive being capable of detecting and differentiating down to 15 genome equivalents. We further describe the design and testing of assays based on three additional SNPs located within the 16S rRNA gene that ensure positive discrimination of Brucella from close phylogenetic relatives on the same platform. Conclusion The multiple-outcome assay described represents a new tool for the rapid, simple and unambiguous characterisation of Brucella to the species level. Furthermore, being based on a robust phylogenetic framework, the assay provides a platform

  19. Identification of Geotrichum candidum at the species and strain level: proposal for a standardized protocol.

    PubMed

    Gente, S; Sohier, D; Coton, E; Duhamel, C; Gueguen, M

    2006-12-01

    In this study, the M13 primer was used to distinguish Geotrichum candidum from the anamorphic and teleomorphic forms of other arthrospore-forming species (discriminatory power = 0.99). For intraspecific characterization, the GATA4 primer showed the highest level of discrimination for G. candidum among the 20 microsatellite primers tested. A molecular typing protocol (DNA concentration, hybridization temperature and type of PCR machine) was optimized through a series of intra- and interlaboratory trials. This protocol was validated using 75 strains of G. candidum, one strain of G. capitatum and one strain of G. fragrans, and exhibited a discrimination score of 0.87. This method could therefore be used in the agro-food industries to identify and to evaluate biodiversity and trace strains of G. candidum. The results show that the GATA4 primer might be used to differentiate strains according to their ecological niche.

  20. Analysis and identification of two reconstituted tobacco sheets by three-level infrared spectroscopy

    NASA Astrophysics Data System (ADS)

    Wu, Xian-xue; Xu, Chang-hua; Li, Ming; Sun, Su-qin; Li, Jin-ming; Dong, Wei

    2014-07-01

    Two kinds of reconstituted tobacco (RT) from France (RTF) and China (RTC) were analyzed and identified by a three-level infrared spectroscopy method (Fourier-transform infrared spectroscopy (FT-IR) coupled with second derivative infrared spectroscopy (SD-IR) and two-dimensional infrared correlation spectroscopy (2D-IR)). The conventional IR spectra of RTF parallel samples were more consistent than those of RTC according to their overlapped parallel spectra and IR spectra correlation coefficients. FT-IR spectra of both two RTs were similar in holistic spectral profile except for small differences around 1430 cm-1, indicating that they have similar chemical constituents. By analysis of SD-IR spectra of RTFs and RTCs, more distinct fingerprint features, especially peaks at 1106 (1110), 1054 (1059) and 877 (874) cm-1, were disclosed. Even better reproducibility of five SD-IR spectra of RTF in 1750-1400 cm-1 could be seen intuitively from their stacked spectra and could be confirmed by further similarity evaluation of SD-IR spectra. Existence of calcium carbonate and calcium oxalate could be easily observed in two RTs by comparing their spectra with references. Furthermore, the 2D-IR spectra provided obvious, vivid and intuitive differences of RTF and RTC. Both two RTs had a pair of strong positive auto-peaks in 1600-1400 cm-1. Specifically, the autopeak at 1586 cm-1 in RTF was stronger than the one around 1421 cm-1, whereas the one at 1587 cm-1 in RTC was weaker than that at 1458 cm-1. Consequently, the RTs of two different brands were analyzed and identified thoroughly and RTF had better homogeneity than RTC. As a result, three-level infrared spectroscopy method has proved to be a simple, convenient and efficient method for rapid discrimination and homogeneousness estimation of RT.

  1. Student conceptions about the DNA structure within a hierarchical organizational level: Improvement by experiment- and computer-based outreach learning.

    PubMed

    Langheinrich, Jessica; Bogner, Franz X

    2015-01-01

    As non-scientific conceptions interfere with learning processes, teachers need both, to know about them and to address them in their classrooms. For our study, based on 182 eleventh graders, we analyzed the level of conceptual understanding by implementing the "draw and write" technique during a computer-supported gene technology module. To give participants the hierarchical organizational level which they have to draw, was a specific feature of our study. We introduced two objective category systems for analyzing drawings and inscriptions. Our results indicated a long- as well as a short-term increase in the level of conceptual understanding and in the number of drawn elements and their grades concerning the DNA structure. Consequently, we regard the "draw and write" technique as a tool for a teacher to get to know students' alternative conceptions. Furthermore, our study points the modification potential of hands-on and computer-supported learning modules.

  2. Identification of high-level functional/system requirements for future civil transports

    NASA Technical Reports Server (NTRS)

    Swink, Jay R.; Goins, Richard T.

    1992-01-01

    In order to accommodate the rapid growth in commercial aviation throughout the remainder of this century, the Federal Aviation Administration (FAA) is faced with a formidable challenge to upgrade and/or modernize the National Airspace System (NAS) without compromising safety or efficiency. A recurring theme in both the Aviation System Capital Investment Plan (CIP), which has replaced the NAS Plan, and the new FAA Plan for Research, Engineering, and Development (RE&D) rely on the application of new technologies and a greater use of automation. Identifying the high-level functional and system impacts of such modernization efforts on future civil transport operational requirements, particularly in terms of cockpit functionality and information transfer, was the primary objective of this project. The FAA planning documents for the NAS of the 2005 era and beyond were surveyed; major aircraft functional capabilities and system components required for such an operating environment were identified. A hierarchical structured analysis of the information processing and flows emanating from such functional/system components were conducted and the results documented in graphical form depicting the relationships between functions and systems.

  3. Simple and rapid molecular techniques for identification of amylose levels in rice varieties.

    PubMed

    Cheng, Acga; Ismail, Ismanizan; Osman, Mohamad; Hashim, Habibuddin

    2012-01-01

    The polymorphisms of Waxy (Wx) microsatellite and G-T single-nucleotide polymorphism (SNP) in the Wx gene region were analyzed using simplified techniques in fifteen rice varieties. A rapid and reliable electrophoresis method, MetaPhor agarose gel electrophoresis (MAGE), was effectively employed as an alternative to polyacrylamide gel electrophoresis (PAGE) for separating Wx microsatellite alleles. The amplified products containing the Wx microsatellite ranged from 100 to 130 bp in length. Five Wx microsatellite alleles, namely (CT)(10), (CT)(11), (CT)(16), (CT)(17), and (CT)(18) were identified. Of these, (CT)(11) and (CT)(17) were the predominant classes among the tested varieties. All varieties with an apparent amylose content higher than 24% were associated with the shorter repeat alleles; (CT)(10) and (CT)(11), while varieties with 24% or less amylose were associated with the longer repeat alleles. All varieties with intermediate and high amylose content had the sequence AGGTATA at the 5'-leader intron splice site, while varieties with low amylose content had the sequence AGTTATA. The G-T polymorphism was further verified by the PCR-AccI cleaved amplified polymorphic sequence (CAPS) method, in which only genotypes containing the AGGTATA sequence were cleaved by AccI. Hence, varieties with desirable amylose levels can be developed rapidly using the Wx microsatellite and G-T SNP, along with MAGE.

  4. A client-server software for the identification of groundwater vulnerability to pesticides at regional level.

    PubMed

    Di Guardo, Andrea; Finizio, Antonio

    2015-10-15

    The groundwater VULnerability to PESticide software system (VULPES) is a user-friendly, GIS-based and client-server software developed to identify vulnerable areas to pesticides at regional level making use of pesticide fate models. It is a Decision Support System aimed to assist the public policy makers to investigate areas sensitive to specific substances and to propose limitations of use or mitigation measures. VULPES identify the so-called Uniform Geographical Unit (UGU) which are areas characterised by the same agro-environmental conditions. In each UGU it applies the PELMO model obtaining the 80th percentile of the substance concentration at 1 metre depth; then VULPES creates a vulnerability map in shapefile format which classifies the outputs comparing them with the lower threshold set to the legal limit concentration in groundwater (0.1 μg/l). This paper describes the software structure in details and a case study with the application of the terbuthylazine herbicide on the Lombardy region territory. Three zones with different degrees of vulnerabilities has been identified and described.

  5. Identification of oxidized phospholipids in bronchoalveolar lavage exposed to low ozone levels using multivariate analysis

    PubMed Central

    Almstrand, Ann-Charlotte; Voelker, Dennis; Murphy, Robert C

    2015-01-01

    Chemical reactions with unsaturated phospholipids in the respiratory tract lining fluid have been identified as one of the first important steps in the mechanisms mediating environmental ozone toxicity. As a consequence of these reactions, complex mixtures of oxidized lipids are generated in the presence of mixtures of non-oxidized naturally occurring phospholipid molecular species, which challenge methods of analysis. Untargeted mass spectrometry and statistical methods were employed to approach these complex spectra. Human bronchoalveolar lavage (BAL) was exposed to low levels of ozone and samples, with and without derivatization of aldehydes, were analyzed by liquid chromatography electrospray ionization tandem mass spectrometry. Data processing was carried out using principal component analysis (PCA). Resulting PCA score plots indicated an ozone dose-dependent increase, with apparent separation between BAL samples exposed to 60 ppb ozone and non-exposed BAL samples, and a clear separation between ozonized samples before and after derivatization. Corresponding loadings plots revealed that more than 30 phosphatidylcholine (PC) species decreased due to ozonation. A total of 13 PC and 6 phosphatidylglycerol oxidation products were identified with the majority being structurally characterized as chain-shortened aldehyde products. This method exemplifies an approach for comprehensive detection of low abundance, yet important, components in complex lipid samples. PMID:25575758

  6. Cytokinins of the Developing Mango Fruit : Isolation, Identification, and Changes in Levels during Maturation.

    PubMed

    Chen, W S

    1983-02-01

    The cytokinin activity has been isolated and identified from extracts of immature mango (Mangifera indica L.) seeds. The structures of zeatin, zeatin riboside, and N(6)-(Delta(2)-isopentenyl)adenine riboside were confirmed on the basis of their chromatographic behavior and mass spectra of trimethylsilyl derivatives. Both trans and cis isomers of zeatin and zeatin riboside were also identified by the retention times of high performance liquid chromatography. In addition, an unidentified compound appeared to be a cytokinin glucoside.The concentration of cytokinins in the panicle and pulp of mango reached a maximum 5 to 10 days after full bloom and decreased rapidly thereafter. The cytokinin level in the seed remained high until the 28th day after full bloom. The quantity of cytokinins in pulp per fruit increased from the 10th day after full bloom, the maximum being attained around the 50th day after full bloom. Similarly, the amount of cytokinins per seed increased from the 10th day after full bloom, reaching a peak on the 40th day and decreasing gradually thereafter.A high percentage of fruit set in mango was persistently maintained by supplying 6-benzylaminopurine (1.5 x 10(3) micromolar) onto the panicle at the anthesis stage and by supplying gibberellic acid (7.2 x 10(2) micromolar) and naphthalene acetamide (3.1 x 10 micromolar) at the young fruit stage.

  7. Feature-level signal processing for near-real-time odor identification

    NASA Astrophysics Data System (ADS)

    Roppel, Thaddeus A.; Padgett, Mary Lou; Waldemark, Joakim T. A.; Wilson, Denise M.

    1998-09-01

    Rapid detection and classification of odor is of particular interest in applications such as manufacturing of consumer items, food processing, drug and explosives detection, and battlefield situation assessment. Various detection and classification techniques are under investigation so that end users can have access to useful information from odor sensor arrays in near-real-time. Feature-level data clustering and classification techniques are proposed that are (1) parallelizable to permit efficient hardware implementation, (2) adaptable to readily incorporate new data classes, (3) capable of gracefully handling outlier data points and failed sensor conditions, and (4) can provide confidence intervals and/or a traceable decision record along with each classification to permit validation and verification. Results from using specific techniques will be presented and compared. The techniques studied include principal components analysis, automated outlier determination, radial basis functions (RBF), multi-layer perceptrons (MLP), and pulse-coupled neural networks (PCNN). The results reported here are based on data from a testbed in which a gas sensor array is exposed to odor samples on a continuous basis. We have reported previously that more detailed and faster discrimination can be obtained by using sensor transient response in addition to steady state response. As the size of the data set grows we are able to more accurately model performance of a sensor array under realistic conditions.

  8. Identification of a gene, FMP21, whose expression levels are involved in thermotolerance in Saccharomyces cerevisiae

    PubMed Central

    2014-01-01

    Elucidation of the mechanism of high temperature tolerance in yeasts is important for the molecular breeding of high temperature-tolerant yeasts that can be used in bioethanol production. We identified genes whose expression is correlated with the degree of thermotolerance in Saccharomyces cerevisiae by DNA microarray analysis. Gene expression profiles of three S. cerevisiae strains showing different levels of thermotolerance were compared, and we chose three of them as candidate genes. Among these genes, FMP21 was investigated as a thermotolerance-related gene in S. cerevisiae by comparing the growth at high temperature with the gene expression in eight strains. The expression ratio of FMP21 at 37°C was correlated with the doubling time ratio at a coefficient of determination of 0.787. The potential involvement of the Fmp21 in the thermotolerance of yeasts was evaluated. The FMP21 deletion variant showed a decreased respiratory growth rate and increased thermosensitivity. Furthermore, the overexpression of FMP21 improved thermotolerance in yeasts. In conclusion, the function of Fmp21 is important for thermotolerance in yeasts. PMID:25177541

  9. Systems-level modeling of mycobacterial metabolism for the identification of new (multi-)drug targets.

    PubMed

    Rienksma, Rienk A; Suarez-Diez, Maria; Spina, Lucie; Schaap, Peter J; Martins dos Santos, Vitor A P

    2014-12-01

    Systems-level metabolic network reconstructions and the derived constraint-based (CB) mathematical models are efficient tools to explore bacterial metabolism. Approximately one-fourth of the Mycobacterium tuberculosis (Mtb) genome contains genes that encode proteins directly involved in its metabolism. These represent potential drug targets that can be systematically probed with CB models through the prediction of genes essential (or the combination thereof) for the pathogen to grow. However, gene essentiality depends on the growth conditions and, so far, no in vitro model precisely mimics the host at the different stages of mycobacterial infection, limiting model predictions. These limitations can be circumvented by combining expression data from in vivo samples with a validated CB model, creating an accurate description of pathogen metabolism in the host. To this end, we present here a thoroughly curated and extended genome-scale CB metabolic model of Mtb quantitatively validated using 13C measurements. We describe some of the efforts made in integrating CB models and high-throughput data to generate condition specific models, and we will discuss challenges ahead. This knowledge and the framework herein presented will enable to identify potential new drug targets, and will foster the development of optimal therapeutic strategies.

  10. Introducing Creativity in a Design Laboratory for a Freshman Level Electrical and Computer Engineering Course

    ERIC Educational Resources Information Center

    Burkett, Susan L.; Kotru, Sushma; Lusth, John C.; McCallum, Debra; Dunlap, Sarah

    2014-01-01

    Dunlap, The University of Alabama, USA ABSTRACT In the electrical and computer engineering (ECE) curriculum at The University of Alabama, freshmen are introduced to fundamental electrical concepts and units, DC circuit analysis techniques, operational amplifiers, circuit simulation, design, and professional ethics. The two credit course has both…

  11. Computer-Assisted Instruction: Potential for College Level Instruction and Review of Research.

    ERIC Educational Resources Information Center

    Dwyer, Francis M.

    Some basic concepts and types of computer assisted instruction (CAI) are presented, and their application in college and university settings is considered. CAI literature of the late 1960's--including descriptions of specific CAI systems together with studies of instructional effectiveness, learning time, and student attitudes--is then summarized.…

  12. Assessing the Computational Literacy of Elementary Students on a National Level in Korea

    ERIC Educational Resources Information Center

    Jun, SooJin; Han, SunGwan; Kim, HyeonCheol; Lee, WonGyu

    2014-01-01

    Information and communication technology (ICT) literacy education has become an important issue, and the necessity of computational literacy (CL) has been increasing in our growing information society. CL is becoming an important element for future talents, and many countries, including the USA, are developing programs for CL education.…

  13. Photochemistry of Naphthopyrans and Derivatives: A Computational Experiment for Upper-Level Undergraduates or Graduate Students

    ERIC Educational Resources Information Center

    Castet, Frédéric; Méreau, Raphaël; Liotard, Daniel

    2014-01-01

    In this computational experiment, students use advanced quantum chemistry tools to simulate the photochromic reaction mechanism in naphthopyran derivatives. The first part aims to make students familiar with excited-state reaction mechanisms and addresses the photoisomerization of the benzopyran molecule by means of semiempirical quantum chemical…

  14. Developing Instructional Applications at the Secondary Level. The Computer as a Tool.

    ERIC Educational Resources Information Center

    McManus, Jack; And Others

    Case studies are presented for seven Los Angeles area (California) high schools that worked with Pepperdine University in the IBM/ETS (International Business Machines/Educational Testing Service) Model Schools program, a project which provided training for selected secondary school teachers in the use of personal computers and selected software as…

  15. The Effects of Computer-Assisted Material on Students' Cognitive Levels, Misconceptions and Attitudes Towards Science

    ERIC Educational Resources Information Center

    Cepni, Salih; Tas, Erol; Kose, Sacit

    2006-01-01

    The purpose of this study was to investigate the effects of a Computer-Assisted Instruction Material (CAIM) related to "photosynthesis" topic on student cognitive development, misconceptions and attitudes. The study conducted in 2002-2003 academic year and was carried out in two different classes taught by the same teacher, in which…

  16. Examining the Use of Computer Algebra Systems in University-Level Mathematics Teaching

    ERIC Educational Resources Information Center

    Lavicza, Zsolt

    2009-01-01

    The use of Computer Algebra Systems (CAS) is becoming increasingly important and widespread in mathematics research and teaching. In this paper, I will report on a questionnaire study enquiring about mathematicians' use of CAS in mathematics teaching in three countries; the United States, the United Kingdom, and Hungary. Based on the responses…

  17. Computer Assisted Vocational Math. Written for TRS-80, Model I, Level II, 16K.

    ERIC Educational Resources Information Center

    Daly, Judith; And Others

    This computer-assisted curriculum is intended to be used to enhance a vocational mathematics/applied mathematics course. A total of 32 packets were produced to increase the basic mathematics skills of students in the following vocational programs: automotive trades, beauty culture, building trades, climate control, electrical trades,…

  18. An Investigation into Specifying Service Level Agreements for Provisioning Cloud Computing Services

    DTIC Science & Technology

    2012-12-01

    of the computing environment . An SLA for a traditional network system covers network support services, application performance, client-side...53  G.   LESSONS LEARNED FROM AMAZON EC2 BLACKOUTS ................56  VI.  CONCLUSIONS...61  A.   ISSUES AND LESSONS LEARNED

  19. Investigating the Relationship between Curiosity Level and Computer Self Efficacy Beliefs of Elementary Teachers Candidates

    ERIC Educational Resources Information Center

    Gulten, Dilek Cagirgan; Yaman, Yavuz; Deringol, Yasemin; Ozsari, Ismail

    2011-01-01

    Nowadays, "lifelong learning individual" concept is gaining importance in which curiosity is one important feature that an individual should have as a requirement of learning. It is known that learning will naturally occur spontaneously when curiosity instinct is awakened during any learning-teaching process. Computer self-efficacy…

  20. Teaching Computer Languages and Elementary Theory for Mixed Audiences at University Level

    ERIC Educational Resources Information Center

    Christiansen, Henning

    2004-01-01

    Theoretical issues of computer science are traditionally taught in a way that presupposes a solid mathematical background and are usually considered more or less inaccessible for students without this. An effective methodology is described which has been developed for a target group of university students with different backgrounds such as natural…

  1. Computers and Traditional Teaching Practices: Factors Influencing Middle Level Students' Science Achievement and Attitudes about Science

    ERIC Educational Resources Information Center

    Odom, Arthur Louis; Marszalek, Jacob M.; Stoddard, Elizabeth R.; Wrobel, Jerzy M.

    2011-01-01

    The purpose of this study was to examine the association of middle school student science achievement and attitudes toward science with student-reported frequency of using computers to learn science and other classroom practices. Baseline comparison data were collected on the frequency of student-centred teaching practices (e.g. the use of group…

  2. Analysis of lidar elevation data for improved identification and delineation of lands vulnerable to sea-level rise

    USGS Publications Warehouse

    Gesch, D.B.

    2009-01-01

    The importance of sea-level rise in shaping coastal landscapes is well recognized within the earth science community, but as with many natural hazards, communicating the risks associated with sea-level rise remains a challenge. Topography is a key parameter that influences many of the processes involved in coastal change, and thus, up-to-date, high-resolution, high-accuracy elevation data are required to model the coastal environment. Maps of areas subject to potential inundation have great utility to planners and managers concerned with the effects of sea-level rise. However, most of the maps produced to date are simplistic representations derived from older, coarse elevation data. In the last several years, vast amounts of high quality elevation data derived from lidar have become available. Because of their high vertical accuracy and spatial resolution, these lidar data are an excellent source of up-to-date information from which to improve identification and delineation of vulnerable lands. Four elevation datasets of varying resolution and accuracy were processed to demonstrate that the improved quality of lidar data leads to more precise delineation of coastal lands vulnerable to inundation. A key component of the comparison was to calculate and account for the vertical uncertainty of the elevation datasets. This comparison shows that lidar allows for a much more detailed delineation of the potential inundation zone when compared to other types of elevation models. It also shows how the certainty of the delineation of lands vulnerable to a given sea-level rise scenario is much improved when derived from higher resolution lidar data. ?? 2009 Coastal Education and Research Foundation.

  3. Identification of Nucleotide-Level Changes Impacting Gene Content and Genome Evolution in Orthopoxviruses

    PubMed Central

    Hatcher, Eneida L.; Hendrickson, Robert Curtis

    2014-01-01

    ABSTRACT Poxviruses are composed of large double-stranded DNA (dsDNA) genomes coding for several hundred genes whose variation has supported virus adaptation to a wide variety of hosts over their long evolutionary history. Comparative genomics has suggested that the Orthopoxvirus genus in particular has undergone reductive evolution, with the most recent common ancestor likely possessing a gene complement consisting of all genes present in any existing modern-day orthopoxvirus species, similar to the current Cowpox virus species. As orthopoxviruses adapt to new environments, the selection pressure on individual genes may be altered, driving sequence divergence and possible loss of function. This is evidenced by accumulation of mutations and loss of protein-coding open reading frames (ORFs) that progress from individual missense mutations to gene truncation through the introduction of early stop mutations (ESMs), gene fragmentation, and in some cases, a total loss of the ORF. In this study, we have constructed a whole-genome alignment for representative isolates from each Orthopoxvirus species and used it to identify the nucleotide-level changes that have led to gene content variation. By identifying the changes that have led to ESMs, we were able to determine that short indels were the major cause of gene truncations and that the genome length is inversely proportional to the number of ESMs present. We also identified the number and types of protein functional motifs still present in truncated genes to assess their functional significance. IMPORTANCE This work contributes to our understanding of reductive evolution in poxviruses by identifying genomic remnants such as single nucleotide polymorphisms (SNPs) and indels left behind by evolutionary processes. Our comprehensive analysis of the genomic changes leading to gene truncation and fragmentation was able to detect some of the remnants of these evolutionary processes still present in orthopoxvirus genomes and

  4. Achieving production-level use of HEP software at the Argonne Leadership Computing Facility

    NASA Astrophysics Data System (ADS)

    Uram, T. D.; Childers, J. T.; LeCompte, T. J.; Papka, M. E.; Benjamin, D.

    2015-12-01

    HEP's demand for computing resources has grown beyond the capacity of the Grid, and these demands will accelerate with the higher energy and luminosity planned for Run II. Mira, the ten petaFLOPs supercomputer at the Argonne Leadership Computing Facility, is a potentially significant compute resource for HEP research. Through an award of fifty million hours on Mira, we have delivered millions of events to LHC experiments by establishing the means of marshaling jobs through serial stages on local clusters, and parallel stages on Mira. We are running several HEP applications, including Alpgen, Pythia, Sherpa, and Geant4. Event generators, such as Sherpa, typically have a split workload: a small scale integration phase, and a second, more scalable, event-generation phase. To accommodate this workload on Mira we have developed two Python-based Django applications, Balsam and ARGO. Balsam is a generalized scheduler interface which uses a plugin system for interacting with scheduler software such as HTCondor, Cobalt, and TORQUE. ARGO is a workflow manager that submits jobs to instances of Balsam. Through these mechanisms, the serial and parallel tasks within jobs are executed on the appropriate resources. This approach and its integration with the PanDA production system will be discussed.

  5. Interactions between Levels of Instructional Detail and Expertise When Learning with Computer Simulations

    ERIC Educational Resources Information Center

    Hsu, Yuling; Gao, Yuan; Liu, Tzu-Chien; Sweller, John

    2015-01-01

    Based on cognitive load theory, the effect of different levels of instructional detail and expertise in a simulation-based environment on learning about concepts of correlation was investigated. Separate versions of the learning environment were designed for the four experimental conditions which differed only with regard to the levels of written…

  6. PRESTO-II computer code for safety assessment on shallow land disposal of low-level wastes

    SciTech Connect

    Uslu, I.; Fields, D.E.; Yalcintas, M.G.

    1987-01-01

    The PRESTO-II (Prediction of Radiation Effects from Shallow Trench Operations) computer code has been applied for the following sites; Koteyli, Balikesir and Kozakli, Nevsehir in Turkey. This site selection was based partially on the need to consider a variety of hydrologic and climatic situations, and partially on the availability of data. The results obtained for the operational low-level waste disposal site at Barnwell, South Carolina, are presented for comparison. 6 refs., 2 figs., 1 tab.

  7. An automatic variational level set segmentation framework for computer aided dental X-rays analysis in clinical environments.

    PubMed

    Li, Shuo; Fevens, Thomas; Krzyzak, Adam; Li, Song

    2006-03-01

    An automatic variational level set segmentation framework for Computer Aided Dental X-rays Analysis (CADXA) in clinical environments is proposed. Designed for clinical environments, the segmentation contains two stages: a training stage and a segmentation stage. During the training stage, first, manually chosen representative images are segmented using hierarchical level set region detection. Then the window based feature extraction followed by principal component analysis (PCA) is applied and results are used to train a support vector machine (SVM) classifier. During the segmentation stage, dental X-rays are classified first by the trained SVM. The classifier provides initial contours which are close to correct boundaries for three coupled level sets driven by a proposed pathologically variational modeling which greatly accelerates the level set segmentation. Based on the segmentation results and uncertainty maps that are built based on a proposed uncertainty measurement, a computer aided analysis scheme is applied. The experimental results show that the proposed method is able to provide an automatic pathological segmentation which naturally segments those problem areas. Based on the segmentation results, the analysis scheme is able to provide indications of possible problem areas of bone loss and decay to the dentists. As well, the experimental results show that the proposed segmentation framework is able to speed up the level set segmentation in clinical environments.

  8. Directional coarsening of Ni-based superalloys: Computer simulation at the mesoscopic level

    SciTech Connect

    Veron, M.; Brechet, Y.; Louchet, F.

    1996-09-01

    Directional coarsening of nickel based superalloys is modeled taking into account the role of anisotropic misfit relaxation by dislocations generated during creep. The directional coarsening is the response to the gradient in elastic energy induced by this anisotropic relaxation. A 3-dimensional computer simulation has been developed to describe both the morphologies and the kinetics of the phenomenon. Results of the simulation are presented and compared to experimental observations on an AM1 superalloy. Morphology maps describing the expected rafting geometry for other superalloys, as a function of misfit and applied stress, are established and discussed.

  9. Lake Erie water level study. Appendix E. Power. Annex D. Computer programs. Final report

    SciTech Connect

    Not Available

    1981-07-01

    This Annex is part of Appendix E - Power. Appendix E contains the economic evaluation of Lake Erie regulation plans in terms of their effects on the generation of hydroelectric power on the connecting channels of the Great Lakes and on the St. Lawrence River. It also contains a description of the methodology that was developed for the purpose of carrying out this evaluation. The purpose of Annex D is to document the computer programs that were used for the determination of power output at each of the power plants. The documentation also provides sufficient user instructions to permit the economic evaluation results to be readily reproducible.

  10. Unbiased species-level identification of clinical isolates of coagulase-negative Staphylococci: does it change the perspective on Staphylococcus lugdunensis?

    PubMed

    Elamin, Wael F; Ball, David; Millar, Michael

    2015-01-01

    Unbiased species-level identification of coagulase-negative staphylococci (CoNS) using matrix-assisted laser desorption ionization-time of flight mass spectrometry (MALDI-TOF MS) identified Staphylococcus lugdunensis to be a more commonly isolated CoNS in our laboratory than previously observed. It has also highlighted the possibility of vertical transmission.

  11. High-level waste storage tank farms/242-A evaporator Standards/Requirements Identification Document (S/RID), Volume 7. Revision 1

    SciTech Connect

    Burt, D.L.

    1994-04-01

    The High-Level Waste Storage Tank Farms/242-A Evaporator Standards/Requirements Identification Document (S/RID) is contained in multiple volumes. This document (Volume 7) presents the standards and requirements for the following sections: Occupational Safety and Health, and Environmental Protection.

  12. Pi30 DNA probe may be useful for the identification of Prevotella intermedia at the species or strain level.

    PubMed

    Shin, Yong Kook; Jeong, Seung-U; Yoo, So Young; Kim, Mi-Kwang; Kim, Hwa-Sook; Kim, Byung-Ock; Kim, Do Kyung; Hwang, Ho-Keel; Kook, Joong-Ki

    2004-01-01

    Recently, we introduced a new method for the rapid screening of bacterial species-or subspecies-specific DNA probes, named the "inverted dot blot hybridization screening method." This method has subsequently been then applied to develop species-or strain-specific DNA probes for Prevotella intermedia and Prevotella nigrescens. In a previous study, the inverted dot blot hybridization data showed that a probe, Pi30, was specific for P. intermedia. In this study, the DNA probe Pi30 was evaluated by Southern blot analysis to determine if it could distinguish P. intermedia from P. nigrescens. The data showed that the probe Pi30 reacted with the genomic DNAs from the reference strains and clinical isolates of both P. intermedia and P. nigrescens, but the size of the signal bands was different. In addition, the probe Pi30 reacted with a 1.4 kbp fragment from the genomic DNAs digested with Pst I of the P. intermedia strains but not with any fragments of P. nigrescens strains. The result indicates that the probe Pi30 could be useful for the identification of P. intermedia by restriction fragment length polymorphism (RFLP) at the species or strain level.

  13. Identification of nitrate long term trends in Loire-Brittany river district (France) in connection with hydrogeological contexts, agricultural practices and water table level variations

    NASA Astrophysics Data System (ADS)

    Lopez, B.; Baran, N.; Bourgine, B.; Ratheau, D.

    2009-04-01

    The European Union (EU) has adopted directives requiring that Member States take measures to reach a "good" chemical status of water resources by the year 2015 (Water Framework Directive: WFD). Alongside, the Nitrates Directives (91/676/EEC) aims at controlling nitrogen pollution and requires Member States to identify groundwaters that contain more than 50 mg NO3 L-1 or could exceed this limit if preventive measures are not taken. In order to achieve these environmental objectives in the Loire-Brittany river basin, or to justify the non achievement of these objectives, a large dataset of nitrate concentrations (117.056 raw data distributed on 7.341 time-series) and water table level time-series (1.371.655 data distributed on 511 piezometers) is analysed from 1945 to 2007. The 156.700 sq km Loire-Brittany river basin shows various hydrogeological contexts, ranging from sedimentary aquifers to basement ones, with a few volcanic-rock aquifers. The knowledge of the evolution of agricultural practices is important in such a study and, even if this information is not locally available, agricultural practices have globally changed since the 1991 Nitrates Directives. The detailed dataset available for the Loire-Brittany basin aquifers is used to evaluate tools and to propose efficient methodologies for identifying and quantifying past and current trends in nitrate concentrations. Therefore, the challenge of this study is to propose a global and integrated approach which allows nitrate trend identifications for the whole Loire-Brittany river basin. The temporal piezometric behaviour of each aquifer is defined using geostatistical analyse of water table level time-series. This method requires the calculation of an experimental temporal variogram that can be fitted with a theoretical model valid for a large time range. Identification of contrasted behaviours (short term, annual or pluriannual water table fluctuations) allows a systematic classification of the Loire

  14. Superimposition-based personal identification using skull computed tomographic images: application to skull with mouth fixed open by residual soft tissue.

    PubMed

    Ishii, Masuko; Saitoh, Hisako; Yasjima, Daisuke; Yohsuk, Makino; Sakuma, Ayaka; Yayama, Kazuhiro; Iwase, Hirotaro

    2013-09-01

    We previously reported that superimposition of 3-dimensional (3D) images reconstructed from computed tomographic images of skeletonized skulls on photographs of the actual skulls afforded a match of skull contours, thereby demonstrating that superimposition of 3D-reconstructed images provides results identical to those obtained with actual skulls. The current superimposition procedure requires a skeletonized skull with mouth closed and thus is not applicable to personal identification using a skull with residual soft tissue or the mouth fixed open, such as those found in mummified or burned bodies. In this study, we scanned using computed tomography the skulls of mummified and immersed body with mandibles fixed open by residual soft tissue, created 3D-reconstructed skull images, which were digitally processed by computer software to close the mandible, and superimposed the images on antemortem facial photographs. The results demonstrated morphological consistency between the 3D-reconstructed skull images and facial photographs, indicating the applicability of the method to personal identification.

  15. A comparative analysis of multi-level computer-assisted decision making systems for traumatic injuries

    PubMed Central

    2009-01-01

    Background This paper focuses on the creation of a predictive computer-assisted decision making system for traumatic injury using machine learning algorithms. Trauma experts must make several difficult decisions based on a large number of patient attributes, usually in a short period of time. The aim is to compare the existing machine learning methods available for medical informatics, and develop reliable, rule-based computer-assisted decision-making systems that provide recommendations for the course of treatment for new patients, based on previously seen cases in trauma databases. Datasets of traumatic brain injury (TBI) patients are used to train and test the decision making algorithm. The work is also applicable to patients with traumatic pelvic injuries. Methods Decision-making rules are created by processing patterns discovered in the datasets, using machine learning techniques. More specifically, CART and C4.5 are used, as they provide grammatical expressions of knowledge extracted by applying logical operations to the available features. The resulting rule sets are tested against other machine learning methods, including AdaBoost and SVM. The rule creation algorithm is applied to multiple datasets, both with and without prior filtering to discover significant variables. This filtering is performed via logistic regression prior to the rule discovery process. Results For survival prediction using all variables, CART outperformed the other machine learning methods. When using only significant variables, neural networks performed best. A reliable rule-base was generated using combined C4.5/CART. The average predictive rule performance was 82% when using all variables, and approximately 84% when using significant variables only. The average performance of the combined C4.5 and CART system using significant variables was 89.7% in predicting the exact outcome (home or rehabilitation), and 93.1% in predicting the ICU length of stay for airlifted TBI patients

  16. Ambient radiation levels in positron emission tomography/computed tomography (PET/CT) imaging center

    PubMed Central

    Santana, Priscila do Carmo; de Oliveira, Paulo Marcio Campos; Mamede, Marcelo; Silveira, Mariana de Castro; Aguiar, Polyanna; Real, Raphaela Vila; da Silva, Teógenes Augusto

    2015-01-01

    Objective To evaluate the level of ambient radiation in a PET/CT center. Materials and Methods Previously selected and calibrated TLD-100H thermoluminescent dosimeters were utilized to measure room radiation levels. During 32 days, the detectors were placed in several strategically selected points inside the PET/CT center and in adjacent buildings. After the exposure period the dosimeters were collected and processed to determine the radiation level. Results In none of the points selected for measurements the values exceeded the radiation dose threshold for controlled area (5 mSv/year) or free area (0.5 mSv/year) as recommended by the Brazilian regulations. Conclusion In the present study the authors demonstrated that the whole shielding system is appropriate and, consequently, the workers are exposed to doses below the threshold established by Brazilian standards, provided the radiation protection standards are followed. PMID:25798004

  17. Knowledge-based low-level image analysis for computer vision systems

    NASA Technical Reports Server (NTRS)

    Dhawan, Atam P.; Baxi, Himanshu; Ranganath, M. V.

    1988-01-01

    Two algorithms for entry-level image analysis and preliminary segmentation are proposed which are flexible enough to incorporate local properties of the image. The first algorithm involves pyramid-based multiresolution processing and a strategy to define and use interlevel and intralevel link strengths. The second algorithm, which is designed for selected window processing, extracts regions adaptively using local histograms. The preliminary segmentation and a set of features are employed as the input to an efficient rule-based low-level analysis system, resulting in suboptimal meaningful segmentation.

  18. Computational Design of Self-Assembling Protein Nanomaterials with Atomic Level Accuracy

    SciTech Connect

    King, Neil P.; Sheffler, William; Sawaya, Michael R.; Vollmar, Breanna S.; Sumida, John P.; André, Ingemar; Gonen, Tamir; Yeates, Todd O.; Baker, David

    2015-09-17

    We describe a general computational method for designing proteins that self-assemble to a desired symmetric architecture. Protein building blocks are docked together symmetrically to identify complementary packing arrangements, and low-energy protein-protein interfaces are then designed between the building blocks in order to drive self-assembly. We used trimeric protein building blocks to design a 24-subunit, 13-nm diameter complex with octahedral symmetry and a 12-subunit, 11-nm diameter complex with tetrahedral symmetry. The designed proteins assembled to the desired oligomeric states in solution, and the crystal structures of the complexes revealed that the resulting materials closely match the design models. The method can be used to design a wide variety of self-assembling protein nanomaterials.

  19. Exaggerated force production in altered Gz-levels during parabolic flight: the role of computational resources allocation.

    PubMed

    Mierau, Andreas; Girgenrath, Michaela

    2010-02-01

    The purpose of the present experiment was to examine whether the previously observed exaggerated isometric force production in changed-Gz during parabolic flight (Mierau et al. 2008) can be explained by a higher computational demand and, thus, inadequate allocation of the brain's computational resources to the task. Subjects (n = 12) were tested during the micro-Gz, high-Gz and normal-Gz episodes of parabolic flight. They produced isometric forces of different magnitudes and directions, according to visually prescribed vectors with their right, dominant hand and performed a choice reaction-time task with their left hand. Tasks were performed either separately (single-task) or simultaneously (dual-task). Dual-task interference was present for both tasks, indicating that each task was resources-demanding. However, this interference remained unaffected by the Gz-level. It was concluded that exaggerated force production in changed-Gz is probably not related to inadequate allocation of the brain's computational resources to the force production task. Statement of Relevance: The present study shows that deficient motor performance in changed-Gz environments (both micro-Gz and high-Gz) is not necessarily related to inadequate computational resources allocation, as was suggested in some previous studies. This finding is of great relevance not only for fundamental research, but also for the training and safety of humans operating in changed-Gz environments, such as astronauts and jet pilots.

  20. Computational identification of a metal organic framework for high selectivity membrane-based CO2/CH4 separations: Cu(hfipbb)(H2hfipbb)0.5.

    PubMed

    Watanabe, Taku; Keskin, Seda; Nair, Sankar; Sholl, David S

    2009-12-28

    The identification of membrane materials with high selectivity for CO(2)/CH(4) mixtures could revolutionize this industrially important separation. We predict using computational methods that a metal organic framework (MOF), Cu(hfipbb)(H(2)hfipbb)(0.5), has unprecedented selectivity for membrane-based separation of CO(2)/CH(4) mixtures. Our calculations combine molecular dynamics, transition state theory, and plane wave DFT calculations to assess the importance of framework flexibility in the MOF during molecular diffusion. This combination of methods should also make it possible to identify other MOFs with attractive properties for kinetic separations.

  1. A computational model for exploratory activity of rats with different anxiety levels in elevated plus-maze.

    PubMed

    Costa, Ariadne A; Morato, Silvio; Roque, Antonio C; Tinós, Renato

    2014-10-30

    The elevated plus-maze is an apparatus widely used to study the level of anxiety in rodents. The maze is plus-shaped, with two enclosed arms and two open arms, and elevated 50cm from the floor. During a test, which usually lasts for 5min, the animal is initially put at the center and is free to move and explore the entire maze. The level of anxiety is measured by variables such as the percentage of time spent and the number of entries in the enclosed arms. High percentage of time spent at and number of entries in the enclosed arms indicate anxiety. Here we propose a computational model of rat behavior in the elevated plus-maze based on an artificial neural network trained by a genetic algorithm. The fitness function of the genetic algorithm is composed of reward (positive) and punishment (negative) terms, which are incremented as the computational agent (virtual rat) moves in the maze. The punishment term is modulated by a parameter that simulates the effects of different drugs. Unlike other computational models, the virtual rat is built independently of prior known experimental data. The exploratory behaviors generated by the model for different simulated pharmacological conditions are in good agreement with data from real rats.

  2. Comparing levels of school performance to science teachers' reports on knowledge/skills, instructional use and student use of computers

    NASA Astrophysics Data System (ADS)

    Kerr, Rebecca

    The purpose of this descriptive quantitative and basic qualitative study was to examine fifth and eighth grade science teachers' responses, perceptions of the role of technology in the classroom, and how they felt that computer applications, tools, and the Internet influence student understanding. The purposeful sample included survey and interview responses from fifth grade and eighth grade general and physical science teachers. Even though they may not be generalizable to other teachers or classrooms due to a low response rate, findings from this study indicated teachers with fewer years of teaching science had a higher level of computer use but less computer access, especially for students, in the classroom. Furthermore, teachers' choice of professional development moderated the relationship between the level of school performance and teachers' knowledge/skills, with the most positive relationship being with workshops that occurred outside of the school. Eighteen interviews revealed that teachers perceived the role of technology in classroom instruction mainly as teacher-centered and supplemental, rather than student-centered activities.

  3. Using RoboCup in University-Level Computer Science Education

    ERIC Educational Resources Information Center

    Sklar, Elizabeth; Parsons, Simon; Stone, Peter

    2004-01-01

    In the education literature, team-based projects have proven to be an effective pedagogical methodology. We have been using "RoboCup" challenges as the basis for class projects in undergraduate and masters level courses. This article discusses several independent efforts in this direction and presents our work in the development of shared…

  4. Tablet Computer Literacy Levels of the Physical Education and Sports Department Students

    ERIC Educational Resources Information Center

    Hergüner, Gülten

    2016-01-01

    Education systems are being affected in parallel by newly emerging hardware and new developments occurring in technology daily. Tablet usage especially is becoming ubiquitous in the teaching-learning processes in recent years. Therefore, using the tablets effectively, managing them and having a high level of tablet literacy play an important role…

  5. Deriving Hounsfield units using grey levels in cone beam computed tomography

    PubMed Central

    Mah, P; Reeves, T E; McDavid, W D

    2010-01-01

    Objectives An in vitro study was performed to investigate the relationship between grey levels in dental cone beam CT (CBCT) and Hounsfield units (HU) in CBCT scanners. Methods A phantom containing 8 different materials of known composition and density was imaged with 11 different dental CBCT scanners and 2 medical CT scanners. The phantom was scanned under three conditions: phantom alone and phantom in a small and large water container. The reconstructed data were exported as Digital Imaging and Communications in Medicine (DICOM) and analysed with On Demand 3D® by Cybermed, Seoul, Korea. The relationship between grey levels and linear attenuation coefficients was investigated. Results It was demonstrated that a linear relationship between the grey levels and the attenuation coefficients of each of the materials exists at some “effective” energy. From the linear regression equation of the reference materials, attenuation coefficients were obtained for each of the materials and CT numbers in HU were derived using the standard equation. Conclusions HU can be derived from the grey levels in dental CBCT scanners using linear attenuation coefficients as an intermediate step. PMID:20729181

  6. Scaling up ATLAS Event Service to production levels on opportunistic computing platforms

    NASA Astrophysics Data System (ADS)

    Benjamin, D.; Caballero, J.; Ernst, M.; Guan, W.; Hover, J.; Lesny, D.; Maeno, T.; Nilsson, P.; Tsulaia, V.; van Gemmeren, P.; Vaniachine, A.; Wang, F.; Wenaus, T.; ATLAS Collaboration

    2016-10-01

    Continued growth in public cloud and HPC resources is on track to exceed the dedicated resources available for ATLAS on the WLCG. Examples of such platforms are Amazon AWS EC2 Spot Instances, Edison Cray XC30 supercomputer, backfill at Tier 2 and Tier 3 sites, opportunistic resources at the Open Science Grid (OSG), and ATLAS High Level Trigger farm between the data taking periods. Because of specific aspects of opportunistic resources such as preemptive job scheduling and data I/O, their efficient usage requires workflow innovations provided by the ATLAS Event Service. Thanks to the finer granularity of the Event Service data processing workflow, the opportunistic resources are used more efficiently. We report on our progress in scaling opportunistic resource usage to double-digit levels in ATLAS production.

  7. Computing multiple aggregation levels and contextual features for road facilities recognition using mobile laser scanning data

    NASA Astrophysics Data System (ADS)

    Yang, Bisheng; Dong, Zhen; Liu, Yuan; Liang, Fuxun; Wang, Yongjun

    2017-04-01

    In recent years, updating the inventory of road infrastructures based on field work is labor intensive, time consuming, and costly. Fortunately, vehicle-based mobile laser scanning (MLS) systems provide an efficient solution to rapidly capture three-dimensional (3D) point clouds of road environments with high flexibility and precision. However, robust recognition of road facilities from huge volumes of 3D point clouds is still a challenging issue because of complicated and incomplete structures, occlusions and varied point densities. Most existing methods utilize point or object based features to recognize object candidates, and can only extract limited types of objects with a relatively low recognition rate, especially for incomplete and small objects. To overcome these drawbacks, this paper proposes a semantic labeling framework by combing multiple aggregation levels (point-segment-object) of features and contextual features to recognize road facilities, such as road surfaces, road boundaries, buildings, guardrails, street lamps, traffic signs, roadside-trees, power lines, and cars, for highway infrastructure inventory. The proposed method first identifies ground and non-ground points, and extracts road surfaces facilities from ground points. Non-ground points are segmented into individual candidate objects based on the proposed multi-rule region growing method. Then, the multiple aggregation levels of features and the contextual features (relative positions, relative directions, and spatial patterns) associated with each candidate object are calculated and fed into a SVM classifier to label the corresponding candidate object. The recognition performance of combining multiple aggregation levels and contextual features was compared with single level (point, segment, or object) based features using large-scale highway scene point clouds. Comparative studies demonstrated that the proposed semantic labeling framework significantly improves road facilities recognition

  8. Simultaneous segmentation and reconstruction: A level set method approach for limited view computed tomography

    PubMed Central

    Yoon, Sungwon; Pineda, Angel R.; Fahrig, Rebecca

    2010-01-01

    Purpose: An iterative tomographic reconstruction algorithm that simultaneously segments and reconstructs the reconstruction domain is proposed and applied to tomographic reconstructions from a sparse number of projection images. Methods: The proposed algorithm uses a two-phase level set method segmentation in conjunction with an iterative tomographic reconstruction to achieve simultaneous segmentation and reconstruction. The simultaneous segmentation and reconstruction is achieved by alternating between level set function evolutions and per-region intensity value updates. To deal with the limited number of projections, a priori information about the reconstruction is enforced via penalized likelihood function. Specifically, smooth function within each region (piecewise smooth function) and bounded function intensity values for each region are assumed. Such a priori information is formulated into a quadratic objective function with linear bound constraints. The level set function evolutions are achieved by artificially time evolving the level set function in the negative gradient direction; the intensity value updates are achieved by using the gradient projection conjugate gradient algorithm. Results: The proposed simultaneous segmentation and reconstruction results were compared to “conventional” iterative reconstruction (with no segmentation), iterative reconstruction followed by segmentation, and filtered backprojection. Improvements of 6%–13% in the normalized root mean square error were observed when the proposed algorithm was applied to simulated projections of a numerical phantom and to real fan-beam projections of the Catphan phantom, both of which did not satisfy the a priori assumptions. Conclusions: The proposed simultaneous segmentation and reconstruction resulted in improved reconstruction image quality. The algorithm correctly segments the reconstruction space into regions, preserves sharp edges between different regions, and smoothes the noise

  9. Simultaneous segmentation and reconstruction: A level set method approach for limited view computed tomography

    SciTech Connect

    Yoon, Sungwon; Pineda, Angel R.; Fahrig, Rebecca

    2010-05-15

    Purpose: An iterative tomographic reconstruction algorithm that simultaneously segments and reconstructs the reconstruction domain is proposed and applied to tomographic reconstructions from a sparse number of projection images. Methods: The proposed algorithm uses a two-phase level set method segmentation in conjunction with an iterative tomographic reconstruction to achieve simultaneous segmentation and reconstruction. The simultaneous segmentation and reconstruction is achieved by alternating between level set function evolutions and per-region intensity value updates. To deal with the limited number of projections, a priori information about the reconstruction is enforced via penalized likelihood function. Specifically, smooth function within each region (piecewise smooth function) and bounded function intensity values for each region are assumed. Such a priori information is formulated into a quadratic objective function with linear bound constraints. The level set function evolutions are achieved by artificially time evolving the level set function in the negative gradient direction; the intensity value updates are achieved by using the gradient projection conjugate gradient algorithm. Results: The proposed simultaneous segmentation and reconstruction results were compared to ''conventional'' iterative reconstruction (with no segmentation), iterative reconstruction followed by segmentation, and filtered backprojection. Improvements of 6%-13% in the normalized root mean square error were observed when the proposed algorithm was applied to simulated projections of a numerical phantom and to real fan-beam projections of the Catphan phantom, both of which did not satisfy the a priori assumptions. Conclusions: The proposed simultaneous segmentation and reconstruction resulted in improved reconstruction image quality. The algorithm correctly segments the reconstruction space into regions, preserves sharp edges between different regions, and smoothes the noise

  10. The PVM (Parallel Virtual Machine) system: Supercomputer level concurrent computation on a network of IBM RS/6000 power stations

    SciTech Connect

    Sunderam, V.S. . Dept. of Mathematics and Computer Science); Geist, G.A. )

    1991-01-01

    The PVM (Parallel Virtual Machine) system enables supercomputer level concurrent computations to be performed on interconnected networks of heterogeneous computer systems. Specifically, a network of 13 IBM RS/6000 powerstations has been successfully used to execute production quality runs of superconductor modeling codes at more than 250 Mflops. This work demonstrates the effectiveness of cooperative concurrent processing for high performance applications, and shows that supercomputer level computations may be attained at a fraction of the cost on distributed computing platforms. This paper describes the PVM programming environment and user facilities, as they apply to hardware platforms comprising a network of IBM RS/6000 powerstations. The salient design features of PVM will be discussed; including heterogeneity, scalability, multilanguage support, provisions for fault tolerance, the use of multiprocessors and scalar machines, an interactive graphical front end, and support for profiling, tracing, and visual analysis. The PVM system has been used extensively, and a range of production quality concurrent applications have been successfully executed using PVM on a variety of networked platforms. The paper will mention representative examples, and discuss two in detail. The first is a material sciences problem that was originally developed on a Cray 2. This application code calculates the electronic structure of metallic alloys from first principles and is based on the KKR-CPA algorithm. The second is a molecular dynamics simulation for calculating materials properties. Performance results for both applicants on networks of RS/6000 powerstations will be presented, and accompanied by discussions of the other advantages of PVM and its potential as a complement or alternative to conventional supercomputers.

  11. Computational aspects of helicopter trim analysis and damping levels from Floquet theory

    NASA Technical Reports Server (NTRS)

    Gaonkar, Gopal H.; Achar, N. S.

    1992-01-01

    Helicopter trim settings of periodic initial state and control inputs are investigated for convergence of Newton iteration in computing the settings sequentially and in parallel. The trim analysis uses a shooting method and a weak version of two temporal finite element methods with displacement formulation and with mixed formulation of displacements and momenta. These three methods broadly represent two main approaches of trim analysis: adaptation of initial-value and finite element boundary-value codes to periodic boundary conditions, particularly for unstable and marginally stable systems. In each method, both the sequential and in-parallel schemes are used and the resulting nonlinear algebraic equations are solved by damped Newton iteration with an optimally selected damping parameter. The impact of damped Newton iteration, including earlier-observed divergence problems in trim analysis, is demonstrated by the maximum condition number of the Jacobian matrices of the iterative scheme and by virtual elimination of divergence. The advantages of the in-parallel scheme over the conventional sequential scheme are also demonstrated.

  12. Leveling

    USGS Publications Warehouse

    1966-01-01

    Geodetic leveling by the U.S. Geological Survey provides a framework of accurate elevations for topographic mapping. Elevations are referred to the Sea Level Datum of 1929. Lines of leveling may be run either with automatic or with precise spirit levels, by either the center-wire or the three-wire method. For future use, the surveys are monumented with bench marks, using standard metal tablets or other marking devices. The elevations are adjusted by least squares or other suitable method and are published in lists of control.

  13. A distributed computational search strategy for the identification of diagnostics targets: application to finding aptamer targets for methicillin-resistant staphylococci.

    PubMed

    Flanagan, Keith; Cockell, Simon; Harwood, Colin; Hallinan, Jennifer; Nakjang, Sirintra; Lawry, Beth; Wipat, Anil

    2014-06-30

    The rapid and cost-effective identification of bacterial species is crucial, especially for clinical diagnosis and treatment. Peptide aptamers have been shown to be valuable for use as a component of novel, direct detection methods. These small peptides have a number of advantages over antibodies, including greater specificity and longer shelf life. These properties facilitate their use as the detector components of biosensor devices. However, the identification of suitable aptamer targets for particular groups of organisms is challenging. We present a semi-automated processing pipeline for the identification of candidate aptamer targets from whole bacterial genome sequences. The pipeline can be configured to search for protein sequence fragments that uniquely identify a set of strains of interest. The system is also capable of identifying additional organisms that may be of interest due to their possession of protein fragments in common with the initial set. Through the use of Cloud computing technology and distributed databases, our system is capable of scaling with the rapidly growing genome repositories, and consequently of keeping the resulting data sets up-to-date. The system described is also more generically applicable to the discovery of specific targets for other diagnostic approaches such as DNA probes, PCR primers and antibodies.

  14. A computational method for the detection of activation/deactivation patterns in biological signals with three levels of electric intensity.

    PubMed

    Guerrero, J A; Macías-Díaz, J E

    2014-02-01

    In the present work, we develop a computational technique to approximate the changes of phase in temporal series associated to electric signals of muscles which perform activities at three different levels of intensity. We suppose that the temporal series are samples of independent, normally distributed random variables with mean equal to zero, and variance equal to one of three possible values, each of them associated to a certain degree of electric intensity. For example, these intensity levels may represent a leg muscle at rest, or active during a light activity (walking), or active during a highly demanding performance (jogging). The model is presented as a maximum likelihood problem involving discrete variables. In turn, this problem is transformed into a continuous one via the introduction of continuous variables with penalization parameters, and it is solved recursively through an iterative numerical method. An a posteriori treatment of the results is used in order to avoid the detection of relatively short periods of silence or activity. We perform simulations with synthetic data in order to assess the validity of our technique. Our computational results show that the method approximates well the occurrence of the change points in synthetic temporal series, even in the presence of autocorrelated sequences. In the way, we show that a generalization of a computational technique for the change-point detection of electric signals with two phases of activity (Esquivel-Frausto et al., 2010 [40]), may be inapplicable in cases of temporal series with three levels of intensity. In this sense, the method proposed in the present manuscript improves previous efforts of the authors.

  15. COMGEN: A computer program for generating finite element models of composite materials at the micro level

    NASA Technical Reports Server (NTRS)

    Melis, Matthew E.

    1990-01-01

    COMGEN (Composite Model Generator) is an interactive FORTRAN program which can be used to create a wide variety of finite element models of continuous fiber composite materials at the micro level. It quickly generates batch or session files to be submitted to the finite element pre- and postprocessor PATRAN based on a few simple user inputs such as fiber diameter and percent fiber volume fraction of the composite to be analyzed. In addition, various mesh densities, boundary conditions, and loads can be assigned easily to the models within COMGEN. PATRAN uses a session file to generate finite element models and their associated loads which can then be translated to virtually any finite element analysis code such as NASTRAN or MARC.

  16. Computational Visual Stress Level Analysis of Calcareous Algae Exposed to Sedimentation.

    PubMed

    Osterloff, Jonas; Nilssen, Ingunn; Eide, Ingvar; de Oliveira Figueiredo, Marcia Abreu; de Souza Tâmega, Frederico Tapajós; Nattkemper, Tim W

    2016-01-01

    This paper presents a machine learning based approach for analyses of photos collected from laboratory experiments conducted to assess the potential impact of water-based drill cuttings on deep-water rhodolith-forming calcareous algae. This pilot study uses imaging technology to quantify and monitor the stress levels of the calcareous algae Mesophyllum engelhartii (Foslie) Adey caused by various degrees of light exposure, flow intensity and amount of sediment. A machine learning based algorithm was applied to assess the temporal variation of the calcareous algae size (∼ mass) and color automatically. Measured size and color were correlated to the photosynthetic efficiency (maximum quantum yield of charge separation in photosystem II, [Formula: see text]) and degree of sediment coverage using multivariate regression. The multivariate regression showed correlations between time and calcareous algae sizes, as well as correlations between fluorescence and calcareous algae colors.

  17. Computational Visual Stress Level Analysis of Calcareous Algae Exposed to Sedimentation

    PubMed Central

    Nilssen, Ingunn; Eide, Ingvar; de Oliveira Figueiredo, Marcia Abreu; de Souza Tâmega, Frederico Tapajós; Nattkemper, Tim W.

    2016-01-01

    This paper presents a machine learning based approach for analyses of photos collected from laboratory experiments conducted to assess the potential impact of water-based drill cuttings on deep-water rhodolith-forming calcareous algae. This pilot study uses imaging technology to quantify and monitor the stress levels of the calcareous algae Mesophyllum engelhartii (Foslie) Adey caused by various degrees of light exposure, flow intensity and amount of sediment. A machine learning based algorithm was applied to assess the temporal variation of the calcareous algae size (∼ mass) and color automatically. Measured size and color were correlated to the photosynthetic efficiency (maximum quantum yield of charge separation in photosystem II, ΦPSIImax) and degree of sediment coverage using multivariate regression. The multivariate regression showed correlations between time and calcareous algae sizes, as well as correlations between fluorescence and calcareous algae colors. PMID:27285611

  18. Multi-level security for computer networking - SAC digital network approach

    NASA Astrophysics Data System (ADS)

    Griess, W.; Poutre, D. L.

    The functional features and architecture of the SACDIN (SAC digital network) are detailed. SACDIN is the new data transmission segment for directing SAC's strategic forces. The system has 135 processor nodes at 32 locations and processes, distributes and stores data of any level of security classification. The sophistication of access nodes is dependent on the location. A reference monitor mediates the multilevel security by implementation of the multi-state machine concept, i.e., the Bell-LaPadula model (1973, 1974), which concludes that a secure state can never lead to an unsecure state. The monitor is controlled by the internal access control mechanism, which resides in PROM. Details of the access process are provided, including message flow on trusted paths appropriate to the security clearance of the user.

  19. A Well-Mixed Computational Model for Estimating Room Air Levels of Selected Constituents from E-Vapor Product Use

    PubMed Central

    Rostami, Ali A.; Pithawalla, Yezdi B.; Liu, Jianmin; Oldham, Michael J.; Wagner, Karl A.; Frost-Pineda, Kimberly; Sarkar, Mohamadi A.

    2016-01-01

    Concerns have been raised in the literature for the potential of secondhand exposure from e-vapor product (EVP) use. It would be difficult to experimentally determine the impact of various factors on secondhand exposure including, but not limited to, room characteristics (indoor space size, ventilation rate), device specifications (aerosol mass delivery, e-liquid composition), and use behavior (number of users and usage frequency). Therefore, a well-mixed computational model was developed to estimate the indoor levels of constituents from EVPs under a variety of conditions. The model is based on physical and thermodynamic interactions between aerosol, vapor, and air, similar to indoor air models referred to by the Environmental Protection Agency. The model results agree well with measured indoor air levels of nicotine from two sources: smoking machine-generated aerosol and aerosol exhaled from EVP use. Sensitivity analysis indicated that increasing air exchange rate reduces room air level of constituents, as more material is carried away. The effect of the amount of aerosol released into the space due to variability in exhalation was also evaluated. The model can estimate the room air level of constituents as a function of time, which may be used to assess the level of non-user exposure over time. PMID:27537903

  20. Optimization of energy level for coronary angiography with dual-energy and dual-source computed tomography.

    PubMed

    Okayama, Satoshi; Seno, Ayako; Soeda, Tsunenari; Takami, Yasuhiro; Kawakami, Rika; Somekawa, Satoshi; Ishigami, Ken-Ichi; Takeda, Yukiji; Kawata, Hiroyuki; Horii, Manabu; Uemura, Shiro; Saito, Yoshihiko

    2012-04-01

    Dual-energy computed tomography (DE-CT) uses polyenergetic X-rays at 100- and 140-kVp tube energy, and generates 120-kVp composite images that are referred to as polyenergetic images (PEIs). Moreover, DE-CT can produce monoenergetic images (MEIs) at any effective energy level. We evaluated whether the image quality of coronary angiography is improved by optimizing the energy levels of DE-CT. We retrospectively evaluated data sets obtained from 24 consecutive patients using cardiac DE-CT at 100- and 140-kVp tube energy with a dual-source scanner. Signal-to-noise ratios (SNRs) were evaluated in the left ascending coronary artery in PEIs, and in MEIs reconstructed at 40, 50, 60, 70, 80, 90, 100, 130, 160 and 190 keV. Energy levels of 100, 120 and 140 kVp generated the highest SNRs in PEIs from 10, 12 and 2 patients, respectively, at 60, 70 and 80 keV in MEIs from 2, 10 and 10 patients, respectively, and at 90 and 100 keV in those from one patient each. Optimization of the energy level for each patient increased the SNR by 16.6% in PEIs (P < 0.0001) and by 18.2% in MEIs (P < 0.05), compared with 120-kVp composite images. The image quality of coronary angiography using DE-CT can be improved by optimizing the energy level for individual patients.

  1. Integrating Compact Constraint and Distance Regularization with Level Set for Hepatocellular Carcinoma (HCC) Segmentation on Computed Tomography (CT) Images

    NASA Astrophysics Data System (ADS)

    Gui, Luying; He, Jian; Qiu, Yudong; Yang, Xiaoping

    2017-01-01

    This paper presents a variational level set approach to segment lesions with compact shapes on medical images. In this study, we investigate to address the problem of segmentation for hepatocellular carcinoma which are usually of various shapes, variable intensities, and weak boundaries. An efficient constraint which is called the isoperimetric constraint to describe the compactness of shapes is applied in this method. In addition, in order to ensure the precise segmentation and stable movement of the level set, a distance regularization is also implemented in the proposed variational framework. Our method is applied to segment various hepatocellular carcinoma regions on Computed Tomography images with promising results. Comparison results also prove that the proposed method is more accurate than other two approaches.

  2. Group-level self-definition and self-investment: a hierarchical (multicomponent) model of in-group identification.

    PubMed

    Leach, Colin Wayne; van Zomeren, Martijn; Zebel, Sven; Vliek, Michael L W; Pennekamp, Sjoerd F; Doosje, Bertjan; Ouwerkerk, Jaap W; Spears, Russell

    2008-07-01

    Recent research shows individuals' identification with in-groups to be psychologically important and socially consequential. However, there is little agreement about how identification should be conceptualized or measured. On the basis of previous work, the authors identified 5 specific components of in-group identification and offered a hierarchical 2-dimensional model within which these components are organized. Studies 1 and 2 used confirmatory factor analysis to validate the proposed model of self-definition (individual self-stereotyping, in-group homogeneity) and self-investment (solidarity, satisfaction, and centrality) dimensions, across 3 different group identities. Studies 3 and 4 demonstrated the construct validity of the 5 components by examining their (concurrent) correlations with established measures of in-group identification. Studies 5-7 demonstrated the predictive and discriminant validity of the 5 components by examining their (prospective) prediction of individuals' orientation to, and emotions about, real intergroup relations. Together, these studies illustrate the conceptual and empirical value of a hierarchical multicomponent model of in-group identification.

  3. Genegis: Computational Tools for Spatial Analyses of DNA Profiles with Associated Photo-Identification and Telemetry Records of Marine Mammals

    DTIC Science & Technology

    2013-09-30

    Calambokidis and Erin Falcone of Cascadia Research Collective. This approach takes advantage of an existing open - source software framework supporting capture...genotyping and photo-identification. geneGIS in Wildbook Wildbook (http://www.wildme.org/wildbook/) is an open - source database framework designed to...software. Although both developments of the ArcGIS toolbox and Wildbook are open source there are two different potential licenses, the Mozilla Public

  4. A new method for the identification of the origin of natural products. Quantitative /sup 2/H NMR at the natural abundance level applied to the characterization of anetholes

    SciTech Connect

    Martin, G.J.; Martin, M.L.; Mabon, F.; Bricont, J.

    1982-05-05

    We have shown by high-field /sup 2/H NMR spectrometry at the natural abundance level that very spectacular differences exist in the interal distribution of /sup 2/H in organic molecules. This phenomenon has been exemplified in particular by the case of ethyl and vinyl derivatives. We show in this study of various anethole samples the potential of this new method as a very powerful tool for the characterization and identification of natural products from different origins.

  5. Comparison of species-level identification and antifungal susceptibility results from diagnostic and reference laboratories for bloodstream Candida surveillance isolates, South Africa, 2009-2010.

    PubMed

    Naicker, Serisha D; Govender, Nevashan; Patel, Jaymati; Zietsman, Inge L; Wadula, Jeannette; Coovadia, Yacoob; Kularatne, Ranmini; Seetharam, Sharona; Govender, Nelesh P

    2016-11-01

    From February 2009 through August 2010, we compared species-level identification of bloodstream Candida isolates and susceptibility to fluconazole, voriconazole, and caspofungin between diagnostic and reference South African laboratories during national surveillance for candidemia. Diagnostic laboratories identified isolates to genus/species level and performed antifungal susceptibility testing, as indicated. At a reference laboratory, viable Candida isolates were identified to species-level using automated systems, biochemical tests, or DNA sequencing; broth dilution susceptibility testing was performed. Categorical agreement (CA) was calculated for susceptibility results of isolates with concordant species identification. Overall, 2172 incident cases were detected, 773 (36%) by surveillance audit. The Vitek 2 YST system (bioMérieux Inc, Marcy l'Etoile, France) was used for identification (360/863, 42%) and susceptibility testing (198/473, 42%) of a large proportion of isolates. For the five most common species (n = 1181), species-level identification was identical in the majority of cases (Candida albicans: 98% (507/517); Candida parapsilosis: 92% (450/488); Candida glabrata: 89% (89/100); Candida tropicalis: 91% (49/54), and Candida krusei: 86% (19/22)). However, diagnostic laboratories were significantly less likely to correctly identify Candida species other than C. albicans versus C. albicans (607/664, 91% vs. 507/517, 98%; P < .001). Susceptibility data were compared for isolates belonging to the five most common species and fluconazole, voriconazole, and caspofungin in 860, 580, and 99 cases, respectively. Diagnostic laboratories significantly under-reported fluconazole resistance in C. parapsilosis (225/393, 57% vs. 239/393, 61%; P < .001) but over-reported fluconazole non-susceptibility in C. albicans (36/362, 10% vs. 3/362, 0.8%; P < .001). Diagnostic laboratories were less likely to correctly identify Candida species other than C. albicans, under

  6. Identification of light absorbing oligomers from glyoxal and methylglyoxal aqueous processing: a comparative study at the molecular level

    NASA Astrophysics Data System (ADS)

    Finessi, Emanuela; Hamilton, Jacqueline; Rickard, Andrew; Baeza-Romero, Maria; Healy, Robert; Peppe, Salvatore; Adams, Tom; Daniels, Mark; Ball, Stephen; Goodall, Iain; Monks, Paul; Borras, Esther; Munoz, Amalia

    2014-05-01

    Numerous studies point to the reactive uptake of gaseous low molecular weight carbonyls onto atmospheric waters (clouds/fog droplets and wet aerosols) as an important SOA formation route not yet included in current models. However, the evaluation of these processes is challenging because water provides a medium for a complex array of reactions to take place such as self-oligomerization, aldol condensation and Maillard-type browning reactions in the presence of ammonium salts. In addition to adding to SOA mass, aqueous chemistry products have been shown to include light absorbing, surface-active and high molecular weight oligomeric species, and can therefore affect climatically relevant aerosol properties such as light absorption and hygroscopicity. Glyoxal (GLY) and methylglyoxal (MGLY) are the gaseous carbonyls that have perhaps received the most attention to date owing to their ubiquity, abundance and reactivity in water, with the majority of studies focussing on bulk physical properties. However, very little is known at the molecular level, in particular for MGLY, and the relative potential of these species as aqueous SOA precursors in ambient air is still unclear. We have conducted experiments with both laboratory solutions and chamber-generated particles to simulate the aqueous processing of GLY and MGLY with ammonium sulphate (AS) under typical atmospheric conditions and investigated their respective aging products. Both high performance liquid chromatography coupled with UV-Vis detection and ion trap mass spectrometry (HPLC-DAD-MSn) and high resolution mass spectrometry (FTICRMS) have been used for molecular identification purposes. Comprehensive gas chromatography with nitrogen chemiluminescence detection (GCxGC-NCD) has been applied for the first time to these systems, revealing a surprisingly high number of nitrogen-containing organics (ONs), with a large extent of polarities. GCxGC-NCD proved to be a valuable tool to determine overall amount and rates of

  7. A Full Dynamic Compound Inverse Method for output-only element-level system identification and input estimation from earthquake response signals

    NASA Astrophysics Data System (ADS)

    Pioldi, Fabio; Rizzi, Egidio

    2016-08-01

    This paper proposes a new output-only element-level system identification and input estimation technique, towards the simultaneous identification of modal parameters, input excitation time history and structural features at the element-level by adopting earthquake-induced structural response signals. The method, named Full Dynamic Compound Inverse Method (FDCIM), releases strong assumptions of earlier element-level techniques, by working with a two-stage iterative algorithm. Jointly, a Statistical Average technique, a modification process and a parameter projection strategy are adopted at each stage to achieve stronger convergence for the identified estimates. The proposed method works in a deterministic way and is completely developed in State-Space form. Further, it does not require continuous- to discrete-time transformations and does not depend on initialization conditions. Synthetic earthquake-induced response signals from different shear-type buildings are generated to validate the implemented procedure, also with noise-corrupted cases. The achieved results provide a necessary condition to demonstrate the effectiveness of the proposed identification method.

  8. A peptide identification-free, genome sequence-independent shotgun proteomics workflow for strain-level bacterial differentiation.

    PubMed

    Shao, Wenguang; Zhang, Min; Lam, Henry; Lau, Stanley C K

    2015-09-23

    Shotgun proteomics is an emerging tool for bacterial identification and differentiation. However, the identification of the mass spectra of peptides to genome-derived peptide sequences remains a key issue that limits the use of shotgun proteomics to bacteria with genome sequences available. In this proof-of-concept study, we report a novel bacterial fingerprinting method that enjoys the resolving power and accuracy of mass spectrometry without the burden of peptide identification (i.e. genome sequence-independent). This method uses a similarity-clustering algorithm to search for mass spectra that are derived from the same peptide and merge them into a unique consensus spectrum as the basis to generate proteomic fingerprints of bacterial isolates. In comparison to a traditional peptide identification-based shotgun proteomics workflow and a PCR-based DNA fingerprinting method targeting the repetitive extragenic palindromes elements in bacterial genomes, the novel method generated fingerprints that were richer in information and more discriminative in differentiating E. coli isolates by their animal sources. The novel method is readily deployable to any cultivable bacteria, and may be used for several fields of study such as environmental microbiology, applied microbiology, and clinical microbiology.

  9. A peptide identification-free, genome sequence-independent shotgun proteomics workflow for strain-level bacterial differentiation

    PubMed Central

    Shao, Wenguang; Zhang, Min; Lam, Henry; Lau, Stanley C. K.

    2015-01-01

    Shotgun proteomics is an emerging tool for bacterial identification and differentiation. However, the identification of the mass spectra of peptides to genome-derived peptide sequences remains a key issue that limits the use of shotgun proteomics to bacteria with genome sequences available. In this proof-of-concept study, we report a novel bacterial fingerprinting method that enjoys the resolving power and accuracy of mass spectrometry without the burden of peptide identification (i.e. genome sequence-independent). This method uses a similarity-clustering algorithm to search for mass spectra that are derived from the same peptide and merge them into a unique consensus spectrum as the basis to generate proteomic fingerprints of bacterial isolates. In comparison to a traditional peptide identification-based shotgun proteomics workflow and a PCR-based DNA fingerprinting method targeting the repetitive extragenic palindromes elements in bacterial genomes, the novel method generated fingerprints that were richer in information and more discriminative in differentiating E. coli isolates by their animal sources. The novel method is readily deployable to any cultivable bacteria, and may be used for several fields of study such as environmental microbiology, applied microbiology, and clinical microbiology. PMID:26395646

  10. Disturbed Interplay between Mid- and High-Level Vision in ASD? Evidence from a Contour Identification Task with Everyday Objects

    ERIC Educational Resources Information Center

    Evers, Kris; Panis, Sven; Torfs, Katrien; Steyaert, Jean; Noens, Ilse; Wagemans, Johan

    2014-01-01

    Atypical visual processing in children with autism spectrum disorder (ASD) does not seem to reside in an isolated processing component, such as global or local processing. We therefore developed a paradigm that requires the interaction between different processes--an identification task with Gaborized object outlines--and applied this to two age…

  11. Computer Music

    NASA Astrophysics Data System (ADS)

    Cook, Perry R.

    This chapter covers algorithms, technologies, computer languages, and systems for computer music. Computer music involves the application of computers and other digital/electronic technologies to music composition, performance, theory, history, and the study of perception. The field combines digital signal processing, computational algorithms, computer languages, hardware and software systems, acoustics, psychoacoustics (low-level perception of sounds from the raw acoustic signal), and music cognition (higher-level perception of musical style, form, emotion, etc.).

  12. Identification of Novel Activators of Constitutive Androstane Receptor from FDA-approved Drugs by Integrated Computational and Biological Approaches

    PubMed Central

    Lynch, Caitlin; Pan, Yongmei; Li, Linhao; Ferguson, Stephen S.; Xia, Menghang; Swaan, Peter W.; Wang, Hongbing

    2012-01-01

    Purpose The constitutive androstane receptor (CAR, NR1I3) is a xenobiotic sensor governing the transcription of numerous hepatic genes associated with drug metabolism and clearance. Recent evidence suggests that CAR also modulates energy homeostasis and cancer development. Thus, identification of novel human (h) CAR activators is of both clinical importance and scientific interest. Methods Docking and ligand-based structure-activity models were used for virtual screening of a database containing over 2000 FDA-approved drugs. Identified lead compounds were evaluated in cell-based reporter assays to determine hCAR activation. Potential activators were further tested in human primary hepatocytes (HPHs) for the expression of the prototypical hCAR target gene CYP2B6. Results Nineteen lead compounds with optimal modeling parameters were selected for biological evaluation. Seven of the 19 leads exhibited moderate to potent activation of hCAR. Five out of the seven compounds translocated hCAR from the cytoplasm to the nucleus of HPHs in a concentration-dependent manner. These compounds also induce the expression of CYP2B6 in HPHs with rank-order of efficacies closely resembling that of hCAR activation. Conclusion These results indicate that our strategically integrated approaches are effective in the identification of novel hCAR modulators, which may function as valuable research tools or potential therapeutic molecules. PMID:23090669

  13. Evaluation of a wireless wearable tongue-computer interface by individuals with high-level spinal cord injuries

    NASA Astrophysics Data System (ADS)

    Huo, Xueliang; Ghovanloo, Maysam

    2010-04-01

    The tongue drive system (TDS) is an unobtrusive, minimally invasive, wearable and wireless tongue-computer interface (TCI), which can infer its users' intentions, represented in their volitional tongue movements, by detecting the position of a small permanent magnetic tracer attached to the users' tongues. Any specific tongue movements can be translated into user-defined commands and used to access and control various devices in the users' environments. The latest external TDS (eTDS) prototype is built on a wireless headphone and interfaced to a laptop PC and a powered wheelchair. Using customized sensor signal processing algorithms and graphical user interface, the eTDS performance was evaluated by 13 naive subjects with high-level spinal cord injuries (C2-C5) at the Shepherd Center in Atlanta, GA. Results of the human trial show that an average information transfer rate of 95 bits/min was achieved for computer access with 82% accuracy. This information transfer rate is about two times higher than the EEG-based BCIs that are tested on human subjects. It was also demonstrated that the subjects had immediate and full control over the powered wheelchair to the extent that they were able to perform complex wheelchair navigation tasks, such as driving through an obstacle course.

  14. Causal Attributions of Success and Failure Made by Undergraduate Students in an Introductory-Level Computer Programming Course

    ERIC Educational Resources Information Center

    Hawi, N.

    2010-01-01

    The purpose of this research is to identify the causal attributions of business computing students in an introductory computer programming course, in the computer science department at Notre Dame University, Louaize. Forty-five male and female undergraduates who completed the computer programming course that extended for a 13-week semester…

  15. Computational identification and characterization of conserved miRNAs and their target genes in garlic (Allium sativum L.) expressed sequence tags.

    PubMed

    Panda, Debashis; Dehury, Budheswar; Sahu, Jagajjit; Barooah, Madhumita; Sen, Priyabrata; Modi, Mahendra K

    2014-03-10

    The endogenous small non-coding functional microRNAs (miRNAs) are short in size, range from ~21 to 24 nucleotides in length, play a pivotal role in gene expression in plants and animals by silencing genes either by destructing or blocking of translation of homologous mRNA. Although various high-throughput, time consuming and expensive techniques like forward genetics and direct cloning are employed to detect miRNAs in plants but comparative genomics complemented with novel bioinformatic tools pave the way for efficient and cost-effective identification of miRNAs through homologous sequence search with previously known miRNAs. In this study, an attempt was made to identify and characterize conserved miRNAs in garlic expressed sequence tags (ESTs) through computational means. For identification of novel miRNAs in garlic, a total 3227 known mature miRNAs of plant kingdom Viridiplantae were searched for homology against 21,637 EST sequences resulting in identification of 6 potential miRNA candidates belonging to 6 different miRNA families. The psRNATarget server predicted 33 potential target genes and their probable functions for the six identified miRNA families in garlic. Most of the garlic miRNA target genes seem to encode transcription factors as well as genes involved in stress response, metabolism, plant growth and development. The results from the present study will shed more light on the understanding of molecular mechanisms of miRNA in garlic which may aid in the development of novel and precise techniques to understand some post-transcriptional gene silencing mechanism in response to stress tolerance.

  16. Computational identification of potential multitarget treatments for ameliorating the adverse effects of amyloid-β on synaptic plasticity

    PubMed Central

    Anastasio, Thomas J.

    2014-01-01

    The leading hypothesis on Alzheimer Disease (AD) is that it is caused by buildup of the peptide amyloid-β (Aβ), which initially causes dysregulation of synaptic plasticity and eventually causes destruction of synapses and neurons. Pharmacological efforts to limit Aβ buildup have proven ineffective, and this raises the twin challenges of understanding the adverse effects of Aβ on synapses and of suggesting pharmacological means to prevent them. The purpose of this paper is to initiate a computational approach to understanding the dysregulation by Aβ of synaptic plasticity and to offer suggestions whereby combinations of various chemical compounds could be arrayed against it. This data-driven approach confronts the complexity of synaptic plasticity by representing findings from the literature in a course-grained manner, and focuses on understanding the aggregate behavior of many molecular interactions. The same set of interactions is modeled by two different computer programs, each written using a different programming modality: one imperative, the other declarative. Both programs compute the same results over an extensive test battery, providing an essential crosscheck. Then the imperative program is used for the computationally intensive purpose of determining the effects on the model of every combination of ten different compounds, while the declarative program is used to analyze model behavior using temporal logic. Together these two model implementations offer new insights into the mechanisms by which Aβ dysregulates synaptic plasticity and suggest many drug combinations that potentially may reduce or prevent it. PMID:24847263

  17. Developing an Educational Computer Game for Migratory Bird Identification Based on a Two-Tier Test Approach

    ERIC Educational Resources Information Center

    Chu, Hui-Chun; Chang, Shao-Chen

    2014-01-01

    Although educational computer games have been recognized as being a promising approach, previous studies have indicated that, without supportive models, students might only show temporary interest during the game-based learning process, and their learning performance is often not as good as expected. Therefore, in this paper, a two-tier test…

  18. Identification of an intra-cranial intra-axial porcupine quill foreign body with computed tomography in a canine patient.

    PubMed

    Sauvé, Christopher P; Sereda, Nikki C; Sereda, Colin W

    2012-02-01

    An intra-cranial intra-axial foreign body was diagnosed in a golden retriever dog through the use of computed tomography (CT). Confirmed by necropsy, a porcupine quill had migrated to the patient's left cerebral hemisphere, likely through the oval foramen. This case study demonstrates the efficacy of CT in visualizing a quill in the canine brain.

  19. Identification of transformation products of rosuvastatin in water during ZnO photocatalytic degradation through the use of associated LC-QTOF-MS to computational chemistry.

    PubMed

    Segalin, Jéferson; Sirtori, Carla; Jank, Louíse; Lima, Martha F S; Livotto, Paolo R; Machado, Tiele C; Lansarin, Marla A; Pizzolato, Tânia M

    2015-12-15

    Rosuvastatin (RST), a synthetic statin, is a 3-hydroxy-3-methylglutaryl coenzyme A reductase inhibitor, with a number of pleiotropic properties, such as anti-inflammation, antioxidation and cardiac remodelling attenuation. According to IMS Health, rosuvastatin was the third best-selling drug in the United States in 2012. RST was recently found in European effluent samples at a detection frequency of 36%. In this study, we evaluate the identification process of major transformation products (TPs) of RST generated during the heterogeneous photocatalysis process with ZnO. The degradation of the parent molecule and the identification of the main TPs were studied in demineralised water. The TPs were monitored and identified by liquid chromatography-quadrupole time-of-flight mass spectrometry (LC-QTOF-MS/MS). Ten TPs were tentatively identified and some of them originated from the hydroxylation suffered by the aromatic ring during the initial stages of the process. Structural elucidation of some of the most abundant or persistent TPs was evaluated by computational analysis, which demonstrated that this approach can be used as a tool to help the elucidation of structures of unknown molecules. The analysis of the parameters obtained from ab initio calculations for different isomers showed the most stable structures and, consequently, the most likely to be found.

  20. High level waste storage tank farms/242-A evaporator standards/requirements identification document phase 1 assessment report

    SciTech Connect

    Biebesheimer, E., Westinghouse Hanford Co.

    1996-09-30

    This document, the Standards/Requirements Identification Document (S/RID) Phase I Assessment Report for the subject facility, represents the results of an Administrative Assessment to determine whether S/RID requirements are fully addressed by existing policies, plans or procedures. It contains; compliance status, remedial actions, and an implementing manuals report linking S/RID elements to requirement source to implementing manual and section.

  1. Utility of indels for species-level identification of a biologically complex plant group: a study with intergenic spacer in Citrus.

    PubMed

    Mahadani, Pradosh; Ghosh, Sankar Kumar

    2014-11-01

    The Consortium of Barcode of Life plant working group proposed to use the defined portion of plastid genes rbcL and matK either singly or in combination as the standard DNA barcode for plants. But DNA barcode based identification of biologically complex plant groups are always a challenging task due to the occurrence of natural hybridization. Here, we examined the use of indels polymorphism in trnH-psbA and trnL-trnF sequences for rapid species identification of citrus. DNA from young leaves of selected citrus species were isolated and matK gene (~800 bp) and trnH-psbA spacer (~450 bp) of Chloroplast DNA was amplified for species level identification. The sequences within the group taxa of Citrus were aligned using the ClustalX program. With few obvious misalignments were corrected manually using the similarity criterion. We identified a 54 bp inverted repeat or palindrome sequence (27-80 regions) and 6 multi residues indel coding regions. Large inverted repeats in cpDNA provided authentication at the higher taxonomic levels. These diagnostics indel marker from trnH-psbA were successful in identifying different species (5 out of 7) within the studied Citrus except Citrus limon and Citrus medica. These two closely related species are distinguished through the 6 bp deletion in trnL-trnF. This study demonstrated that the indel polymorphism based approach easily characterizes the Citrus species and the same may be applied in other complex groups. Likewise other indels occurring intergenic spacer of chloroplast regions may be tested for rapid identification of other secondary citrus species.

  2. A computational approach to Quaternary lake-level reconstruction applied in the central Rocky Mountains, Wyoming, USA

    NASA Astrophysics Data System (ADS)

    Pribyl, Paul; Shuman, Bryan N.

    2014-07-01

    Sediment-based reconstructions of late-Quaternary lake levels provide direct evidence of hydrologic responses to climate change, but many studies only provide approximate lake-elevation curves. Here, we demonstrate a new method for producing quantitative time series of lake elevation based on the facies and elevations of multiple cores collected from a lake's margin. The approach determines the facies represented in each core using diagnostic data, such as sand content, and then compares the results across cores to determine the elevation of the littoral zone over time. By applying the approach computationally, decisions are made systematically and iteratively using different facies classification schemes to evaluate the associated uncertainty. After evaluating our assumptions using ground-penetrating radar (GPR), we quantify past lake-elevation changes, precipitation minus evapotranspiration (ΔP-ET), and uncertainty in both at Lake of the Woods and Little Windy Hill Pond, Wyoming. The well-correlated (r = 0.802 ± 0.002) reconstructions indicate that water levels at both lakes fell at > 11,300, 8000-5500, and 4700-1600 cal yr BP when ΔP - ET decreased to - 50 to - 250 mm/yr. Differences between the reconstructions are typically small (10 ± 24 mm/yr since 7000 cal yr BP), and the similarity indicates that our reconstruction method can produce statistically comparable paleohydrologic datasets across networks of sites.

  3. Computational and informatics strategies for identification of specific protein interaction partners in affinity purification mass spectrometry experiments

    PubMed Central

    Nesvizhskii, Alexey I.

    2013-01-01

    Analysis of protein interaction networks and protein complexes using affinity purification and mass spectrometry (AP/MS) is among most commonly used and successful applications of proteomics technologies. One of the foremost challenges of AP/MS data is a large number of false positive protein interactions present in unfiltered datasets. Here we review computational and informatics strategies for detecting specific protein interaction partners in AP/MS experiments, with a focus on incomplete (as opposite to genome-wide) interactome mapping studies. These strategies range from standard statistical approaches, to empirical scoring schemes optimized for a particular type of data, to advanced computational frameworks. The common denominator among these methods is the use of label-free quantitative information such as spectral counts or integrated peptide intensities that can be extracted from AP/MS data. We also discuss related issues such as combining multiple biological or technical replicates, and dealing with data generated using different tagging strategies. Computational approaches for benchmarking of scoring methods are discussed, and the need for generation of reference AP/MS datasets is highlighted. Finally, we discuss the possibility of more extended modeling of experimental AP/MS data, including integration with external information such as protein interaction predictions based on functional genomics data. PMID:22611043

  4. Identification of Potent Chemotypes Targeting Leishmania major Using a High-Throughput, Low-Stringency, Computationally Enhanced, Small Molecule Screen

    PubMed Central

    Sharlow, Elizabeth R.; Close, David; Shun, Tongying; Leimgruber, Stephanie; Reed, Robyn; Mustata, Gabriela; Wipf, Peter; Johnson, Jacob; O'Neil, Michael; Grögl, Max; Magill, Alan J.; Lazo, John S.

    2009-01-01

    Patients with clinical manifestations of leishmaniasis, including cutaneous leishmaniasis, have limited treatment options, and existing therapies frequently have significant untoward liabilities. Rapid expansion in the diversity of available cutaneous leishmanicidal chemotypes is the initial step in finding alternative efficacious treatments. To this end, we combined a low-stringency Leishmania major promastigote growth inhibition assay with a structural computational filtering algorithm. After a rigorous assay validation process, we interrogated ∼200,000 unique compounds for L. major promastigote growth inhibition. Using iterative computational filtering of the compounds exhibiting >50% inhibition, we identified 553 structural clusters and 640 compound singletons. Secondary confirmation assays yielded 93 compounds with EC50s ≤ 1 µM, with none of the identified chemotypes being structurally similar to known leishmanicidals and most having favorable in silico predicted bioavailability characteristics. The leishmanicidal activity of a representative subset of 15 chemotypes was confirmed in two independent assay formats, and L. major parasite specificity was demonstrated by assaying against a panel of human cell lines. Thirteen chemotypes inhibited the growth of a L. major axenic amastigote-like population. Murine in vivo efficacy studies using one of the new chemotypes document inhibition of footpad lesion development. These results authenticate that low stringency, large-scale compound screening combined with computational structure filtering can rapidly expand the chemotypes targeting in vitro and in vivo Leishmania growth and viability. PMID:19888337

  5. Computer Music

    NASA Astrophysics Data System (ADS)

    Cook, Perry

    This chapter covers algorithms, technologies, computer languages, and systems for computer music. Computer music involves the application of computers and other digital/electronic technologies to music composition, performance, theory, history, and perception. The field combines digital signal processing, computational algorithms, computer languages, hardware and software systems, acoustics, psychoacoustics (low-level perception of sounds from the raw acoustic signal), and music cognition (higher-level perception of musical style, form, emotion, etc.). Although most people would think that analog synthesizers and electronic music substantially predate the use of computers in music, many experiments and complete computer music systems were being constructed and used as early as the 1950s.

  6. The use of isodose levels to interpret radiation induced lung injury: a quantitative analysis of computed tomography changes

    PubMed Central

    Knoll, Miriam A.; Sheu, Ren Dih; Knoll, Abraham D.; Kerns, Sarah L.; Lo, Yeh-Chi; Rosenzweig, Kenneth E.

    2016-01-01

    Background Patients treated with stereotactic body radiation therapy (SBRT) for lung cancer are often found to have radiation-induced lung injury (RILI) surrounding the treated tumor. We investigated whether treatment isodose levels could predict RILI. Methods Thirty-seven lung lesions in 32 patients were treated with SBRT and received post-treatment follow up (FU) computed tomography (CT). Each CT was fused with the original simulation CT and treatment isodose levels were overlaid. The RILI surrounding the treated lesion was contoured. The RILI extension index [fibrosis extension index (FEI)] was defined as the volume of RILI extending outside a given isodose level relative to the total volume of RILI and was expressed as a percentage. Results Univariate analysis revealed that the planning target volume (PTV) was positively correlated with RILI volume at FU: correlation coefficient (CC) =0.628 and P<0.0001 at 1st FU; CE =0.401 and P=0.021 at 2nd FU; CE =0.265 and P=0.306 at 3rd FU. FEI −40 Gy at 1st FU was significantly positively correlated with FEI −40 Gy at subsequent FU’s (CC =0.689 and P=6.5×10−5 comparing 1st and 2nd FU; 0.901 and P=0.020 comparing 2nd and 3rd FU. Ninety-six percent of the RILI was found within the 20 Gy isodose line. Sixty-five percent of patients were found to have a decrease in RILI on the second 2nd CT. Conclusions We have shown that RILI evolves over time and 1st CT correlates well with subsequent CTs. Ninety-six percent of the RILI can be found to occur within the 20 Gy isodose lines, which may prove beneficial to radiologists attempting to distinguish recurrence vs. RILI. PMID:26981453

  7. High-level waste storage tank farms/242-A evaporator standards/requirements identification document (S/RID), Vol. 1

    SciTech Connect

    Not Available

    1994-04-01

    The purpose of this Requirements Identification Document (RID) section is to identify, in one location, all of the facility specific requirements and good industry practices which are necessary or important to establish an effective Issues Management Program for the Tank Farm Facility. The Management Systems Functional Area includes the site management commitment to environmental safety and health (ES&H) policies and controls, to compliance management, to development and management of policy and procedures, to occurrence reporting and corrective actions, resource and issue management, and to the self-assessment process.

  8. CAL-laborate: A Collaborative Publication on the Use of Computer Aided Learning for Tertiary Level Physical Sciences and Geosciences.

    ERIC Educational Resources Information Center

    Fernandez, Anne, Ed.; Sproats, Lee, Ed.; Sorensen, Stacey, Ed.

    2000-01-01

    The science community has been trying to use computers in teaching for many years. There has been much conformity in how this was to be achieved, and the wheel has been re-invented again and again as enthusiast after enthusiast has "done their bit" towards getting computers accepted. Computers are now used by science undergraduates (as well as…

  9. Identification of the wave speed and the second viscosity of cavitation flows with 2D RANS computations - Part I

    NASA Astrophysics Data System (ADS)

    Decaix, J.; Alligné, S.; Nicolet, C.; Avellan, F.; Münch, C.

    2015-12-01

    1D hydro-electric models are useful to predict dynamic behaviour of hydro-power plants. Regarding vortex rope and cavitation surge in Francis turbines, the 1D models require some inputs that can be provided by numerical simulations. In this paper, a 2D cavitating Venturi is considered. URANS computations are performed to investigate the dynamic behaviour of the cavitation sheet depending on the frequency variation of the outlet pressure. The results are used to calibrate and to assess the reliability of the 1D models.

  10. Identification of RNA polymerase III-transcribed Alu loci by computational screening of RNA-Seq data.

    PubMed

    Conti, Anastasia; Carnevali, Davide; Bollati, Valentina; Fustinoni, Silvia; Pellegrini, Matteo; Dieci, Giorgio

    2015-01-01

    Of the ∼ 1.3 million Alu elements in the human genome, only a tiny number are estimated to be active in transcription by RNA polymerase (Pol) III. Tracing the individual loci from which Alu transcripts originate is complicated by their highly repetitive nature. By exploiting RNA-Seq data sets and unique Alu DNA sequences, we devised a bioinformatic pipeline allowing us to identify Pol III-dependent transcripts of individual Alu elements. When applied to ENCODE transcriptomes of seven human cell lines, this search strategy identified ∼ 1300 Alu loci corresponding to detectable transcripts, with ∼ 120 of them expressed in at least three cell lines. In vitro transcription of selected Alus did not reflect their in vivo expression properties, and required the native 5'-flanking region in addition to internal promoter. We also identified a cluster of expressed AluYa5-derived transcription units, juxtaposed to snaR genes on chromosome 19, formed by a promoter-containing left monomer fused to an Alu-unrelated downstream moiety. Autonomous Pol III transcription was also revealed for Alus nested within Pol II-transcribed genes. The ability to investigate Alu transcriptomes at single-locus resolution will facilitate both the identification of novel biologically relevant Alu RNAs and the assessment of Alu expression alteration under pathological conditions.

  11. Identification of RNA polymerase III-transcribed Alu loci by computational screening of RNA-Seq data

    PubMed Central

    Conti, Anastasia; Carnevali, Davide; Bollati, Valentina; Fustinoni, Silvia; Pellegrini, Matteo; Dieci, Giorgio

    2015-01-01

    Of the ∼1.3 million Alu elements in the human genome, only a tiny number are estimated to be active in transcription by RNA polymerase (Pol) III. Tracing the individual loci from which Alu transcripts originate is complicated by their highly repetitive nature. By exploiting RNA-Seq data sets and unique Alu DNA sequences, we devised a bioinformatic pipeline allowing us to identify Pol III-dependent transcripts of individual Alu elements. When applied to ENCODE transcriptomes of seven human cell lines, this search strategy identified ∼1300 Alu loci corresponding to detectable transcripts, with ∼120 of them expressed in at least three cell lines. In vitro transcription of selected Alus did not reflect their in vivo expression properties, and required the native 5′-flanking region in addition to internal promoter. We also identified a cluster of expressed AluYa5-derived transcription units, juxtaposed to snaR genes on chromosome 19, formed by a promoter-containing left monomer fused to an Alu-unrelated downstream moiety. Autonomous Pol III transcription was also revealed for Alus nested within Pol II-transcribed genes. The ability to investigate Alu transcriptomes at single-locus resolution will facilitate both the identification of novel biologically relevant Alu RNAs and the assessment of Alu expression alteration under pathological conditions. PMID:25550429

  12. Compositional modeling in porous media using constant volume flash and flux computation without the need for phase identification

    SciTech Connect

    Polívka, Ondřej Mikyška, Jiří

    2014-09-01

    The paper deals with the numerical solution of a compositional model describing compressible two-phase flow of a mixture composed of several components in porous media with species transfer between the phases. The mathematical model is formulated by means of the extended Darcy's laws for all phases, components continuity equations, constitutive relations, and appropriate initial and boundary conditions. The splitting of components among the phases is described using a new formulation of the local thermodynamic equilibrium which uses volume, temperature, and moles as specification variables. The problem is solved numerically using a combination of the mixed-hybrid finite element method for the total flux discretization and the finite volume method for the discretization of transport equations. A new approach to numerical flux approximation is proposed, which does not require the phase identification and determination of correspondence between the phases on adjacent elements. The time discretization is carried out by the backward Euler method. The resulting large system of nonlinear algebraic equations is solved by the Newton–Raphson iterative method. We provide eight examples of different complexity to show reliability and robustness of our approach.

  13. Identification of stair climbing ability levels in community-dwelling older adults based on the geometric mean of stair ascent and descent speed: The GeMSS classifier.

    PubMed

    Mayagoitia, Ruth E; Harding, John; Kitchen, Sheila

    2017-01-01

    The aim was to develop a quantitative approach to identify three stair-climbing ability levels of older adults: no, somewhat and considerable difficulty. Timed-up-and-go test, six-minute-walk test, and Berg balance scale were used for statistical comparison to a new stair climbing ability classifier based on the geometric mean of stair speeds (GeMSS) in ascent and descent on a flight of eight stairs with a 28° pitch in the housing unit where the participants, 28 (16 women) urban older adults (62-94 years), lived. Ordinal logistic regression revealed the thresholds between the three ability levels for each functional test were more stringent than thresholds found in the literature to classify walking ability levels. Though a small study, the intermediate classifier shows promise of early identification of difficulties with stairs, in order to make timely preventative interventions. Further studies are necessary to obtain scaling factors for stairs with other pitches.

  14. Detection of Cryptosporidium and Identification to the Species Level by Nested PCR and Restriction Fragment Length Polymorphism

    PubMed Central

    Coupe, Stephane; Sarfati, Claudine; Hamane, Samia; Derouin, Francis

    2005-01-01

    Cryptosporidiosis is an emerging protozoan disease associated with large waterborne outbreaks. Diagnosis relies on microscopic examination of stools, but this method cannot identify the infecting species of Cryptosporidium. We have developed a test based on nested PCR and restriction fragment length polymorphism (RFLP) that offers simple identification of Cryptosporidium hominis, Cryptosporidium parvum, and most other human infective species in stool samples. Purified C. parvum oocysts were used for PCR development. Extracted DNA was amplified by nested PCR targeting a 214-bp fragment of the 18S RNA gene. Enzymatic restriction sites were identified by bioinformatic analysis of all published Cryptosporidium 18S rRNA sequences. Experiments with spiked stool samples gave an estimated PCR detection limit of one oocyst. Specificity was assessed by testing 68 stool samples from patients with microscopically proven cryptosporidiosis and 31 Cryptosporidium-negative stools. Sixty-seven (98.5%) of the 68 stool samples from patients with microscopically proven cryptosporidiosis and 2 of the other stool samples were positive by PCR and could be genotyped. RFLP analysis identified 36 C. hominis, 19 C. parvum, 8 Cryptosporidium meleagridis, and 6 Cryptosporidium felis or Cryptosporidium canis samples. Species determination in 26 PCR-positive cases was in full agreement with DNA sequencing of the 18S rRNA hypervariable region. The excellent sensitivity of PCR, coupled with the accuracy of RFLP for species identification, make this method a suitable tool for routine diagnosis and genotyping of Cryptosporidium in stools. PMID:15750054

  15. Species-level identification of the blowfly Chrysomya megacephala and other Diptera in China by DNA barcoding.

    PubMed

    Qiu, Deyi; Cook, Charles E; Yue, Qiaoyun; Hu, Jia; Wei, Xiaoya; Chen, Jian; Liu, Dexing; Wu, Keliang

    2017-02-01

    The blowfly Chrysomya megacephala, or oriental latrine fly, is the most common human-associated fly of the oriental and Australasian regions. Chrysomya megacephala is of particular interest for its use in forensic entomology and because it is a disease vector. The larvae are economically important as feed for livestock and in traditional Chinese medicine. Identification of adults is straightforward, but larvae and fragments of adults are difficult to identify. We collected C. megacephala, its allies Chrysomya pinguis and Protophormia terraenovae, as well as flies from 11 other species from 52 locations around China, then sequenced 658 base pairs of the COI barcode region from 645 flies of all 14 species, including 208 C. megacephala, as the basis of a COI barcode library for flies in China. While C. megacephala and its closest relative C. pinguis are closely related (mean K2P divergence of 0.022), these species are completely non-overlapping in their barcode divergences, thus demonstrating the utility of the COI barcode region for the identification of C. megacephala. We combined the 208 C. megacephala sequences from China with 98 others from public databases and show that worldwide COI barcode diversity is low, with 70% of all individuals belonging to one of three haplotypes that differ by one or two substitutions from each other, reflecting recent anthropogenic dispersal from its native range in Eurasia.

  16. Identification of error making patterns in lesion detection on digital breast tomosynthesis using computer-extracted image features

    NASA Astrophysics Data System (ADS)

    Wang, Mengyu; Zhang, Jing; Grimm, Lars J.; Ghate, Sujata V.; Walsh, Ruth; Johnson, Karen S.; Lo, Joseph Y.; Mazurowski, Maciej A.

    2016-03-01

    Digital breast tomosynthesis (DBT) can improve lesion visibility by eliminating the issue of overlapping breast tissue present in mammography. However, this new modality likely requires new approaches to training. The issue of training in DBT is not well explored. We propose a computer-aided educational approach for DBT training. Our hypothesis is that the trainees' educational outcomes will improve if they are presented with cases individually selected to address their weaknesses. In this study, we focus on the question of how to select such cases. Specifically, we propose an algorithm that based on previously acquired reading data predicts which lesions will be missed by the trainee for future cases (i.e., we focus on false negative error). A logistic regression classifier was used to predict the likelihood of trainee error and computer-extracted features were used as the predictors. Reader data from 3 expert breast imagers was used to establish the ground truth and reader data from 5 radiology trainees was used to evaluate the algorithm performance with repeated holdout cross validation. Receiver operating characteristic (ROC) analysis was applied to measure the performance of the proposed individual trainee models. The preliminary experimental results for 5 trainees showed the individual trainee models were able to distinguish the lesions that would be detected from those that would be missed with the average area under the ROC curve of 0.639 (95% CI, 0.580-0.698). The proposed algorithm can be used to identify difficult cases for individual trainees.

  17. Computational Identification of Protein Pupylation Sites by Using Profile-Based Composition of k-Spaced Amino Acid Pairs.

    PubMed

    Hasan, Md Mehedi; Zhou, Yuan; Lu, Xiaotian; Li, Jinyan; Song, Jiangning; Zhang, Ziding

    2015-01-01

    Prokaryotic proteins are regulated by pupylation, a type of post-translational modification that contributes to cellular function in bacterial organisms. In pupylation process, the prokaryotic ubiquitin-like protein (Pup) tagging is functionally analogous to ubiquitination in order to tag target proteins for proteasomal degradation. To date, several experimental methods have been developed to identify pupylated proteins and their pupylation sites, but these experimental methods are generally laborious and costly. Therefore, computational methods that can accurately predict potential pupylation sites based on protein sequence information are highly desirable. In this paper, a novel predictor termed as pbPUP has been developed for accurate prediction of pupylation sites. In particular, a sophisticated sequence encoding scheme [i.e. the profile-based composition of k-spaced amino acid pairs (pbCKSAAP)] is used to represent the sequence patterns and evolutionary information of the sequence fragments surrounding pupylation sites. Then, a Support Vector Machine (SVM) classifier is trained using the pbCKSAAP encoding scheme. The final pbPUP predictor achieves an AUC value of 0.849 in 10-fold cross-validation tests and outperforms other existing predictors on a comprehensive independent test dataset. The proposed method is anticipated to be a helpful computational resource for the prediction of pupylation sites. The web server and curated datasets in this study are freely available at http://protein.cau.edu.cn/pbPUP/.

  18. Identification of novel adhesins of M. tuberculosis H37Rv using integrated approach of multiple computational algorithms and experimental analysis.

    PubMed

    Kumar, Sanjiv; Puniya, Bhanwar Lal; Parween, Shahila; Nahar, Pradip; Ramachandran, Srinivasan

    2013-01-01

    Pathogenic bacteria interacting with eukaryotic host express adhesins on their surface. These adhesins aid in bacterial attachment to the host cell receptors during colonization. A few adhesins such as Heparin binding hemagglutinin adhesin (HBHA), Apa, Malate Synthase of M. tuberculosis have been identified using specific experimental interaction models based on the biological knowledge of the pathogen. In the present work, we carried out computational screening for adhesins of M. tuberculosis. We used an integrated computational approach using SPAAN for predicting adhesins, PSORTb, SubLoc and LocTree for extracellular localization, and BLAST for verifying non-similarity to human proteins. These steps are among the first of reverse vaccinology. Multiple claims and attacks from different algorithms were processed through argumentative approach. Additional filtration criteria included selection for proteins with low molecular weights and absence of literature reports. We examined binding potential of the selected proteins using an image based ELISA. The protein Rv2599 (membrane protein) binds to human fibronectin, laminin and collagen. Rv3717 (N-acetylmuramoyl-L-alanine amidase) and Rv0309 (L,D-transpeptidase) bind to fibronectin and laminin. We report Rv2599 (membrane protein), Rv0309 and Rv3717 as novel adhesins of M. tuberculosis H37Rv. Our results expand the number of known adhesins of M. tuberculosis and suggest their regulated expression in different stages.

  19. Computational Method for the Systematic Identification of Analog Series and Key Compounds Representing Series and Their Biological Activity Profiles.

    PubMed

    Stumpfe, Dagmar; Dimova, Dilyana; Bajorath, Jürgen

    2016-08-25

    A computational methodology is introduced for detecting all unique series of analogs in large compound data sets, regardless of chemical relationships between analogs. No prior knowledge of core structures or R-groups is required, which are automatically determined. The approach is based upon the generation of retrosynthetic matched molecular pairs and analog networks from which distinct series are isolated. The methodology was applied to systematically extract more than 17 000 distinct series from the ChEMBL database. For comparison, analog series were also isolated from screening compounds and drugs. Known biological activities were mapped to series from ChEMBL, and in more than 13 000 of these series, key compounds were identified that represented substitution sites of all analogs within a series and its complete activity profile. The analog series, key compounds, and activity profiles are made freely available as a resource for medicinal chemistry applications.

  20. Survey of computed tomography doses and establishment of national diagnostic reference levels in the Republic of Belarus.

    PubMed

    Kharuzhyk, S A; Matskevich, S A; Filjustin, A E; Bogushevich, E V; Ugolkova, S A

    2010-01-01

    Computed tomography dose index (CTDI) was measured on eight CT scanners at seven public hospitals in the Republic of Belarus. The effective dose was calculated using normalised values of effective dose per dose-length product (DLP) over various body regions. Considerable variations of the dose values were observed. Mean effective doses amounted to 1.4 +/- 0.4 mSv for brain, 2.6 +/- 1.0 mSv for neck, 6.9 +/- 2.2 mSv for thorax, 7.0 +/- 2.3 mSv for abdomen and 8.8 +/- 3.2 mSv for pelvis. Diagnostic reference levels (DRLs) were proposed by calculating the third quartiles of dose value distributions (body region/volume CTDI, mGy/DLP, mGy cm): brain/60/730, neck/55/640, thorax/20/500, abdomen/25/600 and pelvis/25/490. It is evident that the protocols need to be optimised on some of the CT scanners, in view of the fact that these are the first formulated DRLs for the Republic of Belarus.

  1. Computational modeling of inhibition of voltage-gated Ca channels: identification of different effects on uterine and cardiac action potentials

    PubMed Central

    Tong, Wing-Chiu; Ghouri, Iffath; Taggart, Michael J.

    2014-01-01

    The uterus and heart share the important physiological feature whereby contractile activation of the muscle tissue is regulated by the generation of periodic, spontaneous electrical action potentials (APs). Preterm birth arising from premature uterine contractions is a major complication of pregnancy and there remains a need to pursue avenues of research that facilitate the use of drugs, tocolytics, to limit these inappropriate contractions without deleterious actions on cardiac electrical excitation. A novel approach is to make use of mathematical models of uterine and cardiac APs, which incorporate many ionic currents contributing to the AP forms, and test the cell-specific responses to interventions. We have used three such models—of uterine smooth muscle cells (USMC), cardiac sinoatrial node cells (SAN), and ventricular cells—to investigate the relative effects of reducing two important voltage-gated Ca currents—the L-type (ICaL) and T-type (ICaT) Ca currents. Reduction of ICaL (10%) alone, or ICaT (40%) alone, blunted USMC APs with little effect on ventricular APs and only mild effects on SAN activity. Larger reductions in either current further attenuated the USMC APs but with also greater effects on SAN APs. Encouragingly, a combination of ICaL and ICaT reduction did blunt USMC APs as intended with little detriment to APs of either cardiac cell type. Subsequent overlapping maps of ICaL and ICaT inhibition profiles from each model revealed a range of combined reductions of ICaL and ICaT over which an appreciable diminution of USMC APs could be achieved with no deleterious action on cardiac SAN or ventricular APs. This novel approach illustrates the potential for computational biology to inform us of possible uterine and cardiac cell-specific mechanisms. Incorporating such computational approaches in future studies directed at designing new, or repurposing existing, tocolytics will be beneficial for establishing a desired uterine specificity of action

  2. Frequency of Factors that Complicate the Identification of Mild Traumatic Brain Injury in Level I Trauma Center Patients

    PubMed Central

    Furger, Robyn E.; Nelson, Lindsay D.; Brooke Lerner, E.; McCrea, Michael A.

    2015-01-01

    Aim Determine the frequency of factors that complicate identification of mild traumatic brain injury (mTBI) in emergency department (ED) patients. Setting Chart review. Materials & Methods Records of 3,042 patients (age 18-45) exposed to a potential mechanism of mTBI were reviewed for five common complicating factors and signs of mTBI. Results Most patients (65.1%) had at least one complicating factor: given narcotics in the ED (43.7%), on psychotropic medication (18.4%), psychiatric diagnosis (15.3%), alcohol consumption near time of admission (14.2%), and pre-admission narcotic prescription (8.9%). Conclusion Our findings highlight the frequency of these confounding factors in this population. Future research should identify how these factors interact with performance on assessment measures to improve evidence-based mTBI assessment in this population. PMID:27134757

  3. SU-E-I-74: Image-Matching Technique of Computed Tomography Images for Personal Identification: A Preliminary Study Using Anthropomorphic Chest Phantoms

    SciTech Connect

    Matsunobu, Y; Shiotsuki, K; Morishita, J

    2015-06-15

    Purpose: Fingerprints, dental impressions, and DNA are used to identify unidentified bodies in forensic medicine. Cranial Computed tomography (CT) images and/or dental radiographs are also used for identification. Radiological identification is important, particularly in the absence of comparative fingerprints, dental impressions, and DNA samples. The development of an automated radiological identification system for unidentified bodies is desirable. We investigated the potential usefulness of bone structure for matching chest CT images. Methods: CT images of three anthropomorphic chest phantoms were obtained on different days in various settings. One of the phantoms was assumed to be an unidentified body. The bone image and the bone image with soft tissue (BST image) were extracted from the CT images. To examine the usefulness of the bone image and/or the BST image, the similarities between the two-dimensional (2D) or threedimensional (3D) images of the same and different phantoms were evaluated in terms of the normalized cross-correlation value (NCC). Results: For the 2D and 3D BST images, the NCCs obtained from the same phantom assumed to be an unidentified body (2D, 0.99; 3D, 0.93) were higher than those for the different phantoms (2D, 0.95 and 0.91; 3D, 0.89 and 0.80). The NCCs for the same phantom (2D, 0.95; 3D, 0.88) were greater compared to those of the different phantoms (2D, 0.61 and 0.25; 3D, 0.23 and 0.10) for the bone image. The difference in the NCCs between the same and different phantoms tended to be larger for the bone images than for the BST images. These findings suggest that the image-matching technique is more useful when utilizing the bone image than when utilizing the BST image to identify different people. Conclusion: This preliminary study indicated that evaluating the similarity of bone structure in 2D and 3D images is potentially useful for identifying of an unidentified body.

  4. New software for computer-assisted dental-data matching in Disaster Victim Identification and long-term missing persons investigations: "DAVID Web".

    PubMed

    Clement, J G; Winship, V; Ceddia, J; Al-Amad, S; Morales, A; Hill, A J

    2006-05-15

    In 1997 an internally supported but unfunded pilot project at the Victorian Institute of Forensic Medicine (VIFM) Australia led to the development of a computer system which closely mimicked Interpol paperwork for the storage, later retrieval and tentative matching of the many AM and PM dental records that are often needed for rapid Disaster Victim Identification. The program was called "DAVID" (Disaster And Victim IDentification). It combined the skills of the VIFM Information Technology systems manager (VW), an experienced odontologist (JGC) and an expert database designer (JC); all current authors on this paper. Students did much of the writing of software to prescription from Monash University. The student group involved won an Australian Information Industry Award in recognition of the contribution the new software could have made to the DVI process. Unfortunately, the potential of the software was never realized because paradoxically the federal nature of Australia frequently thwarts uniformity of systems across the entire country. As a consequence, the final development of DAVID never took place. Given the recent problems encountered post-tsunami by the odontologists who were obliged to use the Plass Data system (Plass Data Software, Holbaek, Denmark) and with the impending risks imposed upon Victoria by the decision to host the Commonwealth Games in Melbourne during March 2006, funding was sought and obtained from the state government to update counter disaster preparedness at the VIFM. Some of these funds have been made available to upgrade and complete the DAVID project. In the wake of discussions between leading expert odontologists from around the world held in Geneva during July 2003 at the invitation of the International Committee of the Red Cross significant alterations to the initial design parameters of DAVID were proposed. This was part of broader discussions directed towards developing instruments which could be used by the ICRC's "The Missing

  5. Gene expression levels in normal human lymphoblasts with variable sensitivities to arsenite: Identification of GGT1 and NFKBIE expression levels as possible biomarkers of susceptibility

    SciTech Connect

    Komissarova, Elena V.; Li Ping; Uddin, Ahmed N.; Chen, Xuyan; Nadas, Arthur; Rossman, Toby G.

    2008-01-15

    Drinking arsenic-contaminated water is associated with increased risk of neoplasias of the skin, lung, bladder and possibly other sites, as well as other diseases. Earlier, we showed that human lymphoblast lines from different normal unexposed donors showed variable sensitivities to the toxic effects of arsenite. In the present study, we used microarray analysis to compare the basal gene expression profiles between two arsenite-resistant (GM02707, GM00893) and two arsenite-sensitive lymphoblast lines (GM00546, GM00607). A number of genes were differentially expressed in arsenite-sensitive and arsenite-resistant cells. Among these, {gamma}-glutamyltranspeptidase 1 (GGT1) and NF{kappa}B inhibitor-epsilon (NFKBIE) showed higher expression levels in arsenite-resistant cells. RT-PCR analysis with gene-specific primers confirmed these results. Reduction of GGT1 expression level in arsenite-resistant lymphoblasts with GGT1-specific siRNA resulted in increased cell sensitivity to arsenite. In conclusion, we have demonstrated for the first time that expression levels of GGT1 and possibly NFKBIE might be useful as biomarkers of genetic susceptibility to arsenite. Expression microarrays can thus be exploited for identifying additional biomarkers of susceptibility to arsenite and to other toxicants.

  6. RANS computations for identification of 1-D cavitation model parameters: application to full load cavitation vortex rope

    NASA Astrophysics Data System (ADS)

    Alligné, S.; Decaix, J.; Müller, A.; Nicolet, C.; Avellan, F.; Münch, C.

    2016-11-01

    Due to the massive penetration of alternative renewable energies, hydropower is a key energy conversion technology for stabilizing the electrical power network by using hydraulic machines at off design operating conditions. At full load, the axisymmetric cavitation vortex rope developing in Francis turbines acts as an internal source of energy, leading to an instability commonly referred to as selfexcited surge. 1-D models are developed to predict this phenomenon and to define the range of safe operating points for a hydropower plant. These models involve several parameters that have to be calibrated using experimental and numerical data. The present work aims to identify these parameters with URANS computations with a particular focus on the fluid damping rising when the cavitation volume oscillates. Two test cases have been investigated: a cavitation flow in a Venturi geometry without inlet swirl and a reduced scale model of a Francis turbine operating at full load conditions. The cavitation volume oscillation is forced by imposing an unsteady outlet pressure conditions. By varying the frequency of the outlet pressure, the resonance frequency is determined. Then, the pressure amplitude and the resonance frequency are used as two objectives functions for the optimization process aiming to derive the 1-D model parameters.

  7. Identification of MicroRNAs and transcript targets in Camelina sativa by deep sequencing and computational methods

    DOE PAGES

    Poudel, Saroj; Aryal, Niranjan; Lu, Chaofu; ...

    2015-03-31

    Camelina sativa is an annual oilseed crop that is under intensive development for renewable resources of biofuels and industrial oils. MicroRNAs, or miRNAs, are endogenously encoded small RNAs that play key roles in diverse plant biological processes. Here, we conducted deep sequencing on small RNA libraries prepared from camelina leaves, flower buds and two stages of developing seeds corresponding to initial and peak storage products accumulation. Computational analyses identified 207 known miRNAs belonging to 63 families, as well as 5 novel miRNAs. These miRNAs, especially members of the miRNA families, varied greatly in different tissues and developmental stages. The predictedmore » miRNA target genes are involved in a broad range of physiological functions including lipid metabolism. This report is the first step toward elucidating roles of miRNAs in C. sativa and will provide additional tools to improve this oilseed crop for biofuels and biomaterials.« less

  8. A Combination of Screening and Computational Approaches for the Identification of Novel Compounds That Decrease Mast Cell Degranulation

    PubMed Central

    McShane, Marisa P.; Friedrichson, Tim; Giner, Angelika; Meyenhofer, Felix; Barsacchi, Rico; Bickle, Marc

    2015-01-01

    High-content screening of compound libraries poses various challenges in the early steps in drug discovery such as gaining insights into the mode of action of the selected compounds. Here, we addressed these challenges by integrating two biological screens through bioinformatics and computational analysis. We screened a small-molecule library enriched in amphiphilic compounds in a degranulation assay in rat basophilic leukemia 2H3 (RBL-2H3) cells. The same library was rescreened in a high-content image-based endocytosis assay in HeLa cells. This assay was previously applied to a genome-wide RNAi screen that produced quantitative multiparametric phenotypic profiles for genes that directly or indirectly affect endocytosis. By correlating the endocytic profiles of the compounds with the genome-wide siRNA profiles, we identified candidate pathways that may be inhibited by the compounds. Among these, we focused on the Akt pathway and validated its inhibition in HeLa and RBL-2H3 cells. We further showed that the compounds inhibited the translocation of the Akt-PH domain to the plasma membrane. The approach performed here can be used to integrate chemical and functional genomics screens for investigating the mechanism of action of compounds. PMID:25838434

  9. Identification of evolutionarily conserved Momordica charantia microRNAs using computational approach and its utility in phylogeny analysis.

    PubMed

    Thirugnanasambantham, Krishnaraj; Saravanan, Subramanian; Karikalan, Kulandaivelu; Bharanidharan, Rajaraman; Lalitha, Perumal; Ilango, S; HairulIslam, Villianur Ibrahim

    2015-10-01

    Momordica charantia (bitter gourd, bitter melon) is a monoecious Cucurbitaceae with anti-oxidant, anti-microbial, anti-viral and anti-diabetic potential. Molecular studies on this economically valuable plant are very essential to understand its phylogeny and evolution. MicroRNAs (miRNAs) are conserved, small, non-coding RNA with ability to regulate gene expression by bind the 3' UTR region of target mRNA and are evolved at different rates in different plant species. In this study we have utilized homology based computational approach and identified 27 mature miRNAs for the first time from this bio-medically important plant. The phylogenetic tree developed from binary data derived from the data on presence/absence of the identified miRNAs were noticed to be uncertain and biased. Most of the identified miRNAs were highly conserved among the plant species and sequence based phylogeny analysis of miRNAs resolved the above difficulties in phylogeny approach using miRNA. Predicted gene targets of the identified miRNAs revealed their importance in regulation of plant developmental process. Reported miRNAs held sequence conservation in mature miRNAs and the detailed phylogeny analysis of pre-miRNA sequences revealed genus specific segregation of clusters.

  10. Identification of potential transmembrane protease serine 4 inhibitors as anti-cancer agents by integrated computational approach.

    PubMed

    Ilamathi, M; Hemanth, R; Nishanth, S; Sivaramakrishnan, V

    2016-01-21

    Transmembrane protease serine 4 is a well known cell surface protease facilitating the extracellular matrix degradation and epithelial mesenchymal transition in hepatocellular carcinoma. Henceforth targeting transmembrane protease serine 4 is strongly believed to provide therapeutic intervention against hepatocellular carcinoma. Owing to lack of crystal structure for human transmembrane protease serine 4, we predicted its three dimensional structure for the first time in this study. Experimentally proven inhibitor-Tyroserleutide (TSL) against hepatocellular carcinoma via transmembrane protease serine 4 was used as a benchmark to identify structurally similar candidates from PubChem database to create the TSL library. Virtual screening of TSL library against modeled transmembrane protease serine 4 revealed the top four potential inhibitors. Further binding free energy (ΔGbind) analysis of the potential inhibitors revealed the best potential lead compound against transmembrane protease serine 4. Drug likeliness nature of the top four potential hits were additionally analyzed in comparison to TSL to confirm on the best potential lead compound with the highest % of human oral absorption. Consequently, e-pharmacophore mapping of the best potential lead compound yielded a six point feature. It was observed to contain four hydrogen bond donor sites (D), one positively ionizable site (P) and one aromatic ring (R). Such e-pharmacophore insight obtained from structural determinants by integrated computational analysis could serve as a framework for further advancement of drug discovery process of new anti-cancer agents with less toxicity and high specificity targeting transmembrane protease serine 4 and hepatocellular carcinoma.

  11. Computational identification of new structured cis-regulatory elements in the 3'-untranslated region of human protein coding genes.

    PubMed

    Chen, Xiaowei Sylvia; Brown, Chris M

    2012-10-01

    Messenger ribonucleic acids (RNAs) contain a large number of cis-regulatory RNA elements that function in many types of post-transcriptional regulation. These cis-regulatory elements are often characterized by conserved structures and/or sequences. Although some classes are well known, given the wide range of RNA-interacting proteins in eukaryotes, it is likely that many new classes of cis-regulatory elements are yet to be discovered. An approach to this is to use computational methods that have the advantage of analysing genomic data, particularly comparative data on a large scale. In this study, a set of structural discovery algorithms was applied followed by support vector machine (SVM) classification. We trained a new classification model (CisRNA-SVM) on a set of known structured cis-regulatory elements from 3'-untranslated regions (UTRs) and successfully distinguished these and groups of cis-regulatory elements not been strained on from control genomic and shuffled sequences. The new method outperformed previous methods in classification of cis-regulatory RNA elements. This model was then used to predict new elements from cross-species conserved regions of human 3'-UTRs. Clustering of these elements identified new classes of potential cis-regulatory elements. The model, training and testing sets and novel human predictions are available at: http://mRNA.otago.ac.nz/CisRNA-SVM.

  12. LncFunNet: an integrated computational framework for identification of functional long noncoding RNAs in mouse skeletal muscle cells.

    PubMed

    Zhou, Jiajian; Zhang, Suyang; Wang, Huating; Sun, Hao

    2017-04-04

    Long noncoding RNAs (lncRNAs) are key regulators of diverse cellular processes. Recent advances in high-throughput sequencing have allowed for an unprecedented discovery of novel lncRNAs. To identify functional lncRNAs from thousands of candidates for further functional validation is still a challenging task. Here, we present a novel computational framework, lncFunNet (lncRNA Functional inference through integrated Network) that integrates ChIP-seq, CLIP-seq and RNA-seq data to predict, prioritize and annotate lncRNA functions. In mouse embryonic stem cells (mESCs), using lncFunNet we not only recovered most of the functional lncRNAs known to maintain mESC pluripotency but also predicted a plethora of novel functional lncRNAs. Similarly, in mouse myoblast C2C12 cells, applying lncFunNet led to prediction of reservoirs of functional lncRNAs in both proliferating myoblasts (MBs) and differentiating myotubes (MTs). Further analyses demonstrated that these lncRNAs are frequently bound by key transcription factors, interact with miRNAs and constitute key nodes in biological network motifs. Further experimentations validated their dynamic expression profiles and functionality during myoblast differentiation. Collectively, our studies demonstrate the use of lncFunNet to annotate and identify functional lncRNAs in a given biological system.

  13. MegaMiner: A Tool for Lead Identification Through Text Mining Using Chemoinformatics Tools and Cloud Computing Environment.

    PubMed

    Karthikeyan, Muthukumarasamy; Pandit, Yogesh; Pandit, Deepak; Vyas, Renu

    2015-01-01

    Virtual screening is an indispensable tool to cope with the massive amount of data being tossed by the high throughput omics technologies. With the objective of enhancing the automation capability of virtual screening process a robust portal termed MegaMiner has been built using the cloud computing platform wherein the user submits a text query and directly accesses the proposed lead molecules along with their drug-like, lead-like and docking scores. Textual chemical structural data representation is fraught with ambiguity in the absence of a global identifier. We have used a combination of statistical models, chemical dictionary and regular expression for building a disease specific dictionary. To demonstrate the effectiveness of this approach, a case study on malaria has been carried out in the present work. MegaMiner offered superior results compared to other text mining search engines, as established by F score analysis. A single query term 'malaria' in the portlet led to retrieval of related PubMed records, protein classes, drug classes and 8000 scaffolds which were internally processed and filtered to suggest new molecules as potential anti-malarials. The results obtained were validated by docking the virtual molecules into relevant protein targets. It is hoped that MegaMiner will serve as an indispensable tool for not only identifying hidden relationships between various biological and chemical entities but also for building better corpus and ontologies.

  14. Acyl chain preference and inhibitor identification of Moraxella catarrhalis LpxA: Insight through crystal structure and computational studies.

    PubMed

    Pratap, Shivendra; Kesari, Pooja; Yadav, Ravi; Dev, Aditya; Narwal, Manju; Kumar, Pravindra

    2017-03-01

    Lipopolysaccharide (LPS) is an important surface component and a potential virulence factor in the pathogenesis of Gram-negative bacteria. UDP-N-acetylglucosamine acyltransferase (LpxA) enzyme catalyzes the first reaction of LPS biosynthesis, reversible transfer of R-3-hydroxy-acyl moiety from donor R-3-hydroxy-acyl-acyl carrier protein to the 3' hydroxyl position of UDP-N-acetyl-glucosamine. LpxA enzyme's essentiality in bacterial survival and absence of any homologous protein in humans makes it a promising target for anti-bacterial drug development. Herein, we present the crystal structure of Moraxella catarrhalis LpxA (McLpxA). We propose that L171 is responsible for limiting the acyl chain length in McLpxA to 10C or 12C. The study reveals the plausible interactions between the highly conserved clusters of basic residues at the C-terminal end of McLpxA and acidic residues of acyl carrier protein (ACP). Furthermore, the crystal structure of McLpxA was used to screen potential inhibitors from NCI open database using various computational approaches viz. pharmacophore mapping, virtual screening and molecular docking. Molecules Mol212032, Mol609399 and Mol152546 showed best binding affinity with McLpxA among all screened molecules. These molecules mimic the substrate-LpxA binding interactions.

  15. Computational Identification of Tissue-Specific Splicing Regulatory Elements in Human Genes from RNA-Seq Data

    PubMed Central

    Badr, Eman; ElHefnawi, Mahmoud; Heath, Lenwood S.

    2016-01-01

    Alternative splicing is a vital process for regulating gene expression and promoting proteomic diversity. It plays a key role in tissue-specific expressed genes. This specificity is mainly regulated by splicing factors that bind to specific sequences called splicing regulatory elements (SREs). Here, we report a genome-wide analysis to study alternative splicing on multiple tissues, including brain, heart, liver, and muscle. We propose a pipeline to identify differential exons across tissues and hence tissue-specific SREs. In our pipeline, we utilize the DEXSeq package along with our previously reported algorithms. Utilizing the publicly available RNA-Seq data set from the Human BodyMap project, we identified 28,100 differentially used exons across the four tissues. We identified tissue-specific exonic splicing enhancers that overlap with various previously published experimental and computational databases. A complicated exonic enhancer regulatory network was revealed, where multiple exonic enhancers were found across multiple tissues while some were found only in specific tissues. Putative combinatorial exonic enhancers and silencers were discovered as well, which may be responsible for exon inclusion or exclusion across tissues. Some of the exonic enhancers are found to be co-occurring with multiple exonic silencers and vice versa, which demonstrates a complicated relationship between tissue-specific exonic enhancers and silencers. PMID:27861625

  16. Computational identification of conserved microRNAs and their putative targets in the Hypericum perforatum L. flower transcriptome.

    PubMed

    Galla, Giulio; Volpato, Mirko; Sharbel, Timothy F; Barcaccia, Gianni

    2013-09-01

    MicroRNAs (miRNAs) have recently emerged as important regulators of gene expression in plants. Many miRNA families and their targets have been extensively studied in model species and major crops. We have characterized mature miRNAs along with their precursors and potential targets in Hypericum to generate a comprehensive list of conserved miRNA families and to investigate the regulatory role of selected miRNAs in biological processes that occur in the flower. St. John's wort (Hypericum perforatum L., 2n = 4x = 32), a medicinal plant that produces pharmaceutically important metabolites with therapeutic activities, was chosen because it is regarded as an attractive model system for the study of apomixis. A computational in silico prediction of structure, in combination with an in vitro validation, allowed us to identify 7 pre-miRNAs, including miR156, miR166, miR390, miR394, miR396, and miR414. We demonstrated that H. perforatum flowers share highly conserved miRNAs and that these miRNAs potentially target dozens of genes with a wide range of molecular functions, including metabolism, response to stress, flower development, and plant reproduction. Our analysis paves the way toward identifying flower-specific miRNAs that may differentiate the sexual and apomictic reproductive pathways.

  17. Comparison of ultrasonography, radiography and a single computed tomography slice for fluid identification within the feline tympanic bulla.

    PubMed

    King, A M; Weinrauch, S A; Doust, R; Hammond, G; Yam, P S; Sullivan, M

    2007-05-01

    Evaluation of the tympanic bulla (TB) in cases of acute feline otitis media can be a diagnostic challenge, although a feature often associated with this condition is the accumulation of fluid or material within the middle ear cavity. A technique is reported allowing optimum imaging of the feline TB using ultrasound (US) and recording of the appearance of gas and fluid-filled TB. A random number of bullae in 42 feline cadavers were filled with lubricant and rostroventral-caudodorsal oblique radiographs, single slice computed tomography (CT) images and US images were created and interpreted by blinded operators. The content (fluid or gas) of each TB was determined using each technique and the cadavers were then frozen and sectioned for confirmation. CT remained the most accurate diagnostic method, but US produced better results than radiology. Given the advantages of US over other imaging techniques, these results suggest that further work is warranted to determine applications of this modality in the evaluation of clinical cases of feline otitis media.

  18. Computer-Based Writing and Paper-Based Writing: A Study of Beginning-Level and Intermediate-Level Chinese Learners' Writing

    ERIC Educational Resources Information Center

    Kang, Hana

    2011-01-01

    Chinese writing is one of the most difficult challenges for Chinese learners whose first language writing system is alphabetic letters. Chinese teachers have incorporated computer-based writing into their teaching in the attempt to reduce the difficulties of writing in Chinese, with a particular emphasis on composing (as opposed to simply writing…

  19. High School Students Learning University Level Computer Science on the Web: A Case Study of the "DASK"-Model

    ERIC Educational Resources Information Center

    Grandell, Linda

    2005-01-01

    Computer science is becoming increasingly important in our society. Meta skills, such as problem solving and logical and algorithmic thinking, are emphasized in every field, not only in the natural sciences. Still, largely due to gaps in tuition, common misunderstandings exist about the true nature of computer science. These are especially…

  20. Identification and Validation of Novel Hedgehog-Responsive Enhancers Predicted by Computational Analysis of Ci/Gli Binding Site Density

    PubMed Central

    Richards, Neil; Parker, David S.; Johnson, Lisa A.; Allen, Benjamin L.; Barolo, Scott; Gumucio, Deborah L.

    2015-01-01

    The Hedgehog (Hh) signaling pathway directs a multitude of cellular responses during embryogenesis and adult tissue homeostasis. Stimulation of the pathway results in activation of Hh target genes by the transcription factor Ci/Gli, which binds to specific motifs in genomic enhancers. In Drosophila, only a few enhancers (patched, decapentaplegic, wingless, stripe, knot, hairy, orthodenticle) have been shown by in vivo functional assays to depend on direct Ci/Gli regulation. All but one (orthodenticle) contain more than one Ci/Gli site, prompting us to directly test whether homotypic clustering of Ci/Gli binding sites is sufficient to define a Hh-regulated enhancer. We therefore developed a computational algorithm to identify Ci/Gli clusters that are enriched over random expectation, within a given region of the genome. Candidate genomic regions containing Ci/Gli clusters were functionally tested in chicken neural tube electroporation assays and in transgenic flies. Of the 22 Ci/Gli clusters tested, seven novel enhancers (and the previously known patched enhancer) were identified as Hh-responsive and Ci/Gli-dependent in one or both of these assays, including: Cuticular protein 100A (Cpr100A); invected (inv), which encodes an engrailed-related transcription factor expressed at the anterior/posterior wing disc boundary; roadkill (rdx), the fly homolog of vertebrate Spop; the segment polarity gene gooseberry (gsb); and two previously untested regions of the Hh receptor-encoding patched (ptc) gene. We conclude that homotypic Ci/Gli clustering is not sufficient information to ensure Hh-responsiveness; however, it can provide a clue for enhancer recognition within putative Hedgehog target gene loci. PMID:26710299

  1. Identification of hemodynamically compromised regions by means of cerebral blood volume mapping utilizing computed tomography perfusion imaging.

    PubMed

    Takahashi, Satoshi; Tanizaki, Yoshio; Akaji, Kazunori; Kimura, Hiroaki; Katano, Takehiro; Suzuki, Kentaro; Mochizuki, Yoichi; Shidoh, Satoka; Nakazawa, Masaki; Yoshida, Kazunari; Mihara, Ban

    2017-04-01

    The aim of the study was to evaluate the potential role of computed tomography perfusion (CTP) imaging in identifying hemodynamically compromised regions in patients with occlusive cerebrovascular disease. Twelve patients diagnosed with either occlusion or severe stenosis of the internal carotid artery or the M1 portion of the middle cerebral artery underwent CTP imaging. The data was analyzed by an automated ROI-determining software. Patients were classified into two subgroups: an asymptomatic group consisting of three patients in whom perfusion pressure distal to the site of occlusion/stenosis (PPdis) could be maintained in spite of the arterial occlusion/stenosis, and a symptomatic group consisting of nine patients in whom PPdis could not be maintained enough to avoid watershed infarction. Four CTP-related parameters were independently compared between the two groups. Significant differences were determined using a two-sample t-test. When statistically significant differences were identified, cut-off points were calculated using ROC curves. Analysis revealed statistically significant differences between the asymptomatic and symptomatic subgroups only in the measure of relCBV (p=0.028). Higher relCBV values were observed in the symptomatic subgroup. ROC curve analysis revealed 1.059 to be the optimal relCBV cut-off value for distinguishing between the asymptomatic and symptomatic subgroups. The data revealed that, in patients whose PPdis is maintained, relCBV remains around 1.00. Conversely, in patients whose PPdis decreased, relCBV increased. From these findings, we conclude that elevation of relCBV as observed using CTP imaging accurately reflects the extent of compensatory vasodilatation involvement and can identify hemodynamically compromised regions.

  2. Computational identification of tumor anatomic location associated with survival in two large cohorts of human primary glioblastomas

    PubMed Central

    Liu, Tiffany T.; Achrol, Achal S.; Mitchell, Lex A.; Du, William A.; Loya, Joshua J.; Rodriguez, Scott A.; Feroze, Abdullah; Westbroek, Erick M.; Yeom, Kristen W.; Stuart, Joshua M.; Chang, Steven D.; Harsh, Griffith R.; Rubin, Daniel L.

    2015-01-01

    Background and Purpose Tumor location has been shown to be a significant prognostic factor in patients with glioblastoma (GBM). The purpose of this study is to characterize GBM lesions by identifying MRI voxel-based tumor location features that are associated with tumor molecular profiles, patient characteristics and clinical outcomes. Materials and Methods Preoperative T1 anatomic MR images of 384 GBM patients were obtained from two independent cohorts (N=253 from our local (name withheld to preserve anonymity) Medical Center for training and N=131 from the Cancer Genome Atlas (TCGA) for validation). An automated computational image analysis pipeline was developed to determine the anatomic locations of tumor in each patient. Voxel-based differences in tumor location between good (overall survival (OS) > 17 months) and poor (OS < 11 months) survival groups identified in the training cohort were used to classify patients in the TCGA cohort into two brain location groups, for which clinical features, mRNA expression, and copy number changes were compared to elucidate the biological basis of tumors located in different brain regions. Results Tumors in the right occipito-temporal periventricular white matter were significantly associated with poor survival in both training and test cohorts (both log-rank P < 0.05) and had larger tumor volume compared to tumors in other locations. Tumors in the right peri-atrial location were associated with hypoxia pathway enrichment and PDGFRA amplification, making them potential targets for subgroup-specific therapies. Conclusion Voxel-based location in GBM is associated with patient outcome and may have a potential role for guiding personalized treatment. PMID:26744442

  3. Computational Identification Raises a Riddle for Distribution of Putative NACHT NTPases in the Genome of Early Green Plants

    PubMed Central

    Arya, Preeti; Acharya, Vishal

    2016-01-01

    NACHT NTPases and AP-ATPases belongs to STAND (signal transduction ATPases with numerous domain) P-loop NTPase class, which are known to be involved in defense signaling pathways and apoptosis regulation. The AP-ATPases (also known as NB-ARC) and NACHT NTPases are widely spread throughout all kingdoms of life except in plants, where only AP-ATPases have been extensively studied in the scenario of plant defense response against pathogen invasion and in hypersensitive response (HR). In the present study, we have employed a genome-wide survey (using stringent computational analysis) of 67 diverse organisms viz., archaebacteria, cyanobacteria, fungi, animalia and plantae to revisit the evolutionary history of these two STAND P-loop NTPases. This analysis divulged the presence of NACHT NTPases in the early green plants (green algae and the lycophyte) which had not been previously reported. These NACHT NTPases were known to be involved in diverse functional activities such as transcription regulation in addition to the defense signaling cascades depending on the domain association. In Chalmydomonas reinhardtii, a green algae, WD40 repeats found to be at the carboxyl-terminus of NACHT NTPases suggest probable role in apoptosis regulation. Moreover, the genome of Selaginella moellendorffii, an extant lycophyte, intriguingly shows the considerable number of both AP-ATPases and NACHT NTPases in contrast to a large repertoire of AP-ATPases in plants and emerge as an important node in the evolutionary tree of life. The large complement of AP-ATPases overtakes the function of NACHT NTPases and plausible reason behind the absence of the later in the plant lineages. The presence of NACHT NTPases in the early green plants and phyletic patterns results from this study raises a quandary for the distribution of this STAND P-loop NTPase with the apparent horizontal gene transfer from cyanobacteria. PMID:26930396

  4. Building an electronic book on the Internet: ``CSEP -- an interdisciplinary syllabus for teaching computational science at the graduate level``

    SciTech Connect

    Oliver, C.E.; Strayer, M.R.; Umar, V.M.

    1994-12-31

    The Computational Science Education Project was initiated in September 1991, by the Department of Energy to develop a syllabus for teaching interdisciplinary computational science. CSEP has two major activities. The writing and maintenance of an electronic book (e-book) and educational outreach to the computational science communities through presentations at professional society meetings, journal articles, and by training educators. The interdisciplinary nature of the project is intended to contribute to national technological competitiveness by producing a body of graduates with the necessary skills to operate effectively in high performance computing environments. The educational outreach guides and supports instructors in developing computational science courses and curricula at their institutions. The CSEP e-book provides valuable teaching material around which educators have built. The outreach not only introduces new educators to CSEP, but also establishes a synergistic relationship between CSEP authors, reviewers and users.

  5. The validity of the Computer Science and Applications activity monitor for use in coronary artery disease patients during level walking.

    PubMed

    Ekelund, Ulf; Tingström, Pia; Kamwendo, Kitty; Krantz, Monica; Nylander, Eva; Sjöström, Michael; Bergdahl, Björn

    2002-07-01

    The principal aim of the present study was to examine the validity of the Computer Science and Applications (CSA) activity monitor during level walking in coronary artery disease (CAD) patients. As a secondary aim, we evaluated the usefulness of two previously published energy expenditure (EE) prediction equations. Thirty-four subjects (29 men and five women), all with diagnosed CAD, volunteered to participate. Oxygen uptake (VO2) was measured by indirect calorimetry during walking on a motorized treadmill at three different speeds (3.2, 4.8 and 6.4 km h-1). Physical activity was measured simultaneously using the CSA activity monitor, secured directly to the skin on the lower back (i.e. lumbar vertebrae 4-5) with an elastic belt. The mean (+/- SD) activity counts were 1208 +/- 429, 3258 +/- 753 and 5351 +/- 876 counts min-1, at the three speeds, respectively (P < 0.001). Activity counts were significantly correlated to speed (r = 0.92; P < 0.001), VO2 (ml kg-1 min-1; r = 0.87; P < 0.001) and EE (kcal min-1; r = 0.85, P < 0.001). A stepwise linear regression analysis showed that activity counts and body weight together explained 75% of the variation in EE. Predicted EE from previously published equations differed significantly when used in this group of CAD patients. In conclusion, the CSA activity monitor is a valid instrument for assessing the intensity of physical activity during treadmill walking in CAD patients. Energy expenditure can be predicted from body weight and activity counts.

  6. Greater-than-Class C low-level radioactive waste shipping package/container identification and requirements study. National Low-Level Waste Management Program

    SciTech Connect

    Tyacke, M.

    1993-08-01

    This report identifies a variety of shipping packages (also referred to as casks) and waste containers currently available or being developed that could be used for greater-than-Class C (GTCC) low-level waste (LLW). Since GTCC LLW varies greatly in size, shape, and activity levels, the casks and waste containers that could be used range in size from small, to accommodate a single sealed radiation source, to very large-capacity casks/canisters used to transport or dry-store highly radioactive spent fuel. In some cases, the waste containers may serve directly as shipping packages, while in other cases, the containers would need to be placed in a transport cask. For the purpose of this report, it is assumed that the generator is responsible for transporting the waste to a Department of Energy (DOE) storage, treatment, or disposal facility. Unless DOE establishes specific acceptance criteria, the receiving facility would need the capability to accept any of the casks and waste containers identified in this report. In identifying potential casks and waste containers, no consideration was given to their adequacy relative to handling, storage, treatment, and disposal. Those considerations must be addressed separately as the capabilities of the receiving facility and the handling requirements and operations are better understood.

  7. High-level waste storage tank farms/242-A evaporator standards/requirements identification document (S/RID), Vol. 4

    SciTech Connect

    Not Available

    1994-04-01

    Radiation protection of personnel and the public is accomplished by establishing a well defined Radiation Protection Organization to ensure that appropriate controls on radioactive materials and radiation sources are implemented and documented. This Requirements Identification Document (RID) applies to the activities, personnel, structures, systems, components, and programs involved in executing the mission of the Tank Farms. The physical boundaries within which the requirements of this RID apply are the Single Shell Tank Farms, Double Shell Tank Farms, 242-A Evaporator-Crystallizer, 242-S, T Evaporators, Liquid Effluent Retention Facility (LERF), Purgewater Storage Facility (PWSF), and all interconnecting piping, valves, instrumentation, and controls. Also included is all piping, valves, instrumentation, and controls up to and including the most remote valve under Tank Farms control at any other Hanford Facility having an interconnection with Tank Farms. The boundary of the structures, systems, components, and programs to which this RID applies, is defined by those that are dedicated to and/or under the control of the Tank Farms Operations Department and are specifically implemented at the Tank Farms.

  8. High-level waste storage tank farms/242-A evaporator standards/requirements identification document (S/RID), Vol. 2

    SciTech Connect

    Not Available

    1994-04-01

    The Quality Assurance Functional Area Requirements Identification Document (RID), addresses the programmatic requirements that ensure risks and environmental impacts are minimized, ensure safety, reliability, and performance are maximized through the application of effective management systems commensurate with the risks posed by the Tank Farm Facility and its operation. This RID incorporates guidance intended to provide Tank Farms management with the necessary requirements information to develop, upgrade, or assess the effectiveness of a Quality Assurance Program in the performance of organizational and functional activities. Quality Assurance is defined as all those planned and systematic actions necessary to provide adequate confidence that a facility, structure, system, or component will perform satisfactorily and safely in service. This document will provide the specific requirements to meet DNFSB recommendations and the guidance provided in DOE Order 5700.6C, utilizing industry codes, standards, regulatory guidelines, and industry good practices that have proven to be essential elements for an effective and efficient Quality Assurance Program as the nuclear industry has matured over the last thirty years.

  9. Using single-species toxicity tests, community-level responses, and toxicity identification evaluations to investigate effluent impacts

    SciTech Connect

    Maltby, L.; Clayton, S.A.; Yu, H.; McLoughlin, N.; Wood, R.M.; Yin, D.

    2000-01-01

    Whole effluent toxicity (WET) tests are increasingly used to monitor compliance of consented discharges, but few studies have related toxicity measured using WET tests to receiving water impacts. Here the authors adopt a four-stage procedure to investigate the toxicity and biological impact of a point source discharge and to identify the major toxicants. In stage 1, standard WET tests were employed to determine the toxicity of the effluent. This was then followed by an assessment of receiving water toxicity using in situ deployment of indigenous (Gammarus pulex) and standard (Daphnia magna) test species. The third stage involved the use of biological survey techniques to assess the impact of the discharge on the structure and functioning of the benthic macroinvertebrate community. In stage 4, toxicity identification evaluations (TIE) were used to identify toxic components in the effluent. Receiving-water toxicity and ecological impact detected downstream of the discharge were consistent with the results of WET tests performed on the effluent. Downstream of the discharge, there was a reduction in D. magna survival, in G. pulex survival and feeding rate, in detritus processing, and in biotic indices based on macroinvertebrate community structure. The TIE studies suggested that chlorine was the principal toxicant in the effluent.

  10. In situ strain-level detection and identification of Vibrio parahaemolyticus using surface-enhanced Raman spectroscopy.

    PubMed

    Xu, Jiajie; Turner, Jeffrey W; Idso, Matthew; Biryukov, Stanley V; Rognstad, Laurel; Gong, Heng; Trainer, Vera L; Wells, Mark L; Strom, Mark S; Yu, Qiuming

    2013-03-05

    The outer membrane of a bacterium is composed of chemical and biological components that carry specific molecular information related to strains, growth stages, expressions to stimulation, and maybe even geographic differences. In this work, we demonstrate that the biochemical information embedded in the outer membrane can be used for rapid detection and identification of pathogenic bacteria using surface-enhanced Raman spectroscopy (SERS). We used seven different strains of the marine pathogen Vibrio parahaemolyticus as a model system. The strains represent four genetically distinct clades isolated from clinical and environmental sources in Washington, U.S.A. The unique quasi-3D (Q3D) plasmonic nanostructure arrays, optimized using finite-difference time-domain (FDTD) calculations, were used as SERS-active substrates for sensitive and reproducible detection of these bacteria. SERS barcodes were generated on the basis of SERS spectra and were used to successfully detect individual strains in both blind samples and mixtures. The sensing and detection methods developed in this work could have broad applications in the areas of environmental monitoring, biomedical diagnostics, and homeland security.

  11. First-principles identification of charge-transition levels of native defects in BaF2

    NASA Astrophysics Data System (ADS)

    Ibraheem, A. M.; Eisa, M. H.; Adlan, W.; Amolo, George O.; Khalafalla, M. A. H.

    2017-03-01

    This paper reports on semilocal and hybrid density functional analysis of charge-transition levels of native defects in BaF2 structure. The transition level is defined as the Fermi level where two defect charge states have the same formation energy. The errors arising from the small supercell size effects have been relieved through extrapolating the formation energies to the limit of infinite supercell size. The level placement in the corrected band gap is achieved using a correction factor obtained from the difference between the valence band maxima in semilocal and hybrid calculations. The band gap size from hybrid calculation is validated using the full-potential, linearized augmented planewave method with the modified Becke-Johnson exchange potential. Our results are sufficiently accurate and, thus, significant for direct comparison with experiments.

  12. Identification of the primary compensating defect level responsible for determining blocking voltage of vertical GaN power diodes

    DOE PAGES

    King, M. P.; Kaplar, R. J.; Dickerson, J. R.; ...

    2016-10-31

    Electrical performance and characterization of deep levels in vertical GaN P-i-N diodes grown on low threading dislocation density (~104 –106 cm–2) bulk GaN substrates are investigated. The lightly doped n drift region of these devices is observed to be highly compensated by several prominent deep levels detected using deep level optical spectroscopy at Ec-2.13, 2.92, and 3.2 eV. A combination of steady-state photocapacitance and lighted capacitance-voltage profiling indicates the concentrations of these deep levels to be Nt = 3 × 1012, 2 × 1015, and 5 × 1014 cm–3, respectively. The Ec-2.92 eV level is observed to be the primarymore » compensating defect in as-grown n-type metal-organic chemical vapor deposition GaN, indicating this level acts as a limiting factor for achieving controllably low doping. The device blocking voltage should increase if compensating defects reduce the free carrier concentration of the n drift region. Understanding the incorporation of as-grown and native defects in thick n-GaN is essential for enabling large VBD in the next-generation wide-bandgap power semiconductor devices. Furthermore, controlling the as-grown defects induced by epitaxial growth conditions is critical to achieve blocking voltage capability above 5 kV.« less

  13. Assessment of safety and interference issues of radio frequency identification devices in 0.3 Tesla magnetic resonance imaging and computed tomography.

    PubMed

    Periyasamy, M; Dhanasekaran, R

    2014-01-01

    The objective of this study was to evaluate two issues regarding magnetic resonance imaging (MRI) including device functionality and image artifacts for the presence of radio frequency identification devices (RFID) in association with 0.3 Tesla at 12.7 MHz MRI and computed tomography (CT) scanning. Fifteen samples of RFID tags with two different sizes (wristband and ID card types) were tested. The tags were exposed to several MR-imaging conditions during MRI examination and X-rays of CT scan. Throughout the test, the tags were oriented in three different directions (axial, coronal, and sagittal) relative to MRI system in order to cover all possible situations with respect to the patient undergoing MRI and CT scanning, wearing a RFID tag on wrist. We observed that the tags did not sustain physical damage with their functionality remaining unaffected even after MRI and CT scanning, and there was no alternation in previously stored data as well. In addition, no evidence of either signal loss or artifact was seen in the acquired MR and CT images. Therefore, we can conclude that the use of this passive RFID tag is safe for a patient undergoing MRI at 0.3 T/12.7 MHz and CT Scanning.

  14. A Genetic-Based Feature Selection Approach in the Identification of Left/Right Hand Motor Imagery for a Brain-Computer Interface.

    PubMed

    Yaacoub, Charles; Mhanna, Georges; Rihana, Sandy

    2017-01-23

    Electroencephalography is a non-invasive measure of the brain electrical activity generated by millions of neurons. Feature extraction in electroencephalography analysis is a core issue that may lead to accurate brain mental state classification. This paper presents a new feature selection method that improves left/right hand movement identification of a motor imagery brain-computer interface, based on genetic algorithms and artificial neural networks used as classifiers. Raw electroencephalography signals are first preprocessed using appropriate filtering. Feature extraction is carried out afterwards, based on spectral and temporal signal components, and thus a feature vector is constructed. As various features might be inaccurate and mislead the classifier, thus degrading the overall system performance, the proposed approach identifies a subset of features from a large feature space, such that the classifier error rate is reduced. Experimental results show that the proposed method is able to reduce the number of features to as low as 0.5% (i.e., the number of ignored features can reach 99.5%) while improving the accuracy, sensitivity, specificity, and precision of the classifier.

  15. A Genetic-Based Feature Selection Approach in the Identification of Left/Right Hand Motor Imagery for a Brain-Computer Interface

    PubMed Central

    Yaacoub, Charles; Mhanna, Georges; Rihana, Sandy

    2017-01-01

    Electroencephalography is a non-invasive measure of the brain electrical activity generated by millions of neurons. Feature extraction in electroencephalography analysis is a core issue that may lead to accurate brain mental state classification. This paper presents a new feature selection method that improves left/right hand movement identification of a motor imagery brain-computer interface, based on genetic algorithms and artificial neural networks used as classifiers. Raw electroencephalography signals are first preprocessed using appropriate filtering. Feature extraction is carried out afterwards, based on spectral and temporal signal components, and thus a feature vector is constructed. As various features might be inaccurate and mislead the classifier, thus degrading the overall system performance, the proposed approach identifies a subset of features from a large feature space, such that the classifier error rate is reduced. Experimental results show that the proposed method is able to reduce the number of features to as low as 0.5% (i.e., the number of ignored features can reach 99.5%) while improving the accuracy, sensitivity, specificity, and precision of the classifier. PMID:28124985

  16. Identification and computational annotation of genes differentially expressed in pulp development of Cocos nucifera L. by suppression subtractive hybridization

    PubMed Central

    2014-01-01

    Background Coconut (Cocos nucifera L.) is one of the world’s most versatile, economically important tropical crops. Little is known about the physiological and molecular basis of coconut pulp (endosperm) development and only a few coconut genes and gene product sequences are available in public databases. This study identified genes that were differentially expressed during development of coconut pulp and functionally annotated these identified genes using bioinformatics analysis. Results Pulp from three different coconut developmental stages was collected. Four suppression subtractive hybridization (SSH) libraries were constructed (forward and reverse libraries A and B between stages 1 and 2, and C and D between stages 2 and 3), and identified sequences were computationally annotated using Blast2GO software. A total of 1272 clones were obtained for analysis from four SSH libraries with 63% showing similarity to known proteins. Pairwise comparing of stage-specific gene ontology ids from libraries B-D, A-C, B-C and A-D showed that 32 genes were continuously upregulated and seven downregulated; 28 were transiently upregulated and 23 downregulated. KEGG (Kyoto Encyclopedia of Genes and Genomes) analysis showed that 1-acyl-sn-glycerol-3-phosphate acyltransferase (LPAAT), phospholipase D, acetyl-CoA carboxylase carboxyltransferase beta subunit, 3-hydroxyisobutyryl-CoA hydrolase-like and pyruvate dehydrogenase E1 β subunit were associated with fatty acid biosynthesis or metabolism. Triose phosphate isomerase, cellulose synthase and glucan 1,3-β-glucosidase were related to carbohydrate metabolism, and phosphoenolpyruvate carboxylase was related to both fatty acid and carbohydrate metabolism. Of 737 unigenes, 103 encoded enzymes were involved in fatty acid and carbohydrate biosynthesis and metabolism, and a number of transcription factors and other interesting genes with stage-specific expression were confirmed by real-time PCR, with validation of the SSH results as

  17. Identification of novel candidate drivers connecting different dysfunctional levels for lung adenocarcinoma using protein-protein interactions and a shortest path approach

    PubMed Central

    Chen, Lei; Huang, Tao; Zhang, Yu-Hang; Jiang, Yang; Zheng, Mingyue; Cai, Yu-Dong

    2016-01-01

    Tumors are formed by the abnormal proliferation of somatic cells with disordered growth regulation under the influence of tumorigenic factors. Recently, the theory of “cancer drivers” connects tumor initiation with several specific mutations in the so-called cancer driver genes. According to the differentiation of four basic levels between tumor and adjacent normal tissues, the cancer drivers can be divided into the following: (1) Methylation level, (2) microRNA level, (3) mutation level, and (4) mRNA level. In this study, a computational method is proposed to identify novel lung adenocarcinoma drivers based on dysfunctional genes on the methylation, microRNA, mutation and mRNA levels. First, a large network was constructed using protein-protein interactions. Next, we searched all of the shortest paths connecting dysfunctional genes on different levels and extracted new candidate genes lying on these paths. Finally, the obtained candidate genes were filtered by a permutation test and an additional strict selection procedure involving a betweenness ratio and an interaction score. Several candidate genes remained, which are deemed to be related to two different levels of cancer. The analyses confirmed our assertions that some have the potential to contribute to the tumorigenesis process on multiple levels. PMID:27412431

  18. Identification of novel candidate drivers connecting different dysfunctional levels for lung adenocarcinoma using protein-protein interactions and a shortest path approach

    NASA Astrophysics Data System (ADS)

    Chen, Lei; Huang, Tao; Zhang, Yu-Hang; Jiang, Yang; Zheng, Mingyue; Cai, Yu-Dong

    2016-07-01

    Tumors are formed by the abnormal proliferation of somatic cells with disordered growth regulation under the influence of tumorigenic factors. Recently, the theory of “cancer drivers” connects tumor initiation with several specific mutations in the so-called cancer driver genes. According to the differentiation of four basic levels between tumor and adjacent normal tissues, the cancer drivers can be divided into the following: (1) Methylation level, (2) microRNA level, (3) mutation level, and (4) mRNA level. In this study, a computational method is proposed to identify novel lung adenocarcinoma drivers based on dysfunctional genes on the methylation, microRNA, mutation and mRNA levels. First, a large network was constructed using protein-protein interactions. Next, we searched all of the shortest paths connecting dysfunctional genes on different levels and extracted new candidate genes lying on these paths. Finally, the obtained candidate genes were filtered by a permutation test and an additional strict selection procedure involving a betweenness ratio and an interaction score. Several candidate genes remained, which are deemed to be related to two different levels of cancer. The analyses confirmed our assertions that some have the potential to contribute to the tumorigenesis process on multiple levels.

  19. Cis-trans isomerization in the S1 state of acetylene: identification of cis-well vibrational levels.

    PubMed

    Merer, Anthony J; Steeves, Adam H; Baraban, Joshua H; Bechtel, Hans A; Field, Robert W

    2011-06-28

    A systematic analysis of the S(1)-trans (Ã(1)A(u)) state of acetylene, using IR-UV double resonance along with one-photon fluorescence excitation spectra, has allowed assignment of at least part of every single vibrational state or polyad up to a vibrational energy of 4200 cm(-1). Four observed vibrational levels remain unassigned, for which no place can be found in the level structure of the trans-well. The most prominent of these lies at 46 175 cm(-1). Its (13)C isotope shift, exceptionally long radiative lifetime, unexpected rotational selection rules, and lack of significant Zeeman effect, combined with the fact that no other singlet electronic states are expected at this energy, indicate that it is a vibrational level of the S(1)-cis isomer (Ã(1)A(2)). Guided by ab initio calculations [J. H. Baraban, A. R. Beck, A. H. Steeves, J. F. Stanton, and R. W. Field, J. Chem. Phys. 134, 244311 (2011)] of the cis-well vibrational frequencies, the vibrational assignments of these four levels can be established from their vibrational symmetries together with the (13)C isotope shift of the 46 175 cm(-1) level (assigned here as cis-3(1)6(1)). The S(1)-cis zero-point level is deduced to lie near 44 900 cm(-1), and the ν(6) vibrational frequency of the S(1)-cis well is found to be roughly 565 cm(-1); these values are in remarkably good agreement with the results of recent ab initio calculations. The 46 175 cm(-1) vibrational level is found to have a 3.9 cm(-1) staggering of its K-rotational structure as a result of quantum mechanical tunneling through the isomerization barrier. Such tunneling does not give rise to ammonia-type inversion doubling, because the cis and trans isomers are not equivalent; instead the odd-K rotational levels of a given vibrational level are systematically shifted relative to the even-K rotational levels, leading to a staggering of the K-structure. These various observations represent the first definite assignment of an isomer of

  20. Dental opioid prescribing practices and risk mitigation strategy implementation: Identification of potential targets for provider-level intervention

    PubMed Central

    McCauley, Jenna L.; Leite, Renata S.; Melvin, Cathy L.; Fillingim, Roger B.; Brady, Kathleen T.

    2016-01-01

    Background Given the regular use of immediate release opioids for dental pain management, as well as documented opioid misuse among dental patients, the dental visit may provide a viable point of intervention to screen, identify, and educate patients regarding the risks associated with prescription opioid misuse and diversion. The aims of this statewide survey of dental practitioners were to assess: (a) awareness of the scope of prescription opioid misuse and diversion; (b) current opioid prescribing practices; (c) use of and opinions regarding risk mitigation strategies; and, (d) use and perceived utility of drug monitoring programs. Methods This cross-sectional study surveyed dentists (N=87) participating in statewide professional and alumni organizations. Dentists were invited via email and listserv announcement to participate in a one-time, online, 59-item, self-administered survey. Results A majority of respondents reported prescribing opioids (n=66; 75.8%). A minority of respondents (n=38; 44%) reported regularly screening for current prescription drug abuse. Dentists reported low rates of requesting prior medical records (n=5; 5.8%). Only 38% (n=33) of respondents had ever accessed a prescription drug-monitoring program (PDMP), and only 4 (4.7%) consistently used a PDMP. Dentists reporting prior training in drug diversion were significantly more likely to have accessed their PDMP, p<0.01. Interest in continuing education regarding assessment of prescription drug abuse/diversion and use of drug monitoring programs was high. Conclusions Although most dentists received training related to prescribing opioids, findings identified a gap in existing dental training in the assessment/identification of prescription opioid misuse and diversion. Findings also identified gaps in the implementation of recommended risk mitigation strategies, including screening for prescription drug abuse, consistent provision of patient education, and use of a PDMP prior to prescribing