Science.gov

Sample records for level computational identification

  1. Atomic level computational identification of ligand migration pathways between solvent and binding site in myoglobin.

    PubMed

    Ruscio, Jory Z; Kumar, Deept; Shukla, Maulik; Prisant, Michael G; Murali, T M; Onufriev, Alexey V

    2008-07-08

    Myoglobin is a globular protein involved in oxygen storage and transport. No consensus yet exists on the atomic level mechanism by which oxygen and other small nonpolar ligands move between the myoglobin's buried heme, which is the ligand binding site, and surrounding solvent. This study uses room temperature molecular dynamics simulations to provide a complete atomic level picture of ligand migration in myoglobin. Multiple trajectories--providing a cumulative total of 7 micros of simulation--are analyzed. Our simulation results are consistent with and tie together previous experimental findings. Specifically, we characterize: (i) Explicit full trajectories in which the CO ligand shuttles between the internal binding site and the solvent and (ii) pattern and structural origins of transient voids available for ligand migration. The computations are performed both in sperm whale myoglobin wild-type and in sperm whale V68F myoglobin mutant, which is experimentally known to slow ligand-binding kinetics. On the basis of these independent, but mutually consistent ligand migration and transient void computations, we find that there are two discrete dynamical pathways for ligand migration in myoglobin. Trajectory hops between these pathways are limited to two bottleneck regions. Ligand enters and exits the protein matrix in common identifiable portals on the protein surface. The pathways are located in the "softer" regions of the protein matrix and go between its helices and in its loop regions. Localized structural fluctuations are the primary physical origin of the simulated CO migration pathways inside the protein.

  2. MODEL IDENTIFICATION AND COMPUTER ALGEBRA.

    PubMed

    Bollen, Kenneth A; Bauldry, Shawn

    2010-10-07

    Multiequation models that contain observed or latent variables are common in the social sciences. To determine whether unique parameter values exist for such models, one needs to assess model identification. In practice analysts rely on empirical checks that evaluate the singularity of the information matrix evaluated at sample estimates of parameters. The discrepancy between estimates and population values, the limitations of numerical assessments of ranks, and the difference between local and global identification make this practice less than perfect. In this paper we outline how to use computer algebra systems (CAS) to determine the local and global identification of multiequation models with or without latent variables. We demonstrate a symbolic CAS approach to local identification and develop a CAS approach to obtain explicit algebraic solutions for each of the model parameters. We illustrate the procedures with several examples, including a new proof of the identification of a model for handling missing data using auxiliary variables. We present an identification procedure for Structural Equation Models that makes use of CAS and that is a useful complement to current methods.

  3. Identification of Desired Computer Capabilities for Management of an Air Force Base Level Civil Engineering Design Office.

    DTIC Science & Technology

    1985-09-01

    subjective parameters. Those measures are hard to define and analyze. Kaneda and Wallett , in their AFIT thesis, Development of Productivity Measures for...Kaneda and Wallett found that there was a "present insufficiency of output measures...level and not formalized into a MAJCOM/USAF controlled program (14:78). Kaneda and Wallett found that at many bases there was no data base for

  4. Computational model predictions of cues for concurrent vowel identification.

    PubMed

    Chintanpalli, Ananthakrishna; Ahlstrom, Jayne B; Dubno, Judy R

    2014-10-01

    Although differences in fundamental frequencies (F0s) between vowels are beneficial for their segregation and identification, listeners can still segregate and identify simultaneous vowels that have identical F0s, suggesting that additional cues are contributing, including formant frequency differences. The current perception and computational modeling study was designed to assess the contribution of F0 and formant difference cues for concurrent vowel identification. Younger adults with normal hearing listened to concurrent vowels over a wide range of levels (25-85 dB SPL) for conditions in which F0 was the same or different between vowel pairs. Vowel identification scores were poorer at the lowest and highest levels for each F0 condition, and F0 benefit was reduced at the lowest level as compared to higher levels. To understand the neural correlates underlying level-dependent changes in vowel identification, a computational auditory-nerve model was used to estimate formant and F0 difference cues under the same listening conditions. Template contrast and average localized synchronized rate predicted level-dependent changes in the strength of phase locking to F0s and formants of concurrent vowels, respectively. At lower levels, poorer F0 benefit may be attributed to poorer phase locking to both F0s, which resulted from lower firing rates of auditory-nerve fibers. At higher levels, poorer identification scores may relate to poorer phase locking to the second formant, due to synchrony capture by lower formants. These findings suggest that concurrent vowel identification may be partly influenced by level-dependent changes in phase locking of auditory-nerve fibers to F0s and formants of both vowels.

  5. Computational methods for remote homolog identification.

    PubMed

    Wan, Xiu-Feng; Xu, Dong

    2005-12-01

    As more and more protein sequences are available, homolog identification becomes increasingly important for functional, structural, and evolutional studies of proteins. Many homologous proteins were separated a very long time ago in their evolutionary history and thus their sequences share low sequence identity. These remote homologs have become a research focus in bioinformatics over the past decade, and some significant advances have been achieved. In this paper, we provide a comprehensive review on computational techniques used in remote homolog identification based on different methods, including sequence-sequence comparison, and sequence-structure comparison, and structure-structure comparison. Other miscellaneous approaches are also summarized. Pointers to the online resources of these methods and their related databases are provided. Comparisons among different methods in terms of their technical approaches, their strengths, and limitations are followed. Studies on proteins in SARS-CoV are shown as an example for remote homolog identification application.

  6. Multi-level damage identification with response reconstruction

    NASA Astrophysics Data System (ADS)

    Zhang, Chao-Dong; Xu, You-Lin

    2017-10-01

    Damage identification through finite element (FE) model updating usually forms an inverse problem. Solving the inverse identification problem for complex civil structures is very challenging since the dimension of potential damage parameters in a complex civil structure is often very large. Aside from enormous computation efforts needed in iterative updating, the ill-condition and non-global identifiability features of the inverse problem probably hinder the realization of model updating based damage identification for large civil structures. Following a divide-and-conquer strategy, a multi-level damage identification method is proposed in this paper. The entire structure is decomposed into several manageable substructures and each substructure is further condensed as a macro element using the component mode synthesis (CMS) technique. The damage identification is performed at two levels: the first is at macro element level to locate the potentially damaged region and the second is over the suspicious substructures to further locate as well as quantify the damage severity. In each level's identification, the damage searching space over which model updating is performed is notably narrowed down, not only reducing the computation amount but also increasing the damage identifiability. Besides, the Kalman filter-based response reconstruction is performed at the second level to reconstruct the response of the suspicious substructure for exact damage quantification. Numerical studies and laboratory tests are both conducted on a simply supported overhanging steel beam for conceptual verification. The results demonstrate that the proposed multi-level damage identification via response reconstruction does improve the identification accuracy of damage localization and quantization considerably.

  7. Computing Health: Programing Problem 3, Computing Peak Blood Alcohol Levels.

    ERIC Educational Resources Information Center

    Gold, Robert S.

    1985-01-01

    The Alcohol Metabolism Program, a computer program used to compute peak blood alcohol levels, is expanded upon to include a cover page, brief introduction, and techniques for generalizing the program to calculate peak levels for any number of drinks. (DF)

  8. System Level Applications of Adaptive Computing (SLAAC)

    DTIC Science & Technology

    2003-11-01

    AFRL-IF-RS-TR-2003-255 Final Technical Report November 2003 SYSTEM LEVEL APPLICATIONS OF ADAPTIVE COMPUTING (SLAAC...NOVEMBER 2003 3. REPORT TYPE AND DATES COVERED Final Jul 97 – May 03 4. TITLE AND SUBTITLE SYSTEM LEVEL APPLICATIONS OF ADAPTIVE COMPUTING (SLAAC...importance are described further in Sandia’s final report . 6.2 Technical Approach The CDI algorithm requires significantly more computation power and

  9. Computer method for identification of boiler transfer functions

    NASA Technical Reports Server (NTRS)

    Miles, J. H.

    1972-01-01

    Iterative computer aided procedure was developed which provides for identification of boiler transfer functions using frequency response data. Method uses frequency response data to obtain satisfactory transfer function for both high and low vapor exit quality data.

  10. Identification of Protein–Excipient Interaction Hotspots Using Computational Approaches

    PubMed Central

    Barata, Teresa S.; Zhang, Cheng; Dalby, Paul A.; Brocchini, Steve; Zloh, Mire

    2016-01-01

    Protein formulation development relies on the selection of excipients that inhibit protein–protein interactions preventing aggregation. Empirical strategies involve screening many excipient and buffer combinations using force degradation studies. Such methods do not readily provide information on intermolecular interactions responsible for the protective effects of excipients. This study describes a molecular docking approach to screen and rank interactions allowing for the identification of protein–excipient hotspots to aid in the selection of excipients to be experimentally screened. Previously published work with Drosophila Su(dx) was used to develop and validate the computational methodology, which was then used to determine the formulation hotspots for Fab A33. Commonly used excipients were examined and compared to the regions in Fab A33 prone to protein–protein interactions that could lead to aggregation. This approach could provide information on a molecular level about the protective interactions of excipients in protein formulations to aid the more rational development of future formulations. PMID:27258262

  11. Multi level programming Paradigm for Extreme Computing

    NASA Astrophysics Data System (ADS)

    Petiton, S.; Sato, M.; Emad, N.; Calvin, C.; Tsuji, M.; Dandouna, M.

    2014-06-01

    Abstract: In order to propose a framework and programming paradigms for post-petascale computing, on the road to exascale computing and beyond, we introduced new languages, associated with a hierarchical multi-level programming paradigm, allowing scientific end-users and developers to program highly hierarchical architectures designed for extreme computing. In this paper, we explain the interest of such hierarchical multi-level programming paradigm for extreme computing and its well adaptation to several large computational science applications, such as for linear algebra solvers used for reactor core physic. We describe the YML language and framework allowing describing graphs of parallel components, which may be developed using PGAS-like language such as XMP, scheduled and computed on supercomputers. Then, we propose experimentations on supercomputers (such as the "K" and "Hooper" ones) of the hybrid method MERAM (Multiple Explicitly Restarted Arnoldi Method) as a case study for iterative methods manipulating sparse matrices, and the block Gauss-Jordan method as a case study for direct method manipulating dense matrices. We conclude proposing evolutions for this programming paradigm.

  12. Multi-level RF identification system

    DOEpatents

    Steele, Kerry D.; Anderson, Gordon A.; Gilbert, Ronald W.

    2004-07-20

    A radio frequency identification system having a radio frequency transceiver for generating a continuous wave RF interrogation signal that impinges upon an RF identification tag. An oscillation circuit in the RF identification tag modulates the interrogation signal with a subcarrier of a predetermined frequency and modulates the frequency-modulated signal back to the transmitting interrogator. The interrogator recovers and analyzes the subcarrier signal and determines its frequency. The interrogator generates an output indicative of the frequency of the subcarrier frequency, thereby identifying the responding RFID tag as one of a "class" of RFID tags configured to respond with a subcarrier signal of a predetermined frequency.

  13. Enzyme optimization for next level molecular computing

    NASA Astrophysics Data System (ADS)

    Wąsiewicz, Piotr; Malinowski, Michal; Plucienniczak, Andrzej

    2006-10-01

    The main concept of molecular computing depends on DNA self-assembly abilities and on modifying DNA with the help of enzymes during genetic operations. In the typical DNA computing a sequence of operations executed on DNA strings in parallel is called an algorithm, which is also determined by a model of DNA strings. This methodology is similar to the soft hardware specialized architecture driven here by heating, cooling and enzymes, especially polymerases used for copying strings. As it is described in this paper the polymerase Taq properties are changed by modifying its DNA sequence in such a way that polymerase side activities together with peptide chains, responsible for destroying amplified strings, are cut off. Thus, it introduces the next level of molecular computing. The genetic operation execution succession and the given molecule model with designed nucleotide sequences produce computation results and additionally they modify enzymes, which directly influence on the computation process. The information flow begins to circulate. Additionally, such optimized enzymes are more suitable for nanoconstruction, because they have only desired characteristics. The experiment was proposed to confirm the possibilities of the suggested implementation.

  14. SLIMM: species level identification of microorganisms from metagenomes.

    PubMed

    Dadi, Temesgen Hailemariam; Renard, Bernhard Y; Wieler, Lothar H; Semmler, Torsten; Reinert, Knut

    2017-01-01

    Identification and quantification of microorganisms is a significant step in studying the alpha and beta diversities within and between microbial communities respectively. Both identification and quantification of a given microbial community can be carried out using whole genome shotgun sequences with less bias than when using 16S-rDNA sequences. However, shared regions of DNA among reference genomes and taxonomic units pose a significant challenge in assigning reads correctly to their true origins. The existing microbial community profiling tools commonly deal with this problem by either preparing signature-based unique references or assigning an ambiguous read to its least common ancestor in a taxonomic tree. The former method is limited to making use of the reads which can be mapped to the curated regions, while the latter suffer from the lack of uniquely mapped reads at lower (more specific) taxonomic ranks. Moreover, even if the tools exhibited good performance in calling the organisms present in a sample, there is still room for improvement in determining the correct relative abundance of the organisms. We present a new method Species Level Identification of Microorganisms from Metagenomes (SLIMM) which addresses the above issues by using coverage information of reference genomes to remove unlikely genomes from the analysis and subsequently gain more uniquely mapped reads to assign at lower ranks of a taxonomic tree. SLIMM is based on a few, seemingly easy steps which when combined create a tool that outperforms state-of-the-art tools in run-time and memory usage while being on par or better in computing quantitative and qualitative information at species-level.

  15. SLIMM: species level identification of microorganisms from metagenomes

    PubMed Central

    Renard, Bernhard Y.; Wieler, Lothar H.; Semmler, Torsten; Reinert, Knut

    2017-01-01

    Identification and quantification of microorganisms is a significant step in studying the alpha and beta diversities within and between microbial communities respectively. Both identification and quantification of a given microbial community can be carried out using whole genome shotgun sequences with less bias than when using 16S-rDNA sequences. However, shared regions of DNA among reference genomes and taxonomic units pose a significant challenge in assigning reads correctly to their true origins. The existing microbial community profiling tools commonly deal with this problem by either preparing signature-based unique references or assigning an ambiguous read to its least common ancestor in a taxonomic tree. The former method is limited to making use of the reads which can be mapped to the curated regions, while the latter suffer from the lack of uniquely mapped reads at lower (more specific) taxonomic ranks. Moreover, even if the tools exhibited good performance in calling the organisms present in a sample, there is still room for improvement in determining the correct relative abundance of the organisms. We present a new method Species Level Identification of Microorganisms from Metagenomes (SLIMM) which addresses the above issues by using coverage information of reference genomes to remove unlikely genomes from the analysis and subsequently gain more uniquely mapped reads to assign at lower ranks of a taxonomic tree. SLIMM is based on a few, seemingly easy steps which when combined create a tool that outperforms state-of-the-art tools in run-time and memory usage while being on par or better in computing quantitative and qualitative information at species-level. PMID:28367376

  16. Occupational risk identification using hand-held or laptop computers.

    PubMed

    Naumanen, Paula; Savolainen, Heikki; Liesivuori, Jyrki

    2008-01-01

    This paper describes the Work Environment Profile (WEP) program and its use in risk identification by computer. It is installed into a hand-held computer or a laptop to be used in risk identification during work site visits. A 5-category system is used to describe the identified risks in 7 groups, i.e., accidents, biological and physical hazards, ergonomic and psychosocial load, chemicals, and information technology hazards. Each group contains several qualifying factors. These 5 categories are colour-coded at this stage to aid with visualization. Risk identification produces visual summary images the interpretation of which is facilitated by colours. The WEP program is a tool for risk assessment which is easy to learn and to use both by experts and nonprofessionals. It is especially well adapted to be used both in small and in larger enterprises. Considerable time is saved as no paper notes are needed.

  17. Computational Strategies for a System-Level Understanding of Metabolism

    PubMed Central

    Cazzaniga, Paolo; Damiani, Chiara; Besozzi, Daniela; Colombo, Riccardo; Nobile, Marco S.; Gaglio, Daniela; Pescini, Dario; Molinari, Sara; Mauri, Giancarlo; Alberghina, Lilia; Vanoni, Marco

    2014-01-01

    Cell metabolism is the biochemical machinery that provides energy and building blocks to sustain life. Understanding its fine regulation is of pivotal relevance in several fields, from metabolic engineering applications to the treatment of metabolic disorders and cancer. Sophisticated computational approaches are needed to unravel the complexity of metabolism. To this aim, a plethora of methods have been developed, yet it is generally hard to identify which computational strategy is most suited for the investigation of a specific aspect of metabolism. This review provides an up-to-date description of the computational methods available for the analysis of metabolic pathways, discussing their main advantages and drawbacks.  In particular, attention is devoted to the identification of the appropriate scale and level of accuracy in the reconstruction of metabolic networks, and to the inference of model structure and parameters, especially when dealing with a shortage of experimental measurements. The choice of the proper computational methods to derive in silico data is then addressed, including topological analyses, constraint-based modeling and simulation of the system dynamics. A description of some computational approaches to gain new biological knowledge or to formulate hypotheses is finally provided. PMID:25427076

  18. Human operator identification model and related computer programs

    NASA Technical Reports Server (NTRS)

    Kessler, K. M.; Mohr, J. N.

    1978-01-01

    Four computer programs which provide computational assistance in the analysis of man/machine systems are reported. The programs are: (1) Modified Transfer Function Program (TF); (2) Time Varying Response Program (TVSR); (3) Optimal Simulation Program (TVOPT); and (4) Linear Identification Program (SCIDNT). The TV program converts the time domain state variable system representative to frequency domain transfer function system representation. The TVSR program computes time histories of the input/output responses of the human operator model. The TVOPT program is an optimal simulation program and is similar to TVSR in that it produces time histories of system states associated with an operator in the loop system. The differences between the two programs are presented. The SCIDNT program is an open loop identification code which operates on the simulated data from TVOPT (or TVSR) or real operator data from motion simulators.

  19. Computational identification of operons in microbial genomes.

    PubMed

    Zheng, Yu; Szustakowski, Joseph D; Fortnow, Lance; Roberts, Richard J; Kasif, Simon

    2002-08-01

    By applying graph representations to biochemical pathways, a new computational pipeline is proposed to find potential operons in microbial genomes. The algorithm relies on the fact that enzyme genes in operons tend to catalyze successive reactions in metabolic pathways. We applied this algorithm to 42 microbial genomes to identify putative operon structures. The predicted operons from Escherichia coli were compared with a selected metabolism-related operon dataset from the RegulonDB database, yielding a prediction sensitivity (89%) and specificity (87%) relative to this dataset. Several examples of detected operons are given and analyzed. Modular gene cluster transfer and operon fusion are observed. A further use of predicted operon data to assign function to putative genes was suggested and, as an example, a previous putative gene (MJ1604) from Methanococcus jannaschii is now annotated as a phosphofructokinase, which was regarded previously as a missing enzyme in this organism. GC content changes in the operon region and nonoperon region were examined. The results reveal a clear GC content transition at the boundaries of putative operons. We looked further into the conservation of operons across genomes. A trp operon alignment is analyzed in depth to show gene loss and rearrangement in different organisms during operon evolution.

  20. Identification of Computational and Experimental Reduced-Order Models

    NASA Technical Reports Server (NTRS)

    Silva, Walter A.; Hong, Moeljo S.; Bartels, Robert E.; Piatak, David J.; Scott, Robert C.

    2003-01-01

    The identification of computational and experimental reduced-order models (ROMs) for the analysis of unsteady aerodynamic responses and for efficient aeroelastic analyses is presented. For the identification of a computational aeroelastic ROM, the CFL3Dv6.0 computational fluid dynamics (CFD) code is used. Flutter results for the AGARD 445.6 Wing and for a Rigid Semispan Model (RSM) computed using CFL3Dv6.0 are presented, including discussion of associated computational costs. Modal impulse responses of the unsteady aerodynamic system are computed using the CFL3Dv6.0 code and transformed into state-space form. The unsteady aerodynamic state-space ROM is then combined with a state-space model of the structure to create an aeroelastic simulation using the MATLAB/SIMULINK environment. The MATLAB/SIMULINK ROM is then used to rapidly compute aeroelastic transients, including flutter. The ROM shows excellent agreement with the aeroelastic analyses computed using the CFL3Dv6.0 code directly. For the identification of experimental unsteady pressure ROMs, results are presented for two configurations: the RSM and a Benchmark Supercritical Wing (BSCW). Both models were used to acquire unsteady pressure data due to pitching oscillations on the Oscillating Turntable (OTT) system at the Transonic Dynamics Tunnel (TDT). A deconvolution scheme involving a step input in pitch and the resultant step response in pressure, for several pressure transducers, is used to identify the unsteady pressure impulse responses. The identified impulse responses are then used to predict the pressure responses due to pitching oscillations at several frequencies. Comparisons with the experimental data are then presented.

  1. Model identification in computational stochastic dynamics using experimental modal data

    NASA Astrophysics Data System (ADS)

    Batou, A.; Soize, C.; Audebert, S.

    2015-01-01

    This paper deals with the identification of a stochastic computational model using experimental eigenfrequencies and mode shapes. In the presence of randomness, it is difficult to construct a one-to-one correspondence between the results provided by the stochastic computational model and the experimental data because of the random modes crossing and veering phenomena that may occur from one realization to another one. In this paper, this correspondence is constructed by introducing an adapted transformation for the computed modal quantities. Then the transformed computed modal quantities can be compared with the experimental data in order to identify the parameters of the stochastic computational model. The methodology is applied to a booster pump of thermal units for which experimental modal data have been measured on several sites.

  2. Computational Identification of Transcriptional Regulators in Human Endotoxemia

    PubMed Central

    Nguyen, Tung T.; Foteinou, Panagiota T.; Calvano, Steven E.; Lowry, Stephen F.; Androulakis, Ioannis P.

    2011-01-01

    One of the great challenges in the post-genomic era is to decipher the underlying principles governing the dynamics of biological responses. As modulating gene expression levels is among the key regulatory responses of an organism to changes in its environment, identifying biologically relevant transcriptional regulators and their putative regulatory interactions with target genes is an essential step towards studying the complex dynamics of transcriptional regulation. We present an analysis that integrates various computational and biological aspects to explore the transcriptional regulation of systemic inflammatory responses through a human endotoxemia model. Given a high-dimensional transcriptional profiling dataset from human blood leukocytes, an elementary set of temporal dynamic responses which capture the essence of a pro-inflammatory phase, a counter-regulatory response and a dysregulation in leukocyte bioenergetics has been extracted. Upon identification of these expression patterns, fourteen inflammation-specific gene batteries that represent groups of hypothetically ‘coregulated’ genes are proposed. Subsequently, statistically significant cis-regulatory modules (CRMs) are identified and decomposed into a list of critical transcription factors (34) that are validated largely on primary literature. Finally, our analysis further allows for the construction of a dynamic representation of the temporal transcriptional regulatory program across the host, deciphering possible combinatorial interactions among factors under which they might be active. Although much remains to be explored, this study has computationally identified key transcription factors and proposed a putative time-dependent transcriptional regulatory program associated with critical transcriptional inflammatory responses. These results provide a solid foundation for future investigations to elucidate the underlying transcriptional regulatory mechanisms under the host inflammatory response

  3. 48 CFR 227.7203-10 - Contractor identification and marking of computer software or computer software documentation to...

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... and marking of computer software or computer software documentation to be furnished with restrictive... Rights in Computer Software and Computer Software Documentation 227.7203-10 Contractor identification and marking of computer software or computer software documentation to be furnished with restrictive...

  4. 48 CFR 227.7203-3 - Early identification of computer software or computer software documentation to be furnished to...

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... computer software or computer software documentation to be furnished to the Government with restrictions on..., DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203-3 Early identification of computer software or computer software documentation to be furnished to the Government...

  5. 48 CFR 227.7203-10 - Contractor identification and marking of computer software or computer software documentation to...

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... and marking of computer software or computer software documentation to be furnished with restrictive... Rights in Computer Software and Computer Software Documentation 227.7203-10 Contractor identification and marking of computer software or computer software documentation to be furnished with restrictive markings...

  6. 48 CFR 227.7203-10 - Contractor identification and marking of computer software or computer software documentation to...

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... and marking of computer software or computer software documentation to be furnished with restrictive... Rights in Computer Software and Computer Software Documentation 227.7203-10 Contractor identification and marking of computer software or computer software documentation to be furnished with restrictive markings...

  7. 48 CFR 227.7203-10 - Contractor identification and marking of computer software or computer software documentation to...

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... and marking of computer software or computer software documentation to be furnished with restrictive... Rights in Computer Software and Computer Software Documentation 227.7203-10 Contractor identification and marking of computer software or computer software documentation to be furnished with restrictive markings...

  8. 48 CFR 227.7203-10 - Contractor identification and marking of computer software or computer software documentation to...

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... and marking of computer software or computer software documentation to be furnished with restrictive... Rights in Computer Software and Computer Software Documentation 227.7203-10 Contractor identification and marking of computer software or computer software documentation to be furnished with restrictive markings...

  9. Computed tomographic identification of calcified optic nerve drusen

    SciTech Connect

    Ramirez, H.; Blatt, E.S.; Hibri, N.S.

    1983-07-01

    Four cases of optic disk drusen were accurately diagnosed with orbital computed tomography (CT). The radiologist should be aware of the characteristic CT finding of discrete calcification within an otherwise normal optic disk. This benign process is easily differentiated from lesions such as calcific neoplastic processes of the posterior globe. CT identification of optic disk drusen is essential in the evaluation of visual field defects, migraine-like headaches, and pseudopapilledema.

  10. An experimental modal testing/identification technique for personal computers

    NASA Technical Reports Server (NTRS)

    Roemer, Michael J.; Schlonski, Steven T.; Mook, D. Joseph

    1990-01-01

    A PC-based system for mode shape identification is evaluated. A time-domain modal identification procedure is utilized to identify the mode shapes of a beam apparatus from discrete time-domain measurements. The apparatus includes a cantilevered aluminum beam, four accelerometers, four low-pass filters, and the computer. The method's algorithm is comprised of an identification algorithm: the Eigensystem Realization Algorithm (ERA) and an estimation algorithm called Minimum Model Error (MME). The identification ability of this algorithm is compared with ERA alone, a frequency-response-function technique, and an Euler-Bernoulli beam model. Detection of modal parameters and mode shapes by the PC-based time-domain system is shown to be accurate in an application with an aluminum beam, while mode shapes identified by the frequency-domain technique are not as accurate as predicted. The new method is shown to be significantly less sensitive to noise and poorly excited modes than other leading methods. The results support the use of time-domain identification systems for mode shape prediction.

  11. Multilevel Relaxation in Low Level Computer Vision.

    DTIC Science & Technology

    1982-01-01

    Lab, Cambridge,MA, June,1981. HR80 Hanson,A. and Riseman,E.M., Processing Cones: A Computational Structure for Image Analysis, In: Structured Computer Vision...Klinger,A. (Editors), Structured Computer Vision: Machine Perception through Hierarchical Computation Structures, Academic Press, New York, 1980. TP75

  12. Splign: algorithms for computing spliced alignments with identification of paralogs

    PubMed Central

    Kapustin, Yuri; Souvorov, Alexander; Tatusova, Tatiana; Lipman, David

    2008-01-01

    Background The computation of accurate alignments of cDNA sequences against a genome is at the foundation of modern genome annotation pipelines. Several factors such as presence of paralogs, small exons, non-consensus splice signals, sequencing errors and polymorphic sites pose recognized difficulties to existing spliced alignment algorithms. Results We describe a set of algorithms behind a tool called Splign for computing cDNA-to-Genome alignments. The algorithms include a high-performance preliminary alignment, a compartment identification based on a formally defined model of adjacent duplicated regions, and a refined sequence alignment. In a series of tests, Splign has produced more accurate results than other tools commonly used to compute spliced alignments, in a reasonable amount of time. Conclusion Splign's ability to deal with various issues complicating the spliced alignment problem makes it a helpful tool in eukaryotic genome annotation processes and alternative splicing studies. Its performance is enough to align the largest currently available pools of cDNA data such as the human EST set on a moderate-sized computing cluster in a matter of hours. The duplications identification (compartmentization) algorithm can be used independently in other areas such as the study of pseudogenes. Reviewers This article was reviewed by: Steven Salzberg, Arcady Mushegian and Andrey Mironov (nominated by Mikhail Gelfand). PMID:18495041

  13. Tracking by Identification Using Computer Vision and Radio

    PubMed Central

    Mandeljc, Rok; Kovačič, Stanislav; Kristan, Matej; Perš, Janez

    2013-01-01

    We present a novel system for detection, localization and tracking of multiple people, which fuses a multi-view computer vision approach with a radio-based localization system. The proposed fusion combines the best of both worlds, excellent computer-vision-based localization, and strong identity information provided by the radio system, and is therefore able to perform tracking by identification, which makes it impervious to propagated identity switches. We present comprehensive methodology for evaluation of systems that perform person localization in world coordinate system and use it to evaluate the proposed system as well as its components. Experimental results on a challenging indoor dataset, which involves multiple people walking around a realistically cluttered room, confirm that proposed fusion of both systems significantly outperforms its individual components. Compared to the radio-based system, it achieves better localization results, while at the same time it successfully prevents propagation of identity switches that occur in pure computer-vision-based tracking. PMID:23262485

  14. Computational Issues in Damping Identification for Large Scale Problems

    NASA Technical Reports Server (NTRS)

    Pilkey, Deborah L.; Roe, Kevin P.; Inman, Daniel J.

    1997-01-01

    Two damping identification methods are tested for efficiency in large-scale applications. One is an iterative routine, and the other a least squares method. Numerical simulations have been performed on multiple degree-of-freedom models to test the effectiveness of the algorithm and the usefulness of parallel computation for the problems. High Performance Fortran is used to parallelize the algorithm. Tests were performed using the IBM-SP2 at NASA Ames Research Center. The least squares method tested incurs high communication costs, which reduces the benefit of high performance computing. This method's memory requirement grows at a very rapid rate meaning that larger problems can quickly exceed available computer memory. The iterative method's memory requirement grows at a much slower pace and is able to handle problems with 500+ degrees of freedom on a single processor. This method benefits from parallelization, and significant speedup can he seen for problems of 100+ degrees-of-freedom.

  15. Multi-level hot zone identification for pedestrian safety.

    PubMed

    Lee, Jaeyoung; Abdel-Aty, Mohamed; Choi, Keechoo; Huang, Helai

    2015-03-01

    According to the National Highway Traffic Safety Administration (NHTSA), while fatalities from traffic crashes have decreased, the proportion of pedestrian fatalities has steadily increased from 11% to 14% over the past decade. This study aims at identifying two zonal levels factors. The first is to identify hot zones at which pedestrian crashes occurs, while the second are zones where crash-involved pedestrians came from. Bayesian Poisson lognormal simultaneous equation spatial error model (BPLSESEM) was estimated and revealed significant factors for the two target variables. Then, PSIs (potential for safety improvements) were computed using the model. Subsequently, a novel hot zone identification method was suggested to combine both hot zones from where vulnerable pedestrians originated with hot zones where many pedestrian crashes occur. For the former zones, targeted safety education and awareness campaigns can be provided as countermeasures whereas area-wide engineering treatments and enforcement may be effective safety treatments for the latter ones. Thus, it is expected that practitioners are able to suggest appropriate safety treatments for pedestrian crashes using the method and results from this study. Copyright © 2015 Elsevier Ltd. All rights reserved.

  16. Bone comparison identification method based on chest computed tomography imaging.

    PubMed

    Matsunobu, Yusuke; Morishita, Junji; Usumoto, Yosuke; Okumura, Miki; Ikeda, Noriaki

    2017-08-31

    The aim of this study is to examine the usefulness of bone structure extracted data from chest computed tomography (CT) images for personal identification. Eighteen autopsied cases (12 male and 6 female) that had ante- and post-mortem (AM and PM) CT images were used in this study. The two-dimensional (2D) and three-dimensional (3D) bone images were extracted from the chest CT images via thresholding technique. The similarity between two thoracic bone images (consisting of vertebrae, ribs, and sternum) acquired from AMCT and PMCT images was calculated in terms of the normalized cross-correlation value (NCCV) in both 2D and 3D matchings. An AM case with the highest NCCV corresponding to a given PM case among all of the AM cases studied was regarded as same person. The accuracy of identification of the same person using our method was 100% (18/18) in both 2D and 3D matchings. The NCCVs for the same person tended to be significantly higher than the average of NCCVs for different people in both 2D and 3D matchings. The computation times of image similarity between the two images were less than one second and approximately 10min in 2D and 3D matching, respectively. Therefore, 2D matching especially for thoracic bones seems more advantageous than 3D matching with regard to computation time. We conclude that our proposed personal identification method using bone structure would be useful in forensic cases. Copyright © 2017 Elsevier B.V. All rights reserved.

  17. Computational identification of site-specific transcription factors in Drosophila.

    PubMed

    Adryan, Boris; Teichmann, Sarah A

    2007-01-01

    Site-specific transcription factors (TFs) recognise their target genes in a sequence- or conformation-dependent manor. In contrast to the basal TFs that are the general facilitators of gene expression, their site-specific interaction partners are tissue- or condition-specific. Thus, site-specific TFs constitute the major prerequisite for the modular expression programmes that drive metazoan development. This article deals with the computational identification of TFs (how to find them in genomes) and with online resources such as the FlyTF database of Drosophila site-specific TFs (how to find them online).

  18. Computer Literacy for Teachers: Level 1.

    ERIC Educational Resources Information Center

    Oliver, Marvin E.

    This brief, non-technical narrative for teachers addresses the following questions: (1) What are computers? (2) How do they work? (3) What can they do? (4) Why should we care? (5) What do they have to do with reading instruction? and (6) What is the future of computers for teachers and students? Specific topics discussed include the development of…

  19. A Teaching Exercise for the Identification of Bacteria Using An Interactive Computer Program.

    ERIC Educational Resources Information Center

    Bryant, Trevor N.; Smith, John E.

    1979-01-01

    Describes an interactive Fortran computer program which provides an exercise in the identification of bacteria. Provides a way of enhancing a student's approach to systematic bacteriology and numerical identification procedures. (Author/MA)

  20. Chip level simulation of fault tolerant computers

    NASA Technical Reports Server (NTRS)

    Armstrong, J. R.

    1983-01-01

    Chip level modeling techniques, functional fault simulation, simulation software development, a more efficient, high level version of GSP, and a parallel architecture for functional simulation are discussed.

  1. Identification of Cichlid Fishes from Lake Malawi Using Computer Vision

    PubMed Central

    Joo, Deokjin; Kwan, Ye-seul; Song, Jongwoo; Pinho, Catarina; Hey, Jody; Won, Yong-Jin

    2013-01-01

    Background The explosively radiating evolution of cichlid fishes of Lake Malawi has yielded an amazing number of haplochromine species estimated as many as 500 to 800 with a surprising degree of diversity not only in color and stripe pattern but also in the shape of jaw and body among them. As these morphological diversities have been a central subject of adaptive speciation and taxonomic classification, such high diversity could serve as a foundation for automation of species identification of cichlids. Methodology/Principal Finding Here we demonstrate a method for automatic classification of the Lake Malawi cichlids based on computer vision and geometric morphometrics. For this end we developed a pipeline that integrates multiple image processing tools to automatically extract informative features of color and stripe patterns from a large set of photographic images of wild cichlids. The extracted information was evaluated by statistical classifiers Support Vector Machine and Random Forests. Both classifiers performed better when body shape information was added to the feature of color and stripe. Besides the coloration and stripe pattern, body shape variables boosted the accuracy of classification by about 10%. The programs were able to classify 594 live cichlid individuals belonging to 12 different classes (species and sexes) with an average accuracy of 78%, contrasting to a mere 42% success rate by human eyes. The variables that contributed most to the accuracy were body height and the hue of the most frequent color. Conclusions Computer vision showed a notable performance in extracting information from the color and stripe patterns of Lake Malawi cichlids although the information was not enough for errorless species identification. Our results indicate that there appears an unavoidable difficulty in automatic species identification of cichlid fishes, which may arise from short divergence times and gene flow between closely related species. PMID:24204918

  2. Factors Influencing Exemplary Science Teachers' Levels of Computer Use

    ERIC Educational Resources Information Center

    Hakverdi, Meral; Dana, Thomas M.; Swain, Colleen

    2011-01-01

    The purpose of this study was to examine exemplary science teachers' use of technology in science instruction, factors influencing their level of computer use, their level of knowledge/skills in using specific computer applications for science instruction, their use of computer-related applications/tools during their instruction, and their…

  3. Factors Influencing Exemplary Science Teachers' Levels of Computer Use

    ERIC Educational Resources Information Center

    Hakverdi, Meral; Dana, Thomas M.; Swain, Colleen

    2011-01-01

    The purpose of this study was to examine exemplary science teachers' use of technology in science instruction, factors influencing their level of computer use, their level of knowledge/skills in using specific computer applications for science instruction, their use of computer-related applications/tools during their instruction, and their…

  4. Computational proteomics tools for identification and quality control.

    PubMed

    Kopczynski, Dominik; Sickmann, Albert; Ahrends, Robert

    2017-11-10

    Computational proteomics is a constantly growing field to support end users with powerful and reliable tools for performing several computational steps within an analytics workflow for proteomics experiments. Typically, after capturing with a mass spectrometer, the proteins have to be identified and quantified. After certain follow-up analyses, an optional targeted approach is suitable for validating the results. The de.NBI (German network for bioinformatics infrastructure) service center in Dortmund provides several software applications and platforms as services to meet these demands. In this work, we present our tools and services, which is the combination of SearchGUI and PeptideShaker. SearchGUI is a managing tool for several search engines to find peptide spectra matches for one or more complex MS(2) measurements. PeptideShaker combines all matches and creates a consensus list of identified proteins providing statistical confidence measures. In a next step, we are planning to release a web service for protein identification containing both tools. This system will be designed for high scalability and distributed computing using solutions like the Docker container system among others. As an additional service, we offer a web service oriented database providing all necessary high-quality and high-resolution data for starting targeted proteomics analyses. The user can easily select proteins of interest, review the according spectra and download both protein sequences and spectral library. All systems are designed to be intuitively and user-friendly operable. Copyright © 2017 Elsevier B.V. All rights reserved.

  5. Human identification through frontal sinus 3D superimposition: Pilot study with Cone Beam Computer Tomography.

    PubMed

    Beaini, Thiago Leite; Duailibi-Neto, Eduardo F; Chilvarquer, Israel; Melani, Rodolfo F H

    2015-11-01

    As a unique anatomical feature of the human body, the frontal sinus morphology has been used for identification of unknown bodies with many techniques, mostly using 2D postero-anterior X-rays. With the increase of the use of Cone-Beam Computer Tomography (CBCT), the availability of this exam as ante-mortem records should be considered. The purpose of this study is to establish a new technique for frontal sinus identification through direct superimposition of 3D volumetric models obtained from CBCT exam, by testing two distinct situations. First, a reproducibility test, where two observers independently rendered models of frontal sinus from a sample 20 CBCT exams and identified them on each other's list. In the second situation, one observer tested the protocol and established on different exams of three individual. Using the open source DICOM viewer InVesallius(®) for rendering, Mesh Lab(®,) for positioning the models and CloudCompare for volumetric comparison, both observers matched cases with 100% accuracy and the level of coincidence in a identification situation. The uniqueness of the frontal sinus topography is remarkable and through the described technique, can be used in forensic as an identification method whenever both the sinus structure and antemortem computer tomography is available. Copyright © 2015 Elsevier Ltd and Faculty of Forensic and Legal Medicine. All rights reserved.

  6. Computational system identification of continuous-time nonlinear systems using approximate Bayesian computation

    NASA Astrophysics Data System (ADS)

    Krishnanathan, Kirubhakaran; Anderson, Sean R.; Billings, Stephen A.; Kadirkamanathan, Visakan

    2016-11-01

    In this paper, we derive a system identification framework for continuous-time nonlinear systems, for the first time using a simulation-focused computational Bayesian approach. Simulation approaches to nonlinear system identification have been shown to outperform regression methods under certain conditions, such as non-persistently exciting inputs and fast-sampling. We use the approximate Bayesian computation (ABC) algorithm to perform simulation-based inference of model parameters. The framework has the following main advantages: (1) parameter distributions are intrinsically generated, giving the user a clear description of uncertainty, (2) the simulation approach avoids the difficult problem of estimating signal derivatives as is common with other continuous-time methods, and (3) as noted above, the simulation approach improves identification under conditions of non-persistently exciting inputs and fast-sampling. Term selection is performed by judging parameter significance using parameter distributions that are intrinsically generated as part of the ABC procedure. The results from a numerical example demonstrate that the method performs well in noisy scenarios, especially in comparison to competing techniques that rely on signal derivative estimation.

  7. Computer program to predict aircraft noise levels

    NASA Technical Reports Server (NTRS)

    Clark, B. J.

    1981-01-01

    Methods developed at the NASA Lewis Research Center for predicting the noise contributions from various aircraft noise sources were programmed to predict aircraft noise levels either in flight or in ground tests. The noise sources include fan inlet and exhaust, jet, flap (for powered lift), core (combustor), turbine, and airframe. Noise propagation corrections are available for atmospheric attenuation, ground reflections, extra ground attenuation, and shielding. Outputs can include spectra, overall sound pressure level, perceived noise level, tone-weighted perceived noise level, and effective perceived noise level at locations specified by the user. Footprint contour coordinates and approximate footprint areas can also be calculated. Inputs and outputs can be in either System International or U.S. customary units. The subroutines for each noise source and propagation correction are described. A complete listing is given.

  8. Factors influencing exemplary science teachers' levels of computer use

    NASA Astrophysics Data System (ADS)

    Hakverdi, Meral

    This study examines exemplary science teachers' use of technology in science instruction, factors influencing their level of computer use, their level of knowledge/skills in using specific computer applications for science instruction, their use of computer-related applications/tools during their instruction, and their students' use of computer applications/tools in or for their science class. After a relevant review of the literature certain variables were selected for analysis. These variables included personal self-efficacy in teaching with computers, outcome expectancy, pupil-control ideology, level of computer use, age, gender, teaching experience, personal computer use, professional computer use and science teachers' level of knowledge/skills in using specific computer applications for science instruction. The sample for this study includes middle and high school science teachers who received the Presidential Award for Excellence in Science Teaching Award (sponsored by the White House and the National Science Foundation) between the years 1997 and 2003 from all 50 states and U.S. territories. Award-winning science teachers were contacted about the survey via e-mail or letter with an enclosed return envelope. Of the 334 award-winning science teachers, usable responses were received from 92 science teachers, which made a response rate of 27.5%. Analysis of the survey responses indicated that exemplary science teachers have a variety of knowledge/skills in using computer related applications/tools. The most commonly used computer applications/tools are information retrieval via the Internet, presentation tools, online communication, digital cameras, and data collection probes. Results of the study revealed that students' use of technology in their science classroom is highly correlated with the frequency of their science teachers' use of computer applications/tools. The results of the multiple regression analysis revealed that personal self-efficacy related to

  9. Identification and levels of airborne fungi in Portuguese primary schools.

    PubMed

    Madureira, Joana; Pereira, Cristiana; Paciência, Inês; Teixeira, João Paulo; de Oliveira Fernandes, Eduardo

    2014-01-01

    Several studies found associations between exposure to airborne fungi and allergy, infection, or irritation. This study aimed to characterize airborne fungi populations present in public primary schools in Porto, Portugal, during winter through quantification and identification procedures. Fungal concentration levels and identification were obtained in a total of 73 classrooms. The AirIdeal portable air sampler was used in combination with chloramphenicol malt extract agar. Results showed a wide range of indoor fungi levels, with indoor concentrations higher than outdoors. The most prevalent fungi found indoors were Penicillium sp. (>70%) and Cladosporium sp. As evidence indicates that indoor fungal exposures plays a role in asthma clinical status, these results may contribute to (1) promoting and implementing public health prevention programs and (2) formulating recommendations aimed at providing healthier school environments.

  10. Senior Nursing Students Comfort Levels with Computer Technology.

    ERIC Educational Resources Information Center

    Draude, Barbara J.; McKinney, Susan H.

    This paper describes the rationale for investigating the comfort levels in utilizing computer technology by senior level students in a baccalaureate nursing program. Students were surveyed before and after being exposed to various learning activities requiring interaction with computer technology. These structured learning activities included: use…

  11. Evaluation of new computer-enhanced identification program for microorganisms: adaptation of BioBASE for identification of members of the family Enterobacteriaceae.

    PubMed

    Miller, J M; Alachi, P

    1996-01-01

    We report the use of BioBASE, a computer-enhanced numerical identification software package, as a valuable aid for the rapid identification of unknown enteric bacilli when using conventional biochemicals. We compared BioBASE identification results with those of the Centers for Disease Control and Prevention's mainframe computer to determine the former's accuracy in identifying both common and rare unknown isolates of the family Enterobacteriaceae by using the same compiled data matrix. Of 293 enteric strains tested by BioBASE, 278 (94.9%) were correctly identified to the species level; 13 (4.4%) were assigned unacceptable or low discrimination profiles, but 8 of these (2.7%) were listed as the first choice; and 2 (0.7%) were not identified correctly because of their highly unusual biochemical profiles. The software is user friendly, rapid, and accurate and would be of value to any laboratory that uses conventional biochemicals.

  12. Sound source localization identification accuracy: Level and duration dependencies.

    PubMed

    Yost, William A

    2016-07-01

    Sound source localization accuracy for noises was measured for sources in the front azimuthal open field mainly as a function of overall noise level and duration. An identification procedure was used in which listeners identify which loudspeakers presented a sound. Noises were filtered and differed in bandwidth and center frequency. Sound source localization accuracy depended on the bandwidth of the stimuli, and for the narrow bandwidths, accuracy depended on the filter's center frequency. Sound source localization accuracy did not depend on overall level or duration.

  13. 24 CFR 990.180 - Utilities expense level: Computation of the rolling base consumption level.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ...: Computation of the rolling base consumption level. 990.180 Section 990.180 Housing and Urban Development... Calculating Formula Expenses § 990.180 Utilities expense level: Computation of the rolling base consumption level. (a) General. (1) The rolling base consumption level (RBCL) shall be equal to the average of...

  14. 24 CFR 990.175 - Utilities expense level: Computation of the current consumption level.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ...: Computation of the current consumption level. 990.175 Section 990.175 Housing and Urban Development... Calculating Formula Expenses § 990.175 Utilities expense level: Computation of the current consumption level. The current consumption level shall be the actual amount of each utility consumed during the 12-month...

  15. Proficiency Level--A Fuzzy Variable in Computer Learner Corpora

    ERIC Educational Resources Information Center

    Carlsen, Cecilie

    2012-01-01

    This article focuses on the proficiency level of texts in Computer Learner Corpora (CLCs). A claim is made that proficiency levels are often poorly defined in CLC design, and that the methods used for level assignment of corpus texts are not always adequate. Proficiency level can therefore, best be described as a fuzzy variable in CLCs,…

  16. Proficiency Level--A Fuzzy Variable in Computer Learner Corpora

    ERIC Educational Resources Information Center

    Carlsen, Cecilie

    2012-01-01

    This article focuses on the proficiency level of texts in Computer Learner Corpora (CLCs). A claim is made that proficiency levels are often poorly defined in CLC design, and that the methods used for level assignment of corpus texts are not always adequate. Proficiency level can therefore, best be described as a fuzzy variable in CLCs,…

  17. Learning support assessment study of a computer simulation for the development of microbial identification strategies.

    PubMed

    Johnson, T E; Gedney, C

    2001-05-01

    This paper describes a study that examined how microbiology students construct knowledge of bacterial identification while using a computer simulation. The purpose of this study was to understand how the simulation affects the cognitive processing of students during thinking, problem solving, and learning about bacterial identification and to determine how the simulation facilitates the learning of a domain-specific problem-solving strategy. As part of an upper-division microbiology course, five students participated in several simulation assignments. The data were collected using think-aloud protocol and video action logs as the students used the simulation. The analysis revealed two major themes that determined the performance of the students: Simulation Usage-how the students used the software features and Problem-Solving Strategy Development-the strategy level students started with and the skill level they achieved when they completed their use of the simulation. SEVERAL CONCLUSIONS EMERGED FROM THE ANALYSIS OF THE DATA: (i) The simulation affects various aspects of cognitive processing by creating an environment that makes it possible to practice the application of a problem-solving strategy. The simulation was used as an environment that allowed students to practice the cognitive skills required to solve an unknown. (ii) Identibacter (the computer simulation) may be considered to be a cognitive tool to facilitate the learning of a bacterial identification problem-solving strategy. (iii) The simulation characteristics did support student learning of a problem-solving strategy. (iv) Students demonstrated problem-solving strategy development specific to bacterial identification. (v) Participants demonstrated an improved performance from their repeated use of the simulation.

  18. Computational Identification of Novel Genes: Current and Future Perspectives

    PubMed Central

    Klasberg, Steffen; Bitard-Feildel, Tristan; Mallet, Ludovic

    2016-01-01

    While it has long been thought that all genomic novelties are derived from the existing material, many genes lacking homology to known genes were found in recent genome projects. Some of these novel genes were proposed to have evolved de novo, ie, out of noncoding sequences, whereas some have been shown to follow a duplication and divergence process. Their discovery called for an extension of the historical hypotheses about gene origination. Besides the theoretical breakthrough, increasing evidence accumulated that novel genes play important roles in evolutionary processes, including adaptation and speciation events. Different techniques are available to identify genes and classify them as novel. Their classification as novel is usually based on their similarity to known genes, or lack thereof, detected by comparative genomics or against databases. Computational approaches are further prime methods that can be based on existing models or leveraging biological evidences from experiments. Identification of novel genes remains however a challenging task. With the constant software and technologies updates, no gold standard, and no available benchmark, evaluation and characterization of genomic novelty is a vibrant field. In this review, the classical and state-of-the-art tools for gene prediction are introduced. The current methods for novel gene detection are presented; the methodological strategies and their limits are discussed along with perspective approaches for further studies. PMID:27493475

  19. Functions and Statistics with Computers at the College Level.

    ERIC Educational Resources Information Center

    Hollowell, Kathleen A.; Duch, Barbara J.

    An experimental college level course, Functions and Statistics with Computers, was designed using the textbook "Functions, Statistics, and Trigonometry with Computers" developed by the University of Chicago School Mathematics Project. Students in the experimental course were compared with students in the traditional course based on attitude toward…

  20. Computational identification of obligatorily autocatalytic replicators embedded in metabolic networks

    PubMed Central

    Kun, Ádám; Papp, Balázs; Szathmáry, Eörs

    2008-01-01

    Background If chemical A is necessary for the synthesis of more chemical A, then A has the power of replication (such systems are known as autocatalytic systems). We provide the first systems-level analysis searching for small-molecular autocatalytic components in the metabolisms of diverse organisms, including an inferred minimal metabolism. Results We find that intermediary metabolism is invariably autocatalytic for ATP. Furthermore, we provide evidence for the existence of additional, organism-specific autocatalytic metabolites in the forms of coenzymes (NAD+, coenzyme A, tetrahydrofolate, quinones) and sugars. Although the enzymatic reactions of a number of autocatalytic cycles are present in most of the studied organisms, they display obligatorily autocatalytic behavior in a few networks only, hence demonstrating the need for a systems-level approach to identify metabolic replicators embedded in large networks. Conclusion Metabolic replicators are apparently common and potentially both universal and ancestral: without their presence, kick-starting metabolic networks is impossible, even if all enzymes and genes are present in the same cell. Identification of metabolic replicators is also important for attempts to create synthetic cells, as some of these autocatalytic molecules will presumably be needed to be added to the system as, by definition, the system cannot synthesize them without their initial presence. PMID:18331628

  1. The Computational Complexity of Two-Level Morphology,

    DTIC Science & Technology

    1985-11-01

    At65 991 THE COMPUTATIONAL COMPLEXITY OF TUO-LEVEL MORPHOLOGY 1/1 (U) MASSACHUSETTS INST OF TECH CAMBRIDGE ARTIFICIAL INTELLIGENCE LAB G E BARTON...9 PE~RORuNG ORGANIZATION NAME ANO ADDRESS o. PROGRM~ ELEMENT PROjECT TASK Artificial Intelligence Laboratory AREA4 WORK UNIT NUM@ERS 545 Technology... ARTIFICIAL INTELLIGENCE LABORATORY A.I. Memo No. 856 November, 1985 THE COMPUTATIONAL COMPLEXITY OF TWO-LEVEL MORPHOLOGY G. Edward Barton, Jr. ABSTRACT

  2. Computational identification of genes modulating stem height-diameter allometry.

    PubMed

    Jiang, Libo; Ye, Meixia; Zhu, Sheng; Zhai, Yi; Xu, Meng; Huang, Minren; Wu, Rongling

    2016-12-01

    The developmental variation in stem height with respect to stem diameter is related to a broad range of ecological and evolutionary phenomena in trees, but the underlying genetic basis of this variation remains elusive. We implement a dynamic statistical model, functional mapping, to formulate a general procedure for the computational identification of quantitative trait loci (QTLs) that control stem height-diameter allometry during development. Functional mapping integrates the biological principles underlying trait formation and development into the association analysis of DNA genotype and endpoint phenotype, thus providing an incentive for understanding the mechanistic interplay between genes and development. Built on the basic tenet of functional mapping, we explore two core ecological scenarios of how stem height and stem diameter covary in response to environmental stimuli: (i) trees pioneer sunlit space by allocating more growth to stem height than diameter and (ii) trees maintain their competitive advantage through an inverse pattern. The model is equipped to characterize 'pioneering' QTLs (piQTLs) and 'maintaining' QTLs (miQTLs) which modulate these two ecological scenarios, respectively. In a practical application to a mapping population of full-sib hybrids derived from two Populus species, the model has well proven its versatility by identifying several piQTLs that promote height growth at a cost of diameter growth and several miQTLs that benefit radial growth at a cost of height growth. Judicious application of functional mapping may lead to improved strategies for studying the genetic control of the formation mechanisms underlying trade-offs among quantities of assimilates allocated to different growth parts. © 2016 The Authors. Plant Biotechnology Journal published by Society for Experimental Biology and The Association of Applied Biologists and John Wiley & Sons Ltd.

  3. A Model of Computation for Bit-Level Concurrent Computing and Programming: APEC

    NASA Astrophysics Data System (ADS)

    Ajiro, Takashi; Tsuchida, Kensei

    A concurrent model of computation and a language based on the model for bit-level operation are useful for developing asynchronous and concurrent programs compositionally, which frequently use bit-level operations. Some examples are programs for video games, hardware emulation (including virtual machines), and signal processing. However, few models and languages are optimized and oriented to bit-level concurrent computation. We previously developed a visual programming language called A-BITS for bit-level concurrent programming. The language is based on a dataflow-like model that computes using processes that provide serial bit-level operations and FIFO buffers connected to them. It can express bit-level computation naturally and develop compositionally. We then devised a concurrent computation model called APEC (Asynchronous Program Elements Connection) for bit-level concurrent computation. This model enables precise and formal expression of the process of computation, and a notion of primitive program elements for controlling and operating can be expressed synthetically. Specifically, the model is based on a notion of uniform primitive processes, called primitives, that have three terminals and four ordered rules at most, as well as on bidirectional communication using vehicles called carriers. A new notion is that a carrier moving between two terminals can briefly express some kinds of computation such as synchronization and bidirectional communication. The model's properties make it most applicable to bit-level computation compositionally, since the uniform computation elements are enough to develop components that have practical functionality. Through future application of the model, our research may enable further research on a base model of fine-grain parallel computer architecture, since the model is suitable for expressing massive concurrency by a network of primitives.

  4. Identification of natural images and computer-generated graphics based on statistical and textural features.

    PubMed

    Peng, Fei; Li, Jiao-ting; Long, Min

    2015-03-01

    To discriminate the acquisition pipelines of digital images, a novel scheme for the identification of natural images and computer-generated graphics is proposed based on statistical and textural features. First, the differences between them are investigated from the view of statistics and texture, and 31 dimensions of feature are acquired for identification. Then, LIBSVM is used for the classification. Finally, the experimental results are presented. The results show that it can achieve an identification accuracy of 97.89% for computer-generated graphics, and an identification accuracy of 97.75% for natural images. The analyses also demonstrate the proposed method has excellent performance, compared with some existing methods based only on statistical features or other features. The method has a great potential to be implemented for the identification of natural images and computer-generated graphics.

  5. Identification of Learning Processes by Means of Computer Graphics.

    ERIC Educational Resources Information Center

    Sorensen, Birgitte Holm

    1993-01-01

    Describes a development project for the use of computer graphics and video in connection with an inservice training course for primary education teachers in Denmark. Topics addressed include research approaches to computers; computer graphics in learning processes; activities relating to computer graphics; the role of the teacher; and student…

  6. The algorithmic level is the bridge between computation and brain.

    PubMed

    Love, Bradley C

    2015-04-01

    Every scientist chooses a preferred level of analysis and this choice shapes the research program, even determining what counts as evidence. This contribution revisits Marr's (1982) three levels of analysis (implementation, algorithmic, and computational) and evaluates the prospect of making progress at each individual level. After reviewing limitations of theorizing within a level, two strategies for integration across levels are considered. One is top-down in that it attempts to build a bridge from the computational to algorithmic level. Limitations of this approach include insufficient theoretical constraint at the computation level to provide a foundation for integration, and that people are suboptimal for reasons other than capacity limitations. Instead, an inside-out approach is forwarded in which all three levels of analysis are integrated via the algorithmic level. This approach maximally leverages mutual data constraints at all levels. For example, algorithmic models can be used to interpret brain imaging data, and brain imaging data can be used to select among competing models. Examples of this approach to integration are provided. This merging of levels raises questions about the relevance of Marr's tripartite view.

  7. Teaching Mineral-Identification Skills Using an Expert System Computer Program Incorporating Digitized Video Images.

    ERIC Educational Resources Information Center

    Diemer, John Andrew; And Others

    1989-01-01

    Describes a computer program which represents the first merger of an expert system and imaging technology into a tool to assist in learning mineral identification. Presents a discussion of the system's development, evaluation of effectiveness, and future modifications. (RT)

  8. 7 CFR 1.427 - Filing; identification of parties of record; service; and computation of time.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... Review of Sourcing Areas Pursuant to the Forest Resources Conservation and Shortage Relief Act of 1990 (16 U.S.C. 620 et seq.) § 1.427 Filing; identification of parties of record; service; and computation...

  9. 7 CFR 1.427 - Filing; identification of parties of record; service; and computation of time.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... Review of Sourcing Areas Pursuant to the Forest Resources Conservation and Shortage Relief Act of 1990 (16 U.S.C. 620 et seq.) § 1.427 Filing; identification of parties of record; service; and computation...

  10. 7 CFR 1.427 - Filing; identification of parties of record; service; and computation of time.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... Review of Sourcing Areas Pursuant to the Forest Resources Conservation and Shortage Relief Act of 1990 (16 U.S.C. 620 et seq.) § 1.427 Filing; identification of parties of record; service; and computation...

  11. 7 CFR 1.427 - Filing; identification of parties of record; service; and computation of time.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... Review of Sourcing Areas Pursuant to the Forest Resources Conservation and Shortage Relief Act of 1990 (16 U.S.C. 620 et seq.) § 1.427 Filing; identification of parties of record; service; and computation...

  12. Investigating a Measure of Computer Technology Domain Identification: A Tool for Understanding Gender Differences and Stereotypes

    ERIC Educational Resources Information Center

    Smith, Jessi L.; Morgan, Carolyn L.; White, Paul H.

    2005-01-01

    The aim of this project is to further examine the construct of domain identification (i.e., a person's positive phenomenological experiences with, and perceived self-relevance of, a domain), specifically as it applies to computer technology (CT). The authors model a knownmeasure of math identification to first develop a measure of CT…

  13. Computer ethics and teritary level education in Hong Kong

    SciTech Connect

    Wong, E.Y.W.; Davison, R.M.; Wade, P.W.

    1994-12-31

    This paper seeks to highlight some ethical issues relating to the increasing proliferation of Information Technology into our everyday lives. The authors explain their understanding of computer ethics, and give some reasons why the study of computer ethics is becoming increasingly pertinent. The paper looks at some of the problems that arise in attempting to develop appropriate ethical concepts in a constantly changing environment, and explores some of the ethical dilemmas arising from the increasing use of computers. Some initial research undertaken to explore the ideas and understanding of tertiary level students in Hong Kong on a number of ethical issues of interest is described, and our findings discussed. We hope that presenting this paper and eliciting subsequent discussion will enable us to draw up more comprehensive guidelines for the teaching of computer related ethics to tertiary level students, as well as reveal some directions for future research.

  14. Computing NLTE Opacities -- Node Level Parallel Calculation

    SciTech Connect

    Holladay, Daniel

    2015-09-11

    Presentation. The goal: to produce a robust library capable of computing reasonably accurate opacities inline with the assumption of LTE relaxed (non-LTE). Near term: demonstrate acceleration of non-LTE opacity computation. Far term (if funded): connect to application codes with in-line capability and compute opacities. Study science problems. Use efficient algorithms that expose many levels of parallelism and utilize good memory access patterns for use on advanced architectures. Portability to multiple types of hardware including multicore processors, manycore processors such as KNL, GPUs, etc. Easily coupled to radiation hydrodynamics and thermal radiative transfer codes.

  15. A Program for the Identification of the Enterobacteriaceae for Use in Teaching the Principles of Computer Identification of Bacteria.

    ERIC Educational Resources Information Center

    Hammonds, S. J.

    1990-01-01

    A technique for the numerical identification of bacteria using normalized likelihoods calculated from a probabilistic database is described, and the principles of the technique are explained. The listing of the computer program is included. Specimen results from the program, and examples of how they should be interpreted, are given. (KR)

  16. Identification of control targets in Boolean molecular network models via computational algebra.

    PubMed

    Murrugarra, David; Veliz-Cuba, Alan; Aguilar, Boris; Laubenbacher, Reinhard

    2016-09-23

    Many problems in biomedicine and other areas of the life sciences can be characterized as control problems, with the goal of finding strategies to change a disease or otherwise undesirable state of a biological system into another, more desirable, state through an intervention, such as a drug or other therapeutic treatment. The identification of such strategies is typically based on a mathematical model of the process to be altered through targeted control inputs. This paper focuses on processes at the molecular level that determine the state of an individual cell, involving signaling or gene regulation. The mathematical model type considered is that of Boolean networks. The potential control targets can be represented by a set of nodes and edges that can be manipulated to produce a desired effect on the system. This paper presents a method for the identification of potential intervention targets in Boolean molecular network models using algebraic techniques. The approach exploits an algebraic representation of Boolean networks to encode the control candidates in the network wiring diagram as the solutions of a system of polynomials equations, and then uses computational algebra techniques to find such controllers. The control methods in this paper are validated through the identification of combinatorial interventions in the signaling pathways of previously reported control targets in two well studied systems, a p53-mdm2 network and a blood T cell lymphocyte granular leukemia survival signaling network. Supplementary data is available online and our code in Macaulay2 and Matlab are available via http://www.ms.uky.edu/~dmu228/ControlAlg . This paper presents a novel method for the identification of intervention targets in Boolean network models. The results in this paper show that the proposed methods are useful and efficient for moderately large networks.

  17. Identification of proteins with increased levels in ameloblastic carcinoma.

    PubMed

    García-Muñoz, Alejandro; Bologna-Molina, Ronell; Aldape-Barrios, Beatriz; Licéaga-Escalera, Carlos; Montoya-Pérez, Luis A; Rodríguez, Mario A

    2014-06-01

    The comparative proteomic approach by a combination of 2-dimensional electrophoresis and matrix-assisted laser desorption-ionization-time of flight mass spectrometry (MS) analysis is an attractive strategy for the discovery of cancer biomarkers and therapeutic targets. The identification of protein biomarkers associated with ameloblastic carcinoma (AC), a malignant epithelial odontogenic tumor, will potentially improve the diagnostic and prognostic accuracy for this malignant neoplasm. The aim of the present study was to identify highly expressed proteins in AC that could be considered as potential biomarkers. The protein profile of an AC was compared with the protein profiles of 3 cases of benign ameloblastoma. Proteins that showed increased levels in AC were identified using MS, and the augmented amount of some of these proteins in the malignant lesion was confirmed by Western blot or immunohistochemistry. We detected a total of 782 spots in the protein profile of AC, and 19 of them, showing elevated levels compared with benign ameloblastoma, were identified using MS. These proteins have been implicated in several cellular functions, such as cell structure, metabolism, stress response, and signal transduction. The increased expression of the identified proteins and the minor expression of some proteins that might inhibit tumor progression could be involved in the evolution from a benign lesion to carcinoma. Copyright © 2014 American Association of Oral and Maxillofacial Surgeons. Published by Elsevier Inc. All rights reserved.

  18. Dysregulation in level of goal and action identification across psychological disorders

    PubMed Central

    Watkins, Edward

    2011-01-01

    Goals, events, and actions can be mentally represented within a hierarchical framework that ranges from more abstract to more concrete levels of identification. A more abstract level of identification involves general, superordinate, and decontextualized mental representations that convey the meaning of goals, events, and actions, “why” an action is performed, and its purpose, ends, and consequences. A more concrete level of identification involves specific and subordinate mental representations that include contextual details of goals, events, and actions, and the specific “how” details of an action. This review considers three lines of evidence for considering that dysregulation of level of goal/action identification may be a transdiagnostic process. First, there is evidence that different levels of identification have distinct functional consequences and that in non-clinical samples level of goal/action identification appears to be regulated in a flexible and adaptive way to match the level of goal/action identification to circumstances. Second, there is evidence that level of goal/action identification causally influences symptoms and processes involved in psychological disorders, including emotional response, repetitive thought, impulsivity, problem solving and procrastination. Third, there is evidence that the level of goal/action identification is biased and/or dysregulated in certain psychological disorders, with a bias towards more abstract identification for negative events in depression, GAD, PTSD, and social anxiety. PMID:20579789

  19. Computational simulation of drug delivery at molecular level.

    PubMed

    Li, Youyong; Hou, Tingjun

    2010-01-01

    The field of drug delivery is advancing rapidly. By controlling the precise level and/or location of a given drug in the body, side effects are reduced, doses are lowered, and new therapies are possible. Nonetheless, substantial challenges remain for delivering specific drugs into specific cells. Computational methods to predict the binding and dynamics between drug molecule and its carrier are increasingly desirable to minimize the investment in drug design and development. Significant progress in computational simulation is making it possible to understand the mechanism of drug delivery. This review summarizes the computational methods and progress of four categories of drug delivery systems: dendrimers, polymer micelle, liposome and carbon nanotubes. Computational simulations are particularly valuable in designing better drug carriers and addressing issues that are difficult to be explored by laboratory experiments, such as diffusion, dynamics, etc.

  20. Variance and bias computation for enhanced system identification

    NASA Technical Reports Server (NTRS)

    Bergmann, Martin; Longman, Richard W.; Juang, Jer-Nan

    1989-01-01

    A study is made of the use of a series of variance and bias confidence criteria recently developed for the eigensystem realization algorithm (ERA) identification technique. The criteria are shown to be very effective, not only for indicating the accuracy of the identification results (especially in terms of confidence intervals), but also for helping the ERA user to obtain better results. They help determine the best sample interval, the true system order, how much data to use and whether to introduce gaps in the data used, what dimension Hankel matrix to use, and how to limit the bias or correct for bias in the estimates.

  1. Contours identification of elements in a cone beam computed tomography for investigating maxillary cysts

    NASA Astrophysics Data System (ADS)

    Chioran, Doina; Nicoarǎ, Adrian; Roşu, Şerban; Cǎrligeriu, Virgil; Ianeş, Emilia

    2013-10-01

    Digital processing of two-dimensional cone beam computer tomography slicesstarts by identification of the contour of elements within. This paper deals with the collective work of specialists in medicine and applied mathematics in computer science on elaborating and implementation of algorithms in dental 2D imagery.

  2. The Reality of Computers at the Community College Level.

    ERIC Educational Resources Information Center

    Leone, Stephen J.

    Writing teachers at the community college level who teach using a computer have come to accept the fact that it is more than "just teaching" composition. Such teaching often requires instructors to be as knowledgeable as some of the technicians. Two-year college students and faculty are typically given little support in using computers…

  3. Three-level Teaching Material for Computer-Aided Lecturing.

    ERIC Educational Resources Information Center

    Sajaniemi, Jorma; Kuittinen, Marja

    1999-01-01

    Presents the concept of three-level teaching material and describes a computer-based system SHOW (Sandwich Hierarchy for Omnifarious Ware) that can be used to produce the different forms from a single source, and to show the presentation version in a lecturing situation. Describes an empirical experiment on text discrimination techniques; suggests…

  4. Mathematical Level Raising through Collaborative Investigations with the Computer

    ERIC Educational Resources Information Center

    Pijls, Monique; Dekker, Rijkje; van Hout-Wolters, Bernadette

    2003-01-01

    Investigations with the computer can have different functions in the mathematical learning process, such as to let students explore a subject domain, to guide the process of reinvention, or to give them the opportunity to apply what they have learned. Which function has most effect on mathematical level raising? We investigated that question in…

  5. OS friendly microprocessor architecture: Hardware level computer security

    NASA Astrophysics Data System (ADS)

    Jungwirth, Patrick; La Fratta, Patrick

    2016-05-01

    We present an introduction to the patented OS Friendly Microprocessor Architecture (OSFA) and hardware level computer security. Conventional microprocessors have not tried to balance hardware performance and OS performance at the same time. Conventional microprocessors have depended on the Operating System for computer security and information assurance. The goal of the OS Friendly Architecture is to provide a high performance and secure microprocessor and OS system. We are interested in cyber security, information technology (IT), and SCADA control professionals reviewing the hardware level security features. The OS Friendly Architecture is a switched set of cache memory banks in a pipeline configuration. For light-weight threads, the memory pipeline configuration provides near instantaneous context switching times. The pipelining and parallelism provided by the cache memory pipeline provides for background cache read and write operations while the microprocessor's execution pipeline is running instructions. The cache bank selection controllers provide arbitration to prevent the memory pipeline and microprocessor's execution pipeline from accessing the same cache bank at the same time. This separation allows the cache memory pages to transfer to and from level 1 (L1) caching while the microprocessor pipeline is executing instructions. Computer security operations are implemented in hardware. By extending Unix file permissions bits to each cache memory bank and memory address, the OSFA provides hardware level computer security.

  6. Computational model for supporting SHM systems design: Damage identification via numerical analyses

    NASA Astrophysics Data System (ADS)

    Sartorato, Murilo; de Medeiros, Ricardo; Vandepitte, Dirk; Tita, Volnei

    2017-02-01

    This work presents a computational model to simulate thin structures monitored by piezoelectric sensors in order to support the design of SHM systems, which use vibration based methods. Thus, a new shell finite element model was proposed and implemented via a User ELement subroutine (UEL) into the commercial package ABAQUS™. This model was based on a modified First Order Shear Theory (FOST) for piezoelectric composite laminates. After that, damaged cantilever beams with two piezoelectric sensors in different positions were investigated by using experimental analyses and the proposed computational model. A maximum difference in the magnitude of the FRFs between numerical and experimental analyses of 7.45% was found near the resonance regions. For damage identification, different levels of damage severity were evaluated by seven damage metrics, including one proposed by the present authors. Numerical and experimental damage metrics values were compared, showing a good correlation in terms of tendency. Finally, based on comparisons of numerical and experimental results, it is shown a discussion about the potentials and limitations of the proposed computational model to be used for supporting SHM systems design.

  7. Domain identification in impedance computed tomography by spline collocation method

    NASA Technical Reports Server (NTRS)

    Kojima, Fumio

    1990-01-01

    A method for estimating an unknown domain in elliptic boundary value problems is considered. The problem is formulated as an inverse problem of integral equations of the second kind. A computational method is developed using a splice collocation scheme. The results can be applied to the inverse problem of impedance computed tomography (ICT) for image reconstruction.

  8. Level of Identification as a Predictor of Attitude Change.

    ERIC Educational Resources Information Center

    Williams, Robert H.; Williams, Sharon Ann

    1987-01-01

    Discussion of conditions under which simulation games promote changes in attitudes focuses on identification theory as a predictor of attitude change. Incentive theory and cognitive dissonance theory are discussed, and a study of community college students is described that tested the role of identification in changing attitudes. (LRW)

  9. Identification of condition-specific regulatory modules through multi-level motif and mRNA expression analysis

    PubMed Central

    Chen, Li; Wang, Yue; Hoffman, Eric P.; Riggins, Rebecca B.; Clarke, Robert

    2013-01-01

    Many computational methods for identification of transcription regulatory modules often result in many false positives in practice due to noise sources of binding information and gene expression profiling data. In this paper, we propose a multi-level strategy for condition-specific gene regulatory module identification by integrating motif binding information and gene expression data through support vector regression and significant analysis. We have demonstrated the feasibility of the proposed method on a yeast cell cycle data set. The study on a breast cancer microarray data set shows that it can successfully identify the significant and reliable regulatory modules associated with breast cancer. PMID:20054984

  10. Logic as Marr's Computational Level: Four Case Studies.

    PubMed

    Baggio, Giosuè; van Lambalgen, Michiel; Hagoort, Peter

    2015-04-01

    We sketch four applications of Marr's levels-of-analysis methodology to the relations between logic and experimental data in the cognitive neuroscience of language and reasoning. The first part of the paper illustrates the explanatory power of computational level theories based on logic. We show that a Bayesian treatment of the suppression task in reasoning with conditionals is ruled out by EEG data, supporting instead an analysis based on defeasible logic. Further, we describe how results from an EEG study on temporal prepositions can be reanalyzed using formal semantics, addressing a potential confound. The second part of the article demonstrates the predictive power of logical theories drawing on EEG data on processing progressive constructions and on behavioral data on conditional reasoning in people with autism. Logical theories can constrain processing hypotheses all the way down to neurophysiology, and conversely neuroscience data can guide the selection of alternative computational level models of cognition.

  11. Root morphology and anatomical patterns in forensic dental identification: a comparison of computer-aided identification with traditional forensic dental identification.

    PubMed

    van der Meer, Dirk T; Brumit, Paula C; Schrader, Bruce A; Dove, Stephen B; Senn, David R

    2010-11-01

    An online forensic dental identification exercise was conducted involving 24 antemortem-postmortem (AM-PM) dental radiograph pairs from actual forensic identification cases. Images had been digitally cropped to remove coronal tooth structure and dental restorations. Volunteer forensic odontologists were passively recruited to compare the AM-PM dental radiographs online and conclude identification status using the guidelines for identification from the American Board of Forensic Odontology. The mean accuracy rate for identification was 86.0% (standard deviation 9.2%). The same radiograph pairs were compared using a digital imaging software algorithm, which generated a normalized coefficient of similarity for each pair. Twenty of the radiograph pairs generated a mean accuracy of 85.0%. Four of the pairs could not be used to generate a coefficient of similarity. Receiver operator curve and area under the curve statistical analysis confirmed good discrimination abilities of both methods (online exercise = 0.978; UT-ID index = 0.923) and Spearman's rank correlation coefficient analysis (0.683) indicated good correlation between the results of both methods. Computer-aided dental identification allows for an objective comparison of AM-PM radiographs and can be a useful tool to support a forensic dental identification conclusion.

  12. Identification of the pleural fissures with computed tomography

    SciTech Connect

    Marks, B.W.; Kuhns, L.R.

    1982-04-01

    The pleural fissures can be identified as avascular planes within the pulmonary parenchyma on CT scans. A retrospective analysis of 23 consecutive scans was conducted to consider identification of fissures. On 21% of the axial images, a ''ground glass'' band was identified within the avascular plane, probably due to partial volume averaging of the pleural fissure with the adjacent lung. The pleural fissures could be identified in 84% of cases.

  13. Firearms Identification Using Pattern Analysis and Computational Modeling.

    DTIC Science & Technology

    1995-05-09

    evidence in court to convict the perpetrators of the crime. There are many important areas in forensic science , such as, firearms identification...significant impact on the world of forensic science . 2 Dr. Hall had found time during his medical practice to do research on bullet wounds. The...more common. However, in forensic science , the advantages of modern technology are not fully exploited, and there is a tremendous potential for progress

  14. Data identification for improving gene network inference using computational algebra.

    PubMed

    Dimitrova, Elena; Stigler, Brandilyn

    2014-11-01

    Identification of models of gene regulatory networks is sensitive to the amount of data used as input. Considering the substantial costs in conducting experiments, it is of value to have an estimate of the amount of data required to infer the network structure. To minimize wasted resources, it is also beneficial to know which data are necessary to identify the network. Knowledge of the data and knowledge of the terms in polynomial models are often required a priori in model identification. In applications, it is unlikely that the structure of a polynomial model will be known, which may force data sets to be unnecessarily large in order to identify a model. Furthermore, none of the known results provides any strategy for constructing data sets to uniquely identify a model. We provide a specialization of an existing criterion for deciding when a set of data points identifies a minimal polynomial model when its monomial terms have been specified. Then, we relax the requirement of the knowledge of the monomials and present results for model identification given only the data. Finally, we present a method for constructing data sets that identify minimal polynomial models.

  15. Use of a Computer-Assisted Identification System in the Identification of the Remains of Deceased USAF Personnel.

    DTIC Science & Technology

    1988-04-01

    1987 he completed a Master of Arts degree in Computer Resource Management. Major Triplett also graduated from AFIP’s Forensic Odontology Course and...Thomas, 1973. Articles and Periodicals 4. Brannon, Lawrence S., Capt, USA. "Forensic Odontology : An Application for the Army Dentist." Military...34 Military Medicine, Vol. 146 (April 1981), pp. 262-264. 12. Kim, Ho Wohn. "The Role of Forensic Odontology in the Field of Human Identification

  16. Computing Bounds on Resource Levels for Flexible Plans

    NASA Technical Reports Server (NTRS)

    Muscvettola, Nicola; Rijsman, David

    2009-01-01

    A new algorithm efficiently computes the tightest exact bound on the levels of resources induced by a flexible activity plan (see figure). Tightness of bounds is extremely important for computations involved in planning because tight bounds can save potentially exponential amounts of search (through early backtracking and detection of solutions), relative to looser bounds. The bound computed by the new algorithm, denoted the resource-level envelope, constitutes the measure of maximum and minimum consumption of resources at any time for all fixed-time schedules in the flexible plan. At each time, the envelope guarantees that there are two fixed-time instantiations one that produces the minimum level and one that produces the maximum level. Therefore, the resource-level envelope is the tightest possible resource-level bound for a flexible plan because any tighter bound would exclude the contribution of at least one fixed-time schedule. If the resource- level envelope can be computed efficiently, one could substitute looser bounds that are currently used in the inner cores of constraint-posting scheduling algorithms, with the potential for great improvements in performance. What is needed to reduce the cost of computation is an algorithm, the measure of complexity of which is no greater than a low-degree polynomial in N (where N is the number of activities). The new algorithm satisfies this need. In this algorithm, the computation of resource-level envelopes is based on a novel combination of (1) the theory of shortest paths in the temporal-constraint network for the flexible plan and (2) the theory of maximum flows for a flow network derived from the temporal and resource constraints. The measure of asymptotic complexity of the algorithm is O(N O(maxflow(N)), where O(x) denotes an amount of computing time or a number of arithmetic operations proportional to a number of the order of x and O(maxflow(N)) is the measure of complexity (and thus of cost) of a maximumflow

  17. Computer-guided drug repurposing: identification of trypanocidal activity of clofazimine, benidipine and saquinavir.

    PubMed

    Bellera, Carolina L; Balcazar, Darío E; Vanrell, M Cristina; Casassa, A Florencia; Palestro, Pablo H; Gavernet, Luciana; Labriola, Carlos A; Gálvez, Jorge; Bruno-Blanch, Luis E; Romano, Patricia S; Carrillo, Carolina; Talevi, Alan

    2015-03-26

    In spite of remarkable advances in the knowledge on Trypanosoma cruzi biology, no medications to treat Chagas disease have been approved in the last 40 years and almost 8 million people remain infected. Since the public sector and non-profit organizations play a significant role in the research efforts on Chagas disease, it is important to implement research strategies that promote translation of basic research into the clinical practice. Recent international public-private initiatives address the potential of drug repositioning (i.e. finding second or further medical uses for known-medications) which can substantially improve the success at clinical trials and the innovation in the pharmaceutical field. In this work, we present the computer-aided identification of approved drugs clofazimine, benidipine and saquinavir as potential trypanocidal compounds and test their effects at biochemical as much as cellular level on different parasite stages. According to the obtained results, we discuss biopharmaceutical, toxicological and physiopathological criteria applied to decide to move clofazimine and benidipine into preclinical phase, in an acute model of infection. The article illustrates the potential of computer-guided drug repositioning to integrate and optimize drug discovery and preclinical development; it also proposes rational rules to select which among repositioned candidates should advance to investigational drug status and offers a new insight on clofazimine and benidipine as candidate treatments for Chagas disease. One Sentence Summary: We present the computer-guided drug repositioning of three approved drugs as potential new treatments for Chagas disease, integrating computer-aided drug screening and biochemical, cellular and preclinical tests. Copyright © 2015 Elsevier Masson SAS. All rights reserved.

  18. Design of binary subwavelength multiphase level computer generated holograms.

    PubMed

    Freese, Wiebke; Kämpfe, Thomas; Kley, Ernst-Bernhard; Tünnermann, Andreas

    2010-03-01

    The ability of subwavelength structures to create an artificial effective index opens up new perspectives in designing highly efficient diffractive optical elements. We demonstrate a design approach for binary multi-phase level computer generated holograms based on the effective medium approach. The phase pattern is formed by various subwavelength structures that cause a certain phase delay to an incident light wave. This binary structure approach leads to a significant cost reduction by simplifying the fabrication process. For demonstration, a three-phase level element, operating in the visible range, is fabricated and experimentally evaluated.

  19. A new computer-assisted technique to aid personal identification.

    PubMed

    De Angelis, Danilo; Sala, Remo; Cantatore, Angela; Grandi, Marco; Cattaneo, Cristina

    2009-07-01

    The paper describes a procedure aimed at identification from two-dimensional (2D) images (video-surveillance tapes, for example) by comparison with a three-dimensional (3D) facial model of a suspect. The application is intended to provide a tool which can help in analyzing compatibility or incompatibility between a criminal and a suspect's facial traits. The authors apply the concept of "geometrically compatible images". The idea is to use a scanner to reconstruct a 3D facial model of a suspect and to compare it to a frame extracted from the video-surveillance sequence which shows the face of the perpetrator. Repositioning and reorientation of the 3D model according to subject's face framed in the crime scene photo are manually accomplished, after automatic resizing. Repositioning and reorientation are performed in correspondence of anthropometric landmarks, distinctive for that person and detected both on the 2D face and on the 3D model. In this way, the superimposition between the original two-dimensional facial image and the three-dimensional one is obtained and a judgment is formulated by an expert on the basis of the fit between the anatomical facial districts of the two subjects. The procedure reduces the influence of face orientation and may be a useful tool in identification.

  20. GC/IR computer-aided identification of anaerobic bacteria

    NASA Astrophysics Data System (ADS)

    Ye, Hunian; Zhang, Feng S.; Yang, Hua; Li, Zhu; Ye, Song

    1993-09-01

    A new method was developed to identify anaerobic bacteria by using pattern recognition. The method is depended on GC / JR data. The system is intended for use as a precise rapid and reproduceable aid in the identification of unknown isolates. Key Words: Anaerobic bacteria Pattern recognition Computeraided identification GC / JR 1 . TNTRODUCTTON A major problem in the field of anaerobic bacteriology is the difficulty in accurately precisely and rapidly identifying unknown isolates. Tn the proceedings of the Third International Symposium on Rapid Methods and Automation in Microbiology C. M. Moss said: " Chromatographic analysis is a new future for clinical microbiology" . 12 years past and so far it seems that this is an idea whose time has not get come but it close. Now two major advances that have brought the technology forword in terms ofmaking it appropriate for use in the clinical laboratory can aldo be cited. One is the development and implementation of fused silica capillary columns. In contrast to packed columns and those of'' greater width these columns allow reproducible recovery of hydroxey fatty acids with the same carbon chain length. The second advance is the efficient data processing afforded by modern microcomputer systems. On the other hand the practical steps for sample preparation also are an advance in the clinical laboratory. Chromatographic Analysis means mainly of analysis of fatty acids. The most common

  1. A Simple Computer Application for the Identification of Conifer Genera

    ERIC Educational Resources Information Center

    Strain, Steven R.; Chmielewski, Jerry G.

    2010-01-01

    The National Science Education Standards prescribe that an understanding of the importance of classifying organisms be one component of a student's educational experience in the life sciences. The use of a classification scheme to identify organisms is one way of addressing this goal. We describe Conifer ID, a computer application that assists…

  2. A Simple Computer Application for the Identification of Conifer Genera

    ERIC Educational Resources Information Center

    Strain, Steven R.; Chmielewski, Jerry G.

    2010-01-01

    The National Science Education Standards prescribe that an understanding of the importance of classifying organisms be one component of a student's educational experience in the life sciences. The use of a classification scheme to identify organisms is one way of addressing this goal. We describe Conifer ID, a computer application that assists…

  3. All-memristive neuromorphic computing with level-tuned neurons

    NASA Astrophysics Data System (ADS)

    Pantazi, Angeliki; Woźniak, Stanisław; Tuma, Tomas; Eleftheriou, Evangelos

    2016-09-01

    In the new era of cognitive computing, systems will be able to learn and interact with the environment in ways that will drastically enhance the capabilities of current processors, especially in extracting knowledge from vast amount of data obtained from many sources. Brain-inspired neuromorphic computing systems increasingly attract research interest as an alternative to the classical von Neumann processor architecture, mainly because of the coexistence of memory and processing units. In these systems, the basic components are neurons interconnected by synapses. The neurons, based on their nonlinear dynamics, generate spikes that provide the main communication mechanism. The computational tasks are distributed across the neural network, where synapses implement both the memory and the computational units, by means of learning mechanisms such as spike-timing-dependent plasticity. In this work, we present an all-memristive neuromorphic architecture comprising neurons and synapses realized by using the physical properties and state dynamics of phase-change memristors. The architecture employs a novel concept of interconnecting the neurons in the same layer, resulting in level-tuned neuronal characteristics that preferentially process input information. We demonstrate the proposed architecture in the tasks of unsupervised learning and detection of multiple temporal correlations in parallel input streams. The efficiency of the neuromorphic architecture along with the homogenous neuro-synaptic dynamics implemented with nanoscale phase-change memristors represent a significant step towards the development of ultrahigh-density neuromorphic co-processors.

  4. All-memristive neuromorphic computing with level-tuned neurons.

    PubMed

    Pantazi, Angeliki; Woźniak, Stanisław; Tuma, Tomas; Eleftheriou, Evangelos

    2016-09-02

    In the new era of cognitive computing, systems will be able to learn and interact with the environment in ways that will drastically enhance the capabilities of current processors, especially in extracting knowledge from vast amount of data obtained from many sources. Brain-inspired neuromorphic computing systems increasingly attract research interest as an alternative to the classical von Neumann processor architecture, mainly because of the coexistence of memory and processing units. In these systems, the basic components are neurons interconnected by synapses. The neurons, based on their nonlinear dynamics, generate spikes that provide the main communication mechanism. The computational tasks are distributed across the neural network, where synapses implement both the memory and the computational units, by means of learning mechanisms such as spike-timing-dependent plasticity. In this work, we present an all-memristive neuromorphic architecture comprising neurons and synapses realized by using the physical properties and state dynamics of phase-change memristors. The architecture employs a novel concept of interconnecting the neurons in the same layer, resulting in level-tuned neuronal characteristics that preferentially process input information. We demonstrate the proposed architecture in the tasks of unsupervised learning and detection of multiple temporal correlations in parallel input streams. The efficiency of the neuromorphic architecture along with the homogenous neuro-synaptic dynamics implemented with nanoscale phase-change memristors represent a significant step towards the development of ultrahigh-density neuromorphic co-processors.

  5. A computer-controlled olfactometer for a self-administered odor identification test.

    PubMed

    Schriever, Valentin Alexander; Körner, Johannes; Beyer, Robert; Viana, Samanta; Seo, Han-Seok

    2011-09-01

    Odor identification tests are widely used for the general screening of olfactory function. However, the administration of odor identification tests is often limited due to a lack of investigators' time. Therefore, we attempted to design a computer-controlled olfactometer to present a self-administered odor identification test. The results produced by means of this olfactometer were evaluated in terms of validity and test-retest reliability. To test the validity, participants' performance in the odor identification test using the olfactometer was compared with their performance in the odor identification test using the validated assessment of the "Sniffin' Sticks" test. The ten-item odor identification test was performed two times using two different methods: (1) the self-administered test using the computer-controlled olfactometer and (2) the foreign-administered test using the "Sniffin' sticks." To examine test-retest reliability, 20 participants were asked to repeat these tests on a different day. Participants reached significantly higher scores on a foreign-administered odor identification test using the "Sniffin' sticks" than on the olfactometer-based test; however, this effect was driven by two less correctly identified odors in the olfactometer-based test. The significant difference between both methods in the mean scores disappeared after excluding two odors from the analysis. In addition, both methods showed no significant difference in scores obtained during the first and second session, indicating that results were consistent between sessions. In conclusion, our findings demonstrate that the computer-controlled olfactometer designed in this study can be used for a self-administered odor identification test.

  6. Computational identification of candidate nucleotide cyclases in higher plants.

    PubMed

    Wong, Aloysius; Gehring, Chris

    2013-01-01

    In higher plants guanylyl cyclases (GCs) and adenylyl cyclases (ACs) cannot be identified using BLAST homology searches based on annotated cyclic nucleotide cyclases (CNCs) of prokaryotes, lower eukaryotes, or animals. The reason is that CNCs are often part of complex multifunctional proteins with different domain organizations and biological functions that are not conserved in higher plants. For this reason, we have developed CNC search strategies based on functionally conserved amino acids in the catalytic center of annotated and/or experimentally confirmed CNCs. Here we detail this method which has led to the identification of >25 novel candidate CNCs in Arabidopsis thaliana, several of which have been experimentally confirmed in vitro and in vivo. We foresee that the application of this method can be used to identify many more members of the growing family of CNCs in higher plants.

  7. Computational identification of functional RNA homologs in metagenomic data

    PubMed Central

    Nawrocki, Eric P.; Eddy, Sean R.

    2013-01-01

    A key step toward understanding a metagenomics data set is the identification of functional sequence elements within it, such as protein coding genes and structural RNAs. Relative to protein coding genes, structural RNAs are more difficult to identify because of their reduced alphabet size, lack of open reading frames, and short length. Infernal is a software package that implements “covariance models” (CMs) for RNA homology search, which harness both sequence and structural conservation when searching for RNA homologs. Thanks to the added statistical signal inherent in the secondary structure conservation of many RNA families, Infernal is more powerful than sequence-only based methods such as BLAST and profile HMMs. Together with the Rfam database of CMs, Infernal is a useful tool for identifying RNAs in metagenomics data sets. PMID:23722291

  8. Computational Identification of Active Enhancers in Model Organisms

    PubMed Central

    Wang, Chengqi; Zhang, Michael Q.; Zhang, Zhihua

    2013-01-01

    As a class of cis-regulatory elements, enhancers were first identified as the genomic regions that are able to markedly increase the transcription of genes nearly 30 years ago. Enhancers can regulate gene expression in a cell-type specific and developmental stage specific manner. Although experimental technologies have been developed to identify enhancers genome-wide, the design principle of the regulatory elements and the way they rewire the transcriptional regulatory network tempo-spatially are far from clear. At present, developing predictive methods for enhancers, particularly for the cell-type specific activity of enhancers, is central to computational biology. In this review, we survey the current computational approaches for active enhancer prediction and discuss future directions. PMID:23685394

  9. Parallel computing-based sclera recognition for human identification

    NASA Astrophysics Data System (ADS)

    Lin, Yong; Du, Eliza Y.; Zhou, Zhi

    2012-06-01

    Compared to iris recognition, sclera recognition which uses line descriptor can achieve comparable recognition accuracy in visible wavelengths. However, this method is too time-consuming to be implemented in a real-time system. In this paper, we propose a GPU-based parallel computing approach to reduce the sclera recognition time. We define a new descriptor in which the information of KD tree structure and sclera edge are added. Registration and matching task is divided into subtasks in various sizes according to their computation complexities. Every affine transform parameters are generated by searching on KD tree. Texture memory, constant memory, and shared memory are used to store templates and transform matrixes. The experiment results show that the proposed method executed on GPU can dramatically improve the sclera matching speed in hundreds of times without accuracy decreasing.

  10. Earth feature identification for onboard multispectral data editing: Computational experiments

    NASA Technical Reports Server (NTRS)

    Aherron, R. M.; Arduini, R. F.; Davis, R. E.; Huck, R. O.; Park, S. K.

    1981-01-01

    A computational model of the processes involved in multispectral remote sensing and data classification is developed as a tool for designing smart sensors which can process, edit, and classify the data that they acquire. An evaluation of sensor system performance and design tradeoffs involves classification rates and errors as a function of number and location of spectral channels, radiometric sensitivity and calibration accuracy, target discrimination assignments, and accuracy and frequency of compensation for imaging conditions. This model provides a link between the radiometric and statistical properties of the signals to be classified and the performance characteristics of electro-optical sensors and data processing devices. Preliminary computational results are presented which illustrate the editing performance of several remote sensing approaches.

  11. Construction, implementation and testing of an image identification system using computer vision methods for fruit flies with economic importance (Diptera: Tephritidae).

    PubMed

    Wang, Jiang-Ning; Chen, Xiao-Lin; Hou, Xin-Wen; Zhou, Li-Bing; Zhu, Chao-Dong; Ji, Li-Qiang

    2017-07-01

    Many species of Tephritidae are damaging to fruit, which might negatively impact international fruit trade. Automatic or semi-automatic identification of fruit flies are greatly needed for diagnosing causes of damage and quarantine protocols for economically relevant insects. A fruit fly image identification system named AFIS1.0 has been developed using 74 species belonging to six genera, which include the majority of pests in the Tephritidae. The system combines automated image identification and manual verification, balancing operability and accuracy. AFIS1.0 integrates image analysis and expert system into a content-based image retrieval framework. In the the automatic identification module, AFIS1.0 gives candidate identification results. Afterwards users can do manual selection based on comparing unidentified images with a subset of images corresponding to the automatic identification result. The system uses Gabor surface features in automated identification and yielded an overall classification success rate of 87% to the species level by Independent Multi-part Image Automatic Identification Test. The system is useful for users with or without specific expertise on Tephritidae in the task of rapid and effective identification of fruit flies. It makes the application of computer vision technology to fruit fly recognition much closer to production level. © 2016 Society of Chemical Industry. © 2016 Society of Chemical Industry.

  12. Computationally inexpensive identification of noninformative model parameters by sequential screening

    NASA Astrophysics Data System (ADS)

    Mai, Juliane; Cuntz, Matthias; Zink, Matthias; Thober, Stephan; Kumar, Rohini; Schäfer, David; Schrön, Martin; Craven, John; Rakovec, Oldrich; Spieler, Diana; Prykhodko, Vladyslav; Dalmasso, Giovanni; Musuuza, Jude; Langenberg, Ben; Attinger, Sabine; Samaniego, Luis

    2016-04-01

    Environmental models tend to require increasing computational time and resources as physical process descriptions are improved or new descriptions are incorporated. Many-query applications such as sensitivity analysis or model calibration usually require a large number of model evaluations leading to high computational demand. This often limits the feasibility of rigorous analyses. Here we present a fully automated sequential screening method that selects only informative parameters for a given model output. The method requires a number of model evaluations that is approximately 10 times the number of model parameters. It was tested using the mesoscale hydrologic model mHM in three hydrologically unique European river catchments. It identified around 20 informative parameters out of 52, with different informative parameters in each catchment. The screening method was evaluated with subsequent analyses using all 52 as well as only the informative parameters. Subsequent Sobol's global sensitivity analysis led to almost identical results yet required 40% fewer model evaluations after screening. mHM was calibrated with all and with only informative parameters in the three catchments. Model performances for daily discharge were equally high in both cases with Nash-Sutcliffe efficiencies above 0.82. Calibration using only the informative parameters needed just one third of the number of model evaluations. The universality of the sequential screening method was demonstrated using several general test functions from the literature. We therefore recommend the use of the computationally inexpensive sequential screening method prior to rigorous analyses on complex environmental models.

  13. Computational identification of Ciona intestinalis microRNAs.

    PubMed

    Keshavan, Raja; Virata, Michael; Keshavan, Anisha; Zeller, Robert W

    2010-02-01

    MicroRNAs (miRNAs) are conserved non-coding small RNAs with potent post-transcriptional gene regulatory functions. Recent computational approaches and sequencing of small RNAs had indicated the existence of about 80 Ciona intestinalis miRNAs, although it was not clear whether other miRNA genes were present in the genome. We undertook an alternative computational approach to look for Ciona miRNAs. Conserved non-coding sequences from the C. intestinalis genome were extracted and computationally folded to identify putative hairpin-like structures. After applying additional criteria, we obtained 458 miRNA candidates whose sequences were used to design a custom microarray. Over 100 of our predicted hairpins were identified in this array when probed with RNA from various Ciona stages. We also compared our predictions to recently deposited sequences of Ciona small RNAs and report that 170 of our predicted hairpins are represented in this data set. Altogether, about 250 of our 458 predicted miRNAs were represented in either our array data or the small-RNA sequence database. These results suggest that Ciona has a large number of genomically encoded miRNAs that play an important role in modulating gene activity in developing embryos and adults.

  14. Parallel Computation of the Topology of Level Sets

    SciTech Connect

    Pascucci, V; Cole-McLaughlin, K

    2004-12-16

    This paper introduces two efficient algorithms that compute the Contour Tree of a 3D scalar field F and its augmented version with the Betti numbers of each isosurface. The Contour Tree is a fundamental data structure in scientific visualization that is used to preprocess the domain mesh to allow optimal computation of isosurfaces with minimal overhead storage. The Contour Tree can also be used to build user interfaces reporting the complete topological characterization of a scalar field, as shown in Figure 1. Data exploration time is reduced since the user understands the evolution of level set components with changing isovalue. The Augmented Contour Tree provides even more accurate information segmenting the range space of the scalar field in portion of invariant topology. The exploration time for a single isosurface is also improved since its genus is known in advance. Our first new algorithm augments any given Contour Tree with the Betti numbers of all possible corresponding isocontours in linear time with the size of the tree. Moreover we show how to extend the scheme introduced in [3] with the Betti number computation without increasing its complexity. Thus, we improve on the time complexity from our previous approach [10] from O(m log m) to O(n log n + m), where m is the number of cells and n is the number of vertices in the domain of F. Our second contribution is a new divide-and-conquer algorithm that computes the Augmented Contour Tree with improved efficiency. The approach computes the output Contour Tree by merging two intermediate Contour Trees and is independent of the interpolant. In this way we confine any knowledge regarding a specific interpolant to an independent function that computes the tree for a single cell. We have implemented this function for the trilinear interpolant and plan to replace it with higher order interpolants when needed. The time complexity is O(n + t log n), where t is the number of critical points of F. For the first time

  15. Correlations of Electrophysiological Measurements with Identification Levels of Ancient Chinese Characters

    PubMed Central

    Qi, Zhengyang; Wang, Xiaolong; Hao, Shuang; Zhu, Chuanlin; He, Weiqi; Luo, Wenbo

    2016-01-01

    Studies of event-related potential (ERP) in the human brain have shown that the N170 component can reliably distinguish among different object categories. However, it is unclear whether this is true for different identifiable levels within a single category. In the present study, we used ERP recording to examine the neural response to different identification levels and orientations (upright vs. inverted) of Chinese characters. The results showed that P1, N170, and P250 were modulated by different identification levels of Chinese characters. Moreover, time frequency analysis showed similar results, indicating that identification levels were associated with object recognition, particularly during processing of a single categorical stimulus. PMID:26982215

  16. MAX--An Interactive Computer Program for Teaching Identification of Clay Minerals by X-ray Diffraction.

    ERIC Educational Resources Information Center

    Kohut, Connie K.; And Others

    1993-01-01

    Discusses MAX, an interactive computer program for teaching identification of clay minerals based on standard x-ray diffraction characteristics. The program provides tutorial-type exercises for identification of 16 clay standards, self-evaluation exercises, diffractograms of 28 soil clay minerals, and identification of nonclay minerals. (MDH)

  17. Cloud identification using genetic algorithms and massively parallel computation

    NASA Technical Reports Server (NTRS)

    Buckles, Bill P.; Petry, Frederick E.

    1996-01-01

    As a Guest Computational Investigator under the NASA administered component of the High Performance Computing and Communication Program, we implemented a massively parallel genetic algorithm on the MasPar SIMD computer. Experiments were conducted using Earth Science data in the domains of meteorology and oceanography. Results obtained in these domains are competitive with, and in most cases better than, similar problems solved using other methods. In the meteorological domain, we chose to identify clouds using AVHRR spectral data. Four cloud speciations were used although most researchers settle for three. Results were remarkedly consistent across all tests (91% accuracy). Refinements of this method may lead to more timely and complete information for Global Circulation Models (GCMS) that are prevalent in weather forecasting and global environment studies. In the oceanographic domain, we chose to identify ocean currents from a spectrometer having similar characteristics to AVHRR. Here the results were mixed (60% to 80% accuracy). Given that one is willing to run the experiment several times (say 10), then it is acceptable to claim the higher accuracy rating. This problem has never been successfully automated. Therefore, these results are encouraging even though less impressive than the cloud experiment. Successful conclusion of an automated ocean current detection system would impact coastal fishing, naval tactics, and the study of micro-climates. Finally we contributed to the basic knowledge of GA (genetic algorithm) behavior in parallel environments. We developed better knowledge of the use of subpopulations in the context of shared breeding pools and the migration of individuals. Rigorous experiments were conducted based on quantifiable performance criteria. While much of the work confirmed current wisdom, for the first time we were able to submit conclusive evidence. The software developed under this grant was placed in the public domain. An extensive user

  18. Computer versus physician identification of gastrointestinal alarm features

    PubMed Central

    Almario, Christopher V.; Chey, William D.; Iriana, Sentia; Dailey, Francis; Robbins, Karen; Patel, Anish V.; Reid, Mark; Whitman, Cynthia; Fuller, Garth; Bolus, Roger; Dennis, Buddy; Encarnacion, Rey; Martinez, Bibiana; Soares, Jennifer; Modi, Rushaba; Agarwal, Nikhil; Lee, Aaron; Kubomoto, Scott; Sharma, Gobind; Bolus, Sally; Chang, Lin; Spiegel, Brennan M.R.

    2015-01-01

    Objective It is important for clinicians to inquire about “alarm features” as it may identify those at risk for organic disease and who require additional diagnostic workup. We developed a computer algorithm called Automated Evaluation of Gastrointestinal Symptoms (AEGIS) that systematically collects patient gastrointestinal (GI) symptoms and alarm features, and then “translates” the information into a history of present illness (HPI). Our study's objective was to compare the number of alarms documented by physicians during usual care vs. that collected by AEGIS. Methods We performed a cross-sectional study with a paired sample design among patients visiting adult GI clinics. Participants first received usual care by their physicians and then completed AEGIS. Each individual thus contributed both a physician-documented and computer-generated HPI. Blinded physician reviewers enumerated the positive alarm features (hematochezia, melena, hematemesis, unintentional weight loss, decreased appetite, and fevers) mentioned in each HPI. We compared the number of documented alarms within patient using the Wilcoxon signed-rank test. Results Seventy-five patients had both physician and AEGIS HPIs. AEGIS identified more patients with positive alarm features compared to physicians (53% vs. 27%; p < .001). AEGIS also documented more positive alarms (median 1, interquartile range [IQR] 0–2) vs. physicians (median 0, IQR 0–1; p < .001). Moreover, clinicians documented only 30% of the positive alarms self-reported by patients through AEGIS. Conclusions Physicians documented less than one-third of red flags reported by patients through a computer algorithm. These data indicate that physicians may under report alarm features and that computerized “checklists” could complement standard HPIs to bolster clinical care. PMID:26254875

  19. Computer method for identification of boiler transfer functions

    NASA Technical Reports Server (NTRS)

    Miles, J. H.

    1971-01-01

    An iterative computer method is described for identifying boiler transfer functions using frequency response data. An objective penalized performance measure and a nonlinear minimization technique are used to cause the locus of points generated by a transfer function to resemble the locus of points obtained from frequency response measurements. Different transfer functions can be tried until a satisfactory empirical transfer function to the system is found. To illustrate the method, some examples and some results from a study of a set of data consisting of measurements of the inlet impedance of a single tube forced flow boiler with inserts are given.

  20. Novel computational identification of highly selective biomarkers of pollutant exposure.

    PubMed

    Weisman, David; Liu, Hong; Redfern, Jessica; Zhu, Liya; Colón-Carmona, Adán

    2011-06-15

    The use of in vivo biosensors to acquire environmental pollution data is an emerging and promising paradigm. One major challenge is the identification of highly specific biomarkers that selectively report exposure to a target pollutant, while remaining quiescent under a diverse set of other, often unknown, environmental conditions. This study hypothesized that a microarray data mining approach can identify highly specific biomarkers, and, that the robustness property can generalize to unforeseen environmental conditions. Starting with Arabidopsis thaliana microarray data measuring responses to a variety of treatments, the study used the top scoring pair (TSP) algorithm to identify mRNA transcripts that respond uniquely to phenanthrene, a model polycyclic aromatic hydrocarbon. Subsequent in silico analysis with a larger set of microarray data indicated that the biomarkers remained robust under new conditions. Finally, in vivo experiments were performed with unforeseen conditions that mimic phenanthrene stress, and the biomarkers were assayed using qRT-PCR. In these experiments, the biomarkers always responded positively to phenanthrene, and never responded to the unforeseen conditions, thereby supporting the hypotheses. This data mining approach requires only microarray or next-generation RNA-seq data, and, in principle, can be applied to arbitrary biomonitoring organisms and chemical exposures.

  1. Usability Studies in Virtual and Traditional Computer Aided Design Environments for Fault Identification

    DTIC Science & Technology

    2017-08-08

    Benchmark; Virtual Reality ; Virtual Environments; Competitive Comparison INTRODUCTION his paper is an extension of the work done by Satter...Usability Studies In Virtual And Traditional Computer Aided Design Environments For Fault Identification Dr. Syed Adeel Ahmed, Xavier University... virtual environment with wand interfaces compared directly with a workstation non-stereoscopic traditional CAD interface with keyboard and mouse. In

  2. Computer software for identification of noise source and automatic noise measurement

    NASA Astrophysics Data System (ADS)

    Fujii, Kenji; Sakurai, Masatsugu; Ando, Yoichi

    2004-10-01

    A new computational system for the environmental noise measurement and analysis has been developed. The system consists of binaural microphones, a laptop PC, and analysing software. A target noise is recorded automatically depending on the specified background noise level, and the acoustical parameters are calculated simultaneously. These functions allow for precise field measurements. The system is equipped with a template-matching algorithm for the identification of noise source. This function was implemented to avoid the effect of an interrupting sound such as voice and wind blowing during a measurement. Noise analyses in this system are based on the model of human auditory system. In addition to the time-series data of sound level, the important acoustical parameters of noise source are extracted from the running autocorrelation function (ACF) and the inter-aural cross-correlation function (IACF). It has been found that those parameters are strongly related to the auditory primary sensations and spatial sensations. Evaluation of the environmental noise based on these functions is another feature of this system. This paper describes the effectiveness of the ACF and the IACF analysis for analysing acoustical properties of noise and for evaluating the subjective response to noise.

  3. Apps for Angiosperms: The Usability of Mobile Computers and Printed Field Guides for UK Wild Flower and Winter Tree Identification

    ERIC Educational Resources Information Center

    Stagg, Bethan C.; Donkin, Maria E.

    2017-01-01

    We investigated usability of mobile computers and field guide books with adult botanical novices, for the identification of wildflowers and deciduous trees in winter. Identification accuracy was significantly higher for wildflowers using a mobile computer app than field guide books but significantly lower for deciduous trees. User preference…

  4. Diagnostic reference level of computed tomography (CT) in Japan.

    PubMed

    Fukushima, Yasuhiro; Tsushima, Yoshito; Takei, Hiroyuki; Taketomi-Takahashi, Ayako; Otake, Hidenori; Endo, Keigo

    2012-08-01

    Optimisation of computed tomography (CT) parameters is important in avoiding excess radiation exposure. The aim of this study is to establish the diagnostic reference levels (DRL) of CT in Japan by using dose-length product (DLP). Datasheets were sent to all hospitals/clinics which had CT scanner(s) in Gunma prefecture. Data were obtained for all patients who underwent CT during a single month (June 2010), and the distributions of DLP were evaluated for eight anatomical regions and five patient age groups. The DRL was defined as the 25th and 75th percentiles of DLP. Datasheets were collected from 80 of 192 hospitals/clinics (26 090 patients). DLP for head CT of paediatric patients tended to be higher in Japan compared with DRLs of paediatric head CTs reported from the EU or Syria. Although this study was performed with limited samples, DLP for adult patients were at comparable levels for all anatomical regions.

  5. Bridging Levels of Understanding in Schizophrenia Through Computational Modeling.

    PubMed

    Anticevic, Alan; Murray, John D; Barch, Deanna M

    2015-05-01

    Schizophrenia is an illness with a remarkably complex symptom presentation that has thus far been out of reach of neuroscientific explanation. This presents a fundamental problem for developing better treatments that target specific symptoms or root causes. One promising path forward is the incorporation of computational neuroscience, which provides a way to formalize experimental observations and, in turn, make theoretical predictions for subsequent studies. We review three complementary approaches: (a) biophysically based models developed to test cellular-level and synaptic hypotheses, (b) connectionist models that give insight into large-scale neural-system-level disturbances in schizophrenia, and (c) models that provide a formalism for observations of complex behavioral deficits, such as negative symptoms. We argue that harnessing all of these modeling approaches represents a productive approach for better understanding schizophrenia. We discuss how blending these approaches can allow the field to progress toward a more comprehensive understanding of schizophrenia and its treatment.

  6. Bridging Levels of Understanding in Schizophrenia Through Computational Modeling

    PubMed Central

    Anticevic, Alan; Murray, John D.; Barch, Deanna M.

    2015-01-01

    Schizophrenia is an illness with a remarkably complex symptom presentation that has thus far been out of reach of neuroscientific explanation. This presents a fundamental problem for developing better treatments that target specific symptoms or root causes. One promising path forward is the incorporation of computational neuroscience, which provides a way to formalize experimental observations and, in turn, make theoretical predictions for subsequent studies. We review three complementary approaches: (a) biophysically based models developed to test cellular-level and synaptic hypotheses, (b) connectionist models that give insight into large-scale neural-system-level disturbances in schizophrenia, and (c) models that provide a formalism for observations of complex behavioral deficits, such as negative symptoms. We argue that harnessing all of these modeling approaches represents a productive approach for better understanding schizophrenia. We discuss how blending these approaches can allow the field to progress toward a more comprehensive understanding of schizophrenia and its treatment. PMID:25960938

  7. FPGA based compute nodes for high level triggering in PANDA

    NASA Astrophysics Data System (ADS)

    Kühn, W.; Gilardi, C.; Kirschner, D.; Lang, J.; Lange, S.; Liu, M.; Perez, T.; Yang, S.; Schmitt, L.; Jin, D.; Li, L.; Liu, Z.; Lu, Y.; Wang, Q.; Wei, S.; Xu, H.; Zhao, D.; Korcyl, K.; Otwinowski, J. T.; Salabura, P.; Konorov, I.; Mann, A.

    2008-07-01

    PANDA is a new universal detector for antiproton physics at the HESR facility at FAIR/GSI. The PANDA data acquisition system has to handle interaction rates of the order of 107/s and data rates of several 100 Gb/s. FPGA based compute nodes with multi-Gb/s bandwidth capability using the ATCA architecture are designed to handle tasks such as event building, feature extraction and high level trigger processing. Data connectivity is provided via optical links as well as multiple Gb Ethernet ports. The boards will support trigger algorithms such us pattern recognition for RICH detectors, EM shower analysis, fast tracking algorithms and global event characterization. Besides VHDL, high level C-like hardware description languages will be considered to implement the firmware.

  8. Assessment of the Relationships between the Computer Attitudes and Computer Literacy Levels of Prospective Educators.

    ERIC Educational Resources Information Center

    Hignite, Michael A.; Echternacht, Lonnie J.

    1992-01-01

    Data gathered from 83 prospective business educators on computer anxiety, attitudes toward computers in education, attitudes toward computers as a teaching tool, and computer liking were correlated with data on subjects' computer systems and applications literacy. A relationship between computer attitude and computer system literacy was found, and…

  9. A color and texture based multi-level fusion scheme for ethnicity identification

    NASA Astrophysics Data System (ADS)

    Du, Hongbo; Salah, Sheerko Hma; Ahmed, Hawkar O.

    2014-05-01

    Ethnicity identification of face images is of interest in many areas of application. Different from face recognition of individuals, ethnicity identification classifies faces according to the common features of a specific ethnic group. This paper presents a multi-level fusion scheme for ethnicity identification that combines texture features of local areas of a face using local binary patterns with color features using HSV binning. The scheme fuses the decisions from a k-nearest neighbor classifier and a support vector machine classifier into a final identification decision. We have tested the scheme on a collection of face images from a number of publicly available databases. The results demonstrate the effectiveness of the combined features and improvements on accuracy of identification by the fusion scheme over the identification using individual features and other state-of-art techniques.

  10. Computational identification and analysis of novel sugarcane microRNAs

    PubMed Central

    2012-01-01

    Background MicroRNA-regulation of gene expression plays a key role in the development and response to biotic and abiotic stresses. Deep sequencing analyses accelerate the process of small RNA discovery in many plants and expand our understanding of miRNA-regulated processes. We therefore undertook small RNA sequencing of sugarcane miRNAs in order to understand their complexity and to explore their role in sugarcane biology. Results A bioinformatics search was carried out to discover novel miRNAs that can be regulated in sugarcane plants submitted to drought and salt stresses, and under pathogen infection. By means of the presence of miRNA precursors in the related sorghum genome, we identified 623 candidates of new mature miRNAs in sugarcane. Of these, 44 were classified as high confidence miRNAs. The biological function of the new miRNAs candidates was assessed by analyzing their putative targets. The set of bona fide sugarcane miRNA includes those likely targeting serine/threonine kinases, Myb and zinc finger proteins. Additionally, a MADS-box transcription factor and an RPP2B protein, which act in development and disease resistant processes, could be regulated by cleavage (21-nt-species) and DNA methylation (24-nt-species), respectively. Conclusions A large scale investigation of sRNA in sugarcane using a computational approach has identified a substantial number of new miRNAs and provides detailed genotype-tissue-culture miRNA expression profiles. Comparative analysis between monocots was valuable to clarify aspects about conservation of miRNA and their targets in a plant whose genome has not yet been sequenced. Our findings contribute to knowledge of miRNA roles in regulatory pathways in the complex, polyploidy sugarcane genome. PMID:22747909

  11. When the ends outweigh the means: Mood and level of identification in depression

    PubMed Central

    Watkins, Edward R.; Moberly, Nicholas J.; Moulds, Michelle L.

    2011-01-01

    Research in healthy controls has found that mood influences cognitive processing via level of action identification: happy moods are associated with global and abstract processing; sad moods are associated with local and concrete processing. However, this pattern seems inconsistent with the high level of abstract processing observed in depressed patients, leading Watkins (2008, 2010) to hypothesise that the association between mood and level of goal/action identification is impaired in depression. We tested this hypothesis by measuring level of identification on the Behavioural Identification Form after happy and sad mood inductions in never-depressed controls and currently depressed patients. Participants used increasingly concrete action identifications as they became sadder and less happy, but this effect was moderated by depression status. Consistent with Watkins' (2008) hypothesis, increases in sad mood and decreases in happiness were associated with shifts towards the use of more concrete action identifications in never-depressed individuals, but not in depressed patients. These findings suggest that the putatively adaptive association between mood and level of identification is impaired in major depression. PMID:22017614

  12. Computationally Inexpensive Identification of Non-Informative Model Parameters

    NASA Astrophysics Data System (ADS)

    Mai, J.; Cuntz, M.; Kumar, R.; Zink, M.; Samaniego, L. E.; Schaefer, D.; Thober, S.; Rakovec, O.; Musuuza, J. L.; Craven, J. R.; Spieler, D.; Schrön, M.; Prykhodko, V.; Dalmasso, G.; Langenberg, B.; Attinger, S.

    2014-12-01

    Sensitivity analysis is used, for example, to identify parameters which induce the largest variability in model output and are thus informative during calibration. Variance-based techniques are employed for this purpose, which unfortunately require a large number of model evaluations and are thus ineligible for complex environmental models. We developed, therefore, a computational inexpensive screening method, which is based on Elementary Effects, that automatically separates informative and non-informative model parameters. The method was tested using the mesoscale hydrologic model (mHM) with 52 parameters. The model was applied in three European catchments with different hydrological characteristics, i.e. Neckar (Germany), Sava (Slovenia), and Guadalquivir (Spain). The method identified the same informative parameters as the standard Sobol method but with less than 1% of model runs. In Germany and Slovenia, 22 of 52 parameters were informative mostly in the formulations of evapotranspiration, interflow and percolation. In Spain 19 of 52 parameters were informative with an increased importance of soil parameters. We showed further that Sobol' indexes calculated for the subset of informative parameters are practically the same as Sobol' indexes before the screening but the number of model runs was reduced by more than 50%. The model mHM was then calibrated twice in the three test catchments. First all 52 parameters were taken into account and then only the informative parameters were calibrated while all others are kept fixed. The Nash-Sutcliffe efficiencies were 0.87 and 0.83 in Germany, 0.89 and 0.88 in Slovenia, and 0.86 and 0.85 in Spain, respectively. This minor loss of at most 4% in model performance comes along with a substantial decrease of at least 65% in model evaluations. In summary, we propose an efficient screening method to identify non-informative model parameters that can be discarded during further applications. We have shown that sensitivity

  13. Internship training in computer science: Exploring student satisfaction levels.

    PubMed

    Jaradat, Ghaith M

    2017-08-01

    The requirement of employability in the job market prompted universities to conduct internship training as part of their study plans. There is a need to train students on important academic and professional skills related to the workplace with an IT component. This article describes a statistical study that measures satisfaction levels among students in the faculty of Information Technology and Computer Science in Jordan. The objective of this study is to explore factors that influence student satisfaction with regards to enrolling in an internship training program. The study was conducted to gather student perceptions, opinions, preferences and satisfaction levels related to the program. Data were collected via a mixed method survey (surveys and interviews) from student-respondents. The survey collects demographic and background information from students, including their perception of faculty performance in the training poised to prepare them for the job market. Findings from this study show that students expect internship training to improve their professional and personal skills as well as to increase their workplace-related satisfaction. It is concluded that improving the internship training is crucial among the students as it is expected to enrich their experiences, knowledge and skills in the personal and professional life. It is also expected to increase their level of confidence when it comes to exploring their future job opportunities in the Jordanian market. Copyright © 2017 Elsevier Ltd. All rights reserved.

  14. Australian diagnostic reference levels for multi detector computed tomography.

    PubMed

    Hayton, Anna; Wallace, Anthony; Marks, Paul; Edmonds, Keith; Tingey, David; Johnston, Peter

    2013-03-01

    The Australian Radiation Protection and Nuclear Safety Agency (ARPANSA) is undertaking web based surveys to obtain data to establish national diagnostic reference levels (DRLs) for diagnostic imaging. The first set of DRLs to be established are for multi detector computed tomography (MDCT). The survey samples MDCT dosimetry metrics: dose length product (DLP, mGy.cm) and volume computed tomography dose index (CTDIvol, mGy), for six common protocols/habitus: Head, Neck, Chest, AbdoPelvis, ChestAbdoPelvis and Lumbar Spine from individual radiology clinics and platforms. A practice reference level (PRL) for a given platform and protocol is calculated from a compliant survey containing data collected from at least ten patients. The PRL is defined as the median of the DLP/CTDIvol values for a single compliant survey. Australian National DRLs are defined as the 75th percentile of the distribution of the PRLs for each protocol and age group. Australian National DRLs for adult MDCT have been determined in terms of DLP and CTDIvol. In terms of DLP the national DRLs are 1,000 mGy cm, 600 mGy cm, 450 mGy cm, 700 mGy cm, 1,200 mGy cm, and 900 mGy cm for the protocols Head, Neck, Chest, AbdoPelvis, ChestAbdoPelvis and Lumbar Spine respectively. Average dose values obtained from the European survey Dose Datamed I reveal Australian doses to be higher by comparison for four out of the six protocols. The survey is ongoing, allowing practices to optimise dose delivery as well as allowing the periodic update of DRLs to reflect changes in technology and technique.

  15. Computational Identification of MoRFs in Protein Sequences Using Hierarchical Application of Bayes Rule.

    PubMed

    Malhis, Nawar; Wong, Eric T C; Nassar, Roy; Gsponer, Jörg

    2015-01-01

    Intrinsically disordered regions of proteins play an essential role in the regulation of various biological processes. Key to their regulatory function is often the binding to globular protein domains via sequence elements known as molecular recognition features (MoRFs). Development of computational tools for the identification of candidate MoRF locations in amino acid sequences is an important task and an area of growing interest. Given the relative sparseness of MoRFs in protein sequences, the accuracy of the available MoRF predictors is often inadequate for practical usage, which leaves a significant need and room for improvement. In this work, we introduce MoRFCHiBi_Web, which predicts MoRF locations in protein sequences with higher accuracy compared to current MoRF predictors. Three distinct and largely independent property scores are computed with component predictors and then combined to generate the final MoRF propensity scores. The first score reflects the likelihood of sequence windows to harbour MoRFs and is based on amino acid composition and sequence similarity information. It is generated by MoRFCHiBi using small windows of up to 40 residues in size. The second score identifies long stretches of protein disorder and is generated by ESpritz with the DisProt option. Lastly, the third score reflects residue conservation and is assembled from PSSM files generated by PSI-BLAST. These propensity scores are processed and then hierarchically combined using Bayes rule to generate the final MoRFCHiBi_Web predictions. MoRFCHiBi_Web was tested on three datasets. Results show that MoRFCHiBi_Web outperforms previously developed predictors by generating less than half the false positive rate for the same true positive rate at practical threshold values. This level of accuracy paired with its relatively high processing speed makes MoRFCHiBi_Web a practical tool for MoRF prediction. http://morf.chibi.ubc.ca:8080/morf/.

  16. Computational health economics for identification of unprofitable health care enrollees.

    PubMed

    Rose, Sherri; Bergquist, Savannah L; Layton, Timothy J

    2017-03-22

    Health insurers may attempt to design their health plans to attract profitable enrollees while deterring unprofitable ones. Such insurers would not be delivering socially efficient levels of care by providing health plans that maximize societal benefit, but rather intentionally distorting plan benefits to avoid high-cost enrollees, potentially to the detriment of health and efficiency. In this work, we focus on a specific component of health plan design at risk for health insurer distortion in the Health Insurance Marketplaces: the prescription drug formulary. We introduce an ensembled machine learning function to determine whether drug utilization variables are predictive of a new measure of enrollee unprofitability we derive, and thus vulnerable to distortions by insurers. Our implementation also contains a unique application-specific variable selection tool. This study demonstrates that super learning is effective in extracting the relevant signal for this prediction problem, and that a small number of drug variables can be used to identify unprofitable enrollees. The results are both encouraging and concerning. While risk adjustment appears to have been reasonably successful at weakening the relationship between therapeutic-class-specific drug utilization and unprofitability, some classes remain predictive of insurer losses. The vulnerable enrollees whose prescription drug regimens include drugs in these classes may need special protection from regulators in health insurance market design.

  17. Identification of Upper and Lower Level Yield Strength in Materials

    PubMed Central

    Valíček, Jan; Harničárová, Marta; Kopal, Ivan; Palková, Zuzana; Kušnerová, Milena; Panda, Anton; Šepelák, Vladimír

    2017-01-01

    This work evaluates the possibility of identifying mechanical parameters, especially upper and lower yield points, by the analytical processing of specific elements of the topography of surfaces generated with abrasive waterjet technology. We developed a new system of equations, which are connected with each other in such a way that the result of a calculation is a comprehensive mathematical–physical model, which describes numerically as well as graphically the deformation process of material cutting using an abrasive waterjet. The results of our model have been successfully checked against those obtained by means of a tensile test. The main prospect for future applications of the method presented in this article concerns the identification of mechanical parameters associated with the prediction of material behavior. The findings of this study can contribute to a more detailed understanding of the relationships: material properties—tool properties—deformation properties. PMID:28832526

  18. Frequency domain transfer function identification using the computer program SYSFIT

    SciTech Connect

    Trudnowski, D.J.

    1992-12-01

    Because the primary application of SYSFIT for BPA involves studying power system dynamics, this investigation was geared toward simulating the effects that might be encountered in studying electromechanical oscillations in power systems. Although the intended focus of this work is power system oscillations, the studies are sufficiently genetic that the results can be applied to many types of oscillatory systems with closely-spaced modes. In general, there are two possible ways of solving the optimization problem. One is to use a least-squares optimization function and to write the system in such a form that the problem becomes one of linear least-squares. The solution can then be obtained using a standard least-squares technique. The other method involves using a search method to obtain the optimal model. This method allows considerably more freedom in forming the optimization function and model, but it requires an initial guess of the system parameters. SYSFIT employs this second approach. Detailed investigations were conducted into three main areas: (1) fitting to exact frequency response data of a linear system; (2) fitting to the discrete Fourier transformation of noisy data; and (3) fitting to multi-path systems. The first area consisted of investigating the effects of alternative optimization cost function options; using different optimization search methods; incorrect model order, missing response data; closely-spaced poles; and closely-spaced pole-zero pairs. Within the second area, different noise colorations and levels were studied. In the third area, methods were investigated for improving fitting results by incorporating more than one system path. The following is a list of guidelines and properties developed from the study for fitting a transfer function to the frequency response of a system using optimization search methods.

  19. A computational approach to studying ageing at the individual level

    PubMed Central

    Mourão, Márcio A.; Schnell, Santiago; Pletcher, Scott D.

    2016-01-01

    The ageing process is actively regulated throughout an organism's life, but studying the rate of ageing in individuals is difficult with conventional methods. Consequently, ageing studies typically make biological inference based on population mortality rates, which often do not accurately reflect the probabilities of death at the individual level. To study the relationship between individual and population mortality rates, we integrated in vivo switch experiments with in silico stochastic simulations to elucidate how carefully designed experiments allow key aspects of individual ageing to be deduced from group mortality measurements. As our case study, we used the recent report demonstrating that pheromones of the opposite sex decrease lifespan in Drosophila melanogaster by reversibly increasing population mortality rates. We showed that the population mortality reversal following pheromone removal was almost surely occurring in individuals, albeit more slowly than suggested by population measures. Furthermore, heterogeneity among individuals due to the inherent stochasticity of behavioural interactions skewed population mortality rates in middle-age away from the individual-level trajectories of which they are comprised. This article exemplifies how computational models function as important predictive tools for designing wet-laboratory experiments to use population mortality rates to understand how genetic and environmental manipulations affect ageing in the individual. PMID:26865300

  20. Identification of a Carbonized Body Using Implanted Surgical Plates: The Importance of Computed Tomography.

    PubMed

    Andrade, Vanessa Moreira; Stibich, Christian Abreu; de Santa Martha, Paulo Maurício; de Almeida, Casimiro Abreu Possante; Vieira, Andrea de Castro Domingos

    2017-09-01

    In addition to clinical examination, forensic odontologists can use diagnostic imaging as an auxiliary method for identification. This paper reports a case where forensic odontologists from the Afrânio Peixoto Legal Medicine Institute in Rio de Janeiro (Brazil) positively identified a carbonized and partially calcined body using oral and maxillofacial imaging. The cadaver showed several metallic plates fixed with metallic screws on bones of the neurocranium and viscerocranium. Family members provided spiral computed tomography scans of the skull and a panoramic radiograph that were acquired after an accident that required surgical procedures. Comparative analysis between the clinical exam and the maxillofacial images demonstrated complete coincidence, confirming the victim's identity. Dactyloscopy, which is the most commonly used method of identification, was not possible because of the body carbonization. Thus, diagnostic imaging, especially computed tomography, was essential for elucidation of this case. © 2017 American Academy of Forensic Sciences.

  1. Identification of Restrictive Computer and Software Variables among Preoperational Users of a Computer Learning Center.

    ERIC Educational Resources Information Center

    Kozubal, Diane K.

    While manufacturers have produced a wide variety of software said to be easy for even the youngest child to use, there are conflicting perspectives on computer issues such as ease of use, influence on meeting educational objectives, effects on procedural learning, and rationale for use with young children. Addressing these concerns, this practicum…

  2. Blind source computer device identification from recorded VoIP calls for forensic investigation.

    PubMed

    Jahanirad, Mehdi; Anuar, Nor Badrul; Wahab, Ainuddin Wahid Abdul

    2017-03-01

    The VoIP services provide fertile ground for criminal activity, thus identifying the transmitting computer devices from recorded VoIP call may help the forensic investigator to reveal useful information. It also proves the authenticity of the call recording submitted to the court as evidence. This paper extended the previous study on the use of recorded VoIP call for blind source computer device identification. Although initial results were promising but theoretical reasoning for this is yet to be found. The study suggested computing entropy of mel-frequency cepstrum coefficients (entropy-MFCC) from near-silent segments as an intrinsic feature set that captures the device response function due to the tolerances in the electronic components of individual computer devices. By applying the supervised learning techniques of naïve Bayesian, linear logistic regression, neural networks and support vector machines to the entropy-MFCC features, state-of-the-art identification accuracy of near 99.9% has been achieved on different sets of computer devices for both call recording and microphone recording scenarios. Furthermore, unsupervised learning techniques, including simple k-means, expectation-maximization and density-based spatial clustering of applications with noise (DBSCAN) provided promising results for call recording dataset by assigning the majority of instances to their correct clusters. Copyright © 2017 Elsevier Ireland Ltd. All rights reserved.

  3. Computational Identification of Key Regulators in Two Different Colorectal Cancer Cell Lines

    PubMed Central

    Wlochowitz, Darius; Haubrock, Martin; Arackal, Jetcy; Bleckmann, Annalen; Wolff, Alexander; Beißbarth, Tim; Wingender, Edgar; Gültas, Mehmet

    2016-01-01

    Transcription factors (TFs) are gene regulatory proteins that are essential for an effective regulation of the transcriptional machinery. Today, it is known that their expression plays an important role in several types of cancer. Computational identification of key players in specific cancer cell lines is still an open challenge in cancer research. In this study, we present a systematic approach which combines colorectal cancer (CRC) cell lines, namely 1638N-T1 and CMT-93, and well-established computational methods in order to compare these cell lines on the level of transcriptional regulation as well as on a pathway level, i.e., the cancer cell-intrinsic pathway repertoire. For this purpose, we firstly applied the Trinity platform to detect signature genes, and then applied analyses of the geneXplain platform to these for detection of upstream transcriptional regulators and their regulatory networks. We created a CRC-specific position weight matrix (PWM) library based on the TRANSFAC database (release 2014.1) to minimize the rate of false predictions in the promoter analyses. Using our proposed workflow, we specifically focused on revealing the similarities and differences in transcriptional regulation between the two CRC cell lines, and report a number of well-known, cancer-associated TFs with significantly enriched binding sites in the promoter regions of the signature genes. We show that, although the signature genes of both cell lines show no overlap, they may still be regulated by common TFs in CRC. Based on our findings, we suggest that canonical Wnt signaling is activated in 1638N-T1, but inhibited in CMT-93 through cross-talks of Wnt signaling with the VDR signaling pathway and/or LXR-related pathways. Furthermore, our findings provide indication of several master regulators being present such as MLK3 and Mapk1 (ERK2) which might be important in cell proliferation, migration, and invasion of 1638N-T1 and CMT-93, respectively. Taken together, we provide

  4. Level-set surface segmentation and registration for computing intrasurgical deformations

    NASA Astrophysics Data System (ADS)

    Audette, Michel A.; Peters, Terence M.

    1999-05-01

    We propose a method for estimating intrasurgical brain shift for image-guided surgery. This method consists of five stages: the identification of relevant anatomical surfaces within the MRI/CT volume, range-sensing of the skin and cortex in the OR, rigid registration of the skin range image with its MRI/CT homologue, non-rigid motion tracking over time of cortical range images, and lastly, interpolation of this surface displacement information over the whole brain volume via a realistically valued finite element model of the head. This paper focuses on the anatomical surface identification and cortical range surface tracking problems. The surface identification scheme implements a recent algorithm which imbeds 3D surface segmentation as the level- set of a 4D moving front. A by-product of this stage is a Euclidean distance and closest point map which is later exploited to speed up the rigid and non-rigid surface registration. The range-sensor uses both laser-based triangulation and defocusing techniques to produce a 2D range profile, and is linearly swept across the skin or cortical surface to produce a 3D range image. The surface registration technique is of the iterative closest point type, where each iteration benefits from looking up, rather than searching for, explicit closest point pairs. These explicit point pairs in turn are used in conjunction with a closed-form SVD-based rigid transformation computation and with fast recursive splines to make each rigid and non-rigid registration iteration essentially instantaneous. Our method is validated with a novel deformable brain-shaped phantom, made of Polyvinyl Alcohol Cryogel.

  5. Quantum One Go Computation and the Physical Computation Level of Biological Information Processing

    NASA Astrophysics Data System (ADS)

    Castagnoli, Giuseppe

    2010-02-01

    By extending the representation of quantum algorithms to problem-solution interdependence, the unitary evolution part of the algorithm entangles the register containing the problem with the register containing the solution. Entanglement becomes correlation, or mutual causality, between the two measurement outcomes: the string of bits encoding the problem and that encoding the solution. In former work, we showed that this is equivalent to the algorithm knowing in advance 50% of the bits of the solution it will find in the future, which explains the quantum speed up. Mutual causality between bits of information is also equivalent to seeing quantum measurement as a many body interaction between the parts of a perfect classical machine whose normalized coordinates represent the qubit populations. This “hidden machine” represents the problem to be solved. The many body interaction (measurement) satisfies all the constraints of a nonlinear Boolean network “together and at the same time”—in one go—thus producing the solution. Quantum one go computation can formalize the physical computation level of the theories that place consciousness in quantum measurement. In fact, in visual perception, we see, thus recognize, thus process, a significant amount of information “together and at the same time”. Identifying the fundamental mechanism of consciousness with that of the quantum speed up gives quantum consciousness, with respect to classical consciousness, a potentially enormous evolutionary advantage.

  6. Parallel Computers for Region-Level Image Processing.

    DTIC Science & Technology

    1980-11-01

    34corner" proces- sor in each region, we can use it as the region’s represen- tative in the adjacency-graph- structured computer ; but we need to give it the...graph of bounded degree (% 5). We now briefly describe how to construct a quadtree- structured computer corresponding to (the quadtree of) a given

  7. Identifying the Computer Competency Levels of Recreation Department Undergraduates

    ERIC Educational Resources Information Center

    Zorba, Erdal

    2011-01-01

    Computer-based and web-based applications are as major instructional tools to increase undergraduates' motivation at school. In the recreation field usage of, computer and the internet based recreational applications has become more prevalent in order to present visual and interactive entertainment activities. Recreation department undergraduates…

  8. Evolutionary Computation for the Identification of Emergent Behavior in Autonomous Systems

    NASA Technical Reports Server (NTRS)

    Terrile, Richard J.; Guillaume, Alexandre

    2009-01-01

    Over the past several years the Center for Evolutionary Computation and Automated Design at the Jet Propulsion Laboratory has developed a technique based on Evolutionary Computational Methods (ECM) that allows for the automated optimization of complex computationally modeled systems. An important application of this technique is for the identification of emergent behaviors in autonomous systems. Mobility platforms such as rovers or airborne vehicles are now being designed with autonomous mission controllers that can find trajectories over a solution space that is larger than can reasonably be tested. It is critical to identify control behaviors that are not predicted and can have surprising results (both good and bad). These emergent behaviors need to be identified, characterized and either incorporated into or isolated from the acceptable range of control characteristics. We use cluster analysis of automatically retrieved solutions to identify isolated populations of solutions with divergent behaviors.

  9. Reliability of frontal sinus by cone beam-computed tomography (CBCT) for individual identification.

    PubMed

    Cossellu, Gianguido; De Luca, Stefano; Biagi, Roberto; Farronato, Giampietro; Cingolani, Mariano; Ferrante, Luigi; Cameriere, Roberto

    2015-12-01

    Analysis of the frontal sinus is an important tool in personal identification. Cone beam-computed tomography (CBCT) is also progressively replacing conventional radiography and multi-slice computed tomography (MSCT) in human identification. The aim of this study is to develop a reproducible technique and measurements from 3D reconstructions obtained with CBCT, for use in human identification. CBCT from 150 patients (91 female, 59 male), aged between 15 and 78 years, was analysed with the specific software program MIMICS 11.11 (Materialise N.V., Leuven, Belgium). Corresponding 3D volumes were generated and maximal dimensions along 3 directions (x, y, z), X M, Y M, Z M (in mm), total volume area (in mm(3)), V t, and total surface (in mm(2)), S t, were calculated. Correlation analysis showed that sinus surfaces were strongly correlated with their volume (r = 0.976). Frontal sinuses were separate in 21 subjects (14 %), fused in 67 (44.6 %) and found on only one side (unilateral) in 9 (6 %). A Prominent Middle of Fused Sinus (PMS) was found in 53 subjects (35.3 %). The intra- (0.963-0.999) and inter-observer variability (0.973-0.999) showed a great agreement and a substantial homogeneity of evaluation.

  10. Computer-Assisted Photo Identification Outperforms Visible Implant Elastomers in an Endangered Salamander, Eurycea tonkawae

    PubMed Central

    Bendik, Nathan F.; Morrison, Thomas A.; Gluesenkamp, Andrew G.; Sanders, Mark S.; O’Donnell, Lisa J.

    2013-01-01

    Despite recognition that nearly one-third of the 6300 amphibian species are threatened with extinction, our understanding of the general ecology and population status of many amphibians is relatively poor. A widely-used method for monitoring amphibians involves injecting captured individuals with unique combinations of colored visible implant elastomer (VIE). We compared VIE identification to a less-invasive method – computer-assisted photographic identification (photoID) – in endangered Jollyville Plateau salamanders (Eurycea tonkawae), a species with a known range limited to eight stream drainages in central Texas. We based photoID on the unique pigmentation patterns on the dorsal head region of 1215 individual salamanders using identification software Wild-ID. We compared the performance of photoID methods to VIEs using both ‘high-quality’ and ‘low-quality’ images, which were taken using two different camera types and technologies. For high-quality images, the photoID method had a false rejection rate of 0.76% compared to 1.90% for VIEs. Using a comparable dataset of lower-quality images, the false rejection rate was much higher (15.9%). Photo matching scores were negatively correlated with time between captures, suggesting that evolving natural marks could increase misidentification rates in longer term capture-recapture studies. Our study demonstrates the utility of large-scale capture-recapture using photo identification methods for Eurycea and other species with stable natural marks that can be reliably photographed. PMID:23555669

  11. Trust-level risk identification guidance in the NHS East of England.

    PubMed

    Simsekler, M C Emre; Card, Alan J; Ward, James R; Clarkson, P John

    2015-01-01

    In healthcare, a range of methods are used to improve patient safety through risk identification within the scope of risk management. However, there is no evidence determining what trust-level guidance exists to support risk identification in healthcare organisations. This study therefore aimed to determine such methods through the content analysis of trust-level risk management documents. Through Freedom of Information Act, risk management documents were requested from each acute, mental health and ambulance trust in the East of England region of NHS for content analysis. Received documents were also compared with guidance from other safety-critical industries to capture differences between the documents from those industries, and learning points to the healthcare field. A total of forty-eight documents were received from twenty-one trusts. Incident reporting was found as the main method for risk identification. The documents provided insufficient support for the use of prospective risk identification methods, such as Prospective Hazard Analysis (PHA) methods, while the guidance from other industries extensively promoted such methods. The documents provided significant insight into prescribed risk identification practice in the chosen region. Based on the content analysis and guidance from other safety-critical industries, a number of recommendations were made; such as introducing the use of PHA methods in the creation and revision of risk management documents, and providing individual guidance on risk identification to promote patient safety further.

  12. Application of superimposition-based personal identification using skull computed tomography images.

    PubMed

    Ishii, Masuko; Yayama, Kazuhiro; Motani, Hisako; Sakuma, Ayaka; Yasjima, Daisuke; Hayakawa, Mutumi; Yamamoto, Seiji; Iwase, Hirotaro

    2011-07-01

    Superimposition has been applied to skulls of unidentified skeletonized corpses as a personal identification method. The current method involves layering of a skull and a facial image of a suspected person and thus requires a real skeletonized skull. In this study, we scanned skulls of skeletonized corpses by computed tomography (CT), reconstructed three-dimensional (3D) images of skulls from the CT images, and superimposed the 3D images with facial images of the corresponding persons taken in their lives. Superimposition using 3D-reconstructed skull images demonstrated, as did superimposition using real skulls, an adequate degree of morphological consistency between the 3D-reconstructed skulls and persons in the facial images. Three-dimensional skull images reconstructed from CT images can be saved as data files and the use of these images in superimposition is effective for personal identification of unidentified bodies.

  13. Computational Acoustic Beamforming for Noise Source Identification for Small Wind Turbines

    PubMed Central

    Lien, Fue-Sang

    2017-01-01

    This paper develops a computational acoustic beamforming (CAB) methodology for identification of sources of small wind turbine noise. This methodology is validated using the case of the NACA 0012 airfoil trailing edge noise. For this validation case, the predicted acoustic maps were in excellent conformance with the results of the measurements obtained from the acoustic beamforming experiment. Following this validation study, the CAB methodology was applied to the identification of noise sources generated by a commercial small wind turbine. The simulated acoustic maps revealed that the blade tower interaction and the wind turbine nacelle were the two primary mechanisms for sound generation for this small wind turbine at frequencies between 100 and 630 Hz. PMID:28378012

  14. Computational Acoustic Beamforming for Noise Source Identification for Small Wind Turbines.

    PubMed

    Ma, Ping; Lien, Fue-Sang; Yee, Eugene

    2017-01-01

    This paper develops a computational acoustic beamforming (CAB) methodology for identification of sources of small wind turbine noise. This methodology is validated using the case of the NACA 0012 airfoil trailing edge noise. For this validation case, the predicted acoustic maps were in excellent conformance with the results of the measurements obtained from the acoustic beamforming experiment. Following this validation study, the CAB methodology was applied to the identification of noise sources generated by a commercial small wind turbine. The simulated acoustic maps revealed that the blade tower interaction and the wind turbine nacelle were the two primary mechanisms for sound generation for this small wind turbine at frequencies between 100 and 630 Hz.

  15. Identification of areas with high levels of untreated dental caries.

    PubMed

    Ellwood, R P; O'Mullane, D M

    1996-02-01

    In order to examine the geographical variation of dental health within 10 county districts in North Wales, 3538 children were examined. The associations between three demographic indicators, based on the 1981 OPCS census, and dental health outcomes were assessed for electoral wards within the county districts. The Townsend and Jarman indices were the first two indicators employed and the third was based on a mathematical model representing the variation in the mean number of untreated decayed surfaces per person for the wards. This model was developed using the children examined in the five most westerly county districts. Using the data derived from the five most easterly county districts, the three indicators were assessed. All three showed strong correlations (r > or = 0.88) with dental health. These results indicate that measures of dental health based on large administrative units may obscure variation within them. It is concluded that geographical methods of this type may be useful for targeting dental resources at small areas with high levels of need.

  16. Identification of Semiconductor Defects through Constant-Fermi-Level Ab Initio Molecular Dynamics: Application to GaAs

    NASA Astrophysics Data System (ADS)

    Bouzid, Assil; Pasquarello, Alfredo

    2017-07-01

    We show that constant-Fermi-level ab initio molecular dynamics can be used as a computer-based tool to reveal and control relevant defects in semiconductor materials. In this scheme, the Fermi level can be set at any position within the band gap during the defect generation process, in analogy to experimental growth conditions in the presence of extra electrons or holes. The scheme is illustrated in the case of GaAs, for which we generate melt-quenched amorphous structures through molecular dynamics at various Fermi levels. By a combined analysis which involves both the atomic structure and a Wannier-function decomposition of the electronic structure, we achieve a detailed description of the generated defects as a function of the Fermi level. This leads to the identification of As—As homopolar bonds and Ga dangling bonds for Fermi levels set in the vicinity of the valence band. These defects convert into As dangling bonds and Ga—Ga homopolar bonds, as the Fermi level moves toward the conduction band. This demonstrates a computer-aided procedure to identify semiconductor defects in an unbiased way.

  17. Automatic Measurement of Water Levels by Using Image Identification Method in Open Channel

    NASA Astrophysics Data System (ADS)

    Chung Yang, Han; Xue Yang, Jia

    2014-05-01

    Water level data is indispensable to hydrology research, and it is important information for hydraulic engineering and overall utilization of water resources. The information of water level can be transmitted to management office by the network so that the management office may well understand whether the river level is exceeding the warning line. The existing water level measurement method can only present water levels in a form of data without any of images, the methods which make data just be a data and lack the sense of reality. Those images such as the rising or overflow of river level that the existing measurement method cannot obtain simultaneously. Therefore, this research employs a newly, improved method for water level measurement. Through the Video Surveillance System to record the images on site, an image of water surface will be snapped, and then the snapped image will be pre-processed and be compared with its altitude reference value to obtain a water level altitude value. With the ever-growing technology, the application scope of image identification is widely in increase. This research attempts to use image identification technology to analyze water level automatically. The image observation method used in this research is one of non-contact water level gage but it is quite different from other ones; the image observation method is cheap and the facilities can be set up beside an embankment of river or near the houses, thus the impact coming from external factors will be significantly reduced, and a real scene picture will be transmitted through wireless transmission. According to the dynamic water flow test held in an indoor experimental channel, the results of the research indicated that all of error levels of water level identification were less than 2% which meant the image identification could achieve identification result at different water levels. This new measurement method can offer instant river level figures and on-site video so that a

  18. VALIDATION OF AN ALGORITHM FOR NONMETALLIC INTRAOCULAR FOREIGN BODIES' COMPOSITION IDENTIFICATION BASED ON COMPUTED TOMOGRAPHY AND MAGNETIC RESONANCE IMAGING.

    PubMed

    Moisseiev, Elad; Barequet, Dana; Zunz, Eran; Barak, Adiel; Mardor, Yael; Last, David; Goez, David; Segal, Zvi; Loewenstein, Anat

    2015-09-01

    To validate and evaluate the accuracy of an algorithm for the identification of nonmetallic intraocular foreign body composition based on computed tomography and magnetic resonance imaging. An algorithm for the identification of 10 nonmetallic materials based on computed tomography and magnetic resonance imaging has been previously determined in an ex vivo porcine model. Materials were classified into 4 groups (plastic, glass, stone, and wood). The algorithm was tested by 40 ophthalmologists, which completed a questionnaire including 10 sets of computed tomography and magnetic resonance images of eyes with intraocular foreign bodies and were asked to use the algorithm to identify their compositions. Rates of exact material identification and group identification were measured. Exact material identification was achieved in 42.75% of the cases, and correct group identification in 65%. Using the algorithm, 6 of the materials were exactly identified by over 50% of the participants, and 7 were correctly classified according to their groups by over 75% of the materials. The algorithm was validated and was found to enable correct identification of nonmetallic intraocular foreign body composition in the majority of cases. This is the first study to report and validate a clinical tool allowing intraocular foreign body composition based on their appearance in computed tomography and magnetic resonance imaging, which was previously impossible.

  19. Particle Identification on an FPGA Accelerated Compute Platform for the LHCb Upgrade

    NASA Astrophysics Data System (ADS)

    Fäerber, Christian; Schwemmer, Rainer; Machen, Jonathan; Neufeld, Niko

    2017-07-01

    The current LHCb readout system will be upgraded in 2018 to a “triggerless” readout of the entire detector at the Large Hadron Collider collision rate of 40 MHz. The corresponding bandwidth from the detector down to the foreseen dedicated computing farm (event filter farm), which acts as the trigger, has to be increased by a factor of almost 100 from currently 500 Gb/s up to 40 Tb/s. The event filter farm will preanalyze the data and will select the events on an event by event basis. This will reduce the bandwidth down to a manageable size to write the interesting physics data to tape. The design of such a system is a challenging task, and the reason why different new technologies are considered and have to be investigated for the different parts of the system. For the usage in the event building farm or in the event filter farm (trigger), an experimental field programmable gate array (FPGA) accelerated computing platform is considered and, therefore, tested. FPGA compute accelerators are used more and more in standard servers such as for Microsoft Bing search or Baidu search. The platform we use hosts a general Intel CPU and a high-performance FPGA linked via the high-speed Intel QuickPath Interconnect. An accelerator is implemented on the FPGA. It is very likely that these platforms, which are built, in general, for high-performance computing, are also very interesting for the high-energy physics community. First, the performance results of smaller test cases performed at the beginning are presented. Afterward, a part of the existing LHCb RICH particle identification is tested and is ported to the experimental FPGA accelerated platform. We have compared the performance of the LHCb RICH particle identification running on a normal CPU with the performance of the same algorithm, which is running on the Xeon-FPGA compute accelerator platform.

  20. Impact of PECS tablet computer app on receptive identification of pictures given a verbal stimulus.

    PubMed

    Ganz, Jennifer B; Hong, Ee Rea; Goodwyn, Fara; Kite, Elizabeth; Gilliland, Whitney

    2015-04-01

    The purpose of this brief report was to determine the effect on receptive identification of photos of a tablet computer-based augmentative and alternative communication (AAC) system with voice output. A multiple baseline single-case experimental design across vocabulary words was implemented. One participant, a preschool-aged boy with autism and little intelligible verbal language, was included in the study. Although a functional relation between the intervention and the dependent variable was not established, the intervention did appear to result in mild improvement for two of the three vocabulary words selected. The authors recommend further investigations of the collateral impacts of AAC on skills other than expressive language.

  1. Trusted Computing Exemplar: Low-Level Design Document Standards

    DTIC Science & Technology

    2014-12-12

    for writing low-level design documents. Low-level design documents provide a detailed description of one or more modules. The level of detail should...further this goal, the NPS CAG and NPS CISR ask that any derivative products, code, writings , and/or other derivative materials, include an attribution...functionality, whether accidental or intentional. This document provides the standard format for writing low-level design documents. Low-level design

  2. Social Identification and Interpersonal Communication in Computer-Mediated Communication: What You Do versus Who You Are in Virtual Groups

    ERIC Educational Resources Information Center

    Wang, Zuoming; Walther, Joseph B.; Hancock, Jeffrey T.

    2009-01-01

    This study investigates the influence of interpersonal communication and intergroup identification on members' evaluations of computer-mediated groups. Participants (N= 256) in 64 four-person groups interacted through synchronous computer chat. Subgroup assignments to minimal groups instilled significantly greater in-group versus out-group…

  3. Social Identification and Interpersonal Communication in Computer-Mediated Communication: What You Do versus Who You Are in Virtual Groups

    ERIC Educational Resources Information Center

    Wang, Zuoming; Walther, Joseph B.; Hancock, Jeffrey T.

    2009-01-01

    This study investigates the influence of interpersonal communication and intergroup identification on members' evaluations of computer-mediated groups. Participants (N= 256) in 64 four-person groups interacted through synchronous computer chat. Subgroup assignments to minimal groups instilled significantly greater in-group versus out-group…

  4. Computerized nipple identification for multiple image analysis in computer-aided diagnosis

    SciTech Connect

    Zhou Chuan; Chan Heangping; Paramagul, Chintana; Roubidoux, Marilyn A.; Sahiner, Berkman; Hadjiiski, Labomir M.; Petrick, Nicholas

    2004-10-01

    Correlation of information from multiple-view mammograms (e.g., MLO and CC views, bilateral views, or current and prior mammograms) can improve the performance of breast cancer diagnosis by radiologists or by computer. The nipple is a reliable and stable landmark on mammograms for the registration of multiple mammograms. However, accurate identification of nipple location on mammograms is challenging because of the variations in image quality and in the nipple projections, resulting in some nipples being nearly invisible on the mammograms. In this study, we developed a computerized method to automatically identify the nipple location on digitized mammograms. First, the breast boundary was obtained using a gradient-based boundary tracking algorithm, and then the gray level profiles along the inside and outside of the boundary were identified. A geometric convergence analysis was used to limit the nipple search to a region of the breast boundary. A two-stage nipple detection method was developed to identify the nipple location using the gray level information around the nipple, the geometric characteristics of nipple shapes, and the texture features of glandular tissue or ducts which converge toward the nipple. At the first stage, a rule-based method was designed to identify the nipple location by detecting significant changes of intensity along the gray level profiles inside and outside the breast boundary and the changes in the boundary direction. At the second stage, a texture orientation-field analysis was developed to estimate the nipple location based on the convergence of the texture pattern of glandular tissue or ducts towards the nipple. The nipple location was finally determined from the detected nipple candidates by a rule-based confidence analysis. In this study, 377 and 367 randomly selected digitized mammograms were used for training and testing the nipple detection algorithm, respectively. Two experienced radiologists identified the nipple locations

  5. Computational aspects of hot-wire identification of thermal conductivity and diffusivity under high temperature

    NASA Astrophysics Data System (ADS)

    Vala, Jiří; Jarošová, Petra

    2016-07-01

    Development of advanced materials resistant to high temperature, needed namely for the design of heat storage for low-energy and passive buildings, requires simple, inexpensive and reliable methods of identification of their temperature-sensitive thermal conductivity and diffusivity, covering both well-advised experimental setting and implementation of robust and effective computational algorithms. Special geometrical configurations offer a possibility of quasi-analytical evaluation of temperature development for direct problems, whereas inverse problems of simultaneous evaluation of thermal conductivity and diffusivity must be handled carefully, using some least-squares (minimum variance) arguments. This paper demonstrates the proper mathematical and computational approach to such model problem, thanks to the radial symmetry of hot-wire measurements, including its numerical implementation.

  6. An effective computational tool for parametric studies and identification problems in materials mechanics

    NASA Astrophysics Data System (ADS)

    Bolzon, Gabriella; Buljak, Vladimir

    2011-12-01

    Parametric studies and identification problems require to perform repeated analyses, where only a few input parameters are varied among those defining the problem of interest, often associated to complex numerical simulations. In fact, physical phenomena relevant to several practical applications involve coupled material and geometry non-linearities. In these situations, accurate but expensive computations, usually carried out by the finite element method, may be replaced by numerical procedures based on proper orthogonal decomposition combined with radial basis function interpolation. Besides drastically reducing computing times and costs, this approach is capable of retaining the essential features of the considered system responses while filtering most disturbances. These features are illustrated in this paper with specific reference to some elastic-plastic problems. The presented results can however be easily extended to other meaningful engineering situations.

  7. Three Computer Programs for Use in Introductory Level Physics Laboratories.

    ERIC Educational Resources Information Center

    Kagan, David T.

    1984-01-01

    Describes three computer programs which operate on Apple II+ microcomputers: (1) a menu-driven graph drawing program; (2) a simulation of the Millikan oil drop experiment; and (3) a program used to study the half-life of silver. (Instructions for obtaining the programs from the author are included.) (JN)

  8. Three Computer Programs for Use in Introductory Level Physics Laboratories.

    ERIC Educational Resources Information Center

    Kagan, David T.

    1984-01-01

    Describes three computer programs which operate on Apple II+ microcomputers: (1) a menu-driven graph drawing program; (2) a simulation of the Millikan oil drop experiment; and (3) a program used to study the half-life of silver. (Instructions for obtaining the programs from the author are included.) (JN)

  9. Efficient Computation of the Topology of Level Sets

    SciTech Connect

    Pascucci, V; Cole-McLaughlin, K

    2002-07-19

    This paper introduces two efficient algorithms that compute the Contour Tree of a 3D scalar field F and its augmented version with the Betti numbers of each isosurface. The Contour Tree is a fundamental data structure in scientific visualization that is used to pre-process the domain mesh to allow optimal computation of isosurfaces with minimal storage overhead. The Contour Tree can be also used to build user interfaces reporting the complete topological characterization of a scalar field, as shown in Figure 1. In the first part of the paper we present a new scheme that augments the Contour Tree with the Betti numbers of each isocontour in linear time. We show how to extend the scheme introduced in 3 with the Betti number computation without increasing its complexity. Thus we improve on the time complexity from our previous approach 8 from 0(m log m) to 0(n log n+m), where m is the number of tetrahedra and n is the number of vertices in the domain of F. In the second part of the paper we introduce a new divide and conquer algorithm that computes the Augmented Contour Tree for scalar fields defined on rectilinear grids. The central part of the scheme computes the output contour tree by merging two intermediate contour trees and is independent of the interpolant. In this way we confine any knowledge regarding a specific interpolant to an oracle that computes the tree for a single cell. We have implemented this oracle for the trilinear interpolant and plan to replace it with higher order interpolants when needed. The complexity of the scheme is O(n + t log n), where t is the number of critical points of F. This allows for the first time to compute the Contour Tree in linear time in many practical cases when t = O(n{sup 1-e}). We report the running times for a parallel implementation of our algorithm, showing good scalability with the number of processors.

  10. Computer-driven automatic identification of locomotion states in Caenorhabditis elegans.

    PubMed

    Hoshi, Katsunori; Shingai, Ryuzo

    2006-10-30

    We developed a computer-driven tracking system for the automated analysis of the locomotion of Caenorhabditis elegans. The algorithm for the identification of locomotion states on agar plates (forward movement, backward movement, rest, and curl) includes the identification of the worm's head and tail. The head and tail are first assigned, by using three criteria, based on time-sequential binary images of the worm, and the determination is made based on the majority of the three criteria. By using the majority of the criteria, the robustness was improved. The system allowed us to identify locomotion states and to reconstruct the path of a worm using more than 1h data. Based on 5-min image sequences from a total of 230 individual wild-type worms and 22 mutants, the average error of identification of the head/tail for all strains was 0.20%. The system was used to analyze 70 min of locomotion for wild-type and two mutant strains after a worm was transferred from a seeded plate to a bacteria-free assay plate. The error of identifying the state was less than 1%, which is sufficiently accurate for locomotion studies.

  11. Automatic de-identification of electronic medical records using token-level and character-level conditional random fields.

    PubMed

    Liu, Zengjian; Chen, Yangxin; Tang, Buzhou; Wang, Xiaolong; Chen, Qingcai; Li, Haodi; Wang, Jingfeng; Deng, Qiwen; Zhu, Suisong

    2015-12-01

    De-identification, identifying and removing all protected health information (PHI) present in clinical data including electronic medical records (EMRs), is a critical step in making clinical data publicly available. The 2014 i2b2 (Center of Informatics for Integrating Biology and Bedside) clinical natural language processing (NLP) challenge sets up a track for de-identification (track 1). In this study, we propose a hybrid system based on both machine learning and rule approaches for the de-identification track. In our system, PHI instances are first identified by two (token-level and character-level) conditional random fields (CRFs) and a rule-based classifier, and then are merged by some rules. Experiments conducted on the i2b2 corpus show that our system submitted for the challenge achieves the highest micro F-scores of 94.64%, 91.24% and 91.63% under the "token", "strict" and "relaxed" criteria respectively, which is among top-ranked systems of the 2014 i2b2 challenge. After integrating some refined localization dictionaries, our system is further improved with F-scores of 94.83%, 91.57% and 91.95% under the "token", "strict" and "relaxed" criteria respectively. Copyright © 2015 Elsevier Inc. All rights reserved.

  12. Precision of cephalometric landmark identification: Cone-beam computed tomography vs conventional cephalometric views

    PubMed Central

    Ludlow, John B.; Gubler, Maritzabel; Cevidanes, Lucia; Mol, André

    2009-01-01

    Introduction In this study, we compared the precision of landmark identification using displays of multi-planar cone-beam computed tomographic (CBCT) volumes and conventional lateral cephalograms (Ceph). Methods Twenty presurgical orthodontic patients were radiographed with conventional Ceph and CBCT techniques. Five observers plotted 24 landmarks using computer displays of multi-planer reconstruction (MPR) CBCT and Ceph views during separate sessions. Absolute differences between each observer’s plot and the mean of all observers were averaged as 1 measure of variability (ODM). The absolute difference of each observer from any other observer was averaged as a second measure of variability (DEO). ANOVA and paired t tests were used to analyze variability differences. Results Radiographic modality and landmark were significant at P <0.0001 for DEO and ODM calculations. DEO calculations of observer variability were consistently greater than ODM. The overall correlation of 1920 paired ODM and DEO measurements was excellent at 0.972. All bilateral landmarks had increased precision when identified in the MPR views. Mediolateral variability was statistically greater than anteroposterior or caudal-cranial variability for 5 landmarks in the MPR views. Conclusions The MPR displays of CBCT volume images provide generally more precise identification of traditional cephalometric landmarks. More precise location of condylion, gonion, and orbitale overcomes the problem of superimposition of these bilateral landmarks seen in Ceph. Greater variability of certain landmarks in the mediolateral direction is probably related to inadequate definition of the landmarks in the third dimension. PMID:19732656

  13. Reliable identification at the species level of Brucella isolates with MALDI-TOF-MS

    PubMed Central

    2011-01-01

    Background The genus Brucella contains highly infectious species that are classified as biological threat agents. The timely detection and identification of the microorganism involved is essential for an effective response not only to biological warfare attacks but also to natural outbreaks. Matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOF-MS) is a rapid method for the analysis of biological samples. The advantages of this method, compared to conventional techniques, are rapidity, cost-effectiveness, accuracy and suitability for the high-throughput identification of bacteria. Discrepancies between taxonomy and genetic relatedness on the species and biovar level complicate the development of detection and identification assays. Results In this study, the accurate identification of Brucella species using MALDI-TOF-MS was achieved by constructing a Brucella reference library based on multilocus variable-number tandem repeat analysis (MLVA) data. By comparing MS-spectra from Brucella species against a custom-made MALDI-TOF-MS reference library, MALDI-TOF-MS could be used as a rapid identification method for Brucella species. In this way, 99.3% of the 152 isolates tested were identified at the species level, and B. suis biovar 1 and 2 were identified at the level of their biovar. This result demonstrates that for Brucella, even minimal genomic differences between these serovars translate to specific proteomic differences. Conclusions MALDI-TOF-MS can be developed into a fast and reliable identification method for genetically highly related species when potential taxonomic and genetic inconsistencies are taken into consideration during the generation of the reference library. PMID:22192890

  14. Toward a Computational Neuropsychology of High-Level Vision.

    DTIC Science & Technology

    1984-08-20

    known as visual agnosia ’ (also called "mindblindness’)l this patient failed to *recognize her nurses, got lost frequently when travelling familiar routes...visual agnosia are not blind: these patients can compare two shapes reliably when Computational neuropsychology 16 both are visible, but they cannot...visually recognize what an object is (although many can recognize objects by touch). This sort of agnosia has been well-documented in the literature (see

  15. SLIDE: automatic spine level identification system using a deep convolutional neural network.

    PubMed

    Hetherington, Jorden; Lessoway, Victoria; Gunka, Vit; Abolmaesumi, Purang; Rohling, Robert

    2017-07-01

    Percutaneous spinal needle insertion procedures often require proper identification of the vertebral level to effectively and safely deliver analgesic agents. The current clinical method involves "blind" identification of the vertebral level through manual palpation of the spine, which has only 30% reported accuracy. Therefore, there is a need for better anatomical identification prior to needle insertion. A real-time system was developed to identify the vertebral level from a sequence of ultrasound images, following a clinical imaging protocol. The system uses a deep convolutional neural network (CNN) to classify transverse images of the lower spine. Several existing CNN architectures were implemented, utilizing transfer learning, and compared for adequacy in a real-time system. In the system, the CNN output is processed, using a novel state machine, to automatically identify vertebral levels as the transducer moves up the spine. Additionally, a graphical display was developed and integrated within 3D Slicer. Finally, an augmented reality display, projecting the level onto the patient's back, was also designed. A small feasibility study [Formula: see text] evaluated performance. The proposed CNN successfully discriminates ultrasound images of the sacrum, intervertebral gaps, and vertebral bones, achieving 88% 20-fold cross-validation accuracy. Seventeen of 20 test ultrasound scans had successful identification of all vertebral levels, processed at real-time speed (40 frames/s). A machine learning system is presented that successfully identifies lumbar vertebral levels. The small study on human subjects demonstrated real-time performance. A projection-based augmented reality display was used to show the vertebral level directly on the subject adjacent to the puncture site.

  16. A Functional Level Preprocessor for Computer Aided Digital Design.

    DTIC Science & Technology

    1980-12-01

    the parsing of ucer input, is based on that for the computer language, PASCAL [J2,1J. The procedure is tle author’s original design Each line of input...NIKLAUS WIR~iN. PASCAL -USER:I MANUAL AmD REPORT. NEW YORK, NY: SPRINGER-VERLAG 1978 Li LANCAST17R, DOIN. CMOS CoORBOLK(A. IND)IANAPOLIS, IND): HOWAI(D...34flGS 0151, OtTAll me:;genera ted by SISL, duri -ne iti; last run. Each message is of the foriiat: SutiWur Uk ;L ATLNG M1:SSA(;lE-, )’URIAT NUMBELR, and

  17. A new experimental approach to computer-aided face/skull identification in forensic anthropology.

    PubMed

    Ricci, Alessio; Marella, Gian Luca; Apostol, Mario Alexandru

    2006-03-01

    The present study introduces a new approach to computer-assisted face/skull matching used for personal identification purposes in forensic anthropology. In this experiment, the authors formulated an algorithm able to identify the face of a person suspected to have disappeared, by comparing the respective person's facial image with the skull radiograph. A total of 14 subjects were selected for the study, from which a facial photograph and skull radiograph were taken and ultimately compiled into a database, saved to the hard drive of a computer. The photographs of the faces and corresponding skull radiographs were then drafted using common photographic software, taking caution not to alter the informational content of the images. Once computer generated, the facial images and menu were displayed on a color monitor. In the first phase, a few anatomic points of each photograph were selected and marked with a cross to facilitate and more accurately match the face with its corresponding skull. In the second phase, the above mentioned cross grid was superimposed on the radiographic image of the skull and brought to scale. In the third phase, the crosses were transferred to the cranial points of the radiograph. In the fourth phase, the algorithm calculated the distance of each transferred cross and the corresponding average. The smaller the mean value, the greater the index of similarity between the face and skull.A total of 196 cross-comparisons were conducted, with positive identification resulting in each case. Hence, the algorithm matched a facial photograph to the correct skull in 100% of the cases.

  18. Assigning unique identification numbers to new user accounts and groups in a computing environment with multiple registries

    DOEpatents

    DeRobertis, Christopher V.; Lu, Yantian T.

    2010-02-23

    A method, system, and program storage device for creating a new user account or user group with a unique identification number in a computing environment having multiple user registries is provided. In response to receiving a command to create a new user account or user group, an operating system of a clustered computing environment automatically checks multiple registries configured for the operating system to determine whether a candidate identification number for the new user account or user group has been assigned already to one or more existing user accounts or groups, respectively. The operating system automatically assigns the candidate identification number to the new user account or user group created in a target user registry if the checking indicates that the candidate identification number has not been assigned already to any of the existing user accounts or user groups, respectively.

  19. Identification of Propionibacteria to the species level using Fourier transform infrared spectroscopy and artificial neural networks.

    PubMed

    Dziuba, B

    2013-01-01

    Fourier transform infrared spectroscopy (FTIR) and artificial neural networks (ANN's) were used to identify species of Propionibacteria strains. The aim of the study was to improve the methodology to identify species of Propionibacteria strains, in which the differentiation index D, calculated based on Pearson's correlation and cluster analyses were used to describe the correlation between the Fourier transform infrared spectra and bacteria as molecular systems brought unsatisfactory results. More advanced statistical methods of identification of the FTIR spectra with application of artificial neural networks (ANN's) were used. In this experiment, the FTIR spectra of Propionibacteria strains stored in the library were used to develop artificial neural networks for their identification. Several multilayer perceptrons (MLP) and probabilistic neural networks (PNN) were tested. The practical value of selected artificial neural networks was assessed based on identification results of spectra of 9 reference strains and 28 isolates. To verify results of isolates identification, the PCR based method with the pairs of species-specific primers was used. The use of artificial neural networks in FTIR spectral analyses as the most advanced chemometric method supported correct identification of 93% bacteria of the genus Propionibacterium to the species level.

  20. The conclusive role of postmortem computed tomography (CT) of the skull and computer-assisted superimposition in identification of an unknown body.

    PubMed

    Lorkiewicz-Muszyńska, Dorota; Kociemba, Wojciech; Żaba, Czesław; Łabęcka, Marzena; Koralewska-Kordel, Małgorzata; Abreu-Głowacka, Monica; Przystańska, Agnieszka

    2013-05-01

    Computed tomography is commonly used in modern medicine, and thus, it is often helpful for medicolegal purposes, especially as part of the antemortem record. The application of postmortem computed tomography and 3D reconstruction of the skull in challenging cases is reported, and its valuable contribution to positive identification is discussed. This paper presents a case in which the body of an unknown individual is identified. Positive identification had not been possible despite a multidisciplinary examination. The postmortem use of computerized tomography and 3D reconstruction of the skull followed by the comparison of individual morphological characteristics of the viscerocranium showed the concordant points between the deceased and a missing person. Finally, superimposition using a 3D-reconstructed skull instead of the skeletonized skull demonstrated an adequate degree of morphological consistency in the facial images of the analyzed individuals that lead to positive identification. It was concluded that where other methods of personal identification had failed, the use of postmortem computed tomography had proved to be instrumental in the positive identification of the deceased.

  1. A conceptual framework of computations in mid-level vision

    PubMed Central

    Kubilius, Jonas; Wagemans, Johan; Op de Beeck, Hans P.

    2014-01-01

    If a picture is worth a thousand words, as an English idiom goes, what should those words—or, rather, descriptors—capture? What format of image representation would be sufficiently rich if we were to reconstruct the essence of images from their descriptors? In this paper, we set out to develop a conceptual framework that would be: (i) biologically plausible in order to provide a better mechanistic understanding of our visual system; (ii) sufficiently robust to apply in practice on realistic images; and (iii) able to tap into underlying structure of our visual world. We bring forward three key ideas. First, we argue that surface-based representations are constructed based on feature inference from the input in the intermediate processing layers of the visual system. Such representations are computed in a largely pre-semantic (prior to categorization) and pre-attentive manner using multiple cues (orientation, color, polarity, variation in orientation, and so on), and explicitly retain configural relations between features. The constructed surfaces may be partially overlapping to compensate for occlusions and are ordered in depth (figure-ground organization). Second, we propose that such intermediate representations could be formed by a hierarchical computation of similarity between features in local image patches and pooling of highly-similar units, and reestimated via recurrent loops according to the task demands. Finally, we suggest to use datasets composed of realistically rendered artificial objects and surfaces in order to better understand a model's behavior and its limitations. PMID:25566044

  2. A conceptual framework of computations in mid-level vision.

    PubMed

    Kubilius, Jonas; Wagemans, Johan; Op de Beeck, Hans P

    2014-01-01

    If a picture is worth a thousand words, as an English idiom goes, what should those words-or, rather, descriptors-capture? What format of image representation would be sufficiently rich if we were to reconstruct the essence of images from their descriptors? In this paper, we set out to develop a conceptual framework that would be: (i) biologically plausible in order to provide a better mechanistic understanding of our visual system; (ii) sufficiently robust to apply in practice on realistic images; and (iii) able to tap into underlying structure of our visual world. We bring forward three key ideas. First, we argue that surface-based representations are constructed based on feature inference from the input in the intermediate processing layers of the visual system. Such representations are computed in a largely pre-semantic (prior to categorization) and pre-attentive manner using multiple cues (orientation, color, polarity, variation in orientation, and so on), and explicitly retain configural relations between features. The constructed surfaces may be partially overlapping to compensate for occlusions and are ordered in depth (figure-ground organization). Second, we propose that such intermediate representations could be formed by a hierarchical computation of similarity between features in local image patches and pooling of highly-similar units, and reestimated via recurrent loops according to the task demands. Finally, we suggest to use datasets composed of realistically rendered artificial objects and surfaces in order to better understand a model's behavior and its limitations.

  3. Assessing Pre-Service Teachers' Computer Phobia Levels in Terms of Gender and Experience, Turkish Sample

    ERIC Educational Resources Information Center

    Ursavas, Omer Faruk; Karal, Hasan

    2009-01-01

    In this study it is aimed to determine the level of pre-service teachers' computer phobia. Whether or not computer phobia meaningfully varies statistically according to gender and computer experience has been tested in the study. The study was performed on 430 pre-service teachers at the Education Faculty in Rize/Turkey. Data in the study were…

  4. On parameters identification of computational models of vibrations during quiet standing of humans

    NASA Astrophysics Data System (ADS)

    Barauskas, R.; Krušinskienė, R.

    2007-12-01

    Vibration of the center of pressure (COP) of human body on the base of support during quiet standing is a very popular clinical research, which provides useful information about the physical and health condition of an individual. In this work, vibrations of COP of a human body in forward-backward direction during still standing are generated using controlled inverted pendulum (CIP) model with a single degree of freedom (dof) supplied with proportional, integral and differential (PID) controller, which represents the behavior of the central neural system of a human body and excited by cumulative disturbance vibration, generated within the body due to breathing or any other physical condition. The identification of the model and disturbance parameters is an important stage while creating a close-to-reality computational model able to evaluate features of disturbance. The aim of this study is to present the CIP model parameters identification approach based on the information captured by time series of the COP signal. The identification procedure is based on an error function minimization. Error function is formulated in terms of time laws of computed and experimentally measured COP vibrations. As an alternative, error function is formulated in terms of the stabilogram diffusion function (SDF). The minimization of error functions is carried out by employing methods based on sensitivity functions of the error with respect to model and excitation parameters. The sensitivity functions are obtained by using the variational techniques. The inverse dynamic problem approach has been employed in order to establish the properties of the disturbance time laws ensuring the satisfactory coincidence of measured and computed COP vibration laws. The main difficulty of the investigated problem is encountered during the model validation stage. Generally, neither the PID controller parameter set nor the disturbance time law are known in advance. In this work, an error function

  5. Multi-level adaptive computations in fluid dynamics

    NASA Technical Reports Server (NTRS)

    Brandt, A.

    1979-01-01

    The multi-level adaptive technique (MLAT) is a general strategy of solving continuous problems by cycling between coarser and finer levels of discretization. It provides very fast solvers together with adaptive, nearly optimal discretization schemes to general boundary-value problems in general domains. Here the state of the art is surveyed, emphasizing steady-state fluid dynamics applications, from slow viscous flows to transonic ones. Various new techniques are briefly discussed, including distributive relaxation schemes, the treatment of evolution problems, the combined use of upstream and central differencing, local truncation extrapolations, and other 'super-solver' techniques.

  6. Secondary School Students' Levels of Understanding in Computing Exponents

    ERIC Educational Resources Information Center

    Pitta-Pantazi, Demetra; Christou, Constantinos; Zachariades, Theodossios

    2007-01-01

    The aim of this study is to describe and analyze students' levels of understanding of exponents within the context of procedural and conceptual learning via the conceptual change and prototypes' theory. The study was conducted with 202 secondary school students with the use of a questionnaire and semi-structured interviews. The results suggest…

  7. Effects of portable computing devices on posture, muscle activation levels and efficiency.

    PubMed

    Werth, Abigail; Babski-Reeves, Kari

    2014-11-01

    Very little research exists on ergonomic exposures when using portable computing devices. This study quantified muscle activity (forearm and neck), posture (wrist, forearm and neck), and performance (gross typing speed and error rates) differences across three portable computing devices (laptop, netbook, and slate computer) and two work settings (desk and computer) during data entry tasks. Twelve participants completed test sessions on a single computer using a test-rest-test protocol (30min of work at one work setting, 15min of rest, 30min of work at the other work setting). The slate computer resulted in significantly more non-neutral wrist, elbow and neck postures, particularly when working on the sofa. Performance on the slate computer was four times less than that of the other computers, though lower muscle activity levels were also found. Potential or injury or illness may be elevated when working on smaller, portable computers in non-traditional work settings.

  8. Identification of oxygen-related midgap level in GaAs

    NASA Technical Reports Server (NTRS)

    Lagowski, J.; Lin, D. G.; Gatos, H. C.; Aoyama, T.

    1984-01-01

    An oxygen-related deep level ELO was identified in GaAs employing Bridgman-grown crystals with controlled oxygen doping. The activation energy of ELO is almost the same as that of the dominant midgap level: EL2. This fact impedes the identification of ELO by standard deep level transient spectroscopy. However, it was found that the electron capture cross section of ELO is about four times greater than that of EL2. This characteristic served as the basis for the separation and quantitative investigation of ELO employing detailed capacitance transient measurements in conjunction with reference measurements on crystals grown without oxygen doping and containing only EL2.

  9. Fostering Girls' Computer Literacy through Laptop Learning: Can Mobile Computers Help To Level Out the Gender Difference?

    ERIC Educational Resources Information Center

    Schaumburg, Heike

    The goal of this study was to find out if the difference between boys and girls in computer literacy can be leveled out in a laptop program where each student has his/her own mobile computer to work with at home and at school. Ninth grade students (n=113) from laptop and non-laptop classes in a German high school were tested for their computer…

  10. 12 CFR 1750.13 - Risk-based capital level computation.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... paragraph (a)(2)(iii) of this section, whichever would require more capital in the stress test for the... level computation. (a) Risk-Based Capital Test—OFHEO shall compute a risk-based capital level for each Enterprise at least quarterly by applying the risk-based capital test described in appendix A to this...

  11. Analysis of Necessary Computer Skills for Entry-Level Office Workers.

    ERIC Educational Resources Information Center

    Arzy, Marsha R.

    The level of computer training needed by students seeking entry-level office jobs was the subject of a questionnaire mailed to 153 businesses in northern Wyoming. Usable responses were received from 117 businesses (for a response rate of 76.5 percent). Company size did not significantly affect computer use. Even though 51 percent of the firms…

  12. Computed Tomography (CT) Scanning Facilitates Early Identification of Neonatal Cystic Fibrosis Piglets

    PubMed Central

    Guillon, Antoine; Chevaleyre, Claire; Barc, Celine; Berri, Mustapha; Adriaensen, Hans; Lecompte, François; Villemagne, Thierry; Pezant, Jérémy; Delaunay, Rémi; Moënne-Loccoz, Joseph; Berthon, Patricia; Bähr, Andrea; Wolf, Eckhard; Klymiuk, Nikolai; Attucci, Sylvie; Ramphal, Reuben; Sarradin, Pierre; Buzoni-Gatel, Dominique; Si-Tahar, Mustapha; Caballero, Ignacio

    2015-01-01

    Background Cystic Fibrosis (CF) is the most prevalent autosomal recessive disease in the Caucasian population. A cystic fibrosis transmembrane conductance regulator knockout (CFTR-/-) pig that displays most of the features of the human CF disease has been recently developed. However, CFTR-/- pigs presents a 100% prevalence of meconium ileus that leads to death in the first hours after birth, requiring a rapid diagnosis and surgical intervention to relieve intestinal obstruction. Identification of CFTR-/- piglets is usually performed by PCR genotyping, a procedure that lasts between 4 to 6 h. Here, we aimed to develop a procedure for rapid identification of CFTR-/- piglets that will allow placing them under intensive care soon after birth and immediately proceeding with the surgical correction. Methods and Principal Findings Male and female CFTR+/- pigs were crossed and the progeny was examined by computed tomography (CT) scan to detect the presence of meconium ileus and facilitate a rapid post-natal surgical intervention. Genotype was confirmed by PCR. CT scan presented a 94.4% sensitivity to diagnose CFTR-/- piglets. Diagnosis by CT scan reduced the birth-to-surgery time from a minimum of 10 h down to a minimum of 2.5 h and increased the survival of CFTR-/- piglets to a maximum of 13 days post-surgery as opposed to just 66 h after later surgery. Conclusion CT scan imaging of meconium ileus is an accurate method for rapid identification of CFTR-/- piglets. Early CT detection of meconium ileus may help to extend the lifespan of CFTR-/- piglets and, thus, improve experimental research on CF, still an incurable disease. PMID:26600426

  13. Single-board computer based control system for a portable Raman device with integrated chemical identification

    NASA Astrophysics Data System (ADS)

    Mobley, Joel; Cullum, Brian M.; Wintenberg, Alan L.; Shane Frank, S.; Maples, Robert A.; Stokes, David L.; Vo-Dinh, Tuan

    2004-06-01

    We report the development of a battery-powered portable chemical identification device for field use consisting of an acousto-optic tunable filter (AOTF)-based Raman spectrometer with integrated data processing and analysis software. The various components and custom circuitry are integrated into a self-contained instrument by control software that runs on an embedded single-board computer (SBC), which communicates with the various instrument modules through a 48-line bidirectional TTL bus. The user interacts with the instrument via a touch-sensitive liquid crystal display unit (LCD) that provides soft buttons for user control as well as visual feedback (e.g., spectral plots, stored data, instrument settings, etc.) from the instrument. The control software manages all operational aspects of the instrument with the exception of the power management module that is run by embedded firmware. The SBC-based software includes both automated and manual library searching capabilities, permitting rapid identification of samples in the field. The use of the SBC in tandem with the LCD touchscreen for interfacing and control provides the instrument with a great deal of flexibility as its function can be customized to specific users or tasks via software modifications alone. The instrument, as currently configured, can be operated as a research-grade Raman spectrometer for scientific applications and as a "black-box" chemical identification system for field use. The instrument can acquire 198-point spectra over a spectral range of 238-1620 cm-1, perform a library search, and display the results in less than 14 s. The operating modes of the instrument are demonstrated illustrating the utility and flexibility afforded the system by the SBC-LCD control module.

  14. Computational methods for the identification of spatially varying stiffness and damping in beams

    NASA Technical Reports Server (NTRS)

    Banks, H. T.; Rosen, I. G.

    1986-01-01

    A numerical approximation scheme for the estimation of functional parameters in Euler-Bernoulli models for the transverse vibration of flexible beams with tip bodies is developed. The method permits the identification of spatially varying flexural stiffness and Voigt-Kelvin viscoelastic damping coefficients which appear in the hybrid system of ordinary and partial differential equations and boundary conditions describing the dynamics of such structures. An inverse problem is formulated as a least squares fit to data subject to constraints in the form of a vector system of abstract first order evolution equations. Spline-based finite element approximations are used to finite dimensionalize the problem. Theoretical convergence results are given and numerical studies carried out on both conventional (serial) and vector computers are discussed.

  15. Supervised neural computing solutions for fluorescence identification of benzimidazole fungicides. Data and decision fusion strategies.

    PubMed

    Suárez-Araujo, Carmen Paz; García Báez, Patricio; Sánchez Rodríguez, Álvaro; Santana-Rodrríguez, José Juan

    2016-12-01

    Benzimidazole fungicides (BFs) are a type of pesticide of high environmental interest characterized by a heavy fluorescence spectral overlap which complicates its detection in mixtures. In this paper, we present a computational study based on supervised neural networks for a multi-label classification problem. Specifically, backpropagation networks (BPNs) with data fusion and ensemble schemes are used for the simultaneous resolution of difficult multi-fungicide mixtures. We designed, optimized and compared simple BPNs, BPNs with data fusion and BPNs ensembles. The information environment used is made up of synchronous and conventional BF fluorescence spectra. The mixture spectra are not used in the training nor the validation stage. This study allows us to determine the convenience of fusioning the labels of carbendazim and benomyl for the identification of BFs in complex multi-fungicide mixtures.

  16. VISPA: a computational pipeline for the identification and analysis of genomic vector integration sites.

    PubMed

    Calabria, Andrea; Leo, Simone; Benedicenti, Fabrizio; Cesana, Daniela; Spinozzi, Giulio; Orsini, Massimilano; Merella, Stefania; Stupka, Elia; Zanetti, Gianluigi; Montini, Eugenio

    2014-01-01

    The analysis of the genomic distribution of viral vector genomic integration sites is a key step in hematopoietic stem cell-based gene therapy applications, allowing to assess both the safety and the efficacy of the treatment and to study the basic aspects of hematopoiesis and stem cell biology. Identifying vector integration sites requires ad-hoc bioinformatics tools with stringent requirements in terms of computational efficiency, flexibility, and usability. We developed VISPA (Vector Integration Site Parallel Analysis), a pipeline for automated integration site identification and annotation based on a distributed environment with a simple Galaxy web interface. VISPA was successfully used for the bioinformatics analysis of the follow-up of two lentiviral vector-based hematopoietic stem-cell gene therapy clinical trials. Our pipeline provides a reliable and efficient tool to assess the safety and efficacy of integrating vectors in clinical settings.

  17. Computer game as a tool for training the identification of phonemic length.

    PubMed

    Pennala, Riitta; Richardson, Ulla; Ylinen, Sari; Lyytinen, Heikki; Martin, Maisa

    2014-12-01

    Computer-assisted training of Finnish phonemic length was conducted with 7-year-old Russian-speaking second-language learners of Finnish. Phonemic length plays a different role in these two languages. The training included game activities with two- and three-syllable word and pseudo-word minimal pairs with prototypical vowel durations. The lowest accuracy scores were recorded for two-syllable words. Accuracy scores were higher for the minimal pairs with larger rather than smaller differences in duration. Accuracy scores were lower for long duration than for short duration. The ability to identify quantity degree was generalized to stimuli used in the identification test in two of the children. Ideas for improving the game are introduced.

  18. Identification of intestinal wall abnormalities and ischemia by modeling spatial uncertainty in computed tomography imaging findings.

    PubMed

    Tsunoyama, Taichiro; Pham, Tuan D; Fujita, Takashi; Sakamoto, Tetsuya

    2014-10-01

    Intestinal abnormalities and ischemia are medical conditions in which inflammation and injury of the intestine are caused by inadequate blood supply. Acute ischemia of the small bowel can be life-threatening. Computed tomography (CT) is currently a gold standard for the diagnosis of acute intestinal ischemia in the emergency department. However, the assessment of the diagnostic performance of CT findings in the detection of intestinal abnormalities and ischemia has been a difficult task for both radiologists and surgeons. Little effort has been found in developing computerized systems for the automated identification of these types of complex gastrointestinal disorders. In this paper, a geostatistical mapping of spatial uncertainty in CT scans is introduced for medical image feature extraction, which can be effectively applied for diagnostic detection of intestinal abnormalities and ischemia from control patterns. Experimental results obtained from the analysis of clinical data suggest the usefulness of the proposed uncertainty mapping model.

  19. NLSCIDNT user's guide maximum likehood parameter identification computer program with nonlinear rotorcraft model

    NASA Technical Reports Server (NTRS)

    1979-01-01

    A nonlinear, maximum likelihood, parameter identification computer program (NLSCIDNT) is described which evaluates rotorcraft stability and control coefficients from flight test data. The optimal estimates of the parameters (stability and control coefficients) are determined (identified) by minimizing the negative log likelihood cost function. The minimization technique is the Levenberg-Marquardt method, which behaves like the steepest descent method when it is far from the minimum and behaves like the modified Newton-Raphson method when it is nearer the minimum. Twenty-one states and 40 measurement variables are modeled, and any subset may be selected. States which are not integrated may be fixed at an input value, or time history data may be substituted for the state in the equations of motion. Any aerodynamic coefficient may be expressed as a nonlinear polynomial function of selected 'expansion variables'.

  20. Features preferred in-identification system based on computer mouse movements

    NASA Astrophysics Data System (ADS)

    Jašek, Roman; Kolařík, Martin

    2016-06-01

    Biometric identification systems build features, which describe people, using various data. Usually it is not known which of features could be chosen, and a dedicated process called a feature selection is applied to resolve this uncertainty. The relevant features, that are selected with this process, then describe the people in the system as well as possible. It is likely that the relevancy of selected features also mean that these features describe the important things in the measured behavior and that they partly reveal how the behavior works. This paper uses this idea and evaluates results of many runs of feature selection runs in a system, that identifies people using their moving a computer mouse. It has been found that the most frequently selected features repeat between runs, and that these features describe the similar things of the movements.

  1. A computer program for linear nonparametric and parametric identification of biological data.

    PubMed

    Werness, S A; Anderson, D J

    1984-01-01

    A computer program package for parametric ad nonparametric linear system identification of both static and dynamic biological data, written for an LSI-11 minicomputer with 28 K of memory, is described. The program has 11 possible commands including an instructional help command. A user can perform nonparametric spectral analysis and estimation of autocorrelation and partial autocorrelation functions of univariate data and estimate nonparametrically the transfer function and possibly an associated noise series of bivariate data. In addition, the commands provide the user the means to derive a parametric autoregressive moving average model for univariate data, to derive a parametric transfer function and noise model for bivariate data, and to perform several model evaluation tests such as pole-zero cancellation, examination of residual whiteness and uncorrelatedness with the input. The program, consisting of a main program and driver subroutine as well as six overlay segments, may be run interactively or automatically.

  2. A hyperspectral X-ray computed tomography system for enhanced material identification.

    PubMed

    Wu, Xiaomei; Wang, Qian; Ma, Jinlei; Zhang, Wei; Li, Po; Fang, Zheng

    2017-08-01

    X-ray computed tomography (CT) can distinguish different materials according to their absorption characteristics. The hyperspectral X-ray CT (HXCT) system proposed in the present work reconstructs each voxel according to its X-ray absorption spectral characteristics. In contrast to a dual-energy or multi-energy CT system, HXCT employs cadmium telluride (CdTe) as the x-ray detector, which provides higher spectral resolution and separate spectral lines according to the material's photon-counter working principle. In this paper, a specimen containing ten different polymer materials randomly arranged was adopted for material identification by HXCT. The filtered back-projection algorithm was applied for image and spectral reconstruction. The first step was to sort the individual material components of the specimen according to their cross-sectional image intensity. The second step was to classify materials with similar intensities according to their reconstructed spectral characteristics. The results demonstrated the feasibility of the proposed material identification process and indicated that the proposed HXCT system has good prospects for a wide range of biomedical and industrial nondestructive testing applications.

  3. An integrated transcriptomic and computational analysis for biomarker identification in gastric cancer

    PubMed Central

    Cui, Juan; Chen, Yunbo; Chou, Wen-Chi; Sun, Liankun; Chen, Li; Suo, Jian; Ni, Zhaohui; Zhang, Ming; Kong, Xiaoxia; Hoffman, Lisabeth L.; Kang, Jinsong; Su, Yingying; Olman, Victor; Johnson, Darryl; Tench, Daniel W.; Amster, I. Jonathan; Orlando, Ron; Puett, David; Li, Fan; Xu, Ying

    2011-01-01

    This report describes an integrated study on identification of potential markers for gastric cancer in patients’ cancer tissues and sera based on: (i) genome-scale transcriptomic analyses of 80 paired gastric cancer/reference tissues and (ii) computational prediction of blood-secretory proteins supported by experimental validation. Our findings show that: (i) 715 and 150 genes exhibit significantly differential expressions in all cancers and early-stage cancers versus reference tissues, respectively; and a substantial percentage of the alteration is found to be influenced by age and/or by gender; (ii) 21 co-expressed gene clusters have been identified, some of which are specific to certain subtypes or stages of the cancer; (iii) the top-ranked gene signatures give better than 94% classification accuracy between cancer and the reference tissues, some of which are gender-specific; and (iv) 136 of the differentially expressed genes were predicted to have their proteins secreted into blood, 81 of which were detected experimentally in the sera of 13 validation samples and 29 found to have differential abundances in the sera of cancer patients versus controls. Overall, the novel information obtained in this study has led to identification of promising diagnostic markers for gastric cancer and can benefit further analyses of the key (early) abnormalities during its development. PMID:20965966

  4. A hyperspectral X-ray computed tomography system for enhanced material identification

    NASA Astrophysics Data System (ADS)

    Wu, Xiaomei; Wang, Qian; Ma, Jinlei; Zhang, Wei; Li, Po; Fang, Zheng

    2017-08-01

    X-ray computed tomography (CT) can distinguish different materials according to their absorption characteristics. The hyperspectral X-ray CT (HXCT) system proposed in the present work reconstructs each voxel according to its X-ray absorption spectral characteristics. In contrast to a dual-energy or multi-energy CT system, HXCT employs cadmium telluride (CdTe) as the x-ray detector, which provides higher spectral resolution and separate spectral lines according to the material's photon-counter working principle. In this paper, a specimen containing ten different polymer materials randomly arranged was adopted for material identification by HXCT. The filtered back-projection algorithm was applied for image and spectral reconstruction. The first step was to sort the individual material components of the specimen according to their cross-sectional image intensity. The second step was to classify materials with similar intensities according to their reconstructed spectral characteristics. The results demonstrated the feasibility of the proposed material identification process and indicated that the proposed HXCT system has good prospects for a wide range of biomedical and industrial nondestructive testing applications.

  5. Computer Literacy in Pennsylvania Community Colleges. Competencies in a Beginning Level College Computer Literacy Course.

    ERIC Educational Resources Information Center

    Tortorelli, Ann Eichorn

    A study was conducted at the 14 community colleges (17 campuses) in Pennsylvania to assess the perceptions of faculty about the relative importance of course content items in a beginning credit course in computer literacy, and to survey courses currently being offered. A detailed questionnaire consisting of 96 questions based on MECC (Minnesota…

  6. High level waste storage tanks 242-A evaporator standards/requirement identification document

    SciTech Connect

    Biebesheimer, E.

    1996-01-01

    This document, the Standards/Requirements Identification Document (S/RIDS) for the subject facility, represents the necessary and sufficient requirements to provide an adequate level of protection of the worker, public health and safety, and the environment. It lists those source documents from which requirements were extracted, and those requirements documents considered, but from which no requirements where taken. Documents considered as source documents included State and Federal Regulations, DOE Orders, and DOE Standards

  7. Estimating groundwater levels using system identification models in Nzhelele and Luvuvhu areas, Limpopo Province, South Africa

    NASA Astrophysics Data System (ADS)

    Makungo, Rachel; Odiyo, John O.

    2017-08-01

    This study was focused on testing the ability of a coupled linear and non-linear system identification model in estimating groundwater levels. System identification provides an alternative approach for estimating groundwater levels in areas that lack data required by physically-based models. It also overcomes the limitations of physically-based models due to approximations, assumptions and simplifications. Daily groundwater levels for 4 boreholes, rainfall and evaporation data covering the period 2005-2014 were used in the study. Seventy and thirty percent of the data were used to calibrate and validate the model, respectively. Correlation coefficient (R), coefficient of determination (R2), root mean square error (RMSE), percent bias (PBIAS), Nash Sutcliffe coefficient of efficiency (NSE) and graphical fits were used to evaluate the model performance. Values for R, R2, RMSE, PBIAS and NSE ranged from 0.8 to 0.99, 0.63 to 0.99, 0.01-2.06 m, -7.18 to 1.16 and 0.68 to 0.99, respectively. Comparisons of observed and simulated groundwater levels for calibration and validation runs showed close agreements. The model performance mostly varied from satisfactory, good, very good and excellent. Thus, the model is able to estimate groundwater levels. The calibrated models can reasonably capture description between input and output variables and can, thus be used to estimate long term groundwater levels.

  8. Identify Skills and Proficiency Levels Necessary for Entry-Level Employment for All Vocational Programs Using Computers to Process Data. Final Report.

    ERIC Educational Resources Information Center

    Crowe, Jacquelyn

    This study investigated computer and word processing operator skills necessary for employment in today's high technology office. The study was comprised of seven major phases: (1) identification of existing community college computer operator programs in the state of Washington; (2) attendance at an information management seminar; (3) production…

  9. Identify Skills and Proficiency Levels Necessary for Entry-Level Employment for All Vocational Programs Using Computers to Process Data. Final Report.

    ERIC Educational Resources Information Center

    Crowe, Jacquelyn

    This study investigated computer and word processing operator skills necessary for employment in today's high technology office. The study was comprised of seven major phases: (1) identification of existing community college computer operator programs in the state of Washington; (2) attendance at an information management seminar; (3) production…

  10. Bladed-shrouded-disc aeroelastic analyses: Computer program updates in NASTRAN level 17.7

    NASA Technical Reports Server (NTRS)

    Gallo, A. M.; Elchuri, V.; Skalski, S. C.

    1981-01-01

    In October 1979, a computer program based on the state-of-the-art compressor and structural technologies applied to bladed-shrouded-disc was developed. The program was more operational in NASTRAN Level 16. The bladed disc computer program was updated for operation in NASTRAN Level 17.7. The supersonic cascade unsteady aerodynamics routine UCAS, delivered as part of the NASTRAN Level 16 program was recorded to improve its execution time. These improvements are presented.

  11. A computational model for the identification of biochemical pathways in the krebs cycle.

    PubMed

    Oliveira, Joseph S; Bailey, Colin G; Jones-Oliveira, Janet B; Dixon, David A; Gull, Dean W; Chandler, Mary L

    2003-01-01

    We have applied an algorithmic methodology which provably decomposes any complex network into a complete family of principal subcircuits to study the minimal circuits that describe the Krebs cycle. Every operational behavior that the network is capable of exhibiting can be represented by some combination of these principal subcircuits and this computational decomposition is linearly efficient. We have developed a computational model that can be applied to biochemical reaction systems which accurately renders pathways of such reactions via directed hypergraphs (Petri nets). We have applied the model to the citric acid cycle (Krebs cycle). The Krebs cycle, which oxidizes the acetyl group of acetyl CoA to CO(2) and reduces NAD and FAD to NADH and FADH(2), is a complex interacting set of nine subreaction networks. The Krebs cycle was selected because of its familiarity to the biological community and because it exhibits enough complexity to be interesting in order to introduce this novel analytic approach. This study validates the algorithmic methodology for the identification of significant biochemical signaling subcircuits, based solely upon the mathematical model and not upon prior biological knowledge. The utility of the algebraic-combinatorial model for identifying the complete set of biochemical subcircuits as a data set is demonstrated for this important metabolic process.

  12. A Computational Model for the Identification of Biochemical Pathways in the Krebs Cycle

    SciTech Connect

    Oliveira, Joseph S.; Bailey, Colin G.; Jones-Oliveira, Janet B.; Dixon, David A.; Gull, Dean W.; Chandler, Mary L.

    2003-03-01

    We have applied an algorithmic methodology which provably decomposes any complex network into a complete family of principal subcircuits to study the minimal circuits that describe the Krebs cycle. Every operational behavior that the network is capable of exhibiting can be represented by some combination of these principal subcircuits and this computational decomposition is linearly efficient. We have developed a computational model that can be applied to biochemical reaction systems which accurately renders pathways of such reactions via directed hypergraphs (Petri nets). We have applied the model to the citric acid cycle (Krebs cycle). The Krebs cycle, which oxidizes the acetyl group of acetyl CoA to CO2 and reduces NAD and FAD to NADH and FADH2 is a complex interacting set of nine subreaction networks. The Krebs cycle was selected because of its familiarity to the biological community and because it exhibits enough complexity to be interesting in order to introduce this novel analytic approach. This study validates the algorithmic methodology for the identification of significant biochemical signaling subcircuits, based solely upon the mathematical model and not upon prior biological knowledge. The utility of the algebraic-combinatorial model for identifying the complete set of biochemical subcircuits as a data set is demonstrated for this important metabolic process.

  13. CaPSID: A bioinformatics platform for computational pathogen sequence identification in human genomes and transcriptomes

    PubMed Central

    2012-01-01

    Background It is now well established that nearly 20% of human cancers are caused by infectious agents, and the list of human oncogenic pathogens will grow in the future for a variety of cancer types. Whole tumor transcriptome and genome sequencing by next-generation sequencing technologies presents an unparalleled opportunity for pathogen detection and discovery in human tissues but requires development of new genome-wide bioinformatics tools. Results Here we present CaPSID (Computational Pathogen Sequence IDentification), a comprehensive bioinformatics platform for identifying, querying and visualizing both exogenous and endogenous pathogen nucleotide sequences in tumor genomes and transcriptomes. CaPSID includes a scalable, high performance database for data storage and a web application that integrates the genome browser JBrowse. CaPSID also provides useful metrics for sequence analysis of pre-aligned BAM files, such as gene and genome coverage, and is optimized to run efficiently on multiprocessor computers with low memory usage. Conclusions To demonstrate the usefulness and efficiency of CaPSID, we carried out a comprehensive analysis of both a simulated dataset and transcriptome samples from ovarian cancer. CaPSID correctly identified all of the human and pathogen sequences in the simulated dataset, while in the ovarian dataset CaPSID’s predictions were successfully validated in vitro. PMID:22901030

  14. Identification of rounded atelectasis in workers exposed to asbestos by contrast helical computed tomography.

    PubMed

    Terra-Filho, M; Kavakama, J; Bagatin, E; Capelozzi, V L; Nery, L E; Tavares, R

    2003-10-01

    Rounded atelectasis (RA) is a benign and unusual form of subpleural lung collapse that has been described mostly in asbestos-exposed workers. This form of atelectasis manifests as a lung nodule and can be confused with bronchogenic carcinoma upon conventional radiologic examination. The objective of the present study was to evaluate the variation in contrast uptake in computed tomography for the identification of asbestos-related RA in Brazil. Between January 1998 and December 2000, high-resolution computed tomography (HRCT) was performed in 1658 asbestos-exposed workers. The diagnosis was made in nine patients based on a history of prior asbestos exposure, the presence of characteristic (HRCT) findings and lesions unchanged in size over 2 years or more. In three of them the diagnosis was confirmed during surgery. The dynamic contrast enhancement study was modified to evaluate nodules and pulmonary masses. All nine patients with RA received iodide contrast according to weight. The average enhancement after iodide contrast was infused, reported as Hounsfield units (HU), increased from 62.5+/-9.7 to 125.4+/-20.7 (P < 0.05), with a mean enhancement of 62.5+/-19.7 (range 40 to 89) and with a uniform dense opacification. In conclusion, in this study all patients with RA showed contrast enhancement with uniform dense opacification. The main clinical implication of this finding is that this procedure does not permit differentiation between RA and malignant pulmonary neoplasm.

  15. Crop species identification using machine vision of computer extracted individual leaves

    NASA Astrophysics Data System (ADS)

    Camargo Neto, João; Meyer, George E.

    2005-11-01

    An unsupervised method for plant species identification was developed which uses computer extracted individual whole leaves from color images of crop canopies. Green canopies were isolated from soil/residue backgrounds using a modified Excess Green and Excess Red separation method. Connected components of isolated green regions of interest were changed into pixel fragments using the Gustafson-Kessel fuzzy clustering method. The fragments were reassembled as individual leaves using a genetic optimization algorithm and a fitness method. Pixels of whole leaves were then analyzed using the elliptic Fourier shape and Haralick's classical textural feature analyses. A binary template was constructed to represent each selected leaf region of interest. Elliptic Fourier descriptors were generated from a chain encoding of the leaf boundary. Leaf template orientation was corrected by rotating each extracted leaf to a standard horizontal position. This was done using information provided from the first harmonic set of coefficients. Textural features were computed from the grayscale co-occurrence matrix of the leaf pixel set. Standardized leaf orientation significantly improved the leaf textural venation results. Principle component analysis from SAS (R) was used to select the best Fourier descriptors and textural indices. Indices of local homogeneity, and entropy were found to contribute to improved classification rates. A SAS classification model was developed and correctly classified 83% of redroot pigweed, 100% of sunflower 83% of soybean, and 73% of velvetleaf species. An overall plant species correct classification rate of 86% was attained.

  16. Computer experiments on periodic systems identification using rotor blade transient flapping-torsion responses at high advance ratio

    NASA Technical Reports Server (NTRS)

    Hohenemser, K. H.; Prelewicz, D. A.

    1974-01-01

    Systems identification methods have recently been applied to rotorcraft to estimate stability derivatives from transient flight control response data. While these applications assumed a linear constant coefficient representation of the rotorcraft, the computer experiments described in this paper used transient responses in flap-bending and torsion of a rotor blade at high advance ratio which is a rapidly time varying periodic system.

  17. Multivariate Effects of Level of Education, Computer Ownership, and Computer Use on Female Students' Attitudes towards CALL

    ERIC Educational Resources Information Center

    Rahimi, Mehrak; Yadollahi, Samaneh

    2012-01-01

    The aim of this study was investigating Iranian female students' attitude towards CALL and its relationship with their level of education, computer ownership, and frequency of use. One hundred and forty-two female students (50 junior high-school students, 49 high-school students and 43 university students) participated in this study. They filled…

  18. Ubiquitin Ligase Substrate Identification through Quantitative Proteomics at Both the Protein and Peptide Levels

    PubMed Central

    Lee, Kimberly A.; Hammerle, Lisa P.; Andrews, Paul S.; Stokes, Matthew P.; Mustelin, Tomas; Silva, Jeffrey C.; Black, Roy A.; Doedens, John R.

    2011-01-01

    Protein ubiquitination is a key regulatory process essential to life at a cellular level; significant efforts have been made to identify ubiquitinated proteins through proteomics studies, but the level of success has not reached that of heavily studied post-translational modifications, such as phosphorylation. HRD1, an E3 ubiquitin ligase, has been implicated in rheumatoid arthritis, but no disease-relevant substrates have been identified. To identify these substrates, we have taken both peptide and protein level approaches to enrich for ubiquitinated proteins in the presence and absence of HRD1. At the protein level, a two-step strategy was taken using cells expressing His6-tagged ubiquitin, enriching proteins first based on their ubiquitination and second based on the His tag with protein identification by LC-MS/MS. Application of this method resulted in identification and quantification of more than 400 ubiquitinated proteins, a fraction of which were found to be sensitive to HRD1 and were therefore deemed candidate substrates. In a second approach, ubiquitinated peptides were enriched after tryptic digestion by peptide immunoprecipitation using an antibody specific for the diglycine-labeled internal lysine residue indicative of protein ubiquitination, with peptides and ubiquitination sites identified by LC-MS/MS. Peptide immunoprecipitation resulted in identification of over 1800 ubiquitinated peptides on over 900 proteins in each study, with several proteins emerging as sensitive to HRD1 levels. Notably, significant overlap exists between the HRD1 substrates identified by the protein-based and the peptide-based strategies, with clear cross-validation apparent both qualitatively and quantitatively, demonstrating the effectiveness of both strategies and furthering our understanding of HRD1 biology. PMID:21987572

  19. NEW Fe I LEVEL ENERGIES AND LINE IDENTIFICATIONS FROM STELLAR SPECTRA

    SciTech Connect

    Peterson, Ruth C.; Kurucz, Robert L.

    2015-01-01

    The spectrum of the Fe I atom is critical to many areas of astrophysics and beyond. Measurements of the energies of its high-lying levels remain woefully incomplete, however, despite extensive laboratory and solar analysis. In this work, we use high-resolution archival absorption-line ultraviolet and optical spectra of stars whose warm temperatures favor moderate Fe I excitation. We derive the energy for a particular upper level in Kurucz's semiempirical calculations by adopting a trial value that yields the same wavelength for a given line predicted to be about as strong as that of a strong unidentified spectral line observed in the stellar spectra, then checking the new wavelengths of other strong predicted transitions that share the same upper level for coincidence with other strong observed unidentified lines. To date, this analysis has provided the upper energies of 66 Fe I levels. Many new energy levels are higher than those accessible to laboratory experiments; several exceed the Fe I ionization energy. These levels provide new identifications for over 2000 potentially detectable lines. Almost all of the new levels of odd parity include UV lines that were detected but unclassified in laboratory Fe I absorption spectra, providing an external check on the energy values. We motivate and present the procedure, provide the resulting new energy levels and their uncertainties, list all the potentially detectable UV and optical new Fe I line identifications and their gf values, point out new lines of astrophysical interest, and discuss the prospects for additional Fe I energy level determinations.

  20. New Fe I Level Energies and Line Identifications from Stellar Spectra

    NASA Astrophysics Data System (ADS)

    Peterson, Ruth C.; Kurucz, Robert L.

    2015-01-01

    The spectrum of the Fe I atom is critical to many areas of astrophysics and beyond. Measurements of the energies of its high-lying levels remain woefully incomplete, however, despite extensive laboratory and solar analysis. In this work, we use high-resolution archival absorption-line ultraviolet and optical spectra of stars whose warm temperatures favor moderate Fe I excitation. We derive the energy for a particular upper level in Kurucz's semiempirical calculations by adopting a trial value that yields the same wavelength for a given line predicted to be about as strong as that of a strong unidentified spectral line observed in the stellar spectra, then checking the new wavelengths of other strong predicted transitions that share the same upper level for coincidence with other strong observed unidentified lines. To date, this analysis has provided the upper energies of 66 Fe I levels. Many new energy levels are higher than those accessible to laboratory experiments; several exceed the Fe I ionization energy. These levels provide new identifications for over 2000 potentially detectable lines. Almost all of the new levels of odd parity include UV lines that were detected but unclassified in laboratory Fe I absorption spectra, providing an external check on the energy values. We motivate and present the procedure, provide the resulting new energy levels and their uncertainties, list all the potentially detectable UV and optical new Fe I line identifications and their gf values, point out new lines of astrophysical interest, and discuss the prospects for additional Fe I energy level determinations.

  1. Computer-Assisted Instruction in Elementary Logic at the University Level. Technical Report No. 239.

    ERIC Educational Resources Information Center

    Goldberg, Adele; Suppes, Patrick

    Earlier research by the authors in the design and use of computer-assisted instructional systems and curricula for teaching mathematical logic to gifted elementary school students has been extended to the teaching of university-level courses. This report is a description of the curriculum and problem types of a computer-based course offered at…

  2. Chemistry Problem-Solving Abilities: Gender, Reasoning Level and Computer-Simulated Experiments.

    ERIC Educational Resources Information Center

    Suits, Jerry P.; Lagowski, J. J.

    Two studies were conducted to determine the effects of gender, reasoning level, and inductive and deductive computer-simulated experiments (CSE) on problem-solving abilities in introductory general chemistry. In the pilot study, 254 subjects were randomly assigned to control (computer-assisted-instruction tutorials), inductive or deductive CSE…

  3. Self-Assessment and Student Improvement in an Introductory Computer Course at the Community College Level

    ERIC Educational Resources Information Center

    Spicer-Sutton, Jama; Lampley, James; Good, Donald W.

    2014-01-01

    The purpose of this study was to determine a student's computer knowledge upon course entry and if there was a difference in college students' improvement scores as measured by the difference in pretest and post-test scores of new or novice users, moderate users, and expert users at the end of a college level introductory computing class. This…

  4. Computational techniques in tribology and material science at the atomic level

    NASA Technical Reports Server (NTRS)

    Ferrante, J.; Bozzolo, G. H.

    1992-01-01

    Computations in tribology and material science at the atomic level present considerable difficulties. Computational techniques ranging from first-principles to semi-empirical and their limitations are discussed. Example calculations of metallic surface energies using semi-empirical techniques are presented. Finally, application of the methods to calculation of adhesion and friction are presented.

  5. A Study of Effectiveness of Computer Assisted Instruction (CAI) over Classroom Lecture (CRL) at ICS Level

    ERIC Educational Resources Information Center

    Kaousar, Tayyeba; Choudhry, Bushra Naoreen; Gujjar, Aijaz Ahmed

    2008-01-01

    This study was aimed to evaluate the effectiveness of CAI vs. classroom lecture for computer science at ICS level. The objectives were to compare the learning effects of two groups with classroom lecture and computer-assisted instruction studying the same curriculum and the effects of CAI and CRL in terms of cognitive development. Hypotheses of…

  6. Computational identification of miRNAs and their targets in Phaseolus vulgaris.

    PubMed

    Han, J; Xie, H; Kong, M L; Sun, Q P; Li, R Z; Pan, J B

    2014-01-21

    MicroRNAs (miRNAs) are a class of non-coding small RNAs that negatively regulate gene expression at the post-transcriptional level. Although thousands of miRNAs have been identified in plants, limited information is available about miRNAs in Phaseolus vulgaris, despite it being an important food legume worldwide. The high conservation of plant miRNAs enables the identification of new miRNAs in P. vulgaris by homology analysis. Here, 1804 known and unique plant miRNAs from 37 plant species were blast-searched against expressed sequence tag and genomic survey sequence databases to identify novel miRNAs in P. vulgaris. All candidate sequences were screened by a series of miRNA filtering criteria. Finally, we identified 27 conserved miRNAs, belonging to 24 miRNA families. When compared against known miRNAs in P. vulgaris, we found that 24 of the 27 miRNAs were newly discovered. Further, we identified 92 potential target genes with known functions for these novel miRNAs. Most of these target genes were predicted to be involved in plant development, signal transduction, metabolic pathways, disease resistance, and environmental stress response. The identification of the novel miRNAs in P. vulgaris is anticipated to provide baseline information for further research about the biological functions and evolution of miRNAs in P. vulgaris.

  7. Groundwater contamination: identification of source signal by time-reverse mass transport computation and filtering

    NASA Astrophysics Data System (ADS)

    Koussis, A. S.; Mazi, K.; Lykoudis, S.; Argyriou, A.

    2003-04-01

    Source signal identification is a forensic task, within regulatory and legal activities. Estimation of the contaminant's release history by reverse-solution (stepping back in time) of the mass transport equation, partialC/partialt + u partialC/partialx = D partial^2C/ partialx^2, is an ill-posed problem (its solution is non-unique and unstable). For this reason we propose the recovery of the source signal from measured concentration profile data through a numerical technique that is based on the premise of advection-dominated transport. We derive an explicit numerical scheme by discretising the pure advection equation, partialC/ partialt + u partial C/partialx = 0, such that it also models gradient-transport by matching numerical diffusion (leading truncation error term) to physical dispersion. The match is achieved by appropriate choice of the scheme’s spatial weighting coefficient q as function of the grid Peclet number P = u Δx/D: θ = 0.5 - P-1. This is a novel and efficient direct solution approach for the signal identification problem at hand that can accommodate space-variable transport parameters as well. First, we perform numerical experiments to define proper grids (in terms of Courant {bf C} = uΔt/Δx and grid Peclet P numbers) for control of spurious oscillations (instability). We then assess recovery of source signals, from perfect as well as from error-seeded field data, considering field data resulting from single- and double-peaked source signals. With perfect data, the scheme recovers source signals with very good accuracy. With imperfect data, however, additional data conditioning is required for control of signal noise. Alternating reverse profile computation with Savitzky-Golay low-pass filtering allows the recovery of well-timed and smooth source signals that satisfy mass conservation very well. Current research focuses on: a) optimising the performance of Savitzky-Golay filters, through selection of appropriate parameters (order of least

  8. Computational Identification of Novel MicroRNAs and Their Targets in Vigna unguiculata.

    PubMed

    Lu, Yongzhong; Yang, Xiaoyun

    2010-01-01

    MicroRNAs (miRNAs) are a class of endogenous, noncoding, short RNAs directly involved in regulating gene expression at the posttranscriptional level. High conservation of miRNAs in plant provides the foundation for identification of new miRNAs in other plant species through homology alignment. Here, previous known plant miRNAs were BLASTed against the Expressed Sequence Tag (EST) and Genomic Survey Sequence (GSS) databases of Vigna unguiculata, and according to a series of filtering criteria, a total of 47 miRNAs belonging to 13 miRNA families were identified, and 30 potential target genes of them were subsequently predicted, most of which seemed to encode transcription factors or enzymes participating in regulation of development, growth, metabolism, and other physiological processes. Overall, our findings lay the foundation for further researches of miRNAs function in Vigna unguiculata.

  9. Identification of Nonlinear Micron-Level Mechanics for a Precision Deployable Joint

    NASA Technical Reports Server (NTRS)

    Bullock, S. J.; Peterson, L. D.

    1994-01-01

    The experimental identification of micron-level nonlinear joint mechanics and dynamics for a pin-clevis joint used in a precision, adaptive, deployable space structure are investigated. The force-state mapping method is used to identify the behavior of the joint under a preload. The results of applying a single tension-compression cycle to the joint under a tensile preload are presented. The observed micron-level behavior is highly nonlinear and involves all six rigid body motion degrees-of-freedom of the joint. it is also suggests that at micron levels of motion modelling of the joint mechanics and dynamics must include the interactions between all internal components, such as the pin, bushings, and the joint node.

  10. Identification of pumping influences in long-term water level fluctuations.

    PubMed

    Harp, Dylan R; Vesselinov, Velimir V

    2011-01-01

    Identification of the pumping influences at monitoring wells caused by spatially and temporally variable water supply pumping can be a challenging, yet an important hydrogeological task. The information that can be obtained can be critical for conceptualization of the hydrogeological conditions and indications of the zone of influence of the individual pumping wells. However, the pumping influences are often intermittent and small in magnitude with variable production rates from multiple pumping wells. While these difficulties may support an inclination to abandon the existing dataset and conduct a dedicated cross-hole pumping test, that option can be challenging and expensive to coordinate and execute. This paper presents a method that utilizes a simple analytical modeling approach for analysis of a long-term water level record utilizing an inverse modeling approach. The methodology allows the identification of pumping wells influencing the water level fluctuations. Thus, the analysis provides an efficient and cost-effective alternative to designed and coordinated cross-hole pumping tests. We apply this method on a dataset from the Los Alamos National Laboratory site. Our analysis also provides (1) an evaluation of the information content of the transient water level data; (2) indications of potential structures of the aquifer heterogeneity inhibiting or promoting pressure propagation; and (3) guidance for the development of more complicated models requiring detailed specification of the aquifer heterogeneity.

  11. A riboprinting scheme for identification of unknown Acanthamoeba isolates at species level

    PubMed Central

    Kong, Hyun-Hee

    2002-01-01

    We describe a riboprinting scheme for identification of unknown Acanthamoeba isolates at the species level. It involved the use of PCR-RFLP of small subunit ribosomal RNA gene (riboprint) of 24 reference strains by 4 kinds of restriction enzymes. Seven strains in morphological group I and III were identified at species level with their unique sizes of PCR product and riboprint type by Rsa I. Unique RFCP of 17 strains in group II by Dde I, Taq I and Hae III were classified into: (1) four taxa that were identifiable at the species level, (2) a subgroup of 4 taxa and a pair of 2 taxa that were identical with each other, and (3) a species complex of 7 taxa assigned to A. castellanii complex that were closely related. These results were consistent with those obtained by 18s rDNA sequence analysis. This approach provides an alternative to the rDNA sequencing for rapid identification of a new clinical isolate or a large number of environmental isolates of Acanthamoeba. PMID:11949210

  12. Computational Analyses of Spectral Trees from Electrospray Multi-Stage Mass Spectrometry to Aid Metabolite Identification

    PubMed Central

    Cao, Mingshu; Fraser, Karl; Rasmussen, Susanne

    2013-01-01

    Mass spectrometry coupled with chromatography has become the major technical platform in metabolomics. Aided by peak detection algorithms, the detected signals are characterized by mass-over-charge ratio (m/z) and retention time. Chemical identities often remain elusive for the majority of the signals. Multi-stage mass spectrometry based on electrospray ionization (ESI) allows collision-induced dissociation (CID) fragmentation of selected precursor ions. These fragment ions can assist in structural inference for metabolites of low molecular weight. Computational investigations of fragmentation spectra have increasingly received attention in metabolomics and various public databases house such data. We have developed an R package “iontree” that can capture, store and analyze MS2 and MS3 mass spectral data from high throughput metabolomics experiments. The package includes functions for ion tree construction, an algorithm (distMS2) for MS2 spectral comparison, and tools for building platform-independent ion tree (MS2/MS3) libraries. We have demonstrated the utilization of the package for the systematic analysis and annotation of fragmentation spectra collected in various metabolomics platforms, including direct infusion mass spectrometry, and liquid chromatography coupled with either low resolution or high resolution mass spectrometry. Assisted by the developed computational tools, we have demonstrated that spectral trees can provide informative evidence complementary to retention time and accurate mass to aid with annotating unknown peaks. These experimental spectral trees once subjected to a quality control process, can be used for querying public MS2 databases or de novo interpretation. The putatively annotated spectral trees can be readily incorporated into reference libraries for routine identification of metabolites. PMID:24958264

  13. The identification of a selective dopamine D2 partial agonist, D3 antagonist displaying high levels of brain exposure.

    PubMed

    Holmes, Ian P; Blunt, Richard J; Lorthioir, Olivier E; Blowers, Stephen M; Gribble, Andy; Payne, Andrew H; Stansfield, Ian G; Wood, Martyn; Woollard, Patrick M; Reavill, Charlie; Howes, Claire M; Micheli, Fabrizio; Di Fabio, Romano; Donati, Daniele; Terreni, Silvia; Hamprecht, Dieter; Arista, Luca; Worby, Angela; Watson, Steve P

    2010-03-15

    The identification of a highly selective D(2) partial agonist, D(3) antagonist tool molecule which demonstrates high levels of brain exposure and selectivity against an extensive range of dopamine, serotonin, adrenergic, histamine, and muscarinic receptors is described.

  14. Computer-aided identification of the pectoral muscle in digitized mammograms.

    PubMed

    Camilus, K Santle; Govindan, V K; Sathidevi, P S

    2010-10-01

    Mammograms are X-ray images of human breast which are normally used to detect breast cancer. The presence of pectoral muscle in mammograms may disturb the detection of breast cancer as the pectoral muscle and mammographic parenchyma appear similar. So, the suppression or exclusion of the pectoral muscle from the mammograms is demanded for computer-aided analysis which requires the identification of the pectoral muscle. The main objective of this study is to propose an automated method to efficiently identify the pectoral muscle in medio-lateral oblique-view mammograms. This method uses a proposed graph cut-based image segmentation technique for identifying the pectoral muscle edge. The identified pectoral muscle edge is found to be ragged. Hence, the pectoral muscle is smoothly represented using Bezier curve which uses the control points obtained from the pectoral muscle edge. The proposed work was tested on a public dataset of medio-lateral oblique-view mammograms obtained from mammographic image analysis society database, and its performance was compared with the state-of-the-art methods reported in the literature. The mean false positive and false negative rates of the proposed method over randomly chosen 84 mammograms were calculated, respectively, as 0.64% and 5.58%. Also, with respect to the number of results with small error, the proposed method out performs existing methods. These results indicate that the proposed method can be used to accurately identify the pectoral muscle on medio-lateral oblique view mammograms.

  15. A computational method for the identification of new candidate carcinogenic and non-carcinogenic chemicals.

    PubMed

    Chen, Lei; Chu, Chen; Lu, Jing; Kong, Xiangyin; Huang, Tao; Cai, Yu-Dong

    2015-09-01

    Cancer is one of the leading causes of human death. Based on current knowledge, one of the causes of cancer is exposure to toxic chemical compounds, including radioactive compounds, dioxin, and arsenic. The identification of new carcinogenic chemicals may warn us of potential danger and help to identify new ways to prevent cancer. In this study, a computational method was proposed to identify potential carcinogenic chemicals, as well as non-carcinogenic chemicals. According to the current validated carcinogenic and non-carcinogenic chemicals from the CPDB (Carcinogenic Potency Database), the candidate chemicals were searched in a weighted chemical network constructed according to chemical-chemical interactions. Then, the obtained candidate chemicals were further selected by a randomization test and information on chemical interactions and structures. The analyses identified several candidate carcinogenic chemicals, while those candidates identified as non-carcinogenic were supported by a literature search. In addition, several candidate carcinogenic/non-carcinogenic chemicals exhibit structural dissimilarity with validated carcinogenic/non-carcinogenic chemicals.

  16. Safety and reliability of Radio Frequency Identification Devices in Magnetic Resonance Imaging and Computed Tomography

    PubMed Central

    2010-01-01

    Background Radio Frequency Identification (RFID) devices are becoming more and more essential for patient safety in hospitals. The purpose of this study was to determine patient safety, data reliability and signal loss wearing on skin RFID devices during magnetic resonance imaging (MRI) and computed tomography (CT) scanning. Methods Sixty RFID tags of the type I-Code SLI, 13.56 MHz, ISO 18000-3.1 were tested: Thirty type 1, an RFID tag with a 76 × 45 mm aluminum-etched antenna and 30 type 2, a tag with a 31 × 14 mm copper-etched antenna. The signal loss, material movement and heat tests were performed in a 1.5 T and a 3 T MR system. For data integrity, the tags were tested additionally during CT scanning. Standardized function tests were performed with all transponders before and after all imaging studies. Results There was no memory loss or data alteration in the RFID tags after MRI and CT scanning. Concerning heating (a maximum of 3.6°C) and device movement (below 1 N/kg) no relevant influence was found. Concerning signal loss (artifacts 2 - 4 mm), interpretability of MR images was impaired when superficial structures such as skin, subcutaneous tissues or tendons were assessed. Conclusions Patients wearing RFID wristbands are safe in 1.5 T and 3 T MR scanners using normal operation mode for RF-field. The findings are specific to the RFID tags that underwent testing. PMID:20205829

  17. Computation of likelihood ratios in fingerprint identification for configurations of any number of minutiae.

    PubMed

    Neumann, Cédric; Champod, Christophe; Puch-Solis, Roberto; Egli, Nicole; Anthonioz, Alexandre; Bromage-Griffiths, Andie

    2007-01-01

    Recent court challenges have highlighted the need for statistical research on fingerprint identification. This paper proposes a model for computing likelihood ratios (LRs) to assess the evidential value of comparisons with any number of minutiae. The model considers minutiae type, direction and relative spatial relationships. It expands on previous work on three minutiae by adopting a spatial modeling using radial triangulation and a probabilistic distortion model for assessing the numerator of the LR. The model has been tested on a sample of 686 ulnar loops and 204 arches. Features vectors used for statistical analysis have been obtained following a preprocessing step based on Gabor filtering and image processing to extract minutiae data. The metric used to assess similarity between two feature vectors is based on an Euclidean distance measure. Tippett plots and rates of misleading evidence have been used as performance indicators of the model. The model has shown encouraging behavior with low rates of misleading evidence and a LR power of the model increasing significantly with the number of minutiae. The LRs that it provides are highly indicative of identity of source on a significant proportion of cases, even when considering configurations with few minutiae. In contrast with previous research, the model, in addition to minutia type and direction, incorporates spatial relationships of minutiae without introducing probabilistic independence assumptions. The model also accounts for finger distortion.

  18. Computational Tools for Allosteric Drug Discovery: Site Identification and Focus Library Design.

    PubMed

    Huang, Wenkang; Nussinov, Ruth; Zhang, Jian

    2017-01-01

    Allostery is an intrinsic phenomenon of biological macromolecules involving regulation and/or signal transduction induced by a ligand binding to an allosteric site distinct from a molecule's active site. Allosteric drugs are currently receiving increased attention in drug discovery because drugs that target allosteric sites can provide important advantages over the corresponding orthosteric drugs including specific subtype selectivity within receptor families. Consequently, targeting allosteric sites, instead of orthosteric sites, can reduce drug-related side effects and toxicity. On the down side, allosteric drug discovery can be more challenging than traditional orthosteric drug discovery due to difficulties associated with determining the locations of allosteric sites and designing drugs based on these sites and the need for the allosteric effects to propagate through the structure, reach the ligand binding site and elicit a conformational change. In this study, we present computational tools ranging from the identification of potential allosteric sites to the design of "allosteric-like" modulator libraries. These tools may be particularly useful for allosteric drug discovery.

  19. [Determination of antibiotype by a computer program. Epidemiological significance and antibiogram-identification correlates].

    PubMed

    Fosse, T; Macone, F; Laffont, C

    1988-06-01

    By means of a computer program disk diffusion diameter were analysed and an antibiotic susceptibility code (antibiotype) was determined for enterobacteriaceae. This code was a 6 figure-number. Each figure summarised susceptibility (susceptible or resistant) to 3 antibiotics. Thus a 18 serial antibiotics was necessary to calculate the 6 figure-code. At least following antibiotics were chosen for their characteristic behavior: amoxycillin, ticarcillin, amoxycillin + clavulanic acid, cephalothin, ticarcillin + clavulanic acid, cefotaxime, gentamycin, tobramycin, amikacin, nalidixic acid, pefloxacin, ciprofloxacin, fosfomycin and colistin. This code allowed three kind of utilisation: epidemiology by comparing biochemical and susceptibility patterns of same isolated species; laboratory control: a data base with main antibiotic susceptibility patterns for each species allowed a rapid compatibility control of biochemical identification with antibiogram. An inconsistent result lead to a checking of biochemical and susceptibility tests or to record a new code in a file to a further enrichment of the data base. Impression of a message depending of the code for a therapeutic purpose.

  20. Absolute identification of muramic acid, at trace levels, in human septic synovial fluids in vivo and absence in aseptic fluids.

    PubMed

    Fox, A; Fox, K; Christensson, B; Harrelson, D; Krahmer, M

    1996-09-01

    This is the first report of a study employing the state-of-the-art technique of gas chromatography-tandem mass spectrometry for absolute identification of muramic acid (a marker for peptidoglycan) at trace levels in a human or animal body fluid or tissue. Daughter mass spectra of synovial fluid muramic acid peaks (> or = 30 ng/ml) were identical to those of pure muramic acid. Absolute chemical identification at this level represents a 1,000-fold increase in sensitivity over previous gas chromatography-mass spectrometry identifications. Muramic acid was positively identified in synovial fluids during infection and was eliminated over time but was absent from aseptic fluids.

  1. A unique automation platform for measuring low level radioactivity in metabolite identification studies.

    PubMed

    Krauser, Joel; Walles, Markus; Wolf, Thierry; Graf, Daniel; Swart, Piet

    2012-01-01

    Generation and interpretation of biotransformation data on drugs, i.e. identification of physiologically relevant metabolites, defining metabolic pathways and elucidation of metabolite structures, have become increasingly important to the drug development process. Profiling using (14)C or (3)H radiolabel is defined as the chromatographic separation and quantification of drug-related material in a given biological sample derived from an in vitro, preclinical in vivo or clinical study. Metabolite profiling is a very time intensive activity, particularly for preclinical in vivo or clinical studies which have defined limitations on radiation burden and exposure levels. A clear gap exists for certain studies which do not require specialized high volume automation technologies, yet these studies would still clearly benefit from automation. Use of radiolabeled compounds in preclinical and clinical ADME studies, specifically for metabolite profiling and identification are a very good example. The current lack of automation for measuring low level radioactivity in metabolite profiling requires substantial capacity, personal attention and resources from laboratory scientists. To help address these challenges and improve efficiency, we have innovated, developed and implemented a novel and flexible automation platform that integrates a robotic plate handling platform, HPLC or UPLC system, mass spectrometer and an automated fraction collector.

  2. A Unique Automation Platform for Measuring Low Level Radioactivity in Metabolite Identification Studies

    PubMed Central

    Krauser, Joel; Walles, Markus; Wolf, Thierry; Graf, Daniel; Swart, Piet

    2012-01-01

    Generation and interpretation of biotransformation data on drugs, i.e. identification of physiologically relevant metabolites, defining metabolic pathways and elucidation of metabolite structures, have become increasingly important to the drug development process. Profiling using 14C or 3H radiolabel is defined as the chromatographic separation and quantification of drug-related material in a given biological sample derived from an in vitro, preclinical in vivo or clinical study. Metabolite profiling is a very time intensive activity, particularly for preclinical in vivo or clinical studies which have defined limitations on radiation burden and exposure levels. A clear gap exists for certain studies which do not require specialized high volume automation technologies, yet these studies would still clearly benefit from automation. Use of radiolabeled compounds in preclinical and clinical ADME studies, specifically for metabolite profiling and identification are a very good example. The current lack of automation for measuring low level radioactivity in metabolite profiling requires substantial capacity, personal attention and resources from laboratory scientists. To help address these challenges and improve efficiency, we have innovated, developed and implemented a novel and flexible automation platform that integrates a robotic plate handling platform, HPLC or UPLC system, mass spectrometer and an automated fraction collector. PMID:22723932

  3. Ingroup identification and group-level narcissism as predictors of U.S. citizens' attitudes and behavior toward Arab immigrants.

    PubMed

    Lyons, Patricia A; Kenworthy, Jared B; Popan, Jason R

    2010-09-01

    In four studies, the authors explored factors contributing to negative attitudes and behavior toward Arab immigrants in the United States. In Study 1, Americans reported greater threat from Arabs, compared to other groups (e.g., Latino, Asian). In Study 2, they tested the effects of ingroup identification and group-level narcissism on attitudes toward Arab, Latino, Asian, and European immigrants. Identification interacted with group narcissism in predicting attitudes toward Arab (but not other) immigrants, such that identification predicted negative attitudes toward Arab immigrants only at mean and high levels of group narcissism. Study 3 explored the convergent and discriminant validity of the group narcissism construct. In Study 4, the authors added a behavioral dependent measure. Again, ingroup identification predicted negative behavior and attitudes toward an Arab immigrant group (but not comparison groups) only at mean and high levels of group narcissism. Theoretical and practical implications are discussed.

  4. Ontology-Based High-Level Context Inference for Human Behavior Identification

    PubMed Central

    Villalonga, Claudia; Razzaq, Muhammad Asif; Khan, Wajahat Ali; Pomares, Hector; Rojas, Ignacio; Lee, Sungyoung; Banos, Oresti

    2016-01-01

    Recent years have witnessed a huge progress in the automatic identification of individual primitives of human behavior, such as activities or locations. However, the complex nature of human behavior demands more abstract contextual information for its analysis. This work presents an ontology-based method that combines low-level primitives of behavior, namely activity, locations and emotions, unprecedented to date, to intelligently derive more meaningful high-level context information. The paper contributes with a new open ontology describing both low-level and high-level context information, as well as their relationships. Furthermore, a framework building on the developed ontology and reasoning models is presented and evaluated. The proposed method proves to be robust while identifying high-level contexts even in the event of erroneously-detected low-level contexts. Despite reasonable inference times being obtained for a relevant set of users and instances, additional work is required to scale to long-term scenarios with a large number of users. PMID:27690050

  5. Development of a computer-assisted forensic radiographic identification method using the lateral cervical and lumbar spine.

    PubMed

    Derrick, Sharon M; Raxter, Michelle H; Hipp, John A; Goel, Priya; Chan, Elaine F; Love, Jennifer C; Wiersema, Jason M; Akella, N Shastry

    2015-01-01

    Medical examiners and coroners (ME/C) in the United States hold statutory responsibility to identify deceased individuals who fall under their jurisdiction. The computer-assisted decedent identification (CADI) project was designed to modify software used in diagnosis and treatment of spinal injuries into a mathematically validated tool for ME/C identification of fleshed decedents. CADI software analyzes the shapes of targeted vertebral bodies imaged in an array of standard radiographs and quantifies the likelihood that any two of the radiographs contain matching vertebral bodies. Six validation tests measured the repeatability, reliability, and sensitivity of the method, and the effects of age, sex, and number of radiographs in array composition. CADI returned a 92-100% success rate in identifying the true matching pair of vertebrae within arrays of five to 30 radiographs. Further development of CADI is expected to produce a novel identification method for use in ME/C offices that is reliable, timely, and cost-effective.

  6. Computational identification of surrogate genes for prostate cancer phases using machine learning and molecular network analysis

    PubMed Central

    2014-01-01

    Background Prostate cancer is one of the most common malignant diseases and is characterized by heterogeneity in the clinical course. To date, there are no efficient morphologic features or genomic biomarkers that can characterize the phenotypes of the cancer, especially with regard to metastasis – the most adverse outcome. Searching for effective surrogate genes out of large quantities of gene expression data is a key to cancer phenotyping and/or understanding molecular mechanisms underlying prostate cancer development. Results Using the maximum relevance minimum redundancy (mRMR) method on microarray data from normal tissues, primary tumors and metastatic tumors, we identifed four genes that can optimally classify samples of different prostate cancer phases. Moreover, we constructed a molecular interaction network with existing bioinformatic resources and co-identifed eight genes on the shortest-paths among the mRMR-identified genes, which are potential co-acting factors of prostate cancer. Functional analyses show that molecular functions involved in cell communication, hormone-receptor mediated signaling, and transcription regulation play important roles in the development of prostate cancer. Conclusion We conclude that the surrogate genes we have selected compose an effective classifier of prostate cancer phases, which corresponds to a minimum characterization of cancer phenotypes on the molecular level. Along with their molecular interaction partners, it is fairly to assume that these genes may have important roles in prostate cancer development; particularly, the un-reported genes may bring new insights for the understanding of the molecular mechanisms. Thus our results may serve as a candidate gene set for further functional studies. PMID:25151146

  7. The Potential Use of Radio Frequency Identification Devices for Active Monitoring of Blood Glucose Levels

    PubMed Central

    Moore, Bert

    2009-01-01

    Imagine a diabetes patient receiving a text message on his mobile phone warning him that his blood glucose level is too low or a patient's mobile phone calling an emergency number when the patient goes into diabetic shock. Both scenarios depend on automatic, continuous monitoring of blood glucose levels and transmission of that information to a phone. The development of advanced biological sensors and integration with passive radio frequency identification technologies are the key to this. These hold the promise of being able to free patients from finger stick sampling or externally worn devices while providing continuous blood glucose monitoring that allows patients to manage their health more actively. To achieve this promise, however, a number of technical issues need to be addressed. PMID:20046663

  8. Rapid identification of mycobacteria to the species level by polymerase chain reaction and restriction enzyme analysis.

    PubMed Central

    Telenti, A; Marchesi, F; Balz, M; Bally, F; Böttger, E C; Bodmer, T

    1993-01-01

    A method for the rapid identification of mycobacteria to the species level was developed on the basis of evaluation by the polymerase chain reaction (PCR) of the gene encoding for the 65-kDa protein. The method involves restriction enzyme analysis of PCR products obtained with primers common to all mycobacteria. Using two restriction enzymes, BstEII and HaeIII, medically relevant and other frequent laboratory isolates were differentiated to the species or subspecies level by PCR-restriction enzyme pattern analysis. PCR-restriction enzyme pattern analysis was performed on isolates (n = 330) from solid and fluid culture media, including BACTEC, or from frozen and lyophilized stocks. The procedure does not involve hybridization steps or the use of radioactivity and can be completed within 1 working day. Images PMID:8381805

  9. High level language for measurement complex control based on the computer E-100I

    NASA Technical Reports Server (NTRS)

    Zubkov, B. V.

    1980-01-01

    A high level language was designed to control the process of conducting an experiment using the computer "Elektrinika-1001". Program examples are given to control the measuring and actuating devices. The procedure of including these programs in the suggested high level language is described.

  10. 48 CFR 227.7203-3 - Early identification of computer software or computer software documentation to be furnished to...

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... use, reproduction or disclosure. 227.7203-3 Section 227.7203-3 Federal Acquisition Regulations System... restrictions on use, reproduction or disclosure. (a) Use the provision at 252.227-7017, Identification and... for which restrictions, other than copyright, on use, modification, reproduction, release,...

  11. 48 CFR 227.7203-3 - Early identification of computer software or computer software documentation to be furnished to...

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... use, reproduction or disclosure. 227.7203-3 Section 227.7203-3 Federal Acquisition Regulations System... restrictions on use, reproduction or disclosure. (a) Use the provision at 252.227-7017, Identification and... for which restrictions, other than copyright, on use, modification, reproduction, release,...

  12. 48 CFR 227.7203-3 - Early identification of computer software or computer software documentation to be furnished to...

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... use, reproduction or disclosure. 227.7203-3 Section 227.7203-3 Federal Acquisition Regulations System... restrictions on use, reproduction or disclosure. (a) Use the provision at 252.227-7017, Identification and... for which restrictions, other than copyright, on use, modification, reproduction, release,...

  13. 48 CFR 227.7203-3 - Early identification of computer software or computer software documentation to be furnished to...

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... use, reproduction or disclosure. 227.7203-3 Section 227.7203-3 Federal Acquisition Regulations System... restrictions on use, reproduction or disclosure. (a) Use the provision at 252.227-7017, Identification and... for which restrictions, other than copyright, on use, modification, reproduction, release,...

  14. Computational medicinal chemistry for rational drug design: Identification of novel chemical structures with potential anti-tuberculosis activity.

    PubMed

    Koseki, Yuji; Aoki, Shunsuke

    2014-01-01

    Tuberculosis (TB) is caused by the bacterium Mycobacterium tuberculosis and is a common infectious disease with high mortality and morbidity. The increasing prevalence of drug-resistant strains of TB presents a major public health problem. Due to the lack of effective drugs to treat these drug-resistant strains, the discovery or development of novel anti-TB drugs is important. Computer-aided drug design has become an established strategy for the identification of novel active chemicals through a combination of several drug design tools. In this review, we summarise the current chemotherapy for TB, describe attractive target proteins for the development of antibiotics against TB, and detail several computational drug design strategies that may contribute to the further identification of active chemicals for the treatment of not only TB but also other diseases.

  15. Identification, Recovery, and Refinement of Hitherto Undescribed Population-Level Genomes from the Human Gastrointestinal Tract

    PubMed Central

    Laczny, Cedric C.; Muller, Emilie E. L.; Heintz-Buschart, Anna; Herold, Malte; Lebrun, Laura A.; Hogan, Angela; May, Patrick; de Beaufort, Carine; Wilmes, Paul

    2016-01-01

    Linking taxonomic identity and functional potential at the population-level is important for the study of mixed microbial communities and is greatly facilitated by the availability of microbial reference genomes. While the culture-independent recovery of population-level genomes from environmental samples using the binning of metagenomic data has expanded available reference genome catalogs, several microbial lineages remain underrepresented. Here, we present two reference-independent approaches for the identification, recovery, and refinement of hitherto undescribed population-level genomes. The first approach is aimed at genome recovery of varied taxa and involves multi-sample automated binning using CANOPY CLUSTERING complemented by visualization and human-augmented binning using VIZBIN post hoc. The second approach is particularly well-suited for the study of specific taxa and employs VIZBIN de novo. Using these approaches, we reconstructed a total of six population-level genomes of distinct and divergent representatives of the Alphaproteobacteria class, the Mollicutes class, the Clostridiales order, and the Melainabacteria class from human gastrointestinal tract-derived metagenomic data. Our results demonstrate that, while automated binning approaches provide great potential for large-scale studies of mixed microbial communities, these approaches should be complemented with informative visualizations because expert-driven inspection and refinements are critical for the recovery of high-quality population-level genomes. PMID:27445992

  16. Computational Identification and Comparative Analysis of Secreted and Transmembrane Proteins in Six Burkholderia Species

    PubMed Central

    Nguyen, Thao Thi; Lee, Hyun-Hee; Park, Jungwook; Park, Inmyoung; Seo, Young-Su

    2017-01-01

    As a step towards discovering novel pathogenesis-related proteins, we performed a genome scale computational identification and characterization of secreted and transmembrane (TM) proteins, which are mainly responsible for bacteria-host interactions and interactions with other bacteria, in the genomes of six representative Burkholderia species. The species comprised plant pathogens (B. glumae BGR1, B. gladioli BSR3), human pathogens (B. pseudomallei K96243, B. cepacia LO6), and plant-growth promoting endophytes (Burkholderia sp. KJ006, B. phytofirmans PsJN). The proportions of putative classically secreted proteins (CSPs) and TM proteins among the species were relatively high, up to approximately 20%. Lower proportions of putative type 3 non-classically secreted proteins (T3NCSPs) (~10%) and unclassified non-classically secreted proteins (NCSPs) (~5%) were observed. The numbers of TM proteins among the three clusters (plant pathogens, human pathogens, and endophytes) were different, while the distribution of these proteins according to the number of TM domains was conserved in which TM proteins possessing 1, 2, 4, or 12 TM domains were the dominant groups in all species. In addition, we observed conservation in the protein size distribution of the secreted protein groups among the species. There were species-specific differences in the functional characteristics of these proteins in the various groups of CSPs, T3NCSPs, and unclassified NCSPs. Furthermore, we assigned the complete sets of the conserved and unique NCSP candidates of the collected Burkholderia species using sequence similarity searching. This study could provide new insights into the relationship among plant-pathogenic, human-pathogenic, and endophytic bacteria. PMID:28381962

  17. Computational Identification and Comparative Analysis of Secreted and Transmembrane Proteins in Six Burkholderia Species.

    PubMed

    Nguyen, Thao Thi; Lee, Hyun-Hee; Park, Jungwook; Park, Inmyoung; Seo, Young-Su

    2017-04-01

    As a step towards discovering novel pathogenesis-related proteins, we performed a genome scale computational identification and characterization of secreted and transmembrane (TM) proteins, which are mainly responsible for bacteria-host interactions and interactions with other bacteria, in the genomes of six representative Burkholderia species. The species comprised plant pathogens (B. glumae BGR1, B. gladioli BSR3), human pathogens (B. pseudomallei K96243, B. cepacia LO6), and plant-growth promoting endophytes (Burkholderia sp. KJ006, B. phytofirmans PsJN). The proportions of putative classically secreted proteins (CSPs) and TM proteins among the species were relatively high, up to approximately 20%. Lower proportions of putative type 3 non-classically secreted proteins (T3NCSPs) (~10%) and unclassified non-classically secreted proteins (NCSPs) (~5%) were observed. The numbers of TM proteins among the three clusters (plant pathogens, human pathogens, and endophytes) were different, while the distribution of these proteins according to the number of TM domains was conserved in which TM proteins possessing 1, 2, 4, or 12 TM domains were the dominant groups in all species. In addition, we observed conservation in the protein size distribution of the secreted protein groups among the species. There were species-specific differences in the functional characteristics of these proteins in the various groups of CSPs, T3NCSPs, and unclassified NCSPs. Furthermore, we assigned the complete sets of the conserved and unique NCSP candidates of the collected Burkholderia species using sequence similarity searching. This study could provide new insights into the relationship among plant-pathogenic, human-pathogenic, and endophytic bacteria.

  18. Low-Bandwidth and Non-Compute Intensive Remote Identification of Microbes from Raw Sequencing Reads

    PubMed Central

    Gautier, Laurent; Lund, Ole

    2013-01-01

    Cheap DNA sequencing may soon become routine not only for human genomes but also for practically anything requiring the identification of living organisms from their DNA: tracking of infectious agents, control of food products, bioreactors, or environmental samples. We propose a novel general approach to the analysis of sequencing data where a reference genome does not have to be specified. Using a distributed architecture we are able to query a remote server for hints about what the reference might be, transferring a relatively small amount of data. Our system consists of a server with known reference DNA indexed, and a client with raw sequencing reads. The client sends a sample of unidentified reads, and in return receives a list of matching references. Sequences for the references can be retrieved and used for exhaustive computation on the reads, such as alignment. To demonstrate this approach we have implemented a web server, indexing tens of thousands of publicly available genomes and genomic regions from various organisms and returning lists of matching hits from query sequencing reads. We have also implemented two clients: one running in a web browser, and one as a python script. Both are able to handle a large number of sequencing reads and from portable devices (the browser-based running on a tablet), perform its task within seconds, and consume an amount of bandwidth compatible with mobile broadband networks. Such client-server approaches could develop in the future, allowing a fully automated processing of sequencing data and routine instant quality check of sequencing runs from desktop sequencers. A web access is available at http://tapir.cbs.dtu.dk. The source code for a python command-line client, a server, and supplementary data are available at http://bit.ly/1aURxkc. PMID:24391826

  19. Computational identification and structural analysis of deleterious functional SNPs in MLL gene causing acute leukemia.

    PubMed

    George Priya Doss, C; Rajasekaran, R; Sethumadhavan, Rao

    2010-09-01

    A promising application of the huge amounts of data from the Human Genome Project currently available offers new opportunities for identifying the genetic predisposition and developing a better understanding of complex diseases such as cancers. The main focus of cancer genetics is the study of mutations that are causally implicated in tumorigenesis. The identification of such causal mutations does not only provide insight into cancer biology but also presents anticancer therapeutic targets and diagnostic markers. In this study, we evaluated the Single Nucleotide Polymorphisms (SNPs) that can alter the expression and the function in MLL gene through computational methods. We applied an evolutionary perspective to screen the SNPs using a sequence homologybased SIFT tool, suggested that 10 non-synonymous SNPs (nsSNPs) (50%) were found to be deleterious. Structure based approach PolyPhen server suggested that 5 nsSNPS (25%) may disrupt protein function and structure. PupaSuite tool predicted the phenotypic effect of SNPs on the structure and function of the affected protein. Structure analysis was carried out with the major mutations that occurred in the native protein coded by MLL gene is at amino acid positions Q1198P and K1203Q. The solvent accessibility results showed that 7 residues changed from exposed state in the native type protein to buried state in Q1198P mutant protein and remained unchanged in the case of K1203Q. From the overall results obtained, nsSNP with id (rs1784246) at the amino acid position Q1198P could be considered as deleterious mutation in the acute leukemia caused by MLL gene.

  20. Enhanced fault-tolerant quantum computing in d-level systems.

    PubMed

    Campbell, Earl T

    2014-12-05

    Error-correcting codes protect quantum information and form the basis of fault-tolerant quantum computing. Leading proposals for fault-tolerant quantum computation require codes with an exceedingly rare property, a transversal non-Clifford gate. Codes with the desired property are presented for d-level qudit systems with prime d. The codes use n=d-1 qudits and can detect up to ∼d/3 errors. We quantify the performance of these codes for one approach to quantum computation known as magic-state distillation. Unlike prior work, we find performance is always enhanced by increasing d.

  1. Hydration level is an internal variable for computing motivation to obtain water rewards in monkeys.

    PubMed

    Minamimoto, Takafumi; Yamada, Hiroshi; Hori, Yukiko; Suhara, Tetsuya

    2012-05-01

    In the process of motivation to engage in a behavior, valuation of the expected outcome is comprised of not only external variables (i.e., incentives) but also internal variables (i.e., drive). However, the exact neural mechanism that integrates these variables for the computation of motivational value remains unclear. Besides, the signal of physiological needs, which serves as the primary internal variable for this computation, remains to be identified. Concerning fluid rewards, the osmolality level, one of the physiological indices for the level of thirst, may be an internal variable for valuation, since an increase in the osmolality level induces drinking behavior. Here, to examine the relationship between osmolality and the motivational value of a water reward, we repeatedly measured the blood osmolality level, while 2 monkeys continuously performed an instrumental task until they spontaneously stopped. We found that, as the total amount of water earned increased, the osmolality level progressively decreased (i.e., the hydration level increased) in an individual-dependent manner. There was a significant negative correlation between the error rate of the task (the proportion of trials with low motivation) and the osmolality level. We also found that the increase in the error rate with reward accumulation can be well explained by a formula describing the changes in the osmolality level. These results provide a biologically supported computational formula for the motivational value of a water reward that depends on the hydration level, enabling us to identify the neural mechanism that integrates internal and external variables.

  2. Level sequence and splitting identification of closely spaced energy levels by angle-resolved analysis of fluorescence light

    NASA Astrophysics Data System (ADS)

    Wu, Z. W.; Volotka, A. V.; Surzhykov, A.; Dong, C. Z.; Fritzsche, S.

    2016-06-01

    The angular distribution and linear polarization of the fluorescence light following the resonant photoexcitation is investigated within the framework of density matrix and second-order perturbation theory. Emphasis has been placed on "signatures" for determining the level sequence and splitting of intermediate (partially) overlapping resonances, if analyzed as a function of photon energy of incident light. Detailed computations within the multiconfiguration Dirac-Fock method have been performed, especially for the 1 s22 s22 p63 s ,Ji=1 /2 +γ1→(1s22 s 2 p63 s ) 13 p3 /2,J =1 /2 ,3 /2 →1 s22 s22 p63 s ,Jf=1 /2 +γ2 photoexcitation and subsequent fluorescence emission of atomic sodium. A remarkably strong dependence of the angular distribution and linear polarization of the γ2 fluorescence emission is found upon the level sequence and splitting of the intermediate (1s22 s 2 p63 s ) 13 p3 /2,J =1 /2 ,3 /2 overlapping resonances owing to their finite lifetime (linewidth). We therefore suggest that accurate measurements of the angular distribution and linear polarization might help identify the sequence and small splittings of closely spaced energy levels, even if they cannot be spectroscopically resolved.

  3. [Multi-level identification and analysis about infrared spectroscopy of lophatheri herba].

    PubMed

    Shao, Ying; Wu, Qi-Nan; Gu, Wei; Yue, Wei; Wu, Da-Wei; Fan, Xiu-He

    2014-05-01

    Based on the infrared spectra of Lophatheri Herba and Commelinae Herba, one-dimensional infrared spectra, second derivative spectra and two-dimensional correlated spectra were used to find out the differences between Lophatheri Herba and its imitations, respectively. The common peak ratio and variant peak ratio dual-indexes sequential were calculated and established according to infrared spectra of eleven batches of herbs. Infrared spectral data of Lophatheri Herba cluster analysis was applied to explore the similarity between each sample. The grouping results trend of sequential analysis of dual-indexes and cluster analysis was accordant. The results showed that the differences could be found by multi-level identification, and the source and the quality of the herbs could be effectively distinguished by the two analysis methods. Infrared spectroscopy, used in the present work exhibited some advantages on quick procedures, less sample required, and reliable results, which could provide a new method for the identification of traditional Chinese medicine with the imitations and adulterants, and the control of quality and origin.

  4. Analysis of Osteopontin Levels for the Identification of Asymptomatic Patients with Calcific Aortic Valve Disease

    PubMed Central

    Grau, Juan B.; Poggio, Paolo; Sainger, Rachana; Vernick, William J.; Seefried, William F.; Branchetti, Emanuela; Field, Benjamin C.; Bavaria, Joseph E.; Acker, Michael A.; Ferrari, Giovanni

    2012-01-01

    Background Calcific Aortic Valve Disease (CAVD) is the most common etiology of acquired valve disease. Initial phases of CAVD include thickening of the cusps, whereas advanced stages are associated with biomineralization and reduction of the aortic valve area. These conditions are known as Aortic Valve Sclerosis (AVSc) and Aortic Valve Stenosis (AVS), respectively. Due to its asymptomatic presentation, little is know about the molecular determinants of AVSc. The aim of this study is to correlate plasma and tissue Osteopontin (OPN) levels with echocardiographic evaluation for the identification of asymptomatic patients at risk of CAVD. In addition, we aim to analyze the differential expression and biological function of OPN splicing variants as biomarkers of early and late stages of CAVD. Methods From Jan 2010 to Feb 2011, 254 patients were enrolled in the study. Subjects were divided in three groups based on transesophageal echocardiographic (TEE) evaluation: Controls (56 subjects), AVSc (90), and AVS (164). Plasma and tissue OPN levels were measured by IHC, ELISA and qPCR. Results Patients with AVSc have and AVS higher OPN levels compared to controls. OPN levels are elevated in asymptomatic AVSc patients with no appearance of calcification during TEE evaluation. Osteopontin splicing variants -a, -b, and -c are differentially expressed during CAVD progression and are able to inhibit biomineralization in a cell-based biomineralization assay. Conclusions The analysis of the differential expression of OPN splicing variants during CAVD may help in developing diagnostic and risk stratification tools to follow the progression of asymptomatic AV degeneration. PMID:22093695

  5. An integrated transcriptomic and computational analysis for biomarker identification in human glioma.

    PubMed

    Xing, Wenli; Zeng, Chun

    2016-06-01

    Malignant glioma is one of the most common primary brain tumors and is among the deadliest of human cancers. The molecular mechanism for human glioma is poorly understood. Early prognosis of this disease and early treatment are vital. Thus, it is crucial to target the key genes controlling pathogenesis in the early stage of glioma. In this study, differentially expressed genes in human glioma and paired peritumoral tissues were detected by transcriptome microarray analysis. Following gene microarray analysis, the gene expression profile in the differential grade glioma was further validated by bioinformatic analyses, co-expression network construction. Microarray analysis revealed that 1725 genes were differentially expressed and classified into different glioma stage. The analysis revealed 14 genes that were significantly associated with survival with a false discovery rate. Among these genes, macrophage capping protein (CAPG), a member of the actin-regulatory protein, was the key gene in a 20-gene network that modulates cell motility by interacting with the cytoskeleton. Furthermore, the prognostic impact of CAPG was validated by use of quantitative real-time polymerase chain reaction (qPCR) and immunohistochemistry on human glioma tissue. CAPG protein was significantly upregulated in clinical high-grade glioblastoma as compared with normal brain tissues. Overexpression of CAPG levels also predict shorter overall survival of glioma patients. These data demonstrated CAPG protein expression in human glioma was associated with tumorigenesis and may be a biomarker for identification of the pathological grade of glioma.

  6. User-microprogrammable, local host computer with low-level parallelism

    SciTech Connect

    Tomita, S.; Shibayama, K.; Kitamura, T.; Nakata, T.; Hagiwara, H.

    1983-01-01

    This paper describes the architecture of a dynamically microprogrammable computer with low-level parallelism, called QA-2, which is designed as a high-performance, local host computer for laboratory use. The architectural principle of the QA-2 is the marriage of high-speed, parallel processing capability offered by four powerful arithmetic and logic units (ALUS) with architectural flexibility provided by large scale, dynamic user-microprogramming. By changing its writable control storage dynamically, the QA-2 can be tailored to a wide spectrum of research-oriented applications covering high-level language processing and real-time processing. 11 references.

  7. Studying channelopathies at the functional level using a system identification approach

    NASA Astrophysics Data System (ADS)

    Faisal, A. Aldo

    2007-09-01

    The electrical activity of our brain's neurons is controlled by voltage-gated ion channels. Mutations in these ion channels have been recently associated with clinical conditions, so called channelopathies. The involved ion channels have been well characterised at a molecular and biophysical level. However, the impact of these mutations on neuron function have been only rudimentary studied. It remains unclear how operation and performance (in terms of input-output characteristics and reliability) are affected. Here, I show how system identification techniques provide neuronal performance measures which allow to quantitatively asses the impact of channelopathies by comparing whole cell input-output relationships. I illustrate the feasibility of this approach by comparing the effects on neuronal signalling of two human sodium channel mutations (NaV 1.1 W1204R, R1648H), linked to generalized epilepsy with febrile seizures, to the wild-type NaV 1.1 channel.

  8. Evaluation of the Wider System, a New Computer-Assisted Image-Processing Device for Bacterial Identification and Susceptibility Testing

    PubMed Central

    Cantón, Rafael; Pérez-Vázquez, María; Oliver, Antonio; Sánchez Del Saz, Begoña; Gutiérrez, M. Olga; Martínez-Ferrer, Manuel; Baquero, Fernando

    2000-01-01

    The Wider system is a newly developed computer-assisted image-processing device for both bacterial identification and antimicrobial susceptibility testing. It has been adapted to be able to read and interpret commercial MicroScan panels. Two hundred forty-four fresh consecutive clinical isolates (138 isolates of the family Enterobacteriaceae, 25 nonfermentative gram-negative rods [NFGNRs], and 81 gram-positive cocci) were tested. In addition, 100 enterobacterial strains with known β-lactam resistance mechanisms (22 strains with chromosomal AmpC β-lactamase, 8 strains with chromosomal class A β-lactamase, 21 broad-spectrum and IRT β-lactamase-producing strains, 41 extended-spectrum β-lactamase-producing strains, and 8 permeability mutants) were tested. API galleries and National Committee for Clinical Laboratory Standards (NCCLS) microdilution methods were used as reference methods. The Wider system correctly identified 97.5% of the clinical isolates at the species level. Overall essential agreement (±1 log2 dilution for 3,719 organism-antimicrobial drug combinations) was 95.6% (isolates of the family Enterobacteriaceae, 96.6%; NFGNRs, 88.0%; gram-positive cocci, 95.6%). The lowest essential agreement was observed with Enterobacteriaceae versus imipenem (84.0%), NFGNR versus piperacillin (88.0%) and cefepime (88.0%), and gram-positive isolates versus penicillin (80.4%). The category error rate (NCCLS criteria) was 4.2% (2.0% very major errors, 0.6% major errors, and 1.5% minor errors). Essential agreement and interpretive error rates for eight β-lactam antibiotics against isolates of the family Enterobacteriaceae with known β-lactam resistance mechanisms were 94.8 and 5.4%, respectively. Interestingly, the very major error rate was only 0.8%. Minor errors (3.6%) were mainly observed with amoxicillin-clavulanate and cefepime against extended-spectrum β-lactamase-producing isolates. The Wider system is a new reliable tool which applies the image

  9. [Tri-Level Infrared Spectroscopic Identification of Hot Melting Reflective Road Marking Paint].

    PubMed

    Li, Hao; Ma, Fang; Sun, Su-qin

    2015-12-01

    In order to detect the road marking paint from the trace evidence in traffic accident scene, and to differentiate their brands, we use Tri-level infrared spectroscopic identification, which employs the Fourier transform infrared spectroscopy (FTIR), the second derivative infrared spectroscopy(SD-IR), two-dimensional correlation infrared spectroscopy(2D-IR) to identify three different domestic brands of hot melting reflective road marking paints and their raw materials in formula we Selected. The experimental results show that three labels coatings in ATR and FTIR spectrograms are very similar in shape, only have different absorption peak wave numbers, they have wide and strong absorption peaks near 1435 cm⁻¹, and strong absorption peak near 879, 2955, 2919, 2870 cm⁻¹. After enlarging the partial areas of spectrograms and comparing them with each kind of raw material of formula spectrograms, we can distinguish them. In the region 700-970 and 1370-1 660 cm⁻¹ the spectrograms mainly reflect the different relative content of heavy calcium carbonate of three brands of the paints, and that of polyethylene wax (PE wax), ethylene vinyl acetate resin (EVA), dioctyl phthalate (DOP) in the region 2800-2960 cm⁻¹. The SD-IR not only verify the result of the FTIR analysis, but also further expand the microcosmic differences and reflect the different relative content of quartz sand in the 512-799 cm-1 region. Within the scope of the 1351 to 1525 cm⁻¹, 2D-IR have more significant differences in positions and numbers of automatically peaks. Therefore, the Tri-level infrared spectroscopic identification is a fast and effective method to distinguish the hot melting road marking paints with a gradually improvement in apparent resolution.

  10. DNA Barcoding for Efficient Species- and Pathovar-Level Identification of the Quarantine Plant Pathogen Xanthomonas

    PubMed Central

    Tian, Qian; Zhao, Wenjun; Lu, Songyu; Zhu, Shuifang; Li, Shidong

    2016-01-01

    Genus Xanthomonas comprises many economically important plant pathogens that affect a wide range of hosts. Indeed, fourteen Xanthomonas species/pathovars have been regarded as official quarantine bacteria for imports in China. To date, however, a rapid and accurate method capable of identifying all of the quarantine species/pathovars has yet to be developed. In this study, we therefore evaluated the capacity of DNA barcoding as a digital identification method for discriminating quarantine species/pathovars of Xanthomonas. For these analyses, 327 isolates, representing 45 Xanthomonas species/pathovars, as well as five additional species/pathovars from GenBank (50 species/pathovars total), were utilized to test the efficacy of four DNA barcode candidate genes (16S rRNA gene, cpn60, gyrB, and avrBs2). Of these candidate genes, cpn60 displayed the highest rate of PCR amplification and sequencing success. The tree-building (Neighbor-joining), ‘best close match’, and barcode gap methods were subsequently employed to assess the species- and pathovar-level resolution of each gene. Notably, all isolates of each quarantine species/pathovars formed a monophyletic group in the neighbor-joining tree constructed using the cpn60 sequences. Moreover, cpn60 also demonstrated the most satisfactory results in both barcoding gap analysis and the ‘best close match’ test. Thus, compared with the other markers tested, cpn60 proved to be a powerful DNA barcode, providing a reliable and effective means for the species- and pathovar-level identification of the quarantine plant pathogen Xanthomonas. PMID:27861494

  11. Computational identification of CDR3 sequence archetypes among immunoglobulin sequences in chronic lymphocytic leukemia.

    PubMed

    Messmer, Bradley T; Raphael, Benjamin J; Aerni, Sarah J; Widhopf, George F; Rassenti, Laura Z; Gribben, John G; Kay, Neil E; Kipps, Thomas J

    2009-03-01

    The leukemia cells of unrelated patients with chronic lymphocytic leukemia (CLL) display a restricted repertoire of immunoglobulin (Ig) gene rearrangements with preferential usage of certain Ig gene segments. We developed a computational method to rigorously quantify biases in Ig sequence similarity in large patient databases and to identify groups of patients with unusual levels of sequence similarity. We applied our method to sequences from 1577 CLL patients through the CLL Research Consortium (CRC), and identified 67 similarity groups into which roughly 20% of all patients could be assigned. Immunoglobulin light chain class was highly correlated within all groups and light chain gene usage was similar within sets. Surprisingly, over 40% of the identified groups were composed of somatically mutated genes. This study significantly expands the evidence that antigen selection shapes the Ig repertoire in CLL.

  12. Computational identification of conserved microRNAs and their targets from expression sequence tags of blueberry (Vaccinium corybosum).

    PubMed

    Li, Xuyan; Hou, Yanming; Zhang, Li; Zhang, Wenhao; Quan, Chen; Cui, Yuhai; Bian, Shaomin

    2014-01-01

    MicroRNAs (miRNAs) are a class of endogenous, approximately 21nt in length, non-coding RNA, which mediate the expression of target genes primarily at post-transcriptional levels. miRNAs play critical roles in almost all plant cellular and metabolic processes. Although numerous miRNAs have been identified in the plant kingdom, the miRNAs in blueberry, which is an economically important small fruit crop, still remain totally unknown. In this study, we reported a computational identification of miRNAs and their targets in blueberry. By conducting an EST-based comparative genomics approach, 9 potential vco-miRNAs were discovered from 22,402 blueberry ESTs according to a series of filtering criteria, designated as vco-miR156-5p, vco-miR156-3p, vco-miR1436, vco-miR1522, vco-miR4495, vco-miR5120, vco-miR5658, vco-miR5783, and vco-miR5986. Based on sequence complementarity between miRNA and its target transcript, 34 target ESTs from blueberry and 70 targets from other species were identified for the vco-miRNAs. The targets were found to be involved in transcription, RNA splicing and binding, DNA duplication, signal transduction, transport and trafficking, stress response, as well as synthesis and metabolic process. These findings will greatly contribute to future research in regard to functions and regulatory mechanisms of blueberry miRNAs.

  13. Computational identification of conserved microRNAs and their targets from expression sequence tags of blueberry (Vaccinium corybosum)

    PubMed Central

    Li, Xuyan; Hou, Yanming; Zhang, Li; Zhang, Wenhao; Quan, Chen; Cui, Yuhai; Bian, Shaomin

    2014-01-01

    MicroRNAs (miRNAs) are a class of endogenous, approximately 21nt in length, non-coding RNA, which mediate the expression of target genes primarily at post-transcriptional levels. miRNAs play critical roles in almost all plant cellular and metabolic processes. Although numerous miRNAs have been identified in the plant kingdom, the miRNAs in blueberry, which is an economically important small fruit crop, still remain totally unknown. In this study, we reported a computational identification of miRNAs and their targets in blueberry. By conducting an EST-based comparative genomics approach, 9 potential vco-miRNAs were discovered from 22,402 blueberry ESTs according to a series of filtering criteria, designated as vco-miR156–5p, vco-miR156–3p, vco-miR1436, vco-miR1522, vco-miR4495, vco-miR5120, vco-miR5658, vco-miR5783, and vco-miR5986. Based on sequence complementarity between miRNA and its target transcript, 34 target ESTs from blueberry and 70 targets from other species were identified for the vco-miRNAs. The targets were found to be involved in transcription, RNA splicing and binding, DNA duplication, signal transduction, transport and trafficking, stress response, as well as synthesis and metabolic process. These findings will greatly contribute to future research in regard to functions and regulatory mechanisms of blueberry miRNAs. PMID:25763692

  14. Neuromotor recovery from stroke: computational models at central, functional, and muscle synergy level

    PubMed Central

    Casadio, Maura; Tamagnone, Irene; Summa, Susanna; Sanguineti, Vittorio

    2013-01-01

    Computational models of neuromotor recovery after a stroke might help to unveil the underlying physiological mechanisms and might suggest how to make recovery faster and more effective. At least in principle, these models could serve: (i) To provide testable hypotheses on the nature of recovery; (ii) To predict the recovery of individual patients; (iii) To design patient-specific “optimal” therapy, by setting the treatment variables for maximizing the amount of recovery or for achieving a better generalization of the learned abilities across different tasks. Here we review the state of the art of computational models for neuromotor recovery through exercise, and their implications for treatment. We show that to properly account for the computational mechanisms of neuromotor recovery, multiple levels of description need to be taken into account. The review specifically covers models of recovery at central, functional and muscle synergy level. PMID:23986688

  15. Scalability of a Base Level Design for an On-Board-Computer for Scientific Missions

    NASA Astrophysics Data System (ADS)

    Treudler, Carl Johann; Schroder, Jan-Carsten; Greif, Fabian; Stohlmann, Kai; Aydos, Gokce; Fey, Gorschwin

    2014-08-01

    Facing a wide range of mission requirements and the integration of diverse payloads requires extreme flexibility in the on-board-computing infrastructure for scientific missions. We show that scalability is principally difficult. We address this issue by proposing a base level design and show how the adoption to different needs is achieved. Inter-dependencies between scaling different aspects and their impact on different levels in the design are discussed.

  16. Can Synchronous Computer-Mediated Communication (CMC) Help Beginning-Level Foreign Language Learners Speak?

    ERIC Educational Resources Information Center

    Ko, Chao-Jung

    2012-01-01

    This study investigated the possibility that initial-level learners may acquire oral skills through synchronous computer-mediated communication (SCMC). Twelve Taiwanese French as a foreign language (FFL) students, divided into three groups, were required to conduct a variety of tasks in one of the three learning environments (video/audio, audio,…

  17. Comparing Infants' Preference for Correlated Audiovisual Speech with Signal-Level Computational Models

    ERIC Educational Resources Information Center

    Hollich, George; Prince, Christopher G.

    2009-01-01

    How much of infant behaviour can be accounted for by signal-level analyses of stimuli? The current paper directly compares the moment-by-moment behaviour of 8-month-old infants in an audiovisual preferential looking task with that of several computational models that use the same video stimuli as presented to the infants. One type of model…

  18. BICYCLE II: a computer code for calculating levelized life-cycle costs

    SciTech Connect

    Hardie, R.W.

    1981-11-01

    This report describes the BICYCLE computer code. BICYCLE was specifically designed to calculate levelized life-cycle costs for plants that produce electricity, heat, gaseous fuels, or liquid fuels. Included are (1) derivations of the equations used by BICYCLE, (2) input instructions, (3) sample case input, and (4) sample case output.

  19. A Simple Gauss-Newton Procedure for Covariance Structure Analysis with High-Level Computer Languages.

    ERIC Educational Resources Information Center

    Cudeck, Robert; And Others

    1993-01-01

    An implementation of the Gauss-Newton algorithm for the analysis of covariance structure that is specifically adapted for high-level computer languages is reviewed. This simple method for estimating structural equation models is useful for a variety of standard models, as is illustrated. (SLD)

  20. Learning Density in Vanuatu High School with Computer Simulation: Influence of Different Levels of Guidance

    ERIC Educational Resources Information Center

    Moli, Lemuel; Delserieys, Alice Pedregosa; Impedovo, Maria Antonietta; Castera, Jeremy

    2017-01-01

    This paper presents a study on discovery learning of scientific concepts with the support of computer simulation. In particular, the paper will focus on the effect of the levels of guidance on students with a low degree of experience in informatics and educational technology. The first stage of this study was to identify the common misconceptions…

  1. Can Synchronous Computer-Mediated Communication (CMC) Help Beginning-Level Foreign Language Learners Speak?

    ERIC Educational Resources Information Center

    Ko, Chao-Jung

    2012-01-01

    This study investigated the possibility that initial-level learners may acquire oral skills through synchronous computer-mediated communication (SCMC). Twelve Taiwanese French as a foreign language (FFL) students, divided into three groups, were required to conduct a variety of tasks in one of the three learning environments (video/audio, audio,…

  2. BICYCLE: a computer code for calculating levelized life-cycle costs

    SciTech Connect

    Hardie, R.W.

    1980-08-01

    This report serves as a user's manual for the BICYCLE computer code. BICYCLE was specifically designed to calculate levelized life-cycle costs for plants that produce electricity, heat, gaseous fuels, or liquid fuels. Included in this report are (1) derivations of the equations used by BICYCLE, (2) input instructions, (3) sample case input, and (4) sample case output.

  3. A Simple Gauss-Newton Procedure for Covariance Structure Analysis with High-Level Computer Languages.

    ERIC Educational Resources Information Center

    Cudeck, Robert; And Others

    1993-01-01

    An implementation of the Gauss-Newton algorithm for the analysis of covariance structure that is specifically adapted for high-level computer languages is reviewed. This simple method for estimating structural equation models is useful for a variety of standard models, as is illustrated. (SLD)

  4. The Relationship between Internet and Computer Game Addiction Level and Shyness among High School Students

    ERIC Educational Resources Information Center

    Ayas, Tuncay

    2012-01-01

    This study is conducted to determine the relationship between the internet and computer games addiction level and the shyness among high school students. The participants of the study consist of 365 students attending high schools in Giresun city centre during 2009-2010 academic year. As a result of the study a positive, meaningful, and high…

  5. E-Predict: a computational strategy for species identification based on observed DNA microarray hybridization patterns.

    PubMed

    Urisman, Anatoly; Fischer, Kael F; Chiu, Charles Y; Kistler, Amy L; Beck, Shoshannah; Wang, David; DeRisi, Joseph L

    2005-01-01

    DNA microarrays may be used to identify microbial species present in environmental and clinical samples. However, automated tools for reliable species identification based on observed microarray hybridization patterns are lacking. We present an algorithm, E-Predict, for microarray-based species identification. E-Predict compares observed hybridization patterns with theoretical energy profiles representing different species. We demonstrate the application of the algorithm to viral detection in a set of clinical samples and discuss its relevance to other metagenomic applications.

  6. Cluster chemical ionization for improved confidence level in sample identification by gas chromatography/mass spectrometry.

    PubMed

    Fialkov, Alexander B; Amirav, Aviv

    2003-01-01

    Upon the supersonic expansion of helium mixed with vapor from an organic solvent (e.g. methanol), various clusters of the solvent with the sample molecules can be formed. As a result of 70 eV electron ionization of these clusters, cluster chemical ionization (cluster CI) mass spectra are obtained. These spectra are characterized by the combination of EI mass spectra of vibrationally cold molecules in the supersonic molecular beam (cold EI) with CI-like appearance of abundant protonated molecules, together with satellite peaks corresponding to protonated or non-protonated clusters of sample compounds with 1-3 solvent molecules. Like CI, cluster CI preferably occurs for polar compounds with high proton affinity. However, in contrast to conventional CI, for non-polar compounds or those with reduced proton affinity the cluster CI mass spectrum converges to that of cold EI. The appearance of a protonated molecule and its solvent cluster peaks, plus the lack of protonation and cluster satellites for prominent EI fragments, enable the unambiguous identification of the molecular ion. In turn, the insertion of the proper molecular ion into the NIST library search of the cold EI mass spectra eliminates those candidates with incorrect molecular mass and thus significantly increases the confidence level in sample identification. Furthermore, molecular mass identification is of prime importance for the analysis of unknown compounds that are absent in the library. Examples are given with emphasis on the cluster CI analysis of carbamate pesticides, high explosives and unknown samples, to demonstrate the usefulness of Supersonic GC/MS (GC/MS with supersonic molecular beam) in the analysis of these thermally labile compounds. Cluster CI is shown to be a practical ionization method, due to its ease-of-use and fast instrumental conversion between EI and cluster CI, which involves the opening of only one valve located at the make-up gas path. The ease-of-use of cluster CI is analogous

  7. Assessment of Social Vulnerability Identification at Local Level around Merapi Volcano - A Self Organizing Map Approach

    NASA Astrophysics Data System (ADS)

    Lee, S.; Maharani, Y. N.; Ki, S. J.

    2015-12-01

    The application of Self-Organizing Map (SOM) to analyze social vulnerability to recognize the resilience within sites is a challenging tasks. The aim of this study is to propose a computational method to identify the sites according to their similarity and to determine the most relevant variables to characterize the social vulnerability in each cluster. For this purposes, SOM is considered as an effective platform for analysis of high dimensional data. By considering the cluster structure, the characteristic of social vulnerability of the sites identification can be fully understand. In this study, the social vulnerability variable is constructed from 17 variables, i.e. 12 independent variables which represent the socio-economic concepts and 5 dependent variables which represent the damage and losses due to Merapi eruption in 2010. These variables collectively represent the local situation of the study area, based on conducted fieldwork on September 2013. By using both independent and dependent variables, we can identify if the social vulnerability is reflected onto the actual situation, in this case, Merapi eruption 2010. However, social vulnerability analysis in the local communities consists of a number of variables that represent their socio-economic condition. Some of variables employed in this study might be more or less redundant. Therefore, SOM is used to reduce the redundant variable(s) by selecting the representative variables using the component planes and correlation coefficient between variables in order to find the effective sample size. Then, the selected dataset was effectively clustered according to their similarities. Finally, this approach can produce reliable estimates of clustering, recognize the most significant variables and could be useful for social vulnerability assessment, especially for the stakeholder as decision maker. This research was supported by a grant 'Development of Advanced Volcanic Disaster Response System considering

  8. Identification of Student Misconceptions in Genetics Problem Solving via Computer Program.

    ERIC Educational Resources Information Center

    Browning, Mark E.; Lehman, James D.

    1988-01-01

    Describes a computer program presenting four genetics problems to monitor the problem solving process of college students. Identifies three main areas of difficulty: computational skills; determination of gametes; and application of previous learning to new situations. (Author/YP)

  9. Identification of "Streptococcus milleri" group isolates to the species level with a commercially available rapid test system.

    PubMed

    Flynn, C E; Ruoff, K L

    1995-10-01

    Clinical isolates of the "Streptococcus milleri" species group were examined by conventional methods and a rapid, commercially available method for the identification of these strains to the species level. The levels of agreement between the identifications obtained with the commercially available system (Fluo-Card Milleri; KEY Scientific, Round Rock, Tex.) and conventional methods were 98% for 50 Streptococcus anginosus strains, 97% for 31 Streptococcus constellatus strains, and 88% for 17 isolates identified as Streptococcus intermedius. Patient records were also studied in order to gain information on the frequency and sites of isolation of each of the three "S. milleri" group species.

  10. The Development of a Computer-Directed Training Subsystem and Computer Operator Training Material for the Air Force Phase II Base Level System. Final Report.

    ERIC Educational Resources Information Center

    System Development Corp., Santa Monica, CA.

    The design, development, and evaluation of an integrated Computer-Directed Training Subsystem (CDTS) for the Air Force Phase II Base Level System is described in this report. The development and evaluation of a course to train computer operators of the Air Force Phase II Base Level System under CDTS control is also described. Detailed test results…

  11. Identification of histamine receptors and reduction of squalene levels by an antihistamine in sebocytes.

    PubMed

    Pelle, Edward; McCarthy, James; Seltmann, Holger; Huang, Xi; Mammone, Thomas; Zouboulis, Christos C; Maes, Daniel

    2008-05-01

    Overproduction of sebum, especially during adolescence, is causally related to acne and inflammation. As a way to reduce sebum and its interference with the process of follicular keratinization in the pilosebaceous unit leading to inflammatory acne lesions, antihistamines were investigated for their effect on sebocytes, the major cell of the sebaceous gland responsible for producing sebum. Reverse transcriptase-PCR analysis and immunofluorescence of an immortalized sebocyte cell line (SZ95) revealed the presence of histamine-1 receptor (H-1 receptor), and thus indicated that histamines and, conversely, antihistamines could potentially modulate sebocyte function directly. When sebocytes were incubated with an H-1 receptor antagonist, diphenhydramine (DPH), at non-cytotoxic doses, a significant decrease in squalene levels, a biomarker for sebum, was observed. As determined by high-performance liquid chromatography, untreated sebocytes contained 6.27 (+/-0.73) nmol squalene per 10(6) cells, whereas for DPH-treated cells, the levels were 2.37 (+/-0.24) and 2.03 (+/-0.97) nmol squalene per 10(6) cells at 50 and 100 microM, respectively. These data were further substantiated by the identification of histamine receptors in human sebaceous glands. In conclusion, our data show the presence of histamine receptors on sebocytes, demonstrate how an antagonist to these receptors modulated cellular function, and may indicate a new paradigm for acne therapy involving an H-1 receptor-mediated pathway.

  12. Computational identification of altered metabolism using gene expression and metabolic pathways.

    PubMed

    Nam, Hojung; Lee, Jinwon; Lee, Doheon

    2009-07-01

    Understanding altered metabolism is an important issue because altered metabolism is often revealed as a cause or an effect in pathogenesis. It has also been shown to be an important factor in the manipulation of an organism's metabolism in metabolic engineering. Unfortunately, it is not yet possible to measure the concentration levels of all metabolites in the genome-wide scale of a metabolic network; consequently, a method that infers the alteration of metabolism is beneficial. The present study proposes a computational method that identifies genome-wide altered metabolism by analyzing functional units of KEGG pathways. As control of a metabolic pathway is accomplished by altering the activity of at least one rate-determining step enzyme, not all gene expressions of enzymes in the pathway demonstrate significant changes even if the pathway is altered. Therefore, we measure the alteration levels of a metabolic pathway by selectively observing expression levels of significantly changed genes in a pathway. The proposed method was applied to two strains of Saccharomyces cerevisiae gene expression profiles measured in very high-gravity (VHG) fermentation. The method identified altered metabolic pathways whose properties are related to ethanol and osmotic stress responses which had been known to be observed in VHG fermentation because of the high sugar concentration in growth media and high ethanol concentration in fermentation products. With the identified altered pathways, the proposed method achieved best accuracy and sensitivity rates for the Red Star (RS) strain compared to other three related studies (gene-set enrichment analysis (GSEA), significance analysis of microarray to gene set (SAM-GS), reporter metabolite), and for the CEN.PK 113-7D (CEN) strain, the proposed method and the GSEA method showed comparably similar performances.

  13. Toward computational identification of multiscale "tipping points" in acute inflammation and multiple organ failure.

    PubMed

    An, Gary; Nieman, Gary; Vodovotz, Yoram

    2012-11-01

    Sepsis accounts annually for nearly 10% of total U.S. deaths, costing nearly $17 billion/year. Sepsis is a manifestation of disordered systemic inflammation. Properly regulated inflammation allows for timely recognition and effective reaction to injury or infection, but inadequate or overly robust inflammation can lead to Multiple Organ Dysfunction Syndrome (MODS). There is an incongruity between the systemic nature of disordered inflammation (as the target of inflammation-modulating therapies), and the regional manifestation of organ-specific failure (as the subject of organ support), that presents a therapeutic dilemma: systemic interventions can interfere with an individual organ system's appropriate response, yet organ-specific interventions may not help the overall system reorient itself. Based on a decade of systems and computational approaches to deciphering acute inflammation, along with translationally-motivated experimental studies in both small and large animals, we propose that MODS evolves due to the feed-forward cycle of inflammation → damage → inflammation. We hypothesize that inflammation proceeds at a given, "nested" level or scale until positive feedback exceeds a "tipping point." Below this tipping point, inflammation is contained and manageable; when this threshold is crossed, inflammation becomes disordered, and dysfunction propagates to a higher biological scale (e.g., progressing from cellular, to tissue/organ, to multiple organs, to the organism). Finally, we suggest that a combination of computational biology approaches involving data-driven and mechanistic mathematical modeling, in close association with studies in clinically relevant paradigms of sepsis/MODS, are necessary in order to define scale-specific "tipping points" and to suggest novel therapies for sepsis.

  14. A novel computer-aided detection system for pulmonary nodule identification in CT images

    NASA Astrophysics Data System (ADS)

    Han, Hao; Li, Lihong; Wang, Huafeng; Zhang, Hao; Moore, William; Liang, Zhengrong

    2014-03-01

    Computer-aided detection (CADe) of pulmonary nodules from computer tomography (CT) scans is critical for assisting radiologists to identify lung lesions at an early stage. In this paper, we propose a novel approach for CADe of lung nodules using a two-stage vector quantization (VQ) scheme. The first-stage VQ aims to extract lung from the chest volume, while the second-stage VQ is designed to extract initial nodule candidates (INCs) within the lung volume. Then rule-based expert filtering is employed to prune obvious FPs from INCs, and the commonly-used support vector machine (SVM) classifier is adopted to further reduce the FPs. The proposed system was validated on 100 CT scans randomly selected from the 262 scans that have at least one juxta-pleural nodule annotation in the publicly available database - Lung Image Database Consortium and Image Database Resource Initiative (LIDC-IDRI). The two-stage VQ only missed 2 out of the 207 nodules at agreement level 1, and the INCs detection for each scan took about 30 seconds in average. Expert filtering reduced FPs more than 18 times, while maintaining a sensitivity of 93.24%. As it is trivial to distinguish INCs attached to pleural wall versus not on wall, we investigated the feasibility of training different SVM classifiers to further reduce FPs from these two kinds of INCs. Experiment results indicated that SVM classification over the entire set of INCs was in favor of, where the optimal operating of our CADe system achieved a sensitivity of 89.4% at a specificity of 86.8%.

  15. Identification of Cognitive Processes of Effective and Ineffective Students during Computer Programming

    ERIC Educational Resources Information Center

    Renumol, V. G.; Janakiram, Dharanipragada; Jayaprakash, S.

    2010-01-01

    Identifying the set of cognitive processes (CPs) a student can go through during computer programming is an interesting research problem. It can provide a better understanding of the human aspects in computer programming process and can also contribute to the computer programming education in general. The study identified the presence of a set of…

  16. Reshaping Computer Literacy Teaching in Higher Education: Identification of Critical Success Factors

    ERIC Educational Resources Information Center

    Taylor, Estelle; Goede, Roelien; Steyn, Tjaart

    2011-01-01

    Purpose: Acquiring computer skills is more important today than ever before, especially in a developing country. Teaching of computer skills, however, has to adapt to new technology. This paper aims to model factors influencing the success of the learning of computer literacy by means of an e-learning environment. The research question for this…

  17. Identification of Cognitive Processes of Effective and Ineffective Students during Computer Programming

    ERIC Educational Resources Information Center

    Renumol, V. G.; Janakiram, Dharanipragada; Jayaprakash, S.

    2010-01-01

    Identifying the set of cognitive processes (CPs) a student can go through during computer programming is an interesting research problem. It can provide a better understanding of the human aspects in computer programming process and can also contribute to the computer programming education in general. The study identified the presence of a set of…

  18. Reshaping Computer Literacy Teaching in Higher Education: Identification of Critical Success Factors

    ERIC Educational Resources Information Center

    Taylor, Estelle; Goede, Roelien; Steyn, Tjaart

    2011-01-01

    Purpose: Acquiring computer skills is more important today than ever before, especially in a developing country. Teaching of computer skills, however, has to adapt to new technology. This paper aims to model factors influencing the success of the learning of computer literacy by means of an e-learning environment. The research question for this…

  19. Computing converged free energy differences between levels of theory via nonequilibrium work methods: Challenges and opportunities.

    PubMed

    Kearns, Fiona L; Hudson, Phillip S; Woodcock, Henry L; Boresch, Stefan

    2017-03-08

    We demonstrate that Jarzynski's equation can be used to reliably compute free energy differences between low and high level representations of systems. The need for such a calculation arises when employing the so-called "indirect" approach to free energy simulations with mixed quantum mechanical/molecular mechanical (QM/MM) Hamiltonians; a popular technique for circumventing extensive simulations involving quantum chemical computations. We have applied this methodology to several small and medium sized organic molecules, both in the gas phase and explicit solvent. Test cases include several systems for which the standard approach; that is, free energy perturbation between low and high level description, fails to converge. Finally, we identify three major areas in which the difference between low and high level representations make the calculation of ΔAlow→high difficult: bond stretching and angle bending, different preferred conformations, and the response of the MM region to the charge distribution of the QM region. © 2016 Wiley Periodicals, Inc.

  20. Energy Use and Power Levels in New Monitors and Personal Computers

    SciTech Connect

    Roberson, Judy A.; Homan, Gregory K.; Mahajan, Akshay; Nordman, Bruce; Webber, Carrie A.; Brown, Richard E.; McWhinney, Marla; Koomey, Jonathan G.

    2002-07-23

    Our research was conducted in support of the EPA ENERGY STAR Office Equipment program, whose goal is to reduce the amount of electricity consumed by office equipment in the U.S. The most energy-efficient models in each office equipment category are eligible for the ENERGY STAR label, which consumers can use to identify and select efficient products. As the efficiency of each category improves over time, the ENERGY STAR criteria need to be revised accordingly. The purpose of this study was to provide reliable data on the energy consumption of the newest personal computers and monitors that the EPA can use to evaluate revisions to current ENERGY STAR criteria as well as to improve the accuracy of ENERGY STAR program savings estimates. We report the results of measuring the power consumption and power management capabilities of a sample of new monitors and computers. These results will be used to improve estimates of program energy savings and carbon emission reductions, and to inform rev isions of the ENERGY STAR criteria for these products. Our sample consists of 35 monitors and 26 computers manufactured between July 2000 and October 2001; it includes cathode ray tube (CRT) and liquid crystal display (LCD) monitors, Macintosh and Intel-architecture computers, desktop and laptop computers, and integrated computer systems, in which power consumption of the computer and monitor cannot be measured separately. For each machine we measured power consumption when off, on, and in each low-power level. We identify trends in and opportunities to reduce power consumption in new personal computers and monitors. Our results include a trend among monitor manufacturers to provide a single very low low-power level, well below the current ENERGY STAR criteria for sleep power consumption. These very low sleep power results mean that energy consumed when monitors are off or in active use has become more important in terms of contribution to the overall unit energy consumption (UEC

  1. Level set discrete element method for three-dimensional computations with triaxial case study

    NASA Astrophysics Data System (ADS)

    Kawamoto, Reid; Andò, Edward; Viggiani, Gioacchino; Andrade, José E.

    2016-06-01

    In this paper, we outline the level set discrete element method (LS-DEM) which is a discrete element method variant able to simulate systems of particles with arbitrary shape using level set functions as a geometric basis. This unique formulation allows seamless interfacing with level set-based characterization methods as well as computational ease in contact calculations. We then apply LS-DEM to simulate two virtual triaxial specimens generated from XRCT images of experiments and demonstrate LS-DEM's ability to quantitatively capture and predict stress-strain and volume-strain behavior observed in the experiments.

  2. Cellular Automata as a Computational Model for Low-Level Vision

    NASA Astrophysics Data System (ADS)

    Broggi, Alberto; D'Andrea, Vincenzo; Destri, Giulio

    In this paper we discuss the use of the Cellular Automata (CA) computational model in computer vision applications on massively parallel architectures. Motivations and guidelines of this approach to low-level vision in the frame of the PROMETHEUS project are discussed. The hard real-time requirement of actual application can be only satisfied using an ad hoc VLSI massively parallel architecture (PAPRICA). The hardware solutions and the specific algorithms can be efficiently verified and tested only using, as a simulator, a general purpose machine with a parent architecture (CM-2). An example of application related to feature extraction is discussed.

  3. Computer aided identification of a Hevein-like antimicrobial peptide of bell pepper leaves for biotechnological use.

    PubMed

    Games, Patrícia Dias; daSilva, Elói Quintas Gonçalves; Barbosa, Meire de Oliveira; Almeida-Souza, Hebréia Oliveira; Fontes, Patrícia Pereira; deMagalhães, Marcos Jorge; Pereira, Paulo Roberto Gomes; Prates, Maura Vianna; Franco, Gloria Regina; Faria-Campos, Alessandra; Campos, Sérgio Vale Aguiar; Baracat-Pereira, Maria Cristina

    2016-12-15

    Antimicrobial peptides from plants present mechanisms of action that are different from those of conventional defense agents. They are under-explored but have a potential as commercial antimicrobials. Bell pepper leaves ('Magali R') are discarded after harvesting the fruit and are sources of bioactive peptides. This work reports the isolation by peptidomics tools, and the identification and partially characterization by computational tools of an antimicrobial peptide from bell pepper leaves, and evidences the usefulness of records and the in silico analysis for the study of plant peptides aiming biotechnological uses. Aqueous extracts from leaves were enriched in peptide by salt fractionation and ultrafiltration. An antimicrobial peptide was isolated by tandem chromatographic procedures. Mass spectrometry, automated peptide sequencing and bioinformatics tools were used alternately for identification and partial characterization of the Hevein-like peptide, named HEV-CANN. The computational tools that assisted to the identification of the peptide included BlastP, PSI-Blast, ClustalOmega, PeptideCutter, and ProtParam; conventional protein databases (DB) as Mascot, Protein-DB, GenBank-DB, RefSeq, Swiss-Prot, and UniProtKB; specific for peptides DB as Amper, APD2, CAMP, LAMPs, and PhytAMP; other tools included in ExPASy for Proteomics; The Bioactive Peptide Databases, and The Pepper Genome Database. The HEV-CANN sequence presented 40 amino acid residues, 4258.8 Da, theoretical pI-value of 8.78, and four disulfide bonds. It was stable, and it has inhibited the growth of phytopathogenic bacteria and a fungus. HEV-CANN presented a chitin-binding domain in their sequence. There was a high identity and a positive alignment of HEV-CANN sequence in various databases, but there was not a complete identity, suggesting that HEV-CANN may be produced by ribosomal synthesis, which is in accordance with its constitutive nature. Computational tools for proteomics and databases are

  4. An accurate and computationally efficient algorithm for ground peak identification in large footprint waveform LiDAR data

    NASA Astrophysics Data System (ADS)

    Zhuang, Wei; Mountrakis, Giorgos

    2014-09-01

    Large footprint waveform LiDAR sensors have been widely used for numerous airborne studies. Ground peak identification in a large footprint waveform is a significant bottleneck in exploring full usage of the waveform datasets. In the current study, an accurate and computationally efficient algorithm was developed for ground peak identification, called Filtering and Clustering Algorithm (FICA). The method was evaluated on Land, Vegetation, and Ice Sensor (LVIS) waveform datasets acquired over Central NY. FICA incorporates a set of multi-scale second derivative filters and a k-means clustering algorithm in order to avoid detecting false ground peaks. FICA was tested in five different land cover types (deciduous trees, coniferous trees, shrub, grass and developed area) and showed more accurate results when compared to existing algorithms. More specifically, compared with Gaussian decomposition, the RMSE ground peak identification by FICA was 2.82 m (5.29 m for GD) in deciduous plots, 3.25 m (4.57 m for GD) in coniferous plots, 2.63 m (2.83 m for GD) in shrub plots, 0.82 m (0.93 m for GD) in grass plots, and 0.70 m (0.51 m for GD) in plots of developed areas. FICA performance was also relatively consistent under various slope and canopy coverage (CC) conditions. In addition, FICA showed better computational efficiency compared to existing methods. FICA's major computational and accuracy advantage is a result of the adopted multi-scale signal processing procedures that concentrate on local portions of the signal as opposed to the Gaussian decomposition that uses a curve-fitting strategy applied in the entire signal. The FICA algorithm is a good candidate for large-scale implementation on future space-borne waveform LiDAR sensors.

  5. iProphet: Multi-level Integrative Analysis of Shotgun Proteomic Data Improves Peptide and Protein Identification Rates and Error Estimates*

    PubMed Central

    Shteynberg, David; Deutsch, Eric W.; Lam, Henry; Eng, Jimmy K.; Sun, Zhi; Tasman, Natalie; Mendoza, Luis; Moritz, Robert L.; Aebersold, Ruedi; Nesvizhskii, Alexey I.

    2011-01-01

    The combination of tandem mass spectrometry and sequence database searching is the method of choice for the identification of peptides and the mapping of proteomes. Over the last several years, the volume of data generated in proteomic studies has increased dramatically, which challenges the computational approaches previously developed for these data. Furthermore, a multitude of search engines have been developed that identify different, overlapping subsets of the sample peptides from a particular set of tandem mass spectrometry spectra. We present iProphet, the new addition to the widely used open-source suite of proteomic data analysis tools Trans-Proteomics Pipeline. Applied in tandem with PeptideProphet, it provides more accurate representation of the multilevel nature of shotgun proteomic data. iProphet combines the evidence from multiple identifications of the same peptide sequences across different spectra, experiments, precursor ion charge states, and modified states. It also allows accurate and effective integration of the results from multiple database search engines applied to the same data. The use of iProphet in the Trans-Proteomics Pipeline increases the number of correctly identified peptides at a constant false discovery rate as compared with both PeptideProphet and another state-of-the-art tool Percolator. As the main outcome, iProphet permits the calculation of accurate posterior probabilities and false discovery rate estimates at the level of sequence identical peptide identifications, which in turn leads to more accurate probability estimates at the protein level. Fully integrated with the Trans-Proteomics Pipeline, it supports all commonly used MS instruments, search engines, and computer platforms. The performance of iProphet is demonstrated on two publicly available data sets: data from a human whole cell lysate proteome profiling experiment representative of typical proteomic data sets, and from a set of Streptococcus pyogenes experiments

  6. Cooperative gene regulation by microRNA pairs and their identification using a computational workflow

    PubMed Central

    Schmitz, Ulf; Lai, Xin; Winter, Felix; Wolkenhauer, Olaf; Vera, Julio; Gupta, Shailendra K.

    2014-01-01

    MicroRNAs (miRNAs) are an integral part of gene regulation at the post-transcriptional level. Recently, it has been shown that pairs of miRNAs can repress the translation of a target mRNA in a cooperative manner, which leads to an enhanced effectiveness and specificity in target repression. However, it remains unclear which miRNA pairs can synergize and which genes are target of cooperative miRNA regulation. In this paper, we present a computational workflow for the prediction and analysis of cooperating miRNAs and their mutual target genes, which we refer to as RNA triplexes. The workflow integrates methods of miRNA target prediction; triplex structure analysis; molecular dynamics simulations and mathematical modeling for a reliable prediction of functional RNA triplexes and target repression efficiency. In a case study we analyzed the human genome and identified several thousand targets of cooperative gene regulation. Our results suggest that miRNA cooperativity is a frequent mechanism for an enhanced target repression by pairs of miRNAs facilitating distinctive and fine-tuned target gene expression patterns. Human RNA triplexes predicted and characterized in this study are organized in a web resource at www.sbi.uni-rostock.de/triplexrna/. PMID:24875477

  7. The Duke Personal Computer Project: A Strategy for Computing Literacy.

    ERIC Educational Resources Information Center

    Gallie, Thomas M.; And Others

    1981-01-01

    The introduction of an instructional computing strategy at Duke University that led to identification of three levels of user subsets and establishment of equipment clusters is described. Uses of the system in establishing computer literacy and special applications in chemistry, computer science, and the social sciences are reviewed. (MP)

  8. The Duke Personal Computer Project: A Strategy for Computing Literacy.

    ERIC Educational Resources Information Center

    Gallie, Thomas M.; And Others

    1981-01-01

    The introduction of an instructional computing strategy at Duke University that led to identification of three levels of user subsets and establishment of equipment clusters is described. Uses of the system in establishing computer literacy and special applications in chemistry, computer science, and the social sciences are reviewed. (MP)

  9. Are accurate computations of the 13C' shielding feasible at the DFT level of theory?

    PubMed

    Vila, Jorge A; Arnautova, Yelena A; Martin, Osvaldo A; Scheraga, Harold A

    2014-02-05

    The goal of this study is twofold. First, to investigate the relative influence of the main structural factors affecting the computation of the (13)C' shielding, namely, the conformation of the residue itself and the next nearest-neighbor effects. Second, to determine whether calculation of the (13)C' shielding at the density functional level of theory (DFT), with an accuracy similar to that of the (13)C(α) shielding, is feasible with the existing computational resources. The DFT calculations, carried out for a large number of possible conformations of the tripeptide Ac-GXY-NMe, with different combinations of X and Y residues, enable us to conclude that the accurate computation of the (13)C' shielding for a given residue X depends on the: (i) (ϕ,ψ) backbone torsional angles of X; (ii) side-chain conformation of X; (iii) (ϕ,ψ) torsional angles of Y; and (iv) identity of residue Y. Consequently, DFT-based quantum mechanical calculations of the (13)C' shielding, with all these factors taken into account, are two orders of magnitude more CPU demanding than the computation, with similar accuracy, of the (13)C(α) shielding. Despite not considering the effect of the possible hydrogen bond interaction of the carbonyl oxygen, this work contributes to our general understanding of the main structural factors affecting the accurate computation of the (13)C' shielding in proteins and may spur significant progress in effort to develop new validation methods for protein structures.

  10. High level waste storage tank farms/242-A evaporator Standards/Requirements Identification Document (S/RID), Volume 6

    SciTech Connect

    Not Available

    1994-04-01

    The High-Level Waste Storage Tank Farms/242-A Evaporator Standards/Requirements Identification Document (S/RID) is contained in multiple volumes. This document (Volume 6) outlines the standards and requirements for the sections on: Environmental Restoration and Waste Management, Research and Development and Experimental Activities, and Nuclear Safety.

  11. High-level waste storage tank farms/242-A evaporator Standards/Requirements Identification Document (S/RID), Volume 5

    SciTech Connect

    Not Available

    1994-04-01

    The High-Level Waste Storage Tank Farms/242-A Evaporator Standards/Requirements Identification Document (S/RID) is contained in multiple volumes. This document (Volume 5) outlines the standards and requirements for the Fire Protection and Packaging and Transportation sections.

  12. High-level waste storage tank farms/242-A evaporator Standards/Requirements Identification Document (S/RID)

    SciTech Connect

    Not Available

    1994-04-01

    The High-Level Waste Storage Tank Farms/242-A Evaporator Standards/Requirements Identification Document (S/RID) is contained in multiple volumes. This document (Volume 3) presents the standards and requirements for the following sections: Safeguards and Security, Engineering Design, and Maintenance.

  13. High-level waste storage tank farms/242-A evaporator Standards/Requirements Identification Document (S/RID), Volume 4

    SciTech Connect

    Not Available

    1994-04-01

    The High-Level Waste Storage Tank Farms/242-A Evaporator Standards/Requirements Identification Document (S/RID) is contained in multiple volumes. This document (Volume 4) presents the standards and requirements for the following sections: Radiation Protection and Operations.

  14. Identification of masses in digital mammogram using gray level co-occurrence matrices

    PubMed Central

    Mohd. Khuzi, A; Besar, R; Wan Zaki, WMD; Ahmad, NN

    2009-01-01

    Digital mammogram has become the most effective technique for early breast cancer detection modality. Digital mammogram takes an electronic image of the breast and stores it directly in a computer. The aim of this study is to develop an automated system for assisting the analysis of digital mammograms. Computer image processing techniques will be applied to enhance images and this is followed by segmentation of the region of interest (ROI). Subsequently, the textural features will be extracted from the ROI. The texture features will be used to classify the ROIs as either masses or non-masses. In this study normal breast images and breast image with masses used as the standard input to the proposed system are taken from Mammographic Image Analysis Society (MIAS) digital mammogram database. In MIAS database, masses are grouped into either spiculated, circumscribed or ill-defined. Additional information includes location of masses centres and radius of masses. The extraction of the textural features of ROIs is done by using gray level co-occurrence matrices (GLCM) which is constructed at four different directions for each ROI. The results show that the GLCM at 0º, 45º, 90º and 135º with a block size of 8X8 give significant texture information to identify between masses and non-masses tissues. Analysis of GLCM properties i.e. contrast, energy and homogeneity resulted in receiver operating characteristics (ROC) curve area of Az = 0.84 for Otsu’s method, 0.82 for thresholding method and Az = 0.7 for K-mean clustering. ROC curve area of 0.8-0.9 is rated as good results. The authors’ proposed method contains no complicated algorithm. The detection is based on a decision tree with five criterions to be analysed. This simplicity leads to less computational time. Thus, this approach is suitable for automated real-time breast cancer diagnosis system. PMID:21611053

  15. Identification of masses in digital mammogram using gray level co-occurrence matrices.

    PubMed

    Mohd Khuzi, A; Besar, R; Wan Zaki, Wmd; Ahmad, Nn

    2009-07-01

    Digital mammogram has become the most effective technique for early breast cancer detection modality. Digital mammogram takes an electronic image of the breast and stores it directly in a computer. The aim of this study is to develop an automated system for assisting the analysis of digital mammograms. Computer image processing techniques will be applied to enhance images and this is followed by segmentation of the region of interest (ROI). Subsequently, the textural features will be extracted from the ROI. The texture features will be used to classify the ROIs as either masses or non-masses. In this study normal breast images and breast image with masses used as the standard input to the proposed system are taken from Mammographic Image Analysis Society (MIAS) digital mammogram database. In MIAS database, masses are grouped into either spiculated, circumscribed or ill-defined. Additional information includes location of masses centres and radius of masses. The extraction of the textural features of ROIs is done by using gray level co-occurrence matrices (GLCM) which is constructed at four different directions for each ROI. The results show that the GLCM at 0º, 45º, 90º and 135º with a block size of 8X8 give significant texture information to identify between masses and non-masses tissues. Analysis of GLCM properties i.e. contrast, energy and homogeneity resulted in receiver operating characteristics (ROC) curve area of Az = 0.84 for Otsu's method, 0.82 for thresholding method and Az = 0.7 for K-mean clustering. ROC curve area of 0.8-0.9 is rated as good results. The authors' proposed method contains no complicated algorithm. The detection is based on a decision tree with five criterions to be analysed. This simplicity leads to less computational time. Thus, this approach is suitable for automated real-time breast cancer diagnosis system.

  16. Computation of the intervals of uncertainties about the parameters found for identification

    NASA Technical Reports Server (NTRS)

    Mereau, P.; Raymond, J.

    1982-01-01

    A modeling method to calculate the intervals of uncertainty for parameters found by identification is described. The region of confidence and the general approach to the calculation of these intervals are discussed. The general subprograms for determination of dimensions are described. They provide the organizational charts for the subprograms, the tests carried out and the listings of the different subprograms.

  17. Using the high-level based program interface to facilitate the large scale scientific computing.

    PubMed

    Shang, Yizi; Shang, Ling; Gao, Chuanchang; Lu, Guiming; Ye, Yuntao; Jia, Dongdong

    2014-01-01

    This paper is to make further research on facilitating the large-scale scientific computing on the grid and the desktop grid platform. The related issues include the programming method, the overhead of the high-level program interface based middleware, and the data anticipate migration. The block based Gauss Jordan algorithm as a real example of large-scale scientific computing is used to evaluate those issues presented above. The results show that the high-level based program interface makes the complex scientific applications on large-scale scientific platform easier, though a little overhead is unavoidable. Also, the data anticipation migration mechanism can improve the efficiency of the platform which needs to process big data based scientific applications.

  18. Using the High-Level Based Program Interface to Facilitate the Large Scale Scientific Computing

    PubMed Central

    Shang, Yizi; Shang, Ling; Gao, Chuanchang; Lu, Guiming; Ye, Yuntao; Jia, Dongdong

    2014-01-01

    This paper is to make further research on facilitating the large-scale scientific computing on the grid and the desktop grid platform. The related issues include the programming method, the overhead of the high-level program interface based middleware, and the data anticipate migration. The block based Gauss Jordan algorithm as a real example of large-scale scientific computing is used to evaluate those issues presented above. The results show that the high-level based program interface makes the complex scientific applications on large-scale scientific platform easier, though a little overhead is unavoidable. Also, the data anticipation migration mechanism can improve the efficiency of the platform which needs to process big data based scientific applications. PMID:24574931

  19. Computational Psychiatry of ADHD: Neural Gain Impairments across Marrian Levels of Analysis.

    PubMed

    Hauser, Tobias U; Fiore, Vincenzo G; Moutoussis, Michael; Dolan, Raymond J

    2016-02-01

    Attention-deficit hyperactivity disorder (ADHD), one of the most common psychiatric disorders, is characterised by unstable response patterns across multiple cognitive domains. However, the neural mechanisms that explain these characteristic features remain unclear. Using a computational multilevel approach, we propose that ADHD is caused by impaired gain modulation in systems that generate this phenotypic increased behavioural variability. Using Marr's three levels of analysis as a heuristic framework, we focus on this variable behaviour, detail how it can be explained algorithmically, and how it might be implemented at a neural level through catecholamine influences on corticostriatal loops. This computational, multilevel, approach to ADHD provides a framework for bridging gaps between descriptions of neuronal activity and behaviour, and provides testable predictions about impaired mechanisms. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  20. Evolving tactics using levels of intelligence in computer-generated forces

    NASA Astrophysics Data System (ADS)

    Porto, Vincent W.; Hardt, Michael; Fogel, David B.; Kreutz-Delgado, Kenneth; Fogel, Lawrence J.

    1999-06-01

    Simulated evolution on a computer can provide a means for generating appropriate tactics in real-time combat scenarios. Individual unit or higher-level organizations, such as tanks and platoons, can use evolutionary computation to adapt to the current and projected situations. We briefly review current knowledge in evolutionary algorithms and offer an example of applying these techniques to generate adaptive behavior in a platoon-level engagement of tanks where the mission of one platoon is changed on-the-fly. We also study the effects of increasing the intelligence of one side in a one-on-one tank engagement. The results indicate that measured performance increases with increased intelligence; however, this does not always come at the expense of the opposing side.

  1. Computational Psychiatry of ADHD: Neural Gain Impairments across Marrian Levels of Analysis

    PubMed Central

    Hauser, Tobias U.; Fiore, Vincenzo G.; Moutoussis, Michael; Dolan, Raymond J.

    2016-01-01

    Attention-deficit hyperactivity disorder (ADHD), one of the most common psychiatric disorders, is characterised by unstable response patterns across multiple cognitive domains. However, the neural mechanisms that explain these characteristic features remain unclear. Using a computational multilevel approach, we propose that ADHD is caused by impaired gain modulation in systems that generate this phenotypic increased behavioural variability. Using Marr's three levels of analysis as a heuristic framework, we focus on this variable behaviour, detail how it can be explained algorithmically, and how it might be implemented at a neural level through catecholamine influences on corticostriatal loops. This computational, multilevel, approach to ADHD provides a framework for bridging gaps between descriptions of neuronal activity and behaviour, and provides testable predictions about impaired mechanisms. PMID:26787097

  2. POPCYCLE: a computer code for calculating nuclear and fossil plant levelized life-cycle power costs

    SciTech Connect

    Hardie, R.W.

    1982-02-01

    POPCYCLE, a computer code designed to calculate levelized life-cycle power costs for nuclear and fossil electrical generating plants is described. Included are (1) derivations of the equations and a discussion of the methodology used by POPCYCLE, (2) a description of the input required by the code, (3) a listing of the input for a sample case, and (4) the output for a sample case.

  3. Computer vision applied to herbarium specimens of German trees: testing the future utility of the millions of herbarium specimen images for automated identification.

    PubMed

    Unger, Jakob; Merhof, Dorit; Renner, Susanne

    2016-11-16

    Global Plants, a collaborative between JSTOR and some 300 herbaria, now contains about 2.48 million high-resolution images of plant specimens, a number that continues to grow, and collections that are digitizing their specimens at high resolution are allocating considerable recourses to the maintenance of computer hardware (e.g., servers) and to acquiring digital storage space. We here apply machine learning, specifically the training of a Support-Vector-Machine, to classify specimen images into categories, ideally at the species level, using the 26 most common tree species in Germany as a test case. We designed an analysis pipeline and classification system consisting of segmentation, normalization, feature extraction, and classification steps and evaluated the system in two test sets, one with 26 species, the other with 17, in each case using 10 images per species of plants collected between 1820 and 1995, which simulates the empirical situation that most named species are represented in herbaria and databases, such as JSTOR, by few specimens. We achieved 73.21% accuracy of species assignments in the larger test set, and 84.88% in the smaller test set. The results of this first application of a computer vision algorithm trained on images of herbarium specimens shows that despite the problem of overlapping leaves, leaf-architectural features can be used to categorize specimens to species with good accuracy. Computer vision is poised to play a significant role in future rapid identification at least for frequently collected genera or species in the European flora.

  4. Vibrational energy levels of difluorodioxirane computed with variational and perturbative methods from a hybrid force field.

    PubMed

    Ramakrishnan, Raghunathan; Carrington, Tucker

    2014-02-05

    We have computed vibrational energy levels of difluorodioxirane (CF2O2). For the potential, a Taylor expansion in normal coordinates is used. The CCSD(T) and MP2 methods and correlation consistent basis sets of quadruple-zeta quality are used to determine the force constants. The vibrational Schrödinger equation was solved using both a variational method and second order perturbation theory. The Watson kinetic energy operator and a discrete variable representation were used with the DEWE (E. Mátyus, G. Czakó, B.T. Sutcliffe and A.G. Császár, J. Chem. Phys. 127 (2007) 084102) computer program to do the variational calculations. For the variational calculations, the average absolute deviation of fundamentals, with respect to experimental values, is less than 3 cm(-1). Perturbative results are almost as good. About 300 vibrational levels were computed. (16)O→(18)O isotopic shifts have also been calculated variationally for the lowest 75 vibrational energy levels and are compared to experimental results. Copyright © 2012 Elsevier B.V. All rights reserved.

  5. Minimal marker: an algorithm and computer program for the identification of minimal sets of discriminating DNA markers for efficient variety identification.

    PubMed

    Fujii, Hiroshi; Ogata, Tatsushi; Shimada, Takehiko; Endo, Tomoko; Iketani, Hiroyuki; Shimizu, Tokurou; Yamamoto, Toshiya; Omura, Mitsuo

    2013-04-01

    DNA markers are frequently used to analyze crop varieties, with the coded marker data summarized in a computer-generated table. Such summary tables often provide extraneous data about individual crop genotypes, needlessly complicating and prolonging DNA-based differentiation between crop varieties. At present, it is difficult to identify minimal marker sets--the smallest sets that can distinguish between all crop varieties listed in a marker-summary table--due to the absence of algorithms capable of such characterization. Here, we describe the development of just such an algorithm and MinimalMarker, its accompanying Perl-based computer program. MinimalMarker has been validated in variety identification of fruit trees using published datasets and is available for use with both dominant and co-dominant markers, regardless of the number of alleles, including SSR markers with numeric notation. We expect that this program will prove useful not only to genomics researchers but also to government agencies that use DNA markers to support a variety of food-inspection and -labeling regulations.

  6. Computer Based Instruction in the U.S. Army’s Entry Level Enlisted Training.

    DTIC Science & Technology

    1985-03-13

    CHART NATIONAL BUREAU OF STANDARDS- I963-A COMPUTER BASED INSTRUCTION IN THE U.S. ARMY’S ENTRY LEVEL ENLISTED TRAINING Captain James A. Eldredge HQDA... Captain James A. Eldredge 9. PERFORMING ORGANIZATION NAME AND ADDRESS 10. PROGRAM ELEMENT, PROJECT, TASK Student, HQDA, MILPERCEN(DAPC-OPA-E), AREA...34 . . . . .".. .-.. . _ - ." _; ’ " : :, , . ’ - :"-" . """.i.: ’’-i.’ ;" "." ,i 30 battalion and brigade level. CAS3 is mandatory for all senior Captains and

  7. Zebra tape identification for the instantaneous angular speed computation and angular resampling of motorbike valve train measurements

    NASA Astrophysics Data System (ADS)

    Rivola, Alessandro; Troncossi, Marco

    2014-02-01

    An experimental test campaign was performed on the valve train of a racing motorbike engine in order to get insight into the dynamic of the system. In particular the valve motion was acquired in cold test conditions by means of a laser vibrometer able to acquire displacement and velocity signals. The valve time-dependent measurements needed to be referred to the camshaft angular position in order to analyse the data in the angular domain, as usually done for rotating machines. To this purpose the camshaft was fitted with a zebra tape whose dark and light stripes were tracked by means of an optical probe. Unfortunately, both manufacturing and mounting imperfections of the employed zebra tape, resulting in stripes with slightly different widths, precluded the possibility to directly obtain the correct relationship between camshaft angular position and time. In order to overcome this problem, the identification of the zebra tape was performed by means of the original and practical procedure that is the focus of the present paper. The method consists of three main steps: namely, an ad-hoc test corresponding to special operating conditions, the computation of the instantaneous angular speed, and the final association of the stripes with the corresponding shaft angular position. The results reported in the paper demonstrate the suitability of the simple procedure for the zebra tape identification performed with the final purpose to implement a computed order tracking technique for the data analysis.

  8. Computer predictions of photochemical oxidant levels for initial precursor concentrations characteristic of southeastern Virginia

    NASA Technical Reports Server (NTRS)

    Wakelyn, N. T.; Mclain, A. G.

    1979-01-01

    A computer study was performed with a photochemical box model, using a contemporary chemical mechanism and procedure, and a range of initial input pollutant concentrations thought to encompass those characteristic of the Southeastern Virginia region before a photochemical oxidant episode. The model predictions are consistent with the expectation of high summer afternoon ozone levels when initial nonmethane hydrocarbon (NMHC) levels are in the range 0.30-0.40 ppmC and NOx levels are in the range 0.02-0.05 ppm. Calculations made with a Lagrangian model, for one of the previously calculated cases, which had produced intermediate afternoon ozone levels, suggest that urban source additions of NMHC and NOx exacerbate the photochemical oxidant condition.

  9. Tenth Grade Students' Time Using a Computer as a Predictor of the Highest Level of Education Attempted

    ERIC Educational Resources Information Center

    Gaffey, Adam John

    2014-01-01

    As computing technology continued to grow in the lives of secondary students from 2002 to 2006, researchers failed to identify the influence using computers would have on the highest level of education students attempted. During the early part of the century schools moved towards increasing the usage of computers. Numerous stakeholders were unsure…

  10. Tenth Grade Students' Time Using a Computer as a Predictor of the Highest Level of Education Attempted

    ERIC Educational Resources Information Center

    Gaffey, Adam John

    2014-01-01

    As computing technology continued to grow in the lives of secondary students from 2002 to 2006, researchers failed to identify the influence using computers would have on the highest level of education students attempted. During the early part of the century schools moved towards increasing the usage of computers. Numerous stakeholders were unsure…

  11. Properly defining the targets of a transcription factor significantly improves the computational identification of cooperative transcription factor pairs in yeast

    PubMed Central

    2015-01-01

    Background Transcriptional regulation of gene expression in eukaryotes is usually accomplished by cooperative transcription factors (TFs). Computational identification of cooperative TF pairs has become a hot research topic and many algorithms have been proposed in the literature. A typical algorithm for predicting cooperative TF pairs has two steps. (Step 1) Define the targets of each TF under study. (Step 2) Design a measure for calculating the cooperativity of a TF pair based on the targets of these two TFs. While different algorithms have distinct sophisticated cooperativity measures, the targets of a TF are usually defined using ChIP-chip data. However, there is an inherent weakness in using ChIP-chip data to define the targets of a TF. ChIP-chip analysis can only identify the binding targets of a TF but it cannot distinguish the true regulatory from the binding but non-regulatory targets of a TF. Results This work is the first study which aims to investigate whether the performance of computational identification of cooperative TF pairs could be improved by using a more biologically relevant way to define the targets of a TF. For this purpose, we propose four simple algorithms, all of which consist of two steps. (Step 1) Define the targets of a TF using (i) ChIP-chip data in the first algorithm, (ii) TF binding data in the second algorithm, (iii) TF perturbation data in the third algorithm, and (iv) the intersection of TF binding and TF perturbation data in the fourth algorithm. Compared with the first three algorithms, the fourth algorithm uses a more biologically relevant way to define the targets of a TF. (Step 2) Measure the cooperativity of a TF pair by the statistical significance of the overlap of the targets of these two TFs using the hypergeometric test. By adopting four existing performance indices, we show that the fourth proposed algorithm (PA4) significantly out performs the other three proposed algorithms. This suggests that the computational

  12. Properly defining the targets of a transcription factor significantly improves the computational identification of cooperative transcription factor pairs in yeast.

    PubMed

    Wu, Wei-Sheng; Lai, Fu-Jou

    2015-01-01

    Transcriptional regulation of gene expression in eukaryotes is usually accomplished by cooperative transcription factors (TFs). Computational identification of cooperative TF pairs has become a hot research topic and many algorithms have been proposed in the literature. A typical algorithm for predicting cooperative TF pairs has two steps. (Step 1) Define the targets of each TF under study. (Step 2) Design a measure for calculating the cooperativity of a TF pair based on the targets of these two TFs. While different algorithms have distinct sophisticated cooperativity measures, the targets of a TF are usually defined using ChIP-chip data. However, there is an inherent weakness in using ChIP-chip data to define the targets of a TF. ChIP-chip analysis can only identify the binding targets of a TF but it cannot distinguish the true regulatory from the binding but non-regulatory targets of a TF. This work is the first study which aims to investigate whether the performance of computational identification of cooperative TF pairs could be improved by using a more biologically relevant way to define the targets of a TF. For this purpose, we propose four simple algorithms, all of which consist of two steps. (Step 1) Define the targets of a TF using (i) ChIP-chip data in the first algorithm, (ii) TF binding data in the second algorithm, (iii) TF perturbation data in the third algorithm, and (iv) the intersection of TF binding and TF perturbation data in the fourth algorithm. Compared with the first three algorithms, the fourth algorithm uses a more biologically relevant way to define the targets of a TF. (Step 2) Measure the cooperativity of a TF pair by the statistical significance of the overlap of the targets of these two TFs using the hypergeometric test. By adopting four existing performance indices, we show that the fourth proposed algorithm (PA4) significantly out performs the other three proposed algorithms. This suggests that the computational identification of

  13. Compiling high-level languages for configurable computers: applying lessons from heterogeneous processing

    NASA Astrophysics Data System (ADS)

    Weaver, Glen E.; Weems, Charles C.; McKinley, Kathryn S.

    1996-10-01

    Configurable systems offer increased performance by providing hardware that matches the computational structure of a problem. This hardware is currently programmed with CAD tools and explicit library calls. To attain widespread acceptance, configurable computing must become transparently accessible from high-level programming languages, but the changeable nature of the target hardware presents a major challenge to traditional compiler technology. A compiler for a configurable computer should optimize the use of functions embedded in hardware and schedule hardware reconfigurations. The hurdles to be overcome in achieving this capability are similar in some ways to those facing compilation for heterogeneous systems. For example, current traditional compilers have neither an interface to accept new primitive operators, nor a mechanism for applying optimizations to new operators. We are building a compiler for heterogeneous computing, called Scale, which replaces the traditional monolithic compiler architecture with a flexible framework. Scale has three main parts: translation director, compilation library, and a persistent store which holds our intermediate representation as well as other data structures. The translation director exploits the framework's flexibility by using architectural information to build a plan to direct each compilation. The translation library serves as a toolkit for use by the translation director. Our compiler intermediate representation, Score, facilities the addition of new IR nodes by distinguishing features used in defining nodes from properties on which transformations depend. In this paper, we present an overview of the scale architecture and its capabilities for dealing with heterogeneity, followed by a discussion of how those capabilities apply to problems in configurable computing. We then address aspects of configurable computing that are likely to require extensions to our approach and propose some extensions.

  14. Human identification based on cranial computed tomography scan — a case report

    PubMed Central

    Silva, RF; Botelho, TL; Prado, FB; Kawagushi, JT; Daruge Júnior, E; Bérzin, F

    2011-01-01

    Today, there is increasing use of CT scanning on a clinical basis, aiding in the diagnosis of diseases or injuries. This exam also provides important information that allows identification of individuals. This paper reports the use of a CT scan on the skull, taken when the victim was alive, for the positive identification of a victim of a traffic accident in which the fingerprint analysis was impossible. The authors emphasize that the CT scan is a tool primarily used in clinical diagnosis and may contribute significantly to forensic purpose, allowing the exploration of virtual corpses before the classic autopsy. The use of CT scans might increase the quantity and quality of information involved in the death of the person examined. PMID:21493883

  15. Variance and bias computations for improved modal identification using ERA/DC

    NASA Technical Reports Server (NTRS)

    Longman, Richard W.; Lew, Jiann-Shiun; Tseng, Dong-Huei; Juang, Jer-Nan

    1991-01-01

    Variance and bias confidence criteria were recently developed for the eigensystem realization algorithm (ERA) identification technique. These criteria are extended for the modified version of ERA based on data correlation, ERA/DC, and also for the Q-Markov cover algorithm. The importance and usefulness of the variance and bias information are demonstrated in numerical studies. The criteria are shown to be very effective not only by indicating the accuracy of the identification results, especially in terms of confidence intervals, but also by helping the ERA user to obtain better results by seeing the effect of changing the sample time, adjusting the Hankel matrix dimension, choosing how many singular values to retain, deciding the model order, etc.

  16. A Comparative Study to Evaluate the Effectiveness of Computer Assisted Instruction (CAI) versus Class Room Lecture (RL) for Computer Science at ICS Level

    ERIC Educational Resources Information Center

    Kausar, Tayyaba; Choudhry, Bushra Naoreen; Gujjar, Aijaz Ahmed

    2008-01-01

    This study was aimed to evaluate the effectiveness of CAI vs. classroom lecture for computer science at ICS level. The objectives were to compare the learning effects of two groups with class room lecture and computer assisted instruction studying the same curriculum and the effects of CAI and CRL in terms of cognitive development. Hypothesis of…

  17. A Comparative Study to Evaluate the Effectiveness of Computer Assisted Instruction (CAI) versus Class Room Lecture (CRL) for Computer Science at ICS Level

    ERIC Educational Resources Information Center

    Kausar, Tayyaba; Choudhry, Bushra Naoreen; Gujjar, Aijaz Ahmed

    2008-01-01

    This study was aimed to evaluate the effectiveness of CAI vs. classroom lecture for computer science at ICS level. The objectives were to compare the learning effects of two groups with class room lecture and computer assisted instruction studying the same curriculum and the effects of CAI and CRL in terms of cognitive development. Hypothesis of…

  18. Comments and Criticism: Comment on "Identification of Student Misconceptions in Genetics Problem Solving via Computer Program."

    ERIC Educational Resources Information Center

    Smith, Mike U.

    1991-01-01

    Criticizes an article by Browning and Lehman (1988) for (1) using "gene" instead of allele, (2) misusing the word "misconception," and (3) the possible influences of the computer environment on the results of the study. (PR)

  19. Prediction of monthly regional groundwater levels through hybrid soft-computing techniques

    NASA Astrophysics Data System (ADS)

    Chang, Fi-John; Chang, Li-Chiu; Huang, Chien-Wei; Kao, I.-Feng

    2016-10-01

    Groundwater systems are intrinsically heterogeneous with dynamic temporal-spatial patterns, which cause great difficulty in quantifying their complex processes, while reliable predictions of regional groundwater levels are commonly needed for managing water resources to ensure proper service of water demands within a region. In this study, we proposed a novel and flexible soft-computing technique that could effectively extract the complex high-dimensional input-output patterns of basin-wide groundwater-aquifer systems in an adaptive manner. The soft-computing models combined the Self Organized Map (SOM) and the Nonlinear Autoregressive with Exogenous Inputs (NARX) network for predicting monthly regional groundwater levels based on hydrologic forcing data. The SOM could effectively classify the temporal-spatial patterns of regional groundwater levels, the NARX could accurately predict the mean of regional groundwater levels for adjusting the selected SOM, the Kriging was used to interpolate the predictions of the adjusted SOM into finer grids of locations, and consequently the prediction of a monthly regional groundwater level map could be obtained. The Zhuoshui River basin in Taiwan was the study case, and its monthly data sets collected from 203 groundwater stations, 32 rainfall stations and 6 flow stations during 2000 and 2013 were used for modelling purpose. The results demonstrated that the hybrid SOM-NARX model could reliably and suitably predict monthly basin-wide groundwater levels with high correlations (R2 > 0.9 in both training and testing cases). The proposed methodology presents a milestone in modelling regional environmental issues and offers an insightful and promising way to predict monthly basin-wide groundwater levels, which is beneficial to authorities for sustainable water resources management.

  20. Systems Level Analysis and Identification of Pathways and Networks Associated with Liver Fibrosis

    PubMed Central

    AbdulHameed, Mohamed Diwan M.; Tawa, Gregory J.; Kumar, Kamal; Ippolito, Danielle L.; Lewis, John A.; Stallings, Jonathan D.; Wallqvist, Anders

    2014-01-01

    Toxic liver injury causes necrosis and fibrosis, which may lead to cirrhosis and liver failure. Despite recent progress in understanding the mechanism of liver fibrosis, our knowledge of the molecular-level details of this disease is still incomplete. The elucidation of networks and pathways associated with liver fibrosis can provide insight into the underlying molecular mechanisms of the disease, as well as identify potential diagnostic or prognostic biomarkers. Towards this end, we analyzed rat gene expression data from a range of chemical exposures that produced observable periportal liver fibrosis as documented in DrugMatrix, a publicly available toxicogenomics database. We identified genes relevant to liver fibrosis using standard differential expression and co-expression analyses, and then used these genes in pathway enrichment and protein-protein interaction (PPI) network analyses. We identified a PPI network module associated with liver fibrosis that includes known liver fibrosis-relevant genes, such as tissue inhibitor of metalloproteinase-1, galectin-3, connective tissue growth factor, and lipocalin-2. We also identified several new genes, such as perilipin-3, legumain, and myocilin, which were associated with liver fibrosis. We further analyzed the expression pattern of the genes in the PPI network module across a wide range of 640 chemical exposure conditions in DrugMatrix and identified early indications of liver fibrosis for carbon tetrachloride and lipopolysaccharide exposures. Although it is well known that carbon tetrachloride and lipopolysaccharide can cause liver fibrosis, our network analysis was able to link these compounds to potential fibrotic damage before histopathological changes associated with liver fibrosis appeared. These results demonstrated that our approach is capable of identifying early-stage indicators of liver fibrosis and underscore its potential to aid in predictive toxicity, biomarker identification, and to generally identify

  1. Survey of computed tomography scanners in Taiwan: Dose descriptors, dose guidance levels, and effective doses

    SciTech Connect

    Tsai, H. Y.; Tung, C. J.; Yu, C. C.; Tyan, Y. S.

    2007-04-15

    The IAEA and the ICRP recommended dose guidance levels for the most frequent computed tomography (CT) examinations to promote strategies for the optimization of radiation dose to CT patients. A national survey, including on-site measurements and questionnaires, was conducted in Taiwan in order to establish dose guidance levels and evaluate effective doses for CT. The beam quality and output and the phantom doses were measured for nine representative CT scanners. Questionnaire forms were completed by respondents from facilities of 146 CT scanners out of 285 total scanners. Information on patient, procedure, scanner, and technique for the head and body examinations was provided. The weighted computed tomography dose index (CTDI{sub w}), the dose length product (DLP), organ doses and effective dose were calculated using measured data, questionnaire information and Monte Carlo simulation results. A cost-effective analysis was applied to derive the dose guidance levels on CTDI{sub w} and DLP for several CT examinations. The mean effective dose{+-}standard deviation distributes from 1.6{+-}0.9 mSv for the routine head examination to 13{+-}11 mSv for the examination of liver, spleen, and pancreas. The surveyed results and the dose guidance levels were provided to the national authorities to develop quality control standards and protocols for CT examinations.

  2. Computational identification of freezing zones in building materials exposed to low temperatures

    NASA Astrophysics Data System (ADS)

    Kočí, Jan; Maděra, Jiří

    2017-07-01

    Demonstration of newly developed computational model for determination of critical zones in terms of possible frost induced damage is presented in this paper. The model analyzes hygrothermal conditions in the investigated material together with its pore characteristics. As an output, an amount of moisture that can be solidified is returned. The capability of the model is presented in a simple computational simulation of the wall corner made of sandstone.

  3. Comparative phyloinformatics of virus genes at micro and macro levels in a distributed computing environment

    PubMed Central

    Singh, Dadabhai T; Trehan, Rahul; Schmidt, Bertil; Bretschneider, Timo

    2008-01-01

    Background Preparedness for a possible global pandemic caused by viruses such as the highly pathogenic influenza A subtype H5N1 has become a global priority. In particular, it is critical to monitor the appearance of any new emerging subtypes. Comparative phyloinformatics can be used to monitor, analyze, and possibly predict the evolution of viruses. However, in order to utilize the full functionality of available analysis packages for large-scale phyloinformatics studies, a team of computer scientists, biostatisticians and virologists is needed – a requirement which cannot be fulfilled in many cases. Furthermore, the time complexities of many algorithms involved leads to prohibitive runtimes on sequential computer platforms. This has so far hindered the use of comparative phyloinformatics as a commonly applied tool in this area. Results In this paper the graphical-oriented workflow design system called Quascade and its efficient usage for comparative phyloinformatics are presented. In particular, we focus on how this task can be effectively performed in a distributed computing environment. As a proof of concept, the designed workflows are used for the phylogenetic analysis of neuraminidase of H5N1 isolates (micro level) and influenza viruses (macro level). The results of this paper are hence twofold. Firstly, this paper demonstrates the usefulness of a graphical user interface system to design and execute complex distributed workflows for large-scale phyloinformatics studies of virus genes. Secondly, the analysis of neuraminidase on different levels of complexity provides valuable insights of this virus's tendency for geographical based clustering in the phylogenetic tree and also shows the importance of glycan sites in its molecular evolution. Conclusion The current study demonstrates the efficiency and utility of workflow systems providing a biologist friendly approach to complex biological dataset analysis using high performance computing. In particular, the

  4. Comparative phyloinformatics of virus genes at micro and macro levels in a distributed computing environment.

    PubMed

    Singh, Dadabhai T; Trehan, Rahul; Schmidt, Bertil; Bretschneider, Timo

    2008-01-01

    Preparedness for a possible global pandemic caused by viruses such as the highly pathogenic influenza A subtype H5N1 has become a global priority. In particular, it is critical to monitor the appearance of any new emerging subtypes. Comparative phyloinformatics can be used to monitor, analyze, and possibly predict the evolution of viruses. However, in order to utilize the full functionality of available analysis packages for large-scale phyloinformatics studies, a team of computer scientists, biostatisticians and virologists is needed--a requirement which cannot be fulfilled in many cases. Furthermore, the time complexities of many algorithms involved leads to prohibitive runtimes on sequential computer platforms. This has so far hindered the use of comparative phyloinformatics as a commonly applied tool in this area. In this paper the graphical-oriented workflow design system called Quascade and its efficient usage for comparative phyloinformatics are presented. In particular, we focus on how this task can be effectively performed in a distributed computing environment. As a proof of concept, the designed workflows are used for the phylogenetic analysis of neuraminidase of H5N1 isolates (micro level) and influenza viruses (macro level). The results of this paper are hence twofold. Firstly, this paper demonstrates the usefulness of a graphical user interface system to design and execute complex distributed workflows for large-scale phyloinformatics studies of virus genes. Secondly, the analysis of neuraminidase on different levels of complexity provides valuable insights of this virus's tendency for geographical based clustering in the phylogenetic tree and also shows the importance of glycan sites in its molecular evolution. The current study demonstrates the efficiency and utility of workflow systems providing a biologist friendly approach to complex biological dataset analysis using high performance computing. In particular, the utility of the platform Quascade

  5. MGEScan-non-LTR: computational identification and classification of autonomous non-LTR retrotransposons in eukaryotic genomes

    PubMed Central

    Rho, Mina; Tang, Haixu

    2009-01-01

    Computational methods for genome-wide identification of mobile genetic elements (MGEs) have become increasingly necessary for both genome annotation and evolutionary studies. Non-long terminal repeat (non-LTR) retrotransposons are a class of MGEs that have been found in most eukaryotic genomes, sometimes in extremely high numbers. In this article, we present a computational tool, MGEScan-non-LTR, for the identification of non-LTR retrotransposons in genomic sequences, following a computational approach inspired by a generalized hidden Markov model (GHMM). Three different states represent two different protein domains and inter-domain linker regions encoded in the non-LTR retrotransposons, and their scores are evaluated by using profile hidden Markov models (for protein domains) and Gaussian Bayes classifiers (for linker regions), respectively. In order to classify the non-LTR retrotransposons into one of the 12 previously characterized clades using the same model, we defined separate states for different clades. MGEScan-non-LTR was tested on the genome sequences of four eukaryotic organisms, Drosophila melanogaster, Daphnia pulex, Ciona intestinalis and Strongylocentrotus purpuratus. For the D. melanogaster genome, MGEScan-non-LTR found all known ‘full-length’ elements and simultaneously classified them into the clades CR1, I, Jockey, LOA and R1. Notably, for the D. pulex genome, in which no non-LTR retrotransposon has been annotated, MGEScan-non-LTR found a significantly larger number of elements than did RepeatMasker, using the current version of the RepBase Update library. We also identified novel elements in the other two genomes, which have only been partially studied for non-LTR retrotransposons. PMID:19762481

  6. Contracted basis Lanczos methods for computing numerically exact rovibrational levels of methane

    NASA Astrophysics Data System (ADS)

    Wang, Xiao-Gang; Carrington, Tucker

    2004-08-01

    We present a numerically exact calculation of rovibrational levels of a five-atom molecule. Two contracted basis Lanczos strategies are proposed. The first and preferred strategy is a two-stage contraction. Products of eigenfunctions of a four-dimensional (4D) stretch problem and eigenfunctions of 5D bend-rotation problems, one for each K, are used as basis functions for computing eigenfunctions and eigenvalues (for each K) of the Hamiltonian without the Coriolis coupling term, denoted H0. Finally, energy levels of the full Hamiltonian are calculated in a basis of the eigenfunctions of H0. The second strategy is a one-stage contraction in which energy levels of the full Hamiltonian are computed in the product contracted basis (without first computing eigenfunctions of H0). The two-stage contraction strategy, albeit more complicated, has the crucial advantage that it is trivial to parallelize the calculation so that the CPU and memory costs are independent of J. For the one-stage contraction strategy the CPU and memory costs of the difficult part of the calculation scale linearly with J. We use the polar coordinates associated with orthogonal Radau vectors and spherical harmonic type rovibrational basis functions. A parity-adapted rovibrational basis suitable for a five-atom molecule is proposed and employed to obtain bend-rotation eigenfunctions in the first step of both contraction methods. The effectiveness of the two methods is demonstrated by calculating a large number of converged J=1 rovibrational levels of methane using a global potential energy surface.

  7. Contracted basis Lanczos methods for computing numerically exact rovibrational levels of methane.

    PubMed

    Wang, Xiao-Gang; Carrington, Tucker

    2004-08-15

    We present a numerically exact calculation of rovibrational levels of a five-atom molecule. Two contracted basis Lanczos strategies are proposed. The first and preferred strategy is a two-stage contraction. Products of eigenfunctions of a four-dimensional (4D) stretch problem and eigenfunctions of 5D bend-rotation problems, one for each K, are used as basis functions for computing eigenfunctions and eigenvalues (for each K) of the Hamiltonian without the Coriolis coupling term, denoted H0. Finally, energy levels of the full Hamiltonian are calculated in a basis of the eigenfunctions of H0. The second strategy is a one-stage contraction in which energy levels of the full Hamiltonian are computed in the product contracted basis (without first computing eigenfunctions of H0). The two-stage contraction strategy, albeit more complicated, has the crucial advantage that it is trivial to parallelize the calculation so that the CPU and memory costs are independent of J. For the one-stage contraction strategy the CPU and memory costs of the difficult part of the calculation scale linearly with J. We use the polar coordinates associated with orthogonal Radau vectors and spherical harmonic type rovibrational basis functions. A parity-adapted rovibrational basis suitable for a five-atom molecule is proposed and employed to obtain bend-rotation eigenfunctions in the first step of both contraction methods. The effectiveness of the two methods is demonstrated by calculating a large number of converged J = 1 rovibrational levels of methane using a global potential energy surface.

  8. Effect of Computer Simulations at the Particulate and Macroscopic Levels on Students' Understanding of the Particulate Nature of Matter

    ERIC Educational Resources Information Center

    Tang, Hui; Abraham, Michael R.

    2016-01-01

    Computer-based simulations can help students visualize chemical representations and understand chemistry concepts, but simulations at different levels of representation may vary in effectiveness on student learning. This study investigated the influence of computer activities that simulate chemical reactions at different levels of representation…

  9. Effect of Computer Simulations at the Particulate and Macroscopic Levels on Students' Understanding of the Particulate Nature of Matter

    ERIC Educational Resources Information Center

    Tang, Hui; Abraham, Michael R.

    2016-01-01

    Computer-based simulations can help students visualize chemical representations and understand chemistry concepts, but simulations at different levels of representation may vary in effectiveness on student learning. This study investigated the influence of computer activities that simulate chemical reactions at different levels of representation…

  10. New Fe i Level Energies and Line Identifications from Stellar Spectra. II. Initial Results from New Ultraviolet Spectra of Metal-poor Stars

    NASA Astrophysics Data System (ADS)

    Peterson, Ruth C.; Kurucz, Robert L.; Ayres, Thomas R.

    2017-04-01

    The Fe i spectrum is critical to many areas of astrophysics, yet many of the high-lying levels remain uncharacterized. To remedy this deficiency, Peterson & Kurucz identified Fe i lines in archival ultraviolet and optical spectra of metal-poor stars, whose warm temperatures favor moderate Fe i excitation. Sixty-five new levels were recovered, with 1500 detectable lines, including several bound levels in the ionization continuum of Fe i. Here, we extend the previous work by identifying 59 additional levels, with 1400 detectable lines, by incorporating new high-resolution UV spectra of warm metal-poor stars recently obtained by the Hubble Space Telescope Imaging Spectrograph. We provide gf values for these transitions, both computed as well as adjusted to fit the stellar spectra. We also expand our spectral calculations to the infrared, confirming three levels by matching high-quality spectra of the Sun and two cool stars in the H-band. The predicted gf values suggest that an additional 3700 Fe i lines should be detectable in existing solar infrared spectra. Extending the empirical line identification work to the infrared would help confirm additional Fe i levels, as would new high-resolution UV spectra of metal-poor turnoff stars below 1900 Å.

  11. Dose Assessment in Computed Tomography Examination and Establishment of Local Diagnostic Reference Levels in Mazandaran, Iran

    PubMed Central

    Janbabanezhad Toori, A.; Shabestani-Monfared, A.; Deevband, M.R.; Abdi, R.; Nabahati, M.

    2015-01-01

    Background Medical X-rays are the largest man-made source of public exposure to ionizing radiation. While the benefits of Computed Tomography (CT) are well known in accurate diagnosis, those benefits are not risk-free. CT is a device with higher patient dose in comparison with other conventional radiation procedures. Objective This study is aimed at evaluating radiation dose to patients from Computed Tomography (CT) examination in Mazandaran hospitals and defining diagnostic reference level (DRL). Methods Patient-related data on CT protocol for four common CT examinations including brain, sinus, chest and abdomen & pelvic were collected. In each center, Computed Tomography Dose Index (CTDI) measurements were performed using pencil ionization chamber and CT dosimetry phantom according to AAPM report No. 96 for those techniques. Then, Weighted Computed Tomography Dose Index (CTDIW), Volume Computed Tomography Dose Index (CTDI vol) and Dose Length Product (DLP) were calculated. Results The CTDIw for brain, sinus, chest and abdomen & pelvic ranged (15.6-73), (3.8-25. 8), (4.5-16.3) and (7-16.3), respectively. Values of DLP had a range of (197.4-981), (41.8-184), (131-342.3) and (283.6-486) for brain, sinus, chest and abdomen & pelvic, respectively. The 3rd quartile of CTDIW, derived from dose distribution for each examination is the proposed quantity for DRL. The DRLs of brain, sinus, chest and abdomen & pelvic are measured 59.5, 17, 7.8 and 11 mGy, respectively. Conclusion Results of this study demonstrated large scales of dose for the same examination among different centers. For all examinations, our values were lower than international reference doses. PMID:26688796

  12. A Framework for Different Levels of Integration of Computational Models Into Web-Based Virtual Patients

    PubMed Central

    Narracott, Andrew J; Manini, Simone; Bayley, Martin J; Lawford, Patricia V; McCormack, Keith; Zary, Nabil

    2014-01-01

    Background Virtual patients are increasingly common tools used in health care education to foster learning of clinical reasoning skills. One potential way to expand their functionality is to augment virtual patients’ interactivity by enriching them with computational models of physiological and pathological processes. Objective The primary goal of this paper was to propose a conceptual framework for the integration of computational models within virtual patients, with particular focus on (1) characteristics to be addressed while preparing the integration, (2) the extent of the integration, (3) strategies to achieve integration, and (4) methods for evaluating the feasibility of integration. An additional goal was to pilot the first investigation of changing framework variables on altering perceptions of integration. Methods The framework was constructed using an iterative process informed by Soft System Methodology. The Virtual Physiological Human (VPH) initiative has been used as a source of new computational models. The technical challenges associated with development of virtual patients enhanced by computational models are discussed from the perspectives of a number of different stakeholders. Concrete design and evaluation steps are discussed in the context of an exemplar virtual patient employing the results of the VPH ARCH project, as well as improvements for future iterations. Results The proposed framework consists of four main elements. The first element is a list of feasibility features characterizing the integration process from three perspectives: the computational modelling researcher, the health care educationalist, and the virtual patient system developer. The second element included three integration levels: basic, where a single set of simulation outcomes is generated for specific nodes in the activity graph; intermediate, involving pre-generation of simulation datasets over a range of input parameters; advanced, including dynamic solution of the

  13. A framework for different levels of integration of computational models into web-based virtual patients.

    PubMed

    Kononowicz, Andrzej A; Narracott, Andrew J; Manini, Simone; Bayley, Martin J; Lawford, Patricia V; McCormack, Keith; Zary, Nabil

    2014-01-23

    Virtual patients are increasingly common tools used in health care education to foster learning of clinical reasoning skills. One potential way to expand their functionality is to augment virtual patients' interactivity by enriching them with computational models of physiological and pathological processes. The primary goal of this paper was to propose a conceptual framework for the integration of computational models within virtual patients, with particular focus on (1) characteristics to be addressed while preparing the integration, (2) the extent of the integration, (3) strategies to achieve integration, and (4) methods for evaluating the feasibility of integration. An additional goal was to pilot the first investigation of changing framework variables on altering perceptions of integration. The framework was constructed using an iterative process informed by Soft System Methodology. The Virtual Physiological Human (VPH) initiative has been used as a source of new computational models. The technical challenges associated with development of virtual patients enhanced by computational models are discussed from the perspectives of a number of different stakeholders. Concrete design and evaluation steps are discussed in the context of an exemplar virtual patient employing the results of the VPH ARCH project, as well as improvements for future iterations. The proposed framework consists of four main elements. The first element is a list of feasibility features characterizing the integration process from three perspectives: the computational modelling researcher, the health care educationalist, and the virtual patient system developer. The second element included three integration levels: basic, where a single set of simulation outcomes is generated for specific nodes in the activity graph; intermediate, involving pre-generation of simulation datasets over a range of input parameters; advanced, including dynamic solution of the model. The third element is the

  14. The utility of including pathology reports in improving the computational identification of patients

    PubMed Central

    Chen, Wei; Huang, Yungui; Boyle, Brendan; Lin, Simon

    2016-01-01

    Background: Celiac disease (CD) is a common autoimmune disorder. Efficient identification of patients may improve chronic management of the disease. Prior studies have shown searching International Classification of Diseases-9 (ICD-9) codes alone is inaccurate for identifying patients with CD. In this study, we developed automated classification algorithms leveraging pathology reports and other clinical data in Electronic Health Records (EHRs) to refine the subset population preselected using ICD-9 code (579.0). Materials and Methods: EHRs were searched for established ICD-9 code (579.0) suggesting CD, based on which an initial identification of cases was obtained. In addition, laboratory results for tissue transglutaminse were extracted. Using natural language processing we analyzed pathology reports from upper endoscopy. Twelve machine learning classifiers using different combinations of variables related to ICD-9 CD status, laboratory result status, and pathology reports were experimented to find the best possible CD classifier. Ten-fold cross-validation was used to assess the results. Results: A total of 1498 patient records were used including 363 confirmed cases and 1135 false positive cases that served as controls. Logistic model based on both clinical and pathology report features produced the best results: Kappa of 0.78, F1 of 0.92, and area under the curve (AUC) of 0.94, whereas in contrast using ICD-9 only generated poor results: Kappa of 0.28, F1 of 0.75, and AUC of 0.63. Conclusion: Our automated classification system presented an efficient and reliable way to improve the performance of CD patient identification. PMID:27994938

  15. Systematic toxicological analysis: computer-assisted identification of poisons in biological materials.

    PubMed

    Stimpfl, Th; Demuth, W; Varmuza, K; Vycudilik, W

    2003-06-05

    A new software was developed to improve the chances for identification of a "general unknown" in complex biological materials. To achieve this goal, the total ion current chromatogram was simplified by filtering the acquired mass spectra via an automated subtraction procedure, which removed mass spectra originating from the sample matrix, as well as interfering substances from the extraction procedure. It could be shown that this tool emphasizes mass spectra of exceptional compounds, and therefore provides the forensic toxicologist with further evidence-even in cases where mass spectral data of the unknown compound are not available in "standard" spectral libraries.

  16. A Feature Selection Algorithm to Compute Gene Centric Methylation from Probe Level Methylation Data

    PubMed Central

    Baur, Brittany; Bozdag, Serdar

    2016-01-01

    DNA methylation is an important epigenetic event that effects gene expression during development and various diseases such as cancer. Understanding the mechanism of action of DNA methylation is important for downstream analysis. In the Illumina Infinium HumanMethylation 450K array, there are tens of probes associated with each gene. Given methylation intensities of all these probes, it is necessary to compute which of these probes are most representative of the gene centric methylation level. In this study, we developed a feature selection algorithm based on sequential forward selection that utilized different classification methods to compute gene centric DNA methylation using probe level DNA methylation data. We compared our algorithm to other feature selection algorithms such as support vector machines with recursive feature elimination, genetic algorithms and ReliefF. We evaluated all methods based on the predictive power of selected probes on their mRNA expression levels and found that a K-Nearest Neighbors classification using the sequential forward selection algorithm performed better than other algorithms based on all metrics. We also observed that transcriptional activities of certain genes were more sensitive to DNA methylation changes than transcriptional activities of other genes. Our algorithm was able to predict the expression of those genes with high accuracy using only DNA methylation data. Our results also showed that those DNA methylation-sensitive genes were enriched in Gene Ontology terms related to the regulation of various biological processes. PMID:26872146

  17. A Feature Selection Algorithm to Compute Gene Centric Methylation from Probe Level Methylation Data.

    PubMed

    Baur, Brittany; Bozdag, Serdar

    2016-01-01

    DNA methylation is an important epigenetic event that effects gene expression during development and various diseases such as cancer. Understanding the mechanism of action of DNA methylation is important for downstream analysis. In the Illumina Infinium HumanMethylation 450K array, there are tens of probes associated with each gene. Given methylation intensities of all these probes, it is necessary to compute which of these probes are most representative of the gene centric methylation level. In this study, we developed a feature selection algorithm based on sequential forward selection that utilized different classification methods to compute gene centric DNA methylation using probe level DNA methylation data. We compared our algorithm to other feature selection algorithms such as support vector machines with recursive feature elimination, genetic algorithms and ReliefF. We evaluated all methods based on the predictive power of selected probes on their mRNA expression levels and found that a K-Nearest Neighbors classification using the sequential forward selection algorithm performed better than other algorithms based on all metrics. We also observed that transcriptional activities of certain genes were more sensitive to DNA methylation changes than transcriptional activities of other genes. Our algorithm was able to predict the expression of those genes with high accuracy using only DNA methylation data. Our results also showed that those DNA methylation-sensitive genes were enriched in Gene Ontology terms related to the regulation of various biological processes.

  18. EUDOC: a computer program for identification of drug interaction sites in macromolecules and drug leads from chemical databases.

    PubMed

    Pang, Yuan-Ping; Perola, Emanuele; Xu, Kun; Prendergast, Franklyn G.

    2001-11-30

    The completion of the Human Genome Project, the growing effort on proteomics, and the Structural Genomics Initiative have recently intensified the attention being paid to reliable computer docking programs able to identify molecules that can affect the function of a macromolecule through molecular complexation. We report herein an automated computer docking program, EUDOC, for prediction of ligand-receptor complexes from 3D receptor structures, including metalloproteins, and for identification of a subset enriched in drug leads from chemical databases. This program was evaluated from the standpoints of force field and sampling issues using 154 experimentally determined ligand-receptor complexes and four "real-life" applications of the EUDOC program. The results provide evidence for the reliability and accuracy of the EUDOC program. In addition, key principles underlying molecular recognition, and the effects of structural water molecules in the active site and different atomic charge models on docking results are discussed. Copyright 2001 John Wiley & Sons, Inc. J Comput Chem 22: 1750-1771, 2001

  19. K-nearest neighbors based methods for identification of different gear crack levels under different motor speeds and loads: Revisited

    NASA Astrophysics Data System (ADS)

    Wang, Dong

    2016-03-01

    Gears are the most commonly used components in mechanical transmission systems. Their failures may cause transmission system breakdown and result in economic loss. Identification of different gear crack levels is important to prevent any unexpected gear failure because gear cracks lead to gear tooth breakage. Signal processing based methods mainly require expertize to explain gear fault signatures which is usually not easy to be achieved by ordinary users. In order to automatically identify different gear crack levels, intelligent gear crack identification methods should be developed. The previous case studies experimentally proved that K-nearest neighbors based methods exhibit high prediction accuracies for identification of 3 different gear crack levels under different motor speeds and loads. In this short communication, to further enhance prediction accuracies of existing K-nearest neighbors based methods and extend identification of 3 different gear crack levels to identification of 5 different gear crack levels, redundant statistical features are constructed by using Daubechies 44 (db44) binary wavelet packet transform at different wavelet decomposition levels, prior to the use of a K-nearest neighbors method. The dimensionality of redundant statistical features is 620, which provides richer gear fault signatures. Since many of these statistical features are redundant and highly correlated with each other, dimensionality reduction of redundant statistical features is conducted to obtain new significant statistical features. At last, the K-nearest neighbors method is used to identify 5 different gear crack levels under different motor speeds and loads. A case study including 3 experiments is investigated to demonstrate that the developed method provides higher prediction accuracies than the existing K-nearest neighbors based methods for recognizing different gear crack levels under different motor speeds and loads. Based on the new significant statistical

  20. 3D Multislice and Cone-beam Computed Tomography Systems for Dental Identification.

    PubMed

    Eliášová, Hana; Dostálová, Taťjana

    3D Multislice and Cone-beam computed tomography (CBCT) in forensic odontology has been shown to be useful not only in terms of one or a few of dead bodies but also in multiple fatality incidents. 3D Multislice and Cone-beam computed tomography and digital radiography were demonstrated in a forensic examination form. 3D images of the skull and teeth were analysed and validated for long ante mortem/post mortem intervals. The image acquisition was instantaneous; the images were able to be optically enlarged, measured, superimposed and compared prima vista or using special software and exported as a file. Digital radiology and computer tomography has been shown to be important both in common criminalistics practices and in multiple fatality incidents. Our study demonstrated that CBCT imaging offers less image artifacts, low image reconstruction times, mobility of the unit and considerably lower equipment cost.

  1. TPASS: a gamma-ray spectrum analysis and isotope identification computer code

    SciTech Connect

    Dickens, J.K.

    1981-03-01

    The gamma-ray spectral data-reduction and analysis computer code TPASS is described. This computer code is used to analyze complex Ge(Li) gamma-ray spectra to obtain peak areas corrected for detector efficiencies, from which are determined gamma-ray yields. These yields are compared with an isotope gamma-ray data file to determine the contributions to the observed spectrum from decay of specific radionuclides. A complete FORTRAN listing of the code and a complex test case are given.

  2. Crowding in Cellular Environments at an Atomistic Level from Computer Simulations

    PubMed Central

    2017-01-01

    The effects of crowding in biological environments on biomolecular structure, dynamics, and function remain not well understood. Computer simulations of atomistic models of concentrated peptide and protein systems at different levels of complexity are beginning to provide new insights. Crowding, weak interactions with other macromolecules and metabolites, and altered solvent properties within cellular environments appear to remodel the energy landscape of peptides and proteins in significant ways including the possibility of native state destabilization. Crowding is also seen to affect dynamic properties, both conformational dynamics and diffusional properties of macromolecules. Recent simulations that address these questions are reviewed here and discussed in the context of relevant experiments. PMID:28666087

  3. Early Identification of Handicapped Children. Computer Assisted Remedial Education Report No. R-36.

    ERIC Educational Resources Information Center

    Cartwright, G. Phillip; Cartwright, Carol A.

    The handbook is intended to be part of a graduate course entitled "Introduction to Exceptional Children" which is taught via computer assisted instruction and emphasizes the social, psychological, and physiological characteristics of the mentally, visually, aurally, physically, emotionally, or neurologically handicapped primary grade child to…

  4. The Effects of Linear Microphone Array Changes on Computed Sound Exposure Level Footprints

    NASA Technical Reports Server (NTRS)

    Mueller, Arnold W.; Wilson, Mark R.

    1997-01-01

    Airport land planning commissions often are faced with determining how much area around an airport is affected by the sound exposure levels (SELS) associated with helicopter operations. This paper presents a study of the effects changing the size and composition of a microphone array has on the computed SEL contour (ground footprint) areas used by such commissions. Descent flight acoustic data measured by a fifteen microphone array were reprocessed for five different combinations of microphones within this array. This resulted in data for six different arrays for which SEL contours were computed. The fifteen microphone array was defined as the 'baseline' array since it contained the greatest amount of data. The computations used a newly developed technique, the Acoustic Re-propagation Technique (ART), which uses parts of the NASA noise prediction program ROTONET. After the areas of the SEL contours were calculated the differences between the areas were determined. The area differences for the six arrays are presented that show a five and a three microphone array (with spacing typical of that required by the FAA FAR Part 36 noise certification procedure) compare well with the fifteen microphone array. All data were obtained from a database resulting from a joint project conducted by NASA and U.S. Army researchers at Langley and Ames Research Centers. A brief description of the joint project test design, microphone array set-up, and data reduction methodology associated with the database are discussed.

  5. Identification and analysis of evolutionary selection pressures acting at the molecular level in five forkhead subfamilies.

    PubMed

    Fetterman, Christina D; Rannala, Bruce; Walter, Michael A

    2008-09-24

    Members of the forkhead gene family act as transcription regulators in biological processes including development and metabolism. The evolution of forkhead genes has not been widely examined and selection pressures at the molecular level influencing subfamily evolution and differentiation have not been explored. Here, in silico methods were used to examine selection pressures acting on the coding sequence of five multi-species FOX protein subfamily clusters; FoxA, FoxD, FoxI, FoxO and FoxP. Application of site models, which estimate overall selection pressures on individual codons throughout the phylogeny, showed that the amino acid changes observed were either neutral or under negative selection. Branch-site models, which allow estimated selection pressures along specified lineages to vary as compared to the remaining phylogeny, identified positive selection along branches leading to the FoxA3 and Protostomia clades in the FoxA cluster and the branch leading to the FoxO3 clade in the FoxO cluster. Residues that may differentiate paralogs were identified in the FoxA and FoxO clusters and residues that differentiate orthologs were identified in the FoxA cluster. Neutral amino acid changes were identified in the forkhead domain of the FoxA, FoxD and FoxP clusters while positive selection was identified in the forkhead domain of the Protostomia lineage of the FoxA cluster. A series of residues under strong negative selection adjacent to the N- and C-termini of the forkhead domain were identified in all clusters analyzed suggesting a new method for refinement of domain boundaries. Extrapolation of domains among cluster members in conjunction with selection pressure information allowed prediction of residue function in the FoxA, FoxO and FoxP clusters and exclusion of known domain function in residues of the FoxA and FoxI clusters. Consideration of selection pressures observed in conjunction with known functional information allowed prediction of residue function and

  6. Identification and analysis of evolutionary selection pressures acting at the molecular level in five forkhead subfamilies

    PubMed Central

    2008-01-01

    Background Members of the forkhead gene family act as transcription regulators in biological processes including development and metabolism. The evolution of forkhead genes has not been widely examined and selection pressures at the molecular level influencing subfamily evolution and differentiation have not been explored. Here, in silico methods were used to examine selection pressures acting on the coding sequence of five multi-species FOX protein subfamily clusters; FoxA, FoxD, FoxI, FoxO and FoxP. Results Application of site models, which estimate overall selection pressures on individual codons throughout the phylogeny, showed that the amino acid changes observed were either neutral or under negative selection. Branch-site models, which allow estimated selection pressures along specified lineages to vary as compared to the remaining phylogeny, identified positive selection along branches leading to the FoxA3 and Protostomia clades in the FoxA cluster and the branch leading to the FoxO3 clade in the FoxO cluster. Residues that may differentiate paralogs were identified in the FoxA and FoxO clusters and residues that differentiate orthologs were identified in the FoxA cluster. Neutral amino acid changes were identified in the forkhead domain of the FoxA, FoxD and FoxP clusters while positive selection was identified in the forkhead domain of the Protostomia lineage of the FoxA cluster. A series of residues under strong negative selection adjacent to the N- and C-termini of the forkhead domain were identified in all clusters analyzed suggesting a new method for refinement of domain boundaries. Extrapolation of domains among cluster members in conjunction with selection pressure information allowed prediction of residue function in the FoxA, FoxO and FoxP clusters and exclusion of known domain function in residues of the FoxA and FoxI clusters. Conclusion Consideration of selection pressures observed in conjunction with known functional information allowed

  7. Drug target identification in sphingolipid metabolism by computational systems biology tools: metabolic control analysis and metabolic pathway analysis.

    PubMed

    Ozbayraktar, F Betül Kavun; Ulgen, Kutlu O

    2010-08-01

    Sphingolipids regulate cellular processes that are critically important in cell's fate and function in cancer development and progression. This fact underlies the basics of the novel cancer therapy approach. The pharmacological manipulation of the sphingolipid metabolism in cancer therapeutics necessitates the detailed understanding of the pathway. Two computational systems biology tools are used to identify potential drug target enzymes among sphingolipid pathway that can be further utilized in drug design studies for cancer therapy. The enzymes in sphingolipid pathway were ranked according to their roles in controlling the metabolic network by metabolic control analysis. The physiologically connected reactions, i.e. biologically significant and functional modules of network, were identified by metabolic pathway analysis. The final set of candidate drug target enzymes are selected such that their manipulation leads to ceramide accumulation and long chain base phosphates depletion. The mathematical tools' efficiency for drug target identification performed in this study is validated by clinically available drugs. Copyright 2010 Elsevier Inc. All rights reserved.

  8. Computer-aided identification of the water diffusion coefficient for maize kernels dried in a thin layer

    NASA Astrophysics Data System (ADS)

    Kujawa, Sebastian; Weres, Jerzy; Olek, Wiesław

    2016-07-01

    Uncertainties in mathematical modelling of water transport in cereal grain kernels during drying and storage are mainly due to implementing unreliable values of the water diffusion coefficient and simplifying the geometry of kernels. In the present study an attempt was made to reduce the uncertainties by developing a method for computer-aided identification of the water diffusion coefficient and more accurate 3D geometry modelling for individual kernels using original inverse finite element algorithms. The approach was exemplified by identifying the water diffusion coefficient for maize kernels subjected to drying. On the basis of the developed method, values of the water diffusion coefficient were estimated, 3D geometry of a maize kernel was represented by isoparametric finite elements, and the moisture content inside maize kernels dried in a thin layer was predicted. Validation of the results against experimental data showed significantly lower error values than in the case of results obtained for the water diffusion coefficient values available in the literature.

  9. Evaluation of Staf-Sistem 18-R for identification of staphylococcal clinical isolates to the species level.

    PubMed Central

    Piccolomini, R; Catamo, G; Picciani, C; D'Antonio, D

    1994-01-01

    The accuracy and efficiency of Staf-Sistem 18-R (Liofilchem s.r.l., Roseto degli Abruzzi, Teramo, Italy) were compared with those of conventional biochemical methods to identify 523 strains belonging to 16 different human Staphylococcus species. Overall, 491 strains (93.9%) were correctly identified (percentage of identification, > or = 90.0), with 28 (5.4%) requiring supplementary tests for complete identification. For 14 isolates (2.8%), the strains did not correspond to any key in the codebook and could not be identified by the manufacturer's computer service. Only 18 isolates (3.4%) were misidentified. The system is simple to use, is easy to handle, gives highly reproducible results, and is inexpensive. With the inclusion of more discriminating tests and adjustment in supplementary code numbers for some species, such as Staphylococcus lugdunensis and Staphylococcus schleiferi, Staf-Sistem 18-R is a suitable alternative for identification of human coagulase-positive and coagulase-negative Staphylococcus species in microbiological laboratories. Images PMID:8195373

  10. COMPARISON OF THE PAIN LEVELS OF COMPUTER-CONTROLLED AND CONVENTIONAL ANESTHESIA TECHNIQUES IN PROSTHODONTIC TREATMENT

    PubMed Central

    Yenisey, Murat

    2009-01-01

    Objective: The objective of this study was to compare the pain levels on opposite sides of the maxilla at needle insertion during delivery of local anesthetic solution and tooth preparation for both conventional and anterior middle superior alveolar (AMSA) technique with the Wand computer-controlled local anesthesia application. Material and methods: Pain scores of 16 patients were evaluated with a 5-point verbal rating scale (VRS) and data were analyzed nonparametrically. Pain differences at needle insertion, during delivery of local anesthetic, and at tooth preparation, for conventional versus the Wand technique, were analyzed using the Mann-Whitney U test (p=0.01). Results: The Wand technique had a lower pain level compared to conventional injection for needle insertion (p<0.01). In the anesthetic delivery phase, pain level for the Wand technique was lower (p<0.01). However, there was no difference between the Wand and conventional technique for pain level during tooth preparation (p>0.05). Conclusions; The AMSA technique using the Wand is recommended for prosthodontic treatment because it reduces pain during needle insertion and during delivery of local anaesthetic. However, these two techniques have the same pain levels for tooth preparation. PMID:19936518

  11. Efficient Calibration of Computationally Intensive Groundwater Models through Surrogate Modelling with Lower Levels of Fidelity

    NASA Astrophysics Data System (ADS)

    Razavi, S.; Anderson, D.; Martin, P.; MacMillan, G.; Tolson, B.; Gabriel, C.; Zhang, B.

    2012-12-01

    intensive groundwater modelling case study developed with the FEFLOW software is used to evaluate the proposed methodology. Multiple surrogates of this computationally intensive model with different levels of fidelity are created and applied. Dynamically dimensioned search (DDS) optimization algorithm is used as the search engine in the calibration framework enabled with surrogate models. Results show that this framework can substantially reduce the number of original model evaluations required for calibration by intelligently utilizing faster-to-run surrogates in the course of optimization. Results also demonstrate that the compromise between efficiency (reduced run time) and fidelity of a surrogate model is critically important to the success of the framework, as a surrogate with unreasonably low fidelity, despite being fast, might be quite misleading in calibration of the original model of interest.

  12. Parallel computation of level set method for 500 Hz visual servo control

    NASA Astrophysics Data System (ADS)

    Fei, Xianfeng; Igarashi, Yasunobu; Hashimoto, Koichi

    2008-11-01

    We propose a 2D microorganism tracking system using a parallel level set method and a column parallel vision system (CPV). This system keeps a single microorganism in the middle of the visual field under a microscope by visual servoing an automated stage. We propose a new energy function for the level set method. This function constrains an amount of light intensity inside the detected object contour to control the number of the detected objects. This algorithm is implemented in CPV system and computational time for each frame is 2 [ms], approximately. A tracking experiment for about 25 s is demonstrated. Also we demonstrate a single paramecium can be kept tracking even if other paramecia appear in the visual field and contact with the tracked paramecium.

  13. French diagnostic reference levels in diagnostic radiology, computed tomography and nuclear medicine: 2004-2008 review.

    PubMed

    Roch, P; Aubert, B

    2013-04-01

    After 5 y of collecting data on diagnostic reference levels (DRLs), the Nuclear Safety and Radiation Protection French Institute (IRSN) presents the analyses of this data. The analyses of the collected data for radiology, computed tomography (CT) and nuclear medicine allow IRSN to estimate the level of regulatory application by health professionals and the representativeness of current DRL in terms of relevant examinations, dosimetric quantities, numerical values and patient morphologies. Since 2004, the involvement of professionals has highly increased, especially in nuclear medicine, followed by CT and then by radiology. Analyses show some discordance between regulatory examinations and clinical practice. Some of the dosimetric quantities used for the DRL setting are insufficient or not relevant enough, and some numerical values should also be reviewed. On the basis of these findings, IRSN formulates recommendations to update regulatory DRL with current and relevant examination lists, dosimetric quantities and numerical values.

  14. Derivation of Australian diagnostic reference levels for paediatric multi detector computed tomography.

    PubMed

    Hayton, Anna; Wallace, Anthony

    2016-09-01

    Australian National Diagnostic Reference Levels for paediatric multi detector computed tomography were established for three protocols, Head, Chest and AbdoPelvis, across two age groups, Baby/Infant 0-4 years and Child 5-14 years by the Australian Radiation Protection and Nuclear Safety Agency in 2012. The establishment of Australian paediatric DRLs is an important step towards lowering patient CT doses on a national scale. While Adult DRLs were calculated with data collected from the web based Australian National Diagnostic Reference Level Service, no paediatric data was submitted in the first year of service operation. Data from an independent Royal Australian and New Zealand College of Radiologists Quality Use of Diagnostic Imaging paediatric optimisation survey was used. The paediatric DRLs were defined for CTDIvol (mGy) and DLP (mGy·cm) values that referenced the 16 cm PMMA phantom for the Head protocol and the 32 cm PMMA phantom for body protocols for both paediatric age groups. The Australian paediatric DRLs for multi detector computed tomography are for the Head, Chest and AbdoPelvis protocols respectively, 470, 60 and 170 mGy·cm for the Baby/Infant age group, and 600, 110 and 390 mGy·cm for the Child age group. A comparison with published international paediatric DRLs for computed tomography reveal the Australian paediatric DRLs to be lower on average. However, the comparison is complicated by misalignment of defined age ranges. It is the intention of ARPANSA to review the paediatric DRLs in conjunction with a review of the adult DRLs, which should occur within 5 years of their publication.

  15. Establishment of multi-slice computed tomography (MSCT) reference level in Johor, Malaysia

    NASA Astrophysics Data System (ADS)

    Karim, M. K. A.; Hashim, S.; Bakar, K. A.; Muhammad, H.; Sabarudin, A.; Ang, W. C.; Bahruddin, N. A.

    2016-03-01

    Radiation doses from computed tomography (CT) are the highest and most hazardous compared to other imaging modalities. This study aimed to evaluate radiation dose in Johor, Malaysia to patients during computed tomography examinations of the brain, chest and abdomen and to establish the local diagnostic reference levels (DRLs) as are present with the current, state- of-art, multi-slice CT scanners. Survey forms were sent to five centres performing CT to obtain data regarding acquisition parameters as well as the dose information from CT consoles. CT- EXPO (Version 2.3.1, Germany) was used to validate the dose information. The proposed DRLs were indicated by rounding the third quartiles of whole dose distributions where mean values of CTDIw (mGy), CTDIvol (mGy) and DLP (mGy.cm) were comparable with other reference levels; 63, 63, and 1015 respectively for CT Brain; 15, 14, and 450 respectively for CT thorax and 16, 17, and 590 respectively for CT abdomen. The study revealed that the CT practice and dose output were revolutionised, and must keep up with the pace of introductory technology. We suggest that CTDIvol should be included in current national DRLs, as modern CTs are configured with a higher number of detectors and are independent of pitch factors.

  16. Moving Particle Level-Set (MPLS) method for incompressible multiphase flow computation

    NASA Astrophysics Data System (ADS)

    Ng, K. C.; Hwang, Y. H.; Sheu, T. W. H.; Yu, C. H.

    2015-11-01

    An implementation of a multiphase model in a recently developed Moving Particle Pressure Mesh (MPPM) particle-based solver is reported in the current work. By enforcing the divergence-free condition on the background mesh (pressure mesh), the moving particles are merely treated as observation points without intrinsic mass property, which has surmounted several computational deficiencies in the existing Moving Particle Semi-implicit (MPS) method. In the current work, in order to enhance the smoothness of the fluid interface and simulate interfacial flow with large density ratio without rigorous tuning of calibration parameters as required in most of the existing particle methods, a density interpolation scheme is put forward in the current work by using the conservative level-set method to ensure mass conservation. Several multiphase flow cases are simulated and compared with the existing numerical/theoretical solutions. It is encouraging to observe that the present solutions are more accurate than the numerical solutions based on the existing MPS methods. The proposal of the current Moving Particle Level-Set (MPLS) method thus provides a simple yet effective approach in computing incompressible multiphase flow within the numerical framework of particle method.

  17. High-level ab initio computations of the absorption spectra of organic iridium complexes.

    PubMed

    Plasser, Felix; Dreuw, Andreas

    2015-02-12

    The excited states of fac-tris(phenylpyridinato)iridium [Ir(ppy)3] and the smaller model complex Ir(C3H4N)3 are computed using a number of high-level ab initio methods, including the recently implemented algebraic diagrammatic construction method to third-order ADC(3). A detailed description of the states is provided through advanced analysis methods, which allow a quantification of different charge transfer and orbital relaxation effects and give extended insight into the many-body wave functions. Compared to the ADC(3) benchmark an unexpected striking difference of ADC(2) is found for Ir(C3H4N)3, which derives from an overstabilization of charge transfer effects. Time-dependent density functional theory (TDDFT) using the B3LYP functional shows an analogous but less severe error for charge transfer states, whereas the ωB97 results are in good agreement with ADC(3). Multireference configuration interaction computations, which are in reasonable agreement with ADC(3), reveal that static correlation does not play a significant role. In the case of the larger Ir(ppy)3 complex, results at the TDDFT/B3LYP and TDDFT/ωB97 levels of theory are presented. Strong discrepancies between the two functionals, which are found with respect to the energies, characters, as well as the density of the low lying states, are discussed in detail and compared to experiment.

  18. Identification and analysis of unsatisfactory psychosocial work situations: a participatory approach employing video-computer interaction.

    PubMed

    Hanse, J J; Forsman, M

    2001-02-01

    A method for psychosocial evaluation of potentially stressful or unsatisfactory situations in manual work was developed. It focuses on subjective responses regarding specific situations and is based on interactive worker assessment when viewing video recordings of oneself. The worker is first video-recorded during work. The video is then displayed on the computer terminal, and the filmed worker clicks on virtual controls on the screen whenever an unsatisfactory psychosocial situation appears; a window of questions regarding psychological demands, mental strain and job control is then opened. A library with pictorial information and comments on the selected situations is formed in the computer. The evaluation system, called PSIDAR, was applied in two case studies, one of manual materials handling in an automotive workshop and one of a group of workers producing and testing instrument panels. The findings indicate that PSIDAR can provide data that are useful in a participatory ergonomic process of change.

  19. Identification of Problems Hindering Logistics Support of Commercial- Off-The-Shelf Computer Equipment

    DTIC Science & Technology

    1993-09-01

    accelerating technology taking place in the computer industry . Market-Based Pricing. Because the Air Force is buying the COTS equipment "off-the-shelf...supporting, participating, and using commands. b. Visit Air Force and Private industry to get their perspective, approaches, and procedures on both effective...teams to collect data from industry , acquirers, and users/supporters. The industry team collected data from The Bank of Boston, Commonwealth of

  20. A Computational Wireless Network Backplane: Performance in a Distributed Speaker Identification Application Postprint

    DTIC Science & Technology

    2008-12-01

    traffic patterns are intense but constrained to a local area. Examples include peer-to-peer applications or sensor data processing in the region. In such...vol. 30, no. 4, pp. 68–74, 1997. [7] J. Dean and S. Ghemawat, “ Mapreduce : simplified data processing on large clusters ,” Commun. ACM, vol. 51, no. 1...DWARF, a general distributed application execution framework for wireless ad-hoc networks which dynamically allocates computation resources and manages

  1. Identification of phytophthora isolates to species level using restriction fragment length polymorphism analysis of a polymerase chain reaction-amplified region of mitochondrial DNA.

    PubMed

    Martin, Frank N; Tooley, Paul W

    2004-09-01

    ABSTRACT Polymerase chain reaction primers spanning the mitochondrially encoded coxI and II genes have been identified that were capable of amplifying target DNA from all 152 isolates of 31 species in the genus Phytophthora that were tested. Digestion of the amplicons with restriction enzymes generated species-specific restriction fragment length polymorphism banding profiles that were effective for isolate classification to a species level. Of the 24 species in which multiple isolates were examined, intraspecific polymorphisms were not observed for 16 species, while 5 species exhibited limited intraspecific polymorphism that could be explained by the addition/loss of a single restriction site. Intraspecific polymorphisms were observed for P. megakarya, P. megasperma, and P. syringae; however, these differences may be a reflection of the variation that exists in these species as reported in the literature. Although digestion with AluI alone could differentiate most species tested, single digests with a total of four restriction enzymes were used in this investigation to enhance the accuracy of the technique and minimize the effect of intraspecific variability on correct isolate identification. The use of the computer program BioNumerics simplified data analysis and identification of isolates. Successful template amplification was obtained with DNA recovered from hyphae using a boiling miniprep procedure, thereby reducing the time and materials needed for conducting this analysis.

  2. Power levels in office equipment: Measurements of new monitors and personal computers

    SciTech Connect

    Roberson, Judy A.; Brown, Richard E.; Nordman, Bruce; Webber, Carrie A.; Homan, Gregory H.; Mahajan, Akshay; McWhinney, Marla; Koomey, Jonathan G.

    2002-05-14

    Electronic office equipment has proliferated rapidly over the last twenty years and is projected to continue growing in the future. Efforts to reduce the growth in office equipment energy use have focused on power management to reduce power consumption of electronic devices when not being used for their primary purpose. The EPA ENERGY STAR[registered trademark] program has been instrumental in gaining widespread support for power management in office equipment, and accurate information about the energy used by office equipment in all power levels is important to improving program design and evaluation. This paper presents the results of a field study conducted during 2001 to measure the power levels of new monitors and personal computers. We measured off, on, and low-power levels in about 60 units manufactured since July 2000. The paper summarizes power data collected, explores differences within the sample (e.g., between CRT and LCD monitors), and discusses some issues that arise in m etering office equipment. We also present conclusions to help improve the success of future power management programs.Our findings include a trend among monitor manufacturers to provide a single very low low-power level, and the need to standardize methods for measuring monitor on power, to more accurately estimate the annual energy consumption of office equipment, as well as actual and potential energy savings from power management.

  3. Comparison of conventional ultrasonography and ultrasonography-computed tomography fusion imaging for target identification using digital/real hybrid phantoms: a preliminary study.

    PubMed

    Soyama, Takeshi; Sakuhara, Yusuke; Kudo, Kohsuke; Abo, Daisuke; Wang, Jeff; Ito, Yoichi M; Hasegawa, Yu; Shirato, Hiroki

    2016-07-01

    This preliminary study compared ultrasonography-computed tomography (US-CT) fusion imaging and conventional ultrasonography (US) for accuracy and time required for target identification using a combination of real phantoms and sets of digitally modified computed tomography (CT) images (digital/real hybrid phantoms). In this randomized prospective study, 27 spheres visible on B-mode US were placed at depths of 3.5, 8.5, and 13.5 cm (nine spheres each). All 27 spheres were digitally erased from the CT images, and a radiopaque sphere was digitally placed at each of the 27 locations to create 27 different sets of CT images. Twenty clinicians were instructed to identify the sphere target using US alone and fusion imaging. The accuracy of target identification of the two methods was compared using McNemar's test. The mean time required for target identification and error distances were compared using paired t tests. At all three depths, target identification was more accurate and the mean time required for target identification was significantly less with US-CT fusion imaging than with US alone, and the mean error distances were also shorter with US-CT fusion imaging. US-CT fusion imaging was superior to US alone in terms of accurate and rapid identification of target lesions.

  4. Genotypic and phenotypic applications for the differentiation and species-level identification of achromobacter for clinical diagnoses.

    PubMed

    Gomila, Margarita; Prince-Manzano, Claudia; Svensson-Stadler, Liselott; Busquets, Antonio; Erhard, Marcel; Martínez, Deny L; Lalucat, Jorge; Moore, Edward R B

    2014-01-01

    The Achromobacter is a genus in the family Alcaligenaceae, comprising fifteen species isolated from different sources, including clinical samples. The ability to detect and correctly identify Achromobacter species, particularly A. xylosoxidans, and differentiate them from other phenotypically similar and genotypically related Gram-negative, aerobic, non-fermenting species is important for patients with cystic fibrosis (CF), as well as for nosocomial and other opportunistic infections. Traditional phenotypic profile-based analyses have been demonstrated to be inadequate for reliable identifications of isolates of Achromobacter species and genotypic-based assays, relying upon comparative 16S rRNA gene sequence analyses are not able to insure definitive identifications of Achromobacter species, due to the inherently conserved nature of the gene. The uses of alternative methodologies to enable high-resolution differentiation between the species in the genus are needed. A comparative multi-locus sequence analysis (MLSA) of four selected 'house-keeping' genes (atpD, gyrB, recA, and rpoB) assessed the individual gene sequences for their potential in developing a reliable, rapid and cost-effective diagnostic protocol for Achromobacter species identifications. The analysis of the type strains of the species of the genus and 46 strains of Achromobacter species showed congruence between the cluster analyses derived from the individual genes. The MLSA gene sequences exhibited different levels of resolution in delineating the validly published Achromobacter species and elucidated strains that represent new genotypes and probable new species of the genus. Our results also suggested that the recently described A. spritinus is a later heterotypic synonym of A. marplatensis. Strains were analyzed, using whole-cell Matrix-Assisted Laser Desorption/Ionization Time-Of-Flight mass spectrometry (MALDI-TOF MS), as an alternative phenotypic profile-based method with the potential to

  5. Combining computer algorithms with experimental approaches permits the rapid and accurate identification of T cell epitopes from defined antigens.

    PubMed

    Schirle, M; Weinschenk, T; Stevanović, S

    2001-11-01

    The identification of T cell epitopes from immunologically relevant antigens remains a critical step in the development of vaccines and methods for monitoring of T cell responses. This review presents an overview of strategies that employ computer algorithms for the selection of candidate peptides from defined proteins and subsequent verification of their in vivo relevance by experimental approaches. Several computer algorithms are currently being used for epitope prediction of various major histocompatibility complex (MHC) class I and II molecules, based either on the analysis of natural MHC ligands or on the binding properties of synthetic peptides. Moreover, the analysis of proteasomal digests of peptides and whole proteins has led to the development of algorithms for the prediction of proteasomal cleavages. In order to verify the generation of the predicted peptides during antigen processing in vivo as well as their immunogenic potential, several experimental approaches have been pursued in the recent past. Mass spectrometry-based bioanalytical approaches have been used specifically to detect predicted peptides among isolated natural ligands. Other strategies employ various methods for the stimulation of primary T cell responses against the predicted peptides and subsequent testing of the recognition pattern towards target cells that express the antigen.

  6. PathoScope 2.0: a complete computational framework for strain identification in environmental or clinical sequencing samples.

    PubMed

    Hong, Changjin; Manimaran, Solaiappan; Shen, Ying; Perez-Rogers, Joseph F; Byrd, Allyson L; Castro-Nallar, Eduardo; Crandall, Keith A; Johnson, William Evan

    2014-01-01

    Recent innovations in sequencing technologies have provided researchers with the ability to rapidly characterize the microbial content of an environmental or clinical sample with unprecedented resolution. These approaches are producing a wealth of information that is providing novel insights into the microbial ecology of the environment and human health. However, these sequencing-based approaches produce large and complex datasets that require efficient and sensitive computational analysis workflows. Many recent tools for analyzing metagenomic-sequencing data have emerged, however, these approaches often suffer from issues of specificity, efficiency, and typically do not include a complete metagenomic analysis framework. We present PathoScope 2.0, a complete bioinformatics framework for rapidly and accurately quantifying the proportions of reads from individual microbial strains present in metagenomic sequencing data from environmental or clinical samples. The pipeline performs all necessary computational analysis steps; including reference genome library extraction and indexing, read quality control and alignment, strain identification, and summarization and annotation of results. We rigorously evaluated PathoScope 2.0 using simulated data and data from the 2011 outbreak of Shiga-toxigenic Escherichia coli O104:H4. The results show that PathoScope 2.0 is a complete, highly sensitive, and efficient approach for metagenomic analysis that outperforms alternative approaches in scope, speed, and accuracy. The PathoScope 2.0 pipeline software is freely available for download at: http://sourceforge.net/projects/pathoscope/.

  7. [The role of the computed tomography in the identification of the syndrome of pelvic congestion].

    PubMed

    Motta-Ramírez, Gaspar Alberto; Ruiz-Castro, Eloise; Torres-Hernández, Verónica; Herrera-Avilés, Ricardo Arturo; Rodríguez-Treviño, Carlos

    2013-07-01

    Pelvic congestion syndrome is a condition not yet fully understood, hence provokes controversy. It is cause of up to 40% of visits to the doctor; affecting women of reproductive age who experience non-specific symptoms such as characteristic pelvic pain with more than six months of evolution and difficult to treat dyspareunia in which even narcotics are insufficient for control. To recognize the vascular anatomy of the pelvic cavity and identify the characteristics of pelvic congestion syndrome demonstrable by computed tomography. A descriptive, observational, cross-sectional and retrospective study at Hospital Angeles del Pedregal, in the Department of Radiology and Imaging with patients who reported imaging studies with key findings to recognize the pelvic congestion syndrome. All women with incidental finding of abnormal dilation of the gonadal vein were included, allowing to suggest pelvic congestion syndrome as a possible diagnosis. There were 17 cases (0.9%) of patients with abdominopelvic pain syndrome who underwent multislice computed tomography to 3 mm, with extension from the lung bases to the pubic symphysis. Predominance of left gonadal vein is conditioned by the anatomical arrangement of the left gonadal vein. During the arterial phase opacification of the gonadal vein was identified in 11 patients (65%), a circumstance that correlates with retrograde venous flow valve incompetence. In computed tomography findings of pelvic congestion syndrome were also identified 12 patients (70%) with abdominopelvic pain syndrome. Pelvic congestion syndrome is a rare condition that radiologists do not consider because they don't know it and the clinical diagnoses give no clinical data to suggest this condition. But if one takes into account the literature, it refers to it as the origin of up to 40% of the visits to the gynecologist, and there may be more cases that will increase its prevalence.

  8. Computational Identification and Functional Predictions of Long Noncoding RNA in Zea mays

    PubMed Central

    Boerner, Susan; McGinnis, Karen M.

    2012-01-01

    Background Computational analysis of cDNA sequences from multiple organisms suggests that a large portion of transcribed DNA does not code for a functional protein. In mammals, noncoding transcription is abundant, and often results in functional RNA molecules that do not appear to encode proteins. Many long noncoding RNAs (lncRNAs) appear to have epigenetic regulatory function in humans, including HOTAIR and XIST. While epigenetic gene regulation is clearly an essential mechanism in plants, relatively little is known about the presence or function of lncRNAs in plants. Methodology/Principal Findings To explore the connection between lncRNA and epigenetic regulation of gene expression in plants, a computational pipeline using the programming language Python has been developed and applied to maize full length cDNA sequences to identify, classify, and localize potential lncRNAs. The pipeline was used in parallel with an SVM tool for identifying ncRNAs to identify the maximal number of ncRNAs in the dataset. Although the available library of sequences was small and potentially biased toward protein coding transcripts, 15% of the sequences were predicted to be noncoding. Approximately 60% of these sequences appear to act as precursors for small RNA molecules and may function to regulate gene expression via a small RNA dependent mechanism. ncRNAs were predicted to originate from both genic and intergenic loci. Of the lncRNAs that originated from genic loci, ∼20% were antisense to the host gene loci. Conclusions/Significance Consistent with similar studies in other organisms, noncoding transcription appears to be widespread in the maize genome. Computational predictions indicate that maize lncRNAs may function to regulate expression of other genes through multiple RNA mediated mechanisms. PMID:22916204

  9. Identification of Cardiac and Aortic Injuries in Trauma with Multi-detector Computed Tomography.

    PubMed

    Shergill, Arvind K; Maraj, Tishan; Barszczyk, Mark S; Cheung, Helen; Singh, Navneet; Zavodni, Anna E

    2015-01-01

    Blunt and penetrating cardiovascular (CV) injuries are associated with a high morbidity and mortality. Rapid detection of these injuries in trauma is critical for patient survival. The advent of multi-detector computed tomography (MDCT) has led to increased detection of CV injuries during rapid comprehensive scanning of stabilized major trauma patients. MDCT has the ability to acquire images with a higher temporal and spatial resolution, as well as the capability to create multiplanar reformats. This pictorial review illustrates several common and life-threatening traumatic CV injuries from a regional trauma center.

  10. Identification of Cardiac and Aortic Injuries in Trauma with Multi-detector Computed Tomography

    PubMed Central

    Shergill, Arvind K; Maraj, Tishan; Barszczyk, Mark S; Cheung, Helen; Singh, Navneet; Zavodni, Anna E

    2015-01-01

    Blunt and penetrating cardiovascular (CV) injuries are associated with a high morbidity and mortality. Rapid detection of these injuries in trauma is critical for patient survival. The advent of multi-detector computed tomography (MDCT) has led to increased detection of CV injuries during rapid comprehensive scanning of stabilized major trauma patients. MDCT has the ability to acquire images with a higher temporal and spatial resolution, as well as the capability to create multiplanar reformats. This pictorial review illustrates several common and life-threatening traumatic CV injuries from a regional trauma center. PMID:26430541

  11. Computational identification of putative miRNAs and their target genes in pathogenic amoeba Naegleria fowleri.

    PubMed

    Padmashree, Dyavegowda; Swamy, Narayanaswamy Ramachandra

    2015-01-01

    Naegleria fowleri is a parasitic unicellular free living eukaryotic amoeba. The parasite spreads through contaminated water and causes primary amoebic meningoencephalitis (PAM). Therefore, it is of interest to understand its molecular pathogenesis. Hence, we analyzed the parasite genome for miRNAs (microRNAs) that are non-coding, single stranded RNA molecules. We identified 245 miRNAs using computational methods in N. fowleri, of which five miRNAs are conserved. The predicted miRNA targets were analyzed by using miRanda (software) and further studied the functions by subsequently annotating using AmiGo (a gene ontology web tool).

  12. Scale Score Comparability across Two Levels of a Norm-Referenced Math Computation Test for Students with Learning Disabilities. Out-of-Level Testing Report.

    ERIC Educational Resources Information Center

    Bielinski, John; Thurlow, Martha; Minnema, Jane; Scott, Jim

    In this study, special education teachers identified students with learning disabilities who were working on math skills usually taught two grades below the grade in which the student was enrolled. Each student (n=33) took two levels of the MAT/7 math computation test, an on-grade test, and an out-of-level test intended for students two grades…

  13. Computational Identification of Genomic Features That Influence 3D Chromatin Domain Formation

    PubMed Central

    Mourad, Raphaël; Cuvier, Olivier

    2016-01-01

    Recent advances in long-range Hi-C contact mapping have revealed the importance of the 3D structure of chromosomes in gene expression. A current challenge is to identify the key molecular drivers of this 3D structure. Several genomic features, such as architectural proteins and functional elements, were shown to be enriched at topological domain borders using classical enrichment tests. Here we propose multiple logistic regression to identify those genomic features that positively or negatively influence domain border establishment or maintenance. The model is flexible, and can account for statistical interactions among multiple genomic features. Using both simulated and real data, we show that our model outperforms enrichment test and non-parametric models, such as random forests, for the identification of genomic features that influence domain borders. Using Drosophila Hi-C data at a very high resolution of 1 kb, our model suggests that, among architectural proteins, BEAF-32 and CP190 are the main positive drivers of 3D domain borders. In humans, our model identifies well-known architectural proteins CTCF and cohesin, as well as ZNF143 and Polycomb group proteins as positive drivers of domain borders. The model also reveals the existence of several negative drivers that counteract the presence of domain borders including P300, RXRA, BCL11A and ELK1. PMID:27203237

  14. Computational identification and analysis of MADS box genes in Camellia sinensis

    PubMed Central

    Gogoi, Madhurjya; Borchetia, Sangeeta; Bandyopadhyay, Tanoy

    2015-01-01

    MADS (Minichromosome Maintenance1 Agamous Deficiens Serum response factor) box genes encode transcription factors and they play a key role in growth and development of flowering plants. There are two types of MADS box genes- Type I (serum response factor (SRF)-like) and Type II (myocyte enhancer factor 2 (MEF2)-like). Type II MADS box genes have a conserved MIKC domain (MADS DNA-binding domain, intervening domain, keratin-like domain, and c-terminal domain) and these were extensively studied in plants. Compared to other plants very little is known about MADS box genes in Camellia sinensis. The present study aims at identifying and analyzing the MADS-box genes present in Camellia sinensis. A comparative bioinformatics and phylogenetic analysis of the Camellia sinensis sequences along with Arabidopsis thaliana MADS box sequences available in the public domain databases led to the identification of 16 genes which were orthologous to Type II MADS box gene family members. The protein sequences were classified into distinct clades which are associated with the conserved function of flower and seed development. The identified genes may be used for gene expression and gene manipulation studies to elucidate their role in the development and flowering of tea which may pave the way to improve the crop productivity. PMID:25914445

  15. Statistical material parameters identification based on artificial neural networks for stochastic computations

    NASA Astrophysics Data System (ADS)

    Novák, Drahomír; Lehký, David

    2017-07-01

    A general methodology to obtain statistical material model parameters is presented. The procedure is based on the coupling of a stochastic simulation and an artificial neural network. The identification parameters play the role of basic random variables with a scatter reflecting the physical range of possible values. The efficient small-sample simulation method Latin Hypercube Sampling is used for the stochastic preparation of the training set utilized in training the neural network. Once the network has been trained, it represents an approximation consequently utilized in a following way: To provide the best possible set of model parameters for the given experimental data. The paper focuses the attention on the statistical inverse analysis of material model parameters where statistical moments (usually means and standard deviations) of input parameters have to be identified based on experimental data. A hierarchical statistical parameters database within the framework of reliability software is presented. The efficiency of the approach is verifiedusing numerical example of fracture-mechanical parameters determination of fiber reinforced and plain concretes.

  16. Computational identification and analysis of MADS box genes in Camellia sinensis.

    PubMed

    Gogoi, Madhurjya; Borchetia, Sangeeta; Bandyopadhyay, Tanoy

    2015-01-01

    MADS (Minichromosome Maintenance1 Agamous Deficiens Serum response factor) box genes encode transcription factors and they play a key role in growth and development of flowering plants. There are two types of MADS box genes- Type I (serum response factor (SRF)-like) and Type II (myocyte enhancer factor 2 (MEF2)-like). Type II MADS box genes have a conserved MIKC domain (MADS DNA-binding domain, intervening domain, keratin-like domain, and c-terminal domain) and these were extensively studied in plants. Compared to other plants very little is known about MADS box genes in Camellia sinensis. The present study aims at identifying and analyzing the MADS-box genes present in Camellia sinensis. A comparative bioinformatics and phylogenetic analysis of the Camellia sinensis sequences along with Arabidopsis thaliana MADS box sequences available in the public domain databases led to the identification of 16 genes which were orthologous to Type II MADS box gene family members. The protein sequences were classified into distinct clades which are associated with the conserved function of flower and seed development. The identified genes may be used for gene expression and gene manipulation studies to elucidate their role in the development and flowering of tea which may pave the way to improve the crop productivity.

  17. Identification of Covalent Binding Sites Targeting Cysteines Based on Computational Approaches.

    PubMed

    Zhang, Yanmin; Zhang, Danfeng; Tian, Haozhong; Jiao, Yu; Shi, Zhihao; Ran, Ting; Liu, Haichun; Lu, Shuai; Xu, Anyang; Qiao, Xin; Pan, Jing; Yin, Lingfeng; Zhou, Weineng; Lu, Tao; Chen, Yadong

    2016-09-06

    Covalent drugs have attracted increasing attention in recent years due to good inhibitory activity and selectivity. Targeting noncatalytic cysteines with irreversible inhibitors is a powerful approach for enhancing pharmacological potency and selectivity because cysteines can form covalent bonds with inhibitors through their nucleophilic thiol groups. However, most human kinases have multiple noncatalytic cysteines within the active site; to accurately predict which cysteine is most likely to form covalent bonds is of great importance but remains a challenge when designing irreversible inhibitors. In this work, FTMap was first applied to check its ability in predicting covalent binding site defined as the region where covalent bonds are formed between cysteines and irreversible inhibitors. Results show that it has excellent performance in detecting the hot spots within the binding pocket, and its hydrogen bond interaction frequency analysis could give us some interesting instructions for identification of covalent binding cysteines. Furthermore, we proposed a simple but useful covalent fragment probing approach and showed that it successfully predicted the covalent binding site of seven targets. By adopting a distance-based method, we observed that the closer the nucleophiles of covalent warheads are to the thiol group of a cysteine, the higher the possibility that a cysteine is prone to form a covalent bond. We believe that the combination of FTMap and our distance-based covalent fragment probing method can become a useful tool in detecting the covalent binding site of these targets.

  18. Computational identification of human long intergenic non-coding RNAs using a GA-SVM algorithm.

    PubMed

    Wang, Yanqiu; Li, Yang; Wang, Qi; Lv, Yingli; Wang, Shiyuan; Chen, Xi; Yu, Xuexin; Jiang, Wei; Li, Xia

    2014-01-01

    Long intergenic non-coding RNAs (lincRNAs) are a new type of non-coding RNAs and are closely related with the occurrence and development of diseases. In previous studies, most lincRNAs have been identified through next-generation sequencing. Because lincRNAs exhibit tissue-specific expression, the reproducibility of lincRNA discovery in different studies is very poor. In this study, not including lincRNA expression, we used the sequence, structural and protein-coding potential features as potential features to construct a classifier that can be used to distinguish lincRNAs from non-lincRNAs. The GA-SVM algorithm was performed to extract the optimized feature subset. Compared with several feature subsets, the five-fold cross validation results showed that this optimized feature subset exhibited the best performance for the identification of human lincRNAs. Moreover, the LincRNA Classifier based on Selected Features (linc-SF) was constructed by support vector machine (SVM) based on the optimized feature subset. The performance of this classifier was further evaluated by predicting lincRNAs from two independent lincRNA sets. Because the recognition rates for the two lincRNA sets were 100% and 99.8%, the linc-SF was found to be effective for the prediction of human lincRNAs.

  19. High-level waste storage tank farms/242-A evaporator standards/requirements identification document (S/RID), Vol. 7

    SciTech Connect

    Not Available

    1994-04-01

    This Requirements Identification Document (RID) describes an Occupational Health and Safety Program as defined through the Relevant DOE Orders, regulations, industry codes/standards, industry guidance documents and, as appropriate, good industry practice. The definition of an Occupational Health and Safety Program as specified by this document is intended to address Defense Nuclear Facilities Safety Board Recommendations 90-2 and 91-1, which call for the strengthening of DOE complex activities through the identification and application of relevant standards which supplement or exceed requirements mandated by DOE Orders. This RID applies to the activities, personnel, structures, systems, components, and programs involved in maintaining the facility and executing the mission of the High-Level Waste Storage Tank Farms.

  20. Self-Assessment and Student Improvement in an Introductory Computer Course at the Community College-Level

    ERIC Educational Resources Information Center

    Spicer-Sutton, Jama

    2013-01-01

    The purpose of this study was to determine a student's computer knowledge upon course entry and if there was a difference in college students' improvement scores as measured by the difference in pretest and posttest scores of new or novice users, moderate users, and expert users at the end of a college-level introductory computing class. This…

  1. Self-Assessment and Student Improvement in an Introductory Computer Course at the Community College-Level

    ERIC Educational Resources Information Center

    Spicer-Sutton, Jama

    2013-01-01

    The purpose of this study was to determine a student's computer knowledge upon course entry and if there was a difference in college students' improvement scores as measured by the difference in pretest and posttest scores of new or novice users, moderate users, and expert users at the end of a college-level introductory computing class. This…

  2. Myocardial strain estimation from CT: towards computer-aided diagnosis on infarction identification

    NASA Astrophysics Data System (ADS)

    Wong, Ken C. L.; Tee, Michael; Chen, Marcus; Bluemke, David A.; Summers, Ronald M.; Yao, Jianhua

    2015-03-01

    Regional myocardial strains have the potential for early quantification and detection of cardiac dysfunctions. Although image modalities such as tagged and strain-encoded MRI can provide motion information of the myocardium, they are uncommon in clinical routine. In contrary, cardiac CT images are usually available, but they only provide motion information at salient features such as the cardiac boundaries. To estimate myocardial strains from a CT image sequence, we adopted a cardiac biomechanical model with hyperelastic material properties to relate the motion on the cardiac boundaries to the myocardial deformation. The frame-to-frame displacements of the cardiac boundaries are obtained using B-spline deformable image registration based on mutual information, which are enforced as boundary conditions to the biomechanical model. The system equation is solved by the finite element method to provide the dense displacement field of the myocardium, and the regional values of the three principal strains and the six strains in cylindrical coordinates are computed in terms of the American Heart Association nomenclature. To study the potential of the estimated regional strains on identifying myocardial infarction, experiments were performed on cardiac CT image sequences of ten canines with artificially induced myocardial infarctions. The leave-one-subject-out cross validations show that, by using the optimal strain magnitude thresholds computed from ROC curves, the radial strain and the first principal strain have the best performance.

  3. Computational Identification of Mechanistic Factors That Determine the Timing and Intensity of the Inflammatory Response

    PubMed Central

    Nagaraja, Sridevi; Reifman, Jaques; Mitrophanov, Alexander Y.

    2015-01-01

    Timely resolution of inflammation is critical for the restoration of homeostasis in injured or infected tissue. Chronic inflammation is often characterized by a persistent increase in the concentrations of inflammatory cells and molecular mediators, whose distinct amount and timing characteristics offer an opportunity to identify effective therapeutic regulatory targets. Here, we used our recently developed computational model of local inflammation to identify potential targets for molecular interventions and to investigate the effects of individual and combined inhibition of such targets. This was accomplished via the development and application of computational strategies involving the simulation and analysis of thousands of inflammatory scenarios. We found that modulation of macrophage influx and efflux is an effective potential strategy to regulate the amount of inflammatory cells and molecular mediators in both normal and chronic inflammatory scenarios. We identified three molecular mediators − tumor necrosis factor-α (TNF-α), transforming growth factor-β (TGF-β), and the chemokine CXCL8 − as potential molecular targets whose individual or combined inhibition may robustly regulate both the amount and timing properties of the kinetic trajectories for neutrophils and macrophages in chronic inflammation. Modulation of macrophage flux, as well as of the abundance of TNF-α, TGF-β, and CXCL8, may improve the resolution of chronic inflammation. PMID:26633296

  4. Identification of miRNAs Potentially Involved in Bronchiolitis Obliterans Syndrome: A Computational Study

    PubMed Central

    Politano, Gianfranco; Inghilleri, Simona; Morbini, Patrizia; Calabrese, Fiorella; Benso, Alfredo; Savino, Alessandro; Cova, Emanuela; Zampieri, Davide; Meloni, Federica

    2016-01-01

    The pathogenesis of Bronchiolitis Obliterans Syndrome (BOS), the main clinical phenotype of chronic lung allograft dysfunction, is poorly understood. Recent studies suggest that epigenetic regulation of microRNAs might play a role in its development. In this paper we present the application of a complex computational pipeline to perform enrichment analysis of miRNAs in pathways applied to the study of BOS. The analysis considered the full set of miRNAs annotated in miRBase (version 21), and applied a sequence of filtering approaches and statistical analyses to reduce this set and to score the candidate miRNAs according to their potential involvement in BOS development. Dysregulation of two of the selected candidate miRNAs–miR-34a and miR-21 –was clearly shown in in-situ hybridization (ISH) on five explanted human BOS lungs and on a rat model of acute and chronic lung rejection, thus definitely identifying miR-34a and miR-21 as pathogenic factors in BOS and confirming the effectiveness of the computational pipeline. PMID:27564214

  5. Facile identification of dual FLT3-Aurora A inhibitors: a computer-guided drug design approach.

    PubMed

    Chang Hsu, Yung; Ke, Yi-Yu; Shiao, Hui-Yi; Lee, Chieh-Chien; Lin, Wen-Hsing; Chen, Chun-Hwa; Yen, Kuei-Jung; Hsu, John T-A; Chang, Chungming; Hsieh, Hsing-Pang

    2014-05-01

    Computer-guided drug design is a powerful tool for drug discovery. Herein we disclose the use of this approach for the discovery of dual FMS-like receptor tyrosine kinase-3 (FLT3)-Aurora A inhibitors against cancer. An Aurora hit compound was selected as a starting point, from which 288 virtual molecules were screened. Subsequently, some of these were synthesized and evaluated for their capacity to inhibit FLT3 and Aurora kinase A. To further enhance FLT3 inhibition, structure-activity relationship studies of the lead compound were conducted through a simplification strategy and bioisosteric replacement, followed by the use of computer-guided drug design to prioritize molecules bearing a variety of different terminal groups in terms of favorable binding energy. Selected compounds were then synthesized, and their bioactivity was evaluated. Of these, one novel inhibitor was found to exhibit excellent inhibition of FLT3 and Aurora kinase A and exert a dramatic antiproliferative effect on MOLM-13 and MV4-11 cells, with an IC50 value of 7 nM. Accordingly, it is considered a highly promising candidate for further development.

  6. Identification of miRNAs Potentially Involved in Bronchiolitis Obliterans Syndrome: A Computational Study.

    PubMed

    Di Carlo, Stefano; Rossi, Elena; Politano, Gianfranco; Inghilleri, Simona; Morbini, Patrizia; Calabrese, Fiorella; Benso, Alfredo; Savino, Alessandro; Cova, Emanuela; Zampieri, Davide; Meloni, Federica

    2016-01-01

    The pathogenesis of Bronchiolitis Obliterans Syndrome (BOS), the main clinical phenotype of chronic lung allograft dysfunction, is poorly understood. Recent studies suggest that epigenetic regulation of microRNAs might play a role in its development. In this paper we present the application of a complex computational pipeline to perform enrichment analysis of miRNAs in pathways applied to the study of BOS. The analysis considered the full set of miRNAs annotated in miRBase (version 21), and applied a sequence of filtering approaches and statistical analyses to reduce this set and to score the candidate miRNAs according to their potential involvement in BOS development. Dysregulation of two of the selected candidate miRNAs-miR-34a and miR-21 -was clearly shown in in-situ hybridization (ISH) on five explanted human BOS lungs and on a rat model of acute and chronic lung rejection, thus definitely identifying miR-34a and miR-21 as pathogenic factors in BOS and confirming the effectiveness of the computational pipeline.

  7. Identification of contaminant source architectures—A statistical inversion that emulates multiphase physics in a computationally practicable manner

    NASA Astrophysics Data System (ADS)

    Koch, J.; Nowak, W.

    2016-02-01

    The goal of this work is to improve the inference of nonaqueous-phase contaminated source zone architectures (CSA) from field data. We follow the idea that a physically motivated model for CSA formation helps in this inference by providing relevant relationships between observables and the unknown CSA. Typical multiphase models are computationally too expensive to be applied for inverse modeling; thus, state-of-the-art CSA identification techniques do not yet use physically based CSA formation models. To overcome this shortcoming, we apply a stochastic multiphase model with reduced computational effort that can be used to generate a large ensemble of possible CSA realizations. Further, we apply a reverse transport formulation in order to accelerate the inversion of transport-related data such as downgradient aqueous-phase concentrations. We combine these approaches within an inverse Bayesian methodology for joint inversion of CSA and aquifer parameters. Because we use multiphase physics to constrain and inform the inversion, (1) only physically meaningful CSAs are inferred; (2) each conditional realization is statistically meaningful; (3) we obtain physically meaningful spatial dependencies for interpolation and extrapolation of point-like observations between the different involved unknowns and observables, and (4) dependencies far beyond simple correlation; (5) the inversion yields meaningful uncertainty bounds. We illustrate our concept by inferring three-dimensional probability distributions of DNAPL residence, contaminant mass discharge, and of other CSA characteristics. In the inference example, we use synthetic numerical data on permeability, DNAPL saturation and downgradient aqueous-phase concentration, and we substantiate our claims about the advantages of emulating a multiphase flow model with reduced computational requirement in the inversion.

  8. Patient dose, gray level and exposure index with a computed radiography system

    NASA Astrophysics Data System (ADS)

    Silva, T. R.; Yoshimura, E. M.

    2014-02-01

    Computed radiography (CR) is gradually replacing conventional screen-film system in Brazil. To assess image quality, manufactures provide the calculation of an exposure index through the acquisition software of the CR system. The objective of this study is to verify if the CR image can be used as an evaluator of patient absorbed dose too, through a relationship between the entrance skin dose and the exposure index or the gray level values obtained in the image. The CR system used for this study (Agfa model 30-X with NX acquisition software) calculates an exposure index called Log of the Median (lgM), related to the absorbed dose to the IP. The lgM value depends on the average gray level (called Scan Average Level (SAL)) of the segmented pixel value histogram of the whole image. A Rando male phantom was used to simulate a human body (chest and head), and was irradiated with an X-ray equipment, using usual radiologic techniques for chest exams. Thermoluminescent dosimeters (LiF, TLD100) were used to evaluate entrance skin dose and exit dose. The results showed a logarithm relation between entrance dose and SAL in the image center, regardless of the beam filtration. The exposure index varies linearly with the entrance dose, but the angular coefficient is beam quality dependent. We conclude that, with an adequate calibration, the CR system can be used to evaluate the patient absorbed dose.

  9. Computer models of possible physiological contributions to low-level auditory scene analysis

    NASA Astrophysics Data System (ADS)

    Meddis, Ray

    2004-05-01

    Auditory selective attention is a general term for a wide range of phenomena including grouping and streaming of sound sources. While we know a great deal about the circumstances in which these phenomena occur, we have little understanding of the physiological mechanisms that give rise to these effects. Because attention is sometimes under conscious control, it is tempting to conclude that attention is a high-level/cortical function and beyond our current understanding of brain physiology. However, a number of mechanisms operating at the level of the brainstem may well make an important contribution to auditory scene analysis. Because we know much more about the response of the brainstem to auditory stimulation, we can begin some speculative modeling concerning their possible involvement. Two mechanisms will be discussed in terms of their possible relevance: lateral and recurrent inhibition at the level of the cochlear nucleus. These are likely to contribute to the selection of auditory channels. A new approach to within-channel selection on the basis of pitch will also be discussed. These approaches will be illustrated using computer models of the underlying physiology and their response to stimuli used in psychophysical experiments.

  10. A Low-Rank Method for Characterizing High-Level Neural Computations

    PubMed Central

    Kaardal, Joel T.; Theunissen, Frédéric E.; Sharpee, Tatyana O.

    2017-01-01

    The signal transformations that take place in high-level sensory regions of the brain remain enigmatic because of the many nonlinear transformations that separate responses of these neurons from the input stimuli. One would like to have dimensionality reduction methods that can describe responses of such neurons in terms of operations on a large but still manageable set of relevant input features. A number of methods have been developed for this purpose, but often these methods rely on the expansion of the input space to capture as many relevant stimulus components as statistically possible. This expansion leads to a lower effective sampling thereby reducing the accuracy of the estimated components. Alternatively, so-called low-rank methods explicitly search for a small number of components in the hope of achieving higher estimation accuracy. Even with these methods, however, noise in the neural responses can force the models to estimate more components than necessary, again reducing the methods' accuracy. Here we describe how a flexible regularization procedure, together with an explicit rank constraint, can strongly improve the estimation accuracy compared to previous methods suitable for characterizing neural responses to natural stimuli. Applying the proposed low-rank method to responses of auditory neurons in the songbird brain, we find multiple relevant components making up the receptive field for each neuron and characterize their computations in terms of logical OR and AND computations. The results highlight potential differences in how invariances are constructed in visual and auditory systems. PMID:28824408

  11. Computational Approaches to Analyze and Predict Small Molecule Transport and Distribution at Cellular and Subcellular Levels

    PubMed Central

    Ah Min, Kyoung; Zhang, Xinyuan; Yu, Jing-yu; Rosania, Gus R.

    2013-01-01

    Quantitative structure-activity relationship (QSAR) studies and mechanistic mathematical modeling approaches have been independently employed for analyzing and predicting the transport and distribution of small molecule chemical agents in living organisms. Both of these computational approaches have been useful to interpret experiments measuring the transport properties of small molecule chemical agents, in vitro and in vivo. Nevertheless, mechanistic cell-based pharmacokinetic models have been especially useful to guide the design of experiments probing the molecular pathways underlying small molecule transport phenomena. Unlike QSAR models, mechanistic models can be integrated from microscopic to macroscopic levels, to analyze the spatiotemporal dynamics of small molecule chemical agents from intracellular organelles to whole organs, well beyond the experiments and training data sets upon which the models are based. Based on differential equations, mechanistic models can also be integrated with other differential equations-based systems biology models of biochemical networks or signaling pathways. Although the origin and evolution of mathematical modeling approaches aimed at predicting drug transport and distribution has occurred independently from systems biology, we propose that the incorporation of mechanistic cell-based computational models of drug transport and distribution into a systems biology modeling framework is a logical next-step for the advancement of systems pharmacology research. PMID:24218242

  12. Computational approaches to analyse and predict small molecule transport and distribution at cellular and subcellular levels.

    PubMed

    Min, Kyoung Ah; Zhang, Xinyuan; Yu, Jing-yu; Rosania, Gus R

    2014-01-01

    Quantitative structure-activity relationship (QSAR) studies and mechanistic mathematical modeling approaches have been independently employed for analysing and predicting the transport and distribution of small molecule chemical agents in living organisms. Both of these computational approaches have been useful for interpreting experiments measuring the transport properties of small molecule chemical agents, in vitro and in vivo. Nevertheless, mechanistic cell-based pharmacokinetic models have been especially useful to guide the design of experiments probing the molecular pathways underlying small molecule transport phenomena. Unlike QSAR models, mechanistic models can be integrated from microscopic to macroscopic levels, to analyse the spatiotemporal dynamics of small molecule chemical agents from intracellular organelles to whole organs, well beyond the experiments and training data sets upon which the models are based. Based on differential equations, mechanistic models can also be integrated with other differential equations-based systems biology models of biochemical networks or signaling pathways. Although the origin and evolution of mathematical modeling approaches aimed at predicting drug transport and distribution has occurred independently from systems biology, we propose that the incorporation of mechanistic cell-based computational models of drug transport and distribution into a systems biology modeling framework is a logical next step for the advancement of systems pharmacology research. Copyright © 2013 John Wiley & Sons, Ltd.

  13. Computer assisted surgery. Its usefulness in different levels of pre-operative deformities.

    PubMed

    Benavente, P; López Orosa, C; Oteo Maldonado, J A; Orois Codesal, A; García Lázaro, F J

    2015-01-01

    To compare the results obtained with computer assisted surgery with conventional techniques and evaluate the influence of navigation at different levels of preoperative deformity. A retrospective study was conducted on 100 cases with primary total knee arthroplasty performed with conventional or computer assisted surgery. A comparison was made of the post-operative mechanical axis of the lower limb between both groups and in terms of pre-operative deformity. Optimal alignment is most often obtained by using the navigation system (62%) than by a conventional technique (36%). Patients with deformities under 10° varus showed a mean post-operative alignment within the optimal range (0±3° deviation from the neutral mechanical axis), while those with more than 15° of varus show an alignment out of range, regardless of the technique used (p=.002). In those with a deformity of between 10 and 15° of pre-operative varus alignment, values were found closer to the neutral axis in the navigation group (178.7°) than in the conventional technique (175.5°), although these differences are not statistically significant (p=.127). Post-operative alignment obtained with navigation is better than with the conventional technique, with a smaller percentage of cases out of range, and greater accuracy in placing implants. A potential benefit was observed in navigation for cases with deformities of between 10 and 15° of varus. Copyright © 2014 SECOT. Published by Elsevier Espana. All rights reserved.

  14. Effectiveness of Computer Assisted Instructions (CAI) in Teaching of Mathematics at Secondary Level

    NASA Astrophysics Data System (ADS)

    Dhevakrishnan, R.; Devi, S.; Chinnaiyan, K.

    2012-09-01

    The present study was aimed at effectiveness of computer assisted instructions (CAI) in teaching of mathematics at secondary level adopted experimental method and observing the difference between (CAI) and traditional method. A sample of sixty (60) students of IX class in VVB Matriculation Higher Secondary School at Elayampalayam, Namakkal district were selected for a sample and sample was divided into two group namely experiment and control group. The experimental group consisted 30 students who were taught 'Mensurationí by the computer assisted instructions and the control groups comprising 30 students were taught by the conventional method of teaching. Data analyzed using mean, S.D. and t-test. Findings of the study clearly point out that significant increase in the mean gain scores has been found in the post test scores of the experimental group. Significant differences have been found between the control group and experimental group on post test gain scores. The experiment group, which was taught by the CAI showed better, learning. The conclusion is evident that the CAI is an effective media of instruction for teaching Mathematics at secondary students.s

  15. Two-Level Verification of Data Integrity for Data Storage in Cloud Computing

    NASA Astrophysics Data System (ADS)

    Xu, Guangwei; Chen, Chunlin; Wang, Hongya; Zang, Zhuping; Pang, Mugen; Jiang, Ping

    Data storage in cloud computing can save capital expenditure and relive burden of storage management for users. As the lose or corruption of files stored may happen, many researchers focus on the verification of data integrity. However, massive users often bring large numbers of verifying tasks for the auditor. Moreover, users also need to pay extra fee for these verifying tasks beyond storage fee. Therefore, we propose a two-level verification of data integrity to alleviate these problems. The key idea is to routinely verify the data integrity by users and arbitrate the challenge between the user and cloud provider by the auditor according to the MACs and ϕ values. The extensive performance simulations show that the proposed scheme obviously decreases auditor's verifying tasks and the ratio of wrong arbitration.

  16. Computational methodology to predict satellite system-level effects from impacts of untrackable space debris

    NASA Astrophysics Data System (ADS)

    Welty, N.; Rudolph, M.; Schäfer, F.; Apeldoorn, J.; Janovsky, R.

    2013-07-01

    This paper presents a computational methodology to predict the satellite system-level effects resulting from impacts of untrackable space debris particles. This approach seeks to improve on traditional risk assessment practices by looking beyond the structural penetration of the satellite and predicting the physical damage to internal components and the associated functional impairment caused by untrackable debris impacts. The proposed method combines a debris flux model with the Schäfer-Ryan-Lambert ballistic limit equation (BLE), which accounts for the inherent shielding of components positioned behind the spacecraft structure wall. Individual debris particle impact trajectories and component shadowing effects are considered and the failure probabilities of individual satellite components as a function of mission time are calculated. These results are correlated to expected functional impairment using a Boolean logic model of the system functional architecture considering the functional dependencies and redundancies within the system.

  17. Computational identification of the selenocysteine tRNA (tRNASec) in genomes

    PubMed Central

    2017-01-01

    Selenocysteine (Sec) is known as the 21st amino acid, a cysteine analogue with selenium replacing sulphur. Sec is inserted co-translationally in a small fraction of proteins called selenoproteins. In selenoprotein genes, the Sec specific tRNA (tRNASec) drives the recoding of highly specific UGA codons from stop signals to Sec. Although found in organisms from the three domains of life, Sec is not universal. Many species are completely devoid of selenoprotein genes and lack the ability to synthesize Sec. Since tRNASec is a key component in selenoprotein biosynthesis, its efficient identification in genomes is instrumental to characterize the utilization of Sec across lineages. Available tRNA prediction methods fail to accurately predict tRNASec, due to its unusual structural fold. Here, we present Secmarker, a method based on manually curated covariance models capturing the specific tRNASec structure in archaea, bacteria and eukaryotes. We exploited the non-universality of Sec to build a proper benchmark set for tRNASec predictions, which is not possible for the predictions of other tRNAs. We show that Secmarker greatly improves the accuracy of previously existing methods constituting a valuable tool to identify tRNASec genes, and to efficiently determine whether a genome contains selenoproteins. We used Secmarker to analyze a large set of fully sequenced genomes, and the results revealed new insights in the biology of tRNASec, led to the discovery of a novel bacterial selenoprotein family, and shed additional light on the phylogenetic distribution of selenoprotein containing genomes. Secmarker is freely accessible for download, or online analysis through a web server at http://secmarker.crg.cat. PMID:28192430

  18. Computational identification and comparative analysis of miRNA precursors in three palm species.

    PubMed

    da Silva, Aline Cunha; Grativol, Clícia; Thiebaut, Flávia; Hemerly, Adriana Silva; Ferreira, Paulo Cavalcanti Gomes

    2016-05-01

    In the present study, miRNA precursors in the genomes of three palm species were identified. Analyzes of sequence conservation and biological function of their putative targets contribute to understand the roles of miRNA in palm biology. MicroRNAs are small RNAs of 20-25 nucleotides in length, with important functions in the regulation of gene expression. Recent genome sequencing of the palm species Elaeis guineensis, Elaeis oleifera and Phoenix dactylifera have enabled the discovery of miRNA genes, which can be used as biotechnological tools in palm trees breeding. The goal of this study is the identification of miRNA precursors in the genomes of these species and their possible biological roles suggested by the mature miRNA-based regulation of target genes. Mature miRNA sequences from Arabidopsis thaliana, Oryza sativa, and Zea mays available at the miRBase were used to predict microRNA precursors in the palm genomes. Three hundred and thirty-eight precursors, ranging from 76 to 220 nucleotide (nt) in size and distributed in 33 families were identified. Moreover, we also identified 266 miRNA precursors of Musa acuminata, which are phylogenetically close to palms species. To understand the biological function of palm miRNAs, 374 putative miRNA targets were identified. An enrichment analysis of target-gene function was carried out using the agriGO tool. The results showed that the targets are involved in plant developmental processes, mainly regulating root development. Our findings contribute to increase the knowledge on microRNA roles in palm biology and could help breeding programs of palm trees.

  19. Computational approaches for identification of conserved/unique binding pockets in the A chain of ricin

    SciTech Connect

    Ecale Zhou, C L; Zemla, A T; Roe, D; Young, M; Lam, M; Schoeniger, J; Balhorn, R

    2005-01-29

    Specific and sensitive ligand-based protein detection assays that employ antibodies or small molecules such as peptides, aptamers, or other small molecules require that the corresponding surface region of the protein be accessible and that there be minimal cross-reactivity with non-target proteins. To reduce the time and cost of laboratory screening efforts for diagnostic reagents, we developed new methods for evaluating and selecting protein surface regions for ligand targeting. We devised combined structure- and sequence-based methods for identifying 3D epitopes and binding pockets on the surface of the A chain of ricin that are conserved with respect to a set of ricin A chains and unique with respect to other proteins. We (1) used structure alignment software to detect structural deviations and extracted from this analysis the residue-residue correspondence, (2) devised a method to compare corresponding residues across sets of ricin structures and structures of closely related proteins, (3) devised a sequence-based approach to determine residue infrequency in local sequence context, and (4) modified a pocket-finding algorithm to identify surface crevices in close proximity to residues determined to be conserved/unique based on our structure- and sequence-based methods. In applying this combined informatics approach to ricin A we identified a conserved/unique pocket in close proximity (but not overlapping) the active site that is suitable for bi-dentate ligand development. These methods are generally applicable to identification of surface epitopes and binding pockets for development of diagnostic reagents, therapeutics, and vaccines.

  20. Computational Identification and Systematic Classification of Novel Cytochrome P450 Genes in Salvia miltiorrhiza

    PubMed Central

    Nelson, David R.; Wu, Kai; Liu, Chang

    2014-01-01

    Salvia miltiorrhiza is one of the most economically important medicinal plants. Cytochrome P450 (CYP450) genes have been implicated in the biosynthesis of its active components. However, only a dozen full-length CYP450 genes have been described, and there is no systematic classification of CYP450 genes in S. miltiorrhiza. We obtained 77,549 unigenes from three tissue types of S. miltiorrhiza using RNA-Seq technology. Combining our data with previously identified CYP450 sequences and scanning with the CYP450 model from Pfam resulted in the identification of 116 full-length and 135 partial-length CYP450 genes. The 116 genes were classified into 9 clans and 38 families using standard criteria. The RNA-Seq results showed that 35 CYP450 genes were co-expressed with CYP76AH1, a marker gene for tanshinone biosynthesis, using r≥0.9 as a cutoff. The expression profiles for 16 of 19 randomly selected CYP450 obtained from RNA-Seq were validated by qRT-PCR. Comparing against the KEGG database, 10 CYP450 genes were found to be associated with diterpenoid biosynthesis. Considering all the evidence, 3 CYP450 genes were identified to be potentially involved in terpenoid biosynthesis. Moreover, we found that 15 CYP450 genes were possibly regulated by antisense transcripts (r≥0.9 or r≤–0.9). Lastly, a web resource (SMCYP450, http://www.herbalgenomics.org/samicyp450) was set up, which allows users to browse, search, retrieve and compare CYP450 genes and can serve as a centralized resource. PMID:25493946

  1. Goal-Directed Behavior and Instrumental Devaluation: A Neural System-Level Computational Model

    PubMed Central

    Mannella, Francesco; Mirolli, Marco; Baldassarre, Gianluca

    2016-01-01

    Devaluation is the key experimental paradigm used to demonstrate the presence of instrumental behaviors guided by goals in mammals. We propose a neural system-level computational model to address the question of which brain mechanisms allow the current value of rewards to control instrumental actions. The model pivots on and shows the computational soundness of the hypothesis for which the internal representation of instrumental manipulanda (e.g., levers) activate the representation of rewards (or “action-outcomes”, e.g., foods) while attributing to them a value which depends on the current internal state of the animal (e.g., satiation for some but not all foods). The model also proposes an initial hypothesis of the integrated system of key brain components supporting this process and allowing the recalled outcomes to bias action selection: (a) the sub-system formed by the basolateral amygdala and insular cortex acquiring the manipulanda-outcomes associations and attributing the current value to the outcomes; (b) three basal ganglia-cortical loops selecting respectively goals, associative sensory representations, and actions; (c) the cortico-cortical and striato-nigro-striatal neural pathways supporting the selection, and selection learning, of actions based on habits and goals. The model reproduces and explains the results of several devaluation experiments carried out with control rats and rats with pre- and post-training lesions of the basolateral amygdala, the nucleus accumbens core, the prelimbic cortex, and the dorso-medial striatum. The results support the soundness of the hypotheses of the model and show its capacity to integrate, at the system-level, the operations of the key brain structures underlying devaluation. Based on its hypotheses and predictions, the model also represents an operational framework to support the design and analysis of new experiments on the motivational aspects of goal-directed behavior. PMID:27803652

  2. Instruction-Level Characterization of Scientific Computing Applications Using Hardware Performance Counters

    SciTech Connect

    Luo, Y.; Cameron, K.W.

    1998-11-24

    Workload characterization has been proven an essential tool to architecture design and performance evaluation in both scientific and commercial computing areas. Traditional workload characterization techniques include FLOPS rate, cache miss ratios, CPI (cycles per instruction or IPC, instructions per cycle) etc. With the complexity of sophisticated modern superscalar microprocessors, these traditional characterization techniques are not powerful enough to pinpoint the performance bottleneck of an application on a specific microprocessor. They are also incapable of immediately demonstrating the potential performance benefit of any architectural or functional improvement in a new processor design. To solve these problems, many people rely on simulators, which have substantial constraints especially on large-scale scientific computing applications. This paper presents a new technique of characterizing applications at the instruction level using hardware performance counters. It has the advantage of collecting instruction-level characteristics in a few runs virtually without overhead or slowdown. A variety of instruction counts can be utilized to calculate some average abstract workload parameters corresponding to microprocessor pipelines or functional units. Based on the microprocessor architectural constraints and these calculated abstract parameters, the architectural performance bottleneck for a specific application can be estimated. In particular, the analysis results can provide some insight to the problem that only a small percentage of processor peak performance can be achieved even for many very cache-friendly codes. Meanwhile, the bottleneck estimation can provide suggestions about viable architectural/functional improvement for certain workloads. Eventually, these abstract parameters can lead to the creation of an analytical microprocessor pipeline model and memory hierarchy model.

  3. Goal-Directed Behavior and Instrumental Devaluation: A Neural System-Level Computational Model.

    PubMed

    Mannella, Francesco; Mirolli, Marco; Baldassarre, Gianluca

    2016-01-01

    Devaluation is the key experimental paradigm used to demonstrate the presence of instrumental behaviors guided by goals in mammals. We propose a neural system-level computational model to address the question of which brain mechanisms allow the current value of rewards to control instrumental actions. The model pivots on and shows the computational soundness of the hypothesis for which the internal representation of instrumental manipulanda (e.g., levers) activate the representation of rewards (or "action-outcomes", e.g., foods) while attributing to them a value which depends on the current internal state of the animal (e.g., satiation for some but not all foods). The model also proposes an initial hypothesis of the integrated system of key brain components supporting this process and allowing the recalled outcomes to bias action selection: (a) the sub-system formed by the basolateral amygdala and insular cortex acquiring the manipulanda-outcomes associations and attributing the current value to the outcomes; (b) three basal ganglia-cortical loops selecting respectively goals, associative sensory representations, and actions; (c) the cortico-cortical and striato-nigro-striatal neural pathways supporting the selection, and selection learning, of actions based on habits and goals. The model reproduces and explains the results of several devaluation experiments carried out with control rats and rats with pre- and post-training lesions of the basolateral amygdala, the nucleus accumbens core, the prelimbic cortex, and the dorso-medial striatum. The results support the soundness of the hypotheses of the model and show its capacity to integrate, at the system-level, the operations of the key brain structures underlying devaluation. Based on its hypotheses and predictions, the model also represents an operational framework to support the design and analysis of new experiments on the motivational aspects of goal-directed behavior.

  4. Computational identification of gene over-expression targets for metabolic engineering of taxadiene production.

    PubMed

    Boghigian, Brett A; Armando, John; Salas, Daniel; Pfeifer, Blaine A

    2012-03-01

    Taxadiene is the first dedicated intermediate in the biosynthetic pathway of the anticancer compound Taxol. Recent studies have taken advantage of heterologous hosts to produce taxadiene and other isoprenoid compounds, and such ventures now offer research opportunities that take advantage of the engineering tools associated with the surrogate host. In this study, metabolic engineering was applied in the context of over-expression targets predicted to improve taxadiene production. Identified targets included genes both within and outside of the isoprenoid precursor pathway. These targets were then tested for experimental over-expression in a heterologous Escherichia coli host designed to support isoprenoid biosynthesis. Results confirmed the computationally predicted improvements and indicated a synergy between targets within the expected isoprenoid precursor pathway and those outside this pathway. The presented algorithm is broadly applicable to other host systems and/or product choices.

  5. Cutting edge: identification of novel T cell epitopes in Lol p5a by computational prediction.

    PubMed

    de Lalla, C; Sturniolo, T; Abbruzzese, L; Hammer, J; Sidoli, A; Sinigaglia, F; Panina-Bordignon, P

    1999-08-15

    Although atopic allergy affects computational prediction of DR ligands might thus allow the design of T cell epitopes with potential useful application in novel immunotherapy strategies.

  6. Computational genomic identification and functional reconstitution of plant natural product biosynthetic pathways

    PubMed Central

    2016-01-01

    Covering: 2003 to 2016 The last decade has seen the first major discoveries regarding the genomic basis of plant natural product biosynthetic pathways. Four key computationally driven strategies have been developed to identify such pathways, which make use of physical clustering, co-expression, evolutionary co-occurrence and epigenomic co-regulation of the genes involved in producing a plant natural product. Here, we discuss how these approaches can be used for the discovery of plant biosynthetic pathways encoded by both chromosomally clustered and non-clustered genes. Additionally, we will discuss opportunities to prioritize plant gene clusters for experimental characterization, and end with a forward-looking perspective on how synthetic biology technologies will allow effective functional reconstitution of candidate pathways using a variety of genetic systems. PMID:27321668

  7. Computational identification of promising thermoelectric materials among known quasi-2D binary compounds

    SciTech Connect

    Gorai, Prashun; Toberer, Eric S.; Stevanović, Vladan

    2016-01-01

    Quasi low-dimensional structures are abundant among known thermoelectric materials, primarily because of their low lattice thermal conductivities. In this work, we have computationally assessed the potential of 427 known binary quasi-2D structures in 272 different chemistries for thermoelectric performance. To assess the thermoelectric performance, we employ an improved version of our previously developed descriptor for thermoelectric performance [Yan et al., Energy Environ. Sci., 2015, 8, 983]. The improvement is in the explicit treatment of van der Waals interactions in quasi-2D materials, which leads to significantly better predictions of their crystal structures and lattice thermal conductivities. The improved methodology correctly identifies known binary quasi-2D thermoelectric materials such as Sb2Te3, Bi2Te3, SnSe, SnS, InSe, and In2Se3. As a result, we propose candidate quasi-2D binary materials, a number of which have not been previously considered for thermoelectric applications.

  8. Key for protein coding sequences identification: computer analysis of codon strategy.

    PubMed Central

    Rodier, F; Gabarro-Arpa, J; Ehrlich, R; Reiss, C

    1982-01-01

    The signal qualifying an AUG or GUG as an initiator in mRNAs processed by E. coli ribosomes is not found to be a systematic, literal homology sequence. In contrast, stability analysis reveals that initiators always occur within nucleic acid domains of low stability, for which a high A/U content is observed. Since no aminoacid selection pressure can be detected at N-termini of the proteins, the A/U enrichment results from a biased usage of the code degeneracy. A computer analysis is presented which allows easy detection of the codon strategy. N-terminal codons carry rather systematically A or U in third position, which suggests a mechanism for translation initiation and helps to detect protein coding sequences in sequenced DNA. PMID:7038623

  9. Identification of interstitial-like defects in a computer model of glassy aluminum

    NASA Astrophysics Data System (ADS)

    Goncharova, E. V.; Konchakov, R. A.; Makarov, A. S.; Kobelev, N. P.; Khonik, V. A.

    2017-08-01

    Computer simulation shows that glassy aluminum produced by rapid melt quenching contains a significant number of ‘defects’ similar to dumbbell (split) interstitials in the crystalline state. Although these ‘defects’ do not have any clear topological pattern as opposed to the crystal, they can be uniquely identified with the same properties which are characteristic of these defects in the crystalline structure, i.e. strong sensitivity to applied shear stress, specific local shear strain fields and distinctive low-/high-frequency peculiarities in the vibration spectra of ‘defective’ atoms. This conclusion provides new support for the interstitialcy theory, which was found to give consistent and verifiable explanations for a number of relaxation phenomena in metallic glasses and their relationship with the maternal crystalline state.

  10. Identification of potential drug targets based on a computational biology algorithm for venous thromboembolism.

    PubMed

    Xie, Ruiqiang; Li, Lei; Chen, Lina; Li, Wan; Chen, Binbin; Jiang, Jing; Huang, Hao; Li, Yiran; He, Yuehan; Lv, Junjie; He, Weiming

    2017-02-01

    Venous thromboembolism (VTE) is a common, fatal and frequently recurrent disease. Changes in the activity of different coagulation factors serve as a pathophysiological basis for the recurrent risk of VTE. Systems biology approaches provide a better understanding of the pathological mechanisms responsible for recurrent VTE. In this study, a novel computational method was presented to identify the recurrent risk modules (RRMs) based on the integration of expression profiles and human signaling network, which hold promise for achieving new and deeper insights into the mechanisms responsible for VTE. The results revealed that the RRMs had good classification performance to discriminate patients with recurrent VTE. The functional annotation analysis demonstrated that the RRMs played a crucial role in the pathogenesis of VTE. Furthermore, a variety of approved drug targets in the RRM M5 were related to VTE. Thus, the M5 may be applied to select potential drug targets for combination therapy and the extended treatment of VTE.

  11. Identification and red blood cell automated counting from blood smear images using computer-aided system.

    PubMed

    Acharya, Vasundhara; Kumar, Preetham

    2017-08-17

    Red blood cell count plays a vital role in identifying the overall health of the patient. Hospitals use the hemocytometer to count the blood cells. Conventional method of placing the smear under microscope and counting the cells manually lead to erroneous results, and medical laboratory technicians are put under stress. A computer-aided system will help to attain precise results in less amount of time. This research work proposes an image-processing technique for counting the number of red blood cells. It aims to examine and process the blood smear image, in order to support the counting of red blood cells and identify the number of normal and abnormal cells in the image automatically. K-medoids algorithm which is robust to external noise is used to extract the WBCs from the image. Granulometric analysis is used to separate the red blood cells from the white blood cells. The red blood cells obtained are counted using the labeling algorithm and circular Hough transform. The radius range for the circle-drawing algorithm is estimated by computing the distance of the pixels from the boundary which automates the entire algorithm. A comparison is done between the counts obtained using the labeling algorithm and circular Hough transform. Results of the work showed that circular Hough transform was more accurate in counting the red blood cells than the labeling algorithm as it was successful in identifying even the overlapping cells. The work also intends to compare the results of cell count done using the proposed methodology and manual approach. The work is designed to address all the drawbacks of the previous research work. The research work can be extended to extract various texture and shape features of abnormal cells identified so that diseases like anemia of inflammation and chronic disease can be detected at the earliest.

  12. Automated identification of miscoded and misclassified cases of diabetes from computer records.

    PubMed

    Sadek, A-R; van Vlymen, J; Khunti, K; de Lusignan, S

    2012-03-01

    To develop a computer processable algorithm, capable of running automated searches of routine data that flag miscoded and misclassified cases of diabetes for subsequent clinical review. Anonymized computer data from the Quality Improvement in Chronic Kidney Disease (QICKD) trial (n = 942,031) were analysed using a binary method to assess the accuracy of data on diabetes diagnosis. Diagnostic codes were processed and stratified into: definite, probable and possible diagnosis of Type 1 or Type 2 diabetes. Diagnostic accuracy was improved by using prescription compatibility and temporally sequenced anthropomorphic and biochemical data. Bayesian false detection rate analysis was used to compare findings with those of an entirely independent and more complex manual sort of the first round QICKD study data (n = 760,588). The prevalence of definite diagnosis of Type 1 diabetes and Type 2 diabetes were 0.32% and 3.27% respectively when using the binary search method. Up to 35% of Type 1 diabetes and 0.1% of Type 2 diabetes were miscoded or misclassified on the basis of age/BMI and coding. False detection rate analysis demonstrated a close correlation between the new method and the published hand-crafted sort. Both methods had the highest false detection rate values when coding, therapeutic, anthropomorphic and biochemical filters were used (up to 90% for the new and 75% for the hand-crafted search method). A simple computerized algorithm achieves very similar results to more complex search strategies to identify miscoded and misclassified cases of both Type 1 diabetes and Type 2 diabetes. It has the potential to be used as an automated audit instrument to improve quality of diabetes diagnosis. © 2011 The Authors. Diabetic Medicine © 2011 Diabetes UK.

  13. Computer Folding of RNA Tetraloops: Identification of Key Force Field Deficiencies.

    PubMed

    Kührová, Petra; Best, Robert B; Bottaro, Sandro; Bussi, Giovanni; Šponer, Jiří; Otyepka, Michal; Banáš, Pavel

    2016-09-13

    The computer-aided folding of biomolecules, particularly RNAs, is one of the most difficult challenges in computational structural biology. RNA tetraloops are fundamental RNA motifs playing key roles in RNA folding and RNA-RNA and RNA-protein interactions. Although state-of-the-art Molecular Dynamics (MD) force fields correctly describe the native state of these tetraloops as a stable free-energy basin on the microsecond time scale, enhanced sampling techniques reveal that the native state is not the global free energy minimum, suggesting yet unidentified significant imbalances in the force fields. Here, we tested our ability to fold the RNA tetraloops in various force fields and simulation settings. We employed three different enhanced sampling techniques, namely, temperature replica exchange MD (T-REMD), replica exchange with solute tempering (REST2), and well-tempered metadynamics (WT-MetaD). We aimed to separate problems caused by limited sampling from those due to force-field inaccuracies. We found that none of the contemporary force fields is able to correctly describe folding of the 5'-GAGA-3' tetraloop over a range of simulation conditions. We thus aimed to identify which terms of the force field are responsible for this poor description of TL folding. We showed that at least two different imbalances contribute to this behavior, namely, overstabilization of base-phosphate and/or sugar-phosphate interactions and underestimated stability of the hydrogen bonding interaction in base pairing. The first artifact stabilizes the unfolded ensemble, while the second one destabilizes the folded state. The former problem might be partially alleviated by reparametrization of the van der Waals parameters of the phosphate oxygens suggested by Case et al., while in order to overcome the latter effect we suggest local potentials to better capture hydrogen bonding interactions.

  14. Computational identification of transcriptionally co-regulated genes, validation with the four ANT isoform genes

    PubMed Central

    2012-01-01

    Background The analysis of gene promoters is essential to understand the mechanisms of transcriptional regulation required under the effects of physiological processes, nutritional intake or pathologies. In higher eukaryotes, transcriptional regulation implies the recruitment of a set of regulatory proteins that bind on combinations of nucleotide motifs. We developed a computational analysis of promoter nucleotide sequences, to identify co-regulated genes by combining several programs that allowed us to build regulatory models and perform a crossed analysis on several databases. This strategy was tested on a set of four human genes encoding isoforms 1 to 4 of the mitochondrial ADP/ATP carrier ANT. Each isoform has a specific tissue expression profile linked to its role in cellular bioenergetics. Results From their promoter sequence and from the phylogenetic evolution of these ANT genes in mammals, we constructed combinations of specific regulatory elements. These models were screened using the full human genome and databases of promoter sequences from human and several other mammalian species. For each of transcriptionally regulated ANT1, 2 and 4 genes, a set of co-regulated genes was identified and their over-expression was verified in microarray databases. Conclusions Most of the identified genes encode proteins with a cellular function and specificity in agreement with those of the corresponding ANT isoform. Our in silico study shows that the tissue specific gene expression is mainly driven by promoter regulatory sequences located up to about a thousand base pairs upstream the transcription start site. Moreover, this computational strategy on the study of regulatory pathways should provide, along with transcriptomics and metabolomics, data to construct cellular metabolic networks. PMID:22978616

  15. Computational identification of potential multi-drug combinations for reduction of microglial inflammation in Alzheimer disease

    PubMed Central

    Anastasio, Thomas J.

    2015-01-01

    Like other neurodegenerative diseases, Alzheimer Disease (AD) has a prominent inflammatory component mediated by brain microglia. Reducing microglial inflammation could potentially halt or at least slow the neurodegenerative process. A major challenge in the development of treatments targeting brain inflammation is the sheer complexity of the molecular mechanisms that determine whether microglia become inflammatory or take on a more neuroprotective phenotype. The process is highly multifactorial, raising the possibility that a multi-target/multi-drug strategy could be more effective than conventional monotherapy. This study takes a computational approach in finding combinations of approved drugs that are potentially more effective than single drugs in reducing microglial inflammation in AD. This novel approach exploits the distinct advantages of two different computer programming languages, one imperative and the other declarative. Existing programs written in both languages implement the same model of microglial behavior, and the input/output relationships of both programs agree with each other and with data on microglia over an extensive test battery. Here the imperative program is used efficiently to screen the model for the most efficacious combinations of 10 drugs, while the declarative program is used to analyze in detail the mechanisms of action of the most efficacious combinations. Of the 1024 possible drug combinations, the simulated screen identifies only 7 that are able to move simulated microglia at least 50% of the way from a neurotoxic to a neuroprotective phenotype. Subsequent analysis shows that of the 7 most efficacious combinations, 2 stand out as superior both in strength and reliability. The model offers many experimentally testable and therapeutically relevant predictions concerning effective drug combinations and their mechanisms of action. PMID:26097457

  16. Computationally Guided Identification of Novel Mycobacterium tuberculosis GlmU Inhibitory Leads, Their Optimization, and in Vitro Validation.

    PubMed

    Mehra, Rukmankesh; Rani, Chitra; Mahajan, Priya; Vishwakarma, Ram Ashrey; Khan, Inshad Ali; Nargotra, Amit

    2016-02-08

    Mycobacterium tuberculosis (Mtb) infections are causing serious health concerns worldwide. Antituberculosis drug resistance threatens the current therapies and causes further need to develop effective antituberculosis therapy. GlmU represents an interesting target for developing novel Mtb drug candidates. It is a bifunctional acetyltransferase/uridyltransferase enzyme that catalyzes the biosynthesis of UDP-N-acetyl-glucosamine (UDP-GlcNAc) from glucosamine-1-phosphate (GlcN-1-P). UDP-GlcNAc is a substrate for the biosynthesis of lipopolysaccharide and peptidoglycan that are constituents of the bacterial cell wall. In the current study, structure and ligand based computational models were developed and rationally applied to screen a drug-like compound repository of 20,000 compounds procured from ChemBridge DIVERSet database for the identification of probable inhibitors of Mtb GlmU. The in vitro evaluation of the in silico identified inhibitor candidates resulted in the identification of 15 inhibitory leads of this target. Literature search of these leads through SciFinder and their similarity analysis with the PubChem training data set (AID 1376) revealed the structural novelty of these hits with respect to Mtb GlmU. IC50 of the most potent identified inhibitory lead (5810599) was found to be 9.018 ± 0.04 μM. Molecular dynamics (MD) simulation of this inhibitory lead (5810599) in complex with protein affirms the stability of the lead within the binding pocket and also emphasizes on the key interactive residues for further designing. Binding site analysis of the acetyltransferase pocket with respect to the identified structural moieties provides a thorough analysis for carrying out the lead optimization studies.

  17. Revisiting thoracic surface anatomy in an adult population: A computed tomography evaluation of vertebral level.

    PubMed

    Badshah, Masroor; Soames, Roger; Khan, Muhammad Jaffar; Ibrahim, Muhammad; Khan, Adnan

    2017-03-01

    To compare key thoracic anatomical surface landmarks between healthy and patient adult populations using Computed Tomography (CT). Sixteen slice CT images of 250 age and gender matched healthy individuals and 99 patients with lung parenchymal disease were analyzed to determine the relationship of 17 thoracic structures and their vertebral levels using a 32-bit Radiant DICOM viewer. The structures studied were: aortic hiatus, azygos vein, brachiocephalic artery, gastroesophageal junction (GEJ), left and right common carotid arteries, left and right subclavian arteries, pulmonary trunk bifurcation, superior vena cava junction with the right atrium, carina, cardiac apex, manubriosternal junction, xiphisternal joint, inferior vena cava (IVC) crossing the diaphragm, aortic arch and junction of brachiocephalic veins. The surface anatomy of all structures varied among individuals with no significant effect of age. Binary logistic regression analysis showed a significant association between individual health status and vertebral level for brachiocephalic artery (P = 0.049), GEJ (P = 0.020), right common carotid (P = 0.009) and subclavian arteries (P = 0.009), pulmonary trunk bifurcation (P = 0.049), carina (P = 0.004), and IVC crossing the diaphragm (P = 0.025). These observations differ from those reported in a healthy white Caucasian population and from the vertebral levels of the IVC, esophagus, and aorta crossing the diaphragm in an Iranian population. The differences observed in this study provide insight into the effect of lung pathology on specific thoracic structures and their vertebral levels. Further studies are needed to determine whether these are general changes or pathology-specific. Clin. Anat. 30:227-236, 2017. © 2017 Wiley Periodicals, Inc.

  18. Establishment of diagnostic reference levels in computed tomography for select procedures in Pudhuchery, India.

    PubMed

    Saravanakumar, A; Vaideki, K; Govindarajan, K N; Jayakumar, S

    2014-01-01

    Computed tomography (CT) scanner under operating conditions has become a major source of human exposure to diagnostic X-rays. In this context, weighed CT dose index (CTDIw), volumetric CT dose index (CTDIv), and dose length product (DLP) are important parameter to assess procedures in CT imaging as surrogate dose quantities for patient dose optimization. The current work aims to estimate the existing dose level of CT scanner for head, chest, and abdomen procedures in Pudhuchery in south India and establish dose reference level (DRL) for the region. The study was carried out for six CT scanners in six different radiology departments using 100 mm long pencil ionization chamber and polymethylmethacrylate (PMMA) phantom. From each CT scanner, data pertaining to patient and machine details were collected for 50 head, 50 chest, and 50 abdomen procedures performed over a period of 1 year. The experimental work was carried out using the machine operating parameters used during the procedures. Initially, dose received in the phantom at the center and periphery was measured by five point method. Using these values CTDIw, CTDIv, and DLP were calculated. The DRL is established based on the third quartile value of CTDIv and DLP which is 32 mGy and 925 mGy.cm for head, 12 mGy and 456 mGy.cm for chest, and 16 mGy and 482 mGy.cm for abdomen procedures. These values are well below European Commission Dose Reference Level (EC DRL) and comparable with the third quartile value reported for Tamil Nadu region in India. The present study is the first of its kind to determine the DRL for scanners operating in the Pudhuchery region. Similar studies in other regions of India are necessary in order to establish a National Dose Reference Level.

  19. Establishment of diagnostic reference levels in computed tomography for select procedures in Pudhuchery, India

    PubMed Central

    Saravanakumar, A.; Vaideki, K.; Govindarajan, K. N.; Jayakumar, S.

    2014-01-01

    Computed tomography (CT) scanner under operating conditions has become a major source of human exposure to diagnostic X-rays. In this context, weighed CT dose index (CTDIw), volumetric CT dose index (CTDIv), and dose length product (DLP) are important parameter to assess procedures in CT imaging as surrogate dose quantities for patient dose optimization. The current work aims to estimate the existing dose level of CT scanner for head, chest, and abdomen procedures in Pudhuchery in south India and establish dose reference level (DRL) for the region. The study was carried out for six CT scanners in six different radiology departments using 100 mm long pencil ionization chamber and polymethylmethacrylate (PMMA) phantom. From each CT scanner, data pertaining to patient and machine details were collected for 50 head, 50 chest, and 50 abdomen procedures performed over a period of 1 year. The experimental work was carried out using the machine operating parameters used during the procedures. Initially, dose received in the phantom at the center and periphery was measured by five point method. Using these values CTDIw, CTDIv, and DLP were calculated. The DRL is established based on the third quartile value of CTDIv and DLP which is 32 mGy and 925 mGy.cm for head, 12 mGy and 456 mGy.cm for chest, and 16 mGy and 482 mGy.cm for abdomen procedures. These values are well below European Commission Dose Reference Level (EC DRL) and comparable with the third quartile value reported for Tamil Nadu region in India. The present study is the first of its kind to determine the DRL for scanners operating in the Pudhuchery region. Similar studies in other regions of India are necessary in order to establish a National Dose Reference Level. PMID:24600173

  20. The Reliability of a Computer-Assisted Telephone Interview Version of the Ohio State University Traumatic Brain Injury Identification Method.

    PubMed

    Cuthbert, Jeffrey P; Whiteneck, Gale G; Corrigan, John D; Bogner, Jennifer

    2016-01-01

    Provide test-retest reliability (>5 months) of the Ohio State University Traumatic Brain Injury Identification Method modified for use as a computer-assisted telephone interview (CATI) to capture traumatic brain injury (TBI) and other substantial bodily injuries among a representative sample of noninstitutionalized adults living in Colorado. Four subsamples of 50 individuals, including people with no major lifetime injury, a major lifetime injury but no TBI, TBI with no loss of consciousness, and TBI with loss of consciousness, were interviewed using the CATI Ohio State University Traumatic Brain Injury Identification Method between 6 and 18 months after an initial interview. Stratified random sample of Coloradans (n = 200) selected from a larger study of TBI. Cumulative, Severity and Age-related indices were assessed for long-term reliability. Cumulative indices were those that summed the total number of specific TBI severities across the lifetime; Severity indices included measures of the most severe type of injury incurred throughout the lifetime; and Age-related indices assessed the timing of specific injury types across the lifespan. Test-retest reliabilities ranged from poor to excellent. The indices demonstrating the greatest reliability were Severity measures, with intraclass correlations for ordinal indices ranging from 0.62 to 0.78 and Cohen κ ranging from 0.50 to 0.62. One Cumulative outcome demonstrated high reliability (0.70 for number of TBIs with loss of consciousness ≥30 minutes), while the remaining Cumulative outcomes demonstrated low reliability, ranging from 0.06 to 0.21. Age-related test-retest reliabilities were fair to poor, with intraclass correlations of 0.38 to 0.49 and Cohen κ of 0.32 and 0.34. The CATI-modified Ohio State University Traumatic Brain Injury Identification Method used in this study is an effective measure for evaluating the maximum TBI severity incurred throughout the lifetime within a general population survey. The

  1. Development of computer program ENAUDIBL for computation of the sensation levels of multiple, complex, intrusive sounds in the presence of residual environmental masking noise

    SciTech Connect

    Liebich, R. E.; Chang, Y.-S.; Chun, K. C.

    2000-03-31

    The relative audibility of multiple sounds occurs in separate, independent channels (frequency bands) termed critical bands or equivalent rectangular (filter-response) bandwidths (ERBs) of frequency. The true nature of human hearing is a function of a complex combination of subjective factors, both auditory and nonauditory. Assessment of the probability of individual annoyance, community-complaint reaction levels, speech intelligibility, and the most cost-effective mitigation actions requires sensation-level data; these data are one of the most important auditory factors. However, sensation levels cannot be calculated by using single-number, A-weighted sound level values. This paper describes specific steps to compute sensation levels. A unique, newly developed procedure is used, which simplifies and improves the accuracy of such computations by the use of maximum sensation levels that occur, for each intrusive-sound spectrum, within each ERB. The newly developed program ENAUDIBL makes use of ERB sensation-level values generated with some computational subroutines developed for the formerly documented program SPECTRAN.

  2. Computational identification and quantification of trabecular microarchitecture classes by 3-D texture analysis-based clustering.

    PubMed

    Valentinitsch, Alexander; Patsch, Janina M; Burghardt, Andrew J; Link, Thomas M; Majumdar, Sharmila; Fischer, Lukas; Schueller-Weidekamm, Claudia; Resch, Heinrich; Kainberger, Franz; Langs, Georg

    2013-05-01

    High resolution peripheral quantitative computed tomography (HR-pQCT) permits the non-invasive assessment of cortical and trabecular bone density, geometry, and microarchitecture. Although researchers have developed various post-processing algorithms to quantify HR-pQCT image properties, few of these techniques capture image features beyond global structure-based metrics. While 3D-texture analysis is a key approach in computer vision, it has been utilized only infrequently in HR-pQCT research. Motivated by high isotropic spatial resolution and the information density provided by HR-pQCT scans, we have developed and evaluated a post-processing algorithm that quantifies microarchitecture characteristics via texture features in HR-pQCT scans. During a training phase in which clustering was applied to texture features extracted from each voxel of trabecular bone, three distinct clusters, or trabecular microarchitecture classes (TMACs) were identified. These TMACs represent trabecular bone regions with common texture characteristics. The TMACs were then used to automatically segment the voxels of new data into three regions corresponding to the trained cluster features. Regional trabecular bone texture was described by the histogram of relative trabecular bone volume covered by each cluster. We evaluated the intra-scanner and inter-scanner reproducibility by assessing the precision errors (PE), intra class correlation coefficients (ICC) and Dice coefficients (DC) of the method on 14 ultradistal radius samples scanned on two HR-pQCT systems. DC showed good reproducibility in intra-scanner set-up with a mean of 0.870±0.027 (no unit). Even in the inter-scanner set-up the ICC showed high reproducibility, ranging from 0.814 to 0.964. In a preliminary clinical test application, the TMAC histograms appear to be a good indicator, when differentiating between postmenopausal women with (n=18) and without (n=18) prevalent fragility fractures. In conclusion, we could demonstrate

  3. Computational identification of genetic subnetwork modules associated with maize defense response to Fusarium verticillioides

    PubMed Central

    2015-01-01

    Background Maize, a crop of global significance, is vulnerable to a variety of biotic stresses resulting in economic losses. Fusarium verticillioides (teleomorph Gibberella moniliformis) is one of the key fungal pathogens of maize, causing ear rots and stalk rots. To better understand the genetic mechanisms involved in maize defense as well as F. verticillioides virulence, a systematic investigation of the host-pathogen interaction is needed. The aim of this study was to computationally identify potential maize subnetwork modules associated with its defense response against F. verticillioides. Results We obtained time-course RNA-seq data from B73 maize inoculated with wild type F. verticillioides and a loss-of-virulence mutant, and subsequently established a computational pipeline for network-based comparative analysis. Specifically, we first analyzed the RNA-seq data by a cointegration-correlation-expression approach, where maize genes were jointly analyzed with known F. verticillioides virulence genes to find candidate maize genes likely associated with the defense mechanism. We predicted maize co-expression networks around the selected maize candidate genes based on partial correlation, and subsequently searched for subnetwork modules that were differentially activated when inoculated with two different fungal strains. Based on our analysis pipeline, we identified four potential maize defense subnetwork modules. Two were directly associated with maize defense response and were associated with significant GO terms such as GO:0009817 (defense response to fungus) and GO:0009620 (response to fungus). The other two predicted modules were indirectly involved in the defense response, where the most significant GO terms associated with these modules were GO:0046914 (transition metal ion binding) and GO:0046686 (response to cadmium ion). Conclusion Through our RNA-seq data analysis, we have shown that a network-based approach can enhance our understanding of the

  4. Dual-Energy Computed Tomography Virtual Monoenergetic Imaging of Lung Cancer: Assessment of Optimal Energy Levels.

    PubMed

    Kaup, Moritz; Scholtz, Jan-Erik; Engler, Alexander; Albrecht, Moritz H; Bauer, Ralf W; Kerl, J Matthias; Beeres, Martin; Lehnert, Thomas; Vogl, Thomas J; Wichmann, Julian L

    2016-01-01

    The aim of the study was to evaluate objective and subjective image qualities of virtual monoenergetic imaging (VMI) in dual-source dual-energy computed tomography (DECT) and optimal kiloelectron-volt (keV) levels for lung cancer. Fifty-nine lung cancer patients underwent chest DECT. Images were reconstructed as VMI series at energy levels of 40, 60, 80, and 100 keV and standard linear blending (M_0.3) for comparison. Objective and subjective image qualities were assessed. Lesion contrast peaked in 40-keV VMI reconstructions (2.5 ± 2.9) and 60 keV (1.9 ± 3.0), which was superior to M_0.3 (0.5 ± 2.7) for both comparisons (P < 0.001). Compared with M_0.3, subjective ratings were highest for 60-keV VMI series regarding general image quality (4.48 vs 4.52; P = 0.74) and increased for lesion demarcation (4.07 vs 4.84; P < 0.001), superior to all other VMI series (P < 0.001). Image sharpness was similar between both series. Image noise was rated superior in the 80-keV and M_0.3 series, followed by 60 keV. Virtual monoenergetic imaging reconstructions at 60-keV provided the best combination of subjective and objective image qualities in DECT of lung cancer.

  5. National diagnostic reference level initiative for computed tomography examinations in Kenya

    PubMed Central

    Korir, Geoffrey K.; Wambani, Jeska S.; Korir, Ian K.; Tries, Mark A.; Boen, Patrick K.

    2016-01-01

    The purpose of this study was to estimate the computed tomography (CT) examination frequency, patient radiation exposure, effective doses and national diagnostic reference levels (NDRLs) associated with CT examinations in clinical practice. A structured questionnaire-type form was developed for recording examination frequency, scanning protocols and patient radiation exposure during CT procedures in fully equipped medical facilities across the country. The national annual number of CT examinations per 1000 people was estimated to be 3 procedures. The volume-weighted CT dose index, dose length product, effective dose and NDRLs were determined for 20 types of adult and paediatric CT examinations. Additionally, the CT annual collective effective dose and effective dose per capita were approximated. The radiation exposure during CT examinations was broadly distributed between the facilities that took part in the study. This calls for a need to develop and implement diagnostic reference levels as a standardisation and optimisation tool for the radiological protection of patients at all the CT facilities nationwide. PMID:25790825

  6. The sensitivity of nonlinear computational models of trabecular bone to tissue level constitutive model.

    PubMed

    Baumann, Andrew P; Shi, Xiutao; Roeder, Ryan K; Niebur, Glen L

    2016-01-01

    Microarchitectural finite element models have become a key tool in the analysis of trabecular bone. Robust, accurate, and validated constitutive models would enhance confidence in predictive applications of these models and in their usefulness as accurate assays of tissue properties. Human trabecular bone specimens from the femoral neck (n = 3), greater trochanter (n = 6), and lumbar vertebra (n = 1) of eight different donors were scanned by μ-CT and converted to voxel-based finite element models. Unconfined uniaxial compression and shear loading were simulated for each of three different constitutive models: a principal strain-based model, Drucker-Lode, and Drucker-Prager. The latter was applied with both infinitesimal and finite kinematics. Apparent yield strains exhibited minimal dependence on the constitutive model, differing by at most 16.1%, with the kinematic formulation being influential in compression loading. At the tissue level, the quantities and locations of yielded tissue were insensitive to the constitutive model, with the exception of the Drucker-Lode model, suggesting that correlation of microdamage with computational models does not improve the ability to discriminate between constitutive laws. Taken together, it is unlikely that a tissue constitutive model can be fully validated from apparent-level experiments alone, as the calculations are too insensitive to identify differences in the outcomes. Rather, any asymmetric criterion with a valid yield surface will likely be suitable for most trabecular bone models.

  7. The sensitivity of nonlinear computational models of trabecular bone to tissue level constitutive model

    PubMed Central

    Baumann, Andrew P.; Shi, Xiutao; Roeder, Ryan K.; Niebur, Glen L.

    2015-01-01

    Microarchitectural finite element models have become a key tool in analyses of trabecular bone. Robust, accurate, and validated constitutive models would enhance confidence in predictive applications of these models, and in their usefulness as accurate assays of tissue properties. Human trabecular bone specimens from the femoral neck (n = 3), greater trochanter (n = 6), and lumbar vertebra (n = 1) of eight different donors were scanned by μ-CT and converted to voxel-based finite element models. Unconfined uniaxial compression and shear loading were simulated for each of three different constitutive models: a principal strain based model, Drucker-Lode, and Drucker-Prager. The latter was applied with both infinitesimal and finite kinematics. Apparent yield strains exhibited minimal dependence on the constitutive model, differing by at most 16.1%, with the kinematic formulation being influential in compression loading. At the tissue level, the quantities and locations of yielded tissue were insensitive to the constitutive model, with the exception of the Drucker-Lode model, suggesting that correlation of microdamage with computational models does not improve the ability to discriminate between constitutive laws. Taken together, it is unlikely that a tissue constitutive model can be fully validated from apparent level experiments alone, as the calculations are too insensitive to identify differences in the outcomes. Rather, any asymmetric criterion with a valid yield surface will likely be suitable for most bone models. PMID:25959510

  8. Computational identification of a p38SAPK regulated transcription factor network required for tumor cell quiescence

    PubMed Central

    Adam, Alejandro P.; George, Ajish; Schewe, Denis; Bragado, Paloma; Iglesias, Bibiana V.; Ranganathan, Aparna C.; Kourtidis, Antonis; Conklin, Douglas S.; Aguirre-Ghiso, Julio A.

    2009-01-01

    The stress activated kinase p38 plays key roles in tumor suppression and induction of tumor cell dormancy. However, the mechanisms behind these functions remain poorly understood. Using computational tools we identified a transcription factor (TF) network regulated by p38α/β and required for human squamous carcinoma cell quiescence in vivo. We found that p38 transcriptionally regulates a core network of 46 genes that includes 16 TFs. Activation of p38 induced the expression of the TFs p53 and BHLHB3, while inhibiting c-Jun and FoxM1 expression. Further, induction of p53 by p38 was dependent on c-Jun downregulation. Accordingly, while RNAi downregulation of BHLHB3 or p53 interrupted tumor cell quiescence; downregulation of c-Jun or FoxM1 or overexpression of BHLHB3 in malignant cells mimicked the onset of quiescence. Our results identify components of the regulatory mechanisms driving p38-induced cancer cell quiescence. These may regulate dormancy of residual disease that usually precedes the onset of metastasis in many cancers. PMID:19584293

  9. Computational identification of RNA functional determinants by three-dimensional quantitative structure–activity relationships

    PubMed Central

    Blanchet, Marc-Frédérick; St-Onge, Karine; Lisi, Véronique; Robitaille, Julie; Hamel, Sylvie; Major, François

    2014-01-01

    Anti-infection drugs target vital functions of infectious agents, including their ribosome and other essential non-coding RNAs. One of the reasons infectious agents become resistant to drugs is due to mutations that eliminate drug-binding affinity while maintaining vital elements. Identifying these elements is based on the determination of viable and lethal mutants and associated structures. However, determining the structure of enough mutants at high resolution is not always possible. Here, we introduce a new computational method, MC-3DQSAR, to determine the vital elements of target RNA structure from mutagenesis and available high-resolution data. We applied the method to further characterize the structural determinants of the bacterial 23S ribosomal RNA sarcin–ricin loop (SRL), as well as those of the lead-activated and hammerhead ribozymes. The method was accurate in confirming experimentally determined essential structural elements and predicting the viability of new SRL variants, which were either observed in bacteria or validated in bacterial growth assays. Our results indicate that MC-3DQSAR could be used systematically to evaluate the drug-target potentials of any RNA sites using current high-resolution structural data. PMID:25200082

  10. Identification of natural allosteric inhibitor for Akt1 protein through computational approaches and in vitro evaluation.

    PubMed

    Pragna Lakshmi, T; Kumar, Amit; Vijaykumar, Veena; Natarajan, Sakthivel; Krishna, Ramadas

    2017-03-01

    Akt, a serine/threonine protein kinase, is often hyper activated in breast and prostate cancers, but with poor prognosis. Allosteric inhibitors regulate aberrant kinase activity by stabilizing the protein in inactive conformation. Several natural compounds have been reported as inhibitors for kinases. In this study, to identify potential natural allosteric inhibitor for Akt1, we generated a seven-point pharmacophore model and screened it through natural compound library. Quercetin-7-O-β-d-glucopyranoside or Q7G was found to be the best among selected molecules based on its hydrogen bond occupancy with key allosteric residues, persistent polar contacts and salt bridges that stabilize Akt1 in inactive conformation and minimum binding free energy during molecular dynamics simulation. Q7G induced dose-dependent inhibition of breast cancer cells (MDA MB-231) and arrested them in G1 and sub-G phase. This was associated with down-regulation of anti-apoptotic protein Bcl-2, up-regulation of cleaved caspase-3 and PARP. Expression of p-Akt (Ser473) was also down-regulated which might be due to Akt1 inhibition in inactive conformation. We further confirmed the Akt1 and Q7G interaction which was observed to have a dissociation constant (Kd) of 0.246μM. With these computational, biological and thermodynamic studies, we suggest Q7G as a lead molecule and propose for its further optimization.

  11. Computational Prediction of Human Salivary Proteins from Blood Circulation and Application to Diagnostic Biomarker Identification

    PubMed Central

    Wang, Jiaxin; Liang, Yanchun; Wang, Yan; Cui, Juan; Liu, Ming; Du, Wei; Xu, Ying

    2013-01-01

    Proteins can move from blood circulation into salivary glands through active transportation, passive diffusion or ultrafiltration, some of which are then released into saliva and hence can potentially serve as biomarkers for diseases if accurately identified. We present a novel computational method for predicting salivary proteins that come from circulation. The basis for the prediction is a set of physiochemical and sequence features we found to be discerning between human proteins known to be movable from circulation to saliva and proteins deemed to be not in saliva. A classifier was trained based on these features using a support-vector machine to predict protein secretion into saliva. The classifier achieved 88.56% average recall and 90.76% average precision in 10-fold cross-validation on the training data, indicating that the selected features are informative. Considering the possibility that our negative training data may not be highly reliable (i.e., proteins predicted to be not in saliva), we have also trained a ranking method, aiming to rank the known salivary proteins from circulation as the highest among the proteins in the general background, based on the same features. This prediction capability can be used to predict potential biomarker proteins for specific human diseases when coupled with the information of differentially expressed proteins in diseased versus healthy control tissues and a prediction capability for blood-secretory proteins. Using such integrated information, we predicted 31 candidate biomarker proteins in saliva for breast cancer. PMID:24324552

  12. Computational identification of microRNAs and their targets in cassava (Manihot esculenta Crantz.).

    PubMed

    Patanun, Onsaya; Lertpanyasampatha, Manassawe; Sojikul, Punchapat; Viboonjun, Unchera; Narangajavana, Jarunya

    2013-03-01

    MicroRNAs (miRNAs) are a newly discovered class of noncoding endogenous small RNAs involved in plant growth and development as well as response to environmental stresses. miRNAs have been extensively studied in various plant species, however, only few information are available in cassava, which serves as one of the staple food crops, a biofuel crop, animal feed and industrial raw materials. In this study, the 169 potential cassava miRNAs belonging to 34 miRNA families were identified by computational approach. Interestingly, mes-miR319b was represented as the first putative mirtron demonstrated in cassava. A total of 15 miRNA clusters involving 7 miRNA families, and 12 pairs of sense and antisense strand cassava miRNAs belonging to six different miRNA families were discovered. Prediction of potential miRNA target genes revealed their functions involved in various important plant biological processes. The cis-regulatory elements relevant to drought stress and plant hormone response were identified in the promoter regions of those miRNA genes. The results provided a foundation for further investigation of the functional role of known transcription factors in the regulation of cassava miRNAs. The better understandings of the complexity of miRNA-mediated genes network in cassava would unravel cassava complex biology in storage root development and in coping with environmental stresses, thus providing more insights for future exploitation in cassava improvement.

  13. Social Identification as a Determinant of Concerns about Individual-, Group-, and Inclusive-Level Justice

    ERIC Educational Resources Information Center

    Wenzel, Michael

    2004-01-01

    Extending concepts of micro- and macrojustice, three levels of justice are distinguished. Individual-, group-, and inclusive-level justice are defined in terms of the target of justice concerns: one's individual treatment, one's group's treatment, and the distribution in the collective (e.g., nation). Individual-level justice permits a more…

  14. Identification and Utilization of Employer Requirements for Entry-Level Health Occupations Workers. Final Report.

    ERIC Educational Resources Information Center

    Zukowski, James J.

    The purpose of a research project was to identify employer expectations regarding entry-level competency requirements for selected assistance-level health occupations. Physicians, dentists, and other health professionals reviewed lists of competencies associated with the performance of assistance-level employees in nursing, medical laboratory, and…

  15. Identification and Utilization of Employer Requirements for Entry-Level Health Occupations Workers. Final Report.

    ERIC Educational Resources Information Center

    Zukowski, James J.

    The purpose of a research project was to identify employer expectations regarding entry-level competency requirements for selected assistance-level health occupations. Physicians, dentists, and other health professionals reviewed lists of competencies associated with the performance of assistance-level employees in nursing, medical laboratory, and…

  16. Identification of a potential conformationally disordered mesophase in a small molecule: experimental and computational approaches.

    PubMed

    Chakravarty, Paroma; Bates, Simon; Thomas, Leonard

    2013-08-05

    GNE068, a small organic molecule, was obtained as an amorphous form (GNE068-A) after isolation from ethanol and as a partially disordered form (GNE068-PC) from ethyl acetate. On subsequent characterization, GNE068-PC exhibited a number of properties that were anomalous for a two phase crystalline-amorphous system but consistent with the presence of a solid state phase having intermediate order (mesomorphous). Modulated DSC measurements of GNE068-PC revealed an overlapping endotherm and glass transition in the 135-145 °C range. ΔH of the endotherm showed strong heating rate dependence. Variable temperature XRPD (25-160 °C) revealed structure loss in GNE068-PC, suggesting the endotherm to be an "apparent melt". In addition, gentle grinding of GNE068-PC in a mortar led to a marked decrease in XRPD peak intensities, indicating a "soft" crystalline lattice. Computational analysis of XRPD data revealed the presence of two noncrystalline contributions, one of which was associated with GNE068-A. The second was a variable component that could be modeled as diffuse scattering from local disorder within the associated crystal structure, suggesting a mesomorphous system. Owing to the dominance of the noncrystalline diffuse scattering in GNE068-PC and the observed lattice deformation, the mesomorphous phase exhibited properties consistent with a conformationally disordered mesophase. Because of the intimate association of the residual solvent (ethyl acetate) with the lattice long-range order, loss of solvent on heating through the glass transition temperature of the local disorder caused irrecoverable loss of the long-range order. This precluded the observation of characteristic thermodynamic mesophase behavior above the glass transition temperature.

  17. Identification of the distinguishing features of Crohn's disease and ischemic colitis using computed tomographic enterography.

    PubMed

    Chen, Min; Remer, Erick M; Liu, Xiuli; Lopez, Rocio; Shen, Bo

    2017-08-01

    Background and aims: The differential diagnosis between Crohn's disease (CD) and ischemic colitis (ISC) is important as their clinical management is different. ISC can easily be misdiagnosed as CD, especially in elderly populations. The distinctive radiographic features of the two disease entities have not been investigated. The aim of this study is to assess the utility of computed tomographic enterography (CTE) in the differential diagnosis between CD and ISC. Methods: Patients with confirmed CD and ISC were identified through an electronic medical record search of the Cleveland Clinic Foundation. Patients who had undergone CTE, with or without concurrent colonoscopy and histopathological specimens, were included in this study. CTE images were blindly re-reviewed by an expert gastrointestinal radiologist. The sensitivities, specificities, accuracies and positive and negative predictive values for each of the CTE findings in differentiating CD from ISC were estimated. Kappa coefficients (κ) were calculated to measure diagnosis agreement between CTE and the reference standard. Results: A total of 34 eligible patients were included in this study with 17 having CD and 17 having ISC. In differentiating CD from ISC, the presence of mucosal hyperenhancement and absence of the "target sign" on CTE showed a sensitivity of 100% each for CD, while the two radiographic features yielded a low specificity of 35.3% and 76.5%, respectively. The presence of stricture had a lower sensitivity of 64.7% for the detection of CD but had a high specificity of 100%. In distinguishing CD from ISC, the accuracy of presence of mucosal hyperenhancement, stricture and absence of target sign were 67.7%, 82.4% and 88.2%, respectively. The combination of the presence of mucosal hyperenhancement and the absence of the target sign achieved an accuracy of 100% for distinguishing CD from ISC. There was a good correlation between CTE and the reference standard for distinguishing CD from ISC (κ

  18. Identification of novel microRNAs in Hevea brasiliensis and computational prediction of their targets

    PubMed Central

    2012-01-01

    Background Plants respond to external stimuli through fine regulation of gene expression partially ensured by small RNAs. Of these, microRNAs (miRNAs) play a crucial role. They negatively regulate gene expression by targeting the cleavage or translational inhibition of target messenger RNAs (mRNAs). In Hevea brasiliensis, environmental and harvesting stresses are known to affect natural rubber production. This study set out to identify abiotic stress-related miRNAs in Hevea using next-generation sequencing and bioinformatic analysis. Results Deep sequencing of small RNAs was carried out on plantlets subjected to severe abiotic stress using the Solexa technique. By combining the LeARN pipeline, data from the Plant microRNA database (PMRD) and Hevea EST sequences, we identified 48 conserved miRNA families already characterized in other plant species, and 10 putatively novel miRNA families. The results showed the most abundant size for miRNAs to be 24 nucleotides, except for seven families. Several MIR genes produced both 20-22 nucleotides and 23-27 nucleotides. The two miRNA class sizes were detected for both conserved and putative novel miRNA families, suggesting their functional duality. The EST databases were scanned with conserved and novel miRNA sequences. MiRNA targets were computationally predicted and analysed. The predicted targets involved in "responses to stimuli" and to "antioxidant" and "transcription activities" are presented. Conclusions Deep sequencing of small RNAs combined with transcriptomic data is a powerful tool for identifying conserved and novel miRNAs when the complete genome is not yet available. Our study provided additional information for evolutionary studies and revealed potentially specific regulation of the control of redox status in Hevea. PMID:22330773

  19. Validation of a computer-aided diagnosis system for the automatic identification of carotid atherosclerosis.

    PubMed

    Bonanno, Lilla; Marino, Silvia; Bramanti, Placido; Sottile, Fabrizio

    2015-02-01

    Carotid atherosclerosis represents one of the most important causes of brain stroke. The degree of carotid stenosis is, up to now, considered one of the most important features for determining the risk of brain stroke. Ultrasound (US) is a non-invasive, relatively inexpensive, portable technique, which has an excellent temporal resolution. Computer-aided diagnosis (CAD) has become one of the major research fields in medical and diagnostic imaging. We studied US images of 44 patients, 22 patients with and 22 without carotid artery stenosis, by using US examination and applying a CAD system, an automatic prototype software to detect carotid plaques. We obtained 287 regions: 60 were classified as plaques, with an average signal echogenicity of 244.1 ± 20.0 and 227 were classified as non-plaques, with an average signal echogenicity of 193.8 ± 38.6 compared with the opinion of an expert neurologist (golden test). The receiver operating characteristic (ROC) analysis revealed a highly significant area under the ROC curve difference from 0.5 (null hypothesis) in the discrimination between plaques and non-plaques; the diagnostic accuracy was 89% (95% CI: 0.85-0.92), with an appropriate cut-off value of 236.8, sensitivity was 83% and specificity reached a value of 85%. The experimental results showed that the proposed method is feasible and has a good agreement with the expert neurologist. Without the need of any user-interaction, this method generates a detection out-put that may be useful in second opinion.

  20. A study to establish international diagnostic reference levels for paediatric computed tomography.

    PubMed

    Vassileva, J; Rehani, M; Kostova-Lefterova, D; Al-Naemi, H M; Al Suwaidi, J S; Arandjic, D; Bashier, E H O; Kodlulovich Renha, S; El-Nachef, L; Aguilar, J G; Gershan, V; Gershkevitsh, E; Gruppetta, E; Hustuc, A; Jauhari, A; Kharita, Mohammad Hassan; Khelassi-Toutaoui, N; Khosravi, H R; Khoury, H; Kralik, I; Mahere, S; Mazuoliene, J; Mora, P; Muhogora, W; Muthuvelu, P; Nikodemova, D; Novak, L; Pallewatte, A; Pekarovič, D; Shaaban, M; Shelly, E; Stepanyan, K; Thelsy, N; Visrutaratna, P; Zaman, A

    2015-07-01

    The article reports results from the largest international dose survey in paediatric computed tomography (CT) in 32 countries and proposes international diagnostic reference levels (DRLs) in terms of computed tomography dose index (CTDI vol) and dose length product (DLP). It also assesses whether mean or median values of individual facilities should be used. A total of 6115 individual patient data were recorded among four age groups: <1 y, >1-5 y, >5-10 y and >10-15 y. CTDIw, CTDI vol and DLP from the CT console were recorded in dedicated forms together with patient data and technical parameters. Statistical analysis was performed, and international DRLs were established at rounded 75th percentile values of distribution of median values from all CT facilities. The study presents evidence in favour of using median rather than mean of patient dose indices as the representative of typical local dose in a facility, and for establishing DRLs as third quartile of median values. International DRLs were established for paediatric CT examinations for routine head, chest and abdomen in the four age groups. DRLs for CTDI vol are similar to the reference values from other published reports, with some differences for chest and abdomen CT. Higher variations were observed between DLP values, based on a survey of whole multi-phase exams. It may be noted that other studies in literature were based on single phase only. DRLs reported in this article can be used in countries without sufficient medical physics support to identify non-optimised practice. Recommendations to improve the accuracy and importance of future surveys are provided.

  1. Analysis of the Computer Anxiety Levels of Secondary Technical Education Teachers in West Virginia.

    ERIC Educational Resources Information Center

    Gordon, Howard R. D.

    1995-01-01

    Responses from 116 secondary technical education teachers (91%) who completed Oetting's Computer Anxiety Scale indicated that 46% experienced general computer anxiety. Development of computer and typing skills explained a large portion of the variance. More hands-on training was recommended. (SK)

  2. Efficacy of a Standardized Computer-Based Training Curriculum to Teach Echocardiographic Identification of Rheumatic Heart Disease to Nonexpert Users.

    PubMed

    Beaton, Andrea; Nascimento, Bruno R; Diamantino, Adriana C; Pereira, Gabriel T R; Lopes, Eduardo L V; Miri, Cassio O; Bruno, Kaciane K O; Chequer, Graziela; Ferreira, Camila G; Lafeta, Luciana C X; Richards, Hedda; Perlman, Lindsay; Webb, Catherine L; Ribeiro, Antonio L P; Sable, Craig; Nunes, Maria do Carmo P

    2016-06-01

    The ability to integrate echocardiographic for rheumatic heart disease (RHD) into RHD prevention programs is limited because of lack of financial and expert human resources in endemic areas. Task shifting to nonexperts is promising; but investigations into workforce composition and training schemes are needed. The objective of this study was to test nonexperts' ability to interpret RHD screening echocardiograms after a brief, standardized, computer-based training course. Six nonexperts completed a 3-week curriculum on image interpretation. Participant performance was tested in a school-screening environment in comparison to the reference approach (cardiologists, standard portable echocardiography machines, and 2012 World Heart Federation criteria). All participants successfully completed the curriculum, and feedback was universally positive. Screening was performed in 1,381 children (5 to 18 years, 60% female), with 397 (47 borderline RHD, 6 definite RHD, 336 normal, and 8 other) referred for handheld echo. Overall sensitivity of the simplified approach was 83% (95% CI 76% to 89%), with an overall specificity of 85% (95% CI 82% to 87%). The most common reasons for false-negative screens (n = 16) were missed mitral regurgitation (MR; 44%) and MR ≤1.5 cm (29%). The most common reasons for false-positive screens (n = 179) included identification of erroneous color jets (25%), incorrect MR measurement (24%), and appropriate application of simplified guidelines (39.4%). In conclusion, a short, independent computer-based curriculum can be successfully used to train a heterogeneous group of nonexperts to interpret RHD screening echocardiograms. This approach helps address prohibitive financial and workforce barriers to widespread RHD screening. Copyright © 2016 Elsevier Inc. All rights reserved.

  3. Diagnostic accuracy of cone beam computed tomography in identification and postoperative evaluation of furcation defects

    PubMed Central

    Pajnigara, Natasha; Kolte, Abhay; Kolte, Rajashri; Pajnigara, Nilufer; Lathiya, Vrushali

    2016-01-01

    Background: Decision-making in periodontal therapeutics is critical and is influenced by accurate diagnosis of osseous defects, especially furcation involvement. Commonly used diagnostic methods such as clinical probing and conventional radiography have their own limitations. Hence, this study was planned to evaluate the dimensions of furcation defects clinically (pre- and post-surgery), intra-surgically, and by cone beam computed tomography (CBCT) (pre- and post-surgery). Materials and Methods: The study comprised a total of 200 Grade II furcation defects in forty patients, with a mean age of 38.05 ± 4.77 years diagnosed with chronic periodontitis which were evaluated clinically (pre- and post-surgically), by CBCT (pre- and post-surgically), and intrasurgically after flap reflection (40 defects in each). After the presurgical clinical and CBCT measurements, demineralized freeze-dried bone allograft was placed in the furcation defect and the flaps were sutured back. Six months later, these defects were evaluated by recording measurements clinically, i.e., postsurgery clinical measurements and also postsurgery CBCT measurements (40 defects each). Results: Presurgery clinical measurements (vertical 6.15 ± 1.71 mm and horizontal 3.05 ± 0.84 mm) and CBCT measurements (vertical 7.69 ± 1.67 mm and horizontal 4.62 ± 0.77 mm) underestimated intrasurgery measurements (vertical 8.025 ± 1.67 mm and horizontal 4.82 ± 0.67 mm) in both vertical and horizontal aspects, and the difference was statistically not significant (vertical P = 1.000, 95% confidence interval [CI], horizontal P = 0.867, 95% CI). Further, postsurgery clinical measurements (vertical 2.9 ± 0.74 mm and horizontal 1.52 ± 0.59 mm) underestimated CBCT measurements (vertical 3.67 ± 1.17 mm and horizontal 2.45 ± 0.48 mm). There was statistically significant difference between presurgery clinical–presurgery CBCT (P < 0.0001, 95% CI) versus postsurgery clinical–postsurgery CBCT (P < 0.0001, 95% CI

  4. Computational Identification of Diverse Mechanisms Underlying Transcription Factor-DNA Occupancy

    PubMed Central

    Cheng, Qiong; Kazemian, Majid; Pham, Hannah; Blatti, Charles; Celniker, Susan E.; Wolfe, Scot A.; Brodsky, Michael H.; Sinha, Saurabh

    2013-01-01

    ChIP-based genome-wide assays of transcription factor (TF) occupancy have emerged as a powerful, high-throughput method to understand transcriptional regulation, especially on a global scale. This has led to great interest in the underlying biochemical mechanisms that direct TF-DNA binding, with the ultimate goal of computationally predicting a TF's occupancy profile in any cellular condition. In this study, we examined the influence of various potential determinants of TF-DNA binding on a much larger scale than previously undertaken. We used a thermodynamics-based model of TF-DNA binding, called “STAP,” to analyze 45 TF-ChIP data sets from Drosophila embryonic development. We built a cross-validation framework that compares a baseline model, based on the ChIP'ed (“primary”) TF's motif, to more complex models where binding by secondary TFs is hypothesized to influence the primary TF's occupancy. Candidates interacting TFs were chosen based on RNA-SEQ expression data from the time point of the ChIP experiment. We found widespread evidence of both cooperative and antagonistic effects by secondary TFs, and explicitly quantified these effects. We were able to identify multiple classes of interactions, including (1) long-range interactions between primary and secondary motifs (separated by ≤150 bp), suggestive of indirect effects such as chromatin remodeling, (2) short-range interactions with specific inter-site spacing biases, suggestive of direct physical interactions, and (3) overlapping binding sites suggesting competitive binding. Furthermore, by factoring out the previously reported strong correlation between TF occupancy and DNA accessibility, we were able to categorize the effects into those that are likely to be mediated by the secondary TF's effect on local accessibility and those that utilize accessibility-independent mechanisms. Finally, we conducted in vitro pull-down assays to test model-based predictions of short-range cooperative interactions

  5. Perceptions of teaching and learning automata theory in a college-level computer science course

    NASA Astrophysics Data System (ADS)

    Weidmann, Phoebe Kay

    This dissertation identifies and describes student and instructor perceptions that contribute to effective teaching and learning of Automata Theory in a competitive college-level Computer Science program. Effective teaching is the ability to create an appropriate learning environment in order to provide effective learning. We define effective learning as the ability of a student to meet instructor set learning objectives, demonstrating this by passing the course, while reporting a good learning experience. We conducted our investigation through a detailed qualitative case study of two sections (118 students) of Automata Theory (CS 341) at The University of Texas at Austin taught by Dr. Lily Quilt. Because Automata Theory has a fixed curriculum in the sense that many curricula and textbooks agree on what Automata Theory contains, differences being depth and amount of material to cover in a single course, a case study would allow for generalizable findings. Automata Theory is especially problematic in a Computer Science curriculum since students are not experienced in abstract thinking before taking this course, fail to understand the relevance of the theory, and prefer classes with more concrete activities such as programming. This creates a special challenge for any instructor of Automata Theory as motivation becomes critical for student learning. Through the use of student surveys, instructor interviews, classroom observation, material and course grade analysis we sought to understand what students perceived, what instructors expected of students, and how those perceptions played out in the classroom in terms of structure and instruction. Our goal was to create suggestions that would lead to a better designed course and thus a higher student success rate in Automata Theory. We created a unique theoretical basis, pedagogical positivism, on which to study college-level courses. Pedagogical positivism states that through examining instructor and student perceptions

  6. Computed Tomography Image Origin Identification based on Original Sensor Pattern Noise and 3D Image Reconstruction Algorithm Footprints.

    PubMed

    Duan, Yuping; Bouslimi, Dalel; Yang, Guanyu; Shu, Huazhong; Coatrieux, Gouenou

    2016-06-08

    In this paper, we focus on the "blind" identification of the Computed Tomography (CT) scanner that has produced a CT image. To do so, we propose a set of noise features derived from the image chain acquisition and which can be used as CT-Scanner footprint. Basically, we propose two approaches. The first one aims at identifying a CT-Scanner based on an Original Sensor Pattern Noise (OSPN) that is intrinsic to the X-ray detectors. The second one identifies an acquisition system based on the way this noise is modified by its 3D image reconstruction algorithm. As these reconstruction algorithms are manufacturer dependent and kept secret, our features are used as input to train an SVM based classifier so as to discriminate acquisition systems. Experiments conducted on images issued from 15 different CT-Scanner models of 4 distinct manufacturers demonstrate that our system identifies the origin of one CT image with a detection rate of at least 94% and that it achieves better performance than Sensor Pattern Noise (SPN) based strategy proposed for general public camera devices.

  7. Automatic identification of watercourses in flat and engineered landscapes by computing the skeleton of a LiDAR point cloud

    NASA Astrophysics Data System (ADS)

    Broersen, Tom; Peters, Ravi; Ledoux, Hugo

    2017-09-01

    Drainage networks play a crucial role in protecting land against floods. It is therefore important to have an accurate map of the watercourses that form the drainage network. Previous work on the automatic identification of watercourses was typically based on grids, focused on natural landscapes, and used mostly the slope and curvature of the terrain. We focus in this paper on areas that are characterised by low-lying, flat, and engineered landscapes; these are characteristic to the Netherlands for instance. We propose a new methodology to identify watercourses automatically from elevation data, it uses solely a raw classified LiDAR point cloud as input. We show that by computing twice a skeleton of the point cloud-once in 2D and once in 3D-and that by using the properties of the skeletons we can identify most of the watercourses. We have implemented our methodology and tested it for three different soil types around Utrecht, the Netherlands. We were able to detect 98% of the watercourses for one soil type, and around 75% for the worst case, when we compared to a reference dataset that was obtained semi-automatically.

  8. [Computer-aided personality identification by skull and life-time photography by POSKID 1.1 method].

    PubMed

    Zviagin, V N; Ivanov, N V; Narina, N V

    2000-01-01

    An improved method for computer-aided personality identification by the skull, based on the POSKID 1.1 software, consists in investigation of enlarged images of the skull and life-time photograph of the probable individual by coordinates of 49 anatomical points; independent quantitative evaluation of the aspect of each of the compared objects by the X, Y, and Z axes; formal evaluation of the results of comparative study of the skull-portrait by multidimensional discriminant analysis models. The proposed version differs from the POSKID 1.0 software in the method for evaluating the spatial position of the head on the portrait and adequate orientation of the skull in space, which necessitates the utilization of coordinate-regulated holder POSKID 1.1 method is based on multidimensional discriminant analysis and suggests a virtually reliable solution in 76.13-80.65% cases, a probable solution (positive and negative) in 11.61-18.06% cases, and motivated refusal from solution in 5.81-7.74% cases. In case of a probable or indefinite solution further investigations are recommended making use of life-time photographs with different aspects.

  9. Cord blood triglycerides are associated with IGF-I levels and contribute to the identification of growth-restricted neonates.

    PubMed

    Sifianou, Popi; Zisis, Dimitris

    2012-12-01

    The aim of this study was to investigate whether readily available laboratory tests may aid in the identification of growth-restricted neonates. Cord serum levels of 15 chemical analytes, including insulin-like growth factor I (IGF-I) and insulin-like growth factor binding protein 3 (IGFBP-3) were measured in newborns ≥36 weeks gestational age (GA). Based on the number of anthropometric indices (out of four) with values ≤25th centile for GA, the babies were allocated into three groups, i.e., Group(25)0, Group(25)1 and Group(25)2 corresponding to neonates with 0, 1 and 2 or more indices, respectively, that were ≤25th centile for GA. Furthermore, two composite variables were developed: A25 (Group(25)0 and Group(25)1) and B25 (Group(25)0 and Group(25)2). The data were evaluated by the Mann-Whitney test and multiple regression analyses. Cord serum triglycerides and total cholesterol levels were significantly higher in Group(25)2 compared to Group(25)0 (p values 0.004 and 0.0009, respectively). The triglycerides almost doubled the power of the variable B25 for predicting IGF-I levels and were found to have a highly significant, negative association with the IGF-I levels (p<0.0001). The IGF-I along with the IGFBP-3 levels explained almost one third of the variation of triglycerides. Cord serum triglycerides can assist in the identification of growth-restricted neonates. The novel finding of the association of triglycerides with IGF-I calls for further research as this can illuminate unknown aspects of the fetal lipid metabolism. Copyright © 2012 Elsevier Ltd. All rights reserved.

  10. Identification of a Small Molecule That Selectively Inhibits Mouse PC2 over Mouse PC1/3: A Computational and Experimental Study

    PubMed Central

    Yongye, Austin B.; Vivoli, Mirella; Lindberg, Iris; Appel, Jon R.; Houghten, Richard A.; Martinez-Mayorga, Karina

    2013-01-01

    The calcium-dependent serine endoproteases prohormone convertase 1/3 (PC1/3) and prohormone convertase 2 (PC2) play important roles in the homeostatic regulation of blood glucose levels, hence implicated in diabetes mellitus. Specifically, the absence of PC2 has been associated with chronic hypoglycemia. Since there is a reasonably good conservation of the catalytic domain between species translation of inhibitory effects is likely. In fact, similar results have been found using both mouse and human recombinant enzymes. Here, we employed computational structure-based approaches to screen 14,400 compounds from the Maybridge small molecule library towards mouse PC2. Our most remarkable finding was the identification of a potent and selective PC2 inhibitor. Kinetic data showed the compound to be an allosteric inhibitor. The compound identified is one of the few reported selective, small-molecule inhibitors of PC2. In addition, this new PC2 inhibitor is structurally different and of smaller size than those reported previously. This is advantageous for future studies where structural analogues can be built upon. PMID:23451118

  11. Genotypic and Phenotypic Applications for the Differentiation and Species-Level Identification of Achromobacter for Clinical Diagnoses

    PubMed Central

    Gomila, Margarita; Prince-Manzano, Claudia; Svensson-Stadler, Liselott; Busquets, Antonio; Erhard, Marcel; Martínez, Deny L.; Lalucat, Jorge; Moore, Edward R. B.

    2014-01-01

    The Achromobacter is a genus in the family Alcaligenaceae, comprising fifteen species isolated from different sources, including clinical samples. The ability to detect and correctly identify Achromobacter species, particularly A. xylosoxidans, and differentiate them from other phenotypically similar and genotypically related Gram-negative, aerobic, non-fermenting species is important for patients with cystic fibrosis (CF), as well as for nosocomial and other opportunistic infections. Traditional phenotypic profile-based analyses have been demonstrated to be inadequate for reliable identifications of isolates of Achromobacter species and genotypic-based assays, relying upon comparative 16S rRNA gene sequence analyses are not able to insure definitive identifications of Achromobacter species, due to the inherently conserved nature of the gene. The uses of alternative methodologies to enable high-resolution differentiation between the species in the genus are needed. A comparative multi-locus sequence analysis (MLSA) of four selected ‘house-keeping’ genes (atpD, gyrB, recA, and rpoB) assessed the individual gene sequences for their potential in developing a reliable, rapid and cost-effective diagnostic protocol for Achromobacter species identifications. The analysis of the type strains of the species of the genus and 46 strains of Achromobacter species showed congruence between the cluster analyses derived from the individual genes. The MLSA gene sequences exhibited different levels of resolution in delineating the validly published Achromobacter species and elucidated strains that represent new genotypes and probable new species of the genus. Our results also suggested that the recently described A. spritinus is a later heterotypic synonym of A. marplatensis. Strains were analyzed, using whole-cell Matrix-Assisted Laser Desorption/Ionization Time-Of-Flight mass spectrometry (MALDI-TOF MS), as an alternative phenotypic profile-based method with the potential to

  12. High quality machine-robust image features: identification in nonsmall cell lung cancer computed tomography images.

    PubMed

    Hunter, Luke A; Krafft, Shane; Stingo, Francesco; Choi, Haesun; Martel, Mary K; Kry, Stephen F; Court, Laurence E

    2013-12-01

    For nonsmall cell lung cancer (NSCLC) patients, quantitative image features extracted from computed tomography (CT) images can be used to improve tumor diagnosis, staging, and response assessment. For these findings to be clinically applied, image features need to have high intra and intermachine reproducibility. The objective of this study is to identify CT image features that are reproducible, nonredundant, and informative across multiple machines. Noncontrast-enhanced, test-retest CT image pairs were obtained from 56 NSCLC patients imaged on three CT machines from two institutions. Two machines ("M1" and "M2") used cine 4D-CT and one machine ("M3") used breath-hold helical 3D-CT. Gross tumor volumes (GTVs) were semiautonomously segmented then pruned by removing voxels with CT numbers less than a prescribed Hounsfield unit (HU) cutoff. Three hundred and twenty eight quantitative image features were extracted from each pruned GTV based on its geometry, intensity histogram, absolute gradient image, co-occurrence matrix, and run-length matrix. For each machine, features with concordance correlation coefficient values greater than 0.90 were considered reproducible. The Dice similarity coefficient (DSC) and the Jaccard index (JI) were used to quantify reproducible feature set agreement between machines. Multimachine reproducible feature sets were created by taking the intersection of individual machine reproducible feature sets. Redundant features were removed through hierarchical clustering based on the average correlation between features across multiple machines. For all image types, GTV pruning was found to negatively affect reproducibility (reported results use no HU cutoff). The reproducible feature percentage was highest for average images (M1 = 90.5%, M2 = 94.5%, M1∩M2 = 86.3%), intermediate for end-exhale images (M1 = 75.0%, M2 = 71.0%, M1∩M2 = 52.1%), and lowest for breath-hold images (M3 = 61.0%). Between M1 and M2, the reproducible feature sets

  13. High quality machine-robust image features: Identification in nonsmall cell lung cancer computed tomography images

    PubMed Central

    Hunter, Luke A.; Krafft, Shane; Stingo, Francesco; Choi, Haesun; Martel, Mary K.; Kry, Stephen F.; Court, Laurence E.

    2013-01-01

    Purpose: For nonsmall cell lung cancer (NSCLC) patients, quantitative image features extracted from computed tomography (CT) images can be used to improve tumor diagnosis, staging, and response assessment. For these findings to be clinically applied, image features need to have high intra and intermachine reproducibility. The objective of this study is to identify CT image features that are reproducible, nonredundant, and informative across multiple machines. Methods: Noncontrast-enhanced, test-retest CT image pairs were obtained from 56 NSCLC patients imaged on three CT machines from two institutions. Two machines (“M1” and “M2”) used cine 4D-CT and one machine (“M3”) used breath-hold helical 3D-CT. Gross tumor volumes (GTVs) were semiautonomously segmented then pruned by removing voxels with CT numbers less than a prescribed Hounsfield unit (HU) cutoff. Three hundred and twenty eight quantitative image features were extracted from each pruned GTV based on its geometry, intensity histogram, absolute gradient image, co-occurrence matrix, and run-length matrix. For each machine, features with concordance correlation coefficient values greater than 0.90 were considered reproducible. The Dice similarity coefficient (DSC) and the Jaccard index (JI) were used to quantify reproducible feature set agreement between machines. Multimachine reproducible feature sets were created by taking the intersection of individual machine reproducible feature sets. Redundant features were removed through hierarchical clustering based on the average correlation between features across multiple machines. Results: For all image types, GTV pruning was found to negatively affect reproducibility (reported results use no HU cutoff). The reproducible feature percentage was highest for average images (M1 = 90.5%, M2 = 94.5%, M1∩M2 = 86.3%), intermediate for end-exhale images (M1 = 75.0%, M2 = 71.0%, M1∩M2 = 52.1%), and lowest for breath-hold images (M3 = 61.0%). Between M1 and M2

  14. A component-level failure detection and identification algorithm based on open-loop and closed-loop state estimators

    NASA Astrophysics Data System (ADS)

    You, Seung-Han; Cho, Young Man; Hahn, Jin-Oh

    2013-04-01

    This study presents a component-level failure detection and identification (FDI) algorithm for a cascade mechanical system subsuming a plant driven by an actuator unit. The novelty of the FDI algorithm presented in this study is that it is able to discriminate failure occurring in the actuator unit, the sensor measuring the output of the actuator unit, and the plant driven by the actuator unit. The proposed FDI algorithm exploits the measurement of the actuator unit output together with its estimates generated by open-loop (OL) and closed-loop (CL) estimators to enable FDI at the component's level. In this study, the OL estimator is designed based on the system identification of the actuator unit. The CL estimator, which is guaranteed to be stable against variations in the plant, is synthesized based on the dynamics of the entire cascade system. The viability of the proposed algorithm is demonstrated using a hardware-in-the-loop simulation (HILS), which shows that it can detect and identify target failures reliably in the presence of plant uncertainties.

  15. [Elevated stimulated thyroglobulin levels in the identification of persistent papillary thyroid carcinoma].

    PubMed

    Jervis, Paola; González, Baldomero; Vargas, Guadalupe; Mercado, Moisés

    2011-01-01

    Persistence of papillary thyroid carcinoma is usually associated with elevated stimulated thyroglobulin levels. To evaluate the association between moderately elevated stimulated thyroglobulin levels and the persistence of papillary thyroid carcinoma one year after thyroidectomy and radioiodine ablation. Out of a cohort of 97 patients with papillary thyroid carcinoma, we selected those with available stimulated thyroglobulin level measurements (in the absence of thyroglobulin antibodies) after one year of initial treatment with surgery and radioiodine. The subjects were stratified according to whether the stimulated thyroglobulin level was between 1-10 ng/ml or above 10 ng/ml. Twenty-seven patients were included in the study, 11 with a stimulated thyroglobulin level between 1-10 ng/ml, and 16 with values greater than 10 ng/ml. Median age and gender proportion was similar between both groups. As expected, median stimulated thyroglobulin levels were significantly greater in the second group (5.1 vs. 42 ng/ml; p < 0.001). A stratified analysis aiming at associating stimulated thyroglobulin levels with disease persistence yielded an overall risk of 0.58 (95% CI: 0.1-3.09; p = 0.52) for those subjects with levels between 1-10 ng/ml, while for those with levels > 10 ng/ml the overall risk was 1.71 (95% CI: 0.32-9.1; p = 0.52). The positive predictive value for a stimulated thyroglobulin level between 1-10 ng/ml was 64%. A moderately elevated stimulated thyroglobulin level is an uncertain predictor of papillary thyroid carcinoma persistence.

  16. Computer-assisted identification of novel small molecule inhibitors targeting GLUT1

    NASA Astrophysics Data System (ADS)

    Wan, Zhining; Li, Xin; Sun, Rong; Li, Yuanyuan; Wang, Xiaoyun; Li, Xinru; Rong, Li; Shi, Zheng; Bao, Jinku

    2015-12-01

    Glucose transporters (GLUTs) are the main carriers of glucose that facilitate the diffusion of glucose in mammalian cells, especially GLUT1. Notably, GLUT1 is a rate-limiting transporter for glucose uptake, and its overexpression is a common characteristic in most cancers. Thus, the inhibition of GLUT1 by novel small compounds to lower glucose levels for cancer cells has become an emerging strategy. Herein, we employed high-throughput screening approaches to identify potential inhibitors against the sugar-binding site of GLUT1. Firstly, molecular docking screening was launched against the specs products, and three molecules (ZINC19909927, ZINC19908826, and ZINC19815451) were selected as candidate GLUT1 inhibitors for further analysis. Then, taking the initial ligand β-NG as a reference, molecular dynamic (MD) simulations and molecular mechanics/generalized born surface area (MM/GBSA) method were applied to evaluate the binding stability and affinity of the three candidates towards GLUT1. Finally, we found that ZINC19909927 might have the highest affinity to occupy the binding site of GLUT1. Meanwhile, energy decomposition analysis identified several residues located in substrate-binding site that might provide clues for future inhibitor discovery towards GLUT1. Taken together, these results in our study may provide valuable information for identifying new inhibitors targeting GLUT1-mediated glucose transport and metabolism for cancer therapeutics.

  17. Toward accurate tooth segmentation from computed tomography images using a hybrid level set model

    SciTech Connect

    Gan, Yangzhou; Zhao, Qunfei; Xia, Zeyang E-mail: jing.xiong@siat.ac.cn; Hu, Ying; Xiong, Jing E-mail: jing.xiong@siat.ac.cn; Zhang, Jianwei

    2015-01-15

    Purpose: A three-dimensional (3D) model of the teeth provides important information for orthodontic diagnosis and treatment planning. Tooth segmentation is an essential step in generating the 3D digital model from computed tomography (CT) images. The aim of this study is to develop an accurate and efficient tooth segmentation method from CT images. Methods: The 3D dental CT volumetric images are segmented slice by slice in a two-dimensional (2D) transverse plane. The 2D segmentation is composed of a manual initialization step and an automatic slice by slice segmentation step. In the manual initialization step, the user manually picks a starting slice and selects a seed point for each tooth in this slice. In the automatic slice segmentation step, a developed hybrid level set model is applied to segment tooth contours from each slice. Tooth contour propagation strategy is employed to initialize the level set function automatically. Cone beam CT (CBCT) images of two subjects were used to tune the parameters. Images of 16 additional subjects were used to validate the performance of the method. Volume overlap metrics and surface distance metrics were adopted to assess the segmentation accuracy quantitatively. The volume overlap metrics were volume difference (VD, mm{sup 3}) and Dice similarity coefficient (DSC, %). The surface distance metrics were average symmetric surface distance (ASSD, mm), RMS (root mean square) symmetric surface distance (RMSSSD, mm), and maximum symmetric surface distance (MSSD, mm). Computation time was recorded to assess the efficiency. The performance of the proposed method has been compared with two state-of-the-art methods. Results: For the tested CBCT images, the VD, DSC, ASSD, RMSSSD, and MSSD for the incisor were 38.16 ± 12.94 mm{sup 3}, 88.82 ± 2.14%, 0.29 ± 0.03 mm, 0.32 ± 0.08 mm, and 1.25 ± 0.58 mm, respectively; the VD, DSC, ASSD, RMSSSD, and MSSD for the canine were 49.12 ± 9.33 mm{sup 3}, 91.57 ± 0.82%, 0.27 ± 0.02 mm, 0

  18. Computed Tomography-Related Radiation Exposure in Children Transferred to a Level 1 Pediatric Trauma Center

    PubMed Central

    Brinkman, Adam S.; Gill, Kara G.; Leys, Charles M.; Gosain, Ankush

    2015-01-01

    Purpose Pediatric trauma patients presenting to Referring Facilities (RF) often undergo computed tomography scans (CT) to identify injuries before transfer to a Level 1 Pediatric Trauma Center (PTC). The purpose of our study was to evaluate RF compliance with the American College of Radiology (ACR) guidelines to minimize ionizing radiation exposure in pediatric trauma patients and to determine the frequency of additional or repeat CT imaging after transfer to a PTC. Methods After IRB approval, a retrospective review of all pediatric trauma admissions from January 2010-December 2011 at our American College of Surgeons (ACS) Level 1 PTC was performed. Patient demographics, means of arrival, injury severity score and disposition were analyzed. Patients who underwent CT were grouped by means of arrival: those that were transferred from a RF versus those that presented primarily to the PTC. Compliance with ACR guidelines and need for additional or repeat CT scans were assessed for both groups. Results 697 children (<18yo) were identified with a mean age of 10.6 years. 321 (46%) patients presented primarily to the PTC. 376 (54%) were transferred from a RF, of which 90 (24%) patients underwent CT imaging prior to transfer. CT radiation dosing information was available for 79/90 patients (88%). After transfer, 8/90 (9%) of children imaged at a RF required additional CT scans. In comparison, 314/321 (98%) of patients who presented primarily to the PTC and underwent CT received appropriate pediatric radiation dosing. Mean radiation dose at PTC was approximately half of that at RF for CT scans of the head, chest and abdomen/pelvis (p<0.01). Conclusions Pediatric trauma patients transferred from RF often undergo CT scanning with higher than recommended radiation doses, potentially placing them at increased carcinogenic risk. Fortunately, few RF patients required additional CT scans after PTC transfer. Finally, compliance with ACR radiation dose limit guidelines is better

  19. LEVEL: A computer program for solving the radial Schrödinger equation for bound and quasibound levels

    NASA Astrophysics Data System (ADS)

    Le Roy, Robert J.

    2017-01-01

    This paper describes program LEVEL, which can solve the radial or one-dimensional Schrödinger equation and automatically locate either all of, or a selected number of, the bound and/or quasibound levels of any smooth single- or double-minimum potential, and calculate inertial rotation and centrifugal distortion constants and various expectation values for those levels. It can also calculate Franck-Condon factors and other off-diagonal matrix elements, either between levels of a single potential or between levels of two different potentials. The potential energy function may be defined by any one of a number of analytic functions, or by a set of input potential function values which the code will interpolate over and extrapolate beyond to span the desired range.

  20. Computation of dose rate at flight altitudes during ground level enhancements no. 69, 70 and 71

    NASA Astrophysics Data System (ADS)

    Mishev, A. L.; Adibpour, F.; Usoskin, I. G.; Felsberger, E.

    2015-01-01

    A new numerical model of estimating and monitoring the exposure of personnel due to secondary cosmic radiation onboard aircraft, in accordance with radiation safety standards as well as European and national regulations, has been developed. The model aims to calculate the effective dose at flight altitude (39,000 ft) due to secondary cosmic radiation of galactic and solar origin. In addition, the model allows the estimation of ambient dose equivalent at typical commercial airline altitudes in order to provide comparison with reference data. The basics, structure and function of the model are described. The model is based on a straightforward full Monte Carlo simulation of the cosmic ray induced atmospheric cascade. The cascade simulation is performed with the PLANETOCOSMICS code. The flux of secondary particles, namely neutrons, protons, gammas, electrons, positrons, muons and charged pions is calculated. A subsequent conversion of the particle fluence into the effective dose or ambient dose equivalent is performed as well as a comparison with reference data. An application of the model is demonstrated, using a computation of the effective dose rate at flight altitude during the ground level enhancements of 20 January 2005, 13 December 2006 and 17 May 2012.

  1. Radiation safety concerns and diagnostic reference levels for computed tomography scanners in Tamil Nadu.

    PubMed

    Livingstone, Roshan S; Dinakaran, Paul M

    2011-01-01

    Radiation safety in computed tomography (CT) scanners is of concern due its widespread use in the field of radiological imaging. This study intends to evaluate radiation doses imparted to patients undergoing thorax, abdomen and pelvic CT examinations and formulate regional diagnostic reference levels (DRL) in Tamil Nadu, South India. In-site CT dose measurement was performed in 127 CT scanners in Tamil Nadu for a period of 2 years as a part of the Atomic Energy Regulatory Board (AERB)-funded project. Out of the 127 CT scanners,13 were conventional; 53 single-slice helical scanners (SSHS); 44 multislice CT (MSCT) scanners; and 17 refurbished scanners. CT dose index (CTDI) was measured using a 32-cm polymethyl methacrylate (PMMA)-body phantom in each CT scanner. Dose length product (DLP) for different anatomical regions was generated using CTDI values. The regional DRLs for thorax, abdomen and pelvis examinations were 557, 521 and 294 mGy cm, respectively. The mean effective dose was estimated using the DLP values and was found to be 8.04, 6.69 and 4.79 mSv for thorax, abdomen and pelvic CT examinations, respectively. The establishment of DRLs in this study is the first step towards optimization of CT doses in the Indian context.

  2. Mechanical Behaviour of Light Metal Alloys at High Strain Rates. Computer Simulation on Mesoscale Levels

    NASA Astrophysics Data System (ADS)

    Skripnyak, Vladimir; Skripnyak, Evgeniya; Meyer, Lothar W.; Herzig, Norman; Skripnyak, Nataliya

    2012-02-01

    Researches of the last years have allowed to establish that the laws of deformation and fracture of bulk ultrafine-grained and coarse-grained materials are various both in static and in dynamic loading conditions. Development of adequate constitutive equations for the description of mechanical behavior of bulk ultrafine-grained materials at intensive dynamic influences is complicated in consequence of insufficient knowledge about general rules of inelastic deformation and nucleation and growth of cracks. Multi-scale computational model was used for the investigation of deformation and fracture of bulk structured aluminum and magnesium alloys under stress pulse loadings on mesoscale level. The increment of plastic deformation is defined by the sum of the increments caused by a nucleation and gliding of dislocations, the twinning, meso-blocks movement, and grain boundary sliding. The model takes into account the influence on mechanical properties of alloys an average grains size, grain sizes distribution of and concentration of precipitates. It was obtained the nucleation and gliding of dislocations caused the high attenuation rate of the elastic precursor of ultrafine-grained alloys than in coarse grained counterparts.

  3. A computational model to predict changes in breathiness resulting from variations in aspiration noise level

    PubMed Central

    Shrivastav, Rahul; Camacho, Arturo

    2009-01-01

    Perception of breathy voice quality is cued by a number of acoustic changes including an increase in aspiration noise level (AH) and spectral slope[1]. Changes in AH in a vowel may be evaluated through measures such as the harmonic-to-noise ratio (HNR), cepstral peak prominence (CPP) or via auditory measures such as the partial loudness of harmonic energy (PL) and loudness of aspiration noise (NL). Although a number of experiments have reported high correlation between such measures and ratings of perceived breathiness, a formal model to predict breathiness of a vowel has not been proposed. This research describes two computational models to predict changes in breathiness resulting from variations in AH. One model uses auditory measures while the other uses CPP as independent variables to predict breathiness. For both cases, a translated and truncated power function is required to predict breathiness. Some parameters in both of these models were observed to be pitch-dependent. The “unified” model based on auditory measures was observed to be more accurate than one based on CPP. PMID:19896328

  4. Extended gray level co-occurrence matrix computation for 3D image volume

    NASA Astrophysics Data System (ADS)

    Salih, Nurulazirah M.; Dewi, Dyah Ekashanti Octorina

    2017-02-01

    Gray Level Co-occurrence Matrix (GLCM) is one of the main techniques for texture analysis that has been widely used in many applications. Conventional GLCMs usually focus on two-dimensional (2D) image texture analysis only. However, a three-dimensional (3D) image volume requires specific texture analysis computation. In this paper, an extended 2D to 3D GLCM approach based on the concept of multiple 2D plane positions and pixel orientation directions in the 3D environment is proposed. The algorithm was implemented by breaking down the 3D image volume into 2D slices based on five different plane positions (coordinate axes and oblique axes) resulting in 13 independent directions, then calculating the GLCMs. The resulted GLCMs were averaged to obtain normalized values, then the 3D texture features were calculated. A preliminary examination was performed on a 3D image volume (64 x 64 x 64 voxels). Our analysis confirmed that the proposed technique is capable of extracting the 3D texture features from the extended GLCMs approach. It is a simple and comprehensive technique that can contribute to the 3D image analysis.

  5. Interactomes to Biological Phase Space: a call to begin thinking at a new level in computational biology.

    SciTech Connect

    Davidson, George S.; Brown, William Michael

    2007-09-01

    Techniques for high throughput determinations of interactomes, together with high resolution protein collocalizations maps within organelles and through membranes will soon create a vast resource. With these data, biological descriptions, akin to the high dimensional phase spaces familiar to physicists, will become possible. These descriptions will capture sufficient information to make possible realistic, system-level models of cells. The descriptions and the computational models they enable will require powerful computing techniques. This report is offered as a call to the computational biology community to begin thinking at this scale and as a challenge to develop the required algorithms and codes to make use of the new data.3

  6. Identification at biovar level of Brucella isolates causing abortion in small ruminants of iran.

    PubMed

    Behroozikhah, Ali Mohammad; Bagheri Nejad, Ramin; Amiri, Karim; Bahonar, Ali Reza

    2012-01-01

    To determine the most prevalent biovar responsible for brucellosis in sheep and goat populations of Iran, a cross-sectional study was carried out over 2 years in six provinces selected based on geography and disease prevalence. Specimens obtained from referred aborted sheep and goat fetuses were cultured on Brucella selective media for microbiological isolation. Brucellae were isolated from 265 fetuses and examined for biovar identification using standard microbiological methods. Results showed that 246 isolates (92.8%) were B. melitensis biovar 1, 18 isolates (6.8%) were B. melitensis biovar 2, and, interestingly, one isolate (0.4%) obtained from Mazandaran province was B. abortus biovar 3. In this study, B. melitensis biovar 3 was isolated in none of the selected provinces, and all isolates from 3 provinces (i.e., Chehar-mahal Bakhtiari, Markazi, and Ilam) were identified only as B. melitensis biovar 1. In conclusion, we found that B. melitensis biovar 1 remains the most prevalent cause of small ruminant brucellosis in various provinces of Iran.

  7. ENHANCING THE ATOMIC-LEVEL UNDERSTANDING OF CO2 MINERAL SEQUESTRATION MECHANISMS VIA ADVANCED COMPUTATIONAL MODELING

    SciTech Connect

    A.V.G. Chizmeshya; M.J. McKelvy; G.H. Wolf; R.W. Carpenter; D.A. Gormley; J.R. Diefenbacher; R. Marzke

    2006-03-01

    significantly improved our understanding of mineral carbonation. Group members at the Albany Research Center have recently shown that carbonation of olivine and serpentine, which naturally occurs over geological time (i.e., 100,000s of years), can be accelerated to near completion in hours. Further process refinement will require a synergetic science/engineering approach that emphasizes simultaneous investigation of both thermodynamic processes and the detailed microscopic, atomic-level mechanisms that govern carbonation kinetics. Our previously funded Phase I Innovative Concepts project demonstrated the value of advanced quantum-mechanical modeling as a complementary tool in bridging important gaps in our understanding of the atomic/molecular structure and reaction mechanisms that govern CO2 mineral sequestration reaction processes for the model Mg-rich lamellar hydroxide feedstock material Mg(OH)2. In the present simulation project, improved techniques and more efficient computational schemes have allowed us to expand and augment these capabilities and explore more complex Mg-rich, lamellar hydroxide-based feedstock materials, including the serpentine-based minerals. These feedstock materials are being actively investigated due to their wide availability, and low-cost CO2 mineral sequestration potential. Cutting-edge first principles quantum chemical, computational solid-state and materials simulation methodology studies proposed herein, have been strategically integrated with our new DOE supported (ASU-Argonne National Laboratory) project to investigate the mechanisms that govern mineral feedstock heat-treatment and aqueous/fluid-phase serpentine mineral carbonation in situ. This unified, synergetic theoretical and experimental approach has provided a deeper understanding of the key reaction mechanisms than either individual approach can alone. We used ab initio techniques to significantly advance our understanding of atomic-level processes at the solid/solution interface by

  8. ENHANCING THE ATOMIC-LEVEL UNDERSTANDING OF CO2 MINERAL SEQUESTRATION MECHANISMS VIA ADVANCED COMPUTATIONAL MODELING

    SciTech Connect

    A.V.G. Chizmeshya

    2003-12-19

    /NETL managed National Mineral Sequestration Working Group we have already significantly improved our understanding of mineral carbonation. Group members at the Albany Research Center have recently shown that carbonation of olivine and serpentine, which naturally occurs over geological time (i.e., 100,000s of years), can be accelerated to near completion in hours. Further process refinement will require a synergetic science/engineering approach that emphasizes simultaneous investigation of both thermodynamic processes and the detailed microscopic, atomic-level mechanisms that govern carbonation kinetics. Our previously funded Phase I Innovative Concepts project demonstrated the value of advanced quantum-mechanical modeling as a complementary tool in bridging important gaps in our understanding of the atomic/molecular structure and reaction mechanisms that govern CO{sub 2} mineral sequestration reaction processes for the model Mg-rich lamellar hydroxide feedstock material Mg(OH){sub 2}. In the present simulation project, improved techniques and more efficient computational schemes have allowed us to expand and augment these capabilities and explore more complex Mg-rich, lamellar hydroxide-based feedstock materials, including the serpentine-based minerals. These feedstock materials are being actively investigated due to their wide availability, and low-cost CO{sub 2} mineral sequestration potential. Cutting-edge first principles quantum chemical, computational solid-state and materials simulation methodology studies proposed herein, have been strategically integrated with our new DOE supported (ASU-Argonne National Laboratory) project to investigate the mechanisms that govern mineral feedstock heat-treatment and aqueous/fluid-phase serpentine mineral carbonation in situ. This unified, synergetic theoretical and experimental approach will provide a deeper understanding of the key reaction mechanisms than either individual approach can alone. Ab initio techniques will also

  9. ENHANCING THE ATOMIC-LEVEL UNDERSTANDING OF CO2 MINERAL SEQUESTRATION MECHANISMS VIA ADVANCED COMPUTATIONAL MODELING

    SciTech Connect

    A.V.G. Chizmeshya

    2002-12-19

    /NETL managed National Mineral Sequestration Working Group we have already significantly improved our understanding of mineral carbonation. Group members at the Albany Research Center have recently shown that carbonation of olivine and serpentine, which naturally occurs over geological time (i.e., 100,000s of years), can be accelerated to near completion in hours. Further process refinement will require a synergetic science/engineering approach that emphasizes simultaneous investigation of both thermodynamic processes and the detailed microscopic, atomic-level mechanisms that govern carbonation kinetics. Our previously funded Phase I Innovative Concepts project demonstrated the value of advanced quantum-mechanical modeling as a complementary tool in bridging important gaps in our understanding of the atomic/molecular structure and reaction mechanisms that govern CO{sub 2} mineral sequestration reaction processes for the model Mg-rich lamellar hydroxide feedstock material Mg(OH){sub 2}. In the present simulation project, improved techniques and more efficient computational schemes have allowed us to expand and augment these capabilities and explore more complex Mg-rich, lamellar hydroxide-based feedstock materials, including the serpentine-based minerals. These feedstock materials are being actively investigated due to their wide availability, and low-cost CO{sub 2} mineral sequestration potential. Cutting-edge first principles quantum chemical, computational solid-state and materials simulation methodology studies proposed herein, have been strategically integrated with our new DOE supported (ASU-Argonne National Laboratory) project to investigate the mechanisms that govern mineral feedstock heat-treatment and aqueous/fluid-phase serpentine mineral carbonation in situ. This unified, synergetic theoretical and experimental approach will provide a deeper understanding of the key reaction mechanisms than either individual approach can alone. Ab initio techniques will also

  10. Three dimensional morphological studies of Larger Benthic Foraminifera at the population level using micro computed tomography

    NASA Astrophysics Data System (ADS)

    Kinoshita, Shunichi; Eder, Wolfgang; Woeger, Julia; Hohenegger, Johann; Briguglio, Antonino; Ferrandez-Canadell, Carles

    2015-04-01

    Symbiont-bearing larger benthic Foraminifera (LBF) are long-living marine (at least 1 year), single-celled organisms with complex calcium carbonate shells. Their morphology has been intensively studied since the middle of the nineteenth century. This led to a broad spectrum of taxonomic results, important from biostratigraphy to ecology in shallow water tropical to warm temperate marine palaeo-environments. However, it was necessary for the traditional investigation methods to cut or destruct specimens for analysing the taxonomically important inner structures. X-ray micro-computed tomography (microCT) is one of the newest techniques used in morphological studies. The greatest advantage is the non-destructive acquisition of inner structures. Furthermore, the running improve of microCT scanners' hard- and software provides high resolution and short time scans well-suited for LBF. Three-dimensional imaging techniques allow to select and extract each chamber and to measure easily its volume, surface and several form parameters used for morphometric analyses. Thus, 3-dimensional visualisation of LBF-tests is a very big step forward from traditional morphology based on 2-dimensional data. The quantification of chamber form is a great opportunity to tackle LBF structures, architectures and the bauplan geometry. The micrometric digital resolution is the only way to solve many controversies in phylogeny and evolutionary trends of LBF. For the present study we used micro-computed tomography to easily investigate the chamber number of every specimen from statistically representative part of populations to estimate population dynamics. Samples of living individuals are collected at monthly intervals from fixed locations. Specific preparation allows to scan up to 35 specimens per scan within 2 hours and to obtain the complete digital dataset for each specimen of the population. MicroCT enables thus a fast and precise count of all chambers built by the foraminifer from its

  11. Identification of Xenologs and Their Characteristic Low Expression Levels in the Cyanobacterium Synechococcus elongatus.

    PubMed

    Álvarez-Canales, Gilberto; Arellano-Álvarez, Guadalupe; González-Domenech, Carmen M; de la Cruz, Fernando; Moya, Andrés; Delaye, Luis

    2015-06-01

    Horizontal gene transfer (HGT) is a central process in prokaryotic evolution. Once a gene is introduced into a genome by HGT, its contribution to the fitness of the recipient cell depends in part on its expression level. Here we show that in Synechococcus elongatus PCC 7942, xenologs derived from non-cyanobacterial sources exhibited lower expression levels than native genes in the genome. In accord with our observation, xenolog codon adaptation indexes also displayed relatively low expression values. These results are in agreement with previous reports that suggested the relative neutrality of most xenologs. However, we also demonstrated that some of the xenologs detected participated in cellular functions, including iron starvation acclimation and nitrate reduction, which corroborate the role of HGT in bacterial adaptation. For example, the expression levels of some of the xenologs detected are known to increase under iron-limiting conditions. We interpreted the overall pattern as an indication that there is a selection pressure against high expression levels of xenologs. However, when a xenolog protein product confers a selective advantage, natural selection can further modulate its expression level to meet the requirements of the recipient cell. In addition, we show that ORFans did not exhibit significantly lower expression levels than native genes in the genome, which suggested an origin other than xenology.

  12. Identification of New {CIS} Vibrational Levels in the S1 State of C2H2

    NASA Astrophysics Data System (ADS)

    Baraban, J. H.; Changala, P. B.; Shaver, R. G.; Field, R. W.; Stanton, J. F.; Merer, A. J.

    2012-06-01

    Although the S_1 (tilde{A} ^1A_u) state of the trans conformer of acetylene has been known for many years, the corresponding S_1 (tilde{A} ^1A_2) state of the cis conformer was only discovered recently. Transitions to it from the ground state are electronically forbidden, but its vibrational levels acquire intensity by tunneling through the isomerization barrier and interacting with levels of the trans conformer. We have recently identified two new vibrational levels (32 and 41 61) of the {cis} conformer of S1 C2H2, bringing the total number of levels observed to six out of an expected ten up to the energies studied in this work. The appearance of these levels in IR-UV double resonance LIF spectra will be discussed, along with their vibrational assignments. Experimentally determined vibrational parameters and {ab initio} anharmonic force fields for both the {trans} and {cis} conformers will be presented as part of the evidence supporting these assignments. These results shed new light on the vibrational level structure of both conformers in this isomerizing system. A. J. Merer, A. H. Steeves, J. H. Baraban, H. A. Bechtel, and R. W. Field. J. Chem. Phys., 134(24):244310, 2011.

  13. Identification of Hemoglobin Levels Based on Anthropometric Indices in Elderly Koreans

    PubMed Central

    Kim, Jong Yeol

    2016-01-01

    Objectives Anemia is independently and strongly associated with an increased risk of mortality in older people and is also strongly associated with obesity. The objectives of the present study were to examine the associations between the hemoglobin level and various anthropometric indices, to predict low and normal hemoglobin levels using combined anthropometric indices, and to assess differences in the hemoglobin level and anthropometric indices between Korean men and women. Methods A total of 7,156 individuals ranging in age from 53–90 years participated in this retrospective cross-sectional study. Binary logistic regression (LR) and naïve Bayes (NB) models were used to identify significant differences in the anthropometric indices between subjects with low and normal hemoglobin levels and to assess the predictive power of these indices for the hemoglobin level. Results Among all of the variables, age displayed the strongest association with the hemoglobin level in both men (p < 0.0001, odds ratio [OR] = 0.487, area under the receiver operating characteristic curve based on the LR [LR-AUC] = 0.702, NB-AUC = 0.701) and women (p < 0.0001, OR = 0.636, LR-AUC = 0.625, NB-AUC = 0.624). Among the anthropometric indices, weight and body mass index (BMI) were the best predictors of the hemoglobin level. The predictive powers of all of the variables were higher in men than in women. The AUC values for the NB-Wrapper and LR-Wrapper predictive models generated using combined anthropometric indices were 0.734 and 0.723, respectively, for men and 0.649 and 0.652, respectively, for women. The use of combined anthropometric indices may improve the predictive power for the hemoglobin level. Discussion Among the various anthropometric indices, with the exception of age, we did not identify any indices that were better predictors than weight and BMI for low and normal hemoglobin levels. In addition, none of the ratios between pairs of indices were good indicators of the

  14. Risk of node metastasis of sentinel lymph nodes detected in level II/III of the axilla by single-photon emission computed tomography/computed tomography

    PubMed Central

    SHIMA, HIROAKI; KUTOMI, GORO; SATOMI, FUKINO; MAEDA, HIDEKI; TAKAMARU, TOMOKO; KAMESHIMA, HIDEKAZU; OMURA, TOSEI; MORI, MITSURU; HATAKENAKA, MASAMITSU; HASEGAWA, TADASHI; HIRATA, KOICHI

    2014-01-01

    In breast cancer, single-photon emission computed tomography/computed tomography (SPECT/CT) shows the exact anatomical location of sentinel nodes (SN). SPECT/CT mainly exposes axilla and partly exposes atypical sites of extra-axillary lymphatic drainage. The mechanism of how the atypical hot nodes are involved in lymphatic metastasis was retrospectively investigated in the present study, particularly at the level II/III region. SPECT/CT was performed in 92 clinical stage 0-IIA breast cancer patients. Sentinel lymph nodes are depicted as hot nodes in SPECT/CT. Patients were divided into two groups: With or without hot node in level II/III on SPECT/CT. The existence of metastasis in level II/III was investigated and the risk factors were identified. A total of 12 patients were sentinel lymph node biopsy metastasis positive and axillary lymph node dissection (ALND) was performed. These patients were divided into two groups: With and without SN in level II/III, and nodes in level II/III were pathologically proven. In 11 of the 92 patients, hot nodes were detected in level II/III. There was a significant difference in node metastasis depending on whether there were hot nodes in level II/III (P=0.0319). Multivariate analysis indicated that the hot nodes in level II/III and lymphatic invasion were independent factors associated with node metastasis. There were 12 SN-positive patients followed by ALND. In four of the 12 patients, hot nodes were observed in level II/III. Two of the four patients with hot nodes depicted by SPECT/CT and metastatic nodes were pathologically evident in the same lesion. Therefore, the present study indicated that the hot node in level II/III as depicted by SPECT/CT may be a risk of SN metastasis, including deeper nodes. PMID:25289038

  15. Risk of node metastasis of sentinel lymph nodes detected in level II/III of the axilla by single-photon emission computed tomography/computed tomography.

    PubMed

    Shima, Hiroaki; Kutomi, Goro; Satomi, Fukino; Maeda, Hideki; Takamaru, Tomoko; Kameshima, Hidekazu; Omura, Tosei; Mori, Mitsuru; Hatakenaka, Masamitsu; Hasegawa, Tadashi; Hirata, Koichi

    2014-11-01

    In breast cancer, single-photon emission computed tomography/computed tomography (SPECT/CT) shows the exact anatomical location of sentinel nodes (SN). SPECT/CT mainly exposes axilla and partly exposes atypical sites of extra-axillary lymphatic drainage. The mechanism of how the atypical hot nodes are involved in lymphatic metastasis was retrospectively investigated in the present study, particularly at the level II/III region. SPECT/CT was performed in 92 clinical stage 0-IIA breast cancer patients. Sentinel lymph nodes are depicted as hot nodes in SPECT/CT. Patients were divided into two groups: With or without hot node in level II/III on SPECT/CT. The existence of metastasis in level II/III was investigated and the risk factors were identified. A total of 12 patients were sentinel lymph node biopsy metastasis positive and axillary lymph node dissection (ALND) was performed. These patients were divided into two groups: With and without SN in level II/III, and nodes in level II/III were pathologically proven. In 11 of the 92 patients, hot nodes were detected in level II/III. There was a significant difference in node metastasis depending on whether there were hot nodes in level II/III (P=0.0319). Multivariate analysis indicated that the hot nodes in level II/III and lymphatic invasion were independent factors associated with node metastasis. There were 12 SN-positive patients followed by ALND. In four of the 12 patients, hot nodes were observed in level II/III. Two of the four patients with hot nodes depicted by SPECT/CT and metastatic nodes were pathologically evident in the same lesion. Therefore, the present study indicated that the hot node in level II/III as depicted by SPECT/CT may be a risk of SN metastasis, including deeper nodes.

  16. An Observation Schedule for Assessing Computer Technology Environments at Second, Third, and Fourth Grade Levels.

    ERIC Educational Resources Information Center

    Cobbs, Henry L., Jr.; Wilmoth, James Noel

    An instrument used to assess educational environments for computers as new tools for teaching and learning was studied in grades 2, 3, and 4 of the Atlanta (Georgia) Public Schools (APS) in May 1990. The instrument contained 46 items indexing: (1) the immediate and expanded spaces for the computer stations; (2) the presence of software and…

  17. Integration of Computer Technology Into an Introductory-Level Neuroscience Laboratory

    ERIC Educational Resources Information Center

    Evert, Denise L.; Goodwin, Gregory; Stavnezer, Amy Jo

    2005-01-01

    We describe 3 computer-based neuroscience laboratories. In the first 2 labs, we used commercially available interactive software to enhance the study of functional and comparative neuroanatomy and neurophysiology. In the remaining lab, we used customized software and hardware in 2 psychophysiological experiments. With the use of the computer-based…

  18. Computing at the High School Level: Changing What Teachers and Students Know and Believe

    ERIC Educational Resources Information Center

    Munson, Ashlyn; Moskal, Barbara; Harriger, Alka; Lauriski-Karriker, Tonya; Heersink, Daniel

    2011-01-01

    Research indicates that students often opt out of computing majors due to a lack of prior experience in computing and a lack of knowledge of field-based job opportunities. In addition, it has been found that students respond positively to new subjects when teachers and counselors are enthusiastic and knowledgeable about the area. The summer…

  19. Analysis of the Computer Anxiety Levels of Secondary Technical Education Teachers in West Virginia.

    ERIC Educational Resources Information Center

    Gordon, Howard R. D.

    The computer anxiety of 116 randomly selected secondary technical education teachers from 8 area vocational-technical centers in West Virginia was the focus of a study. The mailed questionnaire consisted of two parts: Oetting's Computer Anxiety Scale (COMPAS) and closed-form questions to obtain general demographic information about the teachers…

  20. Determining the Effectiveness of the 3D Alice Programming Environment at the Computer Science I Level

    ERIC Educational Resources Information Center

    Sykes, Edward R.

    2007-01-01

    Student retention in Computer Science is becoming a serious concern among Educators in many colleges and universities. Most institutions currently face a significant drop in enrollment in Computer Science. A number of different tools and strategies have emerged to address this problem (e.g., BlueJ, Karel Robot, etc.). Although these tools help to…