Science.gov

Sample records for level computational identification

  1. MODEL IDENTIFICATION AND COMPUTER ALGEBRA.

    PubMed

    Bollen, Kenneth A; Bauldry, Shawn

    2010-10-01

    Multiequation models that contain observed or latent variables are common in the social sciences. To determine whether unique parameter values exist for such models, one needs to assess model identification. In practice analysts rely on empirical checks that evaluate the singularity of the information matrix evaluated at sample estimates of parameters. The discrepancy between estimates and population values, the limitations of numerical assessments of ranks, and the difference between local and global identification make this practice less than perfect. In this paper we outline how to use computer algebra systems (CAS) to determine the local and global identification of multiequation models with or without latent variables. We demonstrate a symbolic CAS approach to local identification and develop a CAS approach to obtain explicit algebraic solutions for each of the model parameters. We illustrate the procedures with several examples, including a new proof of the identification of a model for handling missing data using auxiliary variables. We present an identification procedure for Structural Equation Models that makes use of CAS and that is a useful complement to current methods. PMID:21769158

  2. [Identification of the Pseudomonas genus bacteria by computer analysis].

    PubMed

    Kotsofliak, O I; Reva, O N; Kiprianova, E A; Smirnov, V V

    2003-01-01

    A computer program for the simplified phenotypic identification of Pseudomonas has been developed. The information concerning 66 species included in up-to-date Pseudomonas genus characterized by 113 tests was accumulated in a database. The identification key is represented in interactive mode on a website http://www.imv.kiev.ua/PsmIK/default.htm. The program was used for the identification of 46 Pseudomonas strains isolated from rhizosphere. For 23 more strains unidentified by conventional technique, the level of similarity was 67-74%. This fact allows suggesting that they might be representatives of new Pseudomonas species. PMID:15077543

  3. A method for identification of vertebral level.

    PubMed

    Redfern, R M; Smith, E T

    1986-05-01

    A method of spinal level marking applicable particularly for use in thoracolumbar posterior spinal operations is described. The use of patent blue V dye in this procedure is discussed in a consecutive series of over 100 cases. No serious adverse effects were observed. The technique ensures accurate identification of spinal marking and helps to minimize anaesthetic time. PMID:3729267

  4. Computer-assisted identification of anaerobic bacteria.

    PubMed Central

    Kelley, R W; Kellogg, S T

    1978-01-01

    A computer program was developed to identify anaerobic bacteria by using simultaneous pattern recognition via a Bayesian probabilistic model. The system is intended for use as a rapid, precise, and reproducible aid in the identification of unknown isolates. The program operates on a data base of 28 genera comprising 238 species of anaerobic bacteria that can be separated by the program. Input to the program consists of biochemical and gas chromatographic test results in binary format. The system is flexible and yields outputs of: (i) most probable species, (ii) significant test results conflicting with established data, and (iii) differential tests of significance for missing test results. PMID:345970

  5. Personalized identification of abdominal wall hernia meshes on computed tomography.

    PubMed

    Pham, Tuan D; Le, Dinh T P; Xu, Jinwei; Nguyen, Duc T; Martindale, Robert G; Deveney, Clifford W

    2014-01-01

    An abdominal wall hernia is a protrusion of the intestine through an opening or area of weakness in the abdominal wall. Correct pre-operative identification of abdominal wall hernia meshes could help surgeons adjust the surgical plan to meet the expected difficulty and morbidity of operating through or removing the previous mesh. First, we present herein for the first time the application of image analysis for automated identification of hernia meshes. Second, we discuss the novel development of a new entropy-based image texture feature using geostatistics and indicator kriging. Third, we seek to enhance the hernia mesh identification by combining the new texture feature with the gray-level co-occurrence matrix feature of the image. The two features can characterize complementary information of anatomic details of the abdominal hernia wall and its mesh on computed tomography. Experimental results have demonstrated the effectiveness of the proposed study. The new computational tool has potential for personalized mesh identification which can assist surgeons in the diagnosis and repair of complex abdominal wall hernias. PMID:24184112

  6. A new approach for fault identification in computer networks

    NASA Astrophysics Data System (ADS)

    Zhao, Dong; Wang, Tao

    2004-04-01

    Effective management of computer networks has become a more and more difficult job because of the rapid development of the network systems. Fault identification is to find where is the problem of the network and what is it. Data mining generally refers to the process of extracting models from large stores of data. We can use data mining techniques to help us in the fault identification task. Existing approaches of fault identification are introduced and a new approach of fault identification is proposed. This approach improves MSDD algorithm but it need more computation. So some new techniques are used to increase the efficiency.

  7. Computer method for identification of boiler transfer functions

    NASA Technical Reports Server (NTRS)

    Miles, J. H.

    1972-01-01

    Iterative computer aided procedure was developed which provides for identification of boiler transfer functions using frequency response data. Method uses frequency response data to obtain satisfactory transfer function for both high and low vapor exit quality data.

  8. Chip level simulation of fault tolerant computers

    NASA Technical Reports Server (NTRS)

    Armstrong, J. R.

    1982-01-01

    Chip-level modeling techniques in the evaluation of fault tolerant systems were researched. A fault tolerant computer was modeled. An efficient approach to functional fault simulation was developed. Simulation software was also developed.

  9. Multi-level RF identification system

    DOEpatents

    Steele, Kerry D.; Anderson, Gordon A.; Gilbert, Ronald W.

    2004-07-20

    A radio frequency identification system having a radio frequency transceiver for generating a continuous wave RF interrogation signal that impinges upon an RF identification tag. An oscillation circuit in the RF identification tag modulates the interrogation signal with a subcarrier of a predetermined frequency and modulates the frequency-modulated signal back to the transmitting interrogator. The interrogator recovers and analyzes the subcarrier signal and determines its frequency. The interrogator generates an output indicative of the frequency of the subcarrier frequency, thereby identifying the responding RFID tag as one of a "class" of RFID tags configured to respond with a subcarrier signal of a predetermined frequency.

  10. Identification of Protein–Excipient Interaction Hotspots Using Computational Approaches

    PubMed Central

    Barata, Teresa S.; Zhang, Cheng; Dalby, Paul A.; Brocchini, Steve; Zloh, Mire

    2016-01-01

    Protein formulation development relies on the selection of excipients that inhibit protein–protein interactions preventing aggregation. Empirical strategies involve screening many excipient and buffer combinations using force degradation studies. Such methods do not readily provide information on intermolecular interactions responsible for the protective effects of excipients. This study describes a molecular docking approach to screen and rank interactions allowing for the identification of protein–excipient hotspots to aid in the selection of excipients to be experimentally screened. Previously published work with Drosophila Su(dx) was used to develop and validate the computational methodology, which was then used to determine the formulation hotspots for Fab A33. Commonly used excipients were examined and compared to the regions in Fab A33 prone to protein–protein interactions that could lead to aggregation. This approach could provide information on a molecular level about the protective interactions of excipients in protein formulations to aid the more rational development of future formulations. PMID:27258262

  11. Computational Methods for Protein Identification from Mass Spectrometry Data

    PubMed Central

    McHugh, Leo; Arthur, Jonathan W

    2008-01-01

    Protein identification using mass spectrometry is an indispensable computational tool in the life sciences. A dramatic increase in the use of proteomic strategies to understand the biology of living systems generates an ongoing need for more effective, efficient, and accurate computational methods for protein identification. A wide range of computational methods, each with various implementations, are available to complement different proteomic approaches. A solid knowledge of the range of algorithms available and, more critically, the accuracy and effectiveness of these techniques is essential to ensure as many of the proteins as possible, within any particular experiment, are correctly identified. Here, we undertake a systematic review of the currently available methods and algorithms for interpreting, managing, and analyzing biological data associated with protein identification. We summarize the advances in computational solutions as they have responded to corresponding advances in mass spectrometry hardware. The evolution of scoring algorithms and metrics for automated protein identification are also discussed with a focus on the relative performance of different techniques. We also consider the relative advantages and limitations of different techniques in particular biological contexts. Finally, we present our perspective on future developments in the area of computational protein identification by considering the most recent literature on new and promising approaches to the problem as well as identifying areas yet to be explored and the potential application of methods from other areas of computational biology. PMID:18463710

  12. Multi level programming Paradigm for Extreme Computing

    NASA Astrophysics Data System (ADS)

    Petiton, S.; Sato, M.; Emad, N.; Calvin, C.; Tsuji, M.; Dandouna, M.

    2014-06-01

    Abstract: In order to propose a framework and programming paradigms for post-petascale computing, on the road to exascale computing and beyond, we introduced new languages, associated with a hierarchical multi-level programming paradigm, allowing scientific end-users and developers to program highly hierarchical architectures designed for extreme computing. In this paper, we explain the interest of such hierarchical multi-level programming paradigm for extreme computing and its well adaptation to several large computational science applications, such as for linear algebra solvers used for reactor core physic. We describe the YML language and framework allowing describing graphs of parallel components, which may be developed using PGAS-like language such as XMP, scheduled and computed on supercomputers. Then, we propose experimentations on supercomputers (such as the "K" and "Hooper" ones) of the hybrid method MERAM (Multiple Explicitly Restarted Arnoldi Method) as a case study for iterative methods manipulating sparse matrices, and the block Gauss-Jordan method as a case study for direct method manipulating dense matrices. We conclude proposing evolutions for this programming paradigm.

  13. Computational phosphoproteomics: From identification to localization

    PubMed Central

    Lee, Dave C H; Jones, Andrew R; Hubbard, Simon J

    2015-01-01

    Analysis of the phosphoproteome by MS has become a key technology for the characterization of dynamic regulatory processes in the cell, since kinase and phosphatase action underlie many major biological functions. However, the addition of a phosphate group to a suitable side chain often confounds informatic analysis by generating product ion spectra that are more difficult to interpret (and consequently identify) relative to unmodified peptides. Collectively, these challenges have motivated bioinformaticians to create novel software tools and pipelines to assist in the identification of phosphopeptides in proteomic mixtures, and help pinpoint or “localize” the most likely site of modification in cases where there is ambiguity. Here we review the challenges to be met and the informatics solutions available to address them for phosphoproteomic analysis, as well as highlighting the difficulties associated with using them and the implications for data standards. PMID:25475148

  14. Crop identification and area estimation by computer-aided analysis of Landsat data

    NASA Technical Reports Server (NTRS)

    Bauer, M. E.; Hixson, M. M.; Davis, B. J.; Etheridge, J. B.

    1977-01-01

    This report describes the results of a study involving the use of computer-aided analysis techniques applied to Landsat MSS data for identification and area estimation of winter wheat in Kansas and corn and soybeans in Indiana. Key elements of the approach included use of aerial photography for classifier training, stratification of Landsat data and extension of training statistics to areas without training data, and classification of a systematic sample of pixels from each county. Major results and conclusions are: (1) Landsat data was adequate for accurate identification and area estimation of winter wheat in Kansas, but corn and soybean estimates for Indiana were less accurate; (2) computer-aided analysis techniques can be effectively used to extract crop identification information from Landsat MSS data, and (3) systematic sampling of entire counties made possible by computer classification methods resulted in very precise area estimates at county as well as district and state levels.

  15. Human operator identification model and related computer programs

    NASA Technical Reports Server (NTRS)

    Kessler, K. M.; Mohr, J. N.

    1978-01-01

    Four computer programs which provide computational assistance in the analysis of man/machine systems are reported. The programs are: (1) Modified Transfer Function Program (TF); (2) Time Varying Response Program (TVSR); (3) Optimal Simulation Program (TVOPT); and (4) Linear Identification Program (SCIDNT). The TV program converts the time domain state variable system representative to frequency domain transfer function system representation. The TVSR program computes time histories of the input/output responses of the human operator model. The TVOPT program is an optimal simulation program and is similar to TVSR in that it produces time histories of system states associated with an operator in the loop system. The differences between the two programs are presented. The SCIDNT program is an open loop identification code which operates on the simulated data from TVOPT (or TVSR) or real operator data from motion simulators.

  16. Computational identification of 69 retroposons in Arabidopsis.

    PubMed

    Zhang, Yujun; Wu, Yongrui; Liu, Yilei; Han, Bin

    2005-06-01

    Retroposition is a shot-gun strategy of the genome to achieve evolutionary diversities by mixing and matching coding sequences with novel regulatory elements. We have identified 69 retroposons in the Arabidopsis (Arabidopsis thaliana) genome by a computational approach. Most of them were derivatives of mature mRNAs, and 20 genes contained relics of the reverse transcription process, such as truncations, deletions, and extra sequence additions. Of them, 22 are processed pseudogenes, and 52 genes are likely to be actively transcribed, especially in tissues from apical meristems (roots and flowers). Functional compositions of these retroposon parental genes imply that not the mRNA itself but its expression in gamete cells defines a suitable template for retroposition. The presence/absence patterns of retroposons can be used as cladistic markers for biogeographic research. Effects of human and the Mediterranean Pleistocene refugia in Arabidopsis biogeographic distributions were revealed based on two recent retroposons (At1g61410 and At5g52090). An evolutionary rate of new gene creation by retroposition was calculated as 0.6 genes per million years. Retroposons can also be used as molecular fossils of the parental gene expressions in ancient time. Extensions of 3' untranslated regions for those expressed parental genes are revealed as a possible trend of plant transcriptome evolution. In addition, we reported the first plant functional chimeric gene that adapts to intercompartmental transport by capturing two additional exons after retroposition. PMID:15923328

  17. Identification of Computational and Experimental Reduced-Order Models

    NASA Technical Reports Server (NTRS)

    Silva, Walter A.; Hong, Moeljo S.; Bartels, Robert E.; Piatak, David J.; Scott, Robert C.

    2003-01-01

    The identification of computational and experimental reduced-order models (ROMs) for the analysis of unsteady aerodynamic responses and for efficient aeroelastic analyses is presented. For the identification of a computational aeroelastic ROM, the CFL3Dv6.0 computational fluid dynamics (CFD) code is used. Flutter results for the AGARD 445.6 Wing and for a Rigid Semispan Model (RSM) computed using CFL3Dv6.0 are presented, including discussion of associated computational costs. Modal impulse responses of the unsteady aerodynamic system are computed using the CFL3Dv6.0 code and transformed into state-space form. The unsteady aerodynamic state-space ROM is then combined with a state-space model of the structure to create an aeroelastic simulation using the MATLAB/SIMULINK environment. The MATLAB/SIMULINK ROM is then used to rapidly compute aeroelastic transients, including flutter. The ROM shows excellent agreement with the aeroelastic analyses computed using the CFL3Dv6.0 code directly. For the identification of experimental unsteady pressure ROMs, results are presented for two configurations: the RSM and a Benchmark Supercritical Wing (BSCW). Both models were used to acquire unsteady pressure data due to pitching oscillations on the Oscillating Turntable (OTT) system at the Transonic Dynamics Tunnel (TDT). A deconvolution scheme involving a step input in pitch and the resultant step response in pressure, for several pressure transducers, is used to identify the unsteady pressure impulse responses. The identified impulse responses are then used to predict the pressure responses due to pitching oscillations at several frequencies. Comparisons with the experimental data are then presented.

  18. Computational Strategies for a System-Level Understanding of Metabolism

    PubMed Central

    Cazzaniga, Paolo; Damiani, Chiara; Besozzi, Daniela; Colombo, Riccardo; Nobile, Marco S.; Gaglio, Daniela; Pescini, Dario; Molinari, Sara; Mauri, Giancarlo; Alberghina, Lilia; Vanoni, Marco

    2014-01-01

    Cell metabolism is the biochemical machinery that provides energy and building blocks to sustain life. Understanding its fine regulation is of pivotal relevance in several fields, from metabolic engineering applications to the treatment of metabolic disorders and cancer. Sophisticated computational approaches are needed to unravel the complexity of metabolism. To this aim, a plethora of methods have been developed, yet it is generally hard to identify which computational strategy is most suited for the investigation of a specific aspect of metabolism. This review provides an up-to-date description of the computational methods available for the analysis of metabolic pathways, discussing their main advantages and drawbacks.  In particular, attention is devoted to the identification of the appropriate scale and level of accuracy in the reconstruction of metabolic networks, and to the inference of model structure and parameters, especially when dealing with a shortage of experimental measurements. The choice of the proper computational methods to derive in silico data is then addressed, including topological analyses, constraint-based modeling and simulation of the system dynamics. A description of some computational approaches to gain new biological knowledge or to formulate hypotheses is finally provided. PMID:25427076

  19. An Efficient Algorithm for Stiffness Identification of Truss Structures Through Distributed Local Computation

    NASA Astrophysics Data System (ADS)

    Zhang, G.; Burgueño, R.; Elvin, N. G.

    2010-02-01

    This paper presents an efficient stiffness identification technique for truss structures based on distributed local computation. Sensor nodes on each element are assumed to collect strain data and communicate only with sensors on neighboring elements. This can significantly reduce the energy demand for data transmission and the complexity of transmission protocols, thus enabling a simplified wireless implementation. Element stiffness parameters are identified by simple low order matrix inversion at a local level, which reduces the computational energy, allows for distributed computation and makes parallel data processing possible. The proposed method also permits addressing the problem of missing data or faulty sensors. Numerical examples, with and without missing data, are presented and the element stiffness parameters are accurately identified. The computation efficiency of the proposed method is n2 times higher than previously proposed global damage identification methods.

  20. Biracial Identification, Familial Influence and Levels of Acculturation.

    ERIC Educational Resources Information Center

    Morales, Pamilla C.; Steward, Robbie

    The levels of acculturation of biracial and monoracial Hispanics were examined in college students to determine the level of family or community influence in defining racial identification for the biracial individual. The sample was composed of Hispanic undergraduate students currently enrolled at the University of Kansas. Survey packets were…

  1. An experimental modal testing/identification technique for personal computers

    NASA Technical Reports Server (NTRS)

    Roemer, Michael J.; Schlonski, Steven T.; Mook, D. Joseph

    1990-01-01

    A PC-based system for mode shape identification is evaluated. A time-domain modal identification procedure is utilized to identify the mode shapes of a beam apparatus from discrete time-domain measurements. The apparatus includes a cantilevered aluminum beam, four accelerometers, four low-pass filters, and the computer. The method's algorithm is comprised of an identification algorithm: the Eigensystem Realization Algorithm (ERA) and an estimation algorithm called Minimum Model Error (MME). The identification ability of this algorithm is compared with ERA alone, a frequency-response-function technique, and an Euler-Bernoulli beam model. Detection of modal parameters and mode shapes by the PC-based time-domain system is shown to be accurate in an application with an aluminum beam, while mode shapes identified by the frequency-domain technique are not as accurate as predicted. The new method is shown to be significantly less sensitive to noise and poorly excited modes than other leading methods. The results support the use of time-domain identification systems for mode shape prediction.

  2. Computed tomographic identification of calcified optic nerve drusen

    SciTech Connect

    Ramirez, H.; Blatt, E.S.; Hibri, N.S.

    1983-07-01

    Four cases of optic disk drusen were accurately diagnosed with orbital computed tomography (CT). The radiologist should be aware of the characteristic CT finding of discrete calcification within an otherwise normal optic disk. This benign process is easily differentiated from lesions such as calcific neoplastic processes of the posterior globe. CT identification of optic disk drusen is essential in the evaluation of visual field defects, migraine-like headaches, and pseudopapilledema.

  3. 48 CFR 227.7203-10 - Contractor identification and marking of computer software or computer software documentation to...

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... and marking of computer software or computer software documentation to be furnished with restrictive... Rights in Computer Software and Computer Software Documentation 227.7203-10 Contractor identification and marking of computer software or computer software documentation to be furnished with restrictive...

  4. 48 CFR 227.7203-10 - Contractor identification and marking of computer software or computer software documentation to...

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... and marking of computer software or computer software documentation to be furnished with restrictive... Rights in Computer Software and Computer Software Documentation 227.7203-10 Contractor identification and marking of computer software or computer software documentation to be furnished with restrictive...

  5. 48 CFR 227.7203-10 - Contractor identification and marking of computer software or computer software documentation to...

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... and marking of computer software or computer software documentation to be furnished with restrictive... Rights in Computer Software and Computer Software Documentation 227.7203-10 Contractor identification and marking of computer software or computer software documentation to be furnished with restrictive...

  6. 48 CFR 227.7203-10 - Contractor identification and marking of computer software or computer software documentation to...

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... and marking of computer software or computer software documentation to be furnished with restrictive... Rights in Computer Software and Computer Software Documentation 227.7203-10 Contractor identification and marking of computer software or computer software documentation to be furnished with restrictive...

  7. 48 CFR 227.7203-10 - Contractor identification and marking of computer software or computer software documentation to...

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... and marking of computer software or computer software documentation to be furnished with restrictive... Rights in Computer Software and Computer Software Documentation 227.7203-10 Contractor identification and marking of computer software or computer software documentation to be furnished with restrictive...

  8. Tracking by Identification Using Computer Vision and Radio

    PubMed Central

    Mandeljc, Rok; Kovačič, Stanislav; Kristan, Matej; Perš, Janez

    2013-01-01

    We present a novel system for detection, localization and tracking of multiple people, which fuses a multi-view computer vision approach with a radio-based localization system. The proposed fusion combines the best of both worlds, excellent computer-vision-based localization, and strong identity information provided by the radio system, and is therefore able to perform tracking by identification, which makes it impervious to propagated identity switches. We present comprehensive methodology for evaluation of systems that perform person localization in world coordinate system and use it to evaluate the proposed system as well as its components. Experimental results on a challenging indoor dataset, which involves multiple people walking around a realistically cluttered room, confirm that proposed fusion of both systems significantly outperforms its individual components. Compared to the radio-based system, it achieves better localization results, while at the same time it successfully prevents propagation of identity switches that occur in pure computer-vision-based tracking. PMID:23262485

  9. Computational Issues in Damping Identification for Large Scale Problems

    NASA Technical Reports Server (NTRS)

    Pilkey, Deborah L.; Roe, Kevin P.; Inman, Daniel J.

    1997-01-01

    Two damping identification methods are tested for efficiency in large-scale applications. One is an iterative routine, and the other a least squares method. Numerical simulations have been performed on multiple degree-of-freedom models to test the effectiveness of the algorithm and the usefulness of parallel computation for the problems. High Performance Fortran is used to parallelize the algorithm. Tests were performed using the IBM-SP2 at NASA Ames Research Center. The least squares method tested incurs high communication costs, which reduces the benefit of high performance computing. This method's memory requirement grows at a very rapid rate meaning that larger problems can quickly exceed available computer memory. The iterative method's memory requirement grows at a much slower pace and is able to handle problems with 500+ degrees of freedom on a single processor. This method benefits from parallelization, and significant speedup can he seen for problems of 100+ degrees-of-freedom.

  10. Phosphor-stimulated computed cephalometry: reliability of landmark identification.

    PubMed

    Lim, K F; Foong, K W

    1997-11-01

    The aim of this randomized, controlled, prospective study was to determine the reliability of computed lateral cephalometry (Fuji Medical Systems, Tokyo, Japan) in terms of landmark identification compared to conventional lateral cephalometry (CAWO, Schrobenhausen, Germany). To assess the reliability of landmark identification on lateral cephalographs, 20 computed images, taken at 30 per cent reduced radiation (70 kV, 15 mA, 0.35 s) were compared to 20 conventional images (70 kV, 15 mA, 0.5 s). The 40 lateral cephalographs were taken from 20 orthodontic patients at immediate post-treatment and 1 year after retention. The order and type of imaging was randomized. Five orthodontists identified eight skeletal, four dental and five soft tissue landmarks on each of the 40 films. The error of identification was analysed in the XY Cartesian co-ordinate following digitization. Skeletal landmarks exhibited characteristic dispersion with respect to the Cartesian co-ordinates. Root apices were more variable than crown tips. Soft tissue landmarks were more consistent in the X co-ordinate. Two-way ANOVA shows that there is no significant difference between the two imaging systems in both co-ordinates (P > 0.05). Moreover, the differences are generally small (< 0.5 mm), and are unlikely to be of clinical significance. Most of the variables attained statistical power of at least 0.8 in the X-co-ordinate while only the dental landmarks achieved statistical power of at least 0.78 in the Y-co-ordinate. Based on the results of the study: (1) computed lateral cephalographs can be taken at 30 per cent radiation reduction, compared to conventional lateral cephalograph; (2) each anatomical landmark exhibits its characteristic dispersion of error in both the Cartesian co-ordinates; (3) there is no trend between the two imaging systems, with equivocal result, and none of the landmarks attained statistical significance when both raters and imaging systems are considered as factorial

  11. Postmortem computed tomography (PMCT) and disaster victim identification.

    PubMed

    Brough, A L; Morgan, B; Rutty, G N

    2015-09-01

    Radiography has been used for identification since 1927, and established a role in mass fatality investigations in 1949. More recently, postmortem computed tomography (PMCT) has been used for disaster victim identification (DVI). PMCT offers several advantages compared with fluoroscopy, plain film and dental X-rays, including: speed, reducing the number of on-site personnel and imaging modalities required, making it potentially more efficient. However, there are limitations that inhibit the international adoption of PMCT into routine practice. One particular problem is that due to the fact that forensic radiology is a relatively new sub-speciality, there are no internationally established standards for image acquisition, image interpretation and archiving. This is reflected by the current INTERPOL DVI form, which does not contain a PMCT section. The DVI working group of the International Society of Forensic Radiology and Imaging supports the use of imaging in mass fatality response and has published positional statements in this area. This review will discuss forensic radiology, PMCT, and its role in disaster victim identification. PMID:26108152

  12. Computer Literacy for Teachers: Level 1.

    ERIC Educational Resources Information Center

    Oliver, Marvin E.

    This brief, non-technical narrative for teachers addresses the following questions: (1) What are computers? (2) How do they work? (3) What can they do? (4) Why should we care? (5) What do they have to do with reading instruction? and (6) What is the future of computers for teachers and students? Specific topics discussed include the development of…

  13. A Teaching Exercise for the Identification of Bacteria Using An Interactive Computer Program.

    ERIC Educational Resources Information Center

    Bryant, Trevor N.; Smith, John E.

    1979-01-01

    Describes an interactive Fortran computer program which provides an exercise in the identification of bacteria. Provides a way of enhancing a student's approach to systematic bacteriology and numerical identification procedures. (Author/MA)

  14. Identification of Cichlid Fishes from Lake Malawi Using Computer Vision

    PubMed Central

    Joo, Deokjin; Kwan, Ye-seul; Song, Jongwoo; Pinho, Catarina; Hey, Jody; Won, Yong-Jin

    2013-01-01

    Background The explosively radiating evolution of cichlid fishes of Lake Malawi has yielded an amazing number of haplochromine species estimated as many as 500 to 800 with a surprising degree of diversity not only in color and stripe pattern but also in the shape of jaw and body among them. As these morphological diversities have been a central subject of adaptive speciation and taxonomic classification, such high diversity could serve as a foundation for automation of species identification of cichlids. Methodology/Principal Finding Here we demonstrate a method for automatic classification of the Lake Malawi cichlids based on computer vision and geometric morphometrics. For this end we developed a pipeline that integrates multiple image processing tools to automatically extract informative features of color and stripe patterns from a large set of photographic images of wild cichlids. The extracted information was evaluated by statistical classifiers Support Vector Machine and Random Forests. Both classifiers performed better when body shape information was added to the feature of color and stripe. Besides the coloration and stripe pattern, body shape variables boosted the accuracy of classification by about 10%. The programs were able to classify 594 live cichlid individuals belonging to 12 different classes (species and sexes) with an average accuracy of 78%, contrasting to a mere 42% success rate by human eyes. The variables that contributed most to the accuracy were body height and the hue of the most frequent color. Conclusions Computer vision showed a notable performance in extracting information from the color and stripe patterns of Lake Malawi cichlids although the information was not enough for errorless species identification. Our results indicate that there appears an unavoidable difficulty in automatic species identification of cichlid fishes, which may arise from short divergence times and gene flow between closely related species. PMID:24204918

  15. Chip level simulation of fault tolerant computers

    NASA Technical Reports Server (NTRS)

    Armstrong, J. R.

    1983-01-01

    Chip level modeling techniques, functional fault simulation, simulation software development, a more efficient, high level version of GSP, and a parallel architecture for functional simulation are discussed.

  16. Factors Influencing Exemplary Science Teachers' Levels of Computer Use

    ERIC Educational Resources Information Center

    Hakverdi, Meral; Dana, Thomas M.; Swain, Colleen

    2011-01-01

    The purpose of this study was to examine exemplary science teachers' use of technology in science instruction, factors influencing their level of computer use, their level of knowledge/skills in using specific computer applications for science instruction, their use of computer-related applications/tools during their instruction, and their…

  17. Sound source localization identification accuracy: Level and duration dependencies.

    PubMed

    Yost, William A

    2016-07-01

    Sound source localization accuracy for noises was measured for sources in the front azimuthal open field mainly as a function of overall noise level and duration. An identification procedure was used in which listeners identify which loudspeakers presented a sound. Noises were filtered and differed in bandwidth and center frequency. Sound source localization accuracy depended on the bandwidth of the stimuli, and for the narrow bandwidths, accuracy depended on the filter's center frequency. Sound source localization accuracy did not depend on overall level or duration. PMID:27475204

  18. Disaster victim identification: new applications for postmortem computed tomography.

    PubMed

    Blau, Soren; Robertson, Shelley; Johnstone, Marnie

    2008-07-01

    Mass fatalities can present the forensic anthropologist and forensic pathologist with a different set of challenges to those presented by a single fatality. To date radiography has played an important role in the disaster victim identification (DVI) process. The aim of this paper is to highlight the benefits of applying computed tomography (CT) technology to the DVI process. The paper begins by reviewing the extent to which sophisticated imaging techniques, specifically CT, have been increasingly used to assist in the analysis of deceased individuals. A small scale case study is then presented which describes aspects of the DVI process following a recent Australian aviation disaster involving two individuals. Having grided the scene of the disaster, a total of 41 bags of heavily disrupted human remains were collected. A postmortem examination was subsequently undertaken. Analysis of the CT images of all body parts (n = 162) made it possible not only to identify and side differentially preserved skeletal elements which were anatomically unrecognizable in the heavily disrupted body masses, but also to observe and record useful identifying features such as surgical implants. In this case the role of the forensic anthropologist and CT technology were paramount in facilitating a quick identification, and subsequently, an effective and timely reconciliation, of body parts. Although this case study is small scale, it illustrates the enormous potential for CT imaging to complement the existing DVI process. PMID:18547358

  19. Computer program to predict aircraft noise levels

    NASA Technical Reports Server (NTRS)

    Clark, B. J.

    1981-01-01

    Methods developed at the NASA Lewis Research Center for predicting the noise contributions from various aircraft noise sources were programmed to predict aircraft noise levels either in flight or in ground tests. The noise sources include fan inlet and exhaust, jet, flap (for powered lift), core (combustor), turbine, and airframe. Noise propagation corrections are available for atmospheric attenuation, ground reflections, extra ground attenuation, and shielding. Outputs can include spectra, overall sound pressure level, perceived noise level, tone-weighted perceived noise level, and effective perceived noise level at locations specified by the user. Footprint contour coordinates and approximate footprint areas can also be calculated. Inputs and outputs can be in either System International or U.S. customary units. The subroutines for each noise source and propagation correction are described. A complete listing is given.

  20. Evaluation of new computer-enhanced identification program for microorganisms: adaptation of BioBASE for identification of members of the family Enterobacteriaceae.

    PubMed Central

    Miller, J M; Alachi, P

    1996-01-01

    We report the use of BioBASE, a computer-enhanced numerical identification software package, as a valuable aid for the rapid identification of unknown enteric bacilli when using conventional biochemicals. We compared BioBASE identification results with those of the Centers for Disease Control and Prevention's mainframe computer to determine the former's accuracy in identifying both common and rare unknown isolates of the family Enterobacteriaceae by using the same compiled data matrix. Of 293 enteric strains tested by BioBASE, 278 (94.9%) were correctly identified to the species level; 13 (4.4%) were assigned unacceptable or low discrimination profiles, but 8 of these (2.7%) were listed as the first choice; and 2 (0.7%) were not identified correctly because of their highly unusual biochemical profiles. The software is user friendly, rapid, and accurate and would be of value to any laboratory that uses conventional biochemicals. PMID:8748298

  1. Factors influencing exemplary science teachers' levels of computer use

    NASA Astrophysics Data System (ADS)

    Hakverdi, Meral

    This study examines exemplary science teachers' use of technology in science instruction, factors influencing their level of computer use, their level of knowledge/skills in using specific computer applications for science instruction, their use of computer-related applications/tools during their instruction, and their students' use of computer applications/tools in or for their science class. After a relevant review of the literature certain variables were selected for analysis. These variables included personal self-efficacy in teaching with computers, outcome expectancy, pupil-control ideology, level of computer use, age, gender, teaching experience, personal computer use, professional computer use and science teachers' level of knowledge/skills in using specific computer applications for science instruction. The sample for this study includes middle and high school science teachers who received the Presidential Award for Excellence in Science Teaching Award (sponsored by the White House and the National Science Foundation) between the years 1997 and 2003 from all 50 states and U.S. territories. Award-winning science teachers were contacted about the survey via e-mail or letter with an enclosed return envelope. Of the 334 award-winning science teachers, usable responses were received from 92 science teachers, which made a response rate of 27.5%. Analysis of the survey responses indicated that exemplary science teachers have a variety of knowledge/skills in using computer related applications/tools. The most commonly used computer applications/tools are information retrieval via the Internet, presentation tools, online communication, digital cameras, and data collection probes. Results of the study revealed that students' use of technology in their science classroom is highly correlated with the frequency of their science teachers' use of computer applications/tools. The results of the multiple regression analysis revealed that personal self-efficacy related to

  2. 24 CFR 990.175 - Utilities expense level: Computation of the current consumption level.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ...: Computation of the current consumption level. 990.175 Section 990.175 Housing and Urban Development... Calculating Formula Expenses § 990.175 Utilities expense level: Computation of the current consumption level. The current consumption level shall be the actual amount of each utility consumed during the...

  3. 24 CFR 990.175 - Utilities expense level: Computation of the current consumption level.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ...: Computation of the current consumption level. 990.175 Section 990.175 Housing and Urban Development... Calculating Formula Expenses § 990.175 Utilities expense level: Computation of the current consumption level. The current consumption level shall be the actual amount of each utility consumed during the...

  4. 24 CFR 990.175 - Utilities expense level: Computation of the current consumption level.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ...: Computation of the current consumption level. 990.175 Section 990.175 Housing and Urban Development... Calculating Formula Expenses § 990.175 Utilities expense level: Computation of the current consumption level. The current consumption level shall be the actual amount of each utility consumed during the...

  5. 24 CFR 990.180 - Utilities expense level: Computation of the rolling base consumption level.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ...: Computation of the rolling base consumption level. 990.180 Section 990.180 Housing and Urban Development... Calculating Formula Expenses § 990.180 Utilities expense level: Computation of the rolling base consumption level. (a) General. (1) The rolling base consumption level (RBCL) shall be equal to the average...

  6. 24 CFR 990.180 - Utilities expense level: Computation of the rolling base consumption level.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ...: Computation of the rolling base consumption level. 990.180 Section 990.180 Housing and Urban Development... Calculating Formula Expenses § 990.180 Utilities expense level: Computation of the rolling base consumption level. (a) General. (1) The rolling base consumption level (RBCL) shall be equal to the average...

  7. Computational Identification of Novel Genes: Current and Future Perspectives

    PubMed Central

    Klasberg, Steffen; Bitard-Feildel, Tristan; Mallet, Ludovic

    2016-01-01

    While it has long been thought that all genomic novelties are derived from the existing material, many genes lacking homology to known genes were found in recent genome projects. Some of these novel genes were proposed to have evolved de novo, ie, out of noncoding sequences, whereas some have been shown to follow a duplication and divergence process. Their discovery called for an extension of the historical hypotheses about gene origination. Besides the theoretical breakthrough, increasing evidence accumulated that novel genes play important roles in evolutionary processes, including adaptation and speciation events. Different techniques are available to identify genes and classify them as novel. Their classification as novel is usually based on their similarity to known genes, or lack thereof, detected by comparative genomics or against databases. Computational approaches are further prime methods that can be based on existing models or leveraging biological evidences from experiments. Identification of novel genes remains however a challenging task. With the constant software and technologies updates, no gold standard, and no available benchmark, evaluation and characterization of genomic novelty is a vibrant field. In this review, the classical and state-of-the-art tools for gene prediction are introduced. The current methods for novel gene detection are presented; the methodological strategies and their limits are discussed along with perspective approaches for further studies. PMID:27493475

  8. Computational identification of obligatorily autocatalytic replicators embedded in metabolic networks

    PubMed Central

    Kun, Ádám; Papp, Balázs; Szathmáry, Eörs

    2008-01-01

    Background If chemical A is necessary for the synthesis of more chemical A, then A has the power of replication (such systems are known as autocatalytic systems). We provide the first systems-level analysis searching for small-molecular autocatalytic components in the metabolisms of diverse organisms, including an inferred minimal metabolism. Results We find that intermediary metabolism is invariably autocatalytic for ATP. Furthermore, we provide evidence for the existence of additional, organism-specific autocatalytic metabolites in the forms of coenzymes (NAD+, coenzyme A, tetrahydrofolate, quinones) and sugars. Although the enzymatic reactions of a number of autocatalytic cycles are present in most of the studied organisms, they display obligatorily autocatalytic behavior in a few networks only, hence demonstrating the need for a systems-level approach to identify metabolic replicators embedded in large networks. Conclusion Metabolic replicators are apparently common and potentially both universal and ancestral: without their presence, kick-starting metabolic networks is impossible, even if all enzymes and genes are present in the same cell. Identification of metabolic replicators is also important for attempts to create synthetic cells, as some of these autocatalytic molecules will presumably be needed to be added to the system as, by definition, the system cannot synthesize them without their initial presence. PMID:18331628

  9. 24 CFR 990.180 - Utilities expense level: Computation of the rolling base consumption level.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ...: Computation of the rolling base consumption level. 990.180 Section 990.180 Housing and Urban Development... Calculating Formula Expenses § 990.180 Utilities expense level: Computation of the rolling base consumption... RBCL not to be comparable to the current year's consumption level. (c) Financial incentives. The...

  10. Identification of natural images and computer-generated graphics based on statistical and textural features.

    PubMed

    Peng, Fei; Li, Jiao-ting; Long, Min

    2015-03-01

    To discriminate the acquisition pipelines of digital images, a novel scheme for the identification of natural images and computer-generated graphics is proposed based on statistical and textural features. First, the differences between them are investigated from the view of statistics and texture, and 31 dimensions of feature are acquired for identification. Then, LIBSVM is used for the classification. Finally, the experimental results are presented. The results show that it can achieve an identification accuracy of 97.89% for computer-generated graphics, and an identification accuracy of 97.75% for natural images. The analyses also demonstrate the proposed method has excellent performance, compared with some existing methods based only on statistical features or other features. The method has a great potential to be implemented for the identification of natural images and computer-generated graphics. PMID:25537575

  11. Dysregulation in level of goal and action identification across psychological disorders

    PubMed Central

    Watkins, Edward

    2011-01-01

    Goals, events, and actions can be mentally represented within a hierarchical framework that ranges from more abstract to more concrete levels of identification. A more abstract level of identification involves general, superordinate, and decontextualized mental representations that convey the meaning of goals, events, and actions, “why” an action is performed, and its purpose, ends, and consequences. A more concrete level of identification involves specific and subordinate mental representations that include contextual details of goals, events, and actions, and the specific “how” details of an action. This review considers three lines of evidence for considering that dysregulation of level of goal/action identification may be a transdiagnostic process. First, there is evidence that different levels of identification have distinct functional consequences and that in non-clinical samples level of goal/action identification appears to be regulated in a flexible and adaptive way to match the level of goal/action identification to circumstances. Second, there is evidence that level of goal/action identification causally influences symptoms and processes involved in psychological disorders, including emotional response, repetitive thought, impulsivity, problem solving and procrastination. Third, there is evidence that the level of goal/action identification is biased and/or dysregulated in certain psychological disorders, with a bias towards more abstract identification for negative events in depression, GAD, PTSD, and social anxiety. PMID:20579789

  12. Dysregulation in level of goal and action identification across psychological disorders.

    PubMed

    Watkins, Edward

    2011-03-01

    Goals, events, and actions can be mentally represented within a hierarchical framework that ranges from more abstract to more concrete levels of identification. A more abstract level of identification involves general, superordinate, and decontextualized mental representations that convey the meaning of goals, events, and actions, "why" an action is performed, and its purpose, ends, and consequences. A more concrete level of identification involves specific and subordinate mental representations that include contextual details of goals, events, and actions, and the specific "how" details of an action. This review considers three lines of evidence for considering that dysregulation of level of goal/action identification may be a transdiagnostic process. First, there is evidence that different levels of identification have distinct functional consequences and that in non-clinical samples level of goal/action identification appears to be regulated in a flexible and adaptive way to match the level of goal/action identification to circumstances. Second, there is evidence that level of goal/action identification causally influences symptoms and processes involved in psychological disorders, including emotional response, repetitive thought, impulsivity, problem solving and procrastination. Third, there is evidence that the level of goal/action identification is biased and/or dysregulated in certain psychological disorders, with a bias towards more abstract identification for negative events in depression, GAD, PTSD, and social anxiety. PMID:20579789

  13. A novel computational method for the identification of plant alternative splice sites.

    PubMed

    Cui, Ying; Han, Jiuqiang; Zhong, Dexing; Liu, Ruiling

    2013-02-01

    Alternative splicing (AS) increases protein diversity by generating multiple transcript isoforms from a single gene in higher eukaryotes. Up to 48% of plant genes exhibit alternative splicing, which has proven to be involved in some important plant functions such as the stress response. A hybrid feature extraction approach which combing the position weight matrix (PWM) with the increment of diversity (ID) was proposed to represent the base conservative level (BCL) near splice sites and the similarity level of two datasets, respectively. Using the extracted features, the support vector machine (SVM) was applied to classify alternative and constitutive splice sites. By the proposed algorithm, 80.8% of donor sites and 85.4% of acceptor sites were correctly classified. It is anticipated that the novel computational method is promising for the identification of AS sites in plants. PMID:23313482

  14. The algorithmic level is the bridge between computation and brain.

    PubMed

    Love, Bradley C

    2015-04-01

    Every scientist chooses a preferred level of analysis and this choice shapes the research program, even determining what counts as evidence. This contribution revisits Marr's (1982) three levels of analysis (implementation, algorithmic, and computational) and evaluates the prospect of making progress at each individual level. After reviewing limitations of theorizing within a level, two strategies for integration across levels are considered. One is top-down in that it attempts to build a bridge from the computational to algorithmic level. Limitations of this approach include insufficient theoretical constraint at the computation level to provide a foundation for integration, and that people are suboptimal for reasons other than capacity limitations. Instead, an inside-out approach is forwarded in which all three levels of analysis are integrated via the algorithmic level. This approach maximally leverages mutual data constraints at all levels. For example, algorithmic models can be used to interpret brain imaging data, and brain imaging data can be used to select among competing models. Examples of this approach to integration are provided. This merging of levels raises questions about the relevance of Marr's tripartite view. PMID:25823496

  15. Levels of Evaluation for Computer-Based Instruction.

    ERIC Educational Resources Information Center

    Reeves, Thomas C.; Lent, Richard M.

    The uses and methods of four levels of evaluation which can be conducted during the development and implementation phases of computer-based instruction (CBI) programs are discussed in this paper. The levels of evaluation presented are: (1) documentation, (2) formative evaluation, (3) assessment of immediate learner effectiveness, and (4) impact…

  16. A Program for the Identification of the Enterobacteriaceae for Use in Teaching the Principles of Computer Identification of Bacteria.

    ERIC Educational Resources Information Center

    Hammonds, S. J.

    1990-01-01

    A technique for the numerical identification of bacteria using normalized likelihoods calculated from a probabilistic database is described, and the principles of the technique are explained. The listing of the computer program is included. Specimen results from the program, and examples of how they should be interpreted, are given. (KR)

  17. Computer ethics and teritary level education in Hong Kong

    SciTech Connect

    Wong, E.Y.W.; Davison, R.M.; Wade, P.W.

    1994-12-31

    This paper seeks to highlight some ethical issues relating to the increasing proliferation of Information Technology into our everyday lives. The authors explain their understanding of computer ethics, and give some reasons why the study of computer ethics is becoming increasingly pertinent. The paper looks at some of the problems that arise in attempting to develop appropriate ethical concepts in a constantly changing environment, and explores some of the ethical dilemmas arising from the increasing use of computers. Some initial research undertaken to explore the ideas and understanding of tertiary level students in Hong Kong on a number of ethical issues of interest is described, and our findings discussed. We hope that presenting this paper and eliciting subsequent discussion will enable us to draw up more comprehensive guidelines for the teaching of computer related ethics to tertiary level students, as well as reveal some directions for future research.

  18. Rugoscopy: Human identification by computer-assisted photographic superimposition technique

    PubMed Central

    Mohammed, Rezwana Begum; Patil, Rajendra G.; Pammi, V. R.; Sandya, M. Pavana; Kalyan, Siva V.; Anitha, A.

    2013-01-01

    Background: Human identification has been studied since fourteenth century and it has gradually advanced for forensic purposes. Traditional methods such as dental, fingerprint, and DNA comparisons are probably the most common techniques used in this context, allowing fast and secure identification processes. But, in circumstances where identification of an individual by fingerprint or dental record comparison is difficult, palatal rugae may be considered as an alternative source of material. Aim: The present study was done to evaluate the individualistic nature and use of palatal rugae patterns for personal identification and also to test the efficiency of computerized software for forensic identification by photographic superimposition of palatal photographs obtained from casts. Materials and Methods: Two sets of Alginate impressions were made from the upper arches of 100 individuals (50 males and 50 females) with one month interval in between and the casts were poured. All the teeth except the incisors were removed to ensure that only the palate could be used in identification process. In one set of the casts, the palatal rugae were highlighted with a graphite pencil. All the 200 casts were randomly numbered, and then, they were photographed with a 10.1 Mega Pixel Kodak digital camera using standardized method. Using computerized software, the digital photographs of the models without highlighting the palatal rugae were overlapped over the images (transparent) of the palatal rugae with highlighted palatal rugae, in order to identify the pairs by superimposition technique. Incisors were remained and used as landmarks to determine the magnification required to bring the two set of photographs to the same size, in order to make perfect superimposition of images. Results: The result of the overlapping of the digital photographs of highlighted palatal rugae over normal set of models without highlighted palatal rugae resulted in 100% positive identification. Conclusion

  19. Computing NLTE Opacities -- Node Level Parallel Calculation

    SciTech Connect

    Holladay, Daniel

    2015-09-11

    Presentation. The goal: to produce a robust library capable of computing reasonably accurate opacities inline with the assumption of LTE relaxed (non-LTE). Near term: demonstrate acceleration of non-LTE opacity computation. Far term (if funded): connect to application codes with in-line capability and compute opacities. Study science problems. Use efficient algorithms that expose many levels of parallelism and utilize good memory access patterns for use on advanced architectures. Portability to multiple types of hardware including multicore processors, manycore processors such as KNL, GPUs, etc. Easily coupled to radiation hydrodynamics and thermal radiative transfer codes.

  20. Enhanced Fault-Tolerant Quantum Computing in d -Level Systems

    NASA Astrophysics Data System (ADS)

    Campbell, Earl T.

    2014-12-01

    Error-correcting codes protect quantum information and form the basis of fault-tolerant quantum computing. Leading proposals for fault-tolerant quantum computation require codes with an exceedingly rare property, a transversal non-Clifford gate. Codes with the desired property are presented for d -level qudit systems with prime d . The codes use n =d -1 qudits and can detect up to ˜d /3 errors. We quantify the performance of these codes for one approach to quantum computation known as magic-state distillation. Unlike prior work, we find performance is always enhanced by increasing d .

  1. Variance and bias computation for enhanced system identification

    NASA Technical Reports Server (NTRS)

    Bergmann, Martin; Longman, Richard W.; Juang, Jer-Nan

    1989-01-01

    A study is made of the use of a series of variance and bias confidence criteria recently developed for the eigensystem realization algorithm (ERA) identification technique. The criteria are shown to be very effective, not only for indicating the accuracy of the identification results (especially in terms of confidence intervals), but also for helping the ERA user to obtain better results. They help determine the best sample interval, the true system order, how much data to use and whether to introduce gaps in the data used, what dimension Hankel matrix to use, and how to limit the bias or correct for bias in the estimates.

  2. Contours identification of elements in a cone beam computed tomography for investigating maxillary cysts

    NASA Astrophysics Data System (ADS)

    Chioran, Doina; Nicoarǎ, Adrian; Roşu, Şerban; Cǎrligeriu, Virgil; Ianeş, Emilia

    2013-10-01

    Digital processing of two-dimensional cone beam computer tomography slicesstarts by identification of the contour of elements within. This paper deals with the collective work of specialists in medicine and applied mathematics in computer science on elaborating and implementation of algorithms in dental 2D imagery.

  3. Teaching Higher Level Thinking Skills through Computer Courseware.

    ERIC Educational Resources Information Center

    Edwards, Lois

    A rationale is presented for teaching gifted students to gain computer literacy, learn programing, use utility software (e.g., word processing packages), and use interactive educational courseware containing drills, simulations, or educational strategy games to develop higher level and creative thinking skills. Evaluation of courseware for gifted…

  4. OS friendly microprocessor architecture: Hardware level computer security

    NASA Astrophysics Data System (ADS)

    Jungwirth, Patrick; La Fratta, Patrick

    2016-05-01

    We present an introduction to the patented OS Friendly Microprocessor Architecture (OSFA) and hardware level computer security. Conventional microprocessors have not tried to balance hardware performance and OS performance at the same time. Conventional microprocessors have depended on the Operating System for computer security and information assurance. The goal of the OS Friendly Architecture is to provide a high performance and secure microprocessor and OS system. We are interested in cyber security, information technology (IT), and SCADA control professionals reviewing the hardware level security features. The OS Friendly Architecture is a switched set of cache memory banks in a pipeline configuration. For light-weight threads, the memory pipeline configuration provides near instantaneous context switching times. The pipelining and parallelism provided by the cache memory pipeline provides for background cache read and write operations while the microprocessor's execution pipeline is running instructions. The cache bank selection controllers provide arbitration to prevent the memory pipeline and microprocessor's execution pipeline from accessing the same cache bank at the same time. This separation allows the cache memory pages to transfer to and from level 1 (L1) caching while the microprocessor pipeline is executing instructions. Computer security operations are implemented in hardware. By extending Unix file permissions bits to each cache memory bank and memory address, the OSFA provides hardware level computer security.

  5. Domain identification in impedance computed tomography by spline collocation method

    NASA Technical Reports Server (NTRS)

    Kojima, Fumio

    1990-01-01

    A method for estimating an unknown domain in elliptic boundary value problems is considered. The problem is formulated as an inverse problem of integral equations of the second kind. A computational method is developed using a splice collocation scheme. The results can be applied to the inverse problem of impedance computed tomography (ICT) for image reconstruction.

  6. Logic as Marr's Computational Level: Four Case Studies.

    PubMed

    Baggio, Giosuè; van Lambalgen, Michiel; Hagoort, Peter

    2015-04-01

    We sketch four applications of Marr's levels-of-analysis methodology to the relations between logic and experimental data in the cognitive neuroscience of language and reasoning. The first part of the paper illustrates the explanatory power of computational level theories based on logic. We show that a Bayesian treatment of the suppression task in reasoning with conditionals is ruled out by EEG data, supporting instead an analysis based on defeasible logic. Further, we describe how results from an EEG study on temporal prepositions can be reanalyzed using formal semantics, addressing a potential confound. The second part of the article demonstrates the predictive power of logical theories drawing on EEG data on processing progressive constructions and on behavioral data on conditional reasoning in people with autism. Logical theories can constrain processing hypotheses all the way down to neurophysiology, and conversely neuroscience data can guide the selection of alternative computational level models of cognition. PMID:25417838

  7. Multi-level Hierarchical Poly Tree computer architectures

    NASA Technical Reports Server (NTRS)

    Padovan, Joe; Gute, Doug

    1990-01-01

    Based on the concept of hierarchical substructuring, this paper develops an optimal multi-level Hierarchical Poly Tree (HPT) parallel computer architecture scheme which is applicable to the solution of finite element and difference simulations. Emphasis is given to minimizing computational effort, in-core/out-of-core memory requirements, and the data transfer between processors. In addition, a simplified communications network that reduces the number of I/O channels between processors is presented. HPT configurations that yield optimal superlinearities are also demonstrated. Moreover, to generalize the scope of applicability, special attention is given to developing: (1) multi-level reduction trees which provide an orderly/optimal procedure by which model densification/simplification can be achieved, as well as (2) methodologies enabling processor grading that yields architectures with varying types of multi-level granularity.

  8. Correlations of Electrophysiological Measurements with Identification Levels of Ancient Chinese Characters

    PubMed Central

    Qi, Zhengyang; Wang, Xiaolong; Hao, Shuang; Zhu, Chuanlin; He, Weiqi; Luo, Wenbo

    2016-01-01

    Studies of event-related potential (ERP) in the human brain have shown that the N170 component can reliably distinguish among different object categories. However, it is unclear whether this is true for different identifiable levels within a single category. In the present study, we used ERP recording to examine the neural response to different identification levels and orientations (upright vs. inverted) of Chinese characters. The results showed that P1, N170, and P250 were modulated by different identification levels of Chinese characters. Moreover, time frequency analysis showed similar results, indicating that identification levels were associated with object recognition, particularly during processing of a single categorical stimulus. PMID:26982215

  9. 24 CFR 990.175 - Utilities expense level: Computation of the current consumption level.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 24 Housing and Urban Development 4 2010-04-01 2010-04-01 false Utilities expense level: Computation of the current consumption level. 990.175 Section 990.175 Housing and Urban Development Regulations Relating to Housing and Urban Development (Continued) OFFICE OF ASSISTANT SECRETARY FOR PUBLIC AND INDIAN HOUSING, DEPARTMENT OF HOUSING AND...

  10. 24 CFR 990.180 - Utilities expense level: Computation of the rolling base consumption level.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 24 Housing and Urban Development 4 2010-04-01 2010-04-01 false Utilities expense level: Computation of the rolling base consumption level. 990.180 Section 990.180 Housing and Urban Development Regulations Relating to Housing and Urban Development (Continued) OFFICE OF ASSISTANT SECRETARY FOR PUBLIC AND INDIAN HOUSING, DEPARTMENT OF HOUSING...

  11. 24 CFR 990.175 - Utilities expense level: Computation of the current consumption level.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 24 Housing and Urban Development 4 2011-04-01 2011-04-01 false Utilities expense level: Computation of the current consumption level. 990.175 Section 990.175 Housing and Urban Development REGULATIONS RELATING TO HOUSING AND URBAN DEVELOPMENT (CONTINUED) OFFICE OF ASSISTANT SECRETARY FOR PUBLIC AND INDIAN HOUSING, DEPARTMENT OF HOUSING AND...

  12. 24 CFR 990.180 - Utilities expense level: Computation of the rolling base consumption level.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 24 Housing and Urban Development 4 2011-04-01 2011-04-01 false Utilities expense level: Computation of the rolling base consumption level. 990.180 Section 990.180 Housing and Urban Development REGULATIONS RELATING TO HOUSING AND URBAN DEVELOPMENT (CONTINUED) OFFICE OF ASSISTANT SECRETARY FOR PUBLIC AND INDIAN HOUSING, DEPARTMENT OF HOUSING...

  13. Reservoir Computing approach to Great Lakes water level forecasting

    NASA Astrophysics Data System (ADS)

    Coulibaly, Paulin

    2010-02-01

    SummaryThe use of echo state network (ESN) for dynamical system modeling is known as Reservoir Computing and has been shown to be effective for a number of applications, including signal processing, learning grammatical structure, time series prediction and motor/system control. However, the performance of Reservoir Computing approach on hydrological time series remains largely unexplored. This study investigates the potential of ESN or Reservoir Computing for long-term prediction of lake water levels. Great Lakes water levels from 1918 to 2005 are used to develop and evaluate the ESN models. The forecast performance of the ESN-based models is compared with the results obtained from two benchmark models, the conventional recurrent neural network (RNN) and the Bayesian neural network (BNN). The test results indicate a strong ability of ESN models to provide improved lake level forecasts up to 10-month ahead - suggesting that the inherent structure and innovative learning approach of the ESN is suitable for hydrological time series modeling. Another particular advantage of ESN learning approach is that it simplifies the network training complexity and avoids the limitations inherent to the gradient descent optimization method. Overall, it is shown that the ESN can be a good alternative method for improved lake level forecasting, performing better than both the RNN and the BNN on the four selected Great Lakes time series, namely, the Lakes Erie, Huron-Michigan, Ontario, and Superior.

  14. GC/IR computer-aided identification of anaerobic bacteria

    NASA Astrophysics Data System (ADS)

    Ye, Hunian; Zhang, Feng S.; Yang, Hua; Li, Zhu; Ye, Song

    1993-09-01

    A new method was developed to identify anaerobic bacteria by using pattern recognition. The method is depended on GC / JR data. The system is intended for use as a precise rapid and reproduceable aid in the identification of unknown isolates. Key Words: Anaerobic bacteria Pattern recognition Computeraided identification GC / JR 1 . TNTRODUCTTON A major problem in the field of anaerobic bacteriology is the difficulty in accurately precisely and rapidly identifying unknown isolates. Tn the proceedings of the Third International Symposium on Rapid Methods and Automation in Microbiology C. M. Moss said: " Chromatographic analysis is a new future for clinical microbiology" . 12 years past and so far it seems that this is an idea whose time has not get come but it close. Now two major advances that have brought the technology forword in terms ofmaking it appropriate for use in the clinical laboratory can aldo be cited. One is the development and implementation of fused silica capillary columns. In contrast to packed columns and those of'' greater width these columns allow reproducible recovery of hydroxey fatty acids with the same carbon chain length. The second advance is the efficient data processing afforded by modern microcomputer systems. On the other hand the practical steps for sample preparation also are an advance in the clinical laboratory. Chromatographic Analysis means mainly of analysis of fatty acids. The most common

  15. A new computer-assisted technique to aid personal identification.

    PubMed

    De Angelis, Danilo; Sala, Remo; Cantatore, Angela; Grandi, Marco; Cattaneo, Cristina

    2009-07-01

    The paper describes a procedure aimed at identification from two-dimensional (2D) images (video-surveillance tapes, for example) by comparison with a three-dimensional (3D) facial model of a suspect. The application is intended to provide a tool which can help in analyzing compatibility or incompatibility between a criminal and a suspect's facial traits. The authors apply the concept of "geometrically compatible images". The idea is to use a scanner to reconstruct a 3D facial model of a suspect and to compare it to a frame extracted from the video-surveillance sequence which shows the face of the perpetrator. Repositioning and reorientation of the 3D model according to subject's face framed in the crime scene photo are manually accomplished, after automatic resizing. Repositioning and reorientation are performed in correspondence of anthropometric landmarks, distinctive for that person and detected both on the 2D face and on the 3D model. In this way, the superimposition between the original two-dimensional facial image and the three-dimensional one is obtained and a judgment is formulated by an expert on the basis of the fit between the anatomical facial districts of the two subjects. The procedure reduces the influence of face orientation and may be a useful tool in identification. PMID:19082838

  16. Computer-guided drug repurposing: identification of trypanocidal activity of clofazimine, benidipine and saquinavir.

    PubMed

    Bellera, Carolina L; Balcazar, Darío E; Vanrell, M Cristina; Casassa, A Florencia; Palestro, Pablo H; Gavernet, Luciana; Labriola, Carlos A; Gálvez, Jorge; Bruno-Blanch, Luis E; Romano, Patricia S; Carrillo, Carolina; Talevi, Alan

    2015-03-26

    In spite of remarkable advances in the knowledge on Trypanosoma cruzi biology, no medications to treat Chagas disease have been approved in the last 40 years and almost 8 million people remain infected. Since the public sector and non-profit organizations play a significant role in the research efforts on Chagas disease, it is important to implement research strategies that promote translation of basic research into the clinical practice. Recent international public-private initiatives address the potential of drug repositioning (i.e. finding second or further medical uses for known-medications) which can substantially improve the success at clinical trials and the innovation in the pharmaceutical field. In this work, we present the computer-aided identification of approved drugs clofazimine, benidipine and saquinavir as potential trypanocidal compounds and test their effects at biochemical as much as cellular level on different parasite stages. According to the obtained results, we discuss biopharmaceutical, toxicological and physiopathological criteria applied to decide to move clofazimine and benidipine into preclinical phase, in an acute model of infection. The article illustrates the potential of computer-guided drug repositioning to integrate and optimize drug discovery and preclinical development; it also proposes rational rules to select which among repositioned candidates should advance to investigational drug status and offers a new insight on clofazimine and benidipine as candidate treatments for Chagas disease. One Sentence Summary: We present the computer-guided drug repositioning of three approved drugs as potential new treatments for Chagas disease, integrating computer-aided drug screening and biochemical, cellular and preclinical tests. PMID:25707014

  17. Computing Bounds on Resource Levels for Flexible Plans

    NASA Technical Reports Server (NTRS)

    Muscvettola, Nicola; Rijsman, David

    2009-01-01

    A new algorithm efficiently computes the tightest exact bound on the levels of resources induced by a flexible activity plan (see figure). Tightness of bounds is extremely important for computations involved in planning because tight bounds can save potentially exponential amounts of search (through early backtracking and detection of solutions), relative to looser bounds. The bound computed by the new algorithm, denoted the resource-level envelope, constitutes the measure of maximum and minimum consumption of resources at any time for all fixed-time schedules in the flexible plan. At each time, the envelope guarantees that there are two fixed-time instantiations one that produces the minimum level and one that produces the maximum level. Therefore, the resource-level envelope is the tightest possible resource-level bound for a flexible plan because any tighter bound would exclude the contribution of at least one fixed-time schedule. If the resource- level envelope can be computed efficiently, one could substitute looser bounds that are currently used in the inner cores of constraint-posting scheduling algorithms, with the potential for great improvements in performance. What is needed to reduce the cost of computation is an algorithm, the measure of complexity of which is no greater than a low-degree polynomial in N (where N is the number of activities). The new algorithm satisfies this need. In this algorithm, the computation of resource-level envelopes is based on a novel combination of (1) the theory of shortest paths in the temporal-constraint network for the flexible plan and (2) the theory of maximum flows for a flow network derived from the temporal and resource constraints. The measure of asymptotic complexity of the algorithm is O(N O(maxflow(N)), where O(x) denotes an amount of computing time or a number of arithmetic operations proportional to a number of the order of x and O(maxflow(N)) is the measure of complexity (and thus of cost) of a maximumflow

  18. DEVELOPMENT OF COMPUTATIONAL TOOLS FOR OPTIMAL IDENTIFICATION OF BIOLOGICAL NETWORKS

    EPA Science Inventory

    Following the theoretical analysis and computer simulations, the next step for the development of SNIP will be a proof-of-principle laboratory application. Specifically, we have obtained a synthetic transcriptional cascade (harbored in Escherichia coli...

  19. Bytes and Bugs: Integrating Computer Programming with Bacteria Identification.

    ERIC Educational Resources Information Center

    Danciger, Michael

    1986-01-01

    By using a computer program to identify bacteria, students sharpen their analytical skills and gain familiarity with procedures used in laboratories outside the university. Although it is ideal for identifying a bacterium, the program can be adapted to many other disciplines. (Author)

  20. A Simple Computer Application for the Identification of Conifer Genera

    ERIC Educational Resources Information Center

    Strain, Steven R.; Chmielewski, Jerry G.

    2010-01-01

    The National Science Education Standards prescribe that an understanding of the importance of classifying organisms be one component of a student's educational experience in the life sciences. The use of a classification scheme to identify organisms is one way of addressing this goal. We describe Conifer ID, a computer application that assists…

  1. A survey of computational methods and error rate estimation procedures for peptide and protein identification in shotgun proteomics

    PubMed Central

    Nesvizhskii, Alexey I.

    2010-01-01

    This manuscript provides a comprehensive review of the peptide and protein identification process using tandem mass spectrometry (MS/MS) data generated in shotgun proteomic experiments. The commonly used methods for assigning peptide sequences to MS/MS spectra are critically discussed and compared, from basic strategies to advanced multi-stage approaches. A particular attention is paid to the problem of false-positive identifications. Existing statistical approaches for assessing the significance of peptide to spectrum matches are surveyed, ranging from single-spectrum approaches such as expectation values to global error rate estimation procedures such as false discovery rates and posterior probabilities. The importance of using auxiliary discriminant information (mass accuracy, peptide separation coordinates, digestion properties, and etc.) is discussed, and advanced computational approaches for joint modeling of multiple sources of information are presented. This review also includes a detailed analysis of the issues affecting the interpretation of data at the protein level, including the amplification of error rates when going from peptide to protein level, and the ambiguities in inferring the identifies of sample proteins in the presence of shared peptides. Commonly used methods for computing protein-level confidence scores are discussed in detail. The review concludes with a discussion of several outstanding computational issues. PMID:20816881

  2. Computational Identification of Active Enhancers in Model Organisms

    PubMed Central

    Wang, Chengqi; Zhang, Michael Q.; Zhang, Zhihua

    2013-01-01

    As a class of cis-regulatory elements, enhancers were first identified as the genomic regions that are able to markedly increase the transcription of genes nearly 30 years ago. Enhancers can regulate gene expression in a cell-type specific and developmental stage specific manner. Although experimental technologies have been developed to identify enhancers genome-wide, the design principle of the regulatory elements and the way they rewire the transcriptional regulatory network tempo-spatially are far from clear. At present, developing predictive methods for enhancers, particularly for the cell-type specific activity of enhancers, is central to computational biology. In this review, we survey the current computational approaches for active enhancer prediction and discuss future directions. PMID:23685394

  3. Computed tomography identification of an exophytic colonic liposarcoma.

    PubMed

    Chou, Chung Kuao; Chen, Sung-Ting

    2016-09-01

    It may be difficult to ascertain the relationship between a large intra-abdominal tumor and the adjacent organs if they are close together. In the current case, a definitive preoperative diagnosis of an exophytic colonic tumor was obtained by the demonstration of obtuse angles between the tumor and colon and by distinct recognition of the mucosa-submucosa of the colonic wall on computed tomography; the accuracy of this preoperative diagnosis was subsequently confirmed by pathologic findings. PMID:27594941

  4. A color and texture based multi-level fusion scheme for ethnicity identification

    NASA Astrophysics Data System (ADS)

    Du, Hongbo; Salah, Sheerko Hma; Ahmed, Hawkar O.

    2014-05-01

    Ethnicity identification of face images is of interest in many areas of application. Different from face recognition of individuals, ethnicity identification classifies faces according to the common features of a specific ethnic group. This paper presents a multi-level fusion scheme for ethnicity identification that combines texture features of local areas of a face using local binary patterns with color features using HSV binning. The scheme fuses the decisions from a k-nearest neighbor classifier and a support vector machine classifier into a final identification decision. We have tested the scheme on a collection of face images from a number of publicly available databases. The results demonstrate the effectiveness of the combined features and improvements on accuracy of identification by the fusion scheme over the identification using individual features and other state-of-art techniques.

  5. Computationally inexpensive identification of noninformative model parameters by sequential screening

    NASA Astrophysics Data System (ADS)

    Mai, Juliane; Cuntz, Matthias; Zink, Matthias; Thober, Stephan; Kumar, Rohini; Schäfer, David; Schrön, Martin; Craven, John; Rakovec, Oldrich; Spieler, Diana; Prykhodko, Vladyslav; Dalmasso, Giovanni; Musuuza, Jude; Langenberg, Ben; Attinger, Sabine; Samaniego, Luis

    2016-04-01

    Environmental models tend to require increasing computational time and resources as physical process descriptions are improved or new descriptions are incorporated. Many-query applications such as sensitivity analysis or model calibration usually require a large number of model evaluations leading to high computational demand. This often limits the feasibility of rigorous analyses. Here we present a fully automated sequential screening method that selects only informative parameters for a given model output. The method requires a number of model evaluations that is approximately 10 times the number of model parameters. It was tested using the mesoscale hydrologic model mHM in three hydrologically unique European river catchments. It identified around 20 informative parameters out of 52, with different informative parameters in each catchment. The screening method was evaluated with subsequent analyses using all 52 as well as only the informative parameters. Subsequent Sobol's global sensitivity analysis led to almost identical results yet required 40% fewer model evaluations after screening. mHM was calibrated with all and with only informative parameters in the three catchments. Model performances for daily discharge were equally high in both cases with Nash-Sutcliffe efficiencies above 0.82. Calibration using only the informative parameters needed just one third of the number of model evaluations. The universality of the sequential screening method was demonstrated using several general test functions from the literature. We therefore recommend the use of the computationally inexpensive sequential screening method prior to rigorous analyses on complex environmental models.

  6. Computationally inexpensive identification of noninformative model parameters by sequential screening

    NASA Astrophysics Data System (ADS)

    Cuntz, Matthias; Mai, Juliane; Zink, Matthias; Thober, Stephan; Kumar, Rohini; Schäfer, David; Schrön, Martin; Craven, John; Rakovec, Oldrich; Spieler, Diana; Prykhodko, Vladyslav; Dalmasso, Giovanni; Musuuza, Jude; Langenberg, Ben; Attinger, Sabine; Samaniego, Luis

    2015-08-01

    Environmental models tend to require increasing computational time and resources as physical process descriptions are improved or new descriptions are incorporated. Many-query applications such as sensitivity analysis or model calibration usually require a large number of model evaluations leading to high computational demand. This often limits the feasibility of rigorous analyses. Here we present a fully automated sequential screening method that selects only informative parameters for a given model output. The method requires a number of model evaluations that is approximately 10 times the number of model parameters. It was tested using the mesoscale hydrologic model mHM in three hydrologically unique European river catchments. It identified around 20 informative parameters out of 52, with different informative parameters in each catchment. The screening method was evaluated with subsequent analyses using all 52 as well as only the informative parameters. Subsequent Sobol's global sensitivity analysis led to almost identical results yet required 40% fewer model evaluations after screening. mHM was calibrated with all and with only informative parameters in the three catchments. Model performances for daily discharge were equally high in both cases with Nash-Sutcliffe efficiencies above 0.82. Calibration using only the informative parameters needed just one third of the number of model evaluations. The universality of the sequential screening method was demonstrated using several general test functions from the literature. We therefore recommend the use of the computationally inexpensive sequential screening method prior to rigorous analyses on complex environmental models.

  7. All-memristive neuromorphic computing with level-tuned neurons

    NASA Astrophysics Data System (ADS)

    Pantazi, Angeliki; Woźniak, Stanisław; Tuma, Tomas; Eleftheriou, Evangelos

    2016-09-01

    In the new era of cognitive computing, systems will be able to learn and interact with the environment in ways that will drastically enhance the capabilities of current processors, especially in extracting knowledge from vast amount of data obtained from many sources. Brain-inspired neuromorphic computing systems increasingly attract research interest as an alternative to the classical von Neumann processor architecture, mainly because of the coexistence of memory and processing units. In these systems, the basic components are neurons interconnected by synapses. The neurons, based on their nonlinear dynamics, generate spikes that provide the main communication mechanism. The computational tasks are distributed across the neural network, where synapses implement both the memory and the computational units, by means of learning mechanisms such as spike-timing-dependent plasticity. In this work, we present an all-memristive neuromorphic architecture comprising neurons and synapses realized by using the physical properties and state dynamics of phase-change memristors. The architecture employs a novel concept of interconnecting the neurons in the same layer, resulting in level-tuned neuronal characteristics that preferentially process input information. We demonstrate the proposed architecture in the tasks of unsupervised learning and detection of multiple temporal correlations in parallel input streams. The efficiency of the neuromorphic architecture along with the homogenous neuro-synaptic dynamics implemented with nanoscale phase-change memristors represent a significant step towards the development of ultrahigh-density neuromorphic co-processors.

  8. All-memristive neuromorphic computing with level-tuned neurons.

    PubMed

    Pantazi, Angeliki; Woźniak, Stanisław; Tuma, Tomas; Eleftheriou, Evangelos

    2016-09-01

    In the new era of cognitive computing, systems will be able to learn and interact with the environment in ways that will drastically enhance the capabilities of current processors, especially in extracting knowledge from vast amount of data obtained from many sources. Brain-inspired neuromorphic computing systems increasingly attract research interest as an alternative to the classical von Neumann processor architecture, mainly because of the coexistence of memory and processing units. In these systems, the basic components are neurons interconnected by synapses. The neurons, based on their nonlinear dynamics, generate spikes that provide the main communication mechanism. The computational tasks are distributed across the neural network, where synapses implement both the memory and the computational units, by means of learning mechanisms such as spike-timing-dependent plasticity. In this work, we present an all-memristive neuromorphic architecture comprising neurons and synapses realized by using the physical properties and state dynamics of phase-change memristors. The architecture employs a novel concept of interconnecting the neurons in the same layer, resulting in level-tuned neuronal characteristics that preferentially process input information. We demonstrate the proposed architecture in the tasks of unsupervised learning and detection of multiple temporal correlations in parallel input streams. The efficiency of the neuromorphic architecture along with the homogenous neuro-synaptic dynamics implemented with nanoscale phase-change memristors represent a significant step towards the development of ultrahigh-density neuromorphic co-processors. PMID:27455898

  9. MAX--An Interactive Computer Program for Teaching Identification of Clay Minerals by X-ray Diffraction.

    ERIC Educational Resources Information Center

    Kohut, Connie K.; And Others

    1993-01-01

    Discusses MAX, an interactive computer program for teaching identification of clay minerals based on standard x-ray diffraction characteristics. The program provides tutorial-type exercises for identification of 16 clay standards, self-evaluation exercises, diffractograms of 28 soil clay minerals, and identification of nonclay minerals. (MDH)

  10. Cloud identification using genetic algorithms and massively parallel computation

    NASA Technical Reports Server (NTRS)

    Buckles, Bill P.; Petry, Frederick E.

    1996-01-01

    As a Guest Computational Investigator under the NASA administered component of the High Performance Computing and Communication Program, we implemented a massively parallel genetic algorithm on the MasPar SIMD computer. Experiments were conducted using Earth Science data in the domains of meteorology and oceanography. Results obtained in these domains are competitive with, and in most cases better than, similar problems solved using other methods. In the meteorological domain, we chose to identify clouds using AVHRR spectral data. Four cloud speciations were used although most researchers settle for three. Results were remarkedly consistent across all tests (91% accuracy). Refinements of this method may lead to more timely and complete information for Global Circulation Models (GCMS) that are prevalent in weather forecasting and global environment studies. In the oceanographic domain, we chose to identify ocean currents from a spectrometer having similar characteristics to AVHRR. Here the results were mixed (60% to 80% accuracy). Given that one is willing to run the experiment several times (say 10), then it is acceptable to claim the higher accuracy rating. This problem has never been successfully automated. Therefore, these results are encouraging even though less impressive than the cloud experiment. Successful conclusion of an automated ocean current detection system would impact coastal fishing, naval tactics, and the study of micro-climates. Finally we contributed to the basic knowledge of GA (genetic algorithm) behavior in parallel environments. We developed better knowledge of the use of subpopulations in the context of shared breeding pools and the migration of individuals. Rigorous experiments were conducted based on quantifiable performance criteria. While much of the work confirmed current wisdom, for the first time we were able to submit conclusive evidence. The software developed under this grant was placed in the public domain. An extensive user

  11. Computational identification of MoRFs in protein sequences

    PubMed Central

    Malhis, Nawar; Gsponer, Jörg

    2015-01-01

    Motivation: Intrinsically disordered regions of proteins play an essential role in the regulation of various biological processes. Key to their regulatory function is the binding of molecular recognition features (MoRFs) to globular protein domains in a process known as a disorder-to-order transition. Predicting the location of MoRFs in protein sequences with high accuracy remains an important computational challenge. Method: In this study, we introduce MoRFCHiBi, a new computational approach for fast and accurate prediction of MoRFs in protein sequences. MoRFCHiBi combines the outcomes of two support vector machine (SVM) models that take advantage of two different kernels with high noise tolerance. The first, SVMS, is designed to extract maximal information from the general contrast in amino acid compositions between MoRFs, their surrounding regions (Flanks), and the remainders of the sequences. The second, SVMT, is used to identify similarities between regions in a query sequence and MoRFs of the training set. Results: We evaluated the performance of our predictor by comparing its results with those of two currently available MoRF predictors, MoRFpred and ANCHOR. Using three test sets that have previously been collected and used to evaluate MoRFpred and ANCHOR, we demonstrate that MoRFCHiBi outperforms the other predictors with respect to different evaluation metrics. In addition, MoRFCHiBi is downloadable and fast, which makes it useful as a component in other computational prediction tools. Availability and implementation: http://www.chibi.ubc.ca/morf/. Contact: gsponer@chibi.ubc.ca. Supplementary information: Supplementary data are available at Bioinformatics online. PMID:25637562

  12. Computer method for identification of boiler transfer functions

    NASA Technical Reports Server (NTRS)

    Miles, J. H.

    1971-01-01

    An iterative computer method is described for identifying boiler transfer functions using frequency response data. An objective penalized performance measure and a nonlinear minimization technique are used to cause the locus of points generated by a transfer function to resemble the locus of points obtained from frequency response measurements. Different transfer functions can be tried until a satisfactory empirical transfer function to the system is found. To illustrate the method, some examples and some results from a study of a set of data consisting of measurements of the inlet impedance of a single tube forced flow boiler with inserts are given.

  13. Parallel Computation of the Topology of Level Sets

    SciTech Connect

    Pascucci, V; Cole-McLaughlin, K

    2004-12-16

    This paper introduces two efficient algorithms that compute the Contour Tree of a 3D scalar field F and its augmented version with the Betti numbers of each isosurface. The Contour Tree is a fundamental data structure in scientific visualization that is used to preprocess the domain mesh to allow optimal computation of isosurfaces with minimal overhead storage. The Contour Tree can also be used to build user interfaces reporting the complete topological characterization of a scalar field, as shown in Figure 1. Data exploration time is reduced since the user understands the evolution of level set components with changing isovalue. The Augmented Contour Tree provides even more accurate information segmenting the range space of the scalar field in portion of invariant topology. The exploration time for a single isosurface is also improved since its genus is known in advance. Our first new algorithm augments any given Contour Tree with the Betti numbers of all possible corresponding isocontours in linear time with the size of the tree. Moreover we show how to extend the scheme introduced in [3] with the Betti number computation without increasing its complexity. Thus, we improve on the time complexity from our previous approach [10] from O(m log m) to O(n log n + m), where m is the number of cells and n is the number of vertices in the domain of F. Our second contribution is a new divide-and-conquer algorithm that computes the Augmented Contour Tree with improved efficiency. The approach computes the output Contour Tree by merging two intermediate Contour Trees and is independent of the interpolant. In this way we confine any knowledge regarding a specific interpolant to an independent function that computes the tree for a single cell. We have implemented this function for the trilinear interpolant and plan to replace it with higher order interpolants when needed. The time complexity is O(n + t log n), where t is the number of critical points of F. For the first time

  14. Bridging Levels of Understanding in Schizophrenia Through Computational Modeling

    PubMed Central

    Anticevic, Alan; Murray, John D.; Barch, Deanna M.

    2015-01-01

    Schizophrenia is an illness with a remarkably complex symptom presentation that has thus far been out of reach of neuroscientific explanation. This presents a fundamental problem for developing better treatments that target specific symptoms or root causes. One promising path forward is the incorporation of computational neuroscience, which provides a way to formalize experimental observations and, in turn, make theoretical predictions for subsequent studies. We review three complementary approaches: (a) biophysically based models developed to test cellular-level and synaptic hypotheses, (b) connectionist models that give insight into large-scale neural-system-level disturbances in schizophrenia, and (c) models that provide a formalism for observations of complex behavioral deficits, such as negative symptoms. We argue that harnessing all of these modeling approaches represents a productive approach for better understanding schizophrenia. We discuss how blending these approaches can allow the field to progress toward a more comprehensive understanding of schizophrenia and its treatment. PMID:25960938

  15. Application of replica plating and computer analysis for rapid identification of bacteria in some foods. I. Identification scheme.

    PubMed

    Corlett, D A; Lee, J S; Sinnhuber, R O

    1965-09-01

    A method was devised and tested for a quantitative identification of microbial flora in foods. The colonies developing on the initial isolation plates were picked with sterile toothpicks and inoculated on a master plate in prearranged spacing and order. The growth on the master plates was then replicated on a series of solid-agar plates containing differential or selective agents. The characteristic growth and physiological responses of microbial isolates to penicillin, tylosin, vancomycin, streptomycin, chloramphenicol, neomycin, colistin, and to S S Agar, Staphylococcus Medium No. 110, and Potato Dextrose Agar were recorded, together with Gram reaction and cell morphology. This information was then fed into an IBM 1410 digital computer which grouped and analyzed each isolate into 10 microbial genera, or groups, according to the identification key. The identification scheme was established by use of reference culture studies and from the literature. This system was used to analyze the microbial flora in dover sole (Microstomus pacificus) and ground beef. The method described in this article enables one to examine large numbers of microbial isolates with simplicity. PMID:5325942

  16. Computationally Inexpensive Identification of Non-Informative Model Parameters

    NASA Astrophysics Data System (ADS)

    Mai, J.; Cuntz, M.; Kumar, R.; Zink, M.; Samaniego, L. E.; Schaefer, D.; Thober, S.; Rakovec, O.; Musuuza, J. L.; Craven, J. R.; Spieler, D.; Schrön, M.; Prykhodko, V.; Dalmasso, G.; Langenberg, B.; Attinger, S.

    2014-12-01

    Sensitivity analysis is used, for example, to identify parameters which induce the largest variability in model output and are thus informative during calibration. Variance-based techniques are employed for this purpose, which unfortunately require a large number of model evaluations and are thus ineligible for complex environmental models. We developed, therefore, a computational inexpensive screening method, which is based on Elementary Effects, that automatically separates informative and non-informative model parameters. The method was tested using the mesoscale hydrologic model (mHM) with 52 parameters. The model was applied in three European catchments with different hydrological characteristics, i.e. Neckar (Germany), Sava (Slovenia), and Guadalquivir (Spain). The method identified the same informative parameters as the standard Sobol method but with less than 1% of model runs. In Germany and Slovenia, 22 of 52 parameters were informative mostly in the formulations of evapotranspiration, interflow and percolation. In Spain 19 of 52 parameters were informative with an increased importance of soil parameters. We showed further that Sobol' indexes calculated for the subset of informative parameters are practically the same as Sobol' indexes before the screening but the number of model runs was reduced by more than 50%. The model mHM was then calibrated twice in the three test catchments. First all 52 parameters were taken into account and then only the informative parameters were calibrated while all others are kept fixed. The Nash-Sutcliffe efficiencies were 0.87 and 0.83 in Germany, 0.89 and 0.88 in Slovenia, and 0.86 and 0.85 in Spain, respectively. This minor loss of at most 4% in model performance comes along with a substantial decrease of at least 65% in model evaluations. In summary, we propose an efficient screening method to identify non-informative model parameters that can be discarded during further applications. We have shown that sensitivity

  17. Frequency domain transfer function identification using the computer program SYSFIT

    SciTech Connect

    Trudnowski, D.J.

    1992-12-01

    Because the primary application of SYSFIT for BPA involves studying power system dynamics, this investigation was geared toward simulating the effects that might be encountered in studying electromechanical oscillations in power systems. Although the intended focus of this work is power system oscillations, the studies are sufficiently genetic that the results can be applied to many types of oscillatory systems with closely-spaced modes. In general, there are two possible ways of solving the optimization problem. One is to use a least-squares optimization function and to write the system in such a form that the problem becomes one of linear least-squares. The solution can then be obtained using a standard least-squares technique. The other method involves using a search method to obtain the optimal model. This method allows considerably more freedom in forming the optimization function and model, but it requires an initial guess of the system parameters. SYSFIT employs this second approach. Detailed investigations were conducted into three main areas: (1) fitting to exact frequency response data of a linear system; (2) fitting to the discrete Fourier transformation of noisy data; and (3) fitting to multi-path systems. The first area consisted of investigating the effects of alternative optimization cost function options; using different optimization search methods; incorrect model order, missing response data; closely-spaced poles; and closely-spaced pole-zero pairs. Within the second area, different noise colorations and levels were studied. In the third area, methods were investigated for improving fitting results by incorporating more than one system path. The following is a list of guidelines and properties developed from the study for fitting a transfer function to the frequency response of a system using optimization search methods.

  18. Bouc-Wen model parameter identification for a MR fluid damper using computationally efficient GA.

    PubMed

    Kwok, N M; Ha, Q P; Nguyen, M T; Li, J; Samali, B

    2007-04-01

    A non-symmetrical Bouc-Wen model is proposed in this paper for magnetorheological (MR) fluid dampers. The model considers the effect of non-symmetrical hysteresis which has not been taken into account in the original Bouc-Wen model. The model parameters are identified with a Genetic Algorithm (GA) using its flexibility in identification of complex dynamics. The computational efficiency of the proposed GA is improved with the absorption of the selection stage into the crossover and mutation operations. Crossover and mutation are also made adaptive to the fitness values such that their probabilities need not be user-specified. Instead of using a sufficiently number of generations or a pre-determined fitness value, the algorithm termination criterion is formulated on the basis of a statistical hypothesis test, thus enhancing the performance of the parameter identification. Experimental test data of the damper displacement and force are used to verify the proposed approach with satisfactory parameter identification results. PMID:17349644

  19. Computer experiments in preparation of system identification from transient rotor model tests, part 2

    NASA Technical Reports Server (NTRS)

    Hohenemser, K. H.; Yin, S. K.

    1974-01-01

    System identification methods which can extract model rotor paramenters with reasonable accuracy from noise polluted blade flapping transient measurements were developed. Usually parameter identification requires data on the state variables, that is on deflections and on rate of deflections. The small size of rotor models makes it, however, difficult to measure more than the blade flapping deflections. For the computer experiments it was, therefore, assumed that only noisy deflection measurements are available. Parameter identifications were performed for one and two unknown parameters. Both rotating coordinates and multiblade coordinates were used. It was found that data processing with a digital filter allowed by numerical differentiation a sufficiently accurate determination of the rates of deflection and of the accelerations to obtain reasonable parameter estimates with a simple linear estimator.

  20. Assessing the precision of high-throughput computational and laboratory approaches for the genome-wide identification of protein subcellular localization in bacteria

    PubMed Central

    Rey, Sébastien; Gardy, Jennifer L; Brinkman, Fiona SL

    2005-01-01

    Background Identification of a bacterial protein's subcellular localization (SCL) is important for genome annotation, function prediction and drug or vaccine target identification. Subcellular fractionation techniques combined with recent proteomics technology permits the identification of large numbers of proteins from distinct bacterial compartments. However, the fractionation of a complex structure like the cell into several subcellular compartments is not a trivial task. Contamination from other compartments may occur, and some proteins may reside in multiple localizations. New computational methods have been reported over the past few years that now permit much more accurate, genome-wide analysis of the SCL of protein sequences deduced from genomes. There is a need to compare such computational methods with laboratory proteomics approaches to identify the most effective current approach for genome-wide localization characterization and annotation. Results In this study, ten subcellular proteome analyses of bacterial compartments were reviewed. PSORTb version 2.0 was used to computationally predict the localization of proteins reported in these publications, and these computational predictions were then compared to the localizations determined by the proteomics study. By using a combined approach, we were able to identify a number of contaminants and proteins with dual localizations, and were able to more accurately identify membrane subproteomes. Our results allowed us to estimate the precision level of laboratory subproteome studies and we show here that, on average, recent high-precision computational methods such as PSORTb now have a lower error rate than laboratory methods. Conclusion We have performed the first focused comparison of genome-wide proteomic and computational methods for subcellular localization identification, and show that computational methods have now attained a level of precision that is exceeding that of high-throughput laboratory

  1. The role of computational methods in the identification of bioactive compounds.

    PubMed

    Glick, Meir; Jacoby, Edgar

    2011-08-01

    Computational methods play an ever increasing role in lead finding. A vast repertoire of molecular design and virtual screening methods emerged in the past two decades and are today routinely used. There is increasing awareness that there is no single best computational protocol and correspondingly there is a shift recommending the combination of complementary methods. A promising trend for the application of computational methods in lead finding is to take advantage of the vast amounts of HTS (High Throughput Screening) data to allow lead assessment by detailed systems-based data analysis, especially for phenotypic screens where the identification of compound-target pairs is the primary goal. Herein, we review trends and provide examples of successful applications of computational methods in lead finding. PMID:21411361

  2. Identification of Restrictive Computer and Software Variables among Preoperational Users of a Computer Learning Center.

    ERIC Educational Resources Information Center

    Kozubal, Diane K.

    While manufacturers have produced a wide variety of software said to be easy for even the youngest child to use, there are conflicting perspectives on computer issues such as ease of use, influence on meeting educational objectives, effects on procedural learning, and rationale for use with young children. Addressing these concerns, this practicum…

  3. Automatic Measurement of Water Levels by Using Image Identification Method in Open Channel

    NASA Astrophysics Data System (ADS)

    Chung Yang, Han; Xue Yang, Jia

    2014-05-01

    Water level data is indispensable to hydrology research, and it is important information for hydraulic engineering and overall utilization of water resources. The information of water level can be transmitted to management office by the network so that the management office may well understand whether the river level is exceeding the warning line. The existing water level measurement method can only present water levels in a form of data without any of images, the methods which make data just be a data and lack the sense of reality. Those images such as the rising or overflow of river level that the existing measurement method cannot obtain simultaneously. Therefore, this research employs a newly, improved method for water level measurement. Through the Video Surveillance System to record the images on site, an image of water surface will be snapped, and then the snapped image will be pre-processed and be compared with its altitude reference value to obtain a water level altitude value. With the ever-growing technology, the application scope of image identification is widely in increase. This research attempts to use image identification technology to analyze water level automatically. The image observation method used in this research is one of non-contact water level gage but it is quite different from other ones; the image observation method is cheap and the facilities can be set up beside an embankment of river or near the houses, thus the impact coming from external factors will be significantly reduced, and a real scene picture will be transmitted through wireless transmission. According to the dynamic water flow test held in an indoor experimental channel, the results of the research indicated that all of error levels of water level identification were less than 2% which meant the image identification could achieve identification result at different water levels. This new measurement method can offer instant river level figures and on-site video so that a

  4. The Use of Computer-Assisted Identification of ARIMA Time-Series.

    ERIC Educational Resources Information Center

    Brown, Roger L.

    This study was conducted to determine the effects of using various levels of tutorial statistical software for the tentative identification of nonseasonal ARIMA models, a statistical technique proposed by Box and Jenkins for the interpretation of time-series data. The Box-Jenkins approach is an iterative process encompassing several stages of…

  5. Computational Identification of Key Regulators in Two Different Colorectal Cancer Cell Lines

    PubMed Central

    Wlochowitz, Darius; Haubrock, Martin; Arackal, Jetcy; Bleckmann, Annalen; Wolff, Alexander; Beißbarth, Tim; Wingender, Edgar; Gültas, Mehmet

    2016-01-01

    Transcription factors (TFs) are gene regulatory proteins that are essential for an effective regulation of the transcriptional machinery. Today, it is known that their expression plays an important role in several types of cancer. Computational identification of key players in specific cancer cell lines is still an open challenge in cancer research. In this study, we present a systematic approach which combines colorectal cancer (CRC) cell lines, namely 1638N-T1 and CMT-93, and well-established computational methods in order to compare these cell lines on the level of transcriptional regulation as well as on a pathway level, i.e., the cancer cell-intrinsic pathway repertoire. For this purpose, we firstly applied the Trinity platform to detect signature genes, and then applied analyses of the geneXplain platform to these for detection of upstream transcriptional regulators and their regulatory networks. We created a CRC-specific position weight matrix (PWM) library based on the TRANSFAC database (release 2014.1) to minimize the rate of false predictions in the promoter analyses. Using our proposed workflow, we specifically focused on revealing the similarities and differences in transcriptional regulation between the two CRC cell lines, and report a number of well-known, cancer-associated TFs with significantly enriched binding sites in the promoter regions of the signature genes. We show that, although the signature genes of both cell lines show no overlap, they may still be regulated by common TFs in CRC. Based on our findings, we suggest that canonical Wnt signaling is activated in 1638N-T1, but inhibited in CMT-93 through cross-talks of Wnt signaling with the VDR signaling pathway and/or LXR-related pathways. Furthermore, our findings provide indication of several master regulators being present such as MLK3 and Mapk1 (ERK2) which might be important in cell proliferation, migration, and invasion of 1638N-T1 and CMT-93, respectively. Taken together, we provide

  6. Computational Identification of Key Regulators in Two Different Colorectal Cancer Cell Lines.

    PubMed

    Wlochowitz, Darius; Haubrock, Martin; Arackal, Jetcy; Bleckmann, Annalen; Wolff, Alexander; Beißbarth, Tim; Wingender, Edgar; Gültas, Mehmet

    2016-01-01

    Transcription factors (TFs) are gene regulatory proteins that are essential for an effective regulation of the transcriptional machinery. Today, it is known that their expression plays an important role in several types of cancer. Computational identification of key players in specific cancer cell lines is still an open challenge in cancer research. In this study, we present a systematic approach which combines colorectal cancer (CRC) cell lines, namely 1638N-T1 and CMT-93, and well-established computational methods in order to compare these cell lines on the level of transcriptional regulation as well as on a pathway level, i.e., the cancer cell-intrinsic pathway repertoire. For this purpose, we firstly applied the Trinity platform to detect signature genes, and then applied analyses of the geneXplain platform to these for detection of upstream transcriptional regulators and their regulatory networks. We created a CRC-specific position weight matrix (PWM) library based on the TRANSFAC database (release 2014.1) to minimize the rate of false predictions in the promoter analyses. Using our proposed workflow, we specifically focused on revealing the similarities and differences in transcriptional regulation between the two CRC cell lines, and report a number of well-known, cancer-associated TFs with significantly enriched binding sites in the promoter regions of the signature genes. We show that, although the signature genes of both cell lines show no overlap, they may still be regulated by common TFs in CRC. Based on our findings, we suggest that canonical Wnt signaling is activated in 1638N-T1, but inhibited in CMT-93 through cross-talks of Wnt signaling with the VDR signaling pathway and/or LXR-related pathways. Furthermore, our findings provide indication of several master regulators being present such as MLK3 and Mapk1 (ERK2) which might be important in cell proliferation, migration, and invasion of 1638N-T1 and CMT-93, respectively. Taken together, we provide

  7. Computer scoring of the Levels of Emotional Awareness Scale.

    PubMed

    Barchard, Kimberly A; Bajgar, Jane; Leaf, Duncan Ermini; Lane, Richard D

    2010-05-01

    The Levels of Emotional Awareness Scale (LEAS; Lane, Quinlan, Schwartz, Walker, & Zeitlan, 1990) is the most commonly used measure of differentiation and complexity in the use of emotion words and is associated with important clinical outcomes. Hand scoring the LEAS is time consuming. Existing programs for scoring open-ended responses cannot mimic LEAS hand scoring. Therefore, Leaf and Barchard (2006) developed the Program for Open-Ended Scoring (POES) to score the LEAS. In this article, we report a study in which the reliability and validity of POES scoring were examined. In the study, we used three participant types (adult community members, university students, children), three LEAS versions (paper based, computer based, and the LEAS for children), and a diverse set of criterion variables. Across this variety of conditions, the four POES scoring methods had internal consistencies and validities that were comparable to hand scoring, indicating that POES scoring can be used in clinical practice and other applied settings in which hand scoring is impractical. PMID:20479190

  8. Computer-assisted photo identification outperforms visible implant elastomers in an endangered salamander, Eurycea tonkawae.

    PubMed

    Bendik, Nathan F; Morrison, Thomas A; Gluesenkamp, Andrew G; Sanders, Mark S; O'Donnell, Lisa J

    2013-01-01

    Despite recognition that nearly one-third of the 6300 amphibian species are threatened with extinction, our understanding of the general ecology and population status of many amphibians is relatively poor. A widely-used method for monitoring amphibians involves injecting captured individuals with unique combinations of colored visible implant elastomer (VIE). We compared VIE identification to a less-invasive method - computer-assisted photographic identification (photoID) - in endangered Jollyville Plateau salamanders (Eurycea tonkawae), a species with a known range limited to eight stream drainages in central Texas. We based photoID on the unique pigmentation patterns on the dorsal head region of 1215 individual salamanders using identification software Wild-ID. We compared the performance of photoID methods to VIEs using both 'high-quality' and 'low-quality' images, which were taken using two different camera types and technologies. For high-quality images, the photoID method had a false rejection rate of 0.76% compared to 1.90% for VIEs. Using a comparable dataset of lower-quality images, the false rejection rate was much higher (15.9%). Photo matching scores were negatively correlated with time between captures, suggesting that evolving natural marks could increase misidentification rates in longer term capture-recapture studies. Our study demonstrates the utility of large-scale capture-recapture using photo identification methods for Eurycea and other species with stable natural marks that can be reliably photographed. PMID:23555669

  9. Computer-Assisted Photo Identification Outperforms Visible Implant Elastomers in an Endangered Salamander, Eurycea tonkawae

    PubMed Central

    Bendik, Nathan F.; Morrison, Thomas A.; Gluesenkamp, Andrew G.; Sanders, Mark S.; O’Donnell, Lisa J.

    2013-01-01

    Despite recognition that nearly one-third of the 6300 amphibian species are threatened with extinction, our understanding of the general ecology and population status of many amphibians is relatively poor. A widely-used method for monitoring amphibians involves injecting captured individuals with unique combinations of colored visible implant elastomer (VIE). We compared VIE identification to a less-invasive method – computer-assisted photographic identification (photoID) – in endangered Jollyville Plateau salamanders (Eurycea tonkawae), a species with a known range limited to eight stream drainages in central Texas. We based photoID on the unique pigmentation patterns on the dorsal head region of 1215 individual salamanders using identification software Wild-ID. We compared the performance of photoID methods to VIEs using both ‘high-quality’ and ‘low-quality’ images, which were taken using two different camera types and technologies. For high-quality images, the photoID method had a false rejection rate of 0.76% compared to 1.90% for VIEs. Using a comparable dataset of lower-quality images, the false rejection rate was much higher (15.9%). Photo matching scores were negatively correlated with time between captures, suggesting that evolving natural marks could increase misidentification rates in longer term capture-recapture studies. Our study demonstrates the utility of large-scale capture-recapture using photo identification methods for Eurycea and other species with stable natural marks that can be reliably photographed. PMID:23555669

  10. Evolutionary Computation for the Identification of Emergent Behavior in Autonomous Systems

    NASA Technical Reports Server (NTRS)

    Terrile, Richard J.; Guillaume, Alexandre

    2009-01-01

    Over the past several years the Center for Evolutionary Computation and Automated Design at the Jet Propulsion Laboratory has developed a technique based on Evolutionary Computational Methods (ECM) that allows for the automated optimization of complex computationally modeled systems. An important application of this technique is for the identification of emergent behaviors in autonomous systems. Mobility platforms such as rovers or airborne vehicles are now being designed with autonomous mission controllers that can find trajectories over a solution space that is larger than can reasonably be tested. It is critical to identify control behaviors that are not predicted and can have surprising results (both good and bad). These emergent behaviors need to be identified, characterized and either incorporated into or isolated from the acceptable range of control characteristics. We use cluster analysis of automatically retrieved solutions to identify isolated populations of solutions with divergent behaviors.

  11. Levels of Conformity to Islamic Values and the Process of Identification.

    ERIC Educational Resources Information Center

    Nassir, Balkis

    This study was conducted to measure the conformity levels and the identification process among university women students in an Islamic culture. Identity/conformity tests and costume identity tests were administered to 129 undergraduate female students at King Abdulaziz University in Saudi Arabia. The Photographic Costume Identity Test and the…

  12. THE INFLUENCE OF AN INDIVIDUAL'S COGNITIVE STYLE UPON CONCEPT IDENTIFICATION AT VARYING LEVELS OF COMPLEXITY.

    ERIC Educational Resources Information Center

    DAVIS, J.K.

    THIS EXPERIMENT EXAMINED THE EXTENT TO WHICH AN INDIVIDUAL'S COGNITIVE STYLE INFLUENCED HIS PERFORMANCE ON CONCEPT IDENTIFICATION PROBLEMS OF VARYING LEVELS OF COMPLEXITY. COGNITIVE STYLE WAS OPERATIONALLY DEFINED IN TERMS OF AN INDIVIDUAL'S PERFORMANCE ON THE HIDDEN FIGURES TEST (HFT). IT WAS ASSUMED THAT SUBJECTS (SS) ABLE TO IDENTIFY THE HIDDEN…

  13. Computational Identification of MoRFs in Protein Sequences Using Hierarchical Application of Bayes Rule

    PubMed Central

    Malhis, Nawar; Wong, Eric T. C.; Nassar, Roy; Gsponer, Jörg

    2015-01-01

    Motivation Intrinsically disordered regions of proteins play an essential role in the regulation of various biological processes. Key to their regulatory function is often the binding to globular protein domains via sequence elements known as molecular recognition features (MoRFs). Development of computational tools for the identification of candidate MoRF locations in amino acid sequences is an important task and an area of growing interest. Given the relative sparseness of MoRFs in protein sequences, the accuracy of the available MoRF predictors is often inadequate for practical usage, which leaves a significant need and room for improvement. In this work, we introduce MoRFCHiBi_Web, which predicts MoRF locations in protein sequences with higher accuracy compared to current MoRF predictors. Methods Three distinct and largely independent property scores are computed with component predictors and then combined to generate the final MoRF propensity scores. The first score reflects the likelihood of sequence windows to harbour MoRFs and is based on amino acid composition and sequence similarity information. It is generated by MoRFCHiBi using small windows of up to 40 residues in size. The second score identifies long stretches of protein disorder and is generated by ESpritz with the DisProt option. Lastly, the third score reflects residue conservation and is assembled from PSSM files generated by PSI-BLAST. These propensity scores are processed and then hierarchically combined using Bayes rule to generate the final MoRFCHiBi_Web predictions. Results MoRFCHiBi_Web was tested on three datasets. Results show that MoRFCHiBi_Web outperforms previously developed predictors by generating less than half the false positive rate for the same true positive rate at practical threshold values. This level of accuracy paired with its relatively high processing speed makes MoRFCHiBi_Web a practical tool for MoRF prediction. Availability http://morf.chibi.ubc.ca:8080/morf/. PMID

  14. Identifying the Computer Competency Levels of Recreation Department Undergraduates

    ERIC Educational Resources Information Center

    Zorba, Erdal

    2011-01-01

    Computer-based and web-based applications are as major instructional tools to increase undergraduates' motivation at school. In the recreation field usage of, computer and the internet based recreational applications has become more prevalent in order to present visual and interactive entertainment activities. Recreation department undergraduates…

  15. Automatic de-identification of electronic medical records using token-level and character-level conditional random fields.

    PubMed

    Liu, Zengjian; Chen, Yangxin; Tang, Buzhou; Wang, Xiaolong; Chen, Qingcai; Li, Haodi; Wang, Jingfeng; Deng, Qiwen; Zhu, Suisong

    2015-12-01

    De-identification, identifying and removing all protected health information (PHI) present in clinical data including electronic medical records (EMRs), is a critical step in making clinical data publicly available. The 2014 i2b2 (Center of Informatics for Integrating Biology and Bedside) clinical natural language processing (NLP) challenge sets up a track for de-identification (track 1). In this study, we propose a hybrid system based on both machine learning and rule approaches for the de-identification track. In our system, PHI instances are first identified by two (token-level and character-level) conditional random fields (CRFs) and a rule-based classifier, and then are merged by some rules. Experiments conducted on the i2b2 corpus show that our system submitted for the challenge achieves the highest micro F-scores of 94.64%, 91.24% and 91.63% under the "token", "strict" and "relaxed" criteria respectively, which is among top-ranked systems of the 2014 i2b2 challenge. After integrating some refined localization dictionaries, our system is further improved with F-scores of 94.83%, 91.57% and 91.95% under the "token", "strict" and "relaxed" criteria respectively. PMID:26122526

  16. Improving protein identification from peptide mass fingerprinting through a parameterized multi-level scoring algorithm and an optimized peak detection.

    PubMed

    Gras, R; Müller, M; Gasteiger, E; Gay, S; Binz, P A; Bienvenut, W; Hoogland, C; Sanchez, J C; Bairoch, A; Hochstrasser, D F; Appel, R D

    1999-12-01

    We have developed a new algorithm to identify proteins by means of peptide mass fingerprinting. Starting from the matrix-assisted laser desorption/ionization-time-of-flight (MALDI-TOF) spectra and environmental data such as species, isoelectric point and molecular weight, as well as chemical modifications or number of missed cleavages of a protein, the program performs a fully automated identification of the protein. The first step is a peak detection algorithm, which allows precise and fast determination of peptide masses, even if the peaks are of low intensity or they overlap. In the second step the masses and environmental data are used by the identification algorithm to search in protein sequence databases (SWISS-PROT and/or TrEMBL) for protein entries that match the input data. Consequently, a list of candidate proteins is selected from the database, and a score calculation provides a ranking according to the quality of the match. To define the most discriminating scoring calculation we analyzed the respective role of each parameter in two directions. The first one is based on filtering and exploratory effects, while the second direction focuses on the levels where the parameters intervene in the identification process. Thus, according to our analysis, all input parameters contribute to the score, however with different weights. Since it is difficult to estimate the weights in advance, they have been computed with a generic algorithm, using a training set of 91 protein spectra with their environmental data. We tested the resulting scoring calculation on a test set of ten proteins and compared the identification results with those of other peptide mass fingerprinting programs. PMID:10612280

  17. Social Identification and Interpersonal Communication in Computer-Mediated Communication: What You Do versus Who You Are in Virtual Groups

    ERIC Educational Resources Information Center

    Wang, Zuoming; Walther, Joseph B.; Hancock, Jeffrey T.

    2009-01-01

    This study investigates the influence of interpersonal communication and intergroup identification on members' evaluations of computer-mediated groups. Participants (N= 256) in 64 four-person groups interacted through synchronous computer chat. Subgroup assignments to minimal groups instilled significantly greater in-group versus out-group…

  18. Computational aspects of hot-wire identification of thermal conductivity and diffusivity under high temperature

    NASA Astrophysics Data System (ADS)

    Vala, Jiří; Jarošová, Petra

    2016-07-01

    Development of advanced materials resistant to high temperature, needed namely for the design of heat storage for low-energy and passive buildings, requires simple, inexpensive and reliable methods of identification of their temperature-sensitive thermal conductivity and diffusivity, covering both well-advised experimental setting and implementation of robust and effective computational algorithms. Special geometrical configurations offer a possibility of quasi-analytical evaluation of temperature development for direct problems, whereas inverse problems of simultaneous evaluation of thermal conductivity and diffusivity must be handled carefully, using some least-squares (minimum variance) arguments. This paper demonstrates the proper mathematical and computational approach to such model problem, thanks to the radial symmetry of hot-wire measurements, including its numerical implementation.

  19. Identification of oxygen-related midgap level in GaAs

    NASA Technical Reports Server (NTRS)

    Lagowski, J.; Lin, D. G.; Gatos, H. C.; Aoyama, T.

    1984-01-01

    An oxygen-related deep level ELO was identified in GaAs employing Bridgman-grown crystals with controlled oxygen doping. The activation energy of ELO is almost the same as that of the dominant midgap level: EL2. This fact impedes the identification of ELO by standard deep level transient spectroscopy. However, it was found that the electron capture cross section of ELO is about four times greater than that of EL2. This characteristic served as the basis for the separation and quantitative investigation of ELO employing detailed capacitance transient measurements in conjunction with reference measurements on crystals grown without oxygen doping and containing only EL2.

  20. Matrix Transformations in Lower Level Computer Graphics Course.

    ERIC Educational Resources Information Center

    Ying, Dao-ning

    1982-01-01

    Presents computer programs (Applesoft Basic) for: (1) 2-D rotation about any point through any angle; (2) matrix transformation for 2-D rotation; (3) 3-D translation; (4) 3-D rotation; and (5) hyperboloid rotated in 2-D space. Includes background information and sample output for the matrix transformation subroutines. (JN)

  1. Computational Prediction of Electron Ionization Mass Spectra to Assist in GC/MS Compound Identification.

    PubMed

    Allen, Felicity; Pon, Allison; Greiner, Russ; Wishart, David

    2016-08-01

    We describe a tool, competitive fragmentation modeling for electron ionization (CFM-EI) that, given a chemical structure (e.g., in SMILES or InChI format), computationally predicts an electron ionization mass spectrum (EI-MS) (i.e., the type of mass spectrum commonly generated by gas chromatography mass spectrometry). The predicted spectra produced by this tool can be used for putative compound identification, complementing measured spectra in reference databases by expanding the range of compounds able to be considered when availability of measured spectra is limited. The tool extends CFM-ESI, a recently developed method for computational prediction of electrospray tandem mass spectra (ESI-MS/MS), but unlike CFM-ESI, CFM-EI can handle odd-electron ions and isotopes and incorporates an artificial neural network. Tests on EI-MS data from the NIST database demonstrate that CFM-EI is able to model fragmentation likelihoods in low-resolution EI-MS data, producing predicted spectra whose dot product scores are significantly better than full enumeration "bar-code" spectra. CFM-EI also outperformed previously reported results for MetFrag, MOLGEN-MS, and Mass Frontier on one compound identification task. It also outperformed MetFrag in a range of other compound identification tasks involving a much larger data set, containing both derivatized and nonderivatized compounds. While replicate EI-MS measurements of chemical standards are still a more accurate point of comparison, CFM-EI's predictions provide a much-needed alternative when no reference standard is available for measurement. CFM-EI is available at https://sourceforge.net/projects/cfm-id/ for download and http://cfmid.wishartlab.com as a web service. PMID:27381172

  2. Efficient Computation of the Topology of Level Sets

    SciTech Connect

    Pascucci, V; Cole-McLaughlin, K

    2002-07-19

    This paper introduces two efficient algorithms that compute the Contour Tree of a 3D scalar field F and its augmented version with the Betti numbers of each isosurface. The Contour Tree is a fundamental data structure in scientific visualization that is used to pre-process the domain mesh to allow optimal computation of isosurfaces with minimal storage overhead. The Contour Tree can be also used to build user interfaces reporting the complete topological characterization of a scalar field, as shown in Figure 1. In the first part of the paper we present a new scheme that augments the Contour Tree with the Betti numbers of each isocontour in linear time. We show how to extend the scheme introduced in 3 with the Betti number computation without increasing its complexity. Thus we improve on the time complexity from our previous approach 8 from 0(m log m) to 0(n log n+m), where m is the number of tetrahedra and n is the number of vertices in the domain of F. In the second part of the paper we introduce a new divide and conquer algorithm that computes the Augmented Contour Tree for scalar fields defined on rectilinear grids. The central part of the scheme computes the output contour tree by merging two intermediate contour trees and is independent of the interpolant. In this way we confine any knowledge regarding a specific interpolant to an oracle that computes the tree for a single cell. We have implemented this oracle for the trilinear interpolant and plan to replace it with higher order interpolants when needed. The complexity of the scheme is O(n + t log n), where t is the number of critical points of F. This allows for the first time to compute the Contour Tree in linear time in many practical cases when t = O(n{sup 1-e}). We report the running times for a parallel implementation of our algorithm, showing good scalability with the number of processors.

  3. The Role of Language Comprehension and Computation in Mathematical Word Problem Solving among Students with Different Levels of Computation Achievement

    ERIC Educational Resources Information Center

    Guerriero, Tara Stringer

    2010-01-01

    The purpose of this study was to examine how selected linguistic components (including consistency of relational terms and extraneous information) impact performance at each stage of mathematical word problem solving (comprehension, equation construction, and computation accuracy) among students with different levels of computation achievement. …

  4. High level waste storage tanks 242-A evaporator standards/requirement identification document

    SciTech Connect

    Biebesheimer, E.

    1996-01-01

    This document, the Standards/Requirements Identification Document (S/RIDS) for the subject facility, represents the necessary and sufficient requirements to provide an adequate level of protection of the worker, public health and safety, and the environment. It lists those source documents from which requirements were extracted, and those requirements documents considered, but from which no requirements where taken. Documents considered as source documents included State and Federal Regulations, DOE Orders, and DOE Standards

  5. Assigning unique identification numbers to new user accounts and groups in a computing environment with multiple registries

    DOEpatents

    DeRobertis, Christopher V.; Lu, Yantian T.

    2010-02-23

    A method, system, and program storage device for creating a new user account or user group with a unique identification number in a computing environment having multiple user registries is provided. In response to receiving a command to create a new user account or user group, an operating system of a clustered computing environment automatically checks multiple registries configured for the operating system to determine whether a candidate identification number for the new user account or user group has been assigned already to one or more existing user accounts or groups, respectively. The operating system automatically assigns the candidate identification number to the new user account or user group created in a target user registry if the checking indicates that the candidate identification number has not been assigned already to any of the existing user accounts or user groups, respectively.

  6. An artificial-neural-network method for the identification of saturated turbogenerator parameters based on a coupled finite-element/state-space computational algorithm

    SciTech Connect

    Chaudhry, S.R.; Ahmed-Zaid, S.; Demerdash, N.A.

    1995-12-01

    An artificial neural network (ANN) is used in the identification of saturated synchronous machine parameters under diverse operating conditions. The training data base for the ANN is generated by a time-stepping coupled finite-element/state-space (CFE-SS) modeling technique which is used in the computation of the saturated parameters of a 20-kV, 733-MVA, 0.85 pf (lagging) turbogenerator at discrete load points in the P-Q capability plane for three different levels of terminal voltage. These computed parameters constitute a learning data base for a multilayer ANN structure which is successfully trained using the back-propagation algorithm. Results indicate that the trained ANN can identify saturated machine reactances for arbitrary load points in the P-Q plane with an error less than 2% of those values obtained directly from the CFE-SS algorithm. Thus, significant savings in computational time are obtained in such parameter computation tasks.

  7. A conceptual framework of computations in mid-level vision

    PubMed Central

    Kubilius, Jonas; Wagemans, Johan; Op de Beeck, Hans P.

    2014-01-01

    If a picture is worth a thousand words, as an English idiom goes, what should those words—or, rather, descriptors—capture? What format of image representation would be sufficiently rich if we were to reconstruct the essence of images from their descriptors? In this paper, we set out to develop a conceptual framework that would be: (i) biologically plausible in order to provide a better mechanistic understanding of our visual system; (ii) sufficiently robust to apply in practice on realistic images; and (iii) able to tap into underlying structure of our visual world. We bring forward three key ideas. First, we argue that surface-based representations are constructed based on feature inference from the input in the intermediate processing layers of the visual system. Such representations are computed in a largely pre-semantic (prior to categorization) and pre-attentive manner using multiple cues (orientation, color, polarity, variation in orientation, and so on), and explicitly retain configural relations between features. The constructed surfaces may be partially overlapping to compensate for occlusions and are ordered in depth (figure-ground organization). Second, we propose that such intermediate representations could be formed by a hierarchical computation of similarity between features in local image patches and pooling of highly-similar units, and reestimated via recurrent loops according to the task demands. Finally, we suggest to use datasets composed of realistically rendered artificial objects and surfaces in order to better understand a model's behavior and its limitations. PMID:25566044

  8. NEW Fe I LEVEL ENERGIES AND LINE IDENTIFICATIONS FROM STELLAR SPECTRA

    SciTech Connect

    Peterson, Ruth C.; Kurucz, Robert L.

    2015-01-01

    The spectrum of the Fe I atom is critical to many areas of astrophysics and beyond. Measurements of the energies of its high-lying levels remain woefully incomplete, however, despite extensive laboratory and solar analysis. In this work, we use high-resolution archival absorption-line ultraviolet and optical spectra of stars whose warm temperatures favor moderate Fe I excitation. We derive the energy for a particular upper level in Kurucz's semiempirical calculations by adopting a trial value that yields the same wavelength for a given line predicted to be about as strong as that of a strong unidentified spectral line observed in the stellar spectra, then checking the new wavelengths of other strong predicted transitions that share the same upper level for coincidence with other strong observed unidentified lines. To date, this analysis has provided the upper energies of 66 Fe I levels. Many new energy levels are higher than those accessible to laboratory experiments; several exceed the Fe I ionization energy. These levels provide new identifications for over 2000 potentially detectable lines. Almost all of the new levels of odd parity include UV lines that were detected but unclassified in laboratory Fe I absorption spectra, providing an external check on the energy values. We motivate and present the procedure, provide the resulting new energy levels and their uncertainties, list all the potentially detectable UV and optical new Fe I line identifications and their gf values, point out new lines of astrophysical interest, and discuss the prospects for additional Fe I energy level determinations.

  9. Assessing Pre-Service Teachers' Computer Phobia Levels in Terms of Gender and Experience, Turkish Sample

    ERIC Educational Resources Information Center

    Ursavas, Omer Faruk; Karal, Hasan

    2009-01-01

    In this study it is aimed to determine the level of pre-service teachers' computer phobia. Whether or not computer phobia meaningfully varies statistically according to gender and computer experience has been tested in the study. The study was performed on 430 pre-service teachers at the Education Faculty in Rize/Turkey. Data in the study were…

  10. Single-board computer based control system for a portable Raman device with integrated chemical identification

    NASA Astrophysics Data System (ADS)

    Mobley, Joel; Cullum, Brian M.; Wintenberg, Alan L.; Shane Frank, S.; Maples, Robert A.; Stokes, David L.; Vo-Dinh, Tuan

    2004-06-01

    We report the development of a battery-powered portable chemical identification device for field use consisting of an acousto-optic tunable filter (AOTF)-based Raman spectrometer with integrated data processing and analysis software. The various components and custom circuitry are integrated into a self-contained instrument by control software that runs on an embedded single-board computer (SBC), which communicates with the various instrument modules through a 48-line bidirectional TTL bus. The user interacts with the instrument via a touch-sensitive liquid crystal display unit (LCD) that provides soft buttons for user control as well as visual feedback (e.g., spectral plots, stored data, instrument settings, etc.) from the instrument. The control software manages all operational aspects of the instrument with the exception of the power management module that is run by embedded firmware. The SBC-based software includes both automated and manual library searching capabilities, permitting rapid identification of samples in the field. The use of the SBC in tandem with the LCD touchscreen for interfacing and control provides the instrument with a great deal of flexibility as its function can be customized to specific users or tasks via software modifications alone. The instrument, as currently configured, can be operated as a research-grade Raman spectrometer for scientific applications and as a "black-box" chemical identification system for field use. The instrument can acquire 198-point spectra over a spectral range of 238-1620 cm-1, perform a library search, and display the results in less than 14 s. The operating modes of the instrument are demonstrated illustrating the utility and flexibility afforded the system by the SBC-LCD control module.

  11. Computed Tomography (CT) Scanning Facilitates Early Identification of Neonatal Cystic Fibrosis Piglets

    PubMed Central

    Guillon, Antoine; Chevaleyre, Claire; Barc, Celine; Berri, Mustapha; Adriaensen, Hans; Lecompte, François; Villemagne, Thierry; Pezant, Jérémy; Delaunay, Rémi; Moënne-Loccoz, Joseph; Berthon, Patricia; Bähr, Andrea; Wolf, Eckhard; Klymiuk, Nikolai; Attucci, Sylvie; Ramphal, Reuben; Sarradin, Pierre; Buzoni-Gatel, Dominique; Si-Tahar, Mustapha; Caballero, Ignacio

    2015-01-01

    Background Cystic Fibrosis (CF) is the most prevalent autosomal recessive disease in the Caucasian population. A cystic fibrosis transmembrane conductance regulator knockout (CFTR-/-) pig that displays most of the features of the human CF disease has been recently developed. However, CFTR-/- pigs presents a 100% prevalence of meconium ileus that leads to death in the first hours after birth, requiring a rapid diagnosis and surgical intervention to relieve intestinal obstruction. Identification of CFTR-/- piglets is usually performed by PCR genotyping, a procedure that lasts between 4 to 6 h. Here, we aimed to develop a procedure for rapid identification of CFTR-/- piglets that will allow placing them under intensive care soon after birth and immediately proceeding with the surgical correction. Methods and Principal Findings Male and female CFTR+/- pigs were crossed and the progeny was examined by computed tomography (CT) scan to detect the presence of meconium ileus and facilitate a rapid post-natal surgical intervention. Genotype was confirmed by PCR. CT scan presented a 94.4% sensitivity to diagnose CFTR-/- piglets. Diagnosis by CT scan reduced the birth-to-surgery time from a minimum of 10 h down to a minimum of 2.5 h and increased the survival of CFTR-/- piglets to a maximum of 13 days post-surgery as opposed to just 66 h after later surgery. Conclusion CT scan imaging of meconium ileus is an accurate method for rapid identification of CFTR-/- piglets. Early CT detection of meconium ileus may help to extend the lifespan of CFTR-/- piglets and, thus, improve experimental research on CF, still an incurable disease. PMID:26600426

  12. Identification of Yersinia enterocolitica at the Species and Subspecies Levels by Fourier Transform Infrared Spectroscopy ▿

    PubMed Central

    Kuhm, Andrea Elisabeth; Suter, Daniel; Felleisen, Richard; Rau, Jörg

    2009-01-01

    Yersinia enterocolitica and other Yersinia species, such as Y. pseudotuberculosis, Y. bercovieri, and Y. intermedia, were differentiated using Fourier transform infrared spectroscopy (FT-IR) combined with artificial neural network analysis. A set of well defined Yersinia strains from Switzerland and Germany was used to create a method for FT-IR-based differentiation of Yersinia isolates at the species level. The isolates of Y. enterocolitica were also differentiated by FT-IR into the main biotypes (biotypes 1A, 2, and 4) and serotypes (serotypes O:3, O:5, O:9, and “non-O:3, O:5, and O:9”). For external validation of the constructed methods, independently obtained isolates of different Yersinia species were used. A total of 79.9% of Y. enterocolitica sensu stricto isolates were identified correctly at the species level. The FT-IR analysis allowed the separation of all Y. bercovieri, Y. intermedia, and Y. rohdei strains from Y. enterocolitica, which could not be differentiated by the API 20E test system. The probability for correct biotype identification of Y. enterocolitica isolates was 98.3% (41 externally validated strains). For correct serotype identification, the probability was 92.5% (42 externally validated strains). In addition, the presence or absence of the ail gene, one of the main pathogenicity markers, was demonstrated using FT-IR. The probability for correct identification of isolates concerning the ail gene was 98.5% (51 externally validated strains). This indicates that it is possible to obtain information about genus, species, and in the case of Y. enterocolitica also subspecies type with a single measurement. Furthermore, this is the first example of the identification of specific pathogenicity using FT-IR. PMID:19617388

  13. Identification of Yersinia enterocolitica at the species and subspecies levels by Fourier transform infrared spectroscopy.

    PubMed

    Kuhm, Andrea Elisabeth; Suter, Daniel; Felleisen, Richard; Rau, Jörg

    2009-09-01

    Yersinia enterocolitica and other Yersinia species, such as Y. pseudotuberculosis, Y. bercovieri, and Y. intermedia, were differentiated using Fourier transform infrared spectroscopy (FT-IR) combined with artificial neural network analysis. A set of well defined Yersinia strains from Switzerland and Germany was used to create a method for FT-IR-based differentiation of Yersinia isolates at the species level. The isolates of Y. enterocolitica were also differentiated by FT-IR into the main biotypes (biotypes 1A, 2, and 4) and serotypes (serotypes O:3, O:5, O:9, and "non-O:3, O:5, and O:9"). For external validation of the constructed methods, independently obtained isolates of different Yersinia species were used. A total of 79.9% of Y. enterocolitica sensu stricto isolates were identified correctly at the species level. The FT-IR analysis allowed the separation of all Y. bercovieri, Y. intermedia, and Y. rohdei strains from Y. enterocolitica, which could not be differentiated by the API 20E test system. The probability for correct biotype identification of Y. enterocolitica isolates was 98.3% (41 externally validated strains). For correct serotype identification, the probability was 92.5% (42 externally validated strains). In addition, the presence or absence of the ail gene, one of the main pathogenicity markers, was demonstrated using FT-IR. The probability for correct identification of isolates concerning the ail gene was 98.5% (51 externally validated strains). This indicates that it is possible to obtain information about genus, species, and in the case of Y. enterocolitica also subspecies type with a single measurement. Furthermore, this is the first example of the identification of specific pathogenicity using FT-IR. PMID:19617388

  14. Young Children's Perspectives on Immigration within the Italian School Context: An Analysis of the Identification Level of Integration

    ERIC Educational Resources Information Center

    Pellegrini, Lucia

    2010-01-01

    The research explores the level of integration of a group of young children attending a primary school in Italy, a new immigration country, focusing on their identification of racial, ethnic and cultural pluralism. Specifically the study takes into account two processes of identification: the mechanism of attributions and the expression of…

  15. An integrated transcriptomic and computational analysis for biomarker identification in gastric cancer

    PubMed Central

    Cui, Juan; Chen, Yunbo; Chou, Wen-Chi; Sun, Liankun; Chen, Li; Suo, Jian; Ni, Zhaohui; Zhang, Ming; Kong, Xiaoxia; Hoffman, Lisabeth L.; Kang, Jinsong; Su, Yingying; Olman, Victor; Johnson, Darryl; Tench, Daniel W.; Amster, I. Jonathan; Orlando, Ron; Puett, David; Li, Fan; Xu, Ying

    2011-01-01

    This report describes an integrated study on identification of potential markers for gastric cancer in patients’ cancer tissues and sera based on: (i) genome-scale transcriptomic analyses of 80 paired gastric cancer/reference tissues and (ii) computational prediction of blood-secretory proteins supported by experimental validation. Our findings show that: (i) 715 and 150 genes exhibit significantly differential expressions in all cancers and early-stage cancers versus reference tissues, respectively; and a substantial percentage of the alteration is found to be influenced by age and/or by gender; (ii) 21 co-expressed gene clusters have been identified, some of which are specific to certain subtypes or stages of the cancer; (iii) the top-ranked gene signatures give better than 94% classification accuracy between cancer and the reference tissues, some of which are gender-specific; and (iv) 136 of the differentially expressed genes were predicted to have their proteins secreted into blood, 81 of which were detected experimentally in the sera of 13 validation samples and 29 found to have differential abundances in the sera of cancer patients versus controls. Overall, the novel information obtained in this study has led to identification of promising diagnostic markers for gastric cancer and can benefit further analyses of the key (early) abnormalities during its development. PMID:20965966

  16. Features preferred in-identification system based on computer mouse movements

    NASA Astrophysics Data System (ADS)

    Jašek, Roman; Kolařík, Martin

    2016-06-01

    Biometric identification systems build features, which describe people, using various data. Usually it is not known which of features could be chosen, and a dedicated process called a feature selection is applied to resolve this uncertainty. The relevant features, that are selected with this process, then describe the people in the system as well as possible. It is likely that the relevancy of selected features also mean that these features describe the important things in the measured behavior and that they partly reveal how the behavior works. This paper uses this idea and evaluates results of many runs of feature selection runs in a system, that identifies people using their moving a computer mouse. It has been found that the most frequently selected features repeat between runs, and that these features describe the similar things of the movements.

  17. Computer game as a tool for training the identification of phonemic length.

    PubMed

    Pennala, Riitta; Richardson, Ulla; Ylinen, Sari; Lyytinen, Heikki; Martin, Maisa

    2014-12-01

    Computer-assisted training of Finnish phonemic length was conducted with 7-year-old Russian-speaking second-language learners of Finnish. Phonemic length plays a different role in these two languages. The training included game activities with two- and three-syllable word and pseudo-word minimal pairs with prototypical vowel durations. The lowest accuracy scores were recorded for two-syllable words. Accuracy scores were higher for the minimal pairs with larger rather than smaller differences in duration. Accuracy scores were lower for long duration than for short duration. The ability to identify quantity degree was generalized to stimuli used in the identification test in two of the children. Ideas for improving the game are introduced. PMID:23841573

  18. VISPA: a computational pipeline for the identification and analysis of genomic vector integration sites.

    PubMed

    Calabria, Andrea; Leo, Simone; Benedicenti, Fabrizio; Cesana, Daniela; Spinozzi, Giulio; Orsini, Massimilano; Merella, Stefania; Stupka, Elia; Zanetti, Gianluigi; Montini, Eugenio

    2014-01-01

    The analysis of the genomic distribution of viral vector genomic integration sites is a key step in hematopoietic stem cell-based gene therapy applications, allowing to assess both the safety and the efficacy of the treatment and to study the basic aspects of hematopoiesis and stem cell biology. Identifying vector integration sites requires ad-hoc bioinformatics tools with stringent requirements in terms of computational efficiency, flexibility, and usability. We developed VISPA (Vector Integration Site Parallel Analysis), a pipeline for automated integration site identification and annotation based on a distributed environment with a simple Galaxy web interface. VISPA was successfully used for the bioinformatics analysis of the follow-up of two lentiviral vector-based hematopoietic stem-cell gene therapy clinical trials. Our pipeline provides a reliable and efficient tool to assess the safety and efficacy of integrating vectors in clinical settings. PMID:25342980

  19. Computational methods for the identification of spatially varying stiffness and damping in beams

    NASA Technical Reports Server (NTRS)

    Banks, H. T.; Rosen, I. G.

    1986-01-01

    A numerical approximation scheme for the estimation of functional parameters in Euler-Bernoulli models for the transverse vibration of flexible beams with tip bodies is developed. The method permits the identification of spatially varying flexural stiffness and Voigt-Kelvin viscoelastic damping coefficients which appear in the hybrid system of ordinary and partial differential equations and boundary conditions describing the dynamics of such structures. An inverse problem is formulated as a least squares fit to data subject to constraints in the form of a vector system of abstract first order evolution equations. Spline-based finite element approximations are used to finite dimensionalize the problem. Theoretical convergence results are given and numerical studies carried out on both conventional (serial) and vector computers are discussed.

  20. NLSCIDNT user's guide maximum likehood parameter identification computer program with nonlinear rotorcraft model

    NASA Technical Reports Server (NTRS)

    1979-01-01

    A nonlinear, maximum likelihood, parameter identification computer program (NLSCIDNT) is described which evaluates rotorcraft stability and control coefficients from flight test data. The optimal estimates of the parameters (stability and control coefficients) are determined (identified) by minimizing the negative log likelihood cost function. The minimization technique is the Levenberg-Marquardt method, which behaves like the steepest descent method when it is far from the minimum and behaves like the modified Newton-Raphson method when it is nearer the minimum. Twenty-one states and 40 measurement variables are modeled, and any subset may be selected. States which are not integrated may be fixed at an input value, or time history data may be substituted for the state in the equations of motion. Any aerodynamic coefficient may be expressed as a nonlinear polynomial function of selected 'expansion variables'.

  1. The identification of a selective dopamine D2 partial agonist, D3 antagonist displaying high levels of brain exposure.

    PubMed

    Holmes, Ian P; Blunt, Richard J; Lorthioir, Olivier E; Blowers, Stephen M; Gribble, Andy; Payne, Andrew H; Stansfield, Ian G; Wood, Martyn; Woollard, Patrick M; Reavill, Charlie; Howes, Claire M; Micheli, Fabrizio; Di Fabio, Romano; Donati, Daniele; Terreni, Silvia; Hamprecht, Dieter; Arista, Luca; Worby, Angela; Watson, Steve P

    2010-03-15

    The identification of a highly selective D(2) partial agonist, D(3) antagonist tool molecule which demonstrates high levels of brain exposure and selectivity against an extensive range of dopamine, serotonin, adrenergic, histamine, and muscarinic receptors is described. PMID:20153647

  2. Identification of Nonlinear Micron-Level Mechanics for a Precision Deployable Joint

    NASA Technical Reports Server (NTRS)

    Bullock, S. J.; Peterson, L. D.

    1994-01-01

    The experimental identification of micron-level nonlinear joint mechanics and dynamics for a pin-clevis joint used in a precision, adaptive, deployable space structure are investigated. The force-state mapping method is used to identify the behavior of the joint under a preload. The results of applying a single tension-compression cycle to the joint under a tensile preload are presented. The observed micron-level behavior is highly nonlinear and involves all six rigid body motion degrees-of-freedom of the joint. it is also suggests that at micron levels of motion modelling of the joint mechanics and dynamics must include the interactions between all internal components, such as the pin, bushings, and the joint node.

  3. Computer Literacy in Pennsylvania Community Colleges. Competencies in a Beginning Level College Computer Literacy Course.

    ERIC Educational Resources Information Center

    Tortorelli, Ann Eichorn

    A study was conducted at the 14 community colleges (17 campuses) in Pennsylvania to assess the perceptions of faculty about the relative importance of course content items in a beginning credit course in computer literacy, and to survey courses currently being offered. A detailed questionnaire consisting of 96 questions based on MECC (Minnesota…

  4. Crop species identification using machine vision of computer extracted individual leaves

    NASA Astrophysics Data System (ADS)

    Camargo Neto, João; Meyer, George E.

    2005-11-01

    An unsupervised method for plant species identification was developed which uses computer extracted individual whole leaves from color images of crop canopies. Green canopies were isolated from soil/residue backgrounds using a modified Excess Green and Excess Red separation method. Connected components of isolated green regions of interest were changed into pixel fragments using the Gustafson-Kessel fuzzy clustering method. The fragments were reassembled as individual leaves using a genetic optimization algorithm and a fitness method. Pixels of whole leaves were then analyzed using the elliptic Fourier shape and Haralick's classical textural feature analyses. A binary template was constructed to represent each selected leaf region of interest. Elliptic Fourier descriptors were generated from a chain encoding of the leaf boundary. Leaf template orientation was corrected by rotating each extracted leaf to a standard horizontal position. This was done using information provided from the first harmonic set of coefficients. Textural features were computed from the grayscale co-occurrence matrix of the leaf pixel set. Standardized leaf orientation significantly improved the leaf textural venation results. Principle component analysis from SAS (R) was used to select the best Fourier descriptors and textural indices. Indices of local homogeneity, and entropy were found to contribute to improved classification rates. A SAS classification model was developed and correctly classified 83% of redroot pigweed, 100% of sunflower 83% of soybean, and 73% of velvetleaf species. An overall plant species correct classification rate of 86% was attained.

  5. A Computational Model for the Identification of Biochemical Pathways in the Krebs Cycle

    SciTech Connect

    Oliveira, Joseph S.; Bailey, Colin G.; Jones-Oliveira, Janet B.; Dixon, David A.; Gull, Dean W.; Chandler, Mary L.

    2003-03-01

    We have applied an algorithmic methodology which provably decomposes any complex network into a complete family of principal subcircuits to study the minimal circuits that describe the Krebs cycle. Every operational behavior that the network is capable of exhibiting can be represented by some combination of these principal subcircuits and this computational decomposition is linearly efficient. We have developed a computational model that can be applied to biochemical reaction systems which accurately renders pathways of such reactions via directed hypergraphs (Petri nets). We have applied the model to the citric acid cycle (Krebs cycle). The Krebs cycle, which oxidizes the acetyl group of acetyl CoA to CO2 and reduces NAD and FAD to NADH and FADH2 is a complex interacting set of nine subreaction networks. The Krebs cycle was selected because of its familiarity to the biological community and because it exhibits enough complexity to be interesting in order to introduce this novel analytic approach. This study validates the algorithmic methodology for the identification of significant biochemical signaling subcircuits, based solely upon the mathematical model and not upon prior biological knowledge. The utility of the algebraic-combinatorial model for identifying the complete set of biochemical subcircuits as a data set is demonstrated for this important metabolic process.

  6. Identify Skills and Proficiency Levels Necessary for Entry-Level Employment for All Vocational Programs Using Computers to Process Data. Final Report.

    ERIC Educational Resources Information Center

    Crowe, Jacquelyn

    This study investigated computer and word processing operator skills necessary for employment in today's high technology office. The study was comprised of seven major phases: (1) identification of existing community college computer operator programs in the state of Washington; (2) attendance at an information management seminar; (3) production…

  7. Bladed-shrouded-disc aeroelastic analyses: Computer program updates in NASTRAN level 17.7

    NASA Technical Reports Server (NTRS)

    Gallo, A. M.; Elchuri, V.; Skalski, S. C.

    1981-01-01

    In October 1979, a computer program based on the state-of-the-art compressor and structural technologies applied to bladed-shrouded-disc was developed. The program was more operational in NASTRAN Level 16. The bladed disc computer program was updated for operation in NASTRAN Level 17.7. The supersonic cascade unsteady aerodynamics routine UCAS, delivered as part of the NASTRAN Level 16 program was recorded to improve its execution time. These improvements are presented.

  8. Computer experiments on periodic systems identification using rotor blade transient flapping-torsion responses at high advance ratio

    NASA Technical Reports Server (NTRS)

    Hohenemser, K. H.; Prelewicz, D. A.

    1974-01-01

    Systems identification methods have recently been applied to rotorcraft to estimate stability derivatives from transient flight control response data. While these applications assumed a linear constant coefficient representation of the rotorcraft, the computer experiments described in this paper used transient responses in flap-bending and torsion of a rotor blade at high advance ratio which is a rapidly time varying periodic system.

  9. [Evaluation of the GNF computer-coding system for the identification of non-fermentative Gram-negative bacilli].

    PubMed

    Tarng, C M; Chen, M M; Tsai, W C

    1983-05-01

    In order to evaluate the effectiveness of the GNF computer-coding system for the identification of glucose non-fermenting gram-negative bacilli, we employed 406 strains of bacteria including 367 clinical isolates and 39 standard strains for testing. These strains were inoculated into the following eleven conventional biochemical test media: Triple Sugar Iron Agar, Simmon's Citrate Agar, Christensen's Urea Agar, Sulfide-Indole-Motility Medium, Semisolid Voges-Proskauer Test Medium, Moeller's Ornithine Decarboxylase Test Medium, Pyocyanin Test Medium, Oxidation/Fermentation (O/F) Glucose, O/F Fructose, Nitrate Broth, Moeller's Arginine Dihydrolase Test Medium. The results of these tests plus those from the hanging drop motility test and the oxidase test were converted into bacterial code number and then checked with the GNF computer-coding system. It was found that the first preference of agreement was 75.6%, second 15.3%, third 5.9%, and fourth or more 3.2%. In regard to the speed of bacterial identification by using the GNF system and information from hemolysis pattern and flagella stain, it was indicated that 84.7% would be correctly identified within 36-48 hours after isolation. If more confirmational tests were employed, the accurate identification rate would reach to 98.7% after 4 days of isolation. In addition, the use of the GNF computer-coding system can standardize identification procedures, shorten the identification period, and save cost in terms of materials supply, inoculation time, media preparation and media-storing space. Therefore, we conclude that the GNF computer-coding system is an effective tool in the identification of the glucose non-fermenting gram-negative bacilli. PMID:6617315

  10. A Unique Automation Platform for Measuring Low Level Radioactivity in Metabolite Identification Studies

    PubMed Central

    Krauser, Joel; Walles, Markus; Wolf, Thierry; Graf, Daniel; Swart, Piet

    2012-01-01

    Generation and interpretation of biotransformation data on drugs, i.e. identification of physiologically relevant metabolites, defining metabolic pathways and elucidation of metabolite structures, have become increasingly important to the drug development process. Profiling using 14C or 3H radiolabel is defined as the chromatographic separation and quantification of drug-related material in a given biological sample derived from an in vitro, preclinical in vivo or clinical study. Metabolite profiling is a very time intensive activity, particularly for preclinical in vivo or clinical studies which have defined limitations on radiation burden and exposure levels. A clear gap exists for certain studies which do not require specialized high volume automation technologies, yet these studies would still clearly benefit from automation. Use of radiolabeled compounds in preclinical and clinical ADME studies, specifically for metabolite profiling and identification are a very good example. The current lack of automation for measuring low level radioactivity in metabolite profiling requires substantial capacity, personal attention and resources from laboratory scientists. To help address these challenges and improve efficiency, we have innovated, developed and implemented a novel and flexible automation platform that integrates a robotic plate handling platform, HPLC or UPLC system, mass spectrometer and an automated fraction collector. PMID:22723932

  11. Multivariate Effects of Level of Education, Computer Ownership, and Computer Use on Female Students' Attitudes towards CALL

    ERIC Educational Resources Information Center

    Rahimi, Mehrak; Yadollahi, Samaneh

    2012-01-01

    The aim of this study was investigating Iranian female students' attitude towards CALL and its relationship with their level of education, computer ownership, and frequency of use. One hundred and forty-two female students (50 junior high-school students, 49 high-school students and 43 university students) participated in this study. They filled…

  12. The Potential Use of Radio Frequency Identification Devices for Active Monitoring of Blood Glucose Levels

    PubMed Central

    Moore, Bert

    2009-01-01

    Imagine a diabetes patient receiving a text message on his mobile phone warning him that his blood glucose level is too low or a patient's mobile phone calling an emergency number when the patient goes into diabetic shock. Both scenarios depend on automatic, continuous monitoring of blood glucose levels and transmission of that information to a phone. The development of advanced biological sensors and integration with passive radio frequency identification technologies are the key to this. These hold the promise of being able to free patients from finger stick sampling or externally worn devices while providing continuous blood glucose monitoring that allows patients to manage their health more actively. To achieve this promise, however, a number of technical issues need to be addressed. PMID:20046663

  13. Rapid identification of mycobacteria to the species level by polymerase chain reaction and restriction enzyme analysis.

    PubMed Central

    Telenti, A; Marchesi, F; Balz, M; Bally, F; Böttger, E C; Bodmer, T

    1993-01-01

    A method for the rapid identification of mycobacteria to the species level was developed on the basis of evaluation by the polymerase chain reaction (PCR) of the gene encoding for the 65-kDa protein. The method involves restriction enzyme analysis of PCR products obtained with primers common to all mycobacteria. Using two restriction enzymes, BstEII and HaeIII, medically relevant and other frequent laboratory isolates were differentiated to the species or subspecies level by PCR-restriction enzyme pattern analysis. PCR-restriction enzyme pattern analysis was performed on isolates (n = 330) from solid and fluid culture media, including BACTEC, or from frozen and lyophilized stocks. The procedure does not involve hybridization steps or the use of radioactivity and can be completed within 1 working day. Images PMID:8381805

  14. Identification, Recovery, and Refinement of Hitherto Undescribed Population-Level Genomes from the Human Gastrointestinal Tract

    PubMed Central

    Laczny, Cedric C.; Muller, Emilie E. L.; Heintz-Buschart, Anna; Herold, Malte; Lebrun, Laura A.; Hogan, Angela; May, Patrick; de Beaufort, Carine; Wilmes, Paul

    2016-01-01

    Linking taxonomic identity and functional potential at the population-level is important for the study of mixed microbial communities and is greatly facilitated by the availability of microbial reference genomes. While the culture-independent recovery of population-level genomes from environmental samples using the binning of metagenomic data has expanded available reference genome catalogs, several microbial lineages remain underrepresented. Here, we present two reference-independent approaches for the identification, recovery, and refinement of hitherto undescribed population-level genomes. The first approach is aimed at genome recovery of varied taxa and involves multi-sample automated binning using CANOPY CLUSTERING complemented by visualization and human-augmented binning using VIZBIN post hoc. The second approach is particularly well-suited for the study of specific taxa and employs VIZBIN de novo. Using these approaches, we reconstructed a total of six population-level genomes of distinct and divergent representatives of the Alphaproteobacteria class, the Mollicutes class, the Clostridiales order, and the Melainabacteria class from human gastrointestinal tract-derived metagenomic data. Our results demonstrate that, while automated binning approaches provide great potential for large-scale studies of mixed microbial communities, these approaches should be complemented with informative visualizations because expert-driven inspection and refinements are critical for the recovery of high-quality population-level genomes. PMID:27445992

  15. Computational techniques in tribology and material science at the atomic level

    NASA Technical Reports Server (NTRS)

    Ferrante, J.; Bozzolo, G. H.

    1992-01-01

    Computations in tribology and material science at the atomic level present considerable difficulties. Computational techniques ranging from first-principles to semi-empirical and their limitations are discussed. Example calculations of metallic surface energies using semi-empirical techniques are presented. Finally, application of the methods to calculation of adhesion and friction are presented.

  16. An electrical device for computing theoretical draw-downs of ground-water levels

    USGS Publications Warehouse

    Remson, Irwin; Halstead, M.H.

    1955-01-01

    The construction, calibration and use of an electrical "slide rule" for computing theoretical drawdowns of ground-water levels are described. The instrument facilitates the computation of drawdowns under given conditions of discharge or recharge by means of the Theis nonequilibrium equation. It is simple to construct and use and can be a valuable aid in ground-water studies.   

  17. A Study of Effectiveness of Computer Assisted Instruction (CAI) over Classroom Lecture (CRL) at ICS Level

    ERIC Educational Resources Information Center

    Kaousar, Tayyeba; Choudhry, Bushra Naoreen; Gujjar, Aijaz Ahmed

    2008-01-01

    This study was aimed to evaluate the effectiveness of CAI vs. classroom lecture for computer science at ICS level. The objectives were to compare the learning effects of two groups with classroom lecture and computer-assisted instruction studying the same curriculum and the effects of CAI and CRL in terms of cognitive development. Hypotheses of…

  18. Computer-Assisted Instruction in Elementary Logic at the University Level. Technical Report No. 239.

    ERIC Educational Resources Information Center

    Goldberg, Adele; Suppes, Patrick

    Earlier research by the authors in the design and use of computer-assisted instructional systems and curricula for teaching mathematical logic to gifted elementary school students has been extended to the teaching of university-level courses. This report is a description of the curriculum and problem types of a computer-based course offered at…

  19. Computer-aided identification of polymorphism sets diagnostic for groups of bacterial and viral genetic variants

    PubMed Central

    Price, Erin P; Inman-Bamber, John; Thiruvenkataswamy, Venugopal; Huygens, Flavia; Giffard, Philip M

    2007-01-01

    Background Single nucleotide polymorphisms (SNPs) and genes that exhibit presence/absence variation have provided informative marker sets for bacterial and viral genotyping. Identification of marker sets optimised for these purposes has been based on maximal generalized discriminatory power as measured by Simpson's Index of Diversity, or on the ability to identify specific variants. Here we describe the Not-N algorithm, which is designed to identify small sets of genetic markers diagnostic for user-specified subsets of known genetic variants. The algorithm does not treat the user-specified subset and the remaining genetic variants equally. Rather Not-N analysis is designed to underpin assays that provide 0% false negatives, which is very important for e.g. diagnostic procedures for clinically significant subgroups within microbial species. Results The Not-N algorithm has been incorporated into the "Minimum SNPs" computer program and used to derive genetic markers diagnostic for multilocus sequence typing-defined clonal complexes, hepatitis C virus (HCV) subtypes, and phylogenetic clades defined by comparative genome hybridization (CGH) data for Campylobacter jejuni, Yersinia enterocolitica and Clostridium difficile. Conclusion Not-N analysis is effective for identifying small sets of genetic markers diagnostic for microbial sub-groups. The best results to date have been obtained with CGH data from several bacterial species, and HCV sequence data. PMID:17672919

  20. Level sequence and splitting identification of closely spaced energy levels by angle-resolved analysis of fluorescence light

    NASA Astrophysics Data System (ADS)

    Wu, Z. W.; Volotka, A. V.; Surzhykov, A.; Dong, C. Z.; Fritzsche, S.

    2016-06-01

    The angular distribution and linear polarization of the fluorescence light following the resonant photoexcitation is investigated within the framework of density matrix and second-order perturbation theory. Emphasis has been placed on "signatures" for determining the level sequence and splitting of intermediate (partially) overlapping resonances, if analyzed as a function of photon energy of incident light. Detailed computations within the multiconfiguration Dirac-Fock method have been performed, especially for the 1 s22 s22 p63 s ,Ji=1 /2 +γ1→(1s22 s 2 p63 s ) 13 p3 /2,J =1 /2 ,3 /2 →1 s22 s22 p63 s ,Jf=1 /2 +γ2 photoexcitation and subsequent fluorescence emission of atomic sodium. A remarkably strong dependence of the angular distribution and linear polarization of the γ2 fluorescence emission is found upon the level sequence and splitting of the intermediate (1s22 s 2 p63 s ) 13 p3 /2,J =1 /2 ,3 /2 overlapping resonances owing to their finite lifetime (linewidth). We therefore suggest that accurate measurements of the angular distribution and linear polarization might help identify the sequence and small splittings of closely spaced energy levels, even if they cannot be spectroscopically resolved.

  1. Studying channelopathies at the functional level using a system identification approach

    NASA Astrophysics Data System (ADS)

    Faisal, A. Aldo

    2007-09-01

    The electrical activity of our brain's neurons is controlled by voltage-gated ion channels. Mutations in these ion channels have been recently associated with clinical conditions, so called channelopathies. The involved ion channels have been well characterised at a molecular and biophysical level. However, the impact of these mutations on neuron function have been only rudimentary studied. It remains unclear how operation and performance (in terms of input-output characteristics and reliability) are affected. Here, I show how system identification techniques provide neuronal performance measures which allow to quantitatively asses the impact of channelopathies by comparing whole cell input-output relationships. I illustrate the feasibility of this approach by comparing the effects on neuronal signalling of two human sodium channel mutations (NaV 1.1 W1204R, R1648H), linked to generalized epilepsy with febrile seizures, to the wild-type NaV 1.1 channel.

  2. [Tri-Level Infrared Spectroscopic Identification of Hot Melting Reflective Road Marking Paint].

    PubMed

    Li, Hao; Ma, Fang; Sun, Su-qin

    2015-12-01

    In order to detect the road marking paint from the trace evidence in traffic accident scene, and to differentiate their brands, we use Tri-level infrared spectroscopic identification, which employs the Fourier transform infrared spectroscopy (FTIR), the second derivative infrared spectroscopy(SD-IR), two-dimensional correlation infrared spectroscopy(2D-IR) to identify three different domestic brands of hot melting reflective road marking paints and their raw materials in formula we Selected. The experimental results show that three labels coatings in ATR and FTIR spectrograms are very similar in shape, only have different absorption peak wave numbers, they have wide and strong absorption peaks near 1435 cm⁻¹, and strong absorption peak near 879, 2955, 2919, 2870 cm⁻¹. After enlarging the partial areas of spectrograms and comparing them with each kind of raw material of formula spectrograms, we can distinguish them. In the region 700-970 and 1370-1 660 cm⁻¹ the spectrograms mainly reflect the different relative content of heavy calcium carbonate of three brands of the paints, and that of polyethylene wax (PE wax), ethylene vinyl acetate resin (EVA), dioctyl phthalate (DOP) in the region 2800-2960 cm⁻¹. The SD-IR not only verify the result of the FTIR analysis, but also further expand the microcosmic differences and reflect the different relative content of quartz sand in the 512-799 cm-1 region. Within the scope of the 1351 to 1525 cm⁻¹, 2D-IR have more significant differences in positions and numbers of automatically peaks. Therefore, the Tri-level infrared spectroscopic identification is a fast and effective method to distinguish the hot melting road marking paints with a gradually improvement in apparent resolution. PMID:26964206

  3. 48 CFR 227.7203-3 - Early identification of computer software or computer software documentation to be furnished to...

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... use, reproduction or disclosure. 227.7203-3 Section 227.7203-3 Federal Acquisition Regulations System... restrictions on use, reproduction or disclosure. (a) Use the provision at 252.227-7017, Identification and... for which restrictions, other than copyright, on use, modification, reproduction, release,...

  4. 48 CFR 227.7203-3 - Early identification of computer software or computer software documentation to be furnished to...

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... use, reproduction or disclosure. 227.7203-3 Section 227.7203-3 Federal Acquisition Regulations System... restrictions on use, reproduction or disclosure. (a) Use the provision at 252.227-7017, Identification and... for which restrictions, other than copyright, on use, modification, reproduction, release,...

  5. 48 CFR 227.7203-3 - Early identification of computer software or computer software documentation to be furnished to...

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... use, reproduction or disclosure. 227.7203-3 Section 227.7203-3 Federal Acquisition Regulations System... restrictions on use, reproduction or disclosure. (a) Use the provision at 252.227-7017, Identification and... for which restrictions, other than copyright, on use, modification, reproduction, release,...

  6. 48 CFR 227.7203-3 - Early identification of computer software or computer software documentation to be furnished to...

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... use, reproduction or disclosure. 227.7203-3 Section 227.7203-3 Federal Acquisition Regulations System... restrictions on use, reproduction or disclosure. (a) Use the provision at 252.227-7017, Identification and... for which restrictions, other than copyright, on use, modification, reproduction, release,...

  7. 48 CFR 227.7203-3 - Early identification of computer software or computer software documentation to be furnished to...

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... use, reproduction or disclosure. 227.7203-3 Section 227.7203-3 Federal Acquisition Regulations System... restrictions on use, reproduction or disclosure. (a) Use the provision at 252.227-7017, Identification and... for which restrictions, other than copyright, on use, modification, reproduction, release,...

  8. Multi-level Bayesian safety analysis with unprocessed Automatic Vehicle Identification data for an urban expressway.

    PubMed

    Shi, Qi; Abdel-Aty, Mohamed; Yu, Rongjie

    2016-03-01

    In traffic safety studies, crash frequency modeling of total crashes is the cornerstone before proceeding to more detailed safety evaluation. The relationship between crash occurrence and factors such as traffic flow and roadway geometric characteristics has been extensively explored for a better understanding of crash mechanisms. In this study, a multi-level Bayesian framework has been developed in an effort to identify the crash contributing factors on an urban expressway in the Central Florida area. Two types of traffic data from the Automatic Vehicle Identification system, which are the processed data capped at speed limit and the unprocessed data retaining the original speed were incorporated in the analysis along with road geometric information. The model framework was proposed to account for the hierarchical data structure and the heterogeneity among the traffic and roadway geometric data. Multi-level and random parameters models were constructed and compared with the Negative Binomial model under the Bayesian inference framework. Results showed that the unprocessed traffic data was superior. Both multi-level models and random parameters models outperformed the Negative Binomial model and the models with random parameters achieved the best model fitting. The contributing factors identified imply that on the urban expressway lower speed and higher speed variation could significantly increase the crash likelihood. Other geometric factors were significant including auxiliary lanes and horizontal curvature. PMID:26722989

  9. High level language for measurement complex control based on the computer E-100I

    NASA Technical Reports Server (NTRS)

    Zubkov, B. V.

    1980-01-01

    A high level language was designed to control the process of conducting an experiment using the computer "Elektrinika-1001". Program examples are given to control the measuring and actuating devices. The procedure of including these programs in the suggested high level language is described.

  10. Factors Influencing the Integration of Computer Algebra Systems into University-Level Mathematics Education

    ERIC Educational Resources Information Center

    Lavicza, Zsolt

    2007-01-01

    Computer Algebra Systems (CAS) are increasing components of university-level mathematics education. However, little is known about the extent of CAS use and the factors influencing its integration into university curricula. Pre-university level studies suggest that beyond the availability of technology, teachers' conceptions and cultural elements…

  11. Identification and Endodontic Management of Middle Mesial Canal in Mandibular Second Molar Using Cone Beam Computed Tomography.

    PubMed

    Paul, Bonny; Dube, Kavita

    2015-01-01

    Endodontic treatments are routinely done with the help of radiographs. However, radiographs represent only a two-dimensional image of an object. Failure to identify aberrant anatomy can lead to endodontic failure. This case report presents the use of three-dimensional imaging with cone beam computed tomography (CBCT) as an adjunct to digital radiography in identification and management of mandibular second molar with three mesial canals. PMID:26664763

  12. Identification and Endodontic Management of Middle Mesial Canal in Mandibular Second Molar Using Cone Beam Computed Tomography

    PubMed Central

    Paul, Bonny; Dube, Kavita

    2015-01-01

    Endodontic treatments are routinely done with the help of radiographs. However, radiographs represent only a two-dimensional image of an object. Failure to identify aberrant anatomy can lead to endodontic failure. This case report presents the use of three-dimensional imaging with cone beam computed tomography (CBCT) as an adjunct to digital radiography in identification and management of mandibular second molar with three mesial canals. PMID:26664763

  13. Low-bandwidth and non-compute intensive remote identification of microbes from raw sequencing reads.

    PubMed

    Gautier, Laurent; Lund, Ole

    2013-01-01

    Cheap DNA sequencing may soon become routine not only for human genomes but also for practically anything requiring the identification of living organisms from their DNA: tracking of infectious agents, control of food products, bioreactors, or environmental samples. We propose a novel general approach to the analysis of sequencing data where a reference genome does not have to be specified. Using a distributed architecture we are able to query a remote server for hints about what the reference might be, transferring a relatively small amount of data. Our system consists of a server with known reference DNA indexed, and a client with raw sequencing reads. The client sends a sample of unidentified reads, and in return receives a list of matching references. Sequences for the references can be retrieved and used for exhaustive computation on the reads, such as alignment. To demonstrate this approach we have implemented a web server, indexing tens of thousands of publicly available genomes and genomic regions from various organisms and returning lists of matching hits from query sequencing reads. We have also implemented two clients: one running in a web browser, and one as a python script. Both are able to handle a large number of sequencing reads and from portable devices (the browser-based running on a tablet), perform its task within seconds, and consume an amount of bandwidth compatible with mobile broadband networks. Such client-server approaches could develop in the future, allowing a fully automated processing of sequencing data and routine instant quality check of sequencing runs from desktop sequencers. A web access is available at http://tapir.cbs.dtu.dk. The source code for a python command-line client, a server, and supplementary data are available at http://bit.ly/1aURxkc. PMID:24391826

  14. Low-Bandwidth and Non-Compute Intensive Remote Identification of Microbes from Raw Sequencing Reads

    PubMed Central

    Gautier, Laurent; Lund, Ole

    2013-01-01

    Cheap DNA sequencing may soon become routine not only for human genomes but also for practically anything requiring the identification of living organisms from their DNA: tracking of infectious agents, control of food products, bioreactors, or environmental samples. We propose a novel general approach to the analysis of sequencing data where a reference genome does not have to be specified. Using a distributed architecture we are able to query a remote server for hints about what the reference might be, transferring a relatively small amount of data. Our system consists of a server with known reference DNA indexed, and a client with raw sequencing reads. The client sends a sample of unidentified reads, and in return receives a list of matching references. Sequences for the references can be retrieved and used for exhaustive computation on the reads, such as alignment. To demonstrate this approach we have implemented a web server, indexing tens of thousands of publicly available genomes and genomic regions from various organisms and returning lists of matching hits from query sequencing reads. We have also implemented two clients: one running in a web browser, and one as a python script. Both are able to handle a large number of sequencing reads and from portable devices (the browser-based running on a tablet), perform its task within seconds, and consume an amount of bandwidth compatible with mobile broadband networks. Such client-server approaches could develop in the future, allowing a fully automated processing of sequencing data and routine instant quality check of sequencing runs from desktop sequencers. A web access is available at http://tapir.cbs.dtu.dk. The source code for a python command-line client, a server, and supplementary data are available at http://bit.ly/1aURxkc. PMID:24391826

  15. Pixel-Level Analysis Techniques for False-Positive Identification in Kepler Data

    NASA Astrophysics Data System (ADS)

    Bryson, Steve; Jenkins, J.; Gilliland, R.; Batalha, N.; Gautier, T. N.; Rowe, J.; Dunham, E.; Latham, D.; Caldwell, D.; Twicken, J.; Tenenbaum, P.; Clarke, B.; Li, J.; Wu, H.; Quintana, E.; Ciardi, D.; Torres, G.; Dotson, J.; Still, M.

    2011-01-01

    The Kepler mission seeks to identify Earth-size exoplanets by detecting transits of their parent star. The resulting transit signature will be small ( 100 ppm). Several astrophysical phenomena can mimic an Earth-size transit signature, most notably background eclipsing binaries (BGEBs). As part of a larger false-positive identification effort, pixel-level analysis of the Kepler data has proven crucial in identifying the likelihood of these confounding signals. Pixel-level analysis is primarily useful for the case of the transit being a BGEB. Several analysis techniques are presented, including: - measurement of centroid motion in and out of transit compared with detailed modeling of expected centroid motion, including an estimate of the transit source location - transit source location determination through a high-precision PSF-fit of the difference between in- and out-of-transit pixels, directly measuring the location of the transit source - source location determination through fitting the observed summed flux time series (or the light curve derived from the transit model) to each pixel's time series data. These techniques have been automated and are being considered for inclusion in the Kepler Science Operations Center Data Analysis Pipeline. They are supplemented by various diagnostic plots of the Kepler data as well as comparison with background stars identified by the Kepler Follow-up Observing Program (FOP). The final determination of whether an observed transit is a false positive integrates several sources, including pixel-level analysis and FOP results. Pixel-level techniques can identify BGEBs that are separated from the Kepler target star by more than a certain radius, called the "radius of confusion". The determination of the radius of confusion, and the role it plays in assigning the probability of the transit being due to a planet, is briefly discussed. The statistics from the latest false-positive list are provided. Funding for this mission provided

  16. Identification of Low-level Point Radioactive Sources using a sensor network

    SciTech Connect

    Chin, J. C.; Rao, Nageswara S.; Yao, David K. Y.; Shankar, Mallikarjun; Yang, Yong; Hou, J. C.; Srivathsan, Sri; Iyengar, S. Sitharama

    2010-09-01

    Identification of a low-level point radioactive source amidst background radiation is achieved by a network of radiation sensors using a two-step approach. Based on measurements from three or more sensors, a geometric difference triangulation method or an N-sensor localization method is used to estimate the location and strength of the source. Then a sequential probability ratio test based on current measurements and estimated parameters is employed to finally decide: (1) the presence of a source with the estimated parameters, or (2) the absence of the source, or (3) the insufficiency of measurements to make a decision. This method achieves specified levels of false alarm and missed detection probabilities, while ensuring a close-to-minimal number of measurements for reaching a decision. This method minimizes the ghost-source problem of current estimation methods, and achieves a lower false alarm rate compared with current detection methods. This method is tested and demonstrated using: (1) simulations, and (2) a test-bed that utilizes the scaling properties of point radioactive sources to emulate high intensity ones that cannot be easily and safely handled in laboratory experiments.

  17. Pipasic: similarity and expression correction for strain-level identification and quantification in metaproteomics

    PubMed Central

    Penzlin, Anke; Lindner, Martin S.; Doellinger, Joerg; Dabrowski, Piotr Wojtek; Nitsche, Andreas; Renard, Bernhard Y.

    2014-01-01

    Motivation: Metaproteomic analysis allows studying the interplay of organisms or functional groups and has become increasingly popular also for diagnostic purposes. However, difficulties arise owing to the high sequence similarity between related organisms. Further, the state of conservation of proteins between species can be correlated with their expression level, which can lead to significant bias in results and interpretation. These challenges are similar but not identical to the challenges arising in the analysis of metagenomic samples and require specific solutions. Results: We introduce Pipasic (peptide intensity-weighted proteome abundance similarity correction) as a tool that corrects identification and spectral counting-based quantification results using peptide similarity estimation and expression level weighting within a non-negative lasso framework. Pipasic has distinct advantages over approaches only regarding unique peptides or aggregating results to the lowest common ancestor, as demonstrated on examples of viral diagnostics and an acid mine drainage dataset. Availability and implementation: Pipasic source code is freely available from https://sourceforge.net/projects/pipasic/. Contact: RenardB@rki.de Supplementary information: Supplementary data are available at Bioinformatics online PMID:24931978

  18. Simple distortion-invariant optical identification tag based on encrypted binary-phase computer-generated hologram for real-time vehicle identification and verification

    NASA Astrophysics Data System (ADS)

    Kim, Cheol-Su

    2010-11-01

    A simple distortion-invariant optical identification (ID) tag is presented for real-time vehicle identification and verification. The proposed scheme is composed of image encryption, ID tag creation, image decryption, and optical correlation for verification. To create the ID tag, a binary-phase computer-generated hologram (BPCGH) of a symbol image representing a vehicle is created using a simulated annealing algorithm. The BPCGH is then encrypted using an XOR operation and enlargement transformed into polar coordinates. The resulting ID tag is then attached to the vehicle. As the BPCGH consists of only binary phase values, it is robust to external distortions. To identify and verify the vehicle, several reverse processes are required, such as capturing the ID tag with a camera, extracting the ID tag from the captured image, transformation of the ID tag into rectangular coordinates, decryption, an inverse Fourier transform, and correlation. Computer simulation and experimental results confirm that the proposed optical ID tag is secure and robust to such distortions as scaling, rotation, cropping (scratches), and random noise. The ID tag can also be easily implemented, as it consists of only binary phase components.

  19. Realization of a holonomic quantum computer in a chain of three-level systems

    NASA Astrophysics Data System (ADS)

    Gürkan, Zeynep Nilhan; Sjöqvist, Erik

    2015-12-01

    Holonomic quantum computation is the idea to use non-Abelian geometric phases to implement universal quantum gates that are robust to fluctuations in control parameters. Here, we propose a compact design for a holonomic quantum computer based on coupled three-level systems. The scheme does not require adiabatic evolution and can be implemented in arrays of atoms or ions trapped in tailored standing wave potentials.

  20. Evaluation of the Wider System, a New Computer-Assisted Image-Processing Device for Bacterial Identification and Susceptibility Testing

    PubMed Central

    Cantón, Rafael; Pérez-Vázquez, María; Oliver, Antonio; Sánchez Del Saz, Begoña; Gutiérrez, M. Olga; Martínez-Ferrer, Manuel; Baquero, Fernando

    2000-01-01

    The Wider system is a newly developed computer-assisted image-processing device for both bacterial identification and antimicrobial susceptibility testing. It has been adapted to be able to read and interpret commercial MicroScan panels. Two hundred forty-four fresh consecutive clinical isolates (138 isolates of the family Enterobacteriaceae, 25 nonfermentative gram-negative rods [NFGNRs], and 81 gram-positive cocci) were tested. In addition, 100 enterobacterial strains with known β-lactam resistance mechanisms (22 strains with chromosomal AmpC β-lactamase, 8 strains with chromosomal class A β-lactamase, 21 broad-spectrum and IRT β-lactamase-producing strains, 41 extended-spectrum β-lactamase-producing strains, and 8 permeability mutants) were tested. API galleries and National Committee for Clinical Laboratory Standards (NCCLS) microdilution methods were used as reference methods. The Wider system correctly identified 97.5% of the clinical isolates at the species level. Overall essential agreement (±1 log2 dilution for 3,719 organism-antimicrobial drug combinations) was 95.6% (isolates of the family Enterobacteriaceae, 96.6%; NFGNRs, 88.0%; gram-positive cocci, 95.6%). The lowest essential agreement was observed with Enterobacteriaceae versus imipenem (84.0%), NFGNR versus piperacillin (88.0%) and cefepime (88.0%), and gram-positive isolates versus penicillin (80.4%). The category error rate (NCCLS criteria) was 4.2% (2.0% very major errors, 0.6% major errors, and 1.5% minor errors). Essential agreement and interpretive error rates for eight β-lactam antibiotics against isolates of the family Enterobacteriaceae with known β-lactam resistance mechanisms were 94.8 and 5.4%, respectively. Interestingly, the very major error rate was only 0.8%. Minor errors (3.6%) were mainly observed with amoxicillin-clavulanate and cefepime against extended-spectrum β-lactamase-producing isolates. The Wider system is a new reliable tool which applies the image

  1. Assessment of Social Vulnerability Identification at Local Level around Merapi Volcano - A Self Organizing Map Approach

    NASA Astrophysics Data System (ADS)

    Lee, S.; Maharani, Y. N.; Ki, S. J.

    2015-12-01

    The application of Self-Organizing Map (SOM) to analyze social vulnerability to recognize the resilience within sites is a challenging tasks. The aim of this study is to propose a computational method to identify the sites according to their similarity and to determine the most relevant variables to characterize the social vulnerability in each cluster. For this purposes, SOM is considered as an effective platform for analysis of high dimensional data. By considering the cluster structure, the characteristic of social vulnerability of the sites identification can be fully understand. In this study, the social vulnerability variable is constructed from 17 variables, i.e. 12 independent variables which represent the socio-economic concepts and 5 dependent variables which represent the damage and losses due to Merapi eruption in 2010. These variables collectively represent the local situation of the study area, based on conducted fieldwork on September 2013. By using both independent and dependent variables, we can identify if the social vulnerability is reflected onto the actual situation, in this case, Merapi eruption 2010. However, social vulnerability analysis in the local communities consists of a number of variables that represent their socio-economic condition. Some of variables employed in this study might be more or less redundant. Therefore, SOM is used to reduce the redundant variable(s) by selecting the representative variables using the component planes and correlation coefficient between variables in order to find the effective sample size. Then, the selected dataset was effectively clustered according to their similarities. Finally, this approach can produce reliable estimates of clustering, recognize the most significant variables and could be useful for social vulnerability assessment, especially for the stakeholder as decision maker. This research was supported by a grant 'Development of Advanced Volcanic Disaster Response System considering

  2. Automatic Identification of the Repolarization Endpoint by Computing the Dominant T-wave on a Reduced Number of Leads

    PubMed Central

    Giuliani, C.; Agostinelli, A.; Di Nardo, F.; Fioretti, S.; Burattini, L.

    2016-01-01

    Electrocardiographic (ECG) T-wave endpoint (Tend) identification suffers lack of reliability due to the presence of noise and variability among leads. Tend identification can be improved by using global repolarization waveforms obtained by combining several leads. The dominant T-wave (DTW) is a global repolarization waveform that proved to improve Tend identification when computed using the 15 (I to III, aVr, aVl, aVf, V1 to V6, X, Y, Z) leads usually available in clinics, of which only 8 (I, II, V1 to V6) are independent. The aim of the present study was to evaluate if the 8 independent leads are sufficient to obtain a DTW which allows a reliable Tend identification. To this aim Tend measures automatically identified from 15-dependent-lead DTWs of 46 control healthy subjects (CHS) and 103 acute myocardial infarction patients (AMIP) were compared with those obtained from 8-independent-lead DTWs. Results indicate that Tend distributions have not statistically different median values (CHS: 340 ms vs. 340 ms, respectively; AMIP: 325 ms vs. 320 ms, respectively), besides being strongly correlated (CHS: ρ=0.97, AMIP: 0.88; P<10-27). Thus, measuring Tend from the 15-dependent-lead DTWs is statistically equivalent to measuring Tend from the 8-independent-lead DTWs. In conclusion, for the clinical purpose of automatic Tend identification from DTW, the 8 independent leads can be used without a statistically significant loss of accuracy but with a significant decrement of computational effort. The lead dependence of 7 out of 15 leads does not introduce a significant bias in the Tend determination from 15 dependent lead DTWs. PMID:27347218

  3. Automatic Identification of the Repolarization Endpoint by Computing the Dominant T-wave on a Reduced Number of Leads.

    PubMed

    Giuliani, C; Agostinelli, A; Di Nardo, F; Fioretti, S; Burattini, L

    2016-01-01

    Electrocardiographic (ECG) T-wave endpoint (Tend) identification suffers lack of reliability due to the presence of noise and variability among leads. Tend identification can be improved by using global repolarization waveforms obtained by combining several leads. The dominant T-wave (DTW) is a global repolarization waveform that proved to improve Tend identification when computed using the 15 (I to III, aVr, aVl, aVf, V1 to V6, X, Y, Z) leads usually available in clinics, of which only 8 (I, II, V1 to V6) are independent. The aim of the present study was to evaluate if the 8 independent leads are sufficient to obtain a DTW which allows a reliable Tend identification. To this aim Tend measures automatically identified from 15-dependent-lead DTWs of 46 control healthy subjects (CHS) and 103 acute myocardial infarction patients (AMIP) were compared with those obtained from 8-independent-lead DTWs. Results indicate that Tend distributions have not statistically different median values (CHS: 340 ms vs. 340 ms, respectively; AMIP: 325 ms vs. 320 ms, respectively), besides being strongly correlated (CHS: ρ=0.97, AMIP: 0.88; P<10(-27)). Thus, measuring Tend from the 15-dependent-lead DTWs is statistically equivalent to measuring Tend from the 8-independent-lead DTWs. In conclusion, for the clinical purpose of automatic Tend identification from DTW, the 8 independent leads can be used without a statistically significant loss of accuracy but with a significant decrement of computational effort. The lead dependence of 7 out of 15 leads does not introduce a significant bias in the Tend determination from 15 dependent lead DTWs. PMID:27347218

  4. User-microprogrammable, local host computer with low-level parallelism

    SciTech Connect

    Tomita, S.; Shibayama, K.; Kitamura, T.; Nakata, T.; Hagiwara, H.

    1983-01-01

    This paper describes the architecture of a dynamically microprogrammable computer with low-level parallelism, called QA-2, which is designed as a high-performance, local host computer for laboratory use. The architectural principle of the QA-2 is the marriage of high-speed, parallel processing capability offered by four powerful arithmetic and logic units (ALUS) with architectural flexibility provided by large scale, dynamic user-microprogramming. By changing its writable control storage dynamically, the QA-2 can be tailored to a wide spectrum of research-oriented applications covering high-level language processing and real-time processing. 11 references.

  5. Computational identification of conserved microRNAs and their targets from expression sequence tags of blueberry (Vaccinium corybosum)

    PubMed Central

    Li, Xuyan; Hou, Yanming; Zhang, Li; Zhang, Wenhao; Quan, Chen; Cui, Yuhai; Bian, Shaomin

    2014-01-01

    MicroRNAs (miRNAs) are a class of endogenous, approximately 21nt in length, non-coding RNA, which mediate the expression of target genes primarily at post-transcriptional levels. miRNAs play critical roles in almost all plant cellular and metabolic processes. Although numerous miRNAs have been identified in the plant kingdom, the miRNAs in blueberry, which is an economically important small fruit crop, still remain totally unknown. In this study, we reported a computational identification of miRNAs and their targets in blueberry. By conducting an EST-based comparative genomics approach, 9 potential vco-miRNAs were discovered from 22,402 blueberry ESTs according to a series of filtering criteria, designated as vco-miR156–5p, vco-miR156–3p, vco-miR1436, vco-miR1522, vco-miR4495, vco-miR5120, vco-miR5658, vco-miR5783, and vco-miR5986. Based on sequence complementarity between miRNA and its target transcript, 34 target ESTs from blueberry and 70 targets from other species were identified for the vco-miRNAs. The targets were found to be involved in transcription, RNA splicing and binding, DNA duplication, signal transduction, transport and trafficking, stress response, as well as synthesis and metabolic process. These findings will greatly contribute to future research in regard to functions and regulatory mechanisms of blueberry miRNAs. PMID:25763692

  6. Species-level identification of staphylococci isolated from bovine mastitis in Brazil using partial 16S rRNA sequencing.

    PubMed

    Lange, Carla C; Brito, Maria A V P; Reis, Daniele R L; Machado, Marco A; Guimarães, Alessandro S; Azevedo, Ana L S; Salles, Érica B; Alvim, Mariana C T; Silva, Fabiana S; Meurer, Igor R

    2015-04-17

    Staphylococci isolated from bovine milk and not classified as Staphylococcus aureus represent a heterogeneous group of microorganisms that are frequently associated with bovine mastitis. The identification of these microorganisms is important, although it is difficult and relatively costly. Genotypic methods add precision in the identification of Staphylococcus species. In the present study, partial 16S rRNA sequencing was used for the species identification of coagulase-positive and coagulase-negative staphylococci isolated from bovine mastitis. Two hundred and two (95%) of the 213 isolates were successfully identified at the species level. The assigning of an isolate to a particular species was based on ≥99% identity with 16S rRNA sequences deposited in GenBank. The identified isolates belonged to 13 different Staphylococcus species; Staphylococcus chromogenes, S. aureus and Staphylococcus epidermidis were the most frequently identified species. Eight isolates could not be assigned to a single species, as the obtained sequences showed 99% or 100% similarity to sequences from two or three different Staphylococcus species. The relatedness of these isolates with the other isolates and reference strains was visualized using a cladogram. In conclusion, 16S rRNA sequencing was an objective and accurate method for the proper identification of Staphylococcus species isolated from bovine mastitis. Additional target genes could be used in non-conclusive cases for the species-level identification of these microorganisms. PMID:25704228

  7. High level waste storage tank farms/242-A evaporator Standards/Requirements Identification Document (S/RID), Volume 6

    SciTech Connect

    Not Available

    1994-04-01

    The High-Level Waste Storage Tank Farms/242-A Evaporator Standards/Requirements Identification Document (S/RID) is contained in multiple volumes. This document (Volume 6) outlines the standards and requirements for the sections on: Environmental Restoration and Waste Management, Research and Development and Experimental Activities, and Nuclear Safety.

  8. High-level waste storage tank farms/242-A evaporator Standards/Requirements Identification Document (S/RID)

    SciTech Connect

    Not Available

    1994-04-01

    The High-Level Waste Storage Tank Farms/242-A Evaporator Standards/Requirements Identification Document (S/RID) is contained in multiple volumes. This document (Volume 3) presents the standards and requirements for the following sections: Safeguards and Security, Engineering Design, and Maintenance.

  9. High-level waste storage tank farms/242-A evaporator Standards/Requirements Identification Document (S/RID), Volume 4

    SciTech Connect

    Not Available

    1994-04-01

    The High-Level Waste Storage Tank Farms/242-A Evaporator Standards/Requirements Identification Document (S/RID) is contained in multiple volumes. This document (Volume 4) presents the standards and requirements for the following sections: Radiation Protection and Operations.

  10. System-level identification of transcriptional circuits underlying mammalian circadian clocks.

    PubMed

    Ueda, Hiroki R; Hayashi, Satoko; Chen, Wenbin; Sano, Motoaki; Machida, Masayuki; Shigeyoshi, Yasufumi; Iino, Masamitsu; Hashimoto, Seiichi

    2005-02-01

    Mammalian circadian clocks consist of complexly integrated regulatory loops, making it difficult to elucidate them without both the accurate measurement of system dynamics and the comprehensive identification of network circuits. Toward a system-level understanding of this transcriptional circuitry, we identified clock-controlled elements on 16 clock and clock-controlled genes in a comprehensive surveillance of evolutionarily conserved cis elements and measurement of their transcriptional dynamics. Here we report the roles of E/E' boxes, DBP/E4BP4 binding elements and RevErbA/ROR binding elements in nine, seven and six genes, respectively. Our results indicate that circadian transcriptional circuits are governed by two design principles: regulation of E/E' boxes and RevErbA/ROR binding elements follows a repressor-precedes-activator pattern, resulting in delayed transcriptional activity, whereas regulation of DBP/E4BP4 binding elements follows a repressor-antiphasic-to-activator mechanism, which generates high-amplitude transcriptional activity. Our analysis further suggests that regulation of E/E' boxes is a topological vulnerability in mammalian circadian clocks, a concept that has been functionally verified using in vitro phenotype assay systems. PMID:15665827

  11. Scalability of a Base Level Design for an On-Board-Computer for Scientific Missions

    NASA Astrophysics Data System (ADS)

    Treudler, Carl Johann; Schroder, Jan-Carsten; Greif, Fabian; Stohlmann, Kai; Aydos, Gokce; Fey, Gorschwin

    2014-08-01

    Facing a wide range of mission requirements and the integration of diverse payloads requires extreme flexibility in the on-board-computing infrastructure for scientific missions. We show that scalability is principally difficult. We address this issue by proposing a base level design and show how the adoption to different needs is achieved. Inter-dependencies between scaling different aspects and their impact on different levels in the design are discussed.

  12. Neuromotor recovery from stroke: computational models at central, functional, and muscle synergy level

    PubMed Central

    Casadio, Maura; Tamagnone, Irene; Summa, Susanna; Sanguineti, Vittorio

    2013-01-01

    Computational models of neuromotor recovery after a stroke might help to unveil the underlying physiological mechanisms and might suggest how to make recovery faster and more effective. At least in principle, these models could serve: (i) To provide testable hypotheses on the nature of recovery; (ii) To predict the recovery of individual patients; (iii) To design patient-specific “optimal” therapy, by setting the treatment variables for maximizing the amount of recovery or for achieving a better generalization of the learned abilities across different tasks. Here we review the state of the art of computational models for neuromotor recovery through exercise, and their implications for treatment. We show that to properly account for the computational mechanisms of neuromotor recovery, multiple levels of description need to be taken into account. The review specifically covers models of recovery at central, functional and muscle synergy level. PMID:23986688

  13. The Relationship between Internet and Computer Game Addiction Level and Shyness among High School Students

    ERIC Educational Resources Information Center

    Ayas, Tuncay

    2012-01-01

    This study is conducted to determine the relationship between the internet and computer games addiction level and the shyness among high school students. The participants of the study consist of 365 students attending high schools in Giresun city centre during 2009-2010 academic year. As a result of the study a positive, meaningful, and high…

  14. 24 CFR 990.165 - Computation of project expense level (PEL).

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... with 24 CFR 941.606 for a mixed-finance transaction, then the project covered by the mixed-finance... 24 Housing and Urban Development 4 2010-04-01 2010-04-01 false Computation of project expense level (PEL). 990.165 Section 990.165 Housing and Urban Development Regulations Relating to Housing...

  15. Do Students Benefit Equally from Interactive Computer Simulations Regardless of Prior Knowledge Levels?

    ERIC Educational Resources Information Center

    Park, Seong Ik; Lee, Gyumin; Kim, Meekyoung

    2009-01-01

    The purposes of this study were to examine the effects of two types of interactive computer simulations and of prior knowledge levels on concept comprehension, cognitive load, and learning efficiency. Seventy-two 5th grade students were sampled from two elementary schools. They were divided into two groups (high and low) based on prior knowledge…

  16. BICYCLE: a computer code for calculating levelized life-cycle costs

    SciTech Connect

    Hardie, R.W.

    1980-08-01

    This report serves as a user's manual for the BICYCLE computer code. BICYCLE was specifically designed to calculate levelized life-cycle costs for plants that produce electricity, heat, gaseous fuels, or liquid fuels. Included in this report are (1) derivations of the equations used by BICYCLE, (2) input instructions, (3) sample case input, and (4) sample case output.

  17. BICYCLE II: a computer code for calculating levelized life-cycle costs

    SciTech Connect

    Hardie, R.W.

    1981-11-01

    This report describes the BICYCLE computer code. BICYCLE was specifically designed to calculate levelized life-cycle costs for plants that produce electricity, heat, gaseous fuels, or liquid fuels. Included are (1) derivations of the equations used by BICYCLE, (2) input instructions, (3) sample case input, and (4) sample case output.

  18. Can Synchronous Computer-Mediated Communication (CMC) Help Beginning-Level Foreign Language Learners Speak?

    ERIC Educational Resources Information Center

    Ko, Chao-Jung

    2012-01-01

    This study investigated the possibility that initial-level learners may acquire oral skills through synchronous computer-mediated communication (SCMC). Twelve Taiwanese French as a foreign language (FFL) students, divided into three groups, were required to conduct a variety of tasks in one of the three learning environments (video/audio, audio,…

  19. Computational identification of microRNAs in Anatid herpesvirus 1 genome

    PubMed Central

    2012-01-01

    Background MicroRNAs (miRNAs) are a group of short (~22 nt) noncoding RNAs that specifically regulate gene expression at the post-transcriptional level. miRNA precursors (pre-miRNAs), which are imperfect stem loop structures of ~70 nt, are processed into mature miRNAs by cellular RNases III. To date, thousands of miRNAs have been identified in different organisms. Several viruses have been reported to encode miRNAs. Findings Here, we extended the analysis of miRNA-encoding potential to the Anatid herpesvirus 1 (AHV-1). Using computational approaches, we found that AHV-1 putatively encodes 12 mature miRNAs. We then compared the 12 mature miRNAs candidates with the all known miRNAs of the herpesvirus family. Interestingly, the “seed sequences” (nt 2 to 8) of 2 miRNAs were predicted to have the high conservation in position and/or sequence with the 2 miRNAs of Marek’s disease virus type 1 (MDV-1). Additionally, we searched the targets from viral mRNAs. Conclusions Using computational approaches, we found that AHV-1 putatively encodes 12 mature miRNAs and 2 miRNAs have the high conservation with the 2 miRNAs of MDV-1. The result suggested that AHV-1 and MDV-1 should have closed evolutionary relation, which provides a valuable evidence of classification of AHV-1. Additionally, seven viral gene targets were found, which suggested that AHV-1 miRNAs could affect its own gene expression. PMID:22584005

  20. A novel computer-aided detection system for pulmonary nodule identification in CT images

    NASA Astrophysics Data System (ADS)

    Han, Hao; Li, Lihong; Wang, Huafeng; Zhang, Hao; Moore, William; Liang, Zhengrong

    2014-03-01

    Computer-aided detection (CADe) of pulmonary nodules from computer tomography (CT) scans is critical for assisting radiologists to identify lung lesions at an early stage. In this paper, we propose a novel approach for CADe of lung nodules using a two-stage vector quantization (VQ) scheme. The first-stage VQ aims to extract lung from the chest volume, while the second-stage VQ is designed to extract initial nodule candidates (INCs) within the lung volume. Then rule-based expert filtering is employed to prune obvious FPs from INCs, and the commonly-used support vector machine (SVM) classifier is adopted to further reduce the FPs. The proposed system was validated on 100 CT scans randomly selected from the 262 scans that have at least one juxta-pleural nodule annotation in the publicly available database - Lung Image Database Consortium and Image Database Resource Initiative (LIDC-IDRI). The two-stage VQ only missed 2 out of the 207 nodules at agreement level 1, and the INCs detection for each scan took about 30 seconds in average. Expert filtering reduced FPs more than 18 times, while maintaining a sensitivity of 93.24%. As it is trivial to distinguish INCs attached to pleural wall versus not on wall, we investigated the feasibility of training different SVM classifiers to further reduce FPs from these two kinds of INCs. Experiment results indicated that SVM classification over the entire set of INCs was in favor of, where the optimal operating of our CADe system achieved a sensitivity of 89.4% at a specificity of 86.8%.

  1. Identification of masses in digital mammogram using gray level co-occurrence matrices

    PubMed Central

    Mohd. Khuzi, A; Besar, R; Wan Zaki, WMD; Ahmad, NN

    2009-01-01

    Digital mammogram has become the most effective technique for early breast cancer detection modality. Digital mammogram takes an electronic image of the breast and stores it directly in a computer. The aim of this study is to develop an automated system for assisting the analysis of digital mammograms. Computer image processing techniques will be applied to enhance images and this is followed by segmentation of the region of interest (ROI). Subsequently, the textural features will be extracted from the ROI. The texture features will be used to classify the ROIs as either masses or non-masses. In this study normal breast images and breast image with masses used as the standard input to the proposed system are taken from Mammographic Image Analysis Society (MIAS) digital mammogram database. In MIAS database, masses are grouped into either spiculated, circumscribed or ill-defined. Additional information includes location of masses centres and radius of masses. The extraction of the textural features of ROIs is done by using gray level co-occurrence matrices (GLCM) which is constructed at four different directions for each ROI. The results show that the GLCM at 0º, 45º, 90º and 135º with a block size of 8X8 give significant texture information to identify between masses and non-masses tissues. Analysis of GLCM properties i.e. contrast, energy and homogeneity resulted in receiver operating characteristics (ROC) curve area of Az = 0.84 for Otsu’s method, 0.82 for thresholding method and Az = 0.7 for K-mean clustering. ROC curve area of 0.8-0.9 is rated as good results. The authors’ proposed method contains no complicated algorithm. The detection is based on a decision tree with five criterions to be analysed. This simplicity leads to less computational time. Thus, this approach is suitable for automated real-time breast cancer diagnosis system. PMID:21611053

  2. Identification of Cognitive Processes of Effective and Ineffective Students during Computer Programming

    ERIC Educational Resources Information Center

    Renumol, V. G.; Janakiram, Dharanipragada; Jayaprakash, S.

    2010-01-01

    Identifying the set of cognitive processes (CPs) a student can go through during computer programming is an interesting research problem. It can provide a better understanding of the human aspects in computer programming process and can also contribute to the computer programming education in general. The study identified the presence of a set of…

  3. Reshaping Computer Literacy Teaching in Higher Education: Identification of Critical Success Factors

    ERIC Educational Resources Information Center

    Taylor, Estelle; Goede, Roelien; Steyn, Tjaart

    2011-01-01

    Purpose: Acquiring computer skills is more important today than ever before, especially in a developing country. Teaching of computer skills, however, has to adapt to new technology. This paper aims to model factors influencing the success of the learning of computer literacy by means of an e-learning environment. The research question for this…

  4. Man-Computer Symbiosis Through Interactive Graphics: A Survey and Identification of Critical Research Areas.

    ERIC Educational Resources Information Center

    Knoop, Patricia A.

    The purpose of this report was to determine the research areas that appear most critical to achieving man-computer symbiosis. An operational definition of man-computer symbiosis was developed by: (1) reviewing and summarizing what others have said about it, and (2) attempting to distinguish it from other types of man-computer relationships. From…

  5. Computational identification of miRNAs in medicinal plant Senecio vulgaris (Groundsel).

    PubMed

    Sahu, Sarika; Khushwaha, Anjana; Dixit, Rekha

    2011-01-01

    RNAs Interference plays a very important role in gene silencing. In vitro identification of miRNAs is a slow process as it is difficult to isolate them. Nucleotide sequences of miRNAs are highly conserved among the plants and, this form the key feature behind the identification of miRNAs in plant species by homology alignment. In silico identification of miRNAs from EST database is emerging as a novel, faster and reliable approach. Here EST sequences of Senecio vulgaris (Groundsel) were searched against known miRNA sequences by using BLASTN tool. A total of 10 miRNAs were identified from 1956 EST sequences and 115 GSS sequences. The most stable miRNA identified is svu-mir-1. This approach will accelerate advance research in regulation of gene expression in Groundsel by interfering RNAs. PMID:22347777

  6. Computational identification of miRNAs in medicinal plant Senecio vulgaris (Groundsel)

    PubMed Central

    Sahu, Sarika; Khushwaha, Anjana; Dixit, Rekha

    2011-01-01

    RNAs Interference plays a very important role in gene silencing. In vitro identification of miRNAs is a slow process as it is difficult to isolate them. Nucleotide sequences of miRNAs are highly conserved among the plants and, this form the key feature behind the identification of miRNAs in plant species by homology alignment. In silico identification of miRNAs from EST database is emerging as a novel, faster and reliable approach. Here EST sequences of Senecio vulgaris (Groundsel) were searched against known miRNA sequences by using BLASTN tool. A total of 10 miRNAs were identified from 1956 EST sequences and 115 GSS sequences. The most stable miRNA identified is svu-mir-1. This approach will accelerate advance research in regulation of gene expression in Groundsel by interfering RNAs. PMID:22347777

  7. Level set discrete element method for three-dimensional computations with triaxial case study

    NASA Astrophysics Data System (ADS)

    Kawamoto, Reid; Andò, Edward; Viggiani, Gioacchino; Andrade, José E.

    2016-06-01

    In this paper, we outline the level set discrete element method (LS-DEM) which is a discrete element method variant able to simulate systems of particles with arbitrary shape using level set functions as a geometric basis. This unique formulation allows seamless interfacing with level set-based characterization methods as well as computational ease in contact calculations. We then apply LS-DEM to simulate two virtual triaxial specimens generated from XRCT images of experiments and demonstrate LS-DEM's ability to quantitatively capture and predict stress-strain and volume-strain behavior observed in the experiments.

  8. Energy Use and Power Levels in New Monitors and Personal Computers

    SciTech Connect

    Roberson, Judy A.; Homan, Gregory K.; Mahajan, Akshay; Nordman, Bruce; Webber, Carrie A.; Brown, Richard E.; McWhinney, Marla; Koomey, Jonathan G.

    2002-07-23

    Our research was conducted in support of the EPA ENERGY STAR Office Equipment program, whose goal is to reduce the amount of electricity consumed by office equipment in the U.S. The most energy-efficient models in each office equipment category are eligible for the ENERGY STAR label, which consumers can use to identify and select efficient products. As the efficiency of each category improves over time, the ENERGY STAR criteria need to be revised accordingly. The purpose of this study was to provide reliable data on the energy consumption of the newest personal computers and monitors that the EPA can use to evaluate revisions to current ENERGY STAR criteria as well as to improve the accuracy of ENERGY STAR program savings estimates. We report the results of measuring the power consumption and power management capabilities of a sample of new monitors and computers. These results will be used to improve estimates of program energy savings and carbon emission reductions, and to inform rev isions of the ENERGY STAR criteria for these products. Our sample consists of 35 monitors and 26 computers manufactured between July 2000 and October 2001; it includes cathode ray tube (CRT) and liquid crystal display (LCD) monitors, Macintosh and Intel-architecture computers, desktop and laptop computers, and integrated computer systems, in which power consumption of the computer and monitor cannot be measured separately. For each machine we measured power consumption when off, on, and in each low-power level. We identify trends in and opportunities to reduce power consumption in new personal computers and monitors. Our results include a trend among monitor manufacturers to provide a single very low low-power level, well below the current ENERGY STAR criteria for sleep power consumption. These very low sleep power results mean that energy consumed when monitors are off or in active use has become more important in terms of contribution to the overall unit energy consumption (UEC

  9. Cooperative gene regulation by microRNA pairs and their identification using a computational workflow

    PubMed Central

    Schmitz, Ulf; Lai, Xin; Winter, Felix; Wolkenhauer, Olaf; Vera, Julio; Gupta, Shailendra K.

    2014-01-01

    MicroRNAs (miRNAs) are an integral part of gene regulation at the post-transcriptional level. Recently, it has been shown that pairs of miRNAs can repress the translation of a target mRNA in a cooperative manner, which leads to an enhanced effectiveness and specificity in target repression. However, it remains unclear which miRNA pairs can synergize and which genes are target of cooperative miRNA regulation. In this paper, we present a computational workflow for the prediction and analysis of cooperating miRNAs and their mutual target genes, which we refer to as RNA triplexes. The workflow integrates methods of miRNA target prediction; triplex structure analysis; molecular dynamics simulations and mathematical modeling for a reliable prediction of functional RNA triplexes and target repression efficiency. In a case study we analyzed the human genome and identified several thousand targets of cooperative gene regulation. Our results suggest that miRNA cooperativity is a frequent mechanism for an enhanced target repression by pairs of miRNAs facilitating distinctive and fine-tuned target gene expression patterns. Human RNA triplexes predicted and characterized in this study are organized in a web resource at www.sbi.uni-rostock.de/triplexrna/. PMID:24875477

  10. Integrating computational activities into the upper-level Paradigms in Physics curriculum at Oregon State University

    NASA Astrophysics Data System (ADS)

    McIntyre, David H.; Tate, Janet; Manogue, Corinne A.

    2008-04-01

    The Paradigms in Physics project at Oregon State University has reformed the entire upper-level physics curriculum. The reform has involved a rearrangement of content to better reflect the way physicists think about the field and the use of several new pedagogies that place responsibility for learning more firmly in the hands of the students. In particular, we employ a wide variety of computational examples and problems throughout the courses. Students use MAPLE, MATHEMATICA, JAVA, and other software packages to do calculations, visualizations, and simulations that develop their intuition and physical reasoning. These computational activities are indispensable to the success of the curriculum.

  11. Computation of the intervals of uncertainties about the parameters found for identification

    NASA Technical Reports Server (NTRS)

    Mereau, P.; Raymond, J.

    1982-01-01

    A modeling method to calculate the intervals of uncertainty for parameters found by identification is described. The region of confidence and the general approach to the calculation of these intervals are discussed. The general subprograms for determination of dimensions are described. They provide the organizational charts for the subprograms, the tests carried out and the listings of the different subprograms.

  12. Parallel of low-level computer vision algorithms on a multi-DSP system

    NASA Astrophysics Data System (ADS)

    Liu, Huaida; Jia, Pingui; Li, Lijian; Yang, Yiping

    2011-06-01

    Parallel hardware becomes a commonly used approach to satisfy the intensive computation demands of computer vision systems. A multiprocessor architecture based on hypercube interconnecting digital signal processors (DSPs) is described to exploit the temporal and spatial parallelism. This paper presents a parallel implementation of low level vision algorithms designed on multi-DSP system. The convolution operation has been parallelized by using redundant boundary partitioning. Performance of the parallel convolution operation is investigated by varying the image size, mask size and the number of processors. Experimental results show that the speedup is close to the ideal value. However, it can be found that the loading imbalance of processor can significantly affect the computation time and speedup of the multi- DSP system.

  13. Selection of a computer code for Hanford low-level waste engineered-system performance assessment

    SciTech Connect

    McGrail, B.P.; Mahoney, L.A.

    1995-10-01

    Planned performance assessments for the proposed disposal of low-level waste (LLW) glass produced from remediation of wastes stored in underground tanks at Hanford, Washington will require calculations of radionuclide release rates from the subsurface disposal facility. These calculations will be done with the aid of computer codes. Currently available computer codes were ranked in terms of the feature sets implemented in the code that match a set of physical, chemical, numerical, and functional capabilities needed to assess release rates from the engineered system. The needed capabilities were identified from an analysis of the important physical and chemical process expected to affect LLW glass corrosion and the mobility of radionuclides. The highest ranked computer code was found to be the ARES-CT code developed at PNL for the US Department of Energy for evaluation of and land disposal sites.

  14. K-nearest neighbors based methods for identification of different gear crack levels under different motor speeds and loads: Revisited

    NASA Astrophysics Data System (ADS)

    Wang, Dong

    2016-03-01

    Gears are the most commonly used components in mechanical transmission systems. Their failures may cause transmission system breakdown and result in economic loss. Identification of different gear crack levels is important to prevent any unexpected gear failure because gear cracks lead to gear tooth breakage. Signal processing based methods mainly require expertize to explain gear fault signatures which is usually not easy to be achieved by ordinary users. In order to automatically identify different gear crack levels, intelligent gear crack identification methods should be developed. The previous case studies experimentally proved that K-nearest neighbors based methods exhibit high prediction accuracies for identification of 3 different gear crack levels under different motor speeds and loads. In this short communication, to further enhance prediction accuracies of existing K-nearest neighbors based methods and extend identification of 3 different gear crack levels to identification of 5 different gear crack levels, redundant statistical features are constructed by using Daubechies 44 (db44) binary wavelet packet transform at different wavelet decomposition levels, prior to the use of a K-nearest neighbors method. The dimensionality of redundant statistical features is 620, which provides richer gear fault signatures. Since many of these statistical features are redundant and highly correlated with each other, dimensionality reduction of redundant statistical features is conducted to obtain new significant statistical features. At last, the K-nearest neighbors method is used to identify 5 different gear crack levels under different motor speeds and loads. A case study including 3 experiments is investigated to demonstrate that the developed method provides higher prediction accuracies than the existing K-nearest neighbors based methods for recognizing different gear crack levels under different motor speeds and loads. Based on the new significant statistical

  15. Using the High-Level Based Program Interface to Facilitate the Large Scale Scientific Computing

    PubMed Central

    Shang, Yizi; Shang, Ling; Gao, Chuanchang; Lu, Guiming; Ye, Yuntao; Jia, Dongdong

    2014-01-01

    This paper is to make further research on facilitating the large-scale scientific computing on the grid and the desktop grid platform. The related issues include the programming method, the overhead of the high-level program interface based middleware, and the data anticipate migration. The block based Gauss Jordan algorithm as a real example of large-scale scientific computing is used to evaluate those issues presented above. The results show that the high-level based program interface makes the complex scientific applications on large-scale scientific platform easier, though a little overhead is unavoidable. Also, the data anticipation migration mechanism can improve the efficiency of the platform which needs to process big data based scientific applications. PMID:24574931

  16. Computational Psychiatry of ADHD: Neural Gain Impairments across Marrian Levels of Analysis

    PubMed Central

    Hauser, Tobias U.; Fiore, Vincenzo G.; Moutoussis, Michael; Dolan, Raymond J.

    2016-01-01

    Attention-deficit hyperactivity disorder (ADHD), one of the most common psychiatric disorders, is characterised by unstable response patterns across multiple cognitive domains. However, the neural mechanisms that explain these characteristic features remain unclear. Using a computational multilevel approach, we propose that ADHD is caused by impaired gain modulation in systems that generate this phenotypic increased behavioural variability. Using Marr's three levels of analysis as a heuristic framework, we focus on this variable behaviour, detail how it can be explained algorithmically, and how it might be implemented at a neural level through catecholamine influences on corticostriatal loops. This computational, multilevel, approach to ADHD provides a framework for bridging gaps between descriptions of neuronal activity and behaviour, and provides testable predictions about impaired mechanisms. PMID:26787097

  17. System-level tools and reconfigurable computing for next-generation HWIL systems

    NASA Astrophysics Data System (ADS)

    Stark, Derek; McAulay, Derek; Cantle, Allan J.; Devlin, Malachy

    2001-08-01

    Previous work has been presented on the creation of computing architectures called DIME, which addressed the particular computing demands of hardware in the loop systems. These demands include low latency, high data rates and interfacing. While it is essential to have a capable platform for handling and processing of the data streams, the tools must also complement this so that a system's engineer is able to construct their final system. The paper will present the work in the area of integration of system level design tools, such as MATLAB and SIMULINK, with a reconfigurable computing platform. This will demonstrate how algorithms can be implemented and simulated in a familiar rapid application development environment before they are automatically transposed for downloading directly to the computing platform. This complements the established control tools, which handle the configuration and control of the processing systems leading to a tool suite for system development and implementation. As the development tools have evolved the core-processing platform has also been enhanced. These improved platforms are based on dynamically reconfigurable computing, utilizing FPGA technologies, and parallel processing methods that more than double the performance and data bandwidth capabilities. This offers support for the processing of images in Infrared Scene Projectors with 1024 X 1024 resolutions at 400 Hz frame rates. The processing elements will be using the latest generation of FPGAs, which implies that the presented systems will be rated in terms of Tera (1012) operations per second.

  18. Identification of tumor-associated cassette exons in human cancer through EST-based computational prediction and experimental validation

    PubMed Central

    2010-01-01

    Background Many evidences report that alternative splicing, the mechanism which produces mRNAs and proteins with different structures and functions from the same gene, is altered in cancer cells. Thus, the identification and characterization of cancer-specific splice variants may give large impulse to the discovery of novel diagnostic and prognostic tumour biomarkers, as well as of new targets for more selective and effective therapies. Results We present here a genome-wide analysis of the alternative splicing pattern of human genes through a computational analysis of normal and cancer-specific ESTs from seventeen anatomical groups, using data available in AspicDB, a database resource for the analysis of alternative splicing in human. By using a statistical methodology, normal and cancer-specific genes, splice sites and cassette exons were predicted in silico. The condition association of some of the novel normal/tumoral cassette exons was experimentally verified by RT-qPCR assays in the same anatomical system where they were predicted. Remarkably, the presence in vivo of the predicted alternative transcripts, specific for the nervous system, was confirmed in patients affected by glioblastoma. Conclusion This study presents a novel computational methodology for the identification of tumor-associated transcript variants to be used as cancer molecular biomarkers, provides its experimental validation, and reports specific biomarkers for glioblastoma. PMID:20813049

  19. Zebra tape identification for the instantaneous angular speed computation and angular resampling of motorbike valve train measurements

    NASA Astrophysics Data System (ADS)

    Rivola, Alessandro; Troncossi, Marco

    2014-02-01

    An experimental test campaign was performed on the valve train of a racing motorbike engine in order to get insight into the dynamic of the system. In particular the valve motion was acquired in cold test conditions by means of a laser vibrometer able to acquire displacement and velocity signals. The valve time-dependent measurements needed to be referred to the camshaft angular position in order to analyse the data in the angular domain, as usually done for rotating machines. To this purpose the camshaft was fitted with a zebra tape whose dark and light stripes were tracked by means of an optical probe. Unfortunately, both manufacturing and mounting imperfections of the employed zebra tape, resulting in stripes with slightly different widths, precluded the possibility to directly obtain the correct relationship between camshaft angular position and time. In order to overcome this problem, the identification of the zebra tape was performed by means of the original and practical procedure that is the focus of the present paper. The method consists of three main steps: namely, an ad-hoc test corresponding to special operating conditions, the computation of the instantaneous angular speed, and the final association of the stripes with the corresponding shaft angular position. The results reported in the paper demonstrate the suitability of the simple procedure for the zebra tape identification performed with the final purpose to implement a computed order tracking technique for the data analysis.

  20. Handbook for the Identification and Assessment of Computer Courseware for the Adult Learner.

    ERIC Educational Resources Information Center

    Paul, Daniel M., Comp.

    This handbook provides evaluation guidelines, information on acquiring courseware, and evaluations and recommendations regarding available instructional computer software appropriate to the needs of adult learners enrolled in adult basic education or General Education Development. Section 1 addresses computer hardware problems and limitations,…

  1. POPCYCLE: a computer code for calculating nuclear and fossil plant levelized life-cycle power costs

    SciTech Connect

    Hardie, R.W.

    1982-02-01

    POPCYCLE, a computer code designed to calculate levelized life-cycle power costs for nuclear and fossil electrical generating plants is described. Included are (1) derivations of the equations and a discussion of the methodology used by POPCYCLE, (2) a description of the input required by the code, (3) a listing of the input for a sample case, and (4) the output for a sample case.

  2. Real time computation of in vivo drug levels during drug self-administration experiments.

    PubMed

    Tsibulsky, Vladimir L; Norman, Andrew B

    2005-05-01

    A growing body of evidence suggests that the drug concentration in the effect compartment of the body is the major factor regulating self-administration behavior. A novel computer-based protocol was developed to facilitate studies on mechanisms of drug addiction by determining correlations between drug levels and behavior during multiple drug injections and infusions. The core of the system is a user's program written in Medstate Notation language (Med-Associates, Inc.), which runs the self-administration session (with MED-PC software and hardware, Med-Associates, Inc.) and calculates the levels of infused and/or injected drugs in real time during the session. From the comparison of classical exponential and simple linear models of first-order kinetics, it is concluded that exponential solutions for the appropriate differential equations may be replaced with linear equations if the cycle of computation is much shorter than the shortest half-life for the drug. The choice between particular computation equations depends on assumptions about the pharmacokinetics of the particular drug: (i) one-, two- or three-compartment model, (ii) zero-, first- or second-order process of elimination, (iii) the constants of distribution and elimination half-lives of the drug are known or can be reasonably assumed, (iv) dependence of the constants on the drug level, and (v) temporal stability of all parameters during the session. This method of drug level computation can be employed not only for self-administration but also for other behavioral paradigms to advance pharmacokinetic/pharmacodynamic modeling. PMID:15878149

  3. Variance and bias computations for improved modal identification using ERA/DC

    NASA Technical Reports Server (NTRS)

    Longman, Richard W.; Lew, Jiann-Shiun; Tseng, Dong-Huei; Juang, Jer-Nan

    1991-01-01

    Variance and bias confidence criteria were recently developed for the eigensystem realization algorithm (ERA) identification technique. These criteria are extended for the modified version of ERA based on data correlation, ERA/DC, and also for the Q-Markov cover algorithm. The importance and usefulness of the variance and bias information are demonstrated in numerical studies. The criteria are shown to be very effective not only by indicating the accuracy of the identification results, especially in terms of confidence intervals, but also by helping the ERA user to obtain better results by seeing the effect of changing the sample time, adjusting the Hankel matrix dimension, choosing how many singular values to retain, deciding the model order, etc.

  4. Computational analysis of the curvature distribution and power losses of metal strip in tension levellers

    NASA Astrophysics Data System (ADS)

    Steinwender, L.; Kainz, A.; Krimpelstätter, K.; Zeman, K.

    2010-06-01

    Tension levelling is employed in strip processing lines to minimise residual stresses resp. to improve the strip flatness by inducing small elasto-plastic deformations. To improve the design of such machines, precise calculation models are essential to reliably predict tension losses due to plastic dissipation, power requirements of the driven bridle rolls (located upstream and downstream), reaction forces on levelling rolls as well as strains and stresses in the strip. FEM (Finite Element Method) simulations of the tension levelling process (based on Updated Lagrangian concepts) yield high computational costs due to the necessity of very fine meshes as well as due to the severely non-linear characteristics of contact, material and geometry. In an evaluation process of hierarchical models (models with different modeling levels), the reliability of both 3D and 2D modelling concepts (based on continuum and structural elements) was proved by extensive analyses as well as consistency checks against measurement data from an industrial tension leveller. To exploit the potential of computational cost savings, a customised modelling approach based on the principle of virtual work has been elaborated, which yields a drastic reduction of degrees of freedom compared to simulations by utilising commercial FEM-packages.

  5. Compiling high-level languages for configurable computers: applying lessons from heterogeneous processing

    NASA Astrophysics Data System (ADS)

    Weaver, Glen E.; Weems, Charles C.; McKinley, Kathryn S.

    1996-10-01

    Configurable systems offer increased performance by providing hardware that matches the computational structure of a problem. This hardware is currently programmed with CAD tools and explicit library calls. To attain widespread acceptance, configurable computing must become transparently accessible from high-level programming languages, but the changeable nature of the target hardware presents a major challenge to traditional compiler technology. A compiler for a configurable computer should optimize the use of functions embedded in hardware and schedule hardware reconfigurations. The hurdles to be overcome in achieving this capability are similar in some ways to those facing compilation for heterogeneous systems. For example, current traditional compilers have neither an interface to accept new primitive operators, nor a mechanism for applying optimizations to new operators. We are building a compiler for heterogeneous computing, called Scale, which replaces the traditional monolithic compiler architecture with a flexible framework. Scale has three main parts: translation director, compilation library, and a persistent store which holds our intermediate representation as well as other data structures. The translation director exploits the framework's flexibility by using architectural information to build a plan to direct each compilation. The translation library serves as a toolkit for use by the translation director. Our compiler intermediate representation, Score, facilities the addition of new IR nodes by distinguishing features used in defining nodes from properties on which transformations depend. In this paper, we present an overview of the scale architecture and its capabilities for dealing with heterogeneity, followed by a discussion of how those capabilities apply to problems in configurable computing. We then address aspects of configurable computing that are likely to require extensions to our approach and propose some extensions.

  6. Computer models for defining eustatic sea level fluctuations in carbonate rocks

    SciTech Connect

    Read, J.F.; Elrick, M.E.; Osleger, D.A. )

    1990-05-01

    One- and two-dimensional computer models of carbonate sequences help define the amplitudes and periods of 20,000 years to 1 m.y. and 1-3 m.y. sea level fluctuations that have affected carbonate rocks. The models show that with low-amplitude 20-100 k.y. sea level fluctuations, tidal flats are likely to extend across the platform during short-term regressions, and vadose diagenesis is limited because sea level rarely drops far below the platform surface. With high-amplitude 20-100 k.y., sea level fluctuations, tidal flats are confined to coastal locations, very deep-water facies are overlain by shallow-water beds, and during regression sea level falls far below the platform, leaving karstic surfaces. The models also allow testing of random vs. Milankovitch-driven sea level changes. The feasibility of cyclic sedimentation due to autocyclic processes under static sea level can be shown by the modeling to be less likely than Milankovich climatic forcing for developing cyclic carbonate sequences. Models also help define relative dominance of 100 k.y. vs. 20 or 40 k.y. sea level oscillations. The presence of shallow-ramp vs. deep ramp upward-shallowing cycles that are common on many platforms provides valuable constraints on the modeling in that the sea level fluctuations generating the shallow cycles also have to be able to generate the deeper ramp cycles. Sea level fluctuations of 1-3 m.y. are constrained by the modeling because overestimated amplitudes result in sequences and high-frequency cycles that are far too thick. Rate of long-term falls are constrained by the modeling because where fall rate exceeds driving subsidence, the outer platform becomes unconformable, whereas it remains conformable where fall rate is below driving subsidence rate. Quantitative modeling techniques thus provide the means of constraining amplitudes and frequencies of eustatic sea level fluctuations in ancient carbonate sequences.

  7. Evaluation of Staf-Sistem 18-R for identification of staphylococcal clinical isolates to the species level.

    PubMed Central

    Piccolomini, R; Catamo, G; Picciani, C; D'Antonio, D

    1994-01-01

    The accuracy and efficiency of Staf-Sistem 18-R (Liofilchem s.r.l., Roseto degli Abruzzi, Teramo, Italy) were compared with those of conventional biochemical methods to identify 523 strains belonging to 16 different human Staphylococcus species. Overall, 491 strains (93.9%) were correctly identified (percentage of identification, > or = 90.0), with 28 (5.4%) requiring supplementary tests for complete identification. For 14 isolates (2.8%), the strains did not correspond to any key in the codebook and could not be identified by the manufacturer's computer service. Only 18 isolates (3.4%) were misidentified. The system is simple to use, is easy to handle, gives highly reproducible results, and is inexpensive. With the inclusion of more discriminating tests and adjustment in supplementary code numbers for some species, such as Staphylococcus lugdunensis and Staphylococcus schleiferi, Staf-Sistem 18-R is a suitable alternative for identification of human coagulase-positive and coagulase-negative Staphylococcus species in microbiological laboratories. Images PMID:8195373

  8. Task analysis for computer-aided design (CAD) at a keystroke level.

    PubMed

    Chi, C F; Chung, K L

    1996-08-01

    The purpose of this research was to develop a new model to describe and predict a computerized task. AutoCAD was utilized as the experimental tool to collect operating procedure and time data at a keystroke level for a computer aided design (CAD) task. Six undergraduate students participated in the experiment. They were required to complete one simple and one complex engineering drawing. A model which characterized the task performance by software commands and predicted task execution time using keystroke-level model operators was proposed and applied to the analysis of the dialogue data. This task parameter model adopted software commands, e.g. LINE, OFFSET in AutoCAD, to describe the function of a task unit and used up to five parameters to indicate the number of keystrokes, chosen function for a command and ways of starting and ending a command. Each task unit in the task parameter model can be replaced by a number of primitive operators as in the keystroke level model to predict the task execution time. The observed task execution times of all task units were found to be highly correlated with the task execution times predicted by the keystroke level model. Therefore, the task parameter model was proved to be a usable analytical tool for evaluating the human-computer interface (HCI). PMID:15677066

  9. A level set segmentation for computer-aided dental x-ray analysis

    NASA Astrophysics Data System (ADS)

    Li, Shuo; Fevens, Thomas; Krzyzak, Adam; Li, Song

    2005-04-01

    A level-set-based segmentation framework for Computer Aided Dental X-rays Analysis (CADXA) is proposed. In this framework, we first employ level set methods to segment the dental X-ray image into three regions: Normal Region (NR), Potential Abnormal Region (PAR), Abnormal and Background Region (ABR). The segmentation results are then used to build uncertainty maps based on a proposed uncertainty measurement method and an analysis scheme is applied. The level set segmentation method consists of two stages: a training stage and a segmentation stage. During the training stage, manually chosen representative images are segmented using hierarchical level set region detection. The segmentation results are used to train a support vector machine (SVM) classifier. During the segmentation stage, a dental X-ray image is first classified by the trained SVM. The classifier provides an initial contour which is close to the correct boundary for the coupled level set method which is then used to further segment the image. Different dental X-ray images are used to test the framework. Experimental results show that the proposed framework achieves faster level set segmentation and provides more detailed information and indications of possible problems to the dentist. To our best knowledge, this is one of the first results on CADXA using level set methods.

  10. A Comparative Study to Evaluate the Effectiveness of Computer Assisted Instruction (CAI) versus Class Room Lecture (RL) for Computer Science at ICS Level

    ERIC Educational Resources Information Center

    Kausar, Tayyaba; Choudhry, Bushra Naoreen; Gujjar, Aijaz Ahmed

    2008-01-01

    This study was aimed to evaluate the effectiveness of CAI vs. classroom lecture for computer science at ICS level. The objectives were to compare the learning effects of two groups with class room lecture and computer assisted instruction studying the same curriculum and the effects of CAI and CRL in terms of cognitive development. Hypothesis of…

  11. A Comparative Study to Evaluate the Effectiveness of Computer Assisted Instruction (CAI) versus Class Room Lecture (CRL) for Computer Science at ICS Level

    ERIC Educational Resources Information Center

    Kausar, Tayyaba; Choudhry, Bushra Naoreen; Gujjar, Aijaz Ahmed

    2008-01-01

    This study was aimed to evaluate the effectiveness of CAI vs. classroom lecture for computer science at ICS level. The objectives were to compare the learning effects of two groups with class room lecture and computer assisted instruction studying the same curriculum and the effects of CAI and CRL in terms of cognitive development. Hypothesis of…

  12. PCR detection, identification to species level, and fingerprinting of Campylobacter jejuni and Campylobacter coli direct from diarrheic samples.

    PubMed Central

    Linton, D; Lawson, A J; Owen, R J; Stanley, J

    1997-01-01

    Three sets of primers were designed for PCR detection and differentiation of Campylobacter jejuni and Campylobacter coli. The first PCR assay was designed to coidentify C. jejuni and C. coli based on their 16S rRNA gene sequences. The second PCR assay, based on the hippuricase gene sequence, identified all tested reference strains of C. jejuni and also strains of that species which lack detectable hippuricase activity. The third PCR assay, based on the sequence of a cloned (putative) aspartokinase gene and the downstream open reading frame, identified all tested reference strains of C. coli. The assays will find immediate application in the rapid identification to species level of isolates. The assays combine with a protocol for purification of total DNA from fecal samples to allow reproducible PCR identification of campylobacters directly from stools. Of 20 clinical samples from which campylobacters had been cultured, we detected C. jejuni in 17, C. coli in 2, and coinfection of C. jejuni and Campylobacter hyointestinalis in 1. These results were concordant with culture and phenotypic identification to species level. Strain typing by PCR-restriction fragment length polymorphism of the flagellin (flaA) gene detected identical flaA types in fecal DNA and the corresponding campylobacter isolate. Twenty-five Campylobacter-negative stool samples gave no reaction with the PCR assays. These PCR assays can rapidly define the occurrence, species incidence, and flaA genotypes of enteropathogenic campylobacters. PMID:9316909

  13. Effect of Computer Simulations at the Particulate and Macroscopic Levels on Students' Understanding of the Particulate Nature of Matter

    ERIC Educational Resources Information Center

    Tang, Hui; Abraham, Michael R.

    2016-01-01

    Computer-based simulations can help students visualize chemical representations and understand chemistry concepts, but simulations at different levels of representation may vary in effectiveness on student learning. This study investigated the influence of computer activities that simulate chemical reactions at different levels of representation…

  14. Diagnostic reference levels and patient doses in computed tomography examinations in Greece.

    PubMed

    Simantirakis, G; Hourdakis, C J; Economides, S; Kaisas, I; Kalathaki, M; Koukorava, C; Manousaridis, G; Pafilis, C; Tritakis, P; Vogiatzi, S; Kamenopoulou, V; Dimitriou, P

    2015-02-01

    The purpose of this study is to present a national survey that was performed in Greece for the establishment of national Dose Reference Levels (DRLs) for seven common adult Computed Tomography (CT) examinations. Volumetric computed tomography dose index and dose-length product values were collected from the post-data page of 65 'modern' systems that incorporate tube current modulation. Moreover, phantom dose measurements on 26 'older' systems were performed. Finally, the effective dose to the patient from a typical acquisition during these examinations was estimated. The suggested national DRLs are generally comparable with respective published values from similar European studies, with the exception of sinuses CT, which presents significantly higher values. This fact, along with the large variation of the systems' dose values that were observed even for scanners of the same type, indicates a need for further patient protection optimisation without compromising the clinical outcome. PMID:24891405

  15. High-level waste storage tank farms/242-A evaporator standards/requirements identification document (S/RID), Vol. 7

    SciTech Connect

    Not Available

    1994-04-01

    This Requirements Identification Document (RID) describes an Occupational Health and Safety Program as defined through the Relevant DOE Orders, regulations, industry codes/standards, industry guidance documents and, as appropriate, good industry practice. The definition of an Occupational Health and Safety Program as specified by this document is intended to address Defense Nuclear Facilities Safety Board Recommendations 90-2 and 91-1, which call for the strengthening of DOE complex activities through the identification and application of relevant standards which supplement or exceed requirements mandated by DOE Orders. This RID applies to the activities, personnel, structures, systems, components, and programs involved in maintaining the facility and executing the mission of the High-Level Waste Storage Tank Farms.

  16. Dose Assessment in Computed Tomography Examination and Establishment of Local Diagnostic Reference Levels in Mazandaran, Iran

    PubMed Central

    Janbabanezhad Toori, A.; Shabestani-Monfared, A.; Deevband, M.R.; Abdi, R.; Nabahati, M.

    2015-01-01

    Background Medical X-rays are the largest man-made source of public exposure to ionizing radiation. While the benefits of Computed Tomography (CT) are well known in accurate diagnosis, those benefits are not risk-free. CT is a device with higher patient dose in comparison with other conventional radiation procedures. Objective This study is aimed at evaluating radiation dose to patients from Computed Tomography (CT) examination in Mazandaran hospitals and defining diagnostic reference level (DRL). Methods Patient-related data on CT protocol for four common CT examinations including brain, sinus, chest and abdomen & pelvic were collected. In each center, Computed Tomography Dose Index (CTDI) measurements were performed using pencil ionization chamber and CT dosimetry phantom according to AAPM report No. 96 for those techniques. Then, Weighted Computed Tomography Dose Index (CTDIW), Volume Computed Tomography Dose Index (CTDI vol) and Dose Length Product (DLP) were calculated. Results The CTDIw for brain, sinus, chest and abdomen & pelvic ranged (15.6-73), (3.8-25. 8), (4.5-16.3) and (7-16.3), respectively. Values of DLP had a range of (197.4-981), (41.8-184), (131-342.3) and (283.6-486) for brain, sinus, chest and abdomen & pelvic, respectively. The 3rd quartile of CTDIW, derived from dose distribution for each examination is the proposed quantity for DRL. The DRLs of brain, sinus, chest and abdomen & pelvic are measured 59.5, 17, 7.8 and 11 mGy, respectively. Conclusion Results of this study demonstrated large scales of dose for the same examination among different centers. For all examinations, our values were lower than international reference doses. PMID:26688796

  17. A Framework for Different Levels of Integration of Computational Models Into Web-Based Virtual Patients

    PubMed Central

    Narracott, Andrew J; Manini, Simone; Bayley, Martin J; Lawford, Patricia V; McCormack, Keith; Zary, Nabil

    2014-01-01

    Background Virtual patients are increasingly common tools used in health care education to foster learning of clinical reasoning skills. One potential way to expand their functionality is to augment virtual patients’ interactivity by enriching them with computational models of physiological and pathological processes. Objective The primary goal of this paper was to propose a conceptual framework for the integration of computational models within virtual patients, with particular focus on (1) characteristics to be addressed while preparing the integration, (2) the extent of the integration, (3) strategies to achieve integration, and (4) methods for evaluating the feasibility of integration. An additional goal was to pilot the first investigation of changing framework variables on altering perceptions of integration. Methods The framework was constructed using an iterative process informed by Soft System Methodology. The Virtual Physiological Human (VPH) initiative has been used as a source of new computational models. The technical challenges associated with development of virtual patients enhanced by computational models are discussed from the perspectives of a number of different stakeholders. Concrete design and evaluation steps are discussed in the context of an exemplar virtual patient employing the results of the VPH ARCH project, as well as improvements for future iterations. Results The proposed framework consists of four main elements. The first element is a list of feasibility features characterizing the integration process from three perspectives: the computational modelling researcher, the health care educationalist, and the virtual patient system developer. The second element included three integration levels: basic, where a single set of simulation outcomes is generated for specific nodes in the activity graph; intermediate, involving pre-generation of simulation datasets over a range of input parameters; advanced, including dynamic solution of the

  18. A Feature Selection Algorithm to Compute Gene Centric Methylation from Probe Level Methylation Data

    PubMed Central

    Baur, Brittany; Bozdag, Serdar

    2016-01-01

    DNA methylation is an important epigenetic event that effects gene expression during development and various diseases such as cancer. Understanding the mechanism of action of DNA methylation is important for downstream analysis. In the Illumina Infinium HumanMethylation 450K array, there are tens of probes associated with each gene. Given methylation intensities of all these probes, it is necessary to compute which of these probes are most representative of the gene centric methylation level. In this study, we developed a feature selection algorithm based on sequential forward selection that utilized different classification methods to compute gene centric DNA methylation using probe level DNA methylation data. We compared our algorithm to other feature selection algorithms such as support vector machines with recursive feature elimination, genetic algorithms and ReliefF. We evaluated all methods based on the predictive power of selected probes on their mRNA expression levels and found that a K-Nearest Neighbors classification using the sequential forward selection algorithm performed better than other algorithms based on all metrics. We also observed that transcriptional activities of certain genes were more sensitive to DNA methylation changes than transcriptional activities of other genes. Our algorithm was able to predict the expression of those genes with high accuracy using only DNA methylation data. Our results also showed that those DNA methylation-sensitive genes were enriched in Gene Ontology terms related to the regulation of various biological processes. PMID:26872146

  19. TPASS: a gamma-ray spectrum analysis and isotope identification computer code

    SciTech Connect

    Dickens, J.K.

    1981-03-01

    The gamma-ray spectral data-reduction and analysis computer code TPASS is described. This computer code is used to analyze complex Ge(Li) gamma-ray spectra to obtain peak areas corrected for detector efficiencies, from which are determined gamma-ray yields. These yields are compared with an isotope gamma-ray data file to determine the contributions to the observed spectrum from decay of specific radionuclides. A complete FORTRAN listing of the code and a complex test case are given.

  20. Computer-aided identification of the water diffusion coefficient for maize kernels dried in a thin layer

    NASA Astrophysics Data System (ADS)

    Kujawa, Sebastian; Weres, Jerzy; Olek, Wiesław

    2016-07-01

    Uncertainties in mathematical modelling of water transport in cereal grain kernels during drying and storage are mainly due to implementing unreliable values of the water diffusion coefficient and simplifying the geometry of kernels. In the present study an attempt was made to reduce the uncertainties by developing a method for computer-aided identification of the water diffusion coefficient and more accurate 3D geometry modelling for individual kernels using original inverse finite element algorithms. The approach was exemplified by identifying the water diffusion coefficient for maize kernels subjected to drying. On the basis of the developed method, values of the water diffusion coefficient were estimated, 3D geometry of a maize kernel was represented by isoparametric finite elements, and the moisture content inside maize kernels dried in a thin layer was predicted. Validation of the results against experimental data showed significantly lower error values than in the case of results obtained for the water diffusion coefficient values available in the literature.

  1. Computation of expectation values from vibrational coupled-cluster at the two-mode coupling level

    NASA Astrophysics Data System (ADS)

    Zoccante, Alberto; Seidler, Peter; Christiansen, Ove

    2011-04-01

    In this work we show how the vibrational coupled-cluster method at the two-mode coupling level can be used to calculate zero-point vibrational averages of properties. A technique is presented, where any expectation value can be calculated using a single set of Lagrangian multipliers computed solving iteratively a single linear set of equations. Sample calculations are presented which show that the resulting algorithm scales only with the third power of the number of modes, therefore making large systems accessible. Moreover, we present applications to water, pyrrole, and para-nitroaniline.

  2. ROLE OF COMPUTATIONAL CHEMISTRY IN SUPPORT OF HAZARD IDENTIFICATION (ID) MECHANISM-BASED SARS

    EPA Science Inventory

    A mechanism-based structure-activity relationship (SAR) study attempts to determine a structural basis for activity by targeting a single or a few stages in a postulated mechanism of action.Computational chemistry approaches provide a valuable complement to experiment for probing...

  3. IUE data reduction: Wavelength determinations and line identifications using a VAX/750 computer

    NASA Astrophysics Data System (ADS)

    Davidson, J. P.; Bord, D. J.

    A fully automated, interactive system for determining the wavelengths of features in extracted IUE spectra is described. Wavelengths are recorded from video displays of expanded plots of individual orders using a movable cursor, and then corrected for IUE wavelength scale errors. The estimated accuracy of an individual wavelength in the final tabulation is 0.050 A. Such lists are ideally suited for line identification work using the method of wavelength coincidence statistics (WCS). The results of WCS studies of the ultraviolet spectra of the chemically peculiar (CP) stars iota Coronae Borealis and kappa Camcri. Aside from confirming a number of previously reported aspects of the abundance patterns in these stars, the searches produced some interesting, new discoveries, notably the presence of Hf in the spectrum of kappa Camcri. The implications of this work for theories designed to account for anomalous abundances in chemically peculiar stars are discussed.

  4. Computer Aided Drug Design Approaches for Identification of Novel Autotaxin (ATX) Inhibitors.

    PubMed

    Vrontaki, Eleni; Melagraki, Georgia; Kaffe, Eleanna; Mavromoustakos, Thomas; Kokotos, George; Aidinis, Vassilis; Afantitis, Antreas

    2016-01-01

    Autotaxin (ATX) has become an attractive target with a huge pharmacological and pharmacochemical interest in LPA-related diseases and to date many small organic molecules have been explored as potential ATX inhibitors. As a useful aid in the various efforts of identifying novel effective ATX inhibitors, in silico methods can serve as an important and valuable tool. Especially, Virtual Screening (VS) has recently received increased attention due to the large datasets made available, the development of advanced VS techniques and the encouraging fact that VS has contributed to the discovery of several compounds that have either reached the market or entered clinical trials. Different techniques and workflows have been reported in literature with the goal to prioritize possible potent hits. In this review article several deployed virtual screening strategies for the identification of novel potent ATX inhibitors are described. PMID:26997151

  5. A comparative analysis of computational approaches and algorithms for protein subcomplex identification

    PubMed Central

    Zaki, Nazar; Mora, Antonio

    2014-01-01

    High-throughput AP-MS methods have allowed the identification of many protein complexes. However, most post-processing methods of this type of data have been focused on detection of protein complexes and not its subcomplexes. Here, we review the results of some existing methods that may allow subcomplex detection and propose alternative methods in order to detect subcomplexes from AP-MS data. We assessed and drew comparisons between the use of overlapping clustering methods, methods based in the core-attachment model and our own prediction strategy (TRIBAL). The hypothesis behind TRIBAL is that subcomplex-building information may be concealed in the multiple edges generated by an interaction repeated in different contexts in raw data. The CACHET method offered the best results when the evaluation of the predicted subcomplexes was carried out using both the hypergeometric and geometric scores. TRIBAL offered the best performance when using a strict meet-min score. PMID:24584908

  6. The Effects of Linear Microphone Array Changes on Computed Sound Exposure Level Footprints

    NASA Technical Reports Server (NTRS)

    Mueller, Arnold W.; Wilson, Mark R.

    1997-01-01

    Airport land planning commissions often are faced with determining how much area around an airport is affected by the sound exposure levels (SELS) associated with helicopter operations. This paper presents a study of the effects changing the size and composition of a microphone array has on the computed SEL contour (ground footprint) areas used by such commissions. Descent flight acoustic data measured by a fifteen microphone array were reprocessed for five different combinations of microphones within this array. This resulted in data for six different arrays for which SEL contours were computed. The fifteen microphone array was defined as the 'baseline' array since it contained the greatest amount of data. The computations used a newly developed technique, the Acoustic Re-propagation Technique (ART), which uses parts of the NASA noise prediction program ROTONET. After the areas of the SEL contours were calculated the differences between the areas were determined. The area differences for the six arrays are presented that show a five and a three microphone array (with spacing typical of that required by the FAA FAR Part 36 noise certification procedure) compare well with the fifteen microphone array. All data were obtained from a database resulting from a joint project conducted by NASA and U.S. Army researchers at Langley and Ames Research Centers. A brief description of the joint project test design, microphone array set-up, and data reduction methodology associated with the database are discussed.

  7. A cascadic monotonic time-discretized algorithm for finite-level quantum control computation

    NASA Astrophysics Data System (ADS)

    Ditz, P.; Borzi`, A.

    2008-03-01

    A computer package (CNMS) is presented aimed at the solution of finite-level quantum optimal control problems. This package is based on a recently developed computational strategy known as monotonic schemes. Quantum optimal control problems arise in particular in quantum optics where the optimization of a control representing laser pulses is required. The purpose of the external control field is to channel the system's wavefunction between given states in its most efficient way. Physically motivated constraints, such as limited laser resources, are accommodated through appropriately chosen cost functionals. Program summaryProgram title: CNMS Catalogue identifier: ADEB_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADEB_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 770 No. of bytes in distributed program, including test data, etc.: 7098 Distribution format: tar.gz Programming language: MATLAB 6 Computer: AMD Athlon 64 × 2 Dual, 2:21 GHz, 1:5 GB RAM Operating system: Microsoft Windows XP Word size: 32 Classification: 4.9 Nature of problem: Quantum control Solution method: Iterative Running time: 60-600 sec

  8. Parallel computation of level set method for 500 Hz visual servo control

    NASA Astrophysics Data System (ADS)

    Fei, Xianfeng; Igarashi, Yasunobu; Hashimoto, Koichi

    2008-11-01

    We propose a 2D microorganism tracking system using a parallel level set method and a column parallel vision system (CPV). This system keeps a single microorganism in the middle of the visual field under a microscope by visual servoing an automated stage. We propose a new energy function for the level set method. This function constrains an amount of light intensity inside the detected object contour to control the number of the detected objects. This algorithm is implemented in CPV system and computational time for each frame is 2 [ms], approximately. A tracking experiment for about 25 s is demonstrated. Also we demonstrate a single paramecium can be kept tracking even if other paramecia appear in the visual field and contact with the tracked paramecium.

  9. Identification of a unique cause of ring artifact seen in computed tomography trans-axial images.

    PubMed

    Jha, Ashish Kumar; Purandare, Nilendu C; Shah, Sneha; Agrawal, Archi; Puranik, Ameya D; Rangarajan, Venkatesh

    2013-10-01

    Artifacts present in computed tomography (CT) image often degrade the image quality and ultimately, the diagnostic outcome. Ring artifact in trans-axial image is caused by either miscalibrated or defective detector element of detector row, which is often categorized as scanner based artifact. A ring artifact detected on trans-axial CT image of positron emission tomography/computed tomography (PET/CT), was caused by contamination of CT tube aperture by droplet of injectable contrast medium. This artifact was corrected by removal of contrast droplet from CT tube aperture. The ring artifact is a very common artifact, commonly cited in the literature. Our case puts forward an uncommon cause of this artifact and its method of correction, which also, has no mention in the existing literature. PMID:24379535

  10. Efficient Calibration of Computationally Intensive Groundwater Models through Surrogate Modelling with Lower Levels of Fidelity

    NASA Astrophysics Data System (ADS)

    Razavi, S.; Anderson, D.; Martin, P.; MacMillan, G.; Tolson, B.; Gabriel, C.; Zhang, B.

    2012-12-01

    intensive groundwater modelling case study developed with the FEFLOW software is used to evaluate the proposed methodology. Multiple surrogates of this computationally intensive model with different levels of fidelity are created and applied. Dynamically dimensioned search (DDS) optimization algorithm is used as the search engine in the calibration framework enabled with surrogate models. Results show that this framework can substantially reduce the number of original model evaluations required for calibration by intelligently utilizing faster-to-run surrogates in the course of optimization. Results also demonstrate that the compromise between efficiency (reduced run time) and fidelity of a surrogate model is critically important to the success of the framework, as a surrogate with unreasonably low fidelity, despite being fast, might be quite misleading in calibration of the original model of interest.

  11. Establishment of multi-slice computed tomography (MSCT) reference level in Johor, Malaysia

    NASA Astrophysics Data System (ADS)

    Karim, M. K. A.; Hashim, S.; Bakar, K. A.; Muhammad, H.; Sabarudin, A.; Ang, W. C.; Bahruddin, N. A.

    2016-03-01

    Radiation doses from computed tomography (CT) are the highest and most hazardous compared to other imaging modalities. This study aimed to evaluate radiation dose in Johor, Malaysia to patients during computed tomography examinations of the brain, chest and abdomen and to establish the local diagnostic reference levels (DRLs) as are present with the current, state- of-art, multi-slice CT scanners. Survey forms were sent to five centres performing CT to obtain data regarding acquisition parameters as well as the dose information from CT consoles. CT- EXPO (Version 2.3.1, Germany) was used to validate the dose information. The proposed DRLs were indicated by rounding the third quartiles of whole dose distributions where mean values of CTDIw (mGy), CTDIvol (mGy) and DLP (mGy.cm) were comparable with other reference levels; 63, 63, and 1015 respectively for CT Brain; 15, 14, and 450 respectively for CT thorax and 16, 17, and 590 respectively for CT abdomen. The study revealed that the CT practice and dose output were revolutionised, and must keep up with the pace of introductory technology. We suggest that CTDIvol should be included in current national DRLs, as modern CTs are configured with a higher number of detectors and are independent of pitch factors.

  12. Power levels in office equipment: Measurements of new monitors and personal computers

    SciTech Connect

    Roberson, Judy A.; Brown, Richard E.; Nordman, Bruce; Webber, Carrie A.; Homan, Gregory H.; Mahajan, Akshay; McWhinney, Marla; Koomey, Jonathan G.

    2002-05-14

    Electronic office equipment has proliferated rapidly over the last twenty years and is projected to continue growing in the future. Efforts to reduce the growth in office equipment energy use have focused on power management to reduce power consumption of electronic devices when not being used for their primary purpose. The EPA ENERGY STAR[registered trademark] program has been instrumental in gaining widespread support for power management in office equipment, and accurate information about the energy used by office equipment in all power levels is important to improving program design and evaluation. This paper presents the results of a field study conducted during 2001 to measure the power levels of new monitors and personal computers. We measured off, on, and low-power levels in about 60 units manufactured since July 2000. The paper summarizes power data collected, explores differences within the sample (e.g., between CRT and LCD monitors), and discusses some issues that arise in m etering office equipment. We also present conclusions to help improve the success of future power management programs.Our findings include a trend among monitor manufacturers to provide a single very low low-power level, and the need to standardize methods for measuring monitor on power, to more accurately estimate the annual energy consumption of office equipment, as well as actual and potential energy savings from power management.

  13. Computational Identification of Genomic Features That Influence 3D Chromatin Domain Formation

    PubMed Central

    Mourad, Raphaël; Cuvier, Olivier

    2016-01-01

    Recent advances in long-range Hi-C contact mapping have revealed the importance of the 3D structure of chromosomes in gene expression. A current challenge is to identify the key molecular drivers of this 3D structure. Several genomic features, such as architectural proteins and functional elements, were shown to be enriched at topological domain borders using classical enrichment tests. Here we propose multiple logistic regression to identify those genomic features that positively or negatively influence domain border establishment or maintenance. The model is flexible, and can account for statistical interactions among multiple genomic features. Using both simulated and real data, we show that our model outperforms enrichment test and non-parametric models, such as random forests, for the identification of genomic features that influence domain borders. Using Drosophila Hi-C data at a very high resolution of 1 kb, our model suggests that, among architectural proteins, BEAF-32 and CP190 are the main positive drivers of 3D domain borders. In humans, our model identifies well-known architectural proteins CTCF and cohesin, as well as ZNF143 and Polycomb group proteins as positive drivers of domain borders. The model also reveals the existence of several negative drivers that counteract the presence of domain borders including P300, RXRA, BCL11A and ELK1. PMID:27203237

  14. Social Identification as a Determinant of Concerns about Individual-, Group-, and Inclusive-Level Justice

    ERIC Educational Resources Information Center

    Wenzel, Michael

    2004-01-01

    Extending concepts of micro- and macrojustice, three levels of justice are distinguished. Individual-, group-, and inclusive-level justice are defined in terms of the target of justice concerns: one's individual treatment, one's group's treatment, and the distribution in the collective (e.g., nation). Individual-level justice permits a more…

  15. Identification and Utilization of Employer Requirements for Entry-Level Health Occupations Workers. Final Report.

    ERIC Educational Resources Information Center

    Zukowski, James J.

    The purpose of a research project was to identify employer expectations regarding entry-level competency requirements for selected assistance-level health occupations. Physicians, dentists, and other health professionals reviewed lists of competencies associated with the performance of assistance-level employees in nursing, medical laboratory, and…

  16. Computational identification of putative miRNAs and their target genes in pathogenic amoeba Naegleria fowleri.

    PubMed

    Padmashree, Dyavegowda; Swamy, Narayanaswamy Ramachandra

    2015-01-01

    Naegleria fowleri is a parasitic unicellular free living eukaryotic amoeba. The parasite spreads through contaminated water and causes primary amoebic meningoencephalitis (PAM). Therefore, it is of interest to understand its molecular pathogenesis. Hence, we analyzed the parasite genome for miRNAs (microRNAs) that are non-coding, single stranded RNA molecules. We identified 245 miRNAs using computational methods in N. fowleri, of which five miRNAs are conserved. The predicted miRNA targets were analyzed by using miRanda (software) and further studied the functions by subsequently annotating using AmiGo (a gene ontology web tool). PMID:26770029

  17. Computational identification of putative miRNAs and their target genes in pathogenic amoeba Naegleria fowleri

    PubMed Central

    Padmashree, Dyavegowda; Swamy, Narayanaswamy Ramachandra

    2015-01-01

    Naegleria fowleri is a parasitic unicellular free living eukaryotic amoeba. The parasite spreads through contaminated water and causes primary amoebic meningoencephalitis (PAM). Therefore, it is of interest to understand its molecular pathogenesis. Hence, we analyzed the parasite genome for miRNAs (microRNAs) that are non-coding, single stranded RNA molecules. We identified 245 miRNAs using computational methods in N. fowleri, of which five miRNAs are conserved. The predicted miRNA targets were analyzed by using miRanda (software) and further studied the functions by subsequently annotating using AmiGo (a gene ontology web tool). PMID:26770029

  18. Identification of student misconceptions in genetics problem solving via computer program

    NASA Astrophysics Data System (ADS)

    Browning, Mark E.; Lehman, James D.

    A genetics problem practice program and tutor on microcomputer was used by 135 undergraduate education majors enrolled in an introductory biology course at Purdue University. The program presented four genetics problems, two monohybrid and two dihybrid, and required the users to predict the number and type of each class of offspring. Student responses were recorded on diskette and analyzed for evidence of misconceptions and difficulties in the genetics problem-solving process. Three main areas of difficulty were identified: difficulties with computational skills, difficulties in the determination of gametes, and inappropriate application of previous learning to new problem situations.

  19. Identification and area estimation of agricultural crops by computer classification of Landsat MSS data

    NASA Technical Reports Server (NTRS)

    Bauer, M. E.; Cipra, J. E.; Anuta, P. E.; Etheridge, J. B.

    1979-01-01

    Landsat Multispectral Scanner (MSS) data covering a three-county area in northern Illinois were classified using computer-aided techniques as corn, soybeans, or 'other.' Recognition of test fields was 80% accurate. County estimates of the area of corn and soybeans agreed closely with those made by the USDA. Results of the use of a priori information in classification, techniques to produce unbiased area estimates, and the use of temporal and spatial features for classification are discussed. The extendability, variability, and size of training sets, wavelength band selection, and spectral characteristics of crops were also investigated.

  20. Identification of miRNAs Potentially Involved in Bronchiolitis Obliterans Syndrome: A Computational Study

    PubMed Central

    Politano, Gianfranco; Inghilleri, Simona; Morbini, Patrizia; Calabrese, Fiorella; Benso, Alfredo; Savino, Alessandro; Cova, Emanuela; Zampieri, Davide; Meloni, Federica

    2016-01-01

    The pathogenesis of Bronchiolitis Obliterans Syndrome (BOS), the main clinical phenotype of chronic lung allograft dysfunction, is poorly understood. Recent studies suggest that epigenetic regulation of microRNAs might play a role in its development. In this paper we present the application of a complex computational pipeline to perform enrichment analysis of miRNAs in pathways applied to the study of BOS. The analysis considered the full set of miRNAs annotated in miRBase (version 21), and applied a sequence of filtering approaches and statistical analyses to reduce this set and to score the candidate miRNAs according to their potential involvement in BOS development. Dysregulation of two of the selected candidate miRNAs–miR-34a and miR-21 –was clearly shown in in-situ hybridization (ISH) on five explanted human BOS lungs and on a rat model of acute and chronic lung rejection, thus definitely identifying miR-34a and miR-21 as pathogenic factors in BOS and confirming the effectiveness of the computational pipeline. PMID:27564214

  1. Myocardial strain estimation from CT: towards computer-aided diagnosis on infarction identification

    NASA Astrophysics Data System (ADS)

    Wong, Ken C. L.; Tee, Michael; Chen, Marcus; Bluemke, David A.; Summers, Ronald M.; Yao, Jianhua

    2015-03-01

    Regional myocardial strains have the potential for early quantification and detection of cardiac dysfunctions. Although image modalities such as tagged and strain-encoded MRI can provide motion information of the myocardium, they are uncommon in clinical routine. In contrary, cardiac CT images are usually available, but they only provide motion information at salient features such as the cardiac boundaries. To estimate myocardial strains from a CT image sequence, we adopted a cardiac biomechanical model with hyperelastic material properties to relate the motion on the cardiac boundaries to the myocardial deformation. The frame-to-frame displacements of the cardiac boundaries are obtained using B-spline deformable image registration based on mutual information, which are enforced as boundary conditions to the biomechanical model. The system equation is solved by the finite element method to provide the dense displacement field of the myocardium, and the regional values of the three principal strains and the six strains in cylindrical coordinates are computed in terms of the American Heart Association nomenclature. To study the potential of the estimated regional strains on identifying myocardial infarction, experiments were performed on cardiac CT image sequences of ten canines with artificially induced myocardial infarctions. The leave-one-subject-out cross validations show that, by using the optimal strain magnitude thresholds computed from ROC curves, the radial strain and the first principal strain have the best performance.

  2. Computational Identification of Mechanistic Factors That Determine the Timing and Intensity of the Inflammatory Response

    PubMed Central

    Nagaraja, Sridevi; Reifman, Jaques; Mitrophanov, Alexander Y.

    2015-01-01

    Timely resolution of inflammation is critical for the restoration of homeostasis in injured or infected tissue. Chronic inflammation is often characterized by a persistent increase in the concentrations of inflammatory cells and molecular mediators, whose distinct amount and timing characteristics offer an opportunity to identify effective therapeutic regulatory targets. Here, we used our recently developed computational model of local inflammation to identify potential targets for molecular interventions and to investigate the effects of individual and combined inhibition of such targets. This was accomplished via the development and application of computational strategies involving the simulation and analysis of thousands of inflammatory scenarios. We found that modulation of macrophage influx and efflux is an effective potential strategy to regulate the amount of inflammatory cells and molecular mediators in both normal and chronic inflammatory scenarios. We identified three molecular mediators − tumor necrosis factor-α (TNF-α), transforming growth factor-β (TGF-β), and the chemokine CXCL8 − as potential molecular targets whose individual or combined inhibition may robustly regulate both the amount and timing properties of the kinetic trajectories for neutrophils and macrophages in chronic inflammation. Modulation of macrophage flux, as well as of the abundance of TNF-α, TGF-β, and CXCL8, may improve the resolution of chronic inflammation. PMID:26633296

  3. Jimena: efficient computing and system state identification for genetic regulatory networks

    PubMed Central

    2013-01-01

    Background Boolean networks capture switching behavior of many naturally occurring regulatory networks. For semi-quantitative modeling, interpolation between ON and OFF states is necessary. The high degree polynomial interpolation of Boolean genetic regulatory networks (GRNs) in cellular processes such as apoptosis or proliferation allows for the modeling of a wider range of node interactions than continuous activator-inhibitor models, but suffers from scaling problems for networks which contain nodes with more than ~10 inputs. Many GRNs from literature or new gene expression experiments exceed those limitations and a new approach was developed. Results (i) As a part of our new GRN simulation framework Jimena we introduce and setup Boolean-tree-based data structures; (ii) corresponding algorithms greatly expedite the calculation of the polynomial interpolation in almost all cases, thereby expanding the range of networks which can be simulated by this model in reasonable time. (iii) Stable states for discrete models are efficiently counted and identified using binary decision diagrams. As application example, we show how system states can now be sampled efficiently in small up to large scale hormone disease networks (Arabidopsis thaliana development and immunity, pathogen Pseudomonas syringae and modulation by cytokinins and plant hormones). Conclusions Jimena simulates currently available GRNs about 10-100 times faster than the previous implementation of the polynomial interpolation model and even greater gains are achieved for large scale-free networks. This speed-up also facilitates a much more thorough sampling of continuous state spaces which may lead to the identification of new stable states. Mutants of large networks can be constructed and analyzed very quickly enabling new insights into network robustness and behavior. PMID:24118878

  4. Computational approaches for identification of conserved/unique binding pockets in the A chain of ricin

    SciTech Connect

    Ecale Zhou, C L; Zemla, A T; Roe, D; Young, M; Lam, M; Schoeniger, J; Balhorn, R

    2005-01-29

    Specific and sensitive ligand-based protein detection assays that employ antibodies or small molecules such as peptides, aptamers, or other small molecules require that the corresponding surface region of the protein be accessible and that there be minimal cross-reactivity with non-target proteins. To reduce the time and cost of laboratory screening efforts for diagnostic reagents, we developed new methods for evaluating and selecting protein surface regions for ligand targeting. We devised combined structure- and sequence-based methods for identifying 3D epitopes and binding pockets on the surface of the A chain of ricin that are conserved with respect to a set of ricin A chains and unique with respect to other proteins. We (1) used structure alignment software to detect structural deviations and extracted from this analysis the residue-residue correspondence, (2) devised a method to compare corresponding residues across sets of ricin structures and structures of closely related proteins, (3) devised a sequence-based approach to determine residue infrequency in local sequence context, and (4) modified a pocket-finding algorithm to identify surface crevices in close proximity to residues determined to be conserved/unique based on our structure- and sequence-based methods. In applying this combined informatics approach to ricin A we identified a conserved/unique pocket in close proximity (but not overlapping) the active site that is suitable for bi-dentate ligand development. These methods are generally applicable to identification of surface epitopes and binding pockets for development of diagnostic reagents, therapeutics, and vaccines.

  5. Identification of irradiated pepper with the level of hydrogen gas as a probe

    SciTech Connect

    Dohmaru, T.; Furuta, M.; Katayama, T.; Toratani, H.; Takeda, A. )

    1989-12-01

    A novel method to detect whether or not a particular pepper has been irradiated has been developed which is based on the fact that H2 is formed in organic substances irradiated with ionizing radiation. Following gamma irradiation, black and white peppers were ground to powder in a gastight ceramic mill. By gas-chromatographic analysis of the gas in the mill, we observed that H2 had been released from the irradiated pepper grains. Curves plotting the H2 content vs storage time at storage temperatures of 7, 22, and 30 degrees C showed that the higher the temperatures, the smaller the H2 content, and that identification of irradiated pepper was possible for 2-4 months after 10 kGy irradiation.

  6. Identification of Low-Level Vancomycin Resistance in Staphylococcus aureus in the Era of Informatics.

    PubMed

    Ford, Bradley A

    2016-04-01

    Vancomycin-intermediateStaphylococcus aureus(VISA) and heteroresistant VISA (hVISA) are pathogens for which accurate antimicrobial susceptibility testing (AST) would rule out standard treatment with vancomycin. Unfortunately, AST for vancomycin is relatively slow and standard methods are unable to reliably detect VISA and hVISA. An article in this issue (C. A. Mather, B. J. Werth, S. Sivagnanam, D. J. SenGupta, and S. M. Butler-Wu, J Clin Microbiol 54:883-890, 2016, doi:http://dx.doi.org/10.1128/JCM.02428-15) describes a rapid whole-cell matrix-assisted laser desorption ionization-time of flight proxy susceptibility method that highlights current innovations and challenges with rapid AST, VISA/hVISA identification, and clinical bioinformatics. PMID:26865694

  7. Noninvasive fractional flow reserve derived from coronary computed tomography angiography for identification of ischemic lesions: a systematic review and meta-analysis

    PubMed Central

    Wu, Wen; Pan, Dao-Rong; Foin, Nicolas; Pang, Si; Ye, Peng; Holm, Niels; Ren, Xiao-Min; Luo, Jie; Nanjundappa, Aravinda; Chen, Shao-Liang

    2016-01-01

    Detection of coronary ischemic lesions by fractional flow reserve (FFR) has been established as the gold standard. In recent years, novel computer based methods have emerged and they can provide simulation of FFR using coronary artery images acquired from coronary computed tomography angiography (FFRCT). This meta-analysis aimed to evaluate diagnostic performance of FFRCT using FFR as the reference standard. Databases of PubMed, Cochrane Library, EMBASE, Medion and Web of Science were searched. Seven studies met the inclusion criteria, including 833 stable patients (1377 vessels or lesions) with suspected or known coronary artery disease (CAD). The patient-based analysis showed pooled estimates of sensitivity, specificity and diagnostic odds ratio (DOR) for detection of ischemic lesions were 0.89 [95%confidence interval (CI), 0.85–0.93], 0.76 (95%CI, 0.64–0.84) and 26.21 (95%CI, 13.14–52.28). At a per-vessel or per-lesion level, the pooled estimates were as follows: sensitivity 0.84 (95%CI, 0.80–0.87), specificity 0.76 (95%CI, 0.67–0.83) and DOR 16.87 (95%CI, 9.41–30.25). Area under summary receiver operating curves was 0.90 (95%CI, 0.87–0.92) and 0.86 (95%CI, 0.83–0.89) at the two analysis levels, respectively. In conclusion, FFRCT technology achieves a moderate diagnostic performance for noninvasive identification of ischemic lesions in stable patients with suspected or known CAD in comparison to invasive FFR measurement. PMID:27377422

  8. Noninvasive fractional flow reserve derived from coronary computed tomography angiography for identification of ischemic lesions: a systematic review and meta-analysis.

    PubMed

    Wu, Wen; Pan, Dao-Rong; Foin, Nicolas; Pang, Si; Ye, Peng; Holm, Niels; Ren, Xiao-Min; Luo, Jie; Nanjundappa, Aravinda; Chen, Shao-Liang

    2016-01-01

    Detection of coronary ischemic lesions by fractional flow reserve (FFR) has been established as the gold standard. In recent years, novel computer based methods have emerged and they can provide simulation of FFR using coronary artery images acquired from coronary computed tomography angiography (FFRCT). This meta-analysis aimed to evaluate diagnostic performance of FFRCT using FFR as the reference standard. Databases of PubMed, Cochrane Library, EMBASE, Medion and Web of Science were searched. Seven studies met the inclusion criteria, including 833 stable patients (1377 vessels or lesions) with suspected or known coronary artery disease (CAD). The patient-based analysis showed pooled estimates of sensitivity, specificity and diagnostic odds ratio (DOR) for detection of ischemic lesions were 0.89 [95%confidence interval (CI), 0.85-0.93], 0.76 (95%CI, 0.64-0.84) and 26.21 (95%CI, 13.14-52.28). At a per-vessel or per-lesion level, the pooled estimates were as follows: sensitivity 0.84 (95%CI, 0.80-0.87), specificity 0.76 (95%CI, 0.67-0.83) and DOR 16.87 (95%CI, 9.41-30.25). Area under summary receiver operating curves was 0.90 (95%CI, 0.87-0.92) and 0.86 (95%CI, 0.83-0.89) at the two analysis levels, respectively. In conclusion, FFRCT technology achieves a moderate diagnostic performance for noninvasive identification of ischemic lesions in stable patients with suspected or known CAD in comparison to invasive FFR measurement. PMID:27377422

  9. Comparison of Traditional Phenotypic Identification Methods with Partial 5′ 16S rRNA Gene Sequencing for Species-Level Identification of Nonfermenting Gram-Negative Bacilli▿

    PubMed Central

    Cloud, Joann L.; Harmsen, Dag; Iwen, Peter C.; Dunn, James J.; Hall, Gerri; LaSala, Paul Rocco; Hoggan, Karen; Wilson, Deborah; Woods, Gail L.; Mellmann, Alexander

    2010-01-01

    Correct identification of nonfermenting Gram-negative bacilli (NFB) is crucial for patient management. We compared phenotypic identifications of 96 clinical NFB isolates with identifications obtained by 5′ 16S rRNA gene sequencing. Sequencing identified 88 isolates (91.7%) with >99% similarity to a sequence from the assigned species; 61.5% of sequencing results were concordant with phenotypic results, indicating the usability of sequencing to identify NFB. PMID:20164273

  10. Computer aided root lesion detection using level set and complex wavelets

    NASA Astrophysics Data System (ADS)

    Li, Shuo; Fevens, Thomas; Krzyżak, Adam; Jin, Chao; Li, Song

    2007-03-01

    A computer aided root lesion detection method for digital dental X-rays is proposed using level set and complex wavelets. The detection method consists of two stages: preprocessing and root lesion detection. During preprocessing, a level set segmentation is applied to separate the teeth from the background. Tailored for the dental clinical environment, a segmentation clinical acceleration scheme is applied by using a support vector machine (SVM) classifier and individual principal component analysis (PCA) to provide an initial contour. Then, based on the segmentation result, root lesion detection is performed. Firstly, the teeth are isolated by the average intensity profile. Secondly, a center-line zero crossing based candidate generation is applied to generate the possible root lesion areas. Thirdly, the Dual-Tree Complex Wavelets Transform (DT-CWT) is used to further remove false positives. Lastly when the root lesion is detected, the area of root lesion is automatically marked with color indication representing different levels of seriousness. 150 real dental X-rays with various degrees of root lesions are used to test the proposed method. The results were validated by the dentist. Experimental results show that the proposed method is able to successfully detect the root lesion and provide visual assistance to the dentist.

  11. Patient dose, gray level and exposure index with a computed radiography system

    NASA Astrophysics Data System (ADS)

    Silva, T. R.; Yoshimura, E. M.

    2014-02-01

    Computed radiography (CR) is gradually replacing conventional screen-film system in Brazil. To assess image quality, manufactures provide the calculation of an exposure index through the acquisition software of the CR system. The objective of this study is to verify if the CR image can be used as an evaluator of patient absorbed dose too, through a relationship between the entrance skin dose and the exposure index or the gray level values obtained in the image. The CR system used for this study (Agfa model 30-X with NX acquisition software) calculates an exposure index called Log of the Median (lgM), related to the absorbed dose to the IP. The lgM value depends on the average gray level (called Scan Average Level (SAL)) of the segmented pixel value histogram of the whole image. A Rando male phantom was used to simulate a human body (chest and head), and was irradiated with an X-ray equipment, using usual radiologic techniques for chest exams. Thermoluminescent dosimeters (LiF, TLD100) were used to evaluate entrance skin dose and exit dose. The results showed a logarithm relation between entrance dose and SAL in the image center, regardless of the beam filtration. The exposure index varies linearly with the entrance dose, but the angular coefficient is beam quality dependent. We conclude that, with an adequate calibration, the CR system can be used to evaluate the patient absorbed dose.

  12. Toward automatic computer aided dental X-ray analysis using level set method.

    PubMed

    Li, Shuo; Fevens, Thomas; Krzyzak, Adam; Jin, Chao; Li, Song

    2005-01-01

    A Computer Aided Dental X-rays Analysis (CADXA) framework is proposed to semi-automatically detect areas of bone loss and root decay in digital dental X-rays. In this framework, first, a new proposed competitive coupled level set method is proposed to segment the image into three pathologically meaningful regions using two coupled level set functions. Tailored for the dental clinical environment, the segmentation stage uses a trained support vector machine (SVM) classifier to provide initial contours. Then, based on the segmentation results, an analysis scheme is applied. First, the scheme builds an uncertainty map from which those areas with bone loss will be automatically detected. Secondly, the scheme employs a method based on the SVM and the average intensity profile to isolate the teeth and detect root decay. Experimental results show that our proposed framework is able to automatically detect the areas of bone loss and, when given the orientation of the teeth, it is able to automatically detect the root decay with a seriousness level marked for diagnosis. PMID:16685904

  13. A component-level failure detection and identification algorithm based on open-loop and closed-loop state estimators

    NASA Astrophysics Data System (ADS)

    You, Seung-Han; Cho, Young Man; Hahn, Jin-Oh

    2013-04-01

    This study presents a component-level failure detection and identification (FDI) algorithm for a cascade mechanical system subsuming a plant driven by an actuator unit. The novelty of the FDI algorithm presented in this study is that it is able to discriminate failure occurring in the actuator unit, the sensor measuring the output of the actuator unit, and the plant driven by the actuator unit. The proposed FDI algorithm exploits the measurement of the actuator unit output together with its estimates generated by open-loop (OL) and closed-loop (CL) estimators to enable FDI at the component's level. In this study, the OL estimator is designed based on the system identification of the actuator unit. The CL estimator, which is guaranteed to be stable against variations in the plant, is synthesized based on the dynamics of the entire cascade system. The viability of the proposed algorithm is demonstrated using a hardware-in-the-loop simulation (HILS), which shows that it can detect and identify target failures reliably in the presence of plant uncertainties.

  14. Genotypic and Phenotypic Applications for the Differentiation and Species-Level Identification of Achromobacter for Clinical Diagnoses

    PubMed Central

    Gomila, Margarita; Prince-Manzano, Claudia; Svensson-Stadler, Liselott; Busquets, Antonio; Erhard, Marcel; Martínez, Deny L.; Lalucat, Jorge; Moore, Edward R. B.

    2014-01-01

    The Achromobacter is a genus in the family Alcaligenaceae, comprising fifteen species isolated from different sources, including clinical samples. The ability to detect and correctly identify Achromobacter species, particularly A. xylosoxidans, and differentiate them from other phenotypically similar and genotypically related Gram-negative, aerobic, non-fermenting species is important for patients with cystic fibrosis (CF), as well as for nosocomial and other opportunistic infections. Traditional phenotypic profile-based analyses have been demonstrated to be inadequate for reliable identifications of isolates of Achromobacter species and genotypic-based assays, relying upon comparative 16S rRNA gene sequence analyses are not able to insure definitive identifications of Achromobacter species, due to the inherently conserved nature of the gene. The uses of alternative methodologies to enable high-resolution differentiation between the species in the genus are needed. A comparative multi-locus sequence analysis (MLSA) of four selected ‘house-keeping’ genes (atpD, gyrB, recA, and rpoB) assessed the individual gene sequences for their potential in developing a reliable, rapid and cost-effective diagnostic protocol for Achromobacter species identifications. The analysis of the type strains of the species of the genus and 46 strains of Achromobacter species showed congruence between the cluster analyses derived from the individual genes. The MLSA gene sequences exhibited different levels of resolution in delineating the validly published Achromobacter species and elucidated strains that represent new genotypes and probable new species of the genus. Our results also suggested that the recently described A. spritinus is a later heterotypic synonym of A. marplatensis. Strains were analyzed, using whole-cell Matrix-Assisted Laser Desorption/Ionization Time-Of-Flight mass spectrometry (MALDI-TOF MS), as an alternative phenotypic profile-based method with the potential to

  15. Genotypic and phenotypic applications for the differentiation and species-level identification of achromobacter for clinical diagnoses.

    PubMed

    Gomila, Margarita; Prince-Manzano, Claudia; Svensson-Stadler, Liselott; Busquets, Antonio; Erhard, Marcel; Martínez, Deny L; Lalucat, Jorge; Moore, Edward R B

    2014-01-01

    The Achromobacter is a genus in the family Alcaligenaceae, comprising fifteen species isolated from different sources, including clinical samples. The ability to detect and correctly identify Achromobacter species, particularly A. xylosoxidans, and differentiate them from other phenotypically similar and genotypically related Gram-negative, aerobic, non-fermenting species is important for patients with cystic fibrosis (CF), as well as for nosocomial and other opportunistic infections. Traditional phenotypic profile-based analyses have been demonstrated to be inadequate for reliable identifications of isolates of Achromobacter species and genotypic-based assays, relying upon comparative 16S rRNA gene sequence analyses are not able to insure definitive identifications of Achromobacter species, due to the inherently conserved nature of the gene. The uses of alternative methodologies to enable high-resolution differentiation between the species in the genus are needed. A comparative multi-locus sequence analysis (MLSA) of four selected 'house-keeping' genes (atpD, gyrB, recA, and rpoB) assessed the individual gene sequences for their potential in developing a reliable, rapid and cost-effective diagnostic protocol for Achromobacter species identifications. The analysis of the type strains of the species of the genus and 46 strains of Achromobacter species showed congruence between the cluster analyses derived from the individual genes. The MLSA gene sequences exhibited different levels of resolution in delineating the validly published Achromobacter species and elucidated strains that represent new genotypes and probable new species of the genus. Our results also suggested that the recently described A. spritinus is a later heterotypic synonym of A. marplatensis. Strains were analyzed, using whole-cell Matrix-Assisted Laser Desorption/Ionization Time-Of-Flight mass spectrometry (MALDI-TOF MS), as an alternative phenotypic profile-based method with the potential to

  16. Computational genomic identification and functional reconstitution of plant natural product biosynthetic pathways.

    PubMed

    Medema, Marnix H; Osbourn, Anne

    2016-08-27

    Covering: 2003 to 2016The last decade has seen the first major discoveries regarding the genomic basis of plant natural product biosynthetic pathways. Four key computationally driven strategies have been developed to identify such pathways, which make use of physical clustering, co-expression, evolutionary co-occurrence and epigenomic co-regulation of the genes involved in producing a plant natural product. Here, we discuss how these approaches can be used for the discovery of plant biosynthetic pathways encoded by both chromosomally clustered and non-clustered genes. Additionally, we will discuss opportunities to prioritize plant gene clusters for experimental characterization, and end with a forward-looking perspective on how synthetic biology technologies will allow effective functional reconstitution of candidate pathways using a variety of genetic systems. PMID:27321668

  17. Computational dissection of tissue contamination for identification of colon cancer-specific expression profiles.

    PubMed

    Türeci, Ozlem; Ding, Jiayi; Hilton, Holly; Bian, Hongjin; Ohkawa, Hitomi; Braxenthaler, Michael; Seitz, Gerhard; Raddrizzani, Laura; Friess, Helmut; Buchler, Markus; Sahin, Ugur; Hammer, Juergen

    2003-03-01

    Microarray profiles of bulk tumor tissues reflect gene expression corresponding to malignant cells as well as to many different types of contaminating normal cells. In this report, we assess the feasibility of querying baseline multitissue transcriptome databases to dissect disease-specific genes. Using colon cancer as a model tumor, we show that the application of Boolean operators (AND, OR, BUTNOT) for database searches leads to genes with expression patterns of interest. The BUTNOT operator for example allows the assignment of "expression signatures" to normal tissue specimens. These expression signatures were then used to computationally identify contaminating cells within conventionally dissected tissue specimens. The combination of several logic operators together with an expression database based on multiple human tissue specimens can resolve the problem of tissue contamination, revealing novel cancer-specific gene expression. Several markers, previously not known to be colon cancer associated, are provided. PMID:12631577

  18. Key for protein coding sequences identification: computer analysis of codon strategy.

    PubMed Central

    Rodier, F; Gabarro-Arpa, J; Ehrlich, R; Reiss, C

    1982-01-01

    The signal qualifying an AUG or GUG as an initiator in mRNAs processed by E. coli ribosomes is not found to be a systematic, literal homology sequence. In contrast, stability analysis reveals that initiators always occur within nucleic acid domains of low stability, for which a high A/U content is observed. Since no aminoacid selection pressure can be detected at N-termini of the proteins, the A/U enrichment results from a biased usage of the code degeneracy. A computer analysis is presented which allows easy detection of the codon strategy. N-terminal codons carry rather systematically A or U in third position, which suggests a mechanism for translation initiation and helps to detect protein coding sequences in sequenced DNA. PMID:7038623

  19. Identification of agricultural crops by computer processing of ERTS MSS data

    NASA Technical Reports Server (NTRS)

    Bauer, M. E.; Cipra, J. E.

    1973-01-01

    Quantitative evaluation of computer-processed ERTS MSS data classifications has shown that major crop species (corn and soybeans) can be accurately identified. The classifications of satellite data over a 2000 square mile area not only covered more than 100 times the area previously covered using aircraft, but also yielded improved results through the use of temporal and spatial data in addition to the spectral information. Furthermore, training sets could be extended over far larger areas than was ever possible with aircraft scanner data. And, preliminary comparisons of acreage estimates from ERTS data and ground-based systems agreed well. The results demonstrate the potential utility of this technology for obtaining crop production information.

  20. Computational identification of potential multi-drug combinations for reduction of microglial inflammation in Alzheimer disease

    PubMed Central

    Anastasio, Thomas J.

    2015-01-01

    Like other neurodegenerative diseases, Alzheimer Disease (AD) has a prominent inflammatory component mediated by brain microglia. Reducing microglial inflammation could potentially halt or at least slow the neurodegenerative process. A major challenge in the development of treatments targeting brain inflammation is the sheer complexity of the molecular mechanisms that determine whether microglia become inflammatory or take on a more neuroprotective phenotype. The process is highly multifactorial, raising the possibility that a multi-target/multi-drug strategy could be more effective than conventional monotherapy. This study takes a computational approach in finding combinations of approved drugs that are potentially more effective than single drugs in reducing microglial inflammation in AD. This novel approach exploits the distinct advantages of two different computer programming languages, one imperative and the other declarative. Existing programs written in both languages implement the same model of microglial behavior, and the input/output relationships of both programs agree with each other and with data on microglia over an extensive test battery. Here the imperative program is used efficiently to screen the model for the most efficacious combinations of 10 drugs, while the declarative program is used to analyze in detail the mechanisms of action of the most efficacious combinations. Of the 1024 possible drug combinations, the simulated screen identifies only 7 that are able to move simulated microglia at least 50% of the way from a neurotoxic to a neuroprotective phenotype. Subsequent analysis shows that of the 7 most efficacious combinations, 2 stand out as superior both in strength and reliability. The model offers many experimentally testable and therapeutically relevant predictions concerning effective drug combinations and their mechanisms of action. PMID:26097457

  1. Self-Assessment and Student Improvement in an Introductory Computer Course at the Community College-Level

    ERIC Educational Resources Information Center

    Spicer-Sutton, Jama

    2013-01-01

    The purpose of this study was to determine a student's computer knowledge upon course entry and if there was a difference in college students' improvement scores as measured by the difference in pretest and posttest scores of new or novice users, moderate users, and expert users at the end of a college-level introductory computing class.…

  2. Computationally Guided Identification of Novel Mycobacterium tuberculosis GlmU Inhibitory Leads, Their Optimization, and in Vitro Validation.

    PubMed

    Mehra, Rukmankesh; Rani, Chitra; Mahajan, Priya; Vishwakarma, Ram Ashrey; Khan, Inshad Ali; Nargotra, Amit

    2016-02-01

    Mycobacterium tuberculosis (Mtb) infections are causing serious health concerns worldwide. Antituberculosis drug resistance threatens the current therapies and causes further need to develop effective antituberculosis therapy. GlmU represents an interesting target for developing novel Mtb drug candidates. It is a bifunctional acetyltransferase/uridyltransferase enzyme that catalyzes the biosynthesis of UDP-N-acetyl-glucosamine (UDP-GlcNAc) from glucosamine-1-phosphate (GlcN-1-P). UDP-GlcNAc is a substrate for the biosynthesis of lipopolysaccharide and peptidoglycan that are constituents of the bacterial cell wall. In the current study, structure and ligand based computational models were developed and rationally applied to screen a drug-like compound repository of 20,000 compounds procured from ChemBridge DIVERSet database for the identification of probable inhibitors of Mtb GlmU. The in vitro evaluation of the in silico identified inhibitor candidates resulted in the identification of 15 inhibitory leads of this target. Literature search of these leads through SciFinder and their similarity analysis with the PubChem training data set (AID 1376) revealed the structural novelty of these hits with respect to Mtb GlmU. IC50 of the most potent identified inhibitory lead (5810599) was found to be 9.018 ± 0.04 μM. Molecular dynamics (MD) simulation of this inhibitory lead (5810599) in complex with protein affirms the stability of the lead within the binding pocket and also emphasizes on the key interactive residues for further designing. Binding site analysis of the acetyltransferase pocket with respect to the identified structural moieties provides a thorough analysis for carrying out the lead optimization studies. PMID:26812086

  3. Computational Approaches to Analyze and Predict Small Molecule Transport and Distribution at Cellular and Subcellular Levels

    PubMed Central

    Ah Min, Kyoung; Zhang, Xinyuan; Yu, Jing-yu; Rosania, Gus R.

    2013-01-01

    Quantitative structure-activity relationship (QSAR) studies and mechanistic mathematical modeling approaches have been independently employed for analyzing and predicting the transport and distribution of small molecule chemical agents in living organisms. Both of these computational approaches have been useful to interpret experiments measuring the transport properties of small molecule chemical agents, in vitro and in vivo. Nevertheless, mechanistic cell-based pharmacokinetic models have been especially useful to guide the design of experiments probing the molecular pathways underlying small molecule transport phenomena. Unlike QSAR models, mechanistic models can be integrated from microscopic to macroscopic levels, to analyze the spatiotemporal dynamics of small molecule chemical agents from intracellular organelles to whole organs, well beyond the experiments and training data sets upon which the models are based. Based on differential equations, mechanistic models can also be integrated with other differential equations-based systems biology models of biochemical networks or signaling pathways. Although the origin and evolution of mathematical modeling approaches aimed at predicting drug transport and distribution has occurred independently from systems biology, we propose that the incorporation of mechanistic cell-based computational models of drug transport and distribution into a systems biology modeling framework is a logical next-step for the advancement of systems pharmacology research. PMID:24218242

  4. Establishment of diagnostic reference levels in computed tomography for paediatric patients in Sudan: a pilot study.

    PubMed

    Sulieman, A

    2015-07-01

    Paediatric patients are recognised to be at higher risk of developing radiation-induced cancer than adults. The purpose of this pilot study was to evaluate the radiation doses to paediatric patients during computed tomography (CT) procedures in order to propose local diagnostic reference levels (DRLs). A total of 296 patients (aged 6-10 y) were investigated in 8 hospitals equipped with 64-, 16- and dual-slice CT machines. The mean dose length product values were 772, 446 and 178 mGy cm for head, abdomen and chest, respectively. Imaging protocols were not adapted to the patient's weight in certain CT machines. The results confirmed that paediatric patients are exposed to an unnecessary radiation dose. The established DRLs were higher than those available in other countries. This study showed the need for harmonisation of the practice in CT departments and radiation dose optimisation. PMID:25836694

  5. Computational methodology to predict satellite system-level effects from impacts of untrackable space debris

    NASA Astrophysics Data System (ADS)

    Welty, N.; Rudolph, M.; Schäfer, F.; Apeldoorn, J.; Janovsky, R.

    2013-07-01

    This paper presents a computational methodology to predict the satellite system-level effects resulting from impacts of untrackable space debris particles. This approach seeks to improve on traditional risk assessment practices by looking beyond the structural penetration of the satellite and predicting the physical damage to internal components and the associated functional impairment caused by untrackable debris impacts. The proposed method combines a debris flux model with the Schäfer-Ryan-Lambert ballistic limit equation (BLE), which accounts for the inherent shielding of components positioned behind the spacecraft structure wall. Individual debris particle impact trajectories and component shadowing effects are considered and the failure probabilities of individual satellite components as a function of mission time are calculated. These results are correlated to expected functional impairment using a Boolean logic model of the system functional architecture considering the functional dependencies and redundancies within the system.

  6. Two-Level Verification of Data Integrity for Data Storage in Cloud Computing

    NASA Astrophysics Data System (ADS)

    Xu, Guangwei; Chen, Chunlin; Wang, Hongya; Zang, Zhuping; Pang, Mugen; Jiang, Ping

    Data storage in cloud computing can save capital expenditure and relive burden of storage management for users. As the lose or corruption of files stored may happen, many researchers focus on the verification of data integrity. However, massive users often bring large numbers of verifying tasks for the auditor. Moreover, users also need to pay extra fee for these verifying tasks beyond storage fee. Therefore, we propose a two-level verification of data integrity to alleviate these problems. The key idea is to routinely verify the data integrity by users and arbitrate the challenge between the user and cloud provider by the auditor according to the MACs and ϕ values. The extensive performance simulations show that the proposed scheme obviously decreases auditor's verifying tasks and the ratio of wrong arbitration.

  7. On the precision of neural computation with interaural level differences in the lateral superior olive.

    PubMed

    Bures, Zbynek; Marsalek, Petr

    2013-11-01

    Interaural level difference (ILD) is one of the basic binaural clues in the spatial localization of a sound source. Due to the acoustic shadow cast by the head, a sound source out of the medial plane results in an increased sound level at the nearer ear and a decreased level at the distant ear. In the mammalian auditory brainstem, the ILD is processed by a neuronal circuit of binaural neurons in the lateral superior olive (LSO). These neurons receive major excitatory projections from the ipsilateral side and major inhibitory projections from the contralateral side. As the sound level is encoded predominantly by the neuronal discharge rate, the principal function of LSO neurons is to estimate and encode the difference between the discharge rates of the excitatory and inhibitory inputs. Two general mechanisms of this operation are biologically plausible: (1) subtraction of firing rates integrated over longer time intervals, and (2) detection of coincidence of individual spikes within shorter time intervals. However, the exact mechanism of ILD evaluation is not known. Furthermore, given the stochastic nature of neuronal activity, it is not clear how the circuit achieves the remarkable precision of ILD assessment observed experimentally. We employ a probabilistic model and complementary computer simulations to investigate whether the two general mechanisms are capable of the desired performance. Introducing the concept of an ideal observer, we determine the theoretical ILD accuracy expressed by means of the just-noticeable difference (JND) in dependence on the statistics of the interacting spike trains, the overall firing rate, detection time, the number of converging fibers, and on the neural mechanism itself. We demonstrate that the JNDs rely on the precision of spike timing; however, with an appropriate parameter setting, the lowest theoretical values are similar or better than the experimental values. Furthermore, a mechanism based on excitatory and inhibitory

  8. Establishment of diagnostic reference levels in computed tomography for select procedures in Pudhuchery, India

    PubMed Central

    Saravanakumar, A.; Vaideki, K.; Govindarajan, K. N.; Jayakumar, S.

    2014-01-01

    Computed tomography (CT) scanner under operating conditions has become a major source of human exposure to diagnostic X-rays. In this context, weighed CT dose index (CTDIw), volumetric CT dose index (CTDIv), and dose length product (DLP) are important parameter to assess procedures in CT imaging as surrogate dose quantities for patient dose optimization. The current work aims to estimate the existing dose level of CT scanner for head, chest, and abdomen procedures in Pudhuchery in south India and establish dose reference level (DRL) for the region. The study was carried out for six CT scanners in six different radiology departments using 100 mm long pencil ionization chamber and polymethylmethacrylate (PMMA) phantom. From each CT scanner, data pertaining to patient and machine details were collected for 50 head, 50 chest, and 50 abdomen procedures performed over a period of 1 year. The experimental work was carried out using the machine operating parameters used during the procedures. Initially, dose received in the phantom at the center and periphery was measured by five point method. Using these values CTDIw, CTDIv, and DLP were calculated. The DRL is established based on the third quartile value of CTDIv and DLP which is 32 mGy and 925 mGy.cm for head, 12 mGy and 456 mGy.cm for chest, and 16 mGy and 482 mGy.cm for abdomen procedures. These values are well below European Commission Dose Reference Level (EC DRL) and comparable with the third quartile value reported for Tamil Nadu region in India. The present study is the first of its kind to determine the DRL for scanners operating in the Pudhuchery region. Similar studies in other regions of India are necessary in order to establish a National Dose Reference Level. PMID:24600173

  9. Blood and Sputum Eosinophil Levels in Asthma and Their Relationship to Sinus Computed Tomographic Findings

    PubMed Central

    Mehta, Vinay; Campeau, Norbert G.; Kita, Hirohito; Hagan, John B.

    2010-01-01

    OBJECTIVE To investigate the relationship among blood and sputum eosinophil levels, sinus mucosal thickening, and osteitis in patients with asthma. PATIENTS AND METHODS We conducted an observational study of 201 patients with asthma who underwent sinus computed tomographic (CT) imaging and induced sputum analysis at Mayo Clinic's site in Rochester, MN, from November 1, 2000, through December 31, 2005. Sinus CT scans were reviewed by an investigator blinded to patients' identity and chart information (J.B.H.) to assess for mucosal thickening. Each scan was assigned a CT score based on the Lund-Mackay staging scale. Approximately 20% of the scans were reviewed at random by a radiologist (N.G.C.) to ensure quality control. Bone changes consistent with osteitis were ascertained from radiology reports. Lung function was measured, and sputum was analyzed by conventional methods. RESULTS Sinus CT scans revealed abnormalities in 136 (68%) of the 201 study patients. Severe mucosal thickening (CT score, ≥12) was found in 60 patients (30%) and osteitis in 18 patients (9%). There was a positive correlation between CT scores and eosinophil levels in both peripheral blood (ρ=0.45; 95% confidence interval, 0.33–0.56; P<.001) and induced sputum (ρ=0.46; 95% confidence interval, 0.34–0.57; P<.001). Further, elevated blood and sputum eosinophil levels were associated with the presence of osteitis on CT scan and previous sinus surgery. CONCLUSION Blood and sputum eosinophil levels in patients with asthma are directly correlated with sinus mucosal thickening and are associated with osteitis, lending further support to the hypothesis that asthma and chronic rhinosinusitis are mediated by similar inflammatory processes. PMID:18533084

  10. Computational identification of genetic subnetwork modules associated with maize defense response to Fusarium verticillioides

    PubMed Central

    2015-01-01

    Background Maize, a crop of global significance, is vulnerable to a variety of biotic stresses resulting in economic losses. Fusarium verticillioides (teleomorph Gibberella moniliformis) is one of the key fungal pathogens of maize, causing ear rots and stalk rots. To better understand the genetic mechanisms involved in maize defense as well as F. verticillioides virulence, a systematic investigation of the host-pathogen interaction is needed. The aim of this study was to computationally identify potential maize subnetwork modules associated with its defense response against F. verticillioides. Results We obtained time-course RNA-seq data from B73 maize inoculated with wild type F. verticillioides and a loss-of-virulence mutant, and subsequently established a computational pipeline for network-based comparative analysis. Specifically, we first analyzed the RNA-seq data by a cointegration-correlation-expression approach, where maize genes were jointly analyzed with known F. verticillioides virulence genes to find candidate maize genes likely associated with the defense mechanism. We predicted maize co-expression networks around the selected maize candidate genes based on partial correlation, and subsequently searched for subnetwork modules that were differentially activated when inoculated with two different fungal strains. Based on our analysis pipeline, we identified four potential maize defense subnetwork modules. Two were directly associated with maize defense response and were associated with significant GO terms such as GO:0009817 (defense response to fungus) and GO:0009620 (response to fungus). The other two predicted modules were indirectly involved in the defense response, where the most significant GO terms associated with these modules were GO:0046914 (transition metal ion binding) and GO:0046686 (response to cadmium ion). Conclusion Through our RNA-seq data analysis, we have shown that a network-based approach can enhance our understanding of the

  11. A Japanese computer-assisted facial identification system successfully identifies non-Japanese faces.

    PubMed

    Fraser, Natalie L; Yoshino, Mineo; Imaizumi, Kazuhiko; Blackwell, Sherie A; Thomas, C David L; Clement, John G

    2003-08-12

    The method developed by Yoshino et al. in [Forensic Sci. Int. 109 (2000) 225 and Jpn. J. Sci. Tech. Iden. 5 (2000) 9] and already being applied in Japan utilizes a three-dimensional (3D) physiognomic rangefinder combined with a computer-assisted superimposition system. Facial outlines can be compared between two-dimensional (2D) surveillance images and data extracted from 3D images obtained from the rangefinder. Also, the loci of potentially concordant features can be compared and differences measured. The method is largely objective and gives statistics for false positive/false negative findings. This recently developed method by Yoshino et al. is currently being introduced to the Japanese courts. To enable courts outside Japan to assess the admissibility of this new method, studies of non-Japanese faces have been undertaken and shown to produce similar low error rates. The present authors, therefore, consider the Yoshino method to be applicable in a non-Japanese context. As part of this study a comparison of morphological features between two ethnic groups has been undertaken using 3D measurements for the first time and will serve as the foundation for an anthropological database in the future. PMID:12927413

  12. Computational prediction of human salivary proteins from blood circulation and application to diagnostic biomarker identification.

    PubMed

    Wang, Jiaxin; Liang, Yanchun; Wang, Yan; Cui, Juan; Liu, Ming; Du, Wei; Xu, Ying

    2013-01-01

    Proteins can move from blood circulation into salivary glands through active transportation, passive diffusion or ultrafiltration, some of which are then released into saliva and hence can potentially serve as biomarkers for diseases if accurately identified. We present a novel computational method for predicting salivary proteins that come from circulation. The basis for the prediction is a set of physiochemical and sequence features we found to be discerning between human proteins known to be movable from circulation to saliva and proteins deemed to be not in saliva. A classifier was trained based on these features using a support-vector machine to predict protein secretion into saliva. The classifier achieved 88.56% average recall and 90.76% average precision in 10-fold cross-validation on the training data, indicating that the selected features are informative. Considering the possibility that our negative training data may not be highly reliable (i.e., proteins predicted to be not in saliva), we have also trained a ranking method, aiming to rank the known salivary proteins from circulation as the highest among the proteins in the general background, based on the same features. This prediction capability can be used to predict potential biomarker proteins for specific human diseases when coupled with the information of differentially expressed proteins in diseased versus healthy control tissues and a prediction capability for blood-secretory proteins. Using such integrated information, we predicted 31 candidate biomarker proteins in saliva for breast cancer. PMID:24324552

  13. Computational identification of riboswitches based on RNA conserved functional sequences and conformations.

    PubMed

    Chang, Tzu-Hao; Huang, Hsien-Da; Wu, Li-Ching; Yeh, Chi-Ta; Liu, Baw-Jhiune; Horng, Jorng-Tzong

    2009-07-01

    Riboswitches are cis-acting genetic regulatory elements within a specific mRNA that can regulate both transcription and translation by interacting with their corresponding metabolites. Recently, an increasing number of riboswitches have been identified in different species and investigated for their roles in regulatory functions. Both the sequence contexts and structural conformations are important characteristics of riboswitches. None of the previously developed tools, such as covariance models (CMs), Riboswitch finder, and RibEx, provide a web server for efficiently searching homologous instances of known riboswitches or considers two crucial characteristics of each riboswitch, such as the structural conformations and sequence contexts of functional regions. Therefore, we developed a systematic method for identifying 12 kinds of riboswitches. The method is implemented and provided as a web server, RiboSW, to efficiently and conveniently identify riboswitches within messenger RNA sequences. The predictive accuracy of the proposed method is comparable with other previous tools. The efficiency of the proposed method for identifying riboswitches was improved in order to achieve a reasonable computational time required for the prediction, which makes it possible to have an accurate and convenient web server for biologists to obtain the results of their analysis of a given mRNA sequence. RiboSW is now available on the web at http://RiboSW.mbc.nctu.edu.tw/. PMID:19460868

  14. Levels, profiles and source identification of PCDD/Fs in farmland soils of Guiyu, China.

    PubMed

    Xu, Pengjun; Tao, Bu; Li, Nan; Qi, Li; Ren, Yue; Zhou, Zhiguang; Zhang, Lifei; Liu, Aimin; Huang, Yeru

    2013-05-01

    The present study finished the first comprehensive survey of polychlorinated dibenzo-p-dioxin and dibenzofurans (PCDD/Fs) in farmland soils of Guiyu, China. Guiyu was a major electronic wastes (EWs) dismantling area, but primitive and crude EWs disposal manner had led to severe PCDD/Fs pollution there. Twenty-three farmland soil samples covering the entire Guiyu region were analyzed. Toxic equivalent quantities (I-TEQs) of soils in EWs disposal areas were 5.7-57pg TEQ g(-1), and the total concentrations of tetra- to octa-homologues were 2816-17738pgg(-1). The SL district was a heavily contaminated area, and the neighboring SMP town was influenced by Guiyu. EWs disposal might be the source of PCDD/Fs. The homologue profiles were of three types, representing different disposal manner of EWs. Tetrachlorodibenzo-p-dioxins (TCDDs) and octachlorodibenzo-p-dioxin (OCDD) could be used as indicators for source identification, open thermal disposal of EWs was inclined to lead to formation of TCDDs, OCDD was a product of non-thermal processes. PMID:23466087

  15. Identification and Validation of Genetic Variants that Influence Transcription Factor and Cell Signaling Protein Levels

    PubMed Central

    Hause, Ronald J.; Stark, Amy L.; Antao, Nirav N.; Gorsic, Lidija K.; Chung, Sophie H.; Brown, Christopher D.; Wong, Shan S.; Gill, Daniel F.; Myers, Jamie L.; To, Lida Anita; White, Kevin P.; Dolan, M. Eileen; Jones, Richard Baker

    2014-01-01

    Many genetic variants associated with human disease have been found to be associated with alterations in mRNA expression. Although it is commonly assumed that mRNA expression changes will lead to consequent changes in protein levels, methodological challenges have limited our ability to test the degree to which this assumption holds true. Here, we further developed the micro-western array approach and globally examined relationships between human genetic variation and cellular protein levels. We collected more than 250,000 protein level measurements comprising 441 transcription factor and signaling protein isoforms across 68 Yoruba (YRI) HapMap lymphoblastoid cell lines (LCLs) and identified 12 cis and 160 trans protein level QTLs (pQTLs) at a false discovery rate (FDR) of 20%. Whereas up to two thirds of cis mRNA expression QTLs (eQTLs) were also pQTLs, many pQTLs were not associated with mRNA expression. Notably, we replicated and functionally validated a trans pQTL relationship between the KARS lysyl-tRNA synthetase locus and levels of the DIDO1 protein. This study demonstrates proof of concept in applying an antibody-based microarray approach to iteratively measure the levels of human proteins and relate these levels to human genome variation and other genomic data sets. Our results suggest that protein-based mechanisms might functionally buffer genetic alterations that influence mRNA expression levels and that pQTLs might contribute phenotypic diversity to a human population independently of influences on mRNA expression. PMID:25087611

  16. Identification of Xenologs and Their Characteristic Low Expression Levels in the Cyanobacterium Synechococcus elongatus.

    PubMed

    Álvarez-Canales, Gilberto; Arellano-Álvarez, Guadalupe; González-Domenech, Carmen M; de la Cruz, Fernando; Moya, Andrés; Delaye, Luis

    2015-06-01

    Horizontal gene transfer (HGT) is a central process in prokaryotic evolution. Once a gene is introduced into a genome by HGT, its contribution to the fitness of the recipient cell depends in part on its expression level. Here we show that in Synechococcus elongatus PCC 7942, xenologs derived from non-cyanobacterial sources exhibited lower expression levels than native genes in the genome. In accord with our observation, xenolog codon adaptation indexes also displayed relatively low expression values. These results are in agreement with previous reports that suggested the relative neutrality of most xenologs. However, we also demonstrated that some of the xenologs detected participated in cellular functions, including iron starvation acclimation and nitrate reduction, which corroborate the role of HGT in bacterial adaptation. For example, the expression levels of some of the xenologs detected are known to increase under iron-limiting conditions. We interpreted the overall pattern as an indication that there is a selection pressure against high expression levels of xenologs. However, when a xenolog protein product confers a selective advantage, natural selection can further modulate its expression level to meet the requirements of the recipient cell. In addition, we show that ORFans did not exhibit significantly lower expression levels than native genes in the genome, which suggested an origin other than xenology. PMID:26040248

  17. Development of computer program ENAUDIBL for computation of the sensation levels of multiple, complex, intrusive sounds in the presence of residual environmental masking noise

    SciTech Connect

    Liebich, R. E.; Chang, Y.-S.; Chun, K. C.

    2000-03-31

    The relative audibility of multiple sounds occurs in separate, independent channels (frequency bands) termed critical bands or equivalent rectangular (filter-response) bandwidths (ERBs) of frequency. The true nature of human hearing is a function of a complex combination of subjective factors, both auditory and nonauditory. Assessment of the probability of individual annoyance, community-complaint reaction levels, speech intelligibility, and the most cost-effective mitigation actions requires sensation-level data; these data are one of the most important auditory factors. However, sensation levels cannot be calculated by using single-number, A-weighted sound level values. This paper describes specific steps to compute sensation levels. A unique, newly developed procedure is used, which simplifies and improves the accuracy of such computations by the use of maximum sensation levels that occur, for each intrusive-sound spectrum, within each ERB. The newly developed program ENAUDIBL makes use of ERB sensation-level values generated with some computational subroutines developed for the formerly documented program SPECTRAN.

  18. National diagnostic reference level initiative for computed tomography examinations in Kenya.

    PubMed

    Korir, Geoffrey K; Wambani, Jeska S; Korir, Ian K; Tries, Mark A; Boen, Patrick K

    2016-02-01

    The purpose of this study was to estimate the computed tomography (CT) examination frequency, patient radiation exposure, effective doses and national diagnostic reference levels (NDRLs) associated with CT examinations in clinical practice. A structured questionnaire-type form was developed for recording examination frequency, scanning protocols and patient radiation exposure during CT procedures in fully equipped medical facilities across the country. The national annual number of CT examinations per 1000 people was estimated to be 3 procedures. The volume-weighted CT dose index, dose length product, effective dose and NDRLs were determined for 20 types of adult and paediatric CT examinations. Additionally, the CT annual collective effective dose and effective dose per capita were approximated. The radiation exposure during CT examinations was broadly distributed between the facilities that took part in the study. This calls for a need to develop and implement diagnostic reference levels as a standardisation and optimisation tool for the radiological protection of patients at all the CT facilities nationwide. PMID:25790825

  19. Computer-vision-based weed identification of images acquired by 3CCD camera

    NASA Astrophysics Data System (ADS)

    Zhang, Yun; He, Yong; Fang, Hui

    2006-09-01

    Selective application of herbicide to weeds at an earlier stage in crop growth is an important aspect of site-specific management of field crops. For approaches more adaptive in developing the on-line weed detecting application, more researchers involves in studies on image processing techniques for intensive computation and feature extraction tasks to identify the weeds from the other crops and soil background. This paper investigated the potentiality of applying the digital images acquired by the MegaPlus TM MS3100 3-CCD camera to segment the background soil from the plants in question and further recognize weeds from the crops using the Matlab script language. The image of the near-infrared waveband (center 800 nm; width 65 nm) was selected principally for segmenting soil and identifying the cottons from the thistles was achieved based on their respective relative area (pixel amount) in the whole image. The results show adequate recognition that the pixel proportion of soil, cotton leaves and thistle leaves were 78.24%(-0.20% deviation), 16.66% (+ 2.71% SD) and 4.68% (-4.19% SD). However, problems still exists by separating and allocating single plants for their clustering in the images. The information in the images acquired via the other two channels, i.e., the green and the red bands, need to be extracted to help the crop/weed discrimination. More optical specimens should be acquired for calibration and validation to establish the weed-detection model that could be effectively applied in fields.

  20. Identification and analysis of potential targets in Streptococcus sanguinis using computer aided protein data analysis

    PubMed Central

    Chowdhury, Md Rabiul Hossain; Bhuiyan, Md IqbalKaiser; Saha, Ayan; Mosleh, Ivan MHAI; Mondol, Sobuj; Ahmed, C M Sabbir

    2014-01-01

    Purpose Streptococcus sanguinis is a Gram-positive, facultative aerobic bacterium that is a member of the viridans streptococcus group. It is found in human mouths in dental plaque, which accounts for both dental cavities and bacterial endocarditis, and which entails a mortality rate of 25%. Although a range of remedial mediators have been found to control this organism, the effectiveness of agents such as penicillin, amoxicillin, trimethoprim–sulfamethoxazole, and erythromycin, was observed. The emphasis of this investigation was on finding substitute and efficient remedial approaches for the total destruction of this bacterium. Materials and methods In this computational study, various databases and online software were used to ascertain some specific targets of S. sanguinis. Particularly, the Kyoto Encyclopedia of Genes and Genomes databases were applied to determine human nonhomologous proteins, as well as the metabolic pathways involved with those proteins. Different software such as Phyre2, CastP, DoGSiteScorer, the Protein Function Predictor server, and STRING were utilized to evaluate the probable active drug binding site with its known function and protein–protein interaction. Results In this study, among 218 essential proteins of this pathogenic bacterium, 81 nonhomologous proteins were accrued, and 15 proteins that are unique in several metabolic pathways of S. sanguinis were isolated through metabolic pathway analysis. Furthermore, four essentially membrane-bound unique proteins that are involved in distinct metabolic pathways were revealed by this research. Active sites and druggable pockets of these selected proteins were investigated with bioinformatic techniques. In addition, this study also mentions the activity of those proteins, as well as their interactions with the other proteins. Conclusion Our findings helped to identify the type of protein to be considered as an efficient drug target. This study will pave the way for researchers to

  1. Identification of estrogenic compounds emitted from the combustion of computer printed circuit boards in electronic waste.

    PubMed

    Owens, Clyde V; Lambright, Christy; Bobseine, Kathy; Ryan, Bryce; Gray, L Earl; Gullett, Brian K; Wilson, Vickie S

    2007-12-15

    Rapid changes in technology have brought about a surge in demand for electronic equipment. Many of these products contain brominated flame-retardants (BFRs) as additives to decrease the rate of combustion, raising concerns about their toxicological risk. In our study, emissions from the combustion of computer-printed circuit boards were evaluated in the T47D-KBluc estrogen-responsive cell line at a series of concentrations. There was significant activity from the emission extract when compared to the positive control, 0.1 nM estradiol. After HPLC fractionation, GC/MS identified ten chemicals which included bisphenol A; the brominated derivates mono-, di-, and tribisphenol, triphenyl phosphate, triphenyl phosphine oxide, 4'-bromo-[1,1'-biphenyl]-4-ol,3,5-dibromo-4-hydroxybiphenyl,3,5-dibromo-2-hydroxybiphenyl, and the oxygenated polyaromatic hydrocarbon benzanthrone. Commercially available samples of these ten compounds were tested. The compound 4'-bromo-[1,1'-biphenyl]-4-ol resulted in dose-dependent significant increases for luciferase activity at concentrations ranging from 0.1 to 10 microM in the T47D-KBluc assay. The chemical also demonstrated an affinity for binding to the estrogen receptor (ER) with an IC50 of 2 x 10(-7) M. To determine the uterotrophic activity, three doses (50, 100, and 200 mg/kg/day) of 4'-bromo-[1,1'-biphenyl]-4-ol were administered to adult ovariectomized Long-Evans rats for 3 days. Treatment of the animals with 200 mg/ kg/day showed an increase in uterine weight Hence one new chemical, released by burning of electrical wastes, was identified which displays estrogenic activity both in vitro and in vivo. However, it was about 1000-fold less potent than ethynyl estradiol. PMID:18200886

  2. Efficacy of a Standardized Computer-Based Training Curriculum to Teach Echocardiographic Identification of Rheumatic Heart Disease to Nonexpert Users.

    PubMed

    Beaton, Andrea; Nascimento, Bruno R; Diamantino, Adriana C; Pereira, Gabriel T R; Lopes, Eduardo L V; Miri, Cassio O; Bruno, Kaciane K O; Chequer, Graziela; Ferreira, Camila G; Lafeta, Luciana C X; Richards, Hedda; Perlman, Lindsay; Webb, Catherine L; Ribeiro, Antonio L P; Sable, Craig; Nunes, Maria do Carmo P

    2016-06-01

    The ability to integrate echocardiographic for rheumatic heart disease (RHD) into RHD prevention programs is limited because of lack of financial and expert human resources in endemic areas. Task shifting to nonexperts is promising; but investigations into workforce composition and training schemes are needed. The objective of this study was to test nonexperts' ability to interpret RHD screening echocardiograms after a brief, standardized, computer-based training course. Six nonexperts completed a 3-week curriculum on image interpretation. Participant performance was tested in a school-screening environment in comparison to the reference approach (cardiologists, standard portable echocardiography machines, and 2012 World Heart Federation criteria). All participants successfully completed the curriculum, and feedback was universally positive. Screening was performed in 1,381 children (5 to 18 years, 60% female), with 397 (47 borderline RHD, 6 definite RHD, 336 normal, and 8 other) referred for handheld echo. Overall sensitivity of the simplified approach was 83% (95% CI 76% to 89%), with an overall specificity of 85% (95% CI 82% to 87%). The most common reasons for false-negative screens (n = 16) were missed mitral regurgitation (MR; 44%) and MR ≤1.5 cm (29%). The most common reasons for false-positive screens (n = 179) included identification of erroneous color jets (25%), incorrect MR measurement (24%), and appropriate application of simplified guidelines (39.4%). In conclusion, a short, independent computer-based curriculum can be successfully used to train a heterogeneous group of nonexperts to interpret RHD screening echocardiograms. This approach helps address prohibitive financial and workforce barriers to widespread RHD screening. PMID:27084054

  3. Information-based system identification for predicting the groundwater-level fluctuations of hillslopes

    NASA Astrophysics Data System (ADS)

    Hong, Yao-Ming; Wan, Shiuan

    2011-09-01

    The analysis of pre-existing landslides and landslide-prone hillslopes requires an estimation of maximum groundwater levels. Rapid increase in groundwater levels may be a dominant factor for evaluating the occurrence of landslides. System identification—use of mathematical tools and algorithms for building dynamic models from measured data—is adopted in this study. The fluid mass-balance equation is used to model groundwater-level fluctuations, and the model is analytically solved using the finite-difference method. Entropy-based classification (EBC) is used as a data-mining technique to identify the appropriate ranges of influencing variables. The landslide area at Wushe Reservoir, Nantou County, Taiwan, is chosen as a field test site for verification. The study generated 65,535 sets of numbers for the groundwater-level variables of the governing equation, which is judged by root mean square errors. By applying cross-validation methods and EBC, limited numbers of validation samples are used to find the range of each parameter. For these ranges, a heuristic method is employed to find the best results of each parameter for the prediction model of groundwater level. The ranges for governing factors are evaluated and the resulting performance is examined.

  4. Detection and identification of sub-nanogram levels of protein in a nanoLC-trypsin-MS system.

    PubMed

    Slysz, Gordon W; Lewis, Darren F; Schriemer, David C

    2006-08-01

    Proteomic workflows involving liquid-based protein separations are an alternative to gel-based protein analysis, however the trypsin digestion procedure is usually difficult to implement, particularly when processing low abundance proteins from capillary column effluent. To convert the protein to peptides for the purpose of identification, current protocols require several sample handling steps, and sample losses become an issue. In this study, we present an improved system that conducts reversed-phase protein chromatography and rapid on-line tryptic digestion requiring sub-nanogram quantities of protein. This system employs a novel mirror-gradient concept that allows for dynamic titration of the column effluent to create optimal conditions for real-time tryptic digestion. The purpose behind this development was to improve the limits of detection of the online concept, to support flow-based alternatives to gel-based proteomics and to simplify the characterization of low abundance proteins. Using test mixtures of proteins, we show that peptide mass fingerprinting with high sequence representation can be easily achieved at the 20 fmol level, with detection limits down to 5 fmol (85 pg myoglobin). Limits of identification using standard data-dependent MS/MS experiments are as low as 10 fmol. These results suggest that the nanoLC-trypsin-MS/MS system could represent an alternative to the conventional "1D-gel to MS" proteomic strategy. PMID:16889418

  5. Computational Identification of Diverse Mechanisms Underlying Transcription Factor-DNA Occupancy

    PubMed Central

    Cheng, Qiong; Kazemian, Majid; Pham, Hannah; Blatti, Charles; Celniker, Susan E.; Wolfe, Scot A.; Brodsky, Michael H.; Sinha, Saurabh

    2013-01-01

    ChIP-based genome-wide assays of transcription factor (TF) occupancy have emerged as a powerful, high-throughput method to understand transcriptional regulation, especially on a global scale. This has led to great interest in the underlying biochemical mechanisms that direct TF-DNA binding, with the ultimate goal of computationally predicting a TF's occupancy profile in any cellular condition. In this study, we examined the influence of various potential determinants of TF-DNA binding on a much larger scale than previously undertaken. We used a thermodynamics-based model of TF-DNA binding, called “STAP,” to analyze 45 TF-ChIP data sets from Drosophila embryonic development. We built a cross-validation framework that compares a baseline model, based on the ChIP'ed (“primary”) TF's motif, to more complex models where binding by secondary TFs is hypothesized to influence the primary TF's occupancy. Candidates interacting TFs were chosen based on RNA-SEQ expression data from the time point of the ChIP experiment. We found widespread evidence of both cooperative and antagonistic effects by secondary TFs, and explicitly quantified these effects. We were able to identify multiple classes of interactions, including (1) long-range interactions between primary and secondary motifs (separated by ≤150 bp), suggestive of indirect effects such as chromatin remodeling, (2) short-range interactions with specific inter-site spacing biases, suggestive of direct physical interactions, and (3) overlapping binding sites suggesting competitive binding. Furthermore, by factoring out the previously reported strong correlation between TF occupancy and DNA accessibility, we were able to categorize the effects into those that are likely to be mediated by the secondary TF's effect on local accessibility and those that utilize accessibility-independent mechanisms. Finally, we conducted in vitro pull-down assays to test model-based predictions of short-range cooperative interactions

  6. [Transfer characteristic and source identification of soil heavy metals from water-level-fluctuating zone along Xiangxi River, three-Gorges Reservoir area].

    PubMed

    Xu, Tao; Wang, Fei; Guo, Qiang; Nie, Xiao-Qian; Huang, Ying-Ping; Chen, Jun

    2014-04-01

    Transfer characteristics of heavy metals and their evaluation of potential risk were studied based on determining concentration of heavy metal in soils from water-level-fluctuating zone (altitude:145-175 m) and bank (altitude: 175-185 m) along Xiangxi River, Three Gorges Reservoir area. Factor analysis-multiple linear regression (FA-MLR) was employed for heavy metal source identification and source apportionment. Results demonstrate that, during exposing season, the concentration of soil heavy metals in water-level-fluctuation zone and bank showed the variation, and the concentration of soil heavy metals reduced in shallow soil, but increased in deep soil at water-level-fluctuation zone. However, the concentration of soil heavy metals reduced in both shallow and deep soil at bank during the same period. According to the geoaccumulation index,the pollution extent of heavy metals followed the order: Cd > Pb > Cu > Cr, Cd is the primary pollutant. FA and FA-MLR reveal that in soils from water-level-fluctuation zone, 75.60% of Pb originates from traffic, 62.03% of Cd is from agriculture, 64.71% of Cu and 75.36% of Cr are from natural rock. In soils from bank, 82.26% of Pb originates from traffic, 68.63% of Cd is from agriculture, 65.72% of Cu and 69.33% of Cr are from natural rock. In conclusion, FA-MLR can successfully identify source of heavy metal and compute source apportionment of heavy metals, meanwhile the transfer characteristic is revealed. All these information can be a reference for heavy metal pollution control. PMID:24946610

  7. Clinical Identification of the Vertebral Level at Which the Lumbar Sympathetic Ganglia Aggregate

    PubMed Central

    An, Ji Won; Koh, Jae Chul; Sun, Jong Min; Park, Ju Yeon; Choi, Jong Bum; Shin, Myung Ju

    2016-01-01

    Background The location and the number of lumbar sympathetic ganglia (LSG) vary between individuals. The aim of this study was to determine the appropriate level for a lumbar sympathetic ganglion block (LSGB), corresponding to the level at which the LSG principally aggregate. Methods Seventy-four consecutive subjects, including 31 women and 31 men, underwent LSGB either on the left (n = 31) or the right side (n = 43). The primary site of needle entry was randomly selected at the L3 or L4 vertebra. A total of less than 1 ml of radio opaque dye with 4% lidocaine was injected, taking caution not to traverse beyond the level of one vertebral body. The procedure was considered responsive when the skin temperature increased by more than 1℃ within 5 minutes. Results The median responsive level was significantly different between the left (lower third of the L4 body) and right (lower margin of the L3 body) sides (P = 0.021). However, there was no significant difference in the values between men and women. The overall median responsive level was the upper third of the L4 body. The mean responsive level did not correlate with height or BMI. There were no complications on short-term follow-up. Conclusions Selection of the primary target in the left lower third of the L4 vertebral body and the right lower margin of the L3 vertebral body may reduce the number of needle insertions and the volume of agents used in conventional or neurolytic LSGB and radiofrequency thermocoagulation. PMID:27103965

  8. Effects of orientation on the identification of rotated objects depend on the level of identity.

    PubMed

    Hamm, J P; McMullen, P A

    1998-04-01

    Matching names and rotated line drawings of objects showed effects of object orientation that depended on name level. Large effects, in the same range as object naming, were found for rotations between 0 degrees and 120 degrees from upright with subordinate names (e.g., collie), whereas nonsignificant effects were found with superordinate (e.g., animal) and basic names (e.g., dog). These results support image normalization, after contact with orientation-invariant representations, that provide basic-level identity. They consequently fail to support theories of object recognition in which rotated object images are normalized to the upright position before contact with long-term object representations. PMID:9606109

  9. [Identification of high-lying odd energy levels of uranium by resonant ionization mass spectrometry].

    PubMed

    Du, H; Shi, G; Huang, M; Jin, C

    2000-06-01

    Single-colour and two-colour multiphoton resonant ionization spectra of uranium atom were studied extensively with a Nd:YAG laser-pumped dye laser atomic beam apparatus time-of-flight mass spectrometer in our laboratory. The energy locations of high-lying odd-parity levels in the region 33,003-34,264 cm-1, measured by a two-colour three-step ionization technique, were reported here. The angular momentum quantum number J was uniquely assigned for these levels by using angular momentum selection rules. PMID:12958925

  10. Computer-assisted identification and volumetric quantification of dynamic contrast enhancement in brain MRI: an interactive system

    NASA Astrophysics Data System (ADS)

    Wu, Shandong; Avgeropoulos, Nicholas G.; Rippe, David J.

    2013-03-01

    We present a dedicated segmentation system for tumor identification and volumetric quantification in dynamic contrast brain magnetic resonance (MR) scans. Our goal is to offer a practically useful tool at the end of clinicians in order to boost volumetric tumor assessment. The system is designed to work in an interactive mode such that maximizes the integration of computing capacity and clinical intelligence. We demonstrate the main functions of the system in terms of its functional flow and conduct preliminary validation using a representative pilot dataset. The system is inexpensive, user-friendly, easy to deploy and integrate with picture archiving and communication systems (PACS), and possible to be open-source, which enable it to potentially serve as a useful assistant for radiologists and oncologists. It is anticipated that in the future the system can be integrated into clinical workflow so that become routine available to help clinicians make more objective interpretations of treatment interventions and natural history of disease to best advocate patient needs.

  11. A study to establish international diagnostic reference levels for paediatric computed tomography.

    PubMed

    Vassileva, J; Rehani, M; Kostova-Lefterova, D; Al-Naemi, H M; Al Suwaidi, J S; Arandjic, D; Bashier, E H O; Kodlulovich Renha, S; El-Nachef, L; Aguilar, J G; Gershan, V; Gershkevitsh, E; Gruppetta, E; Hustuc, A; Jauhari, A; Kharita, Mohammad Hassan; Khelassi-Toutaoui, N; Khosravi, H R; Khoury, H; Kralik, I; Mahere, S; Mazuoliene, J; Mora, P; Muhogora, W; Muthuvelu, P; Nikodemova, D; Novak, L; Pallewatte, A; Pekarovič, D; Shaaban, M; Shelly, E; Stepanyan, K; Thelsy, N; Visrutaratna, P; Zaman, A

    2015-07-01

    The article reports results from the largest international dose survey in paediatric computed tomography (CT) in 32 countries and proposes international diagnostic reference levels (DRLs) in terms of computed tomography dose index (CTDI vol) and dose length product (DLP). It also assesses whether mean or median values of individual facilities should be used. A total of 6115 individual patient data were recorded among four age groups: <1 y, >1-5 y, >5-10 y and >10-15 y. CTDIw, CTDI vol and DLP from the CT console were recorded in dedicated forms together with patient data and technical parameters. Statistical analysis was performed, and international DRLs were established at rounded 75th percentile values of distribution of median values from all CT facilities. The study presents evidence in favour of using median rather than mean of patient dose indices as the representative of typical local dose in a facility, and for establishing DRLs as third quartile of median values. International DRLs were established for paediatric CT examinations for routine head, chest and abdomen in the four age groups. DRLs for CTDI vol are similar to the reference values from other published reports, with some differences for chest and abdomen CT. Higher variations were observed between DLP values, based on a survey of whole multi-phase exams. It may be noted that other studies in literature were based on single phase only. DRLs reported in this article can be used in countries without sufficient medical physics support to identify non-optimised practice. Recommendations to improve the accuracy and importance of future surveys are provided. PMID:25836685

  12. Living Systems are Dynamically Stable by Computing Themselves at the Quantum Level

    NASA Astrophysics Data System (ADS)

    Igamberdiev, Abir U.

    2003-06-01

    The smallest details of living systems are molecular devices that operate between the classical and quantum levels, i.e. between the potential dimension (microscale) and the actual three-dimensional space (macroscale). They realize non-demolition quantum measurements in which time appears as a mesoscale dimension separating contradictory statements in the course of actualization. These smaller devices form larger devices (macromolecular complexes), up to living body. The quantum device possesses its own potential internal quantum state (IQS), which is maintained for prolonged time via error-correction being a reflection over this state. Decoherence-free IQS can exhibit itself by a creative generation of iteration limits in the real world. To avoid a collapse of the quantum information in the process of correcting errors, it is possible to make a partial measurement that extracts only the error-information and leaves the encoded state untouched. In natural quantum computers, which are living systems, the error-correction is internal. It is a result of reflection, given as a sort of a subjective process allotting optimal limits of iteration. The IQS resembles the properties of a quasi-particle, which interacts with the surround, applying decoherence commands to it. In this framework, enzymes are molecular automata of the extremal quantum computer, the set of which maintains stable highly ordered coherent state, and genome represents a concatenation of error-correcting codes into a single reflective set. Biological systems, being autopoietic in physical space, control quantum measurements in the physical universe. The biological evolution is really a functional evolution of measurement constraints in which limits of iteration are established possessing criteria of perfection and having selective values.

  13. DIATOM INDICES OF STREAM ECOSYSTEM CONDITIONS: COMPARISON OF GENUS VS. SPECIES LEVEL IDENTIFICATIONS

    EPA Science Inventory

    Diatom assemblage data collected between 1993 and 1995 from 233 Mid-Appalachian streams were used to compare indices of biotic integrity based on genus vs. species level taxonomy. Thirty-seven genera and 197 species of diatoms were identified from these samples. Metrics included...

  14. A Composite Approach To The Identification Of High-Level Topological Features In A Histopathologic Image

    NASA Astrophysics Data System (ADS)

    Kuhn, W. P.; Bartels, H. G.; Bartels, P. H.; Richards, D. L.; Saffer, J. S.; Shoemaker, R. L.

    1988-06-01

    Analysis of the large amounts of image data obtainable from very-high-speed scanning laser microscopes places severe demands on computer software and hardware architectures. The automated calculation of features over entire images can provide quantitative data useful to a pathologist who must make a diagnosis. A program that identifies objects of diagnostic interest in an image must utilize a model of the image. An expert system is an effective method for building abstract models of object hierarchies and for utilizing heuristic information. In this paper we discuss a composite approach to image understanding and assessment that utilizes an expert system to control a set of image processing functions for the recognition of various objects in an image.

  15. Computer-assisted identification of novel small molecule inhibitors targeting GLUT1

    NASA Astrophysics Data System (ADS)

    Wan, Zhining; Li, Xin; Sun, Rong; Li, Yuanyuan; Wang, Xiaoyun; Li, Xinru; Rong, Li; Shi, Zheng; Bao, Jinku

    2015-12-01

    Glucose transporters (GLUTs) are the main carriers of glucose that facilitate the diffusion of glucose in mammalian cells, especially GLUT1. Notably, GLUT1 is a rate-limiting transporter for glucose uptake, and its overexpression is a common characteristic in most cancers. Thus, the inhibition of GLUT1 by novel small compounds to lower glucose levels for cancer cells has become an emerging strategy. Herein, we employed high-throughput screening approaches to identify potential inhibitors against the sugar-binding site of GLUT1. Firstly, molecular docking screening was launched against the specs products, and three molecules (ZINC19909927, ZINC19908826, and ZINC19815451) were selected as candidate GLUT1 inhibitors for further analysis. Then, taking the initial ligand β-NG as a reference, molecular dynamic (MD) simulations and molecular mechanics/generalized born surface area (MM/GBSA) method were applied to evaluate the binding stability and affinity of the three candidates towards GLUT1. Finally, we found that ZINC19909927 might have the highest affinity to occupy the binding site of GLUT1. Meanwhile, energy decomposition analysis identified several residues located in substrate-binding site that might provide clues for future inhibitor discovery towards GLUT1. Taken together, these results in our study may provide valuable information for identifying new inhibitors targeting GLUT1-mediated glucose transport and metabolism for cancer therapeutics.

  16. Computational identification of miRNAs that modulate the differentiation of mesenchymal stem cells to osteoblasts

    PubMed Central

    Seenprachawong, Kanokwan; Nuchnoi, Pornlada; Nantasenamat, Chanin; Prachayasittikul, Virapong

    2016-01-01

    MicroRNAs (miRNAs) are small endogenous noncoding RNAs that play an instrumental role in post-transcriptional modulation of gene expression. Genes related to osteogenesis (i.e., RUNX2, COL1A1 and OSX) is important in controlling the differentiation of mesenchymal stem cells (MSCs) to bone tissues. The regulated expression level of miRNAs is critically important for the differentiation of MSCs to preosteoblasts. The understanding of miRNA regulation in osteogenesis could be applied for future applications in bone defects. Therefore, this study aims to shed light on the mechanistic pathway underlying osteogenesis by predicting miRNAs that may modulate this pathway. This study investigates RUNX2, which is a major transcription factor for osteogenesis that drives MSCs into preosteoblasts. Three different prediction tools were employed for identifying miRNAs related to osteogenesis using the 3’UTR of RUNX2 as the target gene. Of the 1,023 miRNAs, 70 miRNAs were found by at least two of the tools. Candidate miRNAs were then selected based on their free energy values, followed by assessing the probability of target accessibility. The results showed that miRNAs 23b, 23a, 30b, 143, 203, 217, and 221 could regulate the RUNX2 gene during the differentiation of MSCs to preosteoblasts. PMID:27168985

  17. Computational identification of miRNAs that modulate the differentiation of mesenchymal stem cells to osteoblasts.

    PubMed

    Seenprachawong, Kanokwan; Nuchnoi, Pornlada; Nantasenamat, Chanin; Prachayasittikul, Virapong; Supokawej, Aungkura

    2016-01-01

    MicroRNAs (miRNAs) are small endogenous noncoding RNAs that play an instrumental role in post-transcriptional modulation of gene expression. Genes related to osteogenesis (i.e., RUNX2, COL1A1 and OSX) is important in controlling the differentiation of mesenchymal stem cells (MSCs) to bone tissues. The regulated expression level of miRNAs is critically important for the differentiation of MSCs to preosteoblasts. The understanding of miRNA regulation in osteogenesis could be applied for future applications in bone defects. Therefore, this study aims to shed light on the mechanistic pathway underlying osteogenesis by predicting miRNAs that may modulate this pathway. This study investigates RUNX2, which is a major transcription factor for osteogenesis that drives MSCs into preosteoblasts. Three different prediction tools were employed for identifying miRNAs related to osteogenesis using the 3'UTR of RUNX2 as the target gene. Of the 1,023 miRNAs, 70 miRNAs were found by at least two of the tools. Candidate miRNAs were then selected based on their free energy values, followed by assessing the probability of target accessibility. The results showed that miRNAs 23b, 23a, 30b, 143, 203, 217, and 221 could regulate the RUNX2 gene during the differentiation of MSCs to preosteoblasts. PMID:27168985

  18. Analysis of the Computer Anxiety Levels of Secondary Technical Education Teachers in West Virginia.

    ERIC Educational Resources Information Center

    Gordon, Howard R. D.

    1995-01-01

    Responses from 116 secondary technical education teachers (91%) who completed Oetting's Computer Anxiety Scale indicated that 46% experienced general computer anxiety. Development of computer and typing skills explained a large portion of the variance. More hands-on training was recommended. (SK)

  19. The Impact of Cognitive and Non-Cognitive Personality Traits on Computer Literacy Level

    ERIC Educational Resources Information Center

    Saparniene, Diana; Merkys, Gediminas; Saparnis, Gintaras

    2006-01-01

    Purpose: The paper deals with the study of students' computer literacy one of the purposes being demonstration the impact of the cognitive and non-cognitive personality traits (attention, verbal and non-verbal intelligence, emotional-motivational relationship with computer, learning strategies, etc.) on the quality of computer literacy.…

  20. MICA, Managed Instruction with Computer Assistance: Level Five. An Outline of the System's Capabilities.

    ERIC Educational Resources Information Center

    Lorenz, Thomas B.; And Others

    Computer technology has been used since 1972 in the Madison, Wisconsin, public schools to control the flow of information required to support individualized instruction. Madison's computer-managed instruction system, MICA (Managed Instruction with Computer Assistance), operates interactively within individualized instruction programs to provide…

  1. High-level waste storage tank farms/242-A evaporator standards/requirements identification document (S/RID), Vol. 3

    SciTech Connect

    Not Available

    1994-04-01

    The Safeguards and Security (S&S) Functional Area address the programmatic and technical requirements, controls, and standards which assure compliance with applicable S&S laws and regulations. Numerous S&S responsibilities are performed on behalf of the Tank Farm Facility by site level organizations. Certain other responsibilities are shared, and the remainder are the sole responsibility of the Tank Farm Facility. This Requirements Identification Document describes a complete functional Safeguards and Security Program that is presumed to be the responsibility of the Tank Farm Facility. The following list identifies the programmatic elements in the S&S Functional Area: Program Management, Protection Program Scope and Evaluation, Personnel Security, Physical Security Systems, Protection Program Operations, Material Control and Accountability, Information Security, and Key Program Interfaces.

  2. Identification of interleukin 2, 6, and 8 levels around miniscrews during orthodontic tooth movement.

    PubMed

    Hamamcı, Nihal; Acun Kaya, Filiz; Uysal, Ersin; Yokuş, Beran

    2012-06-01

    The aim of this study was to identify the levels of interleukin (IL)-2, IL-6, and IL-8 around miniscrews used for anchorage during canine distalization. Sixteen patients (eight males and eight females; mean age, 16.6 ± 2.4 years) who were treated with bilateral upper first premolar extractions were included in the study. Thirty-two maxillary miniscrew implants were placed bilaterally in the alveolar bone between the maxillary second premolars and first molars as anchorage units for maxillary canine distalization. Three groups were constructed. The treatment, miniscrew, and control groups consisted of upper canines, miniscrew implants, and upper second premolars, respectively. Peri-miniscrew implant crevicular fluid and gingival crevicular fluid (GCF) were obtained at baseline (T1) and at 1 (T2), 24 (T3), and 48 (T4) hours, 7 (T5) and 21 (T6) days, and 3 months (T7) after force application. Paired sample t-tests were used to determine within-group changes and Dunnett's t and Tukey's honestly significant difference tests for between-group multiple comparisons. During the 3 month period, IL-2 levels significantly increased (P < 0.01) but only in the treatment group after 24 hours. IL-6 levels were unchanged at all times points in the three groups. IL-8 levels increased significantly at 1 (P < 0.05), 24 (P < 0.01), and 48 (P < 0.01) hours in the treatment group and at 24 (P < 0.05) and 48 (P < 0.01) hours in the miniscrew group. It appears that miniscrews can be used for anchorage in orthodontics when correct physiological forces are applied. PMID:21474566

  3. Identification of technical problems encountered in the shallow land burial of low-level radioactive wastes

    SciTech Connect

    Jacobs, D.G.; Epler, J.S.; Rose, R.R.

    1980-03-01

    A review of problems encountered in the shallow land burial of low-level radioactive wastes has been made in support of the technical aspects of the National Low-Level Waste (LLW) Management Research and Development Program being administered by the Low-Level Waste Management Program Office, Oak Ridge National Laboratory. The operating histories of burial sites at six major DOE and five commercial facilities in the US have been examined and several major problems identified. The problems experienced st the sites have been grouped into general categories dealing with site development, waste characterization, operation, and performance evaluation. Based on this grouping of the problem, a number of major technical issues have been identified which should be incorporated into program plans for further research and development. For each technical issue a discussion is presented relating the issue to a particular problem, identifying some recent or current related research, and suggesting further work necessary for resolving the issue. Major technical issues which have been identified include the need for improved water management, further understanding of the effect of chemical and physical parameters on radionuclide migration, more comprehensive waste records, improved programs for performance monitoring and evaluation, development of better predictive capabilities, evaluation of space utilization, and improved management control.

  4. Toward accurate tooth segmentation from computed tomography images using a hybrid level set model

    SciTech Connect

    Gan, Yangzhou; Zhao, Qunfei; Xia, Zeyang E-mail: jing.xiong@siat.ac.cn; Hu, Ying; Xiong, Jing E-mail: jing.xiong@siat.ac.cn; Zhang, Jianwei

    2015-01-15

    Purpose: A three-dimensional (3D) model of the teeth provides important information for orthodontic diagnosis and treatment planning. Tooth segmentation is an essential step in generating the 3D digital model from computed tomography (CT) images. The aim of this study is to develop an accurate and efficient tooth segmentation method from CT images. Methods: The 3D dental CT volumetric images are segmented slice by slice in a two-dimensional (2D) transverse plane. The 2D segmentation is composed of a manual initialization step and an automatic slice by slice segmentation step. In the manual initialization step, the user manually picks a starting slice and selects a seed point for each tooth in this slice. In the automatic slice segmentation step, a developed hybrid level set model is applied to segment tooth contours from each slice. Tooth contour propagation strategy is employed to initialize the level set function automatically. Cone beam CT (CBCT) images of two subjects were used to tune the parameters. Images of 16 additional subjects were used to validate the performance of the method. Volume overlap metrics and surface distance metrics were adopted to assess the segmentation accuracy quantitatively. The volume overlap metrics were volume difference (VD, mm{sup 3}) and Dice similarity coefficient (DSC, %). The surface distance metrics were average symmetric surface distance (ASSD, mm), RMS (root mean square) symmetric surface distance (RMSSSD, mm), and maximum symmetric surface distance (MSSD, mm). Computation time was recorded to assess the efficiency. The performance of the proposed method has been compared with two state-of-the-art methods. Results: For the tested CBCT images, the VD, DSC, ASSD, RMSSSD, and MSSD for the incisor were 38.16 ± 12.94 mm{sup 3}, 88.82 ± 2.14%, 0.29 ± 0.03 mm, 0.32 ± 0.08 mm, and 1.25 ± 0.58 mm, respectively; the VD, DSC, ASSD, RMSSSD, and MSSD for the canine were 49.12 ± 9.33 mm{sup 3}, 91.57 ± 0.82%, 0.27 ± 0.02 mm, 0

  5. Computation of dose rate at flight altitudes during ground level enhancements no. 69, 70 and 71

    NASA Astrophysics Data System (ADS)

    Mishev, A. L.; Adibpour, F.; Usoskin, I. G.; Felsberger, E.

    2015-01-01

    A new numerical model of estimating and monitoring the exposure of personnel due to secondary cosmic radiation onboard aircraft, in accordance with radiation safety standards as well as European and national regulations, has been developed. The model aims to calculate the effective dose at flight altitude (39,000 ft) due to secondary cosmic radiation of galactic and solar origin. In addition, the model allows the estimation of ambient dose equivalent at typical commercial airline altitudes in order to provide comparison with reference data. The basics, structure and function of the model are described. The model is based on a straightforward full Monte Carlo simulation of the cosmic ray induced atmospheric cascade. The cascade simulation is performed with the PLANETOCOSMICS code. The flux of secondary particles, namely neutrons, protons, gammas, electrons, positrons, muons and charged pions is calculated. A subsequent conversion of the particle fluence into the effective dose or ambient dose equivalent is performed as well as a comparison with reference data. An application of the model is demonstrated, using a computation of the effective dose rate at flight altitude during the ground level enhancements of 20 January 2005, 13 December 2006 and 17 May 2012.

  6. Mechanical Behaviour of Light Metal Alloys at High Strain Rates. Computer Simulation on Mesoscale Levels

    NASA Astrophysics Data System (ADS)

    Skripnyak, Vladimir; Skripnyak, Evgeniya; Meyer, Lothar W.; Herzig, Norman; Skripnyak, Nataliya

    2012-02-01

    Researches of the last years have allowed to establish that the laws of deformation and fracture of bulk ultrafine-grained and coarse-grained materials are various both in static and in dynamic loading conditions. Development of adequate constitutive equations for the description of mechanical behavior of bulk ultrafine-grained materials at intensive dynamic influences is complicated in consequence of insufficient knowledge about general rules of inelastic deformation and nucleation and growth of cracks. Multi-scale computational model was used for the investigation of deformation and fracture of bulk structured aluminum and magnesium alloys under stress pulse loadings on mesoscale level. The increment of plastic deformation is defined by the sum of the increments caused by a nucleation and gliding of dislocations, the twinning, meso-blocks movement, and grain boundary sliding. The model takes into account the influence on mechanical properties of alloys an average grains size, grain sizes distribution of and concentration of precipitates. It was obtained the nucleation and gliding of dislocations caused the high attenuation rate of the elastic precursor of ultrafine-grained alloys than in coarse grained counterparts.

  7. Identification of sites for the low-level waste disposal development and demonstration program

    SciTech Connect

    Ketelle, R.H.; Lee, D.W.

    1988-04-01

    This report presents the results of site selection studies for potential low-level radioactive waste disposal sites on the Oak Ridge Reservation (ORR). Summaries of the site selection procedures used and results of previous site selection studies on the ORR are included. This report includes recommendations of sites for demonstration of shallow land burial using engineered trench designs and demonstration of above-grade disposal using design concepts similar to those used in tumulus disposal. The site selection study, like its predecessor (ORNL/TM-9717, Use of DOE Site Selection Criteria for Screening Low-Level Waste Disposal Sites on the Oak Ridge Reservation), involved application of exclusionary site screening criteria to the region of interest to eliminate unacceptable areas from consideration. Also like the previous study, the region of interest for this study was limited to the Oak Ridge Department of Energy Reservation. Reconnaissance-level environmental data were used in the study, and field inspections of candidate sites were made to verify the available reconnaissance data. Five candidate sites, all underlain by Knox dolomite residuum and bedrock, were identified for possible development of shallow land burial facilities. Of the five candidate sites, the West Chestnut site was judged to be best suited for deployment of the shallow land burial technology. Three candidate sites, all underlain by the Conasauga Group in Bear Creek Valley, were identified for possible development of above-grade disposal technologies. Of the three sites identified, the Central Bear Creek Valley site lying between State Route 95 and Gum Hollow Road was ranked most favorable for deployment of the above-grade disposal technology.

  8. Identification of the double acceptor levels of the mercury vacancies in HgCdTe

    NASA Astrophysics Data System (ADS)

    Gemain, F.; Robin, I. C.; De Vita, M.; Brochen, S.; Lusson, A.

    2011-03-01

    Photoluminescence and temperature-dependent Hall measurements of nonintentionally doped HgCdTe epilayers were compared. These films were grown by liquid phase epitaxy and postannealed under different conditions as follows: a p-type annealing was used to control the mercury vacancy concentration and a n-type annealing under saturated Hg atmosphere was used to fill the mercury vacancies. The comparison of the photoluminescence measurements with Hall effect measurements allows us to identify the two acceptor energy levels of the mercury vacancy and to evidence its "negative-U" property corresponding to a stabilization of the ionized state V- of the mercury vacancy compared to its neutral state V0.

  9. Oil contamination of fish in the North Sea. Determination of levels and identification of sources

    SciTech Connect

    Johnsen, S.; Restucci, R.; Klungsoyr, J.

    1996-12-31

    Two fish species, cod and haddock have been sampled from five different regions in the Norwegian sector of the North Sea, the Haltenbanken and the Barents Sea, Three of the five sampling areas were located in regions with no local oil or gas production, while the remaining two areas represented regions with high density of oil and gas production fields. A total of 25 specimen of each of the two fish species were collected, and liver (all samples) and muscle (10 samples from each group) were analyzed for the content of total hydrocarbons (THC), selected aromatic compounds (NPD and PAH) and bicyclic aliphatic decalines. The present paper outlines the results of liver samples analyses from four of the sampled regions, & northern North Sea region and the three reference regions Egersundbanken, Haltenbanken and the Barents Sea. In general, no significant difference was observed between the hydrocarbon levels within the sampled regions. The only observed exception was a moderate, but significant increase in decaline levels in haddock liver from the Northern North Sea region. The qualitative interpretation of the results showed that the sources of hydrocarbon contamination varied within the total sampling area. This observation indicates that the local discharge sources in areas with high petroleum production activity are the sources of hydrocarbons in fish from such areas. However, it was not possible to identify single discharges as a contamination source from the present results.

  10. Identification of risk factors for Campylobacter contamination levels on broiler carcasses during the slaughter process.

    PubMed

    Seliwiorstow, Tomasz; Baré, Julie; Berkvens, Dirk; Van Damme, Inge; Uyttendaele, Mieke; De Zutter, Lieven

    2016-06-01

    Campylobacter carcass contamination was quantified across the slaughter line during processing of Campylobacter positive batches. These quantitative data were combined together with information describing slaughterhouse and batch related characteristics in order to identify risk factors for Campylobacter contamination levels on broiler carcasses. The results revealed that Campylobacter counts are influenced by the contamination of incoming birds (both the initial external carcass contamination and the colonization level of caeca) and the duration of transport and holding time that can be linked with feed withdrawal period. In addition, technical aspects of the slaughter process such as a dump based unloading system, electrical stunning, lower scalding temperature, incorrect setting of plucking, vent cutter and evisceration machines were identified as risk factors associated with increased Campylobacter counts on processed carcasses. As such the study indicates possible improvements of the slaughter process that can result in better control of Campylobacter numbers under routine processing of Campylobacter positive batches without use of chemical or physical decontamination. Moreover, all investigated factors were existing variations of the routine processing practises and therefore proposed interventions are practically and economically achievable. PMID:27016637

  11. Identification of low level gamma-irradiation of meats by high sensitivity comet assay

    NASA Astrophysics Data System (ADS)

    Miyahara, Makoto; Saito, Akiko; Ito, Hitoshi; Toyoda, Masatake

    2002-03-01

    The detection of low levels of irradiation in meats (pork, beef, and chicken) using the new comet assay was investigated in order to assess the capability of the procedure. The new assay includes a process that improves its sensitivity to irradiation and a novel evaluation system for each slide (influence score and comet-type distribution). Samples used were purchased at retailers and were irradiated at 0.5 and 2kGy at 0°C. The samples were processed to obtain comets. Slides were evaluated by typing comets, calculating the influence score and analyzing the comet-type distribution chart of shown on the slide. Influence scores of beef, pork, and chicken at 0kGy were 287(SD=8.0), 305 (SD=12.9), and 320 (SD=21.0), respectively. Those at 500Gy, were 305 (SD=5.3), 347 (SD=10.6), and 364 (12.6), respectively. Irradiation levels in food were successfully determined. Sensitivity to irradiation differed among samples (chicken>pork>beef).

  12. Rapid identification of Brucella isolates to the species level by real time PCR based single nucleotide polymorphism (SNP) analysis

    PubMed Central

    Gopaul, Krishna K; Koylass, Mark S; Smith, Catherine J; Whatmore, Adrian M

    2008-01-01

    Background Brucellosis, caused by members of the genus Brucella, remains one of the world's major zoonotic diseases. Six species have classically been recognised within the family Brucella largely based on a combination of classical microbiology and host specificity, although more recently additional isolations of novel Brucella have been reported from various marine mammals and voles. Classical identification to species level is based on a biotyping approach that is lengthy, requires extensive and hazardous culturing and can be difficult to interpret. Here we describe a simple and rapid approach to identification of Brucella isolates to the species level based on real-time PCR analysis of species-specific single nucleotide polymorphisms (SNPs) that were identified following a robust and extensive phylogenetic analysis of the genus. Results Seven pairs of short sequence Minor Groove Binding (MGB) probes were designed corresponding to SNPs shown to possess an allele specific for each of the six classical Brucella spp and the marine mammal Brucella. Assays were optimised to identical reaction parameters in order to give a multiple outcome assay that can differentiate all the classical species and Brucella isolated from marine mammals. The scope of the assay was confirmed by testing of over 300 isolates of Brucella, all of which typed as predicted when compared to other phenotypic and genotypic approaches. The assay is sensitive being capable of detecting and differentiating down to 15 genome equivalents. We further describe the design and testing of assays based on three additional SNPs located within the 16S rRNA gene that ensure positive discrimination of Brucella from close phylogenetic relatives on the same platform. Conclusion The multiple-outcome assay described represents a new tool for the rapid, simple and unambiguous characterisation of Brucella to the species level. Furthermore, being based on a robust phylogenetic framework, the assay provides a platform

  13. ENHANCING THE ATOMIC-LEVEL UNDERSTANDING OF CO2 MINERAL SEQUESTRATION MECHANISMS VIA ADVANCED COMPUTATIONAL MODELING

    SciTech Connect

    A.V.G. Chizmeshya

    2002-12-19

    /NETL managed National Mineral Sequestration Working Group we have already significantly improved our understanding of mineral carbonation. Group members at the Albany Research Center have recently shown that carbonation of olivine and serpentine, which naturally occurs over geological time (i.e., 100,000s of years), can be accelerated to near completion in hours. Further process refinement will require a synergetic science/engineering approach that emphasizes simultaneous investigation of both thermodynamic processes and the detailed microscopic, atomic-level mechanisms that govern carbonation kinetics. Our previously funded Phase I Innovative Concepts project demonstrated the value of advanced quantum-mechanical modeling as a complementary tool in bridging important gaps in our understanding of the atomic/molecular structure and reaction mechanisms that govern CO{sub 2} mineral sequestration reaction processes for the model Mg-rich lamellar hydroxide feedstock material Mg(OH){sub 2}. In the present simulation project, improved techniques and more efficient computational schemes have allowed us to expand and augment these capabilities and explore more complex Mg-rich, lamellar hydroxide-based feedstock materials, including the serpentine-based minerals. These feedstock materials are being actively investigated due to their wide availability, and low-cost CO{sub 2} mineral sequestration potential. Cutting-edge first principles quantum chemical, computational solid-state and materials simulation methodology studies proposed herein, have been strategically integrated with our new DOE supported (ASU-Argonne National Laboratory) project to investigate the mechanisms that govern mineral feedstock heat-treatment and aqueous/fluid-phase serpentine mineral carbonation in situ. This unified, synergetic theoretical and experimental approach will provide a deeper understanding of the key reaction mechanisms than either individual approach can alone. Ab initio techniques will also

  14. ENHANCING THE ATOMIC-LEVEL UNDERSTANDING OF CO2 MINERAL SEQUESTRATION MECHANISMS VIA ADVANCED COMPUTATIONAL MODELING

    SciTech Connect

    A.V.G. Chizmeshya; M.J. McKelvy; G.H. Wolf; R.W. Carpenter; D.A. Gormley; J.R. Diefenbacher; R. Marzke

    2006-03-01

    significantly improved our understanding of mineral carbonation. Group members at the Albany Research Center have recently shown that carbonation of olivine and serpentine, which naturally occurs over geological time (i.e., 100,000s of years), can be accelerated to near completion in hours. Further process refinement will require a synergetic science/engineering approach that emphasizes simultaneous investigation of both thermodynamic processes and the detailed microscopic, atomic-level mechanisms that govern carbonation kinetics. Our previously funded Phase I Innovative Concepts project demonstrated the value of advanced quantum-mechanical modeling as a complementary tool in bridging important gaps in our understanding of the atomic/molecular structure and reaction mechanisms that govern CO2 mineral sequestration reaction processes for the model Mg-rich lamellar hydroxide feedstock material Mg(OH)2. In the present simulation project, improved techniques and more efficient computational schemes have allowed us to expand and augment these capabilities and explore more complex Mg-rich, lamellar hydroxide-based feedstock materials, including the serpentine-based minerals. These feedstock materials are being actively investigated due to their wide availability, and low-cost CO2 mineral sequestration potential. Cutting-edge first principles quantum chemical, computational solid-state and materials simulation methodology studies proposed herein, have been strategically integrated with our new DOE supported (ASU-Argonne National Laboratory) project to investigate the mechanisms that govern mineral feedstock heat-treatment and aqueous/fluid-phase serpentine mineral carbonation in situ. This unified, synergetic theoretical and experimental approach has provided a deeper understanding of the key reaction mechanisms than either individual approach can alone. We used ab initio techniques to significantly advance our understanding of atomic-level processes at the solid/solution interface by

  15. ENHANCING THE ATOMIC-LEVEL UNDERSTANDING OF CO2 MINERAL SEQUESTRATION MECHANISMS VIA ADVANCED COMPUTATIONAL MODELING

    SciTech Connect

    A.V.G. Chizmeshya

    2003-12-19

    /NETL managed National Mineral Sequestration Working Group we have already significantly improved our understanding of mineral carbonation. Group members at the Albany Research Center have recently shown that carbonation of olivine and serpentine, which naturally occurs over geological time (i.e., 100,000s of years), can be accelerated to near completion in hours. Further process refinement will require a synergetic science/engineering approach that emphasizes simultaneous investigation of both thermodynamic processes and the detailed microscopic, atomic-level mechanisms that govern carbonation kinetics. Our previously funded Phase I Innovative Concepts project demonstrated the value of advanced quantum-mechanical modeling as a complementary tool in bridging important gaps in our understanding of the atomic/molecular structure and reaction mechanisms that govern CO{sub 2} mineral sequestration reaction processes for the model Mg-rich lamellar hydroxide feedstock material Mg(OH){sub 2}. In the present simulation project, improved techniques and more efficient computational schemes have allowed us to expand and augment these capabilities and explore more complex Mg-rich, lamellar hydroxide-based feedstock materials, including the serpentine-based minerals. These feedstock materials are being actively investigated due to their wide availability, and low-cost CO{sub 2} mineral sequestration potential. Cutting-edge first principles quantum chemical, computational solid-state and materials simulation methodology studies proposed herein, have been strategically integrated with our new DOE supported (ASU-Argonne National Laboratory) project to investigate the mechanisms that govern mineral feedstock heat-treatment and aqueous/fluid-phase serpentine mineral carbonation in situ. This unified, synergetic theoretical and experimental approach will provide a deeper understanding of the key reaction mechanisms than either individual approach can alone. Ab initio techniques will also

  16. Three dimensional morphological studies of Larger Benthic Foraminifera at the population level using micro computed tomography

    NASA Astrophysics Data System (ADS)

    Kinoshita, Shunichi; Eder, Wolfgang; Woeger, Julia; Hohenegger, Johann; Briguglio, Antonino; Ferrandez-Canadell, Carles

    2015-04-01

    Symbiont-bearing larger benthic Foraminifera (LBF) are long-living marine (at least 1 year), single-celled organisms with complex calcium carbonate shells. Their morphology has been intensively studied since the middle of the nineteenth century. This led to a broad spectrum of taxonomic results, important from biostratigraphy to ecology in shallow water tropical to warm temperate marine palaeo-environments. However, it was necessary for the traditional investigation methods to cut or destruct specimens for analysing the taxonomically important inner structures. X-ray micro-computed tomography (microCT) is one of the newest techniques used in morphological studies. The greatest advantage is the non-destructive acquisition of inner structures. Furthermore, the running improve of microCT scanners' hard- and software provides high resolution and short time scans well-suited for LBF. Three-dimensional imaging techniques allow to select and extract each chamber and to measure easily its volume, surface and several form parameters used for morphometric analyses. Thus, 3-dimensional visualisation of LBF-tests is a very big step forward from traditional morphology based on 2-dimensional data. The quantification of chamber form is a great opportunity to tackle LBF structures, architectures and the bauplan geometry. The micrometric digital resolution is the only way to solve many controversies in phylogeny and evolutionary trends of LBF. For the present study we used micro-computed tomography to easily investigate the chamber number of every specimen from statistically representative part of populations to estimate population dynamics. Samples of living individuals are collected at monthly intervals from fixed locations. Specific preparation allows to scan up to 35 specimens per scan within 2 hours and to obtain the complete digital dataset for each specimen of the population. MicroCT enables thus a fast and precise count of all chambers built by the foraminifer from its

  17. Identification of strong and weak interacting two-level systems in KBr:CN.

    PubMed

    Gaita-Ariño, Alejandro; Schechter, Moshe

    2011-09-01

    Tunneling two-level systems (TLSs) are believed to be the source of phenomena such as the universal low temperature properties in disordered and amorphous solids, and 1/f noise. The existence of these phenomena in a large variety of dissimilar physical systems testifies for the universal nature of the TLSs, which however, is not yet known. Following a recent suggestion that attributes the low temperature TLSs to inversion pairs [M. Schechter and P. C. E. Stamp, arXiv:0910.1283.] we calculate explicitly the TLS-phonon coupling of inversion symmetric and asymmetric TLSs in a given disordered crystal. Our work (a) estimates parameters that support the theory in M. Schechter and P. C. E. Stamp, arXiv:0910.1283, in its general form, and (b) positively identifies, for the first time, the relevant TLSs in a given system. PMID:21981511

  18. Identification of Geotrichum candidum at the species and strain level: proposal for a standardized protocol.

    PubMed

    Gente, S; Sohier, D; Coton, E; Duhamel, C; Gueguen, M

    2006-12-01

    In this study, the M13 primer was used to distinguish Geotrichum candidum from the anamorphic and teleomorphic forms of other arthrospore-forming species (discriminatory power = 0.99). For intraspecific characterization, the GATA4 primer showed the highest level of discrimination for G. candidum among the 20 microsatellite primers tested. A molecular typing protocol (DNA concentration, hybridization temperature and type of PCR machine) was optimized through a series of intra- and interlaboratory trials. This protocol was validated using 75 strains of G. candidum, one strain of G. capitatum and one strain of G. fragrans, and exhibited a discrimination score of 0.87. This method could therefore be used in the agro-food industries to identify and to evaluate biodiversity and trace strains of G. candidum. The results show that the GATA4 primer might be used to differentiate strains according to their ecological niche. PMID:16855820

  19. Identification of genetic variants of lecithin cholesterol acyltransferase in individuals with high HDL‑C levels.

    PubMed

    Naseri, Mohsen; Hedayati, Mehdi; Daneshpour, Maryam Sadat; Bandarian, Fatemeh; Azizi, Fereidoun

    2014-07-01

    Among the most common lipid abnormalities, a low level of high-density lipoprotein-cholesterol (HDL‑C) is one of the first risk factors identified for coronary heart disease. Lecithin cholesterol acyltransferase (LCAT) has a pivotal role in the formation and maturation of HDL-C and in reverse cholesterol transport. To identify genetic loci associated with low HDL-C in a population-based cohort in Tehran, the promoter, coding regions and exon/intron boundaries of LCAT were amplified and sequenced in consecutive individuals (n=150) who had extremely low or high HDL-C levels but no other major lipid abnormalities. A total of 14 single-nucleotide polymorphisms (SNPs) were identified, of which 10 were found to be novel; the L393L, S232T and 16:67977696 C>A polymorphisms have been previously reported in the SNP Database (as rs5923, rs4986970 and rs11860115, respectively) and the non-synonymous R47M mutation has been reported in the Catalogue of Somatic Mutations in Cancer (COSM972635). Three of the SNPs identified in the present study (position 6,531 in exon 5, position 6,696 in exon 5 and position 5,151 in exon 1) led to an amino acid substitution. The most common variants were L393L (4886C/T) in exon 6 and Q177E, a novel mutation, in exon 5, and the prevalence of the heterozygous genotype of these two SNPs was significantly higher in the low HDL-C groups. Univariate conditional logistic regression odds ratios (ORs) were nominally significant for Q177E (OR, 5.64; P=0.02; 95% confidence interval, 1.2‑26.2). However, this finding was attenuated following adjustment for confounders. Further studies using a larger sample size may enhance the determination of the role of these SNPs. PMID:24789697

  20. Analysis and identification of two reconstituted tobacco sheets by three-level infrared spectroscopy

    NASA Astrophysics Data System (ADS)

    Wu, Xian-xue; Xu, Chang-hua; Li, Ming; Sun, Su-qin; Li, Jin-ming; Dong, Wei

    2014-07-01

    Two kinds of reconstituted tobacco (RT) from France (RTF) and China (RTC) were analyzed and identified by a three-level infrared spectroscopy method (Fourier-transform infrared spectroscopy (FT-IR) coupled with second derivative infrared spectroscopy (SD-IR) and two-dimensional infrared correlation spectroscopy (2D-IR). The conventional IR spectra of RTF parallel samples were more consistent than those of RTC according to their overlapped parallel spectra and IR spectra correlation coefficients. FT-IR spectra of both two RTs were similar in holistic spectral profile except for small differences around 1430 cm-1, indicating that they have similar chemical constituents. By analysis of SD-IR spectra of RTFs and RTCs, more distinct fingerprint features, especially peaks at 1106 (1110), 1054 (1059) and 877 (874) cm-1, were disclosed. Even better reproducibility of five SD-IR spectra of RTF in 1750-1400 cm-1 could be seen intuitively from their stacked spectra and could be confirmed by further similarity evaluation of SD-IR spectra. Existence of calcium carbonate and calcium oxalate could be easily observed in two RTs by comparing their spectra with references. Furthermore, the 2D-IR spectra provided obvious, vivid and intuitive differences of RTF and RTC. Both two RTs had a pair of strong positive auto-peaks in 1600-1400 cm-1. Specifically, the autopeak at 1586 cm-1 in RTF was stronger than the one around 1421 cm-1, whereas the one at 1587 cm-1 in RTC was weaker than that at 1458 cm-1. Consequently, the RTs of two different brands were analyzed and identified thoroughly and RTF had better homogeneity than RTC. As a result, three-level infrared spectroscopy method has proved to be a simple, convenient and efficient method for rapid discrimination and homogeneousness estimation of RT.

  1. Interactomes to Biological Phase Space: a call to begin thinking at a new level in computational biology.

    SciTech Connect

    Davidson, George S.; Brown, William Michael

    2007-09-01

    Techniques for high throughput determinations of interactomes, together with high resolution protein collocalizations maps within organelles and through membranes will soon create a vast resource. With these data, biological descriptions, akin to the high dimensional phase spaces familiar to physicists, will become possible. These descriptions will capture sufficient information to make possible realistic, system-level models of cells. The descriptions and the computational models they enable will require powerful computing techniques. This report is offered as a call to the computational biology community to begin thinking at this scale and as a challenge to develop the required algorithms and codes to make use of the new data.3

  2. Identification of 14-3-3 Proteins Phosphopeptide-Binding Specificity Using an Affinity-Based Computational Approach

    PubMed Central

    Li, Zhao; Tang, Jijun; Guo, Fei

    2016-01-01

    The 14-3-3 proteins are a highly conserved family of homodimeric and heterodimeric molecules, expressed in all eukaryotic cells. In human cells, this family consists of seven distinct but highly homologous 14-3-3 isoforms. 14-3-3σ is the only isoform directly linked to cancer in epithelial cells, which is regulated by major tumor suppressor genes. For each 14-3-3 isoform, we have 1,000 peptide motifs with experimental binding affinity values. In this paper, we present a novel method for identifying peptide motifs binding to 14-3-3σ isoform. First, we propose a sampling criteria to build a predictor for each new peptide sequence. Then, we select nine physicochemical properties of amino acids to describe each peptide motif. We also use auto-cross covariance to extract correlative properties of amino acids in any two positions. Finally, we consider elastic net to predict affinity values of peptide motifs, based on ridge regression and least absolute shrinkage and selection operator (LASSO). Our method tests on the 1,000 known peptide motifs binding to seven 14-3-3 isoforms. On the 14-3-3σ isoform, our method has overall pearson-product-moment correlation coefficient (PCC) and root mean squared error (RMSE) values of 0.84 and 252.31 for N–terminal sublibrary, and 0.77 and 269.13 for C–terminal sublibrary. We predict affinity values of 16,000 peptide sequences and relative binding ability across six permutated positions similar with experimental values. We identify phosphopeptides that preferentially bind to 14-3-3σ over other isoforms. Several positions on peptide motifs are in the same amino acid category with experimental substrate specificity of phosphopeptides binding to 14-3-3σ. Our method is fast and reliable and is a general computational method that can be used in peptide-protein binding identification in proteomics research. PMID:26828594

  3. Identification of new stress-induced microRNA and their targets in wheat using computational approach

    PubMed Central

    Pandey, Bharati; Gupta, Om Prakash; Pandey, Dev Mani; Sharma, Indu; Sharma, Pradeep

    2013-01-01

    MicroRNAs (miRNAs) are a class of short endogenous non-coding small RNA molecules of about 18–22 nucleotides in length. Their main function is to downregulate gene expression in different manners like translational repression, mRNA cleavage and epigenetic modification. Computational predictions have raised the number of miRNAs in wheat significantly using an EST based approach. Hence, a combinatorial approach which is amalgamation of bioinformatics software and perl script was used to identify new miRNA to add to the growing database of wheat miRNA. Identification of miRNAs was initiated by mining the EST (Expressed Sequence Tags) database available at National Center for Biotechnology Information. In this investigation, 4677 mature microRNA sequences belonging to 50 miRNA families from different plant species were used to predict miRNA in wheat. A total of five abiotic stress-responsive new miRNAs were predicted and named Ta-miR5653, Ta-miR855, Ta-miR819k, Ta-miR3708 and Ta-miR5156. In addition, four previously identified miRNA, i.e., Ta-miR1122, miR1117, Ta-miR1134 and Ta-miR1133 were predicted in newly identified EST sequence and 14 potential target genes were subsequently predicted, most of which seems to encode ubiquitin carrier protein, serine/threonine protein kinase, 40S ribosomal protein, F-box/kelch-repeat protein, BTB/POZ domain-containing protein, transcription factors which are involved in growth, development, metabolism and stress response. Our result has increased the number of miRNAs in wheat, which should be useful for further investigation into the biological functions and evolution of miRNAs in wheat and other plant species. PMID:23511197

  4. MULTIPRED2: a computational system for large-scale identification of peptides predicted to bind to HLA supertypes and alleles.

    PubMed

    Zhang, Guang Lan; DeLuca, David S; Keskin, Derin B; Chitkushev, Lou; Zlateva, Tanya; Lund, Ole; Reinherz, Ellis L; Brusic, Vladimir

    2011-11-30

    MULTIPRED2 is a computational system for facile prediction of peptide binding to multiple alleles belonging to human leukocyte antigen (HLA) class I and class II DR molecules. It enables prediction of peptide binding to products of individual HLA alleles, combination of alleles, or HLA supertypes. NetMHCpan and NetMHCIIpan are used as prediction engines. The 13 HLA Class I supertypes are A1, A2, A3, A24, B7, B8, B27, B44, B58, B62, C1, and C4. The 13 HLA Class II DR supertypes are DR1, DR3, DR4, DR6, DR7, DR8, DR9, DR11, DR12, DR13, DR14, DR15, and DR16. In total, MULTIPRED2 enables prediction of peptide binding to 1077 variants representing 26 HLA supertypes. MULTIPRED2 has visualization modules for mapping promiscuous T-cell epitopes as well as those regions of high target concentration - referred to as T-cell epitope hotspots. Novel graphic representations are employed to display the predicted binding peptides and immunological hotspots in an intuitive manner and also to provide a global view of results as heat maps. Another function of MULTIPRED2, which has direct relevance to vaccine design, is the calculation of population coverage. Currently it calculates population coverage in five major groups in North America. MULTIPRED2 is an important tool to complement wet-lab experimental methods for identification of T-cell epitopes. It is available at http://cvc.dfci.harvard.edu/multipred2/. PMID:21130094

  5. Identification of 14-3-3 Proteins Phosphopeptide-Binding Specificity Using an Affinity-Based Computational Approach.

    PubMed

    Li, Zhao; Tang, Jijun; Guo, Fei

    2016-01-01

    The 14-3-3 proteins are a highly conserved family of homodimeric and heterodimeric molecules, expressed in all eukaryotic cells. In human cells, this family consists of seven distinct but highly homologous 14-3-3 isoforms. 14-3-3σ is the only isoform directly linked to cancer in epithelial cells, which is regulated by major tumor suppressor genes. For each 14-3-3 isoform, we have 1,000 peptide motifs with experimental binding affinity values. In this paper, we present a novel method for identifying peptide motifs binding to 14-3-3σ isoform. First, we propose a sampling criteria to build a predictor for each new peptide sequence. Then, we select nine physicochemical properties of amino acids to describe each peptide motif. We also use auto-cross covariance to extract correlative properties of amino acids in any two positions. Finally, we consider elastic net to predict affinity values of peptide motifs, based on ridge regression and least absolute shrinkage and selection operator (LASSO). Our method tests on the 1,000 known peptide motifs binding to seven 14-3-3 isoforms. On the 14-3-3σ isoform, our method has overall pearson-product-moment correlation coefficient (PCC) and root mean squared error (RMSE) values of 0.84 and 252.31 for N-terminal sublibrary, and 0.77 and 269.13 for C-terminal sublibrary. We predict affinity values of 16,000 peptide sequences and relative binding ability across six permutated positions similar with experimental values. We identify phosphopeptides that preferentially bind to 14-3-3σ over other isoforms. Several positions on peptide motifs are in the same amino acid category with experimental substrate specificity of phosphopeptides binding to 14-3-3σ. Our method is fast and reliable and is a general computational method that can be used in peptide-protein binding identification in proteomics research. PMID:26828594

  6. Identification of antipsychotic drug fluspirilene as a potential p53-MDM2 inhibitor: a combined computational and experimental study.

    PubMed

    Patil, Sachin P; Pacitti, Michael F; Gilroy, Kevin S; Ruggiero, John C; Griffin, Jonathan D; Butera, Joseph J; Notarfrancesco, Joseph M; Tran, Shawn; Stoddart, John W

    2015-02-01

    The inhibition of tumor suppressor p53 protein due to its direct interaction with oncogenic murine double minute 2 (MDM2) protein, plays a central role in almost 50 % of all human tumor cells. Therefore, pharmacological inhibition of the p53-binding pocket on MDM2, leading to p53 activation, presents an important therapeutic target against these cancers expressing wild-type p53. In this context, the present study utilized an integrated virtual and experimental screening approach to screen a database of approved drugs for potential p53-MDM2 interaction inhibitors. Specifically, using an ensemble rigid-receptor docking approach with four MDM2 protein crystal structures, six drug molecules were identified as possible p53-MDM2 inhibitors. These drug molecules were then subjected to further molecular modeling investigation through flexible-receptor docking followed by Prime/MM-GBSA binding energy analysis. These studies identified fluspirilene, an approved antipsychotic drug, as a top hit with MDM2 binding mode and energy similar to that of a native MDM2 crystal ligand. The molecular dynamics simulations suggested stable binding of fluspirilene to the p53-binding pocket on MDM2 protein. The experimental testing of fluspirilene showed significant growth inhibition of human colon tumor cells in a p53-dependent manner. Fluspirilene also inhibited growth of several other human tumor cell lines in the NCI60 cell line panel. Taken together, these computational and experimental data suggest a potentially novel role of fluspirilene in inhibiting the p53-MDM2 interaction. It is noteworthy here that fluspirilene has a long history of safe human use, thus presenting immediate clinical potential as a cancer therapeutic. Furthermore, fluspirilene could also serve as a structurally-novel lead molecule for the development of more potent, small-molecule p53-MDM2 inhibitors against several types of cancer. Importantly, the combined computational and experimental screening protocol

  7. Prevention of Secondary Conditions in Fetal Alcohol Spectrum Disorders: Identification of Systems-Level Barriers

    PubMed Central

    Petrenko, Christie L. M.; Tahir, Naira; Mahoney, Erin C.; Chin, Nancy P.

    2013-01-01

    Objective Fetal alcohol spectrum disorders (FASD) impact 2 to 5 percent of the U.S. population and are associated with life-long cognitive and behavioral impairments. Individuals with FASD have high rates of secondary conditions, including mental health problems, school disruptions, and trouble with the law. This study focuses on systems-level barriers that contribute to secondary conditions and interfere with prevention and treatment. Methods Using a phenomenological methodology, semi-structured interviews and focus groups were conducted with parents of children with FASD and service providers. Data were analyzed using a framework approach. Results Participants emphasized the pervasive lack of knowledge of FASD throughout multiple systems. This lack of knowledge contributes to multi-system barriers including delayed diagnosis, unavailability of services, and difficulty qualifying for, implementing, and maintaining services. Conclusions FASD is a major public health problem. Broad system changes using a public health approach are needed to increase awareness and understanding of FASD, improve access to diagnostic and therapeutic services, and create responsive institutional policies to prevent secondary conditions. These changes are essential to improve outcomes for individuals with FASD and their families and facilitate dissemination of empirically supported interventions. PMID:24178158

  8. Identification of APC mutations and evaluation of their expression level using a functional screening assay

    SciTech Connect

    Varesco, L.; Gismondi, V.; Bafico, A.

    1994-09-01

    A functional screen for chain-terminating mutations in the APC gene recently has been developed. It is based on the PCR and cloning of a segment of the gene in-frame with a colorimetric marker gene (lacz) followed by screening for the level of activity of the marker polypeptide (beta-galactosidase). This method scores colony number with different blue colors that are produced by bacteria containing normal and mutant APC segments. In the present work this method was used to screen the entire APC coding region by using eight primer pairs. DNA segments with known APC mutations at different positions in the gene were used as controls and were clearly identifiable with this assay. In addition, the entire APC coding region has been examined in 21 APC patients in whom PCR-SSCP did not identify an APC mutation. Novel mutations (n=14) were identified by the blue/white assay and were all confirmed by sequence analysis. This method also was used to quantitate the expression of paternal and maternal APC alleles taking advantage of an RsaI site polymorphism at position 1458 in a small number of informative individuals. Differential expression of some known mutant APC mRNAs was observed.

  9. Identification of oxidized phospholipids in bronchoalveolar lavage exposed to low ozone levels using multivariate analysis

    PubMed Central

    Almstrand, Ann-Charlotte; Voelker, Dennis; Murphy, Robert C

    2015-01-01

    Chemical reactions with unsaturated phospholipids in the respiratory tract lining fluid have been identified as one of the first important steps in the mechanisms mediating environmental ozone toxicity. As a consequence of these reactions, complex mixtures of oxidized lipids are generated in the presence of mixtures of non-oxidized naturally occurring phospholipid molecular species, which challenge methods of analysis. Untargeted mass spectrometry and statistical methods were employed to approach these complex spectra. Human bronchoalveolar lavage (BAL) was exposed to low levels of ozone and samples, with and without derivatization of aldehydes, were analyzed by liquid chromatography electrospray ionization tandem mass spectrometry. Data processing was carried out using principal component analysis (PCA). Resulting PCA score plots indicated an ozone dose-dependent increase, with apparent separation between BAL samples exposed to 60 ppb ozone and non-exposed BAL samples, and a clear separation between ozonized samples before and after derivatization. Corresponding loadings plots revealed that more than 30 phosphatidylcholine (PC) species decreased due to ozonation. A total of 13 PC and 6 phosphatidylglycerol oxidation products were identified with the majority being structurally characterized as chain-shortened aldehyde products. This method exemplifies an approach for comprehensive detection of low abundance, yet important, components in complex lipid samples. PMID:25575758

  10. Feature-level signal processing for near-real-time odor identification

    NASA Astrophysics Data System (ADS)

    Roppel, Thaddeus A.; Padgett, Mary Lou; Waldemark, Joakim T. A.; Wilson, Denise M.

    1998-09-01

    Rapid detection and classification of odor is of particular interest in applications such as manufacturing of consumer items, food processing, drug and explosives detection, and battlefield situation assessment. Various detection and classification techniques are under investigation so that end users can have access to useful information from odor sensor arrays in near-real-time. Feature-level data clustering and classification techniques are proposed that are (1) parallelizable to permit efficient hardware implementation, (2) adaptable to readily incorporate new data classes, (3) capable of gracefully handling outlier data points and failed sensor conditions, and (4) can provide confidence intervals and/or a traceable decision record along with each classification to permit validation and verification. Results from using specific techniques will be presented and compared. The techniques studied include principal components analysis, automated outlier determination, radial basis functions (RBF), multi-layer perceptrons (MLP), and pulse-coupled neural networks (PCNN). The results reported here are based on data from a testbed in which a gas sensor array is exposed to odor samples on a continuous basis. We have reported previously that more detailed and faster discrimination can be obtained by using sensor transient response in addition to steady state response. As the size of the data set grows we are able to more accurately model performance of a sensor array under realistic conditions.

  11. Cyclic siloxanes in air, including identification of high levels in Chicago and distinct diurnal variation

    PubMed Central

    Yucuis, Rachel A.; Stanier, Charles O.; Hornbuckle, Keri C.

    2014-01-01

    The organosilicon compounds octamethylcyclotetrasiloxane (D4), decamethylcyclopentasiloxane (D5), and dodecamethylcyclohexasiloxane (D6) are high production volume chemicals that are widely used in household goods and personal care products. Due to their prevalence and chemical characteristics, cyclic siloxanes are being assessed as possible persistent organic pollutants. D4, D5, and D6 were measured in indoor and outdoor air to quantify and compare siloxane concentrations and compound ratios depending on location type. Indoor air samples had a median concentration of 2200 ng m−3 for the sum of D4, D5, and D6. Outdoor sampling locations included downtown Chicago, Cedar Rapids, IA, and West Branch, IA, and had median sum siloxane levels of 280, 73, and 29 ng m−3 respectively. A diurnal trend is apparent in the samples taken in downtown Chicago. Nighttime samples had a median 2.7 times higher on average than daytime samples, which is due, in part, to the fluctuations of the planetary boundary layer. D5 was the dominant siloxane in both indoor and outdoor air. Ratios of D5 to D4 averaged 91 and 3.2 for indoor and outdoor air respectively. PMID:23541357

  12. Identification of high-level functional/system requirements for future civil transports

    NASA Technical Reports Server (NTRS)

    Swink, Jay R.; Goins, Richard T.

    1992-01-01

    In order to accommodate the rapid growth in commercial aviation throughout the remainder of this century, the Federal Aviation Administration (FAA) is faced with a formidable challenge to upgrade and/or modernize the National Airspace System (NAS) without compromising safety or efficiency. A recurring theme in both the Aviation System Capital Investment Plan (CIP), which has replaced the NAS Plan, and the new FAA Plan for Research, Engineering, and Development (RE&D) rely on the application of new technologies and a greater use of automation. Identifying the high-level functional and system impacts of such modernization efforts on future civil transport operational requirements, particularly in terms of cockpit functionality and information transfer, was the primary objective of this project. The FAA planning documents for the NAS of the 2005 era and beyond were surveyed; major aircraft functional capabilities and system components required for such an operating environment were identified. A hierarchical structured analysis of the information processing and flows emanating from such functional/system components were conducted and the results documented in graphical form depicting the relationships between functions and systems.

  13. Identification of a gene, FMP21, whose expression levels are involved in thermotolerance in Saccharomyces cerevisiae

    PubMed Central

    2014-01-01

    Elucidation of the mechanism of high temperature tolerance in yeasts is important for the molecular breeding of high temperature-tolerant yeasts that can be used in bioethanol production. We identified genes whose expression is correlated with the degree of thermotolerance in Saccharomyces cerevisiae by DNA microarray analysis. Gene expression profiles of three S. cerevisiae strains showing different levels of thermotolerance were compared, and we chose three of them as candidate genes. Among these genes, FMP21 was investigated as a thermotolerance-related gene in S. cerevisiae by comparing the growth at high temperature with the gene expression in eight strains. The expression ratio of FMP21 at 37°C was correlated with the doubling time ratio at a coefficient of determination of 0.787. The potential involvement of the Fmp21 in the thermotolerance of yeasts was evaluated. The FMP21 deletion variant showed a decreased respiratory growth rate and increased thermosensitivity. Furthermore, the overexpression of FMP21 improved thermotolerance in yeasts. In conclusion, the function of Fmp21 is important for thermotolerance in yeasts. PMID:25177541

  14. Analysis of lidar elevation data for improved identification and delineation of lands vulnerable to sea-level rise

    USGS Publications Warehouse

    Gesch, D.B.

    2009-01-01

    The importance of sea-level rise in shaping coastal landscapes is well recognized within the earth science community, but as with many natural hazards, communicating the risks associated with sea-level rise remains a challenge. Topography is a key parameter that influences many of the processes involved in coastal change, and thus, up-to-date, high-resolution, high-accuracy elevation data are required to model the coastal environment. Maps of areas subject to potential inundation have great utility to planners and managers concerned with the effects of sea-level rise. However, most of the maps produced to date are simplistic representations derived from older, coarse elevation data. In the last several years, vast amounts of high quality elevation data derived from lidar have become available. Because of their high vertical accuracy and spatial resolution, these lidar data are an excellent source of up-to-date information from which to improve identification and delineation of vulnerable lands. Four elevation datasets of varying resolution and accuracy were processed to demonstrate that the improved quality of lidar data leads to more precise delineation of coastal lands vulnerable to inundation. A key component of the comparison was to calculate and account for the vertical uncertainty of the elevation datasets. This comparison shows that lidar allows for a much more detailed delineation of the potential inundation zone when compared to other types of elevation models. It also shows how the certainty of the delineation of lands vulnerable to a given sea-level rise scenario is much improved when derived from higher resolution lidar data. ?? 2009 Coastal Education and Research Foundation.

  15. The incidence and significance of fluid-fluid levels on computed tomography of osseous lesions.

    PubMed

    Davies, A M; Cassar-Pullicino, V N; Grimer, R J

    1992-03-01

    The demonstration of a fluid-fluid level (FFL) within an osseous lesion on computed tomography (CT) has been reported as suggestive of an aneurysmal bone cyst (ABC) although FFLS have also been rarely found in association with other lesions. This study was conducted to determine the frequency of FFLS on CT in a group of ABCs and a series of patients presenting to a major tertiary referral centre for the treatment of bone tumours. An FFL was present on CT in 21 (84%) of the 25 ABCs and in 17 was multiple. FFLs are typical of the mid ("blow-out") or late phase of development of an ABC and not the incipient ("permeative") stage or where the internal architecture of the tumour has been disrupted by biopsy or previous surgery. In a 3-year period, 16 ABCs were found in 491 bone lesions referred to a bone tumour treatment centre. CT of the ABCs revealed FFLs in 14 (87.5%) cases. Within the same period, 728 CTs of these and other bone lesions were performed and FFLs were identified in two further cases: a massive telangiectatic osteosarcoma and a conventional osteosarcoma following chemotherapy. The diagnostic significance of an FFL on CT for ABC is: sensitivity = 87.5%, specificity = 99.7%, positive predictive value = 87.5%, negative predictive value = 99.7%, accuracy = 99.4%. An FFL within a bone lesion on CT remains strongly suggestive of an ABC although the radiologist should be wary of a rare telangiectatic osteosarcoma. PMID:1547444

  16. Differential Expression Levels of Integrin α6 Enable the Selective Identification and Isolation of Atrial and Ventricular Cardiomyocytes

    PubMed Central

    Wiencierz, Anne Maria; Kernbach, Manuel; Ecklebe, Josephine; Monnerat, Gustavo; Tomiuk, Stefan; Raulf, Alexandra; Christalla, Peter; Malan, Daniela; Hesse, Michael; Bosio, Andreas; Fleischmann, Bernd K.; Eckardt, Dominik

    2015-01-01

    Rationale Central questions such as cardiomyocyte subtype emergence during cardiogenesis or the availability of cardiomyocyte subtypes for cell replacement therapy require selective identification and purification of atrial and ventricular cardiomyocytes. However, current methodologies do not allow for a transgene-free selective isolation of atrial or ventricular cardiomyocytes due to the lack of subtype specific cell surface markers. Methods and Results In order to develop cell surface marker-based isolation procedures for cardiomyocyte subtypes, we performed an antibody-based screening on embryonic mouse hearts. Our data indicate that atrial and ventricular cardiomyocytes are characterized by differential expression of integrin α6 (ITGA6) throughout development and in the adult heart. We discovered that the expression level of this surface marker correlates with the intracellular subtype-specific expression of MLC-2a and MLC-2v on the single cell level and thereby enables the discrimination of cardiomyocyte subtypes by flow cytometry. Based on the differential expression of ITGA6 in atria and ventricles during cardiogenesis, we developed purification protocols for atrial and ventricular cardiomyocytes from mouse hearts. Atrial and ventricular identities of sorted cells were confirmed by expression profiling and patch clamp analysis. Conclusion Here, we introduce a non-genetic, antibody-based approach to specifically isolate highly pure and viable atrial and ventricular cardiomyocytes from mouse hearts of various developmental stages. This will facilitate in-depth characterization of the individual cellular subsets and support translational research applications. PMID:26618511

  17. Analysis of the Computer Anxiety Levels of Secondary Technical Education Teachers in West Virginia.

    ERIC Educational Resources Information Center

    Gordon, Howard R. D.

    The computer anxiety of 116 randomly selected secondary technical education teachers from 8 area vocational-technical centers in West Virginia was the focus of a study. The mailed questionnaire consisted of two parts: Oetting's Computer Anxiety Scale (COMPAS) and closed-form questions to obtain general demographic information about the teachers…

  18. Computing at the High School Level: Changing What Teachers and Students Know and Believe

    ERIC Educational Resources Information Center

    Munson, Ashlyn; Moskal, Barbara; Harriger, Alka; Lauriski-Karriker, Tonya; Heersink, Daniel

    2011-01-01

    Research indicates that students often opt out of computing majors due to a lack of prior experience in computing and a lack of knowledge of field-based job opportunities. In addition, it has been found that students respond positively to new subjects when teachers and counselors are enthusiastic and knowledgeable about the area. The summer…

  19. Integration of Computer Technology Into an Introductory-Level Neuroscience Laboratory

    ERIC Educational Resources Information Center

    Evert, Denise L.; Goodwin, Gregory; Stavnezer, Amy Jo

    2005-01-01

    We describe 3 computer-based neuroscience laboratories. In the first 2 labs, we used commercially available interactive software to enhance the study of functional and comparative neuroanatomy and neurophysiology. In the remaining lab, we used customized software and hardware in 2 psychophysiological experiments. With the use of the computer-based…

  20. Determining the Effectiveness of the 3D Alice Programming Environment at the Computer Science I Level

    ERIC Educational Resources Information Center

    Sykes, Edward R.

    2007-01-01

    Student retention in Computer Science is becoming a serious concern among Educators in many colleges and universities. Most institutions currently face a significant drop in enrollment in Computer Science. A number of different tools and strategies have emerged to address this problem (e.g., BlueJ, Karel Robot, etc.). Although these tools help to…

  1. The Role of Teacher Training on Student Computer Use in Illinois at the Third Grade Level.

    ERIC Educational Resources Information Center

    Bedard, Annette

    In a national survey, teachers who had more computer technology training used computers with their students in more ways and to a greater extent than teachers with fewer hours. Using the same questions, responses from third grade public school teachers in Illinois were compared to the responses in the national survey. Also, in this study, the…

  2. Identification of Nucleotide-Level Changes Impacting Gene Content and Genome Evolution in Orthopoxviruses

    PubMed Central

    Hatcher, Eneida L.; Hendrickson, Robert Curtis

    2014-01-01

    ABSTRACT Poxviruses are composed of large double-stranded DNA (dsDNA) genomes coding for several hundred genes whose variation has supported virus adaptation to a wide variety of hosts over their long evolutionary history. Comparative genomics has suggested that the Orthopoxvirus genus in particular has undergone reductive evolution, with the most recent common ancestor likely possessing a gene complement consisting of all genes present in any existing modern-day orthopoxvirus species, similar to the current Cowpox virus species. As orthopoxviruses adapt to new environments, the selection pressure on individual genes may be altered, driving sequence divergence and possible loss of function. This is evidenced by accumulation of mutations and loss of protein-coding open reading frames (ORFs) that progress from individual missense mutations to gene truncation through the introduction of early stop mutations (ESMs), gene fragmentation, and in some cases, a total loss of the ORF. In this study, we have constructed a whole-genome alignment for representative isolates from each Orthopoxvirus species and used it to identify the nucleotide-level changes that have led to gene content variation. By identifying the changes that have led to ESMs, we were able to determine that short indels were the major cause of gene truncations and that the genome length is inversely proportional to the number of ESMs present. We also identified the number and types of protein functional motifs still present in truncated genes to assess their functional significance. IMPORTANCE This work contributes to our understanding of reductive evolution in poxviruses by identifying genomic remnants such as single nucleotide polymorphisms (SNPs) and indels left behind by evolutionary processes. Our comprehensive analysis of the genomic changes leading to gene truncation and fragmentation was able to detect some of the remnants of these evolutionary processes still present in orthopoxvirus genomes and

  3. Protein level identification of the Listeria monocytogenes Sigma H, Sigma L, and Sigma C regulons

    PubMed Central

    2013-01-01

    Background Transcriptional regulation by alternative sigma (σ) factors represents an important mechanism that allows bacteria to rapidly regulate transcript and protein levels in response to changing environmental conditions. While the role of the alternative σ factor σB has been comparatively well characterized in L. monocytogenes, our understanding of the roles of the three other L. monocytogenes alternative σ factors is still limited. In this study, we employed a quantitative proteomics approach using Isobaric Tags for Relative and Absolute Quantitation (iTRAQ) to characterize the L. monocytogenes σL, σH, and σC protein regulons. Proteomic comparisons used a quadruple alternative σ factor mutant strain (ΔBCHL) and strains expressing a single alternative σ factor (i.e., σL, σH, and σC; strains ΔBCH, ΔBCL, and ΔBHL) to eliminate potential redundancies between σ factors. Results Among the three alternative σ factors studied here, σH provides positive regulation for the largest number of proteins, consistent with previous transcriptomic studies, while σL appears to contribute to negative regulation of a number of proteins. σC was found to regulate a small number of proteins in L. monocytogenes grown to stationary phase at 37°C. Proteins identified as being regulated by multiple alternative σ factors include MptA, which is a component of a PTS system with a potential role in regulation of PrfA activity. Conclusions This study provides initial insights into global regulation of protein production by the L. monocytogenes alternative σ factors σL, σH, and σC. While, among these σ factors, σH appears to positively regulate the largest number of proteins, we also identified PTS systems that appear to be co-regulated by multiple alternative σ factors. Future studies should not only explore potential roles of alternative σ factors in activating a “cascade” of PTS systems that potentially regulate PrfA, but also may want to explore the

  4. SU-E-P-10: Establishment of Local Diagnostic Reference Levels of Routine Exam in Computed Tomography

    SciTech Connect

    Yeh, M; Wang, Y; Weng, H

    2015-06-15

    Introduction National diagnostic reference levels (NDRLs) can be used as a reference dose of radiological examination can provide radiation dose as the basis of patient dose optimization. Local diagnostic reference levels (LDRLs) by periodically view and check doses, more efficiency to improve the way of examination. Therefore, the important first step is establishing a diagnostic reference level. Computed Tomography in Taiwan had been built up the radiation dose limit value,in addition, many studies report shows that CT scan contributed most of the radiation dose in different medical. Therefore, this study was mainly to let everyone understand DRL’s international status. For computed tomography in our hospital to establish diagnostic reference levels. Methods and Materials: There are two clinical CT scanners (a Toshiba Aquilion and a Siemens Sensation) were performed in this study. For CT examinations the basic recommended dosimetric quantity is the Computed Tomography Dose Index (CTDI). Each exam each different body part, we collect 10 patients at least. Carried out the routine examinations, and all exposure parameters have been collected and the corresponding CTDIv and DLP values have been determined. Results: The majority of patients (75%) were between 60–70 Kg of body weight. There are 25 examinations in this study. Table 1 shows the LDRL of each CT routine examination. Conclusions: Therefore, this study would like to let everyone know DRL’s international status, but also establishment of computed tomography of the local reference levels for our hospital, and providing radiation reference, as a basis for optimizing patient dose.

  5. Tin-carbon clusters and the onset of microscopic level immiscibility: Experimental and computational study.

    PubMed

    Bernstein, J; Landau, A; Zemel, E; Kolodney, E

    2015-09-21

    We report the experimental observation and computational analysis of the binary tin-carbon gas phase species. These novel ionic compounds are generated by impact of C60(-) anions on a clean tin target at some kiloelectronvolts kinetic energies. Positive Sn(m)C(n)(+) (m = 1-12, 1 ≤ n ≤ 8) ions were detected mass spectrometrically following ejection from the surface. Impact induced shattering of the C60(-) ion followed by sub-surface penetration of the resulting atomic carbon flux forces efficient mixing between target and projectile atoms even though the two elements (Sn/C) are completely immiscible in the bulk. This approach of C60(-) ion beam induced synthesis can be considered as an effective way for producing novel metal-carbon species of the so-called non-carbide forming elements, thus exploring the possible onset of molecular level miscibility in these systems. Sn2C2(+) was found to be the most abundant carbide cluster ion. Its instantaneous formation kinetics and its measured kinetic energy distribution while exiting the surface demonstrate a single impact formation/emission event (on the sub-ps time scale). Optimal geometries were calculated for both neutral and positively charged species using Born-Oppenheimer molecular dynamics for identifying global minima, followed by density functional theory (DFT) structure optimization and energy calculations at the coupled cluster singles, doubles and perturbative triples [CCSD(T)] level. The calculated structures reflect two distinct binding tendencies. The carbon rich species exhibit polyynic/cummulenic nature (tin end capped carbon chains) while the more stoichiometrically balanced species have larger contributions of metal-metal bonding, sometimes resulting in distinct tin and carbon moieties attached to each other (segregated structures). The Sn2C(n) (n = 3-8) and Sn2C(n)(+) (n = 2-8) are polyynic/cummulenic while all neutral Sn(m)C(n) structures (m = 3-4) could be described as small tin clusters (dimer

  6. Tin-carbon clusters and the onset of microscopic level immiscibility: Experimental and computational study

    NASA Astrophysics Data System (ADS)

    Bernstein, J.; Landau, A.; Zemel, E.; Kolodney, E.

    2015-09-01

    We report the experimental observation and computational analysis of the binary tin-carbon gas phase species. These novel ionic compounds are generated by impact of C60 - anions on a clean tin target at some kiloelectronvolts kinetic energies. Positive SnmCn+ (m = 1-12, 1 ≤ n ≤ 8) ions were detected mass spectrometrically following ejection from the surface. Impact induced shattering of the C60 - ion followed by sub-surface penetration of the resulting atomic carbon flux forces efficient mixing between target and projectile atoms even though the two elements (Sn/C) are completely immiscible in the bulk. This approach of C60 - ion beam induced synthesis can be considered as an effective way for producing novel metal-carbon species of the so-called non-carbide forming elements, thus exploring the possible onset of molecular level miscibility in these systems. Sn2C2+ was found to be the most abundant carbide cluster ion. Its instantaneous formation kinetics and its measured kinetic energy distribution while exiting the surface demonstrate a single impact formation/emission event (on the sub-ps time scale). Optimal geometries were calculated for both neutral and positively charged species using Born-Oppenheimer molecular dynamics for identifying global minima, followed by density functional theory (DFT) structure optimization and energy calculations at the coupled cluster singles, doubles and perturbative triples [CCSD(T)] level. The calculated structures reflect two distinct binding tendencies. The carbon rich species exhibit polyynic/cummulenic nature (tin end capped carbon chains) while the more stoichiometrically balanced species have larger contributions of metal-metal bonding, sometimes resulting in distinct tin and carbon moieties attached to each other (segregated structures). The Sn2Cn (n = 3-8) and Sn2Cn+ (n = 2-8) are polyynic/cummulenic while all neutral SnmCn structures (m = 3-4) could be described as small tin clusters (dimer, trimer, and tetramer

  7. Experimental observation and computational identification of Sc at Cu{sub 16}{sup +}, a stable dopant-encapsulated copper cage

    SciTech Connect

    Veldeman, Nele; Neukermans, Sven; Lievens, Peter; Hoeltzl, Tibor; Minh Tho Nguyen; Veszpremi, Tamas

    2007-07-15

    We report a combined experimental and computational study of scandium doped copper clusters. The clusters are studied with time-of-flight mass spectrometry after laser fragmentation. Enhanced stabilities for specific cluster sizes in the mass abundance spectra are discussed using both electronic (shell closing) and geometric (symmetry) arguments. The exceptional stability observed for Cu{sub 16}Sc{sup +} is investigated in detail computationally. Density functional geometry optimizations at the Becke-Perdew 1986-LANL 2-double-zeta (BP86/LANL2DZ) level result in a Frank-Kasper tetrahedron, encapsulating a scandium atom in a highly coordinated position. The high stability is therefore interpreted in terms of extremely stable dopant encapsulated structures featuring a closed electron shell. The thermodynamic stability, as indicated by the stable backbone and large binding energy per atom, the relatively small ionization energy, and the moderate electron affinity of the neutral Cu{sub 16}Sc cluster show that it has a superatom character, chemically similar to the alkaline-metal atoms.

  8. Two-Level Weld-Material Homogenization for Efficient Computational Analysis of Welded Structure Blast-Survivability

    NASA Astrophysics Data System (ADS)

    Grujicic, M.; Arakere, G.; Hariharan, A.; Pandurangan, B.

    2012-06-01

    The introduction of newer joining technologies like the so-called friction-stir welding (FSW) into automotive engineering entails the knowledge of the joint-material microstructure and properties. Since, the development of vehicles (including military vehicles capable of surviving blast and ballistic impacts) nowadays involves extensive use of the computational engineering analyses (CEA), robust high-fidelity material models are needed for the FSW joints. A two-level material-homogenization procedure is proposed and utilized in this study to help manage computational cost and computer storage requirements for such CEAs. The method utilizes experimental (microstructure, microhardness, tensile testing, and x-ray diffraction) data to construct: (a) the material model for each weld zone and (b) the material model for the entire weld. The procedure is validated by comparing its predictions with the predictions of more detailed but more costly computational analyses.

  9. Demonstration of Cost-Effective, High-Performance Computing at Performance and Reliability Levels Equivalent to a 1994 Vector Supercomputer

    NASA Technical Reports Server (NTRS)

    Babrauckas, Theresa

    2000-01-01

    The Affordable High Performance Computing (AHPC) project demonstrated that high-performance computing based on a distributed network of computer workstations is a cost-effective alternative to vector supercomputers for running CPU and memory intensive design and analysis tools. The AHPC project created an integrated system called a Network Supercomputer. By connecting computer work-stations through a network and utilizing the workstations when they are idle, the resulting distributed-workstation environment has the same performance and reliability levels as the Cray C90 vector Supercomputer at less than 25 percent of the C90 cost. In fact, the cost comparison between a Cray C90 Supercomputer and Sun workstations showed that the number of distributed networked workstations equivalent to a C90 costs approximately 8 percent of the C90.

  10. Modeling Cardiac Electrophysiology at the Organ Level in the Peta FLOPS Computing Age

    SciTech Connect

    Mitchell, Lawrence; Bishop, Martin; Hoetzl, Elena; Neic, Aurel; Liebmann, Manfred; Haase, Gundolf; Plank, Gernot

    2010-09-30

    Despite a steep increase in available compute power, in-silico experimentation with highly detailed models of the heart remains to be challenging due to the high computational cost involved. It is hoped that next generation high performance computing (HPC) resources lead to significant reductions in execution times to leverage a new class of in-silico applications. However, performance gains with these new platforms can only be achieved by engaging a much larger number of compute cores, necessitating strongly scalable numerical techniques. So far strong scalability has been demonstrated only for a moderate number of cores, orders of magnitude below the range required to achieve the desired performance boost.In this study, strong scalability of currently used techniques to solve the bidomain equations is investigated. Benchmark results suggest that scalability is limited to 512-4096 cores within the range of relevant problem sizes even when systems are carefully load-balanced and advanced IO strategies are employed.

  11. [Home computer stabilography: technical level, functional potentialities and spheres of application].

    PubMed

    Sliva, S S

    2005-01-01

    Described, compared and analyzed in the paper are data about sabilographic computer equipment manufactured serially by the leading foreign and Russian companies. Potential spheres of application of stabilographic equipment are discussed. PMID:15757091

  12. Student Conceptions about the DNA Structure within a Hierarchical Organizational Level: Improvement by Experiment- and Computer-Based Outreach Learning

    ERIC Educational Resources Information Center

    Langheinrich, Jessica; Bogner, Franz X.

    2015-01-01

    As non-scientific conceptions interfere with learning processes, teachers need both, to know about them and to address them in their classrooms. For our study, based on 182 eleventh graders, we analyzed the level of conceptual understanding by implementing the "draw and write" technique during a computer-supported gene technology module.…

  13. POOLMS: A computer program for fitting and model selection for two level factorial replication-free experiments

    NASA Technical Reports Server (NTRS)

    Amling, G. E.; Holms, A. G.

    1973-01-01

    A computer program is described that performs a statistical multiple-decision procedure called chain pooling. It uses a number of mean squares assigned to error variance that is conditioned on the relative magnitudes of the mean squares. The model selection is done according to user-specified levels of type 1 or type 2 error probabilities.

  14. The Adoption Process for Personal Computers by Faculties in Business Administration and Teacher Education at the College and University Level.

    ERIC Educational Resources Information Center

    Kehr, George P.

    The adoption process for personal computers by faculties in business administration and teacher education at the graduate college and university level was studied. Diffusion Theory (Roberts, 1962) was tested as it related to the adoption of an innovation and the adoption process as the individual passed through the stages of the process…

  15. Technical Problems in Implementing University-Level Computer-Assisted Instruction in Mathematics and Science: Second Annual Report.

    ERIC Educational Resources Information Center

    Levine, Arvin; And Others

    Difficulties in implementing the EXCHECK/Voice Oriented Curriculum Author Language (VOCAL) System, a general program designed for university-level computer-assisted instruction in mathematics and science written in the VOCAL language, are presented in terms of informal mathematical procedures, audio and prosodic features, and a schedule of…

  16. A STUDY OF THE ACHIEVEMENT LEVELS IN READING AND COMPUTATION OF INCARCERATED ADULT MALES IN THE NORTH CAROLINA PRISON SYSTEM.

    ERIC Educational Resources Information Center

    BLAND, DAVID HORTON

    THIS STUDY INVESTIGATED THE LEVELS OF ACHIEVEMENT IN BOTH READING AND COMPUTATION AS THEY WERE ASSOCIATED WITH SELECTED INDEPENDENT VARIABLES--AGE, OFFENSE, AND GRADE COMPLETION IN SCHOOL. INMATES WERE SELECTED FROM ALL CLASSES OF AGE, TYPE OF CUSTODY AND TYPE OF OFFENSE. GROUP I INCLUDED 597 SUBJECTS WHO HAD NOT REACHED FOURTH GRADE. GROUP II…

  17. Computer Use Ethics among University Students and Staffs: The Influence of Gender, Religious Work Value and Organizational Level

    ERIC Educational Resources Information Center

    Mohamed, Norshidah; Karim, Nor Shahriza Abdul; Hussein, Ramlah

    2012-01-01

    Purpose: The purpose of this paper is to investigate the extent to which individual characteristics, which are gender, religious (Islamic) work value, and organization level (students and staff), are related to attitudes toward computer use ethics. This investigation is conducted in an academic setting in Malaysia, among those subscribing to the…

  18. Raising the Level of the Debate: The Effects of Computer Mediated Communication on Group Dynamics and Critical Thinking Skills.

    ERIC Educational Resources Information Center

    Hettinger, Gary

    This study investigated the use of computer mediated communication (CMC) to increase the cognitive level of student discourse by allowing students to reflect on difficult concepts on an as needed basis. The role of electronic mail (e-mail) interaction in producing a positive group feeling and closer personal relationships is also examined. Field…

  19. Emphasis: Identification.

    ERIC Educational Resources Information Center

    Clyne, Roger

    A potential program for dealing with the identification of kindergarteners with potential learning disabilities is discussed. The subject is dealt with on the level of prediction. It is pointed out that as children learn in different ways, different methods of educating them must be devised. Early identification of disabilities lessens the chances…

  20. High-level waste storage tank farms/242-A evaporator Standards/Requirements Identification Document (S/RID), Volume 7. Revision 1

    SciTech Connect

    Burt, D.L.

    1994-04-01

    The High-Level Waste Storage Tank Farms/242-A Evaporator Standards/Requirements Identification Document (S/RID) is contained in multiple volumes. This document (Volume 7) presents the standards and requirements for the following sections: Occupational Safety and Health, and Environmental Protection.

  1. Identification of multi-level toxicity of liquid crystal display wastewater toward Daphnia magna and Moina macrocopa.

    PubMed

    Kim, Sae-Bom; Kim, Woong-Ki; Chounlamany, Vanseng; Seo, Jaehwan; Yoo, Jisu; Jo, Hun-Je; Jung, Jinho

    2012-08-15

    Toxicity-based regulations of industrial effluent have been adopted to complement the conventional discharge limits based on chemical analyses. In this study, multi-level toxicity including acute toxicity, feeding rate inhibition and oxidative stress of effluent from a liquid crystal display (LCD) wastewater treatment plant (WWTP) to Daphnia magna (reference species) and Moina macrocopa (native species) were periodically monitored from April 2010 to April 2011. Raw wastewater was acutely toxic to both D. magna and M. macrocopa, but the toxicity reached less than 1 TU in the final effluent (FE) as treatment proceeded. Although acute toxicity was not observed in the FE, the feeding rate of daphnids was significantly inhibited. Additionally, the antioxidant enzyme activity of catalase, superoxide dismutase and glutathione peroxidase (GPx) in D. magna increased significantly when compared to the control, while only GPx activity was increased significantly in M. macrocopa (p<0.05). A toxicity identification evaluation using D. magna showed that Cu was the key toxicant in the FE, which was not effectively removed by the coagulation/flocculation process in the LCD WWTP. In addition, Al originating from the coagulant seemed to increase toxicity of the FE. PMID:22677053

  2. Identification of nitrate long term trends in Loire-Brittany river district (France) in connection with hydrogeological contexts, agricultural practices and water table level variations

    NASA Astrophysics Data System (ADS)

    Lopez, B.; Baran, N.; Bourgine, B.; Ratheau, D.

    2009-04-01

    The European Union (EU) has adopted directives requiring that Member States take measures to reach a "good" chemical status of water resources by the year 2015 (Water Framework Directive: WFD). Alongside, the Nitrates Directives (91/676/EEC) aims at controlling nitrogen pollution and requires Member States to identify groundwaters that contain more than 50 mg NO3 L-1 or could exceed this limit if preventive measures are not taken. In order to achieve these environmental objectives in the Loire-Brittany river basin, or to justify the non achievement of these objectives, a large dataset of nitrate concentrations (117.056 raw data distributed on 7.341 time-series) and water table level time-series (1.371.655 data distributed on 511 piezometers) is analysed from 1945 to 2007. The 156.700 sq km Loire-Brittany river basin shows various hydrogeological contexts, ranging from sedimentary aquifers to basement ones, with a few volcanic-rock aquifers. The knowledge of the evolution of agricultural practices is important in such a study and, even if this information is not locally available, agricultural practices have globally changed since the 1991 Nitrates Directives. The detailed dataset available for the Loire-Brittany basin aquifers is used to evaluate tools and to propose efficient methodologies for identifying and quantifying past and current trends in nitrate concentrations. Therefore, the challenge of this study is to propose a global and integrated approach which allows nitrate trend identifications for the whole Loire-Brittany river basin. The temporal piezometric behaviour of each aquifer is defined using geostatistical analyse of water table level time-series. This method requires the calculation of an experimental temporal variogram that can be fitted with a theoretical model valid for a large time range. Identification of contrasted behaviours (short term, annual or pluriannual water table fluctuations) allows a systematic classification of the Loire

  3. Student conceptions about the DNA structure within a hierarchical organizational level: Improvement by experiment- and computer-based outreach learning.

    PubMed

    Langheinrich, Jessica; Bogner, Franz X

    2015-01-01

    As non-scientific conceptions interfere with learning processes, teachers need both, to know about them and to address them in their classrooms. For our study, based on 182 eleventh graders, we analyzed the level of conceptual understanding by implementing the "draw and write" technique during a computer-supported gene technology module. To give participants the hierarchical organizational level which they have to draw, was a specific feature of our study. We introduced two objective category systems for analyzing drawings and inscriptions. Our results indicated a long- as well as a short-term increase in the level of conceptual understanding and in the number of drawn elements and their grades concerning the DNA structure. Consequently, we regard the "draw and write" technique as a tool for a teacher to get to know students' alternative conceptions. Furthermore, our study points the modification potential of hands-on and computer-supported learning modules. PMID:26481196

  4. Identification of quasi-steady compressor characteristics from transient data

    NASA Technical Reports Server (NTRS)

    Nunes, K. B.; Rock, S. M.

    1984-01-01

    The principal goal was to demonstrate that nonlinear compressor map parameters, which govern an in-stall response, can be identified from test data using parameter identification techniques. The tasks included developing and then applying an identification procedure to data generated by NASA LeRC on a hybrid computer. Two levels of model detail were employed. First was a lumped compressor rig model; second was a simplified turbofan model. The main outputs are the tools and procedures generated to accomplish the identification.

  5. Computational identification and analysis of signaling subnetworks with distinct functional roles in the regulation of TNF production.

    PubMed

    Tomaiuolo, Maurizio; Kottke, Melissa; Matheny, Ronald W; Reifman, Jaques; Mitrophanov, Alexander Y

    2016-03-01

    Inflammation is a complex process driven by the coordinated action of a vast number of pro- and anti-inflammatory molecular mediators. While experimental studies have provided an abundance of information about the properties and mechanisms of action of individual mediators, essential system-level regulatory patterns that determine the time-course of inflammation are not sufficiently understood. In particular, it is not known how the contributions from distinct signaling pathways involved in cytokine regulation combine to shape the overall inflammatory response over different time scales. We investigated the kinetics of the intra- and extracellular signaling network controlling the production of the essential pro-inflammatory cytokine, tumor necrosis factor (TNF), and its anti-inflammatory counterpart, interleukin 10 (IL-10), in a macrophage culture. To tackle the intrinsic complexity of the network, we employed a computational modeling approach using the available literature data about specific molecular interactions. Our computational model successfully captured experimentally observed short- and long-term kinetics of key inflammatory mediators. Subsequent model analysis showed that distinct subnetworks regulate IL-10 production by impacting different temporal phases of the cAMP response element-binding protein (CREB) phosphorylation. Moreover, the model revealed that functionally similar inhibitory control circuits regulate the early and late activation phases of nuclear factor κB and CREB. Finally, we identified and investigated distinct signaling subnetworks that independently control the peak height and tail height of the TNF temporal trajectories. The knowledge of such subnetwork-specific regulatory effects may facilitate therapeutic interventions aimed at precise modulation of the inflammatory response. PMID:26751842

  6. Computers and Traditional Teaching Practices: Factors Influencing Middle Level Students' Science Achievement and Attitudes about Science

    ERIC Educational Resources Information Center

    Odom, Arthur Louis; Marszalek, Jacob M.; Stoddard, Elizabeth R.; Wrobel, Jerzy M.

    2011-01-01

    The purpose of this study was to examine the association of middle school student science achievement and attitudes toward science with student-reported frequency of using computers to learn science and other classroom practices. Baseline comparison data were collected on the frequency of student-centred teaching practices (e.g. the use of group…

  7. Assessing the Computational Literacy of Elementary Students on a National Level in Korea

    ERIC Educational Resources Information Center

    Jun, SooJin; Han, SunGwan; Kim, HyeonCheol; Lee, WonGyu

    2014-01-01

    Information and communication technology (ICT) literacy education has become an important issue, and the necessity of computational literacy (CL) has been increasing in our growing information society. CL is becoming an important element for future talents, and many countries, including the USA, are developing programs for CL education.…

  8. Introducing Creativity in a Design Laboratory for a Freshman Level Electrical and Computer Engineering Course

    ERIC Educational Resources Information Center

    Burkett, Susan L.; Kotru, Sushma; Lusth, John C.; McCallum, Debra; Dunlap, Sarah

    2014-01-01

    Dunlap, The University of Alabama, USA ABSTRACT In the electrical and computer engineering (ECE) curriculum at The University of Alabama, freshmen are introduced to fundamental electrical concepts and units, DC circuit analysis techniques, operational amplifiers, circuit simulation, design, and professional ethics. The two credit course has both…

  9. Computer-Mediated Communication: Promoting Learner Autonomy and Intercultural Understanding at Secondary Level

    ERIC Educational Resources Information Center

    Fisher, Linda; Evans, Michael; Esch, Edith

    2004-01-01

    The use of Computer-Mediated Communication (CMC) has been hailed as a solution to the problem of access to native speakers for language learners. This project was devised to investigate whether regular and structured use of email, here via a bulletin board, might enhance learners' study of French, with regard to developing learner autonomy and…

  10. More Math, Please! Kid-Friendly Computation. Level 1: Numbers to 10.

    ERIC Educational Resources Information Center

    Morgan, Sarah Perry

    This book is about teaching computation in a way that accommodates children who do not learn best with traditional math teaching methods. The teaching approach draws on visual and kinesthetic learning strategies to help all students be math whizzes and provides a comprehensive program for elementary math skills. The book begins with counting and…

  11. Developing Instructional Applications at the Secondary Level. The Computer as a Tool.

    ERIC Educational Resources Information Center

    McManus, Jack; And Others

    Case studies are presented for seven Los Angeles area (California) high schools that worked with Pepperdine University in the IBM/ETS (International Business Machines/Educational Testing Service) Model Schools program, a project which provided training for selected secondary school teachers in the use of personal computers and selected software as…

  12. Teaching Computer Languages and Elementary Theory for Mixed Audiences at University Level

    ERIC Educational Resources Information Center

    Christiansen, Henning

    2004-01-01

    Theoretical issues of computer science are traditionally taught in a way that presupposes a solid mathematical background and are usually considered more or less inaccessible for students without this. An effective methodology is described which has been developed for a target group of university students with different backgrounds such as natural…

  13. Examining the Use of Computer Algebra Systems in University-Level Mathematics Teaching

    ERIC Educational Resources Information Center

    Lavicza, Zsolt

    2009-01-01

    The use of Computer Algebra Systems (CAS) is becoming increasingly important and widespread in mathematics research and teaching. In this paper, I will report on a questionnaire study enquiring about mathematicians' use of CAS in mathematics teaching in three countries; the United States, the United Kingdom, and Hungary. Based on the responses…

  14. Investigating the Relationship between Curiosity Level and Computer Self Efficacy Beliefs of Elementary Teachers Candidates

    ERIC Educational Resources Information Center

    Gulten, Dilek Cagirgan; Yaman, Yavuz; Deringol, Yasemin; Ozsari, Ismail

    2011-01-01

    Nowadays, "lifelong learning individual" concept is gaining importance in which curiosity is one important feature that an individual should have as a requirement of learning. It is known that learning will naturally occur spontaneously when curiosity instinct is awakened during any learning-teaching process. Computer self-efficacy belief is…

  15. Computer Assisted Vocational Math. Written for TRS-80, Model I, Level II, 16K.

    ERIC Educational Resources Information Center

    Daly, Judith; And Others

    This computer-assisted curriculum is intended to be used to enhance a vocational mathematics/applied mathematics course. A total of 32 packets were produced to increase the basic mathematics skills of students in the following vocational programs: automotive trades, beauty culture, building trades, climate control, electrical trades,…

  16. The Effects of Computer-Assisted Material on Students' Cognitive Levels, Misconceptions and Attitudes Towards Science

    ERIC Educational Resources Information Center

    Cepni, Salih; Tas, Erol; Kose, Sacit

    2006-01-01

    The purpose of this study was to investigate the effects of a Computer-Assisted Instruction Material (CAIM) related to "photosynthesis" topic on student cognitive development, misconceptions and attitudes. The study conducted in 2002-2003 academic year and was carried out in two different classes taught by the same teacher, in which there were…

  17. Photochemistry of Naphthopyrans and Derivatives: A Computational Experiment for Upper-Level Undergraduates or Graduate Students

    ERIC Educational Resources Information Center

    Castet, Frédéric; Méreau, Raphaël; Liotard, Daniel

    2014-01-01

    In this computational experiment, students use advanced quantum chemistry tools to simulate the photochromic reaction mechanism in naphthopyran derivatives. The first part aims to make students familiar with excited-state reaction mechanisms and addresses the photoisomerization of the benzopyran molecule by means of semiempirical quantum chemical…

  18. Non Invasive Water Level Monitoring on Boiling Water Reactors Using Internal Gamma Radiation: Application of Soft Computing Methods

    SciTech Connect

    Fleischer, Sebastian; Hampel, Rainer

    2006-07-01

    To provide best knowledge about safety-related water level values in boiling water reactors (BWR) is essentially for operational regime. For the water level determination hydrostatic level measurement systems are almost exclusively applied, because they stand the test over many decades in conventional and nuclear power plants (NPP). Due to the steam generation especially in BWR a specific phenomenon occurs which leads to a water-steam mixture level in the reactor annular space and reactor plenum. The mixture level is a high transient non-measurable value concerning the hydrostatic water level measuring system and it significantly differs from the measured collapsed water level. In particular, during operational and accidental transient processes like fast negative pressure transients, the monitoring of these water levels is very important. In addition to the hydrostatic water level measurement system a diverse water level measurement system for BWR should be used. A real physical diversity is given by gamma radiation distribution inside and outside the reactor pressure vessel correlating with the water level. The vertical gamma radiation distribution depends on the water level, but it is also a function of the neutron flux and the coolant recirculation pump speed. For the water level monitoring, special algorithms are required. An analytical determination of the gamma radiation distribution outside the reactor pressure vessel is impossible due to the multitude of radiation of physical processes, complicated non-stationary radiation source distribution and complex geometry of fixtures. For creating suited algorithms Soft Computing methods (Fuzzy Sets Theory, Artificial Neural Networks, etc.) will be used. Therefore, a database containing input values (gamma radiation distribution) and output values (water levels) had to be built. Here, the database was established by experiments (data from BWR and from a test setup) and simulation with the authorised thermo

  19. Interactions between Levels of Instructional Detail and Expertise When Learning with Computer Simulations

    ERIC Educational Resources Information Center

    Hsu, Yuling; Gao, Yuan; Liu, Tzu-Chien; Sweller, John

    2015-01-01

    Based on cognitive load theory, the effect of different levels of instructional detail and expertise in a simulation-based environment on learning about concepts of correlation was investigated. Separate versions of the learning environment were designed for the four experimental conditions which differed only with regard to the levels of written…

  20. Achieving production-level use of HEP software at the Argonne Leadership Computing Facility

    NASA Astrophysics Data System (ADS)

    Uram, T. D.; Childers, J. T.; LeCompte, T. J.; Papka, M. E.; Benjamin, D.

    2015-12-01

    HEP's demand for computing resources has grown beyond the capacity of the Grid, and these demands will accelerate with the higher energy and luminosity planned for Run II. Mira, the ten petaFLOPs supercomputer at the Argonne Leadership Computing Facility, is a potentially significant compute resource for HEP research. Through an award of fifty million hours on Mira, we have delivered millions of events to LHC experiments by establishing the means of marshaling jobs through serial stages on local clusters, and parallel stages on Mira. We are running several HEP applications, including Alpgen, Pythia, Sherpa, and Geant4. Event generators, such as Sherpa, typically have a split workload: a small scale integration phase, and a second, more scalable, event-generation phase. To accommodate this workload on Mira we have developed two Python-based Django applications, Balsam and ARGO. Balsam is a generalized scheduler interface which uses a plugin system for interacting with scheduler software such as HTCondor, Cobalt, and TORQUE. ARGO is a workflow manager that submits jobs to instances of Balsam. Through these mechanisms, the serial and parallel tasks within jobs are executed on the appropriate resources. This approach and its integration with the PanDA production system will be discussed.

  1. Computer-aided diagnosis method for MRI-guided prostate biopsy within the peripheral zone using grey level histograms

    NASA Astrophysics Data System (ADS)

    Rampun, Andrik; Malcolm, Paul; Zwiggelaar, Reyer

    2015-02-01

    This paper describes a computer-aided diagnosis method for targeted prostate biopsies within the peripheral zone in T2-Weighted MRI. We subdivided the peripheral zone into four regions and compare each sub region's grey level histogram with malignant and normal histogram models, and use specific metrics to estimate the presence of abnormality. The initial evaluation based on 200 MRI slices taken from 40 different patients and we achieved 87% correct classification rate with 89% and 86% sensitivity and specificity, respectively. The main contribution of this paper is a novel approach of Computer Aided Diagnosis which is using grey level histograms analysis between sub regions. In clinical point of view, the developed method could assist clinicians to perform targeted biopsies which are better than the random ones which are currently used.

  2. Ambient radiation levels in positron emission tomography/computed tomography (PET/CT) imaging center

    PubMed Central

    Santana, Priscila do Carmo; de Oliveira, Paulo Marcio Campos; Mamede, Marcelo; Silveira, Mariana de Castro; Aguiar, Polyanna; Real, Raphaela Vila; da Silva, Teógenes Augusto

    2015-01-01

    Objective To evaluate the level of ambient radiation in a PET/CT center. Materials and Methods Previously selected and calibrated TLD-100H thermoluminescent dosimeters were utilized to measure room radiation levels. During 32 days, the detectors were placed in several strategically selected points inside the PET/CT center and in adjacent buildings. After the exposure period the dosimeters were collected and processed to determine the radiation level. Results In none of the points selected for measurements the values exceeded the radiation dose threshold for controlled area (5 mSv/year) or free area (0.5 mSv/year) as recommended by the Brazilian regulations. Conclusion In the present study the authors demonstrated that the whole shielding system is appropriate and, consequently, the workers are exposed to doses below the threshold established by Brazilian standards, provided the radiation protection standards are followed. PMID:25798004

  3. Knowledge-based low-level image analysis for computer vision systems

    NASA Technical Reports Server (NTRS)

    Dhawan, Atam P.; Baxi, Himanshu; Ranganath, M. V.

    1988-01-01

    Two algorithms for entry-level image analysis and preliminary segmentation are proposed which are flexible enough to incorporate local properties of the image. The first algorithm involves pyramid-based multiresolution processing and a strategy to define and use interlevel and intralevel link strengths. The second algorithm, which is designed for selected window processing, extracts regions adaptively using local histograms. The preliminary segmentation and a set of features are employed as the input to an efficient rule-based low-level analysis system, resulting in suboptimal meaningful segmentation.

  4. A comparative analysis of multi-level computer-assisted decision making systems for traumatic injuries

    PubMed Central

    2009-01-01

    Background This paper focuses on the creation of a predictive computer-assisted decision making system for traumatic injury using machine learning algorithms. Trauma experts must make several difficult decisions based on a large number of patient attributes, usually in a short period of time. The aim is to compare the existing machine learning methods available for medical informatics, and develop reliable, rule-based computer-assisted decision-making systems that provide recommendations for the course of treatment for new patients, based on previously seen cases in trauma databases. Datasets of traumatic brain injury (TBI) patients are used to train and test the decision making algorithm. The work is also applicable to patients with traumatic pelvic injuries. Methods Decision-making rules are created by processing patterns discovered in the datasets, using machine learning techniques. More specifically, CART and C4.5 are used, as they provide grammatical expressions of knowledge extracted by applying logical operations to the available features. The resulting rule sets are tested against other machine learning methods, including AdaBoost and SVM. The rule creation algorithm is applied to multiple datasets, both with and without prior filtering to discover significant variables. This filtering is performed via logistic regression prior to the rule discovery process. Results For survival prediction using all variables, CART outperformed the other machine learning methods. When using only significant variables, neural networks performed best. A reliable rule-base was generated using combined C4.5/CART. The average predictive rule performance was 82% when using all variables, and approximately 84% when using significant variables only. The average performance of the combined C4.5 and CART system using significant variables was 89.7% in predicting the exact outcome (home or rehabilitation), and 93.1% in predicting the ICU length of stay for airlifted TBI patients

  5. Remote reading of coronary CTA exams using a tablet computer: utility for stenosis assessment and identification of coronary anomalies.

    PubMed

    Zimmerman, Stefan L; Lin, Cheng T; Chu, Linda C; Eng, John; Fishman, Elliot K

    2016-06-01

    The feasibility of remote reading of coronary CT examinations on tablet computers has not been evaluated. The purpose of this study is to evaluate the accuracy of coronary CT angiography reading using an iPad compared to standard 3D workstations. Fifty coronary CT angiography exams, including a spectrum of coronary artery disease and anatomic variants, were reviewed. Coronary CT angiography exams were interpreted by two readers independently on an iPad application (Siemens Webviewer) and a clinical 3D workstation at sessions 2 weeks apart. Studies were scored per vessel for severity of stenosis on a 0-3 scale (0 none, 1 <50 %, 2 ≥50-69 %, 3 ≥70 %). Coronary anomalies were recorded. A consensus read by two experienced cardiac imagers was used as the reference standard. Level of agreement with the reference for iPad and 3D workstations was compared. Multivariate logistic regression was used to analyze the relationship between agreement and display type and to adjust for inter-reader differences. For both readers, there was no significant difference in agreement with the reference standard for per-vessel stenosis scores using either the 3D workstation or the iPad. In a multivariable logistic regression analysis including reader, workstation, and vessel as co-variates, there was no significant association between workstation type or reader and agreement with the reference standard (p > 0.05). Both readers identified 100 % of coronary anomalies using each technique. Reading of coronary CT angiography examinations on the iPad had no influence on stenosis assessment compared to the standard clinical workstation. PMID:27085532

  6. Using RoboCup in University-Level Computer Science Education

    ERIC Educational Resources Information Center

    Sklar, Elizabeth; Parsons, Simon; Stone, Peter

    2004-01-01

    In the education literature, team-based projects have proven to be an effective pedagogical methodology. We have been using "RoboCup" challenges as the basis for class projects in undergraduate and masters level courses. This article discusses several independent efforts in this direction and presents our work in the development of shared…

  7. Tablet Computer Literacy Levels of the Physical Education and Sports Department Students

    ERIC Educational Resources Information Center

    Hergüner, Gülten

    2016-01-01

    Education systems are being affected in parallel by newly emerging hardware and new developments occurring in technology daily. Tablet usage especially is becoming ubiquitous in the teaching-learning processes in recent years. Therefore, using the tablets effectively, managing them and having a high level of tablet literacy play an important role…

  8. Predicting groundwater level fluctuations with meteorological effect implications—A comparative study among soft computing techniques

    NASA Astrophysics Data System (ADS)

    Shiri, Jalal; Kisi, Ozgur; Yoon, Heesung; Lee, Kang-Kun; Hossein Nazemi, Amir

    2013-07-01

    The knowledge of groundwater table fluctuations is important in agricultural lands as well as in the studies related to groundwater utilization and management levels. This paper investigates the abilities of Gene Expression Programming (GEP), Adaptive Neuro-Fuzzy Inference System (ANFIS), Artificial Neural Networks (ANN) and Support Vector Machine (SVM) techniques for groundwater level forecasting in following day up to 7-day prediction intervals. Several input combinations comprising water table level, rainfall and evapotranspiration values from Hongcheon Well station (South Korea), covering a period of eight years (2001-2008) were used to develop and test the applied models. The data from the first six years were used for developing (training) the applied models and the last two years data were reserved for testing. A comparison was also made between the forecasts provided by these models and the Auto-Regressive Moving Average (ARMA) technique. Based on the comparisons, it was found that the GEP models could be employed successfully in forecasting water table level fluctuations up to 7 days beyond data records.

  9. Developing "The Critic's Corner": Computer-Assisted Language Learning for Upper-Level Russian Students.

    ERIC Educational Resources Information Center

    Nicholas, Mary A.; Toporski, Neil

    1993-01-01

    A Lehigh University project to develop interactive video materials for use in upper-level Russian language courses is reported. Russian film clips on laser disc are used to improve writing and speaking skills, stimulate students' critical thinking, and encourage students' collaborative learning. (Contains 23 references.) (Author/LB)

  10. Deriving Hounsfield units using grey levels in cone beam computed tomography

    PubMed Central

    Mah, P; Reeves, T E; McDavid, W D

    2010-01-01

    Objectives An in vitro study was performed to investigate the relationship between grey levels in dental cone beam CT (CBCT) and Hounsfield units (HU) in CBCT scanners. Methods A phantom containing 8 different materials of known composition and density was imaged with 11 different dental CBCT scanners and 2 medical CT scanners. The phantom was scanned under three conditions: phantom alone and phantom in a small and large water container. The reconstructed data were exported as Digital Imaging and Communications in Medicine (DICOM) and analysed with On Demand 3D® by Cybermed, Seoul, Korea. The relationship between grey levels and linear attenuation coefficients was investigated. Results It was demonstrated that a linear relationship between the grey levels and the attenuation coefficients of each of the materials exists at some “effective” energy. From the linear regression equation of the reference materials, attenuation coefficients were obtained for each of the materials and CT numbers in HU were derived using the standard equation. Conclusions HU can be derived from the grey levels in dental CBCT scanners using linear attenuation coefficients as an intermediate step. PMID:20729181

  11. 24 CFR 990.170 - Computation of utilities expense level (UEL): Overview.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... applicable inflation factor. The UEL for a given funding period is the product of the utility rate multiplied by the payable consumption level multiplied by the inflation factor. The UEL is expressed in terms of... of a utility, the PHA shall absorb 75 percent of this increase. (d) Inflation factor for...

  12. Computational Design of Self-Assembling Protein Nanomaterials with Atomic Level Accuracy

    SciTech Connect

    King, Neil P.; Sheffler, William; Sawaya, Michael R.; Vollmar, Breanna S.; Sumida, John P.; André, Ingemar; Gonen, Tamir; Yeates, Todd O.; Baker, David

    2015-09-17

    We describe a general computational method for designing proteins that self-assemble to a desired symmetric architecture. Protein building blocks are docked together symmetrically to identify complementary packing arrangements, and low-energy protein-protein interfaces are then designed between the building blocks in order to drive self-assembly. We used trimeric protein building blocks to design a 24-subunit, 13-nm diameter complex with octahedral symmetry and a 12-subunit, 11-nm diameter complex with tetrahedral symmetry. The designed proteins assembled to the desired oligomeric states in solution, and the crystal structures of the complexes revealed that the resulting materials closely match the design models. The method can be used to design a wide variety of self-assembling protein nanomaterials.

  13. Identification of light absorbing oligomers from glyoxal and methylglyoxal aqueous processing: a comparative study at the molecular level

    NASA Astrophysics Data System (ADS)

    Finessi, Emanuela; Hamilton, Jacqueline; Rickard, Andrew; Baeza-Romero, Maria; Healy, Robert; Peppe, Salvatore; Adams, Tom; Daniels, Mark; Ball, Stephen; Goodall, Iain; Monks, Paul; Borras, Esther; Munoz, Amalia

    2014-05-01

    Numerous studies point to the reactive uptake of gaseous low molecular weight carbonyls onto atmospheric waters (clouds/fog droplets and wet aerosols) as an important SOA formation route not yet included in current models. However, the evaluation of these processes is challenging because water provides a medium for a complex array of reactions to take place such as self-oligomerization, aldol condensation and Maillard-type browning reactions in the presence of ammonium salts. In addition to adding to SOA mass, aqueous chemistry products have been shown to include light absorbing, surface-active and high molecular weight oligomeric species, and can therefore affect climatically relevant aerosol properties such as light absorption and hygroscopicity. Glyoxal (GLY) and methylglyoxal (MGLY) are the gaseous carbonyls that have perhaps received the most attention to date owing to their ubiquity, abundance and reactivity in water, with the majority of studies focussing on bulk physical properties. However, very little is known at the molecular level, in particular for MGLY, and the relative potential of these species as aqueous SOA precursors in ambient air is still unclear. We have conducted experiments with both laboratory solutions and chamber-generated particles to simulate the aqueous processing of GLY and MGLY with ammonium sulphate (AS) under typical atmospheric conditions and investigated their respective aging products. Both high performance liquid chromatography coupled with UV-Vis detection and ion trap mass spectrometry (HPLC-DAD-MSn) and high resolution mass spectrometry (FTICRMS) have been used for molecular identification purposes. Comprehensive gas chromatography with nitrogen chemiluminescence detection (GCxGC-NCD) has been applied for the first time to these systems, revealing a surprisingly high number of nitrogen-containing organics (ONs), with a large extent of polarities. GCxGC-NCD proved to be a valuable tool to determine overall amount and rates of

  14. A Full Dynamic Compound Inverse Method for output-only element-level system identification and input estimation from earthquake response signals

    NASA Astrophysics Data System (ADS)

    Pioldi, Fabio; Rizzi, Egidio

    2016-04-01

    This paper proposes a new output-only element-level system identification and input estimation technique, towards the simultaneous identification of modal parameters, input excitation time history and structural features at the element-level by adopting earthquake-induced structural response signals. The method, named Full Dynamic Compound Inverse Method (FDCIM), releases strong assumptions of earlier element-level techniques, by working with a two-stage iterative algorithm. Jointly, a Statistical Average technique, a modification process and a parameter projection strategy are adopted at each stage to achieve stronger convergence for the identified estimates. The proposed method works in a deterministic way and is completely developed in State-Space form. Further, it does not require continuous- to discrete-time transformations and does not depend on initialization conditions. Synthetic earthquake-induced response signals from different shear-type buildings are generated to validate the implemented procedure, also with noise-corrupted cases. The achieved results provide a necessary condition to demonstrate the effectiveness of the proposed identification method.

  15. A Full Dynamic Compound Inverse Method for output-only element-level system identification and input estimation from earthquake response signals

    NASA Astrophysics Data System (ADS)

    Pioldi, Fabio; Rizzi, Egidio

    2016-08-01

    This paper proposes a new output-only element-level system identification and input estimation technique, towards the simultaneous identification of modal parameters, input excitation time history and structural features at the element-level by adopting earthquake-induced structural response signals. The method, named Full Dynamic Compound Inverse Method (FDCIM), releases strong assumptions of earlier element-level techniques, by working with a two-stage iterative algorithm. Jointly, a Statistical Average technique, a modification process and a parameter projection strategy are adopted at each stage to achieve stronger convergence for the identified estimates. The proposed method works in a deterministic way and is completely developed in State-Space form. Further, it does not require continuous- to discrete-time transformations and does not depend on initialization conditions. Synthetic earthquake-induced response signals from different shear-type buildings are generated to validate the implemented procedure, also with noise-corrupted cases. The achieved results provide a necessary condition to demonstrate the effectiveness of the proposed identification method.

  16. Comparing levels of school performance to science teachers' reports on knowledge/skills, instructional use and student use of computers

    NASA Astrophysics Data System (ADS)

    Kerr, Rebecca

    The purpose of this descriptive quantitative and basic qualitative study was to examine fifth and eighth grade science teachers' responses, perceptions of the role of technology in the classroom, and how they felt that computer applications, tools, and the Internet influence student understanding. The purposeful sample included survey and interview responses from fifth grade and eighth grade general and physical science teachers. Even though they may not be generalizable to other teachers or classrooms due to a low response rate, findings from this study indicated teachers with fewer years of teaching science had a higher level of computer use but less computer access, especially for students, in the classroom. Furthermore, teachers' choice of professional development moderated the relationship between the level of school performance and teachers' knowledge/skills, with the most positive relationship being with workshops that occurred outside of the school. Eighteen interviews revealed that teachers perceived the role of technology in classroom instruction mainly as teacher-centered and supplemental, rather than student-centered activities.

  17. Simultaneous segmentation and reconstruction: A level set method approach for limited view computed tomography

    SciTech Connect

    Yoon, Sungwon; Pineda, Angel R.; Fahrig, Rebecca

    2010-05-15

    Purpose: An iterative tomographic reconstruction algorithm that simultaneously segments and reconstructs the reconstruction domain is proposed and applied to tomographic reconstructions from a sparse number of projection images. Methods: The proposed algorithm uses a two-phase level set method segmentation in conjunction with an iterative tomographic reconstruction to achieve simultaneous segmentation and reconstruction. The simultaneous segmentation and reconstruction is achieved by alternating between level set function evolutions and per-region intensity value updates. To deal with the limited number of projections, a priori information about the reconstruction is enforced via penalized likelihood function. Specifically, smooth function within each region (piecewise smooth function) and bounded function intensity values for each region are assumed. Such a priori information is formulated into a quadratic objective function with linear bound constraints. The level set function evolutions are achieved by artificially time evolving the level set function in the negative gradient direction; the intensity value updates are achieved by using the gradient projection conjugate gradient algorithm. Results: The proposed simultaneous segmentation and reconstruction results were compared to ''conventional'' iterative reconstruction (with no segmentation), iterative reconstruction followed by segmentation, and filtered backprojection. Improvements of 6%-13% in the normalized root mean square error were observed when the proposed algorithm was applied to simulated projections of a numerical phantom and to real fan-beam projections of the Catphan phantom, both of which did not satisfy the a priori assumptions. Conclusions: The proposed simultaneous segmentation and reconstruction resulted in improved reconstruction image quality. The algorithm correctly segments the reconstruction space into regions, preserves sharp edges between different regions, and smoothes the noise

  18. Disturbed Interplay between Mid- and High-Level Vision in ASD? Evidence from a Contour Identification Task with Everyday Objects

    ERIC Educational Resources Information Center

    Evers, Kris; Panis, Sven; Torfs, Katrien; Steyaert, Jean; Noens, Ilse; Wagemans, Johan

    2014-01-01

    Atypical visual processing in children with autism spectrum disorder (ASD) does not seem to reside in an isolated processing component, such as global or local processing. We therefore developed a paradigm that requires the interaction between different processes--an identification task with Gaborized object outlines--and applied this to two age…

  19. A peptide identification-free, genome sequence-independent shotgun proteomics workflow for strain-level bacterial differentiation

    PubMed Central

    Shao, Wenguang; Zhang, Min; Lam, Henry; Lau, Stanley C. K.

    2015-01-01

    Shotgun proteomics is an emerging tool for bacterial identification and differentiation. However, the identification of the mass spectra of peptides to genome-derived peptide sequences remains a key issue that limits the use of shotgun proteomics to bacteria with genome sequences available. In this proof-of-concept study, we report a novel bacterial fingerprinting method that enjoys the resolving power and accuracy of mass spectrometry without the burden of peptide identification (i.e. genome sequence-independent). This method uses a similarity-clustering algorithm to search for mass spectra that are derived from the same peptide and merge them into a unique consensus spectrum as the basis to generate proteomic fingerprints of bacterial isolates. In comparison to a traditional peptide identification-based shotgun proteomics workflow and a PCR-based DNA fingerprinting method targeting the repetitive extragenic palindromes elements in bacterial genomes, the novel method generated fingerprints that were richer in information and more discriminative in differentiating E. coli isolates by their animal sources. The novel method is readily deployable to any cultivable bacteria, and may be used for several fields of study such as environmental microbiology, applied microbiology, and clinical microbiology. PMID:26395646

  20. A peptide identification-free, genome sequence-independent shotgun proteomics workflow for strain-level bacterial differentiation.

    PubMed

    Shao, Wenguang; Zhang, Min; Lam, Henry; Lau, Stanley C K

    2015-01-01

    Shotgun proteomics is an emerging tool for bacterial identification and differentiation. However, the identification of the mass spectra of peptides to genome-derived peptide sequences remains a key issue that limits the use of shotgun proteomics to bacteria with genome sequences available. In this proof-of-concept study, we report a novel bacterial fingerprinting method that enjoys the resolving power and accuracy of mass spectrometry without the burden of peptide identification (i.e. genome sequence-independent). This method uses a similarity-clustering algorithm to search for mass spectra that are derived from the same peptide and merge them into a unique consensus spectrum as the basis to generate proteomic fingerprints of bacterial isolates. In comparison to a traditional peptide identification-based shotgun proteomics workflow and a PCR-based DNA fingerprinting method targeting the repetitive extragenic palindromes elements in bacterial genomes, the novel method generated fingerprints that were richer in information and more discriminative in differentiating E. coli isolates by their animal sources. The novel method is readily deployable to any cultivable bacteria, and may be used for several fields of study such as environmental microbiology, applied microbiology, and clinical microbiology. PMID:26395646

  1. In pursuit of an accurate spatial and temporal model of biomolecules at the atomistic level: a perspective on computer simulation

    PubMed Central

    Gray, Alan; Harlen, Oliver G.; Harris, Sarah A.; Khalid, Syma; Leung, Yuk Ming; Lonsdale, Richard; Mulholland, Adrian J.; Pearson, Arwen R.; Read, Daniel J.; Richardson, Robin A.

    2015-01-01

    Despite huge advances in the computational techniques available for simulating biomolecules at the quantum-mechanical, atomistic and coarse-grained levels, there is still a widespread perception amongst the experimental community that these calculations are highly specialist and are not generally applicable by researchers outside the theoretical community. In this article, the successes and limitations of biomolecular simulation and the further developments that are likely in the near future are discussed. A brief overview is also provided of the experimental biophysical methods that are commonly used to probe biomolecular structure and dynamics, and the accuracy of the information that can be obtained from each is compared with that from modelling. It is concluded that progress towards an accurate spatial and temporal model of biomacromolecules requires a combination of all of these biophysical techniques, both experimental and computational. PMID:25615870

  2. Patient grouping for dose surveys and establishment of diagnostic reference levels in paediatric computed tomography.

    PubMed

    Vassileva, J; Rehani, M

    2015-07-01

    There has been confusion in literature on whether paediatric patients should be grouped according to age, weight or other parameters when dealing with dose surveys. The present work aims to suggest a pragmatic approach to achieve reasonable accuracy for performing patient dose surveys in countries with limited resources. The analysis is based on a subset of data collected within the IAEA survey of paediatric computed tomography (CT) doses, involving 82 CT facilities from 32 countries in Asia, Europe, Africa and Latin America. Data for 6115 patients were collected, in 34.5 % of which data for weight were available. The present study suggests that using four age groups, <1, >1-5, >5-10 and >10-15 y, is realistic and pragmatic for dose surveys in less resourced countries and for the establishment of DRLs. To ensure relevant accuracy of results, data for >30 patients in a particular age group should be collected if patient weight is not known. If a smaller sample is used, patient weight should be recorded and the median weight in the sample should be within 5-10 % from the median weight of the sample for which the DRLs were established. Comparison of results from different surveys should always be performed with caution, taking into consideration the way of grouping of paediatric patients. Dose results can be corrected for differences in patient weight/age group. PMID:25836695

  3. Phase modulation at the few-photon level for weak-nonlinearity-based quantum computing

    NASA Astrophysics Data System (ADS)

    Venkataraman, Vivek; Saha, Kasturi; Gaeta, Alexander L.

    2013-02-01

    The ability of a few-photon light field to impart an appreciable phase shift on another light field is critical for many quantum information applications. A recently proposed paradigm for quantum computation utilizes weak nonlinearities, where a strong field mediates such cross-phase shifts between single photons. Such a protocol promises to be feasible in terms of scalability to many qubits if a cross-phase shift of 10-5 to 10-2 radians per photon can be achieved. A promising platform to achieve such cross-phase shifts is the hollow-core photonic bandgap fibre, which can highly confine atomic vapours and light over distances much greater than the diffraction length. Here, we produce large cross-phase shifts of 0.3 mrad per photon with a fast response time (<5 ns) using rubidium atoms confined to a hollow-core photonic bandgap fibre, which represents, to our knowledge, the largest such nonlinear phase shift induced in a single pass through a room-temperature medium.

  4. Computational aspects of helicopter trim analysis and damping levels from Floquet theory

    NASA Technical Reports Server (NTRS)

    Gaonkar, Gopal H.; Achar, N. S.

    1992-01-01

    Helicopter trim settings of periodic initial state and control inputs are investigated for convergence of Newton iteration in computing the settings sequentially and in parallel. The trim analysis uses a shooting method and a weak version of two temporal finite element methods with displacement formulation and with mixed formulation of displacements and momenta. These three methods broadly represent two main approaches of trim analysis: adaptation of initial-value and finite element boundary-value codes to periodic boundary conditions, particularly for unstable and marginally stable systems. In each method, both the sequential and in-parallel schemes are used and the resulting nonlinear algebraic equations are solved by damped Newton iteration with an optimally selected damping parameter. The impact of damped Newton iteration, including earlier-observed divergence problems in trim analysis, is demonstrated by the maximum condition number of the Jacobian matrices of the iterative scheme and by virtual elimination of divergence. The advantages of the in-parallel scheme over the conventional sequential scheme are also demonstrated.

  5. Whole proteome identification of plant candidate G-protein coupled receptors in Arabidopsis, rice, and poplar: computational prediction and in-vivo protein coupling

    PubMed Central

    Gookin, Timothy E; Kim, Junhyong; Assmann, Sarah M

    2008-01-01

    Background The classic paradigm of heterotrimeric G-protein signaling describes a heptahelical, membrane-spanning G-protein coupled receptor that physically interacts with an intracellular Gα subunit of the G-protein heterotrimer to transduce signals. G-protein coupled receptors comprise the largest protein superfamily in metazoa and are physiologically important as they sense highly diverse stimuli and play key roles in human disease. The heterotrimeric G-protein signaling mechanism is conserved across metazoa, and also readily identifiable in plants, but the low sequence conservation of G-protein coupled receptors hampers the identification of novel ones. Using diverse computational methods, we performed whole-proteome analyses of the three dominant model plant species, the herbaceous dicot Arabidopsis thaliana (mouse-eared cress), the monocot Oryza sativa (rice), and the woody dicot Populus trichocarpa (poplar), to identify plant protein sequences most likely to be GPCRs. Results Our stringent bioinformatic pipeline allowed the high confidence identification of candidate G-protein coupled receptors within the Arabidopsis, Oryza, and Populus proteomes. We extended these computational results through actual wet-bench experiments where we tested over half of our highest ranking Arabidopsis candidate G-protein coupled receptors for the ability to physically couple with GPA1, the sole Gα in Arabidopsis. We found that seven out of eight tested candidate G-protein coupled receptors do in fact interact with GPA1. We show through G-protein coupled receptor classification and molecular evolutionary analyses that both individual G-protein coupled receptor candidates and candidate G-protein coupled receptor families are conserved across plant species and that, in some cases, this conservation extends to metazoans. Conclusion Our computational and wet-bench results provide the first step toward understanding the diversity, conservation, and functional roles of plant

  6. ALPHN: A computer program for calculating ([alpha], n) neutron production in canisters of high-level waste

    SciTech Connect

    Salmon, R.; Hermann, O.W.

    1992-10-01

    The rate of neutron production from ([alpha], n) reactions in canisters of immobilized high-level waste containing borosilicate glass or glass-ceramic compositions is significant and must be considered when estimating neutron shielding requirements. The personal computer program ALPHA calculates the ([alpha], n) neutron production rate of a canister of vitrified high-level waste. The user supplies the chemical composition of the glass or glass-ceramic and the curies of the alpha-emitting actinides present. The output of the program gives the ([alpha], n) neutron production of each actinide in neutrons per second and the total for the canister. The ([alpha], n) neutron production rates are source terms only; that is, they are production rates within the glass and do not take into account the shielding effect of the glass. For a given glass composition, the user can calculate up to eight cases simultaneously; these cases are based on the same glass composition but contain different quantities of actinides per canister. In a typical application, these cases might represent the same canister of vitrified high-level waste at eight different decay times. Run time for a typical problem containing 20 chemical species, 24 actinides, and 8 decay times was 35 s on an IBM AT personal computer. Results of an example based on an expected canister composition at the Defense Waste Processing Facility are shown.

  7. ALPHN: A computer program for calculating ({alpha}, n) neutron production in canisters of high-level waste

    SciTech Connect

    Salmon, R.; Hermann, O.W.

    1992-10-01

    The rate of neutron production from ({alpha}, n) reactions in canisters of immobilized high-level waste containing borosilicate glass or glass-ceramic compositions is significant and must be considered when estimating neutron shielding requirements. The personal computer program ALPHA calculates the ({alpha}, n) neutron production rate of a canister of vitrified high-level waste. The user supplies the chemical composition of the glass or glass-ceramic and the curies of the alpha-emitting actinides present. The output of the program gives the ({alpha}, n) neutron production of each actinide in neutrons per second and the total for the canister. The ({alpha}, n) neutron production rates are source terms only; that is, they are production rates within the glass and do not take into account the shielding effect of the glass. For a given glass composition, the user can calculate up to eight cases simultaneously; these cases are based on the same glass composition but contain different quantities of actinides per canister. In a typical application, these cases might represent the same canister of vitrified high-level waste at eight different decay times. Run time for a typical problem containing 20 chemical species, 24 actinides, and 8 decay times was 35 s on an IBM AT personal computer. Results of an example based on an expected canister composition at the Defense Waste Processing Facility are shown.

  8. A computational method for the detection of activation/deactivation patterns in biological signals with three levels of electric intensity.

    PubMed

    Guerrero, J A; Macías-Díaz, J E

    2014-02-01

    In the present work, we develop a computational technique to approximate the changes of phase in temporal series associated to electric signals of muscles which perform activities at three different levels of intensity. We suppose that the temporal series are samples of independent, normally distributed random variables with mean equal to zero, and variance equal to one of three possible values, each of them associated to a certain degree of electric intensity. For example, these intensity levels may represent a leg muscle at rest, or active during a light activity (walking), or active during a highly demanding performance (jogging). The model is presented as a maximum likelihood problem involving discrete variables. In turn, this problem is transformed into a continuous one via the introduction of continuous variables with penalization parameters, and it is solved recursively through an iterative numerical method. An a posteriori treatment of the results is used in order to avoid the detection of relatively short periods of silence or activity. We perform simulations with synthetic data in order to assess the validity of our technique. Our computational results show that the method approximates well the occurrence of the change points in synthetic temporal series, even in the presence of autocorrelated sequences. In the way, we show that a generalization of a computational technique for the change-point detection of electric signals with two phases of activity (Esquivel-Frausto et al., 2010 [40]), may be inapplicable in cases of temporal series with three levels of intensity. In this sense, the method proposed in the present manuscript improves previous efforts of the authors. PMID:24418009

  9. Low-level waste disposal site performance assessment with the RQ/PQ computer program. Final report. [Shallow land burial

    SciTech Connect

    Rogers, V.C.; Grant, M.W.; Merrell, G.B.; Macbeth, P.J.

    1983-06-01

    The operation of the computer code RQ/PQ (Retention quotient/performance quotient) is presented. The code calculates the potential hazard of low-level radioactive waste as well as the characteristics of the natural and man-made barriers provided by the disposal facility. From these parameters a facility safety factor is determined as a function of time after facility operation. The program also calculates the dose to man for the nine release pathways to man. Both off-site transport and inadvertent intrusion pathways are considered.

  10. Identification of fentanyl derivatives at trace levels with nonaqueous capillary electrophoresis-electrospray-tandem mass spectrometry (MS(n), n = 2, 3): analytical method and forensic applications.

    PubMed

    Rittgen, Jan; Pütz, Michael; Zimmermann, Ralf

    2012-06-01

    The identification of fentanyl derivatives at trace levels employing capillary electrophoresis coupled to electrospray ionization tandem mass spectrometry (CE-ESI-MS(n) , n = 2, 3) is presented. The studied synthetic opioid fentanyl and its derivatives have an exceeding analgesic potency which can be up to 8000 times higher that of morphine. Apart from their therapeutical applications, there is an abuse of them in the drug scene as a heroin substitute. The identification of these opioids at trace levels is of further significant forensic interest with respect to recent seizures of clandestine fentanyl laboratories in Germany. In this work, a nonaqueous capillary electrophoresis (NACE)-ESI-MS(n) procedure was developed for the separation and identification of six fentanyl derivatives including fentanyl, cis- and trans-methylfentanyl, sufentanil, alfentanil, and carfentanil. Their fragmentation pattern in MS(n) experiments were investigated as well as the influence of the sheath-liquid mixture and the influence of the inside diameter of the fused silica capillary on the peak shape and the signal to noise ratio. Method validation included determination of the detection limits (about 1-2 nmol/L) and the repeatability of migration time (at most 0.07% relative standard deviation). The NACE-MS procedure was successfully applied for the analysis of real samples from seizures in illegal fentanyl laboratories. PMID:22736362

  11. Computational Visual Stress Level Analysis of Calcareous Algae Exposed to Sedimentation

    PubMed Central

    Nilssen, Ingunn; Eide, Ingvar; de Oliveira Figueiredo, Marcia Abreu; de Souza Tâmega, Frederico Tapajós; Nattkemper, Tim W.

    2016-01-01

    This paper presents a machine learning based approach for analyses of photos collected from laboratory experiments conducted to assess the potential impact of water-based drill cuttings on deep-water rhodolith-forming calcareous algae. This pilot study uses imaging technology to quantify and monitor the stress levels of the calcareous algae Mesophyllum engelhartii (Foslie) Adey caused by various degrees of light exposure, flow intensity and amount of sediment. A machine learning based algorithm was applied to assess the temporal variation of the calcareous algae size (∼ mass) and color automatically. Measured size and color were correlated to the photosynthetic efficiency (maximum quantum yield of charge separation in photosystem II, ΦPSIImax) and degree of sediment coverage using multivariate regression. The multivariate regression showed correlations between time and calcareous algae sizes, as well as correlations between fluorescence and calcareous algae colors. PMID:27285611

  12. COMGEN: A computer program for generating finite element models of composite materials at the micro level

    NASA Technical Reports Server (NTRS)

    Melis, Matthew E.

    1990-01-01

    COMGEN (Composite Model Generator) is an interactive FORTRAN program which can be used to create a wide variety of finite element models of continuous fiber composite materials at the micro level. It quickly generates batch or session files to be submitted to the finite element pre- and postprocessor PATRAN based on a few simple user inputs such as fiber diameter and percent fiber volume fraction of the composite to be analyzed. In addition, various mesh densities, boundary conditions, and loads can be assigned easily to the models within COMGEN. PATRAN uses a session file to generate finite element models and their associated loads which can then be translated to virtually any finite element analysis code such as NASTRAN or MARC.

  13. Multi-level security for computer networking: SAC digital network approach

    SciTech Connect

    Griess, W.; Poutre, D.L.

    1983-10-01

    For telecommunications systems simultaneously handling data of different security levels, multilevel secure (MLS) operation permits maximum use of resources by automatically providing protection to users with various clearances and needs-to-know. The strategic air command (SAC) is upgrading the primary record data system used to command and control its strategic forces. The upgrade, called the SAC Digital Network (SACDIN), is designed to provide multilevel security to support users and external interfaces, with allowed accesses ranging from unclassified to top secret. SACDIN implements a security kernel based upon the Bell and Lapadula security model. This study presents an overview of the SACDIN security architecture and describes the basic message flow across the MLS network. 7 references.

  14. Computational Visual Stress Level Analysis of Calcareous Algae Exposed to Sedimentation.

    PubMed

    Osterloff, Jonas; Nilssen, Ingunn; Eide, Ingvar; de Oliveira Figueiredo, Marcia Abreu; de Souza Tâmega, Frederico Tapajós; Nattkemper, Tim W

    2016-01-01

    This paper presents a machine learning based approach for analyses of photos collected from laboratory experiments conducted to assess the potential impact of water-based drill cuttings on deep-water rhodolith-forming calcareous algae. This pilot study uses imaging technology to quantify and monitor the stress levels of the calcareous algae Mesophyllum engelhartii (Foslie) Adey caused by various degrees of light exposure, flow intensity and amount of sediment. A machine learning based algorithm was applied to assess the temporal variation of the calcareous algae size (∼ mass) and color automatically. Measured size and color were correlated to the photosynthetic efficiency (maximum quantum yield of charge separation in photosystem II, [Formula: see text]) and degree of sediment coverage using multivariate regression. The multivariate regression showed correlations between time and calcareous algae sizes, as well as correlations between fluorescence and calcareous algae colors. PMID:27285611

  15. A Well-Mixed Computational Model for Estimating Room Air Levels of Selected Constituents from E-Vapor Product Use

    PubMed Central

    Rostami, Ali A.; Pithawalla, Yezdi B.; Liu, Jianmin; Oldham, Michael J.; Wagner, Karl A.; Frost-Pineda, Kimberly; Sarkar, Mohamadi A.

    2016-01-01

    Concerns have been raised in the literature for the potential of secondhand exposure from e-vapor product (EVP) use. It would be difficult to experimentally determine the impact of various factors on secondhand exposure including, but not limited to, room characteristics (indoor space size, ventilation rate), device specifications (aerosol mass delivery, e-liquid composition), and use behavior (number of users and usage frequency). Therefore, a well-mixed computational model was developed to estimate the indoor levels of constituents from EVPs under a variety of conditions. The model is based on physical and thermodynamic interactions between aerosol, vapor, and air, similar to indoor air models referred to by the Environmental Protection Agency. The model results agree well with measured indoor air levels of nicotine from two sources: smoking machine-generated aerosol and aerosol exhaled from EVP use. Sensitivity analysis indicated that increasing air exchange rate reduces room air level of constituents, as more material is carried away. The effect of the amount of aerosol released into the space due to variability in exhalation was also evaluated. The model can estimate the room air level of constituents as a function of time, which may be used to assess the level of non-user exposure over time. PMID:27537903

  16. A Well-Mixed Computational Model for Estimating Room Air Levels of Selected Constituents from E-Vapor Product Use.

    PubMed

    Rostami, Ali A; Pithawalla, Yezdi B; Liu, Jianmin; Oldham, Michael J; Wagner, Karl A; Frost-Pineda, Kimberly; Sarkar, Mohamadi A

    2016-01-01

    Concerns have been raised in the literature for the potential of secondhand exposure from e-vapor product (EVP) use. It would be difficult to experimentally determine the impact of various factors on secondhand exposure including, but not limited to, room characteristics (indoor space size, ventilation rate), device specifications (aerosol mass delivery, e-liquid composition), and use behavior (number of users and usage frequency). Therefore, a well-mixed computational model was developed to estimate the indoor levels of constituents from EVPs under a variety of conditions. The model is based on physical and thermodynamic interactions between aerosol, vapor, and air, similar to indoor air models referred to by the Environmental Protection Agency. The model results agree well with measured indoor air levels of nicotine from two sources: smoking machine-generated aerosol and aerosol exhaled from EVP use. Sensitivity analysis indicated that increasing air exchange rate reduces room air level of constituents, as more material is carried away. The effect of the amount of aerosol released into the space due to variability in exhalation was also evaluated. The model can estimate the room air level of constituents as a function of time, which may be used to assess the level of non-user exposure over time. PMID:27537903

  17. Species level identification and antifungal susceptibility of yeasts isolated from various clinical specimens and evaluation of Integral System Yeasts Plus.

    PubMed

    Bicmen, Can; Doluca, Mine; Gulat, Sinem; Gunduz, Ayriz T; Tuksavul, Fevziye

    2012-07-01

    It is essential to use easy, standard, cost-effective and accurate methods for identification and susceptibility testing of yeasts in routine practice. This study aimed to establish the species distribution and antifungal susceptibility of yeast isolates and also to evaluate the performance of the colorimetric and commercially available Integral System Yeasts Plus (ISYP). Yeast isolates (n=116) were identified by conventional methods and ISYP. Antifungal susceptibility testing was performed by the microdilution method according to the standards of CLSI M27-A3 and ISYP. Candida albicans (50%) was the most common species isolated, followed by C. parapsilosis (25%) (mostly in blood samples). According to the CLSI M27-S3 criteria, resistance rates for amphotericin B, flucytosine, fluconazole, itraconazole, and voriconazole were 0%, 0%, 4.6%, 4.5% and 1.8%, respectively. Resistance for miconazole (MIC >1 mg/L) was found as 17.9%. Sixty-two (53.4%) of the isolates which were analyzed by ISYP showed disagreement with those identified by the conventional methods and API ID 32C identification kit or a specific identification code could not be assigned by ISYP. The performance of ISYP could be indicated as low for all antifungal drugs tested according to the ROC analysis (AUC: 0.28-0.56). As the current version of ISYP displays a poor performance, it is recommended to use the other commercial systems whose results are approved as reliable and in agreement with those of the reference methods in identification and susceptibility testing of yeasts. PMID:22842602

  18. Five-Test Simple Scheme for Species-Level Identification of Clinically Significant Coagulase-Negative Staphylococci

    PubMed Central

    De Paulis, Adriana N.; Predari, Silvia C.; Chazarreta, Carlos D.; Santoianni, Jorge E.

    2003-01-01

    A working scheme developed in our laboratory for identification (by species group and species) of coagulase-negative staphylococci (CNS) was evaluated with 201 consecutive isolates and then validated by using the reference method of Kloos and Schleifer (W. E. Kloos and K. H. Schleifer, J. Clin. Microbiol. 1:82-88, 1975). This five-test simple scheme (referred to here as the simple scheme) combines the novobiocin susceptibility test with tests for urease, pyrrolidonyl arylamidase, ornithine decarboxylase, and aerobic acid from mannose. The addition of one or two tests within a particular species group could then positively identify the isolate. Two commercial systems, Staph-Zym (Rosco) and API-Staph (bioMérieux), along with results obtained by using Rosco diagnostic tablets (nongrowth tests), were also compared with the reference method. One isolate could not be identified even by the reference method. Of the remaining 200 strains, 191 (95.5%) strains were correctly identified with Staph-Zym and 171 strains (85.5%) were correctly identified with API-Staph. The most frequent clinical CNS species isolated were Staphylococcus epidermidis (50.5%), S. haemolyticus (18.5%), S. saprophyticus subsp. saprophyticus (16.0%), S. lugdunensis (6.0%), and S. warneri (2.5%). The simple scheme validated with the reference method has demonstrated an excellent correlation in the identification of the three most frequent species isolated: S. epidermidis, S. haemolyticus, and S. saprophyticus subsp. saprophyticus. With the simple scheme, identification of CNS was possible within 24 h after the enzymatic tests were used, whereas up to 72 h is necessary for the growth tests. This methodology would be very useful in any clinical microbiology laboratory for the presumptive identification of CNS species groups and species. PMID:12624054

  19. High level waste storage tank farms/242-A evaporator standards/requirements identification document phase 1 assessment report

    SciTech Connect

    Biebesheimer, E., Westinghouse Hanford Co.

    1996-09-30

    This document, the Standards/Requirements Identification Document (S/RID) Phase I Assessment Report for the subject facility, represents the results of an Administrative Assessment to determine whether S/RID requirements are fully addressed by existing policies, plans or procedures. It contains; compliance status, remedial actions, and an implementing manuals report linking S/RID elements to requirement source to implementing manual and section.

  20. Identification of Novel Activators of Constitutive Androstane Receptor from FDA-approved Drugs by Integrated Computational and Biological Approaches

    PubMed Central

    Lynch, Caitlin; Pan, Yongmei; Li, Linhao; Ferguson, Stephen S.; Xia, Menghang; Swaan, Peter W.; Wang, Hongbing

    2012-01-01

    Purpose The constitutive androstane receptor (CAR, NR1I3) is a xenobiotic sensor governing the transcription of numerous hepatic genes associated with drug metabolism and clearance. Recent evidence suggests that CAR also modulates energy homeostasis and cancer development. Thus, identification of novel human (h) CAR activators is of both clinical importance and scientific interest. Methods Docking and ligand-based structure-activity models were used for virtual screening of a database containing over 2000 FDA-approved drugs. Identified lead compounds were evaluated in cell-based reporter assays to determine hCAR activation. Potential activators were further tested in human primary hepatocytes (HPHs) for the expression of the prototypical hCAR target gene CYP2B6. Results Nineteen lead compounds with optimal modeling parameters were selected for biological evaluation. Seven of the 19 leads exhibited moderate to potent activation of hCAR. Five out of the seven compounds translocated hCAR from the cytoplasm to the nucleus of HPHs in a concentration-dependent manner. These compounds also induce the expression of CYP2B6 in HPHs with rank-order of efficacies closely resembling that of hCAR activation. Conclusion These results indicate that our strategically integrated approaches are effective in the identification of novel hCAR modulators, which may function as valuable research tools or potential therapeutic molecules. PMID:23090669

  1. In pursuit of an accurate spatial and temporal model of biomolecules at the atomistic level: a perspective on computer simulation

    SciTech Connect

    Gray, Alan; Harlen, Oliver G.; Harris, Sarah A.; Khalid, Syma; Leung, Yuk Ming; Lonsdale, Richard; Mulholland, Adrian J.; Pearson, Arwen R.; Read, Daniel J.; Richardson, Robin A.

    2015-01-01

    The current computational techniques available for biomolecular simulation are described, and the successes and limitations of each with reference to the experimental biophysical methods that they complement are presented. Despite huge advances in the computational techniques available for simulating biomolecules at the quantum-mechanical, atomistic and coarse-grained levels, there is still a widespread perception amongst the experimental community that these calculations are highly specialist and are not generally applicable by researchers outside the theoretical community. In this article, the successes and limitations of biomolecular simulation and the further developments that are likely in the near future are discussed. A brief overview is also provided of the experimental biophysical methods that are commonly used to probe biomolecular structure and dynamics, and the accuracy of the information that can be obtained from each is compared with that from modelling. It is concluded that progress towards an accurate spatial and temporal model of biomacromolecules requires a combination of all of these biophysical techniques, both experimental and computational.

  2. A computational model of oxygen transport in the cerebrocapillary levels for normal and pathologic brain function

    PubMed Central

    Safaeian, Navid; David, Tim

    2013-01-01

    The oxygen exchange and correlation between the cerebral blood flow (CBF) and cerebral metabolic rate of oxygen consumption (CMRO2) in the cortical capillary levels for normal and pathologic brain functions remain the subject of debate. A 3D realistic mesoscale model of the cortical capillary network (non-tree like) is constructed using a random Voronoi tessellation in which each edge represents a capillary segment. The hemodynamics and oxygen transport are numerically simulated in the model, which involves rheological laws in the capillaries, oxygen diffusion, and non-linear binding of oxygen to hemoglobin, respectively. The findings show that the cerebral hypoxia due to a significant decreased perfusion (as can occur in stroke) can be avoided by a moderate reduction in oxygen demand. Oxygen extraction fraction (OEF) can be an important indicator for the brain oxygen metabolism under normal perfusion and misery-perfusion syndrome (leading to ischemia). The results demonstrated that a disproportionately large increase in blood supply is required for a small increase in the oxygen demand, which, in turn, is strongly dependent on the resting OEF. The predicted flow-metabolism coupling in the model supports the experimental studies of spatiotemporal stimulations in humans by positron emission tomography and functional magnetic resonance imaging. PMID:23921901

  3. A computer-based intervention for improving the appropriateness of antiepileptic drug level monitoring.

    PubMed

    Chen, Philip; Tanasijevic, Milenko J; Schoenenberger, Ronald A; Fiskio, Julie; Kuperman, Gilad J; Bates, David W

    2003-03-01

    We designed and implemented 2 automated, computerized screens for use at the time of antiepileptic drug (AED) test order entry to improve appropriateness by reminding physicians when a potentially redundant test was ordered and providing common indications for monitoring and pharmacokinetics of the specific AED. All computerized orders for inpatient serum AED levels during two 3-month periods were included in the study. During the 3-month period after implementation of the automated intervention, 13% of all AED tests ordered were canceled following computerized reminders. For orders appearing redundant, the cancellation rate was 27%. For nonredundant orders, 4% were canceled when information on specific AED monitoring and pharmacokinetics was provided. The cancellation rate was sustained after 4 years. There has been a 19.5% decrease in total AED testing volume since implementation of this intervention, despite a 19.3% increase in overall chemistry test volume. Inappropriateness owing to repeated testing before pharmacologic steady state was reached decreased from 54% of all AED orders to 14.6%. A simple, automated, activity-based intervention targeting a specific test-ordering behavior effectively reduced inappropriate laboratory testing. The sustained benefit supports the idea that computerized interventions may durably affect physician behavior. Computerized delivery of such evidence-based boundary guidelines can help narrow the gap between evidence and practice. PMID:12645347

  4. Identification of transformation products of rosuvastatin in water during ZnO photocatalytic degradation through the use of associated LC-QTOF-MS to computational chemistry.

    PubMed

    Segalin, Jéferson; Sirtori, Carla; Jank, Louíse; Lima, Martha F S; Livotto, Paolo R; Machado, Tiele C; Lansarin, Marla A; Pizzolato, Tânia M

    2015-12-15

    Rosuvastatin (RST), a synthetic statin, is a 3-hydroxy-3-methylglutaryl coenzyme A reductase inhibitor, with a number of pleiotropic properties, such as anti-inflammation, antioxidation and cardiac remodelling attenuation. According to IMS Health, rosuvastatin was the third best-selling drug in the United States in 2012. RST was recently found in European effluent samples at a detection frequency of 36%. In this study, we evaluate the identification process of major transformation products (TPs) of RST generated during the heterogeneous photocatalysis process with ZnO. The degradation of the parent molecule and the identification of the main TPs were studied in demineralised water. The TPs were monitored and identified by liquid chromatography-quadrupole time-of-flight mass spectrometry (LC-QTOF-MS/MS). Ten TPs were tentatively identified and some of them originated from the hydroxylation suffered by the aromatic ring during the initial stages of the process. Structural elucidation of some of the most abundant or persistent TPs was evaluated by computational analysis, which demonstrated that this approach can be used as a tool to help the elucidation of structures of unknown molecules. The analysis of the parameters obtained from ab initio calculations for different isomers showed the most stable structures and, consequently, the most likely to be found. PMID:26093357

  5. Computer-Aided Identification of Trypanosoma brucei Uridine Diphosphate Galactose 4′-Epimerase Inhibitors: Toward the Development of Novel Therapies for African Sleeping Sickness

    PubMed Central

    2010-01-01

    Trypanosoma brucei, the causative agent of human African trypanosomiasis, affects tens of thousands of sub-Saharan Africans. As current therapeutics are inadequate due to toxic side effects, drug resistance, and limited effectiveness, novel therapies are urgently needed. UDP-galactose 4′-epimerase (TbGalE), an enzyme of the Leloir pathway of galactose metabolism, is one promising T. brucei drug target. We here use the relaxed complex scheme, an advanced computer-docking methodology that accounts for full protein flexibility, to identify inhibitors of TbGalE. An initial hit rate of 62% was obtained at 100 μM, ultimately leading to the identification of 14 low-micromolar inhibitors. Thirteen of these inhibitors belong to a distinct series with a conserved binding motif that may prove useful in future drug design and optimization. PMID:20527952

  6. Identification of Students' Intuitive Mental Computational Strategies for 1, 2 and 3 Digits Addition and Subtraction: Pedagogical and Curricular Implications

    ERIC Educational Resources Information Center

    Ghazali, Munirah; Alias, Rohana; Ariffin, Noor Asrul Anuar; Ayub, Ayminsyadora

    2010-01-01

    This paper reports on a study to examine mental computation strategies used by Year 1, Year 2, and Year 3 students to solve addition and subtraction problems. The participants in this study were twenty five 7 to 9 year-old students identified as excellent, good and satisfactory in their mathematics performance from a school in Penang, Malaysia.…

  7. Identification of an intra-cranial intra-axial porcupine quill foreign body with computed tomography in a canine patient.

    PubMed

    Sauvé, Christopher P; Sereda, Nikki C; Sereda, Colin W

    2012-02-01

    An intra-cranial intra-axial foreign body was diagnosed in a golden retriever dog through the use of computed tomography (CT). Confirmed by necropsy, a porcupine quill had migrated to the patient's left cerebral hemisphere, likely through the oval foramen. This case study demonstrates the efficacy of CT in visualizing a quill in the canine brain. PMID:22851782

  8. Identification of Misconceptions in the Central Limit Theorem and Related Concepts and Evaluation of Computer Media as a Remedial Tool.

    ERIC Educational Resources Information Center

    Yu, Chong Ho; And Others

    Central limit theorem (CLT) is considered an important topic in statistics, because it serves as the basis for subsequent learning in other crucial concepts such as hypothesis testing and power analysis. There is an increasing popularity in using dynamic computer software for illustrating CLT. Graphical displays do not necessarily clear up…

  9. Developing an Educational Computer Game for Migratory Bird Identification Based on a Two-Tier Test Approach

    ERIC Educational Resources Information Center

    Chu, Hui-Chun; Chang, Shao-Chen

    2014-01-01

    Although educational computer games have been recognized as being a promising approach, previous studies have indicated that, without supportive models, students might only show temporary interest during the game-based learning process, and their learning performance is often not as good as expected. Therefore, in this paper, a two-tier test…

  10. Identification of an intra-cranial intra-axial porcupine quill foreign body with computed tomography in a canine patient

    PubMed Central

    Sauvé, Christopher P.; Sereda, Nikki C.; Sereda, Colin W.

    2012-01-01

    An intra-cranial intra-axial foreign body was diagnosed in a golden retriever dog through the use of computed tomography (CT). Confirmed by necropsy, a porcupine quill had migrated to the patient’s left cerebral hemisphere, likely through the oval foramen. This case study demonstrates the efficacy of CT in visualizing a quill in the canine brain. PMID:22851782

  11. High-level waste storage tank farms/242-A evaporator standards/requirements identification document (S/RID), Vol. 1

    SciTech Connect

    Not Available

    1994-04-01

    The purpose of this Requirements Identification Document (RID) section is to identify, in one location, all of the facility specific requirements and good industry practices which are necessary or important to establish an effective Issues Management Program for the Tank Farm Facility. The Management Systems Functional Area includes the site management commitment to environmental safety and health (ES&H) policies and controls, to compliance management, to development and management of policy and procedures, to occurrence reporting and corrective actions, resource and issue management, and to the self-assessment process.

  12. Identification of Potent Chemotypes Targeting Leishmania major Using a High-Throughput, Low-Stringency, Computationally Enhanced, Small Molecule Screen

    PubMed Central

    Sharlow, Elizabeth R.; Close, David; Shun, Tongying; Leimgruber, Stephanie; Reed, Robyn; Mustata, Gabriela; Wipf, Peter; Johnson, Jacob; O'Neil, Michael; Grögl, Max; Magill, Alan J.; Lazo, John S.

    2009-01-01

    Patients with clinical manifestations of leishmaniasis, including cutaneous leishmaniasis, have limited treatment options, and existing therapies frequently have significant untoward liabilities. Rapid expansion in the diversity of available cutaneous leishmanicidal chemotypes is the initial step in finding alternative efficacious treatments. To this end, we combined a low-stringency Leishmania major promastigote growth inhibition assay with a structural computational filtering algorithm. After a rigorous assay validation process, we interrogated ∼200,000 unique compounds for L. major promastigote growth inhibition. Using iterative computational filtering of the compounds exhibiting >50% inhibition, we identified 553 structural clusters and 640 compound singletons. Secondary confirmation assays yielded 93 compounds with EC50s ≤ 1 µM, with none of the identified chemotypes being structurally similar to known leishmanicidals and most having favorable in silico predicted bioavailability characteristics. The leishmanicidal activity of a representative subset of 15 chemotypes was confirmed in two independent assay formats, and L. major parasite specificity was demonstrated by assaying against a panel of human cell lines. Thirteen chemotypes inhibited the growth of a L. major axenic amastigote-like population. Murine in vivo efficacy studies using one of the new chemotypes document inhibition of footpad lesion development. These results authenticate that low stringency, large-scale compound screening combined with computational structure filtering can rapidly expand the chemotypes targeting in vitro and in vivo Leishmania growth and viability. PMID:19888337

  13. Computer-aided drug discovery

    PubMed Central

    Bajorath, Jürgen

    2015-01-01

    Computational approaches are an integral part of interdisciplinary drug discovery research. Understanding the science behind computational tools, their opportunities, and limitations is essential to make a true impact on drug discovery at different levels. If applied in a scientifically meaningful way, computational methods improve the ability to identify and evaluate potential drug molecules, but there remain weaknesses in the methods that preclude naïve applications. Herein, current trends in computer-aided drug discovery are reviewed, and selected computational areas are discussed. Approaches are highlighted that aid in the identification and optimization of new drug candidates. Emphasis is put on the presentation and discussion of computational concepts and methods, rather than case studies or application examples. As such, this contribution aims to provide an overview of the current methodological spectrum of computational drug discovery for a broad audience. PMID:26949519

  14. Performance/Design Requirements and Detailed Technical Description for a Computer-Directed Training Subsystem for Integration into the Air Force Phase II Base Level System.

    ERIC Educational Resources Information Center

    Butler, A. K.; And Others

    The performance/design requirements and a detailed technical description for a Computer-Directed Training Subsystem to be integrated into the Air Force Phase II Base Level System are described. The subsystem may be used for computer-assisted lesson construction and has presentation capability for on-the-job training for data automation, staff, and…

  15. Identification of RNA polymerase III-transcribed Alu loci by computational screening of RNA-Seq data

    PubMed Central

    Conti, Anastasia; Carnevali, Davide; Bollati, Valentina; Fustinoni, Silvia; Pellegrini, Matteo; Dieci, Giorgio

    2015-01-01

    Of the ∼1.3 million Alu elements in the human genome, only a tiny number are estimated to be active in transcription by RNA polymerase (Pol) III. Tracing the individual loci from which Alu transcripts originate is complicated by their highly repetitive nature. By exploiting RNA-Seq data sets and unique Alu DNA sequences, we devised a bioinformatic pipeline allowing us to identify Pol III-dependent transcripts of individual Alu elements. When applied to ENCODE transcriptomes of seven human cell lines, this search strategy identified ∼1300 Alu loci corresponding to detectable transcripts, with ∼120 of them expressed in at least three cell lines. In vitro transcription of selected Alus did not reflect their in vivo expression properties, and required the native 5′-flanking region in addition to internal promoter. We also identified a cluster of expressed AluYa5-derived transcription units, juxtaposed to snaR genes on chromosome 19, formed by a promoter-containing left monomer fused to an Alu-unrelated downstream moiety. Autonomous Pol III transcription was also revealed for Alus nested within Pol II-transcribed genes. The ability to investigate Alu transcriptomes at single-locus resolution will facilitate both the identification of novel biologically relevant Alu RNAs and the assessment of Alu expression alteration under pathological conditions. PMID:25550429

  16. Compositional modeling in porous media using constant volume flash and flux computation without the need for phase identification

    SciTech Connect

    Polívka, Ondřej Mikyška, Jiří

    2014-09-01

    The paper deals with the numerical solution of a compositional model describing compressible two-phase flow of a mixture composed of several components in porous media with species transfer between the phases. The mathematical model is formulated by means of the extended Darcy's laws for all phases, components continuity equations, constitutive relations, and appropriate initial and boundary conditions. The splitting of components among the phases is described using a new formulation of the local thermodynamic equilibrium which uses volume, temperature, and moles as specification variables. The problem is solved numerically using a combination of the mixed-hybrid finite element method for the total flux discretization and the finite volume method for the discretization of transport equations. A new approach to numerical flux approximation is proposed, which does not require the phase identification and determination of correspondence between the phases on adjacent elements. The time discretization is carried out by the backward Euler method. The resulting large system of nonlinear algebraic equations is solved by the Newton–Raphson iterative method. We provide eight examples of different complexity to show reliability and robustness of our approach.

  17. Evaluation of a wireless wearable tongue-computer interface by individuals with high-level spinal cord injuries

    NASA Astrophysics Data System (ADS)

    Huo, Xueliang; Ghovanloo, Maysam

    2010-04-01

    The tongue drive system (TDS) is an unobtrusive, minimally invasive, wearable and wireless tongue-computer interface (TCI), which can infer its users' intentions, represented in their volitional tongue movements, by detecting the position of a small permanent magnetic tracer attached to the users' tongues. Any specific tongue movements can be translated into user-defined commands and used to access and control various devices in the users' environments. The latest external TDS (eTDS) prototype is built on a wireless headphone and interfaced to a laptop PC and a powered wheelchair. Using customized sensor signal processing algorithms and graphical user interface, the eTDS performance was evaluated by 13 naive subjects with high-level spinal cord injuries (C2-C5) at the Shepherd Center in Atlanta, GA. Results of the human trial show that an average information transfer rate of 95 bits/min was achieved for computer access with 82% accuracy. This information transfer rate is about two times higher than the EEG-based BCIs that are tested on human subjects. It was also demonstrated that the subjects had immediate and full control over the powered wheelchair to the extent that they were able to perform complex wheelchair navigation tasks, such as driving through an obstacle course.

  18. Evaluation of a wireless wearable tongue–computer interface by individuals with high-level spinal cord injuries

    PubMed Central

    Huo, Xueliang; Ghovanloo, Maysam

    2010-01-01

    The tongue drive system (TDS) is an unobtrusive, minimally invasive, wearable and wireless tongue–computer interface (TCI), which can infer its users' intentions, represented in their volitional tongue movements, by detecting the position of a small permanent magnetic tracer attached to the users' tongues. Any specific tongue movements can be translated into user-defined commands and used to access and control various devices in the users' environments. The latest external TDS (eTDS) prototype is built on a wireless headphone and interfaced to a laptop PC and a powered wheelchair. Using customized sensor signal processing algorithms and graphical user interface, the eTDS performance was evaluated by 13 naive subjects with high-level spinal cord injuries (C2–C5) at the Shepherd Center in Atlanta, GA. Results of the human trial show that an average information transfer rate of 95 bits/min was achieved for computer access with 82% accuracy. This information transfer rate is about two times higher than the EEG-based BCIs that are tested on human subjects. It was also demonstrated that the subjects had immediate and full control over the powered wheelchair to the extent that they were able to perform complex wheelchair navigation tasks, such as driving through an obstacle course. PMID:20332552

  19. Identification of the wave speed and the second viscosity of cavitation flows with 2D RANS computations - Part I

    NASA Astrophysics Data System (ADS)

    Decaix, J.; Alligné, S.; Nicolet, C.; Avellan, F.; Münch, C.

    2015-12-01

    1D hydro-electric models are useful to predict dynamic behaviour of hydro-power plants. Regarding vortex rope and cavitation surge in Francis turbines, the 1D models require some inputs that can be provided by numerical simulations. In this paper, a 2D cavitating Venturi is considered. URANS computations are performed to investigate the dynamic behaviour of the cavitation sheet depending on the frequency variation of the outlet pressure. The results are used to calibrate and to assess the reliability of the 1D models.

  20. Causal Attributions of Success and Failure Made by Undergraduate Students in an Introductory-Level Computer Programming Course

    ERIC Educational Resources Information Center

    Hawi, N.

    2010-01-01

    The purpose of this research is to identify the causal attributions of business computing students in an introductory computer programming course, in the computer science department at Notre Dame University, Louaize. Forty-five male and female undergraduates who completed the computer programming course that extended for a 13-week semester…

  1. Early Identification of Students Predicted to Enroll in Advanced, Upper-Level High School Courses: An Examination of Validity

    ERIC Educational Resources Information Center

    DeRose, Diego S.; Clement, Russell W.

    2011-01-01

    Broward County Public Schools' Research Services department uses logistic regression analysis to compute an indicator to predict student enrollment in advanced high school courses, for students entering ninth grade for the first time. This prediction indicator, along with other student characteristics, supports high school guidance staffs in…

  2. Computational Identification of Protein Pupylation Sites by Using Profile-Based Composition of k-Spaced Amino Acid Pairs.

    PubMed

    Hasan, Md Mehedi; Zhou, Yuan; Lu, Xiaotian; Li, Jinyan; Song, Jiangning; Zhang, Ziding

    2015-01-01

    Prokaryotic proteins are regulated by pupylation, a type of post-translational modification that contributes to cellular function in bacterial organisms. In pupylation process, the prokaryotic ubiquitin-like protein (Pup) tagging is functionally analogous to ubiquitination in order to tag target proteins for proteasomal degradation. To date, several experimental methods have been developed to identify pupylated proteins and their pupylation sites, but these experimental methods are generally laborious and costly. Therefore, computational methods that can accurately predict potential pupylation sites based on protein sequence information are highly desirable. In this paper, a novel predictor termed as pbPUP has been developed for accurate prediction of pupylation sites. In particular, a sophisticated sequence encoding scheme [i.e. the profile-based composition of k-spaced amino acid pairs (pbCKSAAP)] is used to represent the sequence patterns and evolutionary information of the sequence fragments surrounding pupylation sites. Then, a Support Vector Machine (SVM) classifier is trained using the pbCKSAAP encoding scheme. The final pbPUP predictor achieves an AUC value of 0.849 in 10-fold cross-validation tests and outperforms other existing predictors on a comprehensive independent test dataset. The proposed method is anticipated to be a helpful computational resource for the prediction of pupylation sites. The web server and curated datasets in this study are freely available at http://protein.cau.edu.cn/pbPUP/. PMID:26080082

  3. Computational Identification of Protein Pupylation Sites by Using Profile-Based Composition of k-Spaced Amino Acid Pairs

    PubMed Central

    Lu, Xiaotian; Li, Jinyan; Song, Jiangning; Zhang, Ziding

    2015-01-01

    Prokaryotic proteins are regulated by pupylation, a type of post-translational modification that contributes to cellular function in bacterial organisms. In pupylation process, the prokaryotic ubiquitin-like protein (Pup) tagging is functionally analogous to ubiquitination in order to tag target proteins for proteasomal degradation. To date, several experimental methods have been developed to identify pupylated proteins and their pupylation sites, but these experimental methods are generally laborious and costly. Therefore, computational methods that can accurately predict potential pupylation sites based on protein sequence information are highly desirable. In this paper, a novel predictor termed as pbPUP has been developed for accurate prediction of pupylation sites. In particular, a sophisticated sequence encoding scheme [i.e. the profile-based composition of k-spaced amino acid pairs (pbCKSAAP)] is used to represent the sequence patterns and evolutionary information of the sequence fragments surrounding pupylation sites. Then, a Support Vector Machine (SVM) classifier is trained using the pbCKSAAP encoding scheme. The final pbPUP predictor achieves an AUC value of 0.849 in10-fold cross-validation tests and outperforms other existing predictors on a comprehensive independent test dataset. The proposed method is anticipated to be a helpful computational resource for the prediction of pupylation sites. The web server and curated datasets in this study are freely available at http://protein.cau.edu.cn/pbPUP/. PMID:26080082

  4. Identification of error making patterns in lesion detection on digital breast tomosynthesis using computer-extracted image features

    NASA Astrophysics Data System (ADS)

    Wang, Mengyu; Zhang, Jing; Grimm, Lars J.; Ghate, Sujata V.; Walsh, Ruth; Johnson, Karen S.; Lo, Joseph Y.; Mazurowski, Maciej A.

    2016-03-01

    Digital breast tomosynthesis (DBT) can improve lesion visibility by eliminating the issue of overlapping breast tissue present in mammography. However, this new modality likely requires new approaches to training. The issue of training in DBT is not well explored. We propose a computer-aided educational approach for DBT training. Our hypothesis is that the trainees' educational outcomes will improve if they are presented with cases individually selected to address their weaknesses. In this study, we focus on the question of how to select such cases. Specifically, we propose an algorithm that based on previously acquired reading data predicts which lesions will be missed by the trainee for future cases (i.e., we focus on false negative error). A logistic regression classifier was used to predict the likelihood of trainee error and computer-extracted features were used as the predictors. Reader data from 3 expert breast imagers was used to establish the ground truth and reader data from 5 radiology trainees was used to evaluate the algorithm performance with repeated holdout cross validation. Receiver operating characteristic (ROC) analysis was applied to measure the performance of the proposed individual trainee models. The preliminary experimental results for 5 trainees showed the individual trainee models were able to distinguish the lesions that would be detected from those that would be missed with the average area under the ROC curve of 0.639 (95% CI, 0.580-0.698). The proposed algorithm can be used to identify difficult cases for individual trainees.

  5. Computer Music

    NASA Astrophysics Data System (ADS)

    Cook, Perry R.

    This chapter covers algorithms, technologies, computer languages, and systems for computer music. Computer music involves the application of computers and other digital/electronic technologies to music composition, performance, theory, history, and the study of perception. The field combines digital signal processing, computational algorithms, computer languages, hardware and software systems, acoustics, psychoacoustics (low-level perception of sounds from the raw acoustic signal), and music cognition (higher-level perception of musical style, form, emotion, etc.).

  6. Level-set reconstruction algorithm for ultrafast limited-angle X-ray computed tomography of two-phase flows

    PubMed Central

    Bieberle, M.; Hampel, U.

    2015-01-01

    Tomographic image reconstruction is based on recovering an object distribution from its projections, which have been acquired from all angular views around the object. If the angular range is limited to less than 180° of parallel projections, typical reconstruction artefacts arise when using standard algorithms. To compensate for this, specialized algorithms using a priori information about the object need to be applied. The application behind this work is ultrafast limited-angle X-ray computed tomography of two-phase flows. Here, only a binary distribution of the two phases needs to be reconstructed, which reduces the complexity of the inverse problem. To solve it, a new reconstruction algorithm (LSR) based on the level-set method is proposed. It includes one force function term accounting for matching the projection data and one incorporating a curvature-dependent smoothing of the phase boundary. The algorithm has been validated using simulated as well as measured projections of known structures, and its performance has been compared to the algebraic reconstruction technique and a binary derivative of it. The validation as well as the application of the level-set reconstruction on a dynamic two-phase flow demonstrated its applicability and its advantages over other reconstruction algorithms. PMID:25939623

  7. Level-set reconstruction algorithm for ultrafast limited-angle X-ray computed tomography of two-phase flows.

    PubMed

    Bieberle, M; Hampel, U

    2015-06-13

    Tomographic image reconstruction is based on recovering an object distribution from its projections, which have been acquired from all angular views around the object. If the angular range is limited to less than 180° of parallel projections, typical reconstruction artefacts arise when using standard algorithms. To compensate for this, specialized algorithms using a priori information about the object need to be applied. The application behind this work is ultrafast limited-angle X-ray computed tomography of two-phase flows. Here, only a binary distribution of the two phases needs to be reconstructed, which reduces the complexity of the inverse problem. To solve it, a new reconstruction algorithm (LSR) based on the level-set method is proposed. It includes one force function term accounting for matching the projection data and one incorporating a curvature-dependent smoothing of the phase boundary. The algorithm has been validated using simulated as well as measured projections of known structures, and its performance has been compared to the algebraic reconstruction technique and a binary derivative of it. The validation as well as the application of the level-set reconstruction on a dynamic two-phase flow demonstrated its applicability and its advantages over other reconstruction algorithms. PMID:25939623

  8. Identification of MicroRNAs and Transcript Targets in Camelina sativa by Deep Sequencing and Computational Methods

    PubMed Central

    Poudel, Saroj; Aryal, Niranjan; Lu, Chaofu

    2015-01-01

    Camelina sativa is an annual oilseed crop that is under intensive development for renewable resources of biofuels and industrial oils. MicroRNAs, or miRNAs, are endogenously encoded small RNAs that play key roles in diverse plant biological processes. Here, we conducted deep sequencing on small RNA libraries prepared from camelina leaves, flower buds and two stages of developing seeds corresponding to initial and peak storage products accumulation. Computational analyses identified 207 known miRNAs belonging to 63 families, as well as 5 novel miRNAs. These miRNAs, especially members of the miRNA families, varied greatly in different tissues and developmental stages. The predicted miRNA target genes are involved in a broad range of physiological functions including lipid metabolism. This report is the first step toward elucidating roles of miRNAs in C. sativa and will provide additional tools to improve this oilseed crop for biofuels and biomaterials. PMID:25826400

  9. Identification of microRNAs and transcript targets in Camelina sativa by deep sequencing and computational methods.

    PubMed

    Poudel, Saroj; Aryal, Niranjan; Lu, Chaofu

    2015-01-01

    Camelina sativa is an annual oilseed crop that is under intensive development for renewable resources of biofuels and industrial oils. MicroRNAs, or miRNAs, are endogenously encoded small RNAs that play key roles in diverse plant biological processes. Here, we conducted deep sequencing on small RNA libraries prepared from camelina leaves, flower buds and two stages of developing seeds corresponding to initial and peak storage products accumulation. Computational analyses identified 207 known miRNAs belonging to 63 families, as well as 5 novel miRNAs. These miRNAs, especially members of the miRNA families, varied greatly in different tissues and developmental stages. The predicted miRNA target genes are involved in a broad range of physiological functions including lipid metabolism. This report is the first step toward elucidating roles of miRNAs in C. sativa and will provide additional tools to improve this oilseed crop for biofuels and biomaterials. PMID:25826400

  10. Computational Method for the Systematic Identification of Analog Series and Key Compounds Representing Series and Their Biological Activity Profiles.

    PubMed

    Stumpfe, Dagmar; Dimova, Dilyana; Bajorath, Jürgen

    2016-08-25

    A computational methodology is introduced for detecting all unique series of analogs in large compound data sets, regardless of chemical relationships between analogs. No prior knowledge of core structures or R-groups is required, which are automatically determined. The approach is based upon the generation of retrosynthetic matched molecular pairs and analog networks from which distinct series are isolated. The methodology was applied to systematically extract more than 17 000 distinct series from the ChEMBL database. For comparison, analog series were also isolated from screening compounds and drugs. Known biological activities were mapped to series from ChEMBL, and in more than 13 000 of these series, key compounds were identified that represented substitution sites of all analogs within a series and its complete activity profile. The analog series, key compounds, and activity profiles are made freely available as a resource for medicinal chemistry applications. PMID:27501131

  11. Computational modeling of inhibition of voltage-gated Ca channels: identification of different effects on uterine and cardiac action potentials

    PubMed Central

    Tong, Wing-Chiu; Ghouri, Iffath; Taggart, Michael J.

    2014-01-01

    The uterus and heart share the important physiological feature whereby contractile activation of the muscle tissue is regulated by the generation of periodic, spontaneous electrical action potentials (APs). Preterm birth arising from premature uterine contractions is a major complication of pregnancy and there remains a need to pursue avenues of research that facilitate the use of drugs, tocolytics, to limit these inappropriate contractions without deleterious actions on cardiac electrical excitation. A novel approach is to make use of mathematical models of uterine and cardiac APs, which incorporate many ionic currents contributing to the AP forms, and test the cell-specific responses to interventions. We have used three such models—of uterine smooth muscle cells (USMC), cardiac sinoatrial node cells (SAN), and ventricular cells—to investigate the relative effects of reducing two important voltage-gated Ca currents—the L-type (ICaL) and T-type (ICaT) Ca currents. Reduction of ICaL (10%) alone, or ICaT (40%) alone, blunted USMC APs with little effect on ventricular APs and only mild effects on SAN activity. Larger reductions in either current further attenuated the USMC APs but with also greater effects on SAN APs. Encouragingly, a combination of ICaL and ICaT reduction did blunt USMC APs as intended with little detriment to APs of either cardiac cell type. Subsequent overlapping maps of ICaL and ICaT inhibition profiles from each model revealed a range of combined reductions of ICaL and ICaT over which an appreciable diminution of USMC APs could be achieved with no deleterious action on cardiac SAN or ventricular APs. This novel approach illustrates the potential for computational biology to inform us of possible uterine and cardiac cell-specific mechanisms. Incorporating such computational approaches in future studies directed at designing new, or repurposing existing, tocolytics will be beneficial for establishing a desired uterine specificity of action

  12. The First Results of Testing Methods and Algorithms for Automatic Real Time Identification of Waveforms Introduction from Local Earthquakes in Increased Level of Man-induced Noises for the Purposes of Ultra-short-term Warning about an Occurred Earthquake

    NASA Astrophysics Data System (ADS)

    Gravirov, V. V.; Kislov, K. V.

    2009-12-01

    The chief hazard posed by earthquakes consists in their suddenness. The number of earthquakes annually recorded is in excess of 100,000; of these, over 1000 are strong ones. Great human losses usually occur because no devices exist for advance warning of earthquakes. It is therefore high time that mobile information automatic systems should be developed for analysis of seismic information at high levels of manmade noise. The systems should be operated in real time with the minimum possible computational delays and be able to make fast decisions. The chief statement of the project is that sufficiently complete information about an earthquake can be obtained in real time by examining its first onset as recorded by a single seismic sensor or a local seismic array. The essential difference from the existing systems consists in the following: analysis of local seismic data at high levels of manmade noise (that is, when the noise level may be above the seismic signal level), as well as self-contained operation. The algorithms developed during the execution of the project will be capable to be used with success for individual personal protection kits and for warning the population in earthquake-prone areas over the world. The system being developed for this project uses P and S waves as well. The difference in the velocities of these seismic waves permits a technique to be developed for identifying a damaging earthquake. Real time analysis of first onsets yields the time that remains before surface waves arrive and the damage potential of these waves. Estimates show that, when the difference between the earthquake epicenter and the monitored site is of order 200 km, the time difference between the arrivals of P waves and surface waves will be about 30 seconds, which is quite sufficient to evacuate people from potentially hazardous space, insertion of moderators at nuclear power stations, pipeline interlocking, transportation stoppage, warnings issued to rescue services

  13. SU-E-I-74: Image-Matching Technique of Computed Tomography Images for Personal Identification: A Preliminary Study Using Anthropomorphic Chest Phantoms

    SciTech Connect

    Matsunobu, Y; Shiotsuki, K; Morishita, J

    2015-06-15

    Purpose: Fingerprints, dental impressions, and DNA are used to identify unidentified bodies in forensic medicine. Cranial Computed tomography (CT) images and/or dental radiographs are also used for identification. Radiological identification is important, particularly in the absence of comparative fingerprints, dental impressions, and DNA samples. The development of an automated radiological identification system for unidentified bodies is desirable. We investigated the potential usefulness of bone structure for matching chest CT images. Methods: CT images of three anthropomorphic chest phantoms were obtained on different days in various settings. One of the phantoms was assumed to be an unidentified body. The bone image and the bone image with soft tissue (BST image) were extracted from the CT images. To examine the usefulness of the bone image and/or the BST image, the similarities between the two-dimensional (2D) or threedimensional (3D) images of the same and different phantoms were evaluated in terms of the normalized cross-correlation value (NCC). Results: For the 2D and 3D BST images, the NCCs obtained from the same phantom assumed to be an unidentified body (2D, 0.99; 3D, 0.93) were higher than those for the different phantoms (2D, 0.95 and 0.91; 3D, 0.89 and 0.80). The NCCs for the same phantom (2D, 0.95; 3D, 0.88) were greater compared to those of the different phantoms (2D, 0.61 and 0.25; 3D, 0.23 and 0.10) for the bone image. The difference in the NCCs between the same and different phantoms tended to be larger for the bone images than for the BST images. These findings suggest that the image-matching technique is more useful when utilizing the bone image than when utilizing the BST image to identify different people. Conclusion: This preliminary study indicated that evaluating the similarity of bone structure in 2D and 3D images is potentially useful for identifying of an unidentified body.

  14. The use of isodose levels to interpret radiation induced lung injury: a quantitative analysis of computed tomography changes

    PubMed Central

    Knoll, Miriam A.; Sheu, Ren Dih; Knoll, Abraham D.; Kerns, Sarah L.; Lo, Yeh-Chi; Rosenzweig, Kenneth E.

    2016-01-01

    Background Patients treated with stereotactic body radiation therapy (SBRT) for lung cancer are often found to have radiation-induced lung injury (RILI) surrounding the treated tumor. We investigated whether treatment isodose levels could predict RILI. Methods Thirty-seven lung lesions in 32 patients were treated with SBRT and received post-treatment follow up (FU) computed tomography (CT). Each CT was fused with the original simulation CT and treatment isodose levels were overlaid. The RILI surrounding the treated lesion was contoured. The RILI extension index [fibrosis extension index (FEI)] was defined as the volume of RILI extending outside a given isodose level relative to the total volume of RILI and was expressed as a percentage. Results Univariate analysis revealed that the planning target volume (PTV) was positively correlated with RILI volume at FU: correlation coefficient (CC) =0.628 and P<0.0001 at 1st FU; CE =0.401 and P=0.021 at 2nd FU; CE =0.265 and P=0.306 at 3rd FU. FEI −40 Gy at 1st FU was significantly positively correlated with FEI −40 Gy at subsequent FU’s (CC =0.689 and P=6.5×10−5 comparing 1st and 2nd FU; 0.901 and P=0.020 comparing 2nd and 3rd FU. Ninety-six percent of the RILI was found within the 20 Gy isodose line. Sixty-five percent of patients were found to have a decrease in RILI on the second 2nd CT. Conclusions We have shown that RILI evolves over time and 1st CT correlates well with subsequent CTs. Ninety-six percent of the RILI can be found to occur within the 20 Gy isodose lines, which may prove beneficial to radiologists attempting to distinguish recurrence vs. RILI. PMID:26981453

  15. Using single-species toxicity tests, community-level responses, and toxicity identification evaluations to investigate effluent impacts

    SciTech Connect

    Maltby, L.; Clayton, S.A.; Yu, H.; McLoughlin, N.; Wood, R.M.; Yin, D.

    2000-01-01

    Whole effluent toxicity (WET) tests are increasingly used to monitor compliance of consented discharges, but few studies have related toxicity measured using WET tests to receiving water impacts. Here the authors adopt a four-stage procedure to investigate the toxicity and biological impact of a point source discharge and to identify the major toxicants. In stage 1, standard WET tests were employed to determine the toxicity of the effluent. This was then followed by an assessment of receiving water toxicity using in situ deployment of indigenous (Gammarus pulex) and standard (Daphnia magna) test species. The third stage involved the use of biological survey techniques to assess the impact of the discharge on the structure and functioning of the benthic macroinvertebrate community. In stage 4, toxicity identification evaluations (TIE) were used to identify toxic components in the effluent. Receiving-water toxicity and ecological impact detected downstream of the discharge were consistent with the results of WET tests performed on the effluent. Downstream of the discharge, there was a reduction in D. magna survival, in G. pulex survival and feeding rate, in detritus processing, and in biotic indices based on macroinvertebrate community structure. The TIE studies suggested that chlorine was the principal toxicant in the effluent.

  16. High-level waste storage tank farms/242-A evaporator standards/requirements identification document (S/RID), Vol. 2

    SciTech Connect

    Not Available

    1994-04-01

    The Quality Assurance Functional Area Requirements Identification Document (RID), addresses the programmatic requirements that ensure risks and environmental impacts are minimized, ensure safety, reliability, and performance are maximized through the application of effective management systems commensurate with the risks posed by the Tank Farm Facility and its operation. This RID incorporates guidance intended to provide Tank Farms management with the necessary requirements information to develop, upgrade, or assess the effectiveness of a Quality Assurance Program in the performance of organizational and functional activities. Quality Assurance is defined as all those planned and systematic actions necessary to provide adequate confidence that a facility, structure, system, or component will perform satisfactorily and safely in service. This document will provide the specific requirements to meet DNFSB recommendations and the guidance provided in DOE Order 5700.6C, utilizing industry codes, standards, regulatory guidelines, and industry good practices that have proven to be essential elements for an effective and efficient Quality Assurance Program as the nuclear industry has matured over the last thirty years.

  17. High-level waste storage tank farms/242-A evaporator standards/requirements identification document (S/RID), Vol. 4

    SciTech Connect

    Not Available

    1994-04-01

    Radiation protection of personnel and the public is accomplished by establishing a well defined Radiation Protection Organization to ensure that appropriate controls on radioactive materials and radiation sources are implemented and documented. This Requirements Identification Document (RID) applies to the activities, personnel, structures, systems, components, and programs involved in executing the mission of the Tank Farms. The physical boundaries within which the requirements of this RID apply are the Single Shell Tank Farms, Double Shell Tank Farms, 242-A Evaporator-Crystallizer, 242-S, T Evaporators, Liquid Effluent Retention Facility (LERF), Purgewater Storage Facility (PWSF), and all interconnecting piping, valves, instrumentation, and controls. Also included is all piping, valves, instrumentation, and controls up to and including the most remote valve under Tank Farms control at any other Hanford Facility having an interconnection with Tank Farms. The boundary of the structures, systems, components, and programs to which this RID applies, is defined by those that are dedicated to and/or under the control of the Tank Farms Operations Department and are specifically implemented at the Tank Farms.

  18. Identification of a functional SNP in the 3'-UTR of caprine MTHFR gene that is associated with milk protein levels.

    PubMed

    An, Xiaopeng; Song, Yuxuan; Hou, Jinxing; Wang, Shan; Gao, Kexin; Cao, Binyun

    2016-08-01

    Xinong Saanen (n = 305) and Guanzhong (n = 317) dairy goats were used to detect SNPs in the caprine MTHFR 3'-UTR by DNA sequencing. One novel SNP (c.*2494G>A) was identified in the said region. Individuals with the AA genotype had greater milk protein levels than did those with the GG genotype at the c.*2494 G>A locus in both dairy goat breeds (P < 0.05). Functional assays indicated that the MTHFR:c.2494G>A substitution could increase the binding activity of bta-miR-370 with the MTHFR 3'-UTR. In addition, we observed a significant increase in the MTHFR protein level of AA carriers relative to that of GG carriers. These altered levels of MTHFR protein may account for the association of the SNP with milk protein level. PMID:27062401

  19. Computational identification of a transiently open L1/S3 pocket for reactivation of mutant p53

    PubMed Central

    Wassman, Christopher D.; Baronio, Roberta; Demir, Özlem; Wallentine, Brad D.; Chen, Chiung-Kuang; Hall, Linda V.; Salehi, Faezeh; Lin, Da-Wei; Chung, Benjamin P.; Wesley Hatfield, G.; Richard Chamberlin, A.; Luecke, Hartmut; Lathrop, Richard H.; Kaiser, Peter; Amaro, Rommie E.

    2013-01-01

    The tumour suppressor p53 is the most frequently mutated gene in human cancer. Reactivation of mutant p53 by small molecules is an exciting potential cancer therapy. Although several compounds restore wild-type function to mutant p53, their binding sites and mechanisms of action are elusive. Here computational methods identify a transiently open binding pocket between loop L1 and sheet S3 of the p53 core domain. Mutation of residue Cys124, located at the centre of the pocket, abolishes p53 reactivation of mutant R175H by PRIMA-1, a known reactivation compound. Ensemble-based virtual screening against this newly revealed pocket selects stictic acid as a potential p53 reactivation compound. In human osteosarcoma cells, stictic acid exhibits dose-dependent reactivation of p21 expression for mutant R175H more strongly than does PRIMA-1. These results indicate the L1/S3 pocket as a target for pharmaceutical reactivation of p53 mutants. PMID:23360998

  20. Identification of evolutionarily conserved Momordica charantia microRNAs using computational approach and its utility in phylogeny analysis.

    PubMed

    Thirugnanasambantham, Krishnaraj; Saravanan, Subramanian; Karikalan, Kulandaivelu; Bharanidharan, Rajaraman; Lalitha, Perumal; Ilango, S; HairulIslam, Villianur Ibrahim

    2015-10-01

    Momordica charantia (bitter gourd, bitter melon) is a monoecious Cucurbitaceae with anti-oxidant, anti-microbial, anti-viral and anti-diabetic potential. Molecular studies on this economically valuable plant are very essential to understand its phylogeny and evolution. MicroRNAs (miRNAs) are conserved, small, non-coding RNA with ability to regulate gene expression by bind the 3' UTR region of target mRNA and are evolved at different rates in different plant species. In this study we have utilized homology based computational approach and identified 27 mature miRNAs for the first time from this bio-medically important plant. The phylogenetic tree developed from binary data derived from the data on presence/absence of the identified miRNAs were noticed to be uncertain and biased. Most of the identified miRNAs were highly conserved among the plant species and sequence based phylogeny analysis of miRNAs resolved the above difficulties in phylogeny approach using miRNA. Predicted gene targets of the identified miRNAs revealed their importance in regulation of plant developmental process. Reported miRNAs held sequence conservation in mature miRNAs and the detailed phylogeny analysis of pre-miRNA sequences revealed genus specific segregation of clusters. PMID:25988220

  1. Identification of MicroRNAs and transcript targets in Camelina sativa by deep sequencing and computational methods

    DOE PAGESBeta

    Poudel, Saroj; Aryal, Niranjan; Lu, Chaofu; Wang, Tai

    2015-03-31

    Camelina sativa is an annual oilseed crop that is under intensive development for renewable resources of biofuels and industrial oils. MicroRNAs, or miRNAs, are endogenously encoded small RNAs that play key roles in diverse plant biological processes. Here, we conducted deep sequencing on small RNA libraries prepared from camelina leaves, flower buds and two stages of developing seeds corresponding to initial and peak storage products accumulation. Computational analyses identified 207 known miRNAs belonging to 63 families, as well as 5 novel miRNAs. These miRNAs, especially members of the miRNA families, varied greatly in different tissues and developmental stages. The predictedmore » miRNA target genes are involved in a broad range of physiological functions including lipid metabolism. This report is the first step toward elucidating roles of miRNAs in C. sativa and will provide additional tools to improve this oilseed crop for biofuels and biomaterials.« less

  2. Preoperative Identification of a Perforator Using Computed Tomography Angiography and Metal Clip Marking in Perforator Flap Reconstruction

    PubMed Central

    Lee, Jung Woo; Kim, Han Kyeol; Kim, Sin Rak; Han, Yea Sik

    2015-01-01

    In perforator flap reconstruction, vascular mapping using preoperative computed tomography (CT) angiography is widely used to confirm the existence and location of an appropriate perforator. This study proposes a rapid, accurate, and convenient method for marking the perforator location on the skin surface. For 12 patients who underwent perforator flap reconstruction between November 2011 and November 2013, metal clips were fixed on the skin surface at the anticipated perforator locations, which were decided using a handheld Doppler. CT angiography was used to compare the location between the metal clip and the actual perforator. The metal clip was moved and repositioned, if needed, on the basis of the CT images. The locations of the appropriate perforator and the metal clip, which were observed during the surgery, were then compared. In CT angiography, the mean distance between the metal clip and the perforator was 3±3.9 mm, and the mean distance that was measured during surgery was 0.8±0.8 mm. In conclusion, we report a simple, rapid, and precise technique to indicate the accurate location of the appropriate perforator on the skin surface. PMID:25606494

  3. Identification of a scaled-model riser dynamics through a combined computer vision and adaptive Kalman filter approach

    NASA Astrophysics Data System (ADS)

    Trigo, F. C.; Martins, F. P. R.; Fleury, A. T.; Silva, H. C.

    2014-02-01

    Aiming at overcoming the difficulties derived from the traditional camera calibration methods to record the underwater environment of a towing tank where experiments of scaled-model risers are carried on, a computer vision method, combining traditional image processing algorithms and a self-calibration technique was implemented. This method was used to identify the coordinates of control-points viewed on a scaled-model riser submitted to a periodic force applied to its fairlead attachment point. To study the observed motion, the riser was represented as a pseudo-rigid body model (PRBM) and the hypotheses of compliant mechanisms theory were assumed in order to cope with its elastic behavior. The derived Lagrangian equations of motion were linearized and expressed as a state-space model in which the state variables include the generalized coordinates and the unknown generalized forces. The state-vector thus assembled is estimated through a Kalman Filter. The estimation procedure allows the determination of both the generalized forces and the tension along the cable, with statistically proven convergence.

  4. A Combination of Screening and Computational Approaches for the Identification of Novel Compounds That Decrease Mast Cell Degranulation

    PubMed Central

    McShane, Marisa P.; Friedrichson, Tim; Giner, Angelika; Meyenhofer, Felix; Barsacchi, Rico; Bickle, Marc

    2015-01-01

    High-content screening of compound libraries poses various challenges in the early steps in drug discovery such as gaining insights into the mode of action of the selected compounds. Here, we addressed these challenges by integrating two biological screens through bioinformatics and computational analysis. We screened a small-molecule library enriched in amphiphilic compounds in a degranulation assay in rat basophilic leukemia 2H3 (RBL-2H3) cells. The same library was rescreened in a high-content image-based endocytosis assay in HeLa cells. This assay was previously applied to a genome-wide RNAi screen that produced quantitative multiparametric phenotypic profiles for genes that directly or indirectly affect endocytosis. By correlating the endocytic profiles of the compounds with the genome-wide siRNA profiles, we identified candidate pathways that may be inhibited by the compounds. Among these, we focused on the Akt pathway and validated its inhibition in HeLa and RBL-2H3 cells. We further showed that the compounds inhibited the translocation of the Akt-PH domain to the plasma membrane. The approach performed here can be used to integrate chemical and functional genomics screens for investigating the mechanism of action of compounds. PMID:25838434

  5. Computational identification of conserved microRNAs and their putative targets in the Hypericum perforatum L. flower transcriptome.

    PubMed

    Galla, Giulio; Volpato, Mirko; Sharbel, Timothy F; Barcaccia, Gianni

    2013-09-01

    MicroRNAs (miRNAs) have recently emerged as important regulators of gene expression in plants. Many miRNA families and their targets have been extensively studied in model species and major crops. We have characterized mature miRNAs along with their precursors and potential targets in Hypericum to generate a comprehensive list of conserved miRNA families and to investigate the regulatory role of selected miRNAs in biological processes that occur in the flower. St. John's wort (Hypericum perforatum L., 2n = 4x = 32), a medicinal plant that produces pharmaceutically important metabolites with therapeutic activities, was chosen because it is regarded as an attractive model system for the study of apomixis. A computational in silico prediction of structure, in combination with an in vitro validation, allowed us to identify 7 pre-miRNAs, including miR156, miR166, miR390, miR394, miR396, and miR414. We demonstrated that H. perforatum flowers share highly conserved miRNAs and that these miRNAs potentially target dozens of genes with a wide range of molecular functions, including metabolism, response to stress, flower development, and plant reproduction. Our analysis paves the way toward identifying flower-specific miRNAs that may differentiate the sexual and apomictic reproductive pathways. PMID:23846415

  6. Greater-than-Class C low-level radioactive waste shipping package/container identification and requirements study. National Low-Level Waste Management Program

    SciTech Connect

    Tyacke, M.

    1993-08-01

    This report identifies a variety of shipping packages (also referred to as casks) and waste containers currently available or being developed that could be used for greater-than-Class C (GTCC) low-level waste (LLW). Since GTCC LLW varies greatly in size, shape, and activity levels, the casks and waste containers that could be used range in size from small, to accommodate a single sealed radiation source, to very large-capacity casks/canisters used to transport or dry-store highly radioactive spent fuel. In some cases, the waste containers may serve directly as shipping packages, while in other cases, the containers would need to be placed in a transport cask. For the purpose of this report, it is assumed that the generator is responsible for transporting the waste to a Department of Energy (DOE) storage, treatment, or disposal facility. Unless DOE establishes specific acceptance criteria, the receiving facility would need the capability to accept any of the casks and waste containers identified in this report. In identifying potential casks and waste containers, no consideration was given to their adequacy relative to handling, storage, treatment, and disposal. Those considerations must be addressed separately as the capabilities of the receiving facility and the handling requirements and operations are better understood.

  7. CAL-laborate: A Collaborative Publication on the Use of Computer Aided Learning for Tertiary Level Physical Sciences and Geosciences.

    ERIC Educational Resources Information Center

    Fernandez, Anne, Ed.; Sproats, Lee, Ed.; Sorensen, Stacey, Ed.

    2000-01-01

    The science community has been trying to use computers in teaching for many years. There has been much conformity in how this was to be achieved, and the wheel has been re-invented again and again as enthusiast after enthusiast has "done their bit" towards getting computers accepted. Computers are now used by science undergraduates (as well as…

  8. Identification of AGO3-Associated miRNAs and Computational Prediction of Their Targets in the Green Alga Chlamydomonas reinhardtii

    PubMed Central

    Voshall, Adam; Kim, Eun-Jeong; Ma, Xinrong; Moriyama, Etsuko N.; Cerutti, Heriberto

    2015-01-01

    The unicellular green alga Chlamydomonas reinhardtii harbors many types of small RNAs (sRNAs) but little is known about their role(s) in the regulation of endogenous genes and cellular processes. To define functional microRNAs (miRNAs) in Chlamydomonas, we characterized sRNAs associated with an argonaute protein, AGO3, by affinity purification and deep sequencing. Using a stringent set of criteria for canonical miRNA annotation, we identified 39 precursor miRNAs, which produce 45 unique, AGO3-associated miRNA sequences including 13 previously reported miRNAs and 32 novel ones. Potential miRNA targets were identified based on the complementarity of miRNAs with candidate binding sites on transcripts and classified, depending on the extent of complementarity, as being likely to be regulated through cleavage or translational repression. The search for cleavage targets identified 74 transcripts. However, only 6 of them showed an increase in messenger RNA (mRNA) levels in a mutant strain almost devoid of sRNAs. The search for translational repression targets, which used complementarity criteria more stringent than those empirically required for a reduction in target protein levels, identified 488 transcripts. However, unlike observations in metazoans, most predicted translation repression targets did not show appreciable changes in transcript abundance in the absence of sRNAs. Additionally, of three candidate targets examined at the protein level, only one showed a moderate variation in polypeptide amount in the mutant strain. Our results emphasize the difficulty in identifying genuine miRNA targets in Chlamydomonas and suggest that miRNAs, under standard laboratory conditions, might have mainly a modulatory role in endogenous gene regulation in this alga. PMID:25769981

  9. OPTICAL correlation identification technology applied in underwater laser imaging target identification

    NASA Astrophysics Data System (ADS)

    Yao, Guang-Tao; Zhang, Xiao-Hui; Ge, Wei-Long

    2011-11-01

    The underwater laser imaging detection is an effective method of detecting short distance target underwater as an important complement of sonar detection. With the development of underwater laser imaging technology and underwater vehicle technology, the underwater automatic target identification has gotten more and more attention, and is a research difficulty in the area of underwater optical imaging information processing. Today, underwater automatic target identification based on optical imaging is usually realized with the method of digital circuit software programming. The algorithm realization and control of this method is very flexible. However, the optical imaging information is 2D image even 3D image, the amount of imaging processing information is abundant, so the electronic hardware with pure digital algorithm will need long identification time and is hard to meet the demands of real-time identification. If adopt computer parallel processing, the identification speed can be improved, but it will increase complexity, size and power consumption. This paper attempts to apply optical correlation identification technology to realize underwater automatic target identification. The optics correlation identification technology utilizes the Fourier transform characteristic of Fourier lens which can accomplish Fourier transform of image information in the level of nanosecond, and optical space interconnection calculation has the features of parallel, high speed, large capacity and high resolution, combines the flexibility of calculation and control of digital circuit method to realize optoelectronic hybrid identification mode. We reduce theoretical formulation of correlation identification and analyze the principle of optical correlation identification, and write MATLAB simulation program. We adopt single frame image obtained in underwater range gating laser imaging to identify, and through identifying and locating the different positions of target, we can improve

  10. OPTICAL correlation identification technology applied in underwater laser imaging target identification

    NASA Astrophysics Data System (ADS)

    Yao, Guang-tao; Zhang, Xiao-hui; Ge, Wei-long

    2012-01-01

    The underwater laser imaging detection is an effective method of detecting short distance target underwater as an important complement of sonar detection. With the development of underwater laser imaging technology and underwater vehicle technology, the underwater automatic target identification has gotten more and more attention, and is a research difficulty in the area of underwater optical imaging information processing. Today, underwater automatic target identification based on optical imaging is usually realized with the method of digital circuit software programming. The algorithm realization and control of this method is very flexible. However, the optical imaging information is 2D image even 3D image, the amount of imaging processing information is abundant, so the electronic hardware with pure digital algorithm will need long identification time and is hard to meet the demands of real-time identification. If adopt computer parallel processing, the identification speed can be improved, but it will increase complexity, size and power consumption. This paper attempts to apply optical correlation identification technology to realize underwater automatic target identification. The optics correlation identification technology utilizes the Fourier transform characteristic of Fourier lens which can accomplish Fourier transform of image information in the level of nanosecond, and optical space interconnection calculation has the features of parallel, high speed, large capacity and high resolution, combines the flexibility of calculation and control of digital circuit method to realize optoelectronic hybrid identification mode. We reduce theoretical formulation of correlation identification and analyze the principle of optical correlation identification, and write MATLAB simulation program. We adopt single frame image obtained in underwater range gating laser imaging to identify, and through identifying and locating the different positions of target, we can improve

  11. Identification of novel candidate drivers connecting different dysfunctional levels for lung adenocarcinoma using protein-protein interactions and a shortest path approach

    PubMed Central

    Chen, Lei; Huang, Tao; Zhang, Yu-Hang; Jiang, Yang; Zheng, Mingyue; Cai, Yu-Dong

    2016-01-01

    Tumors are formed by the abnormal proliferation of somatic cells with disordered growth regulation under the influence of tumorigenic factors. Recently, the theory of “cancer drivers” connects tumor initiation with several specific mutations in the so-called cancer driver genes. According to the differentiation of four basic levels between tumor and adjacent normal tissues, the cancer drivers can be divided into the following: (1) Methylation level, (2) microRNA level, (3) mutation level, and (4) mRNA level. In this study, a computational method is proposed to identify novel lung adenocarcinoma drivers based on dysfunctional genes on the methylation, microRNA, mutation and mRNA levels. First, a large network was constructed using protein-protein interactions. Next, we searched all of the shortest paths connecting dysfunctional genes on different levels and extracted new candidate genes lying on these paths. Finally, the obtained candidate genes were filtered by a permutation test and an additional strict selection procedure involving a betweenness ratio and an interaction score. Several candidate genes remained, which are deemed to be related to two different levels of cancer. The analyses confirmed our assertions that some have the potential to contribute to the tumorigenesis process on multiple levels. PMID:27412431

  12. Identification of novel candidate drivers connecting different dysfunctional levels for lung adenocarcinoma using protein-protein interactions and a shortest path approach.

    PubMed

    Chen, Lei; Huang, Tao; Zhang, Yu-Hang; Jiang, Yang; Zheng, Mingyue; Cai, Yu-Dong

    2016-01-01

    Tumors are formed by the abnormal proliferation of somatic cells with disordered growth regulation under the influence of tumorigenic factors. Recently, the theory of "cancer drivers" connects tumor initiation with several specific mutations in the so-called cancer driver genes. According to the differentiation of four basic levels between tumor and adjacent normal tissues, the cancer drivers can be divided into the following: (1) Methylation level, (2) microRNA level, (3) mutation level, and (4) mRNA level. In this study, a computational method is proposed to identify novel lung adenocarcinoma drivers based on dysfunctional genes on the methylation, microRNA, mutation and mRNA levels. First, a large network was constructed using protein-protein interactions. Next, we searched all of the shortest paths connecting dysfunctional genes on different levels and extracted new candidate genes lying on these paths. Finally, the obtained candidate genes were filtered by a permutation test and an additional strict selection procedure involving a betweenness ratio and an interaction score. Several candidate genes remained, which are deemed to be related to two different levels of cancer. The analyses confirmed our assertions that some have the potential to contribute to the tumorigenesis process on multiple levels. PMID:27412431

  13. Computer Music

    NASA Astrophysics Data System (ADS)

    Cook, Perry

    This chapter covers algorithms, technologies, computer languages, and systems for computer music. Computer music involves the application of computers and other digital/electronic technologies to music composition, performance, theory, history, and perception. The field combines digital signal processing, computational algorithms, computer languages, hardware and software systems, acoustics, psychoacoustics (low-level perception of sounds from the raw acoustic signal), and music cognition (higher-level perception of musical style, form, emotion, etc.). Although most people would think that analog synthesizers and electronic music substantially predate the use of computers in music, many experiments and complete computer music systems were being constructed and used as early as the 1950s.

  14. Identification of proteins whose synthesis is preferentially enhanced by polyamines at the level of translation in mammalian cells.

    PubMed

    Nishimura, Kazuhiro; Okudaira, Hiroyuki; Ochiai, Eriko; Higashi, Kyohei; Kaneko, Mayumi; Ishii, Itsuko; Nishimura, Tomoe; Dohmae, Naoshi; Kashiwagi, Keiko; Igarashi, Kazuei

    2009-11-01

    In Escherichia coli, several proteins whose synthesis is enhanced by polyamines at the level of translation have been identified. We looked for proteins that are similarly regulated in eukaryotes using a mouse mammary carcinoma FM3A cell culture system. Polyamine deficiency was induced by adding an inhibitor of ornithine decarboxylase, alpha-difluoromethylornithine, to the medium. Proteins enhanced by polyamines were determined by comparison of protein levels in control and polyamine-deficient cells using two-dimensional gel electrophoresis, and were identified by Edman degradation and/or LC/MALDI-TOF/TOF tandem mass spectrometry. Polyamine stimulation of the synthesis of these proteins at the level of translation was confirmed by measuring levels of the corresponding mRNAs and proteins, and levels of the [(35)S]methionine pulse-labeled proteins. The proteins identified in this way were T-complex protein 1, beta subunit (Cct2); heterogeneous nuclear ribonucleoprotein L (Hnrpl); and phosphoglycerate mutase 1 (Pgam1). Since Cct2 was most strongly enhanced by polyamines among three proteins, the mechanism of polyamine stimulation of Cct2 synthesis was studied using NIH3T3 cells transiently transfected with genes encoding Cct2-EGFP fusion mRNA with normal or mutated 5'-untranslated region (5'-UTR) of Cct2 mRNA. Polyamines most likely enhanced ribosome shunting on the 5'-UTR of Cct2 mRNA. PMID:19427401

  15. Identification of framework residues in a secreted recombinant antibody fragment that control production level and localization in Escherichia coli.

    PubMed

    Forsberg, G; Forsgren, M; Jaki, M; Norin, M; Sterky, C; Enhörning, A; Larsson, K; Ericsson, M; Björk, P

    1997-05-01

    The monoclonal antibody 5T4, directed against a human tumor-associated antigen, was expressed as a secreted Fab superantigen fusion protein in Escherichia coli. The product is a putative agent for immunotherapy of non-small cell lung cancer. During fermentation, most of the fusion protein leaked out from the periplasm to the growth medium at a level of approximately 40 mg/liter. This level was notably low compared with similar products containing identical CH1, CL, and superantigen moieties, and the Fv framework was therefore engineered. Using hybrid molecules, the light chain was found to limit high expression levels. Substituting five residues in VL increased the level almost 15 times, exceeding 500 mg/liter in the growth medium. Here, the substitutions Phe-10 --> Ser, Thr-45 --> Lys, Thr-77 --> Ser, and Leu-78 --> Val were most powerful. In addition, replacing four VH residues diminished cell lysis during fermentation. Thereby the product was preferentially located in the periplasm instead of the growth medium, and the total yield was more than 700 mg/liter. All engineered products retained a high affinity for the tumor-associated antigen. It is suggested that at least some of the identified framework residues generally have to be replaced to obtain high level production of recombinant Fab products in E. coli. PMID:9139690

  16. Computational Identification Raises a Riddle for Distribution of Putative NACHT NTPases in the Genome of Early Green Plants

    PubMed Central

    Arya, Preeti; Acharya, Vishal

    2016-01-01

    NACHT NTPases and AP-ATPases belongs to STAND (signal transduction ATPases with numerous domain) P-loop NTPase class, which are known to be involved in defense signaling pathways and apoptosis regulation. The AP-ATPases (also known as NB-ARC) and NACHT NTPases are widely spread throughout all kingdoms of life except in plants, where only AP-ATPases have been extensively studied in the scenario of plant defense response against pathogen invasion and in hypersensitive response (HR). In the present study, we have employed a genome-wide survey (using stringent computational analysis) of 67 diverse organisms viz., archaebacteria, cyanobacteria, fungi, animalia and plantae to revisit the evolutionary history of these two STAND P-loop NTPases. This analysis divulged the presence of NACHT NTPases in the early green plants (green algae and the lycophyte) which had not been previously reported. These NACHT NTPases were known to be involved in diverse functional activities such as transcription regulation in addition to the defense signaling cascades depending on the domain association. In Chalmydomonas reinhardtii, a green algae, WD40 repeats found to be at the carboxyl-terminus of NACHT NTPases suggest probable role in apoptosis regulation. Moreover, the genome of Selaginella moellendorffii, an extant lycophyte, intriguingly shows the considerable number of both AP-ATPases and NACHT NTPases in contrast to a large repertoire of AP-ATPases in plants and emerge as an important node in the evolutionary tree of life. The large complement of AP-ATPases overtakes the function of NACHT NTPases and plausible reason behind the absence of the later in the plant lineages. The presence of NACHT NTPases in the early green plants and phyletic patterns results from this study raises a quandary for the distribution of this STAND P-loop NTPase with the apparent horizontal gene transfer from cyanobacteria. PMID:26930396

  17. Identification and Validation of Novel Hedgehog-Responsive Enhancers Predicted by Computational Analysis of Ci/Gli Binding Site Density

    PubMed Central

    Richards, Neil; Parker, David S.; Johnson, Lisa A.; Allen, Benjamin L.; Barolo, Scott; Gumucio, Deborah L.

    2015-01-01

    The Hedgehog (Hh) signaling pathway directs a multitude of cellular responses during embryogenesis and adult tissue homeostasis. Stimulation of the pathway results in activation of Hh target genes by the transcription factor Ci/Gli, which binds to specific motifs in genomic enhancers. In Drosophila, only a few enhancers (patched, decapentaplegic, wingless, stripe, knot, hairy, orthodenticle) have been shown by in vivo functional assays to depend on direct Ci/Gli regulation. All but one (orthodenticle) contain more than one Ci/Gli site, prompting us to directly test whether homotypic clustering of Ci/Gli binding sites is sufficient to define a Hh-regulated enhancer. We therefore developed a computational algorithm to identify Ci/Gli clusters that are enriched over random expectation, within a given region of the genome. Candidate genomic regions containing Ci/Gli clusters were functionally tested in chicken neural tube electroporation assays and in transgenic flies. Of the 22 Ci/Gli clusters tested, seven novel enhancers (and the previously known patched enhancer) were identified as Hh-responsive and Ci/Gli-dependent in one or both of these assays, including: Cuticular protein 100A (Cpr100A); invected (inv), which encodes an engrailed-related transcription factor expressed at the anterior/posterior wing disc boundary; roadkill (rdx), the fly homolog of vertebrate Spop; the segment polarity gene gooseberry (gsb); and two previously untested regions of the Hh receptor-encoding patched (ptc) gene. We conclude that homotypic Ci/Gli clustering is not sufficient information to ensure Hh-responsiveness; however, it can provide a clue for enhancer recognition within putative Hedgehog target gene loci. PMID:26710299

  18. Computational Identification Raises a Riddle for Distribution of Putative NACHT NTPases in the Genome of Early Green Plants.

    PubMed

    Arya, Preeti; Acharya, Vishal

    2016-01-01

    NACHT NTPases and AP-ATPases belongs to STAND (signal transduction ATPases with numerous domain) P-loop NTPase class, which are known to be involved in defense signaling pathways and apoptosis regulation. The AP-ATPases (also known as NB-ARC) and NACHT NTPases are widely spread throughout all kingdoms of life except in plants, where only AP-ATPases have been extensively studied in the scenario of plant defense response against pathogen invasion and in hypersensitive response (HR). In the present study, we have employed a genome-wide survey (using stringent computational analysis) of 67 diverse organisms viz., archaebacteria, cyanobacteria, fungi, animalia and plantae to revisit the evolutionary history of these two STAND P-loop NTPases. This analysis divulged the presence of NACHT NTPases in the early green plants (green algae and the lycophyte) which had not been previously reported. These NACHT NTPases were known to be involved in diverse functional activities such as transcription regulation in addition to the defense signaling cascades depending on the domain association. In Chalmydomonas reinhardtii, a green algae, WD40 repeats found to be at the carboxyl-terminus of NACHT NTPases suggest probable role in apoptosis regulation. Moreover, the genome of Selaginella moellendorffii, an extant lycophyte, intriguingly shows the considerable number of both AP-ATPases and NACHT NTPases in contrast to a large repertoire of AP-ATPases in plants and emerge as an important node in the evolutionary tree of life. The large complement of AP-ATPases overtakes the function of NACHT NTPases and plausible reason behind the absence of the later in the plant lineages. The presence of NACHT NTPases in the early green plants and phyletic patterns results from this study raises a quandary for the distribution of this STAND P-loop NTPase with the apparent horizontal gene transfer from cyanobacteria. PMID:26930396

  19. Computational identification of novel biochemical systems involved in oxidation, glycosylation and other complex modifications of bases in DNA

    PubMed Central

    Iyer, Lakshminarayan M.; Zhang, Dapeng; Maxwell Burroughs, A.; Aravind, L.

    2013-01-01

    Discovery of the TET/JBP family of dioxygenases that modify bases in DNA has sparked considerable interest in novel DNA base modifications and their biological roles. Using sensitive sequence and structure analyses combined with contextual information from comparative genomics, we computationally characterize over 12 novel biochemical systems for DNA modifications. We predict previously unidentified enzymes, such as the kinetoplastid J-base generating glycosyltransferase (and its homolog GREB1), the catalytic specificity of bacteriophage TET/JBP proteins and their role in complex DNA base modifications. We also predict the enzymes involved in synthesis of hypermodified bases such as alpha-glutamylthymine and alpha-putrescinylthymine that have remained enigmatic for several decades. Moreover, the current analysis suggests that bacteriophages and certain nucleo-cytoplasmic large DNA viruses contain an unexpectedly diverse range of DNA modification systems, in addition to those using previously characterized enzymes such as Dam, Dcm, TET/JBP, pyrimidine hydroxymethylases, Mom and glycosyltransferases. These include enzymes generating modified bases such as deazaguanines related to queuine and archaeosine, pyrimidines comparable with lysidine, those derived using modified S-adenosyl methionine derivatives and those using TET/JBP-generated hydroxymethyl pyrimidines as biosynthetic starting points. We present evidence that some of these modification systems are also widely dispersed across prokaryotes and certain eukaryotes such as basidiomycetes, chlorophyte and stramenopile alga, where they could serve as novel epigenetic marks for regulation or discrimination of self from non-self DNA. Our study extends the role of the PUA-like fold domains in recognition of modified nucleic acids and predicts versions of the ASCH and EVE domains to be novel ‘readers’ of modified bases in DNA. These results open opportunities for the investigation of the biology of these systems

  20. Computational identification of conserved transcription factor binding sites upstream of genes induced in rat brain by transient focal ischemic stroke.

    PubMed

    Pulliam, John V K; Xu, Zhenfeng; Ford, Gregory D; Liu, Cuimei; Li, Yonggang; Stovall, Kyndra C; Cannon, Virginetta S; Tewolde, Teclemichael; Moreno, Carlos S; Ford, Byron D

    2013-02-01

    Microarray analysis has been used to understand how gene regulation plays a critical role in neuronal injury, survival and repair following ischemic stroke. To identify the transcriptional regulatory elements responsible for ischemia-induced gene expression, we examined gene expression profiles of rat brains following focal ischemia and performed computational analysis of consensus transcription factor binding sites (TFBS) in the genes of the dataset. In this study, rats were sacrificed 24 h after middle cerebral artery occlusion (MCAO) stroke and gene transcription in brain tissues following ischemia/reperfusion was examined using Affymetrix GeneChip technology. The CONserved transcription FACtor binding site (CONFAC) software package was used to identify over-represented TFBS in the upstream promoter regions of ischemia-induced genes compared to control datasets. CONFAC identified 12 TFBS that were statistically over-represented from our dataset of ischemia-induced genes, including three members of the Ets-1 family of transcription factors (TFs). Microarray results showed that mRNA for Ets-1 was increased following tMCAO but not pMCAO. Immunohistochemical analysis of Ets-1 protein in rat brains following MCAO showed that Ets-1 was highly expressed in neurons in the brain of sham control animals. Ets-1 protein expression was virtually abolished in injured neurons of the ischemic brain but was unchanged in peri-infarct brain areas. These data indicate that TFs, including Ets-1, may influence neuronal injury following ischemia. These findings could provide important insights into the mechanisms that lead to brain injury and could provide avenues for the development of novel therapies. PMID:23246490