Science.gov

Sample records for level computational identification

  1. MODEL IDENTIFICATION AND COMPUTER ALGEBRA

    PubMed Central

    Bollen, Kenneth A.; Bauldry, Shawn

    2011-01-01

    Multiequation models that contain observed or latent variables are common in the social sciences. To determine whether unique parameter values exist for such models, one needs to assess model identification. In practice analysts rely on empirical checks that evaluate the singularity of the information matrix evaluated at sample estimates of parameters. The discrepancy between estimates and population values, the limitations of numerical assessments of ranks, and the difference between local and global identification make this practice less than perfect. In this paper we outline how to use computer algebra systems (CAS) to determine the local and global identification of multiequation models with or without latent variables. We demonstrate a symbolic CAS approach to local identification and develop a CAS approach to obtain explicit algebraic solutions for each of the model parameters. We illustrate the procedures with several examples, including a new proof of the identification of a model for handling missing data using auxiliary variables. We present an identification procedure for Structural Equation Models that makes use of CAS and that is a useful complement to current methods. PMID:21769158

  2. MODEL IDENTIFICATION AND COMPUTER ALGEBRA.

    PubMed

    Bollen, Kenneth A; Bauldry, Shawn

    2010-10-01

    Multiequation models that contain observed or latent variables are common in the social sciences. To determine whether unique parameter values exist for such models, one needs to assess model identification. In practice analysts rely on empirical checks that evaluate the singularity of the information matrix evaluated at sample estimates of parameters. The discrepancy between estimates and population values, the limitations of numerical assessments of ranks, and the difference between local and global identification make this practice less than perfect. In this paper we outline how to use computer algebra systems (CAS) to determine the local and global identification of multiequation models with or without latent variables. We demonstrate a symbolic CAS approach to local identification and develop a CAS approach to obtain explicit algebraic solutions for each of the model parameters. We illustrate the procedures with several examples, including a new proof of the identification of a model for handling missing data using auxiliary variables. We present an identification procedure for Structural Equation Models that makes use of CAS and that is a useful complement to current methods.

  3. A method for identification of vertebral level.

    PubMed

    Redfern, R M; Smith, E T

    1986-05-01

    A method of spinal level marking applicable particularly for use in thoracolumbar posterior spinal operations is described. The use of patent blue V dye in this procedure is discussed in a consecutive series of over 100 cases. No serious adverse effects were observed. The technique ensures accurate identification of spinal marking and helps to minimize anaesthetic time. PMID:3729267

  4. Computer method for identification of boiler transfer functions

    NASA Technical Reports Server (NTRS)

    Miles, J. H.

    1972-01-01

    Iterative computer aided procedure was developed which provides for identification of boiler transfer functions using frequency response data. Method uses frequency response data to obtain satisfactory transfer function for both high and low vapor exit quality data.

  5. Identification of Protein–Excipient Interaction Hotspots Using Computational Approaches

    PubMed Central

    Barata, Teresa S.; Zhang, Cheng; Dalby, Paul A.; Brocchini, Steve; Zloh, Mire

    2016-01-01

    Protein formulation development relies on the selection of excipients that inhibit protein–protein interactions preventing aggregation. Empirical strategies involve screening many excipient and buffer combinations using force degradation studies. Such methods do not readily provide information on intermolecular interactions responsible for the protective effects of excipients. This study describes a molecular docking approach to screen and rank interactions allowing for the identification of protein–excipient hotspots to aid in the selection of excipients to be experimentally screened. Previously published work with Drosophila Su(dx) was used to develop and validate the computational methodology, which was then used to determine the formulation hotspots for Fab A33. Commonly used excipients were examined and compared to the regions in Fab A33 prone to protein–protein interactions that could lead to aggregation. This approach could provide information on a molecular level about the protective interactions of excipients in protein formulations to aid the more rational development of future formulations. PMID:27258262

  6. Multi level programming Paradigm for Extreme Computing

    NASA Astrophysics Data System (ADS)

    Petiton, S.; Sato, M.; Emad, N.; Calvin, C.; Tsuji, M.; Dandouna, M.

    2014-06-01

    Abstract: In order to propose a framework and programming paradigms for post-petascale computing, on the road to exascale computing and beyond, we introduced new languages, associated with a hierarchical multi-level programming paradigm, allowing scientific end-users and developers to program highly hierarchical architectures designed for extreme computing. In this paper, we explain the interest of such hierarchical multi-level programming paradigm for extreme computing and its well adaptation to several large computational science applications, such as for linear algebra solvers used for reactor core physic. We describe the YML language and framework allowing describing graphs of parallel components, which may be developed using PGAS-like language such as XMP, scheduled and computed on supercomputers. Then, we propose experimentations on supercomputers (such as the "K" and "Hooper" ones) of the hybrid method MERAM (Multiple Explicitly Restarted Arnoldi Method) as a case study for iterative methods manipulating sparse matrices, and the block Gauss-Jordan method as a case study for direct method manipulating dense matrices. We conclude proposing evolutions for this programming paradigm.

  7. Multi-level RF identification system

    DOEpatents

    Steele, Kerry D.; Anderson, Gordon A.; Gilbert, Ronald W.

    2004-07-20

    A radio frequency identification system having a radio frequency transceiver for generating a continuous wave RF interrogation signal that impinges upon an RF identification tag. An oscillation circuit in the RF identification tag modulates the interrogation signal with a subcarrier of a predetermined frequency and modulates the frequency-modulated signal back to the transmitting interrogator. The interrogator recovers and analyzes the subcarrier signal and determines its frequency. The interrogator generates an output indicative of the frequency of the subcarrier frequency, thereby identifying the responding RFID tag as one of a "class" of RFID tags configured to respond with a subcarrier signal of a predetermined frequency.

  8. New multiplatform computer program for numerical identification of microorganisms.

    PubMed

    Flores, Oscar; Belanche, Lluís A; Blanch, Anicet R

    2009-12-01

    The classification of bacteria by using genomic methods or expensive biochemical-based commercial kits is sometimes beyond the reach of many laboratories that need to perform numerous classifications of unknown bacterial strains in a fast, cheap, and reliable way. A new computer program, Identax, for the computer-assisted identification of microorganisms by using only results obtained from conventional biochemical tests is presented. Identax improves current microbial identification software and provides a multiplatform and user-friendly program. It can be executed from any operating system and can be downloaded without any cost from the Identax website (www.identax.org).

  9. Computational phosphoproteomics: From identification to localization

    PubMed Central

    Lee, Dave C H; Jones, Andrew R; Hubbard, Simon J

    2015-01-01

    Analysis of the phosphoproteome by MS has become a key technology for the characterization of dynamic regulatory processes in the cell, since kinase and phosphatase action underlie many major biological functions. However, the addition of a phosphate group to a suitable side chain often confounds informatic analysis by generating product ion spectra that are more difficult to interpret (and consequently identify) relative to unmodified peptides. Collectively, these challenges have motivated bioinformaticians to create novel software tools and pipelines to assist in the identification of phosphopeptides in proteomic mixtures, and help pinpoint or “localize” the most likely site of modification in cases where there is ambiguity. Here we review the challenges to be met and the informatics solutions available to address them for phosphoproteomic analysis, as well as highlighting the difficulties associated with using them and the implications for data standards. PMID:25475148

  10. Computational requirements for on-orbit identification of space systems

    NASA Technical Reports Server (NTRS)

    Hadaegh, Fred Y.

    1988-01-01

    For the future space systems, on-orbit identification (ID) capability will be required to complement on-orbit control, due to the fact that the dynamics of large space structures, spacecrafts, and antennas will not be known sufficiently from ground modeling and testing. The computational requirements for ID of flexible structures such as the space station (SS) or the large deployable reflectors (LDR) are however, extensive due to the large number of modes, sensors, and actuators. For these systems the ID algorithm operations need not be computed in real-time, only in near real-time, or an appropriate mission time. Consequently the space systems will need advanced processors and efficient parallel processing algorithm design and architectures to implement the identification algorithms in near real-time. The MAX computer currently being developed may handle such computational requirements. The purpose is to specify the on-board computational requirements for dynamic and static identification for large space structures. The computational requirements for six ID algorithms are presented in the context of three examples: the JPL/AFAL ground antenna facility, the space station (SS), and the large deployable reflector (LDR).

  11. Computational Strategies for a System-Level Understanding of Metabolism

    PubMed Central

    Cazzaniga, Paolo; Damiani, Chiara; Besozzi, Daniela; Colombo, Riccardo; Nobile, Marco S.; Gaglio, Daniela; Pescini, Dario; Molinari, Sara; Mauri, Giancarlo; Alberghina, Lilia; Vanoni, Marco

    2014-01-01

    Cell metabolism is the biochemical machinery that provides energy and building blocks to sustain life. Understanding its fine regulation is of pivotal relevance in several fields, from metabolic engineering applications to the treatment of metabolic disorders and cancer. Sophisticated computational approaches are needed to unravel the complexity of metabolism. To this aim, a plethora of methods have been developed, yet it is generally hard to identify which computational strategy is most suited for the investigation of a specific aspect of metabolism. This review provides an up-to-date description of the computational methods available for the analysis of metabolic pathways, discussing their main advantages and drawbacks.  In particular, attention is devoted to the identification of the appropriate scale and level of accuracy in the reconstruction of metabolic networks, and to the inference of model structure and parameters, especially when dealing with a shortage of experimental measurements. The choice of the proper computational methods to derive in silico data is then addressed, including topological analyses, constraint-based modeling and simulation of the system dynamics. A description of some computational approaches to gain new biological knowledge or to formulate hypotheses is finally provided. PMID:25427076

  12. Human operator identification model and related computer programs

    NASA Technical Reports Server (NTRS)

    Kessler, K. M.; Mohr, J. N.

    1978-01-01

    Four computer programs which provide computational assistance in the analysis of man/machine systems are reported. The programs are: (1) Modified Transfer Function Program (TF); (2) Time Varying Response Program (TVSR); (3) Optimal Simulation Program (TVOPT); and (4) Linear Identification Program (SCIDNT). The TV program converts the time domain state variable system representative to frequency domain transfer function system representation. The TVSR program computes time histories of the input/output responses of the human operator model. The TVOPT program is an optimal simulation program and is similar to TVSR in that it produces time histories of system states associated with an operator in the loop system. The differences between the two programs are presented. The SCIDNT program is an open loop identification code which operates on the simulated data from TVOPT (or TVSR) or real operator data from motion simulators.

  13. Computational identification of 69 retroposons in Arabidopsis.

    PubMed

    Zhang, Yujun; Wu, Yongrui; Liu, Yilei; Han, Bin

    2005-06-01

    Retroposition is a shot-gun strategy of the genome to achieve evolutionary diversities by mixing and matching coding sequences with novel regulatory elements. We have identified 69 retroposons in the Arabidopsis (Arabidopsis thaliana) genome by a computational approach. Most of them were derivatives of mature mRNAs, and 20 genes contained relics of the reverse transcription process, such as truncations, deletions, and extra sequence additions. Of them, 22 are processed pseudogenes, and 52 genes are likely to be actively transcribed, especially in tissues from apical meristems (roots and flowers). Functional compositions of these retroposon parental genes imply that not the mRNA itself but its expression in gamete cells defines a suitable template for retroposition. The presence/absence patterns of retroposons can be used as cladistic markers for biogeographic research. Effects of human and the Mediterranean Pleistocene refugia in Arabidopsis biogeographic distributions were revealed based on two recent retroposons (At1g61410 and At5g52090). An evolutionary rate of new gene creation by retroposition was calculated as 0.6 genes per million years. Retroposons can also be used as molecular fossils of the parental gene expressions in ancient time. Extensions of 3' untranslated regions for those expressed parental genes are revealed as a possible trend of plant transcriptome evolution. In addition, we reported the first plant functional chimeric gene that adapts to intercompartmental transport by capturing two additional exons after retroposition. PMID:15923328

  14. 48 CFR 227.7203-10 - Contractor identification and marking of computer software or computer software documentation to...

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... and marking of computer software or computer software documentation to be furnished with restrictive... Rights in Computer Software and Computer Software Documentation 227.7203-10 Contractor identification and marking of computer software or computer software documentation to be furnished with restrictive...

  15. 48 CFR 227.7203-10 - Contractor identification and marking of computer software or computer software documentation to...

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... and marking of computer software or computer software documentation to be furnished with restrictive... Rights in Computer Software and Computer Software Documentation 227.7203-10 Contractor identification and marking of computer software or computer software documentation to be furnished with restrictive...

  16. 48 CFR 227.7203-3 - Early identification of computer software or computer software documentation to be furnished to...

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... computer software or computer software documentation to be furnished to the Government with restrictions on..., DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203-3 Early identification of computer software or computer software documentation to be furnished to the Government...

  17. 48 CFR 227.7203-10 - Contractor identification and marking of computer software or computer software documentation to...

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... and marking of computer software or computer software documentation to be furnished with restrictive... Rights in Computer Software and Computer Software Documentation 227.7203-10 Contractor identification and marking of computer software or computer software documentation to be furnished with restrictive...

  18. 48 CFR 227.7203-3 - Early identification of computer software or computer software documentation to be furnished to...

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... computer software or computer software documentation to be furnished to the Government with restrictions on..., DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203-3 Early identification of computer software or computer software documentation to be furnished to the Government...

  19. 48 CFR 227.7203-3 - Early identification of computer software or computer software documentation to be furnished to...

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... computer software or computer software documentation to be furnished to the Government with restrictions on..., DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203-3 Early identification of computer software or computer software documentation to be furnished to the Government...

  20. 48 CFR 227.7203-3 - Early identification of computer software or computer software documentation to be furnished to...

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... computer software or computer software documentation to be furnished to the Government with restrictions on..., DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203-3 Early identification of computer software or computer software documentation to be furnished to the Government...

  1. 48 CFR 227.7203-10 - Contractor identification and marking of computer software or computer software documentation to...

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... and marking of computer software or computer software documentation to be furnished with restrictive... Rights in Computer Software and Computer Software Documentation 227.7203-10 Contractor identification and marking of computer software or computer software documentation to be furnished with restrictive...

  2. 48 CFR 227.7203-3 - Early identification of computer software or computer software documentation to be furnished to...

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... computer software or computer software documentation to be furnished to the Government with restrictions on..., DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203-3 Early identification of computer software or computer software documentation to be furnished to the Government...

  3. 48 CFR 227.7203-10 - Contractor identification and marking of computer software or computer software documentation to...

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... and marking of computer software or computer software documentation to be furnished with restrictive... Rights in Computer Software and Computer Software Documentation 227.7203-10 Contractor identification and marking of computer software or computer software documentation to be furnished with restrictive...

  4. Computed tomographic identification of calcified optic nerve drusen

    SciTech Connect

    Ramirez, H.; Blatt, E.S.; Hibri, N.S.

    1983-07-01

    Four cases of optic disk drusen were accurately diagnosed with orbital computed tomography (CT). The radiologist should be aware of the characteristic CT finding of discrete calcification within an otherwise normal optic disk. This benign process is easily differentiated from lesions such as calcific neoplastic processes of the posterior globe. CT identification of optic disk drusen is essential in the evaluation of visual field defects, migraine-like headaches, and pseudopapilledema.

  5. Computer simulation of optimal sensor locations in loading identification

    NASA Astrophysics Data System (ADS)

    Li, Dong-Sheng; Li, Hong-Nan; Guo, Xing L.

    2003-07-01

    A method is presented for the selection of a set of sensor locations from a larger candidate sent for the purpose of structural loading identification. The method ranks the candidate sensor locations according to their effectiveness for identifying the given known loadings. Measurement locations that yield abnormal jumps in identification results or increase the condition number of the frequency response function are removed. The final sensor configuration tends to minimize the error of the loading identification results and the condition number of the frequency response function. The initial candidate set is selected based on the modal kinetic energy distribution that gives a measure of the dynamic contribution of each physical degree freedom to each of the target mode shapes of interest. In addition, excitation location is considered when selecting appropriate response measurement locations. This method was successfully applied to the optimal sensor location selection and loading identification of a uniform cantilever beam in experiment. It is shown that computer simulation is a good way to select the optimal sensor location for loading identification.

  6. Computer-assisted skull identification system using video superimposition.

    PubMed

    Yoshino, M; Matsuda, H; Kubota, S; Imaizumi, K; Miyasaka, S; Seta, S

    1997-12-01

    This system consists of two main units, namely a video superimposition system and a computer-assisted skull identification system. The video superimposition system is comprised of the following five parts: a skull-positioning box having a monochrome CCD camera, a photo-stand having a color CCD camera, a video image mixing device, a TV monitor and a videotape recorder. The computer-assisted skull identification system is composed of a host computer including our original application software, a film recorder and a color printer. After the determination of the orientation and size of the skull to those of the facial photograph using the video superimposition system, the skull and facial photograph images are digitized and stored within the computer, and then both digitized images are superimposed on the monitor. For the assessment of anatomical consistency between the digitized skull and face, the distance between the landmarks and the thickness of soft tissue of the anthropometrical points are semi-automatically measured on the monitor. The wipe images facilitates the comparison of positional relationships between the digitized skull and face. The software includes the polynomial functions and Fourier harmonic analysis for evaluating the match of the outline such as the forehead and mandibular line in both the digitized images.

  7. Multi-level hot zone identification for pedestrian safety.

    PubMed

    Lee, Jaeyoung; Abdel-Aty, Mohamed; Choi, Keechoo; Huang, Helai

    2015-03-01

    According to the National Highway Traffic Safety Administration (NHTSA), while fatalities from traffic crashes have decreased, the proportion of pedestrian fatalities has steadily increased from 11% to 14% over the past decade. This study aims at identifying two zonal levels factors. The first is to identify hot zones at which pedestrian crashes occurs, while the second are zones where crash-involved pedestrians came from. Bayesian Poisson lognormal simultaneous equation spatial error model (BPLSESEM) was estimated and revealed significant factors for the two target variables. Then, PSIs (potential for safety improvements) were computed using the model. Subsequently, a novel hot zone identification method was suggested to combine both hot zones from where vulnerable pedestrians originated with hot zones where many pedestrian crashes occur. For the former zones, targeted safety education and awareness campaigns can be provided as countermeasures whereas area-wide engineering treatments and enforcement may be effective safety treatments for the latter ones. Thus, it is expected that practitioners are able to suggest appropriate safety treatments for pedestrian crashes using the method and results from this study.

  8. Tracking by Identification Using Computer Vision and Radio

    PubMed Central

    Mandeljc, Rok; Kovačič, Stanislav; Kristan, Matej; Perš, Janez

    2013-01-01

    We present a novel system for detection, localization and tracking of multiple people, which fuses a multi-view computer vision approach with a radio-based localization system. The proposed fusion combines the best of both worlds, excellent computer-vision-based localization, and strong identity information provided by the radio system, and is therefore able to perform tracking by identification, which makes it impervious to propagated identity switches. We present comprehensive methodology for evaluation of systems that perform person localization in world coordinate system and use it to evaluate the proposed system as well as its components. Experimental results on a challenging indoor dataset, which involves multiple people walking around a realistically cluttered room, confirm that proposed fusion of both systems significantly outperforms its individual components. Compared to the radio-based system, it achieves better localization results, while at the same time it successfully prevents propagation of identity switches that occur in pure computer-vision-based tracking. PMID:23262485

  9. New identification possibilities with postmortem multislice computed tomography.

    PubMed

    Dedouit, Fabrice; Telmon, Norbert; Costagliola, Rémi; Otal, Philippe; Florence, Loubes Lacroix; Joffre, Francis; Rougé, Daniel

    2007-11-01

    Historically, radiographical identification has been done by comparing conventional antemortem and postmortem X-ray images. The advent of new technologies such as multislice computed tomography (MSCT) is making traditional antemortem examination increasingly less frequent. The authors present the results of MSCT study of 35 corpses, which demonstrated features potentially useful for identification purposes in ten cases. These features, which relate to abnormalities of postcranial bone as well as of the internal organs, are presented. Attempts were made to find any antemortem X-rays or MSCTs on the cases described to compare the two antemortem and postmortem images. Although antemortem imaging was recovered for only two cases (one case with a skeletal abnormality and one case with a visceral abnormality), it permitted for both cases the comparison of antemortem and postmortem MSCTs.

  10. Factors Influencing Exemplary Science Teachers' Levels of Computer Use

    ERIC Educational Resources Information Center

    Hakverdi, Meral; Dana, Thomas M.; Swain, Colleen

    2011-01-01

    The purpose of this study was to examine exemplary science teachers' use of technology in science instruction, factors influencing their level of computer use, their level of knowledge/skills in using specific computer applications for science instruction, their use of computer-related applications/tools during their instruction, and their…

  11. A Teaching Exercise for the Identification of Bacteria Using An Interactive Computer Program.

    ERIC Educational Resources Information Center

    Bryant, Trevor N.; Smith, John E.

    1979-01-01

    Describes an interactive Fortran computer program which provides an exercise in the identification of bacteria. Provides a way of enhancing a student's approach to systematic bacteriology and numerical identification procedures. (Author/MA)

  12. Symbolic computation at various levels of abstraction

    SciTech Connect

    Von Laven, S.A.

    1986-01-01

    Symbolic computation has already become a versatile tool in many fields. The applications with the highest payoff in terms of results versus effort expended are those for which the operations to be performed are straightforward but lengthy. Three such applications are presented here as examples. In the first, a laser resonator problem, the symbols being manipulated represent physical parameters. In the second, a software design problem, the symbols are more abstract; they represent algorithm structures and transformations and computer architectures. The third application is graphics in which the symbols are anything that might be associated with a tree graph.

  13. Computer program to predict aircraft noise levels

    NASA Technical Reports Server (NTRS)

    Clark, B. J.

    1981-01-01

    Methods developed at the NASA Lewis Research Center for predicting the noise contributions from various aircraft noise sources were programmed to predict aircraft noise levels either in flight or in ground tests. The noise sources include fan inlet and exhaust, jet, flap (for powered lift), core (combustor), turbine, and airframe. Noise propagation corrections are available for atmospheric attenuation, ground reflections, extra ground attenuation, and shielding. Outputs can include spectra, overall sound pressure level, perceived noise level, tone-weighted perceived noise level, and effective perceived noise level at locations specified by the user. Footprint contour coordinates and approximate footprint areas can also be calculated. Inputs and outputs can be in either System International or U.S. customary units. The subroutines for each noise source and propagation correction are described. A complete listing is given.

  14. Computational system identification of continuous-time nonlinear systems using approximate Bayesian computation

    NASA Astrophysics Data System (ADS)

    Krishnanathan, Kirubhakaran; Anderson, Sean R.; Billings, Stephen A.; Kadirkamanathan, Visakan

    2016-11-01

    In this paper, we derive a system identification framework for continuous-time nonlinear systems, for the first time using a simulation-focused computational Bayesian approach. Simulation approaches to nonlinear system identification have been shown to outperform regression methods under certain conditions, such as non-persistently exciting inputs and fast-sampling. We use the approximate Bayesian computation (ABC) algorithm to perform simulation-based inference of model parameters. The framework has the following main advantages: (1) parameter distributions are intrinsically generated, giving the user a clear description of uncertainty, (2) the simulation approach avoids the difficult problem of estimating signal derivatives as is common with other continuous-time methods, and (3) as noted above, the simulation approach improves identification under conditions of non-persistently exciting inputs and fast-sampling. Term selection is performed by judging parameter significance using parameter distributions that are intrinsically generated as part of the ABC procedure. The results from a numerical example demonstrate that the method performs well in noisy scenarios, especially in comparison to competing techniques that rely on signal derivative estimation.

  15. Factors influencing exemplary science teachers' levels of computer use

    NASA Astrophysics Data System (ADS)

    Hakverdi, Meral

    This study examines exemplary science teachers' use of technology in science instruction, factors influencing their level of computer use, their level of knowledge/skills in using specific computer applications for science instruction, their use of computer-related applications/tools during their instruction, and their students' use of computer applications/tools in or for their science class. After a relevant review of the literature certain variables were selected for analysis. These variables included personal self-efficacy in teaching with computers, outcome expectancy, pupil-control ideology, level of computer use, age, gender, teaching experience, personal computer use, professional computer use and science teachers' level of knowledge/skills in using specific computer applications for science instruction. The sample for this study includes middle and high school science teachers who received the Presidential Award for Excellence in Science Teaching Award (sponsored by the White House and the National Science Foundation) between the years 1997 and 2003 from all 50 states and U.S. territories. Award-winning science teachers were contacted about the survey via e-mail or letter with an enclosed return envelope. Of the 334 award-winning science teachers, usable responses were received from 92 science teachers, which made a response rate of 27.5%. Analysis of the survey responses indicated that exemplary science teachers have a variety of knowledge/skills in using computer related applications/tools. The most commonly used computer applications/tools are information retrieval via the Internet, presentation tools, online communication, digital cameras, and data collection probes. Results of the study revealed that students' use of technology in their science classroom is highly correlated with the frequency of their science teachers' use of computer applications/tools. The results of the multiple regression analysis revealed that personal self-efficacy related to

  16. Sound source localization identification accuracy: Level and duration dependencies.

    PubMed

    Yost, William A

    2016-07-01

    Sound source localization accuracy for noises was measured for sources in the front azimuthal open field mainly as a function of overall noise level and duration. An identification procedure was used in which listeners identify which loudspeakers presented a sound. Noises were filtered and differed in bandwidth and center frequency. Sound source localization accuracy depended on the bandwidth of the stimuli, and for the narrow bandwidths, accuracy depended on the filter's center frequency. Sound source localization accuracy did not depend on overall level or duration. PMID:27475204

  17. 24 CFR 990.175 - Utilities expense level: Computation of the current consumption level.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ...: Computation of the current consumption level. 990.175 Section 990.175 Housing and Urban Development... Calculating Formula Expenses § 990.175 Utilities expense level: Computation of the current consumption level. The current consumption level shall be the actual amount of each utility consumed during the...

  18. 24 CFR 990.180 - Utilities expense level: Computation of the rolling base consumption level.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ...: Computation of the rolling base consumption level. 990.180 Section 990.180 Housing and Urban Development... Calculating Formula Expenses § 990.180 Utilities expense level: Computation of the rolling base consumption level. (a) General. (1) The rolling base consumption level (RBCL) shall be equal to the average...

  19. Identification of cancer mechanisms through computational systems modeling

    PubMed Central

    Qi, Zhen; Voit, Eberhard O.

    2014-01-01

    Background Colorectal cancer is one of the most prevalent causes of cancer death. It has been studied extensively for a long time, and numerous genetic and epigenetic events have been associated with the disease. However, its molecular mechanisms are still unclear. High-throughput metabolomics data, combined with customized computational systems modeling, can assist our understanding of some of these mechanisms by revealing connections between alterations in enzymatic activities and their consequences for a person’s metabolic profile. Of particular importance in this context is purine metabolism, as it provides the nucleotides needed for cell proliferation. Methods and findings We employ a computational systems approach to infer molecular mechanisms associated with purine metabolism in colorectal carcinoma. The approach uses a dynamic model of purine metabolism as the simulation system and metabolomics data as input. The execution of large-scale Monte Carlo simulations and optimization with the model permits a step-wise reduction in possibly affected enzyme mechanisms, from which likely targets emerge. Conclusions According to our results, some enzymes in the purine pathway system are very unlikely the targets of colorectal carcinoma. In fact, only three enzymatic steps emerge with statistical confidence as most likely being affected, namely: amidophosphoribosyltransferase (ATASE), 5′-nucleotidase (5NUC), and the xanthine oxidase/dehydrogenase (XD) reactions. The first of these enzymes catalyzes the first committed step of de novo purine biosynthesis, while the other two enzymes are associated with critical purine salvage pathways. The identification of these enzymes is statistically significant and robust. In addition, the results suggest potential secondary targets. The computational method cannot discern whether the inferred mechanisms constitute symptoms of colorectal carcinoma, or whether they might be causative and critical components of the uncontrolled

  20. Computational identification of microRNAs and their targets.

    PubMed

    Zhang, Baohong; Pan, Xiaoping; Wang, Qinglian; Cobb, George P; Anderson, Todd A

    2006-12-01

    MicroRNAs (miRNAs) are one class of newly identified riboregulators of gene expression in many eukaryotic organisms. They play important roles in multiple biological and metabolic processes, including developmental timing, signal transduction, cell maintenance and differentiation, diseases and cancers. miRNAs regulate gene expression at the posttranscriptional level by directly cleaving targeted mRNAs or repressing translation. Although the founding members of miRNAs were discovered by genetic screening approaches, experimental approaches were limited by their low efficiency, time consuming, and high cost. As an alternative, computational approaches were developed. Computational approaches for identifying miRNAs are based on the following major characteristics of miRNAs: hairpin-shaped secondary structures, high conservation for some miRNAs, and high minimal folding free energy index (MFEI). Computational approaches also play an important role in identifying miRNA targets. A majority of known miRNAs and their targets were identified by computational approaches. Several web-based or non-web-based computer software programs are publicly available for predicting miRNAs and their targets.

  1. Proficiency Level--A Fuzzy Variable in Computer Learner Corpora

    ERIC Educational Resources Information Center

    Carlsen, Cecilie

    2012-01-01

    This article focuses on the proficiency level of texts in Computer Learner Corpora (CLCs). A claim is made that proficiency levels are often poorly defined in CLC design, and that the methods used for level assignment of corpus texts are not always adequate. Proficiency level can therefore, best be described as a fuzzy variable in CLCs,…

  2. Computational Identification of Novel Genes: Current and Future Perspectives.

    PubMed

    Klasberg, Steffen; Bitard-Feildel, Tristan; Mallet, Ludovic

    2016-01-01

    While it has long been thought that all genomic novelties are derived from the existing material, many genes lacking homology to known genes were found in recent genome projects. Some of these novel genes were proposed to have evolved de novo, ie, out of noncoding sequences, whereas some have been shown to follow a duplication and divergence process. Their discovery called for an extension of the historical hypotheses about gene origination. Besides the theoretical breakthrough, increasing evidence accumulated that novel genes play important roles in evolutionary processes, including adaptation and speciation events. Different techniques are available to identify genes and classify them as novel. Their classification as novel is usually based on their similarity to known genes, or lack thereof, detected by comparative genomics or against databases. Computational approaches are further prime methods that can be based on existing models or leveraging biological evidences from experiments. Identification of novel genes remains however a challenging task. With the constant software and technologies updates, no gold standard, and no available benchmark, evaluation and characterization of genomic novelty is a vibrant field. In this review, the classical and state-of-the-art tools for gene prediction are introduced. The current methods for novel gene detection are presented; the methodological strategies and their limits are discussed along with perspective approaches for further studies. PMID:27493475

  3. Computational Identification of Novel Genes: Current and Future Perspectives

    PubMed Central

    Klasberg, Steffen; Bitard-Feildel, Tristan; Mallet, Ludovic

    2016-01-01

    While it has long been thought that all genomic novelties are derived from the existing material, many genes lacking homology to known genes were found in recent genome projects. Some of these novel genes were proposed to have evolved de novo, ie, out of noncoding sequences, whereas some have been shown to follow a duplication and divergence process. Their discovery called for an extension of the historical hypotheses about gene origination. Besides the theoretical breakthrough, increasing evidence accumulated that novel genes play important roles in evolutionary processes, including adaptation and speciation events. Different techniques are available to identify genes and classify them as novel. Their classification as novel is usually based on their similarity to known genes, or lack thereof, detected by comparative genomics or against databases. Computational approaches are further prime methods that can be based on existing models or leveraging biological evidences from experiments. Identification of novel genes remains however a challenging task. With the constant software and technologies updates, no gold standard, and no available benchmark, evaluation and characterization of genomic novelty is a vibrant field. In this review, the classical and state-of-the-art tools for gene prediction are introduced. The current methods for novel gene detection are presented; the methodological strategies and their limits are discussed along with perspective approaches for further studies. PMID:27493475

  4. The Reality of Computers at the Community College Level.

    ERIC Educational Resources Information Center

    Leone, Stephen J.

    Writing teachers at the community college level who teach using a computer have come to accept the fact that it is more than "just teaching" composition. Such teaching often requires instructors to be as knowledgeable as some of the technicians. Two-year college students and faculty are typically given little support in using computers for…

  5. Computational identification of obligatorily autocatalytic replicators embedded in metabolic networks

    PubMed Central

    Kun, Ádám; Papp, Balázs; Szathmáry, Eörs

    2008-01-01

    Background If chemical A is necessary for the synthesis of more chemical A, then A has the power of replication (such systems are known as autocatalytic systems). We provide the first systems-level analysis searching for small-molecular autocatalytic components in the metabolisms of diverse organisms, including an inferred minimal metabolism. Results We find that intermediary metabolism is invariably autocatalytic for ATP. Furthermore, we provide evidence for the existence of additional, organism-specific autocatalytic metabolites in the forms of coenzymes (NAD+, coenzyme A, tetrahydrofolate, quinones) and sugars. Although the enzymatic reactions of a number of autocatalytic cycles are present in most of the studied organisms, they display obligatorily autocatalytic behavior in a few networks only, hence demonstrating the need for a systems-level approach to identify metabolic replicators embedded in large networks. Conclusion Metabolic replicators are apparently common and potentially both universal and ancestral: without their presence, kick-starting metabolic networks is impossible, even if all enzymes and genes are present in the same cell. Identification of metabolic replicators is also important for attempts to create synthetic cells, as some of these autocatalytic molecules will presumably be needed to be added to the system as, by definition, the system cannot synthesize them without their initial presence. PMID:18331628

  6. Genetic and computational identification of a conserved bacterial metabolic module.

    PubMed

    Boutte, Cara C; Srinivasan, Balaji S; Flannick, Jason A; Novak, Antal F; Martens, Andrew T; Batzoglou, Serafim; Viollier, Patrick H; Crosson, Sean

    2008-12-01

    We have experimentally and computationally defined a set of genes that form a conserved metabolic module in the alpha-proteobacterium Caulobacter crescentus and used this module to illustrate a schema for the propagation of pathway-level annotation across bacterial genera. Applying comprehensive forward and reverse genetic methods and genome-wide transcriptional analysis, we (1) confirmed the presence of genes involved in catabolism of the abundant environmental sugar myo-inositol, (2) defined an operon encoding an ABC-family myo-inositol transmembrane transporter, and (3) identified a novel myo-inositol regulator protein and cis-acting regulatory motif that control expression of genes in this metabolic module. Despite being encoded from non-contiguous loci on the C. crescentus chromosome, these myo-inositol catabolic enzymes and transporter proteins form a tightly linked functional group in a computationally inferred network of protein associations. Primary sequence comparison was not sufficient to confidently extend annotation of all components of this novel metabolic module to related bacterial genera. Consequently, we implemented the Graemlin multiple-network alignment algorithm to generate cross-species predictions of genes involved in myo-inositol transport and catabolism in other alpha-proteobacteria. Although the chromosomal organization of genes in this functional module varied between species, the upstream regions of genes in this aligned network were enriched for the same palindromic cis-regulatory motif identified experimentally in C. crescentus. Transposon disruption of the operon encoding the computationally predicted ABC myo-inositol transporter of Sinorhizobium meliloti abolished growth on myo-inositol as the sole carbon source, confirming our cross-genera functional prediction. Thus, we have defined regulatory, transport, and catabolic genes and a cis-acting regulatory sequence that form a conserved module required for myo-inositol metabolism in

  7. Levels of Evaluation for Computer-Based Instruction.

    ERIC Educational Resources Information Center

    Reeves, Thomas C.; Lent, Richard M.

    The uses and methods of four levels of evaluation which can be conducted during the development and implementation phases of computer-based instruction (CBI) programs are discussed in this paper. The levels of evaluation presented are: (1) documentation, (2) formative evaluation, (3) assessment of immediate learner effectiveness, and (4) impact…

  8. Computer-aided dental identification: an objective method for assessment of radiographic image similarity.

    PubMed

    Flint, Diane J; Brent Dove, Stephen; Brumit, Paula C; White, Marea; Senn, David R

    2009-01-01

    A pilot study evaluated a computer-based method for comparing digital dental images, utilizing a registration algorithm to correct for variations in projection geometry between images prior to a subtraction analysis. A numerical assessment of similarity was generated for pairs of images. Using well-controlled laboratory settings, the method was evaluated as to its ability to identify the correct specimen with positive results. A subsequent clinical study examined longitudinal radiographic examinations of selected anatomical areas on 47 patients, analyzing the computer-based method in making the correct identification based upon a threshold level of similarity. The results showed that at a threshold of 0.855, there were two false negative and two false positive identifications out of 957 analyses. Based on these initial findings, 25 dental records having two sets of full mouth series of radiographs were selected. The radiographs were digitized and grouped into six anatomical regions. The more recent set of films served as postmortem images. Each postmortem image was analyzed against all other images within the region. Images were registered to correct for differences in projection geometry prior to analysis. An area of interest was selected to assess image similarity. Analysis of variance was used to determine that there was a significant difference between images from the same individual and those from different individuals. Results showed that the threshold level of concordance will vary with the anatomical region of the mouth examined. This method may provide the most objective and reliable method for postmortem dental identification using intra-oral images.

  9. Identification of natural images and computer-generated graphics based on statistical and textural features.

    PubMed

    Peng, Fei; Li, Jiao-ting; Long, Min

    2015-03-01

    To discriminate the acquisition pipelines of digital images, a novel scheme for the identification of natural images and computer-generated graphics is proposed based on statistical and textural features. First, the differences between them are investigated from the view of statistics and texture, and 31 dimensions of feature are acquired for identification. Then, LIBSVM is used for the classification. Finally, the experimental results are presented. The results show that it can achieve an identification accuracy of 97.89% for computer-generated graphics, and an identification accuracy of 97.75% for natural images. The analyses also demonstrate the proposed method has excellent performance, compared with some existing methods based only on statistical features or other features. The method has a great potential to be implemented for the identification of natural images and computer-generated graphics.

  10. 24 CFR 990.175 - Utilities expense level: Computation of the current consumption level.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 24 Housing and Urban Development 4 2010-04-01 2010-04-01 false Utilities expense level... Calculating Formula Expenses § 990.175 Utilities expense level: Computation of the current consumption level. The current consumption level shall be the actual amount of each utility consumed during the...

  11. Identification of Learning Processes by Means of Computer Graphics.

    ERIC Educational Resources Information Center

    Sorensen, Birgitte Holm

    1993-01-01

    Describes a development project for the use of computer graphics and video in connection with an inservice training course for primary education teachers in Denmark. Topics addressed include research approaches to computers; computer graphics in learning processes; activities relating to computer graphics; the role of the teacher; and student…

  12. Computer ethics and teritary level education in Hong Kong

    SciTech Connect

    Wong, E.Y.W.; Davison, R.M.; Wade, P.W.

    1994-12-31

    This paper seeks to highlight some ethical issues relating to the increasing proliferation of Information Technology into our everyday lives. The authors explain their understanding of computer ethics, and give some reasons why the study of computer ethics is becoming increasingly pertinent. The paper looks at some of the problems that arise in attempting to develop appropriate ethical concepts in a constantly changing environment, and explores some of the ethical dilemmas arising from the increasing use of computers. Some initial research undertaken to explore the ideas and understanding of tertiary level students in Hong Kong on a number of ethical issues of interest is described, and our findings discussed. We hope that presenting this paper and eliciting subsequent discussion will enable us to draw up more comprehensive guidelines for the teaching of computer related ethics to tertiary level students, as well as reveal some directions for future research.

  13. Computing NLTE Opacities -- Node Level Parallel Calculation

    SciTech Connect

    Holladay, Daniel

    2015-09-11

    Presentation. The goal: to produce a robust library capable of computing reasonably accurate opacities inline with the assumption of LTE relaxed (non-LTE). Near term: demonstrate acceleration of non-LTE opacity computation. Far term (if funded): connect to application codes with in-line capability and compute opacities. Study science problems. Use efficient algorithms that expose many levels of parallelism and utilize good memory access patterns for use on advanced architectures. Portability to multiple types of hardware including multicore processors, manycore processors such as KNL, GPUs, etc. Easily coupled to radiation hydrodynamics and thermal radiative transfer codes.

  14. Dysregulation in level of goal and action identification across psychological disorders.

    PubMed

    Watkins, Edward

    2011-03-01

    Goals, events, and actions can be mentally represented within a hierarchical framework that ranges from more abstract to more concrete levels of identification. A more abstract level of identification involves general, superordinate, and decontextualized mental representations that convey the meaning of goals, events, and actions, "why" an action is performed, and its purpose, ends, and consequences. A more concrete level of identification involves specific and subordinate mental representations that include contextual details of goals, events, and actions, and the specific "how" details of an action. This review considers three lines of evidence for considering that dysregulation of level of goal/action identification may be a transdiagnostic process. First, there is evidence that different levels of identification have distinct functional consequences and that in non-clinical samples level of goal/action identification appears to be regulated in a flexible and adaptive way to match the level of goal/action identification to circumstances. Second, there is evidence that level of goal/action identification causally influences symptoms and processes involved in psychological disorders, including emotional response, repetitive thought, impulsivity, problem solving and procrastination. Third, there is evidence that the level of goal/action identification is biased and/or dysregulated in certain psychological disorders, with a bias towards more abstract identification for negative events in depression, GAD, PTSD, and social anxiety.

  15. Rugoscopy: Human identification by computer-assisted photographic superimposition technique

    PubMed Central

    Mohammed, Rezwana Begum; Patil, Rajendra G.; Pammi, V. R.; Sandya, M. Pavana; Kalyan, Siva V.; Anitha, A.

    2013-01-01

    Background: Human identification has been studied since fourteenth century and it has gradually advanced for forensic purposes. Traditional methods such as dental, fingerprint, and DNA comparisons are probably the most common techniques used in this context, allowing fast and secure identification processes. But, in circumstances where identification of an individual by fingerprint or dental record comparison is difficult, palatal rugae may be considered as an alternative source of material. Aim: The present study was done to evaluate the individualistic nature and use of palatal rugae patterns for personal identification and also to test the efficiency of computerized software for forensic identification by photographic superimposition of palatal photographs obtained from casts. Materials and Methods: Two sets of Alginate impressions were made from the upper arches of 100 individuals (50 males and 50 females) with one month interval in between and the casts were poured. All the teeth except the incisors were removed to ensure that only the palate could be used in identification process. In one set of the casts, the palatal rugae were highlighted with a graphite pencil. All the 200 casts were randomly numbered, and then, they were photographed with a 10.1 Mega Pixel Kodak digital camera using standardized method. Using computerized software, the digital photographs of the models without highlighting the palatal rugae were overlapped over the images (transparent) of the palatal rugae with highlighted palatal rugae, in order to identify the pairs by superimposition technique. Incisors were remained and used as landmarks to determine the magnification required to bring the two set of photographs to the same size, in order to make perfect superimposition of images. Results: The result of the overlapping of the digital photographs of highlighted palatal rugae over normal set of models without highlighted palatal rugae resulted in 100% positive identification. Conclusion

  16. Prioritization of putative metabolite identifications in LC-MS/MS experiments using a computational pipeline.

    PubMed

    Zhou, Bin; Xiao, Jun Feng; Ressom, Habtom W

    2013-01-01

    One of the major bottle-necks in current LC-MS-based metabolomic investigations is metabolite identification. An often-used approach is to first look up metabolites from databases through peak mass, followed by verification of the obtained putative identifications using MS/MS data. However, the mass-based search may provide inappropriate putative identifications when the observed peak is from isotopes, fragments, or adducts. In addition, a large fraction of peaks is often left with multiple putative identifications. To differentiate these putative identifications, manual verification of metabolites through comparison between biological samples and authentic compounds is necessary. However, such experiments are laborious, especially when multiple putative identifications are encountered. It is desirable to use computational approaches to obtain more reliable putative identifications and prioritize them before performing experimental verification of the metabolites. In this article, a computational pipeline is proposed to assist metabolite identification with improved metabolome coverage and prioritization capability. Multiple publicly available software tools and databases, along with in-house developed algorithms, are utilized to fully exploit the information acquired from LC-MS/MS experiments. The pipeline is successfully applied to identify metabolites on the basis of LC-MS as well as MS/MS data. Using accurate masses, retention time values, MS/MS spectra, and metabolic pathways/networks, more appropriate putative identifications are retrieved and prioritized to guide subsequent metabolite verification experiments. PMID:23307777

  17. Prioritization of putative metabolite identifications in LC-MS/MS experiments using a computational pipeline.

    PubMed

    Zhou, Bin; Xiao, Jun Feng; Ressom, Habtom W

    2013-01-01

    One of the major bottle-necks in current LC-MS-based metabolomic investigations is metabolite identification. An often-used approach is to first look up metabolites from databases through peak mass, followed by verification of the obtained putative identifications using MS/MS data. However, the mass-based search may provide inappropriate putative identifications when the observed peak is from isotopes, fragments, or adducts. In addition, a large fraction of peaks is often left with multiple putative identifications. To differentiate these putative identifications, manual verification of metabolites through comparison between biological samples and authentic compounds is necessary. However, such experiments are laborious, especially when multiple putative identifications are encountered. It is desirable to use computational approaches to obtain more reliable putative identifications and prioritize them before performing experimental verification of the metabolites. In this article, a computational pipeline is proposed to assist metabolite identification with improved metabolome coverage and prioritization capability. Multiple publicly available software tools and databases, along with in-house developed algorithms, are utilized to fully exploit the information acquired from LC-MS/MS experiments. The pipeline is successfully applied to identify metabolites on the basis of LC-MS as well as MS/MS data. Using accurate masses, retention time values, MS/MS spectra, and metabolic pathways/networks, more appropriate putative identifications are retrieved and prioritized to guide subsequent metabolite verification experiments.

  18. OS friendly microprocessor architecture: Hardware level computer security

    NASA Astrophysics Data System (ADS)

    Jungwirth, Patrick; La Fratta, Patrick

    2016-05-01

    We present an introduction to the patented OS Friendly Microprocessor Architecture (OSFA) and hardware level computer security. Conventional microprocessors have not tried to balance hardware performance and OS performance at the same time. Conventional microprocessors have depended on the Operating System for computer security and information assurance. The goal of the OS Friendly Architecture is to provide a high performance and secure microprocessor and OS system. We are interested in cyber security, information technology (IT), and SCADA control professionals reviewing the hardware level security features. The OS Friendly Architecture is a switched set of cache memory banks in a pipeline configuration. For light-weight threads, the memory pipeline configuration provides near instantaneous context switching times. The pipelining and parallelism provided by the cache memory pipeline provides for background cache read and write operations while the microprocessor's execution pipeline is running instructions. The cache bank selection controllers provide arbitration to prevent the memory pipeline and microprocessor's execution pipeline from accessing the same cache bank at the same time. This separation allows the cache memory pages to transfer to and from level 1 (L1) caching while the microprocessor pipeline is executing instructions. Computer security operations are implemented in hardware. By extending Unix file permissions bits to each cache memory bank and memory address, the OSFA provides hardware level computer security.

  19. Mathematical Level Raising through Collaborative Investigations with the Computer

    ERIC Educational Resources Information Center

    Pijls, Monique; Dekker, Rijkje; van Hout-Wolters, Bernadette

    2003-01-01

    Investigations with the computer can have different functions in the mathematical learning process, such as to let students explore a subject domain, to guide the process of reinvention, or to give them the opportunity to apply what they have learned. Which function has most effect on mathematical level raising? We investigated that question in…

  20. Logic as Marr's Computational Level: Four Case Studies.

    PubMed

    Baggio, Giosuè; van Lambalgen, Michiel; Hagoort, Peter

    2015-04-01

    We sketch four applications of Marr's levels-of-analysis methodology to the relations between logic and experimental data in the cognitive neuroscience of language and reasoning. The first part of the paper illustrates the explanatory power of computational level theories based on logic. We show that a Bayesian treatment of the suppression task in reasoning with conditionals is ruled out by EEG data, supporting instead an analysis based on defeasible logic. Further, we describe how results from an EEG study on temporal prepositions can be reanalyzed using formal semantics, addressing a potential confound. The second part of the article demonstrates the predictive power of logical theories drawing on EEG data on processing progressive constructions and on behavioral data on conditional reasoning in people with autism. Logical theories can constrain processing hypotheses all the way down to neurophysiology, and conversely neuroscience data can guide the selection of alternative computational level models of cognition. PMID:25417838

  1. Logic as Marr's Computational Level: Four Case Studies.

    PubMed

    Baggio, Giosuè; van Lambalgen, Michiel; Hagoort, Peter

    2015-04-01

    We sketch four applications of Marr's levels-of-analysis methodology to the relations between logic and experimental data in the cognitive neuroscience of language and reasoning. The first part of the paper illustrates the explanatory power of computational level theories based on logic. We show that a Bayesian treatment of the suppression task in reasoning with conditionals is ruled out by EEG data, supporting instead an analysis based on defeasible logic. Further, we describe how results from an EEG study on temporal prepositions can be reanalyzed using formal semantics, addressing a potential confound. The second part of the article demonstrates the predictive power of logical theories drawing on EEG data on processing progressive constructions and on behavioral data on conditional reasoning in people with autism. Logical theories can constrain processing hypotheses all the way down to neurophysiology, and conversely neuroscience data can guide the selection of alternative computational level models of cognition.

  2. Multi-level Hierarchical Poly Tree computer architectures

    NASA Technical Reports Server (NTRS)

    Padovan, Joe; Gute, Doug

    1990-01-01

    Based on the concept of hierarchical substructuring, this paper develops an optimal multi-level Hierarchical Poly Tree (HPT) parallel computer architecture scheme which is applicable to the solution of finite element and difference simulations. Emphasis is given to minimizing computational effort, in-core/out-of-core memory requirements, and the data transfer between processors. In addition, a simplified communications network that reduces the number of I/O channels between processors is presented. HPT configurations that yield optimal superlinearities are also demonstrated. Moreover, to generalize the scope of applicability, special attention is given to developing: (1) multi-level reduction trees which provide an orderly/optimal procedure by which model densification/simplification can be achieved, as well as (2) methodologies enabling processor grading that yields architectures with varying types of multi-level granularity.

  3. Domain identification in impedance computed tomography by spline collocation method

    NASA Technical Reports Server (NTRS)

    Kojima, Fumio

    1990-01-01

    A method for estimating an unknown domain in elliptic boundary value problems is considered. The problem is formulated as an inverse problem of integral equations of the second kind. A computational method is developed using a splice collocation scheme. The results can be applied to the inverse problem of impedance computed tomography (ICT) for image reconstruction.

  4. 24 CFR 990.180 - Utilities expense level: Computation of the rolling base consumption level.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 24 Housing and Urban Development 4 2010-04-01 2010-04-01 false Utilities expense level... Calculating Formula Expenses § 990.180 Utilities expense level: Computation of the rolling base consumption... each utility consumed during a 12-month period ending June 30th. For example, for the funding...

  5. Data identification for improving gene network inference using computational algebra.

    PubMed

    Dimitrova, Elena; Stigler, Brandilyn

    2014-11-01

    Identification of models of gene regulatory networks is sensitive to the amount of data used as input. Considering the substantial costs in conducting experiments, it is of value to have an estimate of the amount of data required to infer the network structure. To minimize wasted resources, it is also beneficial to know which data are necessary to identify the network. Knowledge of the data and knowledge of the terms in polynomial models are often required a priori in model identification. In applications, it is unlikely that the structure of a polynomial model will be known, which may force data sets to be unnecessarily large in order to identify a model. Furthermore, none of the known results provides any strategy for constructing data sets to uniquely identify a model. We provide a specialization of an existing criterion for deciding when a set of data points identifies a minimal polynomial model when its monomial terms have been specified. Then, we relax the requirement of the knowledge of the monomials and present results for model identification given only the data. Finally, we present a method for constructing data sets that identify minimal polynomial models.

  6. Computing Bounds on Resource Levels for Flexible Plans

    NASA Technical Reports Server (NTRS)

    Muscvettola, Nicola; Rijsman, David

    2009-01-01

    A new algorithm efficiently computes the tightest exact bound on the levels of resources induced by a flexible activity plan (see figure). Tightness of bounds is extremely important for computations involved in planning because tight bounds can save potentially exponential amounts of search (through early backtracking and detection of solutions), relative to looser bounds. The bound computed by the new algorithm, denoted the resource-level envelope, constitutes the measure of maximum and minimum consumption of resources at any time for all fixed-time schedules in the flexible plan. At each time, the envelope guarantees that there are two fixed-time instantiations one that produces the minimum level and one that produces the maximum level. Therefore, the resource-level envelope is the tightest possible resource-level bound for a flexible plan because any tighter bound would exclude the contribution of at least one fixed-time schedule. If the resource- level envelope can be computed efficiently, one could substitute looser bounds that are currently used in the inner cores of constraint-posting scheduling algorithms, with the potential for great improvements in performance. What is needed to reduce the cost of computation is an algorithm, the measure of complexity of which is no greater than a low-degree polynomial in N (where N is the number of activities). The new algorithm satisfies this need. In this algorithm, the computation of resource-level envelopes is based on a novel combination of (1) the theory of shortest paths in the temporal-constraint network for the flexible plan and (2) the theory of maximum flows for a flow network derived from the temporal and resource constraints. The measure of asymptotic complexity of the algorithm is O(N O(maxflow(N)), where O(x) denotes an amount of computing time or a number of arithmetic operations proportional to a number of the order of x and O(maxflow(N)) is the measure of complexity (and thus of cost) of a maximumflow

  7. Computer-guided drug repurposing: identification of trypanocidal activity of clofazimine, benidipine and saquinavir.

    PubMed

    Bellera, Carolina L; Balcazar, Darío E; Vanrell, M Cristina; Casassa, A Florencia; Palestro, Pablo H; Gavernet, Luciana; Labriola, Carlos A; Gálvez, Jorge; Bruno-Blanch, Luis E; Romano, Patricia S; Carrillo, Carolina; Talevi, Alan

    2015-03-26

    In spite of remarkable advances in the knowledge on Trypanosoma cruzi biology, no medications to treat Chagas disease have been approved in the last 40 years and almost 8 million people remain infected. Since the public sector and non-profit organizations play a significant role in the research efforts on Chagas disease, it is important to implement research strategies that promote translation of basic research into the clinical practice. Recent international public-private initiatives address the potential of drug repositioning (i.e. finding second or further medical uses for known-medications) which can substantially improve the success at clinical trials and the innovation in the pharmaceutical field. In this work, we present the computer-aided identification of approved drugs clofazimine, benidipine and saquinavir as potential trypanocidal compounds and test their effects at biochemical as much as cellular level on different parasite stages. According to the obtained results, we discuss biopharmaceutical, toxicological and physiopathological criteria applied to decide to move clofazimine and benidipine into preclinical phase, in an acute model of infection. The article illustrates the potential of computer-guided drug repositioning to integrate and optimize drug discovery and preclinical development; it also proposes rational rules to select which among repositioned candidates should advance to investigational drug status and offers a new insight on clofazimine and benidipine as candidate treatments for Chagas disease. One Sentence Summary: We present the computer-guided drug repositioning of three approved drugs as potential new treatments for Chagas disease, integrating computer-aided drug screening and biochemical, cellular and preclinical tests. PMID:25707014

  8. Computer-guided drug repurposing: identification of trypanocidal activity of clofazimine, benidipine and saquinavir.

    PubMed

    Bellera, Carolina L; Balcazar, Darío E; Vanrell, M Cristina; Casassa, A Florencia; Palestro, Pablo H; Gavernet, Luciana; Labriola, Carlos A; Gálvez, Jorge; Bruno-Blanch, Luis E; Romano, Patricia S; Carrillo, Carolina; Talevi, Alan

    2015-03-26

    In spite of remarkable advances in the knowledge on Trypanosoma cruzi biology, no medications to treat Chagas disease have been approved in the last 40 years and almost 8 million people remain infected. Since the public sector and non-profit organizations play a significant role in the research efforts on Chagas disease, it is important to implement research strategies that promote translation of basic research into the clinical practice. Recent international public-private initiatives address the potential of drug repositioning (i.e. finding second or further medical uses for known-medications) which can substantially improve the success at clinical trials and the innovation in the pharmaceutical field. In this work, we present the computer-aided identification of approved drugs clofazimine, benidipine and saquinavir as potential trypanocidal compounds and test their effects at biochemical as much as cellular level on different parasite stages. According to the obtained results, we discuss biopharmaceutical, toxicological and physiopathological criteria applied to decide to move clofazimine and benidipine into preclinical phase, in an acute model of infection. The article illustrates the potential of computer-guided drug repositioning to integrate and optimize drug discovery and preclinical development; it also proposes rational rules to select which among repositioned candidates should advance to investigational drug status and offers a new insight on clofazimine and benidipine as candidate treatments for Chagas disease. One Sentence Summary: We present the computer-guided drug repositioning of three approved drugs as potential new treatments for Chagas disease, integrating computer-aided drug screening and biochemical, cellular and preclinical tests.

  9. GC/IR computer-aided identification of anaerobic bacteria

    NASA Astrophysics Data System (ADS)

    Ye, Hunian; Zhang, Feng S.; Yang, Hua; Li, Zhu; Ye, Song

    1993-09-01

    A new method was developed to identify anaerobic bacteria by using pattern recognition. The method is depended on GC / JR data. The system is intended for use as a precise rapid and reproduceable aid in the identification of unknown isolates. Key Words: Anaerobic bacteria Pattern recognition Computeraided identification GC / JR 1 . TNTRODUCTTON A major problem in the field of anaerobic bacteriology is the difficulty in accurately precisely and rapidly identifying unknown isolates. Tn the proceedings of the Third International Symposium on Rapid Methods and Automation in Microbiology C. M. Moss said: " Chromatographic analysis is a new future for clinical microbiology" . 12 years past and so far it seems that this is an idea whose time has not get come but it close. Now two major advances that have brought the technology forword in terms ofmaking it appropriate for use in the clinical laboratory can aldo be cited. One is the development and implementation of fused silica capillary columns. In contrast to packed columns and those of'' greater width these columns allow reproducible recovery of hydroxey fatty acids with the same carbon chain length. The second advance is the efficient data processing afforded by modern microcomputer systems. On the other hand the practical steps for sample preparation also are an advance in the clinical laboratory. Chromatographic Analysis means mainly of analysis of fatty acids. The most common

  10. A new computer-assisted technique to aid personal identification.

    PubMed

    De Angelis, Danilo; Sala, Remo; Cantatore, Angela; Grandi, Marco; Cattaneo, Cristina

    2009-07-01

    The paper describes a procedure aimed at identification from two-dimensional (2D) images (video-surveillance tapes, for example) by comparison with a three-dimensional (3D) facial model of a suspect. The application is intended to provide a tool which can help in analyzing compatibility or incompatibility between a criminal and a suspect's facial traits. The authors apply the concept of "geometrically compatible images". The idea is to use a scanner to reconstruct a 3D facial model of a suspect and to compare it to a frame extracted from the video-surveillance sequence which shows the face of the perpetrator. Repositioning and reorientation of the 3D model according to subject's face framed in the crime scene photo are manually accomplished, after automatic resizing. Repositioning and reorientation are performed in correspondence of anthropometric landmarks, distinctive for that person and detected both on the 2D face and on the 3D model. In this way, the superimposition between the original two-dimensional facial image and the three-dimensional one is obtained and a judgment is formulated by an expert on the basis of the fit between the anatomical facial districts of the two subjects. The procedure reduces the influence of face orientation and may be a useful tool in identification.

  11. All-memristive neuromorphic computing with level-tuned neurons

    NASA Astrophysics Data System (ADS)

    Pantazi, Angeliki; Woźniak, Stanisław; Tuma, Tomas; Eleftheriou, Evangelos

    2016-09-01

    In the new era of cognitive computing, systems will be able to learn and interact with the environment in ways that will drastically enhance the capabilities of current processors, especially in extracting knowledge from vast amount of data obtained from many sources. Brain-inspired neuromorphic computing systems increasingly attract research interest as an alternative to the classical von Neumann processor architecture, mainly because of the coexistence of memory and processing units. In these systems, the basic components are neurons interconnected by synapses. The neurons, based on their nonlinear dynamics, generate spikes that provide the main communication mechanism. The computational tasks are distributed across the neural network, where synapses implement both the memory and the computational units, by means of learning mechanisms such as spike-timing-dependent plasticity. In this work, we present an all-memristive neuromorphic architecture comprising neurons and synapses realized by using the physical properties and state dynamics of phase-change memristors. The architecture employs a novel concept of interconnecting the neurons in the same layer, resulting in level-tuned neuronal characteristics that preferentially process input information. We demonstrate the proposed architecture in the tasks of unsupervised learning and detection of multiple temporal correlations in parallel input streams. The efficiency of the neuromorphic architecture along with the homogenous neuro-synaptic dynamics implemented with nanoscale phase-change memristors represent a significant step towards the development of ultrahigh-density neuromorphic co-processors.

  12. All-memristive neuromorphic computing with level-tuned neurons.

    PubMed

    Pantazi, Angeliki; Woźniak, Stanisław; Tuma, Tomas; Eleftheriou, Evangelos

    2016-09-01

    In the new era of cognitive computing, systems will be able to learn and interact with the environment in ways that will drastically enhance the capabilities of current processors, especially in extracting knowledge from vast amount of data obtained from many sources. Brain-inspired neuromorphic computing systems increasingly attract research interest as an alternative to the classical von Neumann processor architecture, mainly because of the coexistence of memory and processing units. In these systems, the basic components are neurons interconnected by synapses. The neurons, based on their nonlinear dynamics, generate spikes that provide the main communication mechanism. The computational tasks are distributed across the neural network, where synapses implement both the memory and the computational units, by means of learning mechanisms such as spike-timing-dependent plasticity. In this work, we present an all-memristive neuromorphic architecture comprising neurons and synapses realized by using the physical properties and state dynamics of phase-change memristors. The architecture employs a novel concept of interconnecting the neurons in the same layer, resulting in level-tuned neuronal characteristics that preferentially process input information. We demonstrate the proposed architecture in the tasks of unsupervised learning and detection of multiple temporal correlations in parallel input streams. The efficiency of the neuromorphic architecture along with the homogenous neuro-synaptic dynamics implemented with nanoscale phase-change memristors represent a significant step towards the development of ultrahigh-density neuromorphic co-processors.

  13. All-memristive neuromorphic computing with level-tuned neurons.

    PubMed

    Pantazi, Angeliki; Woźniak, Stanisław; Tuma, Tomas; Eleftheriou, Evangelos

    2016-09-01

    In the new era of cognitive computing, systems will be able to learn and interact with the environment in ways that will drastically enhance the capabilities of current processors, especially in extracting knowledge from vast amount of data obtained from many sources. Brain-inspired neuromorphic computing systems increasingly attract research interest as an alternative to the classical von Neumann processor architecture, mainly because of the coexistence of memory and processing units. In these systems, the basic components are neurons interconnected by synapses. The neurons, based on their nonlinear dynamics, generate spikes that provide the main communication mechanism. The computational tasks are distributed across the neural network, where synapses implement both the memory and the computational units, by means of learning mechanisms such as spike-timing-dependent plasticity. In this work, we present an all-memristive neuromorphic architecture comprising neurons and synapses realized by using the physical properties and state dynamics of phase-change memristors. The architecture employs a novel concept of interconnecting the neurons in the same layer, resulting in level-tuned neuronal characteristics that preferentially process input information. We demonstrate the proposed architecture in the tasks of unsupervised learning and detection of multiple temporal correlations in parallel input streams. The efficiency of the neuromorphic architecture along with the homogenous neuro-synaptic dynamics implemented with nanoscale phase-change memristors represent a significant step towards the development of ultrahigh-density neuromorphic co-processors. PMID:27455898

  14. Bytes and Bugs: Integrating Computer Programming with Bacteria Identification.

    ERIC Educational Resources Information Center

    Danciger, Michael

    1986-01-01

    By using a computer program to identify bacteria, students sharpen their analytical skills and gain familiarity with procedures used in laboratories outside the university. Although it is ideal for identifying a bacterium, the program can be adapted to many other disciplines. (Author)

  15. A Simple Computer Application for the Identification of Conifer Genera

    ERIC Educational Resources Information Center

    Strain, Steven R.; Chmielewski, Jerry G.

    2010-01-01

    The National Science Education Standards prescribe that an understanding of the importance of classifying organisms be one component of a student's educational experience in the life sciences. The use of a classification scheme to identify organisms is one way of addressing this goal. We describe Conifer ID, a computer application that assists…

  16. Correlations of Electrophysiological Measurements with Identification Levels of Ancient Chinese Characters

    PubMed Central

    Qi, Zhengyang; Wang, Xiaolong; Hao, Shuang; Zhu, Chuanlin; He, Weiqi; Luo, Wenbo

    2016-01-01

    Studies of event-related potential (ERP) in the human brain have shown that the N170 component can reliably distinguish among different object categories. However, it is unclear whether this is true for different identifiable levels within a single category. In the present study, we used ERP recording to examine the neural response to different identification levels and orientations (upright vs. inverted) of Chinese characters. The results showed that P1, N170, and P250 were modulated by different identification levels of Chinese characters. Moreover, time frequency analysis showed similar results, indicating that identification levels were associated with object recognition, particularly during processing of a single categorical stimulus. PMID:26982215

  17. Correlations of Electrophysiological Measurements with Identification Levels of Ancient Chinese Characters.

    PubMed

    Qi, Zhengyang; Wang, Xiaolong; Hao, Shuang; Zhu, Chuanlin; He, Weiqi; Luo, Wenbo

    2016-01-01

    Studies of event-related potential (ERP) in the human brain have shown that the N170 component can reliably distinguish among different object categories. However, it is unclear whether this is true for different identifiable levels within a single category. In the present study, we used ERP recording to examine the neural response to different identification levels and orientations (upright vs. inverted) of Chinese characters. The results showed that P1, N170, and P250 were modulated by different identification levels of Chinese characters. Moreover, time frequency analysis showed similar results, indicating that identification levels were associated with object recognition, particularly during processing of a single categorical stimulus. PMID:26982215

  18. Parallel computing-based sclera recognition for human identification

    NASA Astrophysics Data System (ADS)

    Lin, Yong; Du, Eliza Y.; Zhou, Zhi

    2012-06-01

    Compared to iris recognition, sclera recognition which uses line descriptor can achieve comparable recognition accuracy in visible wavelengths. However, this method is too time-consuming to be implemented in a real-time system. In this paper, we propose a GPU-based parallel computing approach to reduce the sclera recognition time. We define a new descriptor in which the information of KD tree structure and sclera edge are added. Registration and matching task is divided into subtasks in various sizes according to their computation complexities. Every affine transform parameters are generated by searching on KD tree. Texture memory, constant memory, and shared memory are used to store templates and transform matrixes. The experiment results show that the proposed method executed on GPU can dramatically improve the sclera matching speed in hundreds of times without accuracy decreasing.

  19. Earth feature identification for onboard multispectral data editing: Computational experiments

    NASA Technical Reports Server (NTRS)

    Aherron, R. M.; Arduini, R. F.; Davis, R. E.; Huck, R. O.; Park, S. K.

    1981-01-01

    A computational model of the processes involved in multispectral remote sensing and data classification is developed as a tool for designing smart sensors which can process, edit, and classify the data that they acquire. An evaluation of sensor system performance and design tradeoffs involves classification rates and errors as a function of number and location of spectral channels, radiometric sensitivity and calibration accuracy, target discrimination assignments, and accuracy and frequency of compensation for imaging conditions. This model provides a link between the radiometric and statistical properties of the signals to be classified and the performance characteristics of electro-optical sensors and data processing devices. Preliminary computational results are presented which illustrate the editing performance of several remote sensing approaches.

  20. Computed tomography identification of an exophytic colonic liposarcoma.

    PubMed

    Chou, Chung Kuao; Chen, Sung-Ting

    2016-09-01

    It may be difficult to ascertain the relationship between a large intra-abdominal tumor and the adjacent organs if they are close together. In the current case, a definitive preoperative diagnosis of an exophytic colonic tumor was obtained by the demonstration of obtuse angles between the tumor and colon and by distinct recognition of the mucosa-submucosa of the colonic wall on computed tomography; the accuracy of this preoperative diagnosis was subsequently confirmed by pathologic findings. PMID:27594941

  1. Parallel Computation of the Topology of Level Sets

    SciTech Connect

    Pascucci, V; Cole-McLaughlin, K

    2004-12-16

    This paper introduces two efficient algorithms that compute the Contour Tree of a 3D scalar field F and its augmented version with the Betti numbers of each isosurface. The Contour Tree is a fundamental data structure in scientific visualization that is used to preprocess the domain mesh to allow optimal computation of isosurfaces with minimal overhead storage. The Contour Tree can also be used to build user interfaces reporting the complete topological characterization of a scalar field, as shown in Figure 1. Data exploration time is reduced since the user understands the evolution of level set components with changing isovalue. The Augmented Contour Tree provides even more accurate information segmenting the range space of the scalar field in portion of invariant topology. The exploration time for a single isosurface is also improved since its genus is known in advance. Our first new algorithm augments any given Contour Tree with the Betti numbers of all possible corresponding isocontours in linear time with the size of the tree. Moreover we show how to extend the scheme introduced in [3] with the Betti number computation without increasing its complexity. Thus, we improve on the time complexity from our previous approach [10] from O(m log m) to O(n log n + m), where m is the number of cells and n is the number of vertices in the domain of F. Our second contribution is a new divide-and-conquer algorithm that computes the Augmented Contour Tree with improved efficiency. The approach computes the output Contour Tree by merging two intermediate Contour Trees and is independent of the interpolant. In this way we confine any knowledge regarding a specific interpolant to an independent function that computes the tree for a single cell. We have implemented this function for the trilinear interpolant and plan to replace it with higher order interpolants when needed. The time complexity is O(n + t log n), where t is the number of critical points of F. For the first time

  2. MAX--An Interactive Computer Program for Teaching Identification of Clay Minerals by X-ray Diffraction.

    ERIC Educational Resources Information Center

    Kohut, Connie K.; And Others

    1993-01-01

    Discusses MAX, an interactive computer program for teaching identification of clay minerals based on standard x-ray diffraction characteristics. The program provides tutorial-type exercises for identification of 16 clay standards, self-evaluation exercises, diffractograms of 28 soil clay minerals, and identification of nonclay minerals. (MDH)

  3. Computational identification of MoRFs in protein sequences

    PubMed Central

    Malhis, Nawar; Gsponer, Jörg

    2015-01-01

    Motivation: Intrinsically disordered regions of proteins play an essential role in the regulation of various biological processes. Key to their regulatory function is the binding of molecular recognition features (MoRFs) to globular protein domains in a process known as a disorder-to-order transition. Predicting the location of MoRFs in protein sequences with high accuracy remains an important computational challenge. Method: In this study, we introduce MoRFCHiBi, a new computational approach for fast and accurate prediction of MoRFs in protein sequences. MoRFCHiBi combines the outcomes of two support vector machine (SVM) models that take advantage of two different kernels with high noise tolerance. The first, SVMS, is designed to extract maximal information from the general contrast in amino acid compositions between MoRFs, their surrounding regions (Flanks), and the remainders of the sequences. The second, SVMT, is used to identify similarities between regions in a query sequence and MoRFs of the training set. Results: We evaluated the performance of our predictor by comparing its results with those of two currently available MoRF predictors, MoRFpred and ANCHOR. Using three test sets that have previously been collected and used to evaluate MoRFpred and ANCHOR, we demonstrate that MoRFCHiBi outperforms the other predictors with respect to different evaluation metrics. In addition, MoRFCHiBi is downloadable and fast, which makes it useful as a component in other computational prediction tools. Availability and implementation: http://www.chibi.ubc.ca/morf/. Contact: gsponer@chibi.ubc.ca. Supplementary information: Supplementary data are available at Bioinformatics online. PMID:25637562

  4. Cloud identification using genetic algorithms and massively parallel computation

    NASA Technical Reports Server (NTRS)

    Buckles, Bill P.; Petry, Frederick E.

    1996-01-01

    As a Guest Computational Investigator under the NASA administered component of the High Performance Computing and Communication Program, we implemented a massively parallel genetic algorithm on the MasPar SIMD computer. Experiments were conducted using Earth Science data in the domains of meteorology and oceanography. Results obtained in these domains are competitive with, and in most cases better than, similar problems solved using other methods. In the meteorological domain, we chose to identify clouds using AVHRR spectral data. Four cloud speciations were used although most researchers settle for three. Results were remarkedly consistent across all tests (91% accuracy). Refinements of this method may lead to more timely and complete information for Global Circulation Models (GCMS) that are prevalent in weather forecasting and global environment studies. In the oceanographic domain, we chose to identify ocean currents from a spectrometer having similar characteristics to AVHRR. Here the results were mixed (60% to 80% accuracy). Given that one is willing to run the experiment several times (say 10), then it is acceptable to claim the higher accuracy rating. This problem has never been successfully automated. Therefore, these results are encouraging even though less impressive than the cloud experiment. Successful conclusion of an automated ocean current detection system would impact coastal fishing, naval tactics, and the study of micro-climates. Finally we contributed to the basic knowledge of GA (genetic algorithm) behavior in parallel environments. We developed better knowledge of the use of subpopulations in the context of shared breeding pools and the migration of individuals. Rigorous experiments were conducted based on quantifiable performance criteria. While much of the work confirmed current wisdom, for the first time we were able to submit conclusive evidence. The software developed under this grant was placed in the public domain. An extensive user

  5. Computer method for identification of boiler transfer functions

    NASA Technical Reports Server (NTRS)

    Miles, J. H.

    1971-01-01

    An iterative computer method is described for identifying boiler transfer functions using frequency response data. An objective penalized performance measure and a nonlinear minimization technique are used to cause the locus of points generated by a transfer function to resemble the locus of points obtained from frequency response measurements. Different transfer functions can be tried until a satisfactory empirical transfer function to the system is found. To illustrate the method, some examples and some results from a study of a set of data consisting of measurements of the inlet impedance of a single tube forced flow boiler with inserts are given.

  6. System-Level Virtualization for High Performance Computing

    SciTech Connect

    Vallee, Geoffroy R; Naughton, III, Thomas J; Engelmann, Christian; Ong, Hong Hoe; Scott, Stephen L

    2008-01-01

    System-level virtualization has been a research topic since the 70's but regained popularity during the past few years because of the availability of efficient solution such as Xen and the implementation of hardware support in commodity processors (e.g. Intel-VT, AMD-V). However, a majority of system-level virtualization projects is guided by the server consolidation market. As a result, current virtualization solutions appear to not be suitable for high performance computing (HPC) which is typically based on large-scale systems. On another hand there is significant interest in exploiting virtual machines (VMs) within HPC for a number of other reasons. By virtualizing the machine, one is able to run a variety of operating systems and environments as needed by the applications. Virtualization allows users to isolate workloads, improving security and reliability. It is also possible to support non-native environments and/or legacy operating environments through virtualization. In addition, it is possible to balance work loads, use migration techniques to relocate applications from failing machines, and isolate fault systems for repair. This document presents the challenges for the implementation of a system-level virtualization solution for HPC. It also presents a brief survey of the different approaches and techniques to address these challenges.

  7. Application of replica plating and computer analysis for rapid identification of bacteria in some foods. I. Identification scheme.

    PubMed

    Corlett, D A; Lee, J S; Sinnhuber, R O

    1965-09-01

    A method was devised and tested for a quantitative identification of microbial flora in foods. The colonies developing on the initial isolation plates were picked with sterile toothpicks and inoculated on a master plate in prearranged spacing and order. The growth on the master plates was then replicated on a series of solid-agar plates containing differential or selective agents. The characteristic growth and physiological responses of microbial isolates to penicillin, tylosin, vancomycin, streptomycin, chloramphenicol, neomycin, colistin, and to S S Agar, Staphylococcus Medium No. 110, and Potato Dextrose Agar were recorded, together with Gram reaction and cell morphology. This information was then fed into an IBM 1410 digital computer which grouped and analyzed each isolate into 10 microbial genera, or groups, according to the identification key. The identification scheme was established by use of reference culture studies and from the literature. This system was used to analyze the microbial flora in dover sole (Microstomus pacificus) and ground beef. The method described in this article enables one to examine large numbers of microbial isolates with simplicity. PMID:5325942

  8. Application of replica plating and computer analysis for rapid identification of bacteria in some foods. I. Identification scheme.

    PubMed

    Corlett, D A; Lee, J S; Sinnhuber, R O

    1965-09-01

    A method was devised and tested for a quantitative identification of microbial flora in foods. The colonies developing on the initial isolation plates were picked with sterile toothpicks and inoculated on a master plate in prearranged spacing and order. The growth on the master plates was then replicated on a series of solid-agar plates containing differential or selective agents. The characteristic growth and physiological responses of microbial isolates to penicillin, tylosin, vancomycin, streptomycin, chloramphenicol, neomycin, colistin, and to S S Agar, Staphylococcus Medium No. 110, and Potato Dextrose Agar were recorded, together with Gram reaction and cell morphology. This information was then fed into an IBM 1410 digital computer which grouped and analyzed each isolate into 10 microbial genera, or groups, according to the identification key. The identification scheme was established by use of reference culture studies and from the literature. This system was used to analyze the microbial flora in dover sole (Microstomus pacificus) and ground beef. The method described in this article enables one to examine large numbers of microbial isolates with simplicity.

  9. Bouc-Wen model parameter identification for a MR fluid damper using computationally efficient GA.

    PubMed

    Kwok, N M; Ha, Q P; Nguyen, M T; Li, J; Samali, B

    2007-04-01

    A non-symmetrical Bouc-Wen model is proposed in this paper for magnetorheological (MR) fluid dampers. The model considers the effect of non-symmetrical hysteresis which has not been taken into account in the original Bouc-Wen model. The model parameters are identified with a Genetic Algorithm (GA) using its flexibility in identification of complex dynamics. The computational efficiency of the proposed GA is improved with the absorption of the selection stage into the crossover and mutation operations. Crossover and mutation are also made adaptive to the fitness values such that their probabilities need not be user-specified. Instead of using a sufficiently number of generations or a pre-determined fitness value, the algorithm termination criterion is formulated on the basis of a statistical hypothesis test, thus enhancing the performance of the parameter identification. Experimental test data of the damper displacement and force are used to verify the proposed approach with satisfactory parameter identification results. PMID:17349644

  10. Computer-aided identification of prostatic adenocarcinoma: Segmentation of glandular structures

    PubMed Central

    Peng, Yahui; Jiang, Yulei; Eisengart, Laurie; Healy, Mark A; Straus, Francis H; Yang, Ximing J

    2011-01-01

    Background: Identification of individual prostatic glandular structures is an important prerequisite to quantitative histological analysis of prostate cancer with the aid of a computer. We have developed a computer method to segment individual glandular units and to extract quantitative image features, for computer identification of prostatic adenocarcinoma. Methods: Two sets of digital histology images were used: database I (n = 57) for developing and testing the computer technique, and database II (n = 116) for independent validation. The segmentation technique was based on a k-means clustering and a region-growing method. Computer segmentation results were evaluated subjectively and also compared quantitatively against manual gland outlines, using the Jaccard similarity measure. Quantitative features that were extracted from the computer segmentation results include average gland size, spatial gland density, and average gland circularity. Linear discriminant analysis (LDA) was used to combine quantitative image features. Classification performance was evaluated with receiver operating characteristic (ROC) analysis and the area under the ROC curve (AUC). Results: Jaccard similarity coefficients between computer segmentation and manual outlines of individual glands were between 0.63 and 0.72 for non-cancer and between 0.48 and 0.54 for malignant glands, respectively, similar to an interobserver agreement of 0.79 for non-cancer and 0.75 for malignant glands, respectively. The AUC value for the features of average gland size and gland density combined via LDA was 0.91 for database I and 0.96 for database II. Conclusions: Using a computer, we are able to delineate individual prostatic glands automatically and identify prostatic adenocarcinoma accurately, based on the quantitative image features extracted from computer-segmented glandular structures. PMID:21845231

  11. Computational Identification of Key Regulators in Two Different Colorectal Cancer Cell Lines

    PubMed Central

    Wlochowitz, Darius; Haubrock, Martin; Arackal, Jetcy; Bleckmann, Annalen; Wolff, Alexander; Beißbarth, Tim; Wingender, Edgar; Gültas, Mehmet

    2016-01-01

    Transcription factors (TFs) are gene regulatory proteins that are essential for an effective regulation of the transcriptional machinery. Today, it is known that their expression plays an important role in several types of cancer. Computational identification of key players in specific cancer cell lines is still an open challenge in cancer research. In this study, we present a systematic approach which combines colorectal cancer (CRC) cell lines, namely 1638N-T1 and CMT-93, and well-established computational methods in order to compare these cell lines on the level of transcriptional regulation as well as on a pathway level, i.e., the cancer cell-intrinsic pathway repertoire. For this purpose, we firstly applied the Trinity platform to detect signature genes, and then applied analyses of the geneXplain platform to these for detection of upstream transcriptional regulators and their regulatory networks. We created a CRC-specific position weight matrix (PWM) library based on the TRANSFAC database (release 2014.1) to minimize the rate of false predictions in the promoter analyses. Using our proposed workflow, we specifically focused on revealing the similarities and differences in transcriptional regulation between the two CRC cell lines, and report a number of well-known, cancer-associated TFs with significantly enriched binding sites in the promoter regions of the signature genes. We show that, although the signature genes of both cell lines show no overlap, they may still be regulated by common TFs in CRC. Based on our findings, we suggest that canonical Wnt signaling is activated in 1638N-T1, but inhibited in CMT-93 through cross-talks of Wnt signaling with the VDR signaling pathway and/or LXR-related pathways. Furthermore, our findings provide indication of several master regulators being present such as MLK3 and Mapk1 (ERK2) which might be important in cell proliferation, migration, and invasion of 1638N-T1 and CMT-93, respectively. Taken together, we provide

  12. The Use of Computer-Assisted Identification of ARIMA Time-Series.

    ERIC Educational Resources Information Center

    Brown, Roger L.

    This study was conducted to determine the effects of using various levels of tutorial statistical software for the tentative identification of nonseasonal ARIMA models, a statistical technique proposed by Box and Jenkins for the interpretation of time-series data. The Box-Jenkins approach is an iterative process encompassing several stages of…

  13. Identifying the Computer Competency Levels of Recreation Department Undergraduates

    ERIC Educational Resources Information Center

    Zorba, Erdal

    2011-01-01

    Computer-based and web-based applications are as major instructional tools to increase undergraduates' motivation at school. In the recreation field usage of, computer and the internet based recreational applications has become more prevalent in order to present visual and interactive entertainment activities. Recreation department undergraduates…

  14. Identification of critical sediment source areas at regional level

    NASA Astrophysics Data System (ADS)

    Fargas, D.; Casasnovas, J. A. Martínez; Poch, R.

    In order to identify critical sediment sources in large catchments, using easily available terrain information at regional scale, a methodology has been developed to obtain a qualitative assessment necessary for further studies. The main objective of the model is to use basic terrain data related to the erosive processes which contribute to the production, transport and accumulation of sediments through the main water paths in the watershed. The model is based on the selection of homogeneous zones regarding drainage density and lithology, achieved by joining the spatial basic units by a rating system. The values of drainage density are rated according to an erosion class (Bucko & Mazurova, 1958). The lithology is rated by erosion indexes, adapted from FAO (1977). The combination and reclassification of the results brings about five qualitative classes of sediment emission risk. This methodology has been tested an validated for the watershed of the Joaquín Costa reservoir (NE Spain), with a surface of 1500 km 2. The mapping scale was 1:100.000 and the model was implemented through a vector GIS (Arc/Info). The prediction was checked by means of photo-interpretation and field work, which gave a accuracy of 78.5%. The proposed methodology has been proved useful as an initial approach for erosion assessment and soil conservation planning at the regional level, and also to select priority areas where further analyses can be developed.

  15. Evolutionary Computation for the Identification of Emergent Behavior in Autonomous Systems

    NASA Technical Reports Server (NTRS)

    Terrile, Richard J.; Guillaume, Alexandre

    2009-01-01

    Over the past several years the Center for Evolutionary Computation and Automated Design at the Jet Propulsion Laboratory has developed a technique based on Evolutionary Computational Methods (ECM) that allows for the automated optimization of complex computationally modeled systems. An important application of this technique is for the identification of emergent behaviors in autonomous systems. Mobility platforms such as rovers or airborne vehicles are now being designed with autonomous mission controllers that can find trajectories over a solution space that is larger than can reasonably be tested. It is critical to identify control behaviors that are not predicted and can have surprising results (both good and bad). These emergent behaviors need to be identified, characterized and either incorporated into or isolated from the acceptable range of control characteristics. We use cluster analysis of automatically retrieved solutions to identify isolated populations of solutions with divergent behaviors.

  16. Reliability of frontal sinus by cone beam-computed tomography (CBCT) for individual identification.

    PubMed

    Cossellu, Gianguido; De Luca, Stefano; Biagi, Roberto; Farronato, Giampietro; Cingolani, Mariano; Ferrante, Luigi; Cameriere, Roberto

    2015-12-01

    Analysis of the frontal sinus is an important tool in personal identification. Cone beam-computed tomography (CBCT) is also progressively replacing conventional radiography and multi-slice computed tomography (MSCT) in human identification. The aim of this study is to develop a reproducible technique and measurements from 3D reconstructions obtained with CBCT, for use in human identification. CBCT from 150 patients (91 female, 59 male), aged between 15 and 78 years, was analysed with the specific software program MIMICS 11.11 (Materialise N.V., Leuven, Belgium). Corresponding 3D volumes were generated and maximal dimensions along 3 directions (x, y, z), X M, Y M, Z M (in mm), total volume area (in mm(3)), V t, and total surface (in mm(2)), S t, were calculated. Correlation analysis showed that sinus surfaces were strongly correlated with their volume (r = 0.976). Frontal sinuses were separate in 21 subjects (14 %), fused in 67 (44.6 %) and found on only one side (unilateral) in 9 (6 %). A Prominent Middle of Fused Sinus (PMS) was found in 53 subjects (35.3 %). The intra- (0.963-0.999) and inter-observer variability (0.973-0.999) showed a great agreement and a substantial homogeneity of evaluation.

  17. Computer-assisted photo identification outperforms visible implant elastomers in an endangered salamander, Eurycea tonkawae.

    PubMed

    Bendik, Nathan F; Morrison, Thomas A; Gluesenkamp, Andrew G; Sanders, Mark S; O'Donnell, Lisa J

    2013-01-01

    Despite recognition that nearly one-third of the 6300 amphibian species are threatened with extinction, our understanding of the general ecology and population status of many amphibians is relatively poor. A widely-used method for monitoring amphibians involves injecting captured individuals with unique combinations of colored visible implant elastomer (VIE). We compared VIE identification to a less-invasive method - computer-assisted photographic identification (photoID) - in endangered Jollyville Plateau salamanders (Eurycea tonkawae), a species with a known range limited to eight stream drainages in central Texas. We based photoID on the unique pigmentation patterns on the dorsal head region of 1215 individual salamanders using identification software Wild-ID. We compared the performance of photoID methods to VIEs using both 'high-quality' and 'low-quality' images, which were taken using two different camera types and technologies. For high-quality images, the photoID method had a false rejection rate of 0.76% compared to 1.90% for VIEs. Using a comparable dataset of lower-quality images, the false rejection rate was much higher (15.9%). Photo matching scores were negatively correlated with time between captures, suggesting that evolving natural marks could increase misidentification rates in longer term capture-recapture studies. Our study demonstrates the utility of large-scale capture-recapture using photo identification methods for Eurycea and other species with stable natural marks that can be reliably photographed. PMID:23555669

  18. Computer-Assisted Photo Identification Outperforms Visible Implant Elastomers in an Endangered Salamander, Eurycea tonkawae

    PubMed Central

    Bendik, Nathan F.; Morrison, Thomas A.; Gluesenkamp, Andrew G.; Sanders, Mark S.; O’Donnell, Lisa J.

    2013-01-01

    Despite recognition that nearly one-third of the 6300 amphibian species are threatened with extinction, our understanding of the general ecology and population status of many amphibians is relatively poor. A widely-used method for monitoring amphibians involves injecting captured individuals with unique combinations of colored visible implant elastomer (VIE). We compared VIE identification to a less-invasive method – computer-assisted photographic identification (photoID) – in endangered Jollyville Plateau salamanders (Eurycea tonkawae), a species with a known range limited to eight stream drainages in central Texas. We based photoID on the unique pigmentation patterns on the dorsal head region of 1215 individual salamanders using identification software Wild-ID. We compared the performance of photoID methods to VIEs using both ‘high-quality’ and ‘low-quality’ images, which were taken using two different camera types and technologies. For high-quality images, the photoID method had a false rejection rate of 0.76% compared to 1.90% for VIEs. Using a comparable dataset of lower-quality images, the false rejection rate was much higher (15.9%). Photo matching scores were negatively correlated with time between captures, suggesting that evolving natural marks could increase misidentification rates in longer term capture-recapture studies. Our study demonstrates the utility of large-scale capture-recapture using photo identification methods for Eurycea and other species with stable natural marks that can be reliably photographed. PMID:23555669

  19. Optical packet header identification utilizing an all-optical feedback chaotic reservoir computing

    NASA Astrophysics Data System (ADS)

    Qin, Jie; Zhao, Qingchun; Xu, Dongjiao; Yin, Hongxi; Chang, Ying; Huang, Degen

    2016-06-01

    In this paper, an all-optical reservoir computing (RC) setup is proposed for identifying the types of optical packet headers in optical packet switching (OPS) network. The numerical simulation identification results of 3 bits and 32 bits optical headers with the bit rate of 10 Gbps are as low as 0.625% and 2.25%, respectively. The identification errors with the variation of the feedback strength and feedback delay are presented separately. Hence, the optimal feedback parameters are obtained. The all-optical feedback RC setup is robust to the white Gaussian noise. The recognition error is acceptable when the signal-to-noise ratio (SNR) is greater than 15 dB.

  20. Rational use of cognitive resources: levels of analysis between the computational and the algorithmic.

    PubMed

    Griffiths, Thomas L; Lieder, Falk; Goodman, Noah D

    2015-04-01

    Marr's levels of analysis-computational, algorithmic, and implementation-have served cognitive science well over the last 30 years. But the recent increase in the popularity of the computational level raises a new challenge: How do we begin to relate models at different levels of analysis? We propose that it is possible to define levels of analysis that lie between the computational and the algorithmic, providing a way to build a bridge between computational- and algorithmic-level models. The key idea is to push the notion of rationality, often used in defining computational-level models, deeper toward the algorithmic level. We offer a simple recipe for reverse-engineering the mind's cognitive strategies by deriving optimal algorithms for a series of increasingly more realistic abstract computational architectures, which we call "resource-rational analysis."

  1. Sequencing and Computational Approaches to Identification and Characterization of Microbial Organisms

    PubMed Central

    Yadav, Brijesh Singh; Ronda, Venkateswarlu; Vashista, Dinesh P; Sharma, Bhaskar

    2013-01-01

    The recent advances in sequencing technologies and computational approaches are propelling scientists ever closer towards complete understanding of human-microbial interactions. The powerful sequencing platforms are rapidly producing huge amounts of nucleotide sequence data which are compiled into huge databases. This sequence data can be retrieved, assembled, and analyzed for identification of microbial pathogens and diagnosis of diseases. In this article, we present a commentary on how the metagenomics incorporated with microarray and new sequencing techniques are helping microbial detection and characterization. PMID:25288901

  2. Three Computer Programs for Use in Introductory Level Physics Laboratories.

    ERIC Educational Resources Information Center

    Kagan, David T.

    1984-01-01

    Describes three computer programs which operate on Apple II+ microcomputers: (1) a menu-driven graph drawing program; (2) a simulation of the Millikan oil drop experiment; and (3) a program used to study the half-life of silver. (Instructions for obtaining the programs from the author are included.) (JN)

  3. Efficient Computation of the Topology of Level Sets

    SciTech Connect

    Pascucci, V; Cole-McLaughlin, K

    2002-07-19

    This paper introduces two efficient algorithms that compute the Contour Tree of a 3D scalar field F and its augmented version with the Betti numbers of each isosurface. The Contour Tree is a fundamental data structure in scientific visualization that is used to pre-process the domain mesh to allow optimal computation of isosurfaces with minimal storage overhead. The Contour Tree can be also used to build user interfaces reporting the complete topological characterization of a scalar field, as shown in Figure 1. In the first part of the paper we present a new scheme that augments the Contour Tree with the Betti numbers of each isocontour in linear time. We show how to extend the scheme introduced in 3 with the Betti number computation without increasing its complexity. Thus we improve on the time complexity from our previous approach 8 from 0(m log m) to 0(n log n+m), where m is the number of tetrahedra and n is the number of vertices in the domain of F. In the second part of the paper we introduce a new divide and conquer algorithm that computes the Augmented Contour Tree for scalar fields defined on rectilinear grids. The central part of the scheme computes the output contour tree by merging two intermediate contour trees and is independent of the interpolant. In this way we confine any knowledge regarding a specific interpolant to an oracle that computes the tree for a single cell. We have implemented this oracle for the trilinear interpolant and plan to replace it with higher order interpolants when needed. The complexity of the scheme is O(n + t log n), where t is the number of critical points of F. This allows for the first time to compute the Contour Tree in linear time in many practical cases when t = O(n{sup 1-e}). We report the running times for a parallel implementation of our algorithm, showing good scalability with the number of processors.

  4. Computerized nipple identification for multiple image analysis in computer-aided diagnosis

    SciTech Connect

    Zhou Chuan; Chan Heangping; Paramagul, Chintana; Roubidoux, Marilyn A.; Sahiner, Berkman; Hadjiiski, Labomir M.; Petrick, Nicholas

    2004-10-01

    Correlation of information from multiple-view mammograms (e.g., MLO and CC views, bilateral views, or current and prior mammograms) can improve the performance of breast cancer diagnosis by radiologists or by computer. The nipple is a reliable and stable landmark on mammograms for the registration of multiple mammograms. However, accurate identification of nipple location on mammograms is challenging because of the variations in image quality and in the nipple projections, resulting in some nipples being nearly invisible on the mammograms. In this study, we developed a computerized method to automatically identify the nipple location on digitized mammograms. First, the breast boundary was obtained using a gradient-based boundary tracking algorithm, and then the gray level profiles along the inside and outside of the boundary were identified. A geometric convergence analysis was used to limit the nipple search to a region of the breast boundary. A two-stage nipple detection method was developed to identify the nipple location using the gray level information around the nipple, the geometric characteristics of nipple shapes, and the texture features of glandular tissue or ducts which converge toward the nipple. At the first stage, a rule-based method was designed to identify the nipple location by detecting significant changes of intensity along the gray level profiles inside and outside the breast boundary and the changes in the boundary direction. At the second stage, a texture orientation-field analysis was developed to estimate the nipple location based on the convergence of the texture pattern of glandular tissue or ducts towards the nipple. The nipple location was finally determined from the detected nipple candidates by a rule-based confidence analysis. In this study, 377 and 367 randomly selected digitized mammograms were used for training and testing the nipple detection algorithm, respectively. Two experienced radiologists identified the nipple locations

  5. Social Identification and Interpersonal Communication in Computer-Mediated Communication: What You Do versus Who You Are in Virtual Groups

    ERIC Educational Resources Information Center

    Wang, Zuoming; Walther, Joseph B.; Hancock, Jeffrey T.

    2009-01-01

    This study investigates the influence of interpersonal communication and intergroup identification on members' evaluations of computer-mediated groups. Participants (N= 256) in 64 four-person groups interacted through synchronous computer chat. Subgroup assignments to minimal groups instilled significantly greater in-group versus out-group…

  6. Computational aspects of hot-wire identification of thermal conductivity and diffusivity under high temperature

    NASA Astrophysics Data System (ADS)

    Vala, Jiří; Jarošová, Petra

    2016-07-01

    Development of advanced materials resistant to high temperature, needed namely for the design of heat storage for low-energy and passive buildings, requires simple, inexpensive and reliable methods of identification of their temperature-sensitive thermal conductivity and diffusivity, covering both well-advised experimental setting and implementation of robust and effective computational algorithms. Special geometrical configurations offer a possibility of quasi-analytical evaluation of temperature development for direct problems, whereas inverse problems of simultaneous evaluation of thermal conductivity and diffusivity must be handled carefully, using some least-squares (minimum variance) arguments. This paper demonstrates the proper mathematical and computational approach to such model problem, thanks to the radial symmetry of hot-wire measurements, including its numerical implementation.

  7. 3-D visualization and identification of biological microorganisms using partially temporal incoherent light in-line computational holographic imaging.

    PubMed

    Moon, Inkyu; Javidi, Bahram

    2008-12-01

    We present a new method for three-dimensional (3-D) visualization and identification of biological microorganisms using partially temporal incoherent light in-line (PTILI) computational holographic imaging and multivariate statistical methods. For 3-D data acquisition of biological microorganisms, the band-pass filtered white light is used to illuminate a biological sample. The transversely and longitudinally diffracted pattern of the biological sample is magnified by microscope objective (MO) and is optically recorded with an image sensor array interfaced with a computer. Three-dimensional reconstruction of the biological sample from the diffraction pattern is accomplished by using computational Fresnel propagation method. Principal components analysis and nonparametric inference algorithms are applied to the 3-D complex amplitude biological sample for identification purposes. Experiments indicate that the proposed system can be useful for identifying biological microorganisms. To the best of our knowledge, this is the first report on using PTILI computational holographic microscopy for identification of biological microorganisms.

  8. Identification of Propionibacteria to the species level using Fourier transform infrared spectroscopy and artificial neural networks.

    PubMed

    Dziuba, B

    2013-01-01

    Fourier transform infrared spectroscopy (FTIR) and artificial neural networks (ANN's) were used to identify species of Propionibacteria strains. The aim of the study was to improve the methodology to identify species of Propionibacteria strains, in which the differentiation index D, calculated based on Pearson's correlation and cluster analyses were used to describe the correlation between the Fourier transform infrared spectra and bacteria as molecular systems brought unsatisfactory results. More advanced statistical methods of identification of the FTIR spectra with application of artificial neural networks (ANN's) were used. In this experiment, the FTIR spectra of Propionibacteria strains stored in the library were used to develop artificial neural networks for their identification. Several multilayer perceptrons (MLP) and probabilistic neural networks (PNN) were tested. The practical value of selected artificial neural networks was assessed based on identification results of spectra of 9 reference strains and 28 isolates. To verify results of isolates identification, the PCR based method with the pairs of species-specific primers was used. The use of artificial neural networks in FTIR spectral analyses as the most advanced chemometric method supported correct identification of 93% bacteria of the genus Propionibacterium to the species level.

  9. Identification of rickettsial isolates at the species level using multi-spacer typing

    PubMed Central

    Fournier, Pierre-Edouard; Raoult, Didier

    2007-01-01

    Background In order to estimate whether multi-spacer typing (MST), based on the sequencing of variable intergenic spacers, could serve for the identification of Rickettsia at the species level, we applied it to 108 rickettsial isolates or arthropod amplicons that include representatives of 23 valid Rickettsia species. Results MST combining the dksA-xerC, mppA-purC, and rpmE-tRNAfMet spacer sequences identified 61 genotypes, allowing the differentiation of each species by at least one distinct genotype. In addition, MST was discriminatory at the strain level in six species for which several isolates or arthropod amplicons were available. Conclusion MST proved to be a reproducible and high-resolution genotyping method allowing clear identification of rickettsial isolates at the species level and further additional differentiation of strains within some species. PMID:17662158

  10. A conceptual framework of computations in mid-level vision

    PubMed Central

    Kubilius, Jonas; Wagemans, Johan; Op de Beeck, Hans P.

    2014-01-01

    If a picture is worth a thousand words, as an English idiom goes, what should those words—or, rather, descriptors—capture? What format of image representation would be sufficiently rich if we were to reconstruct the essence of images from their descriptors? In this paper, we set out to develop a conceptual framework that would be: (i) biologically plausible in order to provide a better mechanistic understanding of our visual system; (ii) sufficiently robust to apply in practice on realistic images; and (iii) able to tap into underlying structure of our visual world. We bring forward three key ideas. First, we argue that surface-based representations are constructed based on feature inference from the input in the intermediate processing layers of the visual system. Such representations are computed in a largely pre-semantic (prior to categorization) and pre-attentive manner using multiple cues (orientation, color, polarity, variation in orientation, and so on), and explicitly retain configural relations between features. The constructed surfaces may be partially overlapping to compensate for occlusions and are ordered in depth (figure-ground organization). Second, we propose that such intermediate representations could be formed by a hierarchical computation of similarity between features in local image patches and pooling of highly-similar units, and reestimated via recurrent loops according to the task demands. Finally, we suggest to use datasets composed of realistically rendered artificial objects and surfaces in order to better understand a model's behavior and its limitations. PMID:25566044

  11. Assessing Pre-Service Teachers' Computer Phobia Levels in Terms of Gender and Experience, Turkish Sample

    ERIC Educational Resources Information Center

    Ursavas, Omer Faruk; Karal, Hasan

    2009-01-01

    In this study it is aimed to determine the level of pre-service teachers' computer phobia. Whether or not computer phobia meaningfully varies statistically according to gender and computer experience has been tested in the study. The study was performed on 430 pre-service teachers at the Education Faculty in Rize/Turkey. Data in the study were…

  12. 24 CFR 990.165 - Computation of project expense level (PEL).

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... with 24 CFR 941.606 for a mixed-finance transaction, then the project covered by the mixed-finance... 24 Housing and Urban Development 4 2010-04-01 2010-04-01 false Computation of project expense....165 Computation of project expense level (PEL). (a) Computation of PEL. The PEL is calculated in...

  13. 24 CFR 990.165 - Computation of project expense level (PEL).

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... with 24 CFR 941.606 for a mixed-finance transaction, then the project covered by the mixed-finance... 24 Housing and Urban Development 4 2014-04-01 2014-04-01 false Computation of project expense....165 Computation of project expense level (PEL). (a) Computation of PEL. The PEL is calculated in...

  14. Assigning unique identification numbers to new user accounts and groups in a computing environment with multiple registries

    DOEpatents

    DeRobertis, Christopher V.; Lu, Yantian T.

    2010-02-23

    A method, system, and program storage device for creating a new user account or user group with a unique identification number in a computing environment having multiple user registries is provided. In response to receiving a command to create a new user account or user group, an operating system of a clustered computing environment automatically checks multiple registries configured for the operating system to determine whether a candidate identification number for the new user account or user group has been assigned already to one or more existing user accounts or groups, respectively. The operating system automatically assigns the candidate identification number to the new user account or user group created in a target user registry if the checking indicates that the candidate identification number has not been assigned already to any of the existing user accounts or user groups, respectively.

  15. Identification of oxygen-related midgap level in GaAs

    NASA Technical Reports Server (NTRS)

    Lagowski, J.; Lin, D. G.; Gatos, H. C.; Aoyama, T.

    1984-01-01

    An oxygen-related deep level ELO was identified in GaAs employing Bridgman-grown crystals with controlled oxygen doping. The activation energy of ELO is almost the same as that of the dominant midgap level: EL2. This fact impedes the identification of ELO by standard deep level transient spectroscopy. However, it was found that the electron capture cross section of ELO is about four times greater than that of EL2. This characteristic served as the basis for the separation and quantitative investigation of ELO employing detailed capacitance transient measurements in conjunction with reference measurements on crystals grown without oxygen doping and containing only EL2.

  16. Secondary School Students' Levels of Understanding in Computing Exponents

    ERIC Educational Resources Information Center

    Pitta-Pantazi, Demetra; Christou, Constantinos; Zachariades, Theodossios

    2007-01-01

    The aim of this study is to describe and analyze students' levels of understanding of exponents within the context of procedural and conceptual learning via the conceptual change and prototypes' theory. The study was conducted with 202 secondary school students with the use of a questionnaire and semi-structured interviews. The results suggest…

  17. High level waste storage tanks 242-A evaporator standards/requirement identification document

    SciTech Connect

    Biebesheimer, E.

    1996-01-01

    This document, the Standards/Requirements Identification Document (S/RIDS) for the subject facility, represents the necessary and sufficient requirements to provide an adequate level of protection of the worker, public health and safety, and the environment. It lists those source documents from which requirements were extracted, and those requirements documents considered, but from which no requirements where taken. Documents considered as source documents included State and Federal Regulations, DOE Orders, and DOE Standards

  18. On parameters identification of computational models of vibrations during quiet standing of humans

    NASA Astrophysics Data System (ADS)

    Barauskas, R.; Krušinskienė, R.

    2007-12-01

    Vibration of the center of pressure (COP) of human body on the base of support during quiet standing is a very popular clinical research, which provides useful information about the physical and health condition of an individual. In this work, vibrations of COP of a human body in forward-backward direction during still standing are generated using controlled inverted pendulum (CIP) model with a single degree of freedom (dof) supplied with proportional, integral and differential (PID) controller, which represents the behavior of the central neural system of a human body and excited by cumulative disturbance vibration, generated within the body due to breathing or any other physical condition. The identification of the model and disturbance parameters is an important stage while creating a close-to-reality computational model able to evaluate features of disturbance. The aim of this study is to present the CIP model parameters identification approach based on the information captured by time series of the COP signal. The identification procedure is based on an error function minimization. Error function is formulated in terms of time laws of computed and experimentally measured COP vibrations. As an alternative, error function is formulated in terms of the stabilogram diffusion function (SDF). The minimization of error functions is carried out by employing methods based on sensitivity functions of the error with respect to model and excitation parameters. The sensitivity functions are obtained by using the variational techniques. The inverse dynamic problem approach has been employed in order to establish the properties of the disturbance time laws ensuring the satisfactory coincidence of measured and computed COP vibration laws. The main difficulty of the investigated problem is encountered during the model validation stage. Generally, neither the PID controller parameter set nor the disturbance time law are known in advance. In this work, an error function

  19. Fostering Girls' Computer Literacy through Laptop Learning: Can Mobile Computers Help To Level Out the Gender Difference?

    ERIC Educational Resources Information Center

    Schaumburg, Heike

    The goal of this study was to find out if the difference between boys and girls in computer literacy can be leveled out in a laptop program where each student has his/her own mobile computer to work with at home and at school. Ninth grade students (n=113) from laptop and non-laptop classes in a German high school were tested for their computer…

  20. Computed Tomography (CT) Scanning Facilitates Early Identification of Neonatal Cystic Fibrosis Piglets

    PubMed Central

    Guillon, Antoine; Chevaleyre, Claire; Barc, Celine; Berri, Mustapha; Adriaensen, Hans; Lecompte, François; Villemagne, Thierry; Pezant, Jérémy; Delaunay, Rémi; Moënne-Loccoz, Joseph; Berthon, Patricia; Bähr, Andrea; Wolf, Eckhard; Klymiuk, Nikolai; Attucci, Sylvie; Ramphal, Reuben; Sarradin, Pierre; Buzoni-Gatel, Dominique; Si-Tahar, Mustapha; Caballero, Ignacio

    2015-01-01

    Background Cystic Fibrosis (CF) is the most prevalent autosomal recessive disease in the Caucasian population. A cystic fibrosis transmembrane conductance regulator knockout (CFTR-/-) pig that displays most of the features of the human CF disease has been recently developed. However, CFTR-/- pigs presents a 100% prevalence of meconium ileus that leads to death in the first hours after birth, requiring a rapid diagnosis and surgical intervention to relieve intestinal obstruction. Identification of CFTR-/- piglets is usually performed by PCR genotyping, a procedure that lasts between 4 to 6 h. Here, we aimed to develop a procedure for rapid identification of CFTR-/- piglets that will allow placing them under intensive care soon after birth and immediately proceeding with the surgical correction. Methods and Principal Findings Male and female CFTR+/- pigs were crossed and the progeny was examined by computed tomography (CT) scan to detect the presence of meconium ileus and facilitate a rapid post-natal surgical intervention. Genotype was confirmed by PCR. CT scan presented a 94.4% sensitivity to diagnose CFTR-/- piglets. Diagnosis by CT scan reduced the birth-to-surgery time from a minimum of 10 h down to a minimum of 2.5 h and increased the survival of CFTR-/- piglets to a maximum of 13 days post-surgery as opposed to just 66 h after later surgery. Conclusion CT scan imaging of meconium ileus is an accurate method for rapid identification of CFTR-/- piglets. Early CT detection of meconium ileus may help to extend the lifespan of CFTR-/- piglets and, thus, improve experimental research on CF, still an incurable disease. PMID:26600426

  1. VISPA: a computational pipeline for the identification and analysis of genomic vector integration sites.

    PubMed

    Calabria, Andrea; Leo, Simone; Benedicenti, Fabrizio; Cesana, Daniela; Spinozzi, Giulio; Orsini, Massimilano; Merella, Stefania; Stupka, Elia; Zanetti, Gianluigi; Montini, Eugenio

    2014-01-01

    The analysis of the genomic distribution of viral vector genomic integration sites is a key step in hematopoietic stem cell-based gene therapy applications, allowing to assess both the safety and the efficacy of the treatment and to study the basic aspects of hematopoiesis and stem cell biology. Identifying vector integration sites requires ad-hoc bioinformatics tools with stringent requirements in terms of computational efficiency, flexibility, and usability. We developed VISPA (Vector Integration Site Parallel Analysis), a pipeline for automated integration site identification and annotation based on a distributed environment with a simple Galaxy web interface. VISPA was successfully used for the bioinformatics analysis of the follow-up of two lentiviral vector-based hematopoietic stem-cell gene therapy clinical trials. Our pipeline provides a reliable and efficient tool to assess the safety and efficacy of integrating vectors in clinical settings. PMID:25342980

  2. Computational methods for the identification of spatially varying stiffness and damping in beams

    NASA Technical Reports Server (NTRS)

    Banks, H. T.; Rosen, I. G.

    1986-01-01

    A numerical approximation scheme for the estimation of functional parameters in Euler-Bernoulli models for the transverse vibration of flexible beams with tip bodies is developed. The method permits the identification of spatially varying flexural stiffness and Voigt-Kelvin viscoelastic damping coefficients which appear in the hybrid system of ordinary and partial differential equations and boundary conditions describing the dynamics of such structures. An inverse problem is formulated as a least squares fit to data subject to constraints in the form of a vector system of abstract first order evolution equations. Spline-based finite element approximations are used to finite dimensionalize the problem. Theoretical convergence results are given and numerical studies carried out on both conventional (serial) and vector computers are discussed.

  3. Identification of transposon insertion polymorphisms by computational comparative analysis of next generation personal genome data

    NASA Astrophysics Data System (ADS)

    Luo, Xuemei; Dehne, Frank; Liang, Ping

    2011-11-01

    Structural variations (SVs) in a genome are now known as a prominent and important type of genetic variation. Among all types of SVs, the identification of transposon insertion polymorphisms (TIPs) is more challenging due to the highly repetitive nature of transposon sequences. We developed a computational method, TIP-finder, to identify TIPs through analysis of next generation personal genome data and their extremely large copy numbers. We tested the efficiency of TIP-finder with simulated data and are able to detect about 88% of TIPs with precision of ≥91%. Using TIP-finder to analyze the Solexa pair-end sequence data at deep coverage for six genomes representing two trio families, we identified a total of 5569 TIPs, consisting of 4881, 456, 91, and 141 insertions from Alu, L1, SVA and HERV, respectively, representing the most comprehensive analysis of such type of genetic variation.

  4. NLSCIDNT user's guide maximum likehood parameter identification computer program with nonlinear rotorcraft model

    NASA Technical Reports Server (NTRS)

    1979-01-01

    A nonlinear, maximum likelihood, parameter identification computer program (NLSCIDNT) is described which evaluates rotorcraft stability and control coefficients from flight test data. The optimal estimates of the parameters (stability and control coefficients) are determined (identified) by minimizing the negative log likelihood cost function. The minimization technique is the Levenberg-Marquardt method, which behaves like the steepest descent method when it is far from the minimum and behaves like the modified Newton-Raphson method when it is nearer the minimum. Twenty-one states and 40 measurement variables are modeled, and any subset may be selected. States which are not integrated may be fixed at an input value, or time history data may be substituted for the state in the equations of motion. Any aerodynamic coefficient may be expressed as a nonlinear polynomial function of selected 'expansion variables'.

  5. Features preferred in-identification system based on computer mouse movements

    NASA Astrophysics Data System (ADS)

    Jašek, Roman; Kolařík, Martin

    2016-06-01

    Biometric identification systems build features, which describe people, using various data. Usually it is not known which of features could be chosen, and a dedicated process called a feature selection is applied to resolve this uncertainty. The relevant features, that are selected with this process, then describe the people in the system as well as possible. It is likely that the relevancy of selected features also mean that these features describe the important things in the measured behavior and that they partly reveal how the behavior works. This paper uses this idea and evaluates results of many runs of feature selection runs in a system, that identifies people using their moving a computer mouse. It has been found that the most frequently selected features repeat between runs, and that these features describe the similar things of the movements.

  6. [Computational approaches for identification and classification of transposable elements in eukaryotic genomes].

    PubMed

    Xu, Hong-En; Zhang, Hua-Hao; Han, Min-Jin; Shen, Yi-Hong; Huang, Xian-Zhi; Xiang, Zhong-Huai; Zhang, Ze

    2012-08-01

    Repetitive sequences (repeats) represent a significant fraction of the eukaryotic genomes and can be divided into tandem repeats, segmental duplications, and interspersed repeats on the basis of their sequence characteristics and how they are formed. Most interspersed repeats are derived from transposable elements (TEs). Eukaryotic TEs have been subdivided into two major classes according to the intermediate they use to move. The transposition and amplification of TEs have a great impact on the evolution of genes and the stability of genomes. However, identification and classification of TEs are complex and difficult due to the fact that their structure and classification are complex and diverse compared with those of other types of repeats. Here, we briefly introduced the function and classification of TEs, and summarized three different steps for identification, classification and annotation of TEs in eukaryotic genomes: (1) assembly of a repeat library, (2) repeat correction and classification, and (3) genome annotation. The existing computational approaches for each step were summarized and the advantages and disadvantages of the approaches were also highlighted in this review. To accurately identify, classify, and annotate the TEs in eukaryotic genomes requires combined methods. This review provides useful information for biologists who are not familiar with these approaches to find their way through the forest of programs.

  7. Computer Literacy in Pennsylvania Community Colleges. Competencies in a Beginning Level College Computer Literacy Course.

    ERIC Educational Resources Information Center

    Tortorelli, Ann Eichorn

    A study was conducted at the 14 community colleges (17 campuses) in Pennsylvania to assess the perceptions of faculty about the relative importance of course content items in a beginning credit course in computer literacy, and to survey courses currently being offered. A detailed questionnaire consisting of 96 questions based on MECC (Minnesota…

  8. Bladed-shrouded-disc aeroelastic analyses: Computer program updates in NASTRAN level 17.7

    NASA Technical Reports Server (NTRS)

    Gallo, A. M.; Elchuri, V.; Skalski, S. C.

    1981-01-01

    In October 1979, a computer program based on the state-of-the-art compressor and structural technologies applied to bladed-shrouded-disc was developed. The program was more operational in NASTRAN Level 16. The bladed disc computer program was updated for operation in NASTRAN Level 17.7. The supersonic cascade unsteady aerodynamics routine UCAS, delivered as part of the NASTRAN Level 16 program was recorded to improve its execution time. These improvements are presented.

  9. Computer-based study strategies for students with learning disabilities: individual differences associated with adoption level.

    PubMed

    Anderson-Inman, L; Knox-Quinn, C; Horney, M A

    1996-09-01

    This article reports results from a study of the use of technology to support students with learning disabilities in the use of effective study strategies. Thirty secondary students were given laptop computers and taught a variety of computer-based study strategies designed to facilitate information recording, organization, and manipulation. Results suggest that students adopted this innovation at three levels: (a) Power Users (skilled, independent users, integrating the computer into their schoolwork); (b) Prompted Users (skilled computer users, but requiring prompting); and (c) Reluctant Users (having limited knowledge and working only under supervision). Intelligence and reading test scores were associated with adoption levels in a statistically significant way.

  10. NEW Fe I LEVEL ENERGIES AND LINE IDENTIFICATIONS FROM STELLAR SPECTRA

    SciTech Connect

    Peterson, Ruth C.; Kurucz, Robert L.

    2015-01-01

    The spectrum of the Fe I atom is critical to many areas of astrophysics and beyond. Measurements of the energies of its high-lying levels remain woefully incomplete, however, despite extensive laboratory and solar analysis. In this work, we use high-resolution archival absorption-line ultraviolet and optical spectra of stars whose warm temperatures favor moderate Fe I excitation. We derive the energy for a particular upper level in Kurucz's semiempirical calculations by adopting a trial value that yields the same wavelength for a given line predicted to be about as strong as that of a strong unidentified spectral line observed in the stellar spectra, then checking the new wavelengths of other strong predicted transitions that share the same upper level for coincidence with other strong observed unidentified lines. To date, this analysis has provided the upper energies of 66 Fe I levels. Many new energy levels are higher than those accessible to laboratory experiments; several exceed the Fe I ionization energy. These levels provide new identifications for over 2000 potentially detectable lines. Almost all of the new levels of odd parity include UV lines that were detected but unclassified in laboratory Fe I absorption spectra, providing an external check on the energy values. We motivate and present the procedure, provide the resulting new energy levels and their uncertainties, list all the potentially detectable UV and optical new Fe I line identifications and their gf values, point out new lines of astrophysical interest, and discuss the prospects for additional Fe I energy level determinations.

  11. Identification of Yersinia enterocolitica at the Species and Subspecies Levels by Fourier Transform Infrared Spectroscopy ▿

    PubMed Central

    Kuhm, Andrea Elisabeth; Suter, Daniel; Felleisen, Richard; Rau, Jörg

    2009-01-01

    Yersinia enterocolitica and other Yersinia species, such as Y. pseudotuberculosis, Y. bercovieri, and Y. intermedia, were differentiated using Fourier transform infrared spectroscopy (FT-IR) combined with artificial neural network analysis. A set of well defined Yersinia strains from Switzerland and Germany was used to create a method for FT-IR-based differentiation of Yersinia isolates at the species level. The isolates of Y. enterocolitica were also differentiated by FT-IR into the main biotypes (biotypes 1A, 2, and 4) and serotypes (serotypes O:3, O:5, O:9, and “non-O:3, O:5, and O:9”). For external validation of the constructed methods, independently obtained isolates of different Yersinia species were used. A total of 79.9% of Y. enterocolitica sensu stricto isolates were identified correctly at the species level. The FT-IR analysis allowed the separation of all Y. bercovieri, Y. intermedia, and Y. rohdei strains from Y. enterocolitica, which could not be differentiated by the API 20E test system. The probability for correct biotype identification of Y. enterocolitica isolates was 98.3% (41 externally validated strains). For correct serotype identification, the probability was 92.5% (42 externally validated strains). In addition, the presence or absence of the ail gene, one of the main pathogenicity markers, was demonstrated using FT-IR. The probability for correct identification of isolates concerning the ail gene was 98.5% (51 externally validated strains). This indicates that it is possible to obtain information about genus, species, and in the case of Y. enterocolitica also subspecies type with a single measurement. Furthermore, this is the first example of the identification of specific pathogenicity using FT-IR. PMID:19617388

  12. CaPSID: A bioinformatics platform for computational pathogen sequence identification in human genomes and transcriptomes

    PubMed Central

    2012-01-01

    Background It is now well established that nearly 20% of human cancers are caused by infectious agents, and the list of human oncogenic pathogens will grow in the future for a variety of cancer types. Whole tumor transcriptome and genome sequencing by next-generation sequencing technologies presents an unparalleled opportunity for pathogen detection and discovery in human tissues but requires development of new genome-wide bioinformatics tools. Results Here we present CaPSID (Computational Pathogen Sequence IDentification), a comprehensive bioinformatics platform for identifying, querying and visualizing both exogenous and endogenous pathogen nucleotide sequences in tumor genomes and transcriptomes. CaPSID includes a scalable, high performance database for data storage and a web application that integrates the genome browser JBrowse. CaPSID also provides useful metrics for sequence analysis of pre-aligned BAM files, such as gene and genome coverage, and is optimized to run efficiently on multiprocessor computers with low memory usage. Conclusions To demonstrate the usefulness and efficiency of CaPSID, we carried out a comprehensive analysis of both a simulated dataset and transcriptome samples from ovarian cancer. CaPSID correctly identified all of the human and pathogen sequences in the simulated dataset, while in the ovarian dataset CaPSID’s predictions were successfully validated in vitro. PMID:22901030

  13. A Computational Model for the Identification of Biochemical Pathways in the Krebs Cycle

    SciTech Connect

    Oliveira, Joseph S.; Bailey, Colin G.; Jones-Oliveira, Janet B.; Dixon, David A.; Gull, Dean W.; Chandler, Mary L.

    2003-03-01

    We have applied an algorithmic methodology which provably decomposes any complex network into a complete family of principal subcircuits to study the minimal circuits that describe the Krebs cycle. Every operational behavior that the network is capable of exhibiting can be represented by some combination of these principal subcircuits and this computational decomposition is linearly efficient. We have developed a computational model that can be applied to biochemical reaction systems which accurately renders pathways of such reactions via directed hypergraphs (Petri nets). We have applied the model to the citric acid cycle (Krebs cycle). The Krebs cycle, which oxidizes the acetyl group of acetyl CoA to CO2 and reduces NAD and FAD to NADH and FADH2 is a complex interacting set of nine subreaction networks. The Krebs cycle was selected because of its familiarity to the biological community and because it exhibits enough complexity to be interesting in order to introduce this novel analytic approach. This study validates the algorithmic methodology for the identification of significant biochemical signaling subcircuits, based solely upon the mathematical model and not upon prior biological knowledge. The utility of the algebraic-combinatorial model for identifying the complete set of biochemical subcircuits as a data set is demonstrated for this important metabolic process.

  14. A computational model for the identification of biochemical pathways in the krebs cycle.

    PubMed

    Oliveira, Joseph S; Bailey, Colin G; Jones-Oliveira, Janet B; Dixon, David A; Gull, Dean W; Chandler, Mary L

    2003-01-01

    We have applied an algorithmic methodology which provably decomposes any complex network into a complete family of principal subcircuits to study the minimal circuits that describe the Krebs cycle. Every operational behavior that the network is capable of exhibiting can be represented by some combination of these principal subcircuits and this computational decomposition is linearly efficient. We have developed a computational model that can be applied to biochemical reaction systems which accurately renders pathways of such reactions via directed hypergraphs (Petri nets). We have applied the model to the citric acid cycle (Krebs cycle). The Krebs cycle, which oxidizes the acetyl group of acetyl CoA to CO(2) and reduces NAD and FAD to NADH and FADH(2), is a complex interacting set of nine subreaction networks. The Krebs cycle was selected because of its familiarity to the biological community and because it exhibits enough complexity to be interesting in order to introduce this novel analytic approach. This study validates the algorithmic methodology for the identification of significant biochemical signaling subcircuits, based solely upon the mathematical model and not upon prior biological knowledge. The utility of the algebraic-combinatorial model for identifying the complete set of biochemical subcircuits as a data set is demonstrated for this important metabolic process.

  15. Multivariate Effects of Level of Education, Computer Ownership, and Computer Use on Female Students' Attitudes towards CALL

    ERIC Educational Resources Information Center

    Rahimi, Mehrak; Yadollahi, Samaneh

    2012-01-01

    The aim of this study was investigating Iranian female students' attitude towards CALL and its relationship with their level of education, computer ownership, and frequency of use. One hundred and forty-two female students (50 junior high-school students, 49 high-school students and 43 university students) participated in this study. They filled…

  16. Computer experiments on periodic systems identification using rotor blade transient flapping-torsion responses at high advance ratio

    NASA Technical Reports Server (NTRS)

    Hohenemser, K. H.; Prelewicz, D. A.

    1974-01-01

    Systems identification methods have recently been applied to rotorcraft to estimate stability derivatives from transient flight control response data. While these applications assumed a linear constant coefficient representation of the rotorcraft, the computer experiments described in this paper used transient responses in flap-bending and torsion of a rotor blade at high advance ratio which is a rapidly time varying periodic system.

  17. Computer-Assisted Instruction in Elementary Logic at the University Level. Technical Report No. 239.

    ERIC Educational Resources Information Center

    Goldberg, Adele; Suppes, Patrick

    Earlier research by the authors in the design and use of computer-assisted instructional systems and curricula for teaching mathematical logic to gifted elementary school students has been extended to the teaching of university-level courses. This report is a description of the curriculum and problem types of a computer-based course offered at…

  18. An electrical device for computing theoretical draw-downs of ground-water levels

    USGS Publications Warehouse

    Remson, Irwin; Halstead, M.H.

    1955-01-01

    The construction, calibration and use of an electrical "slide rule" for computing theoretical drawdowns of ground-water levels are described. The instrument facilitates the computation of drawdowns under given conditions of discharge or recharge by means of the Theis nonequilibrium equation. It is simple to construct and use and can be a valuable aid in ground-water studies.   

  19. A Study of Effectiveness of Computer Assisted Instruction (CAI) over Classroom Lecture (CRL) at ICS Level

    ERIC Educational Resources Information Center

    Kaousar, Tayyeba; Choudhry, Bushra Naoreen; Gujjar, Aijaz Ahmed

    2008-01-01

    This study was aimed to evaluate the effectiveness of CAI vs. classroom lecture for computer science at ICS level. The objectives were to compare the learning effects of two groups with classroom lecture and computer-assisted instruction studying the same curriculum and the effects of CAI and CRL in terms of cognitive development. Hypotheses of…

  20. Computational techniques in tribology and material science at the atomic level

    NASA Technical Reports Server (NTRS)

    Ferrante, J.; Bozzolo, G. H.

    1992-01-01

    Computations in tribology and material science at the atomic level present considerable difficulties. Computational techniques ranging from first-principles to semi-empirical and their limitations are discussed. Example calculations of metallic surface energies using semi-empirical techniques are presented. Finally, application of the methods to calculation of adhesion and friction are presented.

  1. Identification of Nonlinear Micron-Level Mechanics for a Precision Deployable Joint

    NASA Technical Reports Server (NTRS)

    Bullock, S. J.; Peterson, L. D.

    1994-01-01

    The experimental identification of micron-level nonlinear joint mechanics and dynamics for a pin-clevis joint used in a precision, adaptive, deployable space structure are investigated. The force-state mapping method is used to identify the behavior of the joint under a preload. The results of applying a single tension-compression cycle to the joint under a tensile preload are presented. The observed micron-level behavior is highly nonlinear and involves all six rigid body motion degrees-of-freedom of the joint. it is also suggests that at micron levels of motion modelling of the joint mechanics and dynamics must include the interactions between all internal components, such as the pin, bushings, and the joint node.

  2. Groundwater contamination: identification of source signal by time-reverse mass transport computation and filtering

    NASA Astrophysics Data System (ADS)

    Koussis, A. S.; Mazi, K.; Lykoudis, S.; Argyriou, A.

    2003-04-01

    Source signal identification is a forensic task, within regulatory and legal activities. Estimation of the contaminant's release history by reverse-solution (stepping back in time) of the mass transport equation, partialC/partialt + u partialC/partialx = D partial^2C/ partialx^2, is an ill-posed problem (its solution is non-unique and unstable). For this reason we propose the recovery of the source signal from measured concentration profile data through a numerical technique that is based on the premise of advection-dominated transport. We derive an explicit numerical scheme by discretising the pure advection equation, partialC/ partialt + u partial C/partialx = 0, such that it also models gradient-transport by matching numerical diffusion (leading truncation error term) to physical dispersion. The match is achieved by appropriate choice of the scheme’s spatial weighting coefficient q as function of the grid Peclet number P = u Δx/D: θ = 0.5 - P-1. This is a novel and efficient direct solution approach for the signal identification problem at hand that can accommodate space-variable transport parameters as well. First, we perform numerical experiments to define proper grids (in terms of Courant {bf C} = uΔt/Δx and grid Peclet P numbers) for control of spurious oscillations (instability). We then assess recovery of source signals, from perfect as well as from error-seeded field data, considering field data resulting from single- and double-peaked source signals. With perfect data, the scheme recovers source signals with very good accuracy. With imperfect data, however, additional data conditioning is required for control of signal noise. Alternating reverse profile computation with Savitzky-Golay low-pass filtering allows the recovery of well-timed and smooth source signals that satisfy mass conservation very well. Current research focuses on: a) optimising the performance of Savitzky-Golay filters, through selection of appropriate parameters (order of least

  3. Computational Identification of Novel MicroRNAs and Their Targets in Vigna unguiculata.

    PubMed

    Lu, Yongzhong; Yang, Xiaoyun

    2010-01-01

    MicroRNAs (miRNAs) are a class of endogenous, noncoding, short RNAs directly involved in regulating gene expression at the posttranscriptional level. High conservation of miRNAs in plant provides the foundation for identification of new miRNAs in other plant species through homology alignment. Here, previous known plant miRNAs were BLASTed against the Expressed Sequence Tag (EST) and Genomic Survey Sequence (GSS) databases of Vigna unguiculata, and according to a series of filtering criteria, a total of 47 miRNAs belonging to 13 miRNA families were identified, and 30 potential target genes of them were subsequently predicted, most of which seemed to encode transcription factors or enzymes participating in regulation of development, growth, metabolism, and other physiological processes. Overall, our findings lay the foundation for further researches of miRNAs function in Vigna unguiculata.

  4. Why do we SLIP to the basic level? Computational constraints and their implementation.

    PubMed

    Gosselin, F; Schyns, P G

    2001-10-01

    The authors introduce a new measure of basic-level performance (strategy length and internal practicability; SLIP). SLIP implements 2 computational constraints on the organization of categories in a taxonomy: the minimum number of feature tests required to place the input in a category (strategy length) and the ease with which these tests are performed (internal practicability). The predictive power of SLIP is compared with that of 4 other basic-level measures: context model, category feature possession, category utility, and compression measure, drawing data from other empirical work, and 3 new experiments testing the validity of the computational constraints of SLIP using computer-synthesized 3-dimensional artificial objects.

  5. A unique automation platform for measuring low level radioactivity in metabolite identification studies.

    PubMed

    Krauser, Joel; Walles, Markus; Wolf, Thierry; Graf, Daniel; Swart, Piet

    2012-01-01

    Generation and interpretation of biotransformation data on drugs, i.e. identification of physiologically relevant metabolites, defining metabolic pathways and elucidation of metabolite structures, have become increasingly important to the drug development process. Profiling using (14)C or (3)H radiolabel is defined as the chromatographic separation and quantification of drug-related material in a given biological sample derived from an in vitro, preclinical in vivo or clinical study. Metabolite profiling is a very time intensive activity, particularly for preclinical in vivo or clinical studies which have defined limitations on radiation burden and exposure levels. A clear gap exists for certain studies which do not require specialized high volume automation technologies, yet these studies would still clearly benefit from automation. Use of radiolabeled compounds in preclinical and clinical ADME studies, specifically for metabolite profiling and identification are a very good example. The current lack of automation for measuring low level radioactivity in metabolite profiling requires substantial capacity, personal attention and resources from laboratory scientists. To help address these challenges and improve efficiency, we have innovated, developed and implemented a novel and flexible automation platform that integrates a robotic plate handling platform, HPLC or UPLC system, mass spectrometer and an automated fraction collector.

  6. A computational method for the identification of new candidate carcinogenic and non-carcinogenic chemicals.

    PubMed

    Chen, Lei; Chu, Chen; Lu, Jing; Kong, Xiangyin; Huang, Tao; Cai, Yu-Dong

    2015-09-01

    Cancer is one of the leading causes of human death. Based on current knowledge, one of the causes of cancer is exposure to toxic chemical compounds, including radioactive compounds, dioxin, and arsenic. The identification of new carcinogenic chemicals may warn us of potential danger and help to identify new ways to prevent cancer. In this study, a computational method was proposed to identify potential carcinogenic chemicals, as well as non-carcinogenic chemicals. According to the current validated carcinogenic and non-carcinogenic chemicals from the CPDB (Carcinogenic Potency Database), the candidate chemicals were searched in a weighted chemical network constructed according to chemical-chemical interactions. Then, the obtained candidate chemicals were further selected by a randomization test and information on chemical interactions and structures. The analyses identified several candidate carcinogenic chemicals, while those candidates identified as non-carcinogenic were supported by a literature search. In addition, several candidate carcinogenic/non-carcinogenic chemicals exhibit structural dissimilarity with validated carcinogenic/non-carcinogenic chemicals.

  7. Computer-aided identification of polymorphism sets diagnostic for groups of bacterial and viral genetic variants

    PubMed Central

    Price, Erin P; Inman-Bamber, John; Thiruvenkataswamy, Venugopal; Huygens, Flavia; Giffard, Philip M

    2007-01-01

    Background Single nucleotide polymorphisms (SNPs) and genes that exhibit presence/absence variation have provided informative marker sets for bacterial and viral genotyping. Identification of marker sets optimised for these purposes has been based on maximal generalized discriminatory power as measured by Simpson's Index of Diversity, or on the ability to identify specific variants. Here we describe the Not-N algorithm, which is designed to identify small sets of genetic markers diagnostic for user-specified subsets of known genetic variants. The algorithm does not treat the user-specified subset and the remaining genetic variants equally. Rather Not-N analysis is designed to underpin assays that provide 0% false negatives, which is very important for e.g. diagnostic procedures for clinically significant subgroups within microbial species. Results The Not-N algorithm has been incorporated into the "Minimum SNPs" computer program and used to derive genetic markers diagnostic for multilocus sequence typing-defined clonal complexes, hepatitis C virus (HCV) subtypes, and phylogenetic clades defined by comparative genome hybridization (CGH) data for Campylobacter jejuni, Yersinia enterocolitica and Clostridium difficile. Conclusion Not-N analysis is effective for identifying small sets of genetic markers diagnostic for microbial sub-groups. The best results to date have been obtained with CGH data from several bacterial species, and HCV sequence data. PMID:17672919

  8. Computation of likelihood ratios in fingerprint identification for configurations of any number of minutiae.

    PubMed

    Neumann, Cédric; Champod, Christophe; Puch-Solis, Roberto; Egli, Nicole; Anthonioz, Alexandre; Bromage-Griffiths, Andie

    2007-01-01

    Recent court challenges have highlighted the need for statistical research on fingerprint identification. This paper proposes a model for computing likelihood ratios (LRs) to assess the evidential value of comparisons with any number of minutiae. The model considers minutiae type, direction and relative spatial relationships. It expands on previous work on three minutiae by adopting a spatial modeling using radial triangulation and a probabilistic distortion model for assessing the numerator of the LR. The model has been tested on a sample of 686 ulnar loops and 204 arches. Features vectors used for statistical analysis have been obtained following a preprocessing step based on Gabor filtering and image processing to extract minutiae data. The metric used to assess similarity between two feature vectors is based on an Euclidean distance measure. Tippett plots and rates of misleading evidence have been used as performance indicators of the model. The model has shown encouraging behavior with low rates of misleading evidence and a LR power of the model increasing significantly with the number of minutiae. The LRs that it provides are highly indicative of identity of source on a significant proportion of cases, even when considering configurations with few minutiae. In contrast with previous research, the model, in addition to minutia type and direction, incorporates spatial relationships of minutiae without introducing probabilistic independence assumptions. The model also accounts for finger distortion.

  9. Computer-assisted structure identification (CASI)--an automated platform for high-throughput identification of small molecules by two-dimensional gas chromatography coupled to mass spectrometry.

    PubMed

    Knorr, Arno; Monge, Aurelien; Stueber, Markus; Stratmann, André; Arndt, Daniel; Martin, Elyette; Pospisil, Pavel

    2013-12-01

    Compound identification is widely recognized as a major bottleneck for modern metabolomic approaches and high-throughput nontargeted characterization of complex matrices. To tackle this challenge, an automated platform entitled computer-assisted structure identification (CASI) was designed and developed in order to accelerate and standardize the identification of compound structures. In the first step of the process, CASI automatically searches mass spectral libraries for matches using a NIST MS Search algorithm, which proposes structural candidates for experimental spectra from two-dimensional gas chromatography with time-of-flight mass spectrometry (GC × GC-TOF-MS) measurements, each with an associated match factor. Next, quantitative structure-property relationship (QSPR) models implemented in CASI predict three specific parameters to enhance the confidence for correct compound identification, which were Kovats Index (KI) for the first dimension (1D) separation, relative retention time for the second dimension separation (2DrelRT) and boiling point (BP). In order to reduce the impact of chromatographic variability on the second dimension retention time, a concept based upon hypothetical reference points from linear regressions of a deuterated n-alkanes reference system was introduced, providing a more stable relative retention time measurement. Predicted values for KI and 2DrelRT were calculated and matched with experimentally derived values. Boiling points derived from 1D separations were matched with predicted boiling points, calculated from the chemical structures of the candidates. As a last step, CASI combines the NIST MS Search match factors (NIST MF) with up to three predicted parameter matches from the QSPR models to generate a combined CASI Score representing the measure of confidence for the identification. Threshold values were applied to the CASI Scores assigned to proposed structures, which improved the accuracy for the classification of true

  10. Developing Computer-Interactive Tape Exercises for Intermediate-Level Business French.

    ERIC Educational Resources Information Center

    Garnett, Mary Anne

    One college language teacher developed computer-interactive audiotape exercises for an intermediate-level class in business French. The project was undertaken because of a need for appropriate materials at that level. The use of authoring software permitted development of a variety of activity types, including multiple-choice, fill-in-the-blank,…

  11. Identification, Recovery, and Refinement of Hitherto Undescribed Population-Level Genomes from the Human Gastrointestinal Tract.

    PubMed

    Laczny, Cedric C; Muller, Emilie E L; Heintz-Buschart, Anna; Herold, Malte; Lebrun, Laura A; Hogan, Angela; May, Patrick; de Beaufort, Carine; Wilmes, Paul

    2016-01-01

    Linking taxonomic identity and functional potential at the population-level is important for the study of mixed microbial communities and is greatly facilitated by the availability of microbial reference genomes. While the culture-independent recovery of population-level genomes from environmental samples using the binning of metagenomic data has expanded available reference genome catalogs, several microbial lineages remain underrepresented. Here, we present two reference-independent approaches for the identification, recovery, and refinement of hitherto undescribed population-level genomes. The first approach is aimed at genome recovery of varied taxa and involves multi-sample automated binning using CANOPY CLUSTERING complemented by visualization and human-augmented binning using VIZBIN post hoc. The second approach is particularly well-suited for the study of specific taxa and employs VIZBIN de novo. Using these approaches, we reconstructed a total of six population-level genomes of distinct and divergent representatives of the Alphaproteobacteria class, the Mollicutes class, the Clostridiales order, and the Melainabacteria class from human gastrointestinal tract-derived metagenomic data. Our results demonstrate that, while automated binning approaches provide great potential for large-scale studies of mixed microbial communities, these approaches should be complemented with informative visualizations because expert-driven inspection and refinements are critical for the recovery of high-quality population-level genomes. PMID:27445992

  12. Identification, Recovery, and Refinement of Hitherto Undescribed Population-Level Genomes from the Human Gastrointestinal Tract

    PubMed Central

    Laczny, Cedric C.; Muller, Emilie E. L.; Heintz-Buschart, Anna; Herold, Malte; Lebrun, Laura A.; Hogan, Angela; May, Patrick; de Beaufort, Carine; Wilmes, Paul

    2016-01-01

    Linking taxonomic identity and functional potential at the population-level is important for the study of mixed microbial communities and is greatly facilitated by the availability of microbial reference genomes. While the culture-independent recovery of population-level genomes from environmental samples using the binning of metagenomic data has expanded available reference genome catalogs, several microbial lineages remain underrepresented. Here, we present two reference-independent approaches for the identification, recovery, and refinement of hitherto undescribed population-level genomes. The first approach is aimed at genome recovery of varied taxa and involves multi-sample automated binning using CANOPY CLUSTERING complemented by visualization and human-augmented binning using VIZBIN post hoc. The second approach is particularly well-suited for the study of specific taxa and employs VIZBIN de novo. Using these approaches, we reconstructed a total of six population-level genomes of distinct and divergent representatives of the Alphaproteobacteria class, the Mollicutes class, the Clostridiales order, and the Melainabacteria class from human gastrointestinal tract-derived metagenomic data. Our results demonstrate that, while automated binning approaches provide great potential for large-scale studies of mixed microbial communities, these approaches should be complemented with informative visualizations because expert-driven inspection and refinements are critical for the recovery of high-quality population-level genomes. PMID:27445992

  13. Computational identification of surrogate genes for prostate cancer phases using machine learning and molecular network analysis

    PubMed Central

    2014-01-01

    Background Prostate cancer is one of the most common malignant diseases and is characterized by heterogeneity in the clinical course. To date, there are no efficient morphologic features or genomic biomarkers that can characterize the phenotypes of the cancer, especially with regard to metastasis – the most adverse outcome. Searching for effective surrogate genes out of large quantities of gene expression data is a key to cancer phenotyping and/or understanding molecular mechanisms underlying prostate cancer development. Results Using the maximum relevance minimum redundancy (mRMR) method on microarray data from normal tissues, primary tumors and metastatic tumors, we identifed four genes that can optimally classify samples of different prostate cancer phases. Moreover, we constructed a molecular interaction network with existing bioinformatic resources and co-identifed eight genes on the shortest-paths among the mRMR-identified genes, which are potential co-acting factors of prostate cancer. Functional analyses show that molecular functions involved in cell communication, hormone-receptor mediated signaling, and transcription regulation play important roles in the development of prostate cancer. Conclusion We conclude that the surrogate genes we have selected compose an effective classifier of prostate cancer phases, which corresponds to a minimum characterization of cancer phenotypes on the molecular level. Along with their molecular interaction partners, it is fairly to assume that these genes may have important roles in prostate cancer development; particularly, the un-reported genes may bring new insights for the understanding of the molecular mechanisms. Thus our results may serve as a candidate gene set for further functional studies. PMID:25151146

  14. Level sequence and splitting identification of closely spaced energy levels by angle-resolved analysis of fluorescence light

    NASA Astrophysics Data System (ADS)

    Wu, Z. W.; Volotka, A. V.; Surzhykov, A.; Dong, C. Z.; Fritzsche, S.

    2016-06-01

    The angular distribution and linear polarization of the fluorescence light following the resonant photoexcitation is investigated within the framework of density matrix and second-order perturbation theory. Emphasis has been placed on "signatures" for determining the level sequence and splitting of intermediate (partially) overlapping resonances, if analyzed as a function of photon energy of incident light. Detailed computations within the multiconfiguration Dirac-Fock method have been performed, especially for the 1 s22 s22 p63 s ,Ji=1 /2 +γ1→(1s22 s 2 p63 s ) 13 p3 /2,J =1 /2 ,3 /2 →1 s22 s22 p63 s ,Jf=1 /2 +γ2 photoexcitation and subsequent fluorescence emission of atomic sodium. A remarkably strong dependence of the angular distribution and linear polarization of the γ2 fluorescence emission is found upon the level sequence and splitting of the intermediate (1s22 s 2 p63 s ) 13 p3 /2,J =1 /2 ,3 /2 overlapping resonances owing to their finite lifetime (linewidth). We therefore suggest that accurate measurements of the angular distribution and linear polarization might help identify the sequence and small splittings of closely spaced energy levels, even if they cannot be spectroscopically resolved.

  15. Identification and Endodontic Management of Middle Mesial Canal in Mandibular Second Molar Using Cone Beam Computed Tomography.

    PubMed

    Paul, Bonny; Dube, Kavita

    2015-01-01

    Endodontic treatments are routinely done with the help of radiographs. However, radiographs represent only a two-dimensional image of an object. Failure to identify aberrant anatomy can lead to endodontic failure. This case report presents the use of three-dimensional imaging with cone beam computed tomography (CBCT) as an adjunct to digital radiography in identification and management of mandibular second molar with three mesial canals. PMID:26664763

  16. Enhanced fault-tolerant quantum computing in d-level systems.

    PubMed

    Campbell, Earl T

    2014-12-01

    Error-correcting codes protect quantum information and form the basis of fault-tolerant quantum computing. Leading proposals for fault-tolerant quantum computation require codes with an exceedingly rare property, a transversal non-Clifford gate. Codes with the desired property are presented for d-level qudit systems with prime d. The codes use n=d-1 qudits and can detect up to ∼d/3 errors. We quantify the performance of these codes for one approach to quantum computation known as magic-state distillation. Unlike prior work, we find performance is always enhanced by increasing d.

  17. Computational Methods for Decentralized Two-Level 0-1 Programming Problems through Distributed Genetic Algorithms

    NASA Astrophysics Data System (ADS)

    Niwa, Keiichi; Hayashida, Tomohiro; Sakawa, Masatoshi; Yang, Yishen

    2010-10-01

    We consider two-level programming problems in which there are one decision maker (the leader) at the upper level and two or more decision makers (the followers) at the lower level and decision variables of the leader and the followers are 0-1 variables. We assume that there is coordination among the followers while between the leader and the group of all the followers, there is no motivation to cooperate each other, and fuzzy goals for objective functions of the leader and the followers are introduced so as to take fuzziness of their judgments into consideration. The leader maximizes the degree of satisfaction (the value of the membership function) and the followers choose in concert in order to maximize a minimum among their degrees of satisfaction. We propose a modified computational method that solves problems related to the computational method based on the genetic algorithm (the existing method) for obtaining the Stackelberg solution. Specifically, the distributed genetic algorithm is introduced with respect to the upper level genetic algorithm, which handles decision variables for the leader in order to shorten the computational time of the existing method. Parallelization of the lower level genetic algorithm is also performed along with parallelization of the upper level genetic algorithm. In order to demonstrate the effectiveness of the proposed computational method, numerical experiments are carried out.

  18. Model-based identification of PEEP titrations during different volemic levels.

    PubMed

    Starfinger, C; Chase, J G; Hann, C E; Shaw, G M; Lambert, P; Smith, B W; Sloth, E; Larsson, A; Andreassen, S; Rees, S

    2008-08-01

    A cardiovascular system (CVS) model has previously been validated in simulated cardiac and circulatory disease states. It has also been shown to accurately capture all main hemodynamic trends in a porcine model of pulmonary embolism. In this research, a slightly extended CVS model and parameter identification process are presented and validated in a porcine experiment of positive end-expiratory pressure (PEEP) titrations at different volemic levels. The model is extended to more physiologically represent the separation of venous and arterial circulation. Errors for the identified model are within 5% when re-simulated and compared to clinical data. All identified parameter trends match clinically expected changes. This work represents another clinical validation of the underlying fundamental CVS model, and the methods and approach of using them for cardiovascular diagnosis in critical care.

  19. Studying channelopathies at the functional level using a system identification approach

    NASA Astrophysics Data System (ADS)

    Faisal, A. Aldo

    2007-09-01

    The electrical activity of our brain's neurons is controlled by voltage-gated ion channels. Mutations in these ion channels have been recently associated with clinical conditions, so called channelopathies. The involved ion channels have been well characterised at a molecular and biophysical level. However, the impact of these mutations on neuron function have been only rudimentary studied. It remains unclear how operation and performance (in terms of input-output characteristics and reliability) are affected. Here, I show how system identification techniques provide neuronal performance measures which allow to quantitatively asses the impact of channelopathies by comparing whole cell input-output relationships. I illustrate the feasibility of this approach by comparing the effects on neuronal signalling of two human sodium channel mutations (NaV 1.1 W1204R, R1648H), linked to generalized epilepsy with febrile seizures, to the wild-type NaV 1.1 channel.

  20. Low-bandwidth and non-compute intensive remote identification of microbes from raw sequencing reads.

    PubMed

    Gautier, Laurent; Lund, Ole

    2013-01-01

    Cheap DNA sequencing may soon become routine not only for human genomes but also for practically anything requiring the identification of living organisms from their DNA: tracking of infectious agents, control of food products, bioreactors, or environmental samples. We propose a novel general approach to the analysis of sequencing data where a reference genome does not have to be specified. Using a distributed architecture we are able to query a remote server for hints about what the reference might be, transferring a relatively small amount of data. Our system consists of a server with known reference DNA indexed, and a client with raw sequencing reads. The client sends a sample of unidentified reads, and in return receives a list of matching references. Sequences for the references can be retrieved and used for exhaustive computation on the reads, such as alignment. To demonstrate this approach we have implemented a web server, indexing tens of thousands of publicly available genomes and genomic regions from various organisms and returning lists of matching hits from query sequencing reads. We have also implemented two clients: one running in a web browser, and one as a python script. Both are able to handle a large number of sequencing reads and from portable devices (the browser-based running on a tablet), perform its task within seconds, and consume an amount of bandwidth compatible with mobile broadband networks. Such client-server approaches could develop in the future, allowing a fully automated processing of sequencing data and routine instant quality check of sequencing runs from desktop sequencers. A web access is available at http://tapir.cbs.dtu.dk. The source code for a python command-line client, a server, and supplementary data are available at http://bit.ly/1aURxkc. PMID:24391826

  1. Low-Bandwidth and Non-Compute Intensive Remote Identification of Microbes from Raw Sequencing Reads

    PubMed Central

    Gautier, Laurent; Lund, Ole

    2013-01-01

    Cheap DNA sequencing may soon become routine not only for human genomes but also for practically anything requiring the identification of living organisms from their DNA: tracking of infectious agents, control of food products, bioreactors, or environmental samples. We propose a novel general approach to the analysis of sequencing data where a reference genome does not have to be specified. Using a distributed architecture we are able to query a remote server for hints about what the reference might be, transferring a relatively small amount of data. Our system consists of a server with known reference DNA indexed, and a client with raw sequencing reads. The client sends a sample of unidentified reads, and in return receives a list of matching references. Sequences for the references can be retrieved and used for exhaustive computation on the reads, such as alignment. To demonstrate this approach we have implemented a web server, indexing tens of thousands of publicly available genomes and genomic regions from various organisms and returning lists of matching hits from query sequencing reads. We have also implemented two clients: one running in a web browser, and one as a python script. Both are able to handle a large number of sequencing reads and from portable devices (the browser-based running on a tablet), perform its task within seconds, and consume an amount of bandwidth compatible with mobile broadband networks. Such client-server approaches could develop in the future, allowing a fully automated processing of sequencing data and routine instant quality check of sequencing runs from desktop sequencers. A web access is available at http://tapir.cbs.dtu.dk. The source code for a python command-line client, a server, and supplementary data are available at http://bit.ly/1aURxkc. PMID:24391826

  2. Low-bandwidth and non-compute intensive remote identification of microbes from raw sequencing reads.

    PubMed

    Gautier, Laurent; Lund, Ole

    2013-01-01

    Cheap DNA sequencing may soon become routine not only for human genomes but also for practically anything requiring the identification of living organisms from their DNA: tracking of infectious agents, control of food products, bioreactors, or environmental samples. We propose a novel general approach to the analysis of sequencing data where a reference genome does not have to be specified. Using a distributed architecture we are able to query a remote server for hints about what the reference might be, transferring a relatively small amount of data. Our system consists of a server with known reference DNA indexed, and a client with raw sequencing reads. The client sends a sample of unidentified reads, and in return receives a list of matching references. Sequences for the references can be retrieved and used for exhaustive computation on the reads, such as alignment. To demonstrate this approach we have implemented a web server, indexing tens of thousands of publicly available genomes and genomic regions from various organisms and returning lists of matching hits from query sequencing reads. We have also implemented two clients: one running in a web browser, and one as a python script. Both are able to handle a large number of sequencing reads and from portable devices (the browser-based running on a tablet), perform its task within seconds, and consume an amount of bandwidth compatible with mobile broadband networks. Such client-server approaches could develop in the future, allowing a fully automated processing of sequencing data and routine instant quality check of sequencing runs from desktop sequencers. A web access is available at http://tapir.cbs.dtu.dk. The source code for a python command-line client, a server, and supplementary data are available at http://bit.ly/1aURxkc.

  3. [Tri-Level Infrared Spectroscopic Identification of Hot Melting Reflective Road Marking Paint].

    PubMed

    Li, Hao; Ma, Fang; Sun, Su-qin

    2015-12-01

    In order to detect the road marking paint from the trace evidence in traffic accident scene, and to differentiate their brands, we use Tri-level infrared spectroscopic identification, which employs the Fourier transform infrared spectroscopy (FTIR), the second derivative infrared spectroscopy(SD-IR), two-dimensional correlation infrared spectroscopy(2D-IR) to identify three different domestic brands of hot melting reflective road marking paints and their raw materials in formula we Selected. The experimental results show that three labels coatings in ATR and FTIR spectrograms are very similar in shape, only have different absorption peak wave numbers, they have wide and strong absorption peaks near 1435 cm⁻¹, and strong absorption peak near 879, 2955, 2919, 2870 cm⁻¹. After enlarging the partial areas of spectrograms and comparing them with each kind of raw material of formula spectrograms, we can distinguish them. In the region 700-970 and 1370-1 660 cm⁻¹ the spectrograms mainly reflect the different relative content of heavy calcium carbonate of three brands of the paints, and that of polyethylene wax (PE wax), ethylene vinyl acetate resin (EVA), dioctyl phthalate (DOP) in the region 2800-2960 cm⁻¹. The SD-IR not only verify the result of the FTIR analysis, but also further expand the microcosmic differences and reflect the different relative content of quartz sand in the 512-799 cm-1 region. Within the scope of the 1351 to 1525 cm⁻¹, 2D-IR have more significant differences in positions and numbers of automatically peaks. Therefore, the Tri-level infrared spectroscopic identification is a fast and effective method to distinguish the hot melting road marking paints with a gradually improvement in apparent resolution.

  4. [Tri-Level Infrared Spectroscopic Identification of Hot Melting Reflective Road Marking Paint].

    PubMed

    Li, Hao; Ma, Fang; Sun, Su-qin

    2015-12-01

    In order to detect the road marking paint from the trace evidence in traffic accident scene, and to differentiate their brands, we use Tri-level infrared spectroscopic identification, which employs the Fourier transform infrared spectroscopy (FTIR), the second derivative infrared spectroscopy(SD-IR), two-dimensional correlation infrared spectroscopy(2D-IR) to identify three different domestic brands of hot melting reflective road marking paints and their raw materials in formula we Selected. The experimental results show that three labels coatings in ATR and FTIR spectrograms are very similar in shape, only have different absorption peak wave numbers, they have wide and strong absorption peaks near 1435 cm⁻¹, and strong absorption peak near 879, 2955, 2919, 2870 cm⁻¹. After enlarging the partial areas of spectrograms and comparing them with each kind of raw material of formula spectrograms, we can distinguish them. In the region 700-970 and 1370-1 660 cm⁻¹ the spectrograms mainly reflect the different relative content of heavy calcium carbonate of three brands of the paints, and that of polyethylene wax (PE wax), ethylene vinyl acetate resin (EVA), dioctyl phthalate (DOP) in the region 2800-2960 cm⁻¹. The SD-IR not only verify the result of the FTIR analysis, but also further expand the microcosmic differences and reflect the different relative content of quartz sand in the 512-799 cm-1 region. Within the scope of the 1351 to 1525 cm⁻¹, 2D-IR have more significant differences in positions and numbers of automatically peaks. Therefore, the Tri-level infrared spectroscopic identification is a fast and effective method to distinguish the hot melting road marking paints with a gradually improvement in apparent resolution. PMID:26964206

  5. User-microprogrammable, local host computer with low-level parallelism

    SciTech Connect

    Tomita, S.; Shibayama, K.; Kitamura, T.; Nakata, T.; Hagiwara, H.

    1983-01-01

    This paper describes the architecture of a dynamically microprogrammable computer with low-level parallelism, called QA-2, which is designed as a high-performance, local host computer for laboratory use. The architectural principle of the QA-2 is the marriage of high-speed, parallel processing capability offered by four powerful arithmetic and logic units (ALUS) with architectural flexibility provided by large scale, dynamic user-microprogramming. By changing its writable control storage dynamically, the QA-2 can be tailored to a wide spectrum of research-oriented applications covering high-level language processing and real-time processing. 11 references.

  6. Multi-level Bayesian safety analysis with unprocessed Automatic Vehicle Identification data for an urban expressway.

    PubMed

    Shi, Qi; Abdel-Aty, Mohamed; Yu, Rongjie

    2016-03-01

    In traffic safety studies, crash frequency modeling of total crashes is the cornerstone before proceeding to more detailed safety evaluation. The relationship between crash occurrence and factors such as traffic flow and roadway geometric characteristics has been extensively explored for a better understanding of crash mechanisms. In this study, a multi-level Bayesian framework has been developed in an effort to identify the crash contributing factors on an urban expressway in the Central Florida area. Two types of traffic data from the Automatic Vehicle Identification system, which are the processed data capped at speed limit and the unprocessed data retaining the original speed were incorporated in the analysis along with road geometric information. The model framework was proposed to account for the hierarchical data structure and the heterogeneity among the traffic and roadway geometric data. Multi-level and random parameters models were constructed and compared with the Negative Binomial model under the Bayesian inference framework. Results showed that the unprocessed traffic data was superior. Both multi-level models and random parameters models outperformed the Negative Binomial model and the models with random parameters achieved the best model fitting. The contributing factors identified imply that on the urban expressway lower speed and higher speed variation could significantly increase the crash likelihood. Other geometric factors were significant including auxiliary lanes and horizontal curvature.

  7. Multi-level Bayesian safety analysis with unprocessed Automatic Vehicle Identification data for an urban expressway.

    PubMed

    Shi, Qi; Abdel-Aty, Mohamed; Yu, Rongjie

    2016-03-01

    In traffic safety studies, crash frequency modeling of total crashes is the cornerstone before proceeding to more detailed safety evaluation. The relationship between crash occurrence and factors such as traffic flow and roadway geometric characteristics has been extensively explored for a better understanding of crash mechanisms. In this study, a multi-level Bayesian framework has been developed in an effort to identify the crash contributing factors on an urban expressway in the Central Florida area. Two types of traffic data from the Automatic Vehicle Identification system, which are the processed data capped at speed limit and the unprocessed data retaining the original speed were incorporated in the analysis along with road geometric information. The model framework was proposed to account for the hierarchical data structure and the heterogeneity among the traffic and roadway geometric data. Multi-level and random parameters models were constructed and compared with the Negative Binomial model under the Bayesian inference framework. Results showed that the unprocessed traffic data was superior. Both multi-level models and random parameters models outperformed the Negative Binomial model and the models with random parameters achieved the best model fitting. The contributing factors identified imply that on the urban expressway lower speed and higher speed variation could significantly increase the crash likelihood. Other geometric factors were significant including auxiliary lanes and horizontal curvature. PMID:26722989

  8. Scalability of a Base Level Design for an On-Board-Computer for Scientific Missions

    NASA Astrophysics Data System (ADS)

    Treudler, Carl Johann; Schroder, Jan-Carsten; Greif, Fabian; Stohlmann, Kai; Aydos, Gokce; Fey, Gorschwin

    2014-08-01

    Facing a wide range of mission requirements and the integration of diverse payloads requires extreme flexibility in the on-board-computing infrastructure for scientific missions. We show that scalability is principally difficult. We address this issue by proposing a base level design and show how the adoption to different needs is achieved. Inter-dependencies between scaling different aspects and their impact on different levels in the design are discussed.

  9. Can Synchronous Computer-Mediated Communication (CMC) Help Beginning-Level Foreign Language Learners Speak?

    ERIC Educational Resources Information Center

    Ko, Chao-Jung

    2012-01-01

    This study investigated the possibility that initial-level learners may acquire oral skills through synchronous computer-mediated communication (SCMC). Twelve Taiwanese French as a foreign language (FFL) students, divided into three groups, were required to conduct a variety of tasks in one of the three learning environments (video/audio, audio,…

  10. The Relationship between Internet and Computer Game Addiction Level and Shyness among High School Students

    ERIC Educational Resources Information Center

    Ayas, Tuncay

    2012-01-01

    This study is conducted to determine the relationship between the internet and computer games addiction level and the shyness among high school students. The participants of the study consist of 365 students attending high schools in Giresun city centre during 2009-2010 academic year. As a result of the study a positive, meaningful, and high…

  11. A Simple Gauss-Newton Procedure for Covariance Structure Analysis with High-Level Computer Languages.

    ERIC Educational Resources Information Center

    Cudeck, Robert; And Others

    1993-01-01

    An implementation of the Gauss-Newton algorithm for the analysis of covariance structure that is specifically adapted for high-level computer languages is reviewed. This simple method for estimating structural equation models is useful for a variety of standard models, as is illustrated. (SLD)

  12. Comparing Infants' Preference for Correlated Audiovisual Speech with Signal-Level Computational Models

    ERIC Educational Resources Information Center

    Hollich, George; Prince, Christopher G.

    2009-01-01

    How much of infant behaviour can be accounted for by signal-level analyses of stimuli? The current paper directly compares the moment-by-moment behaviour of 8-month-old infants in an audiovisual preferential looking task with that of several computational models that use the same video stimuli as presented to the infants. One type of model…

  13. Identification of maize embryo-preferred promoters suitable for high-level heterologous protein production.

    PubMed

    Streatfield, Stephen J; Bray, Jeffrey; Love, Robert T; Horn, Michael E; Lane, Jeffrey R; Drees, Carol F; Egelkrout, Erin M; Howard, John A

    2010-01-01

    The production of heterologous proteins in plants at levels consistent with commercialization of protein products requires molecular tools to ensure high-level transgene expression. The identification of strong promoters, preferably specific to the target expression tissue, is a focus for improving foreign protein yields using transgenic cereals as a production system. Thus, there is a requirement for strong embryo preferred monocot promoters. We obtained the sequences of 500 randomly selected maize cDNA clones to determine gene expression profiles in embryo tissues at multiple stages during development. Promoters corresponding to the most abundant clones were identified and isolated. These promoters were fused to the b-glucuronidase reporter and their tissue specificity and developmental expression characteristics assessed in transgenic maize. All of the isolated promoters tested drove transgene expression predominantly in the embryo and were most active late in embryogenesis during storage protein deposition. One of the most active promoters assessed by transgene expression was associated with the globulin-1 protein. Sequence identified here extended approximately 1.6 kb distal to the previously identified extent of the globulin-1 promoter, and this additional sequence boosted expression over two-fold. The extended globulin-1 promoter sequence isolated in this study has the potential for driving transgene expression at higher levels than those previously reported for cereals. Also, other highly active embryo promoters identified here offer opportunities to express multiple foreign proteins simultaneously at high levels in embryo tissues, while avoiding concerns over gene silencing due to the repeated use of a single promoter. PMID:21844671

  14. Computational identification of conserved microRNAs and their targets from expression sequence tags of blueberry (Vaccinium corybosum)

    PubMed Central

    Li, Xuyan; Hou, Yanming; Zhang, Li; Zhang, Wenhao; Quan, Chen; Cui, Yuhai; Bian, Shaomin

    2014-01-01

    MicroRNAs (miRNAs) are a class of endogenous, approximately 21nt in length, non-coding RNA, which mediate the expression of target genes primarily at post-transcriptional levels. miRNAs play critical roles in almost all plant cellular and metabolic processes. Although numerous miRNAs have been identified in the plant kingdom, the miRNAs in blueberry, which is an economically important small fruit crop, still remain totally unknown. In this study, we reported a computational identification of miRNAs and their targets in blueberry. By conducting an EST-based comparative genomics approach, 9 potential vco-miRNAs were discovered from 22,402 blueberry ESTs according to a series of filtering criteria, designated as vco-miR156–5p, vco-miR156–3p, vco-miR1436, vco-miR1522, vco-miR4495, vco-miR5120, vco-miR5658, vco-miR5783, and vco-miR5986. Based on sequence complementarity between miRNA and its target transcript, 34 target ESTs from blueberry and 70 targets from other species were identified for the vco-miRNAs. The targets were found to be involved in transcription, RNA splicing and binding, DNA duplication, signal transduction, transport and trafficking, stress response, as well as synthesis and metabolic process. These findings will greatly contribute to future research in regard to functions and regulatory mechanisms of blueberry miRNAs. PMID:25763692

  15. Computational identification of conserved microRNAs and their targets from expression sequence tags of blueberry (Vaccinium corybosum).

    PubMed

    Li, Xuyan; Hou, Yanming; Zhang, Li; Zhang, Wenhao; Quan, Chen; Cui, Yuhai; Bian, Shaomin

    2014-01-01

    MicroRNAs (miRNAs) are a class of endogenous, approximately 21nt in length, non-coding RNA, which mediate the expression of target genes primarily at post-transcriptional levels. miRNAs play critical roles in almost all plant cellular and metabolic processes. Although numerous miRNAs have been identified in the plant kingdom, the miRNAs in blueberry, which is an economically important small fruit crop, still remain totally unknown. In this study, we reported a computational identification of miRNAs and their targets in blueberry. By conducting an EST-based comparative genomics approach, 9 potential vco-miRNAs were discovered from 22,402 blueberry ESTs according to a series of filtering criteria, designated as vco-miR156-5p, vco-miR156-3p, vco-miR1436, vco-miR1522, vco-miR4495, vco-miR5120, vco-miR5658, vco-miR5783, and vco-miR5986. Based on sequence complementarity between miRNA and its target transcript, 34 target ESTs from blueberry and 70 targets from other species were identified for the vco-miRNAs. The targets were found to be involved in transcription, RNA splicing and binding, DNA duplication, signal transduction, transport and trafficking, stress response, as well as synthesis and metabolic process. These findings will greatly contribute to future research in regard to functions and regulatory mechanisms of blueberry miRNAs.

  16. Identification of Low-level Point Radioactive Sources using a sensor network

    SciTech Connect

    Chin, J. C.; Rao, Nageswara S.; Yao, David K. Y.; Shankar, Mallikarjun; Yang, Yong; Hou, J. C.; Srivathsan, Sri; Iyengar, S. Sitharama

    2010-09-01

    Identification of a low-level point radioactive source amidst background radiation is achieved by a network of radiation sensors using a two-step approach. Based on measurements from three or more sensors, a geometric difference triangulation method or an N-sensor localization method is used to estimate the location and strength of the source. Then a sequential probability ratio test based on current measurements and estimated parameters is employed to finally decide: (1) the presence of a source with the estimated parameters, or (2) the absence of the source, or (3) the insufficiency of measurements to make a decision. This method achieves specified levels of false alarm and missed detection probabilities, while ensuring a close-to-minimal number of measurements for reaching a decision. This method minimizes the ghost-source problem of current estimation methods, and achieves a lower false alarm rate compared with current detection methods. This method is tested and demonstrated using: (1) simulations, and (2) a test-bed that utilizes the scaling properties of point radioactive sources to emulate high intensity ones that cannot be easily and safely handled in laboratory experiments.

  17. Pipasic: similarity and expression correction for strain-level identification and quantification in metaproteomics

    PubMed Central

    Penzlin, Anke; Lindner, Martin S.; Doellinger, Joerg; Dabrowski, Piotr Wojtek; Nitsche, Andreas; Renard, Bernhard Y.

    2014-01-01

    Motivation: Metaproteomic analysis allows studying the interplay of organisms or functional groups and has become increasingly popular also for diagnostic purposes. However, difficulties arise owing to the high sequence similarity between related organisms. Further, the state of conservation of proteins between species can be correlated with their expression level, which can lead to significant bias in results and interpretation. These challenges are similar but not identical to the challenges arising in the analysis of metagenomic samples and require specific solutions. Results: We introduce Pipasic (peptide intensity-weighted proteome abundance similarity correction) as a tool that corrects identification and spectral counting-based quantification results using peptide similarity estimation and expression level weighting within a non-negative lasso framework. Pipasic has distinct advantages over approaches only regarding unique peptides or aggregating results to the lowest common ancestor, as demonstrated on examples of viral diagnostics and an acid mine drainage dataset. Availability and implementation: Pipasic source code is freely available from https://sourceforge.net/projects/pipasic/. Contact: RenardB@rki.de Supplementary information: Supplementary data are available at Bioinformatics online PMID:24931978

  18. Assessment of Social Vulnerability Identification at Local Level around Merapi Volcano - A Self Organizing Map Approach

    NASA Astrophysics Data System (ADS)

    Lee, S.; Maharani, Y. N.; Ki, S. J.

    2015-12-01

    The application of Self-Organizing Map (SOM) to analyze social vulnerability to recognize the resilience within sites is a challenging tasks. The aim of this study is to propose a computational method to identify the sites according to their similarity and to determine the most relevant variables to characterize the social vulnerability in each cluster. For this purposes, SOM is considered as an effective platform for analysis of high dimensional data. By considering the cluster structure, the characteristic of social vulnerability of the sites identification can be fully understand. In this study, the social vulnerability variable is constructed from 17 variables, i.e. 12 independent variables which represent the socio-economic concepts and 5 dependent variables which represent the damage and losses due to Merapi eruption in 2010. These variables collectively represent the local situation of the study area, based on conducted fieldwork on September 2013. By using both independent and dependent variables, we can identify if the social vulnerability is reflected onto the actual situation, in this case, Merapi eruption 2010. However, social vulnerability analysis in the local communities consists of a number of variables that represent their socio-economic condition. Some of variables employed in this study might be more or less redundant. Therefore, SOM is used to reduce the redundant variable(s) by selecting the representative variables using the component planes and correlation coefficient between variables in order to find the effective sample size. Then, the selected dataset was effectively clustered according to their similarities. Finally, this approach can produce reliable estimates of clustering, recognize the most significant variables and could be useful for social vulnerability assessment, especially for the stakeholder as decision maker. This research was supported by a grant 'Development of Advanced Volcanic Disaster Response System considering

  19. Level set discrete element method for three-dimensional computations with triaxial case study

    NASA Astrophysics Data System (ADS)

    Kawamoto, Reid; Andò, Edward; Viggiani, Gioacchino; Andrade, José E.

    2016-06-01

    In this paper, we outline the level set discrete element method (LS-DEM) which is a discrete element method variant able to simulate systems of particles with arbitrary shape using level set functions as a geometric basis. This unique formulation allows seamless interfacing with level set-based characterization methods as well as computational ease in contact calculations. We then apply LS-DEM to simulate two virtual triaxial specimens generated from XRCT images of experiments and demonstrate LS-DEM's ability to quantitatively capture and predict stress-strain and volume-strain behavior observed in the experiments.

  20. Computational identification of altered metabolism using gene expression and metabolic pathways.

    PubMed

    Nam, Hojung; Lee, Jinwon; Lee, Doheon

    2009-07-01

    Understanding altered metabolism is an important issue because altered metabolism is often revealed as a cause or an effect in pathogenesis. It has also been shown to be an important factor in the manipulation of an organism's metabolism in metabolic engineering. Unfortunately, it is not yet possible to measure the concentration levels of all metabolites in the genome-wide scale of a metabolic network; consequently, a method that infers the alteration of metabolism is beneficial. The present study proposes a computational method that identifies genome-wide altered metabolism by analyzing functional units of KEGG pathways. As control of a metabolic pathway is accomplished by altering the activity of at least one rate-determining step enzyme, not all gene expressions of enzymes in the pathway demonstrate significant changes even if the pathway is altered. Therefore, we measure the alteration levels of a metabolic pathway by selectively observing expression levels of significantly changed genes in a pathway. The proposed method was applied to two strains of Saccharomyces cerevisiae gene expression profiles measured in very high-gravity (VHG) fermentation. The method identified altered metabolic pathways whose properties are related to ethanol and osmotic stress responses which had been known to be observed in VHG fermentation because of the high sugar concentration in growth media and high ethanol concentration in fermentation products. With the identified altered pathways, the proposed method achieved best accuracy and sensitivity rates for the Red Star (RS) strain compared to other three related studies (gene-set enrichment analysis (GSEA), significance analysis of microarray to gene set (SAM-GS), reporter metabolite), and for the CEN.PK 113-7D (CEN) strain, the proposed method and the GSEA method showed comparably similar performances.

  1. Energy Use and Power Levels in New Monitors and Personal Computers

    SciTech Connect

    Roberson, Judy A.; Homan, Gregory K.; Mahajan, Akshay; Nordman, Bruce; Webber, Carrie A.; Brown, Richard E.; McWhinney, Marla; Koomey, Jonathan G.

    2002-07-23

    Our research was conducted in support of the EPA ENERGY STAR Office Equipment program, whose goal is to reduce the amount of electricity consumed by office equipment in the U.S. The most energy-efficient models in each office equipment category are eligible for the ENERGY STAR label, which consumers can use to identify and select efficient products. As the efficiency of each category improves over time, the ENERGY STAR criteria need to be revised accordingly. The purpose of this study was to provide reliable data on the energy consumption of the newest personal computers and monitors that the EPA can use to evaluate revisions to current ENERGY STAR criteria as well as to improve the accuracy of ENERGY STAR program savings estimates. We report the results of measuring the power consumption and power management capabilities of a sample of new monitors and computers. These results will be used to improve estimates of program energy savings and carbon emission reductions, and to inform rev isions of the ENERGY STAR criteria for these products. Our sample consists of 35 monitors and 26 computers manufactured between July 2000 and October 2001; it includes cathode ray tube (CRT) and liquid crystal display (LCD) monitors, Macintosh and Intel-architecture computers, desktop and laptop computers, and integrated computer systems, in which power consumption of the computer and monitor cannot be measured separately. For each machine we measured power consumption when off, on, and in each low-power level. We identify trends in and opportunities to reduce power consumption in new personal computers and monitors. Our results include a trend among monitor manufacturers to provide a single very low low-power level, well below the current ENERGY STAR criteria for sleep power consumption. These very low sleep power results mean that energy consumed when monitors are off or in active use has become more important in terms of contribution to the overall unit energy consumption (UEC

  2. A computer program to facilitate performance assessment of underground low-level waste concrete vaults

    SciTech Connect

    Snyder, K.A.; Clifton, J.R.; Pommersheim, J.

    1996-08-01

    A computer program (4SIGHT) to facilitate performance assessment of underground concrete vaults for low level waste (LLW) disposal facilities is being developed at the National Institute of Standards and Technology (NIST). Specifically, the program predicts the hydraulic conductivity and the service life of an underground concrete vault. The hydraulic conductivity estimate is based upon empirical relations. The service life is estimated from consideration of three major degradation processes: steel reinforcement corrosion, sulfate attack, and leaching. The performance prediction is based upon ion transport equations for both diffusion and advection. Most importantly, the computer program incorporates the synergistic degradation effects of all three processes, and their effect upon the transport coefficients.

  3. A novel computer-aided detection system for pulmonary nodule identification in CT images

    NASA Astrophysics Data System (ADS)

    Han, Hao; Li, Lihong; Wang, Huafeng; Zhang, Hao; Moore, William; Liang, Zhengrong

    2014-03-01

    Computer-aided detection (CADe) of pulmonary nodules from computer tomography (CT) scans is critical for assisting radiologists to identify lung lesions at an early stage. In this paper, we propose a novel approach for CADe of lung nodules using a two-stage vector quantization (VQ) scheme. The first-stage VQ aims to extract lung from the chest volume, while the second-stage VQ is designed to extract initial nodule candidates (INCs) within the lung volume. Then rule-based expert filtering is employed to prune obvious FPs from INCs, and the commonly-used support vector machine (SVM) classifier is adopted to further reduce the FPs. The proposed system was validated on 100 CT scans randomly selected from the 262 scans that have at least one juxta-pleural nodule annotation in the publicly available database - Lung Image Database Consortium and Image Database Resource Initiative (LIDC-IDRI). The two-stage VQ only missed 2 out of the 207 nodules at agreement level 1, and the INCs detection for each scan took about 30 seconds in average. Expert filtering reduced FPs more than 18 times, while maintaining a sensitivity of 93.24%. As it is trivial to distinguish INCs attached to pleural wall versus not on wall, we investigated the feasibility of training different SVM classifiers to further reduce FPs from these two kinds of INCs. Experiment results indicated that SVM classification over the entire set of INCs was in favor of, where the optimal operating of our CADe system achieved a sensitivity of 89.4% at a specificity of 86.8%.

  4. Reshaping Computer Literacy Teaching in Higher Education: Identification of Critical Success Factors

    ERIC Educational Resources Information Center

    Taylor, Estelle; Goede, Roelien; Steyn, Tjaart

    2011-01-01

    Purpose: Acquiring computer skills is more important today than ever before, especially in a developing country. Teaching of computer skills, however, has to adapt to new technology. This paper aims to model factors influencing the success of the learning of computer literacy by means of an e-learning environment. The research question for this…

  5. Man-Computer Symbiosis Through Interactive Graphics: A Survey and Identification of Critical Research Areas.

    ERIC Educational Resources Information Center

    Knoop, Patricia A.

    The purpose of this report was to determine the research areas that appear most critical to achieving man-computer symbiosis. An operational definition of man-computer symbiosis was developed by: (1) reviewing and summarizing what others have said about it, and (2) attempting to distinguish it from other types of man-computer relationships. From…

  6. Identification of Cognitive Processes of Effective and Ineffective Students during Computer Programming

    ERIC Educational Resources Information Center

    Renumol, V. G.; Janakiram, Dharanipragada; Jayaprakash, S.

    2010-01-01

    Identifying the set of cognitive processes (CPs) a student can go through during computer programming is an interesting research problem. It can provide a better understanding of the human aspects in computer programming process and can also contribute to the computer programming education in general. The study identified the presence of a set of…

  7. Species-level identification of staphylococci isolated from bovine mastitis in Brazil using partial 16S rRNA sequencing.

    PubMed

    Lange, Carla C; Brito, Maria A V P; Reis, Daniele R L; Machado, Marco A; Guimarães, Alessandro S; Azevedo, Ana L S; Salles, Érica B; Alvim, Mariana C T; Silva, Fabiana S; Meurer, Igor R

    2015-04-17

    Staphylococci isolated from bovine milk and not classified as Staphylococcus aureus represent a heterogeneous group of microorganisms that are frequently associated with bovine mastitis. The identification of these microorganisms is important, although it is difficult and relatively costly. Genotypic methods add precision in the identification of Staphylococcus species. In the present study, partial 16S rRNA sequencing was used for the species identification of coagulase-positive and coagulase-negative staphylococci isolated from bovine mastitis. Two hundred and two (95%) of the 213 isolates were successfully identified at the species level. The assigning of an isolate to a particular species was based on ≥99% identity with 16S rRNA sequences deposited in GenBank. The identified isolates belonged to 13 different Staphylococcus species; Staphylococcus chromogenes, S. aureus and Staphylococcus epidermidis were the most frequently identified species. Eight isolates could not be assigned to a single species, as the obtained sequences showed 99% or 100% similarity to sequences from two or three different Staphylococcus species. The relatedness of these isolates with the other isolates and reference strains was visualized using a cladogram. In conclusion, 16S rRNA sequencing was an objective and accurate method for the proper identification of Staphylococcus species isolated from bovine mastitis. Additional target genes could be used in non-conclusive cases for the species-level identification of these microorganisms.

  8. Species-level identification of staphylococci isolated from bovine mastitis in Brazil using partial 16S rRNA sequencing.

    PubMed

    Lange, Carla C; Brito, Maria A V P; Reis, Daniele R L; Machado, Marco A; Guimarães, Alessandro S; Azevedo, Ana L S; Salles, Érica B; Alvim, Mariana C T; Silva, Fabiana S; Meurer, Igor R

    2015-04-17

    Staphylococci isolated from bovine milk and not classified as Staphylococcus aureus represent a heterogeneous group of microorganisms that are frequently associated with bovine mastitis. The identification of these microorganisms is important, although it is difficult and relatively costly. Genotypic methods add precision in the identification of Staphylococcus species. In the present study, partial 16S rRNA sequencing was used for the species identification of coagulase-positive and coagulase-negative staphylococci isolated from bovine mastitis. Two hundred and two (95%) of the 213 isolates were successfully identified at the species level. The assigning of an isolate to a particular species was based on ≥99% identity with 16S rRNA sequences deposited in GenBank. The identified isolates belonged to 13 different Staphylococcus species; Staphylococcus chromogenes, S. aureus and Staphylococcus epidermidis were the most frequently identified species. Eight isolates could not be assigned to a single species, as the obtained sequences showed 99% or 100% similarity to sequences from two or three different Staphylococcus species. The relatedness of these isolates with the other isolates and reference strains was visualized using a cladogram. In conclusion, 16S rRNA sequencing was an objective and accurate method for the proper identification of Staphylococcus species isolated from bovine mastitis. Additional target genes could be used in non-conclusive cases for the species-level identification of these microorganisms. PMID:25704228

  9. High-level waste storage tank farms/242-A evaporator Standards/Requirements Identification Document (S/RID)

    SciTech Connect

    Not Available

    1994-04-01

    The High-Level Waste Storage Tank Farms/242-A Evaporator Standards/Requirements Identification Document (S/RID) is contained in multiple volumes. This document (Volume 3) presents the standards and requirements for the following sections: Safeguards and Security, Engineering Design, and Maintenance.

  10. High-level waste storage tank farms/242-A evaporator Standards/Requirements Identification Document (S/RID), Volume 5

    SciTech Connect

    Not Available

    1994-04-01

    The High-Level Waste Storage Tank Farms/242-A Evaporator Standards/Requirements Identification Document (S/RID) is contained in multiple volumes. This document (Volume 5) outlines the standards and requirements for the Fire Protection and Packaging and Transportation sections.

  11. High-level waste storage tank farms/242-A evaporator Standards/Requirements Identification Document (S/RID), Volume 4

    SciTech Connect

    Not Available

    1994-04-01

    The High-Level Waste Storage Tank Farms/242-A Evaporator Standards/Requirements Identification Document (S/RID) is contained in multiple volumes. This document (Volume 4) presents the standards and requirements for the following sections: Radiation Protection and Operations.

  12. High level waste storage tank farms/242-A evaporator Standards/Requirements Identification Document (S/RID), Volume 6

    SciTech Connect

    Not Available

    1994-04-01

    The High-Level Waste Storage Tank Farms/242-A Evaporator Standards/Requirements Identification Document (S/RID) is contained in multiple volumes. This document (Volume 6) outlines the standards and requirements for the sections on: Environmental Restoration and Waste Management, Research and Development and Experimental Activities, and Nuclear Safety.

  13. Identification of cis-elements conferring high levels of gene expression in non-green plastids.

    PubMed

    Zhang, Jiang; Ruf, Stephanie; Hasse, Claudia; Childs, Liam; Scharff, Lars B; Bock, Ralph

    2012-10-01

    Although our knowledge about the mechanisms of gene expression in chloroplasts has increased substantially over the past decades, next to nothing is known about the signals and factors that govern expression of the plastid genome in non-green tissues. Here we report the development of a quantitative method suitable for determining the activity of cis-acting elements for gene expression in non-green plastids. The in vivo assay is based on stable transformation of the plastid genome and the discovery that root length upon seedling growth in the presence of the plastid translational inhibitor kanamycin is directly proportional to the expression strength of the resistance gene nptII in transgenic tobacco plastids. By testing various combinations of promoters and translation initiation signals, we have used this experimental system to identify cis-elements that are highly active in non-green plastids. Surprisingly, heterologous expression elements from maize plastids were significantly more efficient in conferring high expression levels in root plastids than homologous expression elements from tobacco. Our work has established a quantitative method for characterization of gene expression in non-green plastid types, and has led to identification of cis-elements for efficient plastid transgene expression in non-green tissues, which are valuable tools for future transplastomic studies in basic and applied research.

  14. Selection of a computer code for Hanford low-level waste engineered-system performance assessment

    SciTech Connect

    McGrail, B.P.; Mahoney, L.A.

    1995-10-01

    Planned performance assessments for the proposed disposal of low-level waste (LLW) glass produced from remediation of wastes stored in underground tanks at Hanford, Washington will require calculations of radionuclide release rates from the subsurface disposal facility. These calculations will be done with the aid of computer codes. Currently available computer codes were ranked in terms of the feature sets implemented in the code that match a set of physical, chemical, numerical, and functional capabilities needed to assess release rates from the engineered system. The needed capabilities were identified from an analysis of the important physical and chemical process expected to affect LLW glass corrosion and the mobility of radionuclides. The highest ranked computer code was found to be the ARES-CT code developed at PNL for the US Department of Energy for evaluation of and land disposal sites.

  15. Identification of masses in digital mammogram using gray level co-occurrence matrices

    PubMed Central

    Mohd. Khuzi, A; Besar, R; Wan Zaki, WMD; Ahmad, NN

    2009-01-01

    Digital mammogram has become the most effective technique for early breast cancer detection modality. Digital mammogram takes an electronic image of the breast and stores it directly in a computer. The aim of this study is to develop an automated system for assisting the analysis of digital mammograms. Computer image processing techniques will be applied to enhance images and this is followed by segmentation of the region of interest (ROI). Subsequently, the textural features will be extracted from the ROI. The texture features will be used to classify the ROIs as either masses or non-masses. In this study normal breast images and breast image with masses used as the standard input to the proposed system are taken from Mammographic Image Analysis Society (MIAS) digital mammogram database. In MIAS database, masses are grouped into either spiculated, circumscribed or ill-defined. Additional information includes location of masses centres and radius of masses. The extraction of the textural features of ROIs is done by using gray level co-occurrence matrices (GLCM) which is constructed at four different directions for each ROI. The results show that the GLCM at 0º, 45º, 90º and 135º with a block size of 8X8 give significant texture information to identify between masses and non-masses tissues. Analysis of GLCM properties i.e. contrast, energy and homogeneity resulted in receiver operating characteristics (ROC) curve area of Az = 0.84 for Otsu’s method, 0.82 for thresholding method and Az = 0.7 for K-mean clustering. ROC curve area of 0.8-0.9 is rated as good results. The authors’ proposed method contains no complicated algorithm. The detection is based on a decision tree with five criterions to be analysed. This simplicity leads to less computational time. Thus, this approach is suitable for automated real-time breast cancer diagnosis system. PMID:21611053

  16. Cooperative gene regulation by microRNA pairs and their identification using a computational workflow

    PubMed Central

    Schmitz, Ulf; Lai, Xin; Winter, Felix; Wolkenhauer, Olaf; Vera, Julio; Gupta, Shailendra K.

    2014-01-01

    MicroRNAs (miRNAs) are an integral part of gene regulation at the post-transcriptional level. Recently, it has been shown that pairs of miRNAs can repress the translation of a target mRNA in a cooperative manner, which leads to an enhanced effectiveness and specificity in target repression. However, it remains unclear which miRNA pairs can synergize and which genes are target of cooperative miRNA regulation. In this paper, we present a computational workflow for the prediction and analysis of cooperating miRNAs and their mutual target genes, which we refer to as RNA triplexes. The workflow integrates methods of miRNA target prediction; triplex structure analysis; molecular dynamics simulations and mathematical modeling for a reliable prediction of functional RNA triplexes and target repression efficiency. In a case study we analyzed the human genome and identified several thousand targets of cooperative gene regulation. Our results suggest that miRNA cooperativity is a frequent mechanism for an enhanced target repression by pairs of miRNAs facilitating distinctive and fine-tuned target gene expression patterns. Human RNA triplexes predicted and characterized in this study are organized in a web resource at www.sbi.uni-rostock.de/triplexrna/. PMID:24875477

  17. Using the High-Level Based Program Interface to Facilitate the Large Scale Scientific Computing

    PubMed Central

    Shang, Yizi; Shang, Ling; Gao, Chuanchang; Lu, Guiming; Ye, Yuntao; Jia, Dongdong

    2014-01-01

    This paper is to make further research on facilitating the large-scale scientific computing on the grid and the desktop grid platform. The related issues include the programming method, the overhead of the high-level program interface based middleware, and the data anticipate migration. The block based Gauss Jordan algorithm as a real example of large-scale scientific computing is used to evaluate those issues presented above. The results show that the high-level based program interface makes the complex scientific applications on large-scale scientific platform easier, though a little overhead is unavoidable. Also, the data anticipation migration mechanism can improve the efficiency of the platform which needs to process big data based scientific applications. PMID:24574931

  18. Computational Psychiatry of ADHD: Neural Gain Impairments across Marrian Levels of Analysis

    PubMed Central

    Hauser, Tobias U.; Fiore, Vincenzo G.; Moutoussis, Michael; Dolan, Raymond J.

    2016-01-01

    Attention-deficit hyperactivity disorder (ADHD), one of the most common psychiatric disorders, is characterised by unstable response patterns across multiple cognitive domains. However, the neural mechanisms that explain these characteristic features remain unclear. Using a computational multilevel approach, we propose that ADHD is caused by impaired gain modulation in systems that generate this phenotypic increased behavioural variability. Using Marr's three levels of analysis as a heuristic framework, we focus on this variable behaviour, detail how it can be explained algorithmically, and how it might be implemented at a neural level through catecholamine influences on corticostriatal loops. This computational, multilevel, approach to ADHD provides a framework for bridging gaps between descriptions of neuronal activity and behaviour, and provides testable predictions about impaired mechanisms. PMID:26787097

  19. Computation of the intervals of uncertainties about the parameters found for identification

    NASA Technical Reports Server (NTRS)

    Mereau, P.; Raymond, J.

    1982-01-01

    A modeling method to calculate the intervals of uncertainty for parameters found by identification is described. The region of confidence and the general approach to the calculation of these intervals are discussed. The general subprograms for determination of dimensions are described. They provide the organizational charts for the subprograms, the tests carried out and the listings of the different subprograms.

  20. Experimental Validation of the Constant Level Method for Identification of Non-Linear Multi-Degree Systems

    NASA Astrophysics Data System (ADS)

    Dimitriadis, G.

    2002-12-01

    System identification for non-linear dynamical systems could find use in many applications such as condition monitoring, finite element model validation and determination of stability. The effectiveness of existing non-linear system identification techniques is limited by various factors such as the complexity of the system under investigation and the type of non-linearities present. In this work, the constant level identification approach, which can identify multi-degree-of-freedom systems featuring any type of non-linear function, including discontinuous functions, is validated experimentally. The method is shown to identify accurately an experimental dynamical system featuring two types of stiffness non-linearity. The full equations of motion are also extracted accurately, even in the presence of a discontinuous non-linearity.

  1. Systems Level Analysis and Identification of Pathways and Networks Associated with Liver Fibrosis

    PubMed Central

    AbdulHameed, Mohamed Diwan M.; Tawa, Gregory J.; Kumar, Kamal; Ippolito, Danielle L.; Lewis, John A.; Stallings, Jonathan D.; Wallqvist, Anders

    2014-01-01

    Toxic liver injury causes necrosis and fibrosis, which may lead to cirrhosis and liver failure. Despite recent progress in understanding the mechanism of liver fibrosis, our knowledge of the molecular-level details of this disease is still incomplete. The elucidation of networks and pathways associated with liver fibrosis can provide insight into the underlying molecular mechanisms of the disease, as well as identify potential diagnostic or prognostic biomarkers. Towards this end, we analyzed rat gene expression data from a range of chemical exposures that produced observable periportal liver fibrosis as documented in DrugMatrix, a publicly available toxicogenomics database. We identified genes relevant to liver fibrosis using standard differential expression and co-expression analyses, and then used these genes in pathway enrichment and protein-protein interaction (PPI) network analyses. We identified a PPI network module associated with liver fibrosis that includes known liver fibrosis-relevant genes, such as tissue inhibitor of metalloproteinase-1, galectin-3, connective tissue growth factor, and lipocalin-2. We also identified several new genes, such as perilipin-3, legumain, and myocilin, which were associated with liver fibrosis. We further analyzed the expression pattern of the genes in the PPI network module across a wide range of 640 chemical exposure conditions in DrugMatrix and identified early indications of liver fibrosis for carbon tetrachloride and lipopolysaccharide exposures. Although it is well known that carbon tetrachloride and lipopolysaccharide can cause liver fibrosis, our network analysis was able to link these compounds to potential fibrotic damage before histopathological changes associated with liver fibrosis appeared. These results demonstrated that our approach is capable of identifying early-stage indicators of liver fibrosis and underscore its potential to aid in predictive toxicity, biomarker identification, and to generally identify

  2. Tenth Grade Students' Time Using a Computer as a Predictor of the Highest Level of Education Attempted

    ERIC Educational Resources Information Center

    Gaffey, Adam John

    2014-01-01

    As computing technology continued to grow in the lives of secondary students from 2002 to 2006, researchers failed to identify the influence using computers would have on the highest level of education students attempted. During the early part of the century schools moved towards increasing the usage of computers. Numerous stakeholders were unsure…

  3. Compiling high-level languages for configurable computers: applying lessons from heterogeneous processing

    NASA Astrophysics Data System (ADS)

    Weaver, Glen E.; Weems, Charles C.; McKinley, Kathryn S.

    1996-10-01

    Configurable systems offer increased performance by providing hardware that matches the computational structure of a problem. This hardware is currently programmed with CAD tools and explicit library calls. To attain widespread acceptance, configurable computing must become transparently accessible from high-level programming languages, but the changeable nature of the target hardware presents a major challenge to traditional compiler technology. A compiler for a configurable computer should optimize the use of functions embedded in hardware and schedule hardware reconfigurations. The hurdles to be overcome in achieving this capability are similar in some ways to those facing compilation for heterogeneous systems. For example, current traditional compilers have neither an interface to accept new primitive operators, nor a mechanism for applying optimizations to new operators. We are building a compiler for heterogeneous computing, called Scale, which replaces the traditional monolithic compiler architecture with a flexible framework. Scale has three main parts: translation director, compilation library, and a persistent store which holds our intermediate representation as well as other data structures. The translation director exploits the framework's flexibility by using architectural information to build a plan to direct each compilation. The translation library serves as a toolkit for use by the translation director. Our compiler intermediate representation, Score, facilities the addition of new IR nodes by distinguishing features used in defining nodes from properties on which transformations depend. In this paper, we present an overview of the scale architecture and its capabilities for dealing with heterogeneity, followed by a discussion of how those capabilities apply to problems in configurable computing. We then address aspects of configurable computing that are likely to require extensions to our approach and propose some extensions.

  4. Identification of tumor-associated cassette exons in human cancer through EST-based computational prediction and experimental validation

    PubMed Central

    2010-01-01

    Background Many evidences report that alternative splicing, the mechanism which produces mRNAs and proteins with different structures and functions from the same gene, is altered in cancer cells. Thus, the identification and characterization of cancer-specific splice variants may give large impulse to the discovery of novel diagnostic and prognostic tumour biomarkers, as well as of new targets for more selective and effective therapies. Results We present here a genome-wide analysis of the alternative splicing pattern of human genes through a computational analysis of normal and cancer-specific ESTs from seventeen anatomical groups, using data available in AspicDB, a database resource for the analysis of alternative splicing in human. By using a statistical methodology, normal and cancer-specific genes, splice sites and cassette exons were predicted in silico. The condition association of some of the novel normal/tumoral cassette exons was experimentally verified by RT-qPCR assays in the same anatomical system where they were predicted. Remarkably, the presence in vivo of the predicted alternative transcripts, specific for the nervous system, was confirmed in patients affected by glioblastoma. Conclusion This study presents a novel computational methodology for the identification of tumor-associated transcript variants to be used as cancer molecular biomarkers, provides its experimental validation, and reports specific biomarkers for glioblastoma. PMID:20813049

  5. Zebra tape identification for the instantaneous angular speed computation and angular resampling of motorbike valve train measurements

    NASA Astrophysics Data System (ADS)

    Rivola, Alessandro; Troncossi, Marco

    2014-02-01

    An experimental test campaign was performed on the valve train of a racing motorbike engine in order to get insight into the dynamic of the system. In particular the valve motion was acquired in cold test conditions by means of a laser vibrometer able to acquire displacement and velocity signals. The valve time-dependent measurements needed to be referred to the camshaft angular position in order to analyse the data in the angular domain, as usually done for rotating machines. To this purpose the camshaft was fitted with a zebra tape whose dark and light stripes were tracked by means of an optical probe. Unfortunately, both manufacturing and mounting imperfections of the employed zebra tape, resulting in stripes with slightly different widths, precluded the possibility to directly obtain the correct relationship between camshaft angular position and time. In order to overcome this problem, the identification of the zebra tape was performed by means of the original and practical procedure that is the focus of the present paper. The method consists of three main steps: namely, an ad-hoc test corresponding to special operating conditions, the computation of the instantaneous angular speed, and the final association of the stripes with the corresponding shaft angular position. The results reported in the paper demonstrate the suitability of the simple procedure for the zebra tape identification performed with the final purpose to implement a computed order tracking technique for the data analysis.

  6. A Comparative Study to Evaluate the Effectiveness of Computer Assisted Instruction (CAI) versus Class Room Lecture (RL) for Computer Science at ICS Level

    ERIC Educational Resources Information Center

    Kausar, Tayyaba; Choudhry, Bushra Naoreen; Gujjar, Aijaz Ahmed

    2008-01-01

    This study was aimed to evaluate the effectiveness of CAI vs. classroom lecture for computer science at ICS level. The objectives were to compare the learning effects of two groups with class room lecture and computer assisted instruction studying the same curriculum and the effects of CAI and CRL in terms of cognitive development. Hypothesis of…

  7. A Comparative Study to Evaluate the Effectiveness of Computer Assisted Instruction (CAI) versus Class Room Lecture (CRL) for Computer Science at ICS Level

    ERIC Educational Resources Information Center

    Kausar, Tayyaba; Choudhry, Bushra Naoreen; Gujjar, Aijaz Ahmed

    2008-01-01

    This study was aimed to evaluate the effectiveness of CAI vs. classroom lecture for computer science at ICS level. The objectives were to compare the learning effects of two groups with class room lecture and computer assisted instruction studying the same curriculum and the effects of CAI and CRL in terms of cognitive development. Hypothesis of…

  8. Comparative phyloinformatics of virus genes at micro and macro levels in a distributed computing environment

    PubMed Central

    Singh, Dadabhai T; Trehan, Rahul; Schmidt, Bertil; Bretschneider, Timo

    2008-01-01

    Background Preparedness for a possible global pandemic caused by viruses such as the highly pathogenic influenza A subtype H5N1 has become a global priority. In particular, it is critical to monitor the appearance of any new emerging subtypes. Comparative phyloinformatics can be used to monitor, analyze, and possibly predict the evolution of viruses. However, in order to utilize the full functionality of available analysis packages for large-scale phyloinformatics studies, a team of computer scientists, biostatisticians and virologists is needed – a requirement which cannot be fulfilled in many cases. Furthermore, the time complexities of many algorithms involved leads to prohibitive runtimes on sequential computer platforms. This has so far hindered the use of comparative phyloinformatics as a commonly applied tool in this area. Results In this paper the graphical-oriented workflow design system called Quascade and its efficient usage for comparative phyloinformatics are presented. In particular, we focus on how this task can be effectively performed in a distributed computing environment. As a proof of concept, the designed workflows are used for the phylogenetic analysis of neuraminidase of H5N1 isolates (micro level) and influenza viruses (macro level). The results of this paper are hence twofold. Firstly, this paper demonstrates the usefulness of a graphical user interface system to design and execute complex distributed workflows for large-scale phyloinformatics studies of virus genes. Secondly, the analysis of neuraminidase on different levels of complexity provides valuable insights of this virus's tendency for geographical based clustering in the phylogenetic tree and also shows the importance of glycan sites in its molecular evolution. Conclusion The current study demonstrates the efficiency and utility of workflow systems providing a biologist friendly approach to complex biological dataset analysis using high performance computing. In particular, the

  9. Signal window minimum average error algorithm for multi-phase level computer-generated holograms

    NASA Astrophysics Data System (ADS)

    El Bouz, Marwa; Heggarty, Kevin

    2000-06-01

    This paper extends the article "Signal window minimum average error algorithm for computer-generated holograms" (JOSA A 1998) to multi-phase level CGHs. We show that using the same rule for calculating the complex error diffusion weights, iterative-algorithm-like low-error signal windows can be obtained for any window shape or position (on- or off-axis) and any number of CGH phase levels. Important algorithm parameters such as amplitude normalisation level and phase freedom diffusers are described and investigated to optimize the algorithm. We show that, combined with a suitable diffuser, the algorithm makes feasible the calculation of high performance CGHs far larger than currently practical with iterative algorithms yet now realisable with modern fabrication techniques. Preliminary experimental optical reconstructions are presented.

  10. K-nearest neighbors based methods for identification of different gear crack levels under different motor speeds and loads: Revisited

    NASA Astrophysics Data System (ADS)

    Wang, Dong

    2016-03-01

    Gears are the most commonly used components in mechanical transmission systems. Their failures may cause transmission system breakdown and result in economic loss. Identification of different gear crack levels is important to prevent any unexpected gear failure because gear cracks lead to gear tooth breakage. Signal processing based methods mainly require expertize to explain gear fault signatures which is usually not easy to be achieved by ordinary users. In order to automatically identify different gear crack levels, intelligent gear crack identification methods should be developed. The previous case studies experimentally proved that K-nearest neighbors based methods exhibit high prediction accuracies for identification of 3 different gear crack levels under different motor speeds and loads. In this short communication, to further enhance prediction accuracies of existing K-nearest neighbors based methods and extend identification of 3 different gear crack levels to identification of 5 different gear crack levels, redundant statistical features are constructed by using Daubechies 44 (db44) binary wavelet packet transform at different wavelet decomposition levels, prior to the use of a K-nearest neighbors method. The dimensionality of redundant statistical features is 620, which provides richer gear fault signatures. Since many of these statistical features are redundant and highly correlated with each other, dimensionality reduction of redundant statistical features is conducted to obtain new significant statistical features. At last, the K-nearest neighbors method is used to identify 5 different gear crack levels under different motor speeds and loads. A case study including 3 experiments is investigated to demonstrate that the developed method provides higher prediction accuracies than the existing K-nearest neighbors based methods for recognizing different gear crack levels under different motor speeds and loads. Based on the new significant statistical

  11. Effect of Computer Simulations at the Particulate and Macroscopic Levels on Students' Understanding of the Particulate Nature of Matter

    ERIC Educational Resources Information Center

    Tang, Hui; Abraham, Michael R.

    2016-01-01

    Computer-based simulations can help students visualize chemical representations and understand chemistry concepts, but simulations at different levels of representation may vary in effectiveness on student learning. This study investigated the influence of computer activities that simulate chemical reactions at different levels of representation…

  12. Dose Assessment in Computed Tomography Examination and Establishment of Local Diagnostic Reference Levels in Mazandaran, Iran

    PubMed Central

    Janbabanezhad Toori, A.; Shabestani-Monfared, A.; Deevband, M.R.; Abdi, R.; Nabahati, M.

    2015-01-01

    Background Medical X-rays are the largest man-made source of public exposure to ionizing radiation. While the benefits of Computed Tomography (CT) are well known in accurate diagnosis, those benefits are not risk-free. CT is a device with higher patient dose in comparison with other conventional radiation procedures. Objective This study is aimed at evaluating radiation dose to patients from Computed Tomography (CT) examination in Mazandaran hospitals and defining diagnostic reference level (DRL). Methods Patient-related data on CT protocol for four common CT examinations including brain, sinus, chest and abdomen & pelvic were collected. In each center, Computed Tomography Dose Index (CTDI) measurements were performed using pencil ionization chamber and CT dosimetry phantom according to AAPM report No. 96 for those techniques. Then, Weighted Computed Tomography Dose Index (CTDIW), Volume Computed Tomography Dose Index (CTDI vol) and Dose Length Product (DLP) were calculated. Results The CTDIw for brain, sinus, chest and abdomen & pelvic ranged (15.6-73), (3.8-25. 8), (4.5-16.3) and (7-16.3), respectively. Values of DLP had a range of (197.4-981), (41.8-184), (131-342.3) and (283.6-486) for brain, sinus, chest and abdomen & pelvic, respectively. The 3rd quartile of CTDIW, derived from dose distribution for each examination is the proposed quantity for DRL. The DRLs of brain, sinus, chest and abdomen & pelvic are measured 59.5, 17, 7.8 and 11 mGy, respectively. Conclusion Results of this study demonstrated large scales of dose for the same examination among different centers. For all examinations, our values were lower than international reference doses. PMID:26688796

  13. A Framework for Different Levels of Integration of Computational Models Into Web-Based Virtual Patients

    PubMed Central

    Narracott, Andrew J; Manini, Simone; Bayley, Martin J; Lawford, Patricia V; McCormack, Keith; Zary, Nabil

    2014-01-01

    Background Virtual patients are increasingly common tools used in health care education to foster learning of clinical reasoning skills. One potential way to expand their functionality is to augment virtual patients’ interactivity by enriching them with computational models of physiological and pathological processes. Objective The primary goal of this paper was to propose a conceptual framework for the integration of computational models within virtual patients, with particular focus on (1) characteristics to be addressed while preparing the integration, (2) the extent of the integration, (3) strategies to achieve integration, and (4) methods for evaluating the feasibility of integration. An additional goal was to pilot the first investigation of changing framework variables on altering perceptions of integration. Methods The framework was constructed using an iterative process informed by Soft System Methodology. The Virtual Physiological Human (VPH) initiative has been used as a source of new computational models. The technical challenges associated with development of virtual patients enhanced by computational models are discussed from the perspectives of a number of different stakeholders. Concrete design and evaluation steps are discussed in the context of an exemplar virtual patient employing the results of the VPH ARCH project, as well as improvements for future iterations. Results The proposed framework consists of four main elements. The first element is a list of feasibility features characterizing the integration process from three perspectives: the computational modelling researcher, the health care educationalist, and the virtual patient system developer. The second element included three integration levels: basic, where a single set of simulation outcomes is generated for specific nodes in the activity graph; intermediate, involving pre-generation of simulation datasets over a range of input parameters; advanced, including dynamic solution of the

  14. Level set methods to compute minimal surfaces in a medium with exclusions (voids).

    SciTech Connect

    Walsh, Timothy Francis; Chopp, David; Torres, Monica

    2003-06-01

    In T1, periodic minimal surfaces in a medium with exclusions (voids) are constructed and in this paper we present two algorithms for computing these minimal surfaces. The two algorithms use evolution of level sets by mean curvature. The first algorithm solves the governing nonlinear PDE directly and enforces numerically an orthogonality condition that the surfaces satisfy when they meet the boundaries of the exclusions. The second algorithm involves h-adaptive finite element approximations of a linear convection-diffusion equation, which has been shown to linearize the governing nonlinear PDE for weighted mean curvature flow.

  15. The Effects of Linear Microphone Array Changes on Computed Sound Exposure Level Footprints

    NASA Technical Reports Server (NTRS)

    Mueller, Arnold W.; Wilson, Mark R.

    1997-01-01

    Airport land planning commissions often are faced with determining how much area around an airport is affected by the sound exposure levels (SELS) associated with helicopter operations. This paper presents a study of the effects changing the size and composition of a microphone array has on the computed SEL contour (ground footprint) areas used by such commissions. Descent flight acoustic data measured by a fifteen microphone array were reprocessed for five different combinations of microphones within this array. This resulted in data for six different arrays for which SEL contours were computed. The fifteen microphone array was defined as the 'baseline' array since it contained the greatest amount of data. The computations used a newly developed technique, the Acoustic Re-propagation Technique (ART), which uses parts of the NASA noise prediction program ROTONET. After the areas of the SEL contours were calculated the differences between the areas were determined. The area differences for the six arrays are presented that show a five and a three microphone array (with spacing typical of that required by the FAA FAR Part 36 noise certification procedure) compare well with the fifteen microphone array. All data were obtained from a database resulting from a joint project conducted by NASA and U.S. Army researchers at Langley and Ames Research Centers. A brief description of the joint project test design, microphone array set-up, and data reduction methodology associated with the database are discussed.

  16. A cascadic monotonic time-discretized algorithm for finite-level quantum control computation

    NASA Astrophysics Data System (ADS)

    Ditz, P.; Borzi`, A.

    2008-03-01

    A computer package (CNMS) is presented aimed at the solution of finite-level quantum optimal control problems. This package is based on a recently developed computational strategy known as monotonic schemes. Quantum optimal control problems arise in particular in quantum optics where the optimization of a control representing laser pulses is required. The purpose of the external control field is to channel the system's wavefunction between given states in its most efficient way. Physically motivated constraints, such as limited laser resources, are accommodated through appropriately chosen cost functionals. Program summaryProgram title: CNMS Catalogue identifier: ADEB_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADEB_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 770 No. of bytes in distributed program, including test data, etc.: 7098 Distribution format: tar.gz Programming language: MATLAB 6 Computer: AMD Athlon 64 × 2 Dual, 2:21 GHz, 1:5 GB RAM Operating system: Microsoft Windows XP Word size: 32 Classification: 4.9 Nature of problem: Quantum control Solution method: Iterative Running time: 60-600 sec

  17. A computer toolbox for damage identification based on changes in vibration characteristics

    SciTech Connect

    Doebling, S.W.; Farrar, C.R.; Cornwell, P.J.

    1997-09-01

    This paper introduces a new toolbox of graphical-interface software algorithms for the numerical simulation of vibration tests, analysis of modal data, finite element model correlation, and the comparison of both linear and nonlinear damage identification techniques. This toolbox is unique because it contains several different vibration-based damage identification algorithms, categorized as those which use only measured response and sensor location information ({open_quotes}non-model-based{close_quotes} techniques) and those which use finite element model correlation ({open_quotes}model-based{close_quotes} techniques). Another unique feature of this toolbox is the wide range of algorithms for experimental modal analysis. The toolbox also contains a unique capability that utilizes the measured coherence functions and Monte Carlo analysis to perform statistical uncertainty analysis on the modal correlation capabilities of toolbox, and also shows a sample application which uses the toolbox to analyze the statistical uncertainties on the results of a series of modal tests performed on a highway bridge.

  18. Recent computational advances in the identification of allosteric sites in proteins.

    PubMed

    Lu, Shaoyong; Huang, Wenkang; Zhang, Jian

    2014-10-01

    Allosteric modulators have the potential to fine-tune protein functional activity. Therefore, the targeting of allosteric sites, as a strategy in drug design, is gaining increasing attention. Currently, it is not trivial to find and characterize new allosteric sites by experimental approaches. Alternatively, computational approaches are useful in helping researchers analyze and select potential allosteric sites for drug discovery. Here, we review state-of-the-art computational approaches directed at predicting putative allosteric sites in proteins, along with examples of successes in identifying allosteric sites utilizing these methods. We also discuss the challenges in developing reliable methods for predicting allosteric sites and tactics to resolve demanding tasks. PMID:25107670

  19. TPASS: a gamma-ray spectrum analysis and isotope identification computer code

    SciTech Connect

    Dickens, J.K.

    1981-03-01

    The gamma-ray spectral data-reduction and analysis computer code TPASS is described. This computer code is used to analyze complex Ge(Li) gamma-ray spectra to obtain peak areas corrected for detector efficiencies, from which are determined gamma-ray yields. These yields are compared with an isotope gamma-ray data file to determine the contributions to the observed spectrum from decay of specific radionuclides. A complete FORTRAN listing of the code and a complex test case are given.

  20. Parallel computation of level set method for 500 Hz visual servo control

    NASA Astrophysics Data System (ADS)

    Fei, Xianfeng; Igarashi, Yasunobu; Hashimoto, Koichi

    2008-11-01

    We propose a 2D microorganism tracking system using a parallel level set method and a column parallel vision system (CPV). This system keeps a single microorganism in the middle of the visual field under a microscope by visual servoing an automated stage. We propose a new energy function for the level set method. This function constrains an amount of light intensity inside the detected object contour to control the number of the detected objects. This algorithm is implemented in CPV system and computational time for each frame is 2 [ms], approximately. A tracking experiment for about 25 s is demonstrated. Also we demonstrate a single paramecium can be kept tracking even if other paramecia appear in the visual field and contact with the tracked paramecium.

  1. Efficient Calibration of Computationally Intensive Groundwater Models through Surrogate Modelling with Lower Levels of Fidelity

    NASA Astrophysics Data System (ADS)

    Razavi, S.; Anderson, D.; Martin, P.; MacMillan, G.; Tolson, B.; Gabriel, C.; Zhang, B.

    2012-12-01

    intensive groundwater modelling case study developed with the FEFLOW software is used to evaluate the proposed methodology. Multiple surrogates of this computationally intensive model with different levels of fidelity are created and applied. Dynamically dimensioned search (DDS) optimization algorithm is used as the search engine in the calibration framework enabled with surrogate models. Results show that this framework can substantially reduce the number of original model evaluations required for calibration by intelligently utilizing faster-to-run surrogates in the course of optimization. Results also demonstrate that the compromise between efficiency (reduced run time) and fidelity of a surrogate model is critically important to the success of the framework, as a surrogate with unreasonably low fidelity, despite being fast, might be quite misleading in calibration of the original model of interest.

  2. Establishment of multi-slice computed tomography (MSCT) reference level in Johor, Malaysia

    NASA Astrophysics Data System (ADS)

    Karim, M. K. A.; Hashim, S.; Bakar, K. A.; Muhammad, H.; Sabarudin, A.; Ang, W. C.; Bahruddin, N. A.

    2016-03-01

    Radiation doses from computed tomography (CT) are the highest and most hazardous compared to other imaging modalities. This study aimed to evaluate radiation dose in Johor, Malaysia to patients during computed tomography examinations of the brain, chest and abdomen and to establish the local diagnostic reference levels (DRLs) as are present with the current, state- of-art, multi-slice CT scanners. Survey forms were sent to five centres performing CT to obtain data regarding acquisition parameters as well as the dose information from CT consoles. CT- EXPO (Version 2.3.1, Germany) was used to validate the dose information. The proposed DRLs were indicated by rounding the third quartiles of whole dose distributions where mean values of CTDIw (mGy), CTDIvol (mGy) and DLP (mGy.cm) were comparable with other reference levels; 63, 63, and 1015 respectively for CT Brain; 15, 14, and 450 respectively for CT thorax and 16, 17, and 590 respectively for CT abdomen. The study revealed that the CT practice and dose output were revolutionised, and must keep up with the pace of introductory technology. We suggest that CTDIvol should be included in current national DRLs, as modern CTs are configured with a higher number of detectors and are independent of pitch factors.

  3. High-level ab initio computations of the absorption spectra of organic iridium complexes.

    PubMed

    Plasser, Felix; Dreuw, Andreas

    2015-02-12

    The excited states of fac-tris(phenylpyridinato)iridium [Ir(ppy)3] and the smaller model complex Ir(C3H4N)3 are computed using a number of high-level ab initio methods, including the recently implemented algebraic diagrammatic construction method to third-order ADC(3). A detailed description of the states is provided through advanced analysis methods, which allow a quantification of different charge transfer and orbital relaxation effects and give extended insight into the many-body wave functions. Compared to the ADC(3) benchmark an unexpected striking difference of ADC(2) is found for Ir(C3H4N)3, which derives from an overstabilization of charge transfer effects. Time-dependent density functional theory (TDDFT) using the B3LYP functional shows an analogous but less severe error for charge transfer states, whereas the ωB97 results are in good agreement with ADC(3). Multireference configuration interaction computations, which are in reasonable agreement with ADC(3), reveal that static correlation does not play a significant role. In the case of the larger Ir(ppy)3 complex, results at the TDDFT/B3LYP and TDDFT/ωB97 levels of theory are presented. Strong discrepancies between the two functionals, which are found with respect to the energies, characters, as well as the density of the low lying states, are discussed in detail and compared to experiment. PMID:25584785

  4. Derivation of Australian diagnostic reference levels for paediatric multi detector computed tomography.

    PubMed

    Hayton, Anna; Wallace, Anthony

    2016-09-01

    Australian National Diagnostic Reference Levels for paediatric multi detector computed tomography were established for three protocols, Head, Chest and AbdoPelvis, across two age groups, Baby/Infant 0-4 years and Child 5-14 years by the Australian Radiation Protection and Nuclear Safety Agency in 2012. The establishment of Australian paediatric DRLs is an important step towards lowering patient CT doses on a national scale. While Adult DRLs were calculated with data collected from the web based Australian National Diagnostic Reference Level Service, no paediatric data was submitted in the first year of service operation. Data from an independent Royal Australian and New Zealand College of Radiologists Quality Use of Diagnostic Imaging paediatric optimisation survey was used. The paediatric DRLs were defined for CTDIvol (mGy) and DLP (mGy·cm) values that referenced the 16 cm PMMA phantom for the Head protocol and the 32 cm PMMA phantom for body protocols for both paediatric age groups. The Australian paediatric DRLs for multi detector computed tomography are for the Head, Chest and AbdoPelvis protocols respectively, 470, 60 and 170 mGy·cm for the Baby/Infant age group, and 600, 110 and 390 mGy·cm for the Child age group. A comparison with published international paediatric DRLs for computed tomography reveal the Australian paediatric DRLs to be lower on average. However, the comparison is complicated by misalignment of defined age ranges. It is the intention of ARPANSA to review the paediatric DRLs in conjunction with a review of the adult DRLs, which should occur within 5 years of their publication. PMID:27350262

  5. Computer-aided identification of the water diffusion coefficient for maize kernels dried in a thin layer

    NASA Astrophysics Data System (ADS)

    Kujawa, Sebastian; Weres, Jerzy; Olek, Wiesław

    2016-07-01

    Uncertainties in mathematical modelling of water transport in cereal grain kernels during drying and storage are mainly due to implementing unreliable values of the water diffusion coefficient and simplifying the geometry of kernels. In the present study an attempt was made to reduce the uncertainties by developing a method for computer-aided identification of the water diffusion coefficient and more accurate 3D geometry modelling for individual kernels using original inverse finite element algorithms. The approach was exemplified by identifying the water diffusion coefficient for maize kernels subjected to drying. On the basis of the developed method, values of the water diffusion coefficient were estimated, 3D geometry of a maize kernel was represented by isoparametric finite elements, and the moisture content inside maize kernels dried in a thin layer was predicted. Validation of the results against experimental data showed significantly lower error values than in the case of results obtained for the water diffusion coefficient values available in the literature.

  6. A comparative analysis of computational approaches and algorithms for protein subcomplex identification.

    PubMed

    Zaki, Nazar; Mora, Antonio

    2014-01-01

    High-throughput AP-MS methods have allowed the identification of many protein complexes. However, most post-processing methods of this type of data have been focused on detection of protein complexes and not its subcomplexes. Here, we review the results of some existing methods that may allow subcomplex detection and propose alternative methods in order to detect subcomplexes from AP-MS data. We assessed and drew comparisons between the use of overlapping clustering methods, methods based in the core-attachment model and our own prediction strategy (TRIBAL). The hypothesis behind TRIBAL is that subcomplex-building information may be concealed in the multiple edges generated by an interaction repeated in different contexts in raw data. The CACHET method offered the best results when the evaluation of the predicted subcomplexes was carried out using both the hypergeometric and geometric scores. TRIBAL offered the best performance when using a strict meet-min score.

  7. Computer Aided Drug Design Approaches for Identification of Novel Autotaxin (ATX) Inhibitors.

    PubMed

    Vrontaki, Eleni; Melagraki, Georgia; Kaffe, Eleanna; Mavromoustakos, Thomas; Kokotos, George; Aidinis, Vassilis; Afantitis, Antreas

    2016-01-01

    Autotaxin (ATX) has become an attractive target with a huge pharmacological and pharmacochemical interest in LPA-related diseases and to date many small organic molecules have been explored as potential ATX inhibitors. As a useful aid in the various efforts of identifying novel effective ATX inhibitors, in silico methods can serve as an important and valuable tool. Especially, Virtual Screening (VS) has recently received increased attention due to the large datasets made available, the development of advanced VS techniques and the encouraging fact that VS has contributed to the discovery of several compounds that have either reached the market or entered clinical trials. Different techniques and workflows have been reported in literature with the goal to prioritize possible potent hits. In this review article several deployed virtual screening strategies for the identification of novel potent ATX inhibitors are described. PMID:26997151

  8. IUE data reduction: Wavelength determinations and line identifications using a VAX/750 computer

    NASA Astrophysics Data System (ADS)

    Davidson, J. P.; Bord, D. J.

    A fully automated, interactive system for determining the wavelengths of features in extracted IUE spectra is described. Wavelengths are recorded from video displays of expanded plots of individual orders using a movable cursor, and then corrected for IUE wavelength scale errors. The estimated accuracy of an individual wavelength in the final tabulation is 0.050 A. Such lists are ideally suited for line identification work using the method of wavelength coincidence statistics (WCS). The results of WCS studies of the ultraviolet spectra of the chemically peculiar (CP) stars iota Coronae Borealis and kappa Camcri. Aside from confirming a number of previously reported aspects of the abundance patterns in these stars, the searches produced some interesting, new discoveries, notably the presence of Hf in the spectrum of kappa Camcri. The implications of this work for theories designed to account for anomalous abundances in chemically peculiar stars are discussed.

  9. Power levels in office equipment: Measurements of new monitors and personal computers

    SciTech Connect

    Roberson, Judy A.; Brown, Richard E.; Nordman, Bruce; Webber, Carrie A.; Homan, Gregory H.; Mahajan, Akshay; McWhinney, Marla; Koomey, Jonathan G.

    2002-05-14

    Electronic office equipment has proliferated rapidly over the last twenty years and is projected to continue growing in the future. Efforts to reduce the growth in office equipment energy use have focused on power management to reduce power consumption of electronic devices when not being used for their primary purpose. The EPA ENERGY STAR[registered trademark] program has been instrumental in gaining widespread support for power management in office equipment, and accurate information about the energy used by office equipment in all power levels is important to improving program design and evaluation. This paper presents the results of a field study conducted during 2001 to measure the power levels of new monitors and personal computers. We measured off, on, and low-power levels in about 60 units manufactured since July 2000. The paper summarizes power data collected, explores differences within the sample (e.g., between CRT and LCD monitors), and discusses some issues that arise in m etering office equipment. We also present conclusions to help improve the success of future power management programs.Our findings include a trend among monitor manufacturers to provide a single very low low-power level, and the need to standardize methods for measuring monitor on power, to more accurately estimate the annual energy consumption of office equipment, as well as actual and potential energy savings from power management.

  10. Effect of Number of Graphic Symbols, Levels, and Listening Conditions on Symbol Identification and Latency in Persons with Aphasia.

    PubMed

    Petroi, Diana; Koul, Rajinder K; Corwin, Melinda

    2014-02-27

    This study investigated the ability of persons with aphasia to complete a series of experimental tasks involving single symbol and subject-verb-object sentence identification on a speech-generating device (SGD) in the presence/absence of competing stimuli. In all, 10 persons with Broca's aphasia and 10 persons in the control group were compared on accuracy and response latency of symbol identification across three listening conditions. Persons with aphasia identified fewer symbols accurately and had longer response latencies than persons in the control group. Number of symbols on the screen and location level had a significant effect on accuracy and latency for both groups. Persons with aphasia perceived tasks to be more difficult than persons in the control group. Results indicate that effective use of SGDs by persons with aphasia may depend on several message organization factors including location and number of symbols per screen. PMID:24575783

  11. High-level waste storage tank farms/242-A evaporator standards/requirements identification document (S/RID), Vol. 7

    SciTech Connect

    Not Available

    1994-04-01

    This Requirements Identification Document (RID) describes an Occupational Health and Safety Program as defined through the Relevant DOE Orders, regulations, industry codes/standards, industry guidance documents and, as appropriate, good industry practice. The definition of an Occupational Health and Safety Program as specified by this document is intended to address Defense Nuclear Facilities Safety Board Recommendations 90-2 and 91-1, which call for the strengthening of DOE complex activities through the identification and application of relevant standards which supplement or exceed requirements mandated by DOE Orders. This RID applies to the activities, personnel, structures, systems, components, and programs involved in maintaining the facility and executing the mission of the High-Level Waste Storage Tank Farms.

  12. Identification and analysis of unsatisfactory psychosocial work situations: a participatory approach employing video-computer interaction.

    PubMed

    Hanse, J J; Forsman, M

    2001-02-01

    A method for psychosocial evaluation of potentially stressful or unsatisfactory situations in manual work was developed. It focuses on subjective responses regarding specific situations and is based on interactive worker assessment when viewing video recordings of oneself. The worker is first video-recorded during work. The video is then displayed on the computer terminal, and the filmed worker clicks on virtual controls on the screen whenever an unsatisfactory psychosocial situation appears; a window of questions regarding psychological demands, mental strain and job control is then opened. A library with pictorial information and comments on the selected situations is formed in the computer. The evaluation system, called PSIDAR, was applied in two case studies, one of manual materials handling in an automotive workshop and one of a group of workers producing and testing instrument panels. The findings indicate that PSIDAR can provide data that are useful in a participatory ergonomic process of change.

  13. BETCO: A Computer Program for the Removal of Barometric and Earth Tide Effects From Water Levels

    NASA Astrophysics Data System (ADS)

    Toll, N.; Rasmussen, T. C.

    2005-12-01

    Barometric pressure effects in long-term water level measurements can mask drawdown responses to well tests and natural stimuli. Noise caused by barometric pressure and earth tide effects complicates analysis of pressure response data using diagnostic pressure derivative plots. A computer program has been developed to remove fluctuations in groundwater levels induced by changes in barometric pressure and earth tides. The program implements a regression deconvolution method to obtain a barometric response function and remove the barometric pressure and earth tide effects from the groundwater level data. Using the barometric response function yields a better residual or corrected head than using a constant barometric efficiency. The graphical response function can be used to diagnose aquifer type and well skin effects. A modification of the regression deconvolution has been implemented to simultaneously remove earth tide effects as well as barometric effects on water levels. The removal of the earth tide effects is provided as a beta feature. The software has been applied to 13 water level data sets at the Waste Isolation Pilot Plant in Carlsbad, NM. The results are compared to a constant barometric efficiency correction method. The freeware software is available as an install wizard for Windows XP and 2000. As of submission, all results output from BETCO are considered preliminary, please do not cite. The code is under continued development and will be qualified per the Sandia National Laboratories WIPP Software QA Plan requirements. This research is funded by WIPP programs administered by the Office of Environmental Management (EM) of the U.S. Department of Energy. Sandia is a multiprogram laboratory operated by Sandia Corporation, a Lockheed Martin Company, for the United States Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000.

  14. Patient dose, gray level and exposure index with a computed radiography system

    NASA Astrophysics Data System (ADS)

    Silva, T. R.; Yoshimura, E. M.

    2014-02-01

    Computed radiography (CR) is gradually replacing conventional screen-film system in Brazil. To assess image quality, manufactures provide the calculation of an exposure index through the acquisition software of the CR system. The objective of this study is to verify if the CR image can be used as an evaluator of patient absorbed dose too, through a relationship between the entrance skin dose and the exposure index or the gray level values obtained in the image. The CR system used for this study (Agfa model 30-X with NX acquisition software) calculates an exposure index called Log of the Median (lgM), related to the absorbed dose to the IP. The lgM value depends on the average gray level (called Scan Average Level (SAL)) of the segmented pixel value histogram of the whole image. A Rando male phantom was used to simulate a human body (chest and head), and was irradiated with an X-ray equipment, using usual radiologic techniques for chest exams. Thermoluminescent dosimeters (LiF, TLD100) were used to evaluate entrance skin dose and exit dose. The results showed a logarithm relation between entrance dose and SAL in the image center, regardless of the beam filtration. The exposure index varies linearly with the entrance dose, but the angular coefficient is beam quality dependent. We conclude that, with an adequate calibration, the CR system can be used to evaluate the patient absorbed dose.

  15. Self-Assessment and Student Improvement in an Introductory Computer Course at the Community College-Level

    ERIC Educational Resources Information Center

    Spicer-Sutton, Jama

    2013-01-01

    The purpose of this study was to determine a student's computer knowledge upon course entry and if there was a difference in college students' improvement scores as measured by the difference in pretest and posttest scores of new or novice users, moderate users, and expert users at the end of a college-level introductory computing class.…

  16. Effectiveness of Computer Assisted Instructions (CAI) in Teaching of Mathematics at Secondary Level

    NASA Astrophysics Data System (ADS)

    Dhevakrishnan, R.; Devi, S.; Chinnaiyan, K.

    2012-09-01

    The present study was aimed at effectiveness of computer assisted instructions (CAI) in teaching of mathematics at secondary level adopted experimental method and observing the difference between (CAI) and traditional method. A sample of sixty (60) students of IX class in VVB Matriculation Higher Secondary School at Elayampalayam, Namakkal district were selected for a sample and sample was divided into two group namely experiment and control group. The experimental group consisted 30 students who were taught 'Mensurationí by the computer assisted instructions and the control groups comprising 30 students were taught by the conventional method of teaching. Data analyzed using mean, S.D. and t-test. Findings of the study clearly point out that significant increase in the mean gain scores has been found in the post test scores of the experimental group. Significant differences have been found between the control group and experimental group on post test gain scores. The experiment group, which was taught by the CAI showed better, learning. The conclusion is evident that the CAI is an effective media of instruction for teaching Mathematics at secondary students.s

  17. Computational identification and analysis of MADS box genes in Camellia sinensis.

    PubMed

    Gogoi, Madhurjya; Borchetia, Sangeeta; Bandyopadhyay, Tanoy

    2015-01-01

    MADS (Minichromosome Maintenance1 Agamous Deficiens Serum response factor) box genes encode transcription factors and they play a key role in growth and development of flowering plants. There are two types of MADS box genes- Type I (serum response factor (SRF)-like) and Type II (myocyte enhancer factor 2 (MEF2)-like). Type II MADS box genes have a conserved MIKC domain (MADS DNA-binding domain, intervening domain, keratin-like domain, and c-terminal domain) and these were extensively studied in plants. Compared to other plants very little is known about MADS box genes in Camellia sinensis. The present study aims at identifying and analyzing the MADS-box genes present in Camellia sinensis. A comparative bioinformatics and phylogenetic analysis of the Camellia sinensis sequences along with Arabidopsis thaliana MADS box sequences available in the public domain databases led to the identification of 16 genes which were orthologous to Type II MADS box gene family members. The protein sequences were classified into distinct clades which are associated with the conserved function of flower and seed development. The identified genes may be used for gene expression and gene manipulation studies to elucidate their role in the development and flowering of tea which may pave the way to improve the crop productivity.

  18. Computational identification of human long intergenic non-coding RNAs using a GA-SVM algorithm.

    PubMed

    Wang, Yanqiu; Li, Yang; Wang, Qi; Lv, Yingli; Wang, Shiyuan; Chen, Xi; Yu, Xuexin; Jiang, Wei; Li, Xia

    2014-01-01

    Long intergenic non-coding RNAs (lincRNAs) are a new type of non-coding RNAs and are closely related with the occurrence and development of diseases. In previous studies, most lincRNAs have been identified through next-generation sequencing. Because lincRNAs exhibit tissue-specific expression, the reproducibility of lincRNA discovery in different studies is very poor. In this study, not including lincRNA expression, we used the sequence, structural and protein-coding potential features as potential features to construct a classifier that can be used to distinguish lincRNAs from non-lincRNAs. The GA-SVM algorithm was performed to extract the optimized feature subset. Compared with several feature subsets, the five-fold cross validation results showed that this optimized feature subset exhibited the best performance for the identification of human lincRNAs. Moreover, the LincRNA Classifier based on Selected Features (linc-SF) was constructed by support vector machine (SVM) based on the optimized feature subset. The performance of this classifier was further evaluated by predicting lincRNAs from two independent lincRNA sets. Because the recognition rates for the two lincRNA sets were 100% and 99.8%, the linc-SF was found to be effective for the prediction of human lincRNAs.

  19. Computational identification and analysis of MADS box genes in Camellia sinensis

    PubMed Central

    Gogoi, Madhurjya; Borchetia, Sangeeta; Bandyopadhyay, Tanoy

    2015-01-01

    MADS (Minichromosome Maintenance1 Agamous Deficiens Serum response factor) box genes encode transcription factors and they play a key role in growth and development of flowering plants. There are two types of MADS box genes- Type I (serum response factor (SRF)-like) and Type II (myocyte enhancer factor 2 (MEF2)-like). Type II MADS box genes have a conserved MIKC domain (MADS DNA-binding domain, intervening domain, keratin-like domain, and c-terminal domain) and these were extensively studied in plants. Compared to other plants very little is known about MADS box genes in Camellia sinensis. The present study aims at identifying and analyzing the MADS-box genes present in Camellia sinensis. A comparative bioinformatics and phylogenetic analysis of the Camellia sinensis sequences along with Arabidopsis thaliana MADS box sequences available in the public domain databases led to the identification of 16 genes which were orthologous to Type II MADS box gene family members. The protein sequences were classified into distinct clades which are associated with the conserved function of flower and seed development. The identified genes may be used for gene expression and gene manipulation studies to elucidate their role in the development and flowering of tea which may pave the way to improve the crop productivity. PMID:25914445

  20. Computational Identification of Genomic Features That Influence 3D Chromatin Domain Formation

    PubMed Central

    Mourad, Raphaël; Cuvier, Olivier

    2016-01-01

    Recent advances in long-range Hi-C contact mapping have revealed the importance of the 3D structure of chromosomes in gene expression. A current challenge is to identify the key molecular drivers of this 3D structure. Several genomic features, such as architectural proteins and functional elements, were shown to be enriched at topological domain borders using classical enrichment tests. Here we propose multiple logistic regression to identify those genomic features that positively or negatively influence domain border establishment or maintenance. The model is flexible, and can account for statistical interactions among multiple genomic features. Using both simulated and real data, we show that our model outperforms enrichment test and non-parametric models, such as random forests, for the identification of genomic features that influence domain borders. Using Drosophila Hi-C data at a very high resolution of 1 kb, our model suggests that, among architectural proteins, BEAF-32 and CP190 are the main positive drivers of 3D domain borders. In humans, our model identifies well-known architectural proteins CTCF and cohesin, as well as ZNF143 and Polycomb group proteins as positive drivers of domain borders. The model also reveals the existence of several negative drivers that counteract the presence of domain borders including P300, RXRA, BCL11A and ELK1. PMID:27203237

  1. Computational identification of putative miRNAs and their target genes in pathogenic amoeba Naegleria fowleri.

    PubMed

    Padmashree, Dyavegowda; Swamy, Narayanaswamy Ramachandra

    2015-01-01

    Naegleria fowleri is a parasitic unicellular free living eukaryotic amoeba. The parasite spreads through contaminated water and causes primary amoebic meningoencephalitis (PAM). Therefore, it is of interest to understand its molecular pathogenesis. Hence, we analyzed the parasite genome for miRNAs (microRNAs) that are non-coding, single stranded RNA molecules. We identified 245 miRNAs using computational methods in N. fowleri, of which five miRNAs are conserved. The predicted miRNA targets were analyzed by using miRanda (software) and further studied the functions by subsequently annotating using AmiGo (a gene ontology web tool). PMID:26770029

  2. Identification of Cardiac and Aortic Injuries in Trauma with Multi-detector Computed Tomography

    PubMed Central

    Shergill, Arvind K; Maraj, Tishan; Barszczyk, Mark S; Cheung, Helen; Singh, Navneet; Zavodni, Anna E

    2015-01-01

    Blunt and penetrating cardiovascular (CV) injuries are associated with a high morbidity and mortality. Rapid detection of these injuries in trauma is critical for patient survival. The advent of multi-detector computed tomography (MDCT) has led to increased detection of CV injuries during rapid comprehensive scanning of stabilized major trauma patients. MDCT has the ability to acquire images with a higher temporal and spatial resolution, as well as the capability to create multiplanar reformats. This pictorial review illustrates several common and life-threatening traumatic CV injuries from a regional trauma center. PMID:26430541

  3. Computational identification of putative miRNAs and their target genes in pathogenic amoeba Naegleria fowleri.

    PubMed

    Padmashree, Dyavegowda; Swamy, Narayanaswamy Ramachandra

    2015-01-01

    Naegleria fowleri is a parasitic unicellular free living eukaryotic amoeba. The parasite spreads through contaminated water and causes primary amoebic meningoencephalitis (PAM). Therefore, it is of interest to understand its molecular pathogenesis. Hence, we analyzed the parasite genome for miRNAs (microRNAs) that are non-coding, single stranded RNA molecules. We identified 245 miRNAs using computational methods in N. fowleri, of which five miRNAs are conserved. The predicted miRNA targets were analyzed by using miRanda (software) and further studied the functions by subsequently annotating using AmiGo (a gene ontology web tool).

  4. Computational identification of putative miRNAs and their target genes in pathogenic amoeba Naegleria fowleri

    PubMed Central

    Padmashree, Dyavegowda; Swamy, Narayanaswamy Ramachandra

    2015-01-01

    Naegleria fowleri is a parasitic unicellular free living eukaryotic amoeba. The parasite spreads through contaminated water and causes primary amoebic meningoencephalitis (PAM). Therefore, it is of interest to understand its molecular pathogenesis. Hence, we analyzed the parasite genome for miRNAs (microRNAs) that are non-coding, single stranded RNA molecules. We identified 245 miRNAs using computational methods in N. fowleri, of which five miRNAs are conserved. The predicted miRNA targets were analyzed by using miRanda (software) and further studied the functions by subsequently annotating using AmiGo (a gene ontology web tool). PMID:26770029

  5. Two-Level Verification of Data Integrity for Data Storage in Cloud Computing

    NASA Astrophysics Data System (ADS)

    Xu, Guangwei; Chen, Chunlin; Wang, Hongya; Zang, Zhuping; Pang, Mugen; Jiang, Ping

    Data storage in cloud computing can save capital expenditure and relive burden of storage management for users. As the lose or corruption of files stored may happen, many researchers focus on the verification of data integrity. However, massive users often bring large numbers of verifying tasks for the auditor. Moreover, users also need to pay extra fee for these verifying tasks beyond storage fee. Therefore, we propose a two-level verification of data integrity to alleviate these problems. The key idea is to routinely verify the data integrity by users and arbitrate the challenge between the user and cloud provider by the auditor according to the MACs and ϕ values. The extensive performance simulations show that the proposed scheme obviously decreases auditor's verifying tasks and the ratio of wrong arbitration.

  6. Instruction-Level Characterization of Scientific Computing Applications Using Hardware Performance Counters

    SciTech Connect

    Luo, Y.; Cameron, K.W.

    1998-11-24

    Workload characterization has been proven an essential tool to architecture design and performance evaluation in both scientific and commercial computing areas. Traditional workload characterization techniques include FLOPS rate, cache miss ratios, CPI (cycles per instruction or IPC, instructions per cycle) etc. With the complexity of sophisticated modern superscalar microprocessors, these traditional characterization techniques are not powerful enough to pinpoint the performance bottleneck of an application on a specific microprocessor. They are also incapable of immediately demonstrating the potential performance benefit of any architectural or functional improvement in a new processor design. To solve these problems, many people rely on simulators, which have substantial constraints especially on large-scale scientific computing applications. This paper presents a new technique of characterizing applications at the instruction level using hardware performance counters. It has the advantage of collecting instruction-level characteristics in a few runs virtually without overhead or slowdown. A variety of instruction counts can be utilized to calculate some average abstract workload parameters corresponding to microprocessor pipelines or functional units. Based on the microprocessor architectural constraints and these calculated abstract parameters, the architectural performance bottleneck for a specific application can be estimated. In particular, the analysis results can provide some insight to the problem that only a small percentage of processor peak performance can be achieved even for many very cache-friendly codes. Meanwhile, the bottleneck estimation can provide suggestions about viable architectural/functional improvement for certain workloads. Eventually, these abstract parameters can lead to the creation of an analytical microprocessor pipeline model and memory hierarchy model.

  7. Goal-Directed Behavior and Instrumental Devaluation: A Neural System-Level Computational Model

    PubMed Central

    Mannella, Francesco; Mirolli, Marco; Baldassarre, Gianluca

    2016-01-01

    Devaluation is the key experimental paradigm used to demonstrate the presence of instrumental behaviors guided by goals in mammals. We propose a neural system-level computational model to address the question of which brain mechanisms allow the current value of rewards to control instrumental actions. The model pivots on and shows the computational soundness of the hypothesis for which the internal representation of instrumental manipulanda (e.g., levers) activate the representation of rewards (or “action-outcomes”, e.g., foods) while attributing to them a value which depends on the current internal state of the animal (e.g., satiation for some but not all foods). The model also proposes an initial hypothesis of the integrated system of key brain components supporting this process and allowing the recalled outcomes to bias action selection: (a) the sub-system formed by the basolateral amygdala and insular cortex acquiring the manipulanda-outcomes associations and attributing the current value to the outcomes; (b) three basal ganglia-cortical loops selecting respectively goals, associative sensory representations, and actions; (c) the cortico-cortical and striato-nigro-striatal neural pathways supporting the selection, and selection learning, of actions based on habits and goals. The model reproduces and explains the results of several devaluation experiments carried out with control rats and rats with pre- and post-training lesions of the basolateral amygdala, the nucleus accumbens core, the prelimbic cortex, and the dorso-medial striatum. The results support the soundness of the hypotheses of the model and show its capacity to integrate, at the system-level, the operations of the key brain structures underlying devaluation. Based on its hypotheses and predictions, the model also represents an operational framework to support the design and analysis of new experiments on the motivational aspects of goal-directed behavior. PMID:27803652

  8. Identification of miRNAs Potentially Involved in Bronchiolitis Obliterans Syndrome: A Computational Study.

    PubMed

    Di Carlo, Stefano; Rossi, Elena; Politano, Gianfranco; Inghilleri, Simona; Morbini, Patrizia; Calabrese, Fiorella; Benso, Alfredo; Savino, Alessandro; Cova, Emanuela; Zampieri, Davide; Meloni, Federica

    2016-01-01

    The pathogenesis of Bronchiolitis Obliterans Syndrome (BOS), the main clinical phenotype of chronic lung allograft dysfunction, is poorly understood. Recent studies suggest that epigenetic regulation of microRNAs might play a role in its development. In this paper we present the application of a complex computational pipeline to perform enrichment analysis of miRNAs in pathways applied to the study of BOS. The analysis considered the full set of miRNAs annotated in miRBase (version 21), and applied a sequence of filtering approaches and statistical analyses to reduce this set and to score the candidate miRNAs according to their potential involvement in BOS development. Dysregulation of two of the selected candidate miRNAs-miR-34a and miR-21 -was clearly shown in in-situ hybridization (ISH) on five explanted human BOS lungs and on a rat model of acute and chronic lung rejection, thus definitely identifying miR-34a and miR-21 as pathogenic factors in BOS and confirming the effectiveness of the computational pipeline. PMID:27564214

  9. Myocardial strain estimation from CT: towards computer-aided diagnosis on infarction identification

    NASA Astrophysics Data System (ADS)

    Wong, Ken C. L.; Tee, Michael; Chen, Marcus; Bluemke, David A.; Summers, Ronald M.; Yao, Jianhua

    2015-03-01

    Regional myocardial strains have the potential for early quantification and detection of cardiac dysfunctions. Although image modalities such as tagged and strain-encoded MRI can provide motion information of the myocardium, they are uncommon in clinical routine. In contrary, cardiac CT images are usually available, but they only provide motion information at salient features such as the cardiac boundaries. To estimate myocardial strains from a CT image sequence, we adopted a cardiac biomechanical model with hyperelastic material properties to relate the motion on the cardiac boundaries to the myocardial deformation. The frame-to-frame displacements of the cardiac boundaries are obtained using B-spline deformable image registration based on mutual information, which are enforced as boundary conditions to the biomechanical model. The system equation is solved by the finite element method to provide the dense displacement field of the myocardium, and the regional values of the three principal strains and the six strains in cylindrical coordinates are computed in terms of the American Heart Association nomenclature. To study the potential of the estimated regional strains on identifying myocardial infarction, experiments were performed on cardiac CT image sequences of ten canines with artificially induced myocardial infarctions. The leave-one-subject-out cross validations show that, by using the optimal strain magnitude thresholds computed from ROC curves, the radial strain and the first principal strain have the best performance.

  10. Identification of miRNAs Potentially Involved in Bronchiolitis Obliterans Syndrome: A Computational Study

    PubMed Central

    Politano, Gianfranco; Inghilleri, Simona; Morbini, Patrizia; Calabrese, Fiorella; Benso, Alfredo; Savino, Alessandro; Cova, Emanuela; Zampieri, Davide; Meloni, Federica

    2016-01-01

    The pathogenesis of Bronchiolitis Obliterans Syndrome (BOS), the main clinical phenotype of chronic lung allograft dysfunction, is poorly understood. Recent studies suggest that epigenetic regulation of microRNAs might play a role in its development. In this paper we present the application of a complex computational pipeline to perform enrichment analysis of miRNAs in pathways applied to the study of BOS. The analysis considered the full set of miRNAs annotated in miRBase (version 21), and applied a sequence of filtering approaches and statistical analyses to reduce this set and to score the candidate miRNAs according to their potential involvement in BOS development. Dysregulation of two of the selected candidate miRNAs–miR-34a and miR-21 –was clearly shown in in-situ hybridization (ISH) on five explanted human BOS lungs and on a rat model of acute and chronic lung rejection, thus definitely identifying miR-34a and miR-21 as pathogenic factors in BOS and confirming the effectiveness of the computational pipeline. PMID:27564214

  11. Computer identification of snoRNA genes using a Mammalian Orthologous Intron Database

    PubMed Central

    Fedorov, Alexei; Stombaugh, Jesse; Harr, Michael W.; Yu, Saihua; Nasalean, Lorena; Shepelev, Valery

    2005-01-01

    Based on comparative genomics, we created a bioinformatic package for computer prediction of small nucleolar RNA (snoRNA) genes in mammalian introns. The core of our approach was the use of the Mammalian Orthologous Intron Database (MOID), which contains all known introns within the human, mouse and rat genomes. Introns from orthologous genes from these three species, that have the same position relative to the reading frame, are grouped in a special orthologous intron table. Our program SNO.pl searches for conserved snoRNA motifs within MOID and reports all cases when characteristic snoRNA-like structures are present in all three orthologous introns of human, mouse and rat sequences. Here we report an example of the SNO.pl usage for searching a particular pattern of conserved C/D-box snoRNA motifs (canonical C- and D-boxes and the 6 nt long terminal stem). In this computer analysis, we detected 57 triplets of snoRNA-like structures in three mammals. Among them were 15 triplets that represented known C/D-box snoRNA genes. Six triplets represented snoRNA genes that had only been partially characterized in the mouse genome. One case represented a novel snoRNA gene, and another three cases, putative snoRNAs. Our programs are publicly available and can be easily adapted and/or modified for searching any conserved motifs within mammalian introns. PMID:16093549

  12. Computational Identification of Mechanistic Factors That Determine the Timing and Intensity of the Inflammatory Response

    PubMed Central

    Nagaraja, Sridevi; Reifman, Jaques; Mitrophanov, Alexander Y.

    2015-01-01

    Timely resolution of inflammation is critical for the restoration of homeostasis in injured or infected tissue. Chronic inflammation is often characterized by a persistent increase in the concentrations of inflammatory cells and molecular mediators, whose distinct amount and timing characteristics offer an opportunity to identify effective therapeutic regulatory targets. Here, we used our recently developed computational model of local inflammation to identify potential targets for molecular interventions and to investigate the effects of individual and combined inhibition of such targets. This was accomplished via the development and application of computational strategies involving the simulation and analysis of thousands of inflammatory scenarios. We found that modulation of macrophage influx and efflux is an effective potential strategy to regulate the amount of inflammatory cells and molecular mediators in both normal and chronic inflammatory scenarios. We identified three molecular mediators − tumor necrosis factor-α (TNF-α), transforming growth factor-β (TGF-β), and the chemokine CXCL8 − as potential molecular targets whose individual or combined inhibition may robustly regulate both the amount and timing properties of the kinetic trajectories for neutrophils and macrophages in chronic inflammation. Modulation of macrophage flux, as well as of the abundance of TNF-α, TGF-β, and CXCL8, may improve the resolution of chronic inflammation. PMID:26633296

  13. Detection of glycopeptide resistance genotypes and identification to the species level of clinically relevant enterococci by PCR.

    PubMed Central

    Dutka-Malen, S; Evers, S; Courvalin, P

    1995-01-01

    A PCR assay that allows simultaneous detection of glycopeptide resistance genotypes and identification to the species level of clinically relevant enterococci (Enterococcus faecium, E. faecalis, E. gallinarum, and E. casseliflavus) was developed. This assay was based on specific amplification of internal fragments of genes encoding D-alanine:D-alanine ligases and related glycopeptide resistance proteins. The specificity of the assay was tested on 5 well-characterized glycopeptide-resistant strains and on 15 susceptible enterococcal type strains. Clinical isolates of enterococci that could not be identified to the species level by conventional methods were identified by the PCR test. This assay offers a specific and rapid alternative to antibiotic susceptibility tests, in particular for detection of low-level vancomycin resistance. PMID:7699051

  14. Jimena: efficient computing and system state identification for genetic regulatory networks

    PubMed Central

    2013-01-01

    Background Boolean networks capture switching behavior of many naturally occurring regulatory networks. For semi-quantitative modeling, interpolation between ON and OFF states is necessary. The high degree polynomial interpolation of Boolean genetic regulatory networks (GRNs) in cellular processes such as apoptosis or proliferation allows for the modeling of a wider range of node interactions than continuous activator-inhibitor models, but suffers from scaling problems for networks which contain nodes with more than ~10 inputs. Many GRNs from literature or new gene expression experiments exceed those limitations and a new approach was developed. Results (i) As a part of our new GRN simulation framework Jimena we introduce and setup Boolean-tree-based data structures; (ii) corresponding algorithms greatly expedite the calculation of the polynomial interpolation in almost all cases, thereby expanding the range of networks which can be simulated by this model in reasonable time. (iii) Stable states for discrete models are efficiently counted and identified using binary decision diagrams. As application example, we show how system states can now be sampled efficiently in small up to large scale hormone disease networks (Arabidopsis thaliana development and immunity, pathogen Pseudomonas syringae and modulation by cytokinins and plant hormones). Conclusions Jimena simulates currently available GRNs about 10-100 times faster than the previous implementation of the polynomial interpolation model and even greater gains are achieved for large scale-free networks. This speed-up also facilitates a much more thorough sampling of continuous state spaces which may lead to the identification of new stable states. Mutants of large networks can be constructed and analyzed very quickly enabling new insights into network robustness and behavior. PMID:24118878

  15. Computational approaches for identification of conserved/unique binding pockets in the A chain of ricin

    SciTech Connect

    Ecale Zhou, C L; Zemla, A T; Roe, D; Young, M; Lam, M; Schoeniger, J; Balhorn, R

    2005-01-29

    Specific and sensitive ligand-based protein detection assays that employ antibodies or small molecules such as peptides, aptamers, or other small molecules require that the corresponding surface region of the protein be accessible and that there be minimal cross-reactivity with non-target proteins. To reduce the time and cost of laboratory screening efforts for diagnostic reagents, we developed new methods for evaluating and selecting protein surface regions for ligand targeting. We devised combined structure- and sequence-based methods for identifying 3D epitopes and binding pockets on the surface of the A chain of ricin that are conserved with respect to a set of ricin A chains and unique with respect to other proteins. We (1) used structure alignment software to detect structural deviations and extracted from this analysis the residue-residue correspondence, (2) devised a method to compare corresponding residues across sets of ricin structures and structures of closely related proteins, (3) devised a sequence-based approach to determine residue infrequency in local sequence context, and (4) modified a pocket-finding algorithm to identify surface crevices in close proximity to residues determined to be conserved/unique based on our structure- and sequence-based methods. In applying this combined informatics approach to ricin A we identified a conserved/unique pocket in close proximity (but not overlapping) the active site that is suitable for bi-dentate ligand development. These methods are generally applicable to identification of surface epitopes and binding pockets for development of diagnostic reagents, therapeutics, and vaccines.

  16. Noninvasive fractional flow reserve derived from coronary computed tomography angiography for identification of ischemic lesions: a systematic review and meta-analysis

    PubMed Central

    Wu, Wen; Pan, Dao-Rong; Foin, Nicolas; Pang, Si; Ye, Peng; Holm, Niels; Ren, Xiao-Min; Luo, Jie; Nanjundappa, Aravinda; Chen, Shao-Liang

    2016-01-01

    Detection of coronary ischemic lesions by fractional flow reserve (FFR) has been established as the gold standard. In recent years, novel computer based methods have emerged and they can provide simulation of FFR using coronary artery images acquired from coronary computed tomography angiography (FFRCT). This meta-analysis aimed to evaluate diagnostic performance of FFRCT using FFR as the reference standard. Databases of PubMed, Cochrane Library, EMBASE, Medion and Web of Science were searched. Seven studies met the inclusion criteria, including 833 stable patients (1377 vessels or lesions) with suspected or known coronary artery disease (CAD). The patient-based analysis showed pooled estimates of sensitivity, specificity and diagnostic odds ratio (DOR) for detection of ischemic lesions were 0.89 [95%confidence interval (CI), 0.85–0.93], 0.76 (95%CI, 0.64–0.84) and 26.21 (95%CI, 13.14–52.28). At a per-vessel or per-lesion level, the pooled estimates were as follows: sensitivity 0.84 (95%CI, 0.80–0.87), specificity 0.76 (95%CI, 0.67–0.83) and DOR 16.87 (95%CI, 9.41–30.25). Area under summary receiver operating curves was 0.90 (95%CI, 0.87–0.92) and 0.86 (95%CI, 0.83–0.89) at the two analysis levels, respectively. In conclusion, FFRCT technology achieves a moderate diagnostic performance for noninvasive identification of ischemic lesions in stable patients with suspected or known CAD in comparison to invasive FFR measurement. PMID:27377422

  17. Noninvasive fractional flow reserve derived from coronary computed tomography angiography for identification of ischemic lesions: a systematic review and meta-analysis.

    PubMed

    Wu, Wen; Pan, Dao-Rong; Foin, Nicolas; Pang, Si; Ye, Peng; Holm, Niels; Ren, Xiao-Min; Luo, Jie; Nanjundappa, Aravinda; Chen, Shao-Liang

    2016-01-01

    Detection of coronary ischemic lesions by fractional flow reserve (FFR) has been established as the gold standard. In recent years, novel computer based methods have emerged and they can provide simulation of FFR using coronary artery images acquired from coronary computed tomography angiography (FFRCT). This meta-analysis aimed to evaluate diagnostic performance of FFRCT using FFR as the reference standard. Databases of PubMed, Cochrane Library, EMBASE, Medion and Web of Science were searched. Seven studies met the inclusion criteria, including 833 stable patients (1377 vessels or lesions) with suspected or known coronary artery disease (CAD). The patient-based analysis showed pooled estimates of sensitivity, specificity and diagnostic odds ratio (DOR) for detection of ischemic lesions were 0.89 [95%confidence interval (CI), 0.85-0.93], 0.76 (95%CI, 0.64-0.84) and 26.21 (95%CI, 13.14-52.28). At a per-vessel or per-lesion level, the pooled estimates were as follows: sensitivity 0.84 (95%CI, 0.80-0.87), specificity 0.76 (95%CI, 0.67-0.83) and DOR 16.87 (95%CI, 9.41-30.25). Area under summary receiver operating curves was 0.90 (95%CI, 0.87-0.92) and 0.86 (95%CI, 0.83-0.89) at the two analysis levels, respectively. In conclusion, FFRCT technology achieves a moderate diagnostic performance for noninvasive identification of ischemic lesions in stable patients with suspected or known CAD in comparison to invasive FFR measurement. PMID:27377422

  18. Identification of water-conditioned Pseudomonas aeruginosa by Raman microspectroscopy on a single cell level.

    PubMed

    Silge, Anja; Schumacher, Wilm; Rösch, Petra; Da Costa Filho, Paulo A; Gérard, Cédric; Popp, Jürgen

    2014-07-01

    The identification of Pseudomonas aeruginosa from samples of bottled natural mineral water by the analysis of subcultures is time consuming and other species of the authentic Pseudomonas group can be a problem. Therefore, this study aimed to investigate the influence of different aquatic environmental conditions (pH, mineral content) and growth phases on the cultivation-free differentiation between water-conditioned Pseudomonas spp. by applying Raman microspectroscopy. The final dataset was comprised of over 7500 single-cell Raman spectra, including the species Pseudomonas aeruginosa, P. fluorescens and P. putida, in order to prove the feasibility of the introduced approach. The collection of spectra was standardized by automated measurements of viable stained bacterial cells. The discrimination was influenced by the growth phase at the beginning of the water adaptation period and by the type of mineral water. Different combinations of the parameters were tested and they resulted in accuracies of up to 85% for the identification of P. aeruginosa from independent samples by applying chemometric analysis.

  19. Identification of water-conditioned Pseudomonas aeruginosa by Raman microspectroscopy on a single cell level.

    PubMed

    Silge, Anja; Schumacher, Wilm; Rösch, Petra; Da Costa Filho, Paulo A; Gérard, Cédric; Popp, Jürgen

    2014-07-01

    The identification of Pseudomonas aeruginosa from samples of bottled natural mineral water by the analysis of subcultures is time consuming and other species of the authentic Pseudomonas group can be a problem. Therefore, this study aimed to investigate the influence of different aquatic environmental conditions (pH, mineral content) and growth phases on the cultivation-free differentiation between water-conditioned Pseudomonas spp. by applying Raman microspectroscopy. The final dataset was comprised of over 7500 single-cell Raman spectra, including the species Pseudomonas aeruginosa, P. fluorescens and P. putida, in order to prove the feasibility of the introduced approach. The collection of spectra was standardized by automated measurements of viable stained bacterial cells. The discrimination was influenced by the growth phase at the beginning of the water adaptation period and by the type of mineral water. Different combinations of the parameters were tested and they resulted in accuracies of up to 85% for the identification of P. aeruginosa from independent samples by applying chemometric analysis. PMID:24958608

  20. Computational genomic identification and functional reconstitution of plant natural product biosynthetic pathways

    PubMed Central

    2016-01-01

    Covering: 2003 to 2016 The last decade has seen the first major discoveries regarding the genomic basis of plant natural product biosynthetic pathways. Four key computationally driven strategies have been developed to identify such pathways, which make use of physical clustering, co-expression, evolutionary co-occurrence and epigenomic co-regulation of the genes involved in producing a plant natural product. Here, we discuss how these approaches can be used for the discovery of plant biosynthetic pathways encoded by both chromosomally clustered and non-clustered genes. Additionally, we will discuss opportunities to prioritize plant gene clusters for experimental characterization, and end with a forward-looking perspective on how synthetic biology technologies will allow effective functional reconstitution of candidate pathways using a variety of genetic systems. PMID:27321668

  1. Cutting edge: identification of novel T cell epitopes in Lol p5a by computational prediction.

    PubMed

    de Lalla, C; Sturniolo, T; Abbruzzese, L; Hammer, J; Sidoli, A; Sinigaglia, F; Panina-Bordignon, P

    1999-08-15

    Although atopic allergy affects computational prediction of DR ligands might thus allow the design of T cell epitopes with potential useful application in novel immunotherapy strategies.

  2. Computer Folding of RNA Tetraloops: Identification of Key Force Field Deficiencies.

    PubMed

    Kührová, Petra; Best, Robert B; Bottaro, Sandro; Bussi, Giovanni; Šponer, Jiří; Otyepka, Michal; Banáš, Pavel

    2016-09-13

    The computer-aided folding of biomolecules, particularly RNAs, is one of the most difficult challenges in computational structural biology. RNA tetraloops are fundamental RNA motifs playing key roles in RNA folding and RNA-RNA and RNA-protein interactions. Although state-of-the-art Molecular Dynamics (MD) force fields correctly describe the native state of these tetraloops as a stable free-energy basin on the microsecond time scale, enhanced sampling techniques reveal that the native state is not the global free energy minimum, suggesting yet unidentified significant imbalances in the force fields. Here, we tested our ability to fold the RNA tetraloops in various force fields and simulation settings. We employed three different enhanced sampling techniques, namely, temperature replica exchange MD (T-REMD), replica exchange with solute tempering (REST2), and well-tempered metadynamics (WT-MetaD). We aimed to separate problems caused by limited sampling from those due to force-field inaccuracies. We found that none of the contemporary force fields is able to correctly describe folding of the 5'-GAGA-3' tetraloop over a range of simulation conditions. We thus aimed to identify which terms of the force field are responsible for this poor description of TL folding. We showed that at least two different imbalances contribute to this behavior, namely, overstabilization of base-phosphate and/or sugar-phosphate interactions and underestimated stability of the hydrogen bonding interaction in base pairing. The first artifact stabilizes the unfolded ensemble, while the second one destabilizes the folded state. The former problem might be partially alleviated by reparametrization of the van der Waals parameters of the phosphate oxygens suggested by Case et al., while in order to overcome the latter effect we suggest local potentials to better capture hydrogen bonding interactions.

  3. Identification of rupture locations in patient-specific abdominal aortic aneurysms using experimental and computational techniques.

    PubMed

    Doyle, Barry J; Cloonan, Aidan J; Walsh, Michael T; Vorp, David A; McGloughlin, Timothy M

    2010-05-01

    In the event of abdominal aortic aneurysm (AAA) rupture, the outcome is often death. This paper aims to experimentally identify the rupture locations of in vitro AAA models and validate these rupture sites using finite element analysis (FEA). Silicone rubber AAA models were manufactured using two different materials (Sylgard 160 and Sylgard 170, Dow Corning) and imaged using computed tomography (CT). Experimental models were inflated until rupture with high speed photography used to capture the site of rupture. 3D reconstructions from CT scans and subsequent FEA of these models enabled the wall stress and wall thickness to be determined for each of the geometries. Experimental models ruptured at regions of inflection, not at regions of maximum diameter. Rupture pressures (mean+/-SD) for the Sylgard 160 and Sylgard 170 models were 650.6+/-195.1mmHg and 410.7+/-159.9mmHg, respectively. Computational models accurately predicted the locations of rupture. Peak wall stress for the Sylgard 160 and Sylgard 170 models was 2.15+/-0.26MPa at an internal pressure of 650mmHg and 1.69+/-0.38MPa at an internal pressure of 410mmHg, respectively. Mean wall thickness of all models was 2.19+/-0.40mm, with a mean wall thickness at the location of rupture of 1.85+/-0.33 and 1.71+/-0.29mm for the Sylgard 160 and Sylgard 170 materials, respectively. Rupture occurred at the location of peak stress in 80% (16/20) of cases and at high stress regions but not peak stress in 10% (2/20) of cases. 10% (2/20) of models had defects in the AAA wall which moved the rupture location away from regions of elevated stress. The results presented may further contribute to the understanding of AAA biomechanics and ultimately AAA rupture prediction.

  4. Development of computer program ENAUDIBL for computation of the sensation levels of multiple, complex, intrusive sounds in the presence of residual environmental masking noise

    SciTech Connect

    Liebich, R. E.; Chang, Y.-S.; Chun, K. C.

    2000-03-31

    The relative audibility of multiple sounds occurs in separate, independent channels (frequency bands) termed critical bands or equivalent rectangular (filter-response) bandwidths (ERBs) of frequency. The true nature of human hearing is a function of a complex combination of subjective factors, both auditory and nonauditory. Assessment of the probability of individual annoyance, community-complaint reaction levels, speech intelligibility, and the most cost-effective mitigation actions requires sensation-level data; these data are one of the most important auditory factors. However, sensation levels cannot be calculated by using single-number, A-weighted sound level values. This paper describes specific steps to compute sensation levels. A unique, newly developed procedure is used, which simplifies and improves the accuracy of such computations by the use of maximum sensation levels that occur, for each intrusive-sound spectrum, within each ERB. The newly developed program ENAUDIBL makes use of ERB sensation-level values generated with some computational subroutines developed for the formerly documented program SPECTRAN.

  5. Identification and Utilization of Employer Requirements for Entry-Level Health Occupations Workers. Final Report.

    ERIC Educational Resources Information Center

    Zukowski, James J.

    The purpose of a research project was to identify employer expectations regarding entry-level competency requirements for selected assistance-level health occupations. Physicians, dentists, and other health professionals reviewed lists of competencies associated with the performance of assistance-level employees in nursing, medical laboratory, and…

  6. Computationally Guided Identification of Novel Mycobacterium tuberculosis GlmU Inhibitory Leads, Their Optimization, and in Vitro Validation.

    PubMed

    Mehra, Rukmankesh; Rani, Chitra; Mahajan, Priya; Vishwakarma, Ram Ashrey; Khan, Inshad Ali; Nargotra, Amit

    2016-02-01

    Mycobacterium tuberculosis (Mtb) infections are causing serious health concerns worldwide. Antituberculosis drug resistance threatens the current therapies and causes further need to develop effective antituberculosis therapy. GlmU represents an interesting target for developing novel Mtb drug candidates. It is a bifunctional acetyltransferase/uridyltransferase enzyme that catalyzes the biosynthesis of UDP-N-acetyl-glucosamine (UDP-GlcNAc) from glucosamine-1-phosphate (GlcN-1-P). UDP-GlcNAc is a substrate for the biosynthesis of lipopolysaccharide and peptidoglycan that are constituents of the bacterial cell wall. In the current study, structure and ligand based computational models were developed and rationally applied to screen a drug-like compound repository of 20,000 compounds procured from ChemBridge DIVERSet database for the identification of probable inhibitors of Mtb GlmU. The in vitro evaluation of the in silico identified inhibitor candidates resulted in the identification of 15 inhibitory leads of this target. Literature search of these leads through SciFinder and their similarity analysis with the PubChem training data set (AID 1376) revealed the structural novelty of these hits with respect to Mtb GlmU. IC50 of the most potent identified inhibitory lead (5810599) was found to be 9.018 ± 0.04 μM. Molecular dynamics (MD) simulation of this inhibitory lead (5810599) in complex with protein affirms the stability of the lead within the binding pocket and also emphasizes on the key interactive residues for further designing. Binding site analysis of the acetyltransferase pocket with respect to the identified structural moieties provides a thorough analysis for carrying out the lead optimization studies. PMID:26812086

  7. Computational identification of genetic subnetwork modules associated with maize defense response to Fusarium verticillioides

    PubMed Central

    2015-01-01

    Background Maize, a crop of global significance, is vulnerable to a variety of biotic stresses resulting in economic losses. Fusarium verticillioides (teleomorph Gibberella moniliformis) is one of the key fungal pathogens of maize, causing ear rots and stalk rots. To better understand the genetic mechanisms involved in maize defense as well as F. verticillioides virulence, a systematic investigation of the host-pathogen interaction is needed. The aim of this study was to computationally identify potential maize subnetwork modules associated with its defense response against F. verticillioides. Results We obtained time-course RNA-seq data from B73 maize inoculated with wild type F. verticillioides and a loss-of-virulence mutant, and subsequently established a computational pipeline for network-based comparative analysis. Specifically, we first analyzed the RNA-seq data by a cointegration-correlation-expression approach, where maize genes were jointly analyzed with known F. verticillioides virulence genes to find candidate maize genes likely associated with the defense mechanism. We predicted maize co-expression networks around the selected maize candidate genes based on partial correlation, and subsequently searched for subnetwork modules that were differentially activated when inoculated with two different fungal strains. Based on our analysis pipeline, we identified four potential maize defense subnetwork modules. Two were directly associated with maize defense response and were associated with significant GO terms such as GO:0009817 (defense response to fungus) and GO:0009620 (response to fungus). The other two predicted modules were indirectly involved in the defense response, where the most significant GO terms associated with these modules were GO:0046914 (transition metal ion binding) and GO:0046686 (response to cadmium ion). Conclusion Through our RNA-seq data analysis, we have shown that a network-based approach can enhance our understanding of the

  8. Rapid and low-level toxic PCR-based method for routine identification of Flavobacterium psychrophilum.

    PubMed

    Cepeda, C; Santos, Y

    2000-12-01

    We describe a rapid, low-toxicity and simple method for the detection of the bacterial fish pathogen Flavobacterium psychrophilum. The method, based on the polymerase chain reaction (PCR), combined the electrophoresis of PCR products in a vertical agarose gel and a modified methylene blue stain. DNA was amplified directly either from bacterial suspensions or from tissues experimentally infected with F. psychrophilum, using different non-toxic commercial DNA extraction kits. The protocol allowed to detect 15 to 150 cells of the pathogen in bacterial suspension, without prior DNA extraction, and 7500 to 75,000 cells in seeded spleen tissue and ovarian fluid using Dynabeads DNA DIRECT extraction system. This method, which has the advantage of not using hazardous products, is proposed as a fast tool for routine identification of F. psychrophilum.

  9. Identification of irradiated pepper with the level of hydrogen gas as a probe

    SciTech Connect

    Dohmaru, T.; Furuta, M.; Katayama, T.; Toratani, H.; Takeda, A. )

    1989-12-01

    A novel method to detect whether or not a particular pepper has been irradiated has been developed which is based on the fact that H2 is formed in organic substances irradiated with ionizing radiation. Following gamma irradiation, black and white peppers were ground to powder in a gastight ceramic mill. By gas-chromatographic analysis of the gas in the mill, we observed that H2 had been released from the irradiated pepper grains. Curves plotting the H2 content vs storage time at storage temperatures of 7, 22, and 30 degrees C showed that the higher the temperatures, the smaller the H2 content, and that identification of irradiated pepper was possible for 2-4 months after 10 kGy irradiation.

  10. Identification of causal genetic drivers of human disease through systems-level analysis of regulatory networks.

    PubMed

    Chen, James C; Alvarez, Mariano J; Talos, Flaminia; Dhruv, Harshil; Rieckhof, Gabrielle E; Iyer, Archana; Diefes, Kristin L; Aldape, Kenneth; Berens, Michael; Shen, Michael M; Califano, Andrea

    2014-10-01

    Identification of driver mutations in human diseases is often limited by cohort size and availability of appropriate statistical models. We propose a framework for the systematic discovery of genetic alterations that are causal determinants of disease, by prioritizing genes upstream of functional disease drivers, within regulatory networks inferred de novo from experimental data. We tested this framework by identifying the genetic determinants of the mesenchymal subtype of glioblastoma. Our analysis uncovered KLHL9 deletions as upstream activators of two previously established master regulators of the subtype, C/EBPβ and C/EBPδ. Rescue of KLHL9 expression induced proteasomal degradation of C/EBP proteins, abrogated the mesenchymal signature, and reduced tumor viability in vitro and in vivo. Deletions of KLHL9 were confirmed in > 50% of mesenchymal cases in an independent cohort, thus representing the most frequent genetic determinant of the subtype. The method generalized to study other human diseases, including breast cancer and Alzheimer's disease.

  11. Identification of Low-Level Vancomycin Resistance in Staphylococcus aureus in the Era of Informatics

    PubMed Central

    2016-01-01

    Vancomycin-intermediate Staphylococcus aureus (VISA) and heteroresistant VISA (hVISA) are pathogens for which accurate antimicrobial susceptibility testing (AST) would rule out standard treatment with vancomycin. Unfortunately, AST for vancomycin is relatively slow and standard methods are unable to reliably detect VISA and hVISA. An article in this issue (C. A. Mather, B. J. Werth, S. Sivagnanam, D. J. SenGupta, and S. M. Butler-Wu, J Clin Microbiol 54:883–890, 2016, doi:http://dx.doi.org/10.1128/JCM.02428-15) describes a rapid whole-cell matrix-assisted laser desorption ionization–time of flight proxy susceptibility method that highlights current innovations and challenges with rapid AST, VISA/hVISA identification, and clinical bioinformatics. PMID:26865694

  12. Identification of Low-Level Vancomycin Resistance in Staphylococcus aureus in the Era of Informatics.

    PubMed

    Ford, Bradley A

    2016-04-01

    Vancomycin-intermediateStaphylococcus aureus(VISA) and heteroresistant VISA (hVISA) are pathogens for which accurate antimicrobial susceptibility testing (AST) would rule out standard treatment with vancomycin. Unfortunately, AST for vancomycin is relatively slow and standard methods are unable to reliably detect VISA and hVISA. An article in this issue (C. A. Mather, B. J. Werth, S. Sivagnanam, D. J. SenGupta, and S. M. Butler-Wu, J Clin Microbiol 54:883-890, 2016, doi:http://dx.doi.org/10.1128/JCM.02428-15) describes a rapid whole-cell matrix-assisted laser desorption ionization-time of flight proxy susceptibility method that highlights current innovations and challenges with rapid AST, VISA/hVISA identification, and clinical bioinformatics.

  13. Proposal of a French health identification number interoperable at the European level.

    PubMed

    Quantin, Catherine; Allaert, François-André; Avillach, Paul; Riandey, Benoît; Fieschi, Marius; Fassa, Maniane; Cohen, Olivier

    2007-01-01

    The French ministry of Health is setting up the Personal Medical Record (PMR). This innovative tool has long been expected by French Health Authorities, Associations of Patients, other Health's associations, those defending Individual Liberties and the French National Data Protection Authority. The PMR will lead to improvements in many areas such as Diagnosis (Research and monitoring) Healthcare (Management of emergencies, urgent situations, Temporal health monitoring and evaluation), Therapy (Cohorts of patients for Clinical trials and epidemiological studies). The PMR will foster safe healthcare management, clinical research and epidemiological studies. Nevertheless, it raises many important questions regarding duplicates and the quality, precision and coherence of the linkage with other health data coming from different sources. The currently planned identifying process raises many questions with regard to its ability to deal with potential duplicates and to perform data linkage with other health data sources. Through this article, using the electronic health records, we develop and propose an identification process to improve the French PMR. Our proposed unique patient identifier will guarantee the security, confidentiality and privacy of the personal data, and will prove to be particularly useful for health planning, health policies and research as well as clinical and epidemiological studies. Finally, it will certainly be interoperable with other European health information systems. We propose here an alternative identification procedure that would allow France to broaden the scope of its PMR project by making it possible to contribute to public health research and policy while increasing interoperability with European health information systems and preserving the confidentiality of the data.

  14. Computer-Assisted Identification of Protoplasts Responsible for Rare Division Events Reveals Guard-Cell Totipotency.

    PubMed Central

    Hall, R. D.; Verhoeven, H. A.; Krens, F. A.

    1995-01-01

    With the use of a computer-controlled microscope system to assist in the positioning and rapid relocation of large numbers of cultured cells, we were able to identify those protoplasts with the capacity to divide within a highly recalcitrant culture in which only a tiny fraction of the total population proceeds to produce viable microcalli. In the cultures used, comprising Beta vulgaris L. (sugar beet) leaf protoplasts, it was confirmed that these cells can be recognized solely on the basis of morphological characters. Therefore, a direct link exists between competence for cell division in vitro and cell type. Divergent callus morphologies and totipotent potential could also be ascribed to distinct protoplast types and hence to cells with a specific origin. The progenitors of the totipotent protoplasts in these cultures have been confirmed as being stomatal guard cells. Consequently, in plants even the most highly adapted living cells clearly retain and can reactivate all of the functional genetic information necessary to recreate the whole organism; an extreme degree of cytodifferentiation is, therefore, no hindrance to expressing totipotent potential. In addition to the considerable practical value of these findings, their implications concerning our understanding of both the control of gene expression and plant cell differentiation and its reversibility are of fundamental significance. PMID:12228442

  15. Computational identification of microRNAs and their targets in cassava (Manihot esculenta Crantz.).

    PubMed

    Patanun, Onsaya; Lertpanyasampatha, Manassawe; Sojikul, Punchapat; Viboonjun, Unchera; Narangajavana, Jarunya

    2013-03-01

    MicroRNAs (miRNAs) are a newly discovered class of noncoding endogenous small RNAs involved in plant growth and development as well as response to environmental stresses. miRNAs have been extensively studied in various plant species, however, only few information are available in cassava, which serves as one of the staple food crops, a biofuel crop, animal feed and industrial raw materials. In this study, the 169 potential cassava miRNAs belonging to 34 miRNA families were identified by computational approach. Interestingly, mes-miR319b was represented as the first putative mirtron demonstrated in cassava. A total of 15 miRNA clusters involving 7 miRNA families, and 12 pairs of sense and antisense strand cassava miRNAs belonging to six different miRNA families were discovered. Prediction of potential miRNA target genes revealed their functions involved in various important plant biological processes. The cis-regulatory elements relevant to drought stress and plant hormone response were identified in the promoter regions of those miRNA genes. The results provided a foundation for further investigation of the functional role of known transcription factors in the regulation of cassava miRNAs. The better understandings of the complexity of miRNA-mediated genes network in cassava would unravel cassava complex biology in storage root development and in coping with environmental stresses, thus providing more insights for future exploitation in cassava improvement.

  16. Computational identification of riboswitches based on RNA conserved functional sequences and conformations.

    PubMed

    Chang, Tzu-Hao; Huang, Hsien-Da; Wu, Li-Ching; Yeh, Chi-Ta; Liu, Baw-Jhiune; Horng, Jorng-Tzong

    2009-07-01

    Riboswitches are cis-acting genetic regulatory elements within a specific mRNA that can regulate both transcription and translation by interacting with their corresponding metabolites. Recently, an increasing number of riboswitches have been identified in different species and investigated for their roles in regulatory functions. Both the sequence contexts and structural conformations are important characteristics of riboswitches. None of the previously developed tools, such as covariance models (CMs), Riboswitch finder, and RibEx, provide a web server for efficiently searching homologous instances of known riboswitches or considers two crucial characteristics of each riboswitch, such as the structural conformations and sequence contexts of functional regions. Therefore, we developed a systematic method for identifying 12 kinds of riboswitches. The method is implemented and provided as a web server, RiboSW, to efficiently and conveniently identify riboswitches within messenger RNA sequences. The predictive accuracy of the proposed method is comparable with other previous tools. The efficiency of the proposed method for identifying riboswitches was improved in order to achieve a reasonable computational time required for the prediction, which makes it possible to have an accurate and convenient web server for biologists to obtain the results of their analysis of a given mRNA sequence. RiboSW is now available on the web at http://RiboSW.mbc.nctu.edu.tw/. PMID:19460868

  17. The Impact of Cognitive and Non-Cognitive Personality Traits on Computer Literacy Level

    ERIC Educational Resources Information Center

    Saparniene, Diana; Merkys, Gediminas; Saparnis, Gintaras

    2006-01-01

    Purpose: The paper deals with the study of students' computer literacy one of the purposes being demonstration the impact of the cognitive and non-cognitive personality traits (attention, verbal and non-verbal intelligence, emotional-motivational relationship with computer, learning strategies, etc.) on the quality of computer literacy.…

  18. Analysis of the Computer Anxiety Levels of Secondary Technical Education Teachers in West Virginia.

    ERIC Educational Resources Information Center

    Gordon, Howard R. D.

    1995-01-01

    Responses from 116 secondary technical education teachers (91%) who completed Oetting's Computer Anxiety Scale indicated that 46% experienced general computer anxiety. Development of computer and typing skills explained a large portion of the variance. More hands-on training was recommended. (SK)

  19. MICA, Managed Instruction with Computer Assistance: Level Five. An Outline of the System's Capabilities.

    ERIC Educational Resources Information Center

    Lorenz, Thomas B.; And Others

    Computer technology has been used since 1972 in the Madison, Wisconsin, public schools to control the flow of information required to support individualized instruction. Madison's computer-managed instruction system, MICA (Managed Instruction with Computer Assistance), operates interactively within individualized instruction programs to provide…

  20. Comparison of Traditional Phenotypic Identification Methods with Partial 5′ 16S rRNA Gene Sequencing for Species-Level Identification of Nonfermenting Gram-Negative Bacilli▿

    PubMed Central

    Cloud, Joann L.; Harmsen, Dag; Iwen, Peter C.; Dunn, James J.; Hall, Gerri; LaSala, Paul Rocco; Hoggan, Karen; Wilson, Deborah; Woods, Gail L.; Mellmann, Alexander

    2010-01-01

    Correct identification of nonfermenting Gram-negative bacilli (NFB) is crucial for patient management. We compared phenotypic identifications of 96 clinical NFB isolates with identifications obtained by 5′ 16S rRNA gene sequencing. Sequencing identified 88 isolates (91.7%) with >99% similarity to a sequence from the assigned species; 61.5% of sequencing results were concordant with phenotypic results, indicating the usability of sequencing to identify NFB. PMID:20164273

  1. Genotypic and Phenotypic Applications for the Differentiation and Species-Level Identification of Achromobacter for Clinical Diagnoses

    PubMed Central

    Gomila, Margarita; Prince-Manzano, Claudia; Svensson-Stadler, Liselott; Busquets, Antonio; Erhard, Marcel; Martínez, Deny L.; Lalucat, Jorge; Moore, Edward R. B.

    2014-01-01

    The Achromobacter is a genus in the family Alcaligenaceae, comprising fifteen species isolated from different sources, including clinical samples. The ability to detect and correctly identify Achromobacter species, particularly A. xylosoxidans, and differentiate them from other phenotypically similar and genotypically related Gram-negative, aerobic, non-fermenting species is important for patients with cystic fibrosis (CF), as well as for nosocomial and other opportunistic infections. Traditional phenotypic profile-based analyses have been demonstrated to be inadequate for reliable identifications of isolates of Achromobacter species and genotypic-based assays, relying upon comparative 16S rRNA gene sequence analyses are not able to insure definitive identifications of Achromobacter species, due to the inherently conserved nature of the gene. The uses of alternative methodologies to enable high-resolution differentiation between the species in the genus are needed. A comparative multi-locus sequence analysis (MLSA) of four selected ‘house-keeping’ genes (atpD, gyrB, recA, and rpoB) assessed the individual gene sequences for their potential in developing a reliable, rapid and cost-effective diagnostic protocol for Achromobacter species identifications. The analysis of the type strains of the species of the genus and 46 strains of Achromobacter species showed congruence between the cluster analyses derived from the individual genes. The MLSA gene sequences exhibited different levels of resolution in delineating the validly published Achromobacter species and elucidated strains that represent new genotypes and probable new species of the genus. Our results also suggested that the recently described A. spritinus is a later heterotypic synonym of A. marplatensis. Strains were analyzed, using whole-cell Matrix-Assisted Laser Desorption/Ionization Time-Of-Flight mass spectrometry (MALDI-TOF MS), as an alternative phenotypic profile-based method with the potential to

  2. A component-level failure detection and identification algorithm based on open-loop and closed-loop state estimators

    NASA Astrophysics Data System (ADS)

    You, Seung-Han; Cho, Young Man; Hahn, Jin-Oh

    2013-04-01

    This study presents a component-level failure detection and identification (FDI) algorithm for a cascade mechanical system subsuming a plant driven by an actuator unit. The novelty of the FDI algorithm presented in this study is that it is able to discriminate failure occurring in the actuator unit, the sensor measuring the output of the actuator unit, and the plant driven by the actuator unit. The proposed FDI algorithm exploits the measurement of the actuator unit output together with its estimates generated by open-loop (OL) and closed-loop (CL) estimators to enable FDI at the component's level. In this study, the OL estimator is designed based on the system identification of the actuator unit. The CL estimator, which is guaranteed to be stable against variations in the plant, is synthesized based on the dynamics of the entire cascade system. The viability of the proposed algorithm is demonstrated using a hardware-in-the-loop simulation (HILS), which shows that it can detect and identify target failures reliably in the presence of plant uncertainties.

  3. Adult Learner Participation in an Online Degree Program: A Program-Level Study of Voluntary Computer-Mediated Communication

    ERIC Educational Resources Information Center

    Thompson, Emily W.; Savenye, Wilhelmina C.

    2007-01-01

    Several studies examining computer-mediated communications (CMC) in online courses have found low levels of participation under both voluntary (ungraded) and mandatory (graded) conditions. This is troubling since student participation is widely considered to have a positive impact on performance. Program-level data were analyzed to explore the…

  4. Identification of estrogenic compounds emitted from the combustion of computer printed circuit boards in electronic waste.

    PubMed

    Owens, Clyde V; Lambright, Christy; Bobseine, Kathy; Ryan, Bryce; Gray, L Earl; Gullett, Brian K; Wilson, Vickie S

    2007-12-15

    Rapid changes in technology have brought about a surge in demand for electronic equipment. Many of these products contain brominated flame-retardants (BFRs) as additives to decrease the rate of combustion, raising concerns about their toxicological risk. In our study, emissions from the combustion of computer-printed circuit boards were evaluated in the T47D-KBluc estrogen-responsive cell line at a series of concentrations. There was significant activity from the emission extract when compared to the positive control, 0.1 nM estradiol. After HPLC fractionation, GC/MS identified ten chemicals which included bisphenol A; the brominated derivates mono-, di-, and tribisphenol, triphenyl phosphate, triphenyl phosphine oxide, 4'-bromo-[1,1'-biphenyl]-4-ol,3,5-dibromo-4-hydroxybiphenyl,3,5-dibromo-2-hydroxybiphenyl, and the oxygenated polyaromatic hydrocarbon benzanthrone. Commercially available samples of these ten compounds were tested. The compound 4'-bromo-[1,1'-biphenyl]-4-ol resulted in dose-dependent significant increases for luciferase activity at concentrations ranging from 0.1 to 10 microM in the T47D-KBluc assay. The chemical also demonstrated an affinity for binding to the estrogen receptor (ER) with an IC50 of 2 x 10(-7) M. To determine the uterotrophic activity, three doses (50, 100, and 200 mg/kg/day) of 4'-bromo-[1,1'-biphenyl]-4-ol were administered to adult ovariectomized Long-Evans rats for 3 days. Treatment of the animals with 200 mg/ kg/day showed an increase in uterine weight Hence one new chemical, released by burning of electrical wastes, was identified which displays estrogenic activity both in vitro and in vivo. However, it was about 1000-fold less potent than ethynyl estradiol.

  5. Identification and analysis of potential targets in Streptococcus sanguinis using computer aided protein data analysis

    PubMed Central

    Chowdhury, Md Rabiul Hossain; Bhuiyan, Md IqbalKaiser; Saha, Ayan; Mosleh, Ivan MHAI; Mondol, Sobuj; Ahmed, C M Sabbir

    2014-01-01

    Purpose Streptococcus sanguinis is a Gram-positive, facultative aerobic bacterium that is a member of the viridans streptococcus group. It is found in human mouths in dental plaque, which accounts for both dental cavities and bacterial endocarditis, and which entails a mortality rate of 25%. Although a range of remedial mediators have been found to control this organism, the effectiveness of agents such as penicillin, amoxicillin, trimethoprim–sulfamethoxazole, and erythromycin, was observed. The emphasis of this investigation was on finding substitute and efficient remedial approaches for the total destruction of this bacterium. Materials and methods In this computational study, various databases and online software were used to ascertain some specific targets of S. sanguinis. Particularly, the Kyoto Encyclopedia of Genes and Genomes databases were applied to determine human nonhomologous proteins, as well as the metabolic pathways involved with those proteins. Different software such as Phyre2, CastP, DoGSiteScorer, the Protein Function Predictor server, and STRING were utilized to evaluate the probable active drug binding site with its known function and protein–protein interaction. Results In this study, among 218 essential proteins of this pathogenic bacterium, 81 nonhomologous proteins were accrued, and 15 proteins that are unique in several metabolic pathways of S. sanguinis were isolated through metabolic pathway analysis. Furthermore, four essentially membrane-bound unique proteins that are involved in distinct metabolic pathways were revealed by this research. Active sites and druggable pockets of these selected proteins were investigated with bioinformatic techniques. In addition, this study also mentions the activity of those proteins, as well as their interactions with the other proteins. Conclusion Our findings helped to identify the type of protein to be considered as an efficient drug target. This study will pave the way for researchers to

  6. Identification of novel microRNAs in Hevea brasiliensis and computational prediction of their targets

    PubMed Central

    2012-01-01

    Background Plants respond to external stimuli through fine regulation of gene expression partially ensured by small RNAs. Of these, microRNAs (miRNAs) play a crucial role. They negatively regulate gene expression by targeting the cleavage or translational inhibition of target messenger RNAs (mRNAs). In Hevea brasiliensis, environmental and harvesting stresses are known to affect natural rubber production. This study set out to identify abiotic stress-related miRNAs in Hevea using next-generation sequencing and bioinformatic analysis. Results Deep sequencing of small RNAs was carried out on plantlets subjected to severe abiotic stress using the Solexa technique. By combining the LeARN pipeline, data from the Plant microRNA database (PMRD) and Hevea EST sequences, we identified 48 conserved miRNA families already characterized in other plant species, and 10 putatively novel miRNA families. The results showed the most abundant size for miRNAs to be 24 nucleotides, except for seven families. Several MIR genes produced both 20-22 nucleotides and 23-27 nucleotides. The two miRNA class sizes were detected for both conserved and putative novel miRNA families, suggesting their functional duality. The EST databases were scanned with conserved and novel miRNA sequences. MiRNA targets were computationally predicted and analysed. The predicted targets involved in "responses to stimuli" and to "antioxidant" and "transcription activities" are presented. Conclusions Deep sequencing of small RNAs combined with transcriptomic data is a powerful tool for identifying conserved and novel miRNAs when the complete genome is not yet available. Our study provided additional information for evolutionary studies and revealed potentially specific regulation of the control of redox status in Hevea. PMID:22330773

  7. Computer-vision-based weed identification of images acquired by 3CCD camera

    NASA Astrophysics Data System (ADS)

    Zhang, Yun; He, Yong; Fang, Hui

    2006-09-01

    Selective application of herbicide to weeds at an earlier stage in crop growth is an important aspect of site-specific management of field crops. For approaches more adaptive in developing the on-line weed detecting application, more researchers involves in studies on image processing techniques for intensive computation and feature extraction tasks to identify the weeds from the other crops and soil background. This paper investigated the potentiality of applying the digital images acquired by the MegaPlus TM MS3100 3-CCD camera to segment the background soil from the plants in question and further recognize weeds from the crops using the Matlab script language. The image of the near-infrared waveband (center 800 nm; width 65 nm) was selected principally for segmenting soil and identifying the cottons from the thistles was achieved based on their respective relative area (pixel amount) in the whole image. The results show adequate recognition that the pixel proportion of soil, cotton leaves and thistle leaves were 78.24%(-0.20% deviation), 16.66% (+ 2.71% SD) and 4.68% (-4.19% SD). However, problems still exists by separating and allocating single plants for their clustering in the images. The information in the images acquired via the other two channels, i.e., the green and the red bands, need to be extracted to help the crop/weed discrimination. More optical specimens should be acquired for calibration and validation to establish the weed-detection model that could be effectively applied in fields.

  8. A scalable computer-aided detection system for microcalcification cluster identification in a pan-European distributed database of mammograms

    NASA Astrophysics Data System (ADS)

    Retico, A.; Delogu, P.; Fantacci, M. E.; Preite Martinez, A.; Stefanini, A.; Tata, A.

    2006-12-01

    A computer-aided detection (CADe) system for microcalcification cluster identification in mammograms has been developed in the framework of the EU-founded MammoGrid project. The CADe software is mainly based on wavelet transforms and artificial neural networks. It is able to identify microcalcifications in different kinds of mammograms (i.e. acquired with different machines and settings, digitized with different pitch and bit depth or direct digital ones). The CADe can be remotely run from GRID-connected acquisition and annotation stations, supporting clinicians from geographically distant locations in the interpretation of mammographic data. We report the FROC analyses of the CADe system performances on three different dataset of mammograms, i.e. images of the CALMA INFN-founded database collected in the Italian National screening program, the MIAS database and the so-far collected MammoGrid images. The sensitivity values of 88% at a rate of 2.15 false positive findings per image (FP/im), 88% with 2.18 FP/im and 87% with 5.7 FP/im have been obtained on the CALMA, MIAS and MammoGrid database, respectively.

  9. Computer-assisted identification and volumetric quantification of dynamic contrast enhancement in brain MRI: an interactive system

    NASA Astrophysics Data System (ADS)

    Wu, Shandong; Avgeropoulos, Nicholas G.; Rippe, David J.

    2013-03-01

    We present a dedicated segmentation system for tumor identification and volumetric quantification in dynamic contrast brain magnetic resonance (MR) scans. Our goal is to offer a practically useful tool at the end of clinicians in order to boost volumetric tumor assessment. The system is designed to work in an interactive mode such that maximizes the integration of computing capacity and clinical intelligence. We demonstrate the main functions of the system in terms of its functional flow and conduct preliminary validation using a representative pilot dataset. The system is inexpensive, user-friendly, easy to deploy and integrate with picture archiving and communication systems (PACS), and possible to be open-source, which enable it to potentially serve as a useful assistant for radiologists and oncologists. It is anticipated that in the future the system can be integrated into clinical workflow so that become routine available to help clinicians make more objective interpretations of treatment interventions and natural history of disease to best advocate patient needs.

  10. Computational approach for identification of Anopheles gambiae miRNA involved in modulation of host immune response.

    PubMed

    Thirugnanasambantham, Krishnaraj; Hairul-Islam, Villianur Ibrahim; Saravanan, Subramanian; Subasri, Subramaniyan; Subastri, Ariraman

    2013-05-01

    MicroRNAs (miRNAs) are small, noncoding RNAs that play key roles in regulating gene expression in animals, plants, and viruses, which involves in biological processes including development, cancer, immunity, and host-microorganism interactions. In this present study, we have used the computational approach to identify potent miRNAs involved in Anopheles gambiae immune response. Analysis of 217,261 A. gambiae ESTs and further study of RNA folding revealed six new miRNAs. The minimum free energy of the predicted miRNAs ranged from -27.2 to -62.63 kcal/mol with an average of -49.38 kcal/mol. While its A + U % ranges from 50 to 65 % with an average value of 57.37 %. Phylogenetic analysis of the predicted miRNAs revealed that aga-miR-277 was evolutionary highly conserved with more similarity with other mosquito species. Observing further the target identification of the predicted miRNA, it was noticed that the aga-miR-2304 and aga-miR-2390 are involved in modulation of immune response by targeting the gene encoding suppressin and protein prophenoloxidase. Further detailed studies of these miRNAs will help in revealing its function in modulation of A. gambiae immune response with respect to its parasite.

  11. Toward accurate tooth segmentation from computed tomography images using a hybrid level set model

    SciTech Connect

    Gan, Yangzhou; Zhao, Qunfei; Xia, Zeyang E-mail: jing.xiong@siat.ac.cn; Hu, Ying; Xiong, Jing E-mail: jing.xiong@siat.ac.cn; Zhang, Jianwei

    2015-01-15

    Purpose: A three-dimensional (3D) model of the teeth provides important information for orthodontic diagnosis and treatment planning. Tooth segmentation is an essential step in generating the 3D digital model from computed tomography (CT) images. The aim of this study is to develop an accurate and efficient tooth segmentation method from CT images. Methods: The 3D dental CT volumetric images are segmented slice by slice in a two-dimensional (2D) transverse plane. The 2D segmentation is composed of a manual initialization step and an automatic slice by slice segmentation step. In the manual initialization step, the user manually picks a starting slice and selects a seed point for each tooth in this slice. In the automatic slice segmentation step, a developed hybrid level set model is applied to segment tooth contours from each slice. Tooth contour propagation strategy is employed to initialize the level set function automatically. Cone beam CT (CBCT) images of two subjects were used to tune the parameters. Images of 16 additional subjects were used to validate the performance of the method. Volume overlap metrics and surface distance metrics were adopted to assess the segmentation accuracy quantitatively. The volume overlap metrics were volume difference (VD, mm{sup 3}) and Dice similarity coefficient (DSC, %). The surface distance metrics were average symmetric surface distance (ASSD, mm), RMS (root mean square) symmetric surface distance (RMSSSD, mm), and maximum symmetric surface distance (MSSD, mm). Computation time was recorded to assess the efficiency. The performance of the proposed method has been compared with two state-of-the-art methods. Results: For the tested CBCT images, the VD, DSC, ASSD, RMSSSD, and MSSD for the incisor were 38.16 ± 12.94 mm{sup 3}, 88.82 ± 2.14%, 0.29 ± 0.03 mm, 0.32 ± 0.08 mm, and 1.25 ± 0.58 mm, respectively; the VD, DSC, ASSD, RMSSSD, and MSSD for the canine were 49.12 ± 9.33 mm{sup 3}, 91.57 ± 0.82%, 0.27 ± 0.02 mm, 0

  12. A Organization for High-Level Interactive Programmatic Control of Computer-Generated Sound.

    NASA Astrophysics Data System (ADS)

    Das, Sumit

    The state of computer generated sound has advanced rapidly, and there exist many different ways of conceptualizing the abstract sound structures that comprise music and other complex organizations of sound. Many of these methods are radically different from one another, and so are not usually used within the same system. One problem that almost all methods share is one of control, as large amounts of data are needed to specify sounds. How do we create, examine, and modify these complex structures? The problem is exacerbated if we consider the realm of interactively controlled sound. This paper presents an organization which, rather than forcing a particular way of thinking about sound, allows multiple arbitrarily high-level views to coexist, all sharing a common interface. The methods or algorithms are abstracted into objects called auditory actors. This encapsulation allows different algorithms to be used concurrently. All communication with and between these actors is carried out through message-passing, which allows arbitrary types of information (such as other messages) to be easily communicated. This standardizes control without limiting it to a particular type of data. A prototype system was implemented using this model. This system was used by a number of different developers to create audio interfaces for interactive virtual reality applications, which were demonstrated at the SIGGRAPH 94 conference in Orlando, Florida. Compared to earlier systems, developers were able to create more complex audio interfaces in a shorter time.

  13. Computation of dose rate at flight altitudes during ground level enhancements no. 69, 70 and 71

    NASA Astrophysics Data System (ADS)

    Mishev, A. L.; Adibpour, F.; Usoskin, I. G.; Felsberger, E.

    2015-01-01

    A new numerical model of estimating and monitoring the exposure of personnel due to secondary cosmic radiation onboard aircraft, in accordance with radiation safety standards as well as European and national regulations, has been developed. The model aims to calculate the effective dose at flight altitude (39,000 ft) due to secondary cosmic radiation of galactic and solar origin. In addition, the model allows the estimation of ambient dose equivalent at typical commercial airline altitudes in order to provide comparison with reference data. The basics, structure and function of the model are described. The model is based on a straightforward full Monte Carlo simulation of the cosmic ray induced atmospheric cascade. The cascade simulation is performed with the PLANETOCOSMICS code. The flux of secondary particles, namely neutrons, protons, gammas, electrons, positrons, muons and charged pions is calculated. A subsequent conversion of the particle fluence into the effective dose or ambient dose equivalent is performed as well as a comparison with reference data. An application of the model is demonstrated, using a computation of the effective dose rate at flight altitude during the ground level enhancements of 20 January 2005, 13 December 2006 and 17 May 2012.

  14. Identification of causal genetic drivers of human disease through systems-level analysis of regulatory networks

    PubMed Central

    Chen, James C.; Alvarez, Mariano J.; Talos, Flaminia; Dhruv, Harshil; Rieckhof, Gabrielle E.; Iyer, Archana; Diefes, Kristin L.; Aldape, Kenneth; Berens, Michael; Shen, Michael M.; Califano, Andrea

    2014-01-01

    SUMMARY Identification of driver mutations in human diseases is often limited by cohort size and availability of appropriate statistical models. We propose a novel framework for the systematic discovery of genetic alterations that are causal determinants of disease, by prioritizing genes upstream of functional disease drivers, within regulatory networks inferred de novo from experimental data. We tested this framework by identifying the genetic determinants of the mesenchymal subtype of glioblastoma. Our analysis uncovered KLHL9 deletions as upstream activators of two previously established master regulators of the subtype, C/EBPβ and C/EBPδ. Rescue of KLHL9 expression induced proteasomal degradation of C/EBP proteins, abrogated the mesenchymal signature, and reduced tumor viability in vitro and in vivo. Deletions of KLHL9 were confirmed in >50% of mesenchymal cases in an independent cohort, thus representing the most frequent genetic determinant of the subtype. The method generalized to study other human diseases, including breast cancer and Alzheimer’s disease. PMID:25303533

  15. Levels, profiles and source identification of PCDD/Fs in farmland soils of Guiyu, China.

    PubMed

    Xu, Pengjun; Tao, Bu; Li, Nan; Qi, Li; Ren, Yue; Zhou, Zhiguang; Zhang, Lifei; Liu, Aimin; Huang, Yeru

    2013-05-01

    The present study finished the first comprehensive survey of polychlorinated dibenzo-p-dioxin and dibenzofurans (PCDD/Fs) in farmland soils of Guiyu, China. Guiyu was a major electronic wastes (EWs) dismantling area, but primitive and crude EWs disposal manner had led to severe PCDD/Fs pollution there. Twenty-three farmland soil samples covering the entire Guiyu region were analyzed. Toxic equivalent quantities (I-TEQs) of soils in EWs disposal areas were 5.7-57pg TEQ g(-1), and the total concentrations of tetra- to octa-homologues were 2816-17738pgg(-1). The SL district was a heavily contaminated area, and the neighboring SMP town was influenced by Guiyu. EWs disposal might be the source of PCDD/Fs. The homologue profiles were of three types, representing different disposal manner of EWs. Tetrachlorodibenzo-p-dioxins (TCDDs) and octachlorodibenzo-p-dioxin (OCDD) could be used as indicators for source identification, open thermal disposal of EWs was inclined to lead to formation of TCDDs, OCDD was a product of non-thermal processes.

  16. Identification and validation of genetic variants that influence transcription factor and cell signaling protein levels.

    PubMed

    Hause, Ronald J; Stark, Amy L; Antao, Nirav N; Gorsic, Lidija K; Chung, Sophie H; Brown, Christopher D; Wong, Shan S; Gill, Daniel F; Myers, Jamie L; To, Lida Anita; White, Kevin P; Dolan, M Eileen; Jones, Richard Baker

    2014-08-01

    Many genetic variants associated with human disease have been found to be associated with alterations in mRNA expression. Although it is commonly assumed that mRNA expression changes will lead to consequent changes in protein levels, methodological challenges have limited our ability to test the degree to which this assumption holds true. Here, we further developed the micro-western array approach and globally examined relationships between human genetic variation and cellular protein levels. We collected more than 250,000 protein level measurements comprising 441 transcription factor and signaling protein isoforms across 68 Yoruba (YRI) HapMap lymphoblastoid cell lines (LCLs) and identified 12 cis and 160 trans protein level QTLs (pQTLs) at a false discovery rate (FDR) of 20%. Whereas up to two thirds of cis mRNA expression QTLs (eQTLs) were also pQTLs, many pQTLs were not associated with mRNA expression. Notably, we replicated and functionally validated a trans pQTL relationship between the KARS lysyl-tRNA synthetase locus and levels of the DIDO1 protein. This study demonstrates proof of concept in applying an antibody-based microarray approach to iteratively measure the levels of human proteins and relate these levels to human genome variation and other genomic data sets. Our results suggest that protein-based mechanisms might functionally buffer genetic alterations that influence mRNA expression levels and that pQTLs might contribute phenotypic diversity to a human population independently of influences on mRNA expression.

  17. Computer-assisted identification of novel small molecule inhibitors targeting GLUT1

    NASA Astrophysics Data System (ADS)

    Wan, Zhining; Li, Xin; Sun, Rong; Li, Yuanyuan; Wang, Xiaoyun; Li, Xinru; Rong, Li; Shi, Zheng; Bao, Jinku

    2015-12-01

    Glucose transporters (GLUTs) are the main carriers of glucose that facilitate the diffusion of glucose in mammalian cells, especially GLUT1. Notably, GLUT1 is a rate-limiting transporter for glucose uptake, and its overexpression is a common characteristic in most cancers. Thus, the inhibition of GLUT1 by novel small compounds to lower glucose levels for cancer cells has become an emerging strategy. Herein, we employed high-throughput screening approaches to identify potential inhibitors against the sugar-binding site of GLUT1. Firstly, molecular docking screening was launched against the specs products, and three molecules (ZINC19909927, ZINC19908826, and ZINC19815451) were selected as candidate GLUT1 inhibitors for further analysis. Then, taking the initial ligand β-NG as a reference, molecular dynamic (MD) simulations and molecular mechanics/generalized born surface area (MM/GBSA) method were applied to evaluate the binding stability and affinity of the three candidates towards GLUT1. Finally, we found that ZINC19909927 might have the highest affinity to occupy the binding site of GLUT1. Meanwhile, energy decomposition analysis identified several residues located in substrate-binding site that might provide clues for future inhibitor discovery towards GLUT1. Taken together, these results in our study may provide valuable information for identifying new inhibitors targeting GLUT1-mediated glucose transport and metabolism for cancer therapeutics.

  18. Computational identification of miRNAs that modulate the differentiation of mesenchymal stem cells to osteoblasts

    PubMed Central

    Seenprachawong, Kanokwan; Nuchnoi, Pornlada; Nantasenamat, Chanin; Prachayasittikul, Virapong

    2016-01-01

    MicroRNAs (miRNAs) are small endogenous noncoding RNAs that play an instrumental role in post-transcriptional modulation of gene expression. Genes related to osteogenesis (i.e., RUNX2, COL1A1 and OSX) is important in controlling the differentiation of mesenchymal stem cells (MSCs) to bone tissues. The regulated expression level of miRNAs is critically important for the differentiation of MSCs to preosteoblasts. The understanding of miRNA regulation in osteogenesis could be applied for future applications in bone defects. Therefore, this study aims to shed light on the mechanistic pathway underlying osteogenesis by predicting miRNAs that may modulate this pathway. This study investigates RUNX2, which is a major transcription factor for osteogenesis that drives MSCs into preosteoblasts. Three different prediction tools were employed for identifying miRNAs related to osteogenesis using the 3’UTR of RUNX2 as the target gene. Of the 1,023 miRNAs, 70 miRNAs were found by at least two of the tools. Candidate miRNAs were then selected based on their free energy values, followed by assessing the probability of target accessibility. The results showed that miRNAs 23b, 23a, 30b, 143, 203, 217, and 221 could regulate the RUNX2 gene during the differentiation of MSCs to preosteoblasts. PMID:27168985

  19. ENHANCING THE ATOMIC-LEVEL UNDERSTANDING OF CO2 MINERAL SEQUESTRATION MECHANISMS VIA ADVANCED COMPUTATIONAL MODELING

    SciTech Connect

    A.V.G. Chizmeshya

    2003-12-19

    /NETL managed National Mineral Sequestration Working Group we have already significantly improved our understanding of mineral carbonation. Group members at the Albany Research Center have recently shown that carbonation of olivine and serpentine, which naturally occurs over geological time (i.e., 100,000s of years), can be accelerated to near completion in hours. Further process refinement will require a synergetic science/engineering approach that emphasizes simultaneous investigation of both thermodynamic processes and the detailed microscopic, atomic-level mechanisms that govern carbonation kinetics. Our previously funded Phase I Innovative Concepts project demonstrated the value of advanced quantum-mechanical modeling as a complementary tool in bridging important gaps in our understanding of the atomic/molecular structure and reaction mechanisms that govern CO{sub 2} mineral sequestration reaction processes for the model Mg-rich lamellar hydroxide feedstock material Mg(OH){sub 2}. In the present simulation project, improved techniques and more efficient computational schemes have allowed us to expand and augment these capabilities and explore more complex Mg-rich, lamellar hydroxide-based feedstock materials, including the serpentine-based minerals. These feedstock materials are being actively investigated due to their wide availability, and low-cost CO{sub 2} mineral sequestration potential. Cutting-edge first principles quantum chemical, computational solid-state and materials simulation methodology studies proposed herein, have been strategically integrated with our new DOE supported (ASU-Argonne National Laboratory) project to investigate the mechanisms that govern mineral feedstock heat-treatment and aqueous/fluid-phase serpentine mineral carbonation in situ. This unified, synergetic theoretical and experimental approach will provide a deeper understanding of the key reaction mechanisms than either individual approach can alone. Ab initio techniques will also

  20. ENHANCING THE ATOMIC-LEVEL UNDERSTANDING OF CO2 MINERAL SEQUESTRATION MECHANISMS VIA ADVANCED COMPUTATIONAL MODELING

    SciTech Connect

    A.V.G. Chizmeshya; M.J. McKelvy; G.H. Wolf; R.W. Carpenter; D.A. Gormley; J.R. Diefenbacher; R. Marzke

    2006-03-01

    significantly improved our understanding of mineral carbonation. Group members at the Albany Research Center have recently shown that carbonation of olivine and serpentine, which naturally occurs over geological time (i.e., 100,000s of years), can be accelerated to near completion in hours. Further process refinement will require a synergetic science/engineering approach that emphasizes simultaneous investigation of both thermodynamic processes and the detailed microscopic, atomic-level mechanisms that govern carbonation kinetics. Our previously funded Phase I Innovative Concepts project demonstrated the value of advanced quantum-mechanical modeling as a complementary tool in bridging important gaps in our understanding of the atomic/molecular structure and reaction mechanisms that govern CO2 mineral sequestration reaction processes for the model Mg-rich lamellar hydroxide feedstock material Mg(OH)2. In the present simulation project, improved techniques and more efficient computational schemes have allowed us to expand and augment these capabilities and explore more complex Mg-rich, lamellar hydroxide-based feedstock materials, including the serpentine-based minerals. These feedstock materials are being actively investigated due to their wide availability, and low-cost CO2 mineral sequestration potential. Cutting-edge first principles quantum chemical, computational solid-state and materials simulation methodology studies proposed herein, have been strategically integrated with our new DOE supported (ASU-Argonne National Laboratory) project to investigate the mechanisms that govern mineral feedstock heat-treatment and aqueous/fluid-phase serpentine mineral carbonation in situ. This unified, synergetic theoretical and experimental approach has provided a deeper understanding of the key reaction mechanisms than either individual approach can alone. We used ab initio techniques to significantly advance our understanding of atomic-level processes at the solid/solution interface by

  1. ENHANCING THE ATOMIC-LEVEL UNDERSTANDING OF CO2 MINERAL SEQUESTRATION MECHANISMS VIA ADVANCED COMPUTATIONAL MODELING

    SciTech Connect

    A.V.G. Chizmeshya

    2002-12-19

    /NETL managed National Mineral Sequestration Working Group we have already significantly improved our understanding of mineral carbonation. Group members at the Albany Research Center have recently shown that carbonation of olivine and serpentine, which naturally occurs over geological time (i.e., 100,000s of years), can be accelerated to near completion in hours. Further process refinement will require a synergetic science/engineering approach that emphasizes simultaneous investigation of both thermodynamic processes and the detailed microscopic, atomic-level mechanisms that govern carbonation kinetics. Our previously funded Phase I Innovative Concepts project demonstrated the value of advanced quantum-mechanical modeling as a complementary tool in bridging important gaps in our understanding of the atomic/molecular structure and reaction mechanisms that govern CO{sub 2} mineral sequestration reaction processes for the model Mg-rich lamellar hydroxide feedstock material Mg(OH){sub 2}. In the present simulation project, improved techniques and more efficient computational schemes have allowed us to expand and augment these capabilities and explore more complex Mg-rich, lamellar hydroxide-based feedstock materials, including the serpentine-based minerals. These feedstock materials are being actively investigated due to their wide availability, and low-cost CO{sub 2} mineral sequestration potential. Cutting-edge first principles quantum chemical, computational solid-state and materials simulation methodology studies proposed herein, have been strategically integrated with our new DOE supported (ASU-Argonne National Laboratory) project to investigate the mechanisms that govern mineral feedstock heat-treatment and aqueous/fluid-phase serpentine mineral carbonation in situ. This unified, synergetic theoretical and experimental approach will provide a deeper understanding of the key reaction mechanisms than either individual approach can alone. Ab initio techniques will also

  2. Three dimensional morphological studies of Larger Benthic Foraminifera at the population level using micro computed tomography

    NASA Astrophysics Data System (ADS)

    Kinoshita, Shunichi; Eder, Wolfgang; Woeger, Julia; Hohenegger, Johann; Briguglio, Antonino; Ferrandez-Canadell, Carles

    2015-04-01

    Symbiont-bearing larger benthic Foraminifera (LBF) are long-living marine (at least 1 year), single-celled organisms with complex calcium carbonate shells. Their morphology has been intensively studied since the middle of the nineteenth century. This led to a broad spectrum of taxonomic results, important from biostratigraphy to ecology in shallow water tropical to warm temperate marine palaeo-environments. However, it was necessary for the traditional investigation methods to cut or destruct specimens for analysing the taxonomically important inner structures. X-ray micro-computed tomography (microCT) is one of the newest techniques used in morphological studies. The greatest advantage is the non-destructive acquisition of inner structures. Furthermore, the running improve of microCT scanners' hard- and software provides high resolution and short time scans well-suited for LBF. Three-dimensional imaging techniques allow to select and extract each chamber and to measure easily its volume, surface and several form parameters used for morphometric analyses. Thus, 3-dimensional visualisation of LBF-tests is a very big step forward from traditional morphology based on 2-dimensional data. The quantification of chamber form is a great opportunity to tackle LBF structures, architectures and the bauplan geometry. The micrometric digital resolution is the only way to solve many controversies in phylogeny and evolutionary trends of LBF. For the present study we used micro-computed tomography to easily investigate the chamber number of every specimen from statistically representative part of populations to estimate population dynamics. Samples of living individuals are collected at monthly intervals from fixed locations. Specific preparation allows to scan up to 35 specimens per scan within 2 hours and to obtain the complete digital dataset for each specimen of the population. MicroCT enables thus a fast and precise count of all chambers built by the foraminifer from its

  3. Interactomes to Biological Phase Space: a call to begin thinking at a new level in computational biology.

    SciTech Connect

    Davidson, George S.; Brown, William Michael

    2007-09-01

    Techniques for high throughput determinations of interactomes, together with high resolution protein collocalizations maps within organelles and through membranes will soon create a vast resource. With these data, biological descriptions, akin to the high dimensional phase spaces familiar to physicists, will become possible. These descriptions will capture sufficient information to make possible realistic, system-level models of cells. The descriptions and the computational models they enable will require powerful computing techniques. This report is offered as a call to the computational biology community to begin thinking at this scale and as a challenge to develop the required algorithms and codes to make use of the new data.3

  4. Information-based system identification for predicting the groundwater-level fluctuations of hillslopes

    NASA Astrophysics Data System (ADS)

    Hong, Yao-Ming; Wan, Shiuan

    2011-09-01

    The analysis of pre-existing landslides and landslide-prone hillslopes requires an estimation of maximum groundwater levels. Rapid increase in groundwater levels may be a dominant factor for evaluating the occurrence of landslides. System identification—use of mathematical tools and algorithms for building dynamic models from measured data—is adopted in this study. The fluid mass-balance equation is used to model groundwater-level fluctuations, and the model is analytically solved using the finite-difference method. Entropy-based classification (EBC) is used as a data-mining technique to identify the appropriate ranges of influencing variables. The landslide area at Wushe Reservoir, Nantou County, Taiwan, is chosen as a field test site for verification. The study generated 65,535 sets of numbers for the groundwater-level variables of the governing equation, which is judged by root mean square errors. By applying cross-validation methods and EBC, limited numbers of validation samples are used to find the range of each parameter. For these ranges, a heuristic method is employed to find the best results of each parameter for the prediction model of groundwater level. The ranges for governing factors are evaluated and the resulting performance is examined.

  5. Identification of entry-level competencies for associate degree radiographers as perceived by primary role definers

    SciTech Connect

    Thorpe, R.L.

    1981-01-01

    The primary purpose of this study was to identify those competencies needed by Associate Degree Radiographers when they assume employment as entry-level practitioners. A second purpose of the study was to rank order the identified competencies within the role delineations recognized by the Essentials and Guidelines of an Accredited Educational Program for the Radiographer. These role delineations include: radiation protection, exercise discretion and judgment, emergency and life saving techniques, patient care and interpersonal communication, and role as professional member. A third purpose of the study was to examine the degree of consensus on role definition of entry-level competencies needed by Associate Degree Radiographers as perceived by primary role definers (such as employers, employees, and educators), and by other selected variables: age, sex, length of experience in radiologic technology, level of formal education, and place of employment. A major finding of this study was that respondents did not differ significantly in their ranking of entry-level competencies needed by Associate Degree Radiographers when the responses were analyzed according to position, age, sex, length of experience, level of education, or place of employment. Another important finding was that respondents considered all of the 63 competencies as important and needed by Associate Degree Radiographers upon initial employment.A major conclusion and recommendation of this study, in view of the high agreement on the rank ordering of competencies, was that these competencies should be included in a competency-based education model. It was further recommended that a three-way system of communication between employers, employees, and educators be considered in order to pool resources and to increase understanding of each position group's contribution and influence on entry-level Associate Degree Radiographers.

  6. Clinical Identification of the Vertebral Level at Which the Lumbar Sympathetic Ganglia Aggregate

    PubMed Central

    An, Ji Won; Koh, Jae Chul; Sun, Jong Min; Park, Ju Yeon; Choi, Jong Bum; Shin, Myung Ju

    2016-01-01

    Background The location and the number of lumbar sympathetic ganglia (LSG) vary between individuals. The aim of this study was to determine the appropriate level for a lumbar sympathetic ganglion block (LSGB), corresponding to the level at which the LSG principally aggregate. Methods Seventy-four consecutive subjects, including 31 women and 31 men, underwent LSGB either on the left (n = 31) or the right side (n = 43). The primary site of needle entry was randomly selected at the L3 or L4 vertebra. A total of less than 1 ml of radio opaque dye with 4% lidocaine was injected, taking caution not to traverse beyond the level of one vertebral body. The procedure was considered responsive when the skin temperature increased by more than 1℃ within 5 minutes. Results The median responsive level was significantly different between the left (lower third of the L4 body) and right (lower margin of the L3 body) sides (P = 0.021). However, there was no significant difference in the values between men and women. The overall median responsive level was the upper third of the L4 body. The mean responsive level did not correlate with height or BMI. There were no complications on short-term follow-up. Conclusions Selection of the primary target in the left lower third of the L4 vertebral body and the right lower margin of the L3 vertebral body may reduce the number of needle insertions and the volume of agents used in conventional or neurolytic LSGB and radiofrequency thermocoagulation. PMID:27103965

  7. Identification of the source of elevated hepatocyte growth factor levels in multiple myeloma patients

    PubMed Central

    2014-01-01

    Background Hepatocyte growth factor (HGF) is a pleiotropic cytokine which can lead to cancer cell proliferation, migration and metastasis. In multiple myeloma (MM) patients it is an abundant component of the bone marrow. HGF levels are elevated in 50% of patients and associated with poor prognosis. Here we aim to investigate its source in myeloma. Methods HGF mRNA levels in bone marrow core biopsies from healthy individuals and myeloma patients were quantified by real-time PCR. HGF gene expression profiling in CD138+ cells isolated from bone marrow aspirates of healthy individuals and MM patients was performed by microarray analysis. HGF protein concentrations present in peripheral blood of MM patients were measured by enzyme-linked immunosorbent assay (ELISA). Cytogenetic status of CD138+ cells was determined by fluorescence in situ hybridization (FISH) and DNA sequencing of the HGF gene promoter. HGF secretion in co-cultures of human myeloma cell lines and bone marrow stromal cells was measured by ELISA. Results HGF gene expression profiling in both bone marrow core biopsies and CD138+ cells showed elevated HGF mRNA levels in myeloma patients. HGF mRNA levels in biopsies and in myeloma cells correlated. Quantification of HGF protein levels in serum also correlated with HGF mRNA levels in CD138+ cells from corresponding patients. Cytogenetic analysis showed myeloma cell clones with HGF copy numbers between 1 and 3 copies. There was no correlation between HGF copy number and HGF mRNA levels. Co-cultivation of the human myeloma cell lines ANBL-6 and JJN3 with bone marrow stromal cells or the HS-5 cell line resulted in a significant increase in secreted HGF. Conclusions We here show that in myeloma patients HGF is primarily produced by malignant plasma cells, and that HGF production by these cells might be supported by the bone marrow microenvironment. Considering the fact that elevated HGF serum and plasma levels predict poor prognosis, these findings are of

  8. Risk of node metastasis of sentinel lymph nodes detected in level II/III of the axilla by single-photon emission computed tomography/computed tomography

    PubMed Central

    SHIMA, HIROAKI; KUTOMI, GORO; SATOMI, FUKINO; MAEDA, HIDEKI; TAKAMARU, TOMOKO; KAMESHIMA, HIDEKAZU; OMURA, TOSEI; MORI, MITSURU; HATAKENAKA, MASAMITSU; HASEGAWA, TADASHI; HIRATA, KOICHI

    2014-01-01

    In breast cancer, single-photon emission computed tomography/computed tomography (SPECT/CT) shows the exact anatomical location of sentinel nodes (SN). SPECT/CT mainly exposes axilla and partly exposes atypical sites of extra-axillary lymphatic drainage. The mechanism of how the atypical hot nodes are involved in lymphatic metastasis was retrospectively investigated in the present study, particularly at the level II/III region. SPECT/CT was performed in 92 clinical stage 0-IIA breast cancer patients. Sentinel lymph nodes are depicted as hot nodes in SPECT/CT. Patients were divided into two groups: With or without hot node in level II/III on SPECT/CT. The existence of metastasis in level II/III was investigated and the risk factors were identified. A total of 12 patients were sentinel lymph node biopsy metastasis positive and axillary lymph node dissection (ALND) was performed. These patients were divided into two groups: With and without SN in level II/III, and nodes in level II/III were pathologically proven. In 11 of the 92 patients, hot nodes were detected in level II/III. There was a significant difference in node metastasis depending on whether there were hot nodes in level II/III (P=0.0319). Multivariate analysis indicated that the hot nodes in level II/III and lymphatic invasion were independent factors associated with node metastasis. There were 12 SN-positive patients followed by ALND. In four of the 12 patients, hot nodes were observed in level II/III. Two of the four patients with hot nodes depicted by SPECT/CT and metastatic nodes were pathologically evident in the same lesion. Therefore, the present study indicated that the hot node in level II/III as depicted by SPECT/CT may be a risk of SN metastasis, including deeper nodes. PMID:25289038

  9. DIATOM INDICES OF STREAM ECOSYSTEM CONDITIONS: COMPARISON OF GENUS VS. SPECIES LEVEL IDENTIFICATIONS

    EPA Science Inventory

    Diatom assemblage data collected between 1993 and 1995 from 233 Mid-Appalachian streams were used to compare indices of biotic integrity based on genus vs. species level taxonomy. Thirty-seven genera and 197 species of diatoms were identified from these samples. Metrics included...

  10. Information and Computing Recommendations for a Course at the Secondary School Level.

    ERIC Educational Resources Information Center

    Education and Computing, 1986

    1986-01-01

    Presents recommendations for an interdisciplinary course called "Computers, Information, and Related Technologies" developed for 9th and 10th grade students by the American Federation of Information Processing Societies (AFIPS). Main topics presented in the course include how information is processed, using information, and how computing and…

  11. Computing at the High School Level: Changing What Teachers and Students Know and Believe

    ERIC Educational Resources Information Center

    Munson, Ashlyn; Moskal, Barbara; Harriger, Alka; Lauriski-Karriker, Tonya; Heersink, Daniel

    2011-01-01

    Research indicates that students often opt out of computing majors due to a lack of prior experience in computing and a lack of knowledge of field-based job opportunities. In addition, it has been found that students respond positively to new subjects when teachers and counselors are enthusiastic and knowledgeable about the area. The summer…

  12. Analysis of the Computer Anxiety Levels of Secondary Technical Education Teachers in West Virginia.

    ERIC Educational Resources Information Center

    Gordon, Howard R. D.

    The computer anxiety of 116 randomly selected secondary technical education teachers from 8 area vocational-technical centers in West Virginia was the focus of a study. The mailed questionnaire consisted of two parts: Oetting's Computer Anxiety Scale (COMPAS) and closed-form questions to obtain general demographic information about the teachers…

  13. Sample Computer Applications for the High School Mathematics Classroom. Selected Topics--Level One.

    ERIC Educational Resources Information Center

    New York City Board of Education, Brooklyn, NY. Div. of Computer Information Services.

    This manual provides a relevant resource for mathematics teachers who wish to take advantage of the extensive library of computer software materials available to expand and strengthen classroom instruction. Classroom management procedures, software duplication guidelines, procedures for copying diskettes for MS DOS and Apple II computers, and…

  14. All-optical reservoir computing system based on InGaAsP ring resonators for high-speed identification and optical routing in optical networks

    NASA Astrophysics Data System (ADS)

    Mesaritakis, Charis; Kapsalis, Alexandros; Syvridis, Dimitris

    2015-01-01

    In this paper an all-optical reservoir computing scheme is modeled, that paves an alternative route to photonic high bit rate header identification in optical networks and allow direct processing in the analog domain. The system consists of randomly interconnected InGaAsP micro-ring-resonators, whereas the computation efficiency of the scheme is based on the ultra-fast Kerr effect and two-photon absorption. Validation of the system's efficiency is confirmed through detailed numerical modeling and two application orientated benchmark tests that consists in the classification of 32bit digital headers, encoded an NRZ optical pulses, with a bitrate of 240Gbps,and the identification of pseudo-analog patters for real time sensing applications in the analog domain.

  15. Determining effective subject-specific strength levels for forward dives using computer simulations of recorded performances.

    PubMed

    King, Mark A; Kong, Pui W; Yeadon, Maurice R

    2009-12-11

    This study used optimisation procedures in conjunction with an 8-segment torque-driven computer simulation model of the takeoff phase in springboard diving to determine appropriate subject-specific strength parameters for use in the simulation of forward dives. Kinematic data were obtained using high-speed video recordings of performances of a forward dive pike (101B) and a forward 2 1/2 somersault pike dive (105B) by an elite diver. Nine parameters for each torque generator were taken from dynamometer measurements on an elite gymnast. The isometric torque parameter for each torque generator was then varied together with torque activation timings until the root mean squared (RMS) percentage difference between simulation and performance in terms of joint angles, orientation, linear momentum, angular momentum, and duration of springboard contact was minimised for each of the two dives. The two sets of isometric torque parameters were combined into a single set by choosing the larger value from the two sets for each parameter. Simulations using the combined set of isometric torque parameters matched the two performances closely with RMS percentage differences of 2.6% for 101B and 3.7% for 105B. Maximising the height reached by the mass centre during the flight phase for 101B using the combined set of isometric parameters and by varying torque generator activation timings during takeoff resulted in a credible height increase of 38 mm compared to the matching simulation. It is concluded that the procedure is able to determine appropriate effective strength levels suitable for use in the optimisation of simulated forward rotating dive performances.

  16. High-level waste storage tank farms/242-A evaporator standards/requirements identification document (S/RID), Vol. 3

    SciTech Connect

    Not Available

    1994-04-01

    The Safeguards and Security (S&S) Functional Area address the programmatic and technical requirements, controls, and standards which assure compliance with applicable S&S laws and regulations. Numerous S&S responsibilities are performed on behalf of the Tank Farm Facility by site level organizations. Certain other responsibilities are shared, and the remainder are the sole responsibility of the Tank Farm Facility. This Requirements Identification Document describes a complete functional Safeguards and Security Program that is presumed to be the responsibility of the Tank Farm Facility. The following list identifies the programmatic elements in the S&S Functional Area: Program Management, Protection Program Scope and Evaluation, Personnel Security, Physical Security Systems, Protection Program Operations, Material Control and Accountability, Information Security, and Key Program Interfaces.

  17. Identification of interleukin 2, 6, and 8 levels around miniscrews during orthodontic tooth movement.

    PubMed

    Hamamcı, Nihal; Acun Kaya, Filiz; Uysal, Ersin; Yokuş, Beran

    2012-06-01

    The aim of this study was to identify the levels of interleukin (IL)-2, IL-6, and IL-8 around miniscrews used for anchorage during canine distalization. Sixteen patients (eight males and eight females; mean age, 16.6 ± 2.4 years) who were treated with bilateral upper first premolar extractions were included in the study. Thirty-two maxillary miniscrew implants were placed bilaterally in the alveolar bone between the maxillary second premolars and first molars as anchorage units for maxillary canine distalization. Three groups were constructed. The treatment, miniscrew, and control groups consisted of upper canines, miniscrew implants, and upper second premolars, respectively. Peri-miniscrew implant crevicular fluid and gingival crevicular fluid (GCF) were obtained at baseline (T1) and at 1 (T2), 24 (T3), and 48 (T4) hours, 7 (T5) and 21 (T6) days, and 3 months (T7) after force application. Paired sample t-tests were used to determine within-group changes and Dunnett's t and Tukey's honestly significant difference tests for between-group multiple comparisons. During the 3 month period, IL-2 levels significantly increased (P < 0.01) but only in the treatment group after 24 hours. IL-6 levels were unchanged at all times points in the three groups. IL-8 levels increased significantly at 1 (P < 0.05), 24 (P < 0.01), and 48 (P < 0.01) hours in the treatment group and at 24 (P < 0.05) and 48 (P < 0.01) hours in the miniscrew group. It appears that miniscrews can be used for anchorage in orthodontics when correct physiological forces are applied.

  18. Identification of interleukin 2, 6, and 8 levels around miniscrews during orthodontic tooth movement.

    PubMed

    Hamamcı, Nihal; Acun Kaya, Filiz; Uysal, Ersin; Yokuş, Beran

    2012-06-01

    The aim of this study was to identify the levels of interleukin (IL)-2, IL-6, and IL-8 around miniscrews used for anchorage during canine distalization. Sixteen patients (eight males and eight females; mean age, 16.6 ± 2.4 years) who were treated with bilateral upper first premolar extractions were included in the study. Thirty-two maxillary miniscrew implants were placed bilaterally in the alveolar bone between the maxillary second premolars and first molars as anchorage units for maxillary canine distalization. Three groups were constructed. The treatment, miniscrew, and control groups consisted of upper canines, miniscrew implants, and upper second premolars, respectively. Peri-miniscrew implant crevicular fluid and gingival crevicular fluid (GCF) were obtained at baseline (T1) and at 1 (T2), 24 (T3), and 48 (T4) hours, 7 (T5) and 21 (T6) days, and 3 months (T7) after force application. Paired sample t-tests were used to determine within-group changes and Dunnett's t and Tukey's honestly significant difference tests for between-group multiple comparisons. During the 3 month period, IL-2 levels significantly increased (P < 0.01) but only in the treatment group after 24 hours. IL-6 levels were unchanged at all times points in the three groups. IL-8 levels increased significantly at 1 (P < 0.05), 24 (P < 0.01), and 48 (P < 0.01) hours in the treatment group and at 24 (P < 0.05) and 48 (P < 0.01) hours in the miniscrew group. It appears that miniscrews can be used for anchorage in orthodontics when correct physiological forces are applied. PMID:21474566

  19. SU-E-P-10: Establishment of Local Diagnostic Reference Levels of Routine Exam in Computed Tomography

    SciTech Connect

    Yeh, M; Wang, Y; Weng, H

    2015-06-15

    Introduction National diagnostic reference levels (NDRLs) can be used as a reference dose of radiological examination can provide radiation dose as the basis of patient dose optimization. Local diagnostic reference levels (LDRLs) by periodically view and check doses, more efficiency to improve the way of examination. Therefore, the important first step is establishing a diagnostic reference level. Computed Tomography in Taiwan had been built up the radiation dose limit value,in addition, many studies report shows that CT scan contributed most of the radiation dose in different medical. Therefore, this study was mainly to let everyone understand DRL’s international status. For computed tomography in our hospital to establish diagnostic reference levels. Methods and Materials: There are two clinical CT scanners (a Toshiba Aquilion and a Siemens Sensation) were performed in this study. For CT examinations the basic recommended dosimetric quantity is the Computed Tomography Dose Index (CTDI). Each exam each different body part, we collect 10 patients at least. Carried out the routine examinations, and all exposure parameters have been collected and the corresponding CTDIv and DLP values have been determined. Results: The majority of patients (75%) were between 60–70 Kg of body weight. There are 25 examinations in this study. Table 1 shows the LDRL of each CT routine examination. Conclusions: Therefore, this study would like to let everyone know DRL’s international status, but also establishment of computed tomography of the local reference levels for our hospital, and providing radiation reference, as a basis for optimizing patient dose.

  20. Identification of earthquake signals from groundwater level records using the HHT method

    NASA Astrophysics Data System (ADS)

    Chen, Chieh-Hung; Wang, Chung-Ho; Liu, Jann-Yenq; Liu, Chen; Liang, Wen-Tzong; Yen, Horng-Yuan; Yeh, Yih-Hsiung; Chia, Yee-Ping; Wang, Yetmen

    2010-03-01

    Coseismic signals of groundwater levels are generally obtained by subtracting the responses of atmospheric pressure, Earth tides and precipitation from the observed data. However, if the observations are conducted without nearby barometers, pluviometers or Earth tide data for correction, coseismic signals are often difficult to extract. In this case, the Hilbert-Huang transform (HHT) is used to obtain the instantaneous frequencies and amplitudes for every point of the decomposed intrinsic mode functions (IMFs) from groundwater level data for differentiating the related frequency-dependent responses without a further auxiliary input. The extracted coseismic signals show intense amplitude pulses that are clearly seen in the third IMF. In addition, two types of coseismic signals can be readily distinguished in the results from the HHT transform. One is an instantaneous short-time signal induced by the passing of seismic waves. Another coseismic signal is a sustained signal induced by the near-field earthquake occurring near the Hualien station, Taiwan and shows a positive correlation between the earthquake distance and magnitude. Our results generally show that using the HHT transform improves our understanding of automatic detection of the coseismic signals from the groundwater level.

  1. Identification of technical problems encountered in the shallow land burial of low-level radioactive wastes

    SciTech Connect

    Jacobs, D.G.; Epler, J.S.; Rose, R.R.

    1980-03-01

    A review of problems encountered in the shallow land burial of low-level radioactive wastes has been made in support of the technical aspects of the National Low-Level Waste (LLW) Management Research and Development Program being administered by the Low-Level Waste Management Program Office, Oak Ridge National Laboratory. The operating histories of burial sites at six major DOE and five commercial facilities in the US have been examined and several major problems identified. The problems experienced st the sites have been grouped into general categories dealing with site development, waste characterization, operation, and performance evaluation. Based on this grouping of the problem, a number of major technical issues have been identified which should be incorporated into program plans for further research and development. For each technical issue a discussion is presented relating the issue to a particular problem, identifying some recent or current related research, and suggesting further work necessary for resolving the issue. Major technical issues which have been identified include the need for improved water management, further understanding of the effect of chemical and physical parameters on radionuclide migration, more comprehensive waste records, improved programs for performance monitoring and evaluation, development of better predictive capabilities, evaluation of space utilization, and improved management control.

  2. Identification of new stress-induced microRNA and their targets in wheat using computational approach

    PubMed Central

    Pandey, Bharati; Gupta, Om Prakash; Pandey, Dev Mani; Sharma, Indu; Sharma, Pradeep

    2013-01-01

    MicroRNAs (miRNAs) are a class of short endogenous non-coding small RNA molecules of about 18–22 nucleotides in length. Their main function is to downregulate gene expression in different manners like translational repression, mRNA cleavage and epigenetic modification. Computational predictions have raised the number of miRNAs in wheat significantly using an EST based approach. Hence, a combinatorial approach which is amalgamation of bioinformatics software and perl script was used to identify new miRNA to add to the growing database of wheat miRNA. Identification of miRNAs was initiated by mining the EST (Expressed Sequence Tags) database available at National Center for Biotechnology Information. In this investigation, 4677 mature microRNA sequences belonging to 50 miRNA families from different plant species were used to predict miRNA in wheat. A total of five abiotic stress-responsive new miRNAs were predicted and named Ta-miR5653, Ta-miR855, Ta-miR819k, Ta-miR3708 and Ta-miR5156. In addition, four previously identified miRNA, i.e., Ta-miR1122, miR1117, Ta-miR1134 and Ta-miR1133 were predicted in newly identified EST sequence and 14 potential target genes were subsequently predicted, most of which seems to encode ubiquitin carrier protein, serine/threonine protein kinase, 40S ribosomal protein, F-box/kelch-repeat protein, BTB/POZ domain-containing protein, transcription factors which are involved in growth, development, metabolism and stress response. Our result has increased the number of miRNAs in wheat, which should be useful for further investigation into the biological functions and evolution of miRNAs in wheat and other plant species. PMID:23511197

  3. Identification of 14-3-3 Proteins Phosphopeptide-Binding Specificity Using an Affinity-Based Computational Approach.

    PubMed

    Li, Zhao; Tang, Jijun; Guo, Fei

    2016-01-01

    The 14-3-3 proteins are a highly conserved family of homodimeric and heterodimeric molecules, expressed in all eukaryotic cells. In human cells, this family consists of seven distinct but highly homologous 14-3-3 isoforms. 14-3-3σ is the only isoform directly linked to cancer in epithelial cells, which is regulated by major tumor suppressor genes. For each 14-3-3 isoform, we have 1,000 peptide motifs with experimental binding affinity values. In this paper, we present a novel method for identifying peptide motifs binding to 14-3-3σ isoform. First, we propose a sampling criteria to build a predictor for each new peptide sequence. Then, we select nine physicochemical properties of amino acids to describe each peptide motif. We also use auto-cross covariance to extract correlative properties of amino acids in any two positions. Finally, we consider elastic net to predict affinity values of peptide motifs, based on ridge regression and least absolute shrinkage and selection operator (LASSO). Our method tests on the 1,000 known peptide motifs binding to seven 14-3-3 isoforms. On the 14-3-3σ isoform, our method has overall pearson-product-moment correlation coefficient (PCC) and root mean squared error (RMSE) values of 0.84 and 252.31 for N-terminal sublibrary, and 0.77 and 269.13 for C-terminal sublibrary. We predict affinity values of 16,000 peptide sequences and relative binding ability across six permutated positions similar with experimental values. We identify phosphopeptides that preferentially bind to 14-3-3σ over other isoforms. Several positions on peptide motifs are in the same amino acid category with experimental substrate specificity of phosphopeptides binding to 14-3-3σ. Our method is fast and reliable and is a general computational method that can be used in peptide-protein binding identification in proteomics research. PMID:26828594

  4. Tin-carbon clusters and the onset of microscopic level immiscibility: Experimental and computational study.

    PubMed

    Bernstein, J; Landau, A; Zemel, E; Kolodney, E

    2015-09-21

    We report the experimental observation and computational analysis of the binary tin-carbon gas phase species. These novel ionic compounds are generated by impact of C60(-) anions on a clean tin target at some kiloelectronvolts kinetic energies. Positive Sn(m)C(n)(+) (m = 1-12, 1 ≤ n ≤ 8) ions were detected mass spectrometrically following ejection from the surface. Impact induced shattering of the C60(-) ion followed by sub-surface penetration of the resulting atomic carbon flux forces efficient mixing between target and projectile atoms even though the two elements (Sn/C) are completely immiscible in the bulk. This approach of C60(-) ion beam induced synthesis can be considered as an effective way for producing novel metal-carbon species of the so-called non-carbide forming elements, thus exploring the possible onset of molecular level miscibility in these systems. Sn2C2(+) was found to be the most abundant carbide cluster ion. Its instantaneous formation kinetics and its measured kinetic energy distribution while exiting the surface demonstrate a single impact formation/emission event (on the sub-ps time scale). Optimal geometries were calculated for both neutral and positively charged species using Born-Oppenheimer molecular dynamics for identifying global minima, followed by density functional theory (DFT) structure optimization and energy calculations at the coupled cluster singles, doubles and perturbative triples [CCSD(T)] level. The calculated structures reflect two distinct binding tendencies. The carbon rich species exhibit polyynic/cummulenic nature (tin end capped carbon chains) while the more stoichiometrically balanced species have larger contributions of metal-metal bonding, sometimes resulting in distinct tin and carbon moieties attached to each other (segregated structures). The Sn2C(n) (n = 3-8) and Sn2C(n)(+) (n = 2-8) are polyynic/cummulenic while all neutral Sn(m)C(n) structures (m = 3-4) could be described as small tin clusters (dimer

  5. Demonstration of Cost-Effective, High-Performance Computing at Performance and Reliability Levels Equivalent to a 1994 Vector Supercomputer

    NASA Technical Reports Server (NTRS)

    Babrauckas, Theresa

    2000-01-01

    The Affordable High Performance Computing (AHPC) project demonstrated that high-performance computing based on a distributed network of computer workstations is a cost-effective alternative to vector supercomputers for running CPU and memory intensive design and analysis tools. The AHPC project created an integrated system called a Network Supercomputer. By connecting computer work-stations through a network and utilizing the workstations when they are idle, the resulting distributed-workstation environment has the same performance and reliability levels as the Cray C90 vector Supercomputer at less than 25 percent of the C90 cost. In fact, the cost comparison between a Cray C90 Supercomputer and Sun workstations showed that the number of distributed networked workstations equivalent to a C90 costs approximately 8 percent of the C90.

  6. High-level GPU computing with jacket for MATLAB and C/C++

    NASA Astrophysics Data System (ADS)

    Pryor, Gallagher; Lucey, Brett; Maddipatla, Sandeep; McClanahan, Chris; Melonakos, John; Venugopalakrishnan, Vishwanath; Patel, Krunal; Yalamanchili, Pavan; Malcolm, James

    2011-06-01

    We describe a software platform for the rapid development of general purpose GPU (GPGPU) computing applications within the MATLAB computing environment, C, and C++: Jacket. Jacket provides thousands of GPU-tuned function syntaxes within MATLAB, C, and C++, including linear algebra, convolutions, reductions, and FFTs as well as signal, image, statistics, and graphics libraries. Additionally, Jacket includes a compiler that translates MATLAB and C++ code to CUDA PTX assembly and OpenGL shaders on demand at runtime. A facility is also included to compile a domain specific version of the MATLAB language to CUDA assembly at build time. Jacket includes the first parallel GPU FOR-loop construction and the first profiler for comparative analysis of CPU and GPU execution times. Jacket provides full GPU compute capability on CUDA hardware and limited, image processing focused compute on OpenGL/ES (2.0 and up) devices for mobile and embedded applications.

  7. Modeling Cardiac Electrophysiology at the Organ Level in the Peta FLOPS Computing Age

    SciTech Connect

    Mitchell, Lawrence; Bishop, Martin; Hoetzl, Elena; Neic, Aurel; Liebmann, Manfred; Haase, Gundolf; Plank, Gernot

    2010-09-30

    Despite a steep increase in available compute power, in-silico experimentation with highly detailed models of the heart remains to be challenging due to the high computational cost involved. It is hoped that next generation high performance computing (HPC) resources lead to significant reductions in execution times to leverage a new class of in-silico applications. However, performance gains with these new platforms can only be achieved by engaging a much larger number of compute cores, necessitating strongly scalable numerical techniques. So far strong scalability has been demonstrated only for a moderate number of cores, orders of magnitude below the range required to achieve the desired performance boost.In this study, strong scalability of currently used techniques to solve the bidomain equations is investigated. Benchmark results suggest that scalability is limited to 512-4096 cores within the range of relevant problem sizes even when systems are carefully load-balanced and advanced IO strategies are employed.

  8. Identification of low level gamma-irradiation of meats by high sensitivity comet assay

    NASA Astrophysics Data System (ADS)

    Miyahara, Makoto; Saito, Akiko; Ito, Hitoshi; Toyoda, Masatake

    2002-03-01

    The detection of low levels of irradiation in meats (pork, beef, and chicken) using the new comet assay was investigated in order to assess the capability of the procedure. The new assay includes a process that improves its sensitivity to irradiation and a novel evaluation system for each slide (influence score and comet-type distribution). Samples used were purchased at retailers and were irradiated at 0.5 and 2kGy at 0°C. The samples were processed to obtain comets. Slides were evaluated by typing comets, calculating the influence score and analyzing the comet-type distribution chart of shown on the slide. Influence scores of beef, pork, and chicken at 0kGy were 287(SD=8.0), 305 (SD=12.9), and 320 (SD=21.0), respectively. Those at 500Gy, were 305 (SD=5.3), 347 (SD=10.6), and 364 (12.6), respectively. Irradiation levels in food were successfully determined. Sensitivity to irradiation differed among samples (chicken>pork>beef).

  9. Oil contamination of fish in the North Sea. Determination of levels and identification of sources

    SciTech Connect

    Johnsen, S.; Restucci, R.; Klungsoyr, J.

    1996-12-31

    Two fish species, cod and haddock have been sampled from five different regions in the Norwegian sector of the North Sea, the Haltenbanken and the Barents Sea, Three of the five sampling areas were located in regions with no local oil or gas production, while the remaining two areas represented regions with high density of oil and gas production fields. A total of 25 specimen of each of the two fish species were collected, and liver (all samples) and muscle (10 samples from each group) were analyzed for the content of total hydrocarbons (THC), selected aromatic compounds (NPD and PAH) and bicyclic aliphatic decalines. The present paper outlines the results of liver samples analyses from four of the sampled regions, & northern North Sea region and the three reference regions Egersundbanken, Haltenbanken and the Barents Sea. In general, no significant difference was observed between the hydrocarbon levels within the sampled regions. The only observed exception was a moderate, but significant increase in decaline levels in haddock liver from the Northern North Sea region. The qualitative interpretation of the results showed that the sources of hydrocarbon contamination varied within the total sampling area. This observation indicates that the local discharge sources in areas with high petroleum production activity are the sources of hydrocarbons in fish from such areas. However, it was not possible to identify single discharges as a contamination source from the present results.

  10. Identification of risk factors for Campylobacter contamination levels on broiler carcasses during the slaughter process.

    PubMed

    Seliwiorstow, Tomasz; Baré, Julie; Berkvens, Dirk; Van Damme, Inge; Uyttendaele, Mieke; De Zutter, Lieven

    2016-06-01

    Campylobacter carcass contamination was quantified across the slaughter line during processing of Campylobacter positive batches. These quantitative data were combined together with information describing slaughterhouse and batch related characteristics in order to identify risk factors for Campylobacter contamination levels on broiler carcasses. The results revealed that Campylobacter counts are influenced by the contamination of incoming birds (both the initial external carcass contamination and the colonization level of caeca) and the duration of transport and holding time that can be linked with feed withdrawal period. In addition, technical aspects of the slaughter process such as a dump based unloading system, electrical stunning, lower scalding temperature, incorrect setting of plucking, vent cutter and evisceration machines were identified as risk factors associated with increased Campylobacter counts on processed carcasses. As such the study indicates possible improvements of the slaughter process that can result in better control of Campylobacter numbers under routine processing of Campylobacter positive batches without use of chemical or physical decontamination. Moreover, all investigated factors were existing variations of the routine processing practises and therefore proposed interventions are practically and economically achievable.

  11. Student Conceptions about the DNA Structure within a Hierarchical Organizational Level: Improvement by Experiment- and Computer-Based Outreach Learning

    ERIC Educational Resources Information Center

    Langheinrich, Jessica; Bogner, Franz X.

    2015-01-01

    As non-scientific conceptions interfere with learning processes, teachers need both, to know about them and to address them in their classrooms. For our study, based on 182 eleventh graders, we analyzed the level of conceptual understanding by implementing the "draw and write" technique during a computer-supported gene technology module.…

  12. The Influence of the Level of Free-Choice Learning Activities on the Use of an Educational Computer Game

    ERIC Educational Resources Information Center

    Barendregt, Wolmet; Bekker, Tilde M.

    2011-01-01

    Employing a mixed-method explorative approach, this study examined the in situ use of and opinions about an educational computer game for learning English introduced in three schools offering different levels of freedom to choose school activities. The results indicated that the general behaviour of the children with the game was very different…

  13. Computer Use Ethics among University Students and Staffs: The Influence of Gender, Religious Work Value and Organizational Level

    ERIC Educational Resources Information Center

    Mohamed, Norshidah; Karim, Nor Shahriza Abdul; Hussein, Ramlah

    2012-01-01

    Purpose: The purpose of this paper is to investigate the extent to which individual characteristics, which are gender, religious (Islamic) work value, and organization level (students and staff), are related to attitudes toward computer use ethics. This investigation is conducted in an academic setting in Malaysia, among those subscribing to the…

  14. POOLMS: A computer program for fitting and model selection for two level factorial replication-free experiments

    NASA Technical Reports Server (NTRS)

    Amling, G. E.; Holms, A. G.

    1973-01-01

    A computer program is described that performs a statistical multiple-decision procedure called chain pooling. It uses a number of mean squares assigned to error variance that is conditioned on the relative magnitudes of the mean squares. The model selection is done according to user-specified levels of type 1 or type 2 error probabilities.

  15. Analysis of the applicability of geophysical methods and computer modelling in determining groundwater level

    NASA Astrophysics Data System (ADS)

    Czaja, Klaudia; Matula, Rafal

    2014-05-01

    The paper presents analysis of the possibilities of application geophysical methods to investigation groundwater conditions. In this paper groundwater is defined as liquid water flowing through shallow aquifers. Groundwater conditions are described through the distribution of permeable layers (like sand, gravel, fractured rock) and impermeable or low-permeable layers (like clay, till, solid rock) in the subsurface. GPR (Ground Penetrating Radar), ERT(Electrical Resistivity Tomography), VES (Vertical Electric Soundings) and seismic reflection, refraction and MASW (Multichannel Analysis of Surface Waves) belong to non - invasive, surface, geophysical methods. Due to differences in physical parameters like dielectric constant, resistivity, density and elastic properties for saturated and saturated zones it is possible to use geophysical techniques for groundwater investigations. Few programmes for GPR, ERT, VES and seismic modelling were applied in order to verify and compare results. Models differ in values of physical parameters such as dielectric constant, electrical conductivity, P and S-wave velocity and the density, layers thickness and the depth of occurrence of the groundwater level. Obtained results for computer modelling for GPR and seismic methods and interpretation of test field measurements are presented. In all of this methods vertical resolution is the most important issue in groundwater investigations. This require proper measurement methodology e.g. antennas with frequencies high enough, Wenner array in electrical surveys, proper geometry for seismic studies. Seismic velocities of unconsolidated rocks like sand and gravel are strongly influenced by porosity and water saturation. No influence of water saturation degree on seismic velocities is observed below a value of about 90% water saturation. A further saturation increase leads to a strong increase of P-wave velocity and a slight decrease of S-wave velocity. But in case of few models only the

  16. Analysis and identification of two reconstituted tobacco sheets by three-level infrared spectroscopy

    NASA Astrophysics Data System (ADS)

    Wu, Xian-xue; Xu, Chang-hua; Li, Ming; Sun, Su-qin; Li, Jin-ming; Dong, Wei

    2014-07-01

    Two kinds of reconstituted tobacco (RT) from France (RTF) and China (RTC) were analyzed and identified by a three-level infrared spectroscopy method (Fourier-transform infrared spectroscopy (FT-IR) coupled with second derivative infrared spectroscopy (SD-IR) and two-dimensional infrared correlation spectroscopy (2D-IR). The conventional IR spectra of RTF parallel samples were more consistent than those of RTC according to their overlapped parallel spectra and IR spectra correlation coefficients. FT-IR spectra of both two RTs were similar in holistic spectral profile except for small differences around 1430 cm-1, indicating that they have similar chemical constituents. By analysis of SD-IR spectra of RTFs and RTCs, more distinct fingerprint features, especially peaks at 1106 (1110), 1054 (1059) and 877 (874) cm-1, were disclosed. Even better reproducibility of five SD-IR spectra of RTF in 1750-1400 cm-1 could be seen intuitively from their stacked spectra and could be confirmed by further similarity evaluation of SD-IR spectra. Existence of calcium carbonate and calcium oxalate could be easily observed in two RTs by comparing their spectra with references. Furthermore, the 2D-IR spectra provided obvious, vivid and intuitive differences of RTF and RTC. Both two RTs had a pair of strong positive auto-peaks in 1600-1400 cm-1. Specifically, the autopeak at 1586 cm-1 in RTF was stronger than the one around 1421 cm-1, whereas the one at 1587 cm-1 in RTC was weaker than that at 1458 cm-1. Consequently, the RTs of two different brands were analyzed and identified thoroughly and RTF had better homogeneity than RTC. As a result, three-level infrared spectroscopy method has proved to be a simple, convenient and efficient method for rapid discrimination and homogeneousness estimation of RT.

  17. Identification of genetic variants of lecithin cholesterol acyltransferase in individuals with high HDL‑C levels.

    PubMed

    Naseri, Mohsen; Hedayati, Mehdi; Daneshpour, Maryam Sadat; Bandarian, Fatemeh; Azizi, Fereidoun

    2014-07-01

    Among the most common lipid abnormalities, a low level of high-density lipoprotein-cholesterol (HDL‑C) is one of the first risk factors identified for coronary heart disease. Lecithin cholesterol acyltransferase (LCAT) has a pivotal role in the formation and maturation of HDL-C and in reverse cholesterol transport. To identify genetic loci associated with low HDL-C in a population-based cohort in Tehran, the promoter, coding regions and exon/intron boundaries of LCAT were amplified and sequenced in consecutive individuals (n=150) who had extremely low or high HDL-C levels but no other major lipid abnormalities. A total of 14 single-nucleotide polymorphisms (SNPs) were identified, of which 10 were found to be novel; the L393L, S232T and 16:67977696 C>A polymorphisms have been previously reported in the SNP Database (as rs5923, rs4986970 and rs11860115, respectively) and the non-synonymous R47M mutation has been reported in the Catalogue of Somatic Mutations in Cancer (COSM972635). Three of the SNPs identified in the present study (position 6,531 in exon 5, position 6,696 in exon 5 and position 5,151 in exon 1) led to an amino acid substitution. The most common variants were L393L (4886C/T) in exon 6 and Q177E, a novel mutation, in exon 5, and the prevalence of the heterozygous genotype of these two SNPs was significantly higher in the low HDL-C groups. Univariate conditional logistic regression odds ratios (ORs) were nominally significant for Q177E (OR, 5.64; P=0.02; 95% confidence interval, 1.2‑26.2). However, this finding was attenuated following adjustment for confounders. Further studies using a larger sample size may enhance the determination of the role of these SNPs. PMID:24789697

  18. Cyclic siloxanes in air, including identification of high levels in Chicago and distinct diurnal variation.

    PubMed

    Yucuis, Rachel A; Stanier, Charles O; Hornbuckle, Keri C

    2013-08-01

    The organosilicon compounds octamethylcyclotetrasiloxane (D4), decamethylcyclopentasiloxane (D5), and dodecamethylcyclohexasiloxane (D6) are high production volume chemicals that are widely used in household goods and personal care products. Due to their prevalence and chemical characteristics, cyclic siloxanes are being assessed as possible persistent organic pollutants. D4, D5, and D6 were measured in indoor and outdoor air to quantify and compare siloxane concentrations and compound ratios depending on location type. Indoor air samples had a median concentration of 2200 ng m(-3) for the sum of D4, D5, and D6. Outdoor sampling locations included downtown Chicago, Cedar Rapids, IA, and West Branch, IA, and had median sum siloxane levels of 280, 73, and 29 ng m(-3) respectively. A diurnal trend is apparent in the samples taken in downtown Chicago. Nighttime samples had a median 2.7 times higher on average than daytime samples, which is due, in part, to the fluctuations of the planetary boundary layer. D5 was the dominant siloxane in both indoor and outdoor air. Ratios of D5 to D4 averaged 91 and 3.2 for indoor and outdoor air respectively.

  19. A client-server software for the identification of groundwater vulnerability to pesticides at regional level.

    PubMed

    Di Guardo, Andrea; Finizio, Antonio

    2015-10-15

    The groundwater VULnerability to PESticide software system (VULPES) is a user-friendly, GIS-based and client-server software developed to identify vulnerable areas to pesticides at regional level making use of pesticide fate models. It is a Decision Support System aimed to assist the public policy makers to investigate areas sensitive to specific substances and to propose limitations of use or mitigation measures. VULPES identify the so-called Uniform Geographical Unit (UGU) which are areas characterised by the same agro-environmental conditions. In each UGU it applies the PELMO model obtaining the 80th percentile of the substance concentration at 1 metre depth; then VULPES creates a vulnerability map in shapefile format which classifies the outputs comparing them with the lower threshold set to the legal limit concentration in groundwater (0.1 μg/l). This paper describes the software structure in details and a case study with the application of the terbuthylazine herbicide on the Lombardy region territory. Three zones with different degrees of vulnerabilities has been identified and described. PMID:26047858

  20. Electrical load models: Obtention and application to the identification of end uses at low aggregation levels

    NASA Astrophysics Data System (ADS)

    Fuentes Moreno, Juan Alvaro

    The knowledge of the load composition has a great interest for the Utilities as well as users and researchers. In this thesis a method that tries to obtain information about the load composition has been developed. Among its main characteristics are: it is non-intrusive, it can work continuously in real time and it can be used only at a low aggregation level. The load composition is obtained by means of measuring, for each phase, the voltage and total current waveform at the secondary of a distribution transformer. After this stage has been done, the total current absorbed, per phase, is decompose into a linear combination of predicted currents of loads present in our database of single electrical loads. These single electrical loads have been previously measured in the laboratory and their behaviour, under a distoted voltage supply, modelled. The model used for these individual loads is a generalization of the crossed frequency admittance matrix, a black box type model which tries to capture the relationship between the harmonic voltages of the voltage supplied and the absorbed harmonic currents. The performance of the method has been evaluated in tests in laboratory as well as in a real application, showing that, if the individual loads are grouped into groups of loads with similar electrical behaviour, then the results obtained can be representative of these groups.

  1. Identification of a gene, FMP21, whose expression levels are involved in thermotolerance in Saccharomyces cerevisiae

    PubMed Central

    2014-01-01

    Elucidation of the mechanism of high temperature tolerance in yeasts is important for the molecular breeding of high temperature-tolerant yeasts that can be used in bioethanol production. We identified genes whose expression is correlated with the degree of thermotolerance in Saccharomyces cerevisiae by DNA microarray analysis. Gene expression profiles of three S. cerevisiae strains showing different levels of thermotolerance were compared, and we chose three of them as candidate genes. Among these genes, FMP21 was investigated as a thermotolerance-related gene in S. cerevisiae by comparing the growth at high temperature with the gene expression in eight strains. The expression ratio of FMP21 at 37°C was correlated with the doubling time ratio at a coefficient of determination of 0.787. The potential involvement of the Fmp21 in the thermotolerance of yeasts was evaluated. The FMP21 deletion variant showed a decreased respiratory growth rate and increased thermosensitivity. Furthermore, the overexpression of FMP21 improved thermotolerance in yeasts. In conclusion, the function of Fmp21 is important for thermotolerance in yeasts. PMID:25177541

  2. Identification of APC mutations and evaluation of their expression level using a functional screening assay

    SciTech Connect

    Varesco, L.; Gismondi, V.; Bafico, A.

    1994-09-01

    A functional screen for chain-terminating mutations in the APC gene recently has been developed. It is based on the PCR and cloning of a segment of the gene in-frame with a colorimetric marker gene (lacz) followed by screening for the level of activity of the marker polypeptide (beta-galactosidase). This method scores colony number with different blue colors that are produced by bacteria containing normal and mutant APC segments. In the present work this method was used to screen the entire APC coding region by using eight primer pairs. DNA segments with known APC mutations at different positions in the gene were used as controls and were clearly identifiable with this assay. In addition, the entire APC coding region has been examined in 21 APC patients in whom PCR-SSCP did not identify an APC mutation. Novel mutations (n=14) were identified by the blue/white assay and were all confirmed by sequence analysis. This method also was used to quantitate the expression of paternal and maternal APC alleles taking advantage of an RsaI site polymorphism at position 1458 in a small number of informative individuals. Differential expression of some known mutant APC mRNAs was observed.

  3. Identification of the deep level defects in AlN single crystals by electron paramagnetic resonance

    NASA Astrophysics Data System (ADS)

    Soltamov, V. A.; Ilyin, I. V.; Soltamova, A. A.; Mokhov, E. N.; Baranov, P. G.

    2010-06-01

    Electron paramagnetic resonance (EPR) at 9.4 and 35 GHz were studied on two types of AlN single crystals, grown by a sublimation sandwich method. These investigations revealed the presence of transition metals impurities in the first sample: Fe2+ (S =2) and some paramagnetic centers with S =3/2, we suggest Cr3+ or Ni3+ as the possible candidates. The EPR spectra of Fe2+ were observed up to the room temperature. After sample illumination at 5 K with light (wavelength shorter 700 nm) strong EPR signal with a g factor of shallow donors (SDs) and slightly anisotropic linewidth appears. This light-induced EPR signal, once excited at low temperature, still persists after switching off the light and is about constant up to 30 K then it drops quickly. SDs show a negative correlation energy U and oxygen in the N position (ON) is the most probable model. EPR spectra of deep-donor center which was assumed to be the nitrogen vacancy VN have been observed in the second sample. The x-ray irradiation leads to considerable enhancement of deep donor's (VN) signals intensity. The annealing resulted in recombination thermoluminescence and the deep donor (VN) energy level was estimated to be about 0.5 eV. The models of shallow (ON) and deep (VN) donor centers were supported by comprehensive hyperfine structure analysis.

  4. Prevention of secondary conditions in fetal alcohol spectrum disorders: identification of systems-level barriers.

    PubMed

    Petrenko, Christie L M; Tahir, Naira; Mahoney, Erin C; Chin, Nancy P

    2014-08-01

    Fetal alcohol spectrum disorders (FASD) impact 2-5% of the US population and are associated with life-long cognitive and behavioral impairments. Individuals with FASD have high rates of secondary conditions, including mental health problems, school disruptions, and trouble with the law. This study focuses on systems-level barriers that contribute to secondary conditions and interfere with prevention and treatment. Using a phenomenological methodology, semi-structured interviews and focus groups were conducted with parents of children with FASD and service providers. Data were analyzed using a framework approach. Participants emphasized the pervasive lack of knowledge of FASD throughout multiple systems. This lack of knowledge contributes to multi-system barriers including delayed diagnosis, unavailability of services, and difficulty qualifying for, implementing, and maintaining services. FASD is a major public health problem. Broad system changes using a public health approach are needed to increase awareness and understanding of FASD, improve access to diagnostic and therapeutic services, and create responsive institutional policies to prevent secondary conditions. These changes are essential to improve outcomes for individuals with FASD and their families and facilitate dissemination of empirically supported interventions.

  5. A client-server software for the identification of groundwater vulnerability to pesticides at regional level.

    PubMed

    Di Guardo, Andrea; Finizio, Antonio

    2015-10-15

    The groundwater VULnerability to PESticide software system (VULPES) is a user-friendly, GIS-based and client-server software developed to identify vulnerable areas to pesticides at regional level making use of pesticide fate models. It is a Decision Support System aimed to assist the public policy makers to investigate areas sensitive to specific substances and to propose limitations of use or mitigation measures. VULPES identify the so-called Uniform Geographical Unit (UGU) which are areas characterised by the same agro-environmental conditions. In each UGU it applies the PELMO model obtaining the 80th percentile of the substance concentration at 1 metre depth; then VULPES creates a vulnerability map in shapefile format which classifies the outputs comparing them with the lower threshold set to the legal limit concentration in groundwater (0.1 μg/l). This paper describes the software structure in details and a case study with the application of the terbuthylazine herbicide on the Lombardy region territory. Three zones with different degrees of vulnerabilities has been identified and described.

  6. Feature-level signal processing for near-real-time odor identification

    NASA Astrophysics Data System (ADS)

    Roppel, Thaddeus A.; Padgett, Mary Lou; Waldemark, Joakim T. A.; Wilson, Denise M.

    1998-09-01

    Rapid detection and classification of odor is of particular interest in applications such as manufacturing of consumer items, food processing, drug and explosives detection, and battlefield situation assessment. Various detection and classification techniques are under investigation so that end users can have access to useful information from odor sensor arrays in near-real-time. Feature-level data clustering and classification techniques are proposed that are (1) parallelizable to permit efficient hardware implementation, (2) adaptable to readily incorporate new data classes, (3) capable of gracefully handling outlier data points and failed sensor conditions, and (4) can provide confidence intervals and/or a traceable decision record along with each classification to permit validation and verification. Results from using specific techniques will be presented and compared. The techniques studied include principal components analysis, automated outlier determination, radial basis functions (RBF), multi-layer perceptrons (MLP), and pulse-coupled neural networks (PCNN). The results reported here are based on data from a testbed in which a gas sensor array is exposed to odor samples on a continuous basis. We have reported previously that more detailed and faster discrimination can be obtained by using sensor transient response in addition to steady state response. As the size of the data set grows we are able to more accurately model performance of a sensor array under realistic conditions.

  7. Identification of high-level functional/system requirements for future civil transports

    NASA Technical Reports Server (NTRS)

    Swink, Jay R.; Goins, Richard T.

    1992-01-01

    In order to accommodate the rapid growth in commercial aviation throughout the remainder of this century, the Federal Aviation Administration (FAA) is faced with a formidable challenge to upgrade and/or modernize the National Airspace System (NAS) without compromising safety or efficiency. A recurring theme in both the Aviation System Capital Investment Plan (CIP), which has replaced the NAS Plan, and the new FAA Plan for Research, Engineering, and Development (RE&D) rely on the application of new technologies and a greater use of automation. Identifying the high-level functional and system impacts of such modernization efforts on future civil transport operational requirements, particularly in terms of cockpit functionality and information transfer, was the primary objective of this project. The FAA planning documents for the NAS of the 2005 era and beyond were surveyed; major aircraft functional capabilities and system components required for such an operating environment were identified. A hierarchical structured analysis of the information processing and flows emanating from such functional/system components were conducted and the results documented in graphical form depicting the relationships between functions and systems.

  8. Cyclic siloxanes in air, including identification of high levels in Chicago and distinct diurnal variation

    PubMed Central

    Yucuis, Rachel A.; Stanier, Charles O.; Hornbuckle, Keri C.

    2014-01-01

    The organosilicon compounds octamethylcyclotetrasiloxane (D4), decamethylcyclopentasiloxane (D5), and dodecamethylcyclohexasiloxane (D6) are high production volume chemicals that are widely used in household goods and personal care products. Due to their prevalence and chemical characteristics, cyclic siloxanes are being assessed as possible persistent organic pollutants. D4, D5, and D6 were measured in indoor and outdoor air to quantify and compare siloxane concentrations and compound ratios depending on location type. Indoor air samples had a median concentration of 2200 ng m−3 for the sum of D4, D5, and D6. Outdoor sampling locations included downtown Chicago, Cedar Rapids, IA, and West Branch, IA, and had median sum siloxane levels of 280, 73, and 29 ng m−3 respectively. A diurnal trend is apparent in the samples taken in downtown Chicago. Nighttime samples had a median 2.7 times higher on average than daytime samples, which is due, in part, to the fluctuations of the planetary boundary layer. D5 was the dominant siloxane in both indoor and outdoor air. Ratios of D5 to D4 averaged 91 and 3.2 for indoor and outdoor air respectively. PMID:23541357

  9. Computer vision-based sorting of Atlantic salmon (Salmo salar) fillets according to their color level.

    PubMed

    Misimi, E; Mathiassen, J R; Erikson, U

    2007-01-01

    Computer vision method was used to evaluate the color of Atlantic salmon (Salmo salar) fillets. Computer vision-based sorting of fillets according to their color was studied on 2 separate groups of salmon fillets. The images of fillets were captured using a digital camera of high resolution. Images of salmon fillets were then segmented in the regions of interest and analyzed in red, green, and blue (RGB) and CIE Lightness, redness, and yellowness (Lab) color spaces, and classified according to the Roche color card industrial standard. Comparisons of fillet color between visual evaluations were made by a panel of human inspectors, according to the Roche SalmoFan lineal standard, and the color scores generated from computer vision algorithm showed that there were no significant differences between the methods. Overall, computer vision can be used as a powerful tool to sort fillets by color in a fast and nondestructive manner. The low cost of implementing computer vision solutions creates the potential to replace manual labor in fish processing plants with automation.

  10. Production control system: A resource management and load leveling system for mainframe computers

    SciTech Connect

    Wood, R.R.

    1994-03-01

    The Livermore Computing Production Control System, commonly called the PCS is described in this paper. The intended audiences for this document are system administrators and resource managers of computer systems. In 1990, the Livermore Computing Center committed itself to convert its supercomputer operations from the New Livermore TimeSharing System (NLTSS) to UNIX based systems. This was a radical change for Livermore Computing in that over thirty years had elapsed during which the only operating environments used on production platforms were LTSS and then later NLTSS. Elaborate, and sometimes obscure, paradigms (to others) were developed to help the lab`s scientists productively use the machines and to accurately account for their use to government oversight agencies. Resource management is difficult in typical UNIX and related systems. The weakness involves the allocation, control and delivery of the usage of computer resources. This is a result of the origins and subsequent development of UNIX which started as a system for small platforms in a collegial or departmental atmosphere without stringent or complex budgetary requirements. Accounting also is weak, being generally limited to reporting that a user used an amount of time on a process. A process accounting record is made only after the completion of a process and then only if the process does not terminate as a result of a system {open_quotes}panic.{close_quotes}

  11. Experimental observation and computational identification of Sc at Cu{sub 16}{sup +}, a stable dopant-encapsulated copper cage

    SciTech Connect

    Veldeman, Nele; Neukermans, Sven; Lievens, Peter; Hoeltzl, Tibor; Minh Tho Nguyen; Veszpremi, Tamas

    2007-07-15

    We report a combined experimental and computational study of scandium doped copper clusters. The clusters are studied with time-of-flight mass spectrometry after laser fragmentation. Enhanced stabilities for specific cluster sizes in the mass abundance spectra are discussed using both electronic (shell closing) and geometric (symmetry) arguments. The exceptional stability observed for Cu{sub 16}Sc{sup +} is investigated in detail computationally. Density functional geometry optimizations at the Becke-Perdew 1986-LANL 2-double-zeta (BP86/LANL2DZ) level result in a Frank-Kasper tetrahedron, encapsulating a scandium atom in a highly coordinated position. The high stability is therefore interpreted in terms of extremely stable dopant encapsulated structures featuring a closed electron shell. The thermodynamic stability, as indicated by the stable backbone and large binding energy per atom, the relatively small ionization energy, and the moderate electron affinity of the neutral Cu{sub 16}Sc cluster show that it has a superatom character, chemically similar to the alkaline-metal atoms.

  12. Analysis of lidar elevation data for improved identification and delineation of lands vulnerable to sea-level rise

    USGS Publications Warehouse

    Gesch, D.B.

    2009-01-01

    The importance of sea-level rise in shaping coastal landscapes is well recognized within the earth science community, but as with many natural hazards, communicating the risks associated with sea-level rise remains a challenge. Topography is a key parameter that influences many of the processes involved in coastal change, and thus, up-to-date, high-resolution, high-accuracy elevation data are required to model the coastal environment. Maps of areas subject to potential inundation have great utility to planners and managers concerned with the effects of sea-level rise. However, most of the maps produced to date are simplistic representations derived from older, coarse elevation data. In the last several years, vast amounts of high quality elevation data derived from lidar have become available. Because of their high vertical accuracy and spatial resolution, these lidar data are an excellent source of up-to-date information from which to improve identification and delineation of vulnerable lands. Four elevation datasets of varying resolution and accuracy were processed to demonstrate that the improved quality of lidar data leads to more precise delineation of coastal lands vulnerable to inundation. A key component of the comparison was to calculate and account for the vertical uncertainty of the elevation datasets. This comparison shows that lidar allows for a much more detailed delineation of the potential inundation zone when compared to other types of elevation models. It also shows how the certainty of the delineation of lands vulnerable to a given sea-level rise scenario is much improved when derived from higher resolution lidar data. ?? 2009 Coastal Education and Research Foundation.

  13. Differential Expression Levels of Integrin α6 Enable the Selective Identification and Isolation of Atrial and Ventricular Cardiomyocytes

    PubMed Central

    Wiencierz, Anne Maria; Kernbach, Manuel; Ecklebe, Josephine; Monnerat, Gustavo; Tomiuk, Stefan; Raulf, Alexandra; Christalla, Peter; Malan, Daniela; Hesse, Michael; Bosio, Andreas; Fleischmann, Bernd K.; Eckardt, Dominik

    2015-01-01

    Rationale Central questions such as cardiomyocyte subtype emergence during cardiogenesis or the availability of cardiomyocyte subtypes for cell replacement therapy require selective identification and purification of atrial and ventricular cardiomyocytes. However, current methodologies do not allow for a transgene-free selective isolation of atrial or ventricular cardiomyocytes due to the lack of subtype specific cell surface markers. Methods and Results In order to develop cell surface marker-based isolation procedures for cardiomyocyte subtypes, we performed an antibody-based screening on embryonic mouse hearts. Our data indicate that atrial and ventricular cardiomyocytes are characterized by differential expression of integrin α6 (ITGA6) throughout development and in the adult heart. We discovered that the expression level of this surface marker correlates with the intracellular subtype-specific expression of MLC-2a and MLC-2v on the single cell level and thereby enables the discrimination of cardiomyocyte subtypes by flow cytometry. Based on the differential expression of ITGA6 in atria and ventricles during cardiogenesis, we developed purification protocols for atrial and ventricular cardiomyocytes from mouse hearts. Atrial and ventricular identities of sorted cells were confirmed by expression profiling and patch clamp analysis. Conclusion Here, we introduce a non-genetic, antibody-based approach to specifically isolate highly pure and viable atrial and ventricular cardiomyocytes from mouse hearts of various developmental stages. This will facilitate in-depth characterization of the individual cellular subsets and support translational research applications. PMID:26618511

  14. Student conceptions about the DNA structure within a hierarchical organizational level: Improvement by experiment- and computer-based outreach learning.

    PubMed

    Langheinrich, Jessica; Bogner, Franz X

    2015-01-01

    As non-scientific conceptions interfere with learning processes, teachers need both, to know about them and to address them in their classrooms. For our study, based on 182 eleventh graders, we analyzed the level of conceptual understanding by implementing the "draw and write" technique during a computer-supported gene technology module. To give participants the hierarchical organizational level which they have to draw, was a specific feature of our study. We introduced two objective category systems for analyzing drawings and inscriptions. Our results indicated a long- as well as a short-term increase in the level of conceptual understanding and in the number of drawn elements and their grades concerning the DNA structure. Consequently, we regard the "draw and write" technique as a tool for a teacher to get to know students' alternative conceptions. Furthermore, our study points the modification potential of hands-on and computer-supported learning modules.

  15. Developing Strategic and Reasoning Abilities with Computer Games at Primary School Level

    ERIC Educational Resources Information Center

    Bottino, R. M.; Ferlino, L.; Ott, M.; Tavella, M.

    2007-01-01

    The paper reports a small-scale, long-term pilot project designed to foster strategic and reasoning abilities in young primary school pupils by engaging them in a number of computer games, mainly those usually called mind games (brainteasers, puzzlers, etc.). In this paper, the objectives, work methodology, experimental setting, and tools used in…

  16. Interaction Between Conceptual Level and Training Method in Computer Based Problem Solving.

    ERIC Educational Resources Information Center

    Sustik, Joan M.; Brown, Bobby R.

    A study of 130 undergraduate students enrolled in a course on auiovisual techniques sought to determine whether heuristic or algorithmic computer-based problem solving training would be differentially effective for students varying in cognitive complexity as measured by the Educational Set Scale (ESS). The interaction was investigated between one…

  17. Success in Institutionalizing Basic Computer Skills Courses at a Community College Level.

    ERIC Educational Resources Information Center

    Dodge, Lucy

    This article outlines the development of basic computer literacy skills courses under the auspices of the Title III Grant awarded to San Jose City College (SJCC) of San Jose, California by the United States Department of Education (Grant no. PO31A980093, Strengthening Institutions, 1998-2003). The grant has been in effect for 3 years, and grant…

  18. Photochemistry of Naphthopyrans and Derivatives: A Computational Experiment for Upper-Level Undergraduates or Graduate Students

    ERIC Educational Resources Information Center

    Castet, Frédéric; Méreau, Raphaël; Liotard, Daniel

    2014-01-01

    In this computational experiment, students use advanced quantum chemistry tools to simulate the photochromic reaction mechanism in naphthopyran derivatives. The first part aims to make students familiar with excited-state reaction mechanisms and addresses the photoisomerization of the benzopyran molecule by means of semiempirical quantum chemical…

  19. Introducing Creativity in a Design Laboratory for a Freshman Level Electrical and Computer Engineering Course

    ERIC Educational Resources Information Center

    Burkett, Susan L.; Kotru, Sushma; Lusth, John C.; McCallum, Debra; Dunlap, Sarah

    2014-01-01

    Dunlap, The University of Alabama, USA ABSTRACT In the electrical and computer engineering (ECE) curriculum at The University of Alabama, freshmen are introduced to fundamental electrical concepts and units, DC circuit analysis techniques, operational amplifiers, circuit simulation, design, and professional ethics. The two credit course has both…

  20. Computers and Traditional Teaching Practices: Factors Influencing Middle Level Students' Science Achievement and Attitudes about Science

    ERIC Educational Resources Information Center

    Odom, Arthur Louis; Marszalek, Jacob M.; Stoddard, Elizabeth R.; Wrobel, Jerzy M.

    2011-01-01

    The purpose of this study was to examine the association of middle school student science achievement and attitudes toward science with student-reported frequency of using computers to learn science and other classroom practices. Baseline comparison data were collected on the frequency of student-centred teaching practices (e.g. the use of group…

  1. Assessing the Computational Literacy of Elementary Students on a National Level in Korea

    ERIC Educational Resources Information Center

    Jun, SooJin; Han, SunGwan; Kim, HyeonCheol; Lee, WonGyu

    2014-01-01

    Information and communication technology (ICT) literacy education has become an important issue, and the necessity of computational literacy (CL) has been increasing in our growing information society. CL is becoming an important element for future talents, and many countries, including the USA, are developing programs for CL education.…

  2. Investigating the Relationship between Curiosity Level and Computer Self Efficacy Beliefs of Elementary Teachers Candidates

    ERIC Educational Resources Information Center

    Gulten, Dilek Cagirgan; Yaman, Yavuz; Deringol, Yasemin; Ozsari, Ismail

    2011-01-01

    Nowadays, "lifelong learning individual" concept is gaining importance in which curiosity is one important feature that an individual should have as a requirement of learning. It is known that learning will naturally occur spontaneously when curiosity instinct is awakened during any learning-teaching process. Computer self-efficacy belief is…

  3. Learner Control over Full and Lean Computer-based Instruction under Differing Ability Levels.

    ERIC Educational Resources Information Center

    Schnackenberg, Heidi L.; Sullivan, Howard J.

    2000-01-01

    Examines the effects of type of instructional control (learner control versus program control) and program mode (full versus lean) on the achievement, option use, time in the program, and attitudes of higher-ability and lower-ability university students using a computer-delivered instructional program. Discusses implications for instructional…

  4. Learner Control over Full and Lean Computer-Based Instruction under Differing Ability Levels.

    ERIC Educational Resources Information Center

    Schnackenberg, Heidi L.

    Undergraduate education majors in a teacher preparation program completed a computer-assisted instructional program for a study designed to examine the effects of type of instructional control and program mode on the achievement, option use, time spent on program, and attitudes of higher and lower ability students. Students were assigned to high…

  5. Teaching Computer Languages and Elementary Theory for Mixed Audiences at University Level

    ERIC Educational Resources Information Center

    Christiansen, Henning

    2004-01-01

    Theoretical issues of computer science are traditionally taught in a way that presupposes a solid mathematical background and are usually considered more or less inaccessible for students without this. An effective methodology is described which has been developed for a target group of university students with different backgrounds such as natural…

  6. Examining the Use of Computer Algebra Systems in University-Level Mathematics Teaching

    ERIC Educational Resources Information Center

    Lavicza, Zsolt

    2009-01-01

    The use of Computer Algebra Systems (CAS) is becoming increasingly important and widespread in mathematics research and teaching. In this paper, I will report on a questionnaire study enquiring about mathematicians' use of CAS in mathematics teaching in three countries; the United States, the United Kingdom, and Hungary. Based on the responses…

  7. The Effects of Computer-Assisted Material on Students' Cognitive Levels, Misconceptions and Attitudes Towards Science

    ERIC Educational Resources Information Center

    Cepni, Salih; Tas, Erol; Kose, Sacit

    2006-01-01

    The purpose of this study was to investigate the effects of a Computer-Assisted Instruction Material (CAIM) related to "photosynthesis" topic on student cognitive development, misconceptions and attitudes. The study conducted in 2002-2003 academic year and was carried out in two different classes taught by the same teacher, in which there were…

  8. Computer-Mediated Communication: Promoting Learner Autonomy and Intercultural Understanding at Secondary Level

    ERIC Educational Resources Information Center

    Fisher, Linda; Evans, Michael; Esch, Edith

    2004-01-01

    The use of Computer-Mediated Communication (CMC) has been hailed as a solution to the problem of access to native speakers for language learners. This project was devised to investigate whether regular and structured use of email, here via a bulletin board, might enhance learners' study of French, with regard to developing learner autonomy and…

  9. Computer Assisted Vocational Math. Written for TRS-80, Model I, Level II, 16K.

    ERIC Educational Resources Information Center

    Daly, Judith; And Others

    This computer-assisted curriculum is intended to be used to enhance a vocational mathematics/applied mathematics course. A total of 32 packets were produced to increase the basic mathematics skills of students in the following vocational programs: automotive trades, beauty culture, building trades, climate control, electrical trades,…

  10. A COMPUTATIONALLY BASED IDENTIFICATION ALGORITHM FOR ESTROGEN RECEPTOR LIGANDS: PART 2. EVALUATION OF A HERA BINDING AFFINITY MODEL

    EPA Science Inventory

    The common reactivity pattern (CORE{A) approach is a 3-dimensional, quantitative structure activity relationship (3-D QSAR) technique that permits identification and quantification of specific global and local stereoelectronic characteristics associated with a chemical's biologic...

  11. Myocardial signal density levels and beam-hardening artifact attenuation using dual-energy computed tomography.

    PubMed

    Rodriguez-Granillo, Gaston A; Carrascosa, Patricia; Cipriano, Silvina; de Zan, Macarena; Deviggiano, Alejandro; Capunay, Carlos; Cury, Ricardo C

    2015-01-01

    The assessment of myocardial perfusion using single-energy (SE) imaging is influenced by beam-hardening artifacts (BHA). We sought to explore the ability of dual-energy (DE) imaging to attenuate the presence of BHA. Myocardial signal density (SD) was evaluated in 2240 myocardial segments (112 for each energy level) and in 320 American Heart Association segments among the SE group. Compared to DE reconstructions at the best energy level, SE acquisitions showed no significant differences overall regarding myocardial SD or signal-to-noise ratio. The segments most commonly affected by BHA showed significantly lower myocardial SD at the lowest energy levels, progressively normalizing at higher energy levels.

  12. The effect on cadaver blood DNA identification by the use of targeted and whole body post-mortem computed tomography angiography.

    PubMed

    Rutty, Guy N; Barber, Jade; Amoroso, Jasmin; Morgan, Bruno; Graham, Eleanor A M

    2013-12-01

    Post-mortem computed tomography angiography (PMCTA) involves the injection of contrast agents. This could have both a dilution effect on biological fluid samples and could affect subsequent post-contrast analytical laboratory processes. We undertook a small sample study of 10 targeted and 10 whole body PMCTA cases to consider whether or not these two methods of PMCTA could affect post-PMCTA cadaver blood based DNA identification. We used standard methodology to examine DNA from blood samples obtained before and after the PMCTA procedure. We illustrate that neither of these PMCTA methods had an effect on the alleles called following short tandem repeat based DNA profiling, and therefore the ability to undertake post-PMCTA blood based DNA identification.

  13. Achieving production-level use of HEP software at the Argonne Leadership Computing Facility

    NASA Astrophysics Data System (ADS)

    Uram, T. D.; Childers, J. T.; LeCompte, T. J.; Papka, M. E.; Benjamin, D.

    2015-12-01

    HEP's demand for computing resources has grown beyond the capacity of the Grid, and these demands will accelerate with the higher energy and luminosity planned for Run II. Mira, the ten petaFLOPs supercomputer at the Argonne Leadership Computing Facility, is a potentially significant compute resource for HEP research. Through an award of fifty million hours on Mira, we have delivered millions of events to LHC experiments by establishing the means of marshaling jobs through serial stages on local clusters, and parallel stages on Mira. We are running several HEP applications, including Alpgen, Pythia, Sherpa, and Geant4. Event generators, such as Sherpa, typically have a split workload: a small scale integration phase, and a second, more scalable, event-generation phase. To accommodate this workload on Mira we have developed two Python-based Django applications, Balsam and ARGO. Balsam is a generalized scheduler interface which uses a plugin system for interacting with scheduler software such as HTCondor, Cobalt, and TORQUE. ARGO is a workflow manager that submits jobs to instances of Balsam. Through these mechanisms, the serial and parallel tasks within jobs are executed on the appropriate resources. This approach and its integration with the PanDA production system will be discussed.

  14. Computers and Traditional Teaching Practices: Factors influencing middle level students' science achievement and attitudes about science

    NASA Astrophysics Data System (ADS)

    Odom, Arthur Louis; Marszalek, Jacob M.; Stoddard, Elizabeth R.; Wrobel, Jerzy M.

    2011-11-01

    The purpose of this study was to examine the association of middle school student science achievement and attitudes toward science with student-reported frequency of using computers to learn science and other classroom practices. Baseline comparison data were collected on the frequency of student-centred teaching practices (e.g. the use of group experiments during science class) and traditional teaching practices (e.g. having students copy notes during science class) to learn science. The student sample was composed of 294 seventh-grade students enrolled in middle school science. Multiple regression was used to investigate the association of attitudes toward science, student-centred teaching practices, computer usage, and traditional teaching practices with science achievement. Both attitudes toward science and student-centred teaching practices were positively associated with science achievement, and student-centred teaching practice was positively associated with attitude toward science. Computer usage was found to have a negative association with student achievement, which was moderated by traditional teaching practices.

  15. Unbiased Species-Level Identification of Clinical Isolates of Coagulase-Negative Staphylococci: Does It Change the Perspective on Staphylococcus lugdunensis?

    PubMed Central

    Ball, David; Millar, Michael

    2014-01-01

    Unbiased species-level identification of coagulase-negative staphylococci (CoNS) using matrix-assisted laser desorption ionization–time of flight mass spectrometry (MALDI-TOF MS) identified Staphylococcus lugdunensis to be a more commonly isolated CoNS in our laboratory than previously observed. It has also highlighted the possibility of vertical transmission. PMID:25339392

  16. High-level waste storage tank farms/242-A evaporator Standards/Requirements Identification Document (S/RID), Volume 7. Revision 1

    SciTech Connect

    Burt, D.L.

    1994-04-01

    The High-Level Waste Storage Tank Farms/242-A Evaporator Standards/Requirements Identification Document (S/RID) is contained in multiple volumes. This document (Volume 7) presents the standards and requirements for the following sections: Occupational Safety and Health, and Environmental Protection.

  17. Unbiased species-level identification of clinical isolates of coagulase-negative Staphylococci: does it change the perspective on Staphylococcus lugdunensis?

    PubMed

    Elamin, Wael F; Ball, David; Millar, Michael

    2015-01-01

    Unbiased species-level identification of coagulase-negative staphylococci (CoNS) using matrix-assisted laser desorption ionization-time of flight mass spectrometry (MALDI-TOF MS) identified Staphylococcus lugdunensis to be a more commonly isolated CoNS in our laboratory than previously observed. It has also highlighted the possibility of vertical transmission.

  18. Interactions between Levels of Instructional Detail and Expertise When Learning with Computer Simulations

    ERIC Educational Resources Information Center

    Hsu, Yuling; Gao, Yuan; Liu, Tzu-Chien; Sweller, John

    2015-01-01

    Based on cognitive load theory, the effect of different levels of instructional detail and expertise in a simulation-based environment on learning about concepts of correlation was investigated. Separate versions of the learning environment were designed for the four experimental conditions which differed only with regard to the levels of written…

  19. New Opportunities for Encouraging Higher Level Mathematical Learning by Creative Use of Emerging Computer Aided Assessment

    ERIC Educational Resources Information Center

    Sangwin, C. J.

    2003-01-01

    This article defines "higher level mathematical skills" and details an important class: that of "constructing instances" of mathematical objects satisfying certain properties. Comment is made on the frequency of higher level tasks in undergraduate work. We explain how such questions may be assessed "in practice" without the imposition on staff of…

  20. Identification of multi-level toxicity of liquid crystal display wastewater toward Daphnia magna and Moina macrocopa.

    PubMed

    Kim, Sae-Bom; Kim, Woong-Ki; Chounlamany, Vanseng; Seo, Jaehwan; Yoo, Jisu; Jo, Hun-Je; Jung, Jinho

    2012-08-15

    Toxicity-based regulations of industrial effluent have been adopted to complement the conventional discharge limits based on chemical analyses. In this study, multi-level toxicity including acute toxicity, feeding rate inhibition and oxidative stress of effluent from a liquid crystal display (LCD) wastewater treatment plant (WWTP) to Daphnia magna (reference species) and Moina macrocopa (native species) were periodically monitored from April 2010 to April 2011. Raw wastewater was acutely toxic to both D. magna and M. macrocopa, but the toxicity reached less than 1 TU in the final effluent (FE) as treatment proceeded. Although acute toxicity was not observed in the FE, the feeding rate of daphnids was significantly inhibited. Additionally, the antioxidant enzyme activity of catalase, superoxide dismutase and glutathione peroxidase (GPx) in D. magna increased significantly when compared to the control, while only GPx activity was increased significantly in M. macrocopa (p<0.05). A toxicity identification evaluation using D. magna showed that Cu was the key toxicant in the FE, which was not effectively removed by the coagulation/flocculation process in the LCD WWTP. In addition, Al originating from the coagulant seemed to increase toxicity of the FE. PMID:22677053

  1. Non Invasive Water Level Monitoring on Boiling Water Reactors Using Internal Gamma Radiation: Application of Soft Computing Methods

    SciTech Connect

    Fleischer, Sebastian; Hampel, Rainer

    2006-07-01

    To provide best knowledge about safety-related water level values in boiling water reactors (BWR) is essentially for operational regime. For the water level determination hydrostatic level measurement systems are almost exclusively applied, because they stand the test over many decades in conventional and nuclear power plants (NPP). Due to the steam generation especially in BWR a specific phenomenon occurs which leads to a water-steam mixture level in the reactor annular space and reactor plenum. The mixture level is a high transient non-measurable value concerning the hydrostatic water level measuring system and it significantly differs from the measured collapsed water level. In particular, during operational and accidental transient processes like fast negative pressure transients, the monitoring of these water levels is very important. In addition to the hydrostatic water level measurement system a diverse water level measurement system for BWR should be used. A real physical diversity is given by gamma radiation distribution inside and outside the reactor pressure vessel correlating with the water level. The vertical gamma radiation distribution depends on the water level, but it is also a function of the neutron flux and the coolant recirculation pump speed. For the water level monitoring, special algorithms are required. An analytical determination of the gamma radiation distribution outside the reactor pressure vessel is impossible due to the multitude of radiation of physical processes, complicated non-stationary radiation source distribution and complex geometry of fixtures. For creating suited algorithms Soft Computing methods (Fuzzy Sets Theory, Artificial Neural Networks, etc.) will be used. Therefore, a database containing input values (gamma radiation distribution) and output values (water levels) had to be built. Here, the database was established by experiments (data from BWR and from a test setup) and simulation with the authorised thermo

  2. Superimposition-based personal identification using skull computed tomographic images: application to skull with mouth fixed open by residual soft tissue.

    PubMed

    Ishii, Masuko; Saitoh, Hisako; Yasjima, Daisuke; Yohsuk, Makino; Sakuma, Ayaka; Yayama, Kazuhiro; Iwase, Hirotaro

    2013-09-01

    We previously reported that superimposition of 3-dimensional (3D) images reconstructed from computed tomographic images of skeletonized skulls on photographs of the actual skulls afforded a match of skull contours, thereby demonstrating that superimposition of 3D-reconstructed images provides results identical to those obtained with actual skulls. The current superimposition procedure requires a skeletonized skull with mouth closed and thus is not applicable to personal identification using a skull with residual soft tissue or the mouth fixed open, such as those found in mummified or burned bodies. In this study, we scanned using computed tomography the skulls of mummified and immersed body with mandibles fixed open by residual soft tissue, created 3D-reconstructed skull images, which were digitally processed by computer software to close the mandible, and superimposed the images on antemortem facial photographs. The results demonstrated morphological consistency between the 3D-reconstructed skull images and facial photographs, indicating the applicability of the method to personal identification.

  3. Identification of nitrate long term trends in Loire-Brittany river district (France) in connection with hydrogeological contexts, agricultural practices and water table level variations

    NASA Astrophysics Data System (ADS)

    Lopez, B.; Baran, N.; Bourgine, B.; Ratheau, D.

    2009-04-01

    The European Union (EU) has adopted directives requiring that Member States take measures to reach a "good" chemical status of water resources by the year 2015 (Water Framework Directive: WFD). Alongside, the Nitrates Directives (91/676/EEC) aims at controlling nitrogen pollution and requires Member States to identify groundwaters that contain more than 50 mg NO3 L-1 or could exceed this limit if preventive measures are not taken. In order to achieve these environmental objectives in the Loire-Brittany river basin, or to justify the non achievement of these objectives, a large dataset of nitrate concentrations (117.056 raw data distributed on 7.341 time-series) and water table level time-series (1.371.655 data distributed on 511 piezometers) is analysed from 1945 to 2007. The 156.700 sq km Loire-Brittany river basin shows various hydrogeological contexts, ranging from sedimentary aquifers to basement ones, with a few volcanic-rock aquifers. The knowledge of the evolution of agricultural practices is important in such a study and, even if this information is not locally available, agricultural practices have globally changed since the 1991 Nitrates Directives. The detailed dataset available for the Loire-Brittany basin aquifers is used to evaluate tools and to propose efficient methodologies for identifying and quantifying past and current trends in nitrate concentrations. Therefore, the challenge of this study is to propose a global and integrated approach which allows nitrate trend identifications for the whole Loire-Brittany river basin. The temporal piezometric behaviour of each aquifer is defined using geostatistical analyse of water table level time-series. This method requires the calculation of an experimental temporal variogram that can be fitted with a theoretical model valid for a large time range. Identification of contrasted behaviours (short term, annual or pluriannual water table fluctuations) allows a systematic classification of the Loire

  4. Identification of quasi-steady compressor characteristics from transient data

    NASA Technical Reports Server (NTRS)

    Nunes, K. B.; Rock, S. M.

    1984-01-01

    The principal goal was to demonstrate that nonlinear compressor map parameters, which govern an in-stall response, can be identified from test data using parameter identification techniques. The tasks included developing and then applying an identification procedure to data generated by NASA LeRC on a hybrid computer. Two levels of model detail were employed. First was a lumped compressor rig model; second was a simplified turbofan model. The main outputs are the tools and procedures generated to accomplish the identification.

  5. Computational Design of an Unnatural Amino Acid Dependent Metalloprotein with Atomic Level Accuracy

    PubMed Central

    Mills, Jeremy H.; Khare, Sagar D.; Bolduc, Jill M.; Forouhar, Farhad; Mulligan, Vikram Khipple; Lew, Scott; Seetharaman, Jayaraman; Tong, Liang; Stoddard, Barry L.; Baker, David

    2013-01-01

    Genetically encoded unnatural amino acids could facilitate the design of proteins and enzymes of novel function, but correctly specifying sites of incorporation, and the identities and orientations of surrounding residues represents a formidable challenge. Computational design methods have been used to identify optimal locations for functional sites in proteins and design the surrounding residues, but have not incorporated unnatural amino acids in this process. We extended the Rosetta design methodology to design metalloproteins in which the amino acid (2,2’-bipyridin-5yl)alanine (Bpy-Ala) is a primary ligand of a bound metal ion. Following initial results that indicated the importance of buttressing the Bpy-Ala amino acid, we designed a buried metal binding site with octahedral coordination geometry consisting of Bpy-Ala, two protein based metal ligands, and two metal bound water molecules. Experimental characterization revealed a Bpy-Ala mediated metalloprotein with the ability to bind divalent cations including Co2+, Zn2+, Fe2+, and Ni2+, with a Kd for Zn2+ of ~40 pM. X-ray crystallographic analysis of the designed protein shows only slight deviation from the computationally designed model. PMID:23924187

  6. Low-level fluoride trapping studies experimental work for computer modeling program

    SciTech Connect

    Russell, R.G.

    1988-11-21

    The material presented in this report involved experimental work performed to assist in determining the constants for a computer modeling program being developed by Production Engineering for use in trap design. Included in this study is bed distribution studies to define uranium loading on alumina (Al/sub 2/O/sub 3/) and sodium fluoride (NaF) with respect to bed zones. A limited amount of work was done on uranium penetration into NaF pellets. Only the experimental work is reported here; Production Engineering will use this data to develop constants for the computer model. Some of the significant conclusions are: NaF has more capacity to load UF/sub 6/, but Al/sub 2/O/sub 3/ distributes the load more equally; velocity, system pressure, and operating temperature influence uranium loading; and in comparative tests NaF had a loading of 25%, while Al/sub 2/O/sub 3/ was 13%. 2 refs., 10 figs., 5 tabs.

  7. A comparative analysis of multi-level computer-assisted decision making systems for traumatic injuries

    PubMed Central

    2009-01-01

    Background This paper focuses on the creation of a predictive computer-assisted decision making system for traumatic injury using machine learning algorithms. Trauma experts must make several difficult decisions based on a large number of patient attributes, usually in a short period of time. The aim is to compare the existing machine learning methods available for medical informatics, and develop reliable, rule-based computer-assisted decision-making systems that provide recommendations for the course of treatment for new patients, based on previously seen cases in trauma databases. Datasets of traumatic brain injury (TBI) patients are used to train and test the decision making algorithm. The work is also applicable to patients with traumatic pelvic injuries. Methods Decision-making rules are created by processing patterns discovered in the datasets, using machine learning techniques. More specifically, CART and C4.5 are used, as they provide grammatical expressions of knowledge extracted by applying logical operations to the available features. The resulting rule sets are tested against other machine learning methods, including AdaBoost and SVM. The rule creation algorithm is applied to multiple datasets, both with and without prior filtering to discover significant variables. This filtering is performed via logistic regression prior to the rule discovery process. Results For survival prediction using all variables, CART outperformed the other machine learning methods. When using only significant variables, neural networks performed best. A reliable rule-base was generated using combined C4.5/CART. The average predictive rule performance was 82% when using all variables, and approximately 84% when using significant variables only. The average performance of the combined C4.5 and CART system using significant variables was 89.7% in predicting the exact outcome (home or rehabilitation), and 93.1% in predicting the ICU length of stay for airlifted TBI patients

  8. Ambient radiation levels in positron emission tomography/computed tomography (PET/CT) imaging center

    PubMed Central

    Santana, Priscila do Carmo; de Oliveira, Paulo Marcio Campos; Mamede, Marcelo; Silveira, Mariana de Castro; Aguiar, Polyanna; Real, Raphaela Vila; da Silva, Teógenes Augusto

    2015-01-01

    Objective To evaluate the level of ambient radiation in a PET/CT center. Materials and Methods Previously selected and calibrated TLD-100H thermoluminescent dosimeters were utilized to measure room radiation levels. During 32 days, the detectors were placed in several strategically selected points inside the PET/CT center and in adjacent buildings. After the exposure period the dosimeters were collected and processed to determine the radiation level. Results In none of the points selected for measurements the values exceeded the radiation dose threshold for controlled area (5 mSv/year) or free area (0.5 mSv/year) as recommended by the Brazilian regulations. Conclusion In the present study the authors demonstrated that the whole shielding system is appropriate and, consequently, the workers are exposed to doses below the threshold established by Brazilian standards, provided the radiation protection standards are followed. PMID:25798004

  9. Knowledge-based low-level image analysis for computer vision systems

    NASA Technical Reports Server (NTRS)

    Dhawan, Atam P.; Baxi, Himanshu; Ranganath, M. V.

    1988-01-01

    Two algorithms for entry-level image analysis and preliminary segmentation are proposed which are flexible enough to incorporate local properties of the image. The first algorithm involves pyramid-based multiresolution processing and a strategy to define and use interlevel and intralevel link strengths. The second algorithm, which is designed for selected window processing, extracts regions adaptively using local histograms. The preliminary segmentation and a set of features are employed as the input to an efficient rule-based low-level analysis system, resulting in suboptimal meaningful segmentation.

  10. Prediction of intramuscular fat levels in Texel lamb loins using X-ray computed tomography scanning.

    PubMed

    Clelland, N; Bunger, L; McLean, K A; Conington, J; Maltin, C; Knott, S; Lambe, N R

    2014-10-01

    For the consumer, tenderness, juiciness and flavour are often described as the most important factors for meat eating quality, all of which have a close association with intramuscular fat (IMF). X-ray computed tomography (CT) can measure fat, muscle and bone volumes and weights, in vivo in sheep and CT predictions of carcass composition have been used in UK sheep breeding programmes over the last few decades. This study aimed to determine the most accurate combination of CT variables to predict IMF percentage of M. longissimus lumborum in Texel lambs. As expected, predicted carcass fat alone accounted for a moderate amount of the variation (R(2)=0.51) in IMF. Prediction accuracies were significantly improved (Adj R(2)>0.65) using information on fat and muscle densities measured from three CT reference scans, showing that CT can provide an accurate prediction of IMF in the loin of purebred Texel sheep.

  11. Computational Design of Self-Assembling Protein Nanomaterials with Atomic Level Accuracy

    SciTech Connect

    King, Neil P.; Sheffler, William; Sawaya, Michael R.; Vollmar, Breanna S.; Sumida, John P.; André, Ingemar; Gonen, Tamir; Yeates, Todd O.; Baker, David

    2015-09-17

    We describe a general computational method for designing proteins that self-assemble to a desired symmetric architecture. Protein building blocks are docked together symmetrically to identify complementary packing arrangements, and low-energy protein-protein interfaces are then designed between the building blocks in order to drive self-assembly. We used trimeric protein building blocks to design a 24-subunit, 13-nm diameter complex with octahedral symmetry and a 12-subunit, 11-nm diameter complex with tetrahedral symmetry. The designed proteins assembled to the desired oligomeric states in solution, and the crystal structures of the complexes revealed that the resulting materials closely match the design models. The method can be used to design a wide variety of self-assembling protein nanomaterials.

  12. Predicting groundwater level fluctuations with meteorological effect implications—A comparative study among soft computing techniques

    NASA Astrophysics Data System (ADS)

    Shiri, Jalal; Kisi, Ozgur; Yoon, Heesung; Lee, Kang-Kun; Hossein Nazemi, Amir

    2013-07-01

    The knowledge of groundwater table fluctuations is important in agricultural lands as well as in the studies related to groundwater utilization and management levels. This paper investigates the abilities of Gene Expression Programming (GEP), Adaptive Neuro-Fuzzy Inference System (ANFIS), Artificial Neural Networks (ANN) and Support Vector Machine (SVM) techniques for groundwater level forecasting in following day up to 7-day prediction intervals. Several input combinations comprising water table level, rainfall and evapotranspiration values from Hongcheon Well station (South Korea), covering a period of eight years (2001-2008) were used to develop and test the applied models. The data from the first six years were used for developing (training) the applied models and the last two years data were reserved for testing. A comparison was also made between the forecasts provided by these models and the Auto-Regressive Moving Average (ARMA) technique. Based on the comparisons, it was found that the GEP models could be employed successfully in forecasting water table level fluctuations up to 7 days beyond data records.

  13. Using RoboCup in University-Level Computer Science Education

    ERIC Educational Resources Information Center

    Sklar, Elizabeth; Parsons, Simon; Stone, Peter

    2004-01-01

    In the education literature, team-based projects have proven to be an effective pedagogical methodology. We have been using "RoboCup" challenges as the basis for class projects in undergraduate and masters level courses. This article discusses several independent efforts in this direction and presents our work in the development of shared…

  14. Tablet Computer Literacy Levels of the Physical Education and Sports Department Students

    ERIC Educational Resources Information Center

    Hergüner, Gülten

    2016-01-01

    Education systems are being affected in parallel by newly emerging hardware and new developments occurring in technology daily. Tablet usage especially is becoming ubiquitous in the teaching-learning processes in recent years. Therefore, using the tablets effectively, managing them and having a high level of tablet literacy play an important role…

  15. 24 CFR 990.170 - Computation of utilities expense level (UEL): Overview.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... applicable inflation factor. The UEL for a given funding period is the product of the utility rate multiplied by the payable consumption level multiplied by the inflation factor. The UEL is expressed in terms of... of a utility, the PHA shall absorb 75 percent of this increase. (d) Inflation factor for...

  16. 24 CFR 990.170 - Computation of utilities expense level (UEL): Overview.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... applicable inflation factor. The UEL for a given funding period is the product of the utility rate multiplied by the payable consumption level multiplied by the inflation factor. The UEL is expressed in terms of... of a utility, the PHA shall absorb 75 percent of this increase. (d) Inflation factor for...

  17. Comparing levels of school performance to science teachers' reports on knowledge/skills, instructional use and student use of computers

    NASA Astrophysics Data System (ADS)

    Kerr, Rebecca

    The purpose of this descriptive quantitative and basic qualitative study was to examine fifth and eighth grade science teachers' responses, perceptions of the role of technology in the classroom, and how they felt that computer applications, tools, and the Internet influence student understanding. The purposeful sample included survey and interview responses from fifth grade and eighth grade general and physical science teachers. Even though they may not be generalizable to other teachers or classrooms due to a low response rate, findings from this study indicated teachers with fewer years of teaching science had a higher level of computer use but less computer access, especially for students, in the classroom. Furthermore, teachers' choice of professional development moderated the relationship between the level of school performance and teachers' knowledge/skills, with the most positive relationship being with workshops that occurred outside of the school. Eighteen interviews revealed that teachers perceived the role of technology in classroom instruction mainly as teacher-centered and supplemental, rather than student-centered activities.

  18. A computational model for exploratory activity of rats with different anxiety levels in elevated plus-maze.

    PubMed

    Costa, Ariadne A; Morato, Silvio; Roque, Antonio C; Tinós, Renato

    2014-10-30

    The elevated plus-maze is an apparatus widely used to study the level of anxiety in rodents. The maze is plus-shaped, with two enclosed arms and two open arms, and elevated 50cm from the floor. During a test, which usually lasts for 5min, the animal is initially put at the center and is free to move and explore the entire maze. The level of anxiety is measured by variables such as the percentage of time spent and the number of entries in the enclosed arms. High percentage of time spent at and number of entries in the enclosed arms indicate anxiety. Here we propose a computational model of rat behavior in the elevated plus-maze based on an artificial neural network trained by a genetic algorithm. The fitness function of the genetic algorithm is composed of reward (positive) and punishment (negative) terms, which are incremented as the computational agent (virtual rat) moves in the maze. The punishment term is modulated by a parameter that simulates the effects of different drugs. Unlike other computational models, the virtual rat is built independently of prior known experimental data. The exploratory behaviors generated by the model for different simulated pharmacological conditions are in good agreement with data from real rats.

  19. A computational model for exploratory activity of rats with different anxiety levels in elevated plus-maze.

    PubMed

    Costa, Ariadne A; Morato, Silvio; Roque, Antonio C; Tinós, Renato

    2014-10-30

    The elevated plus-maze is an apparatus widely used to study the level of anxiety in rodents. The maze is plus-shaped, with two enclosed arms and two open arms, and elevated 50cm from the floor. During a test, which usually lasts for 5min, the animal is initially put at the center and is free to move and explore the entire maze. The level of anxiety is measured by variables such as the percentage of time spent and the number of entries in the enclosed arms. High percentage of time spent at and number of entries in the enclosed arms indicate anxiety. Here we propose a computational model of rat behavior in the elevated plus-maze based on an artificial neural network trained by a genetic algorithm. The fitness function of the genetic algorithm is composed of reward (positive) and punishment (negative) terms, which are incremented as the computational agent (virtual rat) moves in the maze. The punishment term is modulated by a parameter that simulates the effects of different drugs. Unlike other computational models, the virtual rat is built independently of prior known experimental data. The exploratory behaviors generated by the model for different simulated pharmacological conditions are in good agreement with data from real rats. PMID:25128721

  20. A distributed computational search strategy for the identification of diagnostics targets: application to finding aptamer targets for methicillin-resistant staphylococci.

    PubMed

    Flanagan, Keith; Cockell, Simon; Harwood, Colin; Hallinan, Jennifer; Nakjang, Sirintra; Lawry, Beth; Wipat, Anil

    2014-06-30

    The rapid and cost-effective identification of bacterial species is crucial, especially for clinical diagnosis and treatment. Peptide aptamers have been shown to be valuable for use as a component of novel, direct detection methods. These small peptides have a number of advantages over antibodies, including greater specificity and longer shelf life. These properties facilitate their use as the detector components of biosensor devices. However, the identification of suitable aptamer targets for particular groups of organisms is challenging. We present a semi-automated processing pipeline for the identification of candidate aptamer targets from whole bacterial genome sequences. The pipeline can be configured to search for protein sequence fragments that uniquely identify a set of strains of interest. The system is also capable of identifying additional organisms that may be of interest due to their possession of protein fragments in common with the initial set. Through the use of Cloud computing technology and distributed databases, our system is capable of scaling with the rapidly growing genome repositories, and consequently of keeping the resulting data sets up-to-date. The system described is also more generically applicable to the discovery of specific targets for other diagnostic approaches such as DNA probes, PCR primers and antibodies.

  1. Dynamic-thresholding level set: a novel computer-aided volumetry method for liver tumors in hepatic CT images

    NASA Astrophysics Data System (ADS)

    Cai, Wenli; Yoshida, Hiroyuki; Harris, Gordon J.

    2007-03-01

    Measurement of the volume of focal liver tumors, called liver tumor volumetry, is indispensable for assessing the growth of tumors and for monitoring the response of tumors to oncology treatments. Traditional edge models, such as the maximum gradient and zero-crossing methods, often fail to detect the accurate boundary of a fuzzy object such as a liver tumor. As a result, the computerized volumetry based on these edge models tends to differ from manual segmentation results performed by physicians. In this study, we developed a novel computerized volumetry method for fuzzy objects, called dynamic-thresholding level set (DT level set). An optimal threshold value computed from a histogram tends to shift, relative to the theoretical threshold value obtained from a normal distribution model, toward a smaller region in the histogram. We thus designed a mobile shell structure, called a propagating shell, which is a thick region encompassing the level set front. The optimal threshold calculated from the histogram of the shell drives the level set front toward the boundary of a liver tumor. When the volume ratio between the object and the background in the shell approaches one, the optimal threshold value best fits the theoretical threshold value and the shell stops propagating. Application of the DT level set to 26 hepatic CT cases with 63 biopsy-confirmed hepatocellular carcinomas (HCCs) and metastases showed that the computer measured volumes were highly correlated with those of tumors measured manually by physicians. Our preliminary results showed that DT level set was effective and accurate in estimating the volumes of liver tumors detected in hepatic CT images.

  2. Understanding the retina: a review of computational models of the retina from the single cell to the network level.

    PubMed

    Guo, Tianruo; Tsai, David; Bai, Siwei; Morley, John W; Suaning, Gregg J; Lovell, Nigel H; Dokos, Socrates

    2014-01-01

    The vertebrate retina is a clearly organized signal-processing system. It contains more than 60 different types of neurons, arranged in three distinct neural layers. Each cell type is believed to serve unique role(s) in encoding visual information. While we now have a relatively good understanding of the constituent cell types in the retina and some general ideas of their connectivity, with few exceptions, how the retinal circuitry performs computation remains poorly understood. Computational modeling has been commonly used to study the retina from the single cell to the network level. In this article, we begin by reviewing retinal modeling strategies and existing models. We then discuss in detail the significance and limitations of these models, and finally, we provide suggestions for the future development of retinal neural modeling.

  3. In pursuit of an accurate spatial and temporal model of biomolecules at the atomistic level: a perspective on computer simulation

    PubMed Central

    Gray, Alan; Harlen, Oliver G.; Harris, Sarah A.; Khalid, Syma; Leung, Yuk Ming; Lonsdale, Richard; Mulholland, Adrian J.; Pearson, Arwen R.; Read, Daniel J.; Richardson, Robin A.

    2015-01-01

    Despite huge advances in the computational techniques available for simulating biomolecules at the quantum-mechanical, atomistic and coarse-grained levels, there is still a widespread perception amongst the experimental community that these calculations are highly specialist and are not generally applicable by researchers outside the theoretical community. In this article, the successes and limitations of biomolecular simulation and the further developments that are likely in the near future are discussed. A brief overview is also provided of the experimental biophysical methods that are commonly used to probe biomolecular structure and dynamics, and the accuracy of the information that can be obtained from each is compared with that from modelling. It is concluded that progress towards an accurate spatial and temporal model of biomacromolecules requires a combination of all of these biophysical techniques, both experimental and computational. PMID:25615870

  4. Remote reading of coronary CTA exams using a tablet computer: utility for stenosis assessment and identification of coronary anomalies.

    PubMed

    Zimmerman, Stefan L; Lin, Cheng T; Chu, Linda C; Eng, John; Fishman, Elliot K

    2016-06-01

    The feasibility of remote reading of coronary CT examinations on tablet computers has not been evaluated. The purpose of this study is to evaluate the accuracy of coronary CT angiography reading using an iPad compared to standard 3D workstations. Fifty coronary CT angiography exams, including a spectrum of coronary artery disease and anatomic variants, were reviewed. Coronary CT angiography exams were interpreted by two readers independently on an iPad application (Siemens Webviewer) and a clinical 3D workstation at sessions 2 weeks apart. Studies were scored per vessel for severity of stenosis on a 0-3 scale (0 none, 1 <50 %, 2 ≥50-69 %, 3 ≥70 %). Coronary anomalies were recorded. A consensus read by two experienced cardiac imagers was used as the reference standard. Level of agreement with the reference for iPad and 3D workstations was compared. Multivariate logistic regression was used to analyze the relationship between agreement and display type and to adjust for inter-reader differences. For both readers, there was no significant difference in agreement with the reference standard for per-vessel stenosis scores using either the 3D workstation or the iPad. In a multivariable logistic regression analysis including reader, workstation, and vessel as co-variates, there was no significant association between workstation type or reader and agreement with the reference standard (p > 0.05). Both readers identified 100 % of coronary anomalies using each technique. Reading of coronary CT angiography examinations on the iPad had no influence on stenosis assessment compared to the standard clinical workstation. PMID:27085532

  5. Remote reading of coronary CTA exams using a tablet computer: utility for stenosis assessment and identification of coronary anomalies.

    PubMed

    Zimmerman, Stefan L; Lin, Cheng T; Chu, Linda C; Eng, John; Fishman, Elliot K

    2016-06-01

    The feasibility of remote reading of coronary CT examinations on tablet computers has not been evaluated. The purpose of this study is to evaluate the accuracy of coronary CT angiography reading using an iPad compared to standard 3D workstations. Fifty coronary CT angiography exams, including a spectrum of coronary artery disease and anatomic variants, were reviewed. Coronary CT angiography exams were interpreted by two readers independently on an iPad application (Siemens Webviewer) and a clinical 3D workstation at sessions 2 weeks apart. Studies were scored per vessel for severity of stenosis on a 0-3 scale (0 none, 1 <50 %, 2 ≥50-69 %, 3 ≥70 %). Coronary anomalies were recorded. A consensus read by two experienced cardiac imagers was used as the reference standard. Level of agreement with the reference for iPad and 3D workstations was compared. Multivariate logistic regression was used to analyze the relationship between agreement and display type and to adjust for inter-reader differences. For both readers, there was no significant difference in agreement with the reference standard for per-vessel stenosis scores using either the 3D workstation or the iPad. In a multivariable logistic regression analysis including reader, workstation, and vessel as co-variates, there was no significant association between workstation type or reader and agreement with the reference standard (p > 0.05). Both readers identified 100 % of coronary anomalies using each technique. Reading of coronary CT angiography examinations on the iPad had no influence on stenosis assessment compared to the standard clinical workstation.

  6. Computational methodology for predicting the landscape of the human-microbial interactome region level influence.

    PubMed

    Coelho, Edgar D; Santiago, André M; Arrais, Joel P; Oliveira, José Luís

    2015-10-01

    Microbial communities thrive in close association among themselves and with the host, establishing protein-protein interactions (PPIs) with the latter, and thus being able to benefit (positively impact) or disturb (negatively impact) biological events in the host. Despite major collaborative efforts to sequence the Human microbiome, there is still a great lack of understanding their impact. We propose a computational methodology to predict the impact of microbial proteins in human biological events, taking into account the abundance of each microbial protein and its relation to all other microbial and human proteins. This alternative methodology is centered on an improved impact estimation algorithm that integrates PPIs between human and microbial proteins with Reactome pathway data. This methodology was applied to study the impact of 24 microbial phyla over different cellular events, within 10 different human microbiomes. The results obtained confirm findings already described in the literature and explore new ones. We believe the Human microbiome can no longer be ignored as not only is there enough evidence correlating microbiome alterations and disease states, but also the return to healthy states once these alterations are reversed.

  7. Computational design of an unnatural amino acid dependent metalloprotein with atomic level accuracy.

    PubMed

    Mills, Jeremy H; Khare, Sagar D; Bolduc, Jill M; Forouhar, Farhad; Mulligan, Vikram Khipple; Lew, Scott; Seetharaman, Jayaraman; Tong, Liang; Stoddard, Barry L; Baker, David

    2013-09-11

    Genetically encoded unnatural amino acids could facilitate the design of proteins and enzymes of novel function, but correctly specifying sites of incorporation and the identities and orientations of surrounding residues represents a formidable challenge. Computational design methods have been used to identify optimal locations for functional sites in proteins and design the surrounding residues but have not incorporated unnatural amino acids in this process. We extended the Rosetta design methodology to design metalloproteins in which the amino acid (2,2'-bipyridin-5yl)alanine (Bpy-Ala) is a primary ligand of a bound metal ion. Following initial results that indicated the importance of buttressing the Bpy-Ala amino acid, we designed a buried metal binding site with octahedral coordination geometry consisting of Bpy-Ala, two protein-based metal ligands, and two metal-bound water molecules. Experimental characterization revealed a Bpy-Ala-mediated metalloprotein with the ability to bind divalent cations including Co(2+), Zn(2+), Fe(2+), and Ni(2+), with a Kd for Zn(2+) of ∼40 pM. X-ray crystal structures of the designed protein bound to Co(2+) and Ni(2+) have RMSDs to the design model of 0.9 and 1.0 Å respectively over all atoms in the binding site.

  8. Patient grouping for dose surveys and establishment of diagnostic reference levels in paediatric computed tomography.

    PubMed

    Vassileva, J; Rehani, M

    2015-07-01

    There has been confusion in literature on whether paediatric patients should be grouped according to age, weight or other parameters when dealing with dose surveys. The present work aims to suggest a pragmatic approach to achieve reasonable accuracy for performing patient dose surveys in countries with limited resources. The analysis is based on a subset of data collected within the IAEA survey of paediatric computed tomography (CT) doses, involving 82 CT facilities from 32 countries in Asia, Europe, Africa and Latin America. Data for 6115 patients were collected, in 34.5 % of which data for weight were available. The present study suggests that using four age groups, <1, >1-5, >5-10 and >10-15 y, is realistic and pragmatic for dose surveys in less resourced countries and for the establishment of DRLs. To ensure relevant accuracy of results, data for >30 patients in a particular age group should be collected if patient weight is not known. If a smaller sample is used, patient weight should be recorded and the median weight in the sample should be within 5-10 % from the median weight of the sample for which the DRLs were established. Comparison of results from different surveys should always be performed with caution, taking into consideration the way of grouping of paediatric patients. Dose results can be corrected for differences in patient weight/age group.

  9. Patient grouping for dose surveys and establishment of diagnostic reference levels in paediatric computed tomography.

    PubMed

    Vassileva, J; Rehani, M

    2015-07-01

    There has been confusion in literature on whether paediatric patients should be grouped according to age, weight or other parameters when dealing with dose surveys. The present work aims to suggest a pragmatic approach to achieve reasonable accuracy for performing patient dose surveys in countries with limited resources. The analysis is based on a subset of data collected within the IAEA survey of paediatric computed tomography (CT) doses, involving 82 CT facilities from 32 countries in Asia, Europe, Africa and Latin America. Data for 6115 patients were collected, in 34.5 % of which data for weight were available. The present study suggests that using four age groups, <1, >1-5, >5-10 and >10-15 y, is realistic and pragmatic for dose surveys in less resourced countries and for the establishment of DRLs. To ensure relevant accuracy of results, data for >30 patients in a particular age group should be collected if patient weight is not known. If a smaller sample is used, patient weight should be recorded and the median weight in the sample should be within 5-10 % from the median weight of the sample for which the DRLs were established. Comparison of results from different surveys should always be performed with caution, taking into consideration the way of grouping of paediatric patients. Dose results can be corrected for differences in patient weight/age group. PMID:25836695

  10. Computational aspects of helicopter trim analysis and damping levels from Floquet theory

    NASA Technical Reports Server (NTRS)

    Gaonkar, Gopal H.; Achar, N. S.

    1992-01-01

    Helicopter trim settings of periodic initial state and control inputs are investigated for convergence of Newton iteration in computing the settings sequentially and in parallel. The trim analysis uses a shooting method and a weak version of two temporal finite element methods with displacement formulation and with mixed formulation of displacements and momenta. These three methods broadly represent two main approaches of trim analysis: adaptation of initial-value and finite element boundary-value codes to periodic boundary conditions, particularly for unstable and marginally stable systems. In each method, both the sequential and in-parallel schemes are used and the resulting nonlinear algebraic equations are solved by damped Newton iteration with an optimally selected damping parameter. The impact of damped Newton iteration, including earlier-observed divergence problems in trim analysis, is demonstrated by the maximum condition number of the Jacobian matrices of the iterative scheme and by virtual elimination of divergence. The advantages of the in-parallel scheme over the conventional sequential scheme are also demonstrated.

  11. The PVM (Parallel Virtual Machine) system: Supercomputer level concurrent computation on a network of IBM RS/6000 power stations

    SciTech Connect

    Sunderam, V.S. . Dept. of Mathematics and Computer Science); Geist, G.A. )

    1991-01-01

    The PVM (Parallel Virtual Machine) system enables supercomputer level concurrent computations to be performed on interconnected networks of heterogeneous computer systems. Specifically, a network of 13 IBM RS/6000 powerstations has been successfully used to execute production quality runs of superconductor modeling codes at more than 250 Mflops. This work demonstrates the effectiveness of cooperative concurrent processing for high performance applications, and shows that supercomputer level computations may be attained at a fraction of the cost on distributed computing platforms. This paper describes the PVM programming environment and user facilities, as they apply to hardware platforms comprising a network of IBM RS/6000 powerstations. The salient design features of PVM will be discussed; including heterogeneity, scalability, multilanguage support, provisions for fault tolerance, the use of multiprocessors and scalar machines, an interactive graphical front end, and support for profiling, tracing, and visual analysis. The PVM system has been used extensively, and a range of production quality concurrent applications have been successfully executed using PVM on a variety of networked platforms. The paper will mention representative examples, and discuss two in detail. The first is a material sciences problem that was originally developed on a Cray 2. This application code calculates the electronic structure of metallic alloys from first principles and is based on the KKR-CPA algorithm. The second is a molecular dynamics simulation for calculating materials properties. Performance results for both applicants on networks of RS/6000 powerstations will be presented, and accompanied by discussions of the other advantages of PVM and its potential as a complement or alternative to conventional supercomputers.

  12. ALPHN: A computer program for calculating ({alpha}, n) neutron production in canisters of high-level waste

    SciTech Connect

    Salmon, R.; Hermann, O.W.

    1992-10-01

    The rate of neutron production from ({alpha}, n) reactions in canisters of immobilized high-level waste containing borosilicate glass or glass-ceramic compositions is significant and must be considered when estimating neutron shielding requirements. The personal computer program ALPHA calculates the ({alpha}, n) neutron production rate of a canister of vitrified high-level waste. The user supplies the chemical composition of the glass or glass-ceramic and the curies of the alpha-emitting actinides present. The output of the program gives the ({alpha}, n) neutron production of each actinide in neutrons per second and the total for the canister. The ({alpha}, n) neutron production rates are source terms only; that is, they are production rates within the glass and do not take into account the shielding effect of the glass. For a given glass composition, the user can calculate up to eight cases simultaneously; these cases are based on the same glass composition but contain different quantities of actinides per canister. In a typical application, these cases might represent the same canister of vitrified high-level waste at eight different decay times. Run time for a typical problem containing 20 chemical species, 24 actinides, and 8 decay times was 35 s on an IBM AT personal computer. Results of an example based on an expected canister composition at the Defense Waste Processing Facility are shown.

  13. ALPHN: A computer program for calculating ([alpha], n) neutron production in canisters of high-level waste

    SciTech Connect

    Salmon, R.; Hermann, O.W.

    1992-10-01

    The rate of neutron production from ([alpha], n) reactions in canisters of immobilized high-level waste containing borosilicate glass or glass-ceramic compositions is significant and must be considered when estimating neutron shielding requirements. The personal computer program ALPHA calculates the ([alpha], n) neutron production rate of a canister of vitrified high-level waste. The user supplies the chemical composition of the glass or glass-ceramic and the curies of the alpha-emitting actinides present. The output of the program gives the ([alpha], n) neutron production of each actinide in neutrons per second and the total for the canister. The ([alpha], n) neutron production rates are source terms only; that is, they are production rates within the glass and do not take into account the shielding effect of the glass. For a given glass composition, the user can calculate up to eight cases simultaneously; these cases are based on the same glass composition but contain different quantities of actinides per canister. In a typical application, these cases might represent the same canister of vitrified high-level waste at eight different decay times. Run time for a typical problem containing 20 chemical species, 24 actinides, and 8 decay times was 35 s on an IBM AT personal computer. Results of an example based on an expected canister composition at the Defense Waste Processing Facility are shown.

  14. Solving the scattering of N photons on a two-level atom without computation

    NASA Astrophysics Data System (ADS)

    Roulet, Alexandre; Scarani, Valerio

    2016-09-01

    We propose a novel approach for solving the scattering of light onto a two-level atom coupled to a one-dimensional waveguide. First we express the physical quantity of interest in terms of Feynman diagrams and treat the atom as a non-saturable linear beamsplitter. By using the atomic response to our advantage, a relevant substitution is then made that captures the nonlinearity of the atom, and the final result is obtained in terms of simple integrals over the initial incoming wavepackets. The procedure is not limited to post-scattering quantities and allows for instance to derive the atomic excitation during the scattering event.

  15. A two-level predictive event-related potential-based brain-computer interface.

    PubMed

    Xu, Yaming; Nakajima, Yoshikazu

    2013-10-01

    Increasing the freedom of communication using conventional row/column (RC) P300 paradigm by naive way (increasing matrix size) may deteriorate inherent distraction effect and interaction speed. In this paper, we propose a two-level predictive (TLP) paradigm by integrating a 3×3 two-level matrix paradigm with a statistical language model. The TLP paradigm is evaluated using offline and online data from ten healthy subjects. Significantly larger event-related potentials (ERPs) are evoked by the TLP paradigm compared with the classical 6×6 RC. During an online task (correctly spell an English sentence with 57 characters), accuracy and information transfer rate for the TLP are increased by 14.45% and 29.29%, respectively, when compared with the 6×6 RC. Time to complete the task is also decreased by 24.61% using TLP. In sharp contrast, an 8×8 RC (naive extension of the 6×6 RC) consumed 19.18% more time than the classical 6×6 RC. Furthermore, the statistical language model is also exploited to improve classification accuracy in a Bayesian approach. The proposed Bayesian fusion method is tested offline on data from the online spelling tasks. The results show its potential improvement on single-trial ERP classification.

  16. Computer-aided generation of stimulation data and model identification for functional electrical stimulation (FES) control of lower extremities.

    PubMed

    Eom, G M; Watanabe, T; Futami, R; Hoshimiy, N; Handa, Y

    2000-01-01

    Standard stimulation data for unassisted standing up of paraplegic patients was generated by dynamic optimization linked with model simulation, to overcome the difficulties in the present electromyogram (EMG)-based method. The generated stimulation data were roughly in agreement with the normal subjects' EMG. From these, it is suggested that the 'model-based' method is useful as an alternative of the 'EMG-based method'. The same technique can be applied to generation of patient-specific stimulation data once the musculoskeletal system of a patient is properly identified. The musculoskeletal system must be identified from data taken from simple and noninvasive experiments for the identification method to be practically acceptable. We developed a musculoskeletal model and systematic identification protocols for this purpose. They were validated for the vastus lateralis muscle at the knee joint. The identification was successful and the predicted joint angle trajectories closely matched the experimental data. This implies that the model-based generation of patient-specific stimulation data is possible.

  17. Low-level waste disposal site performance assessment with the RQ/PQ computer program. Final report. [Shallow land burial

    SciTech Connect

    Rogers, V.C.; Grant, M.W.; Merrell, G.B.; Macbeth, P.J.

    1983-06-01

    The operation of the computer code RQ/PQ (Retention quotient/performance quotient) is presented. The code calculates the potential hazard of low-level radioactive waste as well as the characteristics of the natural and man-made barriers provided by the disposal facility. From these parameters a facility safety factor is determined as a function of time after facility operation. The program also calculates the dose to man for the nine release pathways to man. Both off-site transport and inadvertent intrusion pathways are considered.

  18. COMGEN: A computer program for generating finite element models of composite materials at the micro level

    NASA Technical Reports Server (NTRS)

    Melis, Matthew E.

    1990-01-01

    COMGEN (Composite Model Generator) is an interactive FORTRAN program which can be used to create a wide variety of finite element models of continuous fiber composite materials at the micro level. It quickly generates batch or session files to be submitted to the finite element pre- and postprocessor PATRAN based on a few simple user inputs such as fiber diameter and percent fiber volume fraction of the composite to be analyzed. In addition, various mesh densities, boundary conditions, and loads can be assigned easily to the models within COMGEN. PATRAN uses a session file to generate finite element models and their associated loads which can then be translated to virtually any finite element analysis code such as NASTRAN or MARC.

  19. Computational Visual Stress Level Analysis of Calcareous Algae Exposed to Sedimentation

    PubMed Central

    Nilssen, Ingunn; Eide, Ingvar; de Oliveira Figueiredo, Marcia Abreu; de Souza Tâmega, Frederico Tapajós; Nattkemper, Tim W.

    2016-01-01

    This paper presents a machine learning based approach for analyses of photos collected from laboratory experiments conducted to assess the potential impact of water-based drill cuttings on deep-water rhodolith-forming calcareous algae. This pilot study uses imaging technology to quantify and monitor the stress levels of the calcareous algae Mesophyllum engelhartii (Foslie) Adey caused by various degrees of light exposure, flow intensity and amount of sediment. A machine learning based algorithm was applied to assess the temporal variation of the calcareous algae size (∼ mass) and color automatically. Measured size and color were correlated to the photosynthetic efficiency (maximum quantum yield of charge separation in photosystem II, ΦPSIImax) and degree of sediment coverage using multivariate regression. The multivariate regression showed correlations between time and calcareous algae sizes, as well as correlations between fluorescence and calcareous algae colors. PMID:27285611

  20. Multi-level security for computer networking: SAC digital network approach

    SciTech Connect

    Griess, W.; Poutre, D.L.

    1983-10-01

    For telecommunications systems simultaneously handling data of different security levels, multilevel secure (MLS) operation permits maximum use of resources by automatically providing protection to users with various clearances and needs-to-know. The strategic air command (SAC) is upgrading the primary record data system used to command and control its strategic forces. The upgrade, called the SAC Digital Network (SACDIN), is designed to provide multilevel security to support users and external interfaces, with allowed accesses ranging from unclassified to top secret. SACDIN implements a security kernel based upon the Bell and Lapadula security model. This study presents an overview of the SACDIN security architecture and describes the basic message flow across the MLS network. 7 references.

  1. Multi-level security for computer networking - SAC digital network approach

    NASA Astrophysics Data System (ADS)

    Griess, W.; Poutre, D. L.

    The functional features and architecture of the SACDIN (SAC digital network) are detailed. SACDIN is the new data transmission segment for directing SAC's strategic forces. The system has 135 processor nodes at 32 locations and processes, distributes and stores data of any level of security classification. The sophistication of access nodes is dependent on the location. A reference monitor mediates the multilevel security by implementation of the multi-state machine concept, i.e., the Bell-LaPadula model (1973, 1974), which concludes that a secure state can never lead to an unsecure state. The monitor is controlled by the internal access control mechanism, which resides in PROM. Details of the access process are provided, including message flow on trusted paths appropriate to the security clearance of the user.

  2. Computational Visual Stress Level Analysis of Calcareous Algae Exposed to Sedimentation.

    PubMed

    Osterloff, Jonas; Nilssen, Ingunn; Eide, Ingvar; de Oliveira Figueiredo, Marcia Abreu; de Souza Tâmega, Frederico Tapajós; Nattkemper, Tim W

    2016-01-01

    This paper presents a machine learning based approach for analyses of photos collected from laboratory experiments conducted to assess the potential impact of water-based drill cuttings on deep-water rhodolith-forming calcareous algae. This pilot study uses imaging technology to quantify and monitor the stress levels of the calcareous algae Mesophyllum engelhartii (Foslie) Adey caused by various degrees of light exposure, flow intensity and amount of sediment. A machine learning based algorithm was applied to assess the temporal variation of the calcareous algae size (∼ mass) and color automatically. Measured size and color were correlated to the photosynthetic efficiency (maximum quantum yield of charge separation in photosystem II, [Formula: see text]) and degree of sediment coverage using multivariate regression. The multivariate regression showed correlations between time and calcareous algae sizes, as well as correlations between fluorescence and calcareous algae colors.

  3. A Well-Mixed Computational Model for Estimating Room Air Levels of Selected Constituents from E-Vapor Product Use

    PubMed Central

    Rostami, Ali A.; Pithawalla, Yezdi B.; Liu, Jianmin; Oldham, Michael J.; Wagner, Karl A.; Frost-Pineda, Kimberly; Sarkar, Mohamadi A.

    2016-01-01

    Concerns have been raised in the literature for the potential of secondhand exposure from e-vapor product (EVP) use. It would be difficult to experimentally determine the impact of various factors on secondhand exposure including, but not limited to, room characteristics (indoor space size, ventilation rate), device specifications (aerosol mass delivery, e-liquid composition), and use behavior (number of users and usage frequency). Therefore, a well-mixed computational model was developed to estimate the indoor levels of constituents from EVPs under a variety of conditions. The model is based on physical and thermodynamic interactions between aerosol, vapor, and air, similar to indoor air models referred to by the Environmental Protection Agency. The model results agree well with measured indoor air levels of nicotine from two sources: smoking machine-generated aerosol and aerosol exhaled from EVP use. Sensitivity analysis indicated that increasing air exchange rate reduces room air level of constituents, as more material is carried away. The effect of the amount of aerosol released into the space due to variability in exhalation was also evaluated. The model can estimate the room air level of constituents as a function of time, which may be used to assess the level of non-user exposure over time. PMID:27537903

  4. A Well-Mixed Computational Model for Estimating Room Air Levels of Selected Constituents from E-Vapor Product Use.

    PubMed

    Rostami, Ali A; Pithawalla, Yezdi B; Liu, Jianmin; Oldham, Michael J; Wagner, Karl A; Frost-Pineda, Kimberly; Sarkar, Mohamadi A

    2016-01-01

    Concerns have been raised in the literature for the potential of secondhand exposure from e-vapor product (EVP) use. It would be difficult to experimentally determine the impact of various factors on secondhand exposure including, but not limited to, room characteristics (indoor space size, ventilation rate), device specifications (aerosol mass delivery, e-liquid composition), and use behavior (number of users and usage frequency). Therefore, a well-mixed computational model was developed to estimate the indoor levels of constituents from EVPs under a variety of conditions. The model is based on physical and thermodynamic interactions between aerosol, vapor, and air, similar to indoor air models referred to by the Environmental Protection Agency. The model results agree well with measured indoor air levels of nicotine from two sources: smoking machine-generated aerosol and aerosol exhaled from EVP use. Sensitivity analysis indicated that increasing air exchange rate reduces room air level of constituents, as more material is carried away. The effect of the amount of aerosol released into the space due to variability in exhalation was also evaluated. The model can estimate the room air level of constituents as a function of time, which may be used to assess the level of non-user exposure over time. PMID:27537903

  5. In pursuit of an accurate spatial and temporal model of biomolecules at the atomistic level: a perspective on computer simulation

    SciTech Connect

    Gray, Alan; Harlen, Oliver G.; Harris, Sarah A.; Khalid, Syma; Leung, Yuk Ming; Lonsdale, Richard; Mulholland, Adrian J.; Pearson, Arwen R.; Read, Daniel J.; Richardson, Robin A.

    2015-01-01

    The current computational techniques available for biomolecular simulation are described, and the successes and limitations of each with reference to the experimental biophysical methods that they complement are presented. Despite huge advances in the computational techniques available for simulating biomolecules at the quantum-mechanical, atomistic and coarse-grained levels, there is still a widespread perception amongst the experimental community that these calculations are highly specialist and are not generally applicable by researchers outside the theoretical community. In this article, the successes and limitations of biomolecular simulation and the further developments that are likely in the near future are discussed. A brief overview is also provided of the experimental biophysical methods that are commonly used to probe biomolecular structure and dynamics, and the accuracy of the information that can be obtained from each is compared with that from modelling. It is concluded that progress towards an accurate spatial and temporal model of biomacromolecules requires a combination of all of these biophysical techniques, both experimental and computational.

  6. Identification of light absorbing oligomers from glyoxal and methylglyoxal aqueous processing: a comparative study at the molecular level

    NASA Astrophysics Data System (ADS)

    Finessi, Emanuela; Hamilton, Jacqueline; Rickard, Andrew; Baeza-Romero, Maria; Healy, Robert; Peppe, Salvatore; Adams, Tom; Daniels, Mark; Ball, Stephen; Goodall, Iain; Monks, Paul; Borras, Esther; Munoz, Amalia

    2014-05-01

    Numerous studies point to the reactive uptake of gaseous low molecular weight carbonyls onto atmospheric waters (clouds/fog droplets and wet aerosols) as an important SOA formation route not yet included in current models. However, the evaluation of these processes is challenging because water provides a medium for a complex array of reactions to take place such as self-oligomerization, aldol condensation and Maillard-type browning reactions in the presence of ammonium salts. In addition to adding to SOA mass, aqueous chemistry products have been shown to include light absorbing, surface-active and high molecular weight oligomeric species, and can therefore affect climatically relevant aerosol properties such as light absorption and hygroscopicity. Glyoxal (GLY) and methylglyoxal (MGLY) are the gaseous carbonyls that have perhaps received the most attention to date owing to their ubiquity, abundance and reactivity in water, with the majority of studies focussing on bulk physical properties. However, very little is known at the molecular level, in particular for MGLY, and the relative potential of these species as aqueous SOA precursors in ambient air is still unclear. We have conducted experiments with both laboratory solutions and chamber-generated particles to simulate the aqueous processing of GLY and MGLY with ammonium sulphate (AS) under typical atmospheric conditions and investigated their respective aging products. Both high performance liquid chromatography coupled with UV-Vis detection and ion trap mass spectrometry (HPLC-DAD-MSn) and high resolution mass spectrometry (FTICRMS) have been used for molecular identification purposes. Comprehensive gas chromatography with nitrogen chemiluminescence detection (GCxGC-NCD) has been applied for the first time to these systems, revealing a surprisingly high number of nitrogen-containing organics (ONs), with a large extent of polarities. GCxGC-NCD proved to be a valuable tool to determine overall amount and rates of

  7. A computational model of oxygen transport in the cerebrocapillary levels for normal and pathologic brain function.

    PubMed

    Safaeian, Navid; David, Tim

    2013-10-01

    The oxygen exchange and correlation between the cerebral blood flow (CBF) and cerebral metabolic rate of oxygen consumption (CMRO2) in the cortical capillary levels for normal and pathologic brain functions remain the subject of debate. A 3D realistic mesoscale model of the cortical capillary network (non-tree like) is constructed using a random Voronoi tessellation in which each edge represents a capillary segment. The hemodynamics and oxygen transport are numerically simulated in the model, which involves rheological laws in the capillaries, oxygen diffusion, and non-linear binding of oxygen to hemoglobin, respectively. The findings show that the cerebral hypoxia due to a significant decreased perfusion (as can occur in stroke) can be avoided by a moderate reduction in oxygen demand. Oxygen extraction fraction (OEF) can be an important indicator for the brain oxygen metabolism under normal perfusion and misery-perfusion syndrome (leading to ischemia). The results demonstrated that a disproportionately large increase in blood supply is required for a small increase in the oxygen demand, which, in turn, is strongly dependent on the resting OEF. The predicted flow-metabolism coupling in the model supports the experimental studies of spatiotemporal stimulations in humans by positron emission tomography and functional magnetic resonance imaging.

  8. A Full Dynamic Compound Inverse Method for output-only element-level system identification and input estimation from earthquake response signals

    NASA Astrophysics Data System (ADS)

    Pioldi, Fabio; Rizzi, Egidio

    2016-08-01

    This paper proposes a new output-only element-level system identification and input estimation technique, towards the simultaneous identification of modal parameters, input excitation time history and structural features at the element-level by adopting earthquake-induced structural response signals. The method, named Full Dynamic Compound Inverse Method (FDCIM), releases strong assumptions of earlier element-level techniques, by working with a two-stage iterative algorithm. Jointly, a Statistical Average technique, a modification process and a parameter projection strategy are adopted at each stage to achieve stronger convergence for the identified estimates. The proposed method works in a deterministic way and is completely developed in State-Space form. Further, it does not require continuous- to discrete-time transformations and does not depend on initialization conditions. Synthetic earthquake-induced response signals from different shear-type buildings are generated to validate the implemented procedure, also with noise-corrupted cases. The achieved results provide a necessary condition to demonstrate the effectiveness of the proposed identification method.

  9. Refined multi-level methodology in parallel computing environment for BWR RIA analysis

    NASA Astrophysics Data System (ADS)

    Solis-Rodarte, Jorge

    2000-12-01

    Best-estimate methodologies in boiling water reactor can reduce the traditional conservative thermal margins imposed on the designs and during the operation of this type of nuclear reactors. Traditional operating thermal margins are obtained based on simplified modeling techniques that are supplemented with the required dose of conservatism. For instance, much more realistic transient pin peaking distributions can be predicted by applying a dehomogenization algorithm, based on a flux reconstruction scheme which uses nodal results during both steady state and transient calculation at each time step. A subchannel analysis module for obtaining thermal margins supplements the calculation approach used. A multi-level methodology to extend the TRAC-BF1/NEM coupled code capability to obtain the transient fuel rod response has been implemented. To fulfill the development requirements some improved neutronic models were implemented into the NEM solution algorithm, namely the pin power reconstruction capability, and the simulation of a dynamic scram. The obtained results were coupled to a subchannel analysis module: COBRA-TF T-H subchannel analysis code. The objective of the pin power reconstruction capability of NEM is to locate the most limiting node (axial region of assembly/channel) within the core. The power information obtained from the NEM 3D neutronic calculation is used by the hot channel analysis module (COBRA-TF). COBRA-TF needs also the T-H conditions at the boundary nodes. This information is provided by TRACBF1 T-H system analysis code. The Subchannel analysis module uses this information to re-calculate the fluid, thermal and hydraulics conditions in the most limiting node (axial region of assembly/channel) within the core.

  10. Disturbed Interplay between Mid- and High-Level Vision in ASD? Evidence from a Contour Identification Task with Everyday Objects

    ERIC Educational Resources Information Center

    Evers, Kris; Panis, Sven; Torfs, Katrien; Steyaert, Jean; Noens, Ilse; Wagemans, Johan

    2014-01-01

    Atypical visual processing in children with autism spectrum disorder (ASD) does not seem to reside in an isolated processing component, such as global or local processing. We therefore developed a paradigm that requires the interaction between different processes--an identification task with Gaborized object outlines--and applied this to two age…

  11. In vivo changes in plasma coenzyme Q10, carotenoid, tocopherol, and retinol levels in children after computer tomography

    PubMed Central

    Halm, Brunhild M.; Lai, Jennifer F.; Morrison, Cynthia M.; Pagano, Ian; Custer, Laurie J.; Cooney, Robert V.; Franke, Adrian A.

    2015-01-01

    Background Low dose X-irradiation (IR) from computer tomography (CT) can generate free radicals, which can damage biologically relevant molecules and ultimately lead to cancer. These effects are especially concerning for children owing to their higher radiosensitivity and longer life expectancy than adults. The lipid phase micronutrients (LPM) coenzyme Q10, carotenoids, E vitamers, and vitamin A are potent radical scavengers that can act as intracellular antioxidants. Methods We investigated changes in circulating levels of these LPM in 17 children (0.25–6 y) undergoing medically indicated CT scans involving relatively low IR doses. Blood was drawn before and 1 h after CT scans and analyzed using HPLC with electrochemical and UV/VIS detection. Results We found significant decreases (p < 0.05) in post-CT plasma levels in several LPM which suggests that these LPM can serve as biodosimeters and may protect against damage from IR during clinical procedures such as CT. The strongest predictors for pre- to post-CT changes for many LPM were their baseline levels. Conclusion Future larger studies are warranted to confirm our findings and to test whether high circulating antioxidant levels protect against IR damage in vivo with an ultimate goal of establishing prophylactic modalities for CT-induced IR damage. PMID:24583267

  12. Identification of Novel Activators of Constitutive Androstane Receptor from FDA-approved Drugs by Integrated Computational and Biological Approaches

    PubMed Central

    Lynch, Caitlin; Pan, Yongmei; Li, Linhao; Ferguson, Stephen S.; Xia, Menghang; Swaan, Peter W.; Wang, Hongbing

    2012-01-01

    Purpose The constitutive androstane receptor (CAR, NR1I3) is a xenobiotic sensor governing the transcription of numerous hepatic genes associated with drug metabolism and clearance. Recent evidence suggests that CAR also modulates energy homeostasis and cancer development. Thus, identification of novel human (h) CAR activators is of both clinical importance and scientific interest. Methods Docking and ligand-based structure-activity models were used for virtual screening of a database containing over 2000 FDA-approved drugs. Identified lead compounds were evaluated in cell-based reporter assays to determine hCAR activation. Potential activators were further tested in human primary hepatocytes (HPHs) for the expression of the prototypical hCAR target gene CYP2B6. Results Nineteen lead compounds with optimal modeling parameters were selected for biological evaluation. Seven of the 19 leads exhibited moderate to potent activation of hCAR. Five out of the seven compounds translocated hCAR from the cytoplasm to the nucleus of HPHs in a concentration-dependent manner. These compounds also induce the expression of CYP2B6 in HPHs with rank-order of efficacies closely resembling that of hCAR activation. Conclusion These results indicate that our strategically integrated approaches are effective in the identification of novel hCAR modulators, which may function as valuable research tools or potential therapeutic molecules. PMID:23090669

  13. Computer-aided drug discovery

    PubMed Central

    Bajorath, Jürgen

    2015-01-01

    Computational approaches are an integral part of interdisciplinary drug discovery research. Understanding the science behind computational tools, their opportunities, and limitations is essential to make a true impact on drug discovery at different levels. If applied in a scientifically meaningful way, computational methods improve the ability to identify and evaluate potential drug molecules, but there remain weaknesses in the methods that preclude naïve applications. Herein, current trends in computer-aided drug discovery are reviewed, and selected computational areas are discussed. Approaches are highlighted that aid in the identification and optimization of new drug candidates. Emphasis is put on the presentation and discussion of computational concepts and methods, rather than case studies or application examples. As such, this contribution aims to provide an overview of the current methodological spectrum of computational drug discovery for a broad audience. PMID:26949519

  14. Computational identification and characterization of conserved miRNAs and their target genes in garlic (Allium sativum L.) expressed sequence tags.

    PubMed

    Panda, Debashis; Dehury, Budheswar; Sahu, Jagajjit; Barooah, Madhumita; Sen, Priyabrata; Modi, Mahendra K

    2014-03-10

    The endogenous small non-coding functional microRNAs (miRNAs) are short in size, range from ~21 to 24 nucleotides in length, play a pivotal role in gene expression in plants and animals by silencing genes either by destructing or blocking of translation of homologous mRNA. Although various high-throughput, time consuming and expensive techniques like forward genetics and direct cloning are employed to detect miRNAs in plants but comparative genomics complemented with novel bioinformatic tools pave the way for efficient and cost-effective identification of miRNAs through homologous sequence search with previously known miRNAs. In this study, an attempt was made to identify and characterize conserved miRNAs in garlic expressed sequence tags (ESTs) through computational means. For identification of novel miRNAs in garlic, a total 3227 known mature miRNAs of plant kingdom Viridiplantae were searched for homology against 21,637 EST sequences resulting in identification of 6 potential miRNA candidates belonging to 6 different miRNA families. The psRNATarget server predicted 33 potential target genes and their probable functions for the six identified miRNA families in garlic. Most of the garlic miRNA target genes seem to encode transcription factors as well as genes involved in stress response, metabolism, plant growth and development. The results from the present study will shed more light on the understanding of molecular mechanisms of miRNA in garlic which may aid in the development of novel and precise techniques to understand some post-transcriptional gene silencing mechanism in response to stress tolerance.

  15. Identification of an intra-cranial intra-axial porcupine quill foreign body with computed tomography in a canine patient.

    PubMed

    Sauvé, Christopher P; Sereda, Nikki C; Sereda, Colin W

    2012-02-01

    An intra-cranial intra-axial foreign body was diagnosed in a golden retriever dog through the use of computed tomography (CT). Confirmed by necropsy, a porcupine quill had migrated to the patient's left cerebral hemisphere, likely through the oval foramen. This case study demonstrates the efficacy of CT in visualizing a quill in the canine brain.

  16. Developing an Educational Computer Game for Migratory Bird Identification Based on a Two-Tier Test Approach

    ERIC Educational Resources Information Center

    Chu, Hui-Chun; Chang, Shao-Chen

    2014-01-01

    Although educational computer games have been recognized as being a promising approach, previous studies have indicated that, without supportive models, students might only show temporary interest during the game-based learning process, and their learning performance is often not as good as expected. Therefore, in this paper, a two-tier test…

  17. Identification of an intra-cranial intra-axial porcupine quill foreign body with computed tomography in a canine patient.

    PubMed

    Sauvé, Christopher P; Sereda, Nikki C; Sereda, Colin W

    2012-02-01

    An intra-cranial intra-axial foreign body was diagnosed in a golden retriever dog through the use of computed tomography (CT). Confirmed by necropsy, a porcupine quill had migrated to the patient's left cerebral hemisphere, likely through the oval foramen. This case study demonstrates the efficacy of CT in visualizing a quill in the canine brain. PMID:22851782

  18. Identification of an intra-cranial intra-axial porcupine quill foreign body with computed tomography in a canine patient

    PubMed Central

    Sauvé, Christopher P.; Sereda, Nikki C.; Sereda, Colin W.

    2012-01-01

    An intra-cranial intra-axial foreign body was diagnosed in a golden retriever dog through the use of computed tomography (CT). Confirmed by necropsy, a porcupine quill had migrated to the patient’s left cerebral hemisphere, likely through the oval foramen. This case study demonstrates the efficacy of CT in visualizing a quill in the canine brain. PMID:22851782

  19. Identification of Students' Intuitive Mental Computational Strategies for 1, 2 and 3 Digits Addition and Subtraction: Pedagogical and Curricular Implications

    ERIC Educational Resources Information Center

    Ghazali, Munirah; Alias, Rohana; Ariffin, Noor Asrul Anuar; Ayub, Ayminsyadora

    2010-01-01

    This paper reports on a study to examine mental computation strategies used by Year 1, Year 2, and Year 3 students to solve addition and subtraction problems. The participants in this study were twenty five 7 to 9 year-old students identified as excellent, good and satisfactory in their mathematics performance from a school in Penang, Malaysia.…

  20. Identification of transformation products of rosuvastatin in water during ZnO photocatalytic degradation through the use of associated LC-QTOF-MS to computational chemistry.

    PubMed

    Segalin, Jéferson; Sirtori, Carla; Jank, Louíse; Lima, Martha F S; Livotto, Paolo R; Machado, Tiele C; Lansarin, Marla A; Pizzolato, Tânia M

    2015-12-15

    Rosuvastatin (RST), a synthetic statin, is a 3-hydroxy-3-methylglutaryl coenzyme A reductase inhibitor, with a number of pleiotropic properties, such as anti-inflammation, antioxidation and cardiac remodelling attenuation. According to IMS Health, rosuvastatin was the third best-selling drug in the United States in 2012. RST was recently found in European effluent samples at a detection frequency of 36%. In this study, we evaluate the identification process of major transformation products (TPs) of RST generated during the heterogeneous photocatalysis process with ZnO. The degradation of the parent molecule and the identification of the main TPs were studied in demineralised water. The TPs were monitored and identified by liquid chromatography-quadrupole time-of-flight mass spectrometry (LC-QTOF-MS/MS). Ten TPs were tentatively identified and some of them originated from the hydroxylation suffered by the aromatic ring during the initial stages of the process. Structural elucidation of some of the most abundant or persistent TPs was evaluated by computational analysis, which demonstrated that this approach can be used as a tool to help the elucidation of structures of unknown molecules. The analysis of the parameters obtained from ab initio calculations for different isomers showed the most stable structures and, consequently, the most likely to be found.

  1. Computer Music

    NASA Astrophysics Data System (ADS)

    Cook, Perry R.

    This chapter covers algorithms, technologies, computer languages, and systems for computer music. Computer music involves the application of computers and other digital/electronic technologies to music composition, performance, theory, history, and the study of perception. The field combines digital signal processing, computational algorithms, computer languages, hardware and software systems, acoustics, psychoacoustics (low-level perception of sounds from the raw acoustic signal), and music cognition (higher-level perception of musical style, form, emotion, etc.).

  2. High level waste storage tank farms/242-A evaporator standards/requirements identification document phase 1 assessment report

    SciTech Connect

    Biebesheimer, E., Westinghouse Hanford Co.

    1996-09-30

    This document, the Standards/Requirements Identification Document (S/RID) Phase I Assessment Report for the subject facility, represents the results of an Administrative Assessment to determine whether S/RID requirements are fully addressed by existing policies, plans or procedures. It contains; compliance status, remedial actions, and an implementing manuals report linking S/RID elements to requirement source to implementing manual and section.

  3. Species level identification and antifungal susceptibility of yeasts isolated from various clinical specimens and evaluation of Integral System Yeasts Plus.

    PubMed

    Bicmen, Can; Doluca, Mine; Gulat, Sinem; Gunduz, Ayriz T; Tuksavul, Fevziye

    2012-07-01

    It is essential to use easy, standard, cost-effective and accurate methods for identification and susceptibility testing of yeasts in routine practice. This study aimed to establish the species distribution and antifungal susceptibility of yeast isolates and also to evaluate the performance of the colorimetric and commercially available Integral System Yeasts Plus (ISYP). Yeast isolates (n=116) were identified by conventional methods and ISYP. Antifungal susceptibility testing was performed by the microdilution method according to the standards of CLSI M27-A3 and ISYP. Candida albicans (50%) was the most common species isolated, followed by C. parapsilosis (25%) (mostly in blood samples). According to the CLSI M27-S3 criteria, resistance rates for amphotericin B, flucytosine, fluconazole, itraconazole, and voriconazole were 0%, 0%, 4.6%, 4.5% and 1.8%, respectively. Resistance for miconazole (MIC >1 mg/L) was found as 17.9%. Sixty-two (53.4%) of the isolates which were analyzed by ISYP showed disagreement with those identified by the conventional methods and API ID 32C identification kit or a specific identification code could not be assigned by ISYP. The performance of ISYP could be indicated as low for all antifungal drugs tested according to the ROC analysis (AUC: 0.28-0.56). As the current version of ISYP displays a poor performance, it is recommended to use the other commercial systems whose results are approved as reliable and in agreement with those of the reference methods in identification and susceptibility testing of yeasts. PMID:22842602

  4. Performance/Design Requirements and Detailed Technical Description for a Computer-Directed Training Subsystem for Integration into the Air Force Phase II Base Level System.

    ERIC Educational Resources Information Center

    Butler, A. K.; And Others

    The performance/design requirements and a detailed technical description for a Computer-Directed Training Subsystem to be integrated into the Air Force Phase II Base Level System are described. The subsystem may be used for computer-assisted lesson construction and has presentation capability for on-the-job training for data automation, staff, and…

  5. Utility of indels for species-level identification of a biologically complex plant group: a study with intergenic spacer in Citrus.

    PubMed

    Mahadani, Pradosh; Ghosh, Sankar Kumar

    2014-11-01

    The Consortium of Barcode of Life plant working group proposed to use the defined portion of plastid genes rbcL and matK either singly or in combination as the standard DNA barcode for plants. But DNA barcode based identification of biologically complex plant groups are always a challenging task due to the occurrence of natural hybridization. Here, we examined the use of indels polymorphism in trnH-psbA and trnL-trnF sequences for rapid species identification of citrus. DNA from young leaves of selected citrus species were isolated and matK gene (~800 bp) and trnH-psbA spacer (~450 bp) of Chloroplast DNA was amplified for species level identification. The sequences within the group taxa of Citrus were aligned using the ClustalX program. With few obvious misalignments were corrected manually using the similarity criterion. We identified a 54 bp inverted repeat or palindrome sequence (27-80 regions) and 6 multi residues indel coding regions. Large inverted repeats in cpDNA provided authentication at the higher taxonomic levels. These diagnostics indel marker from trnH-psbA were successful in identifying different species (5 out of 7) within the studied Citrus except Citrus limon and Citrus medica. These two closely related species are distinguished through the 6 bp deletion in trnL-trnF. This study demonstrated that the indel polymorphism based approach easily characterizes the Citrus species and the same may be applied in other complex groups. Likewise other indels occurring intergenic spacer of chloroplast regions may be tested for rapid identification of other secondary citrus species.

  6. Evaluation of a wireless wearable tongue-computer interface by individuals with high-level spinal cord injuries.

    PubMed

    Huo, Xueliang; Ghovanloo, Maysam

    2010-04-01

    The tongue drive system (TDS) is an unobtrusive, minimally invasive, wearable and wireless tongue-computer interface (TCI), which can infer its users' intentions, represented in their volitional tongue movements, by detecting the position of a small permanent magnetic tracer attached to the users' tongues. Any specific tongue movements can be translated into user-defined commands and used to access and control various devices in the users' environments. The latest external TDS (eTDS) prototype is built on a wireless headphone and interfaced to a laptop PC and a powered wheelchair. Using customized sensor signal processing algorithms and graphical user interface, the eTDS performance was evaluated by 13 naive subjects with high-level spinal cord injuries (C2-C5) at the Shepherd Center in Atlanta, GA. Results of the human trial show that an average information transfer rate of 95 bits/min was achieved for computer access with 82% accuracy. This information transfer rate is about two times higher than the EEG-based BCIs that are tested on human subjects. It was also demonstrated that the subjects had immediate and full control over the powered wheelchair to the extent that they were able to perform complex wheelchair navigation tasks, such as driving through an obstacle course.

  7. Evaluation of a wireless wearable tongue–computer interface by individuals with high-level spinal cord injuries

    PubMed Central

    Huo, Xueliang; Ghovanloo, Maysam

    2010-01-01

    The tongue drive system (TDS) is an unobtrusive, minimally invasive, wearable and wireless tongue–computer interface (TCI), which can infer its users' intentions, represented in their volitional tongue movements, by detecting the position of a small permanent magnetic tracer attached to the users' tongues. Any specific tongue movements can be translated into user-defined commands and used to access and control various devices in the users' environments. The latest external TDS (eTDS) prototype is built on a wireless headphone and interfaced to a laptop PC and a powered wheelchair. Using customized sensor signal processing algorithms and graphical user interface, the eTDS performance was evaluated by 13 naive subjects with high-level spinal cord injuries (C2–C5) at the Shepherd Center in Atlanta, GA. Results of the human trial show that an average information transfer rate of 95 bits/min was achieved for computer access with 82% accuracy. This information transfer rate is about two times higher than the EEG-based BCIs that are tested on human subjects. It was also demonstrated that the subjects had immediate and full control over the powered wheelchair to the extent that they were able to perform complex wheelchair navigation tasks, such as driving through an obstacle course. PMID:20332552

  8. Evaluation of a wireless wearable tongue-computer interface by individuals with high-level spinal cord injuries

    NASA Astrophysics Data System (ADS)

    Huo, Xueliang; Ghovanloo, Maysam

    2010-04-01

    The tongue drive system (TDS) is an unobtrusive, minimally invasive, wearable and wireless tongue-computer interface (TCI), which can infer its users' intentions, represented in their volitional tongue movements, by detecting the position of a small permanent magnetic tracer attached to the users' tongues. Any specific tongue movements can be translated into user-defined commands and used to access and control various devices in the users' environments. The latest external TDS (eTDS) prototype is built on a wireless headphone and interfaced to a laptop PC and a powered wheelchair. Using customized sensor signal processing algorithms and graphical user interface, the eTDS performance was evaluated by 13 naive subjects with high-level spinal cord injuries (C2-C5) at the Shepherd Center in Atlanta, GA. Results of the human trial show that an average information transfer rate of 95 bits/min was achieved for computer access with 82% accuracy. This information transfer rate is about two times higher than the EEG-based BCIs that are tested on human subjects. It was also demonstrated that the subjects had immediate and full control over the powered wheelchair to the extent that they were able to perform complex wheelchair navigation tasks, such as driving through an obstacle course.

  9. Causal Attributions of Success and Failure Made by Undergraduate Students in an Introductory-Level Computer Programming Course

    ERIC Educational Resources Information Center

    Hawi, N.

    2010-01-01

    The purpose of this research is to identify the causal attributions of business computing students in an introductory computer programming course, in the computer science department at Notre Dame University, Louaize. Forty-five male and female undergraduates who completed the computer programming course that extended for a 13-week semester…

  10. Identification of Potent Chemotypes Targeting Leishmania major Using a High-Throughput, Low-Stringency, Computationally Enhanced, Small Molecule Screen

    PubMed Central

    Sharlow, Elizabeth R.; Close, David; Shun, Tongying; Leimgruber, Stephanie; Reed, Robyn; Mustata, Gabriela; Wipf, Peter; Johnson, Jacob; O'Neil, Michael; Grögl, Max; Magill, Alan J.; Lazo, John S.

    2009-01-01

    Patients with clinical manifestations of leishmaniasis, including cutaneous leishmaniasis, have limited treatment options, and existing therapies frequently have significant untoward liabilities. Rapid expansion in the diversity of available cutaneous leishmanicidal chemotypes is the initial step in finding alternative efficacious treatments. To this end, we combined a low-stringency Leishmania major promastigote growth inhibition assay with a structural computational filtering algorithm. After a rigorous assay validation process, we interrogated ∼200,000 unique compounds for L. major promastigote growth inhibition. Using iterative computational filtering of the compounds exhibiting >50% inhibition, we identified 553 structural clusters and 640 compound singletons. Secondary confirmation assays yielded 93 compounds with EC50s ≤ 1 µM, with none of the identified chemotypes being structurally similar to known leishmanicidals and most having favorable in silico predicted bioavailability characteristics. The leishmanicidal activity of a representative subset of 15 chemotypes was confirmed in two independent assay formats, and L. major parasite specificity was demonstrated by assaying against a panel of human cell lines. Thirteen chemotypes inhibited the growth of a L. major axenic amastigote-like population. Murine in vivo efficacy studies using one of the new chemotypes document inhibition of footpad lesion development. These results authenticate that low stringency, large-scale compound screening combined with computational structure filtering can rapidly expand the chemotypes targeting in vitro and in vivo Leishmania growth and viability. PMID:19888337

  11. Three-dimensional cephalometry: a method for the identification and for the orientation of the skull after cone-bean computed tomographic scan.

    PubMed

    Frongia, Gianluigi; Bracco, Pietro; Piancino, Maria Grazia

    2013-05-01

    The aims of this work were (1) to describe a method to identify new skeletal landmarks useful to define the reference system to orient the skull in a new position after cone-bean computed tomographic scan and (2) to demonstrate the reliability of this new method.Ten orthognathic patients (5 male, 5 female; mean [SD] age, 18.9 [1.2] years) underwent the cone-bean computed tomographic scan before surgery. Seven 3-dimensional skeletal measurements derived from 4 skeletal point of construction (C) (right, left, and median orbital C, and sella C) have been used for this study. Reliability has been calculated using Pearson correlation coefficient tests.Intraobserver reliability was 0.9999 for operator A (T1-T2) and 0.9999 for operator B (T1-T2); interobserver reliability was 0.9999 between the first (T1-T1) measurement and 0.9999 between the second (T2-T2).The original method is able to reduce the variability of landmark identification due to the variability of the human anatomy and the influence of the human error in cephalometric analysis. The innovation of this new method is the real possibility to use the anatomical structures in a 3-dimensional way, enhancing the reliability of the reference points.

  12. High-level waste storage tank farms/242-A evaporator standards/requirements identification document (S/RID), Vol. 1

    SciTech Connect

    Not Available

    1994-04-01

    The purpose of this Requirements Identification Document (RID) section is to identify, in one location, all of the facility specific requirements and good industry practices which are necessary or important to establish an effective Issues Management Program for the Tank Farm Facility. The Management Systems Functional Area includes the site management commitment to environmental safety and health (ES&H) policies and controls, to compliance management, to development and management of policy and procedures, to occurrence reporting and corrective actions, resource and issue management, and to the self-assessment process.

  13. Computational identification and characterization of novel microRNA in the mammary gland of dairy goat (Capra hircus).

    PubMed

    Qu, Bo; Qiu, Youwen; Zhen, Zhen; Zhao, Feng; Wang, Chunmei; Cui, Yingjun; Li, Qizhang; Zhang, Li

    2016-09-01

    Many studies have indicated that microRNAs (miRNAs) influence the development of the mammary gland by posttranscriptionally affecting their target genes. The objective of this research was to identify novel miRNAs in the mammary gland of dairy goats with a bioinformatics approach that was based on expressed sequence tag (EST) and genome survey sequence (GSS) analyses. We applied all known major mammals, miRNAs to search against the goat EST and GSS databases for the first time to identify new miRNAs. We, then, validated these newly predicted miRNAs with stem-loop reverse transcription followed by a SYBR Green polymerase chain reaction assay. Finally, 29 mature miRNAs were identified and verified, and of these, 14 were grouped into 13 families based on seed sequence identity and 85 potential target genes of newly verified miRNAs were subsequently predicted, most of which seemed to encode the proteins participating in regulation of metabolism, signal transduction, growth and development. The predicting accuracy of the new miRNAs was 70.37%, which confirmed that the methods used in this study were efficient and reliable. Detailed analyses of the sequence characteristics of the novel miRNAs of the goat mammary gland were performed. In conclusion, these results provide a reference for further identification of miRNAs in animals without a complete genome and thus improve the understanding of miRNAs in the caprine mammary gland. PMID:27659334

  14. A new computational scheme on quantitative inner pipe boundary identification based on the estimation of effective thermal conductivity

    NASA Astrophysics Data System (ADS)

    Fan, Chunli; Sun, Fengrui; Yang, Li

    2008-10-01

    In the paper, the irregular configuration of the inner pipe boundary is identified based on the estimation of the circumferential distribution of the effective thermal conductivity of pipe wall. In order to simulate the true temperature measurement in the numerical examples, the finite element method is used to calculate the temperature distribution at the outer pipe surface based on the irregular shaped inner pipe boundary to be determined. Then based on this simulated temperature distribution the inverse identification work is conducted by employing the modified one-dimensional correction method, along with the finite volume method, to estimate the circumferential distribution of the effective thermal conductivity of the pipe wall. Thereafter, the inner pipe boundary shape is calculated based on the conductivity estimation result. A series of numerical experiments with different temperature measurement errors and different thermal conductivities of pipe wall have certified the effectiveness of the method. It is proved that the method is a simple, fast and accurate one for this inverse heat conduction problem.

  15. Compositional modeling in porous media using constant volume flash and flux computation without the need for phase identification

    SciTech Connect

    Polívka, Ondřej Mikyška, Jiří

    2014-09-01

    The paper deals with the numerical solution of a compositional model describing compressible two-phase flow of a mixture composed of several components in porous media with species transfer between the phases. The mathematical model is formulated by means of the extended Darcy's laws for all phases, components continuity equations, constitutive relations, and appropriate initial and boundary conditions. The splitting of components among the phases is described using a new formulation of the local thermodynamic equilibrium which uses volume, temperature, and moles as specification variables. The problem is solved numerically using a combination of the mixed-hybrid finite element method for the total flux discretization and the finite volume method for the discretization of transport equations. A new approach to numerical flux approximation is proposed, which does not require the phase identification and determination of correspondence between the phases on adjacent elements. The time discretization is carried out by the backward Euler method. The resulting large system of nonlinear algebraic equations is solved by the Newton–Raphson iterative method. We provide eight examples of different complexity to show reliability and robustness of our approach.

  16. Identification of the wave speed and the second viscosity of cavitation flows with 2D RANS computations - Part I

    NASA Astrophysics Data System (ADS)

    Decaix, J.; Alligné, S.; Nicolet, C.; Avellan, F.; Münch, C.

    2015-12-01

    1D hydro-electric models are useful to predict dynamic behaviour of hydro-power plants. Regarding vortex rope and cavitation surge in Francis turbines, the 1D models require some inputs that can be provided by numerical simulations. In this paper, a 2D cavitating Venturi is considered. URANS computations are performed to investigate the dynamic behaviour of the cavitation sheet depending on the frequency variation of the outlet pressure. The results are used to calibrate and to assess the reliability of the 1D models.

  17. Level-set reconstruction algorithm for ultrafast limited-angle X-ray computed tomography of two-phase flows

    PubMed Central

    Bieberle, M.; Hampel, U.

    2015-01-01

    Tomographic image reconstruction is based on recovering an object distribution from its projections, which have been acquired from all angular views around the object. If the angular range is limited to less than 180° of parallel projections, typical reconstruction artefacts arise when using standard algorithms. To compensate for this, specialized algorithms using a priori information about the object need to be applied. The application behind this work is ultrafast limited-angle X-ray computed tomography of two-phase flows. Here, only a binary distribution of the two phases needs to be reconstructed, which reduces the complexity of the inverse problem. To solve it, a new reconstruction algorithm (LSR) based on the level-set method is proposed. It includes one force function term accounting for matching the projection data and one incorporating a curvature-dependent smoothing of the phase boundary. The algorithm has been validated using simulated as well as measured projections of known structures, and its performance has been compared to the algebraic reconstruction technique and a binary derivative of it. The validation as well as the application of the level-set reconstruction on a dynamic two-phase flow demonstrated its applicability and its advantages over other reconstruction algorithms. PMID:25939623

  18. Identification of stair climbing ability levels in community-dwelling older adults based on the geometric mean of stair ascent and descent speed: The GeMSS classifier.

    PubMed

    Mayagoitia, Ruth E; Harding, John; Kitchen, Sheila

    2017-01-01

    The aim was to develop a quantitative approach to identify three stair-climbing ability levels of older adults: no, somewhat and considerable difficulty. Timed-up-and-go test, six-minute-walk test, and Berg balance scale were used for statistical comparison to a new stair climbing ability classifier based on the geometric mean of stair speeds (GeMSS) in ascent and descent on a flight of eight stairs with a 28° pitch in the housing unit where the participants, 28 (16 women) urban older adults (62-94 years), lived. Ordinal logistic regression revealed the thresholds between the three ability levels for each functional test were more stringent than thresholds found in the literature to classify walking ability levels. Though a small study, the intermediate classifier shows promise of early identification of difficulties with stairs, in order to make timely preventative interventions. Further studies are necessary to obtain scaling factors for stairs with other pitches.

  19. Identification of stair climbing ability levels in community-dwelling older adults based on the geometric mean of stair ascent and descent speed: The GeMSS classifier.

    PubMed

    Mayagoitia, Ruth E; Harding, John; Kitchen, Sheila

    2017-01-01

    The aim was to develop a quantitative approach to identify three stair-climbing ability levels of older adults: no, somewhat and considerable difficulty. Timed-up-and-go test, six-minute-walk test, and Berg balance scale were used for statistical comparison to a new stair climbing ability classifier based on the geometric mean of stair speeds (GeMSS) in ascent and descent on a flight of eight stairs with a 28° pitch in the housing unit where the participants, 28 (16 women) urban older adults (62-94 years), lived. Ordinal logistic regression revealed the thresholds between the three ability levels for each functional test were more stringent than thresholds found in the literature to classify walking ability levels. Though a small study, the intermediate classifier shows promise of early identification of difficulties with stairs, in order to make timely preventative interventions. Further studies are necessary to obtain scaling factors for stairs with other pitches. PMID:27633200

  20. Identification of novel adhesins of M. tuberculosis H37Rv using integrated approach of multiple computational algorithms and experimental analysis.

    PubMed

    Kumar, Sanjiv; Puniya, Bhanwar Lal; Parween, Shahila; Nahar, Pradip; Ramachandran, Srinivasan

    2013-01-01

    Pathogenic bacteria interacting with eukaryotic host express adhesins on their surface. These adhesins aid in bacterial attachment to the host cell receptors during colonization. A few adhesins such as Heparin binding hemagglutinin adhesin (HBHA), Apa, Malate Synthase of M. tuberculosis have been identified using specific experimental interaction models based on the biological knowledge of the pathogen. In the present work, we carried out computational screening for adhesins of M. tuberculosis. We used an integrated computational approach using SPAAN for predicting adhesins, PSORTb, SubLoc and LocTree for extracellular localization, and BLAST for verifying non-similarity to human proteins. These steps are among the first of reverse vaccinology. Multiple claims and attacks from different algorithms were processed through argumentative approach. Additional filtration criteria included selection for proteins with low molecular weights and absence of literature reports. We examined binding potential of the selected proteins using an image based ELISA. The protein Rv2599 (membrane protein) binds to human fibronectin, laminin and collagen. Rv3717 (N-acetylmuramoyl-L-alanine amidase) and Rv0309 (L,D-transpeptidase) bind to fibronectin and laminin. We report Rv2599 (membrane protein), Rv0309 and Rv3717 as novel adhesins of M. tuberculosis H37Rv. Our results expand the number of known adhesins of M. tuberculosis and suggest their regulated expression in different stages.

  1. Computational Identification of Protein Pupylation Sites by Using Profile-Based Composition of k-Spaced Amino Acid Pairs.

    PubMed

    Hasan, Md Mehedi; Zhou, Yuan; Lu, Xiaotian; Li, Jinyan; Song, Jiangning; Zhang, Ziding

    2015-01-01

    Prokaryotic proteins are regulated by pupylation, a type of post-translational modification that contributes to cellular function in bacterial organisms. In pupylation process, the prokaryotic ubiquitin-like protein (Pup) tagging is functionally analogous to ubiquitination in order to tag target proteins for proteasomal degradation. To date, several experimental methods have been developed to identify pupylated proteins and their pupylation sites, but these experimental methods are generally laborious and costly. Therefore, computational methods that can accurately predict potential pupylation sites based on protein sequence information are highly desirable. In this paper, a novel predictor termed as pbPUP has been developed for accurate prediction of pupylation sites. In particular, a sophisticated sequence encoding scheme [i.e. the profile-based composition of k-spaced amino acid pairs (pbCKSAAP)] is used to represent the sequence patterns and evolutionary information of the sequence fragments surrounding pupylation sites. Then, a Support Vector Machine (SVM) classifier is trained using the pbCKSAAP encoding scheme. The final pbPUP predictor achieves an AUC value of 0.849 in 10-fold cross-validation tests and outperforms other existing predictors on a comprehensive independent test dataset. The proposed method is anticipated to be a helpful computational resource for the prediction of pupylation sites. The web server and curated datasets in this study are freely available at http://protein.cau.edu.cn/pbPUP/.

  2. Identification of error making patterns in lesion detection on digital breast tomosynthesis using computer-extracted image features

    NASA Astrophysics Data System (ADS)

    Wang, Mengyu; Zhang, Jing; Grimm, Lars J.; Ghate, Sujata V.; Walsh, Ruth; Johnson, Karen S.; Lo, Joseph Y.; Mazurowski, Maciej A.

    2016-03-01

    Digital breast tomosynthesis (DBT) can improve lesion visibility by eliminating the issue of overlapping breast tissue present in mammography. However, this new modality likely requires new approaches to training. The issue of training in DBT is not well explored. We propose a computer-aided educational approach for DBT training. Our hypothesis is that the trainees' educational outcomes will improve if they are presented with cases individually selected to address their weaknesses. In this study, we focus on the question of how to select such cases. Specifically, we propose an algorithm that based on previously acquired reading data predicts which lesions will be missed by the trainee for future cases (i.e., we focus on false negative error). A logistic regression classifier was used to predict the likelihood of trainee error and computer-extracted features were used as the predictors. Reader data from 3 expert breast imagers was used to establish the ground truth and reader data from 5 radiology trainees was used to evaluate the algorithm performance with repeated holdout cross validation. Receiver operating characteristic (ROC) analysis was applied to measure the performance of the proposed individual trainee models. The preliminary experimental results for 5 trainees showed the individual trainee models were able to distinguish the lesions that would be detected from those that would be missed with the average area under the ROC curve of 0.639 (95% CI, 0.580-0.698). The proposed algorithm can be used to identify difficult cases for individual trainees.

  3. Computer Music

    NASA Astrophysics Data System (ADS)

    Cook, Perry

    This chapter covers algorithms, technologies, computer languages, and systems for computer music. Computer music involves the application of computers and other digital/electronic technologies to music composition, performance, theory, history, and perception. The field combines digital signal processing, computational algorithms, computer languages, hardware and software systems, acoustics, psychoacoustics (low-level perception of sounds from the raw acoustic signal), and music cognition (higher-level perception of musical style, form, emotion, etc.). Although most people would think that analog synthesizers and electronic music substantially predate the use of computers in music, many experiments and complete computer music systems were being constructed and used as early as the 1950s.

  4. Identification of Lygus hesperus by DNA Barcoding Reveals Insignificant Levels of Genetic Structure among Distant and Habitat Diverse Populations

    PubMed Central

    Zhou, Changqing; Kandemir, Irfan; Walsh, Douglas B.; Zalom, Frank G.; Lavine, Laura Corley

    2012-01-01

    Background The western tarnished plant bug Lygus hesperus is an economically important pest that belongs to a complex of morphologically similar species that makes identification problematic. The present study provides evidence for the use of DNA barcodes from populations of L. hesperus from the western United States of America for accurate identification. Methodology/Principal Findings This study reports DNA barcodes for 134 individuals of the western tarnished plant bug from alfalfa and strawberry agricultural fields in the western United States of America. Sequence divergence estimates of <3% reveal that morphologically variable individuals presumed to be L. hesperus were accurately identified. Paired estimates of Fst and subsequent estimates of gene flow show that geographically distinct populations of L. hesperus are genetically similar. Therefore, our results support and reinforce the relatively recent (<100 years) migration of the western tarnished plant bug into agricultural habitats across the western United States. Conclusions/Significance This study reveals that despite wide host plant usage and phenotypically plastic morphological traits, the commonly recognized western tarnished plant bug belongs to a single species, Lygus hesperus. In addition, no significant genetic structure was found for the geographically diverse populations of western tarnished plant bug used in this study. PMID:22479640

  5. The use of isodose levels to interpret radiation induced lung injury: a quantitative analysis of computed tomography changes

    PubMed Central

    Knoll, Miriam A.; Sheu, Ren Dih; Knoll, Abraham D.; Kerns, Sarah L.; Lo, Yeh-Chi; Rosenzweig, Kenneth E.

    2016-01-01

    Background Patients treated with stereotactic body radiation therapy (SBRT) for lung cancer are often found to have radiation-induced lung injury (RILI) surrounding the treated tumor. We investigated whether treatment isodose levels could predict RILI. Methods Thirty-seven lung lesions in 32 patients were treated with SBRT and received post-treatment follow up (FU) computed tomography (CT). Each CT was fused with the original simulation CT and treatment isodose levels were overlaid. The RILI surrounding the treated lesion was contoured. The RILI extension index [fibrosis extension index (FEI)] was defined as the volume of RILI extending outside a given isodose level relative to the total volume of RILI and was expressed as a percentage. Results Univariate analysis revealed that the planning target volume (PTV) was positively correlated with RILI volume at FU: correlation coefficient (CC) =0.628 and P<0.0001 at 1st FU; CE =0.401 and P=0.021 at 2nd FU; CE =0.265 and P=0.306 at 3rd FU. FEI −40 Gy at 1st FU was significantly positively correlated with FEI −40 Gy at subsequent FU’s (CC =0.689 and P=6.5×10−5 comparing 1st and 2nd FU; 0.901 and P=0.020 comparing 2nd and 3rd FU. Ninety-six percent of the RILI was found within the 20 Gy isodose line. Sixty-five percent of patients were found to have a decrease in RILI on the second 2nd CT. Conclusions We have shown that RILI evolves over time and 1st CT correlates well with subsequent CTs. Ninety-six percent of the RILI can be found to occur within the 20 Gy isodose lines, which may prove beneficial to radiologists attempting to distinguish recurrence vs. RILI. PMID:26981453

  6. Computational investigation of carbon-in-ash levels for a wall-fired boiler after low-NOx combustion modifications

    SciTech Connect

    Davis, K.A.; Eddings, E.G.; Heap, M.P.; Facchiano, A.; Hardman, R.R.

    1998-12-31

    Many coal-fired boilers retrofitted with low NOX firing systems are experiencing significant operational difficulties due primarily to (1) increased carbon in the fly ash or (2) increased water wall wastage. This paper presents the results of a computational investigation of a wall-fired boiler that has recently been retrofitted with low NOx burners and overfire air. The focus of this paper is the effect of the retrofit on unburned carbon/NOx and the potential for applying operational and design modifications to minimize unburned carbon without adversely impacting NOx emissions. Although the magnitude of an increase in unburned carbon after a low NOx retrofit is system and coal dependent, it is often the case that reduction in emissions of nitrogen oxides is accompanied by a corresponding increase in the amount of unburned carbon in fly ash. Since low NOX firing systems increase the residence time of coal/char particles in a fuel rich environment, it might be expected that there is insufficient time under high temperature, oxidizing conditions to ensure complete carbon oxidation in a low NOx firing system. A relatively straight forward consideration of the effect of temperature and oxygen concentration on coal particle pyrolysis/oxidation can be used to provide a qualitative understanding of this effect. However, the complex flow patterns and highly nonlinear physical and chemical phenomena in a boiler make it difficult to predict carbon-in-ash (c-i-a) levels without the use of advanced computational tools. The sensitivity of c-i-a to burnout is pointed out in a figure for a coal with 10% ash. Although c-i-a changes slowly for low burnout, it changes very rapidly at the extent of burnout typical of a boiler. When the extent of burnout drops only slightly, from 99.5 to 99 percent for example, the c-i-a doubles, from 4 to 8%, which in many situations would be unacceptable. The importance of fuel efficiency and ash disposal/recycle emphasizes the need for

  7. CAL-laborate: A Collaborative Publication on the Use of Computer Aided Learning for Tertiary Level Physical Sciences and Geosciences.

    ERIC Educational Resources Information Center

    Fernandez, Anne, Ed.; Sproats, Lee, Ed.; Sorensen, Stacey, Ed.

    2000-01-01

    The science community has been trying to use computers in teaching for many years. There has been much conformity in how this was to be achieved, and the wheel has been re-invented again and again as enthusiast after enthusiast has "done their bit" towards getting computers accepted. Computers are now used by science undergraduates (as well as…

  8. Early Identification of Students Predicted to Enroll in Advanced, Upper-Level High School Courses: An Examination of Validity

    ERIC Educational Resources Information Center

    DeRose, Diego S.; Clement, Russell W.

    2011-01-01

    Broward County Public Schools' Research Services department uses logistic regression analysis to compute an indicator to predict student enrollment in advanced high school courses, for students entering ninth grade for the first time. This prediction indicator, along with other student characteristics, supports high school guidance staffs in…

  9. Computational Method for the Systematic Identification of Analog Series and Key Compounds Representing Series and Their Biological Activity Profiles.

    PubMed

    Stumpfe, Dagmar; Dimova, Dilyana; Bajorath, Jürgen

    2016-08-25

    A computational methodology is introduced for detecting all unique series of analogs in large compound data sets, regardless of chemical relationships between analogs. No prior knowledge of core structures or R-groups is required, which are automatically determined. The approach is based upon the generation of retrosynthetic matched molecular pairs and analog networks from which distinct series are isolated. The methodology was applied to systematically extract more than 17 000 distinct series from the ChEMBL database. For comparison, analog series were also isolated from screening compounds and drugs. Known biological activities were mapped to series from ChEMBL, and in more than 13 000 of these series, key compounds were identified that represented substitution sites of all analogs within a series and its complete activity profile. The analog series, key compounds, and activity profiles are made freely available as a resource for medicinal chemistry applications.

  10. Computational modeling of inhibition of voltage-gated Ca channels: identification of different effects on uterine and cardiac action potentials

    PubMed Central

    Tong, Wing-Chiu; Ghouri, Iffath; Taggart, Michael J.

    2014-01-01

    The uterus and heart share the important physiological feature whereby contractile activation of the muscle tissue is regulated by the generation of periodic, spontaneous electrical action potentials (APs). Preterm birth arising from premature uterine contractions is a major complication of pregnancy and there remains a need to pursue avenues of research that facilitate the use of drugs, tocolytics, to limit these inappropriate contractions without deleterious actions on cardiac electrical excitation. A novel approach is to make use of mathematical models of uterine and cardiac APs, which incorporate many ionic currents contributing to the AP forms, and test the cell-specific responses to interventions. We have used three such models—of uterine smooth muscle cells (USMC), cardiac sinoatrial node cells (SAN), and ventricular cells—to investigate the relative effects of reducing two important voltage-gated Ca currents—the L-type (ICaL) and T-type (ICaT) Ca currents. Reduction of ICaL (10%) alone, or ICaT (40%) alone, blunted USMC APs with little effect on ventricular APs and only mild effects on SAN activity. Larger reductions in either current further attenuated the USMC APs but with also greater effects on SAN APs. Encouragingly, a combination of ICaL and ICaT reduction did blunt USMC APs as intended with little detriment to APs of either cardiac cell type. Subsequent overlapping maps of ICaL and ICaT inhibition profiles from each model revealed a range of combined reductions of ICaL and ICaT over which an appreciable diminution of USMC APs could be achieved with no deleterious action on cardiac SAN or ventricular APs. This novel approach illustrates the potential for computational biology to inform us of possible uterine and cardiac cell-specific mechanisms. Incorporating such computational approaches in future studies directed at designing new, or repurposing existing, tocolytics will be beneficial for establishing a desired uterine specificity of action

  11. Adult granulosa cell tumor presenting with massive ascites, elevated CA-125 level, and low 18F-fluorodeoxyglucose uptake on positron emission tomography/computed tomography

    PubMed Central

    Tak, Ji Young; Park, Ji Y.; Lee, Seung Jeong; Lee, Yoon Hee; Hong, Dae Gy

    2015-01-01

    Adult granulosa cell tumors (AGCTs) presenting with massive ascites and elevated serum CA-125 levels have rarely been described in the literature. An ovarian mass, massive ascites, and elevated serum CA-125 levels in postmenopausal women generally suggest a malignant ovarian tumor, particularly advanced epithelial ovarian cancer. AGCT has low 18F-fluorodeoxyglucose uptake on positron emission tomography/computed tomography due to its low metabolic activity. In the present report, we describe a case of an AGCT with massive ascites, elevated serum CA-125 level, and low 18F-fluorodeoxyglucose uptake on positron emission tomography/computed tomography. PMID:26430671

  12. SU-E-I-74: Image-Matching Technique of Computed Tomography Images for Personal Identification: A Preliminary Study Using Anthropomorphic Chest Phantoms

    SciTech Connect

    Matsunobu, Y; Shiotsuki, K; Morishita, J

    2015-06-15

    Purpose: Fingerprints, dental impressions, and DNA are used to identify unidentified bodies in forensic medicine. Cranial Computed tomography (CT) images and/or dental radiographs are also used for identification. Radiological identification is important, particularly in the absence of comparative fingerprints, dental impressions, and DNA samples. The development of an automated radiological identification system for unidentified bodies is desirable. We investigated the potential usefulness of bone structure for matching chest CT images. Methods: CT images of three anthropomorphic chest phantoms were obtained on different days in various settings. One of the phantoms was assumed to be an unidentified body. The bone image and the bone image with soft tissue (BST image) were extracted from the CT images. To examine the usefulness of the bone image and/or the BST image, the similarities between the two-dimensional (2D) or threedimensional (3D) images of the same and different phantoms were evaluated in terms of the normalized cross-correlation value (NCC). Results: For the 2D and 3D BST images, the NCCs obtained from the same phantom assumed to be an unidentified body (2D, 0.99; 3D, 0.93) were higher than those for the different phantoms (2D, 0.95 and 0.91; 3D, 0.89 and 0.80). The NCCs for the same phantom (2D, 0.95; 3D, 0.88) were greater compared to those of the different phantoms (2D, 0.61 and 0.25; 3D, 0.23 and 0.10) for the bone image. The difference in the NCCs between the same and different phantoms tended to be larger for the bone images than for the BST images. These findings suggest that the image-matching technique is more useful when utilizing the bone image than when utilizing the BST image to identify different people. Conclusion: This preliminary study indicated that evaluating the similarity of bone structure in 2D and 3D images is potentially useful for identifying of an unidentified body.

  13. Identification of chromosomal location of mupA gene, encoding low-level mupirocin resistance in staphylococcal isolates.

    PubMed Central

    Ramsey, M A; Bradley, S F; Kauffman, C A; Morton, T M

    1996-01-01

    Low- and high-level mupirocin resistance have been reported in Staphylococcus aureus. The expression of plasmid-encoded mupA is responsible for high-level mupirocin resistance. Low-level mupirocin-resistant strains do not contain plasmid-encoded mupA, and a chromosomal location for this gene has not previously been reported. We examined high- and low-level mupirocin-resistant S. aureus strains to determine if mupA was present on the chromosome of low-level-resistant isolates. Southern blot analysis of DNA from four mupirocin-resistant strains identified mupA in both high- and low-level mupirocin-resistant strains. Low-level mupirocin-resistant strains contained a copy of mupA on the chromosome, while the high-level mupirocin-resistant isolate contained a copy of the gene on the plasmid. PCR amplification of genomic DNA from each mupirocin-resistant strain resulted in a 1.65-kb fragment, the predicted product from the intragenic mupA primers. This is the first report of a chromosomal location for the mupA gene conferring low-level mupirocin resistance. PMID:9124848

  14. New software for computer-assisted dental-data matching in Disaster Victim Identification and long-term missing persons investigations: "DAVID Web".

    PubMed

    Clement, J G; Winship, V; Ceddia, J; Al-Amad, S; Morales, A; Hill, A J

    2006-05-15

    In 1997 an internally supported but unfunded pilot project at the Victorian Institute of Forensic Medicine (VIFM) Australia led to the development of a computer system which closely mimicked Interpol paperwork for the storage, later retrieval and tentative matching of the many AM and PM dental records that are often needed for rapid Disaster Victim Identification. The program was called "DAVID" (Disaster And Victim IDentification). It combined the skills of the VIFM Information Technology systems manager (VW), an experienced odontologist (JGC) and an expert database designer (JC); all current authors on this paper. Students did much of the writing of software to prescription from Monash University. The student group involved won an Australian Information Industry Award in recognition of the contribution the new software could have made to the DVI process. Unfortunately, the potential of the software was never realized because paradoxically the federal nature of Australia frequently thwarts uniformity of systems across the entire country. As a consequence, the final development of DAVID never took place. Given the recent problems encountered post-tsunami by the odontologists who were obliged to use the Plass Data system (Plass Data Software, Holbaek, Denmark) and with the impending risks imposed upon Victoria by the decision to host the Commonwealth Games in Melbourne during March 2006, funding was sought and obtained from the state government to update counter disaster preparedness at the VIFM. Some of these funds have been made available to upgrade and complete the DAVID project. In the wake of discussions between leading expert odontologists from around the world held in Geneva during July 2003 at the invitation of the International Committee of the Red Cross significant alterations to the initial design parameters of DAVID were proposed. This was part of broader discussions directed towards developing instruments which could be used by the ICRC's "The Missing

  15. Survey of computed tomography doses and establishment of national diagnostic reference levels in the Republic of Belarus.

    PubMed

    Kharuzhyk, S A; Matskevich, S A; Filjustin, A E; Bogushevich, E V; Ugolkova, S A

    2010-01-01

    Computed tomography dose index (CTDI) was measured on eight CT scanners at seven public hospitals in the Republic of Belarus. The effective dose was calculated using normalised values of effective dose per dose-length product (DLP) over various body regions. Considerable variations of the dose values were observed. Mean effective doses amounted to 1.4 +/- 0.4 mSv for brain, 2.6 +/- 1.0 mSv for neck, 6.9 +/- 2.2 mSv for thorax, 7.0 +/- 2.3 mSv for abdomen and 8.8 +/- 3.2 mSv for pelvis. Diagnostic reference levels (DRLs) were proposed by calculating the third quartiles of dose value distributions (body region/volume CTDI, mGy/DLP, mGy cm): brain/60/730, neck/55/640, thorax/20/500, abdomen/25/600 and pelvis/25/490. It is evident that the protocols need to be optimised on some of the CT scanners, in view of the fact that these are the first formulated DRLs for the Republic of Belarus.

  16. On the computation of viscous terms for incompressible two-phase flows with Level Set/Ghost Fluid Method

    NASA Astrophysics Data System (ADS)

    Lalanne, Benjamin; Villegas, Lucia Rueda; Tanguy, Sébastien; Risso, Frédéric

    2015-11-01

    In this paper, we present a detailed analysis of the computation of the viscous terms for the simulation of incompressible two-phase flows in the framework of Level Set/Ghost Fluid Method when viscosity is discontinuous across the interface. Two pioneering papers on the topic, Kang et al. [10] and Sussman et al. [26], proposed two different approaches to deal with viscous terms. However, a definitive assessment of their respective efficiency is currently not available. In this paper, we demonstrate from theoretical arguments and confirm from numerical simulations that these two approaches are equivalent from a continuous point of view and we compare their accuracies in relevant test-cases. We also propose a new intermediate method which uses the properties of the two previous methods. This new method enables a simple implementation for an implicit temporal discretization of the viscous terms. In addition, the efficiency of the Delta Function method [24] is also assessed and compared to the three previous ones, which allow us to propose a general overview of the accuracy of all available methods. The selected test-cases involve configurations wherein viscosity plays a major role and for which either theoretical results or experimental data are available as reference solutions: simulations of spherical rising bubbles, shape-oscillating bubbles and deformed rising bubbles at low Reynolds numbers.

  17. Identification of evolutionarily conserved Momordica charantia microRNAs using computational approach and its utility in phylogeny analysis.

    PubMed

    Thirugnanasambantham, Krishnaraj; Saravanan, Subramanian; Karikalan, Kulandaivelu; Bharanidharan, Rajaraman; Lalitha, Perumal; Ilango, S; HairulIslam, Villianur Ibrahim

    2015-10-01

    Momordica charantia (bitter gourd, bitter melon) is a monoecious Cucurbitaceae with anti-oxidant, anti-microbial, anti-viral and anti-diabetic potential. Molecular studies on this economically valuable plant are very essential to understand its phylogeny and evolution. MicroRNAs (miRNAs) are conserved, small, non-coding RNA with ability to regulate gene expression by bind the 3' UTR region of target mRNA and are evolved at different rates in different plant species. In this study we have utilized homology based computational approach and identified 27 mature miRNAs for the first time from this bio-medically important plant. The phylogenetic tree developed from binary data derived from the data on presence/absence of the identified miRNAs were noticed to be uncertain and biased. Most of the identified miRNAs were highly conserved among the plant species and sequence based phylogeny analysis of miRNAs resolved the above difficulties in phylogeny approach using miRNA. Predicted gene targets of the identified miRNAs revealed their importance in regulation of plant developmental process. Reported miRNAs held sequence conservation in mature miRNAs and the detailed phylogeny analysis of pre-miRNA sequences revealed genus specific segregation of clusters. PMID:25988220

  18. Computer image analysis: an additional tool for the identification of processed poultry and mammal protein containing bones.

    PubMed

    Pinotti, L; Fearn, T; Gulalp, S; Campagnoli, A; Ottoboni, M; Baldi, A; Cheli, F; Savoini, G; Dell'Orto, V

    2013-01-01

    The aims of this study were (1) to evaluate the potential of image analysis measurements, in combination with the official analytical methods for the detection of constituents of animal origin in feedstuffs, to distinguish between poultry versus mammals; and (2) to identify possible markers that can be used in routine analysis. For this purpose, 14 mammal and seven poultry samples and a total of 1081 bone fragment lacunae were analysed by combining the microscopic methods with computer image analysis. The distribution of 30 different measured size and shape bone lacunae variables were studied both within and between the two zoological classes. In all cases a considerable overlap between classes meant that classification of individual lacunae was problematic, though a clear separation in the means did allow successful classification of samples on the basis of averages. The variables most useful for classification were those related to size, lacuna area for example. The approach shows considerable promise but will need further study using a larger number of samples with a wider range.

  19. Positron emission tomography-computed tomography in the diagnostic evaluation of smoldering multiple myeloma: identification of patients needing therapy

    PubMed Central

    Siontis, B; Kumar, S; Dispenzieri, A; Drake, M T; Lacy, M Q; Buadi, F; Dingli, D; Kapoor, P; Gonsalves, W; Gertz, M A; Rajkumar, S V

    2015-01-01

    We studied 188 patients with a suspected smoldering multiple myeloma (MM) who had undergone a positron emission tomography-computed tomography (PET-CT) scan as part of their clinical evaluation. PET-CT was positive (clinical radiologist interpretation of increased bone uptake and/or evidence of lytic bone destruction) in 74 patients and negative in 114 patients. Of these, 25 patients with a positive PET-CT and 97 patients with a negative PET-CT were observed without therapy and formed the study cohort (n=122). The probability of progression to MM within 2 years was 75% in patients with a positive PET-CT observed without therapy compared with 30% in patients with a negative PET-CT; median time to progression was 21 months versus 60 months, respectively, P=0.0008. Of 25 patients with a positive PET-CT, the probability of progression was 87% at 2 years in those with evidence of underlying osteolysis (n=16) and 61% in patients with abnormal PET-CT uptake but no evidence of osteolysis (n=9). Patients with positive PET-CT and evidence of underlying osteolysis have a high risk of progression to MM within 2 years when observed without therapy. These observations support recent changes to imaging requirements in the International Myeloma Working Group updated diagnostic criteria for MM. PMID:26495861

  20. MegaMiner: A Tool for Lead Identification Through Text Mining Using Chemoinformatics Tools and Cloud Computing Environment.

    PubMed

    Karthikeyan, Muthukumarasamy; Pandit, Yogesh; Pandit, Deepak; Vyas, Renu

    2015-01-01

    Virtual screening is an indispensable tool to cope with the massive amount of data being tossed by the high throughput omics technologies. With the objective of enhancing the automation capability of virtual screening process a robust portal termed MegaMiner has been built using the cloud computing platform wherein the user submits a text query and directly accesses the proposed lead molecules along with their drug-like, lead-like and docking scores. Textual chemical structural data representation is fraught with ambiguity in the absence of a global identifier. We have used a combination of statistical models, chemical dictionary and regular expression for building a disease specific dictionary. To demonstrate the effectiveness of this approach, a case study on malaria has been carried out in the present work. MegaMiner offered superior results compared to other text mining search engines, as established by F score analysis. A single query term 'malaria' in the portlet led to retrieval of related PubMed records, protein classes, drug classes and 8000 scaffolds which were internally processed and filtered to suggest new molecules as potential anti-malarials. The results obtained were validated by docking the virtual molecules into relevant protein targets. It is hoped that MegaMiner will serve as an indispensable tool for not only identifying hidden relationships between various biological and chemical entities but also for building better corpus and ontologies. PMID:26138567

  1. A computational method for identification of vaccine targets from protein regions of conserved human leukocyte antigen binding

    PubMed Central

    2015-01-01

    Background Computational methods for T cell-based vaccine target discovery focus on selection of highly conserved peptides identified across pathogen variants, followed by prediction of their binding of human leukocyte antigen molecules. However, experimental studies have shown that T cells often target diverse regions in highly variable viral pathogens and this diversity may need to be addressed through redefinition of suitable peptide targets. Methods We have developed a method for antigen assessment and target selection for polyvalent vaccines, with which we identified immune epitopes from variable regions, where all variants bind HLA. These regions, although variable, can thus be considered stable in terms of HLA binding and represent valuable vaccine targets. Results We applied this method to predict CD8+ T-cell targets in influenza A H7N9 hemagglutinin and significantly increased the number of potential vaccine targets compared to the number of targets discovered using the traditional approach where low-frequency peptides are excluded. Conclusions We developed a webserver with an intuitive visualization scheme for summarizing the T cell-based antigenic potential of any given protein or proteome using human leukocyte antigen binding predictions and made a web-accessible software implementation freely available at http://met-hilab.cbs.dtu.dk/blockcons/. PMID:26679766

  2. MegaMiner: A Tool for Lead Identification Through Text Mining Using Chemoinformatics Tools and Cloud Computing Environment.

    PubMed

    Karthikeyan, Muthukumarasamy; Pandit, Yogesh; Pandit, Deepak; Vyas, Renu

    2015-01-01

    Virtual screening is an indispensable tool to cope with the massive amount of data being tossed by the high throughput omics technologies. With the objective of enhancing the automation capability of virtual screening process a robust portal termed MegaMiner has been built using the cloud computing platform wherein the user submits a text query and directly accesses the proposed lead molecules along with their drug-like, lead-like and docking scores. Textual chemical structural data representation is fraught with ambiguity in the absence of a global identifier. We have used a combination of statistical models, chemical dictionary and regular expression for building a disease specific dictionary. To demonstrate the effectiveness of this approach, a case study on malaria has been carried out in the present work. MegaMiner offered superior results compared to other text mining search engines, as established by F score analysis. A single query term 'malaria' in the portlet led to retrieval of related PubMed records, protein classes, drug classes and 8000 scaffolds which were internally processed and filtered to suggest new molecules as potential anti-malarials. The results obtained were validated by docking the virtual molecules into relevant protein targets. It is hoped that MegaMiner will serve as an indispensable tool for not only identifying hidden relationships between various biological and chemical entities but also for building better corpus and ontologies.

  3. Functional MRI-based identification of brain areas involved in motor imagery for implantable brain-computer interfaces.

    PubMed

    Hermes, D; Vansteensel, M J; Albers, A M; Bleichner, M G; Benedictus, M R; Mendez Orellana, C; Aarnoutse, E J; Ramsey, N F

    2011-04-01

    For the development of minimally invasive brain-computer interfaces (BCIs), it is important to accurately localize the area of implantation. Using fMRI, we investigated which brain areas are involved in motor imagery. Twelve healthy subjects performed a motor execution and imagery task during separate fMRI and EEG measurements. fMRI results showed that during imagery, premotor and parietal areas were most robustly activated in individual subjects, but surprisingly, no activation was found in the primary motor cortex. EEG results showed that spectral power decreases in contralateral sensorimotor rhythms (8-24 Hz) during both movement and imagery. To further verify the involvement of the motor imagery areas found with fMRI, one epilepsy patient performed the same task during both fMRI and ECoG recordings. Significant ECoG low (8-24 Hz) and high (65-95 Hz) frequency power changes were observed selectively on premotor cortex and these co-localized with fMRI. During a subsequent BCI task, excellent performance (91%) was obtained based on ECoG power changes from the localized premotor area. These results indicate that other areas than the primary motor area may be more reliably activated during motor imagery. Specifically, the premotor cortex may be a better area to implant an invasive BCI.

  4. Functional MRI-based identification of brain areas involved in motor imagery for implantable brain-computer interfaces

    NASA Astrophysics Data System (ADS)

    Hermes, D.; Vansteensel, M. J.; Albers, A. M.; Bleichner, M. G.; Benedictus, M. R.; Mendez Orellana, C.; Aarnoutse, E. J.; Ramsey, N. F.

    2011-04-01

    For the development of minimally invasive brain-computer interfaces (BCIs), it is important to accurately localize the area of implantation. Using fMRI, we investigated which brain areas are involved in motor imagery. Twelve healthy subjects performed a motor execution and imagery task during separate fMRI and EEG measurements. fMRI results showed that during imagery, premotor and parietal areas were most robustly activated in individual subjects, but surprisingly, no activation was found in the primary motor cortex. EEG results showed that spectral power decreases in contralateral sensorimotor rhythms (8-24 Hz) during both movement and imagery. To further verify the involvement of the motor imagery areas found with fMRI, one epilepsy patient performed the same task during both fMRI and ECoG recordings. Significant ECoG low (8-24 Hz) and high (65-95 Hz) frequency power changes were observed selectively on premotor cortex and these co-localized with fMRI. During a subsequent BCI task, excellent performance (91%) was obtained based on ECoG power changes from the localized premotor area. These results indicate that other areas than the primary motor area may be more reliably activated during motor imagery. Specifically, the premotor cortex may be a better area to implant an invasive BCI.

  5. Identification of MicroRNAs and transcript targets in Camelina sativa by deep sequencing and computational methods

    DOE PAGESBeta

    Poudel, Saroj; Aryal, Niranjan; Lu, Chaofu; Wang, Tai

    2015-03-31

    Camelina sativa is an annual oilseed crop that is under intensive development for renewable resources of biofuels and industrial oils. MicroRNAs, or miRNAs, are endogenously encoded small RNAs that play key roles in diverse plant biological processes. Here, we conducted deep sequencing on small RNA libraries prepared from camelina leaves, flower buds and two stages of developing seeds corresponding to initial and peak storage products accumulation. Computational analyses identified 207 known miRNAs belonging to 63 families, as well as 5 novel miRNAs. These miRNAs, especially members of the miRNA families, varied greatly in different tissues and developmental stages. The predictedmore » miRNA target genes are involved in a broad range of physiological functions including lipid metabolism. This report is the first step toward elucidating roles of miRNAs in C. sativa and will provide additional tools to improve this oilseed crop for biofuels and biomaterials.« less

  6. Rapid identification of mycobacteria to the species level by polymerase chain reaction and restriction enzyme analysis--a case report of corneal ulcer.

    PubMed

    Chen, K H; Sheu, M M; Lin, S R

    1997-09-01

    We report a case of corneal ulcer caused by nontuberculous mycobacteria which was confirmed by smear and culture. We attempted a new method for the rapid identification of mycobacteria to the species level on the basis of evaluation by the polymerase chain reaction of the gene encoding. The method is involved with restriction enzyme analysis of PCR product obtained with primers common to all mycobacteria. Using the restriction enzyme Bst EII and Hae III, clinically relevant and other frequent laboratory isolates were differentiated to the species or subspecies level by PCR-restriction enzyme analysis. The main prevalence of pattern analysis is Mycobacterium chelonae subsp. abscessus in this case. The outcome suggests that PCR-restriction enzyme analysis should be a useful method for early diagnosis concerning nontuberculous mycobacterial keratitis. PMID:9348738

  7. Comparison of 16S rRNA sequencing with biochemical testing for species-level identification of clinical isolates of Neisseria spp.

    PubMed

    Mechergui, Arij; Achour, Wafa; Ben Hassen, Assia

    2014-08-01

    We aimed to compare accuracy of genus and species level identification of Neisseria spp. using biochemical testing and 16S rRNA sequence analysis. These methods were evaluated using 85 Neisseria spp. clinical isolates initially identified to the genus level by conventional biochemical tests and API NH system (Bio-Mérieux(®)). In 34 % (29/85), more than one possibility was given by 16S rRNA sequence analysis. In 6 % (5/85), one of the possibilities offered by 16S rRNA gene sequencing, agreed with the result given by biochemical testing. In 4 % (3/85), the same species was given by both methods. 16S rRNA gene sequencing results did not correlate well with biochemical tests.

  8. The First Results of Testing Methods and Algorithms for Automatic Real Time Identification of Waveforms Introduction from Local Earthquakes in Increased Level of Man-induced Noises for the Purposes of Ultra-short-term Warning about an Occurred Earthquake

    NASA Astrophysics Data System (ADS)

    Gravirov, V. V.; Kislov, K. V.

    2009-12-01

    The chief hazard posed by earthquakes consists in their suddenness. The number of earthquakes annually recorded is in excess of 100,000; of these, over 1000 are strong ones. Great human losses usually occur because no devices exist for advance warning of earthquakes. It is therefore high time that mobile information automatic systems should be developed for analysis of seismic information at high levels of manmade noise. The systems should be operated in real time with the minimum possible computational delays and be able to make fast decisions. The chief statement of the project is that sufficiently complete information about an earthquake can be obtained in real time by examining its first onset as recorded by a single seismic sensor or a local seismic array. The essential difference from the existing systems consists in the following: analysis of local seismic data at high levels of manmade noise (that is, when the noise level may be above the seismic signal level), as well as self-contained operation. The algorithms developed during the execution of the project will be capable to be used with success for individual personal protection kits and for warning the population in earthquake-prone areas over the world. The system being developed for this project uses P and S waves as well. The difference in the velocities of these seismic waves permits a technique to be developed for identifying a damaging earthquake. Real time analysis of first onsets yields the time that remains before surface waves arrive and the damage potential of these waves. Estimates show that, when the difference between the earthquake epicenter and the monitored site is of order 200 km, the time difference between the arrivals of P waves and surface waves will be about 30 seconds, which is quite sufficient to evacuate people from potentially hazardous space, insertion of moderators at nuclear power stations, pipeline interlocking, transportation stoppage, warnings issued to rescue services

  9. Gene expression levels in normal human lymphoblasts with variable sensitivities to arsenite: Identification of GGT1 and NFKBIE expression levels as possible biomarkers of susceptibility

    SciTech Connect

    Komissarova, Elena V.; Li Ping; Uddin, Ahmed N.; Chen, Xuyan; Nadas, Arthur; Rossman, Toby G.

    2008-01-15

    Drinking arsenic-contaminated water is associated with increased risk of neoplasias of the skin, lung, bladder and possibly other sites, as well as other diseases. Earlier, we showed that human lymphoblast lines from different normal unexposed donors showed variable sensitivities to the toxic effects of arsenite. In the present study, we used microarray analysis to compare the basal gene expression profiles between two arsenite-resistant (GM02707, GM00893) and two arsenite-sensitive lymphoblast lines (GM00546, GM00607). A number of genes were differentially expressed in arsenite-sensitive and arsenite-resistant cells. Among these, {gamma}-glutamyltranspeptidase 1 (GGT1) and NF{kappa}B inhibitor-epsilon (NFKBIE) showed higher expression levels in arsenite-resistant cells. RT-PCR analysis with gene-specific primers confirmed these results. Reduction of GGT1 expression level in arsenite-resistant lymphoblasts with GGT1-specific siRNA resulted in increased cell sensitivity to arsenite. In conclusion, we have demonstrated for the first time that expression levels of GGT1 and possibly NFKBIE might be useful as biomarkers of genetic susceptibility to arsenite. Expression microarrays can thus be exploited for identifying additional biomarkers of susceptibility to arsenite and to other toxicants.

  10. Computational identification of novel biochemical systems involved in oxidation, glycosylation and other complex modifications of bases in DNA.

    PubMed

    Iyer, Lakshminarayan M; Zhang, Dapeng; Burroughs, A Maxwell; Aravind, L

    2013-09-01

    Discovery of the TET/JBP family of dioxygenases that modify bases in DNA has sparked considerable interest in novel DNA base modifications and their biological roles. Using sensitive sequence and structure analyses combined with contextual information from comparative genomics, we computationally characterize over 12 novel biochemical systems for DNA modifications. We predict previously unidentified enzymes, such as the kinetoplastid J-base generating glycosyltransferase (and its homolog GREB1), the catalytic specificity of bacteriophage TET/JBP proteins and their role in complex DNA base modifications. We also predict the enzymes involved in synthesis of hypermodified bases such as alpha-glutamylthymine and alpha-putrescinylthymine that have remained enigmatic for several decades. Moreover, the current analysis suggests that bacteriophages and certain nucleo-cytoplasmic large DNA viruses contain an unexpectedly diverse range of DNA modification systems, in addition to those using previously characterized enzymes such as Dam, Dcm, TET/JBP, pyrimidine hydroxymethylases, Mom and glycosyltransferases. These include enzymes generating modified bases such as deazaguanines related to queuine and archaeosine, pyrimidines comparable with lysidine, those derived using modified S-adenosyl methionine derivatives and those using TET/JBP-generated hydroxymethyl pyrimidines as biosynthetic starting points. We present evidence that some of these modification systems are also widely dispersed across prokaryotes and certain eukaryotes such as basidiomycetes, chlorophyte and stramenopile alga, where they could serve as novel epigenetic marks for regulation or discrimination of self from non-self DNA. Our study extends the role of the PUA-like fold domains in recognition of modified nucleic acids and predicts versions of the ASCH and EVE domains to be novel 'readers' of modified bases in DNA. These results open opportunities for the investigation of the biology of these systems and

  11. Identification and Validation of Novel Hedgehog-Responsive Enhancers Predicted by Computational Analysis of Ci/Gli Binding Site Density

    PubMed Central

    Richards, Neil; Parker, David S.; Johnson, Lisa A.; Allen, Benjamin L.; Barolo, Scott; Gumucio, Deborah L.

    2015-01-01

    The Hedgehog (Hh) signaling pathway directs a multitude of cellular responses during embryogenesis and adult tissue homeostasis. Stimulation of the pathway results in activation of Hh target genes by the transcription factor Ci/Gli, which binds to specific motifs in genomic enhancers. In Drosophila, only a few enhancers (patched, decapentaplegic, wingless, stripe, knot, hairy, orthodenticle) have been shown by in vivo functional assays to depend on direct Ci/Gli regulation. All but one (orthodenticle) contain more than one Ci/Gli site, prompting us to directly test whether homotypic clustering of Ci/Gli binding sites is sufficient to define a Hh-regulated enhancer. We therefore developed a computational algorithm to identify Ci/Gli clusters that are enriched over random expectation, within a given region of the genome. Candidate genomic regions containing Ci/Gli clusters were functionally tested in chicken neural tube electroporation assays and in transgenic flies. Of the 22 Ci/Gli clusters tested, seven novel enhancers (and the previously known patched enhancer) were identified as Hh-responsive and Ci/Gli-dependent in one or both of these assays, including: Cuticular protein 100A (Cpr100A); invected (inv), which encodes an engrailed-related transcription factor expressed at the anterior/posterior wing disc boundary; roadkill (rdx), the fly homolog of vertebrate Spop; the segment polarity gene gooseberry (gsb); and two previously untested regions of the Hh receptor-encoding patched (ptc) gene. We conclude that homotypic Ci/Gli clustering is not sufficient information to ensure Hh-responsiveness; however, it can provide a clue for enhancer recognition within putative Hedgehog target gene loci. PMID:26710299

  12. Computational Identification Raises a Riddle for Distribution of Putative NACHT NTPases in the Genome of Early Green Plants.

    PubMed

    Arya, Preeti; Acharya, Vishal

    2016-01-01

    NACHT NTPases and AP-ATPases belongs to STAND (signal transduction ATPases with numerous domain) P-loop NTPase class, which are known to be involved in defense signaling pathways and apoptosis regulation. The AP-ATPases (also known as NB-ARC) and NACHT NTPases are widely spread throughout all kingdoms of life except in plants, where only AP-ATPases have been extensively studied in the scenario of plant defense response against pathogen invasion and in hypersensitive response (HR). In the present study, we have employed a genome-wide survey (using stringent computational analysis) of 67 diverse organisms viz., archaebacteria, cyanobacteria, fungi, animalia and plantae to revisit the evolutionary history of these two STAND P-loop NTPases. This analysis divulged the presence of NACHT NTPases in the early green plants (green algae and the lycophyte) which had not been previously reported. These NACHT NTPases were known to be involved in diverse functional activities such as transcription regulation in addition to the defense signaling cascades depending on the domain association. In Chalmydomonas reinhardtii, a green algae, WD40 repeats found to be at the carboxyl-terminus of NACHT NTPases suggest probable role in apoptosis regulation. Moreover, the genome of Selaginella moellendorffii, an extant lycophyte, intriguingly shows the considerable number of both AP-ATPases and NACHT NTPases in contrast to a large repertoire of AP-ATPases in plants and emerge as an important node in the evolutionary tree of life. The large complement of AP-ATPases overtakes the function of NACHT NTPases and plausible reason behind the absence of the later in the plant lineages. The presence of NACHT NTPases in the early green plants and phyletic patterns results from this study raises a quandary for the distribution of this STAND P-loop NTPase with the apparent horizontal gene transfer from cyanobacteria.

  13. Computational identification of novel biochemical systems involved in oxidation, glycosylation and other complex modifications of bases in DNA

    PubMed Central

    Iyer, Lakshminarayan M.; Zhang, Dapeng; Maxwell Burroughs, A.; Aravind, L.

    2013-01-01

    Discovery of the TET/JBP family of dioxygenases that modify bases in DNA has sparked considerable interest in novel DNA base modifications and their biological roles. Using sensitive sequence and structure analyses combined with contextual information from comparative genomics, we computationally characterize over 12 novel biochemical systems for DNA modifications. We predict previously unidentified enzymes, such as the kinetoplastid J-base generating glycosyltransferase (and its homolog GREB1), the catalytic specificity of bacteriophage TET/JBP proteins and their role in complex DNA base modifications. We also predict the enzymes involved in synthesis of hypermodified bases such as alpha-glutamylthymine and alpha-putrescinylthymine that have remained enigmatic for several decades. Moreover, the current analysis suggests that bacteriophages and certain nucleo-cytoplasmic large DNA viruses contain an unexpectedly diverse range of DNA modification systems, in addition to those using previously characterized enzymes such as Dam, Dcm, TET/JBP, pyrimidine hydroxymethylases, Mom and glycosyltransferases. These include enzymes generating modified bases such as deazaguanines related to queuine and archaeosine, pyrimidines comparable with lysidine, those derived using modified S-adenosyl methionine derivatives and those using TET/JBP-generated hydroxymethyl pyrimidines as biosynthetic starting points. We present evidence that some of these modification systems are also widely dispersed across prokaryotes and certain eukaryotes such as basidiomycetes, chlorophyte and stramenopile alga, where they could serve as novel epigenetic marks for regulation or discrimination of self from non-self DNA. Our study extends the role of the PUA-like fold domains in recognition of modified nucleic acids and predicts versions of the ASCH and EVE domains to be novel ‘readers’ of modified bases in DNA. These results open opportunities for the investigation of the biology of these systems

  14. Computational Identification Raises a Riddle for Distribution of Putative NACHT NTPases in the Genome of Early Green Plants

    PubMed Central

    Arya, Preeti; Acharya, Vishal

    2016-01-01

    NACHT NTPases and AP-ATPases belongs to STAND (signal transduction ATPases with numerous domain) P-loop NTPase class, which are known to be involved in defense signaling pathways and apoptosis regulation. The AP-ATPases (also known as NB-ARC) and NACHT NTPases are widely spread throughout all kingdoms of life except in plants, where only AP-ATPases have been extensively studied in the scenario of plant defense response against pathogen invasion and in hypersensitive response (HR). In the present study, we have employed a genome-wide survey (using stringent computational analysis) of 67 diverse organisms viz., archaebacteria, cyanobacteria, fungi, animalia and plantae to revisit the evolutionary history of these two STAND P-loop NTPases. This analysis divulged the presence of NACHT NTPases in the early green plants (green algae and the lycophyte) which had not been previously reported. These NACHT NTPases were known to be involved in diverse functional activities such as transcription regulation in addition to the defense signaling cascades depending on the domain association. In Chalmydomonas reinhardtii, a green algae, WD40 repeats found to be at the carboxyl-terminus of NACHT NTPases suggest probable role in apoptosis regulation. Moreover, the genome of Selaginella moellendorffii, an extant lycophyte, intriguingly shows the considerable number of both AP-ATPases and NACHT NTPases in contrast to a large repertoire of AP-ATPases in plants and emerge as an important node in the evolutionary tree of life. The large complement of AP-ATPases overtakes the function of NACHT NTPases and plausible reason behind the absence of the later in the plant lineages. The presence of NACHT NTPases in the early green plants and phyletic patterns results from this study raises a quandary for the distribution of this STAND P-loop NTPase with the apparent horizontal gene transfer from cyanobacteria. PMID:26930396

  15. Computational identification of conserved transcription factor binding sites upstream of genes induced in rat brain by transient focal ischemic stroke

    PubMed Central

    Pulliam, John V.K.; Xu, Zhenfeng; Ford, Gregory D.; Liu, Cuimei; Li, Yonggang; Stovall, Kyndra; Cannon, Virginetta S.; Tewolde, Teclemichael; Moreno, Carlos S.; Ford, Byron D.

    2013-01-01

    Microarray analysis has been used to understand how gene regulation plays a critical role in neuronal injury, survival and repair following ischemic stroke. To identify the transcriptional regulatory elements responsible for ischemia-induced gene expression, we examined gene expression profiles of rat brains following focal ischemia and performed computational analysis of consensus transcription factor binding sites (TFBS) in the genes of the dataset. In this study, rats were sacrificed 24 h after middle cerebral artery occlusion (MCAO) stroke and gene transcription in brain tissues following ischemia/reperfusion was examined using Affymetrix GeneChip technology. The CONserved transcription FACtor binding site (CONFAC) software package was used to identify over-represented TFBS in the upstream promoter regions of ischemia-induced genes compared to control datasets. CONFAC identified 12 TFBS that were statistically over-represented from our dataset of ischemia-induced genes, including three members of the Ets-1 family of transcription factors (TFs). Microarray results showed that mRNA for Ets-1 was increased following tMCAO but not pMCAO. Immunohistochemical analysis of Ets-1 protein in rat brains following MCAO showed that Ets-1 was highly expressed in neurons in the brain of sham control animals. Ets-1 protein expression was virtually abolished in injured neurons of the ischemic brain but was unchanged in peri-infarct brain areas. These data indicate that TFs, including Ets-1, may influence neuronal injury following ischemia. These findings could provide important insights into the mechanisms that lead to brain injury and could provide avenues for the development of novel therapies. PMID:23246490

  16. A Modular Probe Strategy for Drug Localization, Target Identification and Target Occupancy Measurement on Single Cell Level.

    PubMed

    Rutkowska, Anna; Thomson, Douglas W; Vappiani, Johanna; Werner, Thilo; Mueller, Katrin M; Dittus, Lars; Krause, Jana; Muelbaier, Marcel; Bergamini, Giovanna; Bantscheff, Marcus

    2016-09-16

    Late stage failures of candidate drug molecules are frequently caused by off-target effects or inefficient target engagement in vivo. In order to address these fundamental challenges in drug discovery, we developed a modular probe strategy based on bioorthogonal chemistry that enables the attachment of multiple reporters to the same probe in cell extracts and live cells. In a systematic evaluation, we identified the inverse electron demand Diels-Alder reaction between trans-cyclooctene labeled probe molecules and tetrazine-tagged reporters to be the most efficient bioorthogonal reaction for this strategy. Bioorthogonal biotinylation of the probe allows the identification of drug targets in a chemoproteomics competition binding assay using quantitative mass spectrometry. Attachment of a fluorescent reporter enables monitoring of spatial localization of probes as well as drug-target colocalization studies. Finally, direct target occupancy of unlabeled drugs can be determined at single cell resolution by competitive binding with fluorescently labeled probe molecules. The feasibility of the modular probe strategy is demonstrated with noncovalent PARP inhibitors.

  17. High-level waste storage tank farms/242-A evaporator standards/requirements identification document (S/RID), Vol. 2

    SciTech Connect

    Not Available

    1994-04-01

    The Quality Assurance Functional Area Requirements Identification Document (RID), addresses the programmatic requirements that ensure risks and environmental impacts are minimized, ensure safety, reliability, and performance are maximized through the application of effective management systems commensurate with the risks posed by the Tank Farm Facility and its operation. This RID incorporates guidance intended to provide Tank Farms management with the necessary requirements information to develop, upgrade, or assess the effectiveness of a Quality Assurance Program in the performance of organizational and functional activities. Quality Assurance is defined as all those planned and systematic actions necessary to provide adequate confidence that a facility, structure, system, or component will perform satisfactorily and safely in service. This document will provide the specific requirements to meet DNFSB recommendations and the guidance provided in DOE Order 5700.6C, utilizing industry codes, standards, regulatory guidelines, and industry good practices that have proven to be essential elements for an effective and efficient Quality Assurance Program as the nuclear industry has matured over the last thirty years.

  18. Using single-species toxicity tests, community-level responses, and toxicity identification evaluations to investigate effluent impacts

    SciTech Connect

    Maltby, L.; Clayton, S.A.; Yu, H.; McLoughlin, N.; Wood, R.M.; Yin, D.

    2000-01-01

    Whole effluent toxicity (WET) tests are increasingly used to monitor compliance of consented discharges, but few studies have related toxicity measured using WET tests to receiving water impacts. Here the authors adopt a four-stage procedure to investigate the toxicity and biological impact of a point source discharge and to identify the major toxicants. In stage 1, standard WET tests were employed to determine the toxicity of the effluent. This was then followed by an assessment of receiving water toxicity using in situ deployment of indigenous (Gammarus pulex) and standard (Daphnia magna) test species. The third stage involved the use of biological survey techniques to assess the impact of the discharge on the structure and functioning of the benthic macroinvertebrate community. In stage 4, toxicity identification evaluations (TIE) were used to identify toxic components in the effluent. Receiving-water toxicity and ecological impact detected downstream of the discharge were consistent with the results of WET tests performed on the effluent. Downstream of the discharge, there was a reduction in D. magna survival, in G. pulex survival and feeding rate, in detritus processing, and in biotic indices based on macroinvertebrate community structure. The TIE studies suggested that chlorine was the principal toxicant in the effluent.

  19. High-level waste storage tank farms/242-A evaporator standards/requirements identification document (S/RID), Vol. 4

    SciTech Connect

    Not Available

    1994-04-01

    Radiation protection of personnel and the public is accomplished by establishing a well defined Radiation Protection Organization to ensure that appropriate controls on radioactive materials and radiation sources are implemented and documented. This Requirements Identification Document (RID) applies to the activities, personnel, structures, systems, components, and programs involved in executing the mission of the Tank Farms. The physical boundaries within which the requirements of this RID apply are the Single Shell Tank Farms, Double Shell Tank Farms, 242-A Evaporator-Crystallizer, 242-S, T Evaporators, Liquid Effluent Retention Facility (LERF), Purgewater Storage Facility (PWSF), and all interconnecting piping, valves, instrumentation, and controls. Also included is all piping, valves, instrumentation, and controls up to and including the most remote valve under Tank Farms control at any other Hanford Facility having an interconnection with Tank Farms. The boundary of the structures, systems, components, and programs to which this RID applies, is defined by those that are dedicated to and/or under the control of the Tank Farms Operations Department and are specifically implemented at the Tank Farms.

  20. A Modular Probe Strategy for Drug Localization, Target Identification and Target Occupancy Measurement on Single Cell Level.

    PubMed

    Rutkowska, Anna; Thomson, Douglas W; Vappiani, Johanna; Werner, Thilo; Mueller, Katrin M; Dittus, Lars; Krause, Jana; Muelbaier, Marcel; Bergamini, Giovanna; Bantscheff, Marcus

    2016-09-16

    Late stage failures of candidate drug molecules are frequently caused by off-target effects or inefficient target engagement in vivo. In order to address these fundamental challenges in drug discovery, we developed a modular probe strategy based on bioorthogonal chemistry that enables the attachment of multiple reporters to the same probe in cell extracts and live cells. In a systematic evaluation, we identified the inverse electron demand Diels-Alder reaction between trans-cyclooctene labeled probe molecules and tetrazine-tagged reporters to be the most efficient bioorthogonal reaction for this strategy. Bioorthogonal biotinylation of the probe allows the identification of drug targets in a chemoproteomics competition binding assay using quantitative mass spectrometry. Attachment of a fluorescent reporter enables monitoring of spatial localization of probes as well as drug-target colocalization studies. Finally, direct target occupancy of unlabeled drugs can be determined at single cell resolution by competitive binding with fluorescently labeled probe molecules. The feasibility of the modular probe strategy is demonstrated with noncovalent PARP inhibitors. PMID:27384741