Sample records for dna-based computation solving

  1. Solving satisfiability problems using a novel microarray-based DNA computer.

    PubMed

    Lin, Che-Hsin; Cheng, Hsiao-Ping; Yang, Chang-Biau; Yang, Chia-Ning

    2007-01-01

    An algorithm based on a modified sticker model accompanied with an advanced MEMS-based microarray technology is demonstrated to solve SAT problem, which has long served as a benchmark in DNA computing. Unlike conventional DNA computing algorithms needing an initial data pool to cover correct and incorrect answers and further executing a series of separation procedures to destroy the unwanted ones, we built solutions in parts to satisfy one clause in one step, and eventually solve the entire Boolean formula through steps. No time-consuming sample preparation procedures and delicate sample applying equipment were required for the computing process. Moreover, experimental results show the bound DNA sequences can sustain the chemical solutions during computing processes such that the proposed method shall be useful in dealing with large-scale problems.

  2. A new parallel DNA algorithm to solve the task scheduling problem based on inspired computational model.

    PubMed

    Wang, Zhaocai; Ji, Zuwen; Wang, Xiaoming; Wu, Tunhua; Huang, Wei

    2017-12-01

    As a promising approach to solve the computationally intractable problem, the method based on DNA computing is an emerging research area including mathematics, computer science and molecular biology. The task scheduling problem, as a well-known NP-complete problem, arranges n jobs to m individuals and finds the minimum execution time of last finished individual. In this paper, we use a biologically inspired computational model and describe a new parallel algorithm to solve the task scheduling problem by basic DNA molecular operations. In turn, we skillfully design flexible length DNA strands to represent elements of the allocation matrix, take appropriate biological experiment operations and get solutions of the task scheduling problem in proper length range with less than O(n 2 ) time complexity. Copyright © 2017. Published by Elsevier B.V.

  3. Solving probability reasoning based on DNA strand displacement and probability modules.

    PubMed

    Zhang, Qiang; Wang, Xiaobiao; Wang, Xiaojun; Zhou, Changjun

    2017-12-01

    In computation biology, DNA strand displacement technology is used to simulate the computation process and has shown strong computing ability. Most researchers use it to solve logic problems, but it is only rarely used in probabilistic reasoning. To process probabilistic reasoning, a conditional probability derivation model and total probability model based on DNA strand displacement were established in this paper. The models were assessed through the game "read your mind." It has been shown to enable the application of probabilistic reasoning in genetic diagnosis. Copyright © 2017 Elsevier Ltd. All rights reserved.

  4. Solving traveling salesman problems with DNA molecules encoding numerical values.

    PubMed

    Lee, Ji Youn; Shin, Soo-Yong; Park, Tai Hyun; Zhang, Byoung-Tak

    2004-12-01

    We introduce a DNA encoding method to represent numerical values and a biased molecular algorithm based on the thermodynamic properties of DNA. DNA strands are designed to encode real values by variation of their melting temperatures. The thermodynamic properties of DNA are used for effective local search of optimal solutions using biochemical techniques, such as denaturation temperature gradient polymerase chain reaction and temperature gradient gel electrophoresis. The proposed method was successfully applied to the traveling salesman problem, an instance of optimization problems on weighted graphs. This work extends the capability of DNA computing to solving numerical optimization problems, which is contrasted with other DNA computing methods focusing on logical problem solving.

  5. A Parallel Biological Optimization Algorithm to Solve the Unbalanced Assignment Problem Based on DNA Molecular Computing.

    PubMed

    Wang, Zhaocai; Pu, Jun; Cao, Liling; Tan, Jian

    2015-10-23

    The unbalanced assignment problem (UAP) is to optimally resolve the problem of assigning n jobs to m individuals (m < n), such that minimum cost or maximum profit obtained. It is a vitally important Non-deterministic Polynomial (NP) complete problem in operation management and applied mathematics, having numerous real life applications. In this paper, we present a new parallel DNA algorithm for solving the unbalanced assignment problem using DNA molecular operations. We reasonably design flexible-length DNA strands representing different jobs and individuals, take appropriate steps, and get the solutions of the UAP in the proper length range and O(mn) time. We extend the application of DNA molecular operations and simultaneity to simplify the complexity of the computation.

  6. A Parallel Biological Optimization Algorithm to Solve the Unbalanced Assignment Problem Based on DNA Molecular Computing

    PubMed Central

    Wang, Zhaocai; Pu, Jun; Cao, Liling; Tan, Jian

    2015-01-01

    The unbalanced assignment problem (UAP) is to optimally resolve the problem of assigning n jobs to m individuals (m < n), such that minimum cost or maximum profit obtained. It is a vitally important Non-deterministic Polynomial (NP) complete problem in operation management and applied mathematics, having numerous real life applications. In this paper, we present a new parallel DNA algorithm for solving the unbalanced assignment problem using DNA molecular operations. We reasonably design flexible-length DNA strands representing different jobs and individuals, take appropriate steps, and get the solutions of the UAP in the proper length range and O(mn) time. We extend the application of DNA molecular operations and simultaneity to simplify the complexity of the computation. PMID:26512650

  7. A new fast algorithm for solving the minimum spanning tree problem based on DNA molecules computation.

    PubMed

    Wang, Zhaocai; Huang, Dongmei; Meng, Huajun; Tang, Chengpei

    2013-10-01

    The minimum spanning tree (MST) problem is to find minimum edge connected subsets containing all the vertex of a given undirected graph. It is a vitally important NP-complete problem in graph theory and applied mathematics, having numerous real life applications. Moreover in previous studies, DNA molecular operations usually were used to solve NP-complete head-to-tail path search problems, rarely for NP-hard problems with multi-lateral path solutions result, such as the minimum spanning tree problem. In this paper, we present a new fast DNA algorithm for solving the MST problem using DNA molecular operations. For an undirected graph with n vertex and m edges, we reasonably design flexible length DNA strands representing the vertex and edges, take appropriate steps and get the solutions of the MST problem in proper length range and O(3m+n) time complexity. We extend the application of DNA molecular operations and simultaneity simplify the complexity of the computation. Results of computer simulative experiments show that the proposed method updates some of the best known values with very short time and that the proposed method provides a better performance with solution accuracy over existing algorithms. Copyright © 2013 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.

  8. Approaching mathematical model of the immune network based DNA Strand Displacement system.

    PubMed

    Mardian, Rizki; Sekiyama, Kosuke; Fukuda, Toshio

    2013-12-01

    One biggest obstacle in molecular programming is that there is still no direct method to compile any existed mathematical model into biochemical reaction in order to solve a computational problem. In this paper, the implementation of DNA Strand Displacement system based on nature-inspired computation is observed. By using the Immune Network Theory and Chemical Reaction Network, the compilation of DNA-based operation is defined and the formulation of its mathematical model is derived. Furthermore, the implementation on this system is compared with the conventional implementation by using silicon-based programming. From the obtained results, we can see a positive correlation between both. One possible application from this DNA-based model is for a decision making scheme of intelligent computer or molecular robot. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  9. Fast parallel molecular algorithms for DNA-based computation: solving the elliptic curve discrete logarithm problem over GF2.

    PubMed

    Li, Kenli; Zou, Shuting; Xv, Jin

    2008-01-01

    Elliptic curve cryptographic algorithms convert input data to unrecognizable encryption and the unrecognizable data back again into its original decrypted form. The security of this form of encryption hinges on the enormous difficulty that is required to solve the elliptic curve discrete logarithm problem (ECDLP), especially over GF(2(n)), n in Z+. This paper describes an effective method to find solutions to the ECDLP by means of a molecular computer. We propose that this research accomplishment would represent a breakthrough for applied biological computation and this paper demonstrates that in principle this is possible. Three DNA-based algorithms: a parallel adder, a parallel multiplier, and a parallel inverse over GF(2(n)) are described. The biological operation time of all of these algorithms is polynomial with respect to n. Considering this analysis, cryptography using a public key might be less secure. In this respect, a principal contribution of this paper is to provide enhanced evidence of the potential of molecular computing to tackle such ambitious computations.

  10. Fast Parallel Molecular Algorithms for DNA-Based Computation: Solving the Elliptic Curve Discrete Logarithm Problem over GF(2n)

    PubMed Central

    Li, Kenli; Zou, Shuting; Xv, Jin

    2008-01-01

    Elliptic curve cryptographic algorithms convert input data to unrecognizable encryption and the unrecognizable data back again into its original decrypted form. The security of this form of encryption hinges on the enormous difficulty that is required to solve the elliptic curve discrete logarithm problem (ECDLP), especially over GF(2n), n ∈ Z+. This paper describes an effective method to find solutions to the ECDLP by means of a molecular computer. We propose that this research accomplishment would represent a breakthrough for applied biological computation and this paper demonstrates that in principle this is possible. Three DNA-based algorithms: a parallel adder, a parallel multiplier, and a parallel inverse over GF(2n) are described. The biological operation time of all of these algorithms is polynomial with respect to n. Considering this analysis, cryptography using a public key might be less secure. In this respect, a principal contribution of this paper is to provide enhanced evidence of the potential of molecular computing to tackle such ambitious computations. PMID:18431451

  11. Molecular Sticker Model Stimulation on Silicon for a Maximum Clique Problem

    PubMed Central

    Ning, Jianguo; Li, Yanmei; Yu, Wen

    2015-01-01

    Molecular computers (also called DNA computers), as an alternative to traditional electronic computers, are smaller in size but more energy efficient, and have massive parallel processing capacity. However, DNA computers may not outperform electronic computers owing to their higher error rates and some limitations of the biological laboratory. The stickers model, as a typical DNA-based computer, is computationally complete and universal, and can be viewed as a bit-vertically operating machine. This makes it attractive for silicon implementation. Inspired by the information processing method on the stickers computer, we propose a novel parallel computing model called DEM (DNA Electronic Computing Model) on System-on-a-Programmable-Chip (SOPC) architecture. Except for the significant difference in the computing medium—transistor chips rather than bio-molecules—the DEM works similarly to DNA computers in immense parallel information processing. Additionally, a plasma display panel (PDP) is used to show the change of solutions, and helps us directly see the distribution of assignments. The feasibility of the DEM is tested by applying it to compute a maximum clique problem (MCP) with eight vertices. Owing to the limited computing sources on SOPC architecture, the DEM could solve moderate-size problems in polynomial time. PMID:26075867

  12. In vitro molecular machine learning algorithm via symmetric internal loops of DNA.

    PubMed

    Lee, Ji-Hoon; Lee, Seung Hwan; Baek, Christina; Chun, Hyosun; Ryu, Je-Hwan; Kim, Jin-Woo; Deaton, Russell; Zhang, Byoung-Tak

    2017-08-01

    Programmable biomolecules, such as DNA strands, deoxyribozymes, and restriction enzymes, have been used to solve computational problems, construct large-scale logic circuits, and program simple molecular games. Although studies have shown the potential of molecular computing, the capability of computational learning with DNA molecules, i.e., molecular machine learning, has yet to be experimentally verified. Here, we present a novel molecular learning in vitro model in which symmetric internal loops of double-stranded DNA are exploited to measure the differences between training instances, thus enabling the molecules to learn from small errors. The model was evaluated on a data set of twenty dialogue sentences obtained from the television shows Friends and Prison Break. The wet DNA-computing experiments confirmed that the molecular learning machine was able to generalize the dialogue patterns of each show and successfully identify the show from which the sentences originated. The molecular machine learning model described here opens the way for solving machine learning problems in computer science and biology using in vitro molecular computing with the data encoded in DNA molecules. Copyright © 2017. Published by Elsevier B.V.

  13. Fast parallel DNA-based algorithms for molecular computation: quadratic congruence and factoring integers.

    PubMed

    Chang, Weng-Long

    2012-03-01

    Assume that n is a positive integer. If there is an integer such that M (2) ≡ C (mod n), i.e., the congruence has a solution, then C is said to be a quadratic congruence (mod n). If the congruence does not have a solution, then C is said to be a quadratic noncongruence (mod n). The task of solving the problem is central to many important applications, the most obvious being cryptography. In this article, we describe a DNA-based algorithm for solving quadratic congruence and factoring integers. In additional to this novel contribution, we also show the utility of our encoding scheme, and of the algorithm's submodules. We demonstrate how a variety of arithmetic, shifted and comparative operations, namely bitwise and full addition, subtraction, left shifter and comparison perhaps are performed using strands of DNA.

  14. DNA strand displacement system running logic programs.

    PubMed

    Rodríguez-Patón, Alfonso; Sainz de Murieta, Iñaki; Sosík, Petr

    2014-01-01

    The paper presents a DNA-based computing model which is enzyme-free and autonomous, not requiring a human intervention during the computation. The model is able to perform iterated resolution steps with logical formulae in conjunctive normal form. The implementation is based on the technique of DNA strand displacement, with each clause encoded in a separate DNA molecule. Propositions are encoded assigning a strand to each proposition p, and its complementary strand to the proposition ¬p; clauses are encoded comprising different propositions in the same strand. The model allows to run logic programs composed of Horn clauses by cascading resolution steps. The potential of the model is demonstrated also by its theoretical capability of solving SAT. The resulting SAT algorithm has a linear time complexity in the number of resolution steps, whereas its spatial complexity is exponential in the number of variables of the formula. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  15. Manipulation of oligonucleotides immobilized on solid supports - DNA computations on surfaces

    NASA Astrophysics Data System (ADS)

    Liu, Qinghua

    The manipulation of DNA oligonucleotides immobilized on various solid supports has been studied intensively, especially in the area of surface hybridization. Recently, surface-based biotechnology has been applied to the area of molecular computing. These surface-based methods have advantages with regard to ease of handling, facile purification, and less interference when compared to solution methodologies. This dissertation describes the investigation of molecular approaches to DNA computing. The feasibility of encoding a bit (0 or 1) of information for DNA-based computations at the single nucleotide level was studied, particularly with regard to the efficiency and specificity of hybridization discrimination. Both gold and glass surfaces, with addressed arrays of 32 oligonucleotides, were employed with similar hybridization results. Although single-base discrimination may be achieved in the system, it is at the cost of a severe decrease in the efficiency of hybridization to perfectly matched sequences. This compromises the utility of single nucleotide encoding for DNA computing applications in the absence of some additional mechanism for increasing specificity. Several methods are suggested including a multiple-base encoding strategy. The multiple-base encoding strategy was employed to develop a prototype DNA computer. The approach was demonstrated by solving a small example of the Satisfiability (SAT) problem, an NP-complete problem in Boolean logic. 16 distinct DNA oligonucleotides, encoding all candidate solutions to the 4-variable-4-clause-3-SAT problem, were immobilized on a gold surface in the non-addressed format. Four cycles of MARK (hybridization), DESTROY (enzymatic destruction) and UNMARK (denaturation) were performed, which identified and eliminated members of the set which were not solutions to the problem. Determination of the answer was accomplished in the READOUT (sequence identification) operation by PCR amplification of the remaining molecules and hybridization to an addressed array. Four answers were determined and the S/N ratio between correct and incorrect solutions ranged from 10 to 777, making discrimination between correct and incorrect solutions to the problem straightforward. Additionally, studies of enzymatic manipulations of DNA molecules on surfaces suggested the use of E. coli Exonuclease I (Exo I) and perhaps EarI in the DESTROY operation.

  16. Non-linear molecular pattern classification using molecular beacons with multiple targets.

    PubMed

    Lee, In-Hee; Lee, Seung Hwan; Park, Tai Hyun; Zhang, Byoung-Tak

    2013-12-01

    In vitro pattern classification has been highlighted as an important future application of DNA computing. Previous work has demonstrated the feasibility of linear classifiers using DNA-based molecular computing. However, complex tasks require non-linear classification capability. Here we design a molecular beacon that can interact with multiple targets and experimentally shows that its fluorescent signals form a complex radial-basis function, enabling it to be used as a building block for non-linear molecular classification in vitro. The proposed method was successfully applied to solving artificial and real-world classification problems: XOR and microRNA expression patterns. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  17. Parallel computation with molecular-motor-propelled agents in nanofabricated networks.

    PubMed

    Nicolau, Dan V; Lard, Mercy; Korten, Till; van Delft, Falco C M J M; Persson, Malin; Bengtsson, Elina; Månsson, Alf; Diez, Stefan; Linke, Heiner; Nicolau, Dan V

    2016-03-08

    The combinatorial nature of many important mathematical problems, including nondeterministic-polynomial-time (NP)-complete problems, places a severe limitation on the problem size that can be solved with conventional, sequentially operating electronic computers. There have been significant efforts in conceiving parallel-computation approaches in the past, for example: DNA computation, quantum computation, and microfluidics-based computation. However, these approaches have not proven, so far, to be scalable and practical from a fabrication and operational perspective. Here, we report the foundations of an alternative parallel-computation system in which a given combinatorial problem is encoded into a graphical, modular network that is embedded in a nanofabricated planar device. Exploring the network in a parallel fashion using a large number of independent, molecular-motor-propelled agents then solves the mathematical problem. This approach uses orders of magnitude less energy than conventional computers, thus addressing issues related to power consumption and heat dissipation. We provide a proof-of-concept demonstration of such a device by solving, in a parallel fashion, the small instance {2, 5, 9} of the subset sum problem, which is a benchmark NP-complete problem. Finally, we discuss the technical advances necessary to make our system scalable with presently available technology.

  18. Engineering bacteria to solve the Burnt Pancake Problem

    PubMed Central

    Haynes, Karmella A; Broderick, Marian L; Brown, Adam D; Butner, Trevor L; Dickson, James O; Harden, W Lance; Heard, Lane H; Jessen, Eric L; Malloy, Kelly J; Ogden, Brad J; Rosemond, Sabriya; Simpson, Samantha; Zwack, Erin; Campbell, A Malcolm; Eckdahl, Todd T; Heyer, Laurie J; Poet, Jeffrey L

    2008-01-01

    Background We investigated the possibility of executing DNA-based computation in living cells by engineering Escherichia coli to address a classic mathematical puzzle called the Burnt Pancake Problem (BPP). The BPP is solved by sorting a stack of distinct objects (pancakes) into proper order and orientation using the minimum number of manipulations. Each manipulation reverses the order and orientation of one or more adjacent objects in the stack. We have designed a system that uses site-specific DNA recombination to mediate inversions of genetic elements that represent pancakes within plasmid DNA. Results Inversions (or "flips") of the DNA fragment pancakes are driven by the Salmonella typhimurium Hin/hix DNA recombinase system that we reconstituted as a collection of modular genetic elements for use in E. coli. Our system sorts DNA segments by inversions to produce different permutations of a promoter and a tetracycline resistance coding region; E. coli cells become antibiotic resistant when the segments are properly sorted. Hin recombinase can mediate all possible inversion operations on adjacent flippable DNA fragments. Mathematical modeling predicts that the system reaches equilibrium after very few flips, where equal numbers of permutations are randomly sorted and unsorted. Semiquantitative PCR analysis of in vivo flipping suggests that inversion products accumulate on a time scale of hours or days rather than minutes. Conclusion The Hin/hix system is a proof-of-concept demonstration of in vivo computation with the potential to be scaled up to accommodate larger and more challenging problems. Hin/hix may provide a flexible new tool for manipulating transgenic DNA in vivo. PMID:18492232

  19. Exploring the Feasibility of a DNA Computer: Design of an ALU Using Sticker-Based DNA Model.

    PubMed

    Sarkar, Mayukh; Ghosal, Prasun; Mohanty, Saraju P

    2017-09-01

    Since its inception, DNA computing has advanced to offer an extremely powerful, energy-efficient emerging technology for solving hard computational problems with its inherent massive parallelism and extremely high data density. This would be much more powerful and general purpose when combined with other existing well-known algorithmic solutions that exist for conventional computing architectures using a suitable ALU. Thus, a specifically designed DNA Arithmetic and Logic Unit (ALU) that can address operations suitable for both domains can mitigate the gap between these two. An ALU must be able to perform all possible logic operations, including NOT, OR, AND, XOR, NOR, NAND, and XNOR; compare, shift etc., integer and floating point arithmetic operations (addition, subtraction, multiplication, and division). In this paper, design of an ALU has been proposed using sticker-based DNA model with experimental feasibility analysis. Novelties of this paper may be in manifold. First, the integer arithmetic operations performed here are 2s complement arithmetic, and the floating point operations follow the IEEE 754 floating point format, resembling closely to a conventional ALU. Also, the output of each operation can be reused for any next operation. So any algorithm or program logic that users can think of can be implemented directly on the DNA computer without any modification. Second, once the basic operations of sticker model can be automated, the implementations proposed in this paper become highly suitable to design a fully automated ALU. Third, proposed approaches are easy to implement. Finally, these approaches can work on sufficiently large binary numbers.

  20. Games that Enlist Collective Intelligence to Solve Complex Scientific Problems.

    PubMed

    Burnett, Stephen; Furlong, Michelle; Melvin, Paul Guy; Singiser, Richard

    2016-03-01

    There is great value in employing the collective problem-solving power of large groups of people. Technological advances have allowed computer games to be utilized by a diverse population to solve problems. Science games are becoming more popular and cover various areas such as sequence alignments, DNA base-pairing, and protein and RNA folding. While these tools have been developed for the general population, they can also be used effectively in the classroom to teach students about various topics. Many games also employ a social component that entices students to continue playing and thereby to continue learning. The basic functions of game play and the potential of game play as a tool in the classroom are discussed in this article.

  1. Games that Enlist Collective Intelligence to Solve Complex Scientific Problems

    PubMed Central

    Burnett, Stephen; Furlong, Michelle; Melvin, Paul Guy; Singiser, Richard

    2016-01-01

    There is great value in employing the collective problem-solving power of large groups of people. Technological advances have allowed computer games to be utilized by a diverse population to solve problems. Science games are becoming more popular and cover various areas such as sequence alignments, DNA base-pairing, and protein and RNA folding. While these tools have been developed for the general population, they can also be used effectively in the classroom to teach students about various topics. Many games also employ a social component that entices students to continue playing and thereby to continue learning. The basic functions of game play and the potential of game play as a tool in the classroom are discussed in this article. PMID:27047610

  2. High-resolution mapping of bifurcations in nonlinear biochemical circuits

    NASA Astrophysics Data System (ADS)

    Genot, A. J.; Baccouche, A.; Sieskind, R.; Aubert-Kato, N.; Bredeche, N.; Bartolo, J. F.; Taly, V.; Fujii, T.; Rondelez, Y.

    2016-08-01

    Analog molecular circuits can exploit the nonlinear nature of biochemical reaction networks to compute low-precision outputs with fewer resources than digital circuits. This analog computation is similar to that employed by gene-regulation networks. Although digital systems have a tractable link between structure and function, the nonlinear and continuous nature of analog circuits yields an intricate functional landscape, which makes their design counter-intuitive, their characterization laborious and their analysis delicate. Here, using droplet-based microfluidics, we map with high resolution and dimensionality the bifurcation diagrams of two synthetic, out-of-equilibrium and nonlinear programs: a bistable DNA switch and a predator-prey DNA oscillator. The diagrams delineate where function is optimal, dynamics bifurcates and models fail. Inverse problem solving on these large-scale data sets indicates interference from enzymatic coupling. Additionally, data mining exposes the presence of rare, stochastically bursting oscillators near deterministic bifurcations.

  3. Software Reviews.

    ERIC Educational Resources Information Center

    Science Software Quarterly, 1984

    1984-01-01

    Provides extensive reviews of computer software, examining documentation, ease of use, performance, error handling, special features, and system requirements. Includes statistics, problem-solving (TK Solver), label printing, database management, experimental psychology, Encyclopedia Britannica biology, and DNA-sequencing programs. A program for…

  4. Analysis of problem solving on project based learning with resource based learning approach computer-aided program

    NASA Astrophysics Data System (ADS)

    Kuncoro, K. S.; Junaedi, I.; Dwijanto

    2018-03-01

    This study aimed to reveal the effectiveness of Project Based Learning with Resource Based Learning approach computer-aided program and analyzed problem-solving abilities in terms of problem-solving steps based on Polya stages. The research method used was mixed method with sequential explanatory design. The subject of this research was the students of math semester 4. The results showed that the S-TPS (Strong Top Problem Solving) and W-TPS (Weak Top Problem Solving) had good problem-solving abilities in each problem-solving indicator. The problem-solving ability of S-MPS (Strong Middle Problem Solving) and (Weak Middle Problem Solving) in each indicator was good. The subject of S-BPS (Strong Bottom Problem Solving) had a difficulty in solving the problem with computer program, less precise in writing the final conclusion and could not reflect the problem-solving process using Polya’s step. While the Subject of W-BPS (Weak Bottom Problem Solving) had not been able to meet almost all the indicators of problem-solving. The subject of W-BPS could not precisely made the initial table of completion so that the completion phase with Polya’s step was constrained.

  5. Stochastic modelling, Bayesian inference, and new in vivo measurements elucidate the debated mtDNA bottleneck mechanism

    PubMed Central

    Johnston, Iain G; Burgstaller, Joerg P; Havlicek, Vitezslav; Kolbe, Thomas; Rülicke, Thomas; Brem, Gottfried; Poulton, Jo; Jones, Nick S

    2015-01-01

    Dangerous damage to mitochondrial DNA (mtDNA) can be ameliorated during mammalian development through a highly debated mechanism called the mtDNA bottleneck. Uncertainty surrounding this process limits our ability to address inherited mtDNA diseases. We produce a new, physically motivated, generalisable theoretical model for mtDNA populations during development, allowing the first statistical comparison of proposed bottleneck mechanisms. Using approximate Bayesian computation and mouse data, we find most statistical support for a combination of binomial partitioning of mtDNAs at cell divisions and random mtDNA turnover, meaning that the debated exact magnitude of mtDNA copy number depletion is flexible. New experimental measurements from a wild-derived mtDNA pairing in mice confirm the theoretical predictions of this model. We analytically solve a mathematical description of this mechanism, computing probabilities of mtDNA disease onset, efficacy of clinical sampling strategies, and effects of potential dynamic interventions, thus developing a quantitative and experimentally-supported stochastic theory of the bottleneck. DOI: http://dx.doi.org/10.7554/eLife.07464.001 PMID:26035426

  6. Automated property optimization via ab initio O(N) elongation method: Application to (hyper-)polarizability in DNA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Orimoto, Yuuichi, E-mail: orimoto.yuuichi.888@m.kyushu-u.ac.jp; Aoki, Yuriko; Japan Science and Technology Agency, CREST, 4-1-8 Hon-chou, Kawaguchi, Saitama 332-0012

    An automated property optimization method was developed based on the ab initio O(N) elongation (ELG) method and applied to the optimization of nonlinear optical (NLO) properties in DNA as a first test. The ELG method mimics a polymerization reaction on a computer, and the reaction terminal of a starting cluster is attacked by monomers sequentially to elongate the electronic structure of the system by solving in each step a limited space including the terminal (localized molecular orbitals at the terminal) and monomer. The ELG-finite field (ELG-FF) method for calculating (hyper-)polarizabilities was used as the engine program of the optimization method,more » and it was found to show linear scaling efficiency while maintaining high computational accuracy for a random sequenced DNA model. Furthermore, the self-consistent field convergence was significantly improved by using the ELG-FF method compared with a conventional method, and it can lead to more feasible NLO property values in the FF treatment. The automated optimization method successfully chose an appropriate base pair from four base pairs (A, T, G, and C) for each elongation step according to an evaluation function. From test optimizations for the first order hyper-polarizability (β) in DNA, a substantial difference was observed depending on optimization conditions between “choose-maximum” (choose a base pair giving the maximum β for each step) and “choose-minimum” (choose a base pair giving the minimum β). In contrast, there was an ambiguous difference between these conditions for optimizing the second order hyper-polarizability (γ) because of the small absolute value of γ and the limitation of numerical differential calculations in the FF method. It can be concluded that the ab initio level property optimization method introduced here can be an effective step towards an advanced computer aided material design method as long as the numerical limitation of the FF method is taken into account.« less

  7. Automated property optimization via ab initio O(N) elongation method: Application to (hyper-)polarizability in DNA.

    PubMed

    Orimoto, Yuuichi; Aoki, Yuriko

    2016-07-14

    An automated property optimization method was developed based on the ab initio O(N) elongation (ELG) method and applied to the optimization of nonlinear optical (NLO) properties in DNA as a first test. The ELG method mimics a polymerization reaction on a computer, and the reaction terminal of a starting cluster is attacked by monomers sequentially to elongate the electronic structure of the system by solving in each step a limited space including the terminal (localized molecular orbitals at the terminal) and monomer. The ELG-finite field (ELG-FF) method for calculating (hyper-)polarizabilities was used as the engine program of the optimization method, and it was found to show linear scaling efficiency while maintaining high computational accuracy for a random sequenced DNA model. Furthermore, the self-consistent field convergence was significantly improved by using the ELG-FF method compared with a conventional method, and it can lead to more feasible NLO property values in the FF treatment. The automated optimization method successfully chose an appropriate base pair from four base pairs (A, T, G, and C) for each elongation step according to an evaluation function. From test optimizations for the first order hyper-polarizability (β) in DNA, a substantial difference was observed depending on optimization conditions between "choose-maximum" (choose a base pair giving the maximum β for each step) and "choose-minimum" (choose a base pair giving the minimum β). In contrast, there was an ambiguous difference between these conditions for optimizing the second order hyper-polarizability (γ) because of the small absolute value of γ and the limitation of numerical differential calculations in the FF method. It can be concluded that the ab initio level property optimization method introduced here can be an effective step towards an advanced computer aided material design method as long as the numerical limitation of the FF method is taken into account.

  8. The semantic system is involved in mathematical problem solving.

    PubMed

    Zhou, Xinlin; Li, Mengyi; Li, Leinian; Zhang, Yiyun; Cui, Jiaxin; Liu, Jie; Chen, Chuansheng

    2018-02-01

    Numerous studies have shown that the brain regions around bilateral intraparietal cortex are critical for number processing and arithmetical computation. However, the neural circuits for more advanced mathematics such as mathematical problem solving (with little routine arithmetical computation) remain unclear. Using functional magnetic resonance imaging (fMRI), this study (N = 24 undergraduate students) compared neural bases of mathematical problem solving (i.e., number series completion, mathematical word problem solving, and geometric problem solving) and arithmetical computation. Direct subject- and item-wise comparisons revealed that mathematical problem solving typically had greater activation than arithmetical computation in all 7 regions of the semantic system (which was based on a meta-analysis of 120 functional neuroimaging studies on semantic processing). Arithmetical computation typically had greater activation in the supplementary motor area and left precentral gyrus. The results suggest that the semantic system in the brain supports mathematical problem solving. Copyright © 2017 Elsevier Inc. All rights reserved.

  9. Three-input majority logic gate and multiple input logic circuit based on DNA strand displacement.

    PubMed

    Li, Wei; Yang, Yang; Yan, Hao; Liu, Yan

    2013-06-12

    In biomolecular programming, the properties of biomolecules such as proteins and nucleic acids are harnessed for computational purposes. The field has gained considerable attention due to the possibility of exploiting the massive parallelism that is inherent in natural systems to solve computational problems. DNA has already been used to build complex molecular circuits, where the basic building blocks are logic gates that produce single outputs from one or more logical inputs. We designed and experimentally realized a three-input majority gate based on DNA strand displacement. One of the key features of a three-input majority gate is that the three inputs have equal priority, and the output will be true if any of the two inputs are true. Our design consists of a central, circular DNA strand with three unique domains between which are identical joint sequences. Before inputs are introduced to the system, each domain and half of each joint is protected by one complementary ssDNA that displays a toehold for subsequent displacement by the corresponding input. With this design the relationship between any two domains is analogous to the relationship between inputs in a majority gate. Displacing two or more of the protection strands will expose at least one complete joint and return a true output; displacing none or only one of the protection strands will not expose a complete joint and will return a false output. Further, we designed and realized a complex five-input logic gate based on the majority gate described here. By controlling two of the five inputs the complex gate can realize every combination of OR and AND gates of the other three inputs.

  10. APBSmem: A Graphical Interface for Electrostatic Calculations at the Membrane

    PubMed Central

    Callenberg, Keith M.; Choudhary, Om P.; de Forest, Gabriel L.; Gohara, David W.; Baker, Nathan A.; Grabe, Michael

    2010-01-01

    Electrostatic forces are one of the primary determinants of molecular interactions. They help guide the folding of proteins, increase the binding of one protein to another and facilitate protein-DNA and protein-ligand binding. A popular method for computing the electrostatic properties of biological systems is to numerically solve the Poisson-Boltzmann (PB) equation, and there are several easy-to-use software packages available that solve the PB equation for soluble proteins. Here we present a freely available program, called APBSmem, for carrying out these calculations in the presence of a membrane. The Adaptive Poisson-Boltzmann Solver (APBS) is used as a back-end for solving the PB equation, and a Java-based graphical user interface (GUI) coordinates a set of routines that introduce the influence of the membrane, determine its placement relative to the protein, and set the membrane potential. The software Jmol is embedded in the GUI to visualize the protein inserted in the membrane before the calculation and the electrostatic potential after completing the computation. We expect that the ease with which the GUI allows one to carry out these calculations will make this software a useful resource for experimenters and computational researchers alike. Three examples of membrane protein electrostatic calculations are carried out to illustrate how to use APBSmem and to highlight the different quantities of interest that can be calculated. PMID:20949122

  11. APBSmem: a graphical interface for electrostatic calculations at the membrane.

    PubMed

    Callenberg, Keith M; Choudhary, Om P; de Forest, Gabriel L; Gohara, David W; Baker, Nathan A; Grabe, Michael

    2010-09-29

    Electrostatic forces are one of the primary determinants of molecular interactions. They help guide the folding of proteins, increase the binding of one protein to another and facilitate protein-DNA and protein-ligand binding. A popular method for computing the electrostatic properties of biological systems is to numerically solve the Poisson-Boltzmann (PB) equation, and there are several easy-to-use software packages available that solve the PB equation for soluble proteins. Here we present a freely available program, called APBSmem, for carrying out these calculations in the presence of a membrane. The Adaptive Poisson-Boltzmann Solver (APBS) is used as a back-end for solving the PB equation, and a Java-based graphical user interface (GUI) coordinates a set of routines that introduce the influence of the membrane, determine its placement relative to the protein, and set the membrane potential. The software Jmol is embedded in the GUI to visualize the protein inserted in the membrane before the calculation and the electrostatic potential after completing the computation. We expect that the ease with which the GUI allows one to carry out these calculations will make this software a useful resource for experimenters and computational researchers alike. Three examples of membrane protein electrostatic calculations are carried out to illustrate how to use APBSmem and to highlight the different quantities of interest that can be calculated.

  12. QPSO-Based Adaptive DNA Computing Algorithm

    PubMed Central

    Karakose, Mehmet; Cigdem, Ugur

    2013-01-01

    DNA (deoxyribonucleic acid) computing that is a new computation model based on DNA molecules for information storage has been increasingly used for optimization and data analysis in recent years. However, DNA computing algorithm has some limitations in terms of convergence speed, adaptability, and effectiveness. In this paper, a new approach for improvement of DNA computing is proposed. This new approach aims to perform DNA computing algorithm with adaptive parameters towards the desired goal using quantum-behaved particle swarm optimization (QPSO). Some contributions provided by the proposed QPSO based on adaptive DNA computing algorithm are as follows: (1) parameters of population size, crossover rate, maximum number of operations, enzyme and virus mutation rate, and fitness function of DNA computing algorithm are simultaneously tuned for adaptive process, (2) adaptive algorithm is performed using QPSO algorithm for goal-driven progress, faster operation, and flexibility in data, and (3) numerical realization of DNA computing algorithm with proposed approach is implemented in system identification. Two experiments with different systems were carried out to evaluate the performance of the proposed approach with comparative results. Experimental results obtained with Matlab and FPGA demonstrate ability to provide effective optimization, considerable convergence speed, and high accuracy according to DNA computing algorithm. PMID:23935409

  13. Students' Mathematics Word Problem-Solving Achievement in a Computer-Based Story

    ERIC Educational Resources Information Center

    Gunbas, N.

    2015-01-01

    The purpose of this study was to investigate the effect of a computer-based story, which was designed in anchored instruction framework, on sixth-grade students' mathematics word problem-solving achievement. Problems were embedded in a story presented on a computer as computer story, and then compared with the paper-based version of the same story…

  14. A Comparison of the Effects of Lego TC Logo and Problem Solving Software on Elementary Students' Problem Solving Skills.

    ERIC Educational Resources Information Center

    Palumbo, Debra L; Palumbo, David B.

    1993-01-01

    Computer-based problem-solving software exposure was compared to Lego TC LOGO instruction. Thirty fifth graders received either Lego LOGO instruction, which couples Lego building block activities with LOGO computer programming, or instruction with various problem-solving computer programs. Although both groups showed significant progress, the Lego…

  15. New Trends of Digital Data Storage in DNA

    PubMed Central

    2016-01-01

    With the exponential growth in the capacity of information generated and the emerging need for data to be stored for prolonged period of time, there emerges a need for a storage medium with high capacity, high storage density, and possibility to withstand extreme environmental conditions. DNA emerges as the prospective medium for data storage with its striking features. Diverse encoding models for reading and writing data onto DNA, codes for encrypting data which addresses issues of error generation, and approaches for developing codons and storage styles have been developed over the recent past. DNA has been identified as a potential medium for secret writing, which achieves the way towards DNA cryptography and stenography. DNA utilized as an organic memory device along with big data storage and analytics in DNA has paved the way towards DNA computing for solving computational problems. This paper critically analyzes the various methods used for encoding and encrypting data onto DNA while identifying the advantages and capability of every scheme to overcome the drawbacks identified priorly. Cryptography and stenography techniques have been analyzed in a critical approach while identifying the limitations of each method. This paper also identifies the advantages and limitations of DNA as a memory device and memory applications. PMID:27689089

  16. New Trends of Digital Data Storage in DNA.

    PubMed

    De Silva, Pavani Yashodha; Ganegoda, Gamage Upeksha

    With the exponential growth in the capacity of information generated and the emerging need for data to be stored for prolonged period of time, there emerges a need for a storage medium with high capacity, high storage density, and possibility to withstand extreme environmental conditions. DNA emerges as the prospective medium for data storage with its striking features. Diverse encoding models for reading and writing data onto DNA, codes for encrypting data which addresses issues of error generation, and approaches for developing codons and storage styles have been developed over the recent past. DNA has been identified as a potential medium for secret writing, which achieves the way towards DNA cryptography and stenography. DNA utilized as an organic memory device along with big data storage and analytics in DNA has paved the way towards DNA computing for solving computational problems. This paper critically analyzes the various methods used for encoding and encrypting data onto DNA while identifying the advantages and capability of every scheme to overcome the drawbacks identified priorly. Cryptography and stenography techniques have been analyzed in a critical approach while identifying the limitations of each method. This paper also identifies the advantages and limitations of DNA as a memory device and memory applications.

  17. Programmable and autonomous computing machine made of biomolecules

    PubMed Central

    Benenson, Yaakov; Paz-Elizur, Tamar; Adar, Rivka; Keinan, Ehud; Livneh, Zvi; Shapiro, Ehud

    2013-01-01

    Devices that convert information from one form into another according to a definite procedure are known as automata. One such hypothetical device is the universal Turing machine1, which stimulated work leading to the development of modern computers. The Turing machine and its special cases2, including finite automata3, operate by scanning a data tape, whose striking analogy to information-encoding biopolymers inspired several designs for molecular DNA computers4–8. Laboratory-scale computing using DNA and human-assisted protocols has been demonstrated9–15, but the realization of computing devices operating autonomously on the molecular scale remains rare16–20. Here we describe a programmable finite automaton comprising DNA and DNA-manipulating enzymes that solves computational problems autonomously. The automaton’s hardware consists of a restriction nuclease and ligase, the software and input are encoded by double-stranded DNA, and programming amounts to choosing appropriate software molecules. Upon mixing solutions containing these components, the automaton processes the input molecule via a cascade of restriction, hybridization and ligation cycles, producing a detectable output molecule that encodes the automaton’s final state, and thus the computational result. In our implementation 1012 automata sharing the same software run independently and in parallel on inputs (which could, in principle, be distinct) in 120 μl solution at room temperature at a combined rate of 109 transitions per second with a transition fidelity greater than 99.8%, consuming less than 10−10 W. PMID:11719800

  18. Problem-Solving Models for Computer Literacy: Getting Smarter at Solving Problems. Student Lessons.

    ERIC Educational Resources Information Center

    Moursund, David

    This book is intended for use as a student guide. It is about human problem solving and provides information on how the mind works, placing a major emphasis on the role of computers as an aid in problem solving. The book is written with the underlying philosophy of discovery-based learning based on two premises: first, through the appropriate…

  19. Tyramine Hydrochloride Based Label-Free System for Operating Various DNA Logic Gates and a DNA Caliper for Base Number Measurements.

    PubMed

    Fan, Daoqing; Zhu, Xiaoqing; Dong, Shaojun; Wang, Erkang

    2017-07-05

    DNA is believed to be a promising candidate for molecular logic computation, and the fluorogenic/colorimetric substrates of G-quadruplex DNAzyme (G4zyme) are broadly used as label-free output reporters of DNA logic circuits. Herein, for the first time, tyramine-HCl (a fluorogenic substrate of G4zyme) is applied to DNA logic computation and a series of label-free DNA-input logic gates, including elementary AND, OR, and INHIBIT logic gates, as well as a two to one encoder, are constructed. Furthermore, a DNA caliper that can measure the base number of target DNA as low as three bases is also fabricated. This DNA caliper can also perform concatenated AND-AND logic computation to fulfil the requirements of sophisticated logic computing. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  20. Computer-Based Science Inquiry: How Components of Metacognitive Self-Regulation Affect Problem-Solving.

    ERIC Educational Resources Information Center

    Howard, Bruce C.; McGee, Steven; Shia, Regina; Hong, Namsoo Shin

    This study sought to examine the effects of meta cognitive self-regulation on problem solving across three conditions: (1) an interactive, computer-based treatment condition; (2) a noninteractive computer-based alternative treatment condition; and (3) a control condition. Also investigated was which of five components of metacognitive…

  1. Integrating Computers into the Problem-Solving Process.

    ERIC Educational Resources Information Center

    Lowther, Deborah L.; Morrison, Gary R.

    2003-01-01

    Asserts that within the context of problem-based learning environments, professors can encourage students to use computers as problem-solving tools. The ten-step Integrating Technology for InQuiry (NteQ) model guides professors through the process of integrating computers into problem-based learning activities. (SWM)

  2. Reversible Data Hiding Based on DNA Computing

    PubMed Central

    Xie, Yingjie

    2017-01-01

    Biocomputing, especially DNA, computing has got great development. It is widely used in information security. In this paper, a novel algorithm of reversible data hiding based on DNA computing is proposed. Inspired by the algorithm of histogram modification, which is a classical algorithm for reversible data hiding, we combine it with DNA computing to realize this algorithm based on biological technology. Compared with previous results, our experimental results have significantly improved the ER (Embedding Rate). Furthermore, some PSNR (peak signal-to-noise ratios) of test images are also improved. Experimental results show that it is suitable for protecting the copyright of cover image in DNA-based information security. PMID:28280504

  3. Modeling Students' Problem Solving Performance in the Computer-Based Mathematics Learning Environment

    ERIC Educational Resources Information Center

    Lee, Young-Jin

    2017-01-01

    Purpose: The purpose of this paper is to develop a quantitative model of problem solving performance of students in the computer-based mathematics learning environment. Design/methodology/approach: Regularized logistic regression was used to create a quantitative model of problem solving performance of students that predicts whether students can…

  4. Classical versus Computer Algebra Methods in Elementary Geometry

    ERIC Educational Resources Information Center

    Pech, Pavel

    2005-01-01

    Computer algebra methods based on results of commutative algebra like Groebner bases of ideals and elimination of variables make it possible to solve complex, elementary and non elementary problems of geometry, which are difficult to solve using a classical approach. Computer algebra methods permit the proof of geometric theorems, automatic…

  5. Genetic network inference as a series of discrimination tasks.

    PubMed

    Kimura, Shuhei; Nakayama, Satoshi; Hatakeyama, Mariko

    2009-04-01

    Genetic network inference methods based on sets of differential equations generally require a great deal of time, as the equations must be solved many times. To reduce the computational cost, researchers have proposed other methods for inferring genetic networks by solving sets of differential equations only a few times, or even without solving them at all. When we try to obtain reasonable network models using these methods, however, we must estimate the time derivatives of the gene expression levels with great precision. In this study, we propose a new method to overcome the drawbacks of inference methods based on sets of differential equations. Our method infers genetic networks by obtaining classifiers capable of predicting the signs of the derivatives of the gene expression levels. For this purpose, we defined a genetic network inference problem as a series of discrimination tasks, then solved the defined series of discrimination tasks with a linear programming machine. Our experimental results demonstrated that the proposed method is capable of correctly inferring genetic networks, and doing so more than 500 times faster than the other inference methods based on sets of differential equations. Next, we applied our method to actual expression data of the bacterial SOS DNA repair system. And finally, we demonstrated that our approach relates to the inference method based on the S-system model. Though our method provides no estimation of the kinetic parameters, it should be useful for researchers interested only in the network structure of a target system. Supplementary data are available at Bioinformatics online.

  6. Interactive Computer Based Assessment Tasks: How Problem-Solving Process Data Can Inform Instruction

    ERIC Educational Resources Information Center

    Zoanetti, Nathan

    2010-01-01

    This article presents key steps in the design and analysis of a computer based problem-solving assessment featuring interactive tasks. The purpose of the assessment is to support targeted instruction for students by diagnosing strengths and weaknesses at different stages of problem-solving. The first focus of this article is the task piloting…

  7. Forensic DNA methylation profiling from evidence material for investigative leads

    PubMed Central

    Lee, Hwan Young; Lee, Soong Deok; Shin, Kyoung-Jin

    2016-01-01

    DNA methylation is emerging as an attractive marker providing investigative leads to solve crimes in forensic genetics. The identification of body fluids that utilizes tissue-specific DNA methylation can contribute to solving crimes by predicting activity related to the evidence material. The age estimation based on DNA methylation is expected to reduce the number of potential suspects, when the DNA profile from the evidence does not match with any known person, including those stored in the forensic database. Moreover, the variation in DNA implicates environmental exposure, such as cigarette smoking and alcohol consumption, thereby suggesting the possibility to be used as a marker for predicting the lifestyle of potential suspect. In this review, we describe recent advances in our understanding of DNA methylation variations and the utility of DNA methylation as a forensic marker for advanced investigative leads from evidence materials. [BMB Reports 2016; 49(7): 359-369] PMID:27099236

  8. From Numerical Problem Solving to Model-Based Experimentation Incorporating Computer-Based Tools of Various Scales into the ChE Curriculum

    ERIC Educational Resources Information Center

    Shacham, Mordechai; Cutlip, Michael B.; Brauner, Neima

    2009-01-01

    A continuing challenge to the undergraduate chemical engineering curriculum is the time-effective incorporation and use of computer-based tools throughout the educational program. Computing skills in academia and industry require some proficiency in programming and effective use of software packages for solving 1) single-model, single-algorithm…

  9. Analyzing Log Files to Predict Students' Problem Solving Performance in a Computer-Based Physics Tutor

    ERIC Educational Resources Information Center

    Lee, Young-Jin

    2015-01-01

    This study investigates whether information saved in the log files of a computer-based tutor can be used to predict the problem solving performance of students. The log files of a computer-based physics tutoring environment called Andes Physics Tutor was analyzed to build a logistic regression model that predicted success and failure of students'…

  10. NMR and computational methods applied to the 3- dimensional structure determination of DNA and ligand-DNA complexes in solution

    NASA Astrophysics Data System (ADS)

    Smith, Jarrod Anson

    2D homonuclear 1H NMR methods and restrained molecular dynamics (rMD) calculations have been applied to determining the three-dimensional structures of DNA and minor groove-binding ligand-DNA complexes in solution. The structure of the DNA decamer sequence d(GCGTTAACGC)2 has been solved both with a distance-based rMD protocol and an NOE relaxation matrix backcalculation-based protocol in order to probe the relative merits of the different refinement methods. In addition, three minor groove binding ligand-DNA complexes have been examined. The solution structure of the oligosaccharide moiety of the antitumor DNA scission agent calicheamicin γ1I has been determined in complex with a decamer duplex containing its high affinity 5'-TCCT- 3' binding sequence. The structure of the complex reinforces the belief that the oligosaccharide moiety is responsible for the sequence selective minor-groove binding activity of the agent, and critical intermolecular contacts are revealed. The solution structures of both the (+) and (-) enantiomers of the minor groove binding DNA alkylating agent duocarmycin SA have been determined in covalent complex with the undecamer DNA duplex d(GACTAATTGTC).d(GAC AATTAGTC). The results support the proposal that the alkylation activity of the duocarmycin antitumor antibiotics is catalyzed by a binding-induced conformational change in the ligand which activates the cyclopropyl group for reaction with the DNA. Comparisons between the structures of the two enantiomers covalently bound to the same DNA sequence at the same 5'-AATTA-3 ' site have provided insight into the binding orientation and site selectivity, as well as the relative rates of reactivity of these two agents.

  11. The Role of the Goal in Solving Hard Computational Problems: Do People Really Optimize?

    ERIC Educational Resources Information Center

    Carruthers, Sarah; Stege, Ulrike; Masson, Michael E. J.

    2018-01-01

    The role that the mental, or internal, representation plays when people are solving hard computational problems has largely been overlooked to date, despite the reality that this internal representation drives problem solving. In this work we investigate how performance on versions of two hard computational problems differs based on what internal…

  12. Enhancing Digital Fluency through a Training Program for Creative Problem Solving Using Computer Programming

    ERIC Educational Resources Information Center

    Kim, SugHee; Chung, KwangSik; Yu, HeonChang

    2013-01-01

    The purpose of this paper is to propose a training program for creative problem solving based on computer programming. The proposed program will encourage students to solve real-life problems through a creative thinking spiral related to cognitive skills with computer programming. With the goal of enhancing digital fluency through this proposed…

  13. High performance transcription factor-DNA docking with GPU computing

    PubMed Central

    2012-01-01

    Background Protein-DNA docking is a very challenging problem in structural bioinformatics and has important implications in a number of applications, such as structure-based prediction of transcription factor binding sites and rational drug design. Protein-DNA docking is very computational demanding due to the high cost of energy calculation and the statistical nature of conformational sampling algorithms. More importantly, experiments show that the docking quality depends on the coverage of the conformational sampling space. It is therefore desirable to accelerate the computation of the docking algorithm, not only to reduce computing time, but also to improve docking quality. Methods In an attempt to accelerate the sampling process and to improve the docking performance, we developed a graphics processing unit (GPU)-based protein-DNA docking algorithm. The algorithm employs a potential-based energy function to describe the binding affinity of a protein-DNA pair, and integrates Monte-Carlo simulation and a simulated annealing method to search through the conformational space. Algorithmic techniques were developed to improve the computation efficiency and scalability on GPU-based high performance computing systems. Results The effectiveness of our approach is tested on a non-redundant set of 75 TF-DNA complexes and a newly developed TF-DNA docking benchmark. We demonstrated that the GPU-based docking algorithm can significantly accelerate the simulation process and thereby improving the chance of finding near-native TF-DNA complex structures. This study also suggests that further improvement in protein-DNA docking research would require efforts from two integral aspects: improvement in computation efficiency and energy function design. Conclusions We present a high performance computing approach for improving the prediction accuracy of protein-DNA docking. The GPU-based docking algorithm accelerates the search of the conformational space and thus increases the chance of finding more near-native structures. To the best of our knowledge, this is the first ad hoc effort of applying GPU or GPU clusters to the protein-DNA docking problem. PMID:22759575

  14. Effects of computer-based graphic organizers to solve one-step word problems for middle school students with mild intellectual disability: A preliminary study.

    PubMed

    Sheriff, Kelli A; Boon, Richard T

    2014-08-01

    The purpose of this study was to examine the effects of computer-based graphic organizers, using Kidspiration 3© software, to solve one-step word problems. Participants included three students with mild intellectual disability enrolled in a functional academic skills curriculum in a self-contained classroom. A multiple probe single-subject research design (Horner & Baer, 1978) was used to evaluate the effectiveness of computer-based graphic organizers to solving mathematical one-step word problems. During the baseline phase, the students completed a teacher-generated worksheet that consisted of nine functional word problems in a traditional format using a pencil, paper, and a calculator. In the intervention and maintenance phases, the students were instructed to complete the word problems using a computer-based graphic organizer. Results indicated that all three of the students improved in their ability to solve the one-step word problems using computer-based graphic organizers compared to traditional instructional practices. Limitations of the study and recommendations for future research directions are discussed. Copyright © 2014 Elsevier Ltd. All rights reserved.

  15. Identification of unique repeated patterns, location of mutation in DNA finger printing using artificial intelligence technique.

    PubMed

    Mukunthan, B; Nagaveni, N

    2014-01-01

    In genetic engineering, conventional techniques and algorithms employed by forensic scientists to assist in identification of individuals on the basis of their respective DNA profiles involves more complex computational steps and mathematical formulae, also the identification of location of mutation in a genomic sequence in laboratories is still an exigent task. This novel approach provides ability to solve the problems that do not have an algorithmic solution and the available solutions are also too complex to be found. The perfect blend made of bioinformatics and neural networks technique results in efficient DNA pattern analysis algorithm with utmost prediction accuracy.

  16. Solving a Hamiltonian Path Problem with a bacterial computer

    PubMed Central

    Baumgardner, Jordan; Acker, Karen; Adefuye, Oyinade; Crowley, Samuel Thomas; DeLoache, Will; Dickson, James O; Heard, Lane; Martens, Andrew T; Morton, Nickolaus; Ritter, Michelle; Shoecraft, Amber; Treece, Jessica; Unzicker, Matthew; Valencia, Amanda; Waters, Mike; Campbell, A Malcolm; Heyer, Laurie J; Poet, Jeffrey L; Eckdahl, Todd T

    2009-01-01

    Background The Hamiltonian Path Problem asks whether there is a route in a directed graph from a beginning node to an ending node, visiting each node exactly once. The Hamiltonian Path Problem is NP complete, achieving surprising computational complexity with modest increases in size. This challenge has inspired researchers to broaden the definition of a computer. DNA computers have been developed that solve NP complete problems. Bacterial computers can be programmed by constructing genetic circuits to execute an algorithm that is responsive to the environment and whose result can be observed. Each bacterium can examine a solution to a mathematical problem and billions of them can explore billions of possible solutions. Bacterial computers can be automated, made responsive to selection, and reproduce themselves so that more processing capacity is applied to problems over time. Results We programmed bacteria with a genetic circuit that enables them to evaluate all possible paths in a directed graph in order to find a Hamiltonian path. We encoded a three node directed graph as DNA segments that were autonomously shuffled randomly inside bacteria by a Hin/hixC recombination system we previously adapted from Salmonella typhimurium for use in Escherichia coli. We represented nodes in the graph as linked halves of two different genes encoding red or green fluorescent proteins. Bacterial populations displayed phenotypes that reflected random ordering of edges in the graph. Individual bacterial clones that found a Hamiltonian path reported their success by fluorescing both red and green, resulting in yellow colonies. We used DNA sequencing to verify that the yellow phenotype resulted from genotypes that represented Hamiltonian path solutions, demonstrating that our bacterial computer functioned as expected. Conclusion We successfully designed, constructed, and tested a bacterial computer capable of finding a Hamiltonian path in a three node directed graph. This proof-of-concept experiment demonstrates that bacterial computing is a new way to address NP-complete problems using the inherent advantages of genetic systems. The results of our experiments also validate synthetic biology as a valuable approach to biological engineering. We designed and constructed basic parts, devices, and systems using synthetic biology principles of standardization and abstraction. PMID:19630940

  17. Use of Computer-Based Case Studies in a Problem-Solving Curriculum.

    ERIC Educational Resources Information Center

    Haworth, Ian S.; And Others

    1997-01-01

    Describes the use of three case studies, on computer, to enhance problem solving and critical thinking among doctoral pharmacy students in a physical chemistry course. Students are expected to use specific computer programs, spreadsheets, electronic mail, molecular graphics, word processing, online literature searching, and other computer-based…

  18. EPA, Notre Dame researchers discuss challenges in adopting DNA-based methods for monitoring invasive species in U.S. water bodies

    EPA Pesticide Factsheets

    DNA-based technology helps people solve problems. It can be used to correctly match organ donors with recipients, identify victims of natural and man-made disasters, and detect bacteria and other organisms that may pollute air, soil, food, or water.

  19. Solving the Curriculum Sequencing Problem with DNA Computing Approach

    ERIC Educational Resources Information Center

    Debbah, Amina; Ben Ali, Yamina Mohamed

    2014-01-01

    In the e-learning systems, a learning path is known as a sequence of learning materials linked to each others to help learners achieving their learning goals. As it is impossible to have the same learning path that suits different learners, the Curriculum Sequencing problem (CS) consists of the generation of a personalized learning path for each…

  20. Vibrational Spectroscopy and Astrobiology

    NASA Technical Reports Server (NTRS)

    Chaban, Galina M.; Kwak, D. (Technical Monitor)

    2001-01-01

    Role of vibrational spectroscopy in solving problems related to astrobiology will be discussed. Vibrational (infrared) spectroscopy is a very sensitive tool for identifying molecules. Theoretical approach used in this work is based on direct computation of anharmonic vibrational frequencies and intensities from electronic structure codes. One of the applications of this computational technique is possible identification of biological building blocks (amino acids, small peptides, DNA bases) in the interstellar medium (ISM). Identifying small biological molecules in the ISM is very important from the point of view of origin of life. Hybrid (quantum mechanics/molecular mechanics) theoretical techniques will be discussed that may allow to obtain accurate vibrational spectra of biomolecular building blocks and to create a database of spectroscopic signatures that can assist observations of these molecules in space. Another application of the direct computational spectroscopy technique is to help to design and analyze experimental observations of ice surfaces of one of the Jupiter's moons, Europa, that possibly contains hydrated salts. The presence of hydrated salts on the surface can be an indication of a subsurface ocean and the possible existence of life forms inhabiting such an ocean.

  1. Inference of Vohradský's Models of Genetic Networks by Solving Two-Dimensional Function Optimization Problems

    PubMed Central

    Kimura, Shuhei; Sato, Masanao; Okada-Hatakeyama, Mariko

    2013-01-01

    The inference of a genetic network is a problem in which mutual interactions among genes are inferred from time-series of gene expression levels. While a number of models have been proposed to describe genetic networks, this study focuses on a mathematical model proposed by Vohradský. Because of its advantageous features, several researchers have proposed the inference methods based on Vohradský's model. When trying to analyze large-scale networks consisting of dozens of genes, however, these methods must solve high-dimensional non-linear function optimization problems. In order to resolve the difficulty of estimating the parameters of the Vohradský's model, this study proposes a new method that defines the problem as several two-dimensional function optimization problems. Through numerical experiments on artificial genetic network inference problems, we showed that, although the computation time of the proposed method is not the shortest, the method has the ability to estimate parameters of Vohradský's models more effectively with sufficiently short computation times. This study then applied the proposed method to an actual inference problem of the bacterial SOS DNA repair system, and succeeded in finding several reasonable regulations. PMID:24386175

  2. Computational Approaches to Nucleic Acid Origami.

    PubMed

    Jabbari, Hosna; Aminpour, Maral; Montemagno, Carlo

    2015-10-12

    Recent advances in experimental DNA origami have dramatically expanded the horizon of DNA nanotechnology. Complex 3D suprastructures have been designed and developed using DNA origami with applications in biomaterial science, nanomedicine, nanorobotics, and molecular computation. Ribonucleic acid (RNA) origami has recently been realized as a new approach. Similar to DNA, RNA molecules can be designed to form complex 3D structures through complementary base pairings. RNA origami structures are, however, more compact and more thermodynamically stable due to RNA's non-canonical base pairing and tertiary interactions. With all these advantages, the development of RNA origami lags behind DNA origami by a large gap. Furthermore, although computational methods have proven to be effective in designing DNA and RNA origami structures and in their evaluation, advances in computational nucleic acid origami is even more limited. In this paper, we review major milestones in experimental and computational DNA and RNA origami and present current challenges in these fields. We believe collaboration between experimental nanotechnologists and computer scientists are critical for advancing these new research paradigms.

  3. Design and Diagnosis Problem Solving with Multifunctional Technical Knowledge Bases

    DTIC Science & Technology

    1992-09-29

    STRUCTURE METHODOLOGY Design problem solving is a complex activity involving a number of subtasks. and a number of alternative methods potentially available...Conference on Artificial Intelligence. London: The British Computer Society, pp. 621-633. Friedland, P. (1979). Knowledge-based experimental design ...Computing Milieuxl: Management of Computing and Information Systems- -ty,*m man- agement General Terms: Design . Methodology Additional Key Words and Phrases

  4. Evaluating Preclinical Medical Students by Using Computer-Based Problem-Solving Examinations.

    ERIC Educational Resources Information Center

    Stevens, Ronald H.; And Others

    1989-01-01

    A study to determine the feasibility of creating and administering computer-based problem-solving examinations for evaluating second-year medical students in immunology and to determine how students would perform on these tests relative to their performances on concurrently administered objective and essay examinations is described. (Author/MLW)

  5. A Survey of Computational Intelligence Techniques in Protein Function Prediction

    PubMed Central

    Tiwari, Arvind Kumar; Srivastava, Rajeev

    2014-01-01

    During the past, there was a massive growth of knowledge of unknown proteins with the advancement of high throughput microarray technologies. Protein function prediction is the most challenging problem in bioinformatics. In the past, the homology based approaches were used to predict the protein function, but they failed when a new protein was different from the previous one. Therefore, to alleviate the problems associated with homology based traditional approaches, numerous computational intelligence techniques have been proposed in the recent past. This paper presents a state-of-the-art comprehensive review of various computational intelligence techniques for protein function predictions using sequence, structure, protein-protein interaction network, and gene expression data used in wide areas of applications such as prediction of DNA and RNA binding sites, subcellular localization, enzyme functions, signal peptides, catalytic residues, nuclear/G-protein coupled receptors, membrane proteins, and pathway analysis from gene expression datasets. This paper also summarizes the result obtained by many researchers to solve these problems by using computational intelligence techniques with appropriate datasets to improve the prediction performance. The summary shows that ensemble classifiers and integration of multiple heterogeneous data are useful for protein function prediction. PMID:25574395

  6. Reinforcement learning in computer vision

    NASA Astrophysics Data System (ADS)

    Bernstein, A. V.; Burnaev, E. V.

    2018-04-01

    Nowadays, machine learning has become one of the basic technologies used in solving various computer vision tasks such as feature detection, image segmentation, object recognition and tracking. In many applications, various complex systems such as robots are equipped with visual sensors from which they learn state of surrounding environment by solving corresponding computer vision tasks. Solutions of these tasks are used for making decisions about possible future actions. It is not surprising that when solving computer vision tasks we should take into account special aspects of their subsequent application in model-based predictive control. Reinforcement learning is one of modern machine learning technologies in which learning is carried out through interaction with the environment. In recent years, Reinforcement learning has been used both for solving such applied tasks as processing and analysis of visual information, and for solving specific computer vision problems such as filtering, extracting image features, localizing objects in scenes, and many others. The paper describes shortly the Reinforcement learning technology and its use for solving computer vision problems.

  7. Solving a four-destination traveling salesman problem using Escherichia coli cells as biocomputers.

    PubMed

    Esau, Michael; Rozema, Mark; Zhang, Tuo Huang; Zeng, Dawson; Chiu, Stephanie; Kwan, Rachel; Moorhouse, Cadence; Murray, Cameron; Tseng, Nien-Tsu; Ridgway, Doug; Sauvageau, Dominic; Ellison, Michael

    2014-12-19

    The Traveling Salesman Problem involves finding the shortest possible route visiting all destinations on a map only once before returning to the point of origin. The present study demonstrates a strategy for solving Traveling Salesman Problems using modified E. coli cells as processors for massively parallel computing. Sequential, combinatorial DNA assembly was used to generate routes, in the form of plasmids made up of marker genes, each representing a path between destinations, and short connecting linkers, each representing a given destination. Upon growth of the population of modified E. coli, phenotypic selection was used to eliminate invalid routes, and statistical analysis was performed to successfully identify the optimal solution. The strategy was successfully employed to solve a four-destination test problem.

  8. Problem-Solving Test: Conditional Gene Targeting Using the Cre/loxP Recombination System

    ERIC Educational Resources Information Center

    Szeberényi, József

    2013-01-01

    Terms to be familiar with before you start to solve the test: gene targeting, knock-out mutation, bacteriophage, complementary base-pairing, homologous recombination, deletion, transgenic organisms, promoter, polyadenylation element, transgene, DNA replication, RNA polymerase, Shine-Dalgarno sequence, restriction endonuclease, polymerase chain…

  9. Kinetics of DSB rejoining and formation of simple chromosome exchange aberrations

    NASA Technical Reports Server (NTRS)

    Cucinotta, F. A.; Nikjoo, H.; O'Neill, P.; Goodhead, D. T.

    2000-01-01

    PURPOSE: To investigate the role of kinetics in the processing of DNA double strand breaks (DSB), and the formation of simple chromosome exchange aberrations following X-ray exposures to mammalian cells based on an enzymatic approach. METHODS: Using computer simulations based on a biochemical approach, rate-equations that describe the processing of DSB through the formation of a DNA-enzyme complex were formulated. A second model that allows for competition between two processing pathways was also formulated. The formation of simple exchange aberrations was modelled as misrepair during the recombination of single DSB with undamaged DNA. Non-linear coupled differential equations corresponding to biochemical pathways were solved numerically by fitting to experimental data. RESULTS: When mediated by a DSB repair enzyme complex, the processing of single DSB showed a complex behaviour that gives the appearance of fast and slow components of rejoining. This is due to the time-delay caused by the action time of enzymes in biomolecular reactions. It is shown that the kinetic- and dose-responses of simple chromosome exchange aberrations are well described by a recombination model of DSB interacting with undamaged DNA when aberration formation increases with linear dose-dependence. Competition between two or more recombination processes is shown to lead to the formation of simple exchange aberrations with a dose-dependence similar to that of a linear quadratic model. CONCLUSIONS: Using a minimal number of assumptions, the kinetics and dose response observed experimentally for DSB rejoining and the formation of simple chromosome exchange aberrations are shown to be consistent with kinetic models based on enzymatic reaction approaches. A non-linear dose response for simple exchange aberrations is possible in a model of recombination of DNA containing a DSB with undamaged DNA when two or more pathways compete for DSB repair.

  10. cuRRBS: simple and robust evaluation of enzyme combinations for reduced representation approaches.

    PubMed

    Martin-Herranz, Daniel E; Ribeiro, António J M; Krueger, Felix; Thornton, Janet M; Reik, Wolf; Stubbs, Thomas M

    2017-11-16

    DNA methylation is an important epigenetic modification in many species that is critical for development, and implicated in ageing and many complex diseases, such as cancer. Many cost-effective genome-wide analyses of DNA modifications rely on restriction enzymes capable of digesting genomic DNA at defined sequence motifs. There are hundreds of restriction enzyme families but few are used to date, because no tool is available for the systematic evaluation of restriction enzyme combinations that can enrich for certain sites of interest in a genome. Herein, we present customised Reduced Representation Bisulfite Sequencing (cuRRBS), a novel and easy-to-use computational method that solves this problem. By computing the optimal enzymatic digestions and size selection steps required, cuRRBS generalises the traditional MspI-based Reduced Representation Bisulfite Sequencing (RRBS) protocol to all restriction enzyme combinations. In addition, cuRRBS estimates the fold-reduction in sequencing costs and provides a robustness value for the personalised RRBS protocol, allowing users to tailor the protocol to their experimental needs. Moreover, we show in silico that cuRRBS-defined restriction enzymes consistently out-perform MspI digestion in many biological systems, considering both CpG and CHG contexts. Finally, we have validated the accuracy of cuRRBS predictions for single and double enzyme digestions using two independent experimental datasets. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  11. Curriculum-Based Measurement of Mathematics Competence: From Computation to Concepts and Applications to Real-Life Problem Solving

    ERIC Educational Resources Information Center

    Fuchs, Lynn S.; Fuchs, Douglas; Courey, Susan J.

    2005-01-01

    In this article, the authors explain how curriculum-based measurement (CBM) differs from other forms of classroom-based assessment. The development of CBM is traced from computation to concepts and applications to real-life problem solving, with examples of the assessments and illustrations of research to document technical features and utility…

  12. All-optical computation system for solving differential equations based on optical intensity differentiator.

    PubMed

    Tan, Sisi; Wu, Zhao; Lei, Lei; Hu, Shoujin; Dong, Jianji; Zhang, Xinliang

    2013-03-25

    We propose and experimentally demonstrate an all-optical differentiator-based computation system used for solving constant-coefficient first-order linear ordinary differential equations. It consists of an all-optical intensity differentiator and a wavelength converter, both based on a semiconductor optical amplifier (SOA) and an optical filter (OF). The equation is solved for various values of the constant-coefficient and two considered input waveforms, namely, super-Gaussian and Gaussian signals. An excellent agreement between the numerical simulation and the experimental results is obtained.

  13. Environmental Sensing of Expert Knowledge in a Computational Evolution System for Complex Problem Solving in Human Genetics

    NASA Astrophysics Data System (ADS)

    Greene, Casey S.; Hill, Douglas P.; Moore, Jason H.

    The relationship between interindividual variation in our genomes and variation in our susceptibility to common diseases is expected to be complex with multiple interacting genetic factors. A central goal of human genetics is to identify which DNA sequence variations predict disease risk in human populations. Our success in this endeavour will depend critically on the development and implementation of computational intelligence methods that are able to embrace, rather than ignore, the complexity of the genotype to phenotype relationship. To this end, we have developed a computational evolution system (CES) to discover genetic models of disease susceptibility involving complex relationships between DNA sequence variations. The CES approach is hierarchically organized and is capable of evolving operators of any arbitrary complexity. The ability to evolve operators distinguishes this approach from artificial evolution approaches using fixed operators such as mutation and recombination. Our previous studies have shown that a CES that can utilize expert knowledge about the problem in evolved operators significantly outperforms a CES unable to use this knowledge. This environmental sensing of external sources of biological or statistical knowledge is important when the search space is both rugged and large as in the genetic analysis of complex diseases. We show here that the CES is also capable of evolving operators which exploit one of several sources of expert knowledge to solve the problem. This is important for both the discovery of highly fit genetic models and because the particular source of expert knowledge used by evolved operators may provide additional information about the problem itself. This study brings us a step closer to a CES that can solve complex problems in human genetics in addition to discovering genetic models of disease.

  14. Investigation of a Sybr-Green-Based Method to Validate DNA Sequences for DNA Computing

    DTIC Science & Technology

    2005-05-01

    OF A SYBR-GREEN-BASED METHOD TO VALIDATE DNA SEQUENCES FOR DNA COMPUTING 6. AUTHOR(S) Wendy Pogozelski, Salvatore Priore, Matthew Bernard ...simulated annealing. Biochemistry, 35, 14077-14089. 15 Pogozelski, W.K., Bernard , M.P. and Macula, A. (2004) DNA code validation using...and Clark, B.F.C. (eds) In RNA Biochemistry and Biotechnology, NATO ASI Series, Kluwer Academic Publishers. Zucker, M. and Stiegler , P. (1981

  15. Improving the learning of clinical reasoning through computer-based cognitive representation.

    PubMed

    Wu, Bian; Wang, Minhong; Johnson, Janice M; Grotzer, Tina A

    2014-01-01

    Objective Clinical reasoning is usually taught using a problem-solving approach, which is widely adopted in medical education. However, learning through problem solving is difficult as a result of the contextualization and dynamic aspects of actual problems. Moreover, knowledge acquired from problem-solving practice tends to be inert and fragmented. This study proposed a computer-based cognitive representation approach that externalizes and facilitates the complex processes in learning clinical reasoning. The approach is operationalized in a computer-based cognitive representation tool that involves argument mapping to externalize the problem-solving process and concept mapping to reveal the knowledge constructed from the problems. Methods Twenty-nine Year 3 or higher students from a medical school in east China participated in the study. Participants used the proposed approach implemented in an e-learning system to complete four learning cases in 4 weeks on an individual basis. For each case, students interacted with the problem to capture critical data, generate and justify hypotheses, make a diagnosis, recall relevant knowledge, and update their conceptual understanding of the problem domain. Meanwhile, students used the computer-based cognitive representation tool to articulate and represent the key elements and their interactions in the learning process. Results A significant improvement was found in students' learning products from the beginning to the end of the study, consistent with students' report of close-to-moderate progress in developing problem-solving and knowledge-construction abilities. No significant differences were found between the pretest and posttest scores with the 4-week period. The cognitive representation approach was found to provide more formative assessment. Conclusions The computer-based cognitive representation approach improved the learning of clinical reasoning in both problem solving and knowledge construction.

  16. Improving the learning of clinical reasoning through computer-based cognitive representation

    PubMed Central

    Wu, Bian; Wang, Minhong; Johnson, Janice M.; Grotzer, Tina A.

    2014-01-01

    Objective Clinical reasoning is usually taught using a problem-solving approach, which is widely adopted in medical education. However, learning through problem solving is difficult as a result of the contextualization and dynamic aspects of actual problems. Moreover, knowledge acquired from problem-solving practice tends to be inert and fragmented. This study proposed a computer-based cognitive representation approach that externalizes and facilitates the complex processes in learning clinical reasoning. The approach is operationalized in a computer-based cognitive representation tool that involves argument mapping to externalize the problem-solving process and concept mapping to reveal the knowledge constructed from the problems. Methods Twenty-nine Year 3 or higher students from a medical school in east China participated in the study. Participants used the proposed approach implemented in an e-learning system to complete four learning cases in 4 weeks on an individual basis. For each case, students interacted with the problem to capture critical data, generate and justify hypotheses, make a diagnosis, recall relevant knowledge, and update their conceptual understanding of the problem domain. Meanwhile, students used the computer-based cognitive representation tool to articulate and represent the key elements and their interactions in the learning process. Results A significant improvement was found in students’ learning products from the beginning to the end of the study, consistent with students’ report of close-to-moderate progress in developing problem-solving and knowledge-construction abilities. No significant differences were found between the pretest and posttest scores with the 4-week period. The cognitive representation approach was found to provide more formative assessment. Conclusions The computer-based cognitive representation approach improved the learning of clinical reasoning in both problem solving and knowledge construction. PMID:25518871

  17. Improving the learning of clinical reasoning through computer-based cognitive representation.

    PubMed

    Wu, Bian; Wang, Minhong; Johnson, Janice M; Grotzer, Tina A

    2014-01-01

    Clinical reasoning is usually taught using a problem-solving approach, which is widely adopted in medical education. However, learning through problem solving is difficult as a result of the contextualization and dynamic aspects of actual problems. Moreover, knowledge acquired from problem-solving practice tends to be inert and fragmented. This study proposed a computer-based cognitive representation approach that externalizes and facilitates the complex processes in learning clinical reasoning. The approach is operationalized in a computer-based cognitive representation tool that involves argument mapping to externalize the problem-solving process and concept mapping to reveal the knowledge constructed from the problems. Twenty-nine Year 3 or higher students from a medical school in east China participated in the study. Participants used the proposed approach implemented in an e-learning system to complete four learning cases in 4 weeks on an individual basis. For each case, students interacted with the problem to capture critical data, generate and justify hypotheses, make a diagnosis, recall relevant knowledge, and update their conceptual understanding of the problem domain. Meanwhile, students used the computer-based cognitive representation tool to articulate and represent the key elements and their interactions in the learning process. A significant improvement was found in students' learning products from the beginning to the end of the study, consistent with students' report of close-to-moderate progress in developing problem-solving and knowledge-construction abilities. No significant differences were found between the pretest and posttest scores with the 4-week period. The cognitive representation approach was found to provide more formative assessment. The computer-based cognitive representation approach improved the learning of clinical reasoning in both problem solving and knowledge construction.

  18. Research on Image Encryption Based on DNA Sequence and Chaos Theory

    NASA Astrophysics Data System (ADS)

    Tian Zhang, Tian; Yan, Shan Jun; Gu, Cheng Yan; Ren, Ran; Liao, Kai Xin

    2018-04-01

    Nowadays encryption is a common technique to protect image data from unauthorized access. In recent years, many scientists have proposed various encryption algorithms based on DNA sequence to provide a new idea for the design of image encryption algorithm. Therefore, a new method of image encryption based on DNA computing technology is proposed in this paper, whose original image is encrypted by DNA coding and 1-D logistic chaotic mapping. First, the algorithm uses two modules as the encryption key. The first module uses the real DNA sequence, and the second module is made by one-dimensional logistic chaos mapping. Secondly, the algorithm uses DNA complementary rules to encode original image, and uses the key and DNA computing technology to compute each pixel value of the original image, so as to realize the encryption of the whole image. Simulation results show that the algorithm has good encryption effect and security.

  19. Computer-Based Graphical Displays for Enhancing Mental Animation and Improving Reasoning in Novice Learning of Probability

    ERIC Educational Resources Information Center

    Kaplan, Danielle E.; Wu, Erin Chia-ling

    2006-01-01

    Our research suggests static and animated graphics can lead to more animated thinking and more correct problem solving in computer-based probability learning. Pilot software modules were developed for graduate online statistics courses and representation research. A study with novice graduate student statisticians compared problem solving in five…

  20. Computer-Based Assessment of Collaborative Problem Solving: Exploring the Feasibility of Human-to-Agent Approach

    ERIC Educational Resources Information Center

    Rosen, Yigal

    2015-01-01

    How can activities in which collaborative skills of an individual are measured be standardized? In order to understand how students perform on collaborative problem solving (CPS) computer-based assessment, it is necessary to examine empirically the multi-faceted performance that may be distributed across collaboration methods. The aim of this…

  1. Analog Processor To Solve Optimization Problems

    NASA Technical Reports Server (NTRS)

    Duong, Tuan A.; Eberhardt, Silvio P.; Thakoor, Anil P.

    1993-01-01

    Proposed analog processor solves "traveling-salesman" problem, considered paradigm of global-optimization problems involving routing or allocation of resources. Includes electronic neural network and auxiliary circuitry based partly on concepts described in "Neural-Network Processor Would Allocate Resources" (NPO-17781) and "Neural Network Solves 'Traveling-Salesman' Problem" (NPO-17807). Processor based on highly parallel computing solves problem in significantly less time.

  2. Simultaneous G-Quadruplex DNA Logic.

    PubMed

    Bader, Antoine; Cockroft, Scott L

    2018-04-03

    A fundamental principle of digital computer operation is Boolean logic, where inputs and outputs are described by binary integer voltages. Similarly, inputs and outputs may be processed on the molecular level as exemplified by synthetic circuits that exploit the programmability of DNA base-pairing. Unlike modern computers, which execute large numbers of logic gates in parallel, most implementations of molecular logic have been limited to single computing tasks, or sensing applications. This work reports three G-quadruplex-based logic gates that operate simultaneously in a single reaction vessel. The gates respond to unique Boolean DNA inputs by undergoing topological conversion from duplex to G-quadruplex states that were resolved using a thioflavin T dye and gel electrophoresis. The modular, addressable, and label-free approach could be incorporated into DNA-based sensors, or used for resolving and debugging parallel processes in DNA computing applications. © 2018 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  3. Formaldehyde substitute fixatives: effects on nucleic acid preservation.

    PubMed

    Moelans, Cathy B; Oostenrijk, Daphne; Moons, Michiel J; van Diest, Paul J

    2011-11-01

    In surgical pathology, formalin-fixed paraffin-embedded tissues are increasingly being used as a source of DNA and RNA for molecular assays in addition to histopathological evaluation. However, the commonly used formalin fixative is carcinogenic, and its crosslinking impairs DNA and RNA quality. The suitability of three new presumably less toxic, crosslinking (F-Solv) and non-crosslinking (FineFIX, RCL2) alcohol-based fixatives was tested for routine molecular pathology in comparison with neutral buffered formalin (NBF) as gold standard. Size ladder PCR, epidermal growth factor receptor sequence analysis, microsatellite instability (MSI), chromogenic (CISH), fluorescence in situ hybridisation (FISH) and qPCR were performed. The alcohol-based non-crosslinking fixatives (FineFIX and RCL2) resulted in a higher DNA yield and quality compared with crosslinking fixatives (NBF and F-Solv). Size ladder PCR resulted in a shorter amplicon size (300 bp) for both crosslinking fixatives compared with the non-crosslinking fixatives (400 bp). All four fixatives were directly applicable for MSI and epidermal growth factor receptor sequence analysis. All fixatives except F-Solv showed clear signals in CISH and FISH. RNA yield and quality were superior after non-crosslinking fixation. qPCR resulted in lower Ct values for RCL2 and FineFIX. The alcohol-based non-crosslinking fixatives performed better than crosslinking fixatives with regard to DNA and RNA yield, quality and applicability in molecular diagnostics. Given the higher yield, less starting material may be necessary, thereby increasing the applicability of biopsies for molecular studies.

  4. Comparative Biochemistry and Metabolism

    DTIC Science & Technology

    1978-12-01

    pyrimidines). When interest includes labile pyrimidine derivatives, the DNA is hydrolyzed enzymatically; 5 mg DNA is dis- solved in water containing 20 j...Individual labeled pyrimidine nucleosides from animals so treated have been isolated but not yet identified. The DNA is hydrolyzed enzymatically to... hydrolyzed and chromatographically separated into pyrimidine oligonucleotides and free purine bases. At a dose of 60 mg hydrazine/kg body weight (LDO.0O

  5. P-Hint-Hunt: a deep parallelized whole genome DNA methylation detection tool.

    PubMed

    Peng, Shaoliang; Yang, Shunyun; Gao, Ming; Liao, Xiangke; Liu, Jie; Yang, Canqun; Wu, Chengkun; Yu, Wenqiang

    2017-03-14

    The increasing studies have been conducted using whole genome DNA methylation detection as one of the most important part of epigenetics research to find the significant relationships among DNA methylation and several typical diseases, such as cancers and diabetes. In many of those studies, mapping the bisulfite treated sequence to the whole genome has been the main method to study DNA cytosine methylation. However, today's relative tools almost suffer from inaccuracies and time-consuming problems. In our study, we designed a new DNA methylation prediction tool ("Hint-Hunt") to solve the problem. By having an optimal complex alignment computation and Smith-Waterman matrix dynamic programming, Hint-Hunt could analyze and predict the DNA methylation status. But when Hint-Hunt tried to predict DNA methylation status with large-scale dataset, there are still slow speed and low temporal-spatial efficiency problems. In order to solve the problems of Smith-Waterman dynamic programming and low temporal-spatial efficiency, we further design a deep parallelized whole genome DNA methylation detection tool ("P-Hint-Hunt") on Tianhe-2 (TH-2) supercomputer. To the best of our knowledge, P-Hint-Hunt is the first parallel DNA methylation detection tool with a high speed-up to process large-scale dataset, and could run both on CPU and Intel Xeon Phi coprocessors. Moreover, we deploy and evaluate Hint-Hunt and P-Hint-Hunt on TH-2 supercomputer in different scales. The experimental results illuminate our tools eliminate the deviation caused by bisulfite treatment in mapping procedure and the multi-level parallel program yields a 48 times speed-up with 64 threads. P-Hint-Hunt gain a deep acceleration on CPU and Intel Xeon Phi heterogeneous platform, which gives full play of the advantages of multi-cores (CPU) and many-cores (Phi).

  6. Problem-based test: replication of mitochondrial DNA during the cell cycle.

    PubMed

    Sétáló, György

    2013-01-01

    Terms to be familiar with before you start to solve the test: cell cycle, generation time, S-phase, cell culture synchronization, isotopic pulse-chase labeling, density labeling, equilibrium density-gradient centrifugation, buoyant density, rate-zonal centrifugation, nucleoside, nucleotide, kinase enzymes, polymerization of nucleic acids, re-replication block, cell fractionation, Svedberg (sedimentation constant = [ S]), nuclear DNA, mitochondrial DNA, heavy and light mitochondrial DNA chains, heteroplasmy, mitochondrial diseases Copyright © 2013 Wiley Periodicals, Inc.

  7. The Effects of the Computer-Based Instruction on the Achievement and Problem Solving Skills of the Science and Technology Students

    ERIC Educational Resources Information Center

    Serin, Oguz

    2011-01-01

    This study aims to investigate the effects of the computer-based instruction on the achievements and problem solving skills of the science and technology students. This is a study based on the pre-test/post-test control group design. The participants of the study consist of 52 students; 26 in the experimental group, 26 in the control group. The…

  8. Biomolecular computers with multiple restriction enzymes.

    PubMed

    Sakowski, Sebastian; Krasinski, Tadeusz; Waldmajer, Jacek; Sarnik, Joanna; Blasiak, Janusz; Poplawski, Tomasz

    2017-01-01

    The development of conventional, silicon-based computers has several limitations, including some related to the Heisenberg uncertainty principle and the von Neumann "bottleneck". Biomolecular computers based on DNA and proteins are largely free of these disadvantages and, along with quantum computers, are reasonable alternatives to their conventional counterparts in some applications. The idea of a DNA computer proposed by Ehud Shapiro's group at the Weizmann Institute of Science was developed using one restriction enzyme as hardware and DNA fragments (the transition molecules) as software and input/output signals. This computer represented a two-state two-symbol finite automaton that was subsequently extended by using two restriction enzymes. In this paper, we propose the idea of a multistate biomolecular computer with multiple commercially available restriction enzymes as hardware. Additionally, an algorithmic method for the construction of transition molecules in the DNA computer based on the use of multiple restriction enzymes is presented. We use this method to construct multistate, biomolecular, nondeterministic finite automata with four commercially available restriction enzymes as hardware. We also describe an experimental applicaton of this theoretical model to a biomolecular finite automaton made of four endonucleases.

  9. Mathematics Word Problem Solving: An Investigation into Schema-Based Instruction in a Computer-Mediated Setting and a Teacher-Mediated Setting with Mathematically Low-Performing Students

    ERIC Educational Resources Information Center

    Leh, Jayne

    2011-01-01

    Substantial evidence indicates that teacher-delivered schema-based instruction (SBI) facilitates significant increases in mathematics word problem solving (WPS) skills for diverse students; however research is unclear whether technology affordances facilitate superior gains in computer-mediated (CM) instruction in mathematics WPS when compared to…

  10. The Effects of Varied Visual Organizational Strategies within Computer-Based Instruction on Factual, Conceptual and Problem Solving Learning.

    ERIC Educational Resources Information Center

    Haag, Brenda Bannan; Grabowski, Barbara L.

    The purpose of this exploratory study was to examine the effectiveness of learner manipulation of visuals with and without organizing cues in computer-based instruction on adults' factual, conceptual, and problem-solving learning. An instructional unit involving the physiology and the anatomy of the heart was used. A post-test only control group…

  11. Examining Information Problem-Solving, Knowledge, and Application Gains within Two Instructional Methods: Problem-Based and Computer-Mediated Participatory Simulation

    ERIC Educational Resources Information Center

    Newell, Terrance S.

    2008-01-01

    This study compared the effectiveness of two instructional methods--problem-based instruction within a face-to-face context and computer-mediated participatory simulation--in increasing students' content knowledge and application gains in the area of information problem-solving. The instructional methods were implemented over a four-week period. A…

  12. Computational method and system for modeling, analyzing, and optimizing DNA amplification and synthesis

    DOEpatents

    Vandersall, Jennifer A.; Gardner, Shea N.; Clague, David S.

    2010-05-04

    A computational method and computer-based system of modeling DNA synthesis for the design and interpretation of PCR amplification, parallel DNA synthesis, and microarray chip analysis. The method and system include modules that address the bioinformatics, kinetics, and thermodynamics of DNA amplification and synthesis. Specifically, the steps of DNA selection, as well as the kinetics and thermodynamics of DNA hybridization and extensions, are addressed, which enable the optimization of the processing and the prediction of the products as a function of DNA sequence, mixing protocol, time, temperature and concentration of species.

  13. Problem-Solving Test: Expression Cloning of the Erythropoietin Receptor

    ERIC Educational Resources Information Center

    Szeberenyi, Jozsef

    2008-01-01

    Terms to be familiar with before you start to solve the test: cytokines, cytokine receptors, cDNA library, cDNA synthesis, poly(A)[superscript +] RNA, primer, template, reverse transcriptase, restriction endonucleases, cohesive ends, expression vector, promoter, Shine-Dalgarno sequence, poly(A) signal, DNA helicase, DNA ligase, topoisomerases,…

  14. DENA: A Configurable Microarchitecture and Design Flow for Biomedical DNA-Based Logic Design.

    PubMed

    Beiki, Zohre; Jahanian, Ali

    2017-10-01

    DNA is known as the building block for storing the life codes and transferring the genetic features through the generations. However, it is found that DNA strands can be used for a new type of computation that opens fascinating horizons in computational medicine. Significant contributions are addressed on design of DNA-based logic gates for medical and computational applications but there are serious challenges for designing the medium and large-scale DNA circuits. In this paper, a new microarchitecture and corresponding design flow is proposed to facilitate the design of multistage large-scale DNA logic systems. Feasibility and efficiency of the proposed microarchitecture are evaluated by implementing a full adder and, then, its cascadability is determined by implementing a multistage 8-bit adder. Simulation results show the highlight features of the proposed design style and microarchitecture in terms of the scalability, implementation cost, and signal integrity of the DNA-based logic system compared to the traditional approaches.

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tamrin, Mohd Izzuddin Mohd; Turaev, Sherzod; Sembok, Tengku Mohd Tengku

    There are tremendous works in biotechnology especially in area of DNA molecules. The computer society is attempting to develop smaller computing devices through computational models which are based on the operations performed on the DNA molecules. A Watson-Crick automaton, a theoretical model for DNA based computation, has two reading heads, and works on double-stranded sequences of the input related by a complementarity relation similar with the Watson-Crick complementarity of DNA nucleotides. Over the time, several variants of Watson-Crick automata have been introduced and investigated. However, they cannot be used as suitable DNA based computational models for molecular stochastic processes andmore » fuzzy processes that are related to important practical problems such as molecular parsing, gene disease detection, and food authentication. In this paper we define new variants of Watson-Crick automata, called weighted Watson-Crick automata, developing theoretical models for molecular stochastic and fuzzy processes. We define weighted Watson-Crick automata adapting weight restriction mechanisms associated with formal grammars and automata. We also study the generative capacities of weighted Watson-Crick automata, including probabilistic and fuzzy variants. We show that weighted variants of Watson-Crick automata increase their generative power.« less

  16. Weighted Watson-Crick automata

    NASA Astrophysics Data System (ADS)

    Tamrin, Mohd Izzuddin Mohd; Turaev, Sherzod; Sembok, Tengku Mohd Tengku

    2014-07-01

    There are tremendous works in biotechnology especially in area of DNA molecules. The computer society is attempting to develop smaller computing devices through computational models which are based on the operations performed on the DNA molecules. A Watson-Crick automaton, a theoretical model for DNA based computation, has two reading heads, and works on double-stranded sequences of the input related by a complementarity relation similar with the Watson-Crick complementarity of DNA nucleotides. Over the time, several variants of Watson-Crick automata have been introduced and investigated. However, they cannot be used as suitable DNA based computational models for molecular stochastic processes and fuzzy processes that are related to important practical problems such as molecular parsing, gene disease detection, and food authentication. In this paper we define new variants of Watson-Crick automata, called weighted Watson-Crick automata, developing theoretical models for molecular stochastic and fuzzy processes. We define weighted Watson-Crick automata adapting weight restriction mechanisms associated with formal grammars and automata. We also study the generative capacities of weighted Watson-Crick automata, including probabilistic and fuzzy variants. We show that weighted variants of Watson-Crick automata increase their generative power.

  17. Regressive Imagery in Creative Problem-Solving: Comparing Verbal Protocols of Expert and Novice Visual Artists and Computer Programmers

    ERIC Educational Resources Information Center

    Kozbelt, Aaron; Dexter, Scott; Dolese, Melissa; Meredith, Daniel; Ostrofsky, Justin

    2015-01-01

    We applied computer-based text analyses of regressive imagery to verbal protocols of individuals engaged in creative problem-solving in two domains: visual art (23 experts, 23 novices) and computer programming (14 experts, 14 novices). Percentages of words involving primary process and secondary process thought, plus emotion-related words, were…

  18. An Automated Pipeline for Engineering Many-Enzyme Pathways: Computational Sequence Design, Pathway Expression-Flux Mapping, and Scalable Pathway Optimization.

    PubMed

    Halper, Sean M; Cetnar, Daniel P; Salis, Howard M

    2018-01-01

    Engineering many-enzyme metabolic pathways suffers from the design curse of dimensionality. There are an astronomical number of synonymous DNA sequence choices, though relatively few will express an evolutionary robust, maximally productive pathway without metabolic bottlenecks. To solve this challenge, we have developed an integrated, automated computational-experimental pipeline that identifies a pathway's optimal DNA sequence without high-throughput screening or many cycles of design-build-test. The first step applies our Operon Calculator algorithm to design a host-specific evolutionary robust bacterial operon sequence with maximally tunable enzyme expression levels. The second step applies our RBS Library Calculator algorithm to systematically vary enzyme expression levels with the smallest-sized library. After characterizing a small number of constructed pathway variants, measurements are supplied to our Pathway Map Calculator algorithm, which then parameterizes a kinetic metabolic model that ultimately predicts the pathway's optimal enzyme expression levels and DNA sequences. Altogether, our algorithms provide the ability to efficiently map the pathway's sequence-expression-activity space and predict DNA sequences with desired metabolic fluxes. Here, we provide a step-by-step guide to applying the Pathway Optimization Pipeline on a desired multi-enzyme pathway in a bacterial host.

  19. Antibacterial Drug Leads: DNA and Enzyme Multitargeting

    DOE PAGES

    Zhu, Wei; Wang, Yang; Li, Kai; ...

    2015-01-09

    Here, we report the results of an investigation of the activity of a series of amidine and bisamidine compounds against Staphylococcus aureus and Escherichia coli. The most active compounds bound to an AT-rich DNA dodecamer (CGCGAATTCGCG) 2 and using DSC were found to increase the melting transition by up to 24 °C. Several compounds also inhibited undecaprenyl diphosphate synthase (UPPS) with IC 50 values of 100–500 nM, and we found good correlations (R 2 = 0.89, S. aureus; R 2 = 0.79, E. coli) between experimental and predicted cell growth inhibition by using DNA ΔT m and UPPS IC 50more » experimental results together with one computed descriptor. Finally, we also solved the structures of three bisamidines binding to DNA as well as three UPPS structures. Overall, the results are of general interest in the context of the development of resistance-resistant antibiotics that involve multitargeting.« less

  20. Application of virtual screening and molecular dynamics for the analysis of selectivity of inhibitors of HU proteins targeted to the DNA-recognition site

    NASA Astrophysics Data System (ADS)

    Talyzina, A. A.; Agapova, Yu. K.; Podshivalov, D. D.; Timofeev, V. I.; Sidorov-Biryukov, D. D.; Rakitina, T. V.

    2017-11-01

    DNA-Binding HU proteins are essential for the maintenance of genomic DNA supercoiling and compaction in prokaryotic cells and are promising pharmacological targets for the design of new antibacterial agents. The virtual screening for low-molecular-weight compounds capable of specifically interacting with the DNA-recognition loop of the HU protein from the mycoplasma Spiroplasma melliferum was performed. The ability of the initially selected ligands to form stable complexes with the protein target was assessed by molecular dynamics simulation. One compound, which forms an unstable complex, was eliminated by means of a combination of computational methods, resulting in a decrease in the number of compounds that will pass to the experimental test phase. This approach can be used to solve a wide range of problems related to the search for and validation of low-molecular-weight inhibitors specific for a particular protein target.

  1. A Volunteer Computing Project for Solving Geoacoustic Inversion Problems

    NASA Astrophysics Data System (ADS)

    Zaikin, Oleg; Petrov, Pavel; Posypkin, Mikhail; Bulavintsev, Vadim; Kurochkin, Ilya

    2017-12-01

    A volunteer computing project aimed at solving computationally hard inverse problems in underwater acoustics is described. This project was used to study the possibilities of the sound speed profile reconstruction in a shallow-water waveguide using a dispersion-based geoacoustic inversion scheme. The computational capabilities provided by the project allowed us to investigate the accuracy of the inversion for different mesh sizes of the sound speed profile discretization grid. This problem suits well for volunteer computing because it can be easily decomposed into independent simpler subproblems.

  2. Implementing and Assessing Computational Modeling in Introductory Mechanics

    ERIC Educational Resources Information Center

    Caballero, Marcos D.; Kohlmyer, Matthew A.; Schatz, Michael F.

    2012-01-01

    Students taking introductory physics are rarely exposed to computational modeling. In a one-semester large lecture introductory calculus-based mechanics course at Georgia Tech, students learned to solve physics problems using the VPython programming environment. During the term, 1357 students in this course solved a suite of 14 computational…

  3. Problem-Solving in the Pre-Clinical Curriculum: The Uses of Computer Simulations.

    ERIC Educational Resources Information Center

    Michael, Joel A.; Rovick, Allen A.

    1986-01-01

    Promotes the use of computer-based simulations in the pre-clinical medical curriculum as a means of providing students with opportunities for problem solving. Describes simple simulations of skeletal muscle loads, complex simulations of major organ systems and comprehensive simulation models of the entire human body. (TW)

  4. Biomolecular computers with multiple restriction enzymes

    PubMed Central

    Sakowski, Sebastian; Krasinski, Tadeusz; Waldmajer, Jacek; Sarnik, Joanna; Blasiak, Janusz; Poplawski, Tomasz

    2017-01-01

    Abstract The development of conventional, silicon-based computers has several limitations, including some related to the Heisenberg uncertainty principle and the von Neumann “bottleneck”. Biomolecular computers based on DNA and proteins are largely free of these disadvantages and, along with quantum computers, are reasonable alternatives to their conventional counterparts in some applications. The idea of a DNA computer proposed by Ehud Shapiro’s group at the Weizmann Institute of Science was developed using one restriction enzyme as hardware and DNA fragments (the transition molecules) as software and input/output signals. This computer represented a two-state two-symbol finite automaton that was subsequently extended by using two restriction enzymes. In this paper, we propose the idea of a multistate biomolecular computer with multiple commercially available restriction enzymes as hardware. Additionally, an algorithmic method for the construction of transition molecules in the DNA computer based on the use of multiple restriction enzymes is presented. We use this method to construct multistate, biomolecular, nondeterministic finite automata with four commercially available restriction enzymes as hardware. We also describe an experimental applicaton of this theoretical model to a biomolecular finite automaton made of four endonucleases. PMID:29064510

  5. Nanofluidic Device with Embedded Nanopore

    NASA Astrophysics Data System (ADS)

    Zhang, Yuning; Reisner, Walter

    2014-03-01

    Nanofluidic based devices are robust methods for biomolecular sensing and single DNA manipulation. Nanopore-based DNA sensing has attractive features that make it a leading candidate as a single-molecule DNA sequencing technology. Nanochannel based extension of DNA, combined with enzymatic or denaturation-based barcoding schemes, is already a powerful approach for genome analysis. We believe that there is revolutionary potential in devices that combine nanochannels with nanpore detectors. In particular, due to the fast translocation of a DNA molecule through a standard nanopore configuration, there is an unfavorable trade-off between signal and sequence resolution. With a combined nanochannel-nanopore device, based on embedding a nanopore inside a nanochannel, we can in principle gain independent control over both DNA translocation speed and sensing signal, solving the key draw-back of the standard nanopore configuration. We demonstrate that we can detect - using fluorescent microscopy - successful translocation of DNA from the nanochannel out through the nanopore, a possible method to 'select' a given barcode for further analysis. We also show that in equilibrium DNA will not escape through an embedded sub-persistence length nanopore until a certain voltage bias is added.

  6. Analog Computation by DNA Strand Displacement Circuits.

    PubMed

    Song, Tianqi; Garg, Sudhanshu; Mokhtar, Reem; Bui, Hieu; Reif, John

    2016-08-19

    DNA circuits have been widely used to develop biological computing devices because of their high programmability and versatility. Here, we propose an architecture for the systematic construction of DNA circuits for analog computation based on DNA strand displacement. The elementary gates in our architecture include addition, subtraction, and multiplication gates. The input and output of these gates are analog, which means that they are directly represented by the concentrations of the input and output DNA strands, respectively, without requiring a threshold for converting to Boolean signals. We provide detailed domain designs and kinetic simulations of the gates to demonstrate their expected performance. On the basis of these gates, we describe how DNA circuits to compute polynomial functions of inputs can be built. Using Taylor Series and Newton Iteration methods, functions beyond the scope of polynomials can also be computed by DNA circuits built upon our architecture.

  7. Reconstructing evolutionary trees in parallel for massive sequences.

    PubMed

    Zou, Quan; Wan, Shixiang; Zeng, Xiangxiang; Ma, Zhanshan Sam

    2017-12-14

    Building the evolutionary trees for massive unaligned DNA sequences is challenging and crucial. However, reconstructing evolutionary tree for ultra-large sequences is hard. Massive multiple sequence alignment is also challenging and time/space consuming. Hadoop and Spark are developed recently, which bring spring light for the classical computational biology problems. In this paper, we tried to solve the multiple sequence alignment and evolutionary reconstruction in parallel. HPTree, which is developed in this paper, can deal with big DNA sequence files quickly. It works well on the >1GB files, and gets better performance than other evolutionary reconstruction tools. Users could use HPTree for reonstructing evolutioanry trees on the computer clusters or cloud platform (eg. Amazon Cloud). HPTree could help on population evolution research and metagenomics analysis. In this paper, we employ the Hadoop and Spark platform and design an evolutionary tree reconstruction software tool for unaligned massive DNA sequences. Clustering and multiple sequence alignment are done in parallel. Neighbour-joining model was employed for the evolutionary tree building. We opened our software together with source codes via http://lab.malab.cn/soft/HPtree/ .

  8. Effect of Worked Examples on Mental Model Progression in a Computer-Based Simulation Learning Environment

    ERIC Educational Resources Information Center

    Darabi, Aubteen; Nelson, David W.; Meeker, Richard; Liang, Xinya; Boulware, Wilma

    2010-01-01

    In a diagnostic problem solving operation of a computer-simulated chemical plant, chemical engineering students were randomly assigned to two groups: one studying product-oriented worked examples, the other practicing conventional problem solving. Effects of these instructional strategies on the progression of learners' mental models were examined…

  9. Application of forensic DNA testing in the legal system.

    PubMed

    Primorac, D; Schanfield, M S

    2000-03-01

    DNA technology has taken an irreplaceable position in the field of the forensic sciences. Since 1985, when Peter Gill and Alex Jeffreys first applied DNA technology to forensic problems, to the present, more than 50,000 cases worldwide have been solved through the use of DNA based technology. Although the development of DNA typing in forensic science has been extremely rapid, today we are witnessing a new era of DNA technology including automation and miniaturization. In forensic science, DNA analysis has become "the new form of scientific evidence" and has come under public scrutiny and the demand to show competence. More and more courts admit the DNA based evidence. We believe that in the near future this technology will be generally accepted in the legal system. There are two main applications of DNA analysis in forensic medicine: criminal investigation and paternity testing. In this article we present background information on DNA, human genetics, and the application of DNA analysis to legal problems, as well as the commonly applied respective mathematics.

  10. Molecular docking studies of curcumin natural derivatives with DNA topoisomerase I and II-DNA complexes.

    PubMed

    Kumar, Anil; Bora, Utpal

    2014-12-01

    DNA topoisomerase I (topo I) and II (topo II) are essential enzymes that solve the topological problems of DNA by allowing DNA strands or double helices to pass through each other during cellular processes such as replication, transcription, recombination, and chromatin remodeling. Their critical roles make topoisomerases an attractive drug target against cancer. The present molecular docking study provides insights into the inhibition of topo I and II by curcumin natural derivatives. The binding modes suggested that curcumin natural derivatives docked at the site of DNA cleavage parallel to the axis of DNA base pairing. Cyclocurcumin and curcumin sulphate were predicted to be the most potent inhibitors amongst all the curcumin natural derivatives docked. The binding modes of cyclocurcumin and curcumin sulphate were similar to known inhibitors of topo I and II. Residues like Arg364, Asn722 and base A113 (when docked to topo I-DNA complex) and residues Asp479, Gln778 and base T9 (when docked to topo II-DNA complex) seem to play important role in the binding of curcumin natural derivatives at the site of DNA cleavage.

  11. Superimposed Code Theoretic Analysis of DNA Codes and DNA Computing

    DTIC Science & Technology

    2008-01-01

    complements of one another and the DNA duplex formed is a Watson - Crick (WC) duplex. However, there are many instances when the formation of non-WC...that the user’s requirements for probe selection are met based on the Watson - Crick probe locality within a target. The second type, called...AFRL-RI-RS-TR-2007-288 Final Technical Report January 2008 SUPERIMPOSED CODE THEORETIC ANALYSIS OF DNA CODES AND DNA COMPUTING

  12. Encrypted Objects and Decryption Processes: Problem-Solving with Functions in a Learning Environment Based on Cryptography

    ERIC Educational Resources Information Center

    White, Tobin

    2009-01-01

    This paper introduces an applied problem-solving task, set in the context of cryptography and embedded in a network of computer-based tools. This designed learning environment engaged students in a series of collaborative problem-solving activities intended to introduce the topic of functions through a set of linked representations. In a…

  13. Introduction to bioinformatics.

    PubMed

    Can, Tolga

    2014-01-01

    Bioinformatics is an interdisciplinary field mainly involving molecular biology and genetics, computer science, mathematics, and statistics. Data intensive, large-scale biological problems are addressed from a computational point of view. The most common problems are modeling biological processes at the molecular level and making inferences from collected data. A bioinformatics solution usually involves the following steps: Collect statistics from biological data. Build a computational model. Solve a computational modeling problem. Test and evaluate a computational algorithm. This chapter gives a brief introduction to bioinformatics by first providing an introduction to biological terminology and then discussing some classical bioinformatics problems organized by the types of data sources. Sequence analysis is the analysis of DNA and protein sequences for clues regarding function and includes subproblems such as identification of homologs, multiple sequence alignment, searching sequence patterns, and evolutionary analyses. Protein structures are three-dimensional data and the associated problems are structure prediction (secondary and tertiary), analysis of protein structures for clues regarding function, and structural alignment. Gene expression data is usually represented as matrices and analysis of microarray data mostly involves statistics analysis, classification, and clustering approaches. Biological networks such as gene regulatory networks, metabolic pathways, and protein-protein interaction networks are usually modeled as graphs and graph theoretic approaches are used to solve associated problems such as construction and analysis of large-scale networks.

  14. Comparison of the Effects of Computer-Based Practice and Conceptual Understanding Interventions on Mathematics Fact Retention and Generalization

    ERIC Educational Resources Information Center

    Kanive, Rebecca; Nelson, Peter M.; Burns, Matthew K.; Ysseldyke, James

    2014-01-01

    The authors' purpose was to determine the effects of computer-based practice and conceptual interventions on computational fluency and word-problem solving of fourth- and fifth-grade students with mathematics difficulties. A randomized pretest-posttest control group design found that students assigned to the computer-based practice intervention…

  15. Problem-Solving Test: Real-Time Polymerase Chain Reaction

    ERIC Educational Resources Information Center

    Szeberenyi, Jozsef

    2009-01-01

    Terms to be familiar with before you start to solve the test: polymerase chain reaction, DNA amplification, electrophoresis, breast cancer, "HER2" gene, genomic DNA, "in vitro" DNA synthesis, template, primer, Taq polymerase, 5[prime][right arrow]3[prime] elongation activity, 5[prime][right arrow]3[prime] exonuclease activity, deoxyribonucleoside…

  16. Reasoning by analogy as an aid to heuristic theorem proving.

    NASA Technical Reports Server (NTRS)

    Kling, R. E.

    1972-01-01

    When heuristic problem-solving programs are faced with large data bases that contain numbers of facts far in excess of those needed to solve any particular problem, their performance rapidly deteriorates. In this paper, the correspondence between a new unsolved problem and a previously solved analogous problem is computed and invoked to tailor large data bases to manageable sizes. This paper outlines the design of an algorithm for generating and exploiting analogies between theorems posed to a resolution-logic system. These algorithms are believed to be the first computationally feasible development of reasoning by analogy to be applied to heuristic theorem proving.

  17. Computer-Aided Drug Discovery: Molecular Docking of Diminazene Ligands to DNA Minor Groove

    ERIC Educational Resources Information Center

    Kholod, Yana; Hoag, Erin; Muratore, Katlynn; Kosenkov, Dmytro

    2018-01-01

    The reported project-based laboratory unit introduces upper-division undergraduate students to the basics of computer-aided drug discovery as a part of a computational chemistry laboratory course. The students learn to perform model binding of organic molecules (ligands) to the DNA minor groove with computer-aided drug discovery (CADD) tools. The…

  18. Trusted computing strengthens cloud authentication.

    PubMed

    Ghazizadeh, Eghbal; Zamani, Mazdak; Ab Manan, Jamalul-lail; Alizadeh, Mojtaba

    2014-01-01

    Cloud computing is a new generation of technology which is designed to provide the commercial necessities, solve the IT management issues, and run the appropriate applications. Another entry on the list of cloud functions which has been handled internally is Identity Access Management (IAM). Companies encounter IAM as security challenges while adopting more technologies became apparent. Trust Multi-tenancy and trusted computing based on a Trusted Platform Module (TPM) are great technologies for solving the trust and security concerns in the cloud identity environment. Single sign-on (SSO) and OpenID have been released to solve security and privacy problems for cloud identity. This paper proposes the use of trusted computing, Federated Identity Management, and OpenID Web SSO to solve identity theft in the cloud. Besides, this proposed model has been simulated in .Net environment. Security analyzing, simulation, and BLP confidential model are three ways to evaluate and analyze our proposed model.

  19. Trusted Computing Strengthens Cloud Authentication

    PubMed Central

    2014-01-01

    Cloud computing is a new generation of technology which is designed to provide the commercial necessities, solve the IT management issues, and run the appropriate applications. Another entry on the list of cloud functions which has been handled internally is Identity Access Management (IAM). Companies encounter IAM as security challenges while adopting more technologies became apparent. Trust Multi-tenancy and trusted computing based on a Trusted Platform Module (TPM) are great technologies for solving the trust and security concerns in the cloud identity environment. Single sign-on (SSO) and OpenID have been released to solve security and privacy problems for cloud identity. This paper proposes the use of trusted computing, Federated Identity Management, and OpenID Web SSO to solve identity theft in the cloud. Besides, this proposed model has been simulated in .Net environment. Security analyzing, simulation, and BLP confidential model are three ways to evaluate and analyze our proposed model. PMID:24701149

  20. MENDEL: An Intelligent Computer Tutoring System for Genetics Problem-Solving, Conjecturing, and Understanding.

    ERIC Educational Resources Information Center

    Streibel, Michael; And Others

    1987-01-01

    Describes an advice-giving computer system being developed for genetics education called MENDEL that is based on research in learning, genetics problem solving, and expert systems. The value of MENDEL as a design tool and the tutorial function are stressed. Hypothesis testing, graphics, and experiential learning are also discussed. (Author/LRW)

  1. Disentangling DNA molecules

    NASA Astrophysics Data System (ADS)

    Vologodskii, Alexander

    2016-09-01

    The widespread circular form of DNA molecules inside cells creates very serious topological problems during replication. Due to the helical structure of the double helix the parental strands of circular DNA form a link of very high order, and yet they have to be unlinked before the cell division. DNA topoisomerases, the enzymes that catalyze passing of one DNA segment through another, solve this problem in principle. However, it is very difficult to remove all entanglements between the replicated DNA molecules due to huge length of DNA comparing to the cell size. One strategy that nature uses to overcome this problem is to create the topoisomerases that can dramatically reduce the fraction of linked circular DNA molecules relative to the corresponding fraction at thermodynamic equilibrium. This striking property of the enzymes means that the enzymes that interact with DNA only locally can access their topology, a global property of circular DNA molecules. This review considers the experimental studies of the phenomenon and analyzes the theoretical models that have been suggested in attempts to explain it. We describe here how various models of enzyme action can be investigated computationally. There is no doubt at the moment that we understand basic principles governing enzyme action. Still, there are essential quantitative discrepancies between the experimental data and the theoretical predictions. We consider how these discrepancies can be overcome.

  2. AI tools in computer based problem solving

    NASA Technical Reports Server (NTRS)

    Beane, Arthur J.

    1988-01-01

    The use of computers to solve value oriented, deterministic, algorithmic problems, has evolved a structured life cycle model of the software process. The symbolic processing techniques used, primarily in research, for solving nondeterministic problems, and those for which an algorithmic solution is unknown, have evolved a different model, much less structured. Traditionally, the two approaches have been used completely independently. With the advent of low cost, high performance 32 bit workstations executing identical software with large minicomputers and mainframes, it became possible to begin to merge both models into a single extended model of computer problem solving. The implementation of such an extended model on a VAX family of micro/mini/mainframe systems is described. Examples in both development and deployment of applications involving a blending of AI and traditional techniques are given.

  3. Phylo: A Citizen Science Approach for Improving Multiple Sequence Alignment

    PubMed Central

    Kam, Alfred; Kwak, Daniel; Leung, Clarence; Wu, Chu; Zarour, Eleyine; Sarmenta, Luis; Blanchette, Mathieu; Waldispühl, Jérôme

    2012-01-01

    Background Comparative genomics, or the study of the relationships of genome structure and function across different species, offers a powerful tool for studying evolution, annotating genomes, and understanding the causes of various genetic disorders. However, aligning multiple sequences of DNA, an essential intermediate step for most types of analyses, is a difficult computational task. In parallel, citizen science, an approach that takes advantage of the fact that the human brain is exquisitely tuned to solving specific types of problems, is becoming increasingly popular. There, instances of hard computational problems are dispatched to a crowd of non-expert human game players and solutions are sent back to a central server. Methodology/Principal Findings We introduce Phylo, a human-based computing framework applying “crowd sourcing” techniques to solve the Multiple Sequence Alignment (MSA) problem. The key idea of Phylo is to convert the MSA problem into a casual game that can be played by ordinary web users with a minimal prior knowledge of the biological context. We applied this strategy to improve the alignment of the promoters of disease-related genes from up to 44 vertebrate species. Since the launch in November 2010, we received more than 350,000 solutions submitted from more than 12,000 registered users. Our results show that solutions submitted contributed to improving the accuracy of up to 70% of the alignment blocks considered. Conclusions/Significance We demonstrate that, combined with classical algorithms, crowd computing techniques can be successfully used to help improving the accuracy of MSA. More importantly, we show that an NP-hard computational problem can be embedded in casual game that can be easily played by people without significant scientific training. This suggests that citizen science approaches can be used to exploit the billions of “human-brain peta-flops” of computation that are spent every day playing games. Phylo is available at: http://phylo.cs.mcgill.ca. PMID:22412834

  4. Constructing Smart Protocells with Built-In DNA Computational Core to Eliminate Exogenous Challenge.

    PubMed

    Lyu, Yifan; Wu, Cuichen; Heinke, Charles; Han, Da; Cai, Ren; Teng, I-Ting; Liu, Yuan; Liu, Hui; Zhang, Xiaobing; Liu, Qiaoling; Tan, Weihong

    2018-06-06

    A DNA reaction network is like a biological algorithm that can respond to "molecular input signals", such as biological molecules, while the artificial cell is like a microrobot whose function is powered by the encapsulated DNA reaction network. In this work, we describe the feasibility of using a DNA reaction network as the computational core of a protocell, which will perform an artificial immune response in a concise way to eliminate a mimicked pathogenic challenge. Such a DNA reaction network (RN)-powered protocell can realize the connection of logical computation and biological recognition due to the natural programmability and biological properties of DNA. Thus, the biological input molecules can be easily involved in the molecular computation and the computation process can be spatially isolated and protected by artificial bilayer membrane. We believe the strategy proposed in the current paper, i.e., using DNA RN to power artificial cells, will lay the groundwork for understanding the basic design principles of DNA algorithm-based nanodevices which will, in turn, inspire the construction of artificial cells, or protocells, that will find a place in future biomedical research.

  5. Identification of Forensic Samples via Mitochondrial DNA in the Undergraduate Biochemistry Laboratory

    NASA Astrophysics Data System (ADS)

    Millard, Julie T.; Pilon, André M.

    2003-04-01

    A recent forensic approach for identification of unknown biological samples is mitochondrial DNA (mtDNA) sequencing. We describe a laboratory exercise suitable for an undergraduate biochemistry course in which the polymerase chain reaction is used to amplify a 440 base pair hypervariable region of human mtDNA from a variety of "crime scene" samples (e.g., teeth, hair, nails, cigarettes, envelope flaps, toothbrushes, and chewing gum). Amplification is verified via agarose gel electrophoresis and then samples are subjected to cycle sequencing. Sequence alignments are made via the program CLUSTAL W, allowing students to compare samples and solve the "crime."

  6. Easy design of colorimetric logic gates based on nonnatural base pairing and controlled assembly of gold nanoparticles.

    PubMed

    Zhang, Li; Wang, Zhong-Xia; Liang, Ru-Ping; Qiu, Jian-Ding

    2013-07-16

    Utilizing the principles of metal-ion-mediated base pairs (C-Ag-C and T-Hg-T), the pH-sensitive conformational transition of C-rich DNA strand, and the ligand-exchange process triggered by DL-dithiothreitol (DTT), a system of colorimetric logic gates (YES, AND, INHIBIT, and XOR) can be rationally constructed based on the aggregation of the DNA-modified Au NPs. The proposed logic operation system is simple, which consists of only T-/C-rich DNA-modified Au NPs, and it is unnecessary to exquisitely design and alter the DNA sequence for different multiple molecular logic operations. The nonnatural base pairing combined with unique optical properties of Au NPs promises great potential in multiplexed ion sensing, molecular-scale computers, and other computational logic devices.

  7. Toward Theory-Based Instruction in Scientific Problem Solving.

    ERIC Educational Resources Information Center

    Heller, Joan I.; And Others

    Several empirical and theoretical analyses related to scientific problem-solving are reviewed, including: detailed studies of individuals at different levels of expertise, and computer models simulating some aspects of human information processing during problem solving. Analysis of these studies has revealed many facets about the nature of the…

  8. Problem Solving Under Time-Constraints.

    ERIC Educational Resources Information Center

    Richardson, Michael; Hunt, Earl

    A model of how automated and controlled processing can be mixed in computer simulations of problem solving is proposed. It is based on previous work by Hunt and Lansman (1983), who developed a model of problem solving that could reproduce the data obtained with several attention and performance paradigms, extending production-system notation to…

  9. Computer problem-solving coaches for introductory physics: Design and usability studies

    NASA Astrophysics Data System (ADS)

    Ryan, Qing X.; Frodermann, Evan; Heller, Kenneth; Hsu, Leonardo; Mason, Andrew

    2016-06-01

    The combination of modern computing power, the interactivity of web applications, and the flexibility of object-oriented programming may finally be sufficient to create computer coaches that can help students develop metacognitive problem-solving skills, an important competence in our rapidly changing technological society. However, no matter how effective such coaches might be, they will only be useful if they are attractive to students. We describe the design and testing of a set of web-based computer programs that act as personal coaches to students while they practice solving problems from introductory physics. The coaches are designed to supplement regular human instruction, giving students access to effective forms of practice outside class. We present results from large-scale usability tests of the computer coaches and discuss their implications for future versions of the coaches.

  10. HMMBinder: DNA-Binding Protein Prediction Using HMM Profile Based Features.

    PubMed

    Zaman, Rianon; Chowdhury, Shahana Yasmin; Rashid, Mahmood A; Sharma, Alok; Dehzangi, Abdollah; Shatabda, Swakkhar

    2017-01-01

    DNA-binding proteins often play important role in various processes within the cell. Over the last decade, a wide range of classification algorithms and feature extraction techniques have been used to solve this problem. In this paper, we propose a novel DNA-binding protein prediction method called HMMBinder. HMMBinder uses monogram and bigram features extracted from the HMM profiles of the protein sequences. To the best of our knowledge, this is the first application of HMM profile based features for the DNA-binding protein prediction problem. We applied Support Vector Machines (SVM) as a classification technique in HMMBinder. Our method was tested on standard benchmark datasets. We experimentally show that our method outperforms the state-of-the-art methods found in the literature.

  11. Applying Agrep to r-NSA to solve multiple sequences approximate matching.

    PubMed

    Ni, Bing; Wong, Man-Hon; Lam, Chi-Fai David; Leung, Kwong-Sak

    2014-01-01

    This paper addresses the approximate matching problem in a database consisting of multiple DNA sequences, where the proposed approach applies Agrep to a new truncated suffix array, r-NSA. The construction time of the structure is linear to the database size, and the computations of indexing a substring in the structure are constant. The number of characters processed in applying Agrep is analysed theoretically, and the theoretical upper-bound can approximate closely the empirical number of characters, which is obtained through enumerating the characters in the actual structure built. Experiments are carried out using (synthetic) random DNA sequences, as well as (real) genome sequences including Hepatitis-B Virus and X-chromosome. Experimental results show that, compared to the straight-forward approach that applies Agrep to multiple sequences individually, the proposed approach solves the matching problem in much shorter time. The speed-up of our approach depends on the sequence patterns, and for highly similar homologous genome sequences, which are the common cases in real-life genomes, it can be up to several orders of magnitude.

  12. The effect on cadaver blood DNA identification by the use of targeted and whole body post-mortem computed tomography angiography.

    PubMed

    Rutty, Guy N; Barber, Jade; Amoroso, Jasmin; Morgan, Bruno; Graham, Eleanor A M

    2013-12-01

    Post-mortem computed tomography angiography (PMCTA) involves the injection of contrast agents. This could have both a dilution effect on biological fluid samples and could affect subsequent post-contrast analytical laboratory processes. We undertook a small sample study of 10 targeted and 10 whole body PMCTA cases to consider whether or not these two methods of PMCTA could affect post-PMCTA cadaver blood based DNA identification. We used standard methodology to examine DNA from blood samples obtained before and after the PMCTA procedure. We illustrate that neither of these PMCTA methods had an effect on the alleles called following short tandem repeat based DNA profiling, and therefore the ability to undertake post-PMCTA blood based DNA identification.

  13. A Software Laboratory Environment for Computer-Based Problem Solving.

    ERIC Educational Resources Information Center

    Kurtz, Barry L.; O'Neal, Micheal B.

    This paper describes a National Science Foundation-sponsored project at Louisiana Technological University to develop computer-based laboratories for "hands-on" introductions to major topics of computer science. The underlying strategy is to develop structured laboratory environments that present abstract concepts through the use of…

  14. Smell Detection Agent Based Optimization Algorithm

    NASA Astrophysics Data System (ADS)

    Vinod Chandra, S. S.

    2016-09-01

    In this paper, a novel nature-inspired optimization algorithm has been employed and the trained behaviour of dogs in detecting smell trails is adapted into computational agents for problem solving. The algorithm involves creation of a surface with smell trails and subsequent iteration of the agents in resolving a path. This algorithm can be applied in different computational constraints that incorporate path-based problems. Implementation of the algorithm can be treated as a shortest path problem for a variety of datasets. The simulated agents have been used to evolve the shortest path between two nodes in a graph. This algorithm is useful to solve NP-hard problems that are related to path discovery. This algorithm is also useful to solve many practical optimization problems. The extensive derivation of the algorithm can be enabled to solve shortest path problems.

  15. Problem-Solving Test: Analysis of DNA Damage Recognizing Proteins in Yeast and Human Cells

    ERIC Educational Resources Information Center

    Szeberenyi, Jozsef

    2013-01-01

    The experiment described in this test was aimed at identifying DNA repair proteins in human and yeast cells. Terms to be familiar with before you start to solve the test: DNA repair, germline mutation, somatic mutation, inherited disease, cancer, restriction endonuclease, radioactive labeling, [alpha-[superscript 32]P]ATP, [gamma-[superscript…

  16. PBEQ-Solver for online visualization of electrostatic potential of biomolecules.

    PubMed

    Jo, Sunhwan; Vargyas, Miklos; Vasko-Szedlar, Judit; Roux, Benoît; Im, Wonpil

    2008-07-01

    PBEQ-Solver provides a web-based graphical user interface to read biomolecular structures, solve the Poisson-Boltzmann (PB) equations and interactively visualize the electrostatic potential. PBEQ-Solver calculates (i) electrostatic potential and solvation free energy, (ii) protein-protein (DNA or RNA) electrostatic interaction energy and (iii) pKa of a selected titratable residue. All the calculations can be performed in both aqueous solvent and membrane environments (with a cylindrical pore in the case of membrane). PBEQ-Solver uses the PBEQ module in the biomolecular simulation program CHARMM to solve the finite-difference PB equation of molecules specified by users. Users can interactively inspect the calculated electrostatic potential on the solvent-accessible surface as well as iso-electrostatic potential contours using a novel online visualization tool based on MarvinSpace molecular visualization software, a Java applet integrated within CHARMM-GUI (http://www.charmm-gui.org). To reduce the computational time on the server, and to increase the efficiency in visualization, all the PB calculations are performed with coarse grid spacing (1.5 A before and 1 A after focusing). PBEQ-Solver suggests various physical parameters for PB calculations and users can modify them if necessary. PBEQ-Solver is available at http://www.charmm-gui.org/input/pbeqsolver.

  17. Genomic signal processing methods for computation of alignment-free distances from DNA sequences.

    PubMed

    Borrayo, Ernesto; Mendizabal-Ruiz, E Gerardo; Vélez-Pérez, Hugo; Romo-Vázquez, Rebeca; Mendizabal, Adriana P; Morales, J Alejandro

    2014-01-01

    Genomic signal processing (GSP) refers to the use of digital signal processing (DSP) tools for analyzing genomic data such as DNA sequences. A possible application of GSP that has not been fully explored is the computation of the distance between a pair of sequences. In this work we present GAFD, a novel GSP alignment-free distance computation method. We introduce a DNA sequence-to-signal mapping function based on the employment of doublet values, which increases the number of possible amplitude values for the generated signal. Additionally, we explore the use of three DSP distance metrics as descriptors for categorizing DNA signal fragments. Our results indicate the feasibility of employing GAFD for computing sequence distances and the use of descriptors for characterizing DNA fragments.

  18. Genomic Signal Processing Methods for Computation of Alignment-Free Distances from DNA Sequences

    PubMed Central

    Borrayo, Ernesto; Mendizabal-Ruiz, E. Gerardo; Vélez-Pérez, Hugo; Romo-Vázquez, Rebeca; Mendizabal, Adriana P.; Morales, J. Alejandro

    2014-01-01

    Genomic signal processing (GSP) refers to the use of digital signal processing (DSP) tools for analyzing genomic data such as DNA sequences. A possible application of GSP that has not been fully explored is the computation of the distance between a pair of sequences. In this work we present GAFD, a novel GSP alignment-free distance computation method. We introduce a DNA sequence-to-signal mapping function based on the employment of doublet values, which increases the number of possible amplitude values for the generated signal. Additionally, we explore the use of three DSP distance metrics as descriptors for categorizing DNA signal fragments. Our results indicate the feasibility of employing GAFD for computing sequence distances and the use of descriptors for characterizing DNA fragments. PMID:25393409

  19. Problem-Solving Test: Pyrosequencing

    ERIC Educational Resources Information Center

    Szeberenyi, Jozsef

    2013-01-01

    Terms to be familiar with before you start to solve the test: Maxam-Gilbert sequencing, Sanger sequencing, gel electrophoresis, DNA synthesis reaction, polymerase chain reaction, template, primer, DNA polymerase, deoxyribonucleoside triphosphates, orthophosphate, pyrophosphate, nucleoside monophosphates, luminescence, acid anhydride bond,…

  20. Computers in medical education 1: evaluation of a problem-orientated learning package.

    PubMed

    Devitt, P; Palmer, E

    1998-04-01

    A computer-based learning package has been developed, aimed at expanding students' knowledge base, as well as improving data-handling abilities and clinical problem-solving skills. The program was evaluated by monitoring its use by students, canvassing users' opinions and measuring its effectiveness as a learning tool compared to tutorials on the same material. Evaluation was undertaken using three methods: initially, by a questionnaire on computers as a learning tool and the applicability of the content: second, through monitoring by the computer of student use, decisions and performance; finally, through pre- and post-test assessment of fifth-year students who either used a computer package or attended a tutorial on equivalent material. Most students provided positive comments on the learning material and expressed a willingness to see computer-aided learning (CAL) introduced into the curriculum. Over a 3-month period, 26 modules in the program were used on 1246 occasions. Objective measurement showed a significant gain in knowledge, data handling and problem-solving skills. Computer-aided learning is a valuable learning resource that deserves better attention in medical education. When used appropriately, the computer can be an effective learning resource, not only for the delivery of knowledge. but also to help students develop their problem-solving skills.

  1. Nanochannel Device with Embedded Nanopore: a New Approach for Single-Molecule DNA Analysis and Manipulation

    NASA Astrophysics Data System (ADS)

    Zhang, Yuning; Reisner, Walter

    2013-03-01

    Nanopore and nanochannel based devices are robust methods for biomolecular sensing and single DNA manipulation. Nanopore-based DNA sensing has attractive features that make it a leading candidate as a single-molecule DNA sequencing technology. Nanochannel based extension of DNA, combined with enzymatic or denaturation-based barcoding schemes, is already a powerful approach for genome analysis. We believe that there is revolutionary potential in devices that combine nanochannels with embedded pore detectors. In particular, due to the fast translocation of a DNA molecule through a standard nanopore configuration, there is an unfavorable trade-off between signal and sequence resolution. With a combined nanochannel-nanopore device, based on embedding a pore inside a nanochannel, we can in principle gain independent control over both DNA translocation speed and sensing signal, solving the key draw-back of the standard nanopore configuration. We demonstrate that we can optically detect successful translocation of DNA from the nanochannel out through the nanopore, a possible method to 'select' a given barcode for further analysis. In particular, we show that in equilibrium DNA will not escape through an embedded sub-persistence length nanopore, suggesting that the pore could be used as a nanoscale window through which to interrogate a nanochannel extended DNA molecule. Furthermore, electrical measurements through the nanopore are performed, indicating that DNA sensing is feasible using the nanochannel-nanopore device.

  2. Computer-Mediated Assessment of Higher-Order Thinking Development

    ERIC Educational Resources Information Center

    Tilchin, Oleg; Raiyn, Jamal

    2015-01-01

    Solving complicated problems in a contemporary knowledge-based society requires higher-order thinking (HOT). The most productive way to encourage development of HOT in students is through use of the Problem-based Learning (PBL) model. This model organizes learning by solving corresponding problems relative to study courses. Students are directed…

  3. Using a Computer-Adapted, Conceptually Based History Text to Increase Comprehension and Problem-Solving Skills of Students with Disabilities

    ERIC Educational Resources Information Center

    Twyman, Todd; Tindal, Gerald

    2006-01-01

    The purpose of this study was to improve the comprehension and problem-solving skills of students with disabilities in social studies using a conceptually framed, computer-adapted history text. Participants were 11th and 12th grade students identified with learning disabilities in reading and writing from two intact, self-contained social studies…

  4. Proceedings of the Conference on Joint Problem Solving and Microcomputers (San Diego, California, March 31 - April 2, 1983). Technical Report No. 1.

    ERIC Educational Resources Information Center

    Cole, Michael; And Others

    A group of American and Japanese psychologists, anthropologists, linguists, and computer scientists gathered at the University of California, San Diego, to exchange ideas on models of joint problem solving and their special relevance to the design and implementation of computer-based systems of instruction. Much of the discussion focused on…

  5. An Analysis of Collaborative Problem-Solving Activities Mediated by Individual-Based and Collaborative Computer Simulations

    ERIC Educational Resources Information Center

    Chang, C.-J.; Chang, M.-H.; Liu, C.-C.; Chiu, B.-C.; Fan Chiang, S.-H.; Wen, C.-T.; Hwang, F.-K.; Chao, P.-Y.; Chen, Y.-L.; Chai, C.-S.

    2017-01-01

    Researchers have indicated that the collaborative problem-solving space afforded by the collaborative systems significantly impact the problem-solving process. However, recent investigations into collaborative simulations, which allow a group of students to jointly manipulate a problem in a shared problem space, have yielded divergent results…

  6. Backtrack Programming: A Computer-Based Approach to Group Problem Solving.

    ERIC Educational Resources Information Center

    Scott, Michael D.; Bodaken, Edward M.

    Backtrack problem-solving appears to be a viable alternative to current problem-solving methodologies. It appears to have considerable heuristic potential as a conceptual and operational framework for small group communication research, as well as functional utility for the student group in the small group class or the management team in the…

  7. Logical NAND and NOR Operations Using Algorithmic Self-assembly of DNA Molecules

    NASA Astrophysics Data System (ADS)

    Wang, Yanfeng; Cui, Guangzhao; Zhang, Xuncai; Zheng, Yan

    DNA self-assembly is the most advanced and versatile system that has been experimentally demonstrated for programmable construction of patterned systems on the molecular scale. It has been demonstrated that the simple binary arithmetic and logical operations can be computed by the process of self assembly of DNA tiles. Here we report a one-dimensional algorithmic self-assembly of DNA triple-crossover molecules that can be used to execute five steps of a logical NAND and NOR operations on a string of binary bits. To achieve this, abstract tiles were translated into DNA tiles based on triple-crossover motifs. Serving as input for the computation, long single stranded DNA molecules were used to nucleate growth of tiles into algorithmic crystals. Our method shows that engineered DNA self-assembly can be treated as a bottom-up design techniques, and can be capable of designing DNA computer organization and architecture.

  8. Orchestration of Molecular Information through Higher Order Chemical Recognition

    NASA Astrophysics Data System (ADS)

    Frezza, Brian M.

    Broadly defined, higher order chemical recognition is the process whereby discrete chemical building blocks capable of specifically binding to cognate moieties are covalently linked into oligomeric chains. These chains, or sequences, are then able to recognize and bind to their cognate sequences with a high degree of cooperativity. Principally speaking, DNA and RNA are the most readily obtained examples of this chemical phenomenon, and function via Watson-Crick cognate pairing: guanine pairs with cytosine and adenine with thymine (DNA) or uracil (RNA), in an anti-parallel manner. While the theoretical principles, techniques, and equations derived herein apply generally to any higher-order chemical recognition system, in practice we utilize DNA oligomers as a model-building material to experimentally investigate and validate our hypotheses. Historically, general purpose information processing has been a task limited to semiconductor electronics. Molecular computing on the other hand has been limited to ad hoc approaches designed to solve highly specific and unique computation problems, often involving components or techniques that cannot be applied generally in a manner suitable for precise and predictable engineering. Herein, we provide a fundamental framework for harnessing high-order recognition in a modular and programmable fashion to synthesize molecular information process networks of arbitrary construction and complexity. This document provides a solid foundation for routinely embedding computational capability into chemical and biological systems where semiconductor electronics are unsuitable for practical application.

  9. [Applications of DNA methylation markers in forensic medicine].

    PubMed

    Zhao, Gui-sen; Yang, Qing-en

    2005-02-01

    DNA methylation is a post-replication modification that is predominantly found in cytosines of the dinucleotide sequence CpG. Epigenetic information is stored in the distribution of the modified base 5-methylcytosine. DNA methylation profiles represent a more chemically and biologically stable source of molecular diagnostic information than RNA or most proteins. Recent advances attest to the great promise of DNA methylation markers as powerful future tools in the clinic. In the past decade, DNA methylation analysis has been revolutionized by two technological advances--bisulphite modification of DNA and methylation-specific polymerase chain reaction (MSP). The methylation pattern of human genome is space-time specific, sex-specific, parent-of-origin specific and disease specific, providing us an alternative way to solve forensic problems.

  10. Mechanism of degradation of 2'-deoxycytidine by formamide: implications for chemical DNA sequencing procedures.

    PubMed

    Saladino, R; Crestini, C; Mincione, E; Costanzo, G; Di Mauro, E; Negri, R

    1997-11-01

    We describe the reaction of formamide with 2'-deoxycytidine to give pyrimidine ring opening by nucleophilic addition on the electrophilic C(6) and C(4) positions. This information is confirmed by the analysis of the products of formamide attack on 2'-deoxycytidine, 5-methyl-2'-deoxycytidine, and 5-bromo-2'-deoxycytidine, residues when the latter are incorporated into oligonucleotides by DNA polymerase-driven polymerization and solid-phase phosphoramidite procedure. The increased sensitivity of 5-bromo-2'-deoxycytidine relative to that of 2'-deoxycytidine is pivotal for the improvement of the one-lane chemical DNA sequencing procedure based on the base-selective reaction of formamide with DNA. In many DNA sequencing cases it will in fact be possible to incorporate this base analogue into the DNA to be sequenced, thus providing a complete discrimination between its UV absorption signal and that of the thymidine residues. The wide spectrum of different sensitivities to formamide displayed by the 2'-deoxycytidine analogues solves, in the DNA single-lane chemical sequencing procedure, the possible source of errors due to low discrimination between C and T residues.

  11. Nanochannel Device with Embedded Nanopore: a New Approach for Single-Molecule DNA Analysis and Manipulation

    NASA Astrophysics Data System (ADS)

    Zhang, Yuning; Reisner, Walter

    2012-02-01

    Nanopore and nanochannel based devices are robust methods for biomolecular sensing and single DNA manipulation. Nanopore-based DNA sensing has attractive features that make it a leading candidate as a single-molecule DNA sequencing technology. Nanochannel based extension of DNA, combined with enzymatic or denaturation-based barcoding schemes, is already a powerful approach for genome analysis. We believe that there is revolutionary potential in devices that combine nanochannels with nanpore detectors. In particular, due to the fast translocation of a DNA molecule through a standard nanopore configuration, there is an unfavorable trade-off between signal and sequence resolution. With a combined nanochannel-nanopore device, based on embedding a nanopore inside a nanochannel, we can in principle gain independent control over both DNA translocation speed and sensing signal, solving the key draw-back of the standard nanopore configuration. We will discuss our recent progress on device fabrication and characterization. In particular, we demonstrate that we can detect - using fluorescent microscopy - successful translocation of DNA from the nanochannel out through the nanopore, a possible method to 'select' a given barcode for further analysis. In particular, we show that in equilibrium DNA will not escape through an embedded sub-persistence length nanopore, suggesting that the embedded pore could be used as a nanoscale window through which to interrogate a nanochannel extended DNA molecule.

  12. Fast Legendre moment computation for template matching

    NASA Astrophysics Data System (ADS)

    Li, Bing C.

    2017-05-01

    Normalized cross correlation (NCC) based template matching is insensitive to intensity changes and it has many applications in image processing, object detection, video tracking and pattern recognition. However, normalized cross correlation implementation is computationally expensive since it involves both correlation computation and normalization implementation. In this paper, we propose Legendre moment approach for fast normalized cross correlation implementation and show that the computational cost of this proposed approach is independent of template mask sizes which is significantly faster than traditional mask size dependent approaches, especially for large mask templates. Legendre polynomials have been widely used in solving Laplace equation in electrodynamics in spherical coordinate systems, and solving Schrodinger equation in quantum mechanics. In this paper, we extend Legendre polynomials from physics to computer vision and pattern recognition fields, and demonstrate that Legendre polynomials can help to reduce the computational cost of NCC based template matching significantly.

  13. Crystal structure of a four-stranded intercalated DNA: d(C4)

    NASA Technical Reports Server (NTRS)

    Chen, L.; Cai, L.; Zhang, X.; Rich, A.

    1994-01-01

    The crystal structure of d(C4) solved at 2.3-A resolution reveals a four-stranded molecule composed of two interdigitated or intercalated duplexes. The duplexes are held together by hemiprotonated cytosine-cytosine base pairs and are parallel stranded, but the two duplexes point in opposite directions. The molecule has a slow right-handed twist of 12.4 degrees between covalently linked cytosine base pairs, and the base stacking distance is 3.1 A. This is in general agreement with the NMR studies. A biological role for DNA in this conformation is suggested.

  14. Commentary: Crowdsourcing, Foldit, and Scientific Discovery Games

    ERIC Educational Resources Information Center

    Parslow, Graham R.

    2013-01-01

    The web has created new possibilities for collaboration that fit under the terms crowdsourcing and human-based computation. Crowdsourcing applies when a task or problem is outsourced to an undefined public rather than a specific body. Human-based computation refers to ways that humans and computers can work together to solve problems. These two…

  15. A knowledge-based system with learning for computer communication network design

    NASA Technical Reports Server (NTRS)

    Pierre, Samuel; Hoang, Hai Hoc; Tropper-Hausen, Evelyne

    1990-01-01

    Computer communication network design is well-known as complex and hard. For that reason, the most effective methods used to solve it are heuristic. Weaknesses of these techniques are listed and a new approach based on artificial intelligence for solving this problem is presented. This approach is particularly recommended for large packet switched communication networks, in the sense that it permits a high degree of reliability and offers a very flexible environment dealing with many relevant design parameters such as link cost, link capacity, and message delay.

  16. An Examination of the Relationship between Computation, Problem Solving, and Reading

    ERIC Educational Resources Information Center

    Cormier, Damien C.; Yeo, Seungsoo; Christ, Theodore J.; Offrey, Laura D.; Pratt, Katherine

    2016-01-01

    The purpose of this study is to evaluate the relationship of mathematics calculation rate (curriculum-based measurement of mathematics; CBM-M), reading rate (curriculum-based measurement of reading; CBM-R), and mathematics application and problem solving skills (mathematics screener) among students at four levels of proficiency on a statewide…

  17. Interactive computer graphics applications for compressible aerodynamics

    NASA Technical Reports Server (NTRS)

    Benson, Thomas J.

    1994-01-01

    Three computer applications have been developed to solve inviscid compressible fluids problems using interactive computer graphics. The first application is a compressible flow calculator which solves for isentropic flow, normal shocks, and oblique shocks or centered expansions produced by two dimensional ramps. The second application couples the solutions generated by the first application to a more graphical presentation of the results to produce a desk top simulator of three compressible flow problems: 1) flow past a single compression ramp; 2) flow past two ramps in series; and 3) flow past two opposed ramps. The third application extends the results of the second to produce a design tool which solves for the flow through supersonic external or mixed compression inlets. The applications were originally developed to run on SGI or IBM workstations running GL graphics. They are currently being extended to solve additional types of flow problems and modified to operate on any X-based workstation.

  18. Examining the Effects of Field Dependence-Independence on Learners' Problem-Solving Performance and Interaction with a Computer Modeling Tool: Implications for the Design of Joint Cognitive Systems

    ERIC Educational Resources Information Center

    Angeli, Charoula

    2013-01-01

    An investigation was carried out to examine the effects of cognitive style on learners' performance and interaction during complex problem solving with a computer modeling tool. One hundred and nineteen undergraduates volunteered to participate in the study. Participants were first administered a test, and based on their test scores they were…

  19. Rotman Lens Sidewall Design and Optimization with Hybrid Hardware/Software Based Programming

    DTIC Science & Technology

    2015-01-09

    conventional MoM and stored in memory. The components of Zfar are computed as needed through a fast matrix vector multiplication ( MVM ), which...V vector. Iterative methods, e.g. BiCGSTAB, are employed for solving the linear equation. The matrix-vector multiplications ( MVMs ), which dominate...most of the computation in the solving phase, consists of calculating near and far MVMs . The far MVM comprises aggregation, translation, and

  20. Effects of the Multiple Solutions and Question Prompts on Generalization and Justification for Non-Routine Mathematical Problem Solving in a Computer Game Context

    ERIC Educational Resources Information Center

    Lee, Chun-Yi; Chen, Ming-Jang; Chang, Wen-Long

    2014-01-01

    The aim of this study is to investigate the effects of solution methods and question prompts on generalization and justification of non-routine problem solving for Grade 9 students. The learning activities are based on the context of the frog jumping game. In addition, related computer tools were used to support generalization and justification of…

  1. Experiences with explicit finite-difference schemes for complex fluid dynamics problems on STAR-100 and CYBER-203 computers

    NASA Technical Reports Server (NTRS)

    Kumar, A.; Rudy, D. H.; Drummond, J. P.; Harris, J. E.

    1982-01-01

    Several two- and three-dimensional external and internal flow problems solved on the STAR-100 and CYBER-203 vector processing computers are described. The flow field was described by the full Navier-Stokes equations which were then solved by explicit finite-difference algorithms. Problem results and computer system requirements are presented. Program organization and data base structure for three-dimensional computer codes which will eliminate or improve on page faulting, are discussed. Storage requirements for three-dimensional codes are reduced by calculating transformation metric data in each step. As a result, in-core grid points were increased in number by 50% to 150,000, with a 10% execution time increase. An assessment of current and future machine requirements shows that even on the CYBER-205 computer only a few problems can be solved realistically. Estimates reveal that the present situation is more storage limited than compute rate limited, but advancements in both storage and speed are essential to realistically calculate three-dimensional flow.

  2. Solving large mixed linear models using preconditioned conjugate gradient iteration.

    PubMed

    Strandén, I; Lidauer, M

    1999-12-01

    Continuous evaluation of dairy cattle with a random regression test-day model requires a fast solving method and algorithm. A new computing technique feasible in Jacobi and conjugate gradient based iterative methods using iteration on data is presented. In the new computing technique, the calculations in multiplication of a vector by a matrix were recorded to three steps instead of the commonly used two steps. The three-step method was implemented in a general mixed linear model program that used preconditioned conjugate gradient iteration. Performance of this program in comparison to other general solving programs was assessed via estimation of breeding values using univariate, multivariate, and random regression test-day models. Central processing unit time per iteration with the new three-step technique was, at best, one-third that needed with the old technique. Performance was best with the test-day model, which was the largest and most complex model used. The new program did well in comparison to other general software. Programs keeping the mixed model equations in random access memory required at least 20 and 435% more time to solve the univariate and multivariate animal models, respectively. Computations of the second best iteration on data took approximately three and five times longer for the animal and test-day models, respectively, than did the new program. Good performance was due to fast computing time per iteration and quick convergence to the final solutions. Use of preconditioned conjugate gradient based methods in solving large breeding value problems is supported by our findings.

  3. A universal concept based on cellular neural networks for ultrafast and flexible solving of differential equations.

    PubMed

    Chedjou, Jean Chamberlain; Kyamakya, Kyandoghere

    2015-04-01

    This paper develops and validates a comprehensive and universally applicable computational concept for solving nonlinear differential equations (NDEs) through a neurocomputing concept based on cellular neural networks (CNNs). High-precision, stability, convergence, and lowest-possible memory requirements are ensured by the CNN processor architecture. A significant challenge solved in this paper is that all these cited computing features are ensured in all system-states (regular or chaotic ones) and in all bifurcation conditions that may be experienced by NDEs.One particular quintessence of this paper is to develop and demonstrate a solver concept that shows and ensures that CNN processors (realized either in hardware or in software) are universal solvers of NDE models. The solving logic or algorithm of given NDEs (possible examples are: Duffing, Mathieu, Van der Pol, Jerk, Chua, Rössler, Lorenz, Burgers, and the transport equations) through a CNN processor system is provided by a set of templates that are computed by our comprehensive templates calculation technique that we call nonlinear adaptive optimization. This paper is therefore a significant contribution and represents a cutting-edge real-time computational engineering approach, especially while considering the various scientific and engineering applications of this ultrafast, energy-and-memory-efficient, and high-precise NDE solver concept. For illustration purposes, three NDE models are demonstratively solved, and related CNN templates are derived and used: the periodically excited Duffing equation, the Mathieu equation, and the transport equation.

  4. Numerical Simulation of Flow Through an Artificial Heart

    NASA Technical Reports Server (NTRS)

    Rogers, Stuart E.; Kutler, Paul; Kwak, Dochan; Kiris, Cetin

    1989-01-01

    A solution procedure was developed that solves the unsteady, incompressible Navier-Stokes equations, and was used to numerically simulate viscous incompressible flow through a model of the Pennsylvania State artificial heart. The solution algorithm is based on the artificial compressibility method, and uses flux-difference splitting to upwind the convective terms; a line-relaxation scheme is used to solve the equations. The time-accuracy of the method is obtained by iteratively solving the equations at each physical time step. The artificial heart geometry involves a piston-type action with a moving solid wall. A single H-grid is fit inside the heart chamber. The grid is continuously compressed and expanded with a constant number of grid points to accommodate the moving piston. The computational domain ends at the valve openings where nonreflective boundary conditions based on the method of characteristics are applied. Although a number of simplifing assumptions were made regarding the geometry, the computational results agreed reasonably well with an experimental picture. The computer time requirements for this flow simulation, however, are quite extensive. Computational study of this type of geometry would benefit greatly from improvements in computer hardware speed and algorithm efficiency enhancements.

  5. The Application of Web-based Computer-assisted Instruction Courseware within Health Assessment

    NASA Astrophysics Data System (ADS)

    Xiuyan, Guo

    Health assessment is a clinical nursing course and places emphasis on clinical skills. The application of computer-assisted instruction in the field of nursing teaching solved the problems in the traditional lecture class. This article stated teaching experience of web-based computer-assisted instruction, based upon a two-year study of computer-assisted instruction courseware use within the course health assessment. The computer-assisted instruction courseware could develop teaching structure, simulate clinical situations, create teaching situations and facilitate students study.

  6. Hardware-based Artificial Neural Networks for Size, Weight, and Power Constrained Platforms (Preprint)

    DTIC Science & Technology

    2012-11-01

    few sensors/complex computations, and many sensors/simple computation. II. CHALLENGES WITH NANO-ENABLED NEUROMORPHIC CHIPS A wide variety of...scenarios. Neuromorphic processors, which are based on the highly parallelized computing architecture of the mammalian brain, show great promise in...in the brain. This fundamentally different approach, frequently referred to as neuromorphic computing, is thought to be better able to solve fuzzy

  7. A strand graph semantics for DNA-based computation

    PubMed Central

    Petersen, Rasmus L.; Lakin, Matthew R.; Phillips, Andrew

    2015-01-01

    DNA nanotechnology is a promising approach for engineering computation at the nanoscale, with potential applications in biofabrication and intelligent nanomedicine. DNA strand displacement is a general strategy for implementing a broad range of nanoscale computations, including any computation that can be expressed as a chemical reaction network. Modelling and analysis of DNA strand displacement systems is an important part of the design process, prior to experimental realisation. As experimental techniques improve, it is important for modelling languages to keep pace with the complexity of structures that can be realised experimentally. In this paper we present a process calculus for modelling DNA strand displacement computations involving rich secondary structures, including DNA branches and loops. We prove that our calculus is also sufficiently expressive to model previous work on non-branching structures, and propose a mapping from our calculus to a canonical strand graph representation, in which vertices represent DNA strands, ordered sites represent domains, and edges between sites represent bonds between domains. We define interactions between strands by means of strand graph rewriting, and prove the correspondence between the process calculus and strand graph behaviours. Finally, we propose a mapping from strand graphs to an efficient implementation, which we use to perform modelling and simulation of DNA strand displacement systems with rich secondary structure. PMID:27293306

  8. Developing an Efficient Computational Method that Estimates the Ability of Students in a Web-Based Learning Environment

    ERIC Educational Resources Information Center

    Lee, Young-Jin

    2012-01-01

    This paper presents a computational method that can efficiently estimate the ability of students from the log files of a Web-based learning environment capturing their problem solving processes. The computational method developed in this study approximates the posterior distribution of the student's ability obtained from the conventional Bayes…

  9. A detailed experimental study of a DNA computer with two endonucleases.

    PubMed

    Sakowski, Sebastian; Krasiński, Tadeusz; Sarnik, Joanna; Blasiak, Janusz; Waldmajer, Jacek; Poplawski, Tomasz

    2017-07-14

    Great advances in biotechnology have allowed the construction of a computer from DNA. One of the proposed solutions is a biomolecular finite automaton, a simple two-state DNA computer without memory, which was presented by Ehud Shapiro's group at the Weizmann Institute of Science. The main problem with this computer, in which biomolecules carry out logical operations, is its complexity - increasing the number of states of biomolecular automata. In this study, we constructed (in laboratory conditions) a six-state DNA computer that uses two endonucleases (e.g. AcuI and BbvI) and a ligase. We have presented a detailed experimental verification of its feasibility. We described the effect of the number of states, the length of input data, and the nondeterminism on the computing process. We also tested different automata (with three, four, and six states) running on various accepted input words of different lengths such as ab, aab, aaab, ababa, and of an unaccepted word ba. Moreover, this article presents the reaction optimization and the methods of eliminating certain biochemical problems occurring in the implementation of a biomolecular DNA automaton based on two endonucleases.

  10. ASKI: A modular toolbox for scattering-integral-based seismic full waveform inversion and sensitivity analysis utilizing external forward codes

    NASA Astrophysics Data System (ADS)

    Schumacher, Florian; Friederich, Wolfgang

    Due to increasing computational resources, the development of new numerically demanding methods and software for imaging Earth's interior remains of high interest in Earth sciences. Here, we give a description from a user's and programmer's perspective of the highly modular, flexible and extendable software package ASKI-Analysis of Sensitivity and Kernel Inversion-recently developed for iterative scattering-integral-based seismic full waveform inversion. In ASKI, the three fundamental steps of solving the seismic forward problem, computing waveform sensitivity kernels and deriving a model update are solved by independent software programs that interact via file output/input only. Furthermore, the spatial discretizations of the model space used for solving the seismic forward problem and for deriving model updates, respectively, are kept completely independent. For this reason, ASKI does not contain a specific forward solver but instead provides a general interface to established community wave propagation codes. Moreover, the third fundamental step of deriving a model update can be repeated at relatively low costs applying different kinds of model regularization or re-selecting/weighting the inverted dataset without need to re-solve the forward problem or re-compute the kernels. Additionally, ASKI offers the user sensitivity and resolution analysis tools based on the full sensitivity matrix and allows to compose customized workflows in a consistent computational environment. ASKI is written in modern Fortran and Python, it is well documented and freely available under terms of the GNU General Public License (http://www.rub.de/aski).

  11. Research for the Fluid Field of the Centrifugal Compressor Impeller in Accelerating Startup

    NASA Astrophysics Data System (ADS)

    Li, Xiaozhu; Chen, Gang; Zhu, Changyun; Qin, Guoliang

    2013-03-01

    In order to study the flow field in the impeller in the accelerating start-up process of centrifugal compressor, the 3-D and 1-D transient accelerated flow governing equations along streamline in the impeller of the centrifugal compressor are derived in detail, the assumption of pressure gradient distribution is presented, and the solving method for 1-D transient accelerating flow field is given based on the assumption. The solving method is achieved by programming and the computing result is obtained. It is obtained by comparison that the computing method is met with the test result. So the feasibility and effectiveness for solving accelerating start-up problem of centrifugal compressor by the solving method in this paper is proven.

  12. Diagnostic Problem-Solving Process in Professional Contexts: Theory and Empirical Investigation in the Context of Car Mechatronics Using Computer-Generated Log-Files

    ERIC Educational Resources Information Center

    Abele, Stephan

    2018-01-01

    This article deals with a theory-based investigation of the diagnostic problem-solving process in professional contexts. To begin with, a theory of the diagnostic problem-solving process was developed drawing on findings from different professional contexts. The theory distinguishes between four sub-processes of the diagnostic problem-solving…

  13. Using Educational Data Mining Methods to Assess Field-Dependent and Field-Independent Learners' Complex Problem Solving

    ERIC Educational Resources Information Center

    Angeli, Charoula; Valanides, Nicos

    2013-01-01

    The present study investigated the problem-solving performance of 101 university students and their interactions with a computer modeling tool in order to solve a complex problem. Based on their performance on the hidden figures test, students were assigned to three groups of field-dependent (FD), field-mixed (FM), and field-independent (FI)…

  14. Programmable energy landscapes for kinetic control of DNA strand displacement.

    PubMed

    Machinek, Robert R F; Ouldridge, Thomas E; Haley, Natalie E C; Bath, Jonathan; Turberfield, Andrew J

    2014-11-10

    DNA is used to construct synthetic systems that sense, actuate, move and compute. The operation of many dynamic DNA devices depends on toehold-mediated strand displacement, by which one DNA strand displaces another from a duplex. Kinetic control of strand displacement is particularly important in autonomous molecular machinery and molecular computation, in which non-equilibrium systems are controlled through rates of competing processes. Here, we introduce a new method based on the creation of mismatched base pairs as kinetic barriers to strand displacement. Reaction rate constants can be tuned across three orders of magnitude by altering the position of such a defect without significantly changing the stabilities of reactants or products. By modelling reaction free-energy landscapes, we explore the mechanistic basis of this control mechanism. We also demonstrate that oxDNA, a coarse-grained model of DNA, is capable of accurately predicting and explaining the impact of mismatches on displacement kinetics.

  15. Modelling of DNA-protein recognition

    NASA Technical Reports Server (NTRS)

    Rein, R.; Garduno, R.; Colombano, S.; Nir, S.; Haydock, K.; Macelroy, R. D.

    1980-01-01

    Computer model-building procedures using stereochemical principles together with theoretical energy calculations appear to be, at this stage, the most promising route toward the elucidation of DNA-protein binding schemes and recognition principles. A review of models and bonding principles is conducted and approaches to modeling are considered, taking into account possible di-hydrogen-bonding schemes between a peptide and a base (or a base pair) of a double-stranded nucleic acid in the major groove, aspects of computer graphic modeling, and a search for isogeometric helices. The energetics of recognition complexes is discussed and several models for peptide DNA recognition are presented.

  16. Organization of the secure distributed computing based on multi-agent system

    NASA Astrophysics Data System (ADS)

    Khovanskov, Sergey; Rumyantsev, Konstantin; Khovanskova, Vera

    2018-04-01

    Nowadays developing methods for distributed computing is received much attention. One of the methods of distributed computing is using of multi-agent systems. The organization of distributed computing based on the conventional network computers can experience security threats performed by computational processes. Authors have developed the unified agent algorithm of control system of computing network nodes operation. Network PCs is used as computing nodes. The proposed multi-agent control system for the implementation of distributed computing allows in a short time to organize using of the processing power of computers any existing network to solve large-task by creating a distributed computing. Agents based on a computer network can: configure a distributed computing system; to distribute the computational load among computers operated agents; perform optimization distributed computing system according to the computing power of computers on the network. The number of computers connected to the network can be increased by connecting computers to the new computer system, which leads to an increase in overall processing power. Adding multi-agent system in the central agent increases the security of distributed computing. This organization of the distributed computing system reduces the problem solving time and increase fault tolerance (vitality) of computing processes in a changing computing environment (dynamic change of the number of computers on the network). Developed a multi-agent system detects cases of falsification of the results of a distributed system, which may lead to wrong decisions. In addition, the system checks and corrects wrong results.

  17. A novel image encryption algorithm based on the chaotic system and DNA computing

    NASA Astrophysics Data System (ADS)

    Chai, Xiuli; Gan, Zhihua; Lu, Yang; Chen, Yiran; Han, Daojun

    A novel image encryption algorithm using the chaotic system and deoxyribonucleic acid (DNA) computing is presented. Different from the traditional encryption methods, the permutation and diffusion of our method are manipulated on the 3D DNA matrix. Firstly, a 3D DNA matrix is obtained through bit plane splitting, bit plane recombination, DNA encoding of the plain image. Secondly, 3D DNA level permutation based on position sequence group (3DDNALPBPSG) is introduced, and chaotic sequences generated from the chaotic system are employed to permutate the positions of the elements of the 3D DNA matrix. Thirdly, 3D DNA level diffusion (3DDNALD) is given, the confused 3D DNA matrix is split into sub-blocks, and XOR operation by block is manipulated to the sub-DNA matrix and the key DNA matrix from the chaotic system. At last, by decoding the diffused DNA matrix, we get the cipher image. SHA 256 hash of the plain image is employed to calculate the initial values of the chaotic system to avoid chosen plaintext attack. Experimental results and security analyses show that our scheme is secure against several known attacks, and it can effectively protect the security of the images.

  18. [Computer-assisted education in problem-solving in neurology; a randomized educational study].

    PubMed

    Weverling, G J; Stam, J; ten Cate, T J; van Crevel, H

    1996-02-24

    To determine the effect of computer-based medical teaching (CBMT) as a supplementary method to teach clinical problem-solving during the clerkship in neurology. Randomized controlled blinded study. Academic Medical Centre, Amsterdam, the Netherlands. 103 Students were assigned at random to a group with access to CBMT and a control group. CBMT consisted of 20 computer-simulated patients with neurological diseases, and was permanently available during five weeks to students in the CBMT group. The ability to recognize and solve neurological problems was assessed with two free-response tests, scored by two blinded observers. The CBMT students scored significantly better on the test related to the CBMT cases (mean score 7.5 on a zero to 10 point scale; control group 6.2; p < 0.001). There was no significant difference on the control test not related to the problems practised with CBMT. CBMT can be an effective method for teaching clinical problem-solving, when used as a supplementary teaching facility during a clinical clerkship. The increased ability to solve problems learned by CBMT had no demonstrable effect on the performance with other neurological problems.

  19. Discovery of a new motion mechanism of biomotors similar to the earth revolving around the sun without rotation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Guo, Peixuan, E-mail: peixuan.guo@uky.edu; Schwartz, Chad; Haak, Jeannie

    Biomotors have been classified into linear and rotational motors. For 35 years, it has been popularly believed that viral dsDNA-packaging apparatuses are pentameric rotation motors. Recently, a third class of hexameric motor has been found in bacteriophage phi29 that utilizes a mechanism of revolution without rotation, friction, coiling, or torque. This review addresses how packaging motors control dsDNA one-way traffic; how four electropositive layers in the channel interact with the electronegative phosphate backbone to generate four steps in translocating one dsDNA helix; how motors resolve the mismatch between 10.5 bases and 12 connector subunits per cycle of revolution; and howmore » ATP regulates sequential action of motor ATPase. Since motors with all number of subunits can utilize the revolution mechanism, this finding helps resolve puzzles and debates concerning the oligomeric nature of packaging motors in many phage systems. This revolution mechanism helps to solve the undesirable dsDNA supercoiling issue involved in rotation. - Highlights: • New motion mechanism of revolution without rotation found for phi29 DNA packaging. • Revolution motor finding expands classical linear and rotation biomotor classes. • Revolution motors transport dsDNA unidirectionally without supercoiling. • New mechanism solves many puzzles, mysteries, and debates in biomotor studies. • Motors with all numbers of subunits can utilize the revolution mechanism.« less

  20. DnaSAM: Software to perform neutrality testing for large datasets with complex null models.

    PubMed

    Eckert, Andrew J; Liechty, John D; Tearse, Brandon R; Pande, Barnaly; Neale, David B

    2010-05-01

    Patterns of DNA sequence polymorphisms can be used to understand the processes of demography and adaptation within natural populations. High-throughput generation of DNA sequence data has historically been the bottleneck with respect to data processing and experimental inference. Advances in marker technologies have largely solved this problem. Currently, the limiting step is computational, with most molecular population genetic software allowing a gene-by-gene analysis through a graphical user interface. An easy-to-use analysis program that allows both high-throughput processing of multiple sequence alignments along with the flexibility to simulate data under complex demographic scenarios is currently lacking. We introduce a new program, named DnaSAM, which allows high-throughput estimation of DNA sequence diversity and neutrality statistics from experimental data along with the ability to test those statistics via Monte Carlo coalescent simulations. These simulations are conducted using the ms program, which is able to incorporate several genetic parameters (e.g. recombination) and demographic scenarios (e.g. population bottlenecks). The output is a set of diversity and neutrality statistics with associated probability values under a user-specified null model that are stored in easy to manipulate text file. © 2009 Blackwell Publishing Ltd.

  1. Research on Computers in Mathematics Education, IV. The Use of Computers in Mathematics Education Resource Series.

    ERIC Educational Resources Information Center

    Kieren, Thomas E.

    This last paper in a set of four reviews research on a wide variety of computer applications in the mathematics classroom. It covers computer-based instruction, especially drill-and-practice and tutorial modes; computer-managed instruction; and computer-augmented problem-solving. Analytical comments on the findings and status of the research are…

  2. An Expert System Shell to Teach Problem Solving.

    ERIC Educational Resources Information Center

    Lippert, Renate C.

    1988-01-01

    Discusses the use of expert systems to teach problem-solving skills to students from grade 6 to college level. The role of computer technology in the future of education is considered, and the construction of knowledge bases is described, including an example for physics. (LRW)

  3. The Meselson-Stahl Experiment

    ERIC Educational Resources Information Center

    Szeberenyi, Jozsef

    2012-01-01

    Terms to be familiar with before you start to solve the test: DNA replication, nitrogen isotopes, density labeling, cesium chloride density gradient centrifugation, ultraviolet absorption, DNA denaturation, circular and linear DNA, superspiralization, superhelical DNA, and template.

  4. Tutoring Mathematical Word Problems Using Solution Trees: Text Comprehension, Situation Comprehension, and Mathematization in Solving Story Problems. Research Report No. 8.

    ERIC Educational Resources Information Center

    Reusser, Kurt; And Others

    The main concern of this paper is on the psychological processes of how students understand and solve mathematical word problems, and on how this knowledge can be applied to computer-based tutoring. It is argued that only a better understanding of the psychological requirements for understanding and solving those problems will lead to…

  5. An efficient parallel algorithm for the solution of a tridiagonal linear system of equations

    NASA Technical Reports Server (NTRS)

    Stone, H. S.

    1971-01-01

    Tridiagonal linear systems of equations are solved on conventional serial machines in a time proportional to N, where N is the number of equations. The conventional algorithms do not lend themselves directly to parallel computations on computers of the ILLIAC IV class, in the sense that they appear to be inherently serial. An efficient parallel algorithm is presented in which computation time grows as log sub 2 N. The algorithm is based on recursive doubling solutions of linear recurrence relations, and can be used to solve recurrence relations of all orders.

  6. An Efficient Rank Based Approach for Closest String and Closest Substring

    PubMed Central

    2012-01-01

    This paper aims to present a new genetic approach that uses rank distance for solving two known NP-hard problems, and to compare rank distance with other distance measures for strings. The two NP-hard problems we are trying to solve are closest string and closest substring. For each problem we build a genetic algorithm and we describe the genetic operations involved. Both genetic algorithms use a fitness function based on rank distance. We compare our algorithms with other genetic algorithms that use different distance measures, such as Hamming distance or Levenshtein distance, on real DNA sequences. Our experiments show that the genetic algorithms based on rank distance have the best results. PMID:22675483

  7. Chromatin Computation

    PubMed Central

    Bryant, Barbara

    2012-01-01

    In living cells, DNA is packaged along with protein and RNA into chromatin. Chemical modifications to nucleotides and histone proteins are added, removed and recognized by multi-functional molecular complexes. Here I define a new computational model, in which chromatin modifications are information units that can be written onto a one-dimensional string of nucleosomes, analogous to the symbols written onto cells of a Turing machine tape, and chromatin-modifying complexes are modeled as read-write rules that operate on a finite set of adjacent nucleosomes. I illustrate the use of this “chromatin computer” to solve an instance of the Hamiltonian path problem. I prove that chromatin computers are computationally universal – and therefore more powerful than the logic circuits often used to model transcription factor control of gene expression. Features of biological chromatin provide a rich instruction set for efficient computation of nontrivial algorithms in biological time scales. Modeling chromatin as a computer shifts how we think about chromatin function, suggests new approaches to medical intervention, and lays the groundwork for the engineering of a new class of biological computing machines. PMID:22567109

  8. Network Community Detection based on the Physarum-inspired Computational Framework.

    PubMed

    Gao, Chao; Liang, Mingxin; Li, Xianghua; Zhang, Zili; Wang, Zhen; Zhou, Zhili

    2016-12-13

    Community detection is a crucial and essential problem in the structure analytics of complex networks, which can help us understand and predict the characteristics and functions of complex networks. Many methods, ranging from the optimization-based algorithms to the heuristic-based algorithms, have been proposed for solving such a problem. Due to the inherent complexity of identifying network structure, how to design an effective algorithm with a higher accuracy and a lower computational cost still remains an open problem. Inspired by the computational capability and positive feedback mechanism in the wake of foraging process of Physarum, which is a large amoeba-like cell consisting of a dendritic network of tube-like pseudopodia, a general Physarum-based computational framework for community detection is proposed in this paper. Based on the proposed framework, the inter-community edges can be identified from the intra-community edges in a network and the positive feedback of solving process in an algorithm can be further enhanced, which are used to improve the efficiency of original optimization-based and heuristic-based community detection algorithms, respectively. Some typical algorithms (e.g., genetic algorithm, ant colony optimization algorithm, and Markov clustering algorithm) and real-world datasets have been used to estimate the efficiency of our proposed computational framework. Experiments show that the algorithms optimized by Physarum-inspired computational framework perform better than the original ones, in terms of accuracy and computational cost. Moreover, a computational complexity analysis verifies the scalability of our framework.

  9. Inverse Problem in Self-assembly

    NASA Astrophysics Data System (ADS)

    Tkachenko, Alexei

    2012-02-01

    By decorating colloids and nanoparticles with DNA, one can introduce highly selective key-lock interactions between them. This leads to a new class of systems and problems in soft condensed matter physics. In particular, this opens a possibility to solve inverse problem in self-assembly: how to build an arbitrary desired structure with the bottom-up approach? I will present a theoretical and computational analysis of the hierarchical strategy in attacking this problem. It involves self-assembly of particular building blocks (``octopus particles''), that in turn would assemble into the target structure. On a conceptual level, our approach combines elements of three different brands of programmable self assembly: DNA nanotechnology, nanoparticle-DNA assemblies and patchy colloids. I will discuss the general design principles, theoretical and practical limitations of this approach, and illustrate them with our simulation results. Our crucial result is that not only it is possible to design a system that has a given nanostructure as a ground state, but one can also program and optimize the kinetic pathway for its self-assembly.

  10. GUI to Facilitate Research on Biological Damage from Radiation

    NASA Technical Reports Server (NTRS)

    Cucinotta, Frances A.; Ponomarev, Artem Lvovich

    2010-01-01

    A graphical-user-interface (GUI) computer program has been developed to facilitate research on the damage caused by highly energetic particles and photons impinging on living organisms. The program brings together, into one computational workspace, computer codes that have been developed over the years, plus codes that will be developed during the foreseeable future, to address diverse aspects of radiation damage. These include codes that implement radiation-track models, codes for biophysical models of breakage of deoxyribonucleic acid (DNA) by radiation, pattern-recognition programs for extracting quantitative information from biological assays, and image-processing programs that aid visualization of DNA breaks. The radiation-track models are based on transport models of interactions of radiation with matter and solution of the Boltzmann transport equation by use of both theoretical and numerical models. The biophysical models of breakage of DNA by radiation include biopolymer coarse-grained and atomistic models of DNA, stochastic- process models of deposition of energy, and Markov-based probabilistic models of placement of double-strand breaks in DNA. The program is designed for use in the NT, 95, 98, 2000, ME, and XP variants of the Windows operating system.

  11. Comparative Analysis of Palm and Wearable Computers for Participatory Simulations

    ERIC Educational Resources Information Center

    Klopfer, Eric; Yoon, Susan; Rivas, Luz

    2004-01-01

    Recent educational computer-based technologies have offered promising lines of research that promote social constructivist learning goals, develop skills required to operate in a knowledge-based economy (Roschelle et al. 2000), and enable more authentic science-like problem-solving. In our research programme, we have been interested in combining…

  12. Pedagogy and/or technology: Making difference in improving students' problem solving skills

    NASA Astrophysics Data System (ADS)

    Hrepic, Zdeslav; Lodder, Katherine; Shaw, Kimberly A.

    2013-01-01

    Pen input computers combined with interactive software may have substantial potential for promoting active instructional methodologies and for facilitating students' problem solving ability. An excellent example is a study in which introductory physics students improved retention, conceptual understanding and problem solving abilities when one of three weekly lectures was replaced with group problem solving sessions facilitated with Tablet PCs and DyKnow software [1,2]. The research goal of the present study was to isolate the effect of the methodology itself (using additional time to teach problem solving) from that of the involved technology. In Fall 2011 we compared the performance of students taking the same introductory physics lecture course while enrolled in two separate problem-solving sections. One section used pen-based computing to facilitate group problem solving while the other section used low-tech methods for one third of the semester (covering Kinematics), and then traded technologies for the middle third of the term (covering Dynamics). Analysis of quiz, exam and standardized pre-post test results indicated no significant difference in scores of the two groups. Combining this result with those of previous studies implies primacy of pedagogy (collaborative problem solving itself) over technology for student learning in problem solving recitations.

  13. DNA-informed breeding of rosaceous crops: promises, progress and prospects

    PubMed Central

    Peace, Cameron P

    2017-01-01

    Crops of the Rosaceae family provide valuable contributions to rural economies and human health and enjoyment. Sustained solutions to production challenges and market demands can be met with genetically improved new cultivars. Traditional rosaceous crop breeding is expensive and time-consuming and would benefit from improvements in efficiency and accuracy. Use of DNA information is becoming conventional in rosaceous crop breeding, contributing to many decisions and operations, but only after past decades of solved challenges and generation of sufficient resources. Successes in deployment of DNA-based knowledge and tools have arisen when the ‘chasm’ between genomics discoveries and practical application is bridged systematically. Key steps are establishing breeder desire for use of DNA information, adapting tools to local breeding utility, identifying efficient application schemes, accessing effective services in DNA-based diagnostics and gaining experience in integrating DNA information into breeding operations and decisions. DNA-informed germplasm characterization for revealing identity and relatedness has benefitted many programs and provides a compelling entry point to reaping benefits of genomics research. DNA-informed germplasm evaluation for predicting trait performance has enabled effective reallocation of breeding resources when applied in pioneering programs. DNA-based diagnostics is now expanding from specific loci to genome-wide considerations. Realizing the full potential of this expansion will require improved accuracy of predictions, multi-trait DNA profiling capabilities, streamlined breeding information management systems, strategies that overcome plant-based features that limit breeding progress and widespread training of current and future breeding personnel and allied scientists. PMID:28326185

  14. A Feature Selection Algorithm to Compute Gene Centric Methylation from Probe Level Methylation Data.

    PubMed

    Baur, Brittany; Bozdag, Serdar

    2016-01-01

    DNA methylation is an important epigenetic event that effects gene expression during development and various diseases such as cancer. Understanding the mechanism of action of DNA methylation is important for downstream analysis. In the Illumina Infinium HumanMethylation 450K array, there are tens of probes associated with each gene. Given methylation intensities of all these probes, it is necessary to compute which of these probes are most representative of the gene centric methylation level. In this study, we developed a feature selection algorithm based on sequential forward selection that utilized different classification methods to compute gene centric DNA methylation using probe level DNA methylation data. We compared our algorithm to other feature selection algorithms such as support vector machines with recursive feature elimination, genetic algorithms and ReliefF. We evaluated all methods based on the predictive power of selected probes on their mRNA expression levels and found that a K-Nearest Neighbors classification using the sequential forward selection algorithm performed better than other algorithms based on all metrics. We also observed that transcriptional activities of certain genes were more sensitive to DNA methylation changes than transcriptional activities of other genes. Our algorithm was able to predict the expression of those genes with high accuracy using only DNA methylation data. Our results also showed that those DNA methylation-sensitive genes were enriched in Gene Ontology terms related to the regulation of various biological processes.

  15. Manage Your Life Online (MYLO): a pilot trial of a conversational computer-based intervention for problem solving in a student sample.

    PubMed

    Gaffney, Hannah; Mansell, Warren; Edwards, Rachel; Wright, Jason

    2014-11-01

    Computerized self-help that has an interactive, conversational format holds several advantages, such as flexibility across presenting problems and ease of use. We designed a new program called MYLO that utilizes the principles of METHOD of Levels (MOL) therapy--based upon Perceptual Control Theory (PCT). We tested the efficacy of MYLO, tested whether the psychological change mechanisms described by PCT mediated its efficacy, and evaluated effects of client expectancy. Forty-eight student participants were randomly assigned to MYLO or a comparison program ELIZA. Participants discussed a problem they were currently experiencing with their assigned program and completed measures of distress, resolution and expectancy preintervention, postintervention and at 2-week follow-up. MYLO and ELIZA were associated with reductions in distress, depression, anxiety and stress. MYLO was considered more helpful and led to greater problem resolution. The psychological change processes predicted higher ratings of MYLO's helpfulness and reductions in distress. Positive expectancies towards computer-based problem solving correlated with MYLO's perceived helpfulness and greater problem resolution, and this was partly mediated by the psychological change processes identified. The findings provide provisional support for the acceptability of the MYLO program in a non-clinical sample although its efficacy as an innovative computer-based aid to problem solving remains unclear. Nevertheless, the findings provide tentative early support for the mechanisms of psychological change identified within PCT and highlight the importance of client expectations on predicting engagement in computer-based self-help.

  16. Testing the effectiveness of problem-based learning with learning-disabled students in biology

    NASA Astrophysics Data System (ADS)

    Guerrera, Claudia Patrizia

    The purpose of the present study was to investigate the effects of problem-based learning (PBL) with learning-disabled (LD) students. Twenty-four students (12 dyads) classified as LD and attending a school for the learning-disabled participated in the study. Students engaged in either a computer-based environment involving BioWorld, a hospital simulation designed to teach biology students problem-solving skills, or a paper-and-pencil version based on the computer program. A hybrid model of learning was adopted whereby students were provided with direct instruction on the digestive system prior to participating in a problem-solving activity. Students worked in dyads and solved three problems involving the digestive system in either a computerized or a paper-and-pencil condition. The experimenter acted as a coach to assist students throughout the problem-solving process. A follow-up study was conducted, one month later, to measure the long-term learning gains. Quantitative and qualitative methods were used to analyze three types of data: process data, outcome data, and follow-up data. Results from the process data showed that all students engaged in effective collaboration and became more systematic in their problem solving over time. Findings from the outcome and follow-up data showed that students in both treatment conditions, made both learning and motivational gains and that these benefits were still evident one month later. Overall, results demonstrated that the computer facilitated students' problem solving and scientific reasoning skills. Some differences were noted in students' collaboration and the amount of assistance required from the coach in both conditions. Thus, PBL is an effective learning approach with LD students in science, regardless of the type of learning environment. These results have implications for teaching science to LD students, as well as for future designs of educational software for this population.

  17. Cloud-based MOTIFSIM: Detecting Similarity in Large DNA Motif Data Sets.

    PubMed

    Tran, Ngoc Tam L; Huang, Chun-Hsi

    2017-05-01

    We developed the cloud-based MOTIFSIM on Amazon Web Services (AWS) cloud. The tool is an extended version from our web-based tool version 2.0, which was developed based on a novel algorithm for detecting similarity in multiple DNA motif data sets. This cloud-based version further allows researchers to exploit the computing resources available from AWS to detect similarity in multiple large-scale DNA motif data sets resulting from the next-generation sequencing technology. The tool is highly scalable with expandable AWS.

  18. Flood inundation extent mapping based on block compressed tracing

    NASA Astrophysics Data System (ADS)

    Shen, Dingtao; Rui, Yikang; Wang, Jiechen; Zhang, Yu; Cheng, Liang

    2015-07-01

    Flood inundation extent, depth, and duration are important factors affecting flood hazard evaluation. At present, flood inundation analysis is based mainly on a seeded region-growing algorithm, which is an inefficient process because it requires excessive recursive computations and it is incapable of processing massive datasets. To address this problem, we propose a block compressed tracing algorithm for mapping the flood inundation extent, which reads the DEM data in blocks before transferring them to raster compression storage. This allows a smaller computer memory to process a larger amount of data, which solves the problem of the regular seeded region-growing algorithm. In addition, the use of a raster boundary tracing technique allows the algorithm to avoid the time-consuming computations required by the seeded region-growing. Finally, we conduct a comparative evaluation in the Chin-sha River basin, results show that the proposed method solves the problem of flood inundation extent mapping based on massive DEM datasets with higher computational efficiency than the original method, which makes it suitable for practical applications.

  19. Assessing Cognitive Learning of Analytical Problem Solving

    NASA Astrophysics Data System (ADS)

    Billionniere, Elodie V.

    Introductory programming courses, also known as CS1, have a specific set of expected outcomes related to the learning of the most basic and essential computational concepts in computer science (CS). However, two of the most often heard complaints in such courses are that (1) they are divorced from the reality of application and (2) they make the learning of the basic concepts tedious. The concepts introduced in CS1 courses are highly abstract and not easily comprehensible. In general, the difficulty is intrinsic to the field of computing, often described as "too mathematical or too abstract." This dissertation presents a small-scale mixed method study conducted during the fall 2009 semester of CS1 courses at Arizona State University. This study explored and assessed students' comprehension of three core computational concepts---abstraction, arrays of objects, and inheritance---in both algorithm design and problem solving. Through this investigation students' profiles were categorized based on their scores and based on their mistakes categorized into instances of five computational thinking concepts: abstraction, algorithm, scalability, linguistics, and reasoning. It was shown that even though the notion of computational thinking is not explicit in the curriculum, participants possessed and/or developed this skill through the learning and application of the CS1 core concepts. Furthermore, problem-solving experiences had a direct impact on participants' knowledge skills, explanation skills, and confidence. Implications for teaching CS1 and for future research are also considered.

  20. The Ins and Outs of DNA Fingerprinting the Infectious Fungi

    PubMed Central

    Soll, David R.

    2000-01-01

    DNA fingerprinting methods have evolved as major tools in fungal epidemiology. However, no single method has emerged as the method of choice, and some methods perform better than others at different levels of resolution. In this review, requirements for an effective DNA fingerprinting method are proposed and procedures are described for testing the efficacy of a method. In light of the proposed requirements, the most common methods now being used to DNA fingerprint the infectious fungi are described and assessed. These methods include restriction fragment length polymorphisms (RFLP), RFLP with hybridization probes, randomly amplified polymorphic DNA and other PCR-based methods, electrophoretic karyotyping, and sequencing-based methods. Procedures for computing similarity coefficients, generating phylogenetic trees, and testing the stability of clusters are then described. To facilitate the analysis of DNA fingerprinting data, computer-assisted methods are described. Finally, the problems inherent in the collection of test and control isolates are considered, and DNA fingerprinting studies of strain maintenance during persistent or recurrent infections, microevolution in infecting strains, and the origin of nosocomial infections are assessed in light of the preceding discussion of the ins and outs of DNA fingerprinting. The intent of this review is to generate an awareness of the need to verify the efficacy of each DNA fingerprinting method for the level of genetic relatedness necessary to answer the epidemiological question posed, to use quantitative methods to analyze DNA fingerprint data, to use computer-assisted DNA fingerprint analysis systems to analyze data, and to file data in a form that can be used in the future for retrospective and comparative studies. PMID:10756003

  1. A Research Study of Computer-Based Tutoring of Mathematical and Scientific Knowledge. Final Technical Report.

    ERIC Educational Resources Information Center

    Goldstein, Ira

    Computer coaching of students as an aid in problem-solving instruction is discussed. This report describes an advanced form of computer-assisted instruction that must not only present the material to be taught, but also analyze the student's responses. The program must decide whether to intervene and how much to say to a pupil based on its…

  2. Computational identification of novel biochemical systems involved in oxidation, glycosylation and other complex modifications of bases in DNA.

    PubMed

    Iyer, Lakshminarayan M; Zhang, Dapeng; Burroughs, A Maxwell; Aravind, L

    2013-09-01

    Discovery of the TET/JBP family of dioxygenases that modify bases in DNA has sparked considerable interest in novel DNA base modifications and their biological roles. Using sensitive sequence and structure analyses combined with contextual information from comparative genomics, we computationally characterize over 12 novel biochemical systems for DNA modifications. We predict previously unidentified enzymes, such as the kinetoplastid J-base generating glycosyltransferase (and its homolog GREB1), the catalytic specificity of bacteriophage TET/JBP proteins and their role in complex DNA base modifications. We also predict the enzymes involved in synthesis of hypermodified bases such as alpha-glutamylthymine and alpha-putrescinylthymine that have remained enigmatic for several decades. Moreover, the current analysis suggests that bacteriophages and certain nucleo-cytoplasmic large DNA viruses contain an unexpectedly diverse range of DNA modification systems, in addition to those using previously characterized enzymes such as Dam, Dcm, TET/JBP, pyrimidine hydroxymethylases, Mom and glycosyltransferases. These include enzymes generating modified bases such as deazaguanines related to queuine and archaeosine, pyrimidines comparable with lysidine, those derived using modified S-adenosyl methionine derivatives and those using TET/JBP-generated hydroxymethyl pyrimidines as biosynthetic starting points. We present evidence that some of these modification systems are also widely dispersed across prokaryotes and certain eukaryotes such as basidiomycetes, chlorophyte and stramenopile alga, where they could serve as novel epigenetic marks for regulation or discrimination of self from non-self DNA. Our study extends the role of the PUA-like fold domains in recognition of modified nucleic acids and predicts versions of the ASCH and EVE domains to be novel 'readers' of modified bases in DNA. These results open opportunities for the investigation of the biology of these systems and their use in biotechnology.

  3. Computational identification of novel biochemical systems involved in oxidation, glycosylation and other complex modifications of bases in DNA

    PubMed Central

    Iyer, Lakshminarayan M.; Zhang, Dapeng; Maxwell Burroughs, A.; Aravind, L.

    2013-01-01

    Discovery of the TET/JBP family of dioxygenases that modify bases in DNA has sparked considerable interest in novel DNA base modifications and their biological roles. Using sensitive sequence and structure analyses combined with contextual information from comparative genomics, we computationally characterize over 12 novel biochemical systems for DNA modifications. We predict previously unidentified enzymes, such as the kinetoplastid J-base generating glycosyltransferase (and its homolog GREB1), the catalytic specificity of bacteriophage TET/JBP proteins and their role in complex DNA base modifications. We also predict the enzymes involved in synthesis of hypermodified bases such as alpha-glutamylthymine and alpha-putrescinylthymine that have remained enigmatic for several decades. Moreover, the current analysis suggests that bacteriophages and certain nucleo-cytoplasmic large DNA viruses contain an unexpectedly diverse range of DNA modification systems, in addition to those using previously characterized enzymes such as Dam, Dcm, TET/JBP, pyrimidine hydroxymethylases, Mom and glycosyltransferases. These include enzymes generating modified bases such as deazaguanines related to queuine and archaeosine, pyrimidines comparable with lysidine, those derived using modified S-adenosyl methionine derivatives and those using TET/JBP-generated hydroxymethyl pyrimidines as biosynthetic starting points. We present evidence that some of these modification systems are also widely dispersed across prokaryotes and certain eukaryotes such as basidiomycetes, chlorophyte and stramenopile alga, where they could serve as novel epigenetic marks for regulation or discrimination of self from non-self DNA. Our study extends the role of the PUA-like fold domains in recognition of modified nucleic acids and predicts versions of the ASCH and EVE domains to be novel ‘readers’ of modified bases in DNA. These results open opportunities for the investigation of the biology of these systems and their use in biotechnology. PMID:23814188

  4. Beyond input-output computings: error-driven emergence with parallel non-distributed slime mold computer.

    PubMed

    Aono, Masashi; Gunji, Yukio-Pegio

    2003-10-01

    The emergence derived from errors is the key importance for both novel computing and novel usage of the computer. In this paper, we propose an implementable experimental plan for the biological computing so as to elicit the emergent property of complex systems. An individual plasmodium of the true slime mold Physarum polycephalum acts in the slime mold computer. Modifying the Elementary Cellular Automaton as it entails the global synchronization problem upon the parallel computing provides the NP-complete problem solved by the slime mold computer. The possibility to solve the problem by giving neither all possible results nor explicit prescription of solution-seeking is discussed. In slime mold computing, the distributivity in the local computing logic can change dynamically, and its parallel non-distributed computing cannot be reduced into the spatial addition of multiple serial computings. The computing system based on exhaustive absence of the super-system may produce, something more than filling the vacancy.

  5. Chess games: a model for RNA based computation.

    PubMed

    Cukras, A R; Faulhammer, D; Lipton, R J; Landweber, L F

    1999-10-01

    Here we develop the theory of RNA computing and a method for solving the 'knight problem' as an instance of a satisfiability (SAT) problem. Using only biological molecules and enzymes as tools, we developed an algorithm for solving the knight problem (3 x 3 chess board) using a 10-bit combinatorial pool and sequential RNase H digestions. The results of preliminary experiments presented here reveal that the protocol recovers far more correct solutions than expected at random, but the persistence of errors still presents the greatest challenge.

  6. Utilizing Molecular Dynamics ' Multipotent Methodologies to Measure Microscopic Motions of DNA Molecules: A Magniloquent Manuscript On DNA's Means and Mannerisms

    NASA Astrophysics Data System (ADS)

    Kingsland, Addie

    DNA is an amazing molecule which is the basic template for all genetics. It is the primary molecule for storing biological information, and has many applications in nanotechnology. Double-stranded DNA may contain mismatched base pairs beyond the Watson-Crick pairs guanine-cytosine and adenine-thymine. To date, no one has found a physical property of base pair mismatches which describes the behavior of naturally occurring mismatch repair enzymes. Many materials properties of DNA are also unknown, for instance, when pulling DNA in different configurations, different energy differences are observed with no obvious reason why. DNA mismatches also affect their local environment, for instance changing the quantum yield of nearby azobenzene moieties. We utilize molecular dynamics computer simulations to study the structure and dynamics for both matched and mismatched base pairs, within both biological and materials contexts, and in both equilibrium and biased dynamics. We show that mismatched pairs shift further in the plane normal to the DNA strand and are more likely to exhibit non-canonical structures, including the e-motif. Base pair mismatches alter their local environment, affecting the trans- to cis- photoisomerization quantum yield of azobenzene, as well as increasing the likelihood of observing the e-motif. We also show that by using simulated data, we can give new insights on theoretical models to calculate the energetics of pulling DNA strands apart. These results, all relatively inexpensive on modern computer hardware, can help guide the design of DNA-based nanotechnologies, as well as give new insights into the functioning of mismatch repair systems in cancer prevention.

  7. Computational DNA hole spectroscopy: A new tool to predict mutation hotspots, critical base pairs, and disease ‘driver’ mutations

    PubMed Central

    Suárez, Martha Y.; Villagrán; Miller, John H.

    2015-01-01

    We report on a new technique, computational DNA hole spectroscopy, which creates spectra of electron hole probabilities vs. nucleotide position. A hole is a site of positive charge created when an electron is removed. Peaks in the hole spectrum depict sites where holes tend to localize and potentially trigger a base pair mismatch during replication. Our studies of mitochondrial DNA reveal a correlation between L-strand hole spectrum peaks and spikes in the human mutation spectrum. Importantly, we also find that hole peak positions that do not coincide with large variant frequencies often coincide with disease-implicated mutations and/or (for coding DNA) encoded conserved amino acids. This enables combining hole spectra with variant data to identify critical base pairs and potential disease ‘driver’ mutations. Such integration of DNA hole and variance spectra could ultimately prove invaluable for pinpointing critical regions of the vast non-protein-coding genome. An observed asymmetry in correlations, between the spectrum of human mtDNA variations and the L- and H-strand hole spectra, is attributed to asymmetric DNA replication processes that occur for the leading and lagging strands. PMID:26310834

  8. Computational DNA hole spectroscopy: A new tool to predict mutation hotspots, critical base pairs, and disease 'driver' mutations.

    PubMed

    Villagrán, Martha Y Suárez; Miller, John H

    2015-08-27

    We report on a new technique, computational DNA hole spectroscopy, which creates spectra of electron hole probabilities vs. nucleotide position. A hole is a site of positive charge created when an electron is removed. Peaks in the hole spectrum depict sites where holes tend to localize and potentially trigger a base pair mismatch during replication. Our studies of mitochondrial DNA reveal a correlation between L-strand hole spectrum peaks and spikes in the human mutation spectrum. Importantly, we also find that hole peak positions that do not coincide with large variant frequencies often coincide with disease-implicated mutations and/or (for coding DNA) encoded conserved amino acids. This enables combining hole spectra with variant data to identify critical base pairs and potential disease 'driver' mutations. Such integration of DNA hole and variance spectra could ultimately prove invaluable for pinpointing critical regions of the vast non-protein-coding genome. An observed asymmetry in correlations, between the spectrum of human mtDNA variations and the L- and H-strand hole spectra, is attributed to asymmetric DNA replication processes that occur for the leading and lagging strands.

  9. A Game Based e-Learning System to Teach Artificial Intelligence in the Computer Sciences Degree

    ERIC Educational Resources Information Center

    de Castro-Santos, Amable; Fajardo, Waldo; Molina-Solana, Miguel

    2017-01-01

    Our students taking the Artificial Intelligence and Knowledge Engineering courses often encounter a large number of problems to solve which are not directly related to the subject to be learned. To solve this problem, we have developed a game based e-learning system. The elected game, that has been implemented as an e-learning system, allows to…

  10. A Benders based rolling horizon algorithm for a dynamic facility location problem

    DOE PAGES

    Marufuzzaman,, Mohammad; Gedik, Ridvan; Roni, Mohammad S.

    2016-06-28

    This study presents a well-known capacitated dynamic facility location problem (DFLP) that satisfies the customer demand at a minimum cost by determining the time period for opening, closing, or retaining an existing facility in a given location. To solve this challenging NP-hard problem, this paper develops a unique hybrid solution algorithm that combines a rolling horizon algorithm with an accelerated Benders decomposition algorithm. Extensive computational experiments are performed on benchmark test instances to evaluate the hybrid algorithm’s efficiency and robustness in solving the DFLP problem. Computational results indicate that the hybrid Benders based rolling horizon algorithm consistently offers high qualitymore » feasible solutions in a much shorter computational time period than the standalone rolling horizon and accelerated Benders decomposition algorithms in the experimental range.« less

  11. Search Path Mapping: A Versatile Approach for Visualizing Problem-Solving Behavior.

    ERIC Educational Resources Information Center

    Stevens, Ronald H.

    1991-01-01

    Computer-based problem-solving examinations in immunology generate graphic representations of students' search paths, allowing evaluation of how organized and focused their knowledge is, how well their organization relates to critical concepts in immunology, where major misconceptions exist, and whether proper knowledge links exist between content…

  12. Improving Problem-Solving Techniques for Students in Low-Performing Schools

    ERIC Educational Resources Information Center

    Hobbs, Robert Maurice

    2012-01-01

    Teachers can use culturally relevant pedagogical strategies and technologies as emerging tools to improve students' problem-solving skills. The purpose of this study was to investigate and assess the effectiveness of culturally specific computer-based instructional tasks on ninth-grade African American mathematics students. This study tried to…

  13. An Intelligent Model for Pairs Trading Using Genetic Algorithms.

    PubMed

    Huang, Chien-Feng; Hsu, Chi-Jen; Chen, Chi-Chung; Chang, Bao Rong; Li, Chen-An

    2015-01-01

    Pairs trading is an important and challenging research area in computational finance, in which pairs of stocks are bought and sold in pair combinations for arbitrage opportunities. Traditional methods that solve this set of problems mostly rely on statistical methods such as regression. In contrast to the statistical approaches, recent advances in computational intelligence (CI) are leading to promising opportunities for solving problems in the financial applications more effectively. In this paper, we present a novel methodology for pairs trading using genetic algorithms (GA). Our results showed that the GA-based models are able to significantly outperform the benchmark and our proposed method is capable of generating robust models to tackle the dynamic characteristics in the financial application studied. Based upon the promising results obtained, we expect this GA-based method to advance the research in computational intelligence for finance and provide an effective solution to pairs trading for investment in practice.

  14. An Intelligent Model for Pairs Trading Using Genetic Algorithms

    PubMed Central

    Hsu, Chi-Jen; Chen, Chi-Chung; Li, Chen-An

    2015-01-01

    Pairs trading is an important and challenging research area in computational finance, in which pairs of stocks are bought and sold in pair combinations for arbitrage opportunities. Traditional methods that solve this set of problems mostly rely on statistical methods such as regression. In contrast to the statistical approaches, recent advances in computational intelligence (CI) are leading to promising opportunities for solving problems in the financial applications more effectively. In this paper, we present a novel methodology for pairs trading using genetic algorithms (GA). Our results showed that the GA-based models are able to significantly outperform the benchmark and our proposed method is capable of generating robust models to tackle the dynamic characteristics in the financial application studied. Based upon the promising results obtained, we expect this GA-based method to advance the research in computational intelligence for finance and provide an effective solution to pairs trading for investment in practice. PMID:26339236

  15. An Ethnographic Study of the Computational Strategies of a Group of Young Street Vendors in Beirut.

    ERIC Educational Resources Information Center

    Jurdak, Murad; Shahin, Iman

    1999-01-01

    Examines the computational strategies of 10 young street vendors in Beirut by describing, comparing, and analyzing computational strategies used in solving three types of problems: (1) transactions in the workplace; (2) word problems; and (3) computation exercises in a school-like setting. Indicates that vendors' use of semantically-based mental…

  16. Computing exponentially faster: implementing a non-deterministic universal Turing machine using DNA

    PubMed Central

    Currin, Andrew; Korovin, Konstantin; Ababi, Maria; Roper, Katherine; Kell, Douglas B.; Day, Philip J.

    2017-01-01

    The theory of computer science is based around universal Turing machines (UTMs): abstract machines able to execute all possible algorithms. Modern digital computers are physical embodiments of classical UTMs. For the most important class of problem in computer science, non-deterministic polynomial complete problems, non-deterministic UTMs (NUTMs) are theoretically exponentially faster than both classical UTMs and quantum mechanical UTMs (QUTMs). However, no attempt has previously been made to build an NUTM, and their construction has been regarded as impossible. Here, we demonstrate the first physical design of an NUTM. This design is based on Thue string rewriting systems, and thereby avoids the limitations of most previous DNA computing schemes: all the computation is local (simple edits to strings) so there is no need for communication, and there is no need to order operations. The design exploits DNA's ability to replicate to execute an exponential number of computational paths in P time. Each Thue rewriting step is embodied in a DNA edit implemented using a novel combination of polymerase chain reactions and site-directed mutagenesis. We demonstrate that the design works using both computational modelling and in vitro molecular biology experimentation: the design is thermodynamically favourable, microprogramming can be used to encode arbitrary Thue rules, all classes of Thue rule can be implemented, and non-deterministic rule implementation. In an NUTM, the resource limitation is space, which contrasts with classical UTMs and QUTMs where it is time. This fundamental difference enables an NUTM to trade space for time, which is significant for both theoretical computer science and physics. It is also of practical importance, for to quote Richard Feynman ‘there's plenty of room at the bottom’. This means that a desktop DNA NUTM could potentially utilize more processors than all the electronic computers in the world combined, and thereby outperform the world's current fastest supercomputer, while consuming a tiny fraction of its energy. PMID:28250099

  17. Bicriteria Network Optimization Problem using Priority-based Genetic Algorithm

    NASA Astrophysics Data System (ADS)

    Gen, Mitsuo; Lin, Lin; Cheng, Runwei

    Network optimization is being an increasingly important and fundamental issue in the fields such as engineering, computer science, operations research, transportation, telecommunication, decision support systems, manufacturing, and airline scheduling. In many applications, however, there are several criteria associated with traversing each edge of a network. For example, cost and flow measures are both important in the networks. As a result, there has been recent interest in solving Bicriteria Network Optimization Problem. The Bicriteria Network Optimization Problem is known a NP-hard. The efficient set of paths may be very large, possibly exponential in size. Thus the computational effort required to solve it can increase exponentially with the problem size in the worst case. In this paper, we propose a genetic algorithm (GA) approach used a priority-based chromosome for solving the bicriteria network optimization problem including maximum flow (MXF) model and minimum cost flow (MCF) model. The objective is to find the set of Pareto optimal solutions that give possible maximum flow with minimum cost. This paper also combines Adaptive Weight Approach (AWA) that utilizes some useful information from the current population to readjust weights for obtaining a search pressure toward a positive ideal point. Computer simulations show the several numerical experiments by using some difficult-to-solve network design problems, and show the effectiveness of the proposed method.

  18. Learning Probabilities in Computer Engineering by Using a Competency- and Problem-Based Approach

    ERIC Educational Resources Information Center

    Khoumsi, Ahmed; Hadjou, Brahim

    2005-01-01

    Our department has redesigned its electrical and computer engineering programs by adopting a learning methodology based on competence development, problem solving, and the realization of design projects. In this article, we show how this pedagogical approach has been successfully used for learning probabilities and their application to computer…

  19. The Role of Guidance in Computer-Based Problem Solving for the Development of Concepts of Logic.

    ERIC Educational Resources Information Center

    Eysink, Tessa H. S.; Dijkstra, Sanne; Kuper, Jan

    2002-01-01

    Describes a study at the University of Twente (Netherlands) that investigated the effect of two instructional variables, manipulation of objects and guidance, in learning to use the logical connective, conditional with a computer-based learning environment, Tarski's World, designed to teach first-order logic. Discusses results of…

  20. Calculation of Ceramic Phase Diagrams

    DTIC Science & Technology

    1979-11-30

    Recent examples of the use of data bases and computer techniques in solw~ng problems associated with: in-situ formation of columbium, nickel and...examples of the use of data bases and computer techniques in solving problems associated with: in-situ formation of columbium, nickel and cobalt based...covers processing of in-situ eutectic composite formation in columbium, nickel and cobalt base superalloys, sigma phase formation in high temperature

  1. Petri-net-based 2D design of DNA walker circuits.

    PubMed

    Gilbert, David; Heiner, Monika; Rohr, Christian

    2018-01-01

    We consider localised DNA computation, where a DNA strand walks along a binary decision graph to compute a binary function. One of the challenges for the design of reliable walker circuits consists in leakage transitions, which occur when a walker jumps into another branch of the decision graph. We automatically identify leakage transitions, which allows for a detailed qualitative and quantitative assessment of circuit designs, design comparison, and design optimisation. The ability to identify leakage transitions is an important step in the process of optimising DNA circuit layouts where the aim is to minimise the computational error inherent in a circuit while minimising the area of the circuit. Our 2D modelling approach of DNA walker circuits relies on coloured stochastic Petri nets which enable functionality, topology and dimensionality all to be integrated in one two-dimensional model. Our modelling and analysis approach can be easily extended to 3-dimensional walker systems.

  2. Structure of the sporulation histidine kinase inhibitor Sda from Bacillus subtilis and insights into its solution state

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jacques, David A.; Streamer, Margaret; Rowland, Susan L.

    2009-06-01

    The crystal structure of Sda, a DNA-replication/damage checkpoint inhibitor of sporulation in B. subtilis, has been solved via the MAD method. The subunit arrangement in the crystal has enabled a reappraisal of previous biophysical data, resulting in a new model for the behaviour of the protein in solution. The crystal structure of the DNA-damage checkpoint inhibitor of sporulation, Sda, from Bacillus subtilis, has been solved by the MAD technique using selenomethionine-substituted protein. The structure closely resembles that previously solved by NMR, as well as the structure of a homologue from Geobacillus stearothermophilus solved in complex with the histidine kinase KinB.more » The structure contains three molecules in the asymmetric unit. The unusual trimeric arrangement, which lacks simple internal symmetry, appears to be preserved in solution based on an essentially ideal fit to previously acquired scattering data for Sda in solution. This interpretation contradicts previous findings that Sda was monomeric or dimeric in solution. This study demonstrates the difficulties that can be associated with the characterization of small proteins and the value of combining multiple biophysical techniques. It also emphasizes the importance of understanding the physical principles behind these techniques and therefore their limitations.« less

  3. Residues of E. coli topoisomerase I conserved for interaction with a specific cytosine base to facilitate DNA cleavage

    PubMed Central

    Narula, Gagandeep; Tse-Dinh, Yuk-Ching

    2012-01-01

    Bacterial and archaeal topoisomerase I display selectivity for a cytosine base 4 nt upstream from the DNA cleavage site. Recently, the solved crystal structure of Escherichia coli topoisomerase I covalently linked to a single-stranded oligonucleotide revealed that R169 and R173 interact with the cytosine base at the −4 position via hydrogen bonds while the phenol ring of Y177 wedges between the bases at the −4 and the −5 position. Substituting R169 to alanine changed the selectivity of the enzyme for the base at the −4 position from a cytosine to an adenine. The R173A mutant displayed similar sequence selectivity as the wild-type enzyme, but weaker cleavage and relaxation activity. Mutation of Y177 to serine or alanine rendered the enzyme inactive. Although mutation of each of these residues led to different outcomes, R169, R173 and Y177 work together to interact with a cytosine base at the −4 position to facilitate DNA cleavage. These strictly conserved residues might act after initial substrate binding as a Molecular Ruler to form a protein–DNA complex with the scissile phosphate positioned at the active site for optimal DNA cleavage by the tyrosine hydroxyl nucleophile to facilitate DNA cleavage in the reaction pathway. PMID:22833607

  4. DNA-Based Dynamic Reaction Networks.

    PubMed

    Fu, Ting; Lyu, Yifan; Liu, Hui; Peng, Ruizi; Zhang, Xiaobing; Ye, Mao; Tan, Weihong

    2018-05-21

    Deriving from logical and mechanical interactions between DNA strands and complexes, DNA-based artificial reaction networks (RNs) are attractive for their high programmability, as well as cascading and fan-out ability, which are similar to the basic principles of electronic logic gates. Arising from the dream of creating novel computing mechanisms, researchers have placed high hopes on the development of DNA-based dynamic RNs and have strived to establish the basic theories and operative strategies of these networks. This review starts by looking back on the evolution of DNA dynamic RNs; in particular' the most significant applications in biochemistry occurring in recent years. Finally, we discuss the perspectives of DNA dynamic RNs and give a possible direction for the development of DNA circuits. Copyright © 2018. Published by Elsevier Ltd.

  5. Problem-Based Test: Replication of Mitochondrial DNA during the Cell Cycle

    ERIC Educational Resources Information Center

    Setalo, Gyorgy, Jr.

    2013-01-01

    Terms to be familiar with before you start to solve the test: cell cycle, generation time, S-phase, cell culture synchronization, isotopic pulse-chase labeling, density labeling, equilibrium density-gradient centrifugation, buoyant density, rate-zonal centrifugation, nucleoside, nucleotide, kinase enzymes, polymerization of nucleic acids,…

  6. Combination of DNA-based and conventional methods to detect human leukocyte antigen polymorphism and its use for paternity testing.

    PubMed

    Kereszturya, László; Rajczya, Katalin; Lászikb, András; Gyódia, Eva; Pénzes, Mária; Falus, András; Petrányia, Gyõzõ G

    2002-03-01

    In cases of disputed paternity, the scientific goal is to promote either the exclusion of a falsely accused man or the affiliation of the alleged father. Until now, in addition to anthropologic characteristics, the determination of genetic markers included human leukocyte antigen gene variants; erythrocyte antigens and serum proteins were used for that reason. Recombinant DNA techniques provided a new set of highly variable genetic markers based on DNA nucleotide sequence polymorphism. From the practical standpoint, the application of these techniques to paternity testing provides greater versatility than do conventional genetic marker systems. The use of methods to detect the polymorphism of human leukocyte antigen loci significantly increases the chance of validation of ambiguous results in paternity testing. The outcome of 2384 paternity cases investigated by serologic and/or DNA-based human leukocyte antigen typing was statistically analyzed. Different cases solved by DNA typing are presented involving cases with one or two accused men, exclusions and nonexclusions, and tests of the paternity of a deceased man. The results provide evidence for the advantage of the combined application of various techniques in forensic diagnostics and emphasizes the outstanding possibilities of DNA-based assays. Representative examples demonstrate the strength of combined techniques in paternity testing.

  7. Market-based control strategy for long-span structures considering the multi-time delay issue

    NASA Astrophysics Data System (ADS)

    Li, Hongnan; Song, Jianzhu; Li, Gang

    2017-01-01

    To solve the different time delays that exist in the control device installed on spatial structures, in this study, discrete analysis using a 2 N precise algorithm was selected to solve the multi-time-delay issue for long-span structures based on the market-based control (MBC) method. The concept of interval mixed energy was introduced from computational structural mechanics and optimal control research areas, and it translates the design of the MBC multi-time-delay controller into a solution for the segment matrix. This approach transforms the serial algorithm in time to parallel computing in space, greatly improving the solving efficiency and numerical stability. The designed controller is able to consider the issue of time delay with a linear controlling force combination and is especially effective for large time-delay conditions. A numerical example of a long-span structure was selected to demonstrate the effectiveness of the presented controller, and the time delay was found to have a significant impact on the results.

  8. Discovering Motifs in Biological Sequences Using the Micron Automata Processor.

    PubMed

    Roy, Indranil; Aluru, Srinivas

    2016-01-01

    Finding approximately conserved sequences, called motifs, across multiple DNA or protein sequences is an important problem in computational biology. In this paper, we consider the (l, d) motif search problem of identifying one or more motifs of length l present in at least q of the n given sequences, with each occurrence differing from the motif in at most d substitutions. The problem is known to be NP-complete, and the largest solved instance reported to date is (26,11). We propose a novel algorithm for the (l,d) motif search problem using streaming execution over a large set of non-deterministic finite automata (NFA). This solution is designed to take advantage of the micron automata processor, a new technology close to deployment that can simultaneously execute multiple NFA in parallel. We demonstrate the capability for solving much larger instances of the (l, d) motif search problem using the resources available within a single automata processor board, by estimating run-times for problem instances (39,18) and (40,17). The paper serves as a useful guide to solving problems using this new accelerator technology.

  9. How does the brain solve visual object recognition?

    PubMed Central

    Zoccolan, Davide; Rust, Nicole C.

    2012-01-01

    Mounting evidence suggests that “core object recognition,” the ability to rapidly recognize objects despite substantial appearance variation, is solved in the brain via a cascade of reflexive, largely feedforward computations that culminate in a powerful neuronal representation in the inferior temporal cortex. However, the algorithm that produces this solution remains little-understood. Here we review evidence ranging from individual neurons, to neuronal populations, to behavior, to computational models. We propose that understanding this algorithm will require using neuronal and psychophysical data to sift through many computational models, each based on building blocks of small, canonical sub-networks with a common functional goal. PMID:22325196

  10. A new method for enhancer prediction based on deep belief network.

    PubMed

    Bu, Hongda; Gan, Yanglan; Wang, Yang; Zhou, Shuigeng; Guan, Jihong

    2017-10-16

    Studies have shown that enhancers are significant regulatory elements to play crucial roles in gene expression regulation. Since enhancers are unrelated to the orientation and distance to their target genes, it is a challenging mission for scholars and researchers to accurately predicting distal enhancers. In the past years, with the high-throughout ChiP-seq technologies development, several computational techniques emerge to predict enhancers using epigenetic or genomic features. Nevertheless, the inconsistency of computational models across different cell-lines and the unsatisfactory prediction performance call for further research in this area. Here, we propose a new Deep Belief Network (DBN) based computational method for enhancer prediction, which is called EnhancerDBN. This method combines diverse features, composed of DNA sequence compositional features, DNA methylation and histone modifications. Our computational results indicate that 1) EnhancerDBN outperforms 13 existing methods in prediction, and 2) GC content and DNA methylation can serve as relevant features for enhancer prediction. Deep learning is effective in boosting the performance of enhancer prediction.

  11. Discrete Ramanujan transform for distinguishing the protein coding regions from other regions.

    PubMed

    Hua, Wei; Wang, Jiasong; Zhao, Jian

    2014-01-01

    Based on the study of Ramanujan sum and Ramanujan coefficient, this paper suggests the concepts of discrete Ramanujan transform and spectrum. Using Voss numerical representation, one maps a symbolic DNA strand as a numerical DNA sequence, and deduces the discrete Ramanujan spectrum of the numerical DNA sequence. It is well known that of discrete Fourier power spectrum of protein coding sequence has an important feature of 3-base periodicity, which is widely used for DNA sequence analysis by the technique of discrete Fourier transform. It is performed by testing the signal-to-noise ratio at frequency N/3 as a criterion for the analysis, where N is the length of the sequence. The results presented in this paper show that the property of 3-base periodicity can be only identified as a prominent spike of the discrete Ramanujan spectrum at period 3 for the protein coding regions. The signal-to-noise ratio for discrete Ramanujan spectrum is defined for numerical measurement. Therefore, the discrete Ramanujan spectrum and the signal-to-noise ratio of a DNA sequence can be used for distinguishing the protein coding regions from the noncoding regions. All the exon and intron sequences in whole chromosomes 1, 2, 3 and 4 of Caenorhabditis elegans have been tested and the histograms and tables from the computational results illustrate the reliability of our method. In addition, we have analyzed theoretically and gotten the conclusion that the algorithm for calculating discrete Ramanujan spectrum owns the lower computational complexity and higher computational accuracy. The computational experiments show that the technique by using discrete Ramanujan spectrum for classifying different DNA sequences is a fast and effective method. Copyright © 2014 Elsevier Ltd. All rights reserved.

  12. An evolution based biosensor receptor DNA sequence generation algorithm.

    PubMed

    Kim, Eungyeong; Lee, Malrey; Gatton, Thomas M; Lee, Jaewan; Zang, Yupeng

    2010-01-01

    A biosensor is composed of a bioreceptor, an associated recognition molecule, and a signal transducer that can selectively detect target substances for analysis. DNA based biosensors utilize receptor molecules that allow hybridization with the target analyte. However, most DNA biosensor research uses oligonucleotides as the target analytes and does not address the potential problems of real samples. The identification of recognition molecules suitable for real target analyte samples is an important step towards further development of DNA biosensors. This study examines the characteristics of DNA used as bioreceptors and proposes a hybrid evolution-based DNA sequence generating algorithm, based on DNA computing, to identify suitable DNA bioreceptor recognition molecules for stable hybridization with real target substances. The Traveling Salesman Problem (TSP) approach is applied in the proposed algorithm to evaluate the safety and fitness of the generated DNA sequences. This approach improves efficiency and stability for enhanced and variable-length DNA sequence generation and allows extension to generation of variable-length DNA sequences with diverse receptor recognition requirements.

  13. Single-Molecule Methods for Nucleotide Excision Repair: Building a System to Watch Repair in Real Time.

    PubMed

    Kong, Muwen; Beckwitt, Emily C; Springall, Luke; Kad, Neil M; Van Houten, Bennett

    2017-01-01

    Single-molecule approaches to solving biophysical problems are powerful tools that allow static and dynamic real-time observations of specific molecular interactions of interest in the absence of ensemble-averaging effects. Here, we provide detailed protocols for building an experimental system that employs atomic force microscopy and a single-molecule DNA tightrope assay based on oblique angle illumination fluorescence microscopy. Together with approaches for engineering site-specific lesions into DNA substrates, these complementary biophysical techniques are well suited for investigating protein-DNA interactions that involve target-specific DNA-binding proteins, such as those engaged in a variety of DNA repair pathways. In this chapter, we demonstrate the utility of the platform by applying these techniques in the studies of proteins participating in nucleotide excision repair. © 2017 Elsevier Inc. All rights reserved.

  14. Quantum-mechanical analysis of the energetic contributions to π stacking in nucleic acids versus rise, twist, and slide.

    PubMed

    Parker, Trent M; Hohenstein, Edward G; Parrish, Robert M; Hud, Nicholas V; Sherrill, C David

    2013-01-30

    Symmetry-adapted perturbation theory (SAPT) is applied to pairs of hydrogen-bonded nucleobases to obtain the energetic components of base stacking (electrostatic, exchange-repulsion, induction/polarization, and London dispersion interactions) and how they vary as a function of the helical parameters Rise, Twist, and Slide. Computed average values of Rise and Twist agree well with experimental data for B-form DNA from the Nucleic Acids Database, even though the model computations omitted the backbone atoms (suggesting that the backbone in B-form DNA is compatible with having the bases adopt their ideal stacking geometries). London dispersion forces are the most important attractive component in base stacking, followed by electrostatic interactions. At values of Rise typical of those in DNA (3.36 Å), the electrostatic contribution is nearly always attractive, providing further evidence for the importance of charge-penetration effects in π-π interactions (a term neglected in classical force fields). Comparison of the computed stacking energies with those from model complexes made of the "parent" nucleobases purine and 2-pyrimidone indicates that chemical substituents in DNA and RNA account for 20-40% of the base-stacking energy. A lack of correspondence between the SAPT results and experiment for Slide in RNA base-pair steps suggests that the backbone plays a larger role in determining stacking geometries in RNA than in B-form DNA. In comparisons of base-pair steps with thymine versus uracil, the thymine methyl group tends to enhance the strength of the stacking interaction through a combination of dispersion and electrosatic interactions.

  15. DNA-programmed dynamic assembly of quantum dots for molecular computation.

    PubMed

    He, Xuewen; Li, Zhi; Chen, Muzi; Ma, Nan

    2014-12-22

    Despite the widespread use of quantum dots (QDs) for biosensing and bioimaging, QD-based bio-interfaceable and reconfigurable molecular computing systems have not yet been realized. DNA-programmed dynamic assembly of multi-color QDs is presented for the construction of a new class of fluorescence resonance energy transfer (FRET)-based QD computing systems. A complete set of seven elementary logic gates (OR, AND, NOR, NAND, INH, XOR, XNOR) are realized using a series of binary and ternary QD complexes operated by strand displacement reactions. The integration of different logic gates into a half-adder circuit for molecular computation is also demonstrated. This strategy is quite versatile and straightforward for logical operations and would pave the way for QD-biocomputing-based intelligent molecular diagnostics. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  16. Flowfield computation of entry vehicles

    NASA Technical Reports Server (NTRS)

    Prabhu, Dinesh K.

    1990-01-01

    The equations governing the multidimensional flow of a reacting mixture of thermally perfect gasses were derived. The modeling procedures for the various terms of the conservation laws are discussed. A numerical algorithm, based on the finite-volume approach, to solve these conservation equations was developed. The advantages and disadvantages of the present numerical scheme are discussed from the point of view of accuracy, computer time, and memory requirements. A simple one-dimensional model problem was solved to prove the feasibility and accuracy of the algorithm. A computer code implementing the above algorithm was developed and is presently being applied to simple geometries and conditions. Once the code is completely debugged and validated, it will be used to compute the complete unsteady flow field around the Aeroassist Flight Experiment (AFE) body.

  17. Quantum computation with coherent spin states and the close Hadamard problem

    NASA Astrophysics Data System (ADS)

    Adcock, Mark R. A.; Høyer, Peter; Sanders, Barry C.

    2016-04-01

    We study a model of quantum computation based on the continuously parameterized yet finite-dimensional Hilbert space of a spin system. We explore the computational powers of this model by analyzing a pilot problem we refer to as the close Hadamard problem. We prove that the close Hadamard problem can be solved in the spin system model with arbitrarily small error probability in a constant number of oracle queries. We conclude that this model of quantum computation is suitable for solving certain types of problems. The model is effective for problems where symmetries between the structure of the information associated with the problem and the structure of the unitary operators employed in the quantum algorithm can be exploited.

  18. Gener: a minimal programming module for chemical controllers based on DNA strand displacement

    PubMed Central

    Kahramanoğulları, Ozan; Cardelli, Luca

    2015-01-01

    Summary: Gener is a development module for programming chemical controllers based on DNA strand displacement. Gener is developed with the aim of providing a simple interface that minimizes the opportunities for programming errors: Gener allows the user to test the computations of the DNA programs based on a simple two-domain strand displacement algebra, the minimal available so far. The tool allows the user to perform stepwise computations with respect to the rules of the algebra as well as exhaustive search of the computation space with different options for exploration and visualization. Gener can be used in combination with existing tools, and in particular, its programs can be exported to Microsoft Research’s DSD tool as well as to LaTeX. Availability and implementation: Gener is available for download at the Cosbi website at http://www.cosbi.eu/research/prototypes/gener as a windows executable that can be run on Mac OS X and Linux by using Mono. Contact: ozan@cosbi.eu PMID:25957353

  19. Gener: a minimal programming module for chemical controllers based on DNA strand displacement.

    PubMed

    Kahramanoğulları, Ozan; Cardelli, Luca

    2015-09-01

    : Gener is a development module for programming chemical controllers based on DNA strand displacement. Gener is developed with the aim of providing a simple interface that minimizes the opportunities for programming errors: Gener allows the user to test the computations of the DNA programs based on a simple two-domain strand displacement algebra, the minimal available so far. The tool allows the user to perform stepwise computations with respect to the rules of the algebra as well as exhaustive search of the computation space with different options for exploration and visualization. Gener can be used in combination with existing tools, and in particular, its programs can be exported to Microsoft Research's DSD tool as well as to LaTeX. Gener is available for download at the Cosbi website at http://www.cosbi.eu/research/prototypes/gener as a windows executable that can be run on Mac OS X and Linux by using Mono. ozan@cosbi.eu. © The Author 2015. Published by Oxford University Press.

  20. Computer program determines chemical equilibria in complex systems

    NASA Technical Reports Server (NTRS)

    Gordon, S.; Zeleznik, F. J.

    1966-01-01

    Computer program numerically solves nonlinear algebraic equations for chemical equilibrium based on iteration equations independent of choice of components. This program calculates theoretical performance for frozen and equilibrium composition during expansion and Chapman-Jouguet flame properties, studies combustion, and designs hardware.

  1. Mistaking Identities: Challenging Representations of Language, Gender, and Race in High Tech Television Programs.

    ERIC Educational Resources Information Center

    Voithofer, R. J.

    Television programs are increasingly featuring information technologies like computers as significant narrative devices, including the use of computer-based technologies as virtual worlds or environments in which characters interact, the use of computers as tools in problem solving and confronting conflict, and characters that are part human, part…

  2. The Role of Context-Related Parameters in Adults' Mental Computational Acts

    ERIC Educational Resources Information Center

    Naresh, Nirmala; Presmeg, Norma

    2012-01-01

    Researchers who have carried out studies pertaining to mental computation and everyday mathematics point out that adults and children reason intuitively based upon experiences within specific contexts; they use invented strategies of their own to solve real-life problems. We draw upon research areas of mental computation and everyday mathematics…

  3. Enhancing Student Writing and Computer Programming with LATEX and MATLAB in Multivariable Calculus

    ERIC Educational Resources Information Center

    Sullivan, Eric; Melvin, Timothy

    2016-01-01

    Written communication and computer programming are foundational components of an undergraduate degree in the mathematical sciences. All lower-division mathematics courses at our institution are paired with computer-based writing, coding, and problem-solving activities. In multivariable calculus we utilize MATLAB and LATEX to have students explore…

  4. Loci-STREAM Version 0.9

    NASA Technical Reports Server (NTRS)

    Wright, Jeffrey; Thakur, Siddharth

    2006-01-01

    Loci-STREAM is an evolving computational fluid dynamics (CFD) software tool for simulating possibly chemically reacting, possibly unsteady flows in diverse settings, including rocket engines, turbomachines, oil refineries, etc. Loci-STREAM implements a pressure- based flow-solving algorithm that utilizes unstructured grids. (The benefit of low memory usage by pressure-based algorithms is well recognized by experts in the field.) The algorithm is robust for flows at all speeds from zero to hypersonic. The flexibility of arbitrary polyhedral grids enables accurate, efficient simulation of flows in complex geometries, including those of plume-impingement problems. The present version - Loci-STREAM version 0.9 - includes an interface with the Portable, Extensible Toolkit for Scientific Computation (PETSc) library for access to enhanced linear-equation-solving programs therein that accelerate convergence toward a solution. The name "Loci" reflects the creation of this software within the Loci computational framework, which was developed at Mississippi State University for the primary purpose of simplifying the writing of complex multidisciplinary application programs to run in distributed-memory computing environments including clusters of personal computers. Loci has been designed to relieve application programmers of the details of programming for distributed-memory computers.

  5. Research Advances: DNA Computing Targets West Nile Virus, Other Deadly Diseases, and Tic-Tac-Toe; Marijuana Component May Offer Hope for Alzheimer's Disease Treatment; New Wound Dressing May Lead to Maggot Therapy--Without the Maggots

    ERIC Educational Resources Information Center

    King, Angela G.

    2007-01-01

    This article presents three reports of research advances. The first report describes a deoxyribonucleic acid (DNA)-based computer that could lead to faster, more accurate tests for diagnosing West Nile Virus and bird flu. Representing the first "medium-scale integrated molecular circuit," it is the most powerful computing device of its type to…

  6. Response Mode Effects on Computer Based Problem Solving. Report Series 1979.

    ERIC Educational Resources Information Center

    Brown, Bobby R.; Sustik, Joan M.

    This response mode study attempts to determine whether different response modes are helpful or not in facilitating the thought process in a given problem solving situation. The Luchins Water Jar Test (WJT) used in this study illustrates the phenomena "Einstelling" (mechanization of response) because it does not require any specialized content…

  7. Gas Permeation Computations with Mathematica

    ERIC Educational Resources Information Center

    Binous, Housam

    2006-01-01

    We show a new approach, based on the utilization of Mathematica, to solve gas permeation problems using membranes. We start with the design of a membrane unit for the separation of a multicomponent mixture. The built-in Mathematica function, FindRoot, allows one to solve seven simultaneous equations instead of using the iterative approach of…

  8. An Auto-Scoring Mechanism for Evaluating Problem-Solving Ability in a Web-Based Learning Environment

    ERIC Educational Resources Information Center

    Chiou, Chuang-Kai; Hwang, Gwo-Jen; Tseng, Judy C. R.

    2009-01-01

    The rapid development of computer and network technologies has attracted researchers to investigate strategies for and the effects of applying information technologies in learning activities; simultaneously, learning environments have been developed to record the learning portfolios of students seeking web information for problem-solving. Although…

  9. Computer-Based Assessment of Complex Problem Solving: Concept, Implementation, and Application

    ERIC Educational Resources Information Center

    Greiff, Samuel; Wustenberg, Sascha; Holt, Daniel V.; Goldhammer, Frank; Funke, Joachim

    2013-01-01

    Complex Problem Solving (CPS) skills are essential to successfully deal with environments that change dynamically and involve a large number of interconnected and partially unknown causal influences. The increasing importance of such skills in the 21st century requires appropriate assessment and intervention methods, which in turn rely on adequate…

  10. Unifying Computer-Based Assessment across Conceptual Instruction, Problem-Solving, and Digital Games

    ERIC Educational Resources Information Center

    Miller, William L.; Baker, Ryan S.; Rossi, Lisa M.

    2014-01-01

    As students work through online learning systems such as the Reasoning Mind blended learning system, they often are not confined to working within a single educational activity; instead, they work through various different activities such as conceptual instruction, problem-solving items, and fluency-building games. However, most work on assessing…

  11. Augmented neural networks and problem structure-based heuristics for the bin-packing problem

    NASA Astrophysics Data System (ADS)

    Kasap, Nihat; Agarwal, Anurag

    2012-08-01

    In this article, we report on a research project where we applied augmented-neural-networks (AugNNs) approach for solving the classical bin-packing problem (BPP). AugNN is a metaheuristic that combines a priority rule heuristic with the iterative search approach of neural networks to generate good solutions fast. This is the first time this approach has been applied to the BPP. We also propose a decomposition approach for solving harder BPP, in which subproblems are solved using a combination of AugNN approach and heuristics that exploit the problem structure. We discuss the characteristics of problems on which such problem structure-based heuristics could be applied. We empirically show the effectiveness of the AugNN and the decomposition approach on many benchmark problems in the literature. For the 1210 benchmark problems tested, 917 problems were solved to optimality and the average gap between the obtained solution and the upper bound for all the problems was reduced to under 0.66% and computation time averaged below 33 s per problem. We also discuss the computational complexity of our approach.

  12. Kinetics of Mismatch Formation opposite Lesions by the Replicative DNA Polymerase from Bacteriophage RB69

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hogg, Matthew; Rudnicki, Jean; Midkiff, John

    2010-04-12

    The fidelity of DNA replication is under constant threat from the formation of lesions within the genome. Oxidation of DNA bases leads to the formation of altered DNA bases such as 8-oxo-7,8-dihydroguanine, commonly called 8-oxoG, and 2-hydroxyadenenine, or 2-OHA. In this work we have examined the incorporation kinetics opposite these two oxidatively derived lesions as well as an abasic site analogue by the replicative DNA polymerase from bacteriophage RB69. We compared the kinetic parameters for both wild type and the low fidelity L561A variant. While nucleotide incorporation rates (k{sub pol}) were generally higher for the variant, the presence of amore » lesion in the templating position reduced the ability of both the wild-type and variant DNA polymerases to form ternary enzyme-DNA-dNTP complexes. Thus, the L561A substitution does not significantly affect the ability of the RB69 DNA polymerase to recognize damaged DNA; instead, the mutation increases the probability that nucleotide incorporation will occur. We have also solved the crystal structure of the L561A variant forming an 8-oxoG {center_dot} dATP mispair and show that the propensity for forming this mispair depends on an enlarged polymerase active site.« less

  13. Inverse design of an isotropic suspended Kirchhoff rod: theoretical and numerical results on the uniqueness of the natural shape.

    PubMed

    Bertails-Descoubes, Florence; Derouet-Jourdan, Alexandre; Romero, Victor; Lazarus, Arnaud

    2018-04-01

    Solving the equations for Kirchhoff elastic rods has been widely explored for decades in mathematics, physics and computer science, with significant applications in the modelling of thin flexible structures such as DNA, hair or climbing plants. As demonstrated in previous experimental and theoretical studies, the natural curvature plays an important role in the equilibrium shape of a Kirchhoff rod, even in the simple case where the rod is isotropic and suspended under gravity. In this paper, we investigate the reverse problem: can we characterize the natural curvature of a suspended isotropic rod, given an equilibrium curve? We prove that although there exists an infinite number of natural curvatures that are compatible with the prescribed equilibrium, they are all equivalent in the sense that they correspond to a unique natural shape for the rod. This natural shape can be computed efficiently by solving in sequence three linear initial value problems, starting from any framing of the input curve. We provide several numerical experiments to illustrate this uniqueness result, and finally discuss its potential impact on non-invasive parameter estimation and inverse design of thin elastic rods.

  14. Inverse design of an isotropic suspended Kirchhoff rod: theoretical and numerical results on the uniqueness of the natural shape

    NASA Astrophysics Data System (ADS)

    Bertails-Descoubes, Florence; Derouet-Jourdan, Alexandre; Romero, Victor; Lazarus, Arnaud

    2018-04-01

    Solving the equations for Kirchhoff elastic rods has been widely explored for decades in mathematics, physics and computer science, with significant applications in the modelling of thin flexible structures such as DNA, hair or climbing plants. As demonstrated in previous experimental and theoretical studies, the natural curvature plays an important role in the equilibrium shape of a Kirchhoff rod, even in the simple case where the rod is isotropic and suspended under gravity. In this paper, we investigate the reverse problem: can we characterize the natural curvature of a suspended isotropic rod, given an equilibrium curve? We prove that although there exists an infinite number of natural curvatures that are compatible with the prescribed equilibrium, they are all equivalent in the sense that they correspond to a unique natural shape for the rod. This natural shape can be computed efficiently by solving in sequence three linear initial value problems, starting from any framing of the input curve. We provide several numerical experiments to illustrate this uniqueness result, and finally discuss its potential impact on non-invasive parameter estimation and inverse design of thin elastic rods.

  15. Assessment of DNA extracted from FTA® cards for use on the Illumina iSelect BeadChip

    PubMed Central

    McClure, Matthew C; McKay, Stephanie D; Schnabel, Robert D; Taylor, Jeremy F

    2009-01-01

    Background As FTA® cards provide an ideal medium for the field collection of DNA we sought to assess the quality of genomic DNA extracted from this source for use on the Illumina BovineSNP50 iSelect BeadChip which requires unbound, relatively intact (fragment sizes ≥ 2 kb), and high-quality DNA. Bovine blood and nasal swab samples collected on FTA cards were extracted using the commercially available GenSolve kit with a minor modification. The call rate and concordance of genotypes from each sample were compared to those obtained from whole blood samples extracted by standard PCI extraction. Findings An ANOVA analysis indicated no significant difference (P > 0.72) in BovineSNP50 genotype call rate between DNA extracted from FTA cards by the GenSolve kit or extracted from whole blood by PCI. Two sample t-tests demonstrated that the DNA extracted from the FTA cards produced genotype call and concordance rates that were not different to those produced by assaying DNA samples extracted by PCI from whole blood. Conclusion We conclude that DNA extracted from FTA cards by the GenSolve kit is of sufficiently high quality to produce results comparable to those obtained from DNA extracted from whole blood when assayed by the Illumina iSelect technology. Additionally, we validate the use of nasal swabs as an alternative to venous blood or buccal samples from animal subjects for reliably producing high quality genotypes on this platform. PMID:19531223

  16. Assessment of DNA extracted from FTA cards for use on the Illumina iSelect BeadChip.

    PubMed

    McClure, Matthew C; McKay, Stephanie D; Schnabel, Robert D; Taylor, Jeremy F

    2009-06-16

    As FTA cards provide an ideal medium for the field collection of DNA we sought to assess the quality of genomic DNA extracted from this source for use on the Illumina BovineSNP50 iSelect BeadChip which requires unbound, relatively intact (fragment sizes >or= 2 kb), and high-quality DNA. Bovine blood and nasal swab samples collected on FTA cards were extracted using the commercially available GenSolve kit with a minor modification. The call rate and concordance of genotypes from each sample were compared to those obtained from whole blood samples extracted by standard PCI extraction. An ANOVA analysis indicated no significant difference (P > 0.72) in BovineSNP50 genotype call rate between DNA extracted from FTA cards by the GenSolve kit or extracted from whole blood by PCI. Two sample t-tests demonstrated that the DNA extracted from the FTA cards produced genotype call and concordance rates that were not different to those produced by assaying DNA samples extracted by PCI from whole blood. We conclude that DNA extracted from FTA cards by the GenSolve kit is of sufficiently high quality to produce results comparable to those obtained from DNA extracted from whole blood when assayed by the Illumina iSelect technology. Additionally, we validate the use of nasal swabs as an alternative to venous blood or buccal samples from animal subjects for reliably producing high quality genotypes on this platform.

  17. Numerical Modeling of Saturated Boiling in a Heated Tube

    NASA Technical Reports Server (NTRS)

    Majumdar, Alok; LeClair, Andre; Hartwig, Jason

    2017-01-01

    This paper describes a mathematical formulation and numerical solution of boiling in a heated tube. The mathematical formulation involves a discretization of the tube into a flow network consisting of fluid nodes and branches and a thermal network consisting of solid nodes and conductors. In the fluid network, the mass, momentum and energy conservation equations are solved and in the thermal network, the energy conservation equation of solids is solved. A pressure-based, finite-volume formulation has been used to solve the equations in the fluid network. The system of equations is solved by a hybrid numerical scheme which solves the mass and momentum conservation equations by a simultaneous Newton-Raphson method and the energy conservation equation by a successive substitution method. The fluid network and thermal network are coupled through heat transfer between the solid and fluid nodes which is computed by Chen's correlation of saturated boiling heat transfer. The computer model is developed using the Generalized Fluid System Simulation Program and the numerical predictions are compared with test data.

  18. A randomized approach to speed up the analysis of large-scale read-count data in the application of CNV detection.

    PubMed

    Wang, WeiBo; Sun, Wei; Wang, Wei; Szatkiewicz, Jin

    2018-03-01

    The application of high-throughput sequencing in a broad range of quantitative genomic assays (e.g., DNA-seq, ChIP-seq) has created a high demand for the analysis of large-scale read-count data. Typically, the genome is divided into tiling windows and windowed read-count data is generated for the entire genome from which genomic signals are detected (e.g. copy number changes in DNA-seq, enrichment peaks in ChIP-seq). For accurate analysis of read-count data, many state-of-the-art statistical methods use generalized linear models (GLM) coupled with the negative-binomial (NB) distribution by leveraging its ability for simultaneous bias correction and signal detection. However, although statistically powerful, the GLM+NB method has a quadratic computational complexity and therefore suffers from slow running time when applied to large-scale windowed read-count data. In this study, we aimed to speed up substantially the GLM+NB method by using a randomized algorithm and we demonstrate here the utility of our approach in the application of detecting copy number variants (CNVs) using a real example. We propose an efficient estimator, the randomized GLM+NB coefficients estimator (RGE), for speeding up the GLM+NB method. RGE samples the read-count data and solves the estimation problem on a smaller scale. We first theoretically validated the consistency and the variance properties of RGE. We then applied RGE to GENSENG, a GLM+NB based method for detecting CNVs. We named the resulting method as "R-GENSENG". Based on extensive evaluation using both simulated and empirical data, we concluded that R-GENSENG is ten times faster than the original GENSENG while maintaining GENSENG's accuracy in CNV detection. Our results suggest that RGE strategy developed here could be applied to other GLM+NB based read-count analyses, i.e. ChIP-seq data analysis, to substantially improve their computational efficiency while preserving the analytic power.

  19. The Goal Specificity Effect on Strategy Use and Instructional Efficiency during Computer-Based Scientific Discovery Learning

    ERIC Educational Resources Information Center

    Kunsting, Josef; Wirth, Joachim; Paas, Fred

    2011-01-01

    Using a computer-based scientific discovery learning environment on buoyancy in fluids we investigated the "effects of goal specificity" (nonspecific goals vs. specific goals) for two goal types (problem solving goals vs. learning goals) on "strategy use" and "instructional efficiency". Our empirical findings close an important research gap,…

  20. A Problem Posing-Based Practicing Strategy for Facilitating Students' Computer Programming Skills in the Team-Based Learning Mode

    ERIC Educational Resources Information Center

    Wang, Xiao-Ming; Hwang, Gwo-Jen

    2017-01-01

    Computer programming is a subject that requires problem-solving strategies and involves a great number of programming logic activities which pose challenges for learners. Therefore, providing learning support and guidance is important. Collaborative learning is widely believed to be an effective teaching approach; it can enhance learners' social…

  1. An Improved Quantitative Real-Time PCR Assay for the Enumeration of Heterosigma akashiwo (Raphidophyceae) Cysts Using a DNA Debris Removal Method and a Cyst-Based Standard Curve.

    PubMed

    Kim, Joo-Hwan; Kim, Jin Ho; Wang, Pengbin; Park, Bum Soo; Han, Myung-Soo

    2016-01-01

    The identification and quantification of Heterosigma akashiwo cysts in sediments by light microscopy can be difficult due to the small size and morphology of the cysts, which are often indistinguishable from those of other types of algae. Quantitative real-time PCR (qPCR) based assays represent a potentially efficient method for quantifying the abundance of H. akashiwo cysts, although standard curves must be based on cyst DNA rather than on vegetative cell DNA due to differences in gene copy number and DNA extraction yield between these two cell types. Furthermore, qPCR on sediment samples can be complicated by the presence of extracellular DNA debris. To solve these problems, we constructed a cyst-based standard curve and developed a simple method for removing DNA debris from sediment samples. This cyst-based standard curve was compared with a standard curve based on vegetative cells, as vegetative cells may have twice the gene copy number of cysts. To remove DNA debris from the sediment, we developed a simple method involving dilution with distilled water and heating at 75°C. A total of 18 sediment samples were used to evaluate this method. Cyst abundance determined using the qPCR assay without DNA debris removal yielded results up to 51-fold greater than with direct counting. By contrast, a highly significant correlation was observed between cyst abundance determined by direct counting and the qPCR assay in conjunction with DNA debris removal (r2 = 0.72, slope = 1.07, p < 0.001). Therefore, this improved qPCR method should be a powerful tool for the accurate quantification of H. akashiwo cysts in sediment samples.

  2. The Time on Task Effect in Reading and Problem Solving Is Moderated by Task Difficulty and Skill: Insights from a Computer-Based Large-Scale Assessment

    ERIC Educational Resources Information Center

    Goldhammer, Frank; Naumann, Johannes; Stelter, Annette; Tóth, Krisztina; Rölke, Heiko; Klieme, Eckhard

    2014-01-01

    Computer-based assessment can provide new insights into behavioral processes of task completion that cannot be uncovered by paper-based instruments. Time presents a major characteristic of the task completion process. Psychologically, time on task has 2 different interpretations, suggesting opposing associations with task outcome: Spending more…

  3. DNA Origami-Graphene Hybrid Nanopore for DNA Detection.

    PubMed

    Barati Farimani, Amir; Dibaeinia, Payam; Aluru, Narayana R

    2017-01-11

    DNA origami nanostructures can be used to functionalize solid-state nanopores for single molecule studies. In this study, we characterized a nanopore in a DNA origami-graphene heterostructure for DNA detection. The DNA origami nanopore is functionalized with a specific nucleotide type at the edge of the pore. Using extensive molecular dynamics (MD) simulations, we computed and analyzed the ionic conductivity of nanopores in heterostructures carpeted with one or two layers of DNA origami on graphene. We demonstrate that a nanopore in DNA origami-graphene gives rise to distinguishable dwell times for the four DNA base types, whereas for a nanopore in bare graphene, the dwell time is almost the same for all types of bases. The specific interactions (hydrogen bonds) between DNA origami and the translocating DNA strand yield different residence times and ionic currents. We also conclude that the speed of DNA translocation decreases due to the friction between the dangling bases at the pore mouth and the sequencing DNA strands.

  4. [Forensic evidence-based medicine in computer communication networks].

    PubMed

    Qiu, Yun-Liang; Peng, Ming-Qi

    2013-12-01

    As an important component of judicial expertise, forensic science is broad and highly specialized. With development of network technology, increasement of information resources, and improvement of people's legal consciousness, forensic scientists encounter many new problems, and have been required to meet higher evidentiary standards in litigation. In view of this, evidence-based concept should be established in forensic medicine. We should find the most suitable method in forensic science field and other related area to solve specific problems in the evidence-based mode. Evidence-based practice can solve the problems in legal medical field, and it will play a great role in promoting the progress and development of forensic science. This article reviews the basic theory of evidence-based medicine and its effect, way, method, and evaluation in the forensic medicine in order to discuss the application value of forensic evidence-based medicine in computer communication networks.

  5. A review on economic emission dispatch problems using quantum computational intelligence

    NASA Astrophysics Data System (ADS)

    Mahdi, Fahad Parvez; Vasant, Pandian; Kallimani, Vish; Abdullah-Al-Wadud, M.

    2016-11-01

    Economic emission dispatch (EED) problems are one of the most crucial problems in power systems. Growing energy demand, limitation of natural resources and global warming make this topic into the center of discussion and research. This paper reviews the use of Quantum Computational Intelligence (QCI) in solving Economic Emission Dispatch problems. QCI techniques like Quantum Genetic Algorithm (QGA) and Quantum Particle Swarm Optimization (QPSO) algorithm are discussed here. This paper will encourage the researcher to use more QCI based algorithm to get better optimal result for solving EED problems.

  6. An efficient computer based wavelets approximation method to solve Fuzzy boundary value differential equations

    NASA Astrophysics Data System (ADS)

    Alam Khan, Najeeb; Razzaq, Oyoon Abdul

    2016-03-01

    In the present work a wavelets approximation method is employed to solve fuzzy boundary value differential equations (FBVDEs). Essentially, a truncated Legendre wavelets series together with the Legendre wavelets operational matrix of derivative are utilized to convert FB- VDE into a simple computational problem by reducing it into a system of fuzzy algebraic linear equations. The capability of scheme is investigated on second order FB- VDE considered under generalized H-differentiability. Solutions are represented graphically showing competency and accuracy of this method.

  7. Subspace projection method for unstructured searches with noisy quantum oracles using a signal-based quantum emulation device

    NASA Astrophysics Data System (ADS)

    La Cour, Brian R.; Ostrove, Corey I.

    2017-01-01

    This paper describes a novel approach to solving unstructured search problems using a classical, signal-based emulation of a quantum computer. The classical nature of the representation allows one to perform subspace projections in addition to the usual unitary gate operations. Although bandwidth requirements will limit the scale of problems that can be solved by this method, it can nevertheless provide a significant computational advantage for problems of limited size. In particular, we find that, for the same number of noisy oracle calls, the proposed subspace projection method provides a higher probability of success for finding a solution than does an single application of Grover's algorithm on the same device.

  8. Experimental realization of a one-way quantum computer algorithm solving Simon's problem.

    PubMed

    Tame, M S; Bell, B A; Di Franco, C; Wadsworth, W J; Rarity, J G

    2014-11-14

    We report an experimental demonstration of a one-way implementation of a quantum algorithm solving Simon's problem-a black-box period-finding problem that has an exponential gap between the classical and quantum runtime. Using an all-optical setup and modifying the bases of single-qubit measurements on a five-qubit cluster state, key representative functions of the logical two-qubit version's black box can be queried and solved. To the best of our knowledge, this work represents the first experimental realization of the quantum algorithm solving Simon's problem. The experimental results are in excellent agreement with the theoretical model, demonstrating the successful performance of the algorithm. With a view to scaling up to larger numbers of qubits, we analyze the resource requirements for an n-qubit version. This work helps highlight how one-way quantum computing provides a practical route to experimentally investigating the quantum-classical gap in the query complexity model.

  9. A System for Generating Instructional Computer Graphics.

    ERIC Educational Resources Information Center

    Nygard, Kendall E.; Ranganathan, Babusankar

    1983-01-01

    Description of the Tektronix-Based Interactive Graphics System for Instruction (TIGSI), which was developed for generating graphics displays in computer-assisted instruction materials, discusses several applications (e.g., reinforcing learning of concepts, principles, rules, and problem-solving techniques) and presents advantages of the TIGSI…

  10. Programmable hardware for reconfigurable computing systems

    NASA Astrophysics Data System (ADS)

    Smith, Stephen

    1996-10-01

    In 1945 the work of J. von Neumann and H. Goldstein created the principal architecture for electronic computation that has now lasted fifty years. Nevertheless alternative architectures have been created that have computational capability, for special tasks, far beyond that feasible with von Neumann machines. The emergence of high capacity programmable logic devices has made the realization of these architectures practical. The original ENIAC and EDVAC machines were conceived to solve special mathematical problems that were far from today's concept of 'killer applications.' In a similar vein programmable hardware computation is being used today to solve unique mathematical problems. Our programmable hardware activity is focused on the research and development of novel computational systems based upon the reconfigurability of our programmable logic devices. We explore our programmable logic architectures and their implications for programmable hardware. One programmable hardware board implementation is detailed.

  11. DNA Cryptography and Deep Learning using Genetic Algorithm with NW algorithm for Key Generation.

    PubMed

    Kalsi, Shruti; Kaur, Harleen; Chang, Victor

    2017-12-05

    Cryptography is not only a science of applying complex mathematics and logic to design strong methods to hide data called as encryption, but also to retrieve the original data back, called decryption. The purpose of cryptography is to transmit a message between a sender and receiver such that an eavesdropper is unable to comprehend it. To accomplish this, not only we need a strong algorithm, but a strong key and a strong concept for encryption and decryption process. We have introduced a concept of DNA Deep Learning Cryptography which is defined as a technique of concealing data in terms of DNA sequence and deep learning. In the cryptographic technique, each alphabet of a letter is converted into a different combination of the four bases, namely; Adenine (A), Cytosine (C), Guanine (G) and Thymine (T), which make up the human deoxyribonucleic acid (DNA). Actual implementations with the DNA don't exceed laboratory level and are expensive. To bring DNA computing on a digital level, easy and effective algorithms are proposed in this paper. In proposed work we have introduced firstly, a method and its implementation for key generation based on the theory of natural selection using Genetic Algorithm with Needleman-Wunsch (NW) algorithm and Secondly, a method for implementation of encryption and decryption based on DNA computing using biological operations Transcription, Translation, DNA Sequencing and Deep Learning.

  12. Programmable chemical controllers made from DNA.

    PubMed

    Chen, Yuan-Jyue; Dalchau, Neil; Srinivas, Niranjan; Phillips, Andrew; Cardelli, Luca; Soloveichik, David; Seelig, Georg

    2013-10-01

    Biological organisms use complex molecular networks to navigate their environment and regulate their internal state. The development of synthetic systems with similar capabilities could lead to applications such as smart therapeutics or fabrication methods based on self-organization. To achieve this, molecular control circuits need to be engineered to perform integrated sensing, computation and actuation. Here we report a DNA-based technology for implementing the computational core of such controllers. We use the formalism of chemical reaction networks as a 'programming language' and our DNA architecture can, in principle, implement any behaviour that can be mathematically expressed as such. Unlike logic circuits, our formulation naturally allows complex signal processing of intrinsically analogue biological and chemical inputs. Controller components can be derived from biologically synthesized (plasmid) DNA, which reduces errors associated with chemically synthesized DNA. We implement several building-block reaction types and then combine them into a network that realizes, at the molecular level, an algorithm used in distributed control systems for achieving consensus between multiple agents.

  13. Programmable chemical controllers made from DNA

    NASA Astrophysics Data System (ADS)

    Chen, Yuan-Jyue; Dalchau, Neil; Srinivas, Niranjan; Phillips, Andrew; Cardelli, Luca; Soloveichik, David; Seelig, Georg

    2013-10-01

    Biological organisms use complex molecular networks to navigate their environment and regulate their internal state. The development of synthetic systems with similar capabilities could lead to applications such as smart therapeutics or fabrication methods based on self-organization. To achieve this, molecular control circuits need to be engineered to perform integrated sensing, computation and actuation. Here we report a DNA-based technology for implementing the computational core of such controllers. We use the formalism of chemical reaction networks as a 'programming language' and our DNA architecture can, in principle, implement any behaviour that can be mathematically expressed as such. Unlike logic circuits, our formulation naturally allows complex signal processing of intrinsically analogue biological and chemical inputs. Controller components can be derived from biologically synthesized (plasmid) DNA, which reduces errors associated with chemically synthesized DNA. We implement several building-block reaction types and then combine them into a network that realizes, at the molecular level, an algorithm used in distributed control systems for achieving consensus between multiple agents.

  14. Programmable chemical controllers made from DNA

    PubMed Central

    Chen, Yuan-Jyue; Dalchau, Neil; Srinivas, Niranjan; Phillips, Andrew; Cardelli, Luca; Soloveichik, David; Seelig, Georg

    2014-01-01

    Biological organisms use complex molecular networks to navigate their environment and regulate their internal state. The development of synthetic systems with similar capabilities could lead to applications such as smart therapeutics or fabrication methods based on self-organization. To achieve this, molecular control circuits need to be engineered to perform integrated sensing, computation and actuation. Here we report a DNA-based technology for implementing the computational core of such controllers. We use the formalism of chemical reaction networks as a 'programming language', and our DNA architecture can, in principle, implement any behaviour that can be mathematically expressed as such. Unlike logic circuits, our formulation naturally allows complex signal processing of intrinsically analogue biological and chemical inputs. Controller components can be derived from biologically synthesized (plasmid) DNA, which reduces errors associated with chemically synthesized DNA. We implement several building-block reaction types and then combine them into a network that realizes, at the molecular level, an algorithm used in distributed control systems for achieving consensus between multiple agents. PMID:24077029

  15. Distributed intrusion detection system based on grid security model

    NASA Astrophysics Data System (ADS)

    Su, Jie; Liu, Yahui

    2008-03-01

    Grid computing has developed rapidly with the development of network technology and it can solve the problem of large-scale complex computing by sharing large-scale computing resource. In grid environment, we can realize a distributed and load balance intrusion detection system. This paper first discusses the security mechanism in grid computing and the function of PKI/CA in the grid security system, then gives the application of grid computing character in the distributed intrusion detection system (IDS) based on Artificial Immune System. Finally, it gives a distributed intrusion detection system based on grid security system that can reduce the processing delay and assure the detection rates.

  16. Naturally selecting solutions: the use of genetic algorithms in bioinformatics.

    PubMed

    Manning, Timmy; Sleator, Roy D; Walsh, Paul

    2013-01-01

    For decades, computer scientists have looked to nature for biologically inspired solutions to computational problems; ranging from robotic control to scheduling optimization. Paradoxically, as we move deeper into the post-genomics era, the reverse is occurring, as biologists and bioinformaticians look to computational techniques, to solve a variety of biological problems. One of the most common biologically inspired techniques are genetic algorithms (GAs), which take the Darwinian concept of natural selection as the driving force behind systems for solving real world problems, including those in the bioinformatics domain. Herein, we provide an overview of genetic algorithms and survey some of the most recent applications of this approach to bioinformatics based problems.

  17. Computer Graphics-aided systems analysis: application to well completion design

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Detamore, J.E.; Sarma, M.P.

    1985-03-01

    The development of an engineering tool (in the form of a computer model) for solving design and analysis problems related with oil and gas well production operations is discussed. The development of the method is based on integrating the concepts of ''Systems Analysis'' with the techniques of ''Computer Graphics''. The concepts behind the method are very general in nature. This paper, however, illustrates the application of the method in solving gas well completion design problems. The use of the method will save time and improve the efficiency of such design and analysis problems. The method can be extended to othermore » design and analysis aspects of oil and gas wells.« less

  18. Identification of DNA-binding proteins by combining auto-cross covariance transformation and ensemble learning.

    PubMed

    Liu, Bin; Wang, Shanyi; Dong, Qiwen; Li, Shumin; Liu, Xuan

    2016-04-20

    DNA-binding proteins play a pivotal role in various intra- and extra-cellular activities ranging from DNA replication to gene expression control. With the rapid development of next generation of sequencing technique, the number of protein sequences is unprecedentedly increasing. Thus it is necessary to develop computational methods to identify the DNA-binding proteins only based on the protein sequence information. In this study, a novel method called iDNA-KACC is presented, which combines the Support Vector Machine (SVM) and the auto-cross covariance transformation. The protein sequences are first converted into profile-based protein representation, and then converted into a series of fixed-length vectors by the auto-cross covariance transformation with Kmer composition. The sequence order effect can be effectively captured by this scheme. These vectors are then fed into Support Vector Machine (SVM) to discriminate the DNA-binding proteins from the non DNA-binding ones. iDNA-KACC achieves an overall accuracy of 75.16% and Matthew correlation coefficient of 0.5 by a rigorous jackknife test. Its performance is further improved by employing an ensemble learning approach, and the improved predictor is called iDNA-KACC-EL. Experimental results on an independent dataset shows that iDNA-KACC-EL outperforms all the other state-of-the-art predictors, indicating that it would be a useful computational tool for DNA binding protein identification. .

  19. Design and Application of Interactive Simulations in Problem-Solving in University-Level Physics Education

    ERIC Educational Resources Information Center

    Ceberio, Mikel; Almudí, José Manuel; Franco, Ángel

    2016-01-01

    In recent years, interactive computer simulations have been progressively integrated in the teaching of the sciences and have contributed significant improvements in the teaching-learning process. Practicing problem-solving is a key factor in science and engineering education. The aim of this study was to design simulation-based problem-solving…

  20. Real-Time Assessment of Problem-Solving of Physics Students Using Computer-Based Technology

    ERIC Educational Resources Information Center

    Gok, Tolga

    2012-01-01

    The change in students' problem solving ability in upper-level course through the application of a technological interactive environment--Tablet PC running InkSurvey--was investigated in present study. Tablet PC/InkSurvey interactive technology allowing the instructor to receive real-time formative assessment as the class works through the problem…

  1. Instructional Applications of Artificial Intelligence.

    ERIC Educational Resources Information Center

    Halff, Henry M.

    1986-01-01

    Surveys artificial intelligence and the development of computer-based tutors and speculates on the future of artificial intelligence in education. Includes discussion of the definitions of knowledge, expert systems (computer systems that solve tough technical problems), intelligent tutoring systems (ITS), and specific ITSs such as GUIDON, MYCIN,…

  2. Structure-based cleavage mechanism of Thermus thermophilus Argonaute DNA guide strand-mediated DNA target cleavage

    PubMed Central

    Sheng, Gang; Zhao, Hongtu; Wang, Jiuyu; Rao, Yu; Tian, Wenwen; Swarts, Daan C.; van der Oost, John; Patel, Dinshaw J.; Wang, Yanli

    2014-01-01

    We report on crystal structures of ternary Thermus thermophilus Argonaute (TtAgo) complexes with 5′-phosphorylated guide DNA and a series of DNA targets. These ternary complex structures of cleavage-incompatible, cleavage-compatible, and postcleavage states solved at improved resolution up to 2.2 Å have provided molecular insights into the orchestrated positioning of catalytic residues, a pair of Mg2+ cations, and the putative water nucleophile positioned for in-line attack on the cleavable phosphate for TtAgo-mediated target cleavage by a RNase H-type mechanism. In addition, these ternary complex structures have provided insights into protein and DNA conformational changes that facilitate transition between cleavage-incompatible and cleavage-compatible states, including the role of a Glu finger in generating a cleavage-competent catalytic Asp-Glu-Asp-Asp tetrad. Following cleavage, the seed segment forms a stable duplex with the complementary segment of the target strand. PMID:24374628

  3. A Scheduling Algorithm for Cloud Computing System Based on the Driver of Dynamic Essential Path.

    PubMed

    Xie, Zhiqiang; Shao, Xia; Xin, Yu

    2016-01-01

    To solve the problem of task scheduling in the cloud computing system, this paper proposes a scheduling algorithm for cloud computing based on the driver of dynamic essential path (DDEP). This algorithm applies a predecessor-task layer priority strategy to solve the problem of constraint relations among task nodes. The strategy assigns different priority values to every task node based on the scheduling order of task node as affected by the constraint relations among task nodes, and the task node list is generated by the different priority value. To address the scheduling order problem in which task nodes have the same priority value, the dynamic essential long path strategy is proposed. This strategy computes the dynamic essential path of the pre-scheduling task nodes based on the actual computation cost and communication cost of task node in the scheduling process. The task node that has the longest dynamic essential path is scheduled first as the completion time of task graph is indirectly influenced by the finishing time of task nodes in the longest dynamic essential path. Finally, we demonstrate the proposed algorithm via simulation experiments using Matlab tools. The experimental results indicate that the proposed algorithm can effectively reduce the task Makespan in most cases and meet a high quality performance objective.

  4. A Scheduling Algorithm for Cloud Computing System Based on the Driver of Dynamic Essential Path

    PubMed Central

    Xie, Zhiqiang; Shao, Xia; Xin, Yu

    2016-01-01

    To solve the problem of task scheduling in the cloud computing system, this paper proposes a scheduling algorithm for cloud computing based on the driver of dynamic essential path (DDEP). This algorithm applies a predecessor-task layer priority strategy to solve the problem of constraint relations among task nodes. The strategy assigns different priority values to every task node based on the scheduling order of task node as affected by the constraint relations among task nodes, and the task node list is generated by the different priority value. To address the scheduling order problem in which task nodes have the same priority value, the dynamic essential long path strategy is proposed. This strategy computes the dynamic essential path of the pre-scheduling task nodes based on the actual computation cost and communication cost of task node in the scheduling process. The task node that has the longest dynamic essential path is scheduled first as the completion time of task graph is indirectly influenced by the finishing time of task nodes in the longest dynamic essential path. Finally, we demonstrate the proposed algorithm via simulation experiments using Matlab tools. The experimental results indicate that the proposed algorithm can effectively reduce the task Makespan in most cases and meet a high quality performance objective. PMID:27490901

  5. Use of edge-based finite elements for solving three dimensional scattering problems

    NASA Technical Reports Server (NTRS)

    Chatterjee, A.; Jin, J. M.; Volakis, John L.

    1991-01-01

    Edge based finite elements are free from drawbacks associated with node based vectorial finite elements and are, therefore, ideal for solving 3-D scattering problems. The finite element discretization using edge elements is checked by solving for the resonant frequencies of a closed inhomogeneously filled metallic cavity. Great improvements in accuracy are observed when compared to the classical node based approach with no penalty in terms of computational time and with the expected absence of spurious modes. A performance comparison between the edge based tetrahedra and rectangular brick elements is carried out and tetrahedral elements are found to be more accurate than rectangular bricks for a given storage intensity. A detailed formulation for the scattering problem with various approaches for terminating the finite element mesh is also presented.

  6. A DNA-based molecular motor that can navigate a network of tracks

    NASA Astrophysics Data System (ADS)

    Wickham, Shelley F. J.; Bath, Jonathan; Katsuda, Yousuke; Endo, Masayuki; Hidaka, Kumi; Sugiyama, Hiroshi; Turberfield, Andrew J.

    2012-03-01

    Synthetic molecular motors can be fuelled by the hydrolysis or hybridization of DNA. Such motors can move autonomously and programmably, and long-range transport has been observed on linear tracks. It has also been shown that DNA systems can compute. Here, we report a synthetic DNA-based system that integrates long-range transport and information processing. We show that the path of a motor through a network of tracks containing four possible routes can be programmed using instructions that are added externally or carried by the motor itself. When external control is used we find that 87% of the motors follow the correct path, and when internal control is used 71% of the motors follow the correct path. Programmable motion will allow the development of computing networks, molecular systems that can sort and process cargoes according to instructions that they carry, and assembly lines that can be reconfigured dynamically in response to changing demands.

  7. Antibody-controlled actuation of DNA-based molecular circuits.

    PubMed

    Engelen, Wouter; Meijer, Lenny H H; Somers, Bram; de Greef, Tom F A; Merkx, Maarten

    2017-02-17

    DNA-based molecular circuits allow autonomous signal processing, but their actuation has relied mostly on RNA/DNA-based inputs, limiting their application in synthetic biology, biomedicine and molecular diagnostics. Here we introduce a generic method to translate the presence of an antibody into a unique DNA strand, enabling the use of antibodies as specific inputs for DNA-based molecular computing. Our approach, antibody-templated strand exchange (ATSE), uses the characteristic bivalent architecture of antibodies to promote DNA-strand exchange reactions both thermodynamically and kinetically. Detailed characterization of the ATSE reaction allowed the establishment of a comprehensive model that describes the kinetics and thermodynamics of ATSE as a function of toehold length, antibody-epitope affinity and concentration. ATSE enables the introduction of complex signal processing in antibody-based diagnostics, as demonstrated here by constructing molecular circuits for multiplex antibody detection, integration of multiple antibody inputs using logic gates and actuation of enzymes and DNAzymes for signal amplification.

  8. Antibody-controlled actuation of DNA-based molecular circuits

    NASA Astrophysics Data System (ADS)

    Engelen, Wouter; Meijer, Lenny H. H.; Somers, Bram; de Greef, Tom F. A.; Merkx, Maarten

    2017-02-01

    DNA-based molecular circuits allow autonomous signal processing, but their actuation has relied mostly on RNA/DNA-based inputs, limiting their application in synthetic biology, biomedicine and molecular diagnostics. Here we introduce a generic method to translate the presence of an antibody into a unique DNA strand, enabling the use of antibodies as specific inputs for DNA-based molecular computing. Our approach, antibody-templated strand exchange (ATSE), uses the characteristic bivalent architecture of antibodies to promote DNA-strand exchange reactions both thermodynamically and kinetically. Detailed characterization of the ATSE reaction allowed the establishment of a comprehensive model that describes the kinetics and thermodynamics of ATSE as a function of toehold length, antibody-epitope affinity and concentration. ATSE enables the introduction of complex signal processing in antibody-based diagnostics, as demonstrated here by constructing molecular circuits for multiplex antibody detection, integration of multiple antibody inputs using logic gates and actuation of enzymes and DNAzymes for signal amplification.

  9. A metallo-DNA nanowire with uninterrupted one-dimensional silver array

    NASA Astrophysics Data System (ADS)

    Kondo, Jiro; Tada, Yoshinari; Dairaku, Takenori; Hattori, Yoshikazu; Saneyoshi, Hisao; Ono, Akira; Tanaka, Yoshiyuki

    2017-10-01

    The double-helix structure of DNA, in which complementary strands reversibly hybridize to each other, not only explains how genetic information is stored and replicated, but also has proved very attractive for the development of nanomaterials. The discovery of metal-mediated base pairs has prompted the generation of short metal-DNA hybrid duplexes by a bottom-up approach. Here we describe a metallo-DNA nanowire—whose structure was solved by high-resolution X-ray crystallography—that consists of dodecamer duplexes held together by four different metal-mediated base pairs (the previously observed C-Ag-C, as well as G-Ag-G, G-Ag-C and T-Ag-T) and linked to each other through G overhangs involved in interduplex G-Ag-G. The resulting hybrid nanowires are 2 nm wide with a length of the order of micrometres to millimetres, and hold the silver ions in uninterrupted one-dimensional arrays along the DNA helical axis. The hybrid nanowires are further assembled into three-dimensional lattices by interactions between adenine residues, fully bulged out of the double helix.

  10. Balancing Expression and Structure in Game Design: Developing Computational Participation Using Studio-Based Design Pedagogy

    ERIC Educational Resources Information Center

    DeVane, Ben; Steward, Cody; Tran, Kelly M.

    2016-01-01

    This article reports on a project that used a game-creation tool to introduce middle-school students ages 10 to 13 to problem-solving strategies similar to those in computer science through the lens of studio-based design arts. Drawing on historic paradigms in design pedagogy and contemporary educational approaches in the digital arts to teach…

  11. Computing Gröbner and Involutive Bases for Linear Systems of Difference Equations

    NASA Astrophysics Data System (ADS)

    Yanovich, Denis

    2018-02-01

    The computation of involutive bases and Gröbner bases for linear systems of difference equations is solved and its importance for physical and mathematical problems is discussed. The algorithm and issues concerning its implementation in C are presented and calculation times are compared with the competing programs. The paper ends with consideration on the parallel version of this implementation and its scalability.

  12. The 'Biologically-Inspired Computing' Column

    NASA Technical Reports Server (NTRS)

    Hinchey, Mike

    2006-01-01

    The field of Biology changed dramatically in 1953, with the determination by Francis Crick and James Dewey Watson of the double helix structure of DNA. This discovery changed Biology for ever, allowing the sequencing of the human genome, and the emergence of a "new Biology" focused on DNA, genes, proteins, data, and search. Computational Biology and Bioinformatics heavily rely on computing to facilitate research into life and development. Simultaneously, an understanding of the biology of living organisms indicates a parallel with computing systems: molecules in living cells interact, grow, and transform according to the "program" dictated by DNA. Moreover, paradigms of Computing are emerging based on modelling and developing computer-based systems exploiting ideas that are observed in nature. This includes building into computer systems self-management and self-governance mechanisms that are inspired by the human body's autonomic nervous system, modelling evolutionary systems analogous to colonies of ants or other insects, and developing highly-efficient and highly-complex distributed systems from large numbers of (often quite simple) largely homogeneous components to reflect the behaviour of flocks of birds, swarms of bees, herds of animals, or schools of fish. This new field of "Biologically-Inspired Computing", often known in other incarnations by other names, such as: Autonomic Computing, Pervasive Computing, Organic Computing, Biomimetics, and Artificial Life, amongst others, is poised at the intersection of Computer Science, Engineering, Mathematics, and the Life Sciences. Successes have been reported in the fields of drug discovery, data communications, computer animation, control and command, exploration systems for space, undersea, and harsh environments, to name but a few, and augur much promise for future progress.

  13. Performance of parallel computation using CUDA for solving the one-dimensional elasticity equations

    NASA Astrophysics Data System (ADS)

    Darmawan, J. B. B.; Mungkasi, S.

    2017-01-01

    In this paper, we investigate the performance of parallel computation in solving the one-dimensional elasticity equations. Elasticity equations are usually implemented in engineering science. Solving these equations fast and efficiently is desired. Therefore, we propose the use of parallel computation. Our parallel computation uses CUDA of the NVIDIA. Our research results show that parallel computation using CUDA has a great advantage and is powerful when the computation is of large scale.

  14. An optimization-based approach for solving a time-harmonic multiphysical wave problem with higher-order schemes

    NASA Astrophysics Data System (ADS)

    Mönkölä, Sanna

    2013-06-01

    This study considers developing numerical solution techniques for the computer simulations of time-harmonic fluid-structure interaction between acoustic and elastic waves. The focus is on the efficiency of an iterative solution method based on a controllability approach and spectral elements. We concentrate on the model, in which the acoustic waves in the fluid domain are modeled by using the velocity potential and the elastic waves in the structure domain are modeled by using displacement. Traditionally, the complex-valued time-harmonic equations are used for solving the time-harmonic problems. Instead of that, we focus on finding periodic solutions without solving the time-harmonic problems directly. The time-dependent equations can be simulated with respect to time until a time-harmonic solution is reached, but the approach suffers from poor convergence. To overcome this challenge, we follow the approach first suggested and developed for the acoustic wave equations by Bristeau, Glowinski, and Périaux. Thus, we accelerate the convergence rate by employing a controllability method. The problem is formulated as a least-squares optimization problem, which is solved with the conjugate gradient (CG) algorithm. Computation of the gradient of the functional is done directly for the discretized problem. A graph-based multigrid method is used for preconditioning the CG algorithm.

  15. ELECTRONIC DIGITAL COMPUTER

    DOEpatents

    Stone, J.J. Jr.; Bettis, E.S.; Mann, E.R.

    1957-10-01

    The electronic digital computer is designed to solve systems involving a plurality of simultaneous linear equations. The computer can solve a system which converges rather rapidly when using Von Seidel's method of approximation and performs the summations required for solving for the unknown terms by a method of successive approximations.

  16. Numerical simulation of the vortical flow around a pitching airfoil

    NASA Astrophysics Data System (ADS)

    Fu, Xiang; Li, Gaohua; Wang, Fuxin

    2017-04-01

    In order to study the dynamic behaviors of the flapping wing, the vortical flow around a pitching NACA0012 airfoil is investigated. The unsteady flow field is obtained by a very efficient zonal procedure based on the velocity-vorticity formulation and the Reynolds number based on the chord length of the airfoil is set to 1 million. The zonal procedure divides up the whole computation domain in to three zones: potential flow zone, boundary layer zone and Navier-Stokes zone. Since the vorticity is absent in the potential flow zone, the vorticity transport equation needs only to be solved in the boundary layer zone and Navier-Stokes zone. Moreover, the boundary layer equations are solved in the boundary layer zone. This arrangement drastically reduces the computation time against the traditional numerical method. After the flow field computation, the evolution of the vortices around the airfoil is analyzed in detail.

  17. Assessment regarding the use of the computer aided analytical models in the calculus of the general strength of a ship hull

    NASA Astrophysics Data System (ADS)

    Hreniuc, V.; Hreniuc, A.; Pescaru, A.

    2017-08-01

    Solving a general strength problem of a ship hull may be done using analytical approaches which are useful to deduce the buoyancy forces distribution, the weighting forces distribution along the hull and the geometrical characteristics of the sections. These data are used to draw the free body diagrams and to compute the stresses. The general strength problems require a large amount of calculi, therefore it is interesting how a computer may be used to solve such problems. Using computer programming an engineer may conceive software instruments based on analytical approaches. However, before developing the computer code the research topic must be thoroughly analysed, in this way being reached a meta-level of understanding of the problem. The following stage is to conceive an appropriate development strategy of the original software instruments useful for the rapid development of computer aided analytical models. The geometrical characteristics of the sections may be computed using a bool algebra that operates with ‘simple’ geometrical shapes. By ‘simple’ we mean that for the according shapes we have direct calculus relations. In the set of ‘simple’ shapes we also have geometrical entities bounded by curves approximated as spline functions or as polygons. To conclude, computer programming offers the necessary support to solve general strength ship hull problems using analytical methods.

  18. Solving constant-coefficient differential equations with dielectric metamaterials

    NASA Astrophysics Data System (ADS)

    Zhang, Weixuan; Qu, Che; Zhang, Xiangdong

    2016-07-01

    Recently, the concept of metamaterial analog computing has been proposed (Silva et al 2014 Science 343 160-3). Some mathematical operations such as spatial differentiation, integration, and convolution, have been performed by using designed metamaterial blocks. Motivated by this work, we propose a practical approach based on dielectric metamaterial to solve differential equations. The ordinary differential equation can be solved accurately by the correctly designed metamaterial system. The numerical simulations using well-established numerical routines have been performed to successfully verify all theoretical analyses.

  19. Physical Principle for Generation of Randomness

    NASA Technical Reports Server (NTRS)

    Zak, Michail

    2009-01-01

    A physical principle (more precisely, a principle that incorporates mathematical models used in physics) has been conceived as the basis of a method of generating randomness in Monte Carlo simulations. The principle eliminates the need for conventional random-number generators. The Monte Carlo simulation method is among the most powerful computational methods for solving high-dimensional problems in physics, chemistry, economics, and information processing. The Monte Carlo simulation method is especially effective for solving problems in which computational complexity increases exponentially with dimensionality. The main advantage of the Monte Carlo simulation method over other methods is that the demand on computational resources becomes independent of dimensionality. As augmented by the present principle, the Monte Carlo simulation method becomes an even more powerful computational method that is especially useful for solving problems associated with dynamics of fluids, planning, scheduling, and combinatorial optimization. The present principle is based on coupling of dynamical equations with the corresponding Liouville equation. The randomness is generated by non-Lipschitz instability of dynamics triggered and controlled by feedback from the Liouville equation. (In non-Lipschitz dynamics, the derivatives of solutions of the dynamical equations are not required to be bounded.)

  20. A spatially localized architecture for fast and modular DNA computing

    NASA Astrophysics Data System (ADS)

    Chatterjee, Gourab; Dalchau, Neil; Muscat, Richard A.; Phillips, Andrew; Seelig, Georg

    2017-09-01

    Cells use spatial constraints to control and accelerate the flow of information in enzyme cascades and signalling networks. Synthetic silicon-based circuitry similarly relies on spatial constraints to process information. Here, we show that spatial organization can be a similarly powerful design principle for overcoming limitations of speed and modularity in engineered molecular circuits. We create logic gates and signal transmission lines by spatially arranging reactive DNA hairpins on a DNA origami. Signal propagation is demonstrated across transmission lines of different lengths and orientations and logic gates are modularly combined into circuits that establish the universality of our approach. Because reactions preferentially occur between neighbours, identical DNA hairpins can be reused across circuits. Co-localization of circuit elements decreases computation time from hours to minutes compared to circuits with diffusible components. Detailed computational models enable predictive circuit design. We anticipate our approach will motivate using spatial constraints for future molecular control circuit designs.

  1. Structural DNA Nanotechnology: State of the Art and Future Perspective

    PubMed Central

    2015-01-01

    Over the past three decades DNA has emerged as an exceptional molecular building block for nanoconstruction due to its predictable conformation and programmable intra- and intermolecular Watson–Crick base-pairing interactions. A variety of convenient design rules and reliable assembly methods have been developed to engineer DNA nanostructures of increasing complexity. The ability to create designer DNA architectures with accurate spatial control has allowed researchers to explore novel applications in many directions, such as directed material assembly, structural biology, biocatalysis, DNA computing, nanorobotics, disease diagnosis, and drug delivery. This Perspective discusses the state of the art in the field of structural DNA nanotechnology and presents some of the challenges and opportunities that exist in DNA-based molecular design and programming. PMID:25029570

  2. Evaluating the role of coherent delocalized phonon-like modes in DNA cyclization

    DOE PAGES

    Alexandrov, Ludmil B.; Rasmussen, Kim Ø.; Bishop, Alan R.; ...

    2017-08-29

    The innate flexibility of a DNA sequence is quantified by the Jacobson-Stockmayer’s J-factor, which measures the propensity for DNA loop formation. Recent studies of ultra-short DNA sequences revealed a discrepancy of up to six orders of magnitude between experimentally measured and theoretically predicted J-factors. These large differences suggest that, in addition to the elastic moduli of the double helix, other factors contribute to loop formation. We develop a new theoretical model that explores how coherent delocalized phonon-like modes in DNA provide single-stranded ”flexible hinges” to assist in loop formation. We also combine the Czapla-Swigon-Olson structural model of DNA with ourmore » extended Peyrard-Bishop-Dauxois model and, without changing any of the parameters of the two models, apply this new computational framework to 86 experimentally characterized DNA sequences. Our results demonstrate that the new computational framework can predict J-factors within an order of magnitude of experimental measurements for most ultra-short DNA sequences, while continuing to accurately describe the J-factors of longer sequences. Furthermore, we demonstrate that our computational framework can be used to describe the cyclization of DNA sequences that contain a base pair mismatch. Overall, our results support the conclusion that coherent delocalized phonon-like modes play an important role in DNA cyclization.« less

  3. Evaluating the role of coherent delocalized phonon-like modes in DNA cyclization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Alexandrov, Ludmil B.; Rasmussen, Kim Ø.; Bishop, Alan R.

    The innate flexibility of a DNA sequence is quantified by the Jacobson-Stockmayer’s J-factor, which measures the propensity for DNA loop formation. Recent studies of ultra-short DNA sequences revealed a discrepancy of up to six orders of magnitude between experimentally measured and theoretically predicted J-factors. These large differences suggest that, in addition to the elastic moduli of the double helix, other factors contribute to loop formation. We develop a new theoretical model that explores how coherent delocalized phonon-like modes in DNA provide single-stranded ”flexible hinges” to assist in loop formation. We also combine the Czapla-Swigon-Olson structural model of DNA with ourmore » extended Peyrard-Bishop-Dauxois model and, without changing any of the parameters of the two models, apply this new computational framework to 86 experimentally characterized DNA sequences. Our results demonstrate that the new computational framework can predict J-factors within an order of magnitude of experimental measurements for most ultra-short DNA sequences, while continuing to accurately describe the J-factors of longer sequences. Furthermore, we demonstrate that our computational framework can be used to describe the cyclization of DNA sequences that contain a base pair mismatch. Overall, our results support the conclusion that coherent delocalized phonon-like modes play an important role in DNA cyclization.« less

  4. Testing the Use of Implicit Solvent in the Molecular Dynamics Modelling of DNA Flexibility

    NASA Astrophysics Data System (ADS)

    Mitchell, J.; Harris, S.

    DNA flexibility controls packaging, looping and in some cases sequence specific protein binding. Molecular dynamics simulations carried out with a computationally efficient implicit solvent model are potentially a powerful tool for studying larger DNA molecules than can be currently simulated when water and counterions are represented explicitly. In this work we compare DNA flexibility at the base pair step level modelled using an implicit solvent model to that previously determined from explicit solvent simulations and database analysis. Although much of the sequence dependent behaviour is preserved in implicit solvent, the DNA is considerably more flexible when the approximate model is used. In addition we test the ability of the implicit solvent to model stress induced DNA disruptions by simulating a series of DNA minicircle topoisomers which vary in size and superhelical density. When compared with previously run explicit solvent simulations, we find that while the levels of DNA denaturation are similar using both computational methodologies, the specific structural form of the disruptions is different.

  5. Analyzing Team Based Engineering Design Process in Computer Supported Collaborative Learning

    ERIC Educational Resources Information Center

    Lee, Dong-Kuk; Lee, Eun-Sang

    2016-01-01

    The engineering design process has been largely implemented in a collaborative project format. Recently, technological advancement has helped collaborative problem solving processes such as engineering design to have efficient implementation using computers or online technology. In this study, we investigated college students' interaction and…

  6. Teaching Cardiovascular Integrations with Computer Laboratories.

    ERIC Educational Resources Information Center

    Peterson, Nils S.; Campbell, Kenneth B.

    1985-01-01

    Describes a computer-based instructional unit in cardiovascular physiology. The program (which employs simulated laboratory experimental techniques with a problem-solving format is designed to supplement an animal laboratory and to offer students an integrative approach to physiology through use of microcomputers. Also presents an overview of the…

  7. A numerical scheme based on radial basis function finite difference (RBF-FD) technique for solving the high-dimensional nonlinear Schrödinger equations using an explicit time discretization: Runge-Kutta method

    NASA Astrophysics Data System (ADS)

    Dehghan, Mehdi; Mohammadi, Vahid

    2017-08-01

    In this research, we investigate the numerical solution of nonlinear Schrödinger equations in two and three dimensions. The numerical meshless method which will be used here is RBF-FD technique. The main advantage of this method is the approximation of the required derivatives based on finite difference technique at each local-support domain as Ωi. At each Ωi, we require to solve a small linear system of algebraic equations with a conditionally positive definite matrix of order 1 (interpolation matrix). This scheme is efficient and its computational cost is same as the moving least squares (MLS) approximation. A challengeable issue is choosing suitable shape parameter for interpolation matrix in this way. In order to overcome this matter, an algorithm which was established by Sarra (2012), will be applied. This algorithm computes the condition number of the local interpolation matrix using the singular value decomposition (SVD) for obtaining the smallest and largest singular values of that matrix. Moreover, an explicit method based on Runge-Kutta formula of fourth-order accuracy will be applied for approximating the time variable. It also decreases the computational costs at each time step since we will not solve a nonlinear system. On the other hand, to compare RBF-FD method with another meshless technique, the moving kriging least squares (MKLS) approximation is considered for the studied model. Our results demonstrate the ability of the present approach for solving the applicable model which is investigated in the current research work.

  8. Computation of marginal distributions of peak-heights in electropherograms for analysing single source and mixture STR DNA samples.

    PubMed

    Cowell, Robert G

    2018-05-04

    Current models for single source and mixture samples, and probabilistic genotyping software based on them used for analysing STR electropherogram data, assume simple probability distributions, such as the gamma distribution, to model the allelic peak height variability given the initial amount of DNA prior to PCR amplification. Here we illustrate how amplicon number distributions, for a model of the process of sample DNA collection and PCR amplification, may be efficiently computed by evaluating probability generating functions using discrete Fourier transforms. Copyright © 2018 Elsevier B.V. All rights reserved.

  9. Analysis of rotary engine combustion processes based on unsteady, three-dimensional computations

    NASA Technical Reports Server (NTRS)

    Raju, M. S.; Willis, E. A.

    1990-01-01

    A new computer code was developed for predicting the turbulent and chemically reacting flows with sprays occurring inside of a stratified charge rotary engine. The solution procedure is based on an Eulerian Lagrangian approach where the unsteady, three-dimensional Navier-Stokes equations for a perfect gas mixture with variable properties are solved in generalized, Eulerian coordinates on a moving grid by making use of an implicit finite volume, Steger-Warming flux vector splitting scheme, and the liquid phase equations are solved in Lagrangian coordinates. Both the details of the numerical algorithm and the finite difference predictions of the combustor flow field during the opening of exhaust and/or intake, and also during fuel vaporization and combustion, are presented.

  10. Analysis of rotary engine combustion processes based on unsteady, three-dimensional computations

    NASA Technical Reports Server (NTRS)

    Raju, M. S.; Willis, E. A.

    1989-01-01

    A new computer code was developed for predicting the turbulent, and chemically reacting flows with sprays occurring inside of a stratified charge rotary engine. The solution procedure is based on an Eulerian Lagrangian approach where the unsteady, 3-D Navier-Stokes equations for a perfect gas mixture with variable properties are solved in generalized, Eulerian coordinates on a moving grid by making use of an implicit finite volume, Steger-Warming flux vector splitting scheme, and the liquid phase equations are solved in Lagrangian coordinates. Both the details of the numerical algorithm and the finite difference predictions of the combustor flow field during the opening of exhaust and/or intake, and also during fuel vaporization and combustion, are presented.

  11. A low-complexity geometric bilateration method for localization in Wireless Sensor Networks and its comparison with Least-Squares methods.

    PubMed

    Cota-Ruiz, Juan; Rosiles, Jose-Gerardo; Sifuentes, Ernesto; Rivas-Perea, Pablo

    2012-01-01

    This research presents a distributed and formula-based bilateration algorithm that can be used to provide initial set of locations. In this scheme each node uses distance estimates to anchors to solve a set of circle-circle intersection (CCI) problems, solved through a purely geometric formulation. The resulting CCIs are processed to pick those that cluster together and then take the average to produce an initial node location. The algorithm is compared in terms of accuracy and computational complexity with a Least-Squares localization algorithm, based on the Levenberg-Marquardt methodology. Results in accuracy vs. computational performance show that the bilateration algorithm is competitive compared with well known optimized localization algorithms.

  12. An approach for heterogeneous and loosely coupled geospatial data distributed computing

    NASA Astrophysics Data System (ADS)

    Chen, Bin; Huang, Fengru; Fang, Yu; Huang, Zhou; Lin, Hui

    2010-07-01

    Most GIS (Geographic Information System) applications tend to have heterogeneous and autonomous geospatial information resources, and the availability of these local resources is unpredictable and dynamic under a distributed computing environment. In order to make use of these local resources together to solve larger geospatial information processing problems that are related to an overall situation, in this paper, with the support of peer-to-peer computing technologies, we propose a geospatial data distributed computing mechanism that involves loosely coupled geospatial resource directories and a term named as Equivalent Distributed Program of global geospatial queries to solve geospatial distributed computing problems under heterogeneous GIS environments. First, a geospatial query process schema for distributed computing as well as a method for equivalent transformation from a global geospatial query to distributed local queries at SQL (Structured Query Language) level to solve the coordinating problem among heterogeneous resources are presented. Second, peer-to-peer technologies are used to maintain a loosely coupled network environment that consists of autonomous geospatial information resources, thus to achieve decentralized and consistent synchronization among global geospatial resource directories, and to carry out distributed transaction management of local queries. Finally, based on the developed prototype system, example applications of simple and complex geospatial data distributed queries are presented to illustrate the procedure of global geospatial information processing.

  13. A comparison of acceleration methods for solving the neutron transport k-eigenvalue problem

    NASA Astrophysics Data System (ADS)

    Willert, Jeffrey; Park, H.; Knoll, D. A.

    2014-10-01

    Over the past several years a number of papers have been written describing modern techniques for numerically computing the dominant eigenvalue of the neutron transport criticality problem. These methods fall into two distinct categories. The first category of methods rewrite the multi-group k-eigenvalue problem as a nonlinear system of equations and solve the resulting system using either a Jacobian-Free Newton-Krylov (JFNK) method or Nonlinear Krylov Acceleration (NKA), a variant of Anderson Acceleration. These methods are generally successful in significantly reducing the number of transport sweeps required to compute the dominant eigenvalue. The second category of methods utilize Moment-Based Acceleration (or High-Order/Low-Order (HOLO) Acceleration). These methods solve a sequence of modified diffusion eigenvalue problems whose solutions converge to the solution of the original transport eigenvalue problem. This second class of methods is, in our experience, always superior to the first, as most of the computational work is eliminated by the acceleration from the LO diffusion system. In this paper, we review each of these methods. Our computational results support our claim that the choice of which nonlinear solver to use, JFNK or NKA, should be secondary. The primary computational savings result from the implementation of a HOLO algorithm. We display computational results for a series of challenging multi-dimensional test problems.

  14. A computational method for solving stochastic Itô–Volterra integral equations based on stochastic operational matrix for generalized hat basis functions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Heydari, M.H., E-mail: heydari@stu.yazd.ac.ir; The Laboratory of Quantum Information Processing, Yazd University, Yazd; Hooshmandasl, M.R., E-mail: hooshmandasl@yazd.ac.ir

    2014-08-01

    In this paper, a new computational method based on the generalized hat basis functions is proposed for solving stochastic Itô–Volterra integral equations. In this way, a new stochastic operational matrix for generalized hat functions on the finite interval [0,T] is obtained. By using these basis functions and their stochastic operational matrix, such problems can be transformed into linear lower triangular systems of algebraic equations which can be directly solved by forward substitution. Also, the rate of convergence of the proposed method is considered and it has been shown that it is O(1/(n{sup 2}) ). Further, in order to show themore » accuracy and reliability of the proposed method, the new approach is compared with the block pulse functions method by some examples. The obtained results reveal that the proposed method is more accurate and efficient in comparison with the block pule functions method.« less

  15. Application of Quaternion in improving the quality of global sequence alignment scores for an ambiguous sequence target in Streptococcus pneumoniae DNA

    NASA Astrophysics Data System (ADS)

    Lestari, D.; Bustamam, A.; Novianti, T.; Ardaneswari, G.

    2017-07-01

    DNA sequence can be defined as a succession of letters, representing the order of nucleotides within DNA, using a permutation of four DNA base codes including adenine (A), guanine (G), cytosine (C), and thymine (T). The precise code of the sequences is determined using DNA sequencing methods and technologies, which have been developed since the 1970s and currently become highly developed, advanced and highly throughput sequencing technologies. So far, DNA sequencing has greatly accelerated biological and medical research and discovery. However, in some cases DNA sequencing could produce any ambiguous and not clear enough sequencing results that make them quite difficult to be determined whether these codes are A, T, G, or C. To solve these problems, in this study we can introduce other representation of DNA codes namely Quaternion Q = (PA, PT, PG, PC), where PA, PT, PG, PC are the probability of A, T, G, C bases that could appear in Q and PA + PT + PG + PC = 1. Furthermore, using Quaternion representations we are able to construct the improved scoring matrix for global sequence alignment processes, by applying a dot product method. Moreover, this scoring matrix produces better and higher quality of the match and mismatch score between two DNA base codes. In implementation, we applied the Needleman-Wunsch global sequence alignment algorithm using Octave, to analyze our target sequence which contains some ambiguous sequence data. The subject sequences are the DNA sequences of Streptococcus pneumoniae families obtained from the Genebank, meanwhile the target DNA sequence are received from our collaborator database. As the results we found the Quaternion representations improve the quality of the sequence alignment score and we can conclude that DNA sequence target has maximum similarity with Streptococcus pneumoniae.

  16. A Drawing and Multi-Representational Computer Environment for Beginners' Learning of Programming Using C: Design and Pilot Formative Evaluation

    ERIC Educational Resources Information Center

    Kordaki, Maria

    2010-01-01

    This paper presents both the design and the pilot formative evaluation study of a computer-based problem-solving environment (named LECGO: Learning Environment for programming using C using Geometrical Objects) for the learning of computer programming using C by beginners. In its design, constructivist and social learning theories were taken into…

  17. Wusor II: A Computer Aided Instruction Program with Student Modelling Capabilities. AI Memo 417.

    ERIC Educational Resources Information Center

    Carr, Brian

    Wusor II is the second intelligent computer aided instruction (ICAI) program that has been developed to monitor the progress of, and offer suggestions to, students playing Wumpus, a computer game designed to teach logical thinking and problem solving. From the earlier efforts with Wusor I, it was possible to produce a rule-based expert which…

  18. Computational and experimental analysis of DNA shuffling

    PubMed Central

    Maheshri, Narendra; Schaffer, David V.

    2003-01-01

    We describe a computational model of DNA shuffling based on the thermodynamics and kinetics of this process. The model independently tracks a representative ensemble of DNA molecules and records their states at every stage of a shuffling reaction. These data can subsequently be analyzed to yield information on any relevant metric, including reassembly efficiency, crossover number, type and distribution, and DNA sequence length distributions. The predictive ability of the model was validated by comparison to three independent sets of experimental data, and analysis of the simulation results led to several unique insights into the DNA shuffling process. We examine a tradeoff between crossover frequency and reassembly efficiency and illustrate the effects of experimental parameters on this relationship. Furthermore, we discuss conditions that promote the formation of useless “junk” DNA sequences or multimeric sequences containing multiple copies of the reassembled product. This model will therefore aid in the design of optimal shuffling reaction conditions. PMID:12626764

  19. Integrating DNA barcodes and morphology for species delimitation in the Corynoneura group (Diptera: Chironomidae: Orthocladiinae).

    PubMed

    Silva, F L; Wiedenbrug, S

    2014-02-01

    In this study, we use DNA barcodes for species delimitation to solve taxonomic conflicts in 86 specimens of 14 species belonging to the Corynoneura group (Diptera: Chironomidae: Orthocladiinae), from the Atlantic Forest, Brazil. Molecular analysis of cytochrome c-oxidase subunit I (COI) gene sequences supported 14 cohesive species groups, of which two similar groups were subsequently associated with morphological variation at the pupal stage. Eleven species previously described based on morphological criteria were linked to DNA markers. Furthermore, there is the possibility that there may be cryptic species within the Corynoneura group, since one group of species presented internal grouping, although no morphological divergence was observed. Our results support DNA-barcoding as an excellent tool for species delimitation in groups where taxonomy by means of morphology is difficult or even impossible.

  20. The Computer-Based Assessment of Complex Problem Solving and How It Is Influenced by Students' Information and Communication Technology Literacy

    ERIC Educational Resources Information Center

    Greiff, Samuel; Kretzschmar, André; Müller, Jonas C.; Spinath, Birgit; Martin, Romain

    2014-01-01

    The 21st-century work environment places strong emphasis on nonroutine transversal skills. In an educational context, complex problem solving (CPS) is generally considered an important transversal skill that includes knowledge acquisition and its application in new and interactive situations. The dynamic and interactive nature of CPS requires a…

  1. Problem Solving in Technology-Rich Environments. A Report from the NAEP Technology-Based Assessment Project, Research and Development Series. NCES 2007-466

    ERIC Educational Resources Information Center

    Bennett, Randy Elliot; Persky, Hilary; Weiss, Andrew R.; Jenkins, Frank

    2007-01-01

    The Problem Solving in Technology-Rich Environments (TRE) study was designed to demonstrate and explore innovative use of computers for developing, administering, scoring, and analyzing the results of National Assessment of Educational Progress (NAEP) assessments. Two scenarios (Search and Simulation) were created for measuring problem solving…

  2. Knowledge-based control for robot self-localization

    NASA Technical Reports Server (NTRS)

    Bennett, Bonnie Kathleen Holte

    1993-01-01

    Autonomous robot systems are being proposed for a variety of missions including the Mars rover/sample return mission. Prior to any other mission objectives being met, an autonomous robot must be able to determine its own location. This will be especially challenging because location sensors like GPS, which are available on Earth, will not be useful, nor will INS sensors because their drift is too large. Another approach to self-localization is required. In this paper, we describe a novel approach to localization by applying a problem solving methodology. The term 'problem solving' implies a computational technique based on logical representational and control steps. In this research, these steps are derived from observing experts solving localization problems. The objective is not specifically to simulate human expertise but rather to apply its techniques where appropriate for computational systems. In doing this, we describe a model for solving the problem and a system built on that model, called localization control and logic expert (LOCALE), which is a demonstration of concept for the approach and the model. The results of this work represent the first successful solution to high-level control aspects of the localization problem.

  3. A fast solver for the Helmholtz equation based on the generalized multiscale finite-element method

    NASA Astrophysics Data System (ADS)

    Fu, Shubin; Gao, Kai

    2017-11-01

    Conventional finite-element methods for solving the acoustic-wave Helmholtz equation in highly heterogeneous media usually require finely discretized mesh to represent the medium property variations with sufficient accuracy. Computational costs for solving the Helmholtz equation can therefore be considerably expensive for complicated and large geological models. Based on the generalized multiscale finite-element theory, we develop a novel continuous Galerkin method to solve the Helmholtz equation in acoustic media with spatially variable velocity and mass density. Instead of using conventional polynomial basis functions, we use multiscale basis functions to form the approximation space on the coarse mesh. The multiscale basis functions are obtained from multiplying the eigenfunctions of a carefully designed local spectral problem with an appropriate multiscale partition of unity. These multiscale basis functions can effectively incorporate the characteristics of heterogeneous media's fine-scale variations, thus enable us to obtain accurate solution to the Helmholtz equation without directly solving the large discrete system formed on the fine mesh. Numerical results show that our new solver can significantly reduce the dimension of the discrete Helmholtz equation system, and can also obviously reduce the computational time.

  4. Helix Unwinding and Base Flipping Enable Human MTERF1 to Terminate Mitochondrial Transcription

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yakubovskaya, E.; Mejia, E; Byrnes, J

    2010-01-01

    Defects in mitochondrial gene expression are associated with aging and disease. Mterf proteins have been implicated in modulating transcription, replication and protein synthesis. We have solved the structure of a member of this family, the human mitochondrial transcriptional terminator MTERF1, bound to dsDNA containing the termination sequence. The structure indicates that upon sequence recognition MTERF1 unwinds the DNA molecule, promoting eversion of three nucleotides. Base flipping is critical for stable binding and transcriptional termination. Additional structural and biochemical results provide insight into the DNA binding mechanism and explain how MTERF1 recognizes its target sequence. Finally, we have demonstrated that themore » mitochondrial pathogenic G3249A and G3244A mutations interfere with key interactions for sequence recognition, eliminating termination. Our results provide insight into the role of mterf proteins and suggest a link between mitochondrial disease and the regulation of mitochondrial transcription.« less

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Changela, Anita; DiGate, Russell J.; Mondragon, Alfonso

    Escherichia coli DNA topoisomerase III belongs to the type IA family of DNA topoisomerases, which transiently cleave single-stranded DNA (ssDNA) via a 5{prime} phosphotyrosine intermediate. We have solved crystal structures of wild-type E. coli topoisomerase III bound to an eight-base ssDNA molecule in three different pH environments. The structures reveal the enzyme in three distinct conformational states while bound to DNA. One conformation resembles the one observed previously with a DNA-bound, catalytically inactive mutant of topoisomerase III where DNA binding realigns catalytic residues to form a functional active site. Another conformation represents a novel intermediate in which DNA is boundmore » along the ssDNA-binding groove but does not enter the active site, which remains in a catalytically inactive, closed state. A third conformation shows an intermediate state where the enzyme is still in a closed state, but the ssDNA is starting to invade the active site. For the first time, the active site region in the presence of both the catalytic tyrosine and ssDNA substrate is revealed for a type IA DNA topoisomerase, although there is no evidence of ssDNA cleavage. Comparative analysis of the various conformational states suggests a sequence of domain movements undertaken by the enzyme upon substrate binding.« less

  6. RDNAnalyzer: A tool for DNA secondary structure prediction and sequence analysis.

    PubMed

    Afzal, Muhammad; Shahid, Ahmad Ali; Shehzadi, Abida; Nadeem, Shahid; Husnain, Tayyab

    2012-01-01

    RDNAnalyzer is an innovative computer based tool designed for DNA secondary structure prediction and sequence analysis. It can randomly generate the DNA sequence or user can upload the sequences of their own interest in RAW format. It uses and extends the Nussinov dynamic programming algorithm and has various application for the sequence analysis. It predicts the DNA secondary structure and base pairings. It also provides the tools for routinely performed sequence analysis by the biological scientists such as DNA replication, reverse compliment generation, transcription, translation, sequence specific information as total number of nucleotide bases, ATGC base contents along with their respective percentages and sequence cleaner. RDNAnalyzer is a unique tool developed in Microsoft Visual Studio 2008 using Microsoft Visual C# and Windows Presentation Foundation and provides user friendly environment for sequence analysis. It is freely available. http://www.cemb.edu.pk/sw.html RDNAnalyzer - Random DNA Analyser, GUI - Graphical user interface, XAML - Extensible Application Markup Language.

  7. An efficient computational method for solving nonlinear stochastic Itô integral equations: Application for stochastic problems in physics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Heydari, M.H., E-mail: heydari@stu.yazd.ac.ir; The Laboratory of Quantum Information Processing, Yazd University, Yazd; Hooshmandasl, M.R., E-mail: hooshmandasl@yazd.ac.ir

    Because of the nonlinearity, closed-form solutions of many important stochastic functional equations are virtually impossible to obtain. Thus, numerical solutions are a viable alternative. In this paper, a new computational method based on the generalized hat basis functions together with their stochastic operational matrix of Itô-integration is proposed for solving nonlinear stochastic Itô integral equations in large intervals. In the proposed method, a new technique for computing nonlinear terms in such problems is presented. The main advantage of the proposed method is that it transforms problems under consideration into nonlinear systems of algebraic equations which can be simply solved. Errormore » analysis of the proposed method is investigated and also the efficiency of this method is shown on some concrete examples. The obtained results reveal that the proposed method is very accurate and efficient. As two useful applications, the proposed method is applied to obtain approximate solutions of the stochastic population growth models and stochastic pendulum problem.« less

  8. Distributed Algorithms for Probabilistic Solution of Computational Vision Problems.

    DTIC Science & Technology

    1988-03-01

    34 targets. Legters and Young (1982) developed an operator-based approach r% using foreground and background models and solved a least-squares minimiza...1960), "Finite Markov Chains", Van Nostrand, , - New York. Legters , G.R., and Young, T.Y. (1982), "A Mathematical Model for Computer Image Tracking

  9. Design Rationale for a Complex Performance Assessment

    ERIC Educational Resources Information Center

    Williamson, David M.; Bauer, Malcolm; Steinberg, Linda S.; Mislevy, Robert J.; Behrens, John T.; DeMark, Sarah F.

    2004-01-01

    In computer-based interactive environments meant to support learning, students must bring a wide range of relevant knowledge, skills, and abilities to bear jointly as they solve meaningful problems in a learning domain. To function effectively as an assessment, a computer system must additionally be able to evoke and interpret observable evidence…

  10. Computer-Based Physics: An Anthology.

    ERIC Educational Resources Information Center

    Blum, Ronald, Ed.

    Designed to serve as a guide for integrating interactive problem-solving or simulating computers into a college-level physics course, this anthology contains nine articles each of which includes an introduction, a student manual, and a teacher's guide. Among areas covered in the articles are the computerized reduction of data to a Gaussian…

  11. A Model for Intelligent Computer-Aided Education Systems.

    ERIC Educational Resources Information Center

    Du Plessis, Johan P.; And Others

    1995-01-01

    Proposes a model for intelligent computer-aided education systems that is based on cooperative learning, constructive problem-solving, object-oriented programming, interactive user interfaces, and expert system techniques. Future research is discussed, and a prototype for teaching mathematics to 10- to 12-year-old students is appended. (LRW)

  12. Learning and Teaching Information Technology--Computer Skills in Context. ERIC Digest.

    ERIC Educational Resources Information Center

    Eisenberg, Michael B.; Johnson, Doug

    This digest describes an integrated approach to teaching computer skills in K-12 schools. The introductory section discusses the importance of integrating information skills into the curriculum. "Technology Skills for Information Problem Solving: A Curriculum Based on the Big6 Skills Approach" (Michael B. Eisenberg, Doug Johnson, and…

  13. Enhancing Student Performance Using Tablet Computers

    ERIC Educational Resources Information Center

    Enriquez, Amelito G.

    2010-01-01

    Tablet PCs have the potential to change the dynamics of classroom interaction through wireless communication coupled with pen-based computing technology that is suited for analyzing and solving engineering problems. This study focuses on how tablet PCs and wireless technology can be used during classroom instruction to create an Interactive…

  14. Computation of non-monotonic Lyapunov functions for continuous-time systems

    NASA Astrophysics Data System (ADS)

    Li, Huijuan; Liu, AnPing

    2017-09-01

    In this paper, we propose two methods to compute non-monotonic Lyapunov functions for continuous-time systems which are asymptotically stable. The first method is to solve a linear optimization problem on a compact and bounded set. The proposed linear programming based algorithm delivers a CPA1

  15. MPSalsa Version 1.5: A Finite Element Computer Program for Reacting Flow Problems: Part 1 - Theoretical Development

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Devine, K.D.; Hennigan, G.L.; Hutchinson, S.A.

    1999-01-01

    The theoretical background for the finite element computer program, MPSalsa Version 1.5, is presented in detail. MPSalsa is designed to solve laminar or turbulent low Mach number, two- or three-dimensional incompressible and variable density reacting fluid flows on massively parallel computers, using a Petrov-Galerkin finite element formulation. The code has the capability to solve coupled fluid flow (with auxiliary turbulence equations), heat transport, multicomponent species transport, and finite-rate chemical reactions, and to solve coupled multiple Poisson or advection-diffusion-reaction equations. The program employs the CHEMKIN library to provide a rigorous treatment of multicomponent ideal gas kinetics and transport. Chemical reactions occurringmore » in the gas phase and on surfaces are treated by calls to CHEMKIN and SURFACE CHEMK3N, respectively. The code employs unstructured meshes, using the EXODUS II finite element database suite of programs for its input and output files. MPSalsa solves both transient and steady flows by using fully implicit time integration, an inexact Newton method and iterative solvers based on preconditioned Krylov methods as implemented in the Aztec. solver library.« less

  16. Mobile code security

    NASA Astrophysics Data System (ADS)

    Ramalingam, Srikumar

    2001-11-01

    A highly secure mobile agent system is very important for a mobile computing environment. The security issues in mobile agent system comprise protecting mobile hosts from malicious agents, protecting agents from other malicious agents, protecting hosts from other malicious hosts and protecting agents from malicious hosts. Using traditional security mechanisms the first three security problems can be solved. Apart from using trusted hardware, very few approaches exist to protect mobile code from malicious hosts. Some of the approaches to solve this problem are the use of trusted computing, computing with encrypted function, steganography, cryptographic traces, Seal Calculas, etc. This paper focuses on the simulation of some of these existing techniques in the designed mobile language. Some new approaches to solve malicious network problem and agent tampering problem are developed using public key encryption system and steganographic concepts. The approaches are based on encrypting and hiding the partial solutions of the mobile agents. The partial results are stored and the address of the storage is destroyed as the agent moves from one host to another host. This allows only the originator to make use of the partial results. Through these approaches some of the existing problems are solved.

  17. A Cognitive Simulator for Learning the Nature of Human Problem Solving

    NASA Astrophysics Data System (ADS)

    Miwa, Kazuhisa

    Problem solving is understood as a process through which states of problem solving are transferred from the initial state to the goal state by applying adequate operators. Within this framework, knowledge and strategies are given as operators for the search. One of the most important points of researchers' interest in the domain of problem solving is to explain the performance of problem solving behavior based on the knowledge and strategies that the problem solver has. We call the interplay between problem solvers' knowledge/strategies and their behavior the causal relation between mental operations and behavior. It is crucially important, we believe, for novice learners in this domain to understand the causal relation between mental operations and behavior. Based on this insight, we have constructed a learning system in which learners can control mental operations of a computational agent that solves a task, such as knowledge, heuristics, and cognitive capacity, and can observe its behavior. We also introduce this system to a university class, and discuss which findings were discovered by the participants.

  18. Criminal genomic pragmatism: prisoners' representations of DNA technology and biosecurity.

    PubMed

    Machado, Helena; Silva, Susana

    2012-01-01

    Within the context of the use of DNA technology in crime investigation, biosecurity is perceived by different stakeholders according to their particular rationalities and interests. Very little is known about prisoners' perceptions and assessments of the uses of DNA technology in solving crime. To propose a conceptual model that serves to analyse and interpret prisoners' representations of DNA technology and biosecurity. A qualitative study using an interpretative approach based on 31 semi-structured tape-recorded interviews was carried out between May and September 2009, involving male inmates in three prisons located in the north of Portugal. The content analysis focused on the following topics: the meanings attributed to DNA and assessments of the risks and benefits of the uses of DNA technology and databasing in forensic applications. DNA was described as a record of identity, an exceptional material, and a powerful biometric identifier. The interviewees believed that DNA can be planted to incriminate suspects. Convicted offenders argued for the need to extend the criteria for the inclusion of DNA profiles in forensic databases and to restrict the removal of profiles. The conceptual model entitled criminal genomic pragmatism allows for an understanding of the views of prison inmates regarding DNA technology and biosecurity.

  19. Quasi-static earthquake cycle simulation based on nonlinear viscoelastic finite element analyses

    NASA Astrophysics Data System (ADS)

    Agata, R.; Ichimura, T.; Hyodo, M.; Barbot, S.; Hori, T.

    2017-12-01

    To explain earthquake generation processes, simulation methods of earthquake cycles have been studied. For such simulations, the combination of the rate- and state-dependent friction law at the fault plane and the boundary integral method based on Green's function in an elastic half space is widely used (e.g. Hori 2009; Barbot et al. 2012). In this approach, stress change around the fault plane due to crustal deformation can be computed analytically, while the effects of complex physics such as mantle rheology and gravity are generally not taken into account. To consider such effects, we seek to develop an earthquake cycle simulation combining crustal deformation computation based on the finite element (FE) method with the rate- and state-dependent friction law. Since the drawback of this approach is the computational cost associated with obtaining numerical solutions, we adopt a recently developed fast and scalable FE solver (Ichimura et al. 2016), which assumes use of supercomputers, to solve the problem in a realistic time. As in the previous approach, we solve the governing equations consisting of the rate- and state-dependent friction law. In solving the equations, we compute stress changes along the fault plane due to crustal deformation using FE simulation, instead of computing them by superimposing slip response function as in the previous approach. In stress change computation, we take into account nonlinear viscoelastic deformation in the asthenosphere. In the presentation, we will show simulation results in a normative three-dimensional problem, where a circular-shaped velocity-weakening area is set in a square-shaped fault plane. The results with and without nonlinear viscosity in the asthenosphere will be compared. We also plan to apply the developed code to simulate the post-earthquake deformation of a megathrust earthquake, such as the 2011 Tohoku earthquake. Acknowledgment: The results were obtained using the K computer at the RIKEN (Proposal number hp160221).

  20. Unstructured Finite Volume Computational Thermo-Fluid Dynamic Method for Multi-Disciplinary Analysis and Design Optimization

    NASA Technical Reports Server (NTRS)

    Majumdar, Alok; Schallhorn, Paul

    1998-01-01

    This paper describes a finite volume computational thermo-fluid dynamics method to solve for Navier-Stokes equations in conjunction with energy equation and thermodynamic equation of state in an unstructured coordinate system. The system of equations have been solved by a simultaneous Newton-Raphson method and compared with several benchmark solutions. Excellent agreements have been obtained in each case and the method has been found to be significantly faster than conventional Computational Fluid Dynamic(CFD) methods and therefore has the potential for implementation in Multi-Disciplinary analysis and design optimization in fluid and thermal systems. The paper also describes an algorithm of design optimization based on Newton-Raphson method which has been recently tested in a turbomachinery application.

  1. Alkylpurine glycosylase D employs DNA sculpting as a strategy to extrude and excise damaged bases.

    PubMed

    Kossmann, Bradley; Ivanov, Ivaylo

    2014-07-01

    Alkylpurine glycosylase D (AlkD) exhibits a unique base excision strategy. Instead of interacting directly with the lesion, the enzyme engages the non-lesion DNA strand. AlkD induces flipping of the alkylated and opposing base accompanied by DNA stack compression. Since this strategy leaves the alkylated base solvent exposed, the means to achieve enzymatic cleavage had remained unclear. We determined a minimum energy path for flipping out a 3-methyl adenine by AlkD and computed a potential of mean force along this path to delineate the energetics of base extrusion. We show that AlkD acts as a scaffold to stabilize three distinct DNA conformations, including the final extruded state. These states are almost equivalent in free energy and separated by low barriers. Thus, AlkD acts by sculpting the global DNA conformation to achieve lesion expulsion from DNA. N-glycosidic bond scission is then facilitated by a backbone phosphate group proximal to the alkylated base.

  2. Computer Programming: A Medium for Teaching Problem Solving.

    ERIC Educational Resources Information Center

    Casey, Patrick J.

    1997-01-01

    Argues that including computer programming in the curriculum as a medium for instruction is a feasible alternative for teaching problem solving. Discusses the nature of problem solving; the problem-solving elements of discovery, motivation, practical learning situations and flexibility which are inherent in programming; capabilities of computer…

  3. In silico studies to explore the mutagenic ability of 5-halo/oxy/li-oxy-uracil bases with guanine of DNA base pairs.

    PubMed

    Jana, Kalyanashis; Ganguly, Bishwajit

    2014-10-16

    DNA nucleobases are reactive in nature and undergo modifications by deamination, oxidation, alkylation, or hydrolysis processes. Many such modified bases are susceptible to mutagenesis when formed in cellular DNA. The mutagenesis can occur by mispairing with DNA nucleobases by a DNA polymerase during replication. We have performed a study of mispairing of DNA bases with unnatural bases computationally. 5-Halo uracils have been studied as mispairs in mutagenesis; however, the reports on their different forms are scarce in the literature. The stability of mispairs with keto form, enol form, and ionized form of 5-halo-uracil has been computed with the M06-2X/6-31+G** level of theory. The enol form of 5-halo-uracil showed remarkable stability toward DNA mispair compared to the corresponding keto and ionized forms. (F)U-G mispair showed the highest stability in the series and (Halo)(U(enol/ionized)-G mispair interactions energies are more stable than the natural G-C basepair of DNA. To enhance the stability of DNA mispairs, we have introduced the hydroxyl group in the place of halogen atoms, which provides additional hydrogen-bonding interactions in the system while forming the 5-membered ring. The study has been further extended with lithiated 5-hydroxymethyl-uracil to stabilize the DNA mispair. (CH2OLi)U(ionized)-G mispair has shown the highest stability (ΔG = -32.4 kcal/mol) with multi O-Li interactions. AIM (atoms in molecules) and EDA (energy decomposition analysis) analysis has been performed to examine the nature of noncovalent interactions in such mispairs. EDA analysis has shown that electrostatic energy mainly contributes toward the interaction energy of mispairs. The higher stability achieved in these studied mispairs can play a pivotal role in the mutagenesis and can help to attain the mutation for many desired biological processes.

  4. Fast parallel molecular algorithms for DNA-based computation: factoring integers.

    PubMed

    Chang, Weng-Long; Guo, Minyi; Ho, Michael Shan-Hui

    2005-06-01

    The RSA public-key cryptosystem is an algorithm that converts input data to an unrecognizable encryption and converts the unrecognizable data back into its original decryption form. The security of the RSA public-key cryptosystem is based on the difficulty of factoring the product of two large prime numbers. This paper demonstrates to factor the product of two large prime numbers, and is a breakthrough in basic biological operations using a molecular computer. In order to achieve this, we propose three DNA-based algorithms for parallel subtractor, parallel comparator, and parallel modular arithmetic that formally verify our designed molecular solutions for factoring the product of two large prime numbers. Furthermore, this work indicates that the cryptosystems using public-key are perhaps insecure and also presents clear evidence of the ability of molecular computing to perform complicated mathematical operations.

  5. DNA nanotechnology: a future perspective

    PubMed Central

    2013-01-01

    In addition to its genetic function, DNA is one of the most distinct and smart self-assembling nanomaterials. DNA nanotechnology exploits the predictable self-assembly of DNA oligonucleotides to design and assemble innovative and highly discrete nanostructures. Highly ordered DNA motifs are capable of providing an ultra-fine framework for the next generation of nanofabrications. The majority of these applications are based upon the complementarity of DNA base pairing: adenine with thymine, and guanine with cytosine. DNA provides an intelligent route for the creation of nanoarchitectures with programmable and predictable patterns. DNA strands twist along one helix for a number of bases before switching to the other helix by passing through a crossover junction. The association of two crossovers keeps the helices parallel and holds them tightly together, allowing the assembly of bigger structures. Because of the DNA molecule's unique and novel characteristics, it can easily be applied in a vast variety of multidisciplinary research areas like biomedicine, computer science, nano/optoelectronics, and bionanotechnology. PMID:23497147

  6. ENGAGE: A Game Based Learning and Problem Solving Framework

    DTIC Science & Technology

    2012-08-15

    multiplayer card game Creature Capture now supports an offline multiplayer mode (sharing a single computer), in response to feedback from teachers that a...Planetopia overworld will be ready for use by a number of physical schools as well as integrated into multiple online teaching resources. The games will be...From - To) 7/1/2012 – 7/31/2012 4. TITLE AND SUBTITLE ENGAGE: A Game Based Learning and Problem Solving Framework 5a. CONTRACT NUMBER N/A 5b

  7. DNA barcoding for molecular identification of Demodex based on mitochondrial genes.

    PubMed

    Hu, Li; Yang, YuanJun; Zhao, YaE; Niu, DongLing; Yang, Rui; Wang, RuiLing; Lu, Zhaohui; Li, XiaoQi

    2017-12-01

    There has been no widely accepted DNA barcode for species identification of Demodex. In this study, we attempted to solve this issue. First, mitochondrial cox1-5' and 12S gene fragments of Demodex folloculorum, D. brevis, D. canis, and D. caprae were amplified, cloned, and sequenced for the first time; intra/interspecific divergences were computed and phylogenetic trees were reconstructed. Then, divergence frequency distribution plots of those two gene fragments were drawn together with mtDNA cox1-middle region and 16S obtained in previous studies. Finally, their identification efficiency was evaluated by comparing barcoding gap. Results indicated that 12S had the higher identification efficiency. Specifically, for cox1-5' region of the four Demodex species, intraspecific divergences were less than 2.0%, and interspecific divergences were 21.1-31.0%; for 12S, intraspecific divergences were less than 1.4%, and interspecific divergences were 20.8-26.9%. The phylogenetic trees demonstrated that the four Demodex species clustered separately, and divergence frequency distribution plot showed that the largest intraspecific divergence of 12S (1.4%) was less than cox1-5' region (2.0%), cox1-middle region (3.1%), and 16S (2.8%). The barcoding gap of 12S was 19.4%, larger than cox1-5' region (19.1%), cox1-middle region (11.3%), and 16S (13.0%); the interspecific divergence span of 12S was 6.2%, smaller than cox1-5' region (10.0%), cox1-middle region (14.1%), and 16S (11.4%). Moreover, 12S has a moderate length (517 bp) for sequencing at once. Therefore, we proposed mtDNA 12S was more suitable than cox1 and 16S to be a DNA barcode for classification and identification of Demodex at lower category level.

  8. Internet computer coaches for introductory physics problem solving

    NASA Astrophysics Data System (ADS)

    Xu Ryan, Qing

    The ability to solve problems in a variety of contexts is becoming increasingly important in our rapidly changing technological society. Problem-solving is a complex process that is important for everyday life and crucial for learning physics. Although there is a great deal of effort to improve student problem solving skills throughout the educational system, national studies have shown that the majority of students emerge from such courses having made little progress toward developing good problem-solving skills. The Physics Education Research Group at the University of Minnesota has been developing Internet computer coaches to help students become more expert-like problem solvers. During the Fall 2011 and Spring 2013 semesters, the coaches were introduced into large sections (200+ students) of the calculus based introductory mechanics course at the University of Minnesota. This dissertation, will address the research background of the project, including the pedagogical design of the coaches and the assessment of problem solving. The methodological framework of conducting experiments will be explained. The data collected from the large-scale experimental studies will be discussed from the following aspects: the usage and usability of these coaches; the usefulness perceived by students; and the usefulness measured by final exam and problem solving rubric. It will also address the implications drawn from this study, including using this data to direct future coach design and difficulties in conducting authentic assessment of problem-solving.

  9. Evidence of pervasive biologically functional secondary structures within the genomes of eukaryotic single-stranded DNA viruses.

    PubMed

    Muhire, Brejnev Muhizi; Golden, Michael; Murrell, Ben; Lefeuvre, Pierre; Lett, Jean-Michel; Gray, Alistair; Poon, Art Y F; Ngandu, Nobubelo Kwanele; Semegni, Yves; Tanov, Emil Pavlov; Monjane, Adérito Luis; Harkins, Gordon William; Varsani, Arvind; Shepherd, Dionne Natalie; Martin, Darren Patrick

    2014-02-01

    Single-stranded DNA (ssDNA) viruses have genomes that are potentially capable of forming complex secondary structures through Watson-Crick base pairing between their constituent nucleotides. A few of the structural elements formed by such base pairings are, in fact, known to have important functions during the replication of many ssDNA viruses. Unknown, however, are (i) whether numerous additional ssDNA virus genomic structural elements predicted to exist by computational DNA folding methods actually exist and (ii) whether those structures that do exist have any biological relevance. We therefore computationally inferred lists of the most evolutionarily conserved structures within a diverse selection of animal- and plant-infecting ssDNA viruses drawn from the families Circoviridae, Anelloviridae, Parvoviridae, Nanoviridae, and Geminiviridae and analyzed these for evidence of natural selection favoring the maintenance of these structures. While we find evidence that is consistent with purifying selection being stronger at nucleotide sites that are predicted to be base paired than at sites predicted to be unpaired, we also find strong associations between sites that are predicted to pair with one another and site pairs that are apparently coevolving in a complementary fashion. Collectively, these results indicate that natural selection actively preserves much of the pervasive secondary structure that is evident within eukaryote-infecting ssDNA virus genomes and, therefore, that much of this structure is biologically functional. Lastly, we provide examples of various highly conserved but completely uncharacterized structural elements that likely have important functions within some of the ssDNA virus genomes analyzed here.

  10. Evidence of Pervasive Biologically Functional Secondary Structures within the Genomes of Eukaryotic Single-Stranded DNA Viruses

    PubMed Central

    Muhire, Brejnev Muhizi; Golden, Michael; Murrell, Ben; Lefeuvre, Pierre; Lett, Jean-Michel; Gray, Alistair; Poon, Art Y. F.; Ngandu, Nobubelo Kwanele; Semegni, Yves; Tanov, Emil Pavlov; Monjane, Adérito Luis; Harkins, Gordon William; Varsani, Arvind; Shepherd, Dionne Natalie

    2014-01-01

    Single-stranded DNA (ssDNA) viruses have genomes that are potentially capable of forming complex secondary structures through Watson-Crick base pairing between their constituent nucleotides. A few of the structural elements formed by such base pairings are, in fact, known to have important functions during the replication of many ssDNA viruses. Unknown, however, are (i) whether numerous additional ssDNA virus genomic structural elements predicted to exist by computational DNA folding methods actually exist and (ii) whether those structures that do exist have any biological relevance. We therefore computationally inferred lists of the most evolutionarily conserved structures within a diverse selection of animal- and plant-infecting ssDNA viruses drawn from the families Circoviridae, Anelloviridae, Parvoviridae, Nanoviridae, and Geminiviridae and analyzed these for evidence of natural selection favoring the maintenance of these structures. While we find evidence that is consistent with purifying selection being stronger at nucleotide sites that are predicted to be base paired than at sites predicted to be unpaired, we also find strong associations between sites that are predicted to pair with one another and site pairs that are apparently coevolving in a complementary fashion. Collectively, these results indicate that natural selection actively preserves much of the pervasive secondary structure that is evident within eukaryote-infecting ssDNA virus genomes and, therefore, that much of this structure is biologically functional. Lastly, we provide examples of various highly conserved but completely uncharacterized structural elements that likely have important functions within some of the ssDNA virus genomes analyzed here. PMID:24284329

  11. Design and Application of Interactive Simulations in Problem-Solving in University-Level Physics Education

    NASA Astrophysics Data System (ADS)

    Ceberio, Mikel; Almudí, José Manuel; Franco, Ángel

    2016-08-01

    In recent years, interactive computer simulations have been progressively integrated in the teaching of the sciences and have contributed significant improvements in the teaching-learning process. Practicing problem-solving is a key factor in science and engineering education. The aim of this study was to design simulation-based problem-solving teaching materials and assess their effectiveness in improving students' ability to solve problems in university-level physics. Firstly, we analyze the effect of using simulation-based materials in the development of students' skills in employing procedures that are typically used in the scientific method of problem-solving. We found that a significant percentage of the experimental students used expert-type scientific procedures such as qualitative analysis of the problem, making hypotheses, and analysis of results. At the end of the course, only a minority of the students persisted with habits based solely on mathematical equations. Secondly, we compare the effectiveness in terms of problem-solving of the experimental group students with the students who are taught conventionally. We found that the implementation of the problem-solving strategy improved experimental students' results regarding obtaining a correct solution from the academic point of view, in standard textbook problems. Thirdly, we explore students' satisfaction with simulation-based problem-solving teaching materials and we found that the majority appear to be satisfied with the methodology proposed and took on a favorable attitude to learning problem-solving. The research was carried out among first-year Engineering Degree students.

  12. A Fast Method for Embattling Optimization of Ground-Based Radar Surveillance Network

    NASA Astrophysics Data System (ADS)

    Jiang, H.; Cheng, H.; Zhang, Y.; Liu, J.

    A growing number of space activities have created an orbital debris environment that poses increasing impact risks to existing space systems and human space flight. For the safety of in-orbit spacecraft, a lot of observation facilities are needed to catalog space objects, especially in low earth orbit. Surveillance of Low earth orbit objects are mainly rely on ground-based radar, due to the ability limitation of exist radar facilities, a large number of ground-based radar need to build in the next few years in order to meet the current space surveillance demands. How to optimize the embattling of ground-based radar surveillance network is a problem to need to be solved. The traditional method for embattling optimization of ground-based radar surveillance network is mainly through to the detection simulation of all possible stations with cataloged data, and makes a comprehensive comparative analysis of various simulation results with the combinational method, and then selects an optimal result as station layout scheme. This method is time consuming for single simulation and high computational complexity for the combinational analysis, when the number of stations increases, the complexity of optimization problem will be increased exponentially, and cannot be solved with traditional method. There is no better way to solve this problem till now. In this paper, target detection procedure was simplified. Firstly, the space coverage of ground-based radar was simplified, a space coverage projection model of radar facilities in different orbit altitudes was built; then a simplified objects cross the radar coverage model was established according to the characteristics of space objects orbit motion; after two steps simplification, the computational complexity of the target detection was greatly simplified, and simulation results shown the correctness of the simplified results. In addition, the detection areas of ground-based radar network can be easily computed with the simplified model, and then optimized the embattling of ground-based radar surveillance network with the artificial intelligent algorithm, which can greatly simplifies the computational complexities. Comparing with the traditional method, the proposed method greatly improved the computational efficiency.

  13. 'DNA Strider': a 'C' program for the fast analysis of DNA and protein sequences on the Apple Macintosh family of computers.

    PubMed Central

    Marck, C

    1988-01-01

    DNA Strider is a new integrated DNA and Protein sequence analysis program written with the C language for the Macintosh Plus, SE and II computers. It has been designed as an easy to learn and use program as well as a fast and efficient tool for the day-to-day sequence analysis work. The program consists of a multi-window sequence editor and of various DNA and Protein analysis functions. The editor may use 4 different types of sequences (DNA, degenerate DNA, RNA and one-letter coded protein) and can handle simultaneously 6 sequences of any type up to 32.5 kB each. Negative numbering of the bases is allowed for DNA sequences. All classical restriction and translation analysis functions are present and can be performed in any order on any open sequence or part of a sequence. The main feature of the program is that the same analysis function can be repeated several times on different sequences, thus generating multiple windows on the screen. Many graphic capabilities have been incorporated such as graphic restriction map, hydrophobicity profile and the CAI plot- codon adaptation index according to Sharp and Li. The restriction sites search uses a newly designed fast hexamer look-ahead algorithm. Typical runtime for the search of all sites with a library of 130 restriction endonucleases is 1 second per 10,000 bases. The circular graphic restriction map of the pBR322 plasmid can be therefore computed from its sequence and displayed on the Macintosh Plus screen within 2 seconds and its multiline restriction map obtained in a scrolling window within 5 seconds. PMID:2832831

  14. Discrete persistent-chain model for protein binding on DNA.

    PubMed

    Lam, Pui-Man; Zhen, Yi

    2011-04-01

    We describe and solve a discrete persistent-chain model of protein binding on DNA, involving an extra σ(i) at a site i of the DNA. This variable takes the value 1 or 0, depending on whether or not the site is occupied by a protein. In addition, if the site is occupied by a protein, there is an extra energy cost ɛ. For a small force, we obtain analytic expressions for the force-extension curve and the fraction of bound protein on the DNA. For higher forces, the model can be solved numerically to obtain force-extension curves and the average fraction of bound proteins as a function of applied force. Our model can be used to analyze experimental force-extension curves of protein binding on DNA, and hence deduce the number of bound proteins in the case of nonspecific binding. ©2011 American Physical Society

  15. Computational Psychometrics for the Measurement of Collaborative Problem Solving Skills

    PubMed Central

    Polyak, Stephen T.; von Davier, Alina A.; Peterschmidt, Kurt

    2017-01-01

    This paper describes a psychometrically-based approach to the measurement of collaborative problem solving skills, by mining and classifying behavioral data both in real-time and in post-game analyses. The data were collected from a sample of middle school children who interacted with a game-like, online simulation of collaborative problem solving tasks. In this simulation, a user is required to collaborate with a virtual agent to solve a series of tasks within a first-person maze environment. The tasks were developed following the psychometric principles of Evidence Centered Design (ECD) and are aligned with the Holistic Framework developed by ACT. The analyses presented in this paper are an application of an emerging discipline called computational psychometrics which is growing out of traditional psychometrics and incorporates techniques from educational data mining, machine learning and other computer/cognitive science fields. In the real-time analysis, our aim was to start with limited knowledge of skill mastery, and then demonstrate a form of continuous Bayesian evidence tracing that updates sub-skill level probabilities as new conversation flow event evidence is presented. This is performed using Bayes' rule and conversation item conditional probability tables. The items are polytomous and each response option has been tagged with a skill at a performance level. In our post-game analysis, our goal was to discover unique gameplay profiles by performing a cluster analysis of user's sub-skill performance scores based on their patterns of selected dialog responses. PMID:29238314

  16. Computational Psychometrics for the Measurement of Collaborative Problem Solving Skills.

    PubMed

    Polyak, Stephen T; von Davier, Alina A; Peterschmidt, Kurt

    2017-01-01

    This paper describes a psychometrically-based approach to the measurement of collaborative problem solving skills, by mining and classifying behavioral data both in real-time and in post-game analyses. The data were collected from a sample of middle school children who interacted with a game-like, online simulation of collaborative problem solving tasks. In this simulation, a user is required to collaborate with a virtual agent to solve a series of tasks within a first-person maze environment. The tasks were developed following the psychometric principles of Evidence Centered Design (ECD) and are aligned with the Holistic Framework developed by ACT. The analyses presented in this paper are an application of an emerging discipline called computational psychometrics which is growing out of traditional psychometrics and incorporates techniques from educational data mining, machine learning and other computer/cognitive science fields. In the real-time analysis, our aim was to start with limited knowledge of skill mastery, and then demonstrate a form of continuous Bayesian evidence tracing that updates sub-skill level probabilities as new conversation flow event evidence is presented. This is performed using Bayes' rule and conversation item conditional probability tables. The items are polytomous and each response option has been tagged with a skill at a performance level. In our post-game analysis, our goal was to discover unique gameplay profiles by performing a cluster analysis of user's sub-skill performance scores based on their patterns of selected dialog responses.

  17. Osmylated DNA, a novel concept for sequencing DNA using nanopores

    NASA Astrophysics Data System (ADS)

    Kanavarioti, Anastassia

    2015-03-01

    Saenger sequencing has led the advances in molecular biology, while faster and cheaper next generation technologies are urgently needed. A newer approach exploits nanopores, natural or solid-state, set in an electrical field, and obtains base sequence information from current variations due to the passage of a ssDNA molecule through the pore. A hurdle in this approach is the fact that the four bases are chemically comparable to each other which leads to small differences in current obstruction. ‘Base calling’ becomes even more challenging because most nanopores sense a short sequence and not individual bases. Perhaps sequencing DNA via nanopores would be more manageable, if only the bases were two, and chemically very different from each other; a sequence of 1s and 0s comes to mind. Osmylated DNA comes close to such a sequence of 1s and 0s. Osmylation is the addition of osmium tetroxide bipyridine across the C5-C6 double bond of the pyrimidines. Osmylation adds almost 400% mass to the reactive base, creates a sterically and electronically notably different molecule, labeled 1, compared to the unreactive purines, labeled 0. If osmylated DNA were successfully sequenced, the result would be a sequence of osmylated pyrimidines (1), and purines (0), and not of the actual nucleobases. To solve this problem we studied the osmylation reaction with short oligos and with M13mp18, a long ssDNA, developed a UV-vis assay to measure extent of osmylation, and designed two protocols. Protocol A uses mild conditions and yields osmylated thymidines (1), while leaving the other three bases (0) practically intact. Protocol B uses harsher conditions and effectively osmylates both pyrimidines, but not the purines. Applying these two protocols also to the complementary of the target polynucleotide yields a total of four osmylated strands that collectively could define the actual base sequence of the target DNA.

  18. Structure and DNA-binding of meiosis-specific protein Hop2

    NASA Astrophysics Data System (ADS)

    Zhou, Donghua; Moktan, Hem; Pezza, Roberto

    2014-03-01

    Here we report structure elucidation of the DNA binding domain of homologous pairing protein 2 (Hop2), which is important to gene diversity when sperms and eggs are produced. Together with another protein Mnd1, Hop2 enhances the strand invasion activity of recombinase Dmc1 by over 30 times, facilitating proper synapsis of homologous chromosomes. However, the structural and biochemical bases for the function of Hop2 and Mnd1 have not been well understood. As a first step toward such understanding, we recently solved the structure for the N-terminus of Hop2 (1-84) using solution NMR. This fragment shows a typical winged-head conformation with recognized DNA binding activity. DNA interacting sites were then investigated by chemical shift perturbations in a titration experiment. Information of these sites was used to guide protein-DNA docking with MD simulation, revealing that helix 3 is stably lodged in the DNA major groove and that wing 1 (connecting strands 2 and 3) transiently comes in contact with the minor groove in nanosecond time scale. Mutagenesis analysis further confirmed the DNA binding sites in this fragment of the protein.

  19. Structure-Based Mutational Analysis of the C-Terminal DNA-Binding Domain of Human Immunodeficiency Virus Type 1 Integrase: Critical Residues for Protein Oligomerization and DNA Binding

    PubMed Central

    Lutzke, Ramon A. Puras; Plasterk, Ronald H. A.

    1998-01-01

    The C-terminal domain of human immunodeficiency virus type 1 (HIV-1) integrase (IN) is a dimer that binds to DNA in a nonspecific manner. The structure of the minimal region required for DNA binding (IN220–270) has been solved by nuclear magnetic resonance spectroscopy. The overall fold of the C-terminal domain of HIV-1 IN is similar to those of Src homology region 3 domains. Based on the structure of IN220–270, we studied the role of 15 amino acid residues potentially involved in DNA binding and oligomerization by mutational analysis. We found that two amino acid residues, arginine 262 and leucine 234, contribute to DNA binding in the context of IN220–270, as indicated by protein-DNA UV cross-link analysis. We also analyzed mutant proteins representing portions of the full-length IN protein. Amino acid substitution of residues located in the hydrophobic dimer interface, such as L241A and L242A, results in the loss of oligomerization of IN; consequently, the levels of 3′ processing, DNA strand transfer, and intramolecular disintegration are strongly reduced. These results suggest that dimerization of the C-terminal domain of IN is important for correct multimerization of IN. PMID:9573250

  20. Experimental quantum computing to solve systems of linear equations.

    PubMed

    Cai, X-D; Weedbrook, C; Su, Z-E; Chen, M-C; Gu, Mile; Zhu, M-J; Li, Li; Liu, Nai-Le; Lu, Chao-Yang; Pan, Jian-Wei

    2013-06-07

    Solving linear systems of equations is ubiquitous in all areas of science and engineering. With rapidly growing data sets, such a task can be intractable for classical computers, as the best known classical algorithms require a time proportional to the number of variables N. A recently proposed quantum algorithm shows that quantum computers could solve linear systems in a time scale of order log(N), giving an exponential speedup over classical computers. Here we realize the simplest instance of this algorithm, solving 2×2 linear equations for various input vectors on a quantum computer. We use four quantum bits and four controlled logic gates to implement every subroutine required, demonstrating the working principle of this algorithm.

  1. Hybrid cloud: bridging of private and public cloud computing

    NASA Astrophysics Data System (ADS)

    Aryotejo, Guruh; Kristiyanto, Daniel Y.; Mufadhol

    2018-05-01

    Cloud Computing is quickly emerging as a promising paradigm in the recent years especially for the business sector. In addition, through cloud service providers, cloud computing is widely used by Information Technology (IT) based startup company to grow their business. However, the level of most businesses awareness on data security issues is low, since some Cloud Service Provider (CSP) could decrypt their data. Hybrid Cloud Deployment Model (HCDM) has characteristic as open source, which is one of secure cloud computing model, thus HCDM may solve data security issues. The objective of this study is to design, deploy and evaluate a HCDM as Infrastructure as a Service (IaaS). In the implementation process, Metal as a Service (MAAS) engine was used as a base to build an actual server and node. Followed by installing the vsftpd application, which serves as FTP server. In comparison with HCDM, public cloud was adopted through public cloud interface. As a result, the design and deployment of HCDM was conducted successfully, instead of having good security, HCDM able to transfer data faster than public cloud significantly. To the best of our knowledge, Hybrid Cloud Deployment model is one of secure cloud computing model due to its characteristic as open source. Furthermore, this study will serve as a base for future studies about Hybrid Cloud Deployment model which may relevant for solving big security issues of IT-based startup companies especially in Indonesia.

  2. Computer-aided design of nano-filter construction using DNA self-assembly

    NASA Astrophysics Data System (ADS)

    Mohammadzadegan, Reza; Mohabatkar, Hassan

    2007-01-01

    Computer-aided design plays a fundamental role in both top-down and bottom-up nano-system fabrication. This paper presents a bottom-up nano-filter patterning process based on DNA self-assembly. In this study we designed a new method to construct fully designed nano-filters with the pores between 5 nm and 9 nm in diameter. Our calculations illustrated that by constructing such a nano-filter we would be able to separate many molecules.

  3. geneGIS: Computational Tools for Spatial Analyses of DNA Profiles with Associated Photo-Identification and Telemetry Records of Marine Mammals

    DTIC Science & Technology

    2012-09-30

    computational tools provide the ability to display, browse, select, filter and summarize spatio-temporal relationships of these individual-based...her research assistant at Esri, Shaun Walbridge, and members of the Marine Mammal Institute ( MMI ), including Tomas Follet and Debbie Steel. This...Genomics Laboratory, MMI , OSU. 4 As part of the geneGIS initiative, these SPLASH photo-identification records and the geneSPLASH DNA profiles

  4. Aggregation Pheromone System: A Real-parameter Optimization Algorithm using Aggregation Pheromones as the Base Metaphor

    NASA Astrophysics Data System (ADS)

    Tsutsui, Shigeyosi

    This paper proposes an aggregation pheromone system (APS) for solving real-parameter optimization problems using the collective behavior of individuals which communicate using aggregation pheromones. APS was tested on several test functions used in evolutionary computation. The results showed APS could solve real-parameter optimization problems fairly well. The sensitivity analysis of control parameters of APS is also studied.

  5. Mapping an Experiment-Based Assessment of Collaborative Behavior onto Collaborative Problem Solving in PISA 2015: A Cluster Analysis Approach for Collaborator Profiles

    ERIC Educational Resources Information Center

    Herborn, Katharina; Mustafic, Maida; Greiff, Samuel

    2017-01-01

    Collaborative problem solving (CPS) assessment is a new academic research field with a number of educational implications. In 2015, the Programme for International Student Assessment (PISA) assessed CPS with a computer-simulated human-agent (H-A) approach that claimed to measure 12 individual CPS skills for the first time. After reviewing the…

  6. Engineering and Computing Portal to Solve Environmental Problems

    NASA Astrophysics Data System (ADS)

    Gudov, A. M.; Zavozkin, S. Y.; Sotnikov, I. Y.

    2018-01-01

    This paper describes architecture and services of the Engineering and Computing Portal, which is considered to be a complex solution that provides access to high-performance computing resources, enables to carry out computational experiments, teach parallel technologies and solve computing tasks, including technogenic safety ones.

  7. International Symposium on Special Topics in Chemical Propulsion: Base Bleed Held in Athens, Greece on November 23, 24, 25, 1988

    DTIC Science & Technology

    1988-11-01

    Title & Authori ~Number nvitum.d •Base Bleed Technology in perspective’" Talk G. V. Bull SESSION I Chemitry and Ignition / Combustion Behaviour of Base...Base Bleed Projectiles: Bleed,- I Rejw of the Fluid Dynarmic Aspect of the Effect of Base J. Sahu, and W. L. Chow. H - 2 "Xavier - Stokes Computations ...numerical computations of the flow by solving the Navier-Stokes Eluations becoming available, the effect of base bleed can be illus- trated by providing

  8. Decomposition of timed automata for solving scheduling problems

    NASA Astrophysics Data System (ADS)

    Nishi, Tatsushi; Wakatake, Masato

    2014-03-01

    A decomposition algorithm for scheduling problems based on timed automata (TA) model is proposed. The problem is represented as an optimal state transition problem for TA. The model comprises of the parallel composition of submodels such as jobs and resources. The procedure of the proposed methodology can be divided into two steps. The first step is to decompose the TA model into several submodels by using decomposable condition. The second step is to combine individual solution of subproblems for the decomposed submodels by the penalty function method. A feasible solution for the entire model is derived through the iterated computation of solving the subproblem for each submodel. The proposed methodology is applied to solve flowshop and jobshop scheduling problems. Computational experiments demonstrate the effectiveness of the proposed algorithm compared with a conventional TA scheduling algorithm without decomposition.

  9. Application of artificial intelligence to pharmacy and medicine.

    PubMed

    Dasta, J F

    1992-04-01

    Artificial intelligence (AI) is a branch of computer science dealing with solving problems using symbolic programming. It has evolved into a problem solving science with applications in business, engineering, and health care. One application of AI is expert system development. An expert system consists of a knowledge base and inference engine, coupled with a user interface. A crucial aspect of expert system development is knowledge acquisition and implementing computable ways to solve problems. There have been several expert systems developed in medicine to assist physicians with medical diagnosis. Recently, several programs focusing on drug therapy have been described. They provide guidance on drug interactions, drug therapy monitoring, and drug formulary selection. There are many aspects of pharmacy that AI can have an impact on and the reader is challenged to consider these possibilities because they may some day become a reality in pharmacy.

  10. Computer Based Collaborative Problem Solving for Introductory Courses in Physics

    NASA Astrophysics Data System (ADS)

    Ilie, Carolina; Lee, Kevin

    2010-03-01

    We discuss collaborative problem solving computer-based recitation style. The course is designed by Lee [1], and the idea was proposed before by Christian, Belloni and Titus [2,3]. The students find the problems on a web-page containing simulations (physlets) and they write the solutions on an accompanying worksheet after discussing it with a classmate. Physlets have the advantage of being much more like real-world problems than textbook problems. We also compare two protocols for web-based instruction using simulations in an introductory physics class [1]. The inquiry protocol allowed students to control input parameters while the worked example protocol did not. We will discuss which of the two methods is more efficient in relation to Scientific Discovery Learning and Cognitive Load Theory. 1. Lee, Kevin M., Nicoll, Gayle and Brooks, Dave W. (2004). ``A Comparison of Inquiry and Worked Example Web-Based Instruction Using Physlets'', Journal of Science Education and Technology 13, No. 1: 81-88. 2. Christian, W., and Belloni, M. (2001). Physlets: Teaching Physics With Interactive Curricular Material, Prentice Hall, Englewood Cliffs, NJ. 3. Christian,W., and Titus,A. (1998). ``Developing web-based curricula using Java Physlets.'' Computers in Physics 12: 227--232.

  11. CHEMEX; Understanding and Solving Problems in Chemistry. A Computer-Assisted Instruction Program for General Chemistry.

    ERIC Educational Resources Information Center

    Lower, Stephen K.

    A brief overview of CHEMEX--a problem-solving, tutorial style computer-assisted instructional course--is provided and sample problems are offered. In CHEMEX, students receive problems in advance and attempt to solve them before moving through the computer program, which assists them in overcoming difficulties and serves as a review mechanism.…

  12. Structural Studies of E. coli Topoisomerase III-DNA Complexes Reveal A Novel Type IA Topoisomerase-DNA Conformational Intermediate

    PubMed Central

    Changela, Anita; DiGate, Russell J.; Mondragón, Alfonso

    2007-01-01

    Summary E. coli DNA topoisomerase III belongs to the type IA family of DNA topoisomerases, which transiently cleave single-stranded DNA (ssDNA) via a 5′ phosphotyrosine intermediate. We have solved crystal structures of wild-type E. coli topoisomerase III bound to an 8-base ssDNA molecule in three different pH environments. The structures reveal the enzyme in three distinct conformational states while bound to DNA. One conformation resembles the one observed previously with a DNA-bound, catalytically inactive mutant of topoisomerase III where DNA binding realigns catalytic residues to form a functional active site. Another conformation represents a novel intermediate in which DNA is bound along the ssDNA-binding groove but does not enter the active site, which remains in a catalytically inactive, closed state. A third conformation shows an intermediate state where the enzyme is still in a closed state, but the ssDNA is starting to invade the active site. For the first time, the active site region in the presence of both the catalytic tyrosine and ssDNA substrate is revealed for a type IA DNA topoisomerase, although there is no evidence of ssDNA cleavage. Comparative analysis of the various conformational states suggests a sequence of domain movements undertaken by the enzyme upon substrate binding. PMID:17331537

  13. Convergence acceleration of the Proteus computer code with multigrid methods

    NASA Technical Reports Server (NTRS)

    Demuren, A. O.; Ibraheem, S. O.

    1992-01-01

    Presented here is the first part of a study to implement convergence acceleration techniques based on the multigrid concept in the Proteus computer code. A review is given of previous studies on the implementation of multigrid methods in computer codes for compressible flow analysis. Also presented is a detailed stability analysis of upwind and central-difference based numerical schemes for solving the Euler and Navier-Stokes equations. Results are given of a convergence study of the Proteus code on computational grids of different sizes. The results presented here form the foundation for the implementation of multigrid methods in the Proteus code.

  14. Robust Decision Making: The Cognitive and Computational Modeling of Team Problem Solving for Decision Making under Complex and Dynamic Conditions

    DTIC Science & Technology

    2015-07-14

    AFRL-OSR-VA-TR-2015-0202 Robust Decision Making: The Cognitive and Computational Modeling of Team Problem Solving for Decision Making under Complex...Computational Modeling of Team Problem Solving for Decision Making Under Complex and Dynamic Conditions 5a. CONTRACT NUMBER 5b. GRANT NUMBER FA9550-12-1...functioning as they solve complex problems, and propose the means to improve the performance of teams, under changing or adversarial conditions. By

  15. Computer-Based Cognitive Tools in Teacher Training: The COG-TECH Projects

    ERIC Educational Resources Information Center

    Orhun, Emrah

    2003-01-01

    The COG-TECH (Cognitive Technologies for Problem Solving and Learning) Network conducted three international projects between 1994 and 2001 under the auspices of the European Commission. The main purpose of these projects was to train teacher educators in the Mediterranean countries to use computers as effective pedagogical tools. The summer…

  16. Improving the Fraction Word Problem Solving of Students with Mathematics Learning Disabilities: Interactive Computer Application

    ERIC Educational Resources Information Center

    Shin, Mikyung; Bryant, Diane P.

    2017-01-01

    Students with mathematics learning disabilities (MLD) have a weak understanding of fraction concepts and skills, which are foundations of algebra. Such students might benefit from computer-assisted instruction that utilizes evidence-based instructional components (cognitive strategies, feedback, virtual manipulatives). As a pilot study using a…

  17. Observations in the Computer Room: L2 Output and Learner Behaviour

    ERIC Educational Resources Information Center

    Leahy, Christine

    2004-01-01

    This article draws on second language theory, particularly output theory as defined by Swain (1995), in order to conceptualise observations made in a computer-assisted language learning setting. It investigates second language output and learner behaviour within an electronic role-play setting, based on a subject-specific problem solving task and…

  18. Design and Analysis of Compact DNA Strand Displacement Circuits for Analog Computation Using Autocatalytic Amplifiers.

    PubMed

    Song, Tianqi; Garg, Sudhanshu; Mokhtar, Reem; Bui, Hieu; Reif, John

    2018-01-19

    A main goal in DNA computing is to build DNA circuits to compute designated functions using a minimal number of DNA strands. Here, we propose a novel architecture to build compact DNA strand displacement circuits to compute a broad scope of functions in an analog fashion. A circuit by this architecture is composed of three autocatalytic amplifiers, and the amplifiers interact to perform computation. We show DNA circuits to compute functions sqrt(x), ln(x) and exp(x) for x in tunable ranges with simulation results. A key innovation in our architecture, inspired by Napier's use of logarithm transforms to compute square roots on a slide rule, is to make use of autocatalytic amplifiers to do logarithmic and exponential transforms in concentration and time. In particular, we convert from the input that is encoded by the initial concentration of the input DNA strand, to time, and then back again to the output encoded by the concentration of the output DNA strand at equilibrium. This combined use of strand-concentration and time encoding of computational values may have impact on other forms of molecular computation.

  19. Reproducing the Ensemble Average Polar Solvation Energy of a Protein from a Single Structure: Gaussian-Based Smooth Dielectric Function for Macromolecular Modeling.

    PubMed

    Chakravorty, Arghya; Jia, Zhe; Li, Lin; Zhao, Shan; Alexov, Emil

    2018-02-13

    Typically, the ensemble average polar component of solvation energy (ΔG polar solv ) of a macromolecule is computed using molecular dynamics (MD) or Monte Carlo (MC) simulations to generate conformational ensemble and then single/rigid conformation solvation energy calculation is performed on each snapshot. The primary objective of this work is to demonstrate that Poisson-Boltzmann (PB)-based approach using a Gaussian-based smooth dielectric function for macromolecular modeling previously developed by us (Li et al. J. Chem. Theory Comput. 2013, 9 (4), 2126-2136) can reproduce that ensemble average (ΔG polar solv ) of a protein from a single structure. We show that the Gaussian-based dielectric model reproduces the ensemble average ΔG polar solv (⟨ΔG polar solv ⟩) from an energy-minimized structure of a protein regardless of the minimization environment (structure minimized in vacuo, implicit or explicit waters, or crystal structure); the best case, however, is when it is paired with an in vacuo-minimized structure. In other minimization environments (implicit or explicit waters or crystal structure), the traditional two-dielectric model can still be selected with which the model produces correct solvation energies. Our observations from this work reflect how the ability to appropriately mimic the motion of residues, especially the salt bridge residues, influences a dielectric model's ability to reproduce the ensemble average value of polar solvation free energy from a single in vacuo-minimized structure.

  20. Parallel Finite Element Domain Decomposition for Structural/Acoustic Analysis

    NASA Technical Reports Server (NTRS)

    Nguyen, Duc T.; Tungkahotara, Siroj; Watson, Willie R.; Rajan, Subramaniam D.

    2005-01-01

    A domain decomposition (DD) formulation for solving sparse linear systems of equations resulting from finite element analysis is presented. The formulation incorporates mixed direct and iterative equation solving strategics and other novel algorithmic ideas that are optimized to take advantage of sparsity and exploit modern computer architecture, such as memory and parallel computing. The most time consuming part of the formulation is identified and the critical roles of direct sparse and iterative solvers within the framework of the formulation are discussed. Experiments on several computer platforms using several complex test matrices are conducted using software based on the formulation. Small-scale structural examples are used to validate thc steps in the formulation and large-scale (l,000,000+ unknowns) duct acoustic examples are used to evaluate the ORIGIN 2000 processors, and a duster of 6 PCs (running under the Windows environment). Statistics show that the formulation is efficient in both sequential and parallel computing environmental and that the formulation is significantly faster and consumes less memory than that based on one of the best available commercialized parallel sparse solvers.

  1. Problem Solving and Computational Skill: Are They Shared or Distinct Aspects of Mathematical Cognition?

    PubMed Central

    Fuchs, Lynn S.; Fuchs, Douglas; Hamlett, Carol L.; Lambert, Warren; Stuebing, Karla; Fletcher, Jack M.

    2009-01-01

    The purpose of this study was to explore patterns of difficulty in 2 domains of mathematical cognition: computation and problem solving. Third graders (n = 924; 47.3% male) were representatively sampled from 89 classrooms; assessed on computation and problem solving; classified as having difficulty with computation, problem solving, both domains, or neither domain; and measured on 9 cognitive dimensions. Difficulty occurred across domains with the same prevalence as difficulty with a single domain; specific difficulty was distributed similarly across domains. Multivariate profile analysis on cognitive dimensions and chi-square tests on demographics showed that specific computational difficulty was associated with strength in language and weaknesses in attentive behavior and processing speed; problem-solving difficulty was associated with deficient language as well as race and poverty. Implications for understanding mathematics competence and for the identification and treatment of mathematics difficulties are discussed. PMID:20057912

  2. Similarity network fusion for aggregating data types on a genomic scale.

    PubMed

    Wang, Bo; Mezlini, Aziz M; Demir, Feyyaz; Fiume, Marc; Tu, Zhuowen; Brudno, Michael; Haibe-Kains, Benjamin; Goldenberg, Anna

    2014-03-01

    Recent technologies have made it cost-effective to collect diverse types of genome-wide data. Computational methods are needed to combine these data to create a comprehensive view of a given disease or a biological process. Similarity network fusion (SNF) solves this problem by constructing networks of samples (e.g., patients) for each available data type and then efficiently fusing these into one network that represents the full spectrum of underlying data. For example, to create a comprehensive view of a disease given a cohort of patients, SNF computes and fuses patient similarity networks obtained from each of their data types separately, taking advantage of the complementarity in the data. We used SNF to combine mRNA expression, DNA methylation and microRNA (miRNA) expression data for five cancer data sets. SNF substantially outperforms single data type analysis and established integrative approaches when identifying cancer subtypes and is effective for predicting survival.

  3. The intrinsic combinatorial organization and information theoretic content of a sequence are correlated to the DNA encoded nucleosome organization of eukaryotic genomes.

    PubMed

    Utro, Filippo; Di Benedetto, Valeria; Corona, Davide F V; Giancarlo, Raffaele

    2016-03-15

    Thanks to research spanning nearly 30 years, two major models have emerged that account for nucleosome organization in chromatin: statistical and sequence specific. The first is based on elegant, easy to compute, closed-form mathematical formulas that make no assumptions of the physical and chemical properties of the underlying DNA sequence. Moreover, they need no training on the data for their computation. The latter is based on some sequence regularities but, as opposed to the statistical model, it lacks the same type of closed-form formulas that, in this case, should be based on the DNA sequence only. We contribute to close this important methodological gap between the two models by providing three very simple formulas for the sequence specific one. They are all based on well-known formulas in Computer Science and Bioinformatics, and they give different quantifications of how complex a sequence is. In view of how remarkably well they perform, it is very surprising that measures of sequence complexity have not even been considered as candidates to close the mentioned gap. We provide experimental evidence that the intrinsic level of combinatorial organization and information-theoretic content of subsequences within a genome are strongly correlated to the level of DNA encoded nucleosome organization discovered by Kaplan et al Our results establish an important connection between the intrinsic complexity of subsequences in a genome and the intrinsic, i.e. DNA encoded, nucleosome organization of eukaryotic genomes. It is a first step towards a mathematical characterization of this latter 'encoding'. Supplementary data are available at Bioinformatics online. futro@us.ibm.com. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  4. Unveiling Stability Criteria of DNA-Carbon Nanotubes Constructs by Scanning Tunneling Microscopy and Computational Modeling

    DOE PAGES

    Kilina, Svetlana; Yarotski, Dzmitry A.; Talin, A. Alec; ...

    2011-01-01

    We present a combined approach that relies on computational simulations and scanning tunneling microscopy (STM) measurements to reveal morphological properties and stability criteria of carbon nanotube-DNA (CNT-DNA) constructs. Application of STM allows direct observation of very stable CNT-DNA hybrid structures with the well-defined DNA wrapping angle of 63.4 ° and a coiling period of 3.3 nm. Using force field simulations, we determine how the DNA-CNT binding energy depends on the sequence and binding geometry of a single strand DNA. This dependence allows us to quantitatively characterize the stability of a hybrid structure with an optimal π-stacking between DNA nucleotides and themore » tube surface and better interpret STM data. Our simulations clearly demonstrate the existence of a very stable DNA binding geometry for (6,5) CNT as evidenced by the presence of a well-defined minimum in the binding energy as a function of an angle between DNA strand and the nanotube chiral vector. This novel approach demonstrates the feasibility of CNT-DNA geometry studies with subnanometer resolution and paves the way towards complete characterization of the structural and electronic properties of drug-delivering systems based on DNA-CNT hybrids as a function of DNA sequence and a nanotube chirality.« less

  5. Improved teaching-learning-based and JAYA optimization algorithms for solving flexible flow shop scheduling problems

    NASA Astrophysics Data System (ADS)

    Buddala, Raviteja; Mahapatra, Siba Sankar

    2017-11-01

    Flexible flow shop (or a hybrid flow shop) scheduling problem is an extension of classical flow shop scheduling problem. In a simple flow shop configuration, a job having `g' operations is performed on `g' operation centres (stages) with each stage having only one machine. If any stage contains more than one machine for providing alternate processing facility, then the problem becomes a flexible flow shop problem (FFSP). FFSP which contains all the complexities involved in a simple flow shop and parallel machine scheduling problems is a well-known NP-hard (Non-deterministic polynomial time) problem. Owing to high computational complexity involved in solving these problems, it is not always possible to obtain an optimal solution in a reasonable computation time. To obtain near-optimal solutions in a reasonable computation time, a large variety of meta-heuristics have been proposed in the past. However, tuning algorithm-specific parameters for solving FFSP is rather tricky and time consuming. To address this limitation, teaching-learning-based optimization (TLBO) and JAYA algorithm are chosen for the study because these are not only recent meta-heuristics but they do not require tuning of algorithm-specific parameters. Although these algorithms seem to be elegant, they lose solution diversity after few iterations and get trapped at the local optima. To alleviate such drawback, a new local search procedure is proposed in this paper to improve the solution quality. Further, mutation strategy (inspired from genetic algorithm) is incorporated in the basic algorithm to maintain solution diversity in the population. Computational experiments have been conducted on standard benchmark problems to calculate makespan and computational time. It is found that the rate of convergence of TLBO is superior to JAYA. From the results, it is found that TLBO and JAYA outperform many algorithms reported in the literature and can be treated as efficient methods for solving the FFSP.

  6. Computation of the Genetic Code

    NASA Astrophysics Data System (ADS)

    Kozlov, Nicolay N.; Kozlova, Olga N.

    2018-03-01

    One of the problems in the development of mathematical theory of the genetic code (summary is presented in [1], the detailed -to [2]) is the problem of the calculation of the genetic code. Similar problems in the world is unknown and could be delivered only in the 21st century. One approach to solving this problem is devoted to this work. For the first time provides a detailed description of the method of calculation of the genetic code, the idea of which was first published earlier [3]), and the choice of one of the most important sets for the calculation was based on an article [4]. Such a set of amino acid corresponds to a complete set of representations of the plurality of overlapping triple gene belonging to the same DNA strand. A separate issue was the initial point, triggering an iterative search process all codes submitted by the initial data. Mathematical analysis has shown that the said set contains some ambiguities, which have been founded because of our proposed compressed representation of the set. As a result, the developed method of calculation was limited to the two main stages of research, where the first stage only the of the area were used in the calculations. The proposed approach will significantly reduce the amount of computations at each step in this complex discrete structure.

  7. Calculating pH-dependent free energy of proteins by using Monte Carlo protonation probabilities of ionizable residues.

    PubMed

    Huang, Qiang; Herrmann, Andreas

    2012-03-01

    Protein folding, stability, and function are usually influenced by pH. And free energy plays a fundamental role in analysis of such pH-dependent properties. Electrostatics-based theoretical framework using dielectric solvent continuum model and solving Poisson-Boltzmann equation numerically has been shown to be very successful in understanding the pH-dependent properties. However, in this approach the exact computation of pH-dependent free energy becomes impractical for proteins possessing more than several tens of ionizable sites (e.g. > 30), because exact evaluation of the partition function requires a summation over a vast number of possible protonation microstates. Here we present a method which computes the free energy using the average energy and the protonation probabilities of ionizable sites obtained by the well-established Monte Carlo sampling procedure. The key feature is to calculate the entropy by using the protonation probabilities. We used this method to examine a well-studied protein (lysozyme) and produced results which agree very well with the exact calculations. Applications to the optimum pH of maximal stability of proteins and protein-DNA interactions have also resulted in good agreement with experimental data. These examples recommend our method for application to the elucidation of the pH-dependent properties of proteins.

  8. Fast multi-core based multimodal registration of 2D cross-sections and 3D datasets.

    PubMed

    Scharfe, Michael; Pielot, Rainer; Schreiber, Falk

    2010-01-11

    Solving bioinformatics tasks often requires extensive computational power. Recent trends in processor architecture combine multiple cores into a single chip to improve overall performance. The Cell Broadband Engine (CBE), a heterogeneous multi-core processor, provides power-efficient and cost-effective high-performance computing. One application area is image analysis and visualisation, in particular registration of 2D cross-sections into 3D image datasets. Such techniques can be used to put different image modalities into spatial correspondence, for example, 2D images of histological cuts into morphological 3D frameworks. We evaluate the CBE-driven PlayStation 3 as a high performance, cost-effective computing platform by adapting a multimodal alignment procedure to several characteristic hardware properties. The optimisations are based on partitioning, vectorisation, branch reducing and loop unrolling techniques with special attention to 32-bit multiplies and limited local storage on the computing units. We show how a typical image analysis and visualisation problem, the multimodal registration of 2D cross-sections and 3D datasets, benefits from the multi-core based implementation of the alignment algorithm. We discuss several CBE-based optimisation methods and compare our results to standard solutions. More information and the source code are available from http://cbe.ipk-gatersleben.de. The results demonstrate that the CBE processor in a PlayStation 3 accelerates computational intensive multimodal registration, which is of great importance in biological/medical image processing. The PlayStation 3 as a low cost CBE-based platform offers an efficient option to conventional hardware to solve computational problems in image processing and bioinformatics.

  9. A finite element solver for 3-D compressible viscous flows

    NASA Technical Reports Server (NTRS)

    Reddy, K. C.; Reddy, J. N.; Nayani, S.

    1990-01-01

    Computation of the flow field inside a space shuttle main engine (SSME) requires the application of state of the art computational fluid dynamic (CFD) technology. Several computer codes are under development to solve 3-D flow through the hot gas manifold. Some algorithms were designed to solve the unsteady compressible Navier-Stokes equations, either by implicit or explicit factorization methods, using several hundred or thousands of time steps to reach a steady state solution. A new iterative algorithm is being developed for the solution of the implicit finite element equations without assembling global matrices. It is an efficient iteration scheme based on a modified nonlinear Gauss-Seidel iteration with symmetric sweeps. The algorithm is analyzed for a model equation and is shown to be unconditionally stable. Results from a series of test problems are presented. The finite element code was tested for couette flow, which is flow under a pressure gradient between two parallel plates in relative motion. Another problem that was solved is viscous laminar flow over a flat plate. The general 3-D finite element code was used to compute the flow in an axisymmetric turnaround duct at low Mach numbers.

  10. Electron Nuclear Dynamics Simulations of Proton Cancer Therapy Reactions: Water Radiolysis and Proton- and Electron-Induced DNA Damage in Computational Prototypes.

    PubMed

    Teixeira, Erico S; Uppulury, Karthik; Privett, Austin J; Stopera, Christopher; McLaurin, Patrick M; Morales, Jorge A

    2018-05-06

    Proton cancer therapy (PCT) utilizes high-energy proton projectiles to obliterate cancerous tumors with low damage to healthy tissues and without the side effects of X-ray therapy. The healing action of the protons results from their damage on cancerous cell DNA. Despite established clinical use, the chemical mechanisms of PCT reactions at the molecular level remain elusive. This situation prevents a rational design of PCT that can maximize its therapeutic power and minimize its side effects. The incomplete characterization of PCT reactions is partially due to the health risks associated with experimental/clinical techniques applied to human subjects. To overcome this situation, we are conducting time-dependent and non-adiabatic computer simulations of PCT reactions with the electron nuclear dynamics (END) method. Herein, we present a review of our previous and new END research on three fundamental types of PCT reactions: water radiolysis reactions, proton-induced DNA damage and electron-induced DNA damage. These studies are performed on the computational prototypes: proton + H₂O clusters, proton + DNA/RNA bases and + cytosine nucleotide, and electron + cytosine nucleotide + H₂O. These simulations provide chemical mechanisms and dynamical properties of the selected PCT reactions in comparison with available experimental and alternative computational results.

  11. Accurate D-bar Reconstructions of Conductivity Images Based on a Method of Moment with Sinc Basis.

    PubMed

    Abbasi, Mahdi

    2014-01-01

    Planar D-bar integral equation is one of the inverse scattering solution methods for complex problems including inverse conductivity considered in applications such as Electrical impedance tomography (EIT). Recently two different methodologies are considered for the numerical solution of D-bar integrals equation, namely product integrals and multigrid. The first one involves high computational burden and the other one suffers from low convergence rate (CR). In this paper, a novel high speed moment method based using the sinc basis is introduced to solve the two-dimensional D-bar integral equation. In this method, all functions within D-bar integral equation are first expanded using the sinc basis functions. Then, the orthogonal properties of their products dissolve the integral operator of the D-bar equation and results a discrete convolution equation. That is, the new moment method leads to the equation solution without direct computation of the D-bar integral. The resulted discrete convolution equation maybe adapted to a suitable structure to be solved using fast Fourier transform. This allows us to reduce the order of computational complexity to as low as O (N (2)log N). Simulation results on solving D-bar equations arising in EIT problem show that the proposed method is accurate with an ultra-linear CR.

  12. Didactical determinants use of information and communication technology in process of training of future specialists.

    PubMed

    Palamar, Borys I; Vaskivska, Halyna O; Palamar, Svitlana P

    In the article the author touches upon the subject of significance of computer equipment for organization of cooperation of professor and future specialists. Such subject-subject interaction may be directed to forming of professional skills of future specialists. By using information and communication technologies in education system range of didactic tasks can be solved. Improving of process of teaching of subjects in high school, self-learning future specialists, motivating to learning and self-learning, the development of reflection in the learning process. The authors considers computer equipment as instrument for development of intellectual skills, potential and willingness of future specialists to solve communicative and communication tasks and problems on the creative basis. Based on results of researches the author comes to certain conclusions about the effectiveness of usage of computer technologies in process of teaching future specialists and their self-learning. Improper supplying of high schools with computer equipment, lack of appropriate educational programs, professors' teachers' poor knowledge and usage of computers have negative impact on organization of process of teaching disciplines in high schools. Computer equipment and ICT in general are the instruments of development of intellectual skills, potential and willingness of future specialists to solve communicative and communication tasks and problems. So, the formation of psychosocial environment of development of future specialist is multifaceted, complex and didactically important issue.

  13. Big data mining analysis method based on cloud computing

    NASA Astrophysics Data System (ADS)

    Cai, Qing Qiu; Cui, Hong Gang; Tang, Hao

    2017-08-01

    Information explosion era, large data super-large, discrete and non-(semi) structured features have gone far beyond the traditional data management can carry the scope of the way. With the arrival of the cloud computing era, cloud computing provides a new technical way to analyze the massive data mining, which can effectively solve the problem that the traditional data mining method cannot adapt to massive data mining. This paper introduces the meaning and characteristics of cloud computing, analyzes the advantages of using cloud computing technology to realize data mining, designs the mining algorithm of association rules based on MapReduce parallel processing architecture, and carries out the experimental verification. The algorithm of parallel association rule mining based on cloud computing platform can greatly improve the execution speed of data mining.

  14. Key Technology Research on Open Architecture for The Sharing of Heterogeneous Geographic Analysis Models

    NASA Astrophysics Data System (ADS)

    Yue, S. S.; Wen, Y. N.; Lv, G. N.; Hu, D.

    2013-10-01

    In recent years, the increasing development of cloud computing technologies laid critical foundation for efficiently solving complicated geographic issues. However, it is still difficult to realize the cooperative operation of massive heterogeneous geographical models. Traditional cloud architecture is apt to provide centralized solution to end users, while all the required resources are often offered by large enterprises or special agencies. Thus, it's a closed framework from the perspective of resource utilization. Solving comprehensive geographic issues requires integrating multifarious heterogeneous geographical models and data. In this case, an open computing platform is in need, with which the model owners can package and deploy their models into cloud conveniently, while model users can search, access and utilize those models with cloud facility. Based on this concept, the open cloud service strategies for the sharing of heterogeneous geographic analysis models is studied in this article. The key technology: unified cloud interface strategy, sharing platform based on cloud service, and computing platform based on cloud service are discussed in detail, and related experiments are conducted for further verification.

  15. Predicting protein structures with a multiplayer online game.

    PubMed

    Cooper, Seth; Khatib, Firas; Treuille, Adrien; Barbero, Janos; Lee, Jeehyung; Beenen, Michael; Leaver-Fay, Andrew; Baker, David; Popović, Zoran; Players, Foldit

    2010-08-05

    People exert large amounts of problem-solving effort playing computer games. Simple image- and text-recognition tasks have been successfully 'crowd-sourced' through games, but it is not clear if more complex scientific problems can be solved with human-directed computing. Protein structure prediction is one such problem: locating the biologically relevant native conformation of a protein is a formidable computational challenge given the very large size of the search space. Here we describe Foldit, a multiplayer online game that engages non-scientists in solving hard prediction problems. Foldit players interact with protein structures using direct manipulation tools and user-friendly versions of algorithms from the Rosetta structure prediction methodology, while they compete and collaborate to optimize the computed energy. We show that top-ranked Foldit players excel at solving challenging structure refinement problems in which substantial backbone rearrangements are necessary to achieve the burial of hydrophobic residues. Players working collaboratively develop a rich assortment of new strategies and algorithms; unlike computational approaches, they explore not only the conformational space but also the space of possible search strategies. The integration of human visual problem-solving and strategy development capabilities with traditional computational algorithms through interactive multiplayer games is a powerful new approach to solving computationally-limited scientific problems.

  16. Emerging critical roles of Fe-S clusters in DNA replication and repair

    PubMed Central

    Fuss, Jill O.; Tsai, Chi-Lin; Ishida, Justin P.; Tainer, John A.

    2015-01-01

    Fe-S clusters are partners in the origin of life that predate cells, acetyl-CoA metabolism, DNA, and the RNA world. The double helix solved the mystery of DNA replication by base pairing for accurate copying. Yet, for genome stability necessary to life, the double helix has equally important implications for damage repair. Here we examine striking advances that uncover Fe-S cluster roles both in copying the genetic sequence by DNA polymerases and in crucial repair processes for genome maintenance, as mutational defects cause cancer and degenerative disease. Moreover, we examine an exciting, controversial role for Fe-S clusters in a third element required for life – the long-range coordination and regulation of replication and repair events. By their ability to delocalize electrons over both Fe and S centers, Fe-S clusters have unbeatable features for protein conformational control and charge transfer via double-stranded DNA that may fundamentally transform our understanding of life, replication, and repair. PMID:25655665

  17. Making DNA Fingerprints.

    ERIC Educational Resources Information Center

    Nunley, Kathie F.

    1996-01-01

    Presents an activity to simulate electrophoresis using everyday items. Uses adding machine paper to construct a set of DNA fingerprints that can be used to solve crime cases designed by students in any biology class. (JRH)

  18. Energy barriers and rates of tautomeric transitions in DNA bases: ab initio quantum chemical study.

    PubMed

    Basu, Soumalee; Majumdar, Rabi; Das, Gourab K; Bhattacharyya, Dhananjay

    2005-12-01

    Tautomeric transitions of DNA bases are proton transfer reactions, which are important in biology. These reactions are involved in spontaneous point mutations of the genetic material. In the present study, intrinsic reaction coordinates (IRC) analyses through ab initio quantum chemical calculations have been carried out for the individual DNA bases A, T, G, C and also A:T and G:C base pairs to estimate the kinetic and thermodynamic barriers using MP2/6-31G** method for tautomeric transitions. Relatively higher values of kinetic barriers (about 50-60 kcal/mol) have been observed for the single bases, indicating that tautomeric alterations of isolated single bases are quite unlikely. On the other hand, relatively lower values of the kinetic barriers (about 20-25 kcal/mol) for the DNA base pairs A:T and G:C clearly suggest that the tautomeric shifts are much more favorable in DNA base pairs than in isolated single bases. The unusual base pairing A':C, T':G, C':A or G':T in the daughter DNA molecule, resulting from a parent DNA molecule with tautomeric shifts, is found to be stable enough to result in a mutation. The transition rate constants for the single DNA bases in addition to the base pairs are also calculated by computing the free energy differences between the transition states and the reactants.

  19. PowerPlay: Training an Increasingly General Problem Solver by Continually Searching for the Simplest Still Unsolvable Problem

    PubMed Central

    Schmidhuber, Jürgen

    2013-01-01

    Most of computer science focuses on automatically solving given computational problems. I focus on automatically inventing or discovering problems in a way inspired by the playful behavior of animals and humans, to train a more and more general problem solver from scratch in an unsupervised fashion. Consider the infinite set of all computable descriptions of tasks with possibly computable solutions. Given a general problem-solving architecture, at any given time, the novel algorithmic framework PowerPlay (Schmidhuber, 2011) searches the space of possible pairs of new tasks and modifications of the current problem solver, until it finds a more powerful problem solver that provably solves all previously learned tasks plus the new one, while the unmodified predecessor does not. Newly invented tasks may require to achieve a wow-effect by making previously learned skills more efficient such that they require less time and space. New skills may (partially) re-use previously learned skills. The greedy search of typical PowerPlay variants uses time-optimal program search to order candidate pairs of tasks and solver modifications by their conditional computational (time and space) complexity, given the stored experience so far. The new task and its corresponding task-solving skill are those first found and validated. This biases the search toward pairs that can be described compactly and validated quickly. The computational costs of validating new tasks need not grow with task repertoire size. Standard problem solver architectures of personal computers or neural networks tend to generalize by solving numerous tasks outside the self-invented training set; PowerPlay’s ongoing search for novelty keeps breaking the generalization abilities of its present solver. This is related to Gödel’s sequence of increasingly powerful formal theories based on adding formerly unprovable statements to the axioms without affecting previously provable theorems. The continually increasing repertoire of problem-solving procedures can be exploited by a parallel search for solutions to additional externally posed tasks. PowerPlay may be viewed as a greedy but practical implementation of basic principles of creativity (Schmidhuber, 2006a, 2010). A first experimental analysis can be found in separate papers (Srivastava et al., 2012a,b, 2013). PMID:23761771

  20. The effect of introducing computers into an introductory physics problem-solving laboratory

    NASA Astrophysics Data System (ADS)

    McCullough, Laura Ellen

    2000-10-01

    Computers are appearing in every type of classroom across the country. Yet they often appear without benefit of studying their effects. The research that is available on computer use in classrooms has found mixed results, and often ignores the theoretical and instructional contexts of the computer in the classroom. The University of Minnesota's physics department employs a cooperative-group problem solving pedagogy, based on a cognitive apprenticeship instructional model, in its calculus-based introductory physics course. This study was designed to determine possible negative effects of introducing a computerized data-acquisition and analysis tool into this pedagogy as a problem-solving tool for students to use in laboratory. To determine the effects of the computer tool, two quasi-experimental treatment groups were selected. The computer-tool group (N = 170) used a tool, designed for this study (VideoTool), to collect and analyze motion data in the laboratory. The control group (N = 170) used traditional non-computer equipment (spark tapes and Polaroid(TM) film). The curriculum was kept as similar as possible for the two groups. During the ten week academic quarter, groups were examined for effects on performance on conceptual tests and grades, attitudes towards the laboratory and the laboratory tools, and behaviors within cooperative groups. Possible interactions with gender were also examined. Few differences were found between the control and computer-tool groups. The control group received slightly higher scores on one conceptual test, but this difference was not educationally significant. The computer-tool group had slightly more positive attitudes towards using the computer tool than their counterparts had towards the traditional tools. The computer-tool group also perceived that they spoke more frequently about physics misunderstandings, while the control group felt that they discussed equipment difficulties more often. This perceptual difference interacted with gender, with the men in the control group more likely to discuss equipment difficulties than any other group. Overall, the differences between the control and quasi-experimental groups were minimal. It was concluded that carefully replacing traditional data collection and analysis tools with a computer tool had no negative effects on achievement, attitude, group behavior, and did not interact with gender.

  1. Public Perceptions and Expectations of the Forensic Use of DNA: Results of a Preliminary Study

    ERIC Educational Resources Information Center

    Curtis, Cate

    2009-01-01

    The forensic use of Deoxyribonucleic Acid (DNA) is demonstrating significant success as a crime-solving tool. However, numerous concerns have been raised regarding the potential for DNA use to contravene cultural, ethical, and legal codes. In this article the expectations and level of knowledge of the New Zealand public of the DNA data-bank and…

  2. Normal-Mode Analysis of Circular DNA at the Base-Pair Level. 2. Large-Scale Configurational Transformation of a Naturally Curved Molecule.

    PubMed

    Matsumoto, Atsushi; Tobias, Irwin; Olson, Wilma K

    2005-01-01

    Fine structural and energetic details embedded in the DNA base sequence, such as intrinsic curvature, are important to the packaging and processing of the genetic material. Here we investigate the internal dynamics of a 200 bp closed circular molecule with natural curvature using a newly developed normal-mode treatment of DNA in terms of neighboring base-pair "step" parameters. The intrinsic curvature of the DNA is described by a 10 bp repeating pattern of bending distortions at successive base-pair steps. We vary the degree of intrinsic curvature and the superhelical stress on the molecule and consider the normal-mode fluctuations of both the circle and the stable figure-8 configuration under conditions where the energies of the two states are similar. To extract the properties due solely to curvature, we ignore other important features of the double helix, such as the extensibility of the chain, the anisotropy of local bending, and the coupling of step parameters. We compare the computed normal modes of the curved DNA model with the corresponding dynamical features of a covalently closed duplex of the same chain length constructed from naturally straight DNA and with the theoretically predicted dynamical properties of a naturally circular, inextensible elastic rod, i.e., an O-ring. The cyclic molecules with intrinsic curvature are found to be more deformable under superhelical stress than rings formed from naturally straight DNA. As superhelical stress is accumulated in the DNA, the frequency, i.e., energy, of the dominant bending mode decreases in value, and if the imposed stress is sufficiently large, a global configurational rearrangement of the circle to the figure-8 form takes place. We combine energy minimization with normal-mode calculations of the two states to decipher the configurational pathway between the two states. We also describe and make use of a general analytical treatment of the thermal fluctuations of an elastic rod to characterize the motions of the minicircle as a whole from knowledge of the full set of normal modes. The remarkable agreement between computed and theoretically predicted values of the average deviation and dispersion of the writhe of the circular configuration adds to the reliability in the computational approach. Application of the new formalism to the computed modes of the figure-8 provides insights into macromolecular motions which are beyond the scope of current theoretical treatments.

  3. Educating Laboratory Science Learners at a Distance Using Interactive Television

    ERIC Educational Resources Information Center

    Reddy, Christopher

    2014-01-01

    Laboratory science classes offered to students learning at a distance require a methodology that allows for the completion of tactile activities. Literature describes three different methods of solving the distance laboratory dilemma: kit-based laboratory experience, computer-based laboratory experience, and campus-based laboratory experience,…

  4. Multiscale QM/MM molecular dynamics study on the first steps of guanine damage by free hydroxyl radicals in solution.

    PubMed

    Abolfath, Ramin M; Biswas, P K; Rajnarayanam, R; Brabec, Thomas; Kodym, Reinhard; Papiez, Lech

    2012-04-19

    Understanding the damage of DNA bases from hydrogen abstraction by free OH radicals is of particular importance to understanding the indirect effect of ionizing radiation. Previous studies address the problem with truncated DNA bases as ab initio quantum simulations required to study such electronic-spin-dependent processes are computationally expensive. Here, for the first time, we employ a multiscale and hybrid quantum mechanical-molecular mechanical simulation to study the interaction of OH radicals with a guanine-deoxyribose-phosphate DNA molecular unit in the presence of water, where all of the water molecules and the deoxyribose-phosphate fragment are treated with the simplistic classical molecular mechanical scheme. Our result illustrates that the presence of water strongly alters the hydrogen-abstraction reaction as the hydrogen bonding of OH radicals with water restricts the relative orientation of the OH radicals with respect to the DNA base (here, guanine). This results in an angular anisotropy in the chemical pathway and a lower efficiency in the hydrogen-abstraction mechanisms than previously anticipated for identical systems in vacuum. The method can easily be extended to single- and double-stranded DNA without any appreciable computational cost as these molecular units can be treated in the classical subsystem, as has been demonstrated here. © 2012 American Chemical Society

  5. A novel model for DNA sequence similarity analysis based on graph theory.

    PubMed

    Qi, Xingqin; Wu, Qin; Zhang, Yusen; Fuller, Eddie; Zhang, Cun-Quan

    2011-01-01

    Determination of sequence similarity is one of the major steps in computational phylogenetic studies. As we know, during evolutionary history, not only DNA mutations for individual nucleotide but also subsequent rearrangements occurred. It has been one of major tasks of computational biologists to develop novel mathematical descriptors for similarity analysis such that various mutation phenomena information would be involved simultaneously. In this paper, different from traditional methods (eg, nucleotide frequency, geometric representations) as bases for construction of mathematical descriptors, we construct novel mathematical descriptors based on graph theory. In particular, for each DNA sequence, we will set up a weighted directed graph. The adjacency matrix of the directed graph will be used to induce a representative vector for DNA sequence. This new approach measures similarity based on both ordering and frequency of nucleotides so that much more information is involved. As an application, the method is tested on a set of 0.9-kb mtDNA sequences of twelve different primate species. All output phylogenetic trees with various distance estimations have the same topology, and are generally consistent with the reported results from early studies, which proves the new method's efficiency; we also test the new method on a simulated data set, which shows our new method performs better than traditional global alignment method when subsequent rearrangements happen frequently during evolutionary history.

  6. LASL/USDA computer applications annual progress report, October 1, 1978-September 30, 1979. [Data Base Management activities regarding agricultural problems in southwestern USA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sanders, W.M.; Campbell, C.L.; Pickerill, P.A.

    1980-10-01

    The Los Alamos Scientific Laboratory is funded by the US Department of Agriculture to apply scientific and computer technology to solve agricultural problems. This report summarizes work during the period October 1, 1978 through September 30, 1979 on the application of computer technology to four areas: (1) Texas brucellosis calfhood-vaccination studies, (2) brucellosis data-entry system in New Mexico, (3) Idaho adult vaccination data base, and (4) surveillance of slaughterplants in Texas.

  7. Recent progress of quantum annealing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Suzuki, Sei

    2015-03-10

    We review the recent progress of quantum annealing. Quantum annealing was proposed as a method to solve generic optimization problems. Recently a Canadian company has drawn a great deal of attention, as it has commercialized a quantum computer based on quantum annealing. Although the performance of quantum annealing is not sufficiently understood, it is likely that quantum annealing will be a practical method both on a conventional computer and on a quantum computer.

  8. The Effect of Scratch- and Lego Mindstorms Ev3-Based Programming Activities on Academic Achievement, Problem-Solving Skills and Logical-Mathematical Thinking Skills of Students

    ERIC Educational Resources Information Center

    Korkmaz, Özgen

    2016-01-01

    The aim of this study was to investigate the effect of the Scratch and Lego Mindstorms Ev3 programming activities on academic achievement with respect to computer programming, and on the problem-solving and logical-mathematical thinking skills of students. This study was a semi-experimental, pretest-posttest study with two experimental groups and…

  9. Viewing Human DNA Polymerase β Faithfully and Unfaithfully Bypass an Oxidative Lesion by Time-Dependent Crystallography

    DOE PAGES

    Vyas, Rajan; Reed, Andrew J.; Tokarsky, E. John; ...

    2015-03-31

    One common oxidative DNA lesion, 8-oxo-7,8-dihydro-2'-deoxyguanine (8-oxoG), is highly mutagenic in vivo due to its anti-conformation forming a Watson–Crick base pair with correct deoxycytidine 5'-triphosphate (dCTP) and its syn-conformation forming a Hoogsteen base pair with incorrect deoxyadenosine 5'-triphosphate (dATP). Here in this article, we utilized time-resolved X-ray crystallography to follow 8-oxoG bypass by human DNA polymerase β (hPolβ). In the 12 solved structures, both Watson–Crick (anti-8-oxoG:anti-dCTP) and Hoogsteen (syn-8-oxoG:anti-dATP) base pairing were clearly visible and were maintained throughout the chemical reaction. Additionally, a third Mg 2+ appeared during the process of phosphodiester bond formation and was located between the reactingmore » α- and β-phosphates of the dNTP, suggesting its role in stabilizing reaction intermediates. After phosphodiester bond formation, hPolβ reopened its conformation, pyrophosphate was released, and the newly incorporated primer 3'-terminal nucleotide stacked, rather than base paired, with 8-oxoG. These structures provide the first real-time pictures, to our knowledge, of how a polymerase correctly and incorrectly bypasses a DNA lesion.« less

  10. Viewing Human DNA Polymerase β Faithfully and Unfaithfully Bypass an Oxidative Lesion by Time-Dependent Crystallography

    PubMed Central

    Vyas, Rajan; Reed, Andrew J.; Tokarsky, E. John; Suo, Zucai

    2015-01-01

    One common oxidative DNA lesion, 8-oxo-7,8-dihydro-2′-deoxyguanine (8-oxoG), is highly mutagenic in vivo due to its anti-conformation forming a Watson–Crick base pair with correct deoxycytidine 5′-triphosphate (dCTP) and its syn-conformation forming a Hoogsteen base pair with incorrect deoxyadenosine 5′-triphosphate (dATP). Here, we utilized time-resolved X-ray crystallography to follow 8-oxoG bypass by human DNA polymerase β (hPolβ). In the 12 solved structures, both Watson–Crick (anti-8-oxoG:anti-dCTP) and Hoogsteen (syn-8-oxoG:anti-dATP) base pairing were clearly visible and were maintained throughout the chemical reaction. Additionally, a third Mg2+ appeared during the process of phosphodiester bond formation and was located between the reacting α- and β-phosphates of the dNTP, suggesting its role in stabilizing reaction intermediates. After phosphodiester bond formation, hPolβ reopened its conformation, pyrophosphate was released, and the newly incorporated primer 3′-terminal nucleotide stacked, rather than base paired, with 8-oxoG. These structures provide the first real-time pictures, to our knowledge, of how a polymerase correctly and incorrectly bypasses a DNA lesion. PMID:25825995

  11. RDNAnalyzer: A tool for DNA secondary structure prediction and sequence analysis

    PubMed Central

    Afzal, Muhammad; Shahid, Ahmad Ali; Shehzadi, Abida; Nadeem, Shahid; Husnain, Tayyab

    2012-01-01

    RDNAnalyzer is an innovative computer based tool designed for DNA secondary structure prediction and sequence analysis. It can randomly generate the DNA sequence or user can upload the sequences of their own interest in RAW format. It uses and extends the Nussinov dynamic programming algorithm and has various application for the sequence analysis. It predicts the DNA secondary structure and base pairings. It also provides the tools for routinely performed sequence analysis by the biological scientists such as DNA replication, reverse compliment generation, transcription, translation, sequence specific information as total number of nucleotide bases, ATGC base contents along with their respective percentages and sequence cleaner. RDNAnalyzer is a unique tool developed in Microsoft Visual Studio 2008 using Microsoft Visual C# and Windows Presentation Foundation and provides user friendly environment for sequence analysis. It is freely available. Availability http://www.cemb.edu.pk/sw.html Abbreviations RDNAnalyzer - Random DNA Analyser, GUI - Graphical user interface, XAML - Extensible Application Markup Language. PMID:23055611

  12. High-speed DNA-based rolling motors powered by RNase H

    PubMed Central

    Yehl, Kevin; Mugler, Andrew; Vivek, Skanda; Liu, Yang; Zhang, Yun; Fan, Mengzhen; Weeks, Eric R.

    2016-01-01

    DNA-based machines that walk by converting chemical energy into controlled motion could be of use in applications such as next generation sensors, drug delivery platforms, and biological computing. Despite their exquisite programmability, DNA-based walkers are, however, challenging to work with due to their low fidelity and slow rates (~1 nm/min). Here, we report DNA-based machines that roll rather than walk, and consequently have a maximum speed and processivity that is three-orders of magnitude greater than conventional DNA motors. The motors are made from DNA-coated spherical particles that hybridise to a surface modified with complementary RNA; motion is achieved through the addition of RNase H, which selectively hydrolyses hybridised RNA. Spherical motors move in a self-avoiding manner, whereas anisotropic particles, such as dimerised particles or rod-shaped particles travel linearly without a track or external force. Finally, we demonstrate detection of single nucleotide polymorphism by measuring particle displacement using a smartphone camera. PMID:26619152

  13. Solving Fractional Programming Problems based on Swarm Intelligence

    NASA Astrophysics Data System (ADS)

    Raouf, Osama Abdel; Hezam, Ibrahim M.

    2014-04-01

    This paper presents a new approach to solve Fractional Programming Problems (FPPs) based on two different Swarm Intelligence (SI) algorithms. The two algorithms are: Particle Swarm Optimization, and Firefly Algorithm. The two algorithms are tested using several FPP benchmark examples and two selected industrial applications. The test aims to prove the capability of the SI algorithms to solve any type of FPPs. The solution results employing the SI algorithms are compared with a number of exact and metaheuristic solution methods used for handling FPPs. Swarm Intelligence can be denoted as an effective technique for solving linear or nonlinear, non-differentiable fractional objective functions. Problems with an optimal solution at a finite point and an unbounded constraint set, can be solved using the proposed approach. Numerical examples are given to show the feasibility, effectiveness, and robustness of the proposed algorithm. The results obtained using the two SI algorithms revealed the superiority of the proposed technique among others in computational time. A better accuracy was remarkably observed in the solution results of the industrial application problems.

  14. Integrating Retraction Modeling Into an Atlas-Based Framework for Brain Shift Prediction

    PubMed Central

    Chen, Ishita; Ong, Rowena E.; Simpson, Amber L.; Sun, Kay; Thompson, Reid C.

    2015-01-01

    In recent work, an atlas-based statistical model for brain shift prediction, which accounts for uncertainty in the intraoperative environment, has been proposed. Previous work reported in the literature using this technique did not account for local deformation caused by surgical retraction. It is challenging to precisely localize the retractor location prior to surgery and the retractor is often moved in the course of the procedure. This paper proposes a technique that involves computing the retractor-induced brain deformation in the operating room through an active model solve and linearly superposing the solution with the precomputed deformation atlas. As a result, the new method takes advantage of the atlas-based framework’s accounting for uncertainties while also incorporating the effects of retraction with minimal intraoperative computing. This new approach was tested using simulation and phantom experiments. The results showed an improvement in average shift correction from 50% (ranging from 14 to 81%) for gravity atlas alone to 80% using the active solve retraction component (ranging from 73 to 85%). This paper presents a novel yet simple way to integrate retraction into the atlas-based brain shift computation framework. PMID:23864146

  15. Artificial intelligence, expert systems, computer vision, and natural language processing

    NASA Technical Reports Server (NTRS)

    Gevarter, W. B.

    1984-01-01

    An overview of artificial intelligence (AI), its core ingredients, and its applications is presented. The knowledge representation, logic, problem solving approaches, languages, and computers pertaining to AI are examined, and the state of the art in AI is reviewed. The use of AI in expert systems, computer vision, natural language processing, speech recognition and understanding, speech synthesis, problem solving, and planning is examined. Basic AI topics, including automation, search-oriented problem solving, knowledge representation, and computational logic, are discussed.

  16. Potential of DNA barcoding for detecting quarantine fungi.

    PubMed

    Gao, Ruifang; Zhang, Guiming

    2013-11-01

    The detection of live quarantine pathogenic fungi plays an important role in guaranteeing regional biological safety. DNA barcoding, an emerging species identification technology, holds promise for the reliable, quick, and accurate detection of quarantine fungi. International standards for phytosanitary guidelines are urgently needed. The varieties of quarantine fungi listed for seven countries/regions, the currently applied detection methods, and the status of DNA barcoding for detecting quarantine fungi are summarized in this study. Two approaches have been proposed to apply DNA barcoding to fungal quarantine procedures: (i) to verify the reliability of known internal transcribed spacer (ITS)/cytochrome c oxidase subunit I (COI) data for use as barcodes, and (ii) to determine other barcodes for species that cannot be identified by ITS/COI. As a unique, standardizable, and universal species identification tool, DNA barcoding offers great potential for integrating detection methods used in various countries/regions and establishing international detection standards based on accepted DNA barcodes. Through international collaboration, interstate disputes can be eased and many problems related to routine quarantine detection methods can be solved for global trade.

  17. Probing Conformational Changes in Human DNA Topoisomerase IIα by Pulsed Alkylation Mass Spectrometry*

    PubMed Central

    Chen, Yu-tsung; Collins, Tammy R. L.; Guan, Ziqiang; Chen, Vincent B.; Hsieh, Tao-Shih

    2012-01-01

    Type II topoisomerases are essential enzymes for solving DNA topological problems by passing one segment of DNA duplex through a transient double-strand break in a second segment. The reaction requires the enzyme to precisely control DNA cleavage and gate opening coupled with ATP hydrolysis. Using pulsed alkylation mass spectrometry, we were able to monitor the solvent accessibilities around 13 cysteines distributed throughout human topoisomerase IIα by measuring the thiol reactivities with monobromobimane. Most of the measured reactivities are in accordance with the predicted ones based on a homology structural model generated from available crystal structures. However, these results reveal new information for both the residues not covered in the structural model and potential differences between the modeled and solution holoenzyme structures. Furthermore, on the basis of the reactivity changes of several cysteines located at the N-gate and DNA gate, we could monitor the movement of topoisomerase II in the presence of cofactors and detect differences in the DNA gate between two closed clamp enzyme conformations locked by either 5′-adenylyl β,γ-imidodiphosphate or the anticancer drug ICRF-193. PMID:22679013

  18. Statistical Significance of Optical Map Alignments

    PubMed Central

    Sarkar, Deepayan; Goldstein, Steve; Schwartz, David C.

    2012-01-01

    Abstract The Optical Mapping System constructs ordered restriction maps spanning entire genomes through the assembly and analysis of large datasets comprising individually analyzed genomic DNA molecules. Such restriction maps uniquely reveal mammalian genome structure and variation, but also raise computational and statistical questions beyond those that have been solved in the analysis of smaller, microbial genomes. We address the problem of how to filter maps that align poorly to a reference genome. We obtain map-specific thresholds that control errors and improve iterative assembly. We also show how an optimal self-alignment score provides an accurate approximation to the probability of alignment, which is useful in applications seeking to identify structural genomic abnormalities. PMID:22506568

  19. [Definition of the specificity of DNA-methyltransferase M.Bsc4I in cell lysate by blocking of restriction endonucleases and computer modeling].

    PubMed

    Dedkov, V S

    2009-01-01

    The specificity of DNA-methyltransferase M.Bsc4I was defined in cellular lysate of Bacillus schlegelii 4. For this purpose, we used methylation sensitivity of restriction endonucleases, and also modeling of methylation. The modeling consisted in editing sequences of DNA using replacements of methylated bases and their complementary bases. The substratum DNA processed by M.Bsc4I also were used for studying sensitivity of some restriction endonucleases to methylation. Thus, it was shown that M.Bsc4I methylated 5'-Cm4CNNNNNNNGG-3' and the overlapped dcm-methylation blocked its activity. The offered approach can appear universal enough and simple for definition of specificity of DNA-methyltransferases.

  20. Simulations Using Random-Generated DNA and RNA Sequences

    ERIC Educational Resources Information Center

    Bryce, C. F. A.

    1977-01-01

    Using a very simple computer program written in BASIC, a very large number of random-generated DNA or RNA sequences are obtained. Students use these sequences to predict complementary sequences and translational products, evaluate base compositions, determine frequencies of particular triplet codons, and suggest possible secondary structures.…

  1. Solving Equations of Multibody Dynamics

    NASA Technical Reports Server (NTRS)

    Jain, Abhinandan; Lim, Christopher

    2007-01-01

    Darts++ is a computer program for solving the equations of motion of a multibody system or of a multibody model of a dynamic system. It is intended especially for use in dynamical simulations performed in designing and analyzing, and developing software for the control of, complex mechanical systems. Darts++ is based on the Spatial-Operator- Algebra formulation for multibody dynamics. This software reads a description of a multibody system from a model data file, then constructs and implements an efficient algorithm that solves the dynamical equations of the system. The efficiency and, hence, the computational speed is sufficient to make Darts++ suitable for use in realtime closed-loop simulations. Darts++ features an object-oriented software architecture that enables reconfiguration of system topology at run time; in contrast, in related prior software, system topology is fixed during initialization. Darts++ provides an interface to scripting languages, including Tcl and Python, that enable the user to configure and interact with simulation objects at run time.

  2. Physics Learning with a Computer Algebra System: Towards a Learning Environment That Promotes Enhanced Problem Representations.

    ERIC Educational Resources Information Center

    Savelsbergh, Elwin R.; Ferguson-Hessler, Monica G. M.; de Jong, Ton

    An approach to teaching problem-solving based on using the computer software Mathematica is applied to the study of electrostatics and is compared with the normal approach to the module. Learning outcomes for both approaches were not significantly different. The experimental course successfully addressed a number of misconceptions. Students in the…

  3. Cultural Commonalities and Differences in Spatial Problem-Solving: A Computational Analysis

    ERIC Educational Resources Information Center

    Lovett, Andrew; Forbus, Kenneth

    2011-01-01

    A fundamental question in human cognition is how people reason about space. We use a computational model to explore cross-cultural commonalities and differences in spatial cognition. Our model is based upon two hypotheses: (1) the structure-mapping model of analogy can explain the visual comparisons used in spatial reasoning; and (2) qualitative,…

  4. An inverse problem strategy based on forward model evaluations: Gradient-based optimization without adjoint solves

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aguilo Valentin, Miguel Alejandro

    2016-07-01

    This study presents a new nonlinear programming formulation for the solution of inverse problems. First, a general inverse problem formulation based on the compliance error functional is presented. The proposed error functional enables the computation of the Lagrange multipliers, and thus the first order derivative information, at the expense of just one model evaluation. Therefore, the calculation of the Lagrange multipliers does not require the solution of the computationally intensive adjoint problem. This leads to significant speedups for large-scale, gradient-based inverse problems.

  5. Design and Implementation of a Tool for Teaching Programming.

    ERIC Educational Resources Information Center

    Goktepe, Mesut; And Others

    1989-01-01

    Discussion of the use of computers in education focuses on a graphics-based system for teaching the Pascal programing language for problem solving. Topics discussed include user interface; notification based systems; communication processes; object oriented programing; workstations; graphics architecture; and flowcharts. (18 references) (LRW)

  6. A compact two-wave dichrometer of an optical biosensor analytical system for medicine

    NASA Astrophysics Data System (ADS)

    Chulkov, D. P.; Gusev, V. M.; Kompanets, O. N.; Vereschagin, F. V.; Skuridin, S. G.; Yevdokimov, Yu. M.

    2017-01-01

    An experimental model has been developed of a compact two-wave dichrometer on the base of LEDs that is well-suited to work with "liquid" DNA nanoconstructions as biosensing units. The mobile and inexpensive device is intended for use in a biosensor analytical system for rapid determination of biologically active compounds in liquids to solve practical problems of clinic medicine and pharmacology.

  7. HTSFinder: Powerful Pipeline of DNA Signature Discovery by Parallel and Distributed Computing

    PubMed Central

    Karimi, Ramin; Hajdu, Andras

    2016-01-01

    Comprehensive effort for low-cost sequencing in the past few years has led to the growth of complete genome databases. In parallel with this effort, a strong need, fast and cost-effective methods and applications have been developed to accelerate sequence analysis. Identification is the very first step of this task. Due to the difficulties, high costs, and computational challenges of alignment-based approaches, an alternative universal identification method is highly required. Like an alignment-free approach, DNA signatures have provided new opportunities for the rapid identification of species. In this paper, we present an effective pipeline HTSFinder (high-throughput signature finder) with a corresponding k-mer generator GkmerG (genome k-mers generator). Using this pipeline, we determine the frequency of k-mers from the available complete genome databases for the detection of extensive DNA signatures in a reasonably short time. Our application can detect both unique and common signatures in the arbitrarily selected target and nontarget databases. Hadoop and MapReduce as parallel and distributed computing tools with commodity hardware are used in this pipeline. This approach brings the power of high-performance computing into the ordinary desktop personal computers for discovering DNA signatures in large databases such as bacterial genome. A considerable number of detected unique and common DNA signatures of the target database bring the opportunities to improve the identification process not only for polymerase chain reaction and microarray assays but also for more complex scenarios such as metagenomics and next-generation sequencing analysis. PMID:26884678

  8. HTSFinder: Powerful Pipeline of DNA Signature Discovery by Parallel and Distributed Computing.

    PubMed

    Karimi, Ramin; Hajdu, Andras

    2016-01-01

    Comprehensive effort for low-cost sequencing in the past few years has led to the growth of complete genome databases. In parallel with this effort, a strong need, fast and cost-effective methods and applications have been developed to accelerate sequence analysis. Identification is the very first step of this task. Due to the difficulties, high costs, and computational challenges of alignment-based approaches, an alternative universal identification method is highly required. Like an alignment-free approach, DNA signatures have provided new opportunities for the rapid identification of species. In this paper, we present an effective pipeline HTSFinder (high-throughput signature finder) with a corresponding k-mer generator GkmerG (genome k-mers generator). Using this pipeline, we determine the frequency of k-mers from the available complete genome databases for the detection of extensive DNA signatures in a reasonably short time. Our application can detect both unique and common signatures in the arbitrarily selected target and nontarget databases. Hadoop and MapReduce as parallel and distributed computing tools with commodity hardware are used in this pipeline. This approach brings the power of high-performance computing into the ordinary desktop personal computers for discovering DNA signatures in large databases such as bacterial genome. A considerable number of detected unique and common DNA signatures of the target database bring the opportunities to improve the identification process not only for polymerase chain reaction and microarray assays but also for more complex scenarios such as metagenomics and next-generation sequencing analysis.

  9. Self-Directed Student Research through Analysis of Microarray Datasets: A Computer-Based Functional Genomics Practical Class for Masters-Level Students

    ERIC Educational Resources Information Center

    Grenville-Briggs, Laura J.; Stansfield, Ian

    2011-01-01

    This report describes a linked series of Masters-level computer practical workshops. They comprise an advanced functional genomics investigation, based upon analysis of a microarray dataset probing yeast DNA damage responses. The workshops require the students to analyse highly complex transcriptomics datasets, and were designed to stimulate…

  10. A Cognitive Model for Problem Solving in Computer Science

    ERIC Educational Resources Information Center

    Parham, Jennifer R.

    2009-01-01

    According to industry representatives, computer science education needs to emphasize the processes involved in solving computing problems rather than their solutions. Most of the current assessment tools used by universities and computer science departments analyze student answers to problems rather than investigating the processes involved in…

  11. Fast-Solving Quasi-Optimal LS-S3VM Based on an Extended Candidate Set.

    PubMed

    Ma, Yuefeng; Liang, Xun; Kwok, James T; Li, Jianping; Zhou, Xiaoping; Zhang, Haiyan

    2018-04-01

    The semisupervised least squares support vector machine (LS-S 3 VM) is an important enhancement of least squares support vector machines in semisupervised learning. Given that most data collected from the real world are without labels, semisupervised approaches are more applicable than standard supervised approaches. Although a few training methods for LS-S 3 VM exist, the problem of deriving the optimal decision hyperplane efficiently and effectually has not been solved. In this paper, a fully weighted model of LS-S 3 VM is proposed, and a simple integer programming (IP) model is introduced through an equivalent transformation to solve the model. Based on the distances between the unlabeled data and the decision hyperplane, a new indicator is designed to represent the possibility that the label of an unlabeled datum should be reversed in each iteration during training. Using the indicator, we construct an extended candidate set consisting of the indices of unlabeled data with high possibilities, which integrates more information from unlabeled data. Our algorithm is degenerated into a special scenario of the previous algorithm when the extended candidate set is reduced into a set with only one element. Two strategies are utilized to determine the descent directions based on the extended candidate set. Furthermore, we developed a novel method for locating a good starting point based on the properties of the equivalent IP model. Combined with the extended candidate set and the carefully computed starting point, a fast algorithm to solve LS-S 3 VM quasi-optimally is proposed. The choice of quasi-optimal solutions results in low computational cost and avoidance of overfitting. Experiments show that our algorithm equipped with the two designed strategies is more effective than other algorithms in at least one of the following three aspects: 1) computational complexity; 2) generalization ability; and 3) flexibility. However, our algorithm and other algorithms have similar levels of performance in the remaining aspects.

  12. Efficient Monte Carlo sampling of inverse problems using a neural network-based forward—applied to GPR crosshole traveltime inversion

    NASA Astrophysics Data System (ADS)

    Hansen, T. M.; Cordua, K. S.

    2017-12-01

    Probabilistically formulated inverse problems can be solved using Monte Carlo-based sampling methods. In principle, both advanced prior information, based on for example, complex geostatistical models and non-linear forward models can be considered using such methods. However, Monte Carlo methods may be associated with huge computational costs that, in practice, limit their application. This is not least due to the computational requirements related to solving the forward problem, where the physical forward response of some earth model has to be evaluated. Here, it is suggested to replace a numerical complex evaluation of the forward problem, with a trained neural network that can be evaluated very fast. This will introduce a modeling error that is quantified probabilistically such that it can be accounted for during inversion. This allows a very fast and efficient Monte Carlo sampling of the solution to an inverse problem. We demonstrate the methodology for first arrival traveltime inversion of crosshole ground penetrating radar data. An accurate forward model, based on 2-D full-waveform modeling followed by automatic traveltime picking, is replaced by a fast neural network. This provides a sampling algorithm three orders of magnitude faster than using the accurate and computationally expensive forward model, and also considerably faster and more accurate (i.e. with better resolution), than commonly used approximate forward models. The methodology has the potential to dramatically change the complexity of non-linear and non-Gaussian inverse problems that have to be solved using Monte Carlo sampling techniques.

  13. MOOSE: A PARALLEL COMPUTATIONAL FRAMEWORK FOR COUPLED SYSTEMS OF NONLINEAR EQUATIONS.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    G. Hansen; C. Newman; D. Gaston

    Systems of coupled, nonlinear partial di?erential equations often arise in sim- ulation of nuclear processes. MOOSE: Multiphysics Ob ject Oriented Simulation Environment, a parallel computational framework targeted at solving these systems is presented. As opposed to traditional data / ?ow oriented com- putational frameworks, MOOSE is instead founded on mathematics based on Jacobian-free Newton Krylov (JFNK). Utilizing the mathematical structure present in JFNK, physics are modularized into “Kernels” allowing for rapid production of new simulation tools. In addition, systems are solved fully cou- pled and fully implicit employing physics based preconditioning allowing for a large amount of ?exibility even withmore » large variance in time scales. Background on the mathematics, an inspection of the structure of MOOSE and several rep- resentative solutions from applications built on the framework are presented.« less

  14. A generalized Condat's algorithm of 1D total variation regularization

    NASA Astrophysics Data System (ADS)

    Makovetskii, Artyom; Voronin, Sergei; Kober, Vitaly

    2017-09-01

    A common way for solving the denosing problem is to utilize the total variation (TV) regularization. Many efficient numerical algorithms have been developed for solving the TV regularization problem. Condat described a fast direct algorithm to compute the processed 1D signal. Also there exists a direct algorithm with a linear time for 1D TV denoising referred to as the taut string algorithm. The Condat's algorithm is based on a dual problem to the 1D TV regularization. In this paper, we propose a variant of the Condat's algorithm based on the direct 1D TV regularization problem. The usage of the Condat's algorithm with the taut string approach leads to a clear geometric description of the extremal function. Computer simulation results are provided to illustrate the performance of the proposed algorithm for restoration of degraded signals.

  15. A Crystallographic Study of the Role of Sequence Context in Thymine Glycol Bypass by a Replicative DNA Polymerase Serendipitously Sheds Light on the Exonuclease Complex

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aller, Pierre; Duclos, Stéphanie; Wallace, Susan S.

    2012-06-27

    Thymine glycol (Tg) is the most common oxidation product of thymine and is known to be a strong block to replicative DNA polymerases. A previously solved structure of the bacteriophage RB69 DNA polymerase (RB69 gp43) in complex with Tg in the sequence context 5'-G-Tg-G shed light on how Tg blocks primer elongation: The protruding methyl group of the oxidized thymine displaces the adjacent 5'-G, which can no longer serve as a template for primer elongation [Aller, P., Rould, M. A., Hogg, M, Wallace, S. S. and Doublie S. (2007). A structural rationale for stalling of a replicative DNA polymerase atmore » the most common oxidative thymine lesion, thymine glycol. Proc. Natl. Acad. Sci. USA, 104, 814-818.]. Several studies showed that in the sequence context 5'-C-Tg-purine, Tg is more likely to be bypassed by Klenow fragment, an A-family DNA polymerase. We set out to investigate the role of sequence context in Tg bypass in a B-family polymerase and to solve the crystal structures of the bacteriophage RB69 DNA polymerase in complex with Tg-containing DNA in the three remaining sequence contexts: 5'-A-Tg-G, 5'-T-Tg-G, and 5'-C-Tg-G. A combination of several factors - including the associated exonuclease activity, the nature of the 3' and 5' bases surrounding Tg, and the cis-trans interconversion of Tg - influences Tg bypass. We also visualized for the first time the structure of a well-ordered exonuclease complex, allowing us to identify and confirm the role of key residues (Phe123, Met256, and Tyr257) in strand separation and in the stabilization of the primer strand in the exonuclease site.« less

  16. A computer aided thermodynamic approach for predicting the formation of Z-DNA in naturally occurring sequences

    NASA Technical Reports Server (NTRS)

    Ho, P. S.; Ellison, M. J.; Quigley, G. J.; Rich, A.

    1986-01-01

    The ease with which a particular DNA segment adopts the left-handed Z-conformation depends largely on the sequence and on the degree of negative supercoiling to which it is subjected. We describe a computer program (Z-hunt) that is designed to search long sequences of naturally occurring DNA and retrieve those nucleotide combinations of up to 24 bp in length which show a strong propensity for Z-DNA formation. Incorporated into Z-hunt is a statistical mechanical model based on empirically determined energetic parameters for the B to Z transition accumulated to date. The Z-forming potential of a sequence is assessed by ranking its behavior as a function of negative superhelicity relative to the behavior of similar sized randomly generated nucleotide sequences assembled from over 80,000 combinations. The program makes it possible to compare directly the Z-forming potential of sequences with different base compositions and different sequence lengths. Using Z-hunt, we have analyzed the DNA sequences of the bacteriophage phi X174, plasmid pBR322, the animal virus SV40 and the replicative form of the eukaryotic adenovirus-2. The results are compared with those previously obtained by others from experiments designed to locate Z-DNA forming regions in these sequences using probes which show specificity for the left-handed DNA conformation.

  17. Advanced Computer Aids in the Planning and Execution of Air Warfare and Ground Strike Operations: Conference Proceedings, Meeting of the Avionics Panels of AGARD (51st) Held in Kongsberg, Norway on 12-16 May 1986

    DTIC Science & Technology

    1986-02-01

    the area of Artificial Intelligence (At). DARPA’s Strategic Computing Program 13 developing an At ýtchnology base upon which several applications...technologies with the Strategic Computing Program . In late 1983 the Strategic Computing Program (SCP) wes announced. The program was organizsd to develop...solving a resource allocation problem. The remainder of this paper will discuss the TEMPLAR progeam as it relates to the Strategic Computing Program

  18. Electron interaction with phosphate cytidine oligomer dCpdC: base-centered radical anions and their electronic spectra.

    PubMed

    Gu, Jiande; Wang, Jing; Leszczynski, Jerzy

    2014-01-30

    Computational chemistry approach was applied to explore the nature of electron attachment to cytosine-rich DNA single strands. An oligomer dinucleoside phosphate deoxycytidylyl-3',5'-deoxycytidine (dCpdC) was selected as a model system for investigations by density functional theory. Electron distribution patterns for the radical anions of dCpdC in aqueous solution were explored. The excess electron may reside on the nucleobase at the 5' position (dC(•-)pdC) or at the 3' position (dCpdC(•-)). From comparison with electron attachment to the cytosine related DNA fragments, the electron affinity for the formation of the cytosine-centered radical anion in DNA is estimated to be around 2.2 eV. Electron attachment to cytosine sites in DNA single strands might cause perturbations of local structural characteristics. Visible absorption spectroscopy may be applied to validate computational results and determine experimentally the existence of the base-centered radical anion. The time-dependent DFT study shows the absorption around 550-600 nm for the cytosine-centered radical anions of DNA oligomers. This indicates that if such species are detected experimentally they would be characterized by a distinctive color.

  19. Multi-pedal DNA walker biosensors based on catalyzed hairpin assembly and isothermal strand-displacement polymerase reaction for the chemiluminescent detection of proteins.

    PubMed

    Li, Ningxing; Du, Mingyuan; Liu, Yucheng; Ji, Xinghu; He, Zhike

    2018-06-25

    Two kinds of sensitive biosensors based on multi-pedal DNA walker along a 3-D DNA functional magnet particles track for the chemiluminescent detection of streptavidin are constructed and compared in this study. In the presence of SA, multi-pedal DNA walker has been constructed by biotin-modified catalyst as a result of the terminal protection for avoiding the digestion by exonuclease I. Then a toehold of CHA-H1 conjugated with magnetic microparticles (MMPs) could interact with a 'leg' of multi-pedal DNA walker to open the hairpin via toehold-mediated strand exchange catalysis. A newly exposed DNA segment in CHA-H1 would be hybridized with a toehold of biotin-labeled H2. Via the strand displacement process, H2 displaces one 'leg' of multi-pedal DNA walker, and the other 'leg' could still hybridize with neighboring H1 to initiate the next cycle. In order to solve the high background caused by the hybridization between CHA-H1 and H2 without CHA-catalyst, the other model has been designed. The principle of the other model (ISDPR DNA walker) is similar to the above one. After the terminal protection of SA, a 'leg' of multi-pedal DNA walker triggers the opening of the hairpin of ISDPR-H1 conjugated with MMPs. Then the biotin-modified primer could hybridize with the open stem, triggering the polymerization reaction in the presence of dNTPs/polymerase. As the extension of the primer, the 'leg' of multi-pedal DNA walker is displaced so that the other 'leg' could trigger proximal H1 to go on the next cycle. Due to its lower background and stronger signal, multi-pedal DNA walker based on ISDPR has a lower limit of detection for SA. The limit of detection (LOD) for SA is 6.5 pM. What's more, these DNA walker methods have been applied in complex samples successfully.

  20. Implementation of Arithmetic and Nonarithmetic Functions on a Label-free and DNA-based Platform

    NASA Astrophysics Data System (ADS)

    Wang, Kun; He, Mengqi; Wang, Jin; He, Ronghuan; Wang, Jianhua

    2016-10-01

    A series of complex logic gates were constructed based on graphene oxide and DNA-templated silver nanoclusters to perform both arithmetic and nonarithmetic functions. For the purpose of satisfying the requirements of progressive computational complexity and cost-effectiveness, a label-free and universal platform was developed by integration of various functions, including half adder, half subtractor, multiplexer and demultiplexer. The label-free system avoided laborious modification of biomolecules. The designed DNA-based logic gates can be implemented with readout of near-infrared fluorescence, and exhibit great potential applications in the field of bioimaging as well as disease diagnosis.

  1. A multidisciplinary approach to solving computer related vision problems.

    PubMed

    Long, Jennifer; Helland, Magne

    2012-09-01

    This paper proposes a multidisciplinary approach to solving computer related vision issues by including optometry as a part of the problem-solving team. Computer workstation design is increasing in complexity. There are at least ten different professions who contribute to workstation design or who provide advice to improve worker comfort, safety and efficiency. Optometrists have a role identifying and solving computer-related vision issues and in prescribing appropriate optical devices. However, it is possible that advice given by optometrists to improve visual comfort may conflict with other requirements and demands within the workplace. A multidisciplinary approach has been advocated for solving computer related vision issues. There are opportunities for optometrists to collaborate with ergonomists, who coordinate information from physical, cognitive and organisational disciplines to enact holistic solutions to problems. This paper proposes a model of collaboration and examples of successful partnerships at a number of professional levels including individual relationships between optometrists and ergonomists when they have mutual clients/patients, in undergraduate and postgraduate education and in research. There is also scope for dialogue between optometry and ergonomics professional associations. A multidisciplinary approach offers the opportunity to solve vision related computer issues in a cohesive, rather than fragmented way. Further exploration is required to understand the barriers to these professional relationships. © 2012 The College of Optometrists.

  2. A cross-disciplinary introduction to quantum annealing-based algorithms

    NASA Astrophysics Data System (ADS)

    Venegas-Andraca, Salvador E.; Cruz-Santos, William; McGeoch, Catherine; Lanzagorta, Marco

    2018-04-01

    A central goal in quantum computing is the development of quantum hardware and quantum algorithms in order to analyse challenging scientific and engineering problems. Research in quantum computation involves contributions from both physics and computer science; hence this article presents a concise introduction to basic concepts from both fields that are used in annealing-based quantum computation, an alternative to the more familiar quantum gate model. We introduce some concepts from computer science required to define difficult computational problems and to realise the potential relevance of quantum algorithms to find novel solutions to those problems. We introduce the structure of quantum annealing-based algorithms as well as two examples of this kind of algorithms for solving instances of the max-SAT and Minimum Multicut problems. An overview of the quantum annealing systems manufactured by D-Wave Systems is also presented.

  3. Assessing collaborative computing: development of the Collaborative-Computing Observation Instrument (C-COI)

    NASA Astrophysics Data System (ADS)

    Israel, Maya; Wherfel, Quentin M.; Shehab, Saadeddine; Ramos, Evan A.; Metzger, Adam; Reese, George C.

    2016-07-01

    This paper describes the development, validation, and uses of the Collaborative Computing Observation Instrument (C-COI), a web-based analysis instrument that classifies individual and/or collaborative behaviors of students during computing problem-solving (e.g. coding, programming). The C-COI analyzes data gathered through video and audio screen recording software that captures students' computer screens as they program, and their conversations with their peers or adults. The instrument allows researchers to organize and quantify these data to track behavioral patterns that could be further analyzed for deeper understanding of persistence and/or collaborative interactions. The article provides a rationale for the C-COI including the development of a theoretical framework for measuring collaborative interactions in computer-mediated environments. This theoretical framework relied on the computer-supported collaborative learning literature related to adaptive help seeking, the joint problem-solving space in which collaborative computing occurs, and conversations related to outcomes and products of computational activities. Instrument development and validation also included ongoing advisory board feedback from experts in computer science, collaborative learning, and K-12 computing as well as classroom observations to test out the constructs in the C-COI. These processes resulted in an instrument with rigorous validation procedures and a high inter-rater reliability.

  4. Parallel discontinuous Galerkin FEM for computing hyperbolic conservation law on unstructured grids

    NASA Astrophysics Data System (ADS)

    Ma, Xinrong; Duan, Zhijian

    2018-04-01

    High-order resolution Discontinuous Galerkin finite element methods (DGFEM) has been known as a good method for solving Euler equations and Navier-Stokes equations on unstructured grid, but it costs too much computational resources. An efficient parallel algorithm was presented for solving the compressible Euler equations. Moreover, the multigrid strategy based on three-stage three-order TVD Runge-Kutta scheme was used in order to improve the computational efficiency of DGFEM and accelerate the convergence of the solution of unsteady compressible Euler equations. In order to make each processor maintain load balancing, the domain decomposition method was employed. Numerical experiment performed for the inviscid transonic flow fluid problems around NACA0012 airfoil and M6 wing. The results indicated that our parallel algorithm can improve acceleration and efficiency significantly, which is suitable for calculating the complex flow fluid.

  5. Parallel processing for scientific computations

    NASA Technical Reports Server (NTRS)

    Alkhatib, Hasan S.

    1991-01-01

    The main contribution of the effort in the last two years is the introduction of the MOPPS system. After doing extensive literature search, we introduced the system which is described next. MOPPS employs a new solution to the problem of managing programs which solve scientific and engineering applications on a distributed processing environment. Autonomous computers cooperate efficiently in solving large scientific problems with this solution. MOPPS has the advantage of not assuming the presence of any particular network topology or configuration, computer architecture, or operating system. It imposes little overhead on network and processor resources while efficiently managing programs concurrently. The core of MOPPS is an intelligent program manager that builds a knowledge base of the execution performance of the parallel programs it is managing under various conditions. The manager applies this knowledge to improve the performance of future runs. The program manager learns from experience.

  6. An accelerated non-Gaussianity based multichannel predictive deconvolution method with the limited supporting region of filters

    NASA Astrophysics Data System (ADS)

    Li, Zhong-xiao; Li, Zhen-chun

    2016-09-01

    The multichannel predictive deconvolution can be conducted in overlapping temporal and spatial data windows to solve the 2D predictive filter for multiple removal. Generally, the 2D predictive filter can better remove multiples at the cost of more computation time compared with the 1D predictive filter. In this paper we first use the cross-correlation strategy to determine the limited supporting region of filters where the coefficients play a major role for multiple removal in the filter coefficient space. To solve the 2D predictive filter the traditional multichannel predictive deconvolution uses the least squares (LS) algorithm, which requires primaries and multiples are orthogonal. To relax the orthogonality assumption the iterative reweighted least squares (IRLS) algorithm and the fast iterative shrinkage thresholding (FIST) algorithm have been used to solve the 2D predictive filter in the multichannel predictive deconvolution with the non-Gaussian maximization (L1 norm minimization) constraint of primaries. The FIST algorithm has been demonstrated as a faster alternative to the IRLS algorithm. In this paper we introduce the FIST algorithm to solve the filter coefficients in the limited supporting region of filters. Compared with the FIST based multichannel predictive deconvolution without the limited supporting region of filters the proposed method can reduce the computation burden effectively while achieving a similar accuracy. Additionally, the proposed method can better balance multiple removal and primary preservation than the traditional LS based multichannel predictive deconvolution and FIST based single channel predictive deconvolution. Synthetic and field data sets demonstrate the effectiveness of the proposed method.

  7. Fast multi-core based multimodal registration of 2D cross-sections and 3D datasets

    PubMed Central

    2010-01-01

    Background Solving bioinformatics tasks often requires extensive computational power. Recent trends in processor architecture combine multiple cores into a single chip to improve overall performance. The Cell Broadband Engine (CBE), a heterogeneous multi-core processor, provides power-efficient and cost-effective high-performance computing. One application area is image analysis and visualisation, in particular registration of 2D cross-sections into 3D image datasets. Such techniques can be used to put different image modalities into spatial correspondence, for example, 2D images of histological cuts into morphological 3D frameworks. Results We evaluate the CBE-driven PlayStation 3 as a high performance, cost-effective computing platform by adapting a multimodal alignment procedure to several characteristic hardware properties. The optimisations are based on partitioning, vectorisation, branch reducing and loop unrolling techniques with special attention to 32-bit multiplies and limited local storage on the computing units. We show how a typical image analysis and visualisation problem, the multimodal registration of 2D cross-sections and 3D datasets, benefits from the multi-core based implementation of the alignment algorithm. We discuss several CBE-based optimisation methods and compare our results to standard solutions. More information and the source code are available from http://cbe.ipk-gatersleben.de. Conclusions The results demonstrate that the CBE processor in a PlayStation 3 accelerates computational intensive multimodal registration, which is of great importance in biological/medical image processing. The PlayStation 3 as a low cost CBE-based platform offers an efficient option to conventional hardware to solve computational problems in image processing and bioinformatics. PMID:20064262

  8. Forensic DNA testing.

    PubMed

    Butler, John M

    2011-12-01

    Forensic DNA testing has a number of applications, including parentage testing, identifying human remains from natural or man-made disasters or terrorist attacks, and solving crimes. This article provides background information followed by an overview of the process of forensic DNA testing, including sample collection, DNA extraction, PCR amplification, short tandem repeat (STR) allele separation and sizing, typing and profile interpretation, statistical analysis, and quality assurance. The article concludes with discussions of possible problems with the data and other forensic DNA testing techniques.

  9. Problem-Solving Test: Southwestern Blotting

    ERIC Educational Resources Information Center

    Szeberényi, József

    2014-01-01

    Terms to be familiar with before you start to solve the test: Southern blotting, Western blotting, restriction endonucleases, agarose gel electrophoresis, nitrocellulose filter, molecular hybridization, polyacrylamide gel electrophoresis, proto-oncogene, c-abl, Src-homology domains, tyrosine protein kinase, nuclear localization signal, cDNA,…

  10. Self-Assembling Molecular Logic Gates Based on DNA Crossover Tiles.

    PubMed

    Campbell, Eleanor A; Peterson, Evan; Kolpashchikov, Dmitry M

    2017-07-05

    DNA-based computational hardware has attracted ever-growing attention due to its potential to be useful in the analysis of complex mixtures of biological markers. Here we report the design of self-assembling logic gates that recognize DNA inputs and assemble into crossover tiles when the output signal is high; the crossover structures disassemble to form separate DNA stands when the output is low. The output signal can be conveniently detected by fluorescence using a molecular beacon probe as a reporter. AND, NOT, and OR logic gates were designed. We demonstrate that the gates can connect to each other to produce other logic functions. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  11. A novel chaotic based image encryption using a hybrid model of deoxyribonucleic acid and cellular automata

    NASA Astrophysics Data System (ADS)

    Enayatifar, Rasul; Sadaei, Hossein Javedani; Abdullah, Abdul Hanan; Lee, Malrey; Isnin, Ismail Fauzi

    2015-08-01

    Currently, there are many studies have conducted on developing security of the digital image in order to protect such data while they are sending on the internet. This work aims to propose a new approach based on a hybrid model of the Tinkerbell chaotic map, deoxyribonucleic acid (DNA) and cellular automata (CA). DNA rules, DNA sequence XOR operator and CA rules are used simultaneously to encrypt the plain-image pixels. To determine rule number in DNA sequence and also CA, a 2-dimension Tinkerbell chaotic map is employed. Experimental results and computer simulations, both confirm that the proposed scheme not only demonstrates outstanding encryption, but also resists various typical attacks.

  12. Materials Data on BaSe (SG:225) by Materials Project

    DOE Data Explorer

    Kristin Persson

    2014-11-02

    Computed materials data using density functional theory calculations. These calculations determine the electronic structure of bulk materials by solving approximations to the Schrodinger equation. For more information, see https://materialsproject.org/docs/calculations

  13. Materials Data on BaSe (SG:221) by Materials Project

    DOE Data Explorer

    Kristin Persson

    2014-11-02

    Computed materials data using density functional theory calculations. These calculations determine the electronic structure of bulk materials by solving approximations to the Schrodinger equation. For more information, see https://materialsproject.org/docs/calculations

  14. SETI@home

    Science.gov Websites

    experiment, based at UC Berkeley, that uses Internet-connected computers in the Search for Extraterrestrial several hours to be sure. If that didn't solve the problem, it probably means the index the server uses to

  15. OSG-GEM: Gene Expression Matrix Construction Using the Open Science Grid.

    PubMed

    Poehlman, William L; Rynge, Mats; Branton, Chris; Balamurugan, D; Feltus, Frank A

    2016-01-01

    High-throughput DNA sequencing technology has revolutionized the study of gene expression while introducing significant computational challenges for biologists. These computational challenges include access to sufficient computer hardware and functional data processing workflows. Both these challenges are addressed with our scalable, open-source Pegasus workflow for processing high-throughput DNA sequence datasets into a gene expression matrix (GEM) using computational resources available to U.S.-based researchers on the Open Science Grid (OSG). We describe the usage of the workflow (OSG-GEM), discuss workflow design, inspect performance data, and assess accuracy in mapping paired-end sequencing reads to a reference genome. A target OSG-GEM user is proficient with the Linux command line and possesses basic bioinformatics experience. The user may run this workflow directly on the OSG or adapt it to novel computing environments.

  16. OSG-GEM: Gene Expression Matrix Construction Using the Open Science Grid

    PubMed Central

    Poehlman, William L.; Rynge, Mats; Branton, Chris; Balamurugan, D.; Feltus, Frank A.

    2016-01-01

    High-throughput DNA sequencing technology has revolutionized the study of gene expression while introducing significant computational challenges for biologists. These computational challenges include access to sufficient computer hardware and functional data processing workflows. Both these challenges are addressed with our scalable, open-source Pegasus workflow for processing high-throughput DNA sequence datasets into a gene expression matrix (GEM) using computational resources available to U.S.-based researchers on the Open Science Grid (OSG). We describe the usage of the workflow (OSG-GEM), discuss workflow design, inspect performance data, and assess accuracy in mapping paired-end sequencing reads to a reference genome. A target OSG-GEM user is proficient with the Linux command line and possesses basic bioinformatics experience. The user may run this workflow directly on the OSG or adapt it to novel computing environments. PMID:27499617

  17. Insights into peptide nucleic acid (PNA) structural features: The crystal structure of a d-lysine-based chiral PNA–DNA duplex

    PubMed Central

    Menchise, Valeria; De Simone, Giuseppina; Tedeschi, Tullia; Corradini, Roberto; Sforza, Stefano; Marchelli, Rosangela; Capasso, Domenica; Saviano, Michele; Pedone, Carlo

    2003-01-01

    Peptide nucleic acids (PNAs) are oligonucleotide analogues in which the sugar-phosphate backbone has been replaced by a pseudopeptide skeleton. They bind DNA and RNA with high specificity and selectivity, leading to PNA–RNA and PNA–DNA hybrids more stable than the corresponding nucleic acid complexes. The binding affinity and selectivity of PNAs for nucleic acids can be modified by the introduction of stereogenic centers (such as d-Lys-based units) into the PNA backbone. To investigate the structural features of chiral PNAs, the structure of a PNA decamer containing three d-Lys-based monomers (namely H-GpnTpnApnGpnAdlTdlCdlApnCpnTpn-NH2, in which pn represents a pseudopeptide link and dl represents a d-Lys analogue) hybridized with its complementary antiparallel DNA has been solved at a 1.66-Å resolution by means of a single-wavelength anomalous diffraction experiment on a brominated derivative. Thed-Lys-based chiral PNA–DNA (LPD) heteroduplex adopts the so-called P-helix conformation. From the substantial similarity between the PNA conformation in LPD and the conformations observed in other PNA structures, it can be concluded that PNAs possess intrinsic conformational preferences for the P-helix, and that their flexibility is rather restricted. The conformational rigidity of PNAs is enhanced by the presence of the chiral centers, limiting the ability of PNA strands to adopt other conformations and, ultimately, increasing the selectivity in molecular recognition. PMID:14512516

  18. Amoeba-inspired nanoarchitectonic computing: solving intractable computational problems using nanoscale photoexcitation transfer dynamics.

    PubMed

    Aono, Masashi; Naruse, Makoto; Kim, Song-Ju; Wakabayashi, Masamitsu; Hori, Hirokazu; Ohtsu, Motoichi; Hara, Masahiko

    2013-06-18

    Biologically inspired computing devices and architectures are expected to overcome the limitations of conventional technologies in terms of solving computationally demanding problems, adapting to complex environments, reducing energy consumption, and so on. We previously demonstrated that a primitive single-celled amoeba (a plasmodial slime mold), which exhibits complex spatiotemporal oscillatory dynamics and sophisticated computing capabilities, can be used to search for a solution to a very hard combinatorial optimization problem. We successfully extracted the essential spatiotemporal dynamics by which the amoeba solves the problem. This amoeba-inspired computing paradigm can be implemented by various physical systems that exhibit suitable spatiotemporal dynamics resembling the amoeba's problem-solving process. In this Article, we demonstrate that photoexcitation transfer phenomena in certain quantum nanostructures mediated by optical near-field interactions generate the amoebalike spatiotemporal dynamics and can be used to solve the satisfiability problem (SAT), which is the problem of judging whether a given logical proposition (a Boolean formula) is self-consistent. SAT is related to diverse application problems in artificial intelligence, information security, and bioinformatics and is a crucially important nondeterministic polynomial time (NP)-complete problem, which is believed to become intractable for conventional digital computers when the problem size increases. We show that our amoeba-inspired computing paradigm dramatically outperforms a conventional stochastic search method. These results indicate the potential for developing highly versatile nanoarchitectonic computers that realize powerful solution searching with low energy consumption.

  19. Information Power Grid: Distributed High-Performance Computing and Large-Scale Data Management for Science and Engineering

    NASA Technical Reports Server (NTRS)

    Johnston, William E.; Gannon, Dennis; Nitzberg, Bill; Feiereisen, William (Technical Monitor)

    2000-01-01

    The term "Grid" refers to distributed, high performance computing and data handling infrastructure that incorporates geographically and organizationally dispersed, heterogeneous resources that are persistent and supported. The vision for NASN's Information Power Grid - a computing and data Grid - is that it will provide significant new capabilities to scientists and engineers by facilitating routine construction of information based problem solving environments / frameworks that will knit together widely distributed computing, data, instrument, and human resources into just-in-time systems that can address complex and large-scale computing and data analysis problems. IPG development and deployment is addressing requirements obtained by analyzing a number of different application areas, in particular from the NASA Aero-Space Technology Enterprise. This analysis has focussed primarily on two types of users: The scientist / design engineer whose primary interest is problem solving (e.g., determining wing aerodynamic characteristics in many different operating environments), and whose primary interface to IPG will be through various sorts of problem solving frameworks. The second type of user if the tool designer: The computational scientists who convert physics and mathematics into code that can simulate the physical world. These are the two primary users of IPG, and they have rather different requirements. This paper describes the current state of IPG (the operational testbed), the set of capabilities being put into place for the operational prototype IPG, as well as some of the longer term R&D tasks.

  20. Parallel Algorithm Solves Coupled Differential Equations

    NASA Technical Reports Server (NTRS)

    Hayashi, A.

    1987-01-01

    Numerical methods adapted to concurrent processing. Algorithm solves set of coupled partial differential equations by numerical integration. Adapted to run on hypercube computer, algorithm separates problem into smaller problems solved concurrently. Increase in computing speed with concurrent processing over that achievable with conventional sequential processing appreciable, especially for large problems.

  1. Modeling photoionization of aqueous DNA and its components.

    PubMed

    Pluhařová, Eva; Slavíček, Petr; Jungwirth, Pavel

    2015-05-19

    Radiation damage to DNA is usually considered in terms of UVA and UVB radiation. These ultraviolet rays, which are part of the solar spectrum, can indeed cause chemical lesions in DNA, triggered by photoexcitation particularly in the UVB range. Damage can, however, be also caused by higher energy radiation, which can ionize directly the DNA or its immediate surroundings, leading to indirect damage. Thanks to absorption in the atmosphere, the intensity of such ionizing radiation is negligible in the solar spectrum at the surface of Earth. Nevertheless, such an ionizing scenario can become dangerously plausible for astronauts or flight personnel, as well as for persons present at nuclear power plant accidents. On the beneficial side, ionizing radiation is employed as means for destroying the DNA of cancer cells during radiation therapy. Quantitative information about ionization of DNA and its components is important not only for DNA radiation damage, but also for understanding redox properties of DNA in redox sensing or labeling, as well as charge migration along the double helix in nanoelectronics applications. Until recently, the vast majority of experimental and computational data on DNA ionization was pertinent to its components in the gas phase, which is far from its native aqueous environment. The situation has, however, changed for the better due to the advent of photoelectron spectroscopy in liquid microjets and its most recent application to photoionization of aqueous nucleosides, nucleotides, and larger DNA fragments. Here, we present a consistent and efficient computational methodology, which allows to accurately evaluate ionization energies and model photoelectron spectra of aqueous DNA and its individual components. After careful benchmarking, the method based on density functional theory and its time-dependent variant with properly chosen hybrid functionals and polarizable continuum solvent model provides ionization energies with accuracy of 0.2-0.3 eV, allowing for faithful modeling and interpretation of DNA photoionization. The key finding is that the aqueous medium is remarkably efficient in screening the interactions within DNA such that, unlike in the gas phase, ionization of a base, nucleoside, or nucleotide depends only very weakly on the particular DNA context. An exception is the electronic interaction between neighboring bases which can lead to sequence-specific effects, such as a partial delocalization of the cationic hole upon ionization enabled by presence of adjacent bases of the same type.

  2. Acceleration of image-based resolution modelling reconstruction using an expectation maximization nested algorithm.

    PubMed

    Angelis, G I; Reader, A J; Markiewicz, P J; Kotasidis, F A; Lionheart, W R; Matthews, J C

    2013-08-07

    Recent studies have demonstrated the benefits of a resolution model within iterative reconstruction algorithms in an attempt to account for effects that degrade the spatial resolution of the reconstructed images. However, these algorithms suffer from slower convergence rates, compared to algorithms where no resolution model is used, due to the additional need to solve an image deconvolution problem. In this paper, a recently proposed algorithm, which decouples the tomographic and image deconvolution problems within an image-based expectation maximization (EM) framework, was evaluated. This separation is convenient, because more computational effort can be placed on the image deconvolution problem and therefore accelerate convergence. Since the computational cost of solving the image deconvolution problem is relatively small, multiple image-based EM iterations do not significantly increase the overall reconstruction time. The proposed algorithm was evaluated using 2D simulations, as well as measured 3D data acquired on the high-resolution research tomograph. Results showed that bias reduction can be accelerated by interleaving multiple iterations of the image-based EM algorithm solving the resolution model problem, with a single EM iteration solving the tomographic problem. Significant improvements were observed particularly for voxels that were located on the boundaries between regions of high contrast within the object being imaged and for small regions of interest, where resolution recovery is usually more challenging. Minor differences were observed using the proposed nested algorithm, compared to the single iteration normally performed, when an optimal number of iterations are performed for each algorithm. However, using the proposed nested approach convergence is significantly accelerated enabling reconstruction using far fewer tomographic iterations (up to 70% fewer iterations for small regions). Nevertheless, the optimal number of nested image-based EM iterations is hard to be defined and it should be selected according to the given application.

  3. Solving Constraint Satisfaction Problems with Networks of Spiking Neurons

    PubMed Central

    Jonke, Zeno; Habenschuss, Stefan; Maass, Wolfgang

    2016-01-01

    Network of neurons in the brain apply—unlike processors in our current generation of computer hardware—an event-based processing strategy, where short pulses (spikes) are emitted sparsely by neurons to signal the occurrence of an event at a particular point in time. Such spike-based computations promise to be substantially more power-efficient than traditional clocked processing schemes. However, it turns out to be surprisingly difficult to design networks of spiking neurons that can solve difficult computational problems on the level of single spikes, rather than rates of spikes. We present here a new method for designing networks of spiking neurons via an energy function. Furthermore, we show how the energy function of a network of stochastically firing neurons can be shaped in a transparent manner by composing the networks of simple stereotypical network motifs. We show that this design approach enables networks of spiking neurons to produce approximate solutions to difficult (NP-hard) constraint satisfaction problems from the domains of planning/optimization and verification/logical inference. The resulting networks employ noise as a computational resource. Nevertheless, the timing of spikes plays an essential role in their computations. Furthermore, networks of spiking neurons carry out for the Traveling Salesman Problem a more efficient stochastic search for good solutions compared with stochastic artificial neural networks (Boltzmann machines) and Gibbs sampling. PMID:27065785

  4. In Silico Design and Characterization of DNA Nanomaterials

    NASA Astrophysics Data System (ADS)

    Nash, Jessica A.

    Deoxyribonucleic acid (DNA) and ribonucleic acid (RNA) function biologically as carriers of genetic information. However, due to their ability to self-assemble via base pairing, nucleic acid molecules have become widely used in nanotechnology. In this dissertation, in silico techniques are used to probe the structure-property relationships of nucleic acid based nanomaterials. In Part 1, computational methods are employed to formulate nanoparticle design rules for applications in nucleic acid packaging and delivery. Nanoparticles (NPs) play increasingly important roles in nanomedicine, where the surface chemistry allows for control over interactions with biomolecules. Understanding how DNA and RNA compaction occurs is relevant to biological systems and systems in nanotechnology, and critical for the development of more efficient and effective nanoparticle carriers. Computational modeling allows for the description of bio-nano systems and processes with unprecedented detail, and can provide insights and guidelines for the creation of new nanomaterials. Using all-atom molecular dynamics simulations, the effect of nanoparticle surface chemistry, size, and solvent ionic strength on interactions with DNA and RNA are reported. In Chapter 2, a systematic study of the effect of nanoparticle charge on ability to bend and wrap short sequences of DNA and RNA is presented. To cause bending of DNA, a nanoparticle charge of at least +30 is required. Higher nanoparticle charges cause a greater degree of compaction. For RNA, however, charged ligand end-groups bind internally and prevent RNA bending. Nanoparticles were designed to test the influence of NP ligand shell shape and length on RNA binding using these results. In Chapter 3, all-atom simulation of NPs with long double stranded RNA are reported. Simulations show that by shortening NP ligand length, double stranded RNA can be wrapped. In Chapter 4, we consider compaction of long DNA by nanoparticles. NPs with +120 charge can fully compact DNA, but the wrapping is unordered on the surface. Chapter 5 reports the influence of NPs on the structure of single stranded DNA and RNA, showing that NPs have a greater influence on poly-pyrimidine strands than poly-purine strands, and can interrupt hydrogen bonds and pi-pi stacking. In Part II of this dissertation, computational techniques are applied to study DNA tiles and origami. Due to base-pairing DNA can be used to place objects with nanoscale precision, with applications in nanoscience and nanomedicine. Chapter 6 presents the development of anticoagulants using DNA weave tiles and aptamers. More effective anticoagulants can be created by varying the DNA aptamer used, and increasing local concentration by attaching aptamers to a DNA tile. Molecular dynamics simulations show that increasing the number of helices on a DNA weave tile increases tile flexibility. Chapter 7 introduces a tool developed for visualization of DNA origami design. We develop circle map visualizations for DNA origami and maps of the base composition, allowing for visualizations of DNA origami that were not previously available. This tool is currently available online via nanohub (open source) for users around the world. The results reported here provide a fundamental understanding of the behavior of DNA systems in nanotechnology. Results are expected to aid in the development of more effective NP compaction agents, DNA delivery vehicles, and DNA origami design.

  5. Mechanistic Basis for the Bypass of a Bulky DNA Adduct Catalyzed by a Y-Family DNA Polymerase

    PubMed Central

    Vyas, Rajan; Efthimiopoulos, Georgia; Tokarsky, E. John; Malik, Chanchal K.; Basu, Ashis K.; Suo, Zucai

    2015-01-01

    1-Nitropyrene (1-NP), an environmental pollutant, induces DNA damage in vivo and is considered to be carcinogenic. The DNA adducts formed by the 1-NP metabolites stall replicative DNA polymerases but are presumably bypassed by error-prone Y-family DNA polymerases at the expense of replication fidelity and efficiency in vivo. Our running start assays confirmed that a site-specifically placed 8-(deoxyguanosin-N2-yl)-1-aminopyrene (dG1,8), one of the DNA adducts derived from 1-NP, can be bypassed by Sulfolobus solfataricus DNA polymerase IV (Dpo4), although this representative Y-family enzyme was paused strongly by the lesion. Pre-steady-state kinetic assays were employed to determine the low nucleotide incorporation fidelity and establish a minimal kinetic mechanism for the dG1,8 bypass by Dpo4. To reveal a structural basis for dCTP incorporation opposite dG1,8, we solved the crystal structures of the complexes of Dpo4 and DNA containing a templating dG1,8 lesion in the absence or presence of dCTP. The Dpo4·DNA-dG1,8 binary structure shows that the aminopyrene moiety of the lesion stacks against the primer/template junction pair, while its dG moiety projected into the cleft between the Finger and Little Finger domains of Dpo4. In the Dpo4·DNA-dG1,8·dCTP ternary structure, the aminopyrene moiety of the dG1,8 lesion, is sandwiched between the nascent and junction base pairs, while its base is present in the major groove. Moreover, dCTP forms a Watson–Crick base pair with dG, two nucleotides upstream from the dG1,8 site, creating a complex for “-2” frameshift mutation. Mechanistically, these crystal structures provide additional insight into the aforementioned minimal kinetic mechanism. PMID:26327169

  6. Improving Learning, Retention of Knowledge, and Attitude of Students in a Vocational-Technical College through Interactive Computer Technology.

    ERIC Educational Resources Information Center

    Hitchcock, A. Allen

    The problem that this practicum attempted to solve was that students in a vocational-technical college tended to underachieve in courses that were mainly cognitive in nature, as evidenced by low overall grade-point course averages and other measures. The researcher designed computer-based simulation/gaming instruction that aimed to increase…

  7. The Use of Computer-Based Data Banks in the Teaching of Clinical Problem-Solving and Medical Case Management to Veterinary Students

    ERIC Educational Resources Information Center

    Conzelman, Gaylord M.; And Others

    1975-01-01

    Describes a program at the School of Veterinary Medicine, University of California at Davis to give students experience in diagnosis and management of urinary tract diseases. Students request from computer data banks that laboratory information they deem most useful in the medical management of each clinical problem. (JT)

  8. A high-order multiscale finite-element method for time-domain acoustic-wave modeling

    NASA Astrophysics Data System (ADS)

    Gao, Kai; Fu, Shubin; Chung, Eric T.

    2018-05-01

    Accurate and efficient wave equation modeling is vital for many applications in such as acoustics, electromagnetics, and seismology. However, solving the wave equation in large-scale and highly heterogeneous models is usually computationally expensive because the computational cost is directly proportional to the number of grids in the model. We develop a novel high-order multiscale finite-element method to reduce the computational cost of time-domain acoustic-wave equation numerical modeling by solving the wave equation on a coarse mesh based on the multiscale finite-element theory. In contrast to existing multiscale finite-element methods that use only first-order multiscale basis functions, our new method constructs high-order multiscale basis functions from local elliptic problems which are closely related to the Gauss-Lobatto-Legendre quadrature points in a coarse element. Essentially, these basis functions are not only determined by the order of Legendre polynomials, but also by local medium properties, and therefore can effectively convey the fine-scale information to the coarse-scale solution with high-order accuracy. Numerical tests show that our method can significantly reduce the computation time while maintain high accuracy for wave equation modeling in highly heterogeneous media by solving the corresponding discrete system only on the coarse mesh with the new high-order multiscale basis functions.

  9. A high-order multiscale finite-element method for time-domain acoustic-wave modeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gao, Kai; Fu, Shubin; Chung, Eric T.

    Accurate and efficient wave equation modeling is vital for many applications in such as acoustics, electromagnetics, and seismology. However, solving the wave equation in large-scale and highly heterogeneous models is usually computationally expensive because the computational cost is directly proportional to the number of grids in the model. We develop a novel high-order multiscale finite-element method to reduce the computational cost of time-domain acoustic-wave equation numerical modeling by solving the wave equation on a coarse mesh based on the multiscale finite-element theory. In contrast to existing multiscale finite-element methods that use only first-order multiscale basis functions, our new method constructsmore » high-order multiscale basis functions from local elliptic problems which are closely related to the Gauss–Lobatto–Legendre quadrature points in a coarse element. Essentially, these basis functions are not only determined by the order of Legendre polynomials, but also by local medium properties, and therefore can effectively convey the fine-scale information to the coarse-scale solution with high-order accuracy. Numerical tests show that our method can significantly reduce the computation time while maintain high accuracy for wave equation modeling in highly heterogeneous media by solving the corresponding discrete system only on the coarse mesh with the new high-order multiscale basis functions.« less

  10. A high-order multiscale finite-element method for time-domain acoustic-wave modeling

    DOE PAGES

    Gao, Kai; Fu, Shubin; Chung, Eric T.

    2018-02-04

    Accurate and efficient wave equation modeling is vital for many applications in such as acoustics, electromagnetics, and seismology. However, solving the wave equation in large-scale and highly heterogeneous models is usually computationally expensive because the computational cost is directly proportional to the number of grids in the model. We develop a novel high-order multiscale finite-element method to reduce the computational cost of time-domain acoustic-wave equation numerical modeling by solving the wave equation on a coarse mesh based on the multiscale finite-element theory. In contrast to existing multiscale finite-element methods that use only first-order multiscale basis functions, our new method constructsmore » high-order multiscale basis functions from local elliptic problems which are closely related to the Gauss–Lobatto–Legendre quadrature points in a coarse element. Essentially, these basis functions are not only determined by the order of Legendre polynomials, but also by local medium properties, and therefore can effectively convey the fine-scale information to the coarse-scale solution with high-order accuracy. Numerical tests show that our method can significantly reduce the computation time while maintain high accuracy for wave equation modeling in highly heterogeneous media by solving the corresponding discrete system only on the coarse mesh with the new high-order multiscale basis functions.« less

  11. Student Conceptions about the DNA Structure within a Hierarchical Organizational Level: Improvement by Experiment- and Computer-Based Outreach Learning

    ERIC Educational Resources Information Center

    Langheinrich, Jessica; Bogner, Franz X.

    2015-01-01

    As non-scientific conceptions interfere with learning processes, teachers need both, to know about them and to address them in their classrooms. For our study, based on 182 eleventh graders, we analyzed the level of conceptual understanding by implementing the "draw and write" technique during a computer-supported gene technology module.…

  12. Integrating Micro-computers with a Centralized DBMS: ORACLE, SEED AND INGRES

    NASA Technical Reports Server (NTRS)

    Hoerger, J.

    1984-01-01

    Users of ADABAS, a relational-like data base management system (ADABAS) with its data base programming language (NATURAL) are acquiring microcomputers with hopes of solving their individual word processing, office automation, decision support, and simple data processing problems. As processor speeds, memory sizes, and disk storage capacities increase, individual departments begin to maintain "their own" data base on "their own" micro-computer. This situation can adversely affect several of the primary goals set for implementing a centralized DBMS. In order to avoid this potential problem, these micro-computers must be integrated with the centralized DBMS. An easy to use and flexible means for transferring logic data base files between the central data base machine and micro-computers must be provided. Some of the problems encounted in an effort to accomplish this integration and possible solutions are discussed.

  13. System design and implementation of digital-image processing using computational grids

    NASA Astrophysics Data System (ADS)

    Shen, Zhanfeng; Luo, Jiancheng; Zhou, Chenghu; Huang, Guangyu; Ma, Weifeng; Ming, Dongping

    2005-06-01

    As a special type of digital image, remotely sensed images are playing increasingly important roles in our daily lives. Because of the enormous amounts of data involved, and the difficulties of data processing and transfer, an important issue for current computer and geo-science experts is developing internet technology to implement rapid remotely sensed image processing. Computational grids are able to solve this problem effectively. These networks of computer workstations enable the sharing of data and resources, and are used by computer experts to solve imbalances of network resources and lopsided usage. In China, computational grids combined with spatial-information-processing technology have formed a new technology: namely, spatial-information grids. In the field of remotely sensed images, spatial-information grids work more effectively for network computing, data processing, resource sharing, task cooperation and so on. This paper focuses mainly on the application of computational grids to digital-image processing. Firstly, we describe the architecture of digital-image processing on the basis of computational grids, its implementation is then discussed in detail with respect to the technology of middleware. The whole network-based intelligent image-processing system is evaluated on the basis of the experimental analysis of remotely sensed image-processing tasks; the results confirm the feasibility of the application of computational grids to digital-image processing.

  14. Integrating Numerical Computation into the Modeling Instruction Curriculum

    ERIC Educational Resources Information Center

    Caballero, Marcos D.; Burk, John B.; Aiken, John M.; Thoms, Brian D.; Douglas, Scott S.; Scanlon, Erin M.; Schatz, Michael F.

    2014-01-01

    Numerical computation (the use of a computer to solve, simulate, or visualize a physical problem) has fundamentally changed the way scientific research is done. Systems that are too difficult to solve in closed form are probed using computation. Experiments that are impossible to perform in the laboratory are studied numerically. Consequently, in…

  15. Retrieving and Indexing Spatial Data in the Cloud Computing Environment

    NASA Astrophysics Data System (ADS)

    Wang, Yonggang; Wang, Sheng; Zhou, Daliang

    In order to solve the drawbacks of spatial data storage in common Cloud Computing platform, we design and present a framework for retrieving, indexing, accessing and managing spatial data in the Cloud environment. An interoperable spatial data object model is provided based on the Simple Feature Coding Rules from the OGC such as Well Known Binary (WKB) and Well Known Text (WKT). And the classic spatial indexing algorithms like Quad-Tree and R-Tree are re-designed in the Cloud Computing environment. In the last we develop a prototype software based on Google App Engine to implement the proposed model.

  16. Computing border bases using mutant strategies

    NASA Astrophysics Data System (ADS)

    Ullah, E.; Abbas Khan, S.

    2014-01-01

    Border bases, a generalization of Gröbner bases, have actively been addressed during recent years due to their applicability to industrial problems. In cryptography and coding theory a useful application of border based is to solve zero-dimensional systems of polynomial equations over finite fields, which motivates us for developing optimizations of the algorithms that compute border bases. In 2006, Kehrein and Kreuzer formulated the Border Basis Algorithm (BBA), an algorithm which allows the computation of border bases that relate to a degree compatible term ordering. In 2007, J. Ding et al. introduced mutant strategies bases on finding special lower degree polynomials in the ideal. The mutant strategies aim to distinguish special lower degree polynomials (mutants) from the other polynomials and give them priority in the process of generating new polynomials in the ideal. In this paper we develop hybrid algorithms that use the ideas of J. Ding et al. involving the concept of mutants to optimize the Border Basis Algorithm for solving systems of polynomial equations over finite fields. In particular, we recall a version of the Border Basis Algorithm which is actually called the Improved Border Basis Algorithm and propose two hybrid algorithms, called MBBA and IMBBA. The new mutants variants provide us space efficiency as well as time efficiency. The efficiency of these newly developed hybrid algorithms is discussed using standard cryptographic examples.

  17. The Essential Component in DNA-Based Information Storage System: Robust Error-Tolerating Module

    PubMed Central

    Yim, Aldrin Kay-Yuen; Yu, Allen Chi-Shing; Li, Jing-Woei; Wong, Ada In-Chun; Loo, Jacky F. C.; Chan, King Ming; Kong, S. K.; Yip, Kevin Y.; Chan, Ting-Fung

    2014-01-01

    The size of digital data is ever increasing and is expected to grow to 40,000 EB by 2020, yet the estimated global information storage capacity in 2011 is <300 EB, indicating that most of the data are transient. DNA, as a very stable nano-molecule, is an ideal massive storage device for long-term data archive. The two most notable illustrations are from Church et al. and Goldman et al., whose approaches are well-optimized for most sequencing platforms – short synthesized DNA fragments without homopolymer. Here, we suggested improvements on error handling methodology that could enable the integration of DNA-based computational process, e.g., algorithms based on self-assembly of DNA. As a proof of concept, a picture of size 438 bytes was encoded to DNA with low-density parity-check error-correction code. We salvaged a significant portion of sequencing reads with mutations generated during DNA synthesis and sequencing and successfully reconstructed the entire picture. A modular-based programing framework – DNAcodec with an eXtensible Markup Language-based data format was also introduced. Our experiments demonstrated the practicability of long DNA message recovery with high error tolerance, which opens the field to biocomputing and synthetic biology. PMID:25414846

  18. [The application of new technologies to solving maths problems for students with learning disabilities: the 'underwater school'].

    PubMed

    Miranda-Casas, A; Marco-Taverner, R; Soriano-Ferrer, M; Melià de Alba, A; Simó-Casañ, P

    2008-01-01

    Different procedures have demonstrated efficacy to teach cognitive and metacognitive strategies to problem solving in mathematics. Some studies have used computer-based problem solving instructional programs. To analyze in students with learning disabilities the efficacy of a cognitive strategies training for problem solving, with three instructional delivery formats: a teacher-directed program (T-D), a computer-assisted instructional (CAI) program, and a combined program (T-D + CAI). Forty-four children with mathematics learning disabilities, between 8 and 10 years old participated in this study. The children were randomly assigned to one of the three instructional formats and a control group without cognitive strategies training. In the three instructional conditions which were compared all the students learnt problems solving linguistic and visual cognitive strategies trough the self-instructional procedure. Several types of measurements were used for analysing the possible differential efficacy of the three instructional methods implemented: solving problems tests, marks in mathematics, internal achievement responsibility scale, and school behaviours teacher ratings. Our findings show that the T-D training group and the T-D + CAI group improved significantly on math word problem solving and on marks in Maths from pre- to post-testing. In addition, the results indicated that the students of the T-D + CAI group solved more real-life problems and developed more internal attributions compared to both control and CAI groups. Finally, with regard to school behaviours, improvements in school adjustment and learning problems were observed in the students of the group with a combined instructional format (T-D + CAI).

  19. Computer-Based Instruction Research: Implications for Design.

    ERIC Educational Resources Information Center

    Ross, Steven M.; And Others

    The development and evaluation of several microcomputer-based strategies designed to facilitate learning how to solve mathematics word problems by personalizing examples in accord with individuals' background and interests are described in this paper. The first of two studies conducted with fifth and sixth grade students to evaluate these…

  20. Problem-Solving Test: Catalytic Activities of a Human Nuclear Enzyme

    ERIC Educational Resources Information Center

    Szeberenyi, Jozsef

    2011-01-01

    Terms to be familiar with before you start to solve the test: ion exchange chromatography, polynucleotides, oligonucleotides, radioactive labeling, template, primer, DNA polymerase, reverse transcriptase, helicase, nucleoside triphosphates, nucleoside diphosphates, nucleoside monophosphates, nucleosides, 5'-end and 3'-end, bacteriophage,…

  1. Fast algorithms for computing phylogenetic divergence time.

    PubMed

    Crosby, Ralph W; Williams, Tiffani L

    2017-12-06

    The inference of species divergence time is a key step in most phylogenetic studies. Methods have been available for the last ten years to perform the inference, but the performance of the methods does not yet scale well to studies with hundreds of taxa and thousands of DNA base pairs. For example a study of 349 primate taxa was estimated to require over 9 months of processing time. In this work, we present a new algorithm, AncestralAge, that significantly improves the performance of the divergence time process. As part of AncestralAge, we demonstrate a new method for the computation of phylogenetic likelihood and our experiments show a 90% improvement in likelihood computation time on the aforementioned dataset of 349 primates taxa with over 60,000 DNA base pairs. Additionally, we show that our new method for the computation of the Bayesian prior on node ages reduces the running time for this computation on the 349 taxa dataset by 99%. Through the use of these new algorithms we open up the ability to perform divergence time inference on large phylogenetic studies.

  2. Holes influence the mutation spectrum of human mitochondrial DNA

    NASA Astrophysics Data System (ADS)

    Villagran, Martha; Miller, John

    Mutations drive evolution and disease, showing highly non-random patterns of variant frequency vs. nucleotide position. We use computational DNA hole spectroscopy [M.Y. Suarez-Villagran & J.H. Miller, Sci. Rep. 5, 13571 (2015)] to reveal sites of enhanced hole probability in selected regions of human mitochondrial DNA. A hole is a mobile site of positive charge created when an electron is removed, for example by radiation or contact with a mutagenic agent. The hole spectra are quantum mechanically computed using a two-stranded tight binding model of DNA. We observe significant correlation between spectra of hole probabilities and of genetic variation frequencies from the MITOMAP database. These results suggest that hole-enhanced mutation mechanisms exert a substantial, perhaps dominant, influence on mutation patterns in DNA. One example is where a trapped hole induces a hydrogen bond shift, known as tautomerization, which then triggers a base-pair mismatch during replication. Our results deepen overall understanding of sequence specific mutation rates, encompassing both hotspots and cold spots, which drive molecular evolution.

  3. Time-dependent jet flow and noise computations

    NASA Technical Reports Server (NTRS)

    Berman, C. H.; Ramos, J. I.; Karniadakis, G. E.; Orszag, S. A.

    1990-01-01

    Methods for computing jet turbulence noise based on the time-dependent solution of Lighthill's (1952) differential equation are demonstrated. A key element in this approach is a flow code for solving the time-dependent Navier-Stokes equations at relatively high Reynolds numbers. Jet flow results at Re = 10,000 are presented here. This code combines a computationally efficient spectral element technique and a new self-consistent turbulence subgrid model to supply values for Lighthill's turbulence noise source tensor.

  4. Science modelling in pre-calculus: how to make mathematics problems contextually meaningful

    NASA Astrophysics Data System (ADS)

    Sokolowski, Andrzej; Yalvac, Bugrahan; Loving, Cathleen

    2011-04-01

    'Use of mathematical representations to model and interpret physical phenomena and solve problems is one of the major teaching objectives in high school math curriculum' (National Council of Teachers of Mathematics (NCTM), Principles and Standards for School Mathematics, NCTM, Reston, VA, 2000). Commonly used pre-calculus textbooks provide a wide range of application problems. However, these problems focus students' attention on evaluating or solving pre-arranged formulas for given values. The role of scientific content is reduced to provide a background for these problems instead of being sources of data gathering for inducing mathematical tools. Students are neither required to construct mathematical models based on the contexts nor are they asked to validate or discuss the limitations of applied formulas. Using these contexts, the instructor may think that he/she is teaching problem solving, where in reality he/she is teaching algorithms of the mathematical operations (G. Kulm (ed.), New directions for mathematics assessment, in Assessing Higher Order Thinking in Mathematics, Erlbaum, Hillsdale, NJ, 1994, pp. 221-240). Without a thorough representation of the physical phenomena and the mathematical modelling processes undertaken, problem solving unintentionally appears as simple algorithmic operations. In this article, we deconstruct the representations of mathematics problems from selected pre-calculus textbooks and explicate their limitations. We argue that the structure and content of those problems limits students' coherent understanding of mathematical modelling, and this could result in weak student problem-solving skills. Simultaneously, we explore the ways to enhance representations of those mathematical problems, which we have characterized as lacking a meaningful physical context and limiting coherent student understanding. In light of our discussion, we recommend an alternative to strengthen the process of teaching mathematical modelling - utilization of computer-based science simulations. Although there are several exceptional computer-based science simulations designed for mathematics classes (see, e.g. Kinetic Book (http://www.kineticbooks.com/) or Gizmos (http://www.explorelearning.com/)), we concentrate mainly on the PhET Interactive Simulations developed at the University of Colorado at Boulder (http://phet.colorado.edu/) in generating our argument that computer simulations more accurately represent the contextual characteristics of scientific phenomena than their textual descriptions.

  5. Multiprotein DNA Looping

    NASA Astrophysics Data System (ADS)

    Vilar, Jose M. G.; Saiz, Leonor

    2006-06-01

    DNA looping plays a fundamental role in a wide variety of biological processes, providing the backbone for long range interactions on DNA. Here we develop the first model for DNA looping by an arbitrarily large number of proteins and solve it analytically in the case of identical binding. We uncover a switchlike transition between looped and unlooped phases and identify the key parameters that control this transition. Our results establish the basis for the quantitative understanding of fundamental cellular processes like DNA recombination, gene silencing, and telomere maintenance.

  6. Fast global image smoothing based on weighted least squares.

    PubMed

    Min, Dongbo; Choi, Sunghwan; Lu, Jiangbo; Ham, Bumsub; Sohn, Kwanghoon; Do, Minh N

    2014-12-01

    This paper presents an efficient technique for performing a spatially inhomogeneous edge-preserving image smoothing, called fast global smoother. Focusing on sparse Laplacian matrices consisting of a data term and a prior term (typically defined using four or eight neighbors for 2D image), our approach efficiently solves such global objective functions. In particular, we approximate the solution of the memory-and computation-intensive large linear system, defined over a d-dimensional spatial domain, by solving a sequence of 1D subsystems. Our separable implementation enables applying a linear-time tridiagonal matrix algorithm to solve d three-point Laplacian matrices iteratively. Our approach combines the best of two paradigms, i.e., efficient edge-preserving filters and optimization-based smoothing. Our method has a comparable runtime to the fast edge-preserving filters, but its global optimization formulation overcomes many limitations of the local filtering approaches. Our method also achieves high-quality results as the state-of-the-art optimization-based techniques, but runs ∼10-30 times faster. Besides, considering the flexibility in defining an objective function, we further propose generalized fast algorithms that perform Lγ norm smoothing (0 < γ < 2) and support an aggregated (robust) data term for handling imprecise data constraints. We demonstrate the effectiveness and efficiency of our techniques in a range of image processing and computer graphics applications.

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vyas, Rajan; Reed, Andrew J.; Tokarsky, E. John

    One common oxidative DNA lesion, 8-oxo-7,8-dihydro-2'-deoxyguanine (8-oxoG), is highly mutagenic in vivo due to its anti-conformation forming a Watson–Crick base pair with correct deoxycytidine 5'-triphosphate (dCTP) and its syn-conformation forming a Hoogsteen base pair with incorrect deoxyadenosine 5'-triphosphate (dATP). Here in this article, we utilized time-resolved X-ray crystallography to follow 8-oxoG bypass by human DNA polymerase β (hPolβ). In the 12 solved structures, both Watson–Crick (anti-8-oxoG:anti-dCTP) and Hoogsteen (syn-8-oxoG:anti-dATP) base pairing were clearly visible and were maintained throughout the chemical reaction. Additionally, a third Mg 2+ appeared during the process of phosphodiester bond formation and was located between the reactingmore » α- and β-phosphates of the dNTP, suggesting its role in stabilizing reaction intermediates. After phosphodiester bond formation, hPolβ reopened its conformation, pyrophosphate was released, and the newly incorporated primer 3'-terminal nucleotide stacked, rather than base paired, with 8-oxoG. These structures provide the first real-time pictures, to our knowledge, of how a polymerase correctly and incorrectly bypasses a DNA lesion.« less

  8. Students’ difficulties in probabilistic problem-solving

    NASA Astrophysics Data System (ADS)

    Arum, D. P.; Kusmayadi, T. A.; Pramudya, I.

    2018-03-01

    There are many errors can be identified when students solving mathematics problems, particularly in solving the probabilistic problem. This present study aims to investigate students’ difficulties in solving the probabilistic problem. It focuses on analyzing and describing students errors during solving the problem. This research used the qualitative method with case study strategy. The subjects in this research involve ten students of 9th grade that were selected by purposive sampling. Data in this research involve students’ probabilistic problem-solving result and recorded interview regarding students’ difficulties in solving the problem. Those data were analyzed descriptively using Miles and Huberman steps. The results show that students have difficulties in solving the probabilistic problem and can be divided into three categories. First difficulties relate to students’ difficulties in understanding the probabilistic problem. Second, students’ difficulties in choosing and using appropriate strategies for solving the problem. Third, students’ difficulties with the computational process in solving the problem. Based on the result seems that students still have difficulties in solving the probabilistic problem. It means that students have not able to use their knowledge and ability for responding probabilistic problem yet. Therefore, it is important for mathematics teachers to plan probabilistic learning which could optimize students probabilistic thinking ability.

  9. Feasibility Study of an Interactive Multimedia Electronic Problem Solving Treatment Program for Depression: A Preliminary Uncontrolled Trial

    PubMed Central

    Berman, Margit I.; Jr., Jay C. Buckey; Hull, Jay G.; Linardatos, Eftihia; Song, Sueyoung L.; McLellan, Robert K.; Hegel, Mark T.

    2014-01-01

    Computer-based depression interventions lacking live therapist support have difficulty engaging users. This study evaluated the usability, acceptability, credibility, therapeutic alliance and efficacy of a stand-alone multimedia, interactive, computer-based Problem Solving Treatment program (ePST™) for depression. The program simulated live treatment from an expert PST therapist, and delivered 6 ePST™ sessions over 9 weeks. Twenty-nine participants with moderate-severe symptoms received the intervention; 23 completed a mini mally adequate dose of ePST™ (at least 4 sessions). Program usability, acceptability, credibility, and therapeutic alliance were assessed at treatment midpoint and endpoint. Depressive symptoms and health-related functioning were assessed at baseline, treatment midpoint (4 weeks), and study endpoint (10 weeks). Depression outcomes and therapeutic alliance ratings were also compared to previously published research on live PST and computer-based depression therapy. Participants rated the program as highly usable, acceptable, and credible, and reported a therapeutic alliance with the program comparable to that observed in live therapy. Depressive symptoms improved significantly over time. These findings also provide preliminary evidence that ePST™ may be effective as a depression treatment. Larger clinical trials with diverse samples are indicated. PMID:24680231

  10. Combinatorial Optimization by Amoeba-Based Neurocomputer with Chaotic Dynamics

    NASA Astrophysics Data System (ADS)

    Aono, Masashi; Hirata, Yoshito; Hara, Masahiko; Aihara, Kazuyuki

    We demonstrate a computing system based on an amoeba of a true slime mold Physarum capable of producing rich spatiotemporal oscillatory behavior. Our system operates as a neurocomputer because an optical feedback control in accordance with a recurrent neural network algorithm leads the amoeba's photosensitive branches to search for a stable configuration concurrently. We show our system's capability of solving the traveling salesman problem. Furthermore, we apply various types of nonlinear time series analysis to the amoeba's oscillatory behavior in the problem-solving process. The results suggest that an individual amoeba might be characterized as a set of coupled chaotic oscillators.

  11. Accelerate quasi Monte Carlo method for solving systems of linear algebraic equations through shared memory

    NASA Astrophysics Data System (ADS)

    Lai, Siyan; Xu, Ying; Shao, Bo; Guo, Menghan; Lin, Xiaola

    2017-04-01

    In this paper we study on Monte Carlo method for solving systems of linear algebraic equations (SLAE) based on shared memory. Former research demostrated that GPU can effectively speed up the computations of this issue. Our purpose is to optimize Monte Carlo method simulation on GPUmemoryachritecture specifically. Random numbers are organized to storein shared memory, which aims to accelerate the parallel algorithm. Bank conflicts can be avoided by our Collaborative Thread Arrays(CTA)scheme. The results of experiments show that the shared memory based strategy can speed up the computaions over than 3X at most.

  12. FPGA-based distributed computing microarchitecture for complex physical dynamics investigation.

    PubMed

    Borgese, Gianluca; Pace, Calogero; Pantano, Pietro; Bilotta, Eleonora

    2013-09-01

    In this paper, we present a distributed computing system, called DCMARK, aimed at solving partial differential equations at the basis of many investigation fields, such as solid state physics, nuclear physics, and plasma physics. This distributed architecture is based on the cellular neural network paradigm, which allows us to divide the differential equation system solving into many parallel integration operations to be executed by a custom multiprocessor system. We push the number of processors to the limit of one processor for each equation. In order to test the present idea, we choose to implement DCMARK on a single FPGA, designing the single processor in order to minimize its hardware requirements and to obtain a large number of easily interconnected processors. This approach is particularly suited to study the properties of 1-, 2- and 3-D locally interconnected dynamical systems. In order to test the computing platform, we implement a 200 cells, Korteweg-de Vries (KdV) equation solver and perform a comparison between simulations conducted on a high performance PC and on our system. Since our distributed architecture takes a constant computing time to solve the equation system, independently of the number of dynamical elements (cells) of the CNN array, it allows us to reduce the elaboration time more than other similar systems in the literature. To ensure a high level of reconfigurability, we design a compact system on programmable chip managed by a softcore processor, which controls the fast data/control communication between our system and a PC Host. An intuitively graphical user interface allows us to change the calculation parameters and plot the results.

  13. Measurement of inelastic cross sections for low-energy electron scattering from DNA bases.

    PubMed

    Michaud, Marc; Bazin, Marc; Sanche, Léon

    2012-01-01

    To determine experimentally the absolute cross sections (CS) to deposit various amount of energies into DNA bases by low-energy electron (LEE) impact. Electron energy loss (EEL) spectra of DNA bases were recorded for different LEE impact energies on the molecules deposited at very low coverage on an inert argon (Ar) substrate. Following their normalisation to the effective incident electron current and molecular surface number density, the EEL spectra were then fitted with multiple Gaussian functions in order to delimit the various excitation energy regions. The CS to excite a molecule into its various excitation modes were finally obtained from computing the area under the corresponding Gaussians. The EEL spectra and absolute CS for the electronic excitations of pyrimidine and the DNA bases thymine, adenine, and cytosine by electron impacts below 18 eV were reported for the molecules deposited at about monolayer coverage on a solid Ar substrate. The CS for electronic excitations of DNA bases by LEE impact were found to lie within the 10(216) to 10(218) cm(2) range. The large value of the total ionisation CS indicated that ionisation of DNA bases by LEE is an important dissipative process via which ionising radiation degrades and is absorbed in DNA.

  14. Measurement of inelastic cross sections for low-energy electron scattering from DNA bases

    PubMed Central

    Michaud, Marc; Bazin, Marc.; Sanche, Léon

    2013-01-01

    Purpose Determine experimentally the absolute cross sections (CS) to deposit various amount of energies into DNA bases by low-energy electron (LEE) impact. Materials and methods Electron energy loss (EEL) spectra of DNA bases are recorded for different LEE impact energies on the molecules deposited at very low coverage on an inert argon (Ar) substrate. Following their normalisation to the effective incident electron current and molecular surface number density, the EEL spectra are then fitted with multiple Gaussian functions in order to delimit the various excitation energy regions. The CS to excite a molecule into its various excitation modes are finally obtained from computing the area under the corresponding Gaussians. Results The EEL spectra and absolute CS for the electronic excitations of pyrimidine and the DNA bases thymine, adenine, and cytosine by electron impacts below 18 eV are reported for the molecules deposited at about monolayer coverage on a solid Ar substrate. Conclusions The CS for electronic excitations of DNA bases by LEE impact are found to lie within the 10−16 – 10−18 cm2 range. The large value of the total ionisation CS indicates that ionisation of DNA bases by LEE is an important dissipative process via which ionising radiation degrades and is absorbed in DNA. PMID:21615242

  15. Genome Partitioner: A web tool for multi-level partitioning of large-scale DNA constructs for synthetic biology applications.

    PubMed

    Christen, Matthias; Del Medico, Luca; Christen, Heinz; Christen, Beat

    2017-01-01

    Recent advances in lower-cost DNA synthesis techniques have enabled new innovations in the field of synthetic biology. Still, efficient design and higher-order assembly of genome-scale DNA constructs remains a labor-intensive process. Given the complexity, computer assisted design tools that fragment large DNA sequences into fabricable DNA blocks are needed to pave the way towards streamlined assembly of biological systems. Here, we present the Genome Partitioner software implemented as a web-based interface that permits multi-level partitioning of genome-scale DNA designs. Without the need for specialized computing skills, biologists can submit their DNA designs to a fully automated pipeline that generates the optimal retrosynthetic route for higher-order DNA assembly. To test the algorithm, we partitioned a 783 kb Caulobacter crescentus genome design. We validated the partitioning strategy by assembling a 20 kb test segment encompassing a difficult to synthesize DNA sequence. Successful assembly from 1 kb subblocks into the 20 kb segment highlights the effectiveness of the Genome Partitioner for reducing synthesis costs and timelines for higher-order DNA assembly. The Genome Partitioner is broadly applicable to translate DNA designs into ready to order sequences that can be assembled with standardized protocols, thus offering new opportunities to harness the diversity of microbial genomes for synthetic biology applications. The Genome Partitioner web tool can be accessed at https://christenlab.ethz.ch/GenomePartitioner.

  16. Parallelization of combinatorial search when solving knapsack optimization problem on computing systems based on multicore processors

    NASA Astrophysics Data System (ADS)

    Rahman, P. A.

    2018-05-01

    This scientific paper deals with the model of the knapsack optimization problem and method of its solving based on directed combinatorial search in the boolean space. The offered by the author specialized mathematical model of decomposition of the search-zone to the separate search-spheres and the algorithm of distribution of the search-spheres to the different cores of the multi-core processor are also discussed. The paper also provides an example of decomposition of the search-zone to the several search-spheres and distribution of the search-spheres to the different cores of the quad-core processor. Finally, an offered by the author formula for estimation of the theoretical maximum of the computational acceleration, which can be achieved due to the parallelization of the search-zone to the search-spheres on the unlimited number of the processor cores, is also given.

  17. A new method for solving the quantum hydrodynamic equations of motion: application to two-dimensional reactive scattering.

    PubMed

    Pauler, Denise K; Kendrick, Brian K

    2004-01-08

    The de Broglie-Bohm hydrodynamic equations of motion are solved using a meshless method based on a moving least squares approach and an arbitrary Lagrangian-Eulerian frame of reference. A regridding algorithm adds and deletes computational points as needed in order to maintain a uniform interparticle spacing, and unitary time evolution is obtained by propagating the wave packet using averaged fields. The numerical instabilities associated with the formation of nodes in the reflected portion of the wave packet are avoided by adding artificial viscosity to the equations of motion. The methodology is applied to a two-dimensional model collinear reaction with an activation barrier. Reaction probabilities are computed as a function of both time and energy, and are in excellent agreement with those based on the quantum trajectory method. (c) 2004 American Institute of Physics

  18. Free energy landscape and transition pathways from Watson–Crick to Hoogsteen base pairing in free duplex DNA

    PubMed Central

    Yang, Changwon; Kim, Eunae; Pak, Youngshang

    2015-01-01

    Houghton (HG) base pairing plays a central role in the DNA binding of proteins and small ligands. Probing detailed transition mechanism from Watson–Crick (WC) to HG base pair (bp) formation in duplex DNAs is of fundamental importance in terms of revealing intrinsic functions of double helical DNAs beyond their sequence determined functions. We investigated a free energy landscape of a free B-DNA with an adenosine–thymine (A–T) rich sequence to probe its conformational transition pathways from WC to HG base pairing. The free energy landscape was computed with a state-of-art two-dimensional umbrella molecular dynamics simulation at the all-atom level. The present simulation showed that in an isolated duplex DNA, the spontaneous transition from WC to HG bp takes place via multiple pathways. Notably, base flipping into the major and minor grooves was found to play an important role in forming these multiple transition pathways. This finding suggests that naked B-DNA under normal conditions has an inherent ability to form HG bps via spontaneous base opening events. PMID:26250116

  19. PLANT - An experimental task for the study of human problem solving in process control. [Production Levels and Network Troubleshooting

    NASA Technical Reports Server (NTRS)

    Morris, N. M.; Rouse, W. B.; Fath, J. L.

    1985-01-01

    An experimental tool for the investigation of human problem-solving behavior is introduced. Production Levels and Network Troubleshooting (PLANT) is a computer-based process-control task which may be used to provide opportunities for subjects to control a dynamic system and diagnose, repair, and compensate for system failures. The task is described in detail, and experiments which have been conducted using PLANT are briefly discussed.

  20. Collaborative Systems and Multi-user Interfaces: Computer-based Tools for Cooperative Problem Solving

    DTIC Science & Technology

    1986-10-31

    Reference Card Given to Participants) Cognoter Reference Select = LeftButton Menu = MiddleButton TitleBar menu for tool operations Item menu for item...collaborative tools and their uses, the Colab system and the Cognoter presentation tool were implemented and used for both real and posed idea organization...tasks. To test the system design and its effect on structured problem-solving, many early Colab/ Cognoter meetings were monitored and a series of

Top