Sample records for computational studies reveal

  1. Computing rank-revealing QR factorizations of dense matrices.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bischof, C. H.; Quintana-Orti, G.; Mathematics and Computer Science

    1998-06-01

    We develop algorithms and implementations for computing rank-revealing QR (RRQR) factorizations of dense matrices. First, we develop an efficient block algorithm for approximating an RRQR factorization, employing a windowed version of the commonly used Golub pivoting strategy, aided by incremental condition estimation. Second, we develop efficiently implementable variants of guaranteed reliable RRQR algorithms for triangular matrices originally suggested by Chandrasekaran and Ipsen and by Pan and Tang. We suggest algorithmic improvements with respect to condition estimation, termination criteria, and Givens updating. By combining the block algorithm with one of the triangular postprocessing steps, we arrive at an efficient and reliablemore » algorithm for computing an RRQR factorization of a dense matrix. Experimental results on IBM RS/6000 SGI R8000 platforms show that this approach performs up to three times faster that the less reliable QR factorization with column pivoting as it is currently implemented in LAPACK, and comes within 15% of the performance of the LAPACK block algorithm for computing a QR factorization without any column exchanges. Thus, we expect this routine to be useful in may circumstances where numerical rank deficiency cannot be ruled out, but currently has been ignored because of the computational cost of dealing with it.« less

  2. Integration of computational modeling with membrane transport studies reveals new insights into amino acid exchange transport mechanisms

    PubMed Central

    Widdows, Kate L.; Panitchob, Nuttanont; Crocker, Ian P.; Please, Colin P.; Hanson, Mark A.; Sibley, Colin P.; Johnstone, Edward D.; Sengers, Bram G.; Lewis, Rohan M.; Glazier, Jocelyn D.

    2015-01-01

    Uptake of system L amino acid substrates into isolated placental plasma membrane vesicles in the absence of opposing side amino acid (zero-trans uptake) is incompatible with the concept of obligatory exchange, where influx of amino acid is coupled to efflux. We therefore hypothesized that system L amino acid exchange transporters are not fully obligatory and/or that amino acids are initially present inside the vesicles. To address this, we combined computational modeling with vesicle transport assays and transporter localization studies to investigate the mechanisms mediating [14C]l-serine (a system L substrate) transport into human placental microvillous plasma membrane (MVM) vesicles. The carrier model provided a quantitative framework to test the 2 hypotheses that l-serine transport occurs by either obligate exchange or nonobligate exchange coupled with facilitated transport (mixed transport model). The computational model could only account for experimental [14C]l-serine uptake data when the transporter was not exclusively in exchange mode, best described by the mixed transport model. MVM vesicle isolates contained endogenous amino acids allowing for potential contribution to zero-trans uptake. Both L-type amino acid transporter (LAT)1 and LAT2 subtypes of system L were distributed to MVM, with l-serine transport attributed to LAT2. These findings suggest that exchange transporters do not function exclusively as obligate exchangers.—Widdows, K. L., Panitchob, N., Crocker, I. P., Please, C. P., Hanson, M. A., Sibley, C. P., Johnstone, E. D., Sengers, B. G., Lewis, R. M., Glazier, J. D. Integration of computational modeling with membrane transport studies reveals new insights into amino acid exchange transport mechanisms. PMID:25761365

  3. Insights into enzymatic halogenation from computational studies

    PubMed Central

    Senn, Hans M.

    2014-01-01

    The halogenases are a group of enzymes that have only come to the fore over the last 10 years thanks to the discovery and characterization of several novel representatives. They have revealed the fascinating variety of distinct chemical mechanisms that nature utilizes to activate halogens and introduce them into organic substrates. Computational studies using a range of approaches have already elucidated many details of the mechanisms of these enzymes, often in synergistic combination with experiment. This Review summarizes the main insights gained from these studies. It also seeks to identify open questions that are amenable to computational investigations. The studies discussed herein serve to illustrate some of the limitations of the current computational approaches and the challenges encountered in computational mechanistic enzymology. PMID:25426489

  4. Spontaneous Movements of a Computer Mouse Reveal Egoism and In-group Favoritism.

    PubMed

    Maliszewski, Norbert; Wojciechowski, Łukasz; Suszek, Hubert

    2017-01-01

    The purpose of the project was to assess whether the first spontaneous movements of a computer mouse, when making an assessment on a scale presented on the screen, may express a respondent's implicit attitudes. In Study 1, the altruistic behaviors of 66 students were assessed. The students were led to believe that the task they were performing was also being performed by another person and they were asked to distribute earnings between themselves and the partner. The participants performed the tasks under conditions with and without distractors. With the distractors, in the first few seconds spontaneous mouse movements on the scale expressed a selfish distribution of money, while later the movements gravitated toward more altruism. In Study 2, 77 Polish students evaluated a painting by a Polish/Jewish painter on a scale. They evaluated it under conditions of full or distracted cognitive abilities. Spontaneous movements of the mouse on the scale were analyzed. In addition, implicit attitudes toward both Poles and Jews were measured with the Implicit Association Test (IAT). A significant association between implicit attitudes (IAT) and spontaneous evaluation of images using a computer mouse was observed in the group with the distractor. The participants with strong implicit in-group favoritism of Poles revealed stronger preference for the Polish painter's work in the first few seconds of mouse movement. Taken together, these results suggest that spontaneous mouse movements may reveal egoism (in-group favoritism), i.e., processes that were not observed in the participants' final decisions (clicking on the scale).

  5. Spontaneous Movements of a Computer Mouse Reveal Egoism and In-group Favoritism

    PubMed Central

    Maliszewski, Norbert; Wojciechowski, Łukasz; Suszek, Hubert

    2017-01-01

    The purpose of the project was to assess whether the first spontaneous movements of a computer mouse, when making an assessment on a scale presented on the screen, may express a respondent’s implicit attitudes. In Study 1, the altruistic behaviors of 66 students were assessed. The students were led to believe that the task they were performing was also being performed by another person and they were asked to distribute earnings between themselves and the partner. The participants performed the tasks under conditions with and without distractors. With the distractors, in the first few seconds spontaneous mouse movements on the scale expressed a selfish distribution of money, while later the movements gravitated toward more altruism. In Study 2, 77 Polish students evaluated a painting by a Polish/Jewish painter on a scale. They evaluated it under conditions of full or distracted cognitive abilities. Spontaneous movements of the mouse on the scale were analyzed. In addition, implicit attitudes toward both Poles and Jews were measured with the Implicit Association Test (IAT). A significant association between implicit attitudes (IAT) and spontaneous evaluation of images using a computer mouse was observed in the group with the distractor. The participants with strong implicit in-group favoritism of Poles revealed stronger preference for the Polish painter’s work in the first few seconds of mouse movement. Taken together, these results suggest that spontaneous mouse movements may reveal egoism (in-group favoritism), i.e., processes that were not observed in the participants’ final decisions (clicking on the scale). PMID:28163689

  6. Efficient algorithms for computing a strong rank-revealing QR factorization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gu, M.; Eisenstat, S.C.

    1996-07-01

    Given an m x n matrix M with m {ge} n, it is shown that there exists a permutation {Pi} and an integer k such that the QR factorization given by equation (1) reveals the numerical rank of M: the k x k upper-triangular matrix A{sub k} is well conditioned, norm of (C{sub k}){sub 2} is small, and B{sub k} is linearly dependent on A{sub k} with coefficients bounded by a low-degree polynomial in n. Existing rank-revealing QR (RRQR) algorithms are related to such factorizations and two algorithms are presented for computing them. The new algorithms are nearly as efficientmore » as QR with column pivoting for most problems and take O(mn{sup 2}) floating-point operations in the worst case.« less

  7. Integrative Computational Network Analysis Reveals Site-Specific Mediators of Inflammation in Alzheimer's Disease.

    PubMed

    Ravichandran, Srikanth; Michelucci, Alessandro; Del Sol, Antonio

    2018-01-01

    Alzheimer's disease (AD) is a major neurodegenerative disease and is one of the most common cause of dementia in older adults. Among several factors, neuroinflammation is known to play a critical role in the pathogenesis of chronic neurodegenerative diseases. In particular, studies of brains affected by AD show a clear involvement of several inflammatory pathways. Furthermore, depending on the brain regions affected by the disease, the nature and the effect of inflammation can vary. Here, in order to shed more light on distinct and common features of inflammation in different brain regions affected by AD, we employed a computational approach to analyze gene expression data of six site-specific neuronal populations from AD patients. Our network based computational approach is driven by the concept that a sustained inflammatory environment could result in neurotoxicity leading to the disease. Thus, our method aims to infer intracellular signaling pathways/networks that are likely to be constantly activated or inhibited due to persistent inflammatory conditions. The computational analysis identified several inflammatory mediators, such as tumor necrosis factor alpha (TNF-a)-associated pathway, as key upstream receptors/ligands that are likely to transmit sustained inflammatory signals. Further, the analysis revealed that several inflammatory mediators were mainly region specific with few commonalities across different brain regions. Taken together, our results show that our integrative approach aids identification of inflammation-related signaling pathways that could be responsible for the onset or the progression of AD and can be applied to study other neurodegenerative diseases. Furthermore, such computational approaches can enable the translation of clinical omics data toward the development of novel therapeutic strategies for neurodegenerative diseases.

  8. Integrative Computational Network Analysis Reveals Site-Specific Mediators of Inflammation in Alzheimer's Disease

    PubMed Central

    Ravichandran, Srikanth; Michelucci, Alessandro; del Sol, Antonio

    2018-01-01

    Alzheimer's disease (AD) is a major neurodegenerative disease and is one of the most common cause of dementia in older adults. Among several factors, neuroinflammation is known to play a critical role in the pathogenesis of chronic neurodegenerative diseases. In particular, studies of brains affected by AD show a clear involvement of several inflammatory pathways. Furthermore, depending on the brain regions affected by the disease, the nature and the effect of inflammation can vary. Here, in order to shed more light on distinct and common features of inflammation in different brain regions affected by AD, we employed a computational approach to analyze gene expression data of six site-specific neuronal populations from AD patients. Our network based computational approach is driven by the concept that a sustained inflammatory environment could result in neurotoxicity leading to the disease. Thus, our method aims to infer intracellular signaling pathways/networks that are likely to be constantly activated or inhibited due to persistent inflammatory conditions. The computational analysis identified several inflammatory mediators, such as tumor necrosis factor alpha (TNF-a)-associated pathway, as key upstream receptors/ligands that are likely to transmit sustained inflammatory signals. Further, the analysis revealed that several inflammatory mediators were mainly region specific with few commonalities across different brain regions. Taken together, our results show that our integrative approach aids identification of inflammation-related signaling pathways that could be responsible for the onset or the progression of AD and can be applied to study other neurodegenerative diseases. Furthermore, such computational approaches can enable the translation of clinical omics data toward the development of novel therapeutic strategies for neurodegenerative diseases. PMID:29551980

  9. Fractional charge revealed in computer simulations of resonant tunneling in the fractional quantum Hall regime.

    PubMed

    Tsiper, E V

    2006-08-18

    The concept of fractional charge is central to the theory of the fractional quantum Hall effect. Here I use exact diagonalization as well as configuration space renormalization to study finite clusters which are large enough to contain two independent edges. I analyze the conditions of resonant tunneling between the two edges. The "computer experiment" reveals a periodic sequence of resonant tunneling events consistent with the experimentally observed fractional quantization of electric charge in units of e/3 and e/5.

  10. In vitro protease cleavage and computer simulations reveal the HIV-1 capsid maturation pathway

    NASA Astrophysics Data System (ADS)

    Ning, Jiying; Erdemci-Tandogan, Gonca; Yufenyuy, Ernest L.; Wagner, Jef; Himes, Benjamin A.; Zhao, Gongpu; Aiken, Christopher; Zandi, Roya; Zhang, Peijun

    2016-12-01

    HIV-1 virions assemble as immature particles containing Gag polyproteins that are processed by the viral protease into individual components, resulting in the formation of mature infectious particles. There are two competing models for the process of forming the mature HIV-1 core: the disassembly and de novo reassembly model and the non-diffusional displacive model. To study the maturation pathway, we simulate HIV-1 maturation in vitro by digesting immature particles and assembled virus-like particles with recombinant HIV-1 protease and monitor the process with biochemical assays and cryoEM structural analysis in parallel. Processing of Gag in vitro is accurate and efficient and results in both soluble capsid protein and conical or tubular capsid assemblies, seemingly converted from immature Gag particles. Computer simulations further reveal probable assembly pathways of HIV-1 capsid formation. Combining the experimental data and computer simulations, our results suggest a sequential combination of both displacive and disassembly/reassembly processes for HIV-1 maturation.

  11. Computational Approaches for Revealing the Structure of Membrane Transporters: Case Study on Bilitranslocase.

    PubMed

    Venko, Katja; Roy Choudhury, A; Novič, Marjana

    2017-01-01

    The structural and functional details of transmembrane proteins are vastly underexplored, mostly due to experimental difficulties regarding their solubility and stability. Currently, the majority of transmembrane protein structures are still unknown and this present a huge experimental and computational challenge. Nowadays, thanks to X-ray crystallography or NMR spectroscopy over 3000 structures of membrane proteins have been solved, among them only a few hundred unique ones. Due to the vast biological and pharmaceutical interest in the elucidation of the structure and the functional mechanisms of transmembrane proteins, several computational methods have been developed to overcome the experimental gap. If combined with experimental data the computational information enables rapid, low cost and successful predictions of the molecular structure of unsolved proteins. The reliability of the predictions depends on the availability and accuracy of experimental data associated with structural information. In this review, the following methods are proposed for in silico structure elucidation: sequence-dependent predictions of transmembrane regions, predictions of transmembrane helix-helix interactions, helix arrangements in membrane models, and testing their stability with molecular dynamics simulations. We also demonstrate the usage of the computational methods listed above by proposing a model for the molecular structure of the transmembrane protein bilitranslocase. Bilitranslocase is bilirubin membrane transporter, which shares similar tissue distribution and functional properties with some of the members of the Organic Anion Transporter family and is the only member classified in the Bilirubin Transporter Family. Regarding its unique properties, bilitranslocase is a potentially interesting drug target.

  12. Unethical Computer Using Behavior Scale: A Study of Reliability and Validity on Turkish University Students

    ERIC Educational Resources Information Center

    Namlu, Aysen Gurcan; Odabasi, Hatice Ferhan

    2007-01-01

    This study was carried out in a Turkish university with 216 undergraduate students of computer technology as respondents. The study aimed to develop a scale (UECUBS) to determine the unethical computer use behavior. A factor analysis of the related items revealed that the factors were can be divided under five headings; intellectual property,…

  13. Study of basic computer competence among public health nurses in Taiwan.

    PubMed

    Yang, Kuei-Feng; Yu, Shu; Lin, Ming-Sheng; Hsu, Chia-Ling

    2004-03-01

    Rapid advances in information technology and media have made distance learning on the Internet possible. This new model of learning allows greater efficiency and flexibility in knowledge acquisition. Since basic computer competence is a prerequisite for this new learning model, this study was conducted to examine the basic computer competence of public health nurses in Taiwan and explore factors influencing computer competence. A national cross-sectional randomized study was conducted with 329 public health nurses. A questionnaire was used to collect data and was delivered by mail. Results indicate that basic computer competence of public health nurses in Taiwan is still needs to be improved (mean = 57.57 +- 2.83, total score range from 26-130). Among the five most frequently used software programs, nurses were most knowledgeable about Word and least knowledgeable about PowerPoint. Stepwise multiple regression analysis revealed eight variables (weekly number of hours spent online at home, weekly amount of time spent online at work, weekly frequency of computer use at work, previous computer training, computer at workplace and Internet access, job position, education level, and age) that significantly influenced computer competence, which accounted for 39.0 % of the variance. In conclusion, greater computer competence, broader educational programs regarding computer technology, and a greater emphasis on computers at work are necessary to increase the usefulness of distance learning via the Internet in Taiwan. Building a user-friendly environment is important in developing this new media model of learning for the future.

  14. Computational study of hydroxyapatite structures, properties and defects

    NASA Astrophysics Data System (ADS)

    Bystrov, V. S.; Coutinho, J.; Bystrova, A. V.; Dekhtyar, Yu D.; Pullar, R. C.; Poronin, A.; Palcevskis, E.; Dindune, A.; Alkan, B.; Durucan, C.; Paramonova, E. V.

    2015-03-01

    Hydroxyapatite (HAp) was studied from a first principle approach using the local density approximation (LDA) method in AIMPRO code, in combination with various quantum mechanical (QM) and molecular mechanical (MM) methods from HypemChem 7.5/8.0. The data obtained were used for studies of HAp structures, the physical properties of HAp (density of electronic states—DOS, bulk modulus etc) and defects in HAp. Computed data confirmed that HAp can co-exist in different phases—hexagonal and monoclinic. Ordered monoclinic structures, which could reveal piezoelectric properties, are of special interest. The data obtained allow us to characterize the properties of the following defects in HAp: O, H and OH vacancies; H and OH interstitials; substitutions of Ca by Mg, Sr, Mn or Se, and P by Si. These properties reveal the appearance of additional energy levels inside the forbidden zone, shifts of the top of the valence band or the bottom of the conduction band, and subsequent changes in the width of the forbidden zone. The data computed are compared with other known data, both calculated and experimental, such as alteration of the electron work functions under different influences of various defects and treatments, obtained by photoelectron emission. The obtained data are very useful, and there is an urgent need for such analysis of modified HAp interactions with living cells and tissues, improvement of implant techniques and development of new nanomedical applications.

  15. Computers in Social Studies.

    ERIC Educational Resources Information Center

    Kendall, Diane S.; Budin, Howard

    1987-01-01

    Provides an introduction to the articles in this issue. Maintains that social studies teachers and textbooks will not be replaced by computers. States that the relationship between the social studies and computers is limited only by our imaginations. (JDH)

  16. A review of Computer Science resources for learning and teaching with K-12 computing curricula: an Australian case study

    NASA Astrophysics Data System (ADS)

    Falkner, Katrina; Vivian, Rebecca

    2015-10-01

    To support teachers to implement Computer Science curricula into classrooms from the very first year of school, teachers, schools and organisations seek quality curriculum resources to support implementation and teacher professional development. Until now, many Computer Science resources and outreach initiatives have targeted K-12 school-age children, with the intention to engage children and increase interest, rather than to formally teach concepts and skills. What is the educational quality of existing Computer Science resources and to what extent are they suitable for classroom learning and teaching? In this paper, an assessment framework is presented to evaluate the quality of online Computer Science resources. Further, a semi-systematic review of available online Computer Science resources was conducted to evaluate resources available for classroom learning and teaching and to identify gaps in resource availability, using the Australian curriculum as a case study analysis. The findings reveal a predominance of quality resources, however, a number of critical gaps were identified. This paper provides recommendations and guidance for the development of new and supplementary resources and future research.

  17. Computational dissection of human episodic memory reveals mental process-specific genetic profiles

    PubMed Central

    Luksys, Gediminas; Fastenrath, Matthias; Coynel, David; Freytag, Virginie; Gschwind, Leo; Heck, Angela; Jessen, Frank; Maier, Wolfgang; Milnik, Annette; Riedel-Heller, Steffi G.; Scherer, Martin; Spalek, Klara; Vogler, Christian; Wagner, Michael; Wolfsgruber, Steffen; Papassotiropoulos, Andreas; de Quervain, Dominique J.-F.

    2015-01-01

    Episodic memory performance is the result of distinct mental processes, such as learning, memory maintenance, and emotional modulation of memory strength. Such processes can be effectively dissociated using computational models. Here we performed gene set enrichment analyses of model parameters estimated from the episodic memory performance of 1,765 healthy young adults. We report robust and replicated associations of the amine compound SLC (solute-carrier) transporters gene set with the learning rate, of the collagen formation and transmembrane receptor protein tyrosine kinase activity gene sets with the modulation of memory strength by negative emotional arousal, and of the L1 cell adhesion molecule (L1CAM) interactions gene set with the repetition-based memory improvement. Furthermore, in a large functional MRI sample of 795 subjects we found that the association between L1CAM interactions and memory maintenance revealed large clusters of differences in brain activity in frontal cortical areas. Our findings provide converging evidence that distinct genetic profiles underlie specific mental processes of human episodic memory. They also provide empirical support to previous theoretical and neurobiological studies linking specific neuromodulators to the learning rate and linking neural cell adhesion molecules to memory maintenance. Furthermore, our study suggests additional memory-related genetic pathways, which may contribute to a better understanding of the neurobiology of human memory. PMID:26261317

  18. Computational dissection of human episodic memory reveals mental process-specific genetic profiles.

    PubMed

    Luksys, Gediminas; Fastenrath, Matthias; Coynel, David; Freytag, Virginie; Gschwind, Leo; Heck, Angela; Jessen, Frank; Maier, Wolfgang; Milnik, Annette; Riedel-Heller, Steffi G; Scherer, Martin; Spalek, Klara; Vogler, Christian; Wagner, Michael; Wolfsgruber, Steffen; Papassotiropoulos, Andreas; de Quervain, Dominique J-F

    2015-09-01

    Episodic memory performance is the result of distinct mental processes, such as learning, memory maintenance, and emotional modulation of memory strength. Such processes can be effectively dissociated using computational models. Here we performed gene set enrichment analyses of model parameters estimated from the episodic memory performance of 1,765 healthy young adults. We report robust and replicated associations of the amine compound SLC (solute-carrier) transporters gene set with the learning rate, of the collagen formation and transmembrane receptor protein tyrosine kinase activity gene sets with the modulation of memory strength by negative emotional arousal, and of the L1 cell adhesion molecule (L1CAM) interactions gene set with the repetition-based memory improvement. Furthermore, in a large functional MRI sample of 795 subjects we found that the association between L1CAM interactions and memory maintenance revealed large clusters of differences in brain activity in frontal cortical areas. Our findings provide converging evidence that distinct genetic profiles underlie specific mental processes of human episodic memory. They also provide empirical support to previous theoretical and neurobiological studies linking specific neuromodulators to the learning rate and linking neural cell adhesion molecules to memory maintenance. Furthermore, our study suggests additional memory-related genetic pathways, which may contribute to a better understanding of the neurobiology of human memory.

  19. Melanoma Cell Colony Expansion Parameters Revealed by Approximate Bayesian Computation

    PubMed Central

    Vo, Brenda N.; Drovandi, Christopher C.; Pettitt, Anthony N.; Pettet, Graeme J.

    2015-01-01

    In vitro studies and mathematical models are now being widely used to study the underlying mechanisms driving the expansion of cell colonies. This can improve our understanding of cancer formation and progression. Although much progress has been made in terms of developing and analysing mathematical models, far less progress has been made in terms of understanding how to estimate model parameters using experimental in vitro image-based data. To address this issue, a new approximate Bayesian computation (ABC) algorithm is proposed to estimate key parameters governing the expansion of melanoma cell (MM127) colonies, including cell diffusivity, D, cell proliferation rate, λ, and cell-to-cell adhesion, q, in two experimental scenarios, namely with and without a chemical treatment to suppress cell proliferation. Even when little prior biological knowledge about the parameters is assumed, all parameters are precisely inferred with a small posterior coefficient of variation, approximately 2–12%. The ABC analyses reveal that the posterior distributions of D and q depend on the experimental elapsed time, whereas the posterior distribution of λ does not. The posterior mean values of D and q are in the ranges 226–268 µm2h−1, 311–351 µm2h−1 and 0.23–0.39, 0.32–0.61 for the experimental periods of 0–24 h and 24–48 h, respectively. Furthermore, we found that the posterior distribution of q also depends on the initial cell density, whereas the posterior distributions of D and λ do not. The ABC approach also enables information from the two experiments to be combined, resulting in greater precision for all estimates of D and λ. PMID:26642072

  20. Triazine-Substituted and Acyl Hydrazones: Experiment and Computation Reveal a Stability Inversion at Low pH.

    PubMed

    Ji, Kun; Lee, Changsuk; Janesko, Benjamin G; Simanek, Eric E

    2015-08-03

    Condensation of a hydrazine-substituted s-triazine with an aldehyde or ketone yields an equivalent to the widely used, acid-labile acyl hydrazone. Hydrolysis of these hydrazones using a formaldehyde trap as monitored using HPLC reveals that triazine-substituted hydrazones are more labile than acetyl hydrazones at pH>5. The reactivity trends mirror that of the corresponding acetyl hydrazones, with hydrolysis rates increasing along the series (aromatic aldehydeComputational and experimental studies indicate a reversal in stability around the triazine pKa (pH∼5). Protonation of the triazine moiety retards acid-catalyzed hydrolysis of triazinyl hydrazones in comparison to acetyl hydrazone analogues. This behavior supports mechanistic interpretations suggesting that resistance to protonation of the hydrazone N1 is the critical factor in affecting the reaction rate.

  1. Novel Nipah virus immune-antagonism strategy revealed by experimental and computational study.

    PubMed

    Seto, Jeremy; Qiao, Liang; Guenzel, Carolin A; Xiao, Sa; Shaw, Megan L; Hayot, Fernand; Sealfon, Stuart C

    2010-11-01

    Nipah virus is an emerging pathogen that causes severe disease in humans. It expresses several antagonist proteins that subvert the immune response and that may contribute to its pathogenicity. Studies of its biology are difficult due to its high pathogenicity and requirement for biosafety level 4 containment. We integrated experimental and computational methods to elucidate the effects of Nipah virus immune antagonists. Individual Nipah virus immune antagonists (phosphoprotein and V and W proteins) were expressed from recombinant Newcastle disease viruses, and the responses of infected human monocyte-derived dendritic cells were determined. We developed an ordinary differential equation model of the infectious process that that produced results with a high degree of correlation with these experimental results. In order to simulate the effects of wild-type virus, the model was extended to incorporate published experimental data on the time trajectories of immune-antagonist production. These data showed that the RNA-editing mechanism utilized by the wild-type Nipah virus to produce immune antagonists leads to a delay in the production of the most effective immune antagonists, V and W. Model simulations indicated that this delay caused a disconnection between attenuation of the antiviral response and suppression of inflammation. While the antiviral cytokines were efficiently suppressed at early time points, some early inflammatory cytokine production occurred, which would be expected to increase vascular permeability and promote virus spread and pathogenesis. These results suggest that Nipah virus has evolved a unique immune-antagonist strategy that benefits from controlled expression of multiple antagonist proteins with various potencies.

  2. Turkish and English Language Teacher Candidates' Perceived Computer Self-Efficacy and Attitudes toward Computer

    ERIC Educational Resources Information Center

    Adalier, Ahmet

    2012-01-01

    The aim of this study is to reveal the relation between the Turkish and English language teacher candidates' social demographic characteristics and their perceived computer self-efficacy and attitudes toward computer. The population of the study consists of the teacher candidates in the Turkish and English language departments at the universities…

  3. Examining the architecture of cellular computing through a comparative study with a computer

    PubMed Central

    Wang, Degeng; Gribskov, Michael

    2005-01-01

    The computer and the cell both use information embedded in simple coding, the binary software code and the quadruple genomic code, respectively, to support system operations. A comparative examination of their system architecture as well as their information storage and utilization schemes is performed. On top of the code, both systems display a modular, multi-layered architecture, which, in the case of a computer, arises from human engineering efforts through a combination of hardware implementation and software abstraction. Using the computer as a reference system, a simplistic mapping of the architectural components between the two is easily detected. This comparison also reveals that a cell abolishes the software–hardware barrier through genomic encoding for the constituents of the biochemical network, a cell's ‘hardware’ equivalent to the computer central processing unit (CPU). The information loading (gene expression) process acts as a major determinant of the encoded constituent's abundance, which, in turn, often determines the ‘bandwidth’ of a biochemical pathway. Cellular processes are implemented in biochemical pathways in parallel manners. In a computer, on the other hand, the software provides only instructions and data for the CPU. A process represents just sequentially ordered actions by the CPU and only virtual parallelism can be implemented through CPU time-sharing. Whereas process management in a computer may simply mean job scheduling, coordinating pathway bandwidth through the gene expression machinery represents a major process management scheme in a cell. In summary, a cell can be viewed as a super-parallel computer, which computes through controlled hardware composition. While we have, at best, a very fragmented understanding of cellular operation, we have a thorough understanding of the computer throughout the engineering process. The potential utilization of this knowledge to the benefit of systems biology is discussed. PMID

  4. Examining the architecture of cellular computing through a comparative study with a computer.

    PubMed

    Wang, Degeng; Gribskov, Michael

    2005-06-22

    The computer and the cell both use information embedded in simple coding, the binary software code and the quadruple genomic code, respectively, to support system operations. A comparative examination of their system architecture as well as their information storage and utilization schemes is performed. On top of the code, both systems display a modular, multi-layered architecture, which, in the case of a computer, arises from human engineering efforts through a combination of hardware implementation and software abstraction. Using the computer as a reference system, a simplistic mapping of the architectural components between the two is easily detected. This comparison also reveals that a cell abolishes the software-hardware barrier through genomic encoding for the constituents of the biochemical network, a cell's "hardware" equivalent to the computer central processing unit (CPU). The information loading (gene expression) process acts as a major determinant of the encoded constituent's abundance, which, in turn, often determines the "bandwidth" of a biochemical pathway. Cellular processes are implemented in biochemical pathways in parallel manners. In a computer, on the other hand, the software provides only instructions and data for the CPU. A process represents just sequentially ordered actions by the CPU and only virtual parallelism can be implemented through CPU time-sharing. Whereas process management in a computer may simply mean job scheduling, coordinating pathway bandwidth through the gene expression machinery represents a major process management scheme in a cell. In summary, a cell can be viewed as a super-parallel computer, which computes through controlled hardware composition. While we have, at best, a very fragmented understanding of cellular operation, we have a thorough understanding of the computer throughout the engineering process. The potential utilization of this knowledge to the benefit of systems biology is discussed.

  5. A preliminary study on the short-term efficacy of chairside computer-aided design/computer-assisted manufacturing- generated posterior lithium disilicate crowns.

    PubMed

    Reich, Sven; Fischer, Sören; Sobotta, Bernhard; Klapper, Horst-Uwe; Gozdowski, Stephan

    2010-01-01

    The purpose of this preliminary study was to evaluate the clinical performance of chairside-generated crowns over a preliminary time period of 24 months. Forty-one posterior crowns made of a machinable lithium disilicate ceramic for full-contour crowns were inserted in 34 patients using a chairside computer-aided design/computer-assisted manufacturing technique. The crowns were evaluated at baseline and after 6, 12, and 24 months according to modified United States Public Health Service criteria. After 2 years, all reexamined crowns (n = 39) were in situ; one abutment exhibited secondary caries and two abutments received root canal treatment. Within the limited observation period, the crowns revealed clinically satisfying results.

  6. Bridging the digital divide through the integration of computer and information technology in science education: An action research study

    NASA Astrophysics Data System (ADS)

    Brown, Gail Laverne

    The presence of a digital divide, computer and information technology integration effectiveness, and barriers to continued usage of computer and information technology were investigated. Thirty-four African American and Caucasian American students (17 males and 17 females) in grades 9--11 from 2 Georgia high school science classes were exposed to 30 hours of hands-on computer and information technology skills. The purpose of the exposure was to improve students' computer and information technology skills. Pre-study and post-study skills surveys, and structured interviews were used to compare race, gender, income, grade-level, and age differences with respect to computer usage. A paired t-test and McNemar test determined mean differences between student pre-study and post-study perceived skills levels. The results were consistent with findings of the National Telecommunications and Information Administration (2000) that indicated the presence of a digital divide and digital inclusion. Caucasian American participants were found to have more at-home computer and Internet access than African American participants, indicating that there is a digital divide by ethnicity. Caucasian American females were found to have more computer and Internet access which was an indication of digital inclusion. Sophomores had more at-home computer access and Internet access than other levels indicating digital inclusion. Students receiving regular meals had more computer and Internet access than students receiving free/reduced meals. Older students had more computer and Internet access than younger students. African American males had been using computer and information technology the longest which is an indication of inclusion. The paired t-test and McNemar test revealed significant perceived student increases in all skills levels. Interviews did not reveal any barriers to continued usage of the computer and information technology skills.

  7. Academic computer science and gender: A naturalistic study investigating the causes of attrition

    NASA Astrophysics Data System (ADS)

    Declue, Timothy Hall

    Far fewer women than men take computer science classes in high school, enroll in computer science programs in college, or complete advanced degrees in computer science. The computer science pipeline begins to shrink for women even before entering college, but it is at the college level that the "brain drain" is the most evident numerically, especially in the first class taken by most computer science majors called "Computer Science 1" or CS-I. The result, for both academia and industry, is a pronounced technological gender disparity in academic and industrial computer science. The study revealed the existence of several factors influencing success in CS-I. First, and most clearly, the effect of attribution processes seemed to be quite strong. These processes tend to work against success for females and in favor of success for males. Likewise, evidence was discovered which strengthens theories related to prior experience and the perception that computer science has a culture which is hostile to females. Two unanticipated themes related to the motivation and persistence of successful computer science majors. The findings did not support the belief that females have greater logistical problems in computer science than males, or that females tend to have a different programming style than males which adversely affects the females' ability to succeed in CS-I.

  8. A computer program for sample size computations for banding studies

    USGS Publications Warehouse

    Wilson, K.R.; Nichols, J.D.; Hines, J.E.

    1989-01-01

    Sample sizes necessary for estimating survival rates of banded birds, adults and young, are derived based on specified levels of precision. The banding study can be new or ongoing. The desired coefficient of variation (CV) for annual survival estimates, the CV for mean annual survival estimates, and the length of the study must be specified to compute sample sizes. A computer program is available for computation of the sample sizes, and a description of the input and output is provided.

  9. Pupillary dynamics reveal computational cost in sentence planning.

    PubMed

    Sevilla, Yamila; Maldonado, Mora; Shalóm, Diego E

    2014-01-01

    This study investigated the computational cost associated with grammatical planning in sentence production. We measured people's pupillary responses as they produced spoken descriptions of depicted events. We manipulated the syntactic structure of the target by training subjects to use different types of sentences following a colour cue. The results showed higher increase in pupil size for the production of passive and object dislocated sentences than for active canonical subject-verb-object sentences, indicating that more cognitive effort is associated with more complex noncanonical thematic order. We also manipulated the time at which the cue that triggered structure-building processes was presented. Differential increase in pupil diameter for more complex sentences was shown to rise earlier as the colour cue was presented earlier, suggesting that the observed pupillary changes are due to differential demands in relatively independent structure-building processes during grammatical planning. Task-evoked pupillary responses provide a reliable measure to study the cognitive processes involved in sentence production.

  10. Wrist Hypothermia Related to Continuous Work with a Computer Mouse: A Digital Infrared Imaging Pilot Study

    PubMed Central

    Reste, Jelena; Zvagule, Tija; Kurjane, Natalja; Martinsone, Zanna; Martinsone, Inese; Seile, Anita; Vanadzins, Ivars

    2015-01-01

    Computer work is characterized by sedentary static workload with low-intensity energy metabolism. The aim of our study was to evaluate the dynamics of skin surface temperature in the hand during prolonged computer mouse work under different ergonomic setups. Digital infrared imaging of the right forearm and wrist was performed during three hours of continuous computer work (measured at the start and every 15 minutes thereafter) in a laboratory with controlled ambient conditions. Four people participated in the study. Three different ergonomic computer mouse setups were tested on three different days (horizontal computer mouse without mouse pad; horizontal computer mouse with mouse pad and padded wrist support; vertical computer mouse without mouse pad). The study revealed a significantly strong negative correlation between the temperature of the dorsal surface of the wrist and time spent working with a computer mouse. Hand skin temperature decreased markedly after one hour of continuous computer mouse work. Vertical computer mouse work preserved more stable and higher temperatures of the wrist (>30 °C), while continuous use of a horizontal mouse for more than two hours caused an extremely low temperature (<28 °C) in distal parts of the hand. The preliminary observational findings indicate the significant effect of the duration and ergonomics of computer mouse work on the development of hand hypothermia. PMID:26262633

  11. Wrist Hypothermia Related to Continuous Work with a Computer Mouse: A Digital Infrared Imaging Pilot Study.

    PubMed

    Reste, Jelena; Zvagule, Tija; Kurjane, Natalja; Martinsone, Zanna; Martinsone, Inese; Seile, Anita; Vanadzins, Ivars

    2015-08-07

    Computer work is characterized by sedentary static workload with low-intensity energy metabolism. The aim of our study was to evaluate the dynamics of skin surface temperature in the hand during prolonged computer mouse work under different ergonomic setups. Digital infrared imaging of the right forearm and wrist was performed during three hours of continuous computer work (measured at the start and every 15 minutes thereafter) in a laboratory with controlled ambient conditions. Four people participated in the study. Three different ergonomic computer mouse setups were tested on three different days (horizontal computer mouse without mouse pad; horizontal computer mouse with mouse pad and padded wrist support; vertical computer mouse without mouse pad). The study revealed a significantly strong negative correlation between the temperature of the dorsal surface of the wrist and time spent working with a computer mouse. Hand skin temperature decreased markedly after one hour of continuous computer mouse work. Vertical computer mouse work preserved more stable and higher temperatures of the wrist (>30 °C), while continuous use of a horizontal mouse for more than two hours caused an extremely low temperature (<28 °C) in distal parts of the hand. The preliminary observational findings indicate the significant effect of the duration and ergonomics of computer mouse work on the development of hand hypothermia.

  12. Computational study of noise in a large signal transduction network.

    PubMed

    Intosalmi, Jukka; Manninen, Tiina; Ruohonen, Keijo; Linne, Marja-Leena

    2011-06-21

    Biochemical systems are inherently noisy due to the discrete reaction events that occur in a random manner. Although noise is often perceived as a disturbing factor, the system might actually benefit from it. In order to understand the role of noise better, its quality must be studied in a quantitative manner. Computational analysis and modeling play an essential role in this demanding endeavor. We implemented a large nonlinear signal transduction network combining protein kinase C, mitogen-activated protein kinase, phospholipase A2, and β isoform of phospholipase C networks. We simulated the network in 300 different cellular volumes using the exact Gillespie stochastic simulation algorithm and analyzed the results in both the time and frequency domain. In order to perform simulations in a reasonable time, we used modern parallel computing techniques. The analysis revealed that time and frequency domain characteristics depend on the system volume. The simulation results also indicated that there are several kinds of noise processes in the network, all of them representing different kinds of low-frequency fluctuations. In the simulations, the power of noise decreased on all frequencies when the system volume was increased. We concluded that basic frequency domain techniques can be applied to the analysis of simulation results produced by the Gillespie stochastic simulation algorithm. This approach is suited not only to the study of fluctuations but also to the study of pure noise processes. Noise seems to have an important role in biochemical systems and its properties can be numerically studied by simulating the reacting system in different cellular volumes. Parallel computing techniques make it possible to run massive simulations in hundreds of volumes and, as a result, accurate statistics can be obtained from computational studies. © 2011 Intosalmi et al; licensee BioMed Central Ltd.

  13. A computational study of the topology of vortex breakdown

    NASA Technical Reports Server (NTRS)

    Spall, Robert E.; Gatski, Thomas B.

    1991-01-01

    A fully three-dimensional numerical simulation of vortex breakdown using the unsteady, incompressible Navier-Stokes equations has been performed. Solutions to four distinct types of breakdown are identified and compared with experimental results. The computed solutions include weak helical, double helix, spiral, and bubble-type breakdowns. The topological structure of the various breakdowns as well as their interrelationship are studied. The data reveal that the asymmetric modes of breakdown may be subject to additional breakdowns as the vortex core evolves in the streamwise direction. The solutions also show that the freestream axial velocity distribution has a significant effect on the position and type of vortex breakdown.

  14. Case Study: Creation of a Degree Program in Computer Security. White Paper.

    ERIC Educational Resources Information Center

    Belon, Barbara; Wright, Marie

    This paper reports on research into the field of computer security, and undergraduate degrees offered in that field. Research described in the paper reveals only one computer security program at the associate's degree level in the entire country. That program, at Texas State Technical College in Waco, is a 71-credit-hour program leading to an…

  15. Web-based continuing medical education. (II): Evaluation study of computer-mediated continuing medical education.

    PubMed

    Curran, V R; Hoekman, T; Gulliver, W; Landells, I; Hatcher, L

    2000-01-01

    Over the years, various distance learning technologies and methods have been applied to the continuing medical education needs of rural and remote physicians. They have included audio teleconferencing, slow scan imaging, correspondence study, and compressed videoconferencing. The recent emergence and growth of Internet, World Wide Web (Web), and compact disk read-only-memory (CD-ROM) technologies have introduced new opportunities for providing continuing education to the rural medical practitioner. This evaluation study assessed the instructional effectiveness of a hybrid computer-mediated courseware delivery system on dermatologic office procedures. A hybrid delivery system merges Web documents, multimedia, computer-mediated communications, and CD-ROMs to enable self-paced instruction and collaborative learning. Using a modified pretest to post-test control group study design, several evaluative criteria (participant reaction, learning achievement, self-reported performance change, and instructional transactions) were assessed by various qualitative and quantitative data collection methods. This evaluation revealed that a hybrid computer-mediated courseware system was an effective means for increasing knowledge (p < .05) and improving self-reported competency (p < .05) in dermatologic office procedures, and that participants were very satisfied with the self-paced instruction and use of asynchronous computer conferencing for collaborative information sharing among colleagues.

  16. The Effects of Study Tasks in a Computer-Based Chemistry Learning Environment

    NASA Astrophysics Data System (ADS)

    Urhahne, Detlef; Nick, Sabine; Poepping, Anna Christin; Schulz, Sarah Jayne

    2013-12-01

    The present study examines the effects of different study tasks on the acquisition of knowledge about acids and bases in a computer-based learning environment. Three different task formats were selected to create three treatment conditions: learning with gap-fill and matching tasks, learning with multiple-choice tasks, and learning only from text and figures without any additional tasks. Participants were 196 ninth-grade students who learned with a self-developed multimedia program in a pretest-posttest control group design. Research results reveal that gap-fill and matching tasks were most effective in promoting knowledge acquisition, followed by multiple-choice tasks, and no tasks at all. The findings are in line with previous research on this topic. The effects can possibly be explained by the generation-recognition model, which predicts that gap-fill and matching tasks trigger more encompassing learning processes than multiple-choice tasks. It is concluded that instructional designers should incorporate more challenging study tasks for enhancing the effectiveness of computer-based learning environments.

  17. An oculomotor and computational study of a patient with diagonistic dyspraxia.

    PubMed

    Pouget, Pierre; Pradat-Diehl, Pascale; Rivaud-Péchoux, Sophie; Wattiez, Nicolas; Gaymard, Bertrand

    2011-04-01

    Diagonistic dyspraxia (DD) is a behavioural disorder encountered in split-brain subjects in which the left arm acts against the subject's will, deliberately counteracting what the right arm does. We report here an oculomotor and computational study of a patient with a long lasting form of DD. A first series of oculomotor paradigms revealed marked and unprecedented saccade impairments. We used a computational model in order to provide information about the impaired decision-making process: the analysis of saccade latencies revealed that variations of decision times were explained by adjustments of response criterion. This result and paradoxical impairments observed in additional oculomotor paradigms allowed to propose that this adjustment of the criterion level resulted from the co-existence of counteracting oculomotor programs, consistent with the existence of antagonist programs in homotopic cortical areas. In the intact brain, trans-hemispheric inhibition would allow suppression of these counter programs. Depending on the topography of the disconnected areas, various motor and/or behavioural impairments would arise in split-brain subjects. In motor systems, such conflict would result in increased criteria for desired movement execution (oculomotor system) or in simultaneous execution of counteracting movements (skeletal motor system). At higher cognitive levels, it may result in conflict of intentions. Copyright © 2010 Elsevier Srl. All rights reserved.

  18. Anti-diarrheal activity of (-)-epicatechin from Chiranthodendron pentadactylon Larreat: experimental and computational studies.

    PubMed

    Velázquez, Claudia; Correa-Basurto, José; Garcia-Hernandez, Normand; Barbosa, Elizabeth; Tesoro-Cruz, Emiliano; Calzada, Samuel; Calzada, Fernando

    2012-09-28

    Chiranthodendron pentadactylon Larreat is frequently used in Mexican traditional medicine as well as in Guatemalan for several medicinal purposes, including their use in the control of diarrhea. This work was undertaken to obtain additional information that support the traditional use of Chiranthodendron pentadactylon Larreat, on pharmacological basis using the major antisecretory isolated compound from computational, in vitro and in vivo experiments. (-)-Epicatechin was isolated from ethyl acetate fraction of the plant crude extract. In vivo toxin (Vibrio cholera or Escherichia coli)-induced intestinal secretion in rat jejunal loops models and sodium dodecyl sulphate-polyacrylamide gel electrophoresis (SDS-PAGE) analysis on Vibrio cholera toxin were used in experimental studies while the molecular docking technique was used to conduct computational study. The antisecretory activity of epicatechin was tested against Vibrio cholera and Escherichia coli toxins at oral dose 10 mg/kg in the rat model. It exhibited the most potent activity on Vibrio cholera toxin (56.9% of inhibition). In the case of Escherichia coli toxin its effect was moderate (24.1% of inhibition). SDS-PAGE analysis revealed that both (-)-epicatechin and Chiranthodendron pentadactylon extract interacted with the Vibrio cholera toxin at concentration from 80 μg/mL and 300 μg/mL, respectively. Computational molecular docking showed that epicatechin interacted with four amino acid residues (Asn 103, Phe 31, Phe 223 and The 78) in the catalytic site of Vibrio cholera toxin, revealing its potential binding mode at molecular level. The results derived from computational, in vitro and in vivo experiments on Vibrio cholera and Escherichia coli toxins confirm the potential of epicatechin as a new antisecretory compound and give additional scientific support to anecdotal use of Chiranthodendron pentadactylon Larreat in Mexican traditional medicine to treat gastrointestinal disorders such as diarrhea

  19. Computer-Based Education in the Social Studies.

    ERIC Educational Resources Information Center

    Ehman, Lee H.; Glenn, Allen D.

    Computers have not revolutionized social studies curricula because so few teachers use them. But research does indicate that computers are flexible instructional tools that can assist in the development of attitudes, intellectual motivation, and inquiry skills. Social studies educators need to consider expanded computer use in their classrooms…

  20. CARE+ user study: usability and attitudes towards a tablet pc computer counseling tool for HIV+ men and women.

    PubMed

    Skeels, Meredith M; Kurth, Ann; Clausen, Marc; Severynen, Anneleen; Garcia-Smith, Hal

    2006-01-01

    CARE+ is a tablet PC-based computer counseling tool designed to support medication adherence and secondary HIV prevention for people living with HIV. Thirty HIV+ men and women participated in our user study to assess usability and attitudes towards CARE+. We observed them using CARE+ for the first time and conducted a semi-structured interview afterwards. Our findings suggest computer counseling may reduce social bias and encourage participants to answer questions honestly. Participants felt that discussing sensitive subjects with a computer instead of a person reduced feelings of embarrassment and being judged, and promoted privacy. Results also confirm that potential users think computers can provide helpful counseling, and that many also want human counseling interaction. Our study also revealed that tablet PC-based applications are usable by our population of mixed experience computer users. Computer counseling holds great potential for providing assessment and health promotion to individuals with chronic conditions such as HIV.

  1. A study of computer-related upper limb discomfort and computer vision syndrome.

    PubMed

    Sen, A; Richardson, Stanley

    2007-12-01

    Personal computers are one of the commonest office tools in Malaysia today. Their usage, even for three hours per day, leads to a health risk of developing Occupational Overuse Syndrome (OOS), Computer Vision Syndrome (CVS), low back pain, tension headaches and psychosocial stress. The study was conducted to investigate how a multiethnic society in Malaysia is coping with these problems that are increasing at a phenomenal rate in the west. This study investigated computer usage, awareness of ergonomic modifications of computer furniture and peripherals, symptoms of CVS and risk of developing OOS. A cross-sectional questionnaire study of 136 computer users was conducted on a sample population of university students and office staff. A 'Modified Rapid Upper Limb Assessment (RULA) for office work' technique was used for evaluation of OOS. The prevalence of CVS was surveyed incorporating a 10-point scoring system for each of its various symptoms. It was found that many were using standard keyboard and mouse without any ergonomic modifications. Around 50% of those with some low back pain did not have an adjustable backrest. Many users had higher RULA scores of the wrist and neck suggesting increased risk of developing OOS, which needed further intervention. Many (64%) were using refractive corrections and still had high scores of CVS commonly including eye fatigue, headache and burning sensation. The increase of CVS scores (suggesting more subjective symptoms) correlated with increase in computer usage spells. It was concluded that further onsite studies are needed, to follow up this survey to decrease the risks of developing CVS and OOS amongst young computer users.

  2. Personality Types and Affinity for Computers

    DTIC Science & Technology

    1991-03-01

    differences on personality dimensions between the respondents, and to explore the relationship between these differences and computer affinity. The results...between the respondents, and to explore the relationship between these differences and computer affinity. The results revealed no significant differences...type to this measure of computer affinity. 2 II. LITERATURZ REVIEW The interest of this study was the relationship between a person’s psychological

  3. Case Study Discussion Experiences of Computer Education and Instructional Technologies Students about Instructional Design on an Asynchronous Environment

    ERIC Educational Resources Information Center

    Baran, Bahar; Keles, Esra

    2011-01-01

    The aim of this study is to reveal opinions and experiences of two Computer Education and Instructional Technologies Departments' students about case study discussion method after they discussed in online asynchronous environment about Instructional Design (ID). Totally, 80 second year students, 40 from Dokuz Eylul University and 40 from Karadeniz…

  4. Pd-Catalyzed N-Arylation of Secondary Acyclic Amides: Catalyst Development, Scope, and Computational Study

    PubMed Central

    Hicks, Jacqueline D.; Hyde, Alan M.; Cuezva, Alberto Martinez; Buchwald, Stephen L.

    2009-01-01

    We report the efficient N-arylation of acyclic secondary amides and related nucleophiles with aryl nonaflates, triflates, and chlorides. This method allows for easy variation of the aromatic component in tertiary aryl amides. A new biaryl phosphine with P-bound 3,5-(bis)trifluoromethylphenyl groups was found to be uniquely effective for this amidation. The critical aspects of the ligand were explored through synthetic, mechanistic, and computational studies. Systematic variation of the ligand revealed the importance of (1) a methoxy group on the aromatic carbon of the “top ring” ortho to the phosphorus and (2) two highly electron-withdrawing P-bound 3,5-(bis)trifluoromethylphenyl groups. Computational studies suggest the electron-deficient nature of the ligand is important in facilitating amide binding to the LPd(II)(Ph)(X) intermediate. PMID:19886610

  5. Combining H/D exchange mass spectroscopy and computational docking reveals extended DNA-binding surface on uracil-DNA glycosylase

    PubMed Central

    Roberts, Victoria A.; Pique, Michael E.; Hsu, Simon; Li, Sheng; Slupphaug, Geir; Rambo, Robert P.; Jamison, Jonathan W.; Liu, Tong; Lee, Jun H.; Tainer, John A.; Ten Eyck, Lynn F.; Woods, Virgil L.

    2012-01-01

    X-ray crystallography provides excellent structural data on protein–DNA interfaces, but crystallographic complexes typically contain only small fragments of large DNA molecules. We present a new approach that can use longer DNA substrates and reveal new protein–DNA interactions even in extensively studied systems. Our approach combines rigid-body computational docking with hydrogen/deuterium exchange mass spectrometry (DXMS). DXMS identifies solvent-exposed protein surfaces; docking is used to create a 3-dimensional model of the protein–DNA interaction. We investigated the enzyme uracil-DNA glycosylase (UNG), which detects and cleaves uracil from DNA. UNG was incubated with a 30 bp DNA fragment containing a single uracil, giving the complex with the abasic DNA product. Compared with free UNG, the UNG–DNA complex showed increased solvent protection at the UNG active site and at two regions outside the active site: residues 210–220 and 251–264. Computational docking also identified these two DNA-binding surfaces, but neither shows DNA contact in UNG–DNA crystallographic structures. Our results can be explained by separation of the two DNA strands on one side of the active site. These non-sequence-specific DNA-binding surfaces may aid local uracil search, contribute to binding the abasic DNA product and help present the DNA product to APE-1, the next enzyme on the DNA-repair pathway. PMID:22492624

  6. Reviewing and Critiquing Computer Learning and Usage among Older Adults

    ERIC Educational Resources Information Center

    Kim, Young Sek

    2008-01-01

    By searching the keywords of "older adult" and "computer" in ERIC, Academic Search Premier, and PsycINFO, this study reviewed 70 studies published after 1990 that address older adults' computer learning and usage. This study revealed 5 prominent themes among reviewed literature: (a) motivations and barriers of older adults' usage of computers, (b)…

  7. Computed tomography: Will the slices reveal the truth

    PubMed Central

    Haridas, Harish; Mohan, Abarajithan; Papisetti, Sravanthi; Ealla, Kranti K. R.

    2016-01-01

    With the advances in the field of imaging sciences, new methods have been developed in dental radiology. These include digital radiography, density analyzing methods, cone beam computed tomography (CBCT), magnetic resonance imaging, ultrasound, and nuclear imaging techniques, which provide high-resolution detailed images of oral structures. The current review aims to critically elaborate the use of CBCT in endodontics. PMID:27652253

  8. Computer-Assisted Learning in UK Engineering Degree Programmes: Lessons Learned from an Extensive Case Study Programme

    ERIC Educational Resources Information Center

    Rothberg, S. J.; Lamb, F. M.; Willis, L.

    2006-01-01

    This paper gives a synopsis of an extensive programme of case studies on real uses of computer-assisted learning (CAL) materials within UK engineering degree programmes. The programme was conducted between 2000 and 2003 and followed a questionnaire-based survey looking at CAL use in the UK and in Australia. The synopsis reveals a number of key…

  9. Need Assessment of Computer Science and Engineering Graduates

    ERIC Educational Resources Information Center

    Surakka, Sami; Malmi, Lauri

    2005-01-01

    This case study considered the syllabus of the first and second year studies in computer science. The aim of the study was to reveal which topics covered in the syllabi were really needed during the following years of study or in working life. The program that was assessed in the study was a Masters program in computer science and engineering at a…

  10. Linguistic Analysis of Natural Language Communication with Computers.

    ERIC Educational Resources Information Center

    Thompson, Bozena Henisz

    Interaction with computers in natural language requires a language that is flexible and suited to the task. This study of natural dialogue was undertaken to reveal those characteristics which can make computer English more natural. Experiments were made in three modes of communication: face-to-face, terminal-to-terminal, and human-to-computer,…

  11. Revealed Preference Methods for Studying Bicycle Route Choice-A Systematic Review.

    PubMed

    Pritchard, Ray

    2018-03-07

    One fundamental aspect of promoting utilitarian bicycle use involves making modifications to the built environment to improve the safety, efficiency and enjoyability of cycling. Revealed preference data on bicycle route choice can assist greatly in understanding the actual behaviour of a highly heterogeneous group of users, which in turn assists the prioritisation of infrastructure or other built environment initiatives. This systematic review seeks to compare the relative strengths and weaknesses of the empirical approaches for evaluating whole journey route choices of bicyclists. Two electronic databases were systematically searched for a selection of keywords pertaining to bicycle and route choice. In total seven families of methods are identified: GPS devices, smartphone applications, crowdsourcing, participant-recalled routes, accompanied journeys, egocentric cameras and virtual reality. The study illustrates a trade-off in the quality of data obtainable and the average number of participants. Future additional methods could include dockless bikeshare, multiple camera solutions using computer vision and immersive bicycle simulator environments.

  12. Revealed Preference Methods for Studying Bicycle Route Choice—A Systematic Review

    PubMed Central

    2018-01-01

    One fundamental aspect of promoting utilitarian bicycle use involves making modifications to the built environment to improve the safety, efficiency and enjoyability of cycling. Revealed preference data on bicycle route choice can assist greatly in understanding the actual behaviour of a highly heterogeneous group of users, which in turn assists the prioritisation of infrastructure or other built environment initiatives. This systematic review seeks to compare the relative strengths and weaknesses of the empirical approaches for evaluating whole journey route choices of bicyclists. Two electronic databases were systematically searched for a selection of keywords pertaining to bicycle and route choice. In total seven families of methods are identified: GPS devices, smartphone applications, crowdsourcing, participant-recalled routes, accompanied journeys, egocentric cameras and virtual reality. The study illustrates a trade-off in the quality of data obtainable and the average number of participants. Future additional methods could include dockless bikeshare, multiple camera solutions using computer vision and immersive bicycle simulator environments. PMID:29518991

  13. Computer Anxiety: Relationship to Math Anxiety and Holland Types.

    ERIC Educational Resources Information Center

    Bellando, Jayne; Winer, Jane L.

    Although the number of computers in the school system is increasing, many schools are not using computers to their capacity. One reason for this may be computer anxiety on the part of the teacher. A review of the computer anxiety literature reveals little information on the subject, and findings from previous studies suggest that basic controlled…

  14. Educational NASA Computational and Scientific Studies (enCOMPASS)

    NASA Technical Reports Server (NTRS)

    Memarsadeghi, Nargess

    2013-01-01

    Educational NASA Computational and Scientific Studies (enCOMPASS) is an educational project of NASA Goddard Space Flight Center aimed at bridging the gap between computational objectives and needs of NASA's scientific research, missions, and projects, and academia's latest advances in applied mathematics and computer science. enCOMPASS achieves this goal via bidirectional collaboration and communication between NASA and academia. Using developed NASA Computational Case Studies in university computer science/engineering and applied mathematics classes is a way of addressing NASA's goals of contributing to the Science, Technology, Education, and Math (STEM) National Objective. The enCOMPASS Web site at http://encompass.gsfc.nasa.gov provides additional information. There are currently nine enCOMPASS case studies developed in areas of earth sciences, planetary sciences, and astrophysics. Some of these case studies have been published in AIP and IEEE's Computing in Science and Engineering magazines. A few university professors have used enCOMPASS case studies in their computational classes and contributed their findings to NASA scientists. In these case studies, after introducing the science area, the specific problem, and related NASA missions, students are first asked to solve a known problem using NASA data and past approaches used and often published in a scientific/research paper. Then, after learning about the NASA application and related computational tools and approaches for solving the proposed problem, students are given a harder problem as a challenge for them to research and develop solutions for. This project provides a model for NASA scientists and engineers on one side, and university students, faculty, and researchers in computer science and applied mathematics on the other side, to learn from each other's areas of work, computational needs and solutions, and the latest advances in research and development. This innovation takes NASA science and

  15. Study of USGS/NASA land use classification system. [computer analysis from LANDSAT data

    NASA Technical Reports Server (NTRS)

    Spann, G. W.

    1975-01-01

    The results of a computer mapping project using LANDSAT data and the USGS/NASA land use classification system are summarized. During the computer mapping portion of the project, accuracies of 67 percent to 79 percent were achieved using Level II of the classification system and a 4,000 acre test site centered on Douglasville, Georgia. Analysis of response to a questionaire circulated to actual and potential LANDSAT data users reveals several important findings: (1) there is a substantial desire for additional information related to LANDSAT capabilities; (2) a majority of the respondents feel computer mapping from LANDSAT data could aid present or future projects; and (3) the costs of computer mapping are substantially less than those of other methods.

  16. Computer technology forecast study for general aviation

    NASA Technical Reports Server (NTRS)

    Seacord, C. L.; Vaughn, D.

    1976-01-01

    A multi-year, multi-faceted program is underway to investigate and develop potential improvements in airframes, engines, and avionics for general aviation aircraft. The objective of this study was to assemble information that will allow the government to assess the trends in computer and computer/operator interface technology that may have application to general aviation in the 1980's and beyond. The current state of the art of computer hardware is assessed, technical developments in computer hardware are predicted, and nonaviation large volume users of computer hardware are identified.

  17. Network-based study reveals potential infection pathways of hepatitis-C leading to various diseases.

    PubMed

    Mukhopadhyay, Anirban; Maulik, Ujjwal

    2014-01-01

    Protein-protein interaction network-based study of viral pathogenesis has been gaining popularity among computational biologists in recent days. In the present study we attempt to investigate the possible pathways of hepatitis-C virus (HCV) infection by integrating the HCV-human interaction network, human protein interactome and human genetic disease association network. We have proposed quasi-biclique and quasi-clique mining algorithms to integrate these three networks to identify infection gateway host proteins and possible pathways of HCV pathogenesis leading to various diseases. Integrated study of three networks, namely HCV-human interaction network, human protein interaction network, and human proteins-disease association network reveals potential pathways of infection by the HCV that lead to various diseases including cancers. The gateway proteins have been found to be biologically coherent and have high degrees in human interactome compared to the other virus-targeted proteins. The analyses done in this study provide possible targets for more effective anti-hepatitis-C therapeutic involvement.

  18. Network-Based Study Reveals Potential Infection Pathways of Hepatitis-C Leading to Various Diseases

    PubMed Central

    Mukhopadhyay, Anirban; Maulik, Ujjwal

    2014-01-01

    Protein-protein interaction network-based study of viral pathogenesis has been gaining popularity among computational biologists in recent days. In the present study we attempt to investigate the possible pathways of hepatitis-C virus (HCV) infection by integrating the HCV-human interaction network, human protein interactome and human genetic disease association network. We have proposed quasi-biclique and quasi-clique mining algorithms to integrate these three networks to identify infection gateway host proteins and possible pathways of HCV pathogenesis leading to various diseases. Integrated study of three networks, namely HCV-human interaction network, human protein interaction network, and human proteins-disease association network reveals potential pathways of infection by the HCV that lead to various diseases including cancers. The gateway proteins have been found to be biologically coherent and have high degrees in human interactome compared to the other virus-targeted proteins. The analyses done in this study provide possible targets for more effective anti-hepatitis-C therapeutic involvement. PMID:24743187

  19. Quantitative Study on Computer Self-Efficacy and Computer Anxiety Differences in Academic Major and Residential Status

    ERIC Educational Resources Information Center

    Binkley, Zachary Wayne McClellan

    2017-01-01

    This study investigates computer self-efficacy and computer anxiety within 61 students across two academic majors, Aviation and Sports and Exercise Science, while investigating the impact residential status, age, and gender has on those two psychological constructs. The purpose of the study is to find if computer self-efficacy and computer anxiety…

  20. Pacific Educational Computer Network Study. Final Report.

    ERIC Educational Resources Information Center

    Hawaii Univ., Honolulu. ALOHA System.

    The Pacific Educational Computer Network Feasibility Study examined technical and non-technical aspects of the formation of an international Pacific Area computer network for higher education. The technical study covered the assessment of the feasibility of a packet-switched satellite and radio ground distribution network for data transmission…

  1. A Review Study on Cloud Computing Issues

    NASA Astrophysics Data System (ADS)

    Kanaan Kadhim, Qusay; Yusof, Robiah; Sadeq Mahdi, Hamid; Al-shami, Sayed Samer Ali; Rahayu Selamat, Siti

    2018-05-01

    Cloud computing is the most promising current implementation of utility computing in the business world, because it provides some key features over classic utility computing, such as elasticity to allow clients dynamically scale-up and scale-down the resources in execution time. Nevertheless, cloud computing is still in its premature stage and experiences lack of standardization. The security issues are the main challenges to cloud computing adoption. Thus, critical industries such as government organizations (ministries) are reluctant to trust cloud computing due to the fear of losing their sensitive data, as it resides on the cloud with no knowledge of data location and lack of transparency of Cloud Service Providers (CSPs) mechanisms used to secure their data and applications which have created a barrier against adopting this agile computing paradigm. This study aims to review and classify the issues that surround the implementation of cloud computing which a hot area that needs to be addressed by future research.

  2. Synchrotron-radiation-based X-ray micro-computed tomography reveals dental bur debris under dental composite restorations.

    PubMed

    Hedayat, Assem; Nagy, Nicole; Packota, Garnet; Monteith, Judy; Allen, Darcy; Wysokinski, Tomasz; Zhu, Ning

    2016-05-01

    Dental burs are used extensively in dentistry to mechanically prepare tooth structures for restorations (fillings), yet little has been reported on the bur debris left behind in the teeth, and whether it poses potential health risks to patients. Here it is aimed to image dental bur debris under dental fillings, and allude to the potential health hazards that can be caused by this debris when left in direct contact with the biological surroundings, specifically when the debris is made of a non-biocompatible material. Non-destructive micro-computed tomography using the BioMedical Imaging & Therapy facility 05ID-2 beamline at the Canadian Light Source was pursued at 50 keV and at a pixel size of 4 µm to image dental bur fragments under a composite resin dental filling. The bur's cutting edges that produced the fragment were also chemically analyzed. The technique revealed dental bur fragments of different sizes in different locations on the floor of the prepared surface of the teeth and under the filling, which places them in direct contact with the dentinal tubules and the dentinal fluid circulating within them. Dispersive X-ray spectroscopy elemental analysis of the dental bur edges revealed that the fragments are made of tungsten carbide-cobalt, which is bio-incompatible.

  3. Single-Photon Emission Computed Tomography/Computed Tomography Imaging in a Rabbit Model of Emphysema Reveals Ongoing Apoptosis In Vivo

    PubMed Central

    Goldklang, Monica P.; Tekabe, Yared; Zelonina, Tina; Trischler, Jordis; Xiao, Rui; Stearns, Kyle; Romanov, Alexander; Muzio, Valeria; Shiomi, Takayuki; Johnson, Lynne L.

    2016-01-01

    Evaluation of lung disease is limited by the inability to visualize ongoing pathological processes. Molecular imaging that targets cellular processes related to disease pathogenesis has the potential to assess disease activity over time to allow intervention before lung destruction. Because apoptosis is a critical component of lung damage in emphysema, a functional imaging approach was taken to determine if targeting apoptosis in a smoke exposure model would allow the quantification of early lung damage in vivo. Rabbits were exposed to cigarette smoke for 4 or 16 weeks and underwent single-photon emission computed tomography/computed tomography scanning using technetium-99m–rhAnnexin V-128. Imaging results were correlated with ex vivo tissue analysis to validate the presence of lung destruction and apoptosis. Lung computed tomography scans of long-term smoke–exposed rabbits exhibit anatomical similarities to human emphysema, with increased lung volumes compared with controls. Morphometry on lung tissue confirmed increased mean linear intercept and destructive index at 16 weeks of smoke exposure and compliance measurements documented physiological changes of emphysema. Tissue and lavage analysis displayed the hallmarks of smoke exposure, including increased tissue cellularity and protease activity. Technetium-99m–rhAnnexin V-128 single-photon emission computed tomography signal was increased after smoke exposure at 4 and 16 weeks, with confirmation of increased apoptosis through terminal deoxynucleotidyl transferase dUTP nick end labeling staining and increased tissue neutral sphingomyelinase activity in the tissue. These studies not only describe a novel emphysema model for use with future therapeutic applications, but, most importantly, also characterize a promising imaging modality that identifies ongoing destructive cellular processes within the lung. PMID:27483341

  4. Structural determinants of species-selective substrate recognition in human and Drosophila serotonin transporters revealed through computational docking studies

    PubMed Central

    Kaufmann, Kristian W.; Dawson, Eric S.; Henry, L. Keith; Field, Julie R.; Blakely, Randy D.; Meiler, Jens

    2009-01-01

    To identify potential determinants of substrate selectivity in serotonin (5-HT) transporters (SERT), models of human and Drosophila serotonin transporters (hSERT, dSERT) were built based on the leucine transporter (LeuTAa) structure reported by Yamashita et al. (Nature 2005;437:215–223), PBDID 2A65. Although the overall amino acid identity between SERTs and the LeuTAa is only 17%, it increases to above 50% in the first shell of the putative 5-HT binding site, allowing de novo computational docking of tryptamine derivatives in atomic detail. Comparison of hSERT and dSERT complexed with substrates pinpoints likely structural determinants for substrate binding. Forgoing the use of experimental transport and binding data of tryptamine derivatives for construction of these models enables us to cHitically assess and validate their predictive power: A single 5-HT binding mode was identified that retains the amine placement observed in the LeuTAa structure, matches site-directed mutagenesis and substituted cysteine accessibility method (SCAM) data, complies with support vector machine derived relations activity relations, and predicts computational binding energies for 5-HT analogs with a significant correlation coefficient (R = 0.72). This binding mode places 5-HT deep in the binding pocket of the SERT with the 5-position near residue hSERT A169/dSERT D164 in transmembrane helix 3, the indole nitrogen next to residue Y176/Y171, and the ethylamine tail under residues F335/F327 and S336/S328 within 4 Å of residue D98. Our studies identify a number of potential contacts whose contribution to substrate binding and transport was previously unsuspected. PMID:18704946

  5. Computed tomography angiography reveals stenosis and aneurysmal dilation of an aberrant right subclavian artery causing systemic blood pressure misreading in an old Pekinese dog

    PubMed Central

    KIM, Jaehwan; EOM, Kidong; YOON, Hakyoung

    2017-01-01

    A 14-year-old dog weighing 4 kg presented with hypotension only in the right forelimb. Thoracic radiography revealed a round soft tissue opacity near the aortic arch and below the second thoracic vertebra on a lateral view. Three-dimensional computed tomography angiography clearly revealed stenosis and aneurysmal dilation of an aberrant right subclavian artery. Stenosis and aneurysm of an aberrant subclavian artery should be included as a differential diagnosis in dogs showing a round soft tissue opacity near the aortic arch and below the thoracic vertebra on the lateral thoracic radiograph. PMID:28496026

  6. Computed tomography angiography reveals stenosis and aneurysmal dilation of an aberrant right subclavian artery causing systemic blood pressure misreading in an old Pekinese dog.

    PubMed

    Kim, Jaehwan; Eom, Kidong; Yoon, Hakyoung

    2017-06-16

    A 14-year-old dog weighing 4 kg presented with hypotension only in the right forelimb. Thoracic radiography revealed a round soft tissue opacity near the aortic arch and below the second thoracic vertebra on a lateral view. Three-dimensional computed tomography angiography clearly revealed stenosis and aneurysmal dilation of an aberrant right subclavian artery. Stenosis and aneurysm of an aberrant subclavian artery should be included as a differential diagnosis in dogs showing a round soft tissue opacity near the aortic arch and below the thoracic vertebra on the lateral thoracic radiograph.

  7. On the Rhetorical Contract in Human-Computer Interaction.

    ERIC Educational Resources Information Center

    Wenger, Michael J.

    1991-01-01

    An exploration of the rhetorical contract--i.e., the expectations for appropriate interaction--as it develops in human-computer interaction revealed that direct manipulation interfaces were more likely to establish social expectations. Study results suggest that the social nature of human-computer interactions can be examined with reference to the…

  8. A review of evaluative studies of computer-based learning in nursing education.

    PubMed

    Lewis, M J; Davies, R; Jenkins, D; Tait, M I

    2001-01-01

    Although there have been numerous attempts to evaluate the learning benefits of computer-based learning (CBL) packages in nursing education, the results obtained have been equivocal. A literature search conducted for this review found 25 reports of the evaluation of nursing CBL packages since 1966. Detailed analysis of the evaluation methods used in these reports revealed that most had significant design flaws, including the use of too small a sample group, the lack of a control group, etc. Because of this, the conclusions reached were not always valid. More effort is required in the design of future evaluation studies of nursing CBL packages. Copyright 2001 Harcourt Publishers Ltd.

  9. Verification, Validation and Sensitivity Studies in Computational Biomechanics

    PubMed Central

    Anderson, Andrew E.; Ellis, Benjamin J.; Weiss, Jeffrey A.

    2012-01-01

    Computational techniques and software for the analysis of problems in mechanics have naturally moved from their origins in the traditional engineering disciplines to the study of cell, tissue and organ biomechanics. Increasingly complex models have been developed to describe and predict the mechanical behavior of such biological systems. While the availability of advanced computational tools has led to exciting research advances in the field, the utility of these models is often the subject of criticism due to inadequate model verification and validation. The objective of this review is to present the concepts of verification, validation and sensitivity studies with regard to the construction, analysis and interpretation of models in computational biomechanics. Specific examples from the field are discussed. It is hoped that this review will serve as a guide to the use of verification and validation principles in the field of computational biomechanics, thereby improving the peer acceptance of studies that use computational modeling techniques. PMID:17558646

  10. Shared-Ride Taxi Computer Control System Requirements Study

    DOT National Transportation Integrated Search

    1977-08-01

    The technical problem of scheduling and routing shared-ride taxi service is so great that only computers can handle it efficiently. This study is concerned with defining the requirements of such a computer system. The major objective of this study is...

  11. Sources of computer self-efficacy: The relationship to outcome expectations, computer anxiety, and intention to use computers

    NASA Astrophysics Data System (ADS)

    Antoine, Marilyn V.

    2011-12-01

    The purpose of this research was to extend earlier research on sources of selfefficacy (Lent, Lopez, & Biechke, 1991; Usher & Pajares, 2009) to the information technology domain. The principal investigator examined how Bandura's (1977) sources of self-efficacy information---mastery experience, vicarious experience, verbal persuasion, and physiological states---shape computer self-efficacy beliefs and influence the decision to use or not use computers. The study took place at a mid-sized Historically Black College or University in the South. A convenience sample of 105 undergraduates was drawn from students enrolled in multiple sections of two introductory computer courses. There were 67 females and 38 males. This research was a correlational study of the following variables: sources of computer self-efficacy, general computer self-efficacy, outcome expectations, computer anxiety, and intention to use computers. The principal investigator administered a survey questionnaire containing 52 Likert items to measure the major study variables. Additionally, the survey instrument collected demographic variables such as gender, age, race, intended major, classification, technology use, technology adoption category, and whether the student owns a computer. The results reveal the following: (1) Mastery experience and verbal persuasion had statistically significant relationships to general computer self-efficacy, while vicarious experience and physiological states had non-significant relationships. Mastery experience had the strongest correlation to general computer self-efficacy. (2) All of the sources of computer self-efficacy had statistically significant relationships to personal outcome expectations. Vicarious experience had the strongest correlation to personal outcome expectations. (3) All of the sources of self-efficacy had statistically significant relationships to performance outcome expectations. Vicarious experience had the strongest correlation to performance

  12. Binding of indomethacin methyl ester to cyclooxygenase-2. A computational study.

    PubMed

    Sárosi, Menyhárt-Botond

    2018-06-05

    Inhibitors selective towards the second isoform of prostaglandin synthase (cyclooxygenase, COX-2) are promising nonsteroidal anti-inflammatory drugs and antitumor medications. Methylation of the carboxylate group in the relatively nonselective COX inhibitor indomethacin confers significant COX-2 selectivity. Several other modifications converting indomethacin into a COX-2 selective inhibitor have been reported. Earlier experimental and computational studies on neutral indomethacin derivatives suggest that the methyl ester derivative likely binds to COX-2 with a similar binding mode as that observed for the parent indomethacin. However, docking studies followed by molecular dynamics simulations revealed two possible binding modes in COX-2 for indomethacin methyl ester, which differs from the experimental binding mode found for indomethacin. Both alternative binding modes might explain the observed COX-2 selectivity of indomethacin methyl ester. Graphical abstract Binding of indomethacin methyl ester to cyclooxygenase-2.

  13. Ten Years toward Equity: Preliminary Results from a Follow-Up Case Study of Academic Computing Culture

    PubMed Central

    Crenshaw, Tanya L.; Chambers, Erin W.; Heeren, Cinda; Metcalf, Heather E.

    2017-01-01

    Just over 10 years ago, we conducted a culture study of the Computer Science Department at the flagship University of Illinois at Urbana-Champaign, one of the top five computing departments in the country. The study found that while the department placed an emphasis on research, it did so in a way that, in conjunction with a lack of communication and transparency, devalued teaching and mentoring, and negatively impacted the professional development, education, and sense of belonging of the students. As one part of a multi-phase case study spanning over a decade, this manuscript presents preliminary findings from our latest work at the university. We detail early comparisons between data gathered at the Department of Computer Science at the University of Illinois at Urbana-Champaign in 2005 and our most recent pilot case study, a follow-up research project completed in 2016. Though we have not yet completed the full data collection, we find it worthwhile to reflect on the pilot case study data we have collected thus far. Our data reveals improvements in the perceptions of undergraduate teaching quality and undergraduate peer mentoring networks. However, we also found evidence of continuing feelings of isolation, incidents of bias, policy opacity, and uneven policy implementation that are areas of concern, particularly with respect to historically underrepresented groups. We discuss these preliminary follow-up findings, offer research and methodological reflections, and share next steps for applied research that aims to create positive cultural change in computing. PMID:28579969

  14. Mg co-ordination with potential carcinogenic molecule acrylamide: Spectroscopic, computational and cytotoxicity studies

    NASA Astrophysics Data System (ADS)

    Singh, Ranjana; Mishra, Vijay K.; Singh, Hemant K.; Sharma, Gunjan; Koch, Biplob; Singh, Bachcha; Singh, Ranjan K.

    2018-03-01

    Acrylamide (acr) is a potential toxic molecule produced in thermally processed food stuff. Acr-Mg complex has been synthesized chemically and characterized by spectroscopic techniques. The binding sites of acr with Mg were identified by experimental and computational methods. Both experimental and theoretical results suggest that Mg coordinated with the oxygen atom of Cdbnd O group of acr. In-vitro cytotoxicity studies revealed significant decrease in the toxic level of acr-Mg complex as compared to pure acr. The decrease in toxicity on complexation with Mg may be a useful step for future research to reduce the toxicity of acr.

  15. Secure Cloud Computing Implementation Study For Singapore Military Operations

    DTIC Science & Technology

    2016-09-01

    COMPUTING IMPLEMENTATION STUDY FOR SINGAPORE MILITARY OPERATIONS by Lai Guoquan September 2016 Thesis Advisor: John D. Fulp Co-Advisor...DATES COVERED Master’s thesis 4. TITLE AND SUBTITLE SECURE CLOUD COMPUTING IMPLEMENTATION STUDY FOR SINGAPORE MILITARY OPERATIONS 5. FUNDING NUMBERS...addition, from the military perspective, the benefits of cloud computing were analyzed from a study of the U.S. Department of Defense. Then, using

  16. Investigating College and Graduate Students' Multivariable Reasoning in Computational Modeling

    ERIC Educational Resources Information Center

    Wu, Hsin-Kai; Wu, Pai-Hsing; Zhang, Wen-Xin; Hsu, Ying-Shao

    2013-01-01

    Drawing upon the literature in computational modeling, multivariable reasoning, and causal attribution, this study aims at characterizing multivariable reasoning practices in computational modeling and revealing the nature of understanding about multivariable causality. We recruited two freshmen, two sophomores, two juniors, two seniors, four…

  17. Advanced flight computer. Special study

    NASA Technical Reports Server (NTRS)

    Coo, Dennis

    1995-01-01

    This report documents a special study to define a 32-bit radiation hardened, SEU tolerant flight computer architecture, and to investigate current or near-term technologies and development efforts that contribute to the Advanced Flight Computer (AFC) design and development. An AFC processing node architecture is defined. Each node may consist of a multi-chip processor as needed. The modular, building block approach uses VLSI technology and packaging methods that demonstrate a feasible AFC module in 1998 that meets that AFC goals. The defined architecture and approach demonstrate a clear low-risk, low-cost path to the 1998 production goal, with intermediate prototypes in 1996.

  18. Aerodynamic optimization studies on advanced architecture computers

    NASA Technical Reports Server (NTRS)

    Chawla, Kalpana

    1995-01-01

    The approach to carrying out multi-discipline aerospace design studies in the future, especially in massively parallel computing environments, comprises of choosing (1) suitable solvers to compute solutions to equations characterizing a discipline, and (2) efficient optimization methods. In addition, for aerodynamic optimization problems, (3) smart methodologies must be selected to modify the surface shape. In this research effort, a 'direct' optimization method is implemented on the Cray C-90 to improve aerodynamic design. It is coupled with an existing implicit Navier-Stokes solver, OVERFLOW, to compute flow solutions. The optimization method is chosen such that it can accomodate multi-discipline optimization in future computations. In the work , however, only single discipline aerodynamic optimization will be included.

  19. Computational studies of Ras and PI3K

    NASA Technical Reports Server (NTRS)

    Ren, Lei; Cucinotta, Francis A.

    2004-01-01

    Until recently, experimental techniques in molecular cell biology have been the primary means to investigate biological risk upon space radiation. However, computational modeling provides an alternative theoretical approach, which utilizes various computational tools to simulate proteins, nucleotides, and their interactions. In this study, we are focused on using molecular mechanics (MM) and molecular dynamics (MD) to study the mechanism of protein-protein binding and to estimate the binding free energy between proteins. Ras is a key element in a variety of cell processes, and its activation of phosphoinositide 3-kinase (PI3K) is important for survival of transformed cells. Different computational approaches for this particular study are presented to calculate the solvation energies and binding free energies of H-Ras and PI3K. The goal of this study is to establish computational methods to investigate the roles of different proteins played in the cellular responses to space radiation, including modification of protein function through gene mutation, and to support the studies in molecular cell biology and theoretical kinetics models for our risk assessment project.

  20. How Do Students Experience Testing on the University Computer?

    ERIC Educational Resources Information Center

    Whittington, Dale; And Others

    1995-01-01

    Reports a study of the administration mode, scores, and testing experiences of students taking the PreProfessional Skills Test (PPST) under differing conditions (computer based and paper and pencil). PPST scores and surveys of the students revealed varied test-taking strategies and computer-related alterations in test difficulty, construct,…

  1. Exact Dispersion Study of an Asymmetric Thin Planar Slab Dielectric Waveguide without Computing {d^2}β/{d{k^2}} Numerically

    NASA Astrophysics Data System (ADS)

    Raghuwanshi, Sanjeev Kumar; Palodiya, Vikram

    2017-08-01

    Waveguide dispersion can be tailored but not the material dispersion. Hence, the total dispersion can be shifted at any desired band by adjusting the waveguide dispersion. Waveguide dispersion is proportional to {d^2}β/d{k^2} and need to be computed numerically. In this paper, we have tried to compute analytical expression for {d^2}β/d{k^2} in terms of {d^2}β/d{k^2} accurately with numerical technique, ≈ 10^{-5} decimal point. This constraint sometimes generates the error in calculation of waveguide dispersion. To formulate the problem we will use the graphical method. Our study reveals that we can compute the waveguide dispersion enough accurately for various modes by knowing - β only.

  2. Non-Determinism: An Abstract Concept in Computer Science Studies

    ERIC Educational Resources Information Center

    Armoni, Michal; Gal-Ezer, Judith

    2007-01-01

    Non-determinism is one of the most important, yet abstract, recurring concepts of Computer Science. It plays an important role in Computer Science areas such as formal language theory, computability theory, distributed computing, and operating systems. We conducted a series of studies on the perception of non-determinism. In the current research,…

  3. Computer Simulation in the Social Sciences/Social Studies.

    ERIC Educational Resources Information Center

    Klassen, Daniel L.

    Computers are beginning to be used more frequently as instructional tools in secondary school social studies. This is especially true of "new social studies" programs; i.e., programs which subordinate mere mastery of factual content to the recognition of and ability to deal with the social imperatives of the future. Computer-assisted…

  4. International Computer and Information Literacy Study: Assessment Framework

    ERIC Educational Resources Information Center

    Fraillon, Julian; Schulz, Wolfram; Ainley, John

    2013-01-01

    The purpose of the International Computer and Information Literacy Study 2013 (ICILS 2013) is to investigate, in a range of countries, the ways in which young people are developing "computer and information literacy" (CIL) to support their capacity to participate in the digital age. To achieve this aim, the study will assess student…

  5. Ergonomics in the computer workstation.

    PubMed

    Karoney, M J; Mburu, S K; Ndegwa, D W; Nyaichowa, A G; Odera, E B

    2010-09-01

    Awareness of effects of long term use of computer and application of ergonomics in the computer workstation is important for preventing musculoskeletal disorders, eyestrain and psychosocial effects. To determine the awareness of physical and psychological effects of prolonged computer usage and application of ergonomicsin the workstation. One hundred and eighty one people were interviewed from tertiary educational institutions, telecommunications and media houses within Nairobi, Kenya. Descriptive cross sectional study. Majority (89.8%) of the respondents felt that prolonged computer use had an adverse effect on their health, with only 12.4% having received formal training on the same. Assessment of their workstations revealed the most applied ergonomic measure as feet placement on the floor: 100% (181) followed by correct monitor placement with 94.4% (171) fulfilling the requirements. The least applied ergonomic measures were non reflecting wall paint: 5% (9) and adjustable desk 9.9% (18). There is awareness among computer users on the effects of prolonged computer use but there is limited application of ergonomic measures.

  6. REVEAL: An Extensible Reduced Order Model Builder for Simulation and Modeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Agarwal, Khushbu; Sharma, Poorva; Ma, Jinliang

    2013-04-30

    Many science domains need to build computationally efficient and accurate representations of high fidelity, computationally expensive simulations. These computationally efficient versions are known as reduced-order models. This paper presents the design and implementation of a novel reduced-order model (ROM) builder, the REVEAL toolset. This toolset generates ROMs based on science- and engineering-domain specific simulations executed on high performance computing (HPC) platforms. The toolset encompasses a range of sampling and regression methods that can be used to generate a ROM, automatically quantifies the ROM accuracy, and provides support for an iterative approach to improve ROM accuracy. REVEAL is designed to bemore » extensible in order to utilize the core functionality with any simulator that has published input and output formats. It also defines programmatic interfaces to include new sampling and regression techniques so that users can ‘mix and match’ mathematical techniques to best suit the characteristics of their model. In this paper, we describe the architecture of REVEAL and demonstrate its usage with a computational fluid dynamics model used in carbon capture.« less

  7. How does ytterbium chloride interact with DMPC bilayers? A computational and experimental study.

    PubMed

    Gonzalez, Miguel A; Barriga, Hanna M G; Richens, Joanna L; Law, Robert V; O'Shea, Paul; Bresme, Fernando

    2017-03-29

    Lanthanide salts have been studied for many years, primarily in Nuclear Magnetic Resonance (NMR) experiments of mixed lipid-protein systems and more recently to study lipid flip-flop in model membrane systems. It is well recognised that lanthanide salts can influence the behaviour of both lipid and protein systems, however a full molecular level description of lipid-lanthanide interactions is still outstanding. Here we present a study of lanthanide-bilayer interactions, using molecular dynamics computer simulations, fluorescence electrostatic potential experiments and nuclear magnetic resonance. Computer simulations reveal the microscopic structure of DMPC lipid bilayers in the presence of Yb 3+ , and a surprising ability of the membranes to adsorb significant concentrations of Yb 3+ without disrupting the overall membrane structure. At concentrations commonly used in NMR experiments, Yb 3+ ions bind strongly to 5 lipids, inducing a small decrease of the area per lipid and a slight increase of the ordering of the aliphatic chains and the bilayer thickness. The area compressibility modulus increases by a factor of two, with respect to the free-salt case, showing that Yb 3+ ions make the bilayer more rigid. These modifications of the bilayer properties should be taken into account in the interpretation of NMR experiments.

  8. Product placement of computer games in cyberspace.

    PubMed

    Yang, Heng-Li; Wang, Cheng-Shu

    2008-08-01

    Computer games are considered an emerging media and are even regarded as an advertising channel. By a three-phase experiment, this study investigated the advertising effectiveness of computer games for different product placement forms, product types, and their combinations. As the statistical results revealed, computer games are appropriate for placement advertising. Additionally, different product types and placement forms produced different advertising effectiveness. Optimum combinations of product types and placement forms existed. An advertisement design model is proposed for use in game design environments. Some suggestions are given for advertisers and game companies respectively.

  9. Associations among temporomandibular disorders, chronic neck pain and neck pain disability in computer office workers: a pilot study.

    PubMed

    Bragatto, M M; Bevilaqua-Grossi, D; Regalo, S C H; Sousa, J D; Chaves, T C

    2016-05-01

    Neck pain is the most common musculoskeletal complaint among computer office workers. There are several reports about the coexistence of neck pain and temporomandibular disorders (TMD). However, there are no studies investigating this association in the context of work involving computers. The purpose of this study was to verify the association between TMD and neck pain in computer office workers. Fifty-two female computer workers who were divided into two groups: (i) those with self-reported chronic neck pain and disability (WNP) (n = 26) and (ii) those without self-reported neck pain (WONP) (n = 26), and a control group (CG) consisting of 26 women who did not work with computers participated in this study. Clinical assessments were performed to establish a diagnosis of TMD, and craniocervical mechanical pain was assessed using manual palpation and pressure pain threshold (PPT). The results of this study showed that the WNP group had a higher percentage of participants with TMD than the WONP group (42·30% vs. 23·07%, χ(2) = 5·70, P = 0·02). PPTs in all cervical sites were significantly lower in the groups WNP and WONP compared to the CG. Regression analysis revealed TMD, neck pain and work-related factors to be good predictors of disability (R(2) = 0·93, P < 0·001). These results highlighted the importance of considering the work conditions of patients with TMD, as neck disability in computer workers is explained by the association among neck pain, TMD and unfavourable workplace conditions. Consequently, this study attempted to emphasise the importance of considering work activity for minimising neck pain-related disability. © 2016 John Wiley & Sons Ltd.

  10. Brain-Computer Interface with Inhibitory Neurons Reveals Subtype-Specific Strategies.

    PubMed

    Mitani, Akinori; Dong, Mingyuan; Komiyama, Takaki

    2018-01-08

    Brain-computer interfaces have seen an increase in popularity due to their potential for direct neuroprosthetic applications for amputees and disabled individuals. Supporting this promise, animals-including humans-can learn even arbitrary mapping between the activity of cortical neurons and movement of prosthetic devices [1-4]. However, the performance of neuroprosthetic device control has been nowhere near that of limb control in healthy individuals, presenting a dire need to improve the performance. One potential limitation is the fact that previous work has not distinguished diverse cell types in the neocortex, even though different cell types possess distinct functions in cortical computations [5-7] and likely distinct capacities to control brain-computer interfaces. Here, we made a first step in addressing this issue by tracking the plastic changes of three major types of cortical inhibitory neurons (INs) during a neuron-pair operant conditioning task using two-photon imaging of IN subtypes expressing GCaMP6f. Mice were rewarded when the activity of the positive target neuron (N+) exceeded that of the negative target neuron (N-) beyond a set threshold. Mice improved performance with all subtypes, but the strategies were subtype specific. When parvalbumin (PV)-expressing INs were targeted, the activity of N- decreased. However, targeting of somatostatin (SOM)- and vasoactive intestinal peptide (VIP)-expressing INs led to an increase of the N+ activity. These results demonstrate that INs can be individually modulated in a subtype-specific manner and highlight the versatility of neural circuits in adapting to new demands by using cell-type-specific strategies. Copyright © 2017 Elsevier Ltd. All rights reserved.

  11. Cooperative combinatorial optimization: evolutionary computation case study.

    PubMed

    Burgin, Mark; Eberbach, Eugene

    2008-01-01

    This paper presents a formalization of the notion of cooperation and competition of multiple systems that work toward a common optimization goal of the population using evolutionary computation techniques. It is proved that evolutionary algorithms are more expressive than conventional recursive algorithms, such as Turing machines. Three classes of evolutionary computations are introduced and studied: bounded finite, unbounded finite, and infinite computations. Universal evolutionary algorithms are constructed. Such properties of evolutionary algorithms as completeness, optimality, and search decidability are examined. A natural extension of evolutionary Turing machine (ETM) model is proposed to properly reflect phenomena of cooperation and competition in the whole population.

  12. Writing Apprehension, Computer Anxiety and Telecomputing: A Pilot Study.

    ERIC Educational Resources Information Center

    Harris, Judith; Grandgenett, Neal

    1992-01-01

    A study measured graduate students' writing apprehension and computer anxiety levels before and after using electronic mail, computer conferencing, and remote database searching facilities during an educational technology course. Results indicted postcourse computer anxiety levels significantly related to usage statistics. Precourse writing…

  13. Gender Digital Divide and Challenges in Undergraduate Computer Science Programs

    ERIC Educational Resources Information Center

    Stoilescu, Dorian; McDougall, Douglas

    2011-01-01

    Previous research revealed a reduced number of female students registered in computer science studies. In addition, the female students feel isolated, have reduced confidence, and underperform. This article explores differences between female and male students in undergraduate computer science programs in a mid-size university in Ontario. Based on…

  14. Technology Leadership and Supervision: An Analysis Based on Turkish Computer Teachers' Professional Memories

    ERIC Educational Resources Information Center

    Deryakulu, Deniz; Olkun, Sinan

    2009-01-01

    This study examined Turkish computer teachers' professional memories telling of their experiences with school administrators and supervisors. Seventy-four computer teachers participated in the study. Content analysis of the memories revealed that the most frequently mentioned themes concerning school administrators were "unsupportive…

  15. Letting the ‘cat’ out of the bag: pouch young development of the extinct Tasmanian tiger revealed by X-ray computed tomography

    PubMed Central

    Spoutil, Frantisek; Prochazka, Jan; Black, Jay R.; Medlock, Kathryn; Paddle, Robert N.; Knitlova, Marketa; Hipsley, Christy A.

    2018-01-01

    The Tasmanian tiger or thylacine (Thylacinus cynocephalus) was an iconic Australian marsupial predator that was hunted to extinction in the early 1900s. Despite sharing striking similarities with canids, they failed to evolve many of the specialized anatomical features that characterize carnivorous placental mammals. These evolutionary limitations are thought to arise from functional constraints associated with the marsupial mode of reproduction, in which otherwise highly altricial young use their well-developed forelimbs to climb to the pouch and mouth to suckle. Here we present the first three-dimensional digital developmental series of the thylacine throughout its pouch life using X-ray computed tomography on all known ethanol-preserved specimens. Based on detailed skeletal measurements, we refine the species growth curve to improve age estimates for the individuals. Comparison of allometric growth trends in the appendicular skeleton (fore- and hindlimbs) with that of other placental and marsupial mammals revealed that despite their unique adult morphologies, thylacines retained a generalized early marsupial ontogeny. Our approach also revealed mislabelled specimens that possessed large epipubic bones (vestigial in thylacine) and differing vertebral numbers. All of our generated CT models are publicly available, preserving their developmental morphology and providing a novel digital resource for future studies of this unique marsupial. PMID:29515893

  16. Computer Proficiency Questionnaire: Assessing Low and High Computer Proficient Seniors

    PubMed Central

    Boot, Walter R.; Charness, Neil; Czaja, Sara J.; Sharit, Joseph; Rogers, Wendy A.; Fisk, Arthur D.; Mitzner, Tracy; Lee, Chin Chin; Nair, Sankaran

    2015-01-01

    Purpose of the Study: Computers and the Internet have the potential to enrich the lives of seniors and aid in the performance of important tasks required for independent living. A prerequisite for reaping these benefits is having the skills needed to use these systems, which is highly dependent on proper training. One prerequisite for efficient and effective training is being able to gauge current levels of proficiency. We developed a new measure (the Computer Proficiency Questionnaire, or CPQ) to measure computer proficiency in the domains of computer basics, printing, communication, Internet, calendaring software, and multimedia use. Our aim was to develop a measure appropriate for individuals with a wide range of proficiencies from noncomputer users to extremely skilled users. Design and Methods: To assess the reliability and validity of the CPQ, a diverse sample of older adults, including 276 older adults with no or minimal computer experience, was recruited and asked to complete the CPQ. Results: The CPQ demonstrated excellent reliability (Cronbach’s α = .98), with subscale reliabilities ranging from .86 to .97. Age, computer use, and general technology use all predicted CPQ scores. Factor analysis revealed three main factors of proficiency related to Internet and e-mail use; communication and calendaring; and computer basics. Based on our findings, we also developed a short-form CPQ (CPQ-12) with similar properties but 21 fewer questions. Implications: The CPQ and CPQ-12 are useful tools to gauge computer proficiency for training and research purposes, even among low computer proficient older adults. PMID:24107443

  17. Trial-by-Trial Modulation of Associative Memory Formation by Reward Prediction Error and Reward Anticipation as Revealed by a Biologically Plausible Computational Model.

    PubMed

    Aberg, Kristoffer C; Müller, Julia; Schwartz, Sophie

    2017-01-01

    Anticipation and delivery of rewards improves memory formation, but little effort has been made to disentangle their respective contributions to memory enhancement. Moreover, it has been suggested that the effects of reward on memory are mediated by dopaminergic influences on hippocampal plasticity. Yet, evidence linking memory improvements to actual reward computations reflected in the activity of the dopaminergic system, i.e., prediction errors and expected values, is scarce and inconclusive. For example, different previous studies reported that the magnitude of prediction errors during a reinforcement learning task was a positive, negative, or non-significant predictor of successfully encoding simultaneously presented images. Individual sensitivities to reward and punishment have been found to influence the activation of the dopaminergic reward system and could therefore help explain these seemingly discrepant results. Here, we used a novel associative memory task combined with computational modeling and showed independent effects of reward-delivery and reward-anticipation on memory. Strikingly, the computational approach revealed positive influences from both reward delivery, as mediated by prediction error magnitude, and reward anticipation, as mediated by magnitude of expected value, even in the absence of behavioral effects when analyzed using standard methods, i.e., by collapsing memory performance across trials within conditions. We additionally measured trait estimates of reward and punishment sensitivity and found that individuals with increased reward (vs. punishment) sensitivity had better memory for associations encoded during positive (vs. negative) prediction errors when tested after 20 min, but a negative trend when tested after 24 h. In conclusion, modeling trial-by-trial fluctuations in the magnitude of reward, as we did here for prediction errors and expected value computations, provides a comprehensive and biologically plausible description of

  18. Studying an Eulerian Computer Model on Different High-performance Computer Platforms and Some Applications

    NASA Astrophysics Data System (ADS)

    Georgiev, K.; Zlatev, Z.

    2010-11-01

    The Danish Eulerian Model (DEM) is an Eulerian model for studying the transport of air pollutants on large scale. Originally, the model was developed at the National Environmental Research Institute of Denmark. The model computational domain covers Europe and some neighbour parts belong to the Atlantic Ocean, Asia and Africa. If DEM model is to be applied by using fine grids, then its discretization leads to a huge computational problem. This implies that such a model as DEM must be run only on high-performance computer architectures. The implementation and tuning of such a complex large-scale model on each different computer is a non-trivial task. Here, some comparison results of running of this model on different kind of vector (CRAY C92A, Fujitsu, etc.), parallel computers with distributed memory (IBM SP, CRAY T3E, Beowulf clusters, Macintosh G4 clusters, etc.), parallel computers with shared memory (SGI Origin, SUN, etc.) and parallel computers with two levels of parallelism (IBM SMP, IBM BlueGene/P, clusters of multiprocessor nodes, etc.) will be presented. The main idea in the parallel version of DEM is domain partitioning approach. Discussions according to the effective use of the cache and hierarchical memories of the modern computers as well as the performance, speed-ups and efficiency achieved will be done. The parallel code of DEM, created by using MPI standard library, appears to be highly portable and shows good efficiency and scalability on different kind of vector and parallel computers. Some important applications of the computer model output are presented in short.

  19. Computational predictions of zinc oxide hollow structures

    NASA Astrophysics Data System (ADS)

    Tuoc, Vu Ngoc; Huan, Tran Doan; Thao, Nguyen Thi

    2018-03-01

    Nanoporous materials are emerging as potential candidates for a wide range of technological applications in environment, electronic, and optoelectronics, to name just a few. Within this active research area, experimental works are predominant while theoretical/computational prediction and study of these materials face some intrinsic challenges, one of them is how to predict porous structures. We propose a computationally and technically feasible approach for predicting zinc oxide structures with hollows at the nano scale. The designed zinc oxide hollow structures are studied with computations using the density functional tight binding and conventional density functional theory methods, revealing a variety of promising mechanical and electronic properties, which can potentially find future realistic applications.

  20. Reveal genes functionally associated with ACADS by a network study.

    PubMed

    Chen, Yulong; Su, Zhiguang

    2015-09-15

    Establishing a systematic network is aimed at finding essential human gene-gene/gene-disease pathway by means of network inter-connecting patterns and functional annotation analysis. In the present study, we have analyzed functional gene interactions of short-chain acyl-coenzyme A dehydrogenase gene (ACADS). ACADS plays a vital role in free fatty acid β-oxidation and regulates energy homeostasis. Modules of highly inter-connected genes in disease-specific ACADS network are derived by integrating gene function and protein interaction data. Among the 8 genes in ACADS web retrieved from both STRING and GeneMANIA, ACADS is effectively conjoined with 4 genes including HAHDA, HADHB, ECHS1 and ACAT1. The functional analysis is done via ontological briefing and candidate disease identification. We observed that the highly efficient-interlinked genes connected with ACADS are HAHDA, HADHB, ECHS1 and ACAT1. Interestingly, the ontological aspect of genes in the ACADS network reveals that ACADS, HAHDA and HADHB play equally vital roles in fatty acid metabolism. The gene ACAT1 together with ACADS indulges in ketone metabolism. Our computational gene web analysis also predicts potential candidate disease recognition, thus indicating the involvement of ACADS, HAHDA, HADHB, ECHS1 and ACAT1 not only with lipid metabolism but also with infant death syndrome, skeletal myopathy, acute hepatic encephalopathy, Reye-like syndrome, episodic ketosis, and metabolic acidosis. The current study presents a comprehensible layout of ACADS network, its functional strategies and candidate disease approach associated with ACADS network. Copyright © 2015 Elsevier B.V. All rights reserved.

  1. Accuracy of Computed Tomographic Perfusion in Diagnosis of Brain Death: A Prospective Cohort Study.

    PubMed

    Sawicki, Marcin; Sołek-Pastuszka, Joanna; Chamier-Ciemińska, Katarzyna; Walecka, Anna; Bohatyrewicz, Romuald

    2018-05-04

    BACKGROUND This study was designed to determine diagnostic accuracy of computed tomographic perfusion (CTP) compared to computed tomographic angiography (CTA) for the diagnosis of brain death (BD). MATERIAL AND METHODS Whole-brain CTP was performed in patients diagnosed with BD and in patients with devastating brain injury with preserved brainstem reflexes. CTA was derived from CTP datasets. Cerebral blood flow (CBF) and volume (CBV) were calculated in all brain regions. CTP findings were interpreted as confirming diagnosis of BD (positive) when CBF and CBV in all ROIs were below 10 mL/100 g/min and 1.0 mL/100 g, respectively. CTA findings were interpreted using a 4-point system. RESULTS Fifty brain-dead patients and 5 controls were included. In brain-dead patients, CTP results revealed CBF 0.00-9.98 mL/100 g/min and CBV 0.00-0.99 mL/100 g, and were thus interpreted as positive in all patients. CTA results suggested 7 negative cases, providing 86% sensitivity. In the non-brain-dead group, CTP results revealed CBF 2.37-37.59 mL/100 g/min and CBV 0.73-2.34 mL/100 g. The difference between values of CBF and CBV in the brain-dead and non-brain-dead groups was statistically significant (p=0.002 for CBF and p=0.001 for CBV). CTP findings in all non-brain-dead patients were interpreted as negative. This resulted in a specificity of 100% (95% CI, 0.31-1.00) for CTP in the diagnosis of BD. In all non-brain-dead patients, CTA revealed preserved intracranial filling and was interpreted as negative. This resulted in a specificity of 100% (95% CI, 0.31-1.00) for CTA in diagnosis of BD. CONCLUSIONS Whole-brain CTP seems to be a highly sensitive and specific method in diagnosis of BD.

  2. Integrating Laptop Computers into Classroom: Attitudes, Needs, and Professional Development of Science Teachers—A Case Study

    NASA Astrophysics Data System (ADS)

    Klieger, Aviva; Ben-Hur, Yehuda; Bar-Yossef, Nurit

    2010-04-01

    The study examines the professional development of junior-high-school teachers participating in the Israeli "Katom" (Computer for Every Class, Student and Teacher) Program, begun in 2004. A three-circle support and training model was developed for teachers' professional development. The first circle applies to all teachers in the program; the second, to all teachers at individual schools; the third to teachers of specific disciplines. The study reveals and describes the attitudes of science teachers to the integration of laptop computers and to the accompanying professional development model. Semi-structured interviews were conducted with eight science teachers from the four schools participating in the program. The interviews were analyzed according to the internal relational framework taken from the information that arose from the interviews. Two factors influenced science teachers' professional development: (1) Introduction of laptops to the teachers and students. (2) The support and training system. Interview analysis shows that the disciplinary training is most relevant to teachers and they are very interested in belonging to the professional science teachers' community. They also prefer face-to-face meetings in their school. Among the difficulties they noted were the new learning environment, including control of student computers, computer integration in laboratory work and technical problems. Laptop computers contributed significantly to teachers' professional and personal development and to a shift from teacher-centered to student-centered teaching. One-to-One laptops also changed the schools' digital culture. The findings are important for designing concepts and models for professional development when introducing technological innovation into the educational system.

  3. Open-Source Software in Computational Research: A Case Study

    DOE PAGES

    Syamlal, Madhava; O'Brien, Thomas J.; Benyahia, Sofiane; ...

    2008-01-01

    A case study of open-source (OS) development of the computational research software MFIX, used for multiphase computational fluid dynamics simulations, is presented here. The verification and validation steps required for constructing modern computational software and the advantages of OS development in those steps are discussed. The infrastructure used for enabling the OS development of MFIX is described. The impact of OS development on computational research and education in gas-solids flow, as well as the dissemination of information to other areas such as geophysical and volcanology research, is demonstrated. This study shows that the advantages of OS development were realized inmore » the case of MFIX: verification by many users, which enhances software quality; the use of software as a means for accumulating and exchanging information; the facilitation of peer review of the results of computational research.« less

  4. Case Studies in Library Computer Systems.

    ERIC Educational Resources Information Center

    Palmer, Richard Phillips

    Twenty descriptive case studies of computer applications in a variety of libraries are presented in this book. Computerized circulation, serial and acquisition systems in public, high school, college, university and business libraries are included. Each of the studies discusses: 1) the environment in which the system operates, 2) the objectives of…

  5. Computer analysis of lighting style in fine art: steps towards inter-artist studies

    NASA Astrophysics Data System (ADS)

    Stork, David G.

    2011-03-01

    Stylometry in visual art-the mathematical description of artists' styles - has been based on a number of properties of works, such as color, brush stroke shape, visual texture, and measures of contours' curvatures. We introduce the concept of quantitative measures of lighting, such as statistical descriptions of spatial coherence, diuseness, and so forth, as properties of artistic style. Some artists of the high Renaissance, such as Leonardo, worked from nature and strove to render illumination "faithfully" photorealists, such as Richard Estes, worked from photographs and duplicated the "physics based" lighting accurately. As such, each had dierent motivations, methodologies, stagings, and "accuracies" in rendering lighting clues. Perceptual studies show that observers are poor judges of properties of lighting in photographs such as consistency (and thus by extension in paintings as well); computer methods such as rigorous cast-shadow analysis, occluding-contour analysis and spherical harmonic based estimation of light fields can be quite accurate. For this reasons, computer lighting analysis can provide a new tools for art historical studies. We review lighting analysis in paintings such as Vermeer's Girl with a pearl earring, de la Tour's Christ in the carpenter's studio, Caravaggio's Magdalen with the smoking flame and Calling of St. Matthew) and extend our corpus to works where lighting coherence is of interest to art historians, such as Caravaggio's Adoration of the Shepherds or Nativity (1609) in the Capuchin church of Santa Maria degli Angeli. Our measure of lighting coherence may help reveal the working methods of some artists and in diachronic studies of individual artists. We speculate on artists and art historical questions that may ultimately profit from future renements to these new computational tools.

  6. Hispanic women overcoming deterrents to computer science: A phenomenological study

    NASA Astrophysics Data System (ADS)

    Herling, Lourdes

    The products of computer science are important to all aspects of society and are tools in the solution of the world's problems. It is, therefore, troubling that the United States faces a shortage in qualified graduates in computer science. The number of women and minorities in computer science is significantly lower than the percentage of the U.S. population which they represent. The overall enrollment in computer science programs has continued to decline with the enrollment of women declining at a higher rate than that of men. This study addressed three aspects of underrepresentation about which there has been little previous research: addressing computing disciplines specifically rather than embedding them within the STEM disciplines, what attracts women and minorities to computer science, and addressing the issues of race/ethnicity and gender in conjunction rather than in isolation. Since women of underrepresented ethnicities are more severely underrepresented than women in general, it is important to consider whether race and ethnicity play a role in addition to gender as has been suggested by previous research. Therefore, this study examined what attracted Hispanic women to computer science specifically. The study determines whether being subjected to multiple marginalizations---female and Hispanic---played a role in the experiences of Hispanic women currently in computer science. The study found five emergent themes within the experiences of Hispanic women in computer science. Encouragement and role models strongly influenced not only the participants' choice to major in the field, but to persist as well. Most of the participants experienced a negative atmosphere and feelings of not fitting in while in college and industry. The interdisciplinary nature of computer science was the most common aspect that attracted the participants to computer science. The aptitudes participants commonly believed are needed for success in computer science are the Twenty

  7. [A computer-aided image diagnosis and study system].

    PubMed

    Li, Zhangyong; Xie, Zhengxiang

    2004-08-01

    The revolution in information processing, particularly the digitizing of medicine, has changed the medical study, work and management. This paper reports a method to design a system for computer-aided image diagnosis and study. Combined with some good idea of graph-text system and picture archives communicate system (PACS), the system was realized and used for "prescription through computer", "managing images" and "reading images under computer and helping the diagnosis". Also typical examples were constructed in a database and used to teach the beginners. The system was developed by the visual developing tools based on object oriented programming (OOP) and was carried into operation on the Windows 9X platform. The system possesses friendly man-machine interface.

  8. A computational NMR study on zigzag aluminum nitride nanotubes

    NASA Astrophysics Data System (ADS)

    Bodaghi, Ali; Mirzaei, Mahmoud; Seif, Ahmad; Giahi, Masoud

    2008-12-01

    A computational nuclear magnetic resonance (NMR) study is performed to investigate the electronic structure properties of the single-walled zigzag aluminum nitride nanotubes (AlNNTs). The chemical-shielding (CS) tensors are calculated at the sites of Al-27 and N-15 nuclei in three structural forms of AlNNT including H-saturated, Al-terminated, and N-terminated ones. The structural forms are firstly optimized and then the calculated CS tensors in the optimized structures are converted to chemical-shielding isotropic (CSI) and chemical-shielding anisotropic (CSA) parameters. The calculated parameters reveal that various Al-27 and N-15 nuclei are divided into some layers with equivalent electrostatic properties; furthermore, Al and N can act as Lewis base and acid, respectively. In the Al-terminated and N-terminated forms of AlNNT, in which one mouth of the nanotube is terminated by aluminum and nitrogen nuclei, respectively, just the CS tensors of the nearest nuclei to the mouth of the nanotube are significantly changed due to removal of saturating hydrogen atoms. Density functional theory (DFT) calculations are performed using GAUSSIAN 98 package of program.

  9. Outcomes from a pilot study using computer-based rehabilitative tools in a military population.

    PubMed

    Sullivan, Katherine W; Quinn, Julia E; Pramuka, Michael; Sharkey, Laura A; French, Louis M

    2012-01-01

    Novel therapeutic approaches and outcome data are needed for cognitive rehabilitation for patients with a traumatic brain injury; computer-based programs may play a critical role in filling existing knowledge gaps. Brain-fitness computer programs can complement existing therapies, maximize neuroplasticity, provide treatment beyond the clinic, and deliver objective efficacy data. However, these approaches have not been extensively studied in the military and traumatic brain injury population. Walter Reed National Military Medical Center established its Brain Fitness Center (BFC) in 2008 as an adjunct to traditional cognitive therapies for wounded warriors. The BFC offers commercially available "brain-training" products for military Service Members to use in a supportive, structured environment. Over 250 Service Members have utilized this therapeutic intervention. Each patient receives subjective assessments pre and post BFC participation including the Mayo-Portland Adaptability Inventory-4 (MPAI-4), the Neurobehavioral Symptom Inventory (NBSI), and the Satisfaction with Life Scale (SWLS). A review of the first 29 BFC participants, who finished initial and repeat measures, was completed to determine the effectiveness of the BFC program. Two of the three questionnaires of self-reported symptom change completed before and after participation in the BFC revealed a statistically significant reduction in symptom severity based on MPAI and NBSI total scores (p < .05). There were no significant differences in the SWLS score. Despite the typical limitations of a retrospective chart review, such as variation in treatment procedures, preliminary results reveal a trend towards improved self-reported cognitive and functional symptoms.

  10. A Computing Infrastructure for Supporting Climate Studies

    NASA Astrophysics Data System (ADS)

    Yang, C.; Bambacus, M.; Freeman, S. M.; Huang, Q.; Li, J.; Sun, M.; Xu, C.; Wojcik, G. S.; Cahalan, R. F.; NASA Climate @ Home Project Team

    2011-12-01

    Climate change is one of the major challenges facing us on the Earth planet in the 21st century. Scientists build many models to simulate the past and predict the climate change for the next decades or century. Most of the models are at a low resolution with some targeting high resolution in linkage to practical climate change preparedness. To calibrate and validate the models, millions of model runs are needed to find the best simulation and configuration. This paper introduces the NASA effort on Climate@Home project to build a supercomputer based-on advanced computing technologies, such as cloud computing, grid computing, and others. Climate@Home computing infrastructure includes several aspects: 1) a cloud computing platform is utilized to manage the potential spike access to the centralized components, such as grid computing server for dispatching and collecting models runs results; 2) a grid computing engine is developed based on MapReduce to dispatch models, model configuration, and collect simulation results and contributing statistics; 3) a portal serves as the entry point for the project to provide the management, sharing, and data exploration for end users; 4) scientists can access customized tools to configure model runs and visualize model results; 5) the public can access twitter and facebook to get the latest about the project. This paper will introduce the latest progress of the project and demonstrate the operational system during the AGU fall meeting. It will also discuss how this technology can become a trailblazer for other climate studies and relevant sciences. It will share how the challenges in computation and software integration were solved.

  11. Computational mechanics needs study

    NASA Technical Reports Server (NTRS)

    Griffin, O. Hayden, Jr.

    1993-01-01

    In order to assess the needs in computational mechanics over the next decade, we formulated a questionnaire and contacted computational mechanics researchers and users in industry, government, and academia. As expected, we found a wide variety of computational mechanics usage and research. This report outlines the activity discussed with those contacts, as well as that in our own organizations. It should be noted that most of the contacts were made before the recent decline of the defense industry. Therefore, areas which are strongly defense-oriented may decrease in relative importance. In order to facilitate updating of this study, names of a few key researchers in each area are included as starting points for future literature surveys. These lists of names are not intended to represent those persons doing the best research in that area, nor are they intended to be comprehensive. They are, as previously stated, offered as starting points for future literature searches. Overall, there is currently a broad activity in computational mechanics in this country, with the breadth and depth increasing as more sophisticated software and faster computers become more available. The needs and desires of the workers in this field are as diverse as their background and organizational products. There seems to be some degree of software development in any organization (although the level of activity is highly variable from one organization to another) which has any research component in its mission. It seems, however, that there is considerable use of commercial software in almost all organizations. In most industrial research organizations, it appears that very little actual software development is contracted out, but that most is done in-house, using a mixture of funding sources. Government agencies vary widely in the ratio of in-house to out-house ratio. There is a considerable amount of experimental verification in most, but not all, organizations. Generally, the amount of

  12. Study of Airline Computer Reservation Systems

    DOT National Transportation Integrated Search

    1988-05-01

    The study addresses possible competitive issues concerning the five airline-owned computer reservation systems (SABRE, APOLLO, SYSTEMONE, PARS and DATAS II). The relationship of the fees charged by the vendor airlines to participating airlines and tr...

  13. A Study in Computer Abuse.

    ERIC Educational Resources Information Center

    Caulfield Inst. of Technology (Australia).

    Computer abuse is examined as both a general issue and as a specific problem. A statistical profile of computer crime lists distribution by country of reported cases, by industry of occurrence, and by amount of monetary loss. The characteristics of computer abuse are described along with the important categories of such crimes. Factors inhibiting…

  14. Study on the Computational Estimation Performance and Computational Estimation Attitude of Elementary School Fifth Graders in Taiwan

    ERIC Educational Resources Information Center

    Tsao, Yea-Ling; Pan, Ting-Rung

    2011-01-01

    Main purpose of this study is to investigate what level of computational estimation performance is possessed by fifth graders and explore computational estimation attitude towards fifth graders. Two hundred and thirty-five Grade-5 students from four elementary schools in Taipei City were selected for "Computational Estimation Test" and…

  15. Computer Science and Engineering Students Addressing Critical Issues Regarding Gender Differences in Computing: A Case Study

    ERIC Educational Resources Information Center

    Tsagala, Evrikleia; Kordaki, Maria

    2008-01-01

    This study focuses on how Computer Science and Engineering Students (CSESs) of both genders address certain critical issues for gender differences in the field of Computer Science and Engineering (CSE). This case study is based on research conducted on a sample of 99 Greek CSESs, 43 of which were women. More specifically, these students were asked…

  16. Investigation of the computer experiences and attitudes of pre-service mathematics teachers: new evidence from Turkey.

    PubMed

    Birgin, Osman; Catlioğlu, Hakan; Gürbüz, Ramazan; Aydin, Serhat

    2010-10-01

    This study aimed to investigate the experiences of pre-service mathematics (PSM) teachers with computers and their attitudes toward them. The Computer Attitude Scale, Computer Competency Survey, and Computer Use Information Form were administered to 180 Turkish PSM teachers. Results revealed that most PSM teachers used computers at home and at Internet cafes, and that their competency was generally intermediate and upper level. The study concludes that PSM teachers' attitudes about computers differ according to their years of study, computer ownership, level of computer competency, frequency of computer use, computer experience, and whether they had attended a computer-aided instruction course. However, computer attitudes were not affected by gender.

  17. An Examination of Computer Engineering Students' Perceptions about Asynchronous Discussion Forums

    ERIC Educational Resources Information Center

    Ozyurt, Ozcan; Ozyurt, Hacer

    2013-01-01

    This study was conducted in order to reveal the usage profiles and perceptions of Asynchronous Discussion Forums (ADFs) of 126 computer engineering students from the Computer Engineering Department in a university in Turkey. By using a mixed methods research design both quantitative and qualitative data were collected and analyzed. Research…

  18. Nurses' attitudes towards computers: cross sectional questionnaire study.

    PubMed

    Brumini, Gordan; Kovic, Ivor; Zombori, Dejvid; Lulic, Ileana; Petrovecki, Mladen

    2005-02-01

    To estimate the attitudes of hospital nurses towards computers and the influence of gender, age, education, and computer usage on these attitudes. The study was conducted in two Croatian hospitals where integrated hospital information system is being implemented. There were 1,081 nurses surveyed by an anonymous questionnaire consisting of 8 questions about demographic data, education, and computer usage, and 30 statements on attitudes towards computers. The statements were adapted to a Likert type scale. Differences in attitudes towards computers were compared using one-way ANOVA and Tukey-b post-hoc test. The total score was 120+/-15 (mean+/-standard deviation) out of maximal 150. Nurses younger than 30 years had a higher total score than those older than 30 years (124+/-13 vs 119+/-16 for 30-39 age groups and 117+/-15 for>39 age groups, P<0.001). Nurses with a bachelor's degree (119+/-16 vs 122+/-14, P=0.002) and nurses who had attended computer science courses had a higher total score compared to the others (124+/-13 vs 118+/-16, P<0.001). Nurses using computers more than 5 hours per week had higher total score than those who used computers less than 5 hours (127+/-13 vs 124+/-12 for 1-5 h and and 119+/-14 for <1 hour per day, P<0.001, post-hoc test). Nurses in general have positive attitudes towards computers. These results are important for the planning and implementing an integrated hospital information system.

  19. The Effect of Home Computer Use on Jordanian Children: A Parental Perspective

    ERIC Educational Resources Information Center

    Khasawneh, Omar M.; Al-Awidi, Hamed M.

    2008-01-01

    The purpose of this study is to examine the effects of computer technology on Jordanian children from the perspectives of their parents. The sample of the study consisted of 127 participants. Each participant is a parent of a child or children who owned a personal computer. Our findings revealed some of the positive as well as negative changes…

  20. Low-Cost Computer-Aided Instruction/Computer-Managed Instruction (CAI/CMI) System: Feasibility Study. Final Report.

    ERIC Educational Resources Information Center

    Lintz, Larry M.; And Others

    This study investigated the feasibility of a low cost computer-aided instruction/computer-managed instruction (CAI/CMI) system. Air Force instructors and training supervisors were surveyed to determine the potential payoffs of various CAI and CMI functions. Results indicated that a wide range of capabilities had potential for resident technical…

  1. Children as Educational Computer Game Designers: An Exploratory Study

    ERIC Educational Resources Information Center

    Baytak, Ahmet; Land, Susan M.; Smith, Brian K.

    2011-01-01

    This study investigated how children designed computer games as artifacts that reflected their understanding of nutrition. Ten 5th grade students were asked to design computer games with the software "Game Maker" for the purpose of teaching 1st graders about nutrition. The results from the case study show that students were able to…

  2. Computer-assisted abdominal surgery: new technologies.

    PubMed

    Kenngott, H G; Wagner, M; Nickel, F; Wekerle, A L; Preukschas, A; Apitz, M; Schulte, T; Rempel, R; Mietkowski, P; Wagner, F; Termer, A; Müller-Stich, Beat P

    2015-04-01

    Computer-assisted surgery is a wide field of technologies with the potential to enable the surgeon to improve efficiency and efficacy of diagnosis, treatment, and clinical management. This review provides an overview of the most important new technologies and their applications. A MEDLINE database search was performed revealing a total of 1702 references. All references were considered for information on six main topics, namely image guidance and navigation, robot-assisted surgery, human-machine interface, surgical processes and clinical pathways, computer-assisted surgical training, and clinical decision support. Further references were obtained through cross-referencing the bibliography cited in each work. Based on their respective field of expertise, the authors chose 64 publications relevant for the purpose of this review. Computer-assisted systems are increasingly used not only in experimental studies but also in clinical studies. Although computer-assisted abdominal surgery is still in its infancy, the number of studies is constantly increasing, and clinical studies start showing the benefits of computers used not only as tools of documentation and accounting but also for directly assisting surgeons during diagnosis and treatment of patients. Further developments in the field of clinical decision support even have the potential of causing a paradigm shift in how patients are diagnosed and treated.

  3. Factors influencing exemplary science teachers' levels of computer use

    NASA Astrophysics Data System (ADS)

    Hakverdi, Meral

    This study examines exemplary science teachers' use of technology in science instruction, factors influencing their level of computer use, their level of knowledge/skills in using specific computer applications for science instruction, their use of computer-related applications/tools during their instruction, and their students' use of computer applications/tools in or for their science class. After a relevant review of the literature certain variables were selected for analysis. These variables included personal self-efficacy in teaching with computers, outcome expectancy, pupil-control ideology, level of computer use, age, gender, teaching experience, personal computer use, professional computer use and science teachers' level of knowledge/skills in using specific computer applications for science instruction. The sample for this study includes middle and high school science teachers who received the Presidential Award for Excellence in Science Teaching Award (sponsored by the White House and the National Science Foundation) between the years 1997 and 2003 from all 50 states and U.S. territories. Award-winning science teachers were contacted about the survey via e-mail or letter with an enclosed return envelope. Of the 334 award-winning science teachers, usable responses were received from 92 science teachers, which made a response rate of 27.5%. Analysis of the survey responses indicated that exemplary science teachers have a variety of knowledge/skills in using computer related applications/tools. The most commonly used computer applications/tools are information retrieval via the Internet, presentation tools, online communication, digital cameras, and data collection probes. Results of the study revealed that students' use of technology in their science classroom is highly correlated with the frequency of their science teachers' use of computer applications/tools. The results of the multiple regression analysis revealed that personal self-efficacy related to

  4. Could offset cluster reveal strong earthquake pattern?——case study from Haiyuan Fault

    NASA Astrophysics Data System (ADS)

    Ren, Z.; Zhang, Z.; Chen, T.; Yin, J.; Zhang, P. Z.; Zheng, W.; Zhang, H.; Li, C.

    2016-12-01

    Since 1990s, researchers tried to use offset clusters to study strong earthquake patterns. However, due to the limitation of quantity of offset data, it was not widely used until recent years with the rapid development of high-resolution topographic data, such as remote sensing images, LiDAR. In this study, we use airborne LiDAR data to re-evaluate the cumulative offsets and co-seismic offset of the 1920 Haiyuan Ms 8.5 earthquake along the western and middle segments of the co-seismic surface rupture zone. Our LiDAR data indicate the offset observations along both the western and middle segments fall into five groups. The group with minimum slip amount is associated with the 1920 Haiyuan Ms 8.5 earthquake, which ruptured both the western and middle segments. Our research highlights two new interpretations: firstly, the previously reported maximum displacement of the 1920 Earthquake is likely to be produced by at least two earthquakes; secondly, Our results reveal that the Cumulative Offset Probability Density (COPD) peaks of same offset amount on western segment and middles segment did not corresponding to each other one by one. The ages of the paleoearthquakes indicate the offsets are not accumulated during same period. We suggest that any discussion of the rupture pattern of a certain fault based on the offset data should also consider fault segmentation and paleoseismological data; Therefore, using the COPD peaks for studying the number of palaeo-events and their rupture patterns, the COPD peaks should be computed and analyzed on fault sub-sections and not entire fault zones. Our results reveal that the rupture pattern on the western and middle segment of the Haiyuan Fault is different from each other, which provide new data for the regional seismic potential analysis.

  5. Computational Study of a Functionally Graded Ceramic-Metallic Armor

    DTIC Science & Technology

    2006-12-15

    UNCLAS –Dist. A - Approved for public release Computational Study of a Functionally Graded Ceramic-Metallic Armor Douglas W. Templeton1, Tara J...efficiency of a postulated FGM ceramic-metallic armor system composed of aluminum nitride (AlN) and aluminum. The study had two primary...2006 2. REPORT TYPE N/ A 3. DATES COVERED - 4. TITLE AND SUBTITLE Computational Study of a Functionally Graded Ceramic-Metallic Armor 5a

  6. Efficient universal blind quantum computation.

    PubMed

    Giovannetti, Vittorio; Maccone, Lorenzo; Morimae, Tomoyuki; Rudolph, Terry G

    2013-12-06

    We give a cheat sensitive protocol for blind universal quantum computation that is efficient in terms of computational and communication resources: it allows one party to perform an arbitrary computation on a second party's quantum computer without revealing either which computation is performed, or its input and output. The first party's computational capabilities can be extremely limited: she must only be able to create and measure single-qubit superposition states. The second party is not required to use measurement-based quantum computation. The protocol requires the (optimal) exchange of O(Jlog2(N)) single-qubit states, where J is the computational depth and N is the number of qubits needed for the computation.

  7. Exemplary Social Studies Teachers Use of Computer-Supported Instruction in the Classroom

    ERIC Educational Resources Information Center

    Acikalin, Mehmet

    2010-01-01

    Educators increasingly support the use of computer-supported instruction in social studies education. However few studies have been conducted to study teacher use of computer-supported instruction in social studies education. This study was therefore designed to examine the use of exemplary social studies teachers' computer-supported instruction…

  8. The effect of Vaccinium uliginosum extract on tablet computer-induced asthenopia: randomized placebo-controlled study.

    PubMed

    Park, Choul Yong; Gu, Namyi; Lim, Chi-Yeon; Oh, Jong-Hyun; Chang, Minwook; Kim, Martha; Rhee, Moo-Yong

    2016-08-18

    To investigate the alleviation effect of Vaccinium uliginosum extract (DA9301) on tablet computer-induced asthenopia. This was a randomized, placebo-controlled, double-blind and parallel study (Trial registration number: 2013-95). A total 60 volunteers were randomized into DA9301 (n = 30) and control (n = 30) groups. The DA9301 group received DA9301 oral pill (1000 mg/day) for 4 weeks and the control group received placebo. Asthenopia was evaluated by administering a questionnaire containing 10 questions (responses were scored on a scales of 0-6; total score: 60) regarding ocular symptoms before (baseline) and 4 weeks after receiving pills (DA9301 or placebo). The participants completed the questionnaire before and after tablet computer (iPad Air, Apple Inc.) watching at each visit. The change in total asthenopia score (TAS) was calculated and compared between the groups TAS increased significantly after tablet computer watching at baseline in DA9301 group. (from 20.35 to 23.88; p = 0.031) However, after receiving DA9301 for 4 weeks, TAS remained stable after tablet computer watching. In the control group, TAS changes induced by tablet computer watching were not significant both at baseline and at 4 weeks after receiving placebo. Further analysis revealed the scores for "tired eyes" (p = 0.001), "sore/aching eyes" (p = 0.038), "irritated eyes" (p = 0.010), "watery eyes" (p = 0.005), "dry eyes" (p = 0.003), "eye strain" (p = 0.006), "blurred vision" (p = 0.034), and "visual discomfort" (p = 0.018) significantly improved in the DA9301 group. We found that oral intake of DA9301 (1000 mg/day for 4 weeks) was effective in alleviating asthenopia symptoms induced by tablet computer watching. The study is registered at www.clinicaltrials.gov (registration number: NCT02641470, date of registration December 30, 2015).

  9. Computational Investigation of Fluidic Counterflow Thrust Vectoring

    NASA Technical Reports Server (NTRS)

    Hunter, Craig A.; Deere, Karen A.

    1999-01-01

    A computational study of fluidic counterflow thrust vectoring has been conducted. Two-dimensional numerical simulations were run using the computational fluid dynamics code PAB3D with two-equation turbulence closure and linear Reynolds stress modeling. For validation, computational results were compared to experimental data obtained at the NASA Langley Jet Exit Test Facility. In general, computational results were in good agreement with experimental performance data, indicating that efficient thrust vectoring can be obtained with low secondary flow requirements (less than 1% of the primary flow). An examination of the computational flowfield has revealed new details about the generation of a countercurrent shear layer, its relation to secondary suction, and its role in thrust vectoring. In addition to providing new information about the physics of counterflow thrust vectoring, this work appears to be the first documented attempt to simulate the counterflow thrust vectoring problem using computational fluid dynamics.

  10. A Study of Student-Teachers' Readiness to Use Computers in Teaching: An Empirical Study

    ERIC Educational Resources Information Center

    Padmavathi, M.

    2016-01-01

    This study attempts to analyze student-teachers' attitude towards the use of computers for classroom teaching. Four dimensions of computer attitude on a Likert-type five-point scale were used: Affect (liking), Perceived usefulness, Perceived Control, and Behaviour Intention to use computers. The effect of student-teachers' subject area, years of…

  11. A novel quantum scheme for secure two-party distance computation

    NASA Astrophysics Data System (ADS)

    Peng, Zhen-wan; Shi, Run-hua; Zhong, Hong; Cui, Jie; Zhang, Shun

    2017-12-01

    Secure multiparty computational geometry is an essential field of secure multiparty computation, which computes a computation geometric problem without revealing any private information of each party. Secure two-party distance computation is a primitive of secure multiparty computational geometry, which computes the distance between two points without revealing each point's location information (i.e., coordinate). Secure two-party distance computation has potential applications with high secure requirements in military, business, engineering and so on. In this paper, we present a quantum solution to secure two-party distance computation by subtly using quantum private query. Compared to the classical related protocols, our quantum protocol can ensure higher security and better privacy protection because of the physical principle of quantum mechanics.

  12. Spacelab experiment computer study. Volume 1: Executive summary (presentation)

    NASA Technical Reports Server (NTRS)

    Lewis, J. L.; Hodges, B. C.; Christy, J. O.

    1976-01-01

    A quantitative cost for various Spacelab flight hardware configurations is provided along with varied software development options. A cost analysis of Spacelab computer hardware and software is presented. The cost study is discussed based on utilization of a central experiment computer with optional auxillary equipment. Groundrules and assumptions used in deriving the costing methods for all options in the Spacelab experiment study are presented. The groundrules and assumptions, are analysed and the options along with their cost considerations, are discussed. It is concluded that Spacelab program cost for software development and maintenance is independent of experimental hardware and software options, that distributed standard computer concept simplifies software integration without a significant increase in cost, and that decisions on flight computer hardware configurations should not be made until payload selection for a given mission and a detailed analysis of the mission requirements are completed.

  13. Experimental Realization of High-Efficiency Counterfactual Computation.

    PubMed

    Kong, Fei; Ju, Chenyong; Huang, Pu; Wang, Pengfei; Kong, Xi; Shi, Fazhan; Jiang, Liang; Du, Jiangfeng

    2015-08-21

    Counterfactual computation (CFC) exemplifies the fascinating quantum process by which the result of a computation may be learned without actually running the computer. In previous experimental studies, the counterfactual efficiency is limited to below 50%. Here we report an experimental realization of the generalized CFC protocol, in which the counterfactual efficiency can break the 50% limit and even approach unity in principle. The experiment is performed with the spins of a negatively charged nitrogen-vacancy color center in diamond. Taking advantage of the quantum Zeno effect, the computer can remain in the not-running subspace due to the frequent projection by the environment, while the computation result can be revealed by final detection. The counterfactual efficiency up to 85% has been demonstrated in our experiment, which opens the possibility of many exciting applications of CFC, such as high-efficiency quantum integration and imaging.

  14. Experimental Realization of High-Efficiency Counterfactual Computation

    NASA Astrophysics Data System (ADS)

    Kong, Fei; Ju, Chenyong; Huang, Pu; Wang, Pengfei; Kong, Xi; Shi, Fazhan; Jiang, Liang; Du, Jiangfeng

    2015-08-01

    Counterfactual computation (CFC) exemplifies the fascinating quantum process by which the result of a computation may be learned without actually running the computer. In previous experimental studies, the counterfactual efficiency is limited to below 50%. Here we report an experimental realization of the generalized CFC protocol, in which the counterfactual efficiency can break the 50% limit and even approach unity in principle. The experiment is performed with the spins of a negatively charged nitrogen-vacancy color center in diamond. Taking advantage of the quantum Zeno effect, the computer can remain in the not-running subspace due to the frequent projection by the environment, while the computation result can be revealed by final detection. The counterfactual efficiency up to 85% has been demonstrated in our experiment, which opens the possibility of many exciting applications of CFC, such as high-efficiency quantum integration and imaging.

  15. The effect of psychosocial stress on muscle activity during computer work: Comparative study between desktop computer and mobile computing products.

    PubMed

    Taib, Mohd Firdaus Mohd; Bahn, Sangwoo; Yun, Myung Hwan

    2016-06-27

    The popularity of mobile computing products is well known. Thus, it is crucial to evaluate their contribution to musculoskeletal disorders during computer usage under both comfortable and stressful environments. This study explores the effect of different computer products' usages with different tasks used to induce psychosocial stress on muscle activity. Fourteen male subjects performed computer tasks: sixteen combinations of four different computer products with four different tasks used to induce stress. Electromyography for four muscles on the forearm, shoulder and neck regions and task performances were recorded. The increment of trapezius muscle activity was dependent on the task used to induce the stress where a higher level of stress made a greater increment. However, this relationship was not found in the other three muscles. Besides that, compared to desktop and laptop use, the lowest activity for all muscles was obtained during the use of a tablet or smart phone. The best net performance was obtained in a comfortable environment. However, during stressful conditions, the best performance can be obtained using the device that a user is most comfortable with or has the most experience with. Different computer products and different levels of stress play a big role in muscle activity during computer work. Both of these factors must be taken into account in order to reduce the occurrence of musculoskeletal disorders or problems.

  16. Inequities in Computer Education Due to Gender, Race, and Socioeconomic Status.

    ERIC Educational Resources Information Center

    Urban, Cynthia M.

    Recent reports have revealed that inequalities exist between males and females, racial minorities and whites, and rich and poor in accessibility to and use of computers. This study reviews the research in the field of computer-based education to determine the extent of and reasons for these inequities. The annotated research articles are arranged…

  17. Preferred computer activities among individuals with dementia: a pilot study.

    PubMed

    Tak, Sunghee H; Zhang, Hongmei; Hong, Song Hee

    2015-03-01

    Computers offer new activities that are easily accessible, cognitively stimulating, and enjoyable for individuals with dementia. The current descriptive study examined preferred computer activities among nursing home residents with different severity levels of dementia. A secondary data analysis was conducted using activity observation logs from 15 study participants with dementia (severe = 115 logs, moderate = 234 logs, and mild = 124 logs) who participated in a computer activity program. Significant differences existed in preferred computer activities among groups with different severity levels of dementia. Participants with severe dementia spent significantly more time watching slide shows with music than those with both mild and moderate dementia (F [2,12] = 9.72, p = 0.003). Preference in playing games also differed significantly across the three groups. It is critical to consider individuals' interests and functional abilities when computer activities are provided for individuals with dementia. A practice guideline for tailoring computer activities is detailed. Copyright 2015, SLACK Incorporated.

  18. Preparation, characterization, drug release and computational modelling studies of antibiotics loaded amorphous chitin nanoparticles.

    PubMed

    Gayathri, N K; Aparna, V; Maya, S; Biswas, Raja; Jayakumar, R; Mohan, C Gopi

    2017-12-01

    We present a computational investigation of binding affinity of different types of drugs with chitin nanocarriers. Understanding the chitn polymer-drug interaction is important to design and optimize the chitin based drug delivery systems. The binding affinity of three different types of anti-bacterial drugs Ethionamide (ETA) Methacycline (MET) and Rifampicin (RIF) with amorphous chitin nanoparticles (AC-NPs) were studied by integrating computational and experimental techniques. The binding energies (BE) of hydrophobic ETA, hydrophilic MET and hydrophobic RIF were -7.3kcal/mol, -5.1kcal/mol and -8.1kcal/mol respectively, with respect to AC-NPs, using molecular docking studies. This theoretical result was in good correlation with the experimental studies of AC-drug loading and drug entrapment efficiencies of MET (3.5±0.1 and 25± 2%), ETA (5.6±0.02 and 45±4%) and RIF (8.9±0.20 and 53±5%) drugs respectively. Stability studies of the drug encapsulated nanoparticles showed stable values of size, zeta and polydispersity index at 6°C temperature. The correlation between computational BE and experimental drug entrapment efficiencies of RIF, ETA and MET drugs with four AC-NPs strands were 0.999 respectively, while that of the drug loading efficiencies were 0.854 respectively. Further, the molecular docking results predict the atomic level details derived from the electrostatic, hydrogen bonding and hydrophobic interactions of the drug and nanoparticle for its encapsulation and loading in the chitin-based host-guest nanosystems. The present results thus revealed the drug loading and drug delivery insights and has the potential of reducing the time and cost of processing new antibiotic drug delivery nanosystem optimization, development and discovery. Copyright © 2017 Elsevier Ltd. All rights reserved.

  19. Teaching of Real Numbers by Using the Archimedes-Cantor Approach and Computer Algebra Systems

    ERIC Educational Resources Information Center

    Vorob'ev, Evgenii M.

    2015-01-01

    Computer technologies and especially computer algebra systems (CAS) allow students to overcome some of the difficulties they encounter in the study of real numbers. The teaching of calculus can be considerably more effective with the use of CAS provided the didactics of the discipline makes it possible to reveal the full computational potential of…

  20. Enhanced delegated computing using coherence

    NASA Astrophysics Data System (ADS)

    Barz, Stefanie; Dunjko, Vedran; Schlederer, Florian; Moore, Merritt; Kashefi, Elham; Walmsley, Ian A.

    2016-03-01

    A longstanding question is whether it is possible to delegate computational tasks securely—such that neither the computation nor the data is revealed to the server. Recently, both a classical and a quantum solution to this problem were found [C. Gentry, in Proceedings of the 41st Annual ACM Symposium on the Theory of Computing (Association for Computing Machinery, New York, 2009), pp. 167-178; A. Broadbent, J. Fitzsimons, and E. Kashefi, in Proceedings of the 50th Annual Symposium on Foundations of Computer Science (IEEE Computer Society, Los Alamitos, CA, 2009), pp. 517-526]. Here, we study the first step towards the interplay between classical and quantum approaches and show how coherence can be used as a tool for secure delegated classical computation. We show that a client with limited computational capacity—restricted to an XOR gate—can perform universal classical computation by manipulating information carriers that may occupy superpositions of two states. Using single photonic qubits or coherent light, we experimentally implement secure delegated classical computations between an independent client and a server, which are installed in two different laboratories and separated by 50 m . The server has access to the light sources and measurement devices, whereas the client may use only a restricted set of passive optical devices to manipulate the information-carrying light beams. Thus, our work highlights how minimal quantum and classical resources can be combined and exploited for classical computing.

  1. Linking Different Cultures by Computers: A Study of Computer-Assisted Music Notation Instruction.

    ERIC Educational Resources Information Center

    Chen, Steve Shihong; Dennis, J. Richard

    1993-01-01

    Describes a study that investigated the feasibility of using computers to teach music notation systems to Chinese students, as well as to help Western educators study Chinese music and its number notation system. Topics discussed include students' learning sequences; HyperCard software; hypermedia and graphic hypertext indexing; and the…

  2. [Hand motor dysfunctions in computer users].

    PubMed

    Shavlovskaia, O A; Shvarkov, S B; Posokhov, S I

    2010-01-01

    It were studied 239 female typists aged from 16 to 62 years (mean age 20,1±7,8 years) using author's questionnaire for computer typists to assess hand function and develop preventive measures of disturbances revealed. Indirect signs of tunnel hand neuropathy (27,2%), focal hand dystonia (21,4%) and muscular-tonic syndromes of different localization (18%) have been found. Typists are a risk group of fine hand motor dysfunctions. As preventive measures, authors recommend to use computer auxiliary devices, to change a motor stereotype during the day, to make hand "motor holidays", to organize working place.

  3. Case Studies of Auditing in a Computer-Based Systems Environment.

    ERIC Educational Resources Information Center

    General Accounting Office, Washington, DC.

    In response to a growing need for effective and efficient means for auditing computer-based systems, a number of studies dealing primarily with batch-processing type computer operations have been conducted to explore the impact of computers on auditing activities in the Federal Government. This report first presents some statistical data on…

  4. Lower airway dimensions in pediatric patients-A computed tomography study.

    PubMed

    Szelloe, Patricia; Weiss, Markus; Schraner, Thomas; Dave, Mital H

    2017-10-01

    The aim of this study was to obtain lower airway dimensions in children by means of computed tomography (CT). Chest CT scans from 195 pediatric patients (118 boys/77 girls) aged 0.04-15.99 years were analyzed. Tracheal and bronchial lengths, anterior-posterior and lateral diameters, as well as cross-sectional area were assessed at the following levels: mid trachea, right proximal and distal bronchus, proximal bronchus intermedius, and left proximal and distal bronchus. Mediastinal angles of tracheal bifurcation were measured. Data were analyzed by means of linear and polynomial regression plots. The strongest correlations were found between tracheal and bronchial diameters and age as well as between tracheal and bronchial lengths and body length. All measured airway parameters correlated poorly to body weight. Bronchial angles revealed no association with patient's age, body length, or weight. This comprehensive anatomical database of lower airway dimensions demonstrates that tracheal and bronchial diameters correlate better to age, and that tracheal and bronchial length correlate better to body length. All measured airway parameters correlated poorly to body weight. © 2017 John Wiley & Sons Ltd.

  5. The Impact of a Professional Learning Intervention Designed to Enhance Year Six Students' Computational Estimation Performance

    ERIC Educational Resources Information Center

    Mildenhall, Paula; Hackling, Mark

    2012-01-01

    This paper reports on the analysis of a study of a professional learning intervention focussing on computational estimation. Using a multiple case study design it was possible to describe the impact of the intervention of students' beliefs and computational estimation performance. The study revealed some noteworthy impacts on computational…

  6. A case study of the effects of random errors in rawinsonde data on computations of ageostrophic winds

    NASA Technical Reports Server (NTRS)

    Moore, J. T.

    1985-01-01

    Data input for the AVE-SESAME I experiment are utilized to describe the effects of random errors in rawinsonde data on the computation of ageostrophic winds. Computer-generated random errors for wind direction and speed and temperature are introduced into the station soundings at 25 mb intervals from which isentropic data sets are created. Except for the isallobaric and the local wind tendency, all winds are computed for Apr. 10, 1979 at 2000 GMT. Divergence fields reveal that the isallobaric and inertial-geostrophic-advective divergences are less affected by rawinsonde random errors than the divergence of the local wind tendency or inertial-advective winds.

  7. Computer Self-Efficacy, Computer Anxiety, and Attitudes toward the Internet: A Study among Undergraduates in Unimas

    ERIC Educational Resources Information Center

    Sam, Hong Kian; Othman, Abang Ekhsan Abang; Nordin, Zaimuarifuddin Shukri

    2005-01-01

    Eighty-one female and sixty-seven male undergraduates at a Malaysian university, from seven faculties and a Center for Language Studies completed a Computer Self-Efficacy Scale, Computer Anxiety Scale, and an Attitudes toward the Internet Scale and give information about their use of the Internet. This survey research investigated undergraduates'…

  8. Need Assessment of Computer Science and Engineering Graduates

    NASA Astrophysics Data System (ADS)

    Surakka, Sami; Malmi, Lauri

    2005-06-01

    This case study considered the syllabus of the first and second year studies in computer science. The aim of the study was to reveal which topics covered in the syllabi were really needed during the following years of study or in working life. The program that was assessed in the study was a Masters program in computer science and engineering at a university of technology in Finland. The necessity of different subjects for the advanced studies (years 3? ?5) and for working life was assessed using four content analyses: (a) the course catalog of the institution where this study was carried out, (b) employment reports that were attached to the applications for internship credits, (c) masters theses, and (d) job advertisements in a newspaper. The results of the study imply that the necessity of physics for the advanced study and work was very low compared to the extent to which it was studied. On the other hand, the necessity for mathematics was moderate, and it had remained quite steady during the period 1989? ?2002. The most necessary computer science topic was programming. Also telecommunications and networking was needed often, whereas theoretical computer science was needed quite rarely.

  9. Mechanical Influences on Morphogenesis of the Knee Joint Revealed through Morphological, Molecular and Computational Analysis of Immobilised Embryos

    PubMed Central

    Roddy, Karen A.; Prendergast, Patrick J.; Murphy, Paula

    2011-01-01

    Very little is known about the regulation of morphogenesis in synovial joints. Mechanical forces generated from muscle contractions are required for normal development of several aspects of normal skeletogenesis. Here we show that biophysical stimuli generated by muscle contractions impact multiple events during chick knee joint morphogenesis influencing differential growth of the skeletal rudiment epiphyses and patterning of the emerging tissues in the joint interzone. Immobilisation of chick embryos was achieved through treatment with the neuromuscular blocking agent Decamethonium Bromide. The effects on development of the knee joint were examined using a combination of computational modelling to predict alterations in biophysical stimuli, detailed morphometric analysis of 3D digital representations, cell proliferation assays and in situ hybridisation to examine the expression of a selected panel of genes known to regulate joint development. This work revealed the precise changes to shape, particularly in the distal femur, that occur in an altered mechanical environment, corresponding to predicted changes in the spatial and dynamic patterns of mechanical stimuli and region specific changes in cell proliferation rates. In addition, we show altered patterning of the emerging tissues of the joint interzone with the loss of clearly defined and organised cell territories revealed by loss of characteristic interzone gene expression and abnormal expression of cartilage markers. This work shows that local dynamic patterns of biophysical stimuli generated from muscle contractions in the embryo act as a source of positional information guiding patterning and morphogenesis of the developing knee joint. PMID:21386908

  10. NASA Computational Case Study: The Flight of Friendship 7

    NASA Technical Reports Server (NTRS)

    Simpson, David G.

    2012-01-01

    In this case study, we learn how to compute the position of an Earth-orbiting spacecraft as a function of time. As an exercise, we compute the position of John Glenn's Mercury spacecraft Friendship 7 as it orbited the Earth during the third flight of NASA's Mercury program.

  11. Metabolism and development – integration of micro computed tomography data and metabolite profiling reveals metabolic reprogramming from floral initiation to silique development

    PubMed Central

    Bellaire, Anke; Ischebeck, Till; Staedler, Yannick; Weinhaeuser, Isabell; Mair, Andrea; Parameswaran, Sriram; Ito, Toshiro; Schönenberger, Jürg; Weckwerth, Wolfram

    2014-01-01

    The interrelationship of morphogenesis and metabolism is a poorly studied phenomenon. The main paradigm is that development is controlled by gene expression. The aim of the present study was to correlate metabolism to early and late stages of flower and fruit development in order to provide the basis for the identification of metabolic adjustment and limitations. A highly detailed picture of morphogenesis is achieved using nondestructive micro computed tomography. This technique was used to quantify morphometric parameters of early and late flower development in an Arabidopsis thaliana mutant with synchronized flower initiation. The synchronized flower phenotype made it possible to sample enough early floral tissue otherwise not accessible for metabolomic analysis. The integration of metabolomic and morphometric data enabled the correlation of metabolic signatures with the process of flower morphogenesis. These signatures changed significantly during development, indicating a pronounced metabolic reprogramming in the tissue. Distinct sets of metabolites involved in these processes were identified and were linked to the findings of previous gene expression studies of flower development. High correlations with basic leucine zipper (bZIP) transcription factors and nitrogen metabolism genes involved in the control of metabolic carbon : nitrogen partitioning were revealed. Based on these observations a model for metabolic adjustment during flower development is proposed. PMID:24350948

  12. Crew/computer communications study. Volume 2: Appendixes

    NASA Technical Reports Server (NTRS)

    Johannes, J. D.

    1974-01-01

    The software routines developed during the crew/computer communications study are described to provide the user with an understanding of each routine, any restrictions in use, the required input data, and expected results after executing the routines. The combination of routines to generate a crew/computer communications application is also explained. The programmable keyboard and display used by the program is described, and an experiment scenario is provided to illustrate the relationship between the program frames when they are grouped into activity phases. Program descriptions and a user's guide are also presented. For Vol. 1, see N74-18843.

  13. Acting without seeing: eye movements reveal visual processing without awareness.

    PubMed

    Spering, Miriam; Carrasco, Marisa

    2015-04-01

    Visual perception and eye movements are considered to be tightly linked. Diverse fields, ranging from developmental psychology to computer science, utilize eye tracking to measure visual perception. However, this prevailing view has been challenged by recent behavioral studies. Here, we review converging evidence revealing dissociations between the contents of perceptual awareness and different types of eye movement. Such dissociations reveal situations in which eye movements are sensitive to particular visual features that fail to modulate perceptual reports. We also discuss neurophysiological, neuroimaging, and clinical studies supporting the role of subcortical pathways for visual processing without awareness. Our review links awareness to perceptual-eye movement dissociations and furthers our understanding of the brain pathways underlying vision and movement with and without awareness. Copyright © 2015 Elsevier Ltd. All rights reserved.

  14. A mechanistic study and computational prediction of iron, cobalt and manganese cyclopentadienone complexes for hydrogenation of carbon dioxide.

    PubMed

    Ge, Hongyu; Chen, Xiangyang; Yang, Xinzheng

    2016-10-13

    A series of cobalt and manganese cyclopentadienone complexes are proposed and examined computationally as promising catalysts for hydrogenation of CO 2 to formic acid with total free energies as low as 20.0 kcal mol -1 in aqueous solution. Density functional theory study of the newly designed cobalt and manganese complexes and experimentally reported iron cyclopentadienone complexes reveals a stepwise hydride transfer mechanism with a water or a methanol molecule assisted proton transfer for the cleavage of H 2 as the rate-determining step.

  15. Using NCLab-karel to improve computational thinking skill of junior high school students

    NASA Astrophysics Data System (ADS)

    Kusnendar, J.; Prabawa, H. W.

    2018-05-01

    Increasingly human interaction with technology and the increasingly complex development of digital technology world make the theme of computer science education interesting to study. Previous studies on Computer Literacy and Competency reveal that Indonesian teachers in general have fairly high computational skill, but their skill utilization are limited to some applications. This engenders limited and minimum computer-related learning for the students. On the other hand, computer science education is considered unrelated to real-world solutions. This paper attempts to address the utilization of NCLab- Karel in shaping the computational thinking in students. This computational thinking is believed to be able to making learn students about technology. Implementation of Karel utilization provides information that Karel is able to increase student interest in studying computational material, especially algorithm. Observations made during the learning process also indicate the growth and development of computing mindset in students.

  16. A visual study of computers on doctors' desks.

    PubMed

    Pearce, Christopher; Walker, Hannah; O'Shea, Carolyn

    2008-01-01

    General practice has rapidly computerised over the past ten years, thereby changing the nature of general practice rooms. Most general practice consulting rooms were designed and created in an era without computer hardware, establishing a pattern of work around maximising the doctor-patient relationship. General practitioners (GPs) and patients have had to integrate the computer into this environment. Twenty GPs allowed access to their rooms and consultations as part of a larger study. The results are based on an analysis of still shots of the consulting rooms. Analysis used dramaturgical methodology; thus the room is described as though it is the setting for a play. First, several desk areas were identified: a shared or patient area, a working area, a clinical area and an administrative area. Then, within that framework, we were able to identify two broad categories of setting, one inclusive of the patient and one exclusive. With the increasing significance of the computer in the three-way doctor-patient-computer relationship, an understanding of the social milieu in which the three players in the consultation interact (the staging) will inform further analysis of the interaction, and allow a framework for assessing the effects of different computer placements.

  17. Dynamical Approach Study of Spurious Numerics in Nonlinear Computations

    NASA Technical Reports Server (NTRS)

    Yee, H. C.; Mansour, Nagi (Technical Monitor)

    2002-01-01

    The last two decades have been an era when computation is ahead of analysis and when very large scale practical computations are increasingly used in poorly understood multiscale complex nonlinear physical problems and non-traditional fields. Ensuring a higher level of confidence in the predictability and reliability (PAR) of these numerical simulations could play a major role in furthering the design, understanding, affordability and safety of our next generation air and space transportation systems, and systems for planetary and atmospheric sciences, and in understanding the evolution and origin of life. The need to guarantee PAR becomes acute when computations offer the ONLY way of solving these types of data limited problems. Employing theory from nonlinear dynamical systems, some building blocks to ensure a higher level of confidence in PAR of numerical simulations have been revealed by the author and world expert collaborators in relevant fields. Five building blocks with supporting numerical examples were discussed. The next step is to utilize knowledge gained by including nonlinear dynamics, bifurcation and chaos theories as an integral part of the numerical process. The third step is to design integrated criteria for reliable and accurate algorithms that cater to the different multiscale nonlinear physics. This includes but is not limited to the construction of appropriate adaptive spatial and temporal discretizations that are suitable for the underlying governing equations. In addition, a multiresolution wavelets approach for adaptive numerical dissipation/filter controls for high speed turbulence, acoustics and combustion simulations will be sought. These steps are corner stones for guarding against spurious numerical solutions that are solutions of the discretized counterparts but are not solutions of the underlying governing equations.

  18. Academic Computing at Florida A&M University. A Case Study.

    ERIC Educational Resources Information Center

    Hunter, Beverly; Kearsley, Greg

    This case study is one of a series on academic computing at minority institutions which is designed to assist educators at other such institutions in identifying academic computing needs, establishing realistic goals, organizing a staff, and selecting materials. Following a brief description of the purpose and background of the overall study, the…

  19. High Performance Computer Cluster for Theoretical Studies of Roaming in Chemical Reactions

    DTIC Science & Technology

    2016-08-30

    High-performance Computer Cluster for Theoretical Studies of Roaming in Chemical Reactions A dedicated high-performance computer cluster was...SPONSORING/MONITORING AGENCY NAME(S) AND ADDRESS (ES) U.S. Army Research Office P.O. Box 12211 Research Triangle Park, NC 27709-2211 Computer cluster ...peer-reviewed journals: Final Report: High-performance Computer Cluster for Theoretical Studies of Roaming in Chemical Reactions Report Title A dedicated

  20. Computer-Tutors and a Freshman Writer: A Protocol Study.

    ERIC Educational Resources Information Center

    Strickland, James

    Although there are many retrospective accounts from teachers and professional writers concerning the effect of computers on their writing, there are few real-time accounts of students struggling to simultaneously develop as writers and cope with computers. To fill this void in "testimonial data," a study examining talking-aloud protocols from a…

  1. Active medulloblastoma enhancers reveal subgroup-specific cellular origins

    PubMed Central

    Lin, Charles Y.; Erkek, Serap; Tong, Yiai; Yin, Linlin; Federation, Alexander J.; Zapatka, Marc; Haldipur, Parthiv; Kawauchi, Daisuke; Risch, Thomas; Warnatz, Hans-Jörg; Worst, Barbara C.; Ju, Bensheng; Orr, Brent A.; Zeid, Rhamy; Polaski, Donald R.; Segura-Wang, Maia; Waszak, Sebastian M.; Jones, David T.W.; Kool, Marcel; Hovestadt, Volker; Buchhalter, Ivo; Sieber, Laura; Johann, Pascal; Chavez, Lukas; Gröschel, Stefan; Ryzhova, Marina; Korshunov, Andrey; Chen, Wenbiao; Chizhikov, Victor V.; Millen, Kathleen J.; Amstislavskiy, Vyacheslav; Lehrach, Hans; Yaspo, Marie-Laure; Eils, Roland; Lichter, Peter; Korbel, Jan O.; Pfister, Stefan M.; Bradner, James E.; Northcott, Paul A.

    2016-01-01

    Summary Medulloblastoma is a highly malignant paediatric brain tumour, often inflicting devastating consequences on the developing child. Genomic studies have revealed four distinct molecular subgroups with divergent biology and clinical behaviour. An understanding of the regulatory circuitry governing the transcriptional landscapes of medulloblastoma subgroups, and how this relates to their respective developmental origins, is lacking. Using H3K27ac and BRD4 ChIP-Seq, coupled with tissue-matched DNA methylation and transcriptome data, we describe the active cis-regulatory landscape across 28 primary medulloblastoma specimens. Analysis of differentially regulated enhancers and super-enhancers reinforced inter-subgroup heterogeneity and revealed novel, clinically relevant insights into medulloblastoma biology. Computational reconstruction of core regulatory circuitry identified a master set of transcription factors, validated by ChIP-Seq, that are responsible for subgroup divergence and implicate candidate cells-of-origin for Group 4. Our integrated analysis of enhancer elements in a large series of primary tumour samples reveals insights into cis-regulatory architecture, unrecognized dependencies, and cellular origins. PMID:26814967

  2. Computers in Public Education Study.

    ERIC Educational Resources Information Center

    HBJ Enterprises, Highland Park, NJ.

    This survey conducted for the National Institute of Education reports the use of computers in U.S. public schools in the areas of instructional computing, student accounting, management of educational resources, research, guidance, testing, and library applications. From a stratified random sample of 1800 schools in varying geographic areas and…

  3. Self-Concept, Computer Anxiety, Gender and Attitude towards Interactive Computer Technologies: A Predictive Study among Nigerian Teachers

    ERIC Educational Resources Information Center

    Agbatogun, Alaba Olaoluwakotansibe

    2010-01-01

    Interactive Computer Technologies (ICTs) have crept into education industry, thus dramatically causing transformation in instructional process. This study examined the relative and combined contributions of computer anxiety, self-concept and gender to teachers' attitude towards the use of ICT(s). 454 Nigerian teachers constituted the sample. Three…

  4. Study of the TRAC Airfoil Table Computational System

    NASA Technical Reports Server (NTRS)

    Hu, Hong

    1999-01-01

    The report documents the study of the application of the TRAC airfoil table computational package (TRACFOIL) to the prediction of 2D airfoil force and moment data over a wide range of angle of attack and Mach number. The TRACFOIL generates the standard C-81 airfoil table for input into rotorcraft comprehensive codes such as CAM- RAD. The existing TRACFOIL computer package is successfully modified to run on Digital alpha workstations and on Cray-C90 supercomputers. A step-by-step instruction for using the package on both computer platforms is provided. Application of the newer version of TRACFOIL is made for two airfoil sections. The C-81 data obtained using the TRACFOIL method are compared with those of wind-tunnel data and results are presented.

  5. Spatial-temporal filter effect in a computer model study of ventricular fibrillation.

    PubMed

    Nowak, Claudia N; Fischer, Gerald; Wieser, Leonhard; Tilg, Bernhard; Neurauter, Andreas; Strohmenger, Hans U

    2008-08-01

    Prediction of countershock success from ventricular fibrillation (VF) ECG is a major challenge in critical care medicine. Recent findings indicate that stable, high frequency mother rotors are one possible mechanism maintaining VF. A computer model study was performed to investigate how epicardiac sources are reflected in the ECG. In the cardiac tissues of two computer models - a model with cubic geometry and a simplified torso model with a left ventricle - a mother rotor was induced by increasing the potassium rectifier current. On the epicardium, the dominant frequency (DF) map revealed a constant DF of 23 Hz (cubic model) and 24.4 Hz (torso model) in the region of the mother rotor, respectively. A sharp drop of frequency (3-18 Hz in the cubic model and 12.4-18 Hz in the torso model) occurred in the surrounding epicardial tissue of chaotic fibrillatory conduction. While no organized pattern was observable on the body surface of the cubic model, the mother rotor frequency can be identified in the anterior surface of the torso model because of the chosen position of the mother rotor in the ventricle (shortest distance to the body surface). Nevertheless, the DFs were damped on the body surfaces of both models (4.6-8.5 Hz in the cubic model and 14.4-16.4 Hz in the torso model). Thus, it was shown in this computer model study that wave propagation transforms the spatial low pass filtering of the thorax into a temporal low pass. In contrast to the resistive-capacitive low pass filter formed by the tissue, this spatial-temporal low pass filter becomes effective at low frequencies (tens of Hertz). This effect damps the high frequency components arising from the heart and it hampers a direct observation of rapid, organized sources of VF in the ECGs, when in an emergency case an artifact-free recording is not possible.

  6. High Performance Computing (HPC)-Enabled Computational Study on the Feasibility of using Shape Memory Alloys for Gas Turbine Blade Actuation

    DTIC Science & Technology

    2016-11-01

    Feasibility of using Shape Memory Alloys for Gas Turbine Blade Actuation by Kathryn Esham, Luis Bravo, Anindya Ghoshal, Muthuvel Murugan, and Michael...Computational Study on the Feasibility of using Shape Memory Alloys for Gas Turbine Blade Actuation by Luis Bravo, Anindya Ghoshal, Muthuvel...High Performance Computing (HPC)-Enabled Computational Study on the Feasibility of using Shape Memory Alloys for Gas Turbine Blade Actuation 5a

  7. Ambient belonging: how stereotypical cues impact gender participation in computer science.

    PubMed

    Cheryan, Sapna; Plaut, Victoria C; Davies, Paul G; Steele, Claude M

    2009-12-01

    People can make decisions to join a group based solely on exposure to that group's physical environment. Four studies demonstrate that the gender difference in interest in computer science is influenced by exposure to environments associated with computer scientists. In Study 1, simply changing the objects in a computer science classroom from those considered stereotypical of computer science (e.g., Star Trek poster, video games) to objects not considered stereotypical of computer science (e.g., nature poster, phone books) was sufficient to boost female undergraduates' interest in computer science to the level of their male peers. Further investigation revealed that the stereotypical broadcast a masculine stereotype that discouraged women's sense of ambient belonging and subsequent interest in the environment (Studies 2, 3, and 4) but had no similar effect on men (Studies 3, 4). This masculine stereotype prevented women's interest from developing even in environments entirely populated by other women (Study 2). Objects can thus come to broadcast stereotypes of a group, which in turn can deter people who do not identify with these stereotypes from joining that group.

  8. Mind the gap: an attempt to bridge computational and neuroscientific approaches to study creativity

    PubMed Central

    Wiggins, Geraint A.; Bhattacharya, Joydeep

    2014-01-01

    Creativity is the hallmark of human cognition and is behind every innovation, scientific discovery, piece of music, artwork, and idea that have shaped our lives, from ancient times till today. Yet scientific understanding of creative processes is quite limited, mostly due to the traditional belief that considers creativity as a mysterious puzzle, a paradox, defying empirical enquiry. Recently, there has been an increasing interest in revealing the neural correlates of human creativity. Though many of these studies, pioneering in nature, help demystification of creativity, but the field is still dominated by popular beliefs in associating creativity with “right brain thinking”, “divergent thinking”, “altered states” and so on (Dietrich and Kanso, 2010). In this article, we discuss a computational framework for creativity based on Baars’ Global Workspace Theory (GWT; Baars, 1988) enhanced with mechanisms based on information theory. Next we propose a neurocognitive architecture of creativity with a strong focus on various facets (i.e., unconscious thought theory, mind wandering, spontaneous brain states) of un/pre-conscious brain responses. Our principal argument is that pre-conscious creativity happens prior to conscious creativity and the proposed computational model may provide a mechanism by which this transition is managed. This integrative approach, albeit unconventional, will hopefully stimulate future neuroscientific studies of the inscrutable phenomenon of creativity. PMID:25104930

  9. Use of Computer-Based Case Studies in a Problem-Solving Curriculum.

    ERIC Educational Resources Information Center

    Haworth, Ian S.; And Others

    1997-01-01

    Describes the use of three case studies, on computer, to enhance problem solving and critical thinking among doctoral pharmacy students in a physical chemistry course. Students are expected to use specific computer programs, spreadsheets, electronic mail, molecular graphics, word processing, online literature searching, and other computer-based…

  10. Fluid Dynamics of Competitive Swimming: A Computational Study

    NASA Astrophysics Data System (ADS)

    Mittal, Rajat; Loebbeck, Alfred; Singh, Hersh; Mark, Russell; Wei, Timothy

    2004-11-01

    The dolphin kick is an important component in competitive swimming and is used extensively by swimmers immediately following the starting dive as well as after turns. In this stroke, the swimmer swims about three feet under the water surface and the stroke is executed by performing an undulating wave-like motion of the body that is quite similar to the anguilliform propulsion mode in fish. Despite the relatively simple kinematics of this stoke, considerable variability in style and performance is observed even among Olympic level swimmers. Motivated by this, a joint experimental-numerical study has been initiated to examine the fluid-dynamics of this stroke. The current presentation will describe the computational portion of this study. The computations employ a sharp interface immersed boundary method (IBM) which allows us to simulate flows with complex moving boudnaries on stationary Cartesian grids. 3D body scans of male and female Olympic swimmers have been obtained and these are used in conjuction with high speed videos to recreate a realistic dolphin kick for the IBM solver. Preliminary results from these computations will be presented.

  11. Computer Game Theories for Designing Motivating Educational Software: A Survey Study

    ERIC Educational Resources Information Center

    Ang, Chee Siang; Rao, G. S. V. Radha Krishna

    2008-01-01

    The purpose of this study is to evaluate computer game theories for educational software. We propose a framework for designing engaging educational games based on contemporary game studies which includes ludology and narratology. Ludology focuses on the study of computer games as play and game activities, while narratology revolves around the…

  12. Computer Simulation for Pain Management Education: A Pilot Study.

    PubMed

    Allred, Kelly; Gerardi, Nicole

    2017-10-01

    Effective pain management is an elusive concept in acute care. Inadequate knowledge has been identified as a barrier to providing optimal pain management. This study aimed to determine student perceptions of an interactive computer simulation as a potential method for learning pain management, as a motivator to read and learn more about pain management, preference over traditional lecture, and its potential to change nursing practice. A post-computer simulation survey with a mixed-methods descriptive design was used in this study. A college of nursing in a large metropolitan university in the Southeast United States. A convenience sample of 30 nursing students in a Bachelor of Science nursing program. An interactive computer simulation was developed as a potential alternative method of teaching pain management to nursing students. Increases in educational gain as well as its potential to change practice were explored. Each participant was asked to complete a survey consisting of 10 standard 5-point Likert scale items and 5 open-ended questions. The survey was used to evaluate the students' perception of the simulation, specifically related to educational benefit, preference compared with traditional teaching methods, and perceived potential to change nursing practice. Data provided descriptive statistics for initial evaluation of the computer simulation. The responses on the survey suggest nursing students perceive the computer simulation to be entertaining, fun, educational, occasionally preferred over regular lecture, and with potential to change practice. Preliminary data support the use of computer simulation in educating nursing students about pain management. Copyright © 2017 American Society for Pain Management Nursing. Published by Elsevier Inc. All rights reserved.

  13. Computing by physical interaction in neurons.

    PubMed

    Aur, Dorian; Jog, Mandar; Poznanski, Roman R

    2011-12-01

    The electrodynamics of action potentials represents the fundamental level where information is integrated and processed in neurons. The Hodgkin-Huxley model cannot explain the non-stereotyped spatial charge density dynamics that occur during action potential propagation. Revealed in experiments as spike directivity, the non-uniform charge density dynamics within neurons carry meaningful information and suggest that fragments of information regarding our memories are endogenously stored in structural patterns at a molecular level and are revealed only during spiking activity. The main conceptual idea is that under the influence of electric fields, efficient computation by interaction occurs between charge densities embedded within molecular structures and the transient developed flow of electrical charges. This process of computation underlying electrical interactions and molecular mechanisms at the subcellular level is dissimilar from spiking neuron models that are completely devoid of physical interactions. Computation by interaction describes a more powerful continuous model of computation than the one that consists of discrete steps as represented in Turing machines.

  14. Application of CT-PSF-based computer-simulated lung nodules for evaluating the accuracy of computer-aided volumetry.

    PubMed

    Funaki, Ayumu; Ohkubo, Masaki; Wada, Shinichi; Murao, Kohei; Matsumoto, Toru; Niizuma, Shinji

    2012-07-01

    With the wide dissemination of computed tomography (CT) screening for lung cancer, measuring the nodule volume accurately with computer-aided volumetry software is increasingly important. Many studies for determining the accuracy of volumetry software have been performed using a phantom with artificial nodules. These phantom studies are limited, however, in their ability to reproduce the nodules both accurately and in the variety of sizes and densities required. Therefore, we propose a new approach of using computer-simulated nodules based on the point spread function measured in a CT system. The validity of the proposed method was confirmed by the excellent agreement obtained between computer-simulated nodules and phantom nodules regarding the volume measurements. A practical clinical evaluation of the accuracy of volumetry software was achieved by adding simulated nodules onto clinical lung images, including noise and artifacts. The tested volumetry software was revealed to be accurate within an error of 20 % for nodules >5 mm and with the difference between nodule density and background (lung) (CT value) being 400-600 HU. Such a detailed analysis can provide clinically useful information on the use of volumetry software in CT screening for lung cancer. We concluded that the proposed method is effective for evaluating the performance of computer-aided volumetry software.

  15. Developing Computer Model-Based Assessment of Chemical Reasoning: A Feasibility Study

    ERIC Educational Resources Information Center

    Liu, Xiufeng; Waight, Noemi; Gregorius, Roberto; Smith, Erica; Park, Mihwa

    2012-01-01

    This paper reports a feasibility study on developing computer model-based assessments of chemical reasoning at the high school level. Computer models are flash and NetLogo environments to make simultaneously available three domains in chemistry: macroscopic, submicroscopic, and symbolic. Students interact with computer models to answer assessment…

  16. GeoBrain Computational Cyber-laboratory for Earth Science Studies

    NASA Astrophysics Data System (ADS)

    Deng, M.; di, L.

    2009-12-01

    Computational approaches (e.g., computer-based data visualization, analysis and modeling) are critical for conducting increasingly data-intensive Earth science (ES) studies to understand functions and changes of the Earth system. However, currently Earth scientists, educators, and students have met two major barriers that prevent them from being effectively using computational approaches in their learning, research and application activities. The two barriers are: 1) difficulties in finding, obtaining, and using multi-source ES data; and 2) lack of analytic functions and computing resources (e.g., analysis software, computing models, and high performance computing systems) to analyze the data. Taking advantages of recent advances in cyberinfrastructure, Web service, and geospatial interoperability technologies, GeoBrain, a project funded by NASA, has developed a prototype computational cyber-laboratory to effectively remove the two barriers. The cyber-laboratory makes ES data and computational resources at large organizations in distributed locations available to and easily usable by the Earth science community through 1) enabling seamless discovery, access and retrieval of distributed data, 2) federating and enhancing data discovery with a catalogue federation service and a semantically-augmented catalogue service, 3) customizing data access and retrieval at user request with interoperable, personalized, and on-demand data access and services, 4) automating or semi-automating multi-source geospatial data integration, 5) developing a large number of analytic functions as value-added, interoperable, and dynamically chainable geospatial Web services and deploying them in high-performance computing facilities, 6) enabling the online geospatial process modeling and execution, and 7) building a user-friendly extensible web portal for users to access the cyber-laboratory resources. Users can interactively discover the needed data and perform on-demand data analysis and

  17. Predicting Computer Science Ph.D. Completion: A Case Study

    ERIC Educational Resources Information Center

    Cox, G. W.; Hughes, W. E., Jr.; Etzkorn, L. H.; Weisskopf, M. E.

    2009-01-01

    This paper presents the results of an analysis of indicators that can be used to predict whether a student will succeed in a Computer Science Ph.D. program. The analysis was conducted by studying the records of 75 students who have been in the Computer Science Ph.D. program of the University of Alabama in Huntsville. Seventy-seven variables were…

  18. Changes in bone macro- and microstructure in diabetic obese mice revealed by high resolution microfocus X-ray computed tomography

    PubMed Central

    Kerckhofs, G.; Durand, M.; Vangoitsenhoven, R.; Marin, C.; Van der Schueren, B.; Carmeliet, G.; Luyten, F. P.; Geris, L.; Vandamme, K.

    2016-01-01

    High resolution microfocus X-ray computed tomography (HR-microCT) was employed to characterize the structural alterations of the cortical and trabecular bone in a mouse model of obesity-driven type 2 diabetes (T2DM). C57Bl/6J mice were randomly assigned for 14 weeks to either a control diet-fed (CTRL) or a high fat diet (HFD)-fed group developing obesity, hyperglycaemia and insulin resistance. The HFD group showed an increased trabecular thickness and a decreased trabecular number compared to CTRL animals. Midshaft tibia intracortical porosity was assessed at two spatial image resolutions. At 2 μm scale, no change was observed in the intracortical structure. At 1 μm scale, a decrease in the cortical vascular porosity of the HFD bone was evidenced. The study of a group of 8 week old animals corresponding to animals at the start of the diet challenge revealed that the decreased vascular porosity was T2DM-dependant and not related to the ageing process. Our results offer an unprecedented ultra-characterization of the T2DM compromised skeletal micro-architecture and highlight an unrevealed T2DM-related decrease in the cortical vascular porosity, potentially affecting the bone health and fragility. Additionally, it provides some insights into the technical challenge facing the assessment of the rodent bone structure using HR-microCT imaging. PMID:27759061

  19. Changes in bone macro- and microstructure in diabetic obese mice revealed by high resolution microfocus X-ray computed tomography

    NASA Astrophysics Data System (ADS)

    Kerckhofs, G.; Durand, M.; Vangoitsenhoven, R.; Marin, C.; van der Schueren, B.; Carmeliet, G.; Luyten, F. P.; Geris, L.; Vandamme, K.

    2016-10-01

    High resolution microfocus X-ray computed tomography (HR-microCT) was employed to characterize the structural alterations of the cortical and trabecular bone in a mouse model of obesity-driven type 2 diabetes (T2DM). C57Bl/6J mice were randomly assigned for 14 weeks to either a control diet-fed (CTRL) or a high fat diet (HFD)-fed group developing obesity, hyperglycaemia and insulin resistance. The HFD group showed an increased trabecular thickness and a decreased trabecular number compared to CTRL animals. Midshaft tibia intracortical porosity was assessed at two spatial image resolutions. At 2 μm scale, no change was observed in the intracortical structure. At 1 μm scale, a decrease in the cortical vascular porosity of the HFD bone was evidenced. The study of a group of 8 week old animals corresponding to animals at the start of the diet challenge revealed that the decreased vascular porosity was T2DM-dependant and not related to the ageing process. Our results offer an unprecedented ultra-characterization of the T2DM compromised skeletal micro-architecture and highlight an unrevealed T2DM-related decrease in the cortical vascular porosity, potentially affecting the bone health and fragility. Additionally, it provides some insights into the technical challenge facing the assessment of the rodent bone structure using HR-microCT imaging.

  20. Adaptation to High Ethanol Reveals Complex Evolutionary Pathways

    PubMed Central

    Das, Anupam; Espinosa-Cantú, Adriana; De Maeyer, Dries; Arslan, Ahmed; Van Pee, Michiel; van der Zande, Elisa; Meert, Wim; Yang, Yudi; Zhu, Bo; Marchal, Kathleen; DeLuna, Alexander; Van Noort, Vera; Jelier, Rob; Verstrepen, Kevin J.

    2015-01-01

    Tolerance to high levels of ethanol is an ecologically and industrially relevant phenotype of microbes, but the molecular mechanisms underlying this complex trait remain largely unknown. Here, we use long-term experimental evolution of isogenic yeast populations of different initial ploidy to study adaptation to increasing levels of ethanol. Whole-genome sequencing of more than 30 evolved populations and over 100 adapted clones isolated throughout this two-year evolution experiment revealed how a complex interplay of de novo single nucleotide mutations, copy number variation, ploidy changes, mutator phenotypes, and clonal interference led to a significant increase in ethanol tolerance. Although the specific mutations differ between different evolved lineages, application of a novel computational pipeline, PheNetic, revealed that many mutations target functional modules involved in stress response, cell cycle regulation, DNA repair and respiration. Measuring the fitness effects of selected mutations introduced in non-evolved ethanol-sensitive cells revealed several adaptive mutations that had previously not been implicated in ethanol tolerance, including mutations in PRT1, VPS70 and MEX67. Interestingly, variation in VPS70 was recently identified as a QTL for ethanol tolerance in an industrial bio-ethanol strain. Taken together, our results show how, in contrast to adaptation to some other stresses, adaptation to a continuous complex and severe stress involves interplay of different evolutionary mechanisms. In addition, our study reveals functional modules involved in ethanol resistance and identifies several mutations that could help to improve the ethanol tolerance of industrial yeasts. PMID:26545090

  1. Phosphate-Catalyzed Succinimide Formation from Asp Residues: A Computational Study of the Mechanism.

    PubMed

    Kirikoshi, Ryota; Manabe, Noriyoshi; Takahashi, Ohgi

    2018-02-24

    Aspartic acid (Asp) residues in proteins and peptides are prone to the non-enzymatic reactions that give biologically uncommon l-β-Asp, d-Asp, and d-β-Asp residues via the cyclic succinimide intermediate (aminosuccinyl residue, Suc). These abnormal Asp residues are known to have relevance to aging and pathologies. Despite being non-enzymatic, the Suc formation is thought to require a catalyst under physiological conditions. In this study, we computationally investigated the mechanism of the Suc formation from Asp residues that were catalyzed by the dihydrogen phosphate ion, H₂PO₄ - . We used Ac-l-Asp-NHMe (Ac = acetyl, NHMe = methylamino) as a model compound. The H₂PO₄ - ion (as a catalyst) and two explicit water molecules (as solvent molecules stabilizing the negative charge) were included in the calculations. All of the calculations were performed by density functional theory with the B3LYP functional. We revealed a phosphate-catalyzed two-step mechanism (cyclization-dehydration) of the Suc formation, where the first step is predicted to be rate-determining. In both steps, the reaction involved a proton relay mediated by the H₂PO₄ - ion. The calculated activation barrier for this mechanism (100.3 kJ mol -1 ) is in reasonable agreement with an experimental activation energy (107 kJ mol -1 ) for the Suc formation from an Asp-containing peptide in a phosphate buffer, supporting the catalytic mechanism of the H₂PO₄ - ion that is revealed in this study.

  2. Effect of Computer-Based Video Games on Children: An Experimental Study

    ERIC Educational Resources Information Center

    Chuang, Tsung-Yen; Chen, Wei-Fan

    2009-01-01

    This experimental study investigated whether computer-based video games facilitate children's cognitive learning. In comparison to traditional computer-assisted instruction (CAI), this study explored the impact of the varied types of instructional delivery strategies on children's learning achievement. One major research null hypothesis was…

  3. Novel Polyurethane Matrix Systems Reveal a Particular Sustained Release Behavior Studied by Imaging and Computational Modeling.

    PubMed

    Campiñez, María Dolores; Caraballo, Isidoro; Puchkov, Maxim; Kuentz, Martin

    2017-07-01

    The aim of the present work was to better understand the drug-release mechanism from sustained release matrices prepared with two new polyurethanes, using a novel in silico formulation tool based on 3-dimensional cellular automata. For this purpose, two polymers and theophylline as model drug were used to prepare binary matrix tablets. Each formulation was simulated in silico, and its release behavior was compared to the experimental drug release profiles. Furthermore, the polymer distributions in the tablets were imaged by scanning electron microscopy (SEM) and the changes produced by the tortuosity were quantified and verified using experimental data. The obtained results showed that the polymers exhibited a surprisingly high ability for controlling drug release at low excipient concentrations (only 10% w/w of excipient controlled the release of drug during almost 8 h). The mesoscopic in silico model helped to reveal how the novel biopolymers were controlling drug release. The mechanism was found to be a special geometrical arrangement of the excipient particles, creating an almost continuous barrier surrounding the drug in a very effective way, comparable to lipid or waxy excipients but with the advantages of a much higher compactability, stability, and absence of excipient polymorphism.

  4. A Trans-omics Mathematical Analysis Reveals Novel Functions of the Ornithine Metabolic Pathway in Cancer Stem Cells

    NASA Astrophysics Data System (ADS)

    Koseki, Jun; Matsui, Hidetoshi; Konno, Masamitsu; Nishida, Naohiro; Kawamoto, Koichi; Kano, Yoshihiro; Mori, Masaki; Doki, Yuichiro; Ishii, Hideshi

    2016-02-01

    Bioinformatics and computational modelling are expected to offer innovative approaches in human medical science. In the present study, we performed computational analyses and made predictions using transcriptome and metabolome datasets obtained from fluorescence-based visualisations of chemotherapy-resistant cancer stem cells (CSCs) in the human oesophagus. This approach revealed an uncharacterized role for the ornithine metabolic pathway in the survival of chemotherapy-resistant CSCs. The present study fastens this rationale for further characterisation that may lead to the discovery of innovative drugs against robust CSCs.

  5. Handheld computers in critical care.

    PubMed

    Lapinsky, S E; Weshler, J; Mehta, S; Varkul, M; Hallett, D; Stewart, T E

    2001-08-01

    Computing technology has the potential to improve health care management but is often underutilized. Handheld computers are versatile and relatively inexpensive, bringing the benefits of computers to the bedside. We evaluated the role of this technology for managing patient data and accessing medical reference information, in an academic intensive-care unit (ICU). Palm III series handheld devices were given to the ICU team, each installed with medical reference information, schedules, and contact numbers. Users underwent a 1-hour training session introducing the hardware and software. Various patient data management applications were assessed during the study period. Qualitative assessment of the benefits, drawbacks, and suggestions was performed by an independent company, using focus groups. An objective comparison between a paper and electronic handheld textbook was achieved using clinical scenario tests. During the 6-month study period, the 20 physicians and 6 paramedical staff who used the handheld devices found them convenient and functional but suggested more comprehensive training and improved search facilities. Comparison of the handheld computer with the conventional paper text revealed equivalence. Access to computerized patient information improved communication, particularly with regard to long-stay patients, but changes to the software and the process were suggested. The introduction of this technology was well received despite differences in users' familiarity with the devices. Handheld computers have potential in the ICU, but systems need to be developed specifically for the critical-care environment.

  6. Handheld computers in critical care

    PubMed Central

    Lapinsky, Stephen E; Weshler, Jason; Mehta, Sangeeta; Varkul, Mark; Hallett, Dave; Stewart, Thomas E

    2001-01-01

    Background Computing technology has the potential to improve health care management but is often underutilized. Handheld computers are versatile and relatively inexpensive, bringing the benefits of computers to the bedside. We evaluated the role of this technology for managing patient data and accessing medical reference information, in an academic intensive-care unit (ICU). Methods Palm III series handheld devices were given to the ICU team, each installed with medical reference information, schedules, and contact numbers. Users underwent a 1-hour training session introducing the hardware and software. Various patient data management applications were assessed during the study period. Qualitative assessment of the benefits, drawbacks, and suggestions was performed by an independent company, using focus groups. An objective comparison between a paper and electronic handheld textbook was achieved using clinical scenario tests. Results During the 6-month study period, the 20 physicians and 6 paramedical staff who used the handheld devices found them convenient and functional but suggested more comprehensive training and improved search facilities. Comparison of the handheld computer with the conventional paper text revealed equivalence. Access to computerized patient information improved communication, particularly with regard to long-stay patients, but changes to the software and the process were suggested. Conclusions The introduction of this technology was well received despite differences in users' familiarity with the devices. Handheld computers have potential in the ICU, but systems need to be developed specifically for the critical-care environment. PMID:11511337

  7. [Results of the marketing research study "Acceptance of physician's office computer systems"].

    PubMed

    Steinhausen, D; Brinkmann, F; Engelhard, A

    1998-01-01

    We report on a market research study on the acceptance of computer systems in surgeries. 11,000 returned questionnaires of surgeons--user and nonuser--were analysed. We found out that most of the surgeons used their computers in a limited way, i.e. as a device for accounting. Concerning the level of utilisation there are differentials of Men-Women, West-East and Young-Old. In this study we also analysed the computer using behaviour of gynaecologic surgeons. As a result two third of all nonusers are not intending to utilise a computer in the future.

  8. Computer simulation for integrated pest management of spruce budworms

    Treesearch

    Carroll B. Williams; Patrick J. Shea

    1982-01-01

    Some field studies of the effects of various insecticides on the spruce budworm (Choristoneura sp.) and their parasites have shown severe suppression of host (budworm) populations and increased parasitism after treatment. Computer simulation using hypothetical models of spruce budworm-parasite systems based on these field data revealed that (1)...

  9. Structure-preserving and rank-revealing QR-factorizations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bischof, C.H.; Hansen, P.C.

    1991-11-01

    The rank-revealing QR-factorization (RRQR-factorization) is a special QR-factorization that is guaranteed to reveal the numerical rank of the matrix under consideration. This makes the RRQR-factorization a useful tool in the numerical treatment of many rank-deficient problems in numerical linear algebra. In this paper, a framework is presented for the efficient implementation of RRQR algorithms, in particular, for sparse matrices. A sparse RRQR-algorithm should seek to preserve the structure and sparsity of the matrix as much as possible while retaining the ability to capture safely the numerical rank. To this end, the paper proposes to compute an initial QR-factorization using amore » restricted pivoting strategy guarded by incremental condition estimation (ICE), and then applies the algorithm suggested by Chan and Foster to this QR-factorization. The column exchange strategy used in the initial QR factorization will exploit the fact that certain column exchanges do not change the sparsity structure, and compute a sparse QR-factorization that is a good approximation of the sought-after RRQR-factorization. Due to quantities produced by ICE, the Chan/Foster RRQR algorithm can be implemented very cheaply, thus verifying that the sought-after RRQR-factorization has indeed been computed. Experimental results on a model problem show that the initial QR-factorization is indeed very likely to produce RRQR-factorization.« less

  10. Evaluating Computer Screen Time and Its Possible Link to Psychopathology in the Context of Age: A Cross-Sectional Study of Parents and Children

    PubMed Central

    Ross, Sharon; Silman, Zmira; Maoz, Hagai; Bloch, Yuval

    2015-01-01

    Background Several studies have suggested that high levels of computer use are linked to psychopathology. However, there is ambiguity about what should be considered normal or over-use of computers. Furthermore, the nature of the link between computer usage and psychopathology is controversial. The current study utilized the context of age to address these questions. Our hypothesis was that the context of age will be paramount for differentiating normal from excessive use, and that this context will allow a better understanding of the link to psychopathology. Methods In a cross-sectional study, 185 parents and children aged 3–18 years were recruited in clinical and community settings. They were asked to fill out questionnaires regarding demographics, functional and academic variables, computer use as well as psychiatric screening questionnaires. Using a regression model, we identified 3 groups of normal-use, over-use and under-use and examined known factors as putative differentiators between the over-users and the other groups. Results After modeling computer screen time according to age, factors linked to over-use were: decreased socialization (OR 3.24, Confidence interval [CI] 1.23–8.55, p = 0.018), difficulty to disengage from the computer (OR 1.56, CI 1.07–2.28, p = 0.022) and age, though borderline-significant (OR 1.1 each year, CI 0.99–1.22, p = 0.058). While psychopathology was not linked to over-use, post-hoc analysis revealed that the link between increased computer screen time and psychopathology was age-dependent and solidified as age progressed (p = 0.007). Unlike computer usage, the use of small-screens and smartphones was not associated with psychopathology. Conclusions The results suggest that computer screen time follows an age-based course. We conclude that differentiating normal from over-use as well as defining over-use as a possible marker for psychiatric difficulties must be performed within the context of age. If verified by

  11. Evaluating Computer Screen Time and Its Possible Link to Psychopathology in the Context of Age: A Cross-Sectional Study of Parents and Children.

    PubMed

    Segev, Aviv; Mimouni-Bloch, Aviva; Ross, Sharon; Silman, Zmira; Maoz, Hagai; Bloch, Yuval

    2015-01-01

    Several studies have suggested that high levels of computer use are linked to psychopathology. However, there is ambiguity about what should be considered normal or over-use of computers. Furthermore, the nature of the link between computer usage and psychopathology is controversial. The current study utilized the context of age to address these questions. Our hypothesis was that the context of age will be paramount for differentiating normal from excessive use, and that this context will allow a better understanding of the link to psychopathology. In a cross-sectional study, 185 parents and children aged 3-18 years were recruited in clinical and community settings. They were asked to fill out questionnaires regarding demographics, functional and academic variables, computer use as well as psychiatric screening questionnaires. Using a regression model, we identified 3 groups of normal-use, over-use and under-use and examined known factors as putative differentiators between the over-users and the other groups. After modeling computer screen time according to age, factors linked to over-use were: decreased socialization (OR 3.24, Confidence interval [CI] 1.23-8.55, p = 0.018), difficulty to disengage from the computer (OR 1.56, CI 1.07-2.28, p = 0.022) and age, though borderline-significant (OR 1.1 each year, CI 0.99-1.22, p = 0.058). While psychopathology was not linked to over-use, post-hoc analysis revealed that the link between increased computer screen time and psychopathology was age-dependent and solidified as age progressed (p = 0.007). Unlike computer usage, the use of small-screens and smartphones was not associated with psychopathology. The results suggest that computer screen time follows an age-based course. We conclude that differentiating normal from over-use as well as defining over-use as a possible marker for psychiatric difficulties must be performed within the context of age. If verified by additional studies, future research should integrate

  12. Trifluoperazine Regulation of Calmodulin Binding to Fas: A Computational Study

    PubMed Central

    Pan, Di; Yan, Qi; Chen, Yabing; McDonald, Jay M; Song, Yuhua

    2011-01-01

    Death-inducing signaling complex (DISC) formation is a critical step in Fas-mediated signaling for apoptosis. Previous experiments have demonstrated that the calmodulin (CaM) antagonist, trifluoperazine (TFP) regulates CaM-Fas binding and affects Fas-mediated DISC formation. In this study, we investigated the anti-cooperative characteristics of TFP binding to CaM and the effect of TFP on the CaM-Fas interaction from both structural and thermodynamic perspectives using combined molecular dynamics simulations and binding free energy analyses. We studied the interactions of different numbers of TFP molecules with CaM and explored the effects of the resulting conformational changes in CaM on CaM-Fas binding. Results from these analyses showed that the number of TFP molecules bound to CaM directly influenced α-helix formation and hydrogen bond occupancy within the α-helices of CaM, contributing to the conformational and motion changes in CaM. These changes affected CaM binding to Fas, resulting in secondary structural changes in Fas and conformational and motion changes of Fas in CaM-Fas complexes, potentially perturbing the recruitment of Fas-associated death domain (FADD) for DISC formation. The computational results from this study reveal the structural and molecular mechanisms that underlie the role of the CaM antagonist, TFP, in regulation of CaM-Fas binding and Fas-mediated DISC formation in a concentration-dependent manner. PMID:21656570

  13. Junior High Computer Studies: Teacher Resource Manual.

    ERIC Educational Resources Information Center

    Alberta Dept. of Education, Edmonton. Curriculum Branch.

    This manual is designed to help classroom teachers in Alberta, Canada implement the Junior High Computer Studies Program. The first eight sections cover the following material: (1) introduction to the teacher resource manual; (2) program rationale and philosophy; (3) general learner expectations; (4) program framework and flexibility; (5) program…

  14. Spectroscopic and computational studies of ionic clusters as models of solvation and atmospheric reactions

    NASA Astrophysics Data System (ADS)

    Kuwata, Keith T.

    Ionic clusters are useful as model systems for the study of fundamental processes in solution and in the atmosphere. Their structure and reactivity can be studied in detail using vibrational predissociation spectroscopy, in conjunction with high level ab initio calculations. This thesis presents the applications of infrared spectroscopy and computation to a variety of gas-phase cluster systems. A crucial component of the process of stratospheric ozone depletion is the action of polar stratospheric clouds (PSCs) to convert the reservoir species HCl and chlorine nitrate (ClONO2) to photochemically labile compounds. Quantum chemistry was used to explore one possible mechanism by which this activation is effected: Cl- + ClONO2 /to Cl2 + NO3- eqno(1)Correlated ab initio calculations predicted that the direct reaction of chloride ion with ClONO2 is facile, which was confirmed in an experimental kinetics study. In the reaction a weakly bound intermediate Cl2-NO3- is formed, with ~70% of the charge localized on the nitrate moiety. This enables the Cl2-NO3- cluster to be well solvated even in bulk solution, allowing (1) to be facile on PSCs. Quantum chemistry was also applied to the hydration of nitrosonium ion (NO+), an important process in the ionosphere. The calculations, in conjunction with an infrared spectroscopy experiment, revealed the structure of the gas-phase clusters NO+(H2O)n. The large degree of covalent interaction between NO+ and the lone pairs of the H2O ligands is contrasted with the weak electrostatic bonding between iodide ion and H2O. Finally, the competition between ion solvation and solvent self-association is explored for the gas-phase clusters Cl/-(H2O)n and Cl-(NH3)n. For the case of water, vibrational predissociation spectroscopy reveals less hydrogen bonding among H2O ligands than predicted by ab initio calculations. Nevertheless, for n /ge 5, cluster structure is dominated by water-water interactions, with Cl- only partially solvated by the

  15. Computer-Assisted Spanish-Composition Survey--1986.

    ERIC Educational Resources Information Center

    Harvey, T. Edward

    1986-01-01

    A survey of high school and higher education teachers' (N=208) attitudes regarding the use of computers for Spanish-composition instruction revealed that: the lack of foreign-character support remains the major frustration; most teachers used Apple or IBM computers; and there was mixed opinion regarding the real versus the expected benefits of…

  16. Detection of the posterior superior alveolar artery in the lateral sinus wall using computed tomography/cone beam computed tomography: a prevalence meta-analysis study and systematic review.

    PubMed

    Varela-Centelles, P; Loira-Gago, M; Seoane-Romero, J M; Takkouche, B; Monteiro, L; Seoane, J

    2015-11-01

    A systematic search of MEDLINE, Embase, and Proceedings Web of Science was undertaken to assess the prevalence of the posterior superior alveolar artery (PSAA) in the lateral sinus wall in sinus lift patients, as identified using computed tomography (CT)/cone beam computed tomography (CBCT). For inclusion, the article had to report PSAA detection in the bony wall using CT and/or CBCT in patients with subsinus edentulism. Studies on post-mortem findings, mixed samples (living and cadaveric), those presenting pooled results only, or studies performed for a sinus pathology were excluded. Heterogeneity was checked using an adapted version of the DerSimonian and Laird Q test, and quantified by calculating the proportion of the total variance due to between-study variance (Ri statistic). Eight hundred and eleven single papers were reviewed and filtered according to the inclusion/exclusion criteria. Ten studies were selected (1647 patients and 2740 maxillary sinuses (study unit)). The pooled prevalence of PSAA was 62.02 (95% confidence interval (CI) 46.33-77.71). CBCT studies detected PSAA more frequently (78.12, 95% CI 61.25-94.98) than CT studies (51.19, 95% CI 42.33-60.05). Conventional CT revealed thicker arteries than CBCT. It is concluded that PSAA detection is more frequent when CBCT explorations are used. Additional comparative studies controlling for potential confounding factors are needed to ascertain the actual diagnostic value of radiographic explorations for assessing the PSAA prior to sinus floor elevation procedures. Copyright © 2015 International Association of Oral and Maxillofacial Surgeons. Published by Elsevier Ltd. All rights reserved.

  17. Using tablet computers to teach evidence-based medicine to pediatrics residents: a prospective study.

    PubMed

    Soma, David B; Homme, Jason H; Jacobson, Robert M

    2013-01-01

    We sought to determine if tablet computers-supported by a laboratory experience focused upon skill-development-would improve not only evidence-based medicine (EBM) knowledge but also skills and behavior. We conducted a prospective cohort study where we provided tablet computers to our pediatric residents and then held a series of laboratory sessions focused on speed and efficiency in performing EBM at the bedside. We evaluated the intervention with pre- and postintervention tests and surveys based on a validated tool available for use on MedEdPORTAL. The attending pediatric hospitalists also completed surveys regarding their observations of the residents' behavior. All 38 pediatric residents completed the preintervention test and the pre- and postintervention surveys. All but one completed the posttest. All 7 attending pediatric hospitalists completed their surveys. The testing, targeted to assess EBM knowledge, revealed a median increase of 16 points out of a possible 60 points (P < .0001). We found substantial increases in individual resident's test scores across all 3 years of residency. Resident responses demonstrated statistically significant improvements in self-reported comfort with 6 out of 6 EBM skills and statistically significant increases in self-reported frequencies for 4 out of 7 EBM behaviors. Attending pediatric hospitalists reported improvements in 5 of 7 resident behaviors. This novel approach for teaching EBM to pediatric residents improved knowledge, skills, and behavior through the introduction of a tablet computer and laboratory sessions designed to teach the quick and efficient application of EBM at the bedside. Copyright © 2013 Academic Pediatric Association. Published by Elsevier Inc. All rights reserved.

  18. Computers for the Faculty: How on a Limited Budget.

    ERIC Educational Resources Information Center

    Arman, Hal; Kostoff, John

    An informal investigation of the use of computers at Delta College (DC) in Michigan revealed reasonable use of computers by faculty in disciplines such as mathematics, business, and technology, but very limited use in the humanities and social sciences. In an effort to increase faculty computer usage, DC decided to make computers available to any…

  19. Inquiry-Based Learning Case Studies for Computing and Computing Forensic Students

    ERIC Educational Resources Information Center

    Campbell, Jackie

    2012-01-01

    Purpose: The purpose of this paper is to describe and discuss the use of specifically-developed, inquiry-based learning materials for Computing and Forensic Computing students. Small applications have been developed which require investigation in order to de-bug code, analyse data issues and discover "illegal" behaviour. The applications…

  20. Validity of computed tomography in predicting scaphoid screw prominence: a cadaveric study.

    PubMed

    Griffis, Clare E; Olsen, Cara; Nesti, Leon; Gould, C Frank; Frew, Michael; McKay, Patricia

    2017-04-01

    Studies of hardware protrusion into joint spaces following fracture fixation have been performed to address whether or not there is discrepancy between the actual and radiographic appearance of screw prominence. The purpose of our study was to prove that, with respect to the scaphoid, prominence as visualized on CT scan is real and not a result of metal artifact. Forty-two cadaveric wrists were separated into four allotted groups with 21 control specimens and 21 study specimens. All specimens were radiographically screened to exclude those with inherent carpal abnormalities. Acutrak® headless compression screws were placed into all specimens using an open dorsal approach. Cartilage was removed from screw insertion site at the convex surface of the scaphoid proximal pole. Control specimens had 0 mm screw head prominence. The studied specimens had 1, 2, and 3 mm head prominence measured with a digital caliper. Computed tomography, with direct sagittal acquisition and metal suppression technique, was then performed on all specimens following screw placement. Two staff radiologists blinded to the study groups interpreted the images. Results revealed that only one of 21 control specimens was interpreted as prominent. Comparatively, in the studied groups, 90% were accurately interpreted as prominent. CT provides an accurate assessment of scaphoid screw head prominence. When a screw appears prominent on CT scan, it is likely to be truly prominent without contribution from metallic artifact.

  1. Using Computer Assisted Instruction in a Reading and Study Skills Course.

    ERIC Educational Resources Information Center

    Rauch, Margaret

    Test wiseness programs and computer assisted study skills instruction (CASSI) were found to be valuable resources for college reading and study skills instructors and students at St. Cloud State University (Minnesota). Two booklets on test wiseness cues were reorganized and used as computer programs to allow the information to be presented outside…

  2. Handheld Computers in the Classroom: Integration Strategies for Social Studies Educators.

    ERIC Educational Resources Information Center

    Ray, Beverly

    Handheld computers have gone beyond the world of business and are now finding their way into the hands of social studies teachers and students. This paper discusses how social studies teachers can use handheld computers to aid anytime/ anywhere course management. The integration of handheld technology into the classroom provides social studies…

  3. Measurement and Evidence of Computer-Based Task Switching and Multitasking by "Net Generation" Students

    ERIC Educational Resources Information Center

    Judd, Terry; Kennedy, Gregor

    2011-01-01

    Logs of on-campus computer and Internet usage were used to conduct a study of computer-based task switching and multitasking by undergraduate medical students. A detailed analysis of over 6000 individual sessions revealed that while a majority of students engaged in both task switching and multitasking behaviours, they did so less frequently than…

  4. Computational studies of protegrin antimicrobial peptides: a review

    PubMed Central

    Bolintineanu, Dan S.; Kaznessis, Yiannis N.

    2010-01-01

    Antimicrobial peptides (AMPs) are small, naturally-occurring peptides that exhibit strong antibacterial properties generally believed to be a result of selective bacterial membrane disruption. As a result, there has been significant interest in the development of therapeutic antibiotics based on AMPs; however, the poor understanding of the fundamental mechanism of action of these peptides has largely hampered such efforts. We present a summary of computational and theoretical investigations of protegrin, a particularly potent peptide that is both an excellent model for the mechanism of action of AMPs and a promising therapeutic candidate. Experimental investigations have shed light on many of the key steps in the action of protegrin: protegrin monomers are known to dimerize in various lipid environments; protegrin peptides interact strongly with lipid bilayer membranes, particularly anionic lipids; protegrins have been shown to form pores in lipid bilayers, which results in uncontrolled ion transport and may be a key factor in bacterial death. In this work, we present a comprehensive review of the computational and theoretical studies that have complemented and extended the information obtained from experimental work with protegrins, as well as a brief survey of the experimental biophysical studies that are most pertinent to such computational work. We show that a consistent, mechanistic description of the bactericidal mechanism of action of protegrins is emerging, and briefly outline areas where the current understanding is deficient. We hope that the research reviewed herein offers compelling evidence of the benefits of computational investigations of protegrins and other AMPs, as well as providing a useful guide to future work in this area. PMID:20946928

  5. Computer Assisted Language Learning. Routledge Studies in Computer Assisted Language Learning

    ERIC Educational Resources Information Center

    Pennington, Martha

    2011-01-01

    Computer-assisted language learning (CALL) is an approach to language teaching and learning in which computer technology is used as an aid to the presentation, reinforcement and assessment of material to be learned, usually including a substantial interactive element. This books provides an up-to date and comprehensive overview of…

  6. Methods Used in a Recent Computer Selection Study.

    ERIC Educational Resources Information Center

    Botten, LeRoy H.

    A study was conducted at Andrews University, Berrien Springs, Michigan to determine selection of a computer for both academic and administrative purposes. The university has a total enrollment of 2,100 students and includes a college, graduate school and seminary. An initial feasibility study delineated criteria and desirable components of the…

  7. Computational foundations of the visual number sense.

    PubMed

    Stoianov, Ivilin Peev; Zorzi, Marco

    2017-01-01

    We provide an emergentist perspective on the computational mechanism underlying numerosity perception, its development, and the role of inhibition, based on our deep neural network model. We argue that the influence of continuous visual properties does not challenge the notion of number sense, but reveals limit conditions for the computation that yields invariance in numerosity perception. Alternative accounts should be formalized in a computational model.

  8. Student Perceptions in the Design of a Computer Card Game for Learning Computer Literacy Issues: A Case Study

    ERIC Educational Resources Information Center

    Kordaki, Maria; Papastergiou, Marina; Psomos, Panagiotis

    2016-01-01

    The aim of this work was twofold. First, an empirical study was designed aimed at investigating the perceptions that entry-level non-computing majors--namely Physical Education and Sport Science (PESS) undergraduate students--hold about basic Computer Literacy (CL) issues. The participants were 90 first-year PESS students, and their perceptions…

  9. Computer and photogrammetric general land use study of central north Alabama

    NASA Technical Reports Server (NTRS)

    Jayroe, R. R.; Larsen, P. A.; Campbell, C. W.

    1974-01-01

    The object of this report is to acquaint potential users with two computer programs, developed at NASA, Marshall Space Flight Center. They were used in producing a land use survey and maps of central north Alabama from Earth Resources Technology Satellite (ERTS) digital data. The report describes in detail the thought processes and analysis procedures used from the initiation of the land use study to its completion, as well as a photogrammetric study that was used in conjunction with the computer analysis to produce similar land use maps. The results of the land use demonstration indicate that, with respect to computer time and cost, such a study may be economically and realistically feasible on a statewide basis.

  10. Computational Study of Nonadiabatic Effects in Atom-Molecule Reactive Scattering.

    DTIC Science & Technology

    1982-11-15

    a similar interpretation to those in Fig. 4-a, with the rotational effects most evident in the reactant tube (due to the mixing of the two open rotor ...AD-A125 135 COMPUTATIONAL STUDY OF NONRDIABATIC EFFECTS IN 1/2 ATOM-MOLECULE REACTIVE SCATTERING(U) CHEMICAL DYNAMICS CORP COLUMBUS OH B C GARRETT...COMPUTATIONAL STUDY OF NONADIABATIC EFFECTS [ Z IN ATOM-MOLECULE REACTIVE SCATTERING C:) TO AIR FORCE OFFICE OF SCIENTIFIC RESEARCHk CONTRACT NO. F49620-81

  11. Defragging Computer/Videogame Implementation and Assessment in the Social Studies

    ERIC Educational Resources Information Center

    McBride, Holly

    2014-01-01

    Students in this post-industrial technological age require opportunities for the acquisition of new skills, especially in the marketplace of innovation. A pedagogical strategy that is becoming more and more popular within social studies classrooms is the use of computer and video games as enhancements to everyday lesson plans. Computer/video games…

  12. Cyanobacterial biofuels: new insights and strain design strategies revealed by computational modeling.

    PubMed

    Erdrich, Philipp; Knoop, Henning; Steuer, Ralf; Klamt, Steffen

    2014-09-19

    Cyanobacteria are increasingly recognized as promising cell factories for the production of renewable biofuels and chemical feedstocks from sunlight, CO2, and water. However, most biotechnological applications of these organisms are still characterized by low yields. Increasing the production performance of cyanobacteria remains therefore a crucial step. In this work we use a stoichiometric network model of Synechocystis sp. PCC 6803 in combination with CASOP and minimal cut set analysis to systematically identify and characterize suitable strain design strategies for biofuel synthesis, specifically for ethanol and isobutanol. As a key result, improving upon other works, we demonstrate that higher-order knockout strategies exist in the model that lead to coupling of growth with high-yield biofuel synthesis under phototrophic conditions. Enumerating all potential knockout strategies (cut sets) reveals a unifying principle behind the identified strain designs, namely to reduce the ratio of ATP to NADPH produced by the photosynthetic electron transport chain. Accordingly, suitable knockout strategies seek to block cyclic and other alternate electron flows, such that ATP and NADPH are exclusively synthesized via the linear electron flow whose ATP/NADPH ratio is below that required for biomass synthesis. The products of interest are then utilized by the cell as sinks for reduction equivalents in excess. Importantly, the calculated intervention strategies do not rely on the assumption of optimal growth and they ensure that maintenance metabolism in the absence of light remains feasible. Our analyses furthermore suggest that a moderately increased ATP turnover, realized, for example, by ATP futile cycles or other ATP wasting mechanisms, represents a promising target to achieve increased biofuel yields. Our study reveals key principles of rational metabolic engineering strategies in cyanobacteria towards biofuel production. The results clearly show that achieving

  13. [Design and study of parallel computing environment of Monte Carlo simulation for particle therapy planning using a public cloud-computing infrastructure].

    PubMed

    Yokohama, Noriya

    2013-07-01

    This report was aimed at structuring the design of architectures and studying performance measurement of a parallel computing environment using a Monte Carlo simulation for particle therapy using a high performance computing (HPC) instance within a public cloud-computing infrastructure. Performance measurements showed an approximately 28 times faster speed than seen with single-thread architecture, combined with improved stability. A study of methods of optimizing the system operations also indicated lower cost.

  14. LiPISC: A Lightweight and Flexible Method for Privacy-Aware Intersection Set Computation.

    PubMed

    Ren, Wei; Huang, Shiyong; Ren, Yi; Choo, Kim-Kwang Raymond

    2016-01-01

    Privacy-aware intersection set computation (PISC) can be modeled as secure multi-party computation. The basic idea is to compute the intersection of input sets without leaking privacy. Furthermore, PISC should be sufficiently flexible to recommend approximate intersection items. In this paper, we reveal two previously unpublished attacks against PISC, which can be used to reveal and link one input set to another input set, resulting in privacy leakage. We coin these as Set Linkage Attack and Set Reveal Attack. We then present a lightweight and flexible PISC scheme (LiPISC) and prove its security (including against Set Linkage Attack and Set Reveal Attack).

  15. LiPISC: A Lightweight and Flexible Method for Privacy-Aware Intersection Set Computation

    PubMed Central

    Huang, Shiyong; Ren, Yi; Choo, Kim-Kwang Raymond

    2016-01-01

    Privacy-aware intersection set computation (PISC) can be modeled as secure multi-party computation. The basic idea is to compute the intersection of input sets without leaking privacy. Furthermore, PISC should be sufficiently flexible to recommend approximate intersection items. In this paper, we reveal two previously unpublished attacks against PISC, which can be used to reveal and link one input set to another input set, resulting in privacy leakage. We coin these as Set Linkage Attack and Set Reveal Attack. We then present a lightweight and flexible PISC scheme (LiPISC) and prove its security (including against Set Linkage Attack and Set Reveal Attack). PMID:27326763

  16. Recurrent largngeal nerve paralysis: a laryngographic and computed tomographic study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Agha, F.P.

    Vocal cord paralysis is a relatively common entity, usually resulting from a pathologic process of the vagus nerve or its recurrent larynegeal branch. It is rarely caused by intralargngeal lesions. Four teen patients with recurrent laryngeal nerve paralysis (RLNP) were evaluated by laryngography, computed tomography (CT), or both. In the evaluation of the paramedian cord, CT was limited in its ability to differentiate between tumor or RLNP as the cause of the fixed cord, but it yielded more information than laryngography on the structural abnormalities of the larynx and pre-epiglottic and paralaryngeal spaces. Laryngography revealed distinct features of RLNP andmore » is the procedure of choice for evaluation of functional abnormalities of the larynx until further experience with faster CT scanners and dynamic scanning of the larynx is gained.« less

  17. A study of workstation computational performance for real-time flight simulation

    NASA Technical Reports Server (NTRS)

    Maddalon, Jeffrey M.; Cleveland, Jeff I., II

    1995-01-01

    With recent advances in microprocessor technology, some have suggested that modern workstations provide enough computational power to properly operate a real-time simulation. This paper presents the results of a computational benchmark, based on actual real-time flight simulation code used at Langley Research Center, which was executed on various workstation-class machines. The benchmark was executed on different machines from several companies including: CONVEX Computer Corporation, Cray Research, Digital Equipment Corporation, Hewlett-Packard, Intel, International Business Machines, Silicon Graphics, and Sun Microsystems. The machines are compared by their execution speed, computational accuracy, and porting effort. The results of this study show that the raw computational power needed for real-time simulation is now offered by workstations.

  18. CFD Vision 2030 Study: A Path to Revolutionary Computational Aerosciences

    NASA Technical Reports Server (NTRS)

    Slotnick, Jeffrey; Khodadoust, Abdollah; Alonso, Juan; Darmofal, David; Gropp, William; Lurie, Elizabeth; Mavriplis, Dimitri

    2014-01-01

    This report documents the results of a study to address the long range, strategic planning required by NASA's Revolutionary Computational Aerosciences (RCA) program in the area of computational fluid dynamics (CFD), including future software and hardware requirements for High Performance Computing (HPC). Specifically, the "Vision 2030" CFD study is to provide a knowledge-based forecast of the future computational capabilities required for turbulent, transitional, and reacting flow simulations across a broad Mach number regime, and to lay the foundation for the development of a future framework and/or environment where physics-based, accurate predictions of complex turbulent flows, including flow separation, can be accomplished routinely and efficiently in cooperation with other physics-based simulations to enable multi-physics analysis and design. Specific technical requirements from the aerospace industrial and scientific communities were obtained to determine critical capability gaps, anticipated technical challenges, and impediments to achieving the target CFD capability in 2030. A preliminary development plan and roadmap were created to help focus investments in technology development to help achieve the CFD vision in 2030.

  19. A Computational and Experimental Study of Slit Resonators

    NASA Technical Reports Server (NTRS)

    Tam, C. K. W.; Ju, H.; Jones, M. G.; Watson, W. R.; Parrott, T. L.

    2003-01-01

    Computational and experimental studies are carried out to offer validation of the results obtained from direct numerical simulation (DNS) of the flow and acoustic fields of slit resonators. The test cases include slits with 90-degree corners and slits with 45-degree bevel angle housed inside an acoustic impedance tube. Three slit widths are used. Six frequencies from 0.5 to 3.0 kHz are chosen. Good agreement is found between computed and measured reflection factors. In addition, incident sound waves having white noise spectrum and a prescribed pseudo-random noise spectrum are used in subsequent series of tests. The computed broadband results are again found to agree well with experimental data. It is believed the present results provide strong support that DNS can eventually be a useful and accurate prediction tool for liner aeroacoustics. The usage of DNS as a design tool is discussed and illustrated by a simple example.

  20. A Case for Ubiquitous, Integrated Computing in Teacher Education

    ERIC Educational Resources Information Center

    Kay, Robin H.; Knaack, Liesel

    2005-01-01

    The purpose of this study was to evaluate the effect of an integrated, laptop-based approach on pre-service teachers' computer attitudes, ability and use. Pre-post program analysis revealed significant differences in behavioural attitudes and perceived control (self-efficacy), but not in affective and cognitive attitudes. In addition, there was a…

  1. A comparative study: use of a Brain-computer Interface (BCI) device by people with cerebral palsy in interaction with computers.

    PubMed

    Heidrich, Regina O; Jensen, Emely; Rebelo, Francisco; Oliveira, Tiago

    2015-01-01

    This article presents a comparative study among people with cerebral palsy and healthy controls, of various ages, using a Brain-computer Interface (BCI) device. The research is qualitative in its approach. Researchers worked with Observational Case Studies. People with cerebral palsy and healthy controls were evaluated in Portugal and in Brazil. The study aimed to develop a study for product evaluation in order to perceive whether people with cerebral palsy could interact with the computer and compare whether their performance is similar to that of healthy controls when using the Brain-computer Interface. Ultimately, it was found that there are no significant differences between people with cerebral palsy in the two countries, as well as between populations without cerebral palsy (healthy controls).

  2. Influence of computer work under time pressure on cardiac activity.

    PubMed

    Shi, Ping; Hu, Sijung; Yu, Hongliu

    2015-03-01

    Computer users are often under stress when required to complete computer work within a required time. Work stress has repeatedly been associated with an increased risk for cardiovascular disease. The present study examined the effects of time pressure workload during computer tasks on cardiac activity in 20 healthy subjects. Heart rate, time domain and frequency domain indices of heart rate variability (HRV) and Poincaré plot parameters were compared among five computer tasks and two rest periods. Faster heart rate and decreased standard deviation of R-R interval were noted in response to computer tasks under time pressure. The Poincaré plot parameters showed significant differences between different levels of time pressure workload during computer tasks, and between computer tasks and the rest periods. In contrast, no significant differences were identified for the frequency domain indices of HRV. The results suggest that the quantitative Poincaré plot analysis used in this study was able to reveal the intrinsic nonlinear nature of the autonomically regulated cardiac rhythm. Specifically, heightened vagal tone occurred during the relaxation computer tasks without time pressure. In contrast, the stressful computer tasks with added time pressure stimulated cardiac sympathetic activity. Copyright © 2015 Elsevier Ltd. All rights reserved.

  3. Space station Simulation Computer System (SCS) study for NASA/MSFC. Volume 6: Study issues report

    NASA Technical Reports Server (NTRS)

    1989-01-01

    The Simulation Computer System (SCS) is the computer hardware, software, and workstations that will support the Payload Training Complex (PTC) at the Marshall Space Flight Center (MSFC). The PTC will train the space station payload specialists and mission specialists to operate the wide variety of experiments that will be on-board the Freedom Space Station. This simulation Computer System (SCS) study issues report summarizes the analysis and study done as task 1-identify and analyze the CSC study issues- of the SCS study contract.This work was performed over the first three months of the SCS study which began in August of 1988. First issues were identified from all sources. These included the NASA SOW, the TRW proposal, and working groups which focused the experience of NASA and the contractor team performing the study-TRW, Essex, and Grumman. The final list is organized into training related issues, and SCS associated development issues. To begin the analysis of the issues, a list of all the functions for which the SCS could be used was created, i.e., when the computer is turned on, what will it be doing. Analysis was continued by creating an operational functions matrix of SCS users vs. SCS functions to insure all the functions considered were valid, and to aid in identification of users as the analysis progressed. The functions will form the basis for the requirements, which are currently being developed under task 3 of the SCS study.

  4. A Study of the Correlation between Computer Games and Adolescent Behavioral Problems

    PubMed Central

    Shokouhi-Moqhaddam, Solmaz; Khezri-Moghadam, Noshiravan; Javanmard, Zeinab; Sarmadi-Ansar, Hassan; Aminaee, Mehran; Shokouhi-Moqhaddam, Majid; Zivari-Rahman, Mahmoud

    2013-01-01

    Background Today, due to developing communicative technologies, computer games and other audio-visual media as social phenomena, are very attractive and have a great effect on children and adolescents. The increasing popularity of these games among children and adolescents results in the public uncertainties about plausible harmful effects of these games. This study aimed to investigate the correlation between computer games and behavioral problems on male guidance school students. Methods This was a descriptive-correlative study on 384 randomly chosen male guidance school students. They were asked to answer the researcher's questionnaire about computer games and Achenbach’s Youth Self-Report (YSR). Findings The Results of this study indicated that there was about 95% direct significant correlation between the amount of playing games among adolescents and anxiety/depression, withdrawn/depression, rule-breaking behaviors, aggression, and social problems. However, there was no statistically significant correlation between the amount of computer game usage and physical complaints, thinking problems, and attention problems. In addition, there was a significant correlation between the students’ place of living and their parents’ job, and using computer games. Conclusion Computer games lead to anxiety, depression, withdrawal, rule-breaking behavior, aggression, and social problems in adolescents. PMID:24494157

  5. A Study of the Correlation between Computer Games and Adolescent Behavioral Problems.

    PubMed

    Shokouhi-Moqhaddam, Solmaz; Khezri-Moghadam, Noshiravan; Javanmard, Zeinab; Sarmadi-Ansar, Hassan; Aminaee, Mehran; Shokouhi-Moqhaddam, Majid; Zivari-Rahman, Mahmoud

    2013-01-01

    Today, due to developing communicative technologies, computer games and other audio-visual media as social phenomena, are very attractive and have a great effect on children and adolescents. The increasing popularity of these games among children and adolescents results in the public uncertainties about plausible harmful effects of these games. This study aimed to investigate the correlation between computer games and behavioral problems on male guidance school students. This was a descriptive-correlative study on 384 randomly chosen male guidance school students. They were asked to answer the researcher's questionnaire about computer games and Achenbach's Youth Self-Report (YSR). The Results of this study indicated that there was about 95% direct significant correlation between the amount of playing games among adolescents and anxiety/depression, withdrawn/depression, rule-breaking behaviors, aggression, and social problems. However, there was no statistically significant correlation between the amount of computer game usage and physical complaints, thinking problems, and attention problems. In addition, there was a significant correlation between the students' place of living and their parents' job, and using computer games. Computer games lead to anxiety, depression, withdrawal, rule-breaking behavior, aggression, and social problems in adolescents.

  6. A Study of Computer Techniques for Music Research. Final Report.

    ERIC Educational Resources Information Center

    Lincoln, Harry B.

    Work in three areas comprised this study of computer use in thematic indexing for music research: (1) acquisition, encoding, and keypunching of data--themes of which now number about 50,000 (primarily 16th Century Italian vocal music) and serve as a test base for program development; (2) development of computer programs to process this data; and…

  7. Computer use and addiction in Romanian children and teenagers--an observational study.

    PubMed

    Chiriţă, V; Chiriţă, Roxana; Stefănescu, C; Chele, Gabriela; Ilinca, M

    2006-01-01

    The computer has provided some wonderful opportunities for our children. Although research on the effects of children's use of computer is still ambiguous, some initial indications of positive and negative effects are beginning t emerge. They commonly use computers for playing games, completing school assignments, email, and connecting to the Internet. This may sometimes come at the expense of other activities such as homework or normal social interchange. Although most children seem to naturally correct the problem, parents and educators must monitor the signs of misuse. Studies of general computer users suggest that some children's may experience psychological problems such as social isolation, depression, loneliness, and time mismanagement related to their computer use and failure at school. The purpose of this study is to investigate issues related to computer use by school students from 11 to 18 years old. The survey included a representative sample of 439 school students of ages 11 to 18. All of the students came from 3 gymnasium schools and 5 high schools of Iaşi, Romania. The students answered to a questionnaire comprising 34 questions related to computer activities. The children's parents answered to a second questionnaire with the same subject. Most questions supposed to rate on a scale the frequency of occurrence of a certain event or issue; some questions solicited an open-answer or to choose an answer from a list. These were aimed at highlighting: (1) The frequency of computer use by the students; (2) The interference of excessive use with school performance and social life; (3) The identification of a possible computer addiction. The data was processed using the SPSS statistics software, version 11.0. Results show that the school students prefer to spend a considerable amount of time with their computers, over 3 hours/day. More than 65.7% of the students have a computer at home. More than 70% of the parents admit they do not or only occasionally

  8. Studies in Mathematics, Volume 22. Studies in Computer Science.

    ERIC Educational Resources Information Center

    Pollack, Seymour V., Ed.

    The nine articles in this collection were selected because they represent concerns central to computer science, emphasize topics of particular interest to mathematicians, and underscore the wide range of areas deeply and continually affected by computer science. The contents consist of: "Introduction" (S. V. Pollack), "The…

  9. Compute as Fast as the Engineers Can Think! ULTRAFAST COMPUTING TEAM FINAL REPORT

    NASA Technical Reports Server (NTRS)

    Biedron, R. T.; Mehrotra, P.; Nelson, M. L.; Preston, M. L.; Rehder, J. J.; Rogersm J. L.; Rudy, D. H.; Sobieski, J.; Storaasli, O. O.

    1999-01-01

    This report documents findings and recommendations by the Ultrafast Computing Team (UCT). In the period 10-12/98, UCT reviewed design case scenarios for a supersonic transport and a reusable launch vehicle to derive computing requirements necessary for support of a design process with efficiency so radically improved that human thought rather than the computer paces the process. Assessment of the present computing capability against the above requirements indicated a need for further improvement in computing speed by several orders of magnitude to reduce time to solution from tens of hours to seconds in major applications. Evaluation of the trends in computer technology revealed a potential to attain the postulated improvement by further increases of single processor performance combined with massively parallel processing in a heterogeneous environment. However, utilization of massively parallel processing to its full capability will require redevelopment of the engineering analysis and optimization methods, including invention of new paradigms. To that end UCT recommends initiation of a new activity at LaRC called Computational Engineering for development of new methods and tools geared to the new computer architectures in disciplines, their coordination, and validation and benefit demonstration through applications.

  10. Computed tomographic findings of trichuriasis

    PubMed Central

    Tokmak, Naime; Koc, Zafer; Ulusan, Serife; Koltas, Ismail Soner; Bal, Nebil

    2006-01-01

    In this report, we present computed tomographic findings of colonic trichuriasis. The patient was a 75-year-old man who complained of abdominal pain, and weight loss. Diagnosis was achieved by colonoscopic biopsy. Abdominal computed tomography showed irregular and nodular thickening of the wall of the cecum and ascending colon. Although these findings are nonspecific, they may be one of the findings of trichuriasis. These findings, confirmed by pathologic analysis of the biopsied tissue and Kato-Katz parasitological stool flotation technique, revealed adult Trichuris. To our knowledge, this is the first report of colonic trichuriasis indicated by computed tomography. PMID:16830393

  11. Experimental and Computational Study of Sonic and Supersonic Jet Plumes

    NASA Technical Reports Server (NTRS)

    Venkatapathy, E.; Naughton, J. W.; Fletcher, D. G.; Edwards, Thomas A. (Technical Monitor)

    1994-01-01

    Study of sonic and supersonic jet plumes are relevant to understanding such phenomenon as jet-noise, plume signatures, and rocket base-heating and radiation. Jet plumes are simple to simulate and yet, have complex flow structures such as Mach disks, triple points, shear-layers, barrel shocks, shock-shear-layer interaction, etc. Experimental and computational simulation of sonic and supersonic jet plumes have been performed for under- and over-expanded, axisymmetric plume conditions. The computational simulation compare very well with the experimental observations of schlieren pictures. Experimental data such as temperature measurements with hot-wire probes are yet to be measured and will be compared with computed values. Extensive analysis of the computational simulations presents a clear picture of how the complex flow structure develops and the conditions under which self-similar flow structures evolve. From the computations, the plume structure can be further classified into many sub-groups. In the proposed paper, detail results from the experimental and computational simulations for single, axisymmetric, under- and over-expanded, sonic and supersonic plumes will be compared and the fluid dynamic aspects of flow structures will be discussed.

  12. A Validation Study of Student Differentiation between Computing Disciplines

    ERIC Educational Resources Information Center

    Battig, Michael; Shariq, Muhammad

    2011-01-01

    Using a previously published study of how students differentiate between computing disciplines, this study attempts to validate the original research and add additional hypotheses regarding the type of institution that the student resides. Using the identical survey instrument from the original study, students in smaller colleges and in different…

  13. Conventional versus computer-navigated TKA: a prospective randomized study.

    PubMed

    Todesca, Alessandro; Garro, Luca; Penna, Massimo; Bejui-Hugues, Jacques

    2017-06-01

    The purpose of this study was to assess the midterm results of total knee arthroplasty (TKA) implanted with a specific computer navigation system in a group of patients (NAV) and to assess the same prosthesis implanted with the conventional technique in another group (CON); we hypothesized that computer navigation surgery would improve implant alignment, functional scores and survival of the implant compared to the conventional technique. From 2008 to 2009, 225 patients were enrolled in the study and randomly assigned in CON and NAV groups; 240 consecutive mobile-bearing ultra-congruent score (Amplitude, Valence, France) TKAs were performed by a single surgeon, 117 using the conventional method and 123 using the computer-navigated approach. Clinical outcome assessment was based on the Knee Society Score (KSS), the Hospital for Special Surgery Knee Score and the Western Ontario Mac Master University Index score. Component survival was calculated by Kaplan-Meier analysis. Median follow-up was 6.4 years (range 6-7 years). Two patients were lost to follow-up. No differences were seen between the two groups in age, sex, BMI and side of implantation. Three patients of CON group referred feelings of instability during walking, but clinical tests were all negative. NAV group showed statistical significant better KSS Score and wider ROM and fewer outliers from neutral mechanical axis, lateral distal femoral angle, medial proximal tibial angle and tibial slope in post-operative radiographic assessment. There was one case of early post-operative superficial infection (caused by Staph. Aureus) successfully treated with antibiotics. No mechanical loosening, mobile-bearing dislocation or patellofemoral complication was seen. At 7 years of follow-up, component survival in relation to the risk of aseptic loosening or other complications was 100 %. There were no implant revisions. This study demonstrates superior accuracy in implant positioning and statistical significant

  14. Using the Computer in Evolution Studies

    ERIC Educational Resources Information Center

    Mariner, James L.

    1973-01-01

    Describes a high school biology exercise in which a computer greatly reduces time spent on calculations. Genetic equilibrium demonstrated by the Hardy-Weinberg principle and the subsequent effects of violating any of its premises are more readily understood when frequencies of alleles through many generations are calculated by the computer. (JR)

  15. Studies to reveal the nature of interactions between catalase and curcumin using computational methods and optical techniques.

    PubMed

    Mofidi Najjar, Fayezeh; Ghadari, Rahim; Yousefi, Reza; Safari, Naser; Sheikhhasani, Vahid; Sheibani, Nader; Moosavi-Movahedi, Ali Akbar

    2017-02-01

    Curcumin is an important antioxidant compound, and is widely reported as an effective component for reducing complications of many diseases. However, the detailed mechanisms of its activity remain poorly understood. We found that curcumin can significantly increase catalase activity of BLC (bovine liver catalase). The mechanism of curcumin action was investigated using a computational method. We suggested that curcumin may activate BLC by modifying the bottleneck of its narrow channel. The molecular dynamic simulation data showed that placing curcumin on the structure of enzyme can increase the size of the bottleneck in the narrow channel of BLC, and readily allow the access of substrate to the active site. Because of the increase of the distance between amino acids of the bottleneck in the presence of curcumin, the entrance space of substrate increased from 250Å 3 to 440Å 3 . In addition, the increase in emission of intrinsic fluorescence of BLC in presence of curcumin demonstrated changes in tertiary structure of catalase, and possibility of less quenching. We also used circular dichroism (CD) spectropolarimetry to determine how curcumin may alter the enzyme secondary structure. Catalase spectra in the presence of various concentrations of curcumin showed an increase in the amount of α-helix content. Copyright © 2016 Elsevier B.V. All rights reserved.

  16. Supporting High School Student Accomplishment of Biology Content Using Interactive Computer-Based Curricular Case Studies

    NASA Astrophysics Data System (ADS)

    Oliver, Joseph Steve; Hodges, Georgia W.; Moore, James N.; Cohen, Allan; Jang, Yoonsun; Brown, Scott A.; Kwon, Kyung A.; Jeong, Sophia; Raven, Sara P.; Jurkiewicz, Melissa; Robertson, Tom P.

    2017-11-01

    Research into the efficacy of modules featuring dynamic visualizations, case studies, and interactive learning environments is reported here. This quasi-experimental 2-year study examined the implementation of three interactive computer-based instructional modules within a curricular unit covering cellular biology concepts in an introductory high school biology course. The modules featured dynamic visualizations and focused on three processes that underlie much of cellular biology: diffusion, osmosis, and filtration. Pre-tests and post-tests were used to assess knowledge growth across the unit. A mixture Rasch model analysis of the post-test data revealed two groups of students. In both years of the study, a large proportion of the students were classified as low-achieving based on their pre-test scores. The use of the modules in the Cell Unit in year 2 was associated with a much larger proportion of the students having transitioned to the high-achieving group than in year 1. In year 2, the same teachers taught the same concepts as year 1 but incorporated the interactive computer-based modules into the cell biology unit of the curriculum. In year 2, 67% of students initially classified as low-achieving were classified as high-achieving at the end of the unit. Examination of responses to assessments embedded within the modules as well as post-test items linked transition to the high-achieving group with correct responses to items that both referenced the visualization and the contextualization of that visualization within the module. This study points to the importance of dynamic visualization within contextualized case studies as a means to support student knowledge acquisition in biology.

  17. Using Geographic Information Systems (GIS) at Schools without a Computer Laboratory

    ERIC Educational Resources Information Center

    Demirci, Ali

    2011-01-01

    This article reports the results of a study that explored the applicability and effectiveness of a GIS-based exercise implemented by a teacher on a single computer in an ordinary classroom. The GIS-based exercise was implemented in two different environments with two different groups of students. The study reveals that implementing GIS exercises…

  18. Computational Assay of H7N9 Influenza Neuraminidase Reveals R292K Mutation Reduces Drug Binding Affinity

    NASA Astrophysics Data System (ADS)

    Woods, Christopher J.; Malaisree, Maturos; Long, Ben; McIntosh-Smith, Simon; Mulholland, Adrian J.

    2013-12-01

    The emergence of a novel H7N9 avian influenza that infects humans is a serious cause for concern. Of the genome sequences of H7N9 neuraminidase available, one contains a substitution of arginine to lysine at position 292, suggesting a potential for reduced drug binding efficacy. We have performed molecular dynamics simulations of oseltamivir, zanamivir and peramivir bound to H7N9, H7N9-R292K, and a structurally related H11N9 neuraminidase. They show that H7N9 neuraminidase is structurally homologous to H11N9, binding the drugs in identical modes. The simulations reveal that the R292K mutation disrupts drug binding in H7N9 in a comparable manner to that observed experimentally for H11N9-R292K. Absolute binding free energy calculations with the WaterSwap method confirm a reduction in binding affinity. This indicates that the efficacy of antiviral drugs against H7N9-R292K will be reduced. Simulations can assist in predicting disruption of binding caused by mutations in neuraminidase, thereby providing a computational `assay.'

  19. Dynamics of Numerics & Spurious Behaviors in CFD Computations. Revised

    NASA Technical Reports Server (NTRS)

    Yee, Helen C.; Sweby, Peter K.

    1997-01-01

    The global nonlinear behavior of finite discretizations for constant time steps and fixed or adaptive grid spacings is studied using tools from dynamical systems theory. Detailed analysis of commonly used temporal and spatial discretizations for simple model problems is presented. The role of dynamics in the understanding of long time behavior of numerical integration and the nonlinear stability, convergence, and reliability of using time-marching approaches for obtaining steady-state numerical solutions in computational fluid dynamics (CFD) is explored. The study is complemented with examples of spurious behavior observed in steady and unsteady CFD computations. The CFD examples were chosen to illustrate non-apparent spurious behavior that was difficult to detect without extensive grid and temporal refinement studies and some knowledge from dynamical systems theory. Studies revealed the various possible dangers of misinterpreting numerical simulation of realistic complex flows that are constrained by available computing power. In large scale computations where the physics of the problem under study is not well understood and numerical simulations are the only viable means of solution, extreme care must be taken in both computation and interpretation of the numerical data. The goal of this paper is to explore the important role that dynamical systems theory can play in the understanding of the global nonlinear behavior of numerical algorithms and to aid the identification of the sources of numerical uncertainties in CFD.

  20. The Relationship between Computational Fluency and Student Success in General Studies Mathematics

    ERIC Educational Resources Information Center

    Hegeman, Jennifer; Waters, Gavin

    2012-01-01

    Many developmental mathematics programs emphasize computational fluency with the assumption that this is a necessary contributor to student success in general studies mathematics. In an effort to determine which skills are most essential, scores on a computational fluency test were correlated with student success in general studies mathematics at…

  1. Computational Fluid Dynamic (CFD) Study of an Articulating Turbine Blade Cascade

    DTIC Science & Technology

    2016-11-01

    turbine blades to have fluid run through them during use1—a feature which many newer engines include. A cutaway view of a typical rotorcraft engine...ARL-TR-7871 ● NOV 2016 US Army Research Laboratory Computational Fluid Dynamic (CFD) Study of an Articulating Turbine Blade ...ARL-TR-7871 ● NOV 2016 US Army Research Laboratory Computational Fluid Dynamic (CFD) Study of an Articulating Turbine Blade Cascade by Luis

  2. Computational Models Reveal a Passive Mechanism for Cell Migration in the Crypt

    PubMed Central

    Dunn, Sara-Jane; Näthke, Inke S.; Osborne, James M.

    2013-01-01

    Cell migration in the intestinal crypt is essential for the regular renewal of the epithelium, and the continued upward movement of cells is a key characteristic of healthy crypt dynamics. However, the driving force behind this migration is unknown. Possibilities include mitotic pressure, active movement driven by motility cues, or negative pressure arising from cell loss at the crypt collar. It is possible that a combination of factors together coordinate migration. Here, three different computational models are used to provide insight into the mechanisms that underpin cell movement in the crypt, by examining the consequence of eliminating cell division on cell movement. Computational simulations agree with existing experimental results, confirming that migration can continue in the absence of mitosis. Importantly, however, simulations allow us to infer mechanisms that are sufficient to generate cell movement, which is not possible through experimental observation alone. The results produced by the three models agree and suggest that cell loss due to apoptosis and extrusion at the crypt collar relieves cell compression below, allowing cells to expand and move upwards. This finding suggests that future experiments should focus on the role of apoptosis and cell extrusion in controlling cell migration in the crypt. PMID:24260407

  3. Mirror neurons and imitation: a computationally guided review.

    PubMed

    Oztop, Erhan; Kawato, Mitsuo; Arbib, Michael

    2006-04-01

    Neurophysiology reveals the properties of individual mirror neurons in the macaque while brain imaging reveals the presence of 'mirror systems' (not individual neurons) in the human. Current conceptual models attribute high level functions such as action understanding, imitation, and language to mirror neurons. However, only the first of these three functions is well-developed in monkeys. We thus distinguish current opinions (conceptual models) on mirror neuron function from more detailed computational models. We assess the strengths and weaknesses of current computational models in addressing the data and speculations on mirror neurons (macaque) and mirror systems (human). In particular, our mirror neuron system (MNS), mental state inference (MSI) and modular selection and identification for control (MOSAIC) models are analyzed in more detail. Conceptual models often overlook the computational requirements for posited functions, while too many computational models adopt the erroneous hypothesis that mirror neurons are interchangeable with imitation ability. Our meta-analysis underlines the gap between conceptual and computational models and points out the research effort required from both sides to reduce this gap.

  4. Influence of the Pixel Sizes of Reference Computed Tomography on Single-photon Emission Computed Tomography Image Reconstruction Using Conjugate-gradient Algorithm.

    PubMed

    Okuda, Kyohei; Sakimoto, Shota; Fujii, Susumu; Ida, Tomonobu; Moriyama, Shigeru

    The frame-of-reference using computed-tomography (CT) coordinate system on single-photon emission computed tomography (SPECT) reconstruction is one of the advanced characteristics of the xSPECT reconstruction system. The aim of this study was to reveal the influence of the high-resolution frame-of-reference on the xSPECT reconstruction. 99m Tc line-source phantom and National Electrical Manufacturers Association (NEMA) image quality phantom were scanned using the SPECT/CT system. xSPECT reconstructions were performed with the reference CT images in different sizes of the display field-of-view (DFOV) and pixel. The pixel sizes of the reconstructed xSPECT images were close to 2.4 mm, which is acquired as originally projection data, even if the reference CT resolution was varied. The full width at half maximum (FWHM) of the line-source, absolute recovery coefficient, and background variability of image quality phantom were independent on the sizes of DFOV in the reference CT images. The results of this study revealed that the image quality of the reconstructed xSPECT images is not influenced by the resolution of frame-of-reference on SPECT reconstruction.

  5. Using Volunteer Computing to Study Some Features of Diagonal Latin Squares

    NASA Astrophysics Data System (ADS)

    Vatutin, Eduard; Zaikin, Oleg; Kochemazov, Stepan; Valyaev, Sergey

    2017-12-01

    In this research, the study concerns around several features of diagonal Latin squares (DLSs) of small order. Authors of the study suggest an algorithm for computing minimal and maximal numbers of transversals of DLSs. According to this algorithm, all DLSs of a particular order are generated, and for each square all its transversals and diagonal transversals are constructed. The algorithm was implemented and applied to DLSs of order at most 7 on a personal computer. The experiment for order 8 was performed in the volunteer computing project Gerasim@home. In addition, the problem of finding pairs of orthogonal DLSs of order 10 was considered and reduced to Boolean satisfiability problem. The obtained problem turned out to be very hard, therefore it was decomposed into a family of subproblems. In order to solve the problem, the volunteer computing project SAT@home was used. As a result, several dozen pairs of described kind were found.

  6. Computational Study of Separating Flow in a Planar Subsonic Diffuser

    NASA Technical Reports Server (NTRS)

    DalBello, Teryn; Dippold, Vance, III; Georgiadis, Nicholas J.

    2005-01-01

    A computational study of the separated flow through a 2-D asymmetric subsonic diffuser has been performed. The Wind Computational Fluid Dynamics code is used to predict the separation and reattachment behavior for an incompressible diffuser flow. The diffuser inlet flow is a two-dimensional, turbulent, and fully-developed channel flow with a Reynolds number of 20,000 based on the centerline velocity and the channel height. Wind solutions computed with the Menter SST, Chien k-epsilon, Spalart-Allmaras and Explicit Algebraic Reynolds Stress turbulence models are compared with experimentally measured velocity profiles and skin friction along the upper and lower walls. In addition to the turbulence model study, the effects of grid resolution and use of wall functions were investigated. The grid studies varied the number of grid points across the diffuser and varied the initial wall spacing from y(sup +) = 0.2 to 60. The wall function study assessed the applicability of wall functions for analysis of separated flow. The SST and Explicit Algebraic Stress models provide the best agreement with experimental data, and it is recommended wall functions should only be used with a high level of caution.

  7. Computed tomography angiography reveals the crime instrument – case report

    PubMed Central

    Banaszek, Anna; Guziński, Maciej; Sąsiadek, Marek

    2010-01-01

    Summary Background: The development of multislice CT technology enabled imaging of post-traumatic brain lesions with isotropic resolution, which led to unexpected results in the presented case Case Report: An unconscious, 49-year-old male with a suspected trauma underwent a routine CT examination of the head, which revealed an unusual intracerebral bleeding and therefore was followed by CT angiography (CTA). The thorough analysis of CTA source scans led to the detection of the bleeding cause. Conclusions: The presented case showed that a careful analysis of a CT scan allows not only to define the extent of pathological lesions in the intracranial space but it also helps to detect the crime instrument, which is of medico-legal significance. PMID:22802784

  8. Vertebral Pneumaticity in the Ornithomimosaur Archaeornithomimus (Dinosauria: Theropoda) Revealed by Computed Tomography Imaging and Reappraisal of Axial Pneumaticity in Ornithomimosauria

    PubMed Central

    Watanabe, Akinobu; Eugenia Leone Gold, Maria; Brusatte, Stephen L.; Benson, Roger B. J.; Choiniere, Jonah; Davidson, Amy; Norell, Mark A.

    2015-01-01

    Among extant vertebrates, pneumatization of postcranial bones is unique to birds, with few known exceptions in other groups. Through reduction in bone mass, this feature is thought to benefit flight capacity in modern birds, but its prevalence in non-avian dinosaurs of variable sizes has generated competing hypotheses on the initial adaptive significance of postcranial pneumaticity. To better understand the evolutionary history of postcranial pneumaticity, studies have surveyed its distribution among non-avian dinosaurs. Nevertheless, the degree of pneumaticity in the basal coelurosaurian group Ornithomimosauria remains poorly known, despite their potential to greatly enhance our understanding of the early evolution of pneumatic bones along the lineage leading to birds. Historically, the identification of postcranial pneumaticity in non-avian dinosaurs has been based on examination of external morphology, and few studies thus far have focused on the internal architecture of pneumatic structures inside the bones. Here, we describe the vertebral pneumaticity of the ornithomimosaur Archaeornithomimus with the aid of X-ray computed tomography (CT) imaging. Complementary examination of external and internal osteology reveals (1) highly pneumatized cervical vertebrae with an elaborate configuration of interconnected chambers within the neural arch and the centrum; (2) anterior dorsal vertebrae with pneumatic chambers inside the neural arch; (3) apneumatic sacral vertebrae; and (4) a subset of proximal caudal vertebrae with limited pneumatic invasion into the neural arch. Comparisons with other theropod dinosaurs suggest that ornithomimosaurs primitively exhibited a plesiomorphic theropod condition for axial pneumaticity that was extended among later taxa, such as Archaeornithomimus and large bodied Deinocheirus. This finding corroborates the notion that evolutionary increases in vertebral pneumaticity occurred in parallel among independent lineages of bird

  9. Advanced Computed-Tomography Inspection System

    NASA Technical Reports Server (NTRS)

    Harris, Lowell D.; Gupta, Nand K.; Smith, Charles R.; Bernardi, Richard T.; Moore, John F.; Hediger, Lisa

    1993-01-01

    Advanced Computed Tomography Inspection System (ACTIS) is computed-tomography x-ray apparatus revealing internal structures of objects in wide range of sizes and materials. Three x-ray sources and adjustable scan geometry gives system unprecedented versatility. Gantry contains translation and rotation mechanisms scanning x-ray beam through object inspected. Distance between source and detector towers varied to suit object. System used in such diverse applications as development of new materials, refinement of manufacturing processes, and inspection of components.

  10. A Qualitative Study of Students' Computational Thinking Skills in a Data-Driven Computing Class

    ERIC Educational Resources Information Center

    Yuen, Timothy T.; Robbins, Kay A.

    2014-01-01

    Critical thinking, problem solving, the use of tools, and the ability to consume and analyze information are important skills for the 21st century workforce. This article presents a qualitative case study that follows five undergraduate biology majors in a computer science course (CS0). This CS0 course teaches programming within a data-driven…

  11. Computer proficiency questionnaire: assessing low and high computer proficient seniors.

    PubMed

    Boot, Walter R; Charness, Neil; Czaja, Sara J; Sharit, Joseph; Rogers, Wendy A; Fisk, Arthur D; Mitzner, Tracy; Lee, Chin Chin; Nair, Sankaran

    2015-06-01

    Computers and the Internet have the potential to enrich the lives of seniors and aid in the performance of important tasks required for independent living. A prerequisite for reaping these benefits is having the skills needed to use these systems, which is highly dependent on proper training. One prerequisite for efficient and effective training is being able to gauge current levels of proficiency. We developed a new measure (the Computer Proficiency Questionnaire, or CPQ) to measure computer proficiency in the domains of computer basics, printing, communication, Internet, calendaring software, and multimedia use. Our aim was to develop a measure appropriate for individuals with a wide range of proficiencies from noncomputer users to extremely skilled users. To assess the reliability and validity of the CPQ, a diverse sample of older adults, including 276 older adults with no or minimal computer experience, was recruited and asked to complete the CPQ. The CPQ demonstrated excellent reliability (Cronbach's α = .98), with subscale reliabilities ranging from .86 to .97. Age, computer use, and general technology use all predicted CPQ scores. Factor analysis revealed three main factors of proficiency related to Internet and e-mail use; communication and calendaring; and computer basics. Based on our findings, we also developed a short-form CPQ (CPQ-12) with similar properties but 21 fewer questions. The CPQ and CPQ-12 are useful tools to gauge computer proficiency for training and research purposes, even among low computer proficient older adults. © The Author 2013. Published by Oxford University Press on behalf of The Gerontological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  12. Does Cloud Computing in the Atmospheric Sciences Make Sense? A case study of hybrid cloud computing at NASA Langley Research Center

    NASA Astrophysics Data System (ADS)

    Nguyen, L.; Chee, T.; Minnis, P.; Spangenberg, D.; Ayers, J. K.; Palikonda, R.; Vakhnin, A.; Dubois, R.; Murphy, P. R.

    2014-12-01

    The processing, storage and dissemination of satellite cloud and radiation products produced at NASA Langley Research Center are key activities for the Climate Science Branch. A constellation of systems operates in sync to accomplish these goals. Because of the complexity involved with operating such intricate systems, there are both high failure rates and high costs for hardware and system maintenance. Cloud computing has the potential to ameliorate cost and complexity issues. Over time, the cloud computing model has evolved and hybrid systems comprising off-site as well as on-site resources are now common. Towards our mission of providing the highest quality research products to the widest audience, we have explored the use of the Amazon Web Services (AWS) Cloud and Storage and present a case study of our results and efforts. This project builds upon NASA Langley Cloud and Radiation Group's experience with operating large and complex computing infrastructures in a reliable and cost effective manner to explore novel ways to leverage cloud computing resources in the atmospheric science environment. Our case study presents the project requirements and then examines the fit of AWS with the LaRC computing model. We also discuss the evaluation metrics, feasibility, and outcomes and close the case study with the lessons we learned that would apply to others interested in exploring the implementation of the AWS system in their own atmospheric science computing environments.

  13. Computer vs. Workbook Instruction in Second Language Acquisition.

    ERIC Educational Resources Information Center

    Nagata, Noriko

    1996-01-01

    Compares the effectiveness of Nihongo-CALI (Japanese Computer Assisted Language Instruction) with non-CALI workbook instruction. Findings reveal that given the same grammar notes and exercises, ongoing intelligent computer feedback is more effective than simple workbook answer sheets for developing learners' grammatical skill in producing Japanese…

  14. The Effect of Computer Assisted and Computer Based Teaching Methods on Computer Course Success and Computer Using Attitudes of Students

    ERIC Educational Resources Information Center

    Tosun, Nilgün; Suçsuz, Nursen; Yigit, Birol

    2006-01-01

    The purpose of this research was to investigate the effects of the computer-assisted and computer-based instructional methods on students achievement at computer classes and on their attitudes towards using computers. The study, which was completed in 6 weeks, were carried out with 94 sophomores studying in formal education program of Primary…

  15. Understanding initial undergraduate expectations and identity in computing studies

    NASA Astrophysics Data System (ADS)

    Kinnunen, Päivi; Butler, Matthew; Morgan, Michael; Nylen, Aletta; Peters, Anne-Kathrin; Sinclair, Jane; Kalvala, Sara; Pesonen, Erkki

    2018-03-01

    There is growing appreciation of the importance of understanding the student perspective in Higher Education (HE) at both institutional and international levels. This is particularly important in Science, Technology, Engineering and Mathematics subjects such as Computer Science (CS) and Engineering in which industry needs are high but so are student dropout rates. An important factor to consider is the management of students' initial expectations of university study and career. This paper reports on a study of CS first-year students' expectations across three European countries using qualitative data from student surveys and essays. Expectation is examined from both short-term (topics to be studied) and long-term (career goals) perspectives. Tackling these issues will help paint a picture of computing education through students' eyes and explore their vision of its and their role in society. It will also help educators prepare students more effectively for university study and to improve the student experience.

  16. Quantitative proteomic study of Aspergillus Fumigatus secretome revealed deamidation of secretory enzymes.

    PubMed

    Adav, Sunil S; Ravindran, Anita; Sze, Siu Kwan

    2015-04-24

    Aspergillus sp. plays an essential role in lignocellulosic biomass recycling and is also exploited as cell factories for the production of industrial enzymes. This study profiled the secretome of Aspergillus fumigatus when grown with cellulose, xylan and starch by high throughput quantitative proteomics using isobaric tags for relative and absolute quantification (iTRAQ). Post translational modifications (PTMs) of proteins play a critical role in protein functions. However, our understanding of the PTMs in secretory proteins is limited. Here, we present the identification of PTMs such as deamidation of secreted proteins of A. fumigatus. This study quantified diverse groups of extracellular secreted enzymes and their functional classification revealed cellulases and glycoside hydrolases (32.9%), amylases (0.9%), hemicellulases (16.2%), lignin degrading enzymes (8.1%), peptidases and proteases (11.7%), chitinases, lipases and phosphatases (7.6%), and proteins with unknown function (22.5%). The comparison of quantitative iTRAQ results revealed that cellulose and xylan stimulates expression of specific cellulases and hemicellulases, and their abundance level as a function of substrate. In-depth data analysis revealed deamidation as a major PTM of key cellulose hydrolyzing enzymes like endoglucanases, cellobiohydrolases and glucosidases. Hemicellulose degrading endo-1,4-beta-xylanase, monosidases, xylosidases, lignin degrading laccase, isoamyl alcohol oxidase and oxidoreductases were also found to be deamidated. The filamentous fungi play an essential role in lignocellulosic biomass recycling and fungal strains belonging to Aspergillus were also exploited as cell factories for the production of organic acids, pharmaceuticals, and industrially important enzymes. In this study, extracellular proteins secreted by thermophilic A. fumigatus when grown with cellulose, xylan and starch were profiled using isobaric tags for relative and absolute quantification (iTRAQ) by

  17. Computational studies of the 2D self-assembly of bacterial microcompartment shell proteins

    NASA Astrophysics Data System (ADS)

    Mahalik, Jyoti; Brown, Kirsten; Cheng, Xiaolin; Fuentes-Cabrera, Miguel

    Bacterial microcomartments (BMCs) are subcellular organelles that exist within wide variety of bacteria and function like nano-reactors. Among the different types of BMCs known, the carboxysome has been studied the most. The carboxysomes plays an important role in the transport of metabolites across its outer proteinaceous shell. Plenty of studies have investigated the structure of this shell, yet little is known about its self-assembly . Understanding the self-assembly process of BMCs' shell might allow disrupting their functioning and designing new synthetic nano-reactors. We have investigated the self-assembly process of a major protein component of the carboxysome's shell using a Monte Carlo technique that employed a coarse-grained protein model that was calibrated with the all-atomistic potential of mean force. The simulations reveal that this protein self-assembles into clusters that resemble what were seen experimentally in 2D layers. Further analysis of the simulation results suggests that the 2D self-assembly of carboxysome's facets is driven by nucleation-growth process, which in turn could play an important role in the hierarchical self-assembly of BMCs' shell in general. 1. Science Undergraduate Laboratory Internships, ORNL 2. Oak Ridge Leadership Computing Facility, ORNL.

  18. A Study of Young Children's Metaknowing Talk: Learning Experiences with Computers

    ERIC Educational Resources Information Center

    Choi, Ji-Young

    2010-01-01

    This research project was undertaken in a time of increasing emphasis on the exploration of young children's learning and thinking at computers. The purpose of this study was to describe and interpret the characteristics of metaknowing talk that occurred during learning experiences with computers in a kindergarten community of learners. This…

  19. Task Selection, Task Switching and Multitasking during Computer-Based Independent Study

    ERIC Educational Resources Information Center

    Judd, Terry

    2015-01-01

    Detailed logs of students' computer use, during independent study sessions, were captured in an open-access computer laboratory. Each log consisted of a chronological sequence of tasks representing either the application or the Internet domain displayed in the workstation's active window. Each task was classified using a three-tier schema…

  20. Analysis of Climatic and Environmental Changes Using CLEARS Web-GIS Information-Computational System: Siberia Case Study

    NASA Astrophysics Data System (ADS)

    Titov, A. G.; Gordov, E. P.; Okladnikov, I.; Shulgina, T. M.

    2011-12-01

    Analysis of recent climatic and environmental changes in Siberia performed on the basis of the CLEARS (CLimate and Environment Analysis and Research System) information-computational system is presented. The system was developed using the specialized software framework for rapid development of thematic information-computational systems based on Web-GIS technologies. It comprises structured environmental datasets, computational kernel, specialized web portal implementing web mapping application logic, and graphical user interface. Functional capabilities of the system include a number of procedures for mathematical and statistical analysis, data processing and visualization. At present a number of georeferenced datasets is available for processing including two editions of NCEP/NCAR Reanalysis, JMA/CRIEPI JRA-25 Reanalysis, ECMWF ERA-40 and ERA Interim Reanalysis, meteorological observation data for the territory of the former USSR, and others. Firstly, using functionality of the computational kernel employing approved statistical methods it was shown that the most reliable spatio-temporal characteristics of surface temperature and precipitation in Siberia in the second half of 20th and beginning of 21st centuries are provided by ERA-40/ERA Interim Reanalysis and APHRODITE JMA Reanalysis, respectively. Namely those Reanalyses are statistically consistent with reliable in situ meteorological observations. Analysis of surface temperature and precipitation dynamics for the territory of Siberia performed on the base of the developed information-computational system reveals fine spatial and temporal details in heterogeneous patterns obtained for the region earlier. Dynamics of bioclimatic indices determining climate change impact on structure and functioning of regional vegetation cover was investigated as well. Analysis shows significant positive trends of growing season length accompanied by statistically significant increase of sum of growing degree days and total

  1. Acting without seeing: Eye movements reveal visual processing without awareness Miriam Spering & Marisa Carrasco

    PubMed Central

    Spering, Miriam; Carrasco, Marisa

    2015-01-01

    Visual perception and eye movements are considered to be tightly linked. Diverse fields, ranging from developmental psychology to computer science, utilize eye tracking to measure visual perception. However, this prevailing view has been challenged by recent behavioral studies. We review converging evidence revealing dissociations between the contents of perceptual awareness and different types of eye movements. Such dissociations reveal situations in which eye movements are sensitive to particular visual features that fail to modulate perceptual reports. We also discuss neurophysiological, neuroimaging and clinical studies supporting the role of subcortical pathways for visual processing without awareness. Our review links awareness to perceptual-eye movement dissociations and furthers our understanding of the brain pathways underlying vision and movement with and without awareness. PMID:25765322

  2. Two Studies Examining Argumentation in Asynchronous Computer Mediated Communication

    ERIC Educational Resources Information Center

    Joiner, Richard; Jones, Sarah; Doherty, John

    2008-01-01

    Asynchronous computer mediated communication (CMC) would seem to be an ideal medium for supporting development in student argumentation. This paper investigates this assumption through two studies. The first study compared asynchronous CMC with face-to-face discussions. The transactional and strategic level of the argumentation (i.e. measures of…

  3. Computational Study of Scenarios Regarding Explosion Risk Mitigation

    NASA Astrophysics Data System (ADS)

    Vlasin, Nicolae-Ioan; Mihai Pasculescu, Vlad; Florea, Gheorghe-Daniel; Cornel Suvar, Marius

    2016-10-01

    Exploration in order to discover new deposits of natural gas, upgrading techniques to exploit these resources and new ways to convert the heat capacity of these gases into industrial usable energy is the research areas of great interest around the globe. But all activities involving the handling of natural gas (exploitation, transport, combustion) are subjected to the same type of risk: the risk to explosion. Experiments carried out physical scenarios to determine ways to reduce this risk can be extremely costly, requiring suitable premises, equipment and apparatus, manpower, time and, not least, presenting the risk of personnel injury. Taking in account the above mentioned, the present paper deals with the possibility of studying the scenarios of gas explosion type events in virtual domain, exemplifying by performing a computer simulation of a stoichiometric air - methane explosion (methane is the main component of natural gas). The advantages of computer-assisted imply are the possibility of using complex virtual geometries of any form as the area of deployment phenomenon, the use of the same geometry for an infinite number of settings of initial parameters as input, total elimination the risk of personnel injury, decrease the execution time etc. Although computer simulations are hardware resources consuming and require specialized personnel to use the CFD (Computational Fluid Dynamics) techniques, the costs and risks associated with these methods are greatly diminished, presenting, in the same time, a major benefit in terms of execution time.

  4. An Inviscid Computational Study of the Space Shuttle Orbiter and Several Damaged Configurations

    NASA Technical Reports Server (NTRS)

    Prabhu, Ramadas K.; Merski, N. Ronald (Technical Monitor)

    2004-01-01

    Inviscid aerodynamic characteristics of the Space Shuttle Orbiter were computed in support of the Columbia Accident Investigation. The unstructured grid software FELISA was used and computations were done using freestream conditions corresponding to those in the NASA Langley 20-Inch Mach 6 CF4 tunnel test section. The angle of attack was held constant at 40 degrees. The baseline (undamaged) configuration and a large number of damaged configurations of the Orbiter were studied. Most of the computations were done on a half model. However, one set of computations was done using the full-model to study the effect of sideslip. The differences in the aerodynamic coefficients for the damaged and the baseline configurations were computed. Simultaneously with the computation reported here, tests were being done on a scale model of the Orbiter in the 20-Inch Mach 6 CF4 tunnel to measure the deltas . The present computations complemented the CF4 tunnel test, and provided aerodynamic coefficients of the Orbiter as well as its components. Further, they also provided details of the flow field.

  5. Errors in finite-difference computations on curvilinear coordinate systems

    NASA Technical Reports Server (NTRS)

    Mastin, C. W.; Thompson, J. F.

    1980-01-01

    Curvilinear coordinate systems were used extensively to solve partial differential equations on arbitrary regions. An analysis of truncation error in the computation of derivatives revealed why numerical results may be erroneous. A more accurate method of computing derivatives is presented.

  6. Computer-Related Success and Failure: A Longitudinal Field Study of the Factors Influencing Computer-Related Performance.

    ERIC Educational Resources Information Center

    Rozell, E. J.; Gardner, W. L., III

    1999-01-01

    A model of the intrapersonal processes impacting computer-related performance was tested using data from 75 manufacturing employees in a computer training course. Gender, computer experience, and attributional style were predictive of computer attitudes, which were in turn related to computer efficacy, task-specific performance expectations, and…

  7. Computer Page: Computer Studies for All--A Wider Approach

    ERIC Educational Resources Information Center

    Edens, A. J.

    1975-01-01

    An approach to teaching children aged 12 through 14 a substantial course about computers is described. Topics covered include simple algorithms, information and communication, man-machine communication, the concept of a system, the definition of a system, and the use of files. (SD)

  8. Student Engagement with Computer-Generated Feedback: A Case Study

    ERIC Educational Resources Information Center

    Zhang, Zhe

    2017-01-01

    In order to benefit from feedback on their writing, students need to engage effectively with it. This article reports a case study on student engagement with computer-generated feedback, known as automated writing evaluation (AWE) feedback, in an EFL context. Differing from previous studies that explored commercially available AWE programs, this…

  9. Computed Tomography Studies of Lung Mechanics

    PubMed Central

    Simon, Brett A.; Christensen, Gary E.; Low, Daniel A.; Reinhardt, Joseph M.

    2005-01-01

    The study of lung mechanics has progressed from global descriptions of lung pressure and volume relationships to the high-resolution, three-dimensional, quantitative measurement of dynamic regional mechanical properties and displacements. X-ray computed tomography (CT) imaging is ideally suited to the study of regional lung mechanics in intact subjects because of its high spatial and temporal resolution, correlation of functional data with anatomic detail, increasing volumetric data acquisition, and the unique relationship between CT density and lung air content. This review presents an overview of CT measurement principles and limitations for the study of regional mechanics, reviews some of the early work that set the stage for modern imaging approaches and impacted the understanding and management of patients with acute lung injury, and presents evolving novel approaches for the analysis and application of dynamic volumetric lung image data. PMID:16352757

  10. Using Computational and Mechanical Models to Study Animal Locomotion

    PubMed Central

    Miller, Laura A.; Goldman, Daniel I.; Hedrick, Tyson L.; Tytell, Eric D.; Wang, Z. Jane; Yen, Jeannette; Alben, Silas

    2012-01-01

    Recent advances in computational methods have made realistic large-scale simulations of animal locomotion possible. This has resulted in numerous mathematical and computational studies of animal movement through fluids and over substrates with the purpose of better understanding organisms’ performance and improving the design of vehicles moving through air and water and on land. This work has also motivated the development of improved numerical methods and modeling techniques for animal locomotion that is characterized by the interactions of fluids, substrates, and structures. Despite the large body of recent work in this area, the application of mathematical and numerical methods to improve our understanding of organisms in the context of their environment and physiology has remained relatively unexplored. Nature has evolved a wide variety of fascinating mechanisms of locomotion that exploit the properties of complex materials and fluids, but only recently are the mathematical, computational, and robotic tools available to rigorously compare the relative advantages and disadvantages of different methods of locomotion in variable environments. Similarly, advances in computational physiology have only recently allowed investigators to explore how changes at the molecular, cellular, and tissue levels might lead to changes in performance at the organismal level. In this article, we highlight recent examples of how computational, mathematical, and experimental tools can be combined to ultimately answer the questions posed in one of the grand challenges in organismal biology: “Integrating living and physical systems.” PMID:22988026

  11. Doctors' experience with handheld computers in clinical practice: qualitative study.

    PubMed

    McAlearney, Ann Scheck; Schweikhart, Sharon B; Medow, Mitchell A

    2004-05-15

    To examine doctors' perspectives about their experiences with handheld computers in clinical practice. Qualitative study of eight focus groups consisting of doctors with diverse training and practice patterns. Six practice settings across the United States and two additional focus group sessions held at a national meeting of general internists. 54 doctors who did or did not use handheld computers. Doctors who used handheld computers in clinical practice seemed generally satisfied with them and reported diverse patterns of use. Users perceived that the devices helped them increase productivity and improve patient care. Barriers to use concerned the device itself and personal and perceptual constraints, with perceptual factors such as comfort with technology, preference for paper, and the impression that the devices are not easy to use somewhat difficult to overcome. Participants suggested that organisations can help promote handheld computers by providing advice on purchase, usage, training, and user support. Participants expressed concern about reliability and security of the device but were particularly concerned about dependency on the device and over-reliance as a substitute for clinical thinking. Doctors expect handheld computers to become more useful, and most seem interested in leveraging (getting the most value from) their use. Key opportunities with handheld computers included their use as a stepping stone to build doctors' comfort with other information technology and ehealth initiatives and providing point of care support that helps improve patient care.

  12. Ethiopian Population Dermatoglyphic Study Reveals Linguistic Stratification of Diversity

    PubMed Central

    2015-01-01

    The manifestation of ethnic, blood type, & gender-wise population variations regarding Dermatoglyphic manifestations are of interest to assess intra-group diversity and differentiation. The present study reports on the analysis of qualitaive and quantitative finger Dermatoglyphic traits of 382 individuals cross-sectionally sampled from an administrative region of Ethiopia, consisting of five ethnic cohorts from the Afro-Asiatic & Nilo-Saharan affiliations. These Dermatoglyphic parameters were then applied in the assessment of diversity & differentiation, including Heterozygosity, Fixation, Panmixia, Wahlund’s variance, Nei’s measure of genetic diversity, and thumb & finger pattern genotypes, which were inturn used in homology inferences as summarized by a Neighbour-Joining tree constructed from Nei’s standard genetic distance. Results revealed significant correlation between Dermatoglyphics & population parameters that were further found to be in concordance with the historical accounts of the ethnic groups. Such inductions as the ancient north-eastern presence and subsequent admixure events of the Oromos (PII= 15.01), the high diversity of the Amharas (H= 0.1978, F= 0.6453, and P= 0.4144), and the Nilo-Saharan origin of the Berta group (PII= 10.66) are evidences to this. The study has further tested the possibility of applying Dermatoglyphics in population genetic & anthropologic research, highlighting on the prospect of developing a method to trace back population origins & ancient movement patterns. Additionally, linguistic clustering was deemed significant for the Ethiopian population, coinciding with recent genome wide studies that have ascertained that linguistic clustering as to being more crucial than the geographical patterning in the Ethiopian context. Finally, Dermatoglyphic markers have been proven to be endowed with a strong potential as non-invasive preliminary tools applicable prior to genetic studies to analyze ethnically sub

  13. Computational study of the heterodimerization between μ and δ receptors

    NASA Astrophysics Data System (ADS)

    Liu, Xin; Kai, Ming; Jin, Lian; Wang, Rui

    2009-06-01

    A growing body of evidence indicated that the G protein coupled receptors exist as homo- or hetero-dimers in the living cell. The heterodimerization between μ and δ opioid receptors has attracted researchers' particular interests, it is reported to display novel pharmacological and signalling regulation properties. In this study, we construct the full-length 3D-model of μ and δ opioid receptors using the homology modelling method. Threading program was used to predict the possible templates for the N- and C-terminus domains. Then, a 30 ns molecular dynamics simulations was performed with each receptor embedded in an explicit membrane-water environment to refine and explore the conformational space. Based on the structures extracted from the molecular dynamics, the likely interface of μ-δ heterodimer was investigated through the analysis of protein-protein docking, cluster, shape complementary and interaction energy. The computational modelling works revealed that the most likely interface of heterodimer was formed between the transmembrane1,7 (TM1,7) domains of μ receptor and the TM(4,5) domains of δ receptor, with emphasis on μ-TM1 and δ-TM4, the next likely interface was μ(TM6,7)-δ(TM4,5), with emphasis on μ-TM6 and δ-TM4. Our results were consistent with previous reports.

  14. Postural dynamism during computer mouse and keyboard use: A pilot study.

    PubMed

    Van Niekerk, S M; Fourie, S M; Louw, Q A

    2015-09-01

    Prolonged sedentary computer use is a risk factor for musculoskeletal pain. The aim of this study was to explore postural dynamism during two common computer tasks, namely mouse use and keyboard typing. Postural dynamism was described as the total number of postural changes that occurred during the data capture period. Twelve participants were recruited to perform a mouse and a typing task. The data of only eight participants could be analysed. A 3D motion analysis system measured the number of cervical and thoracic postural changes as well as, the range in which the postural changes occurred. The study findings illustrate that there is less postural dynamism of the cervical and thoracic spinal regions during computer mouse use, when compared to keyboard typing. Copyright © 2015 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  15. Experiences of Computer Science Curriculum Design: A Phenomenological Study

    ERIC Educational Resources Information Center

    Sloan, Arthur; Bowe, Brian

    2015-01-01

    This paper presents a qualitative study of 12 computer science lecturers' experiences of curriculum design of several degree programmes during a time of transition from year-long to semesterised courses, due to institutional policy change. The background to the study is outlined, as are the reasons for choosing the research methodology. The main…

  16. Computer vision syndrome: a study of knowledge and practices in university students.

    PubMed

    Reddy, S C; Low, C K; Lim, Y P; Low, L L; Mardina, F; Nursaleha, M P

    2013-01-01

    Computer vision syndrome (CVS) is a condition in which a person experiences one or more of eye symptoms as a result of prolonged working on a computer. To determine the prevalence of CVS symptoms, knowledge and practices of computer use in students studying in different universities in Malaysia, and to evaluate the association of various factors in computer use with the occurrence of symptoms. In a cross sectional, questionnaire survey study, data was collected in college students regarding the demography, use of spectacles, duration of daily continuous use of computer, symptoms of CVS, preventive measures taken to reduce the symptoms, use of radiation filter on the computer screen, and lighting in the room. A total of 795 students, aged between 18 and 25 years, from five universities in Malaysia were surveyed. The prevalence of symptoms of CVS (one or more) was found to be 89.9%; the most disturbing symptom was headache (19.7%) followed by eye strain (16.4%). Students who used computer for more than 2 hours per day experienced significantly more symptoms of CVS (p=0.0001). Looking at far objects in-between the work was significantly (p=0.0008) associated with less frequency of CVS symptoms. The use of radiation filter on the screen (p=0.6777) did not help in reducing the CVS symptoms. Ninety percent of university students in Malaysia experienced symptoms related to CVS, which was seen more often in those who used computer for more than 2 hours continuously per day. © NEPjOPH.

  17. Excessive computer game playing: evidence for addiction and aggression?

    PubMed

    Grüsser, S M; Thalemann, R; Griffiths, M D

    2007-04-01

    Computer games have become an ever-increasing part of many adolescents' day-to-day lives. Coupled with this phenomenon, reports of excessive gaming (computer game playing) denominated as "computer/video game addiction" have been discussed in the popular press as well as in recent scientific research. The aim of the present study was the investigation of the addictive potential of gaming as well as the relationship between excessive gaming and aggressive attitudes and behavior. A sample comprising of 7069 gamers answered two questionnaires online. Data revealed that 11.9% of participants (840 gamers) fulfilled diagnostic criteria of addiction concerning their gaming behavior, while there is only weak evidence for the assumption that aggressive behavior is interrelated with excessive gaming in general. Results of this study contribute to the assumption that also playing games without monetary reward meets criteria of addiction. Hence, an addictive potential of gaming should be taken into consideration regarding prevention and intervention.

  18. Computational Study of Inlet Active Flow Control

    DTIC Science & Technology

    2007-05-01

    AFRL-VA-WP-TR-2007-3077 COMPUTATIONAL STUDY OF INLET ACTIVE FLOW CONTROL Delivery Order 0005 Dr. Sonya T. Smith Howard University Department...NUMBER A0A2 5e. TASK NUMBER 6. AUTHOR(S) Dr. Sonya T. Smith ( Howard University ) Dr. Angela Scribben and Matthew Goettke (AFRL/VAAI) 5f...WORK UNIT NUMBER 0B 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) 8. PERFORMING ORGANIZATION Howard University Department of Mechanical

  19. Brain single-photon emission computed tomography in fetal alcohol syndrome: a case report and study implications.

    PubMed

    Codreanu, Ion; Yang, JiGang; Zhuang, Hongming

    2012-12-01

    The indications of brain single-photon emission computed tomography (SPECT) in fetal alcohol syndrome are not clearly defined, even though the condition is recognized as one of the most common causes of mental retardation. This article reports a case of a 9-year-old adopted girl with developmental delay, mildly dysmorphic facial features, and behavioral and cognitive abnormalities. Extensive investigations including genetic studies and brain magnetic resonance imaging (MRI) revealed no abnormalities, and a diagnosis of fetal alcohol syndrome was considered since official diagnostic criteria were met. A brain SPECT was requested and showed severely decreased tracer activity in the thalami, basal ganglia, and temporal lobes on both sides, the overall findings being consistent with the established diagnosis of fetal alcohol syndrome. With increasing availability of functional brain imaging, the study indications and possible ethical implications in suspected prenatal alcohol exposure or even before adoption need further consideration. In this patient, SPECT was the only test to yield positive results.

  20. Computer-associated health complaints and sources of ergonomic instructions in computer-related issues among Finnish adolescents: a cross-sectional study.

    PubMed

    Hakala, Paula T; Saarni, Lea A; Ketola, Ritva L; Rahkola, Erja T; Salminen, Jouko J; Rimpelä, Arja H

    2010-01-11

    The use of computers has increased among adolescents, as have musculoskeletal symptoms. There is evidence that these symptoms can be reduced through an ergonomics approach and through education. The purpose of this study was to examine where adolescents had received ergonomic instructions related to computer use, and whether receiving these instructions was associated with a reduced prevalence of computer-associated health complaints. Mailed survey with nationally representative sample of 12 to 18-year-old Finns in 2001 (n = 7292, response rate 70%). In total, 6961 youths reported using a computer. We tested the associations of computer use time and received ergonomic instructions (predictor variables) with computer-associated health complaints (outcome variables) using logistic regression analysis. To prevent computer-associated complaints, 61.2% reported having been instructed to arrange their desk/chair/screen in the right position, 71.5% to take rest breaks. The older age group (16-18 years) reported receiving instructions or being self-instructed more often than the 12- to 14-year-olds (p < 0.001). Among both age groups the sources of instructions included school (33.1%), family (28.6%), self (self-instructed) (12.5%), ICT-related (8.6%), friends (1.5%) and health professionals (0.8%). Receiving instructions was not related to lower prevalence of computer-associated health complaints. This report shows that ergonomic instructions on how to prevent computer-related musculoskeletal problems fail to reach a substantial number of children. Furthermore, the reported sources of instructions vary greatly in terms of reliability.

  1. Synthesis, characterization and computational study of the newly synthetized sulfonamide molecule

    NASA Astrophysics Data System (ADS)

    Murthy, P. Krishna; Suneetha, V.; Armaković, Stevan; Armaković, Sanja J.; Suchetan, P. A.; Giri, L.; Rao, R. Sreenivasa

    2018-02-01

    A new compound N-(2,5-dimethyl-4-nitrophenyl)-4-methylbenzenesulfonamide (NDMPMBS) has been derived from 2,5-dimethyl-4-nitroaniline and 4-methylbenzene-1-sulfonyl chloride. Structure was characterized by SCXRD studies and spectroscopic tools. Compound crystallized in the monoclinic crystal system with P21/c space group a = 10.0549, b = 18.967, c = 8.3087, β = 103.18 and Z = 4. Type and nature of intermolecular interaction in crystal state investigated by 3D-Hirshfeld surface and 2D-finger print plots revealed that title compound stabilized by several interactions. The structural and electronic properties of title compound have been calculated at DFT/B3LYP/6-311G++(d,p) level of theory. Computationally obtained spectral data was compared with experimental results, showing excellent mutual agreement. Assignment of each vibrational wave number was done on the basis of potential energy distribution (PED). Investigation of local reactivity descriptors encompassed visualization of molecular electrostatic potential (MEP) and average local ionization energy (ALIE) surfaces, visualization of Fukui functions, natural bond order (NBO) analysis, bond dissociation energies for hydrogen abstraction (H-BDE) and radial distribution functions (RDF) after molecular dynamics (MD) simulations. MD simulations were also used in order to investigate interaction of NDMPMBS molecule with 1WKR and 3ETT proteins protein.

  2. Computational and experimental studies of LEBUs at high device Reynolds numbers

    NASA Technical Reports Server (NTRS)

    Bertelrud, Arild; Watson, R. D.

    1988-01-01

    The present paper summarizes computational and experimental studies for large-eddy breakup devices (LEBUs). LEBU optimization (using a computational approach considering compressibility, Reynolds number, and the unsteadiness of the flow) and experiments with LEBUs at high Reynolds numbers in flight are discussed. The measurements include streamwise as well as spanwise distributions of local skin friction. The unsteady flows around the LEBU devices and far downstream are characterized by strain-gage measurements on the devices and hot-wire readings downstream. Computations are made with available time-averaged and quasi-stationary techniques to find suitable device profiles with minimum drag.

  3. Computer ethics and teritary level education in Hong Kong

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wong, E.Y.W.; Davison, R.M.; Wade, P.W.

    1994-12-31

    This paper seeks to highlight some ethical issues relating to the increasing proliferation of Information Technology into our everyday lives. The authors explain their understanding of computer ethics, and give some reasons why the study of computer ethics is becoming increasingly pertinent. The paper looks at some of the problems that arise in attempting to develop appropriate ethical concepts in a constantly changing environment, and explores some of the ethical dilemmas arising from the increasing use of computers. Some initial research undertaken to explore the ideas and understanding of tertiary level students in Hong Kong on a number of ethicalmore » issues of interest is described, and our findings discussed. We hope that presenting this paper and eliciting subsequent discussion will enable us to draw up more comprehensive guidelines for the teaching of computer related ethics to tertiary level students, as well as reveal some directions for future research.« less

  4. Computer-assisted photogrammetric mapping systems for geologic studies-A progress report

    USGS Publications Warehouse

    Pillmore, C.L.; Dueholm, K.S.; Jepsen, H.S.; Schuch, C.H.

    1981-01-01

    Photogrammetry has played an important role in geologic mapping for many years; however, only recently have attempts been made to automate mapping functions for geology. Computer-assisted photogrammetric mapping systems for geologic studies have been developed and are currently in use in offices of the Geological Survey of Greenland at Copenhagen, Denmark, and the U.S. Geological Survey at Denver, Colorado. Though differing somewhat, the systems are similar in that they integrate Kern PG-2 photogrammetric plotting instruments and small desk-top computers that are programmed to perform special geologic functions and operate flat-bed plotters by means of specially designed hardware and software. A z-drive capability, in which stepping motors control the z-motions of the PG-2 plotters, is an integral part of both systems. This feature enables the computer to automatically position the floating mark on computer-calculated, previously defined geologic planes, such as contacts or the base of coal beds, throughout the stereoscopic model in order to improve the mapping capabilities of the instrument and to aid in correlation and tracing of geologic units. The common goal is to enhance the capabilities of the PG-2 plotter and provide a means by which geologists can make conventional geologic maps more efficiently and explore ways to apply computer technology to geologic studies. ?? 1981.

  5. Performance Evaluation in Network-Based Parallel Computing

    NASA Technical Reports Server (NTRS)

    Dezhgosha, Kamyar

    1996-01-01

    Network-based parallel computing is emerging as a cost-effective alternative for solving many problems which require use of supercomputers or massively parallel computers. The primary objective of this project has been to conduct experimental research on performance evaluation for clustered parallel computing. First, a testbed was established by augmenting our existing SUNSPARCs' network with PVM (Parallel Virtual Machine) which is a software system for linking clusters of machines. Second, a set of three basic applications were selected. The applications consist of a parallel search, a parallel sort, a parallel matrix multiplication. These application programs were implemented in C programming language under PVM. Third, we conducted performance evaluation under various configurations and problem sizes. Alternative parallel computing models and workload allocations for application programs were explored. The performance metric was limited to elapsed time or response time which in the context of parallel computing can be expressed in terms of speedup. The results reveal that the overhead of communication latency between processes in many cases is the restricting factor to performance. That is, coarse-grain parallelism which requires less frequent communication between processes will result in higher performance in network-based computing. Finally, we are in the final stages of installing an Asynchronous Transfer Mode (ATM) switch and four ATM interfaces (each 155 Mbps) which will allow us to extend our study to newer applications, performance metrics, and configurations.

  6. Comparative study viruses with computer-aided phase microscope AIRYSCAN

    NASA Astrophysics Data System (ADS)

    Tychinsky, Vladimir P.; Koufal, Georgy E.; Perevedentseva, Elena V.; Vyshenskaia, Tatiana V.

    1996-12-01

    Traditionally viruses are studied with scanning electron microscopy (SEM) after complicated procedure of sample preparation without the possibility to study it under natural conditions. We obtained images of viruses (Vaccinia virus, Rotavirus) and rickettsias (Rickettsia provazekii, Coxiella burnetti) in native state with computer-aided phase microscope airyscan -- the interference microscope of Linnik layout with phase modulation of the reference wave with dissector image tube as coordinate-sensitive photodetector and computer processing of phase image. A light source was the He-Ne laser. The main result is coincidence of dimensions and shape of phase images with available information concerning their morphology obtained with SEM and other methods. The fine structure of surface and nuclei is observed. This method may be applied for virus recognition and express identification, investigation of virus structure and the analysis of cell-virus interaction.

  7. Fault-tolerant building-block computer study

    NASA Technical Reports Server (NTRS)

    Rennels, D. A.

    1978-01-01

    Ultra-reliable core computers are required for improving the reliability of complex military systems. Such computers can provide reliable fault diagnosis, failure circumvention, and, in some cases serve as an automated repairman for their host systems. A small set of building-block circuits which can be implemented as single very large integration devices, and which can be used with off-the-shelf microprocessors and memories to build self checking computer modules (SCCM) is described. Each SCCM is a microcomputer which is capable of detecting its own faults during normal operation and is described to communicate with other identical modules over one or more Mil Standard 1553A buses. Several SCCMs can be connected into a network with backup spares to provide fault-tolerant operation, i.e. automated recovery from faults. Alternative fault-tolerant SCCM configurations are discussed along with the cost and reliability associated with their implementation.

  8. Doctors' experience with handheld computers in clinical practice: qualitative study

    PubMed Central

    McAlearney, Ann Scheck; Schweikhart, Sharon B; Medow, Mitchell A

    2004-01-01

    Objective To examine doctors' perspectives about their experiences with handheld computers in clinical practice. Design Qualitative study of eight focus groups consisting of doctors with diverse training and practice patterns. Setting Six practice settings across the United States and two additional focus group sessions held at a national meeting of general internists. Participants 54 doctors who did or did not use handheld computers. Results Doctors who used handheld computers in clinical practice seemed generally satisfied with them and reported diverse patterns of use. Users perceived that the devices helped them increase productivity and improve patient care. Barriers to use concerned the device itself and personal and perceptual constraints, with perceptual factors such as comfort with technology, preference for paper, and the impression that the devices are not easy to use somewhat difficult to overcome. Participants suggested that organisations can help promote handheld computers by providing advice on purchase, usage, training, and user support. Participants expressed concern about reliability and security of the device but were particularly concerned about dependency on the device and over-reliance as a substitute for clinical thinking. Conclusions Doctors expect handheld computers to become more useful, and most seem interested in leveraging (getting the most value from) their use. Key opportunities with handheld computers included their use as a stepping stone to build doctors' comfort with other information technology and ehealth initiatives and providing point of care support that helps improve patient care. PMID:15142920

  9. A Computational Method to Determine Glucose Infusion Rates for Isoglycemic Intravenous Glucose Infusion Study.

    PubMed

    Choi, Karam; Lee, Jung Chan; Oh, Tae Jung; Kim, Myeungseon; Kim, Hee Chan; Cho, Young Min; Kim, Sungwan

    2016-01-01

    The results of the isoglycemic intravenous glucose infusion (IIGI) study need to mimic the dynamic glucose profiles during the oral glucose tolerance test (OGTT) to accurately calculate the incretin effect. The glucose infusion rates during IIGI studies have historically been determined by experienced research personnel using the manual ad-hoc method. In this study, a computational method was developed to automatically determine the infusion rates for IIGI study based on a glucose-dynamics model. To evaluate the computational method, 18 subjects with normal glucose tolerance underwent a 75 g OGTT. One-week later, Group 1 (n = 9) and Group 2 (n = 9) underwent IIGI studies using the ad-hoc method and the computational method, respectively. Both methods were evaluated using correlation coefficient, mean absolute relative difference (MARD), and root mean square error (RMSE) between the glucose profiles from the OGTT and the IIGI study. The computational method exhibited significantly higher correlation (0.95 ± 0.03 versus 0.86 ± 0.10, P = 0.019), lower MARD (8.72 ± 1.83% versus 13.11 ± 3.66%, P = 0.002), and lower RMSE (10.33 ± 1.99 mg/dL versus 16.84 ± 4.43 mg/dL, P = 0.002) than the ad-hoc method. The computational method can facilitate IIGI study, and enhance its accuracy and stability. Using this computational method, a high-quality IIGI study can be accomplished without the need for experienced personnel.

  10. Applying standardized uptake values in gallium-67-citrate single-photon emission computed tomography/computed tomography studies and their correlation with blood test results in representative organs.

    PubMed

    Toriihara, Akira; Daisaki, Hiromitsu; Yamaguchi, Akihiro; Yoshida, Katsuya; Isogai, Jun; Tateishi, Ukihide

    2018-05-21

    Recently, semiquantitative analysis using standardized uptake value (SUV) has been introduced in bone single-photon emission computed tomography/computed tomography (SPECT/CT). Our purposes were to apply SUV-based semiquantitative analytic method for gallium-67 (Ga)-citrate SPECT/CT and to evaluate correlation between SUV of physiological uptake and blood test results in representative organs. The accuracy of semiquantitative method was validated using an National Electrical Manufacturers Association body phantom study (radioactivity ratio of sphere : background=4 : 1). Thereafter, 59 patients (34 male and 25 female; mean age, 66.9 years) who had undergone Ga-citrate SPECT/CT were retrospectively enrolled in the study. A mean SUV of physiological uptake was calculated for the following organs: the lungs, right atrium, liver, kidneys, spleen, gluteal muscles, and bone marrow. The correlation between physiological uptakes and blood test results was evaluated using Pearson's correlation coefficient. The phantom study revealed only 1% error between theoretical and actual SUVs in the background, suggesting the sufficient accuracy of scatter and attenuation corrections. However, a partial volume effect could not be overlooked, particularly in small spheres with a diameter of less than 28 mm. The highest mean SUV was observed in the liver (range: 0.44-4.64), followed by bone marrow (range: 0.33-3.60), spleen (range: 0.52-2.12), and kidneys (range: 0.42-1.45). There was no significant correlation between hepatic uptake and liver function, renal uptake and renal function, or bone marrow uptake and blood cell count (P>0.05). The physiological uptake in Ga-citrate SPECT/CT can be represented as SUVs, which are not significantly correlated with corresponding blood test results.

  11. DFT computations on: Crystal structure, vibrational studies and optical investigations of a luminescent self-assembled material

    NASA Astrophysics Data System (ADS)

    Kessentini, A.; Ben Ahmed, A.; Dammak, T.; Belhouchet, M.

    2018-02-01

    The current work undertakes the growth and the physicochemical properties of a novel green-yellow luminescence semi-organic material, the 3-picolylammonium bromide abbreviated (Pico-Br). In this paper, we report the X-ray diffraction measurements which show that the crystal lattice consists of distinct 3-picolylammonium cations and free bromide anions connected via Nsbnd H ⋯ Br and Nsbnd H ⋯ N hydrogen bonds leading to form a two dimensional frameworks. Molecular geometry compared with its optimized counterpart shows that the quantum chemical calculations carried out with density functional method (DFT) well produce the perceived structure by X-ray resolution of the studied material. To provide further insight into the spectroscopic properties, additional characterization of this material have been performed with Raman and infrared studies at room temperature. Theoretical computations have been computed using the (DFT) method at B3LYP/LanL2DZ level of theory implemented within Gaussian 03 program to study the vibrational spectra of the investigated molecule in the ground state. Optical absorption spectrum inspected by UV-visible absorption reveals the appearance of sharp optical gap of 280 nm (4.42 eV) as well as a strong green photoluminescence emission at 550 nm (2.25 eV) is detected on the photoluminescence (PL) spectrum at room temperature. Using the TD/DFT method, HOMO-LUMO energy gap and the Mulliken atomic charges were calculated in order to get an insight into the material. Good agreement between the theoretical results and the experimental ones was predicted.

  12. Study of a computer-controlled integrated optical gas-concentration sensor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Egorov, A A; Egorov, M A; Chekhlova, T K

    2008-08-31

    A computer-controlled integrated optical waveguide sensor based on an optical waveguide of the diffusion type with the low attenuation coefficient is developed and studied. It is shown that the response time of the sensor is {approx}0.15 s. According to tests and computer simulations, the sensor can detect gaseous ammonia in air with the limiting theoretical concentration of {approx}0.1 ppm for the signal-to-noise ratio no less than 20. (laser applications and other topics in quantum electronics)

  13. Study Time: Temporal Orientations of Freshmen Students and Computing.

    ERIC Educational Resources Information Center

    Anderson, Kenneth T.; McClard, Anne Page

    1993-01-01

    Examines student domains of study and time and how these relate to use of innovative computing facilities in a dormitory for 61 first-year college students at Brown University in Providence (Rhode Island). This ethnographic study points out how student conceptions of time differ from those of others and how this affects their use of personal…

  14. A Computer Game-Based Method for Studying Bullying and Cyberbullying

    ERIC Educational Resources Information Center

    Mancilla-Caceres, Juan F.; Espelage, Dorothy; Amir, Eyal

    2015-01-01

    Even though previous studies have addressed the relation between face-to-face bullying and cyberbullying, none have studied both phenomena simultaneously. In this article, we present a computer game-based method to study both types of peer aggression among youth. Study participants included fifth graders (N = 93) in two U.S. Midwestern middle…

  15. Biomechanical effects of mobile computer location in a vehicle cab.

    PubMed

    Saginus, Kyle A; Marklin, Richard W; Seeley, Patricia; Simoneau, Guy G; Freier, Stephen

    2011-10-01

    The objective of this research is to determine the best location to place a conventional mobile computer supported by a commercially available mount in a light truck cab. U.S. and Canadian electric utility companies are in the process of integrating mobile computers into their fleet vehicle cabs. There are no publications on the effect of mobile computer location in a vehicle cab on biomechanical loading, performance, and subjective assessment. The authors tested four locations of mobile computers in a light truck cab in a laboratory study to determine how location affected muscle activity of the lower back and shoulders; joint angles of the shoulders, elbows, and wrist; user performance; and subjective assessment. A total of 22 participants were tested in this study. Placing the mobile computer closer to the steering wheel reduced low back and shoulder muscle activity. Joint angles of the shoulders, elbows, and wrists were also closer to neutral angle. Biomechanical modeling revealed substantially less spinal compression and trunk muscle force. In general, there were no practical differences in performance between the locations. Subjective assessment indicated that users preferred the mobile computer to be as close as possible to the steering wheel. Locating the mobile computer close to the steering wheel reduces risk of injuries, such as low back pain and shoulder tendonitis. Results from the study can guide electric utility companies in the installation of mobile computers into vehicle cabs. Results may also be generalized to other industries that use trucklike vehicles, such as construction.

  16. Computational Electromagnetic Studies for Low-Frequency Compensation of the Reflector Impulse-radiating Antenna

    DTIC Science & Technology

    2015-03-26

    COMPUTATIONAL ELECTROMAGNETIC STUDIES FOR LOW-FREQUENCY COMPENSATION OF THE REFLECTOR IMPULSE-RADIATING ANTENNA THESIS Casey E. Fillmore, Capt, USAF... ELECTROMAGNETIC STUDIES FOR LOW-FREQUENCY COMPENSATION OF THE REFLECTOR IMPULSE-RADIATING ANTENNA THESIS Presented to the Faculty Department of Electrical and...2015 DISTRIBUTION STATEMENT A APPROVED FOR PUBLIC RELEASE; DISTRIBUTION UNLIMITED. AFIT-ENG-MS-15-M-011 COMPUTATIONAL ELECTROMAGNETIC STUDIES FOR LOW

  17. Reverse logistics system planning for recycling computers hardware: A case study

    NASA Astrophysics Data System (ADS)

    Januri, Siti Sarah; Zulkipli, Faridah; Zahari, Siti Meriam; Shamsuri, Siti Hajar

    2014-09-01

    This paper describes modeling and simulation of reverse logistics networks for collection of used computers in one of the company in Selangor. The study focuses on design of reverse logistics network for used computers recycling operation. Simulation modeling, presented in this work allows the user to analyze the future performance of the network and to understand the complex relationship between the parties involved. The findings from the simulation suggest that the model calculates processing time and resource utilization in a predictable manner. In this study, the simulation model was developed by using Arena simulation package.

  18. The Nimrod computational workbench: a case study in desktop metacomputing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Abramson, D.; Sosic, R.; Foster, I.

    The coordinated use of geographically distributed computers, or metacomputing, can in principle provide more accessible and cost- effective supercomputing than conventional high-performance systems. However, we lack evidence that metacomputing systems can be made easily usable, or that there exist large numbers of applications able to exploit metacomputing resources. In this paper, we present work that addresses both these concerns. The basis for this work is a system called Nimrod that provides a desktop problem-solving environment for parametric experiments. We describe how Nimrod has been extended to support the scheduling of computational resources located in a wide-area environment, and report onmore » an experiment in which Nimrod was used to schedule a large parametric study across the Australian Internet. The experiment provided both new scientific results and insights into Nimrod capabilities. We relate the results of this experiment to lessons learned from the I-WAY distributed computing experiment, and draw conclusions as to how Nimrod and I-WAY- like computing environments should be developed to support desktop metacomputing.« less

  19. Finite-data-size study on practical universal blind quantum computation

    NASA Astrophysics Data System (ADS)

    Zhao, Qiang; Li, Qiong

    2018-07-01

    The universal blind quantum computation with weak coherent pulses protocol is a practical scheme to allow a client to delegate a computation to a remote server while the computation hidden. However, in the practical protocol, a finite data size will influence the preparation efficiency in the remote blind qubit state preparation (RBSP). In this paper, a modified RBSP protocol with two decoy states is studied in the finite data size. The issue of its statistical fluctuations is analyzed thoroughly. The theoretical analysis and simulation results show that two-decoy-state case with statistical fluctuation is closer to the asymptotic case than the one-decoy-state case with statistical fluctuation. Particularly, the two-decoy-state protocol can achieve a longer communication distance than the one-decoy-state case in this statistical fluctuation situation.

  20. Computer Science Majors: Sex Role Orientation, Academic Achievement, and Social Cognitive Factors

    ERIC Educational Resources Information Center

    Brown, Chris; Garavalia, Linda S.; Fritts, Mary Lou Hines; Olson, Elizabeth A.

    2006-01-01

    This study examined the sex role orientations endorsed by 188 male and female students majoring in computer science, a male-dominated college degree program. The relations among sex role orientation and academic achievement and social cognitive factors influential in career decision-making self-efficacy were explored. Findings revealed that…

  1. Emotor control: computations underlying bodily resource allocation, emotions, and confidence.

    PubMed

    Kepecs, Adam; Mensh, Brett D

    2015-12-01

    Emotional processes are central to behavior, yet their deeply subjective nature has been a challenge for neuroscientific study as well as for psychiatric diagnosis. Here we explore the relationships between subjective feelings and their underlying brain circuits from a computational perspective. We apply recent insights from systems neuroscience-approaching subjective behavior as the result of mental computations instantiated in the brain-to the study of emotions. We develop the hypothesis that emotions are the product of neural computations whose motor role is to reallocate bodily resources mostly gated by smooth muscles. This "emotor" control system is analagous to the more familiar motor control computations that coordinate skeletal muscle movements. To illustrate this framework, we review recent research on "confidence." Although familiar as a feeling, confidence is also an objective statistical quantity: an estimate of the probability that a hypothesis is correct. This model-based approach helped reveal the neural basis of decision confidence in mammals and provides a bridge to the subjective feeling of confidence in humans. These results have important implications for psychiatry, since disorders of confidence computations appear to contribute to a number of psychopathologies. More broadly, this computational approach to emotions resonates with the emerging view that psychiatric nosology may be best parameterized in terms of disorders of the cognitive computations underlying complex behavior.

  2. Emotor control: computations underlying bodily resource allocation, emotions, and confidence

    PubMed Central

    Kepecs, Adam; Mensh, Brett D.

    2015-01-01

    Emotional processes are central to behavior, yet their deeply subjective nature has been a challenge for neuroscientific study as well as for psychiatric diagnosis. Here we explore the relationships between subjective feelings and their underlying brain circuits from a computational perspective. We apply recent insights from systems neuroscience—approaching subjective behavior as the result of mental computations instantiated in the brain—to the study of emotions. We develop the hypothesis that emotions are the product of neural computations whose motor role is to reallocate bodily resources mostly gated by smooth muscles. This “emotor” control system is analagous to the more familiar motor control computations that coordinate skeletal muscle movements. To illustrate this framework, we review recent research on “confidence.” Although familiar as a feeling, confidence is also an objective statistical quantity: an estimate of the probability that a hypothesis is correct. This model-based approach helped reveal the neural basis of decision confidence in mammals and provides a bridge to the subjective feeling of confidence in humans. These results have important implications for psychiatry, since disorders of confidence computations appear to contribute to a number of psychopathologies. More broadly, this computational approach to emotions resonates with the emerging view that psychiatric nosology may be best parameterized in terms of disorders of the cognitive computations underlying complex behavior. PMID:26869840

  3. A combined computational-experimental analyses of selected metabolic enzymes in Pseudomonas species.

    PubMed

    Perumal, Deepak; Lim, Chu Sing; Chow, Vincent T K; Sakharkar, Kishore R; Sakharkar, Meena K

    2008-09-10

    Comparative genomic analysis has revolutionized our ability to predict the metabolic subsystems that occur in newly sequenced genomes, and to explore the functional roles of the set of genes within each subsystem. These computational predictions can considerably reduce the volume of experimental studies required to assess basic metabolic properties of multiple bacterial species. However, experimental validations are still required to resolve the apparent inconsistencies in the predictions by multiple resources. Here, we present combined computational-experimental analyses on eight completely sequenced Pseudomonas species. Comparative pathway analyses reveal that several pathways within the Pseudomonas species show high plasticity and versatility. Potential bypasses in 11 metabolic pathways were identified. We further confirmed the presence of the enzyme O-acetyl homoserine (thiol) lyase (EC: 2.5.1.49) in P. syringae pv. tomato that revealed inconsistent annotations in KEGG and in the recently published SYSTOMONAS database. These analyses connect and integrate systematic data generation, computational data interpretation, and experimental validation and represent a synergistic and powerful means for conducting biological research.

  4. Computer-associated health complaints and sources of ergonomic instructions in computer-related issues among Finnish adolescents: A cross-sectional study

    PubMed Central

    2010-01-01

    Background The use of computers has increased among adolescents, as have musculoskeletal symptoms. There is evidence that these symptoms can be reduced through an ergonomics approach and through education. The purpose of this study was to examine where adolescents had received ergonomic instructions related to computer use, and whether receiving these instructions was associated with a reduced prevalence of computer-associated health complaints. Methods Mailed survey with nationally representative sample of 12 to 18-year-old Finns in 2001 (n = 7292, response rate 70%). In total, 6961 youths reported using a computer. We tested the associations of computer use time and received ergonomic instructions (predictor variables) with computer-associated health complaints (outcome variables) using logistic regression analysis. Results To prevent computer-associated complaints, 61.2% reported having been instructed to arrange their desk/chair/screen in the right position, 71.5% to take rest breaks. The older age group (16-18 years) reported receiving instructions or being self-instructed more often than the 12- to 14-year-olds (p < 0.001). Among both age groups the sources of instructions included school (33.1%), family (28.6%), self (self-instructed) (12.5%), ICT-related (8.6%), friends (1.5%) and health professionals (0.8%). Receiving instructions was not related to lower prevalence of computer-associated health complaints. Conclusions This report shows that ergonomic instructions on how to prevent computer-related musculoskeletal problems fail to reach a substantial number of children. Furthermore, the reported sources of instructions vary greatly in terms of reliability. PMID:20064250

  5. Computing Cosmic Cataclysms

    NASA Technical Reports Server (NTRS)

    Centrella, Joan M.

    2010-01-01

    The final merger of two black holes releases a tremendous amount of energy, more than the combined light from all the stars in the visible universe. This energy is emitted in the form of gravitational waves, and observing these sources with gravitational wave detectors requires that we know the pattern or fingerprint of the radiation emitted. Since black hole mergers take place in regions of extreme gravitational fields, we need to solve Einstein's equations of general relativity on a computer in order to calculate these wave patterns. For more than 30 years, scientists have tried to compute these wave patterns. However, their computer codes have been plagued by problems that caused them to crash. This situation has changed dramatically in the past few years, with a series of amazing breakthroughs. This talk will take you on this quest for these gravitational wave patterns, showing how a spacetime is constructed on a computer to build a simulation laboratory for binary black hole mergers. We will focus on the recent advances that are revealing these waveforms, and the dramatic new potential for discoveries that arises when these sources will be observed.

  6. Revealing the Strong Functional Association of adipor2 and cdh13 with adipoq: A Gene Network Study.

    PubMed

    Bag, Susmita; Anbarasu, Anand

    2015-04-01

    In the present study, we have analyzed functional gene interactions of adiponectin gene (adipoq). The key role of adipoq is in regulating energy homeostasis and it functions as a novel signaling molecule for adipose tissue. Modules of highly inter-connected genes in disease-specific adipoq network are derived by integrating gene function and protein interaction data. Among twenty genes in adipoq web, adipoq is effectively conjoined with two genes: Adiponectin receptor 2 (adipor2) and cadherin 13 (cdh13). The functional analysis is done via ontological briefing and candidate disease identification. We observed that the highly efficient-interlinked genes connected with adipoq are adipor2 and cdh13. Interestingly, the ontological aspect of adipor2 and cdh13 in the adipoq network reveal the fact that adipoq and adipor2 are involved mostly in glucose and lipid metabolic processes. The gene cdh13 indulge in cell adhesion process with adipoq and adipor2. Our computational gene web analysis also predicts potential candidate disease recognition, thus indicating the involvement of adipoq, adipor2, and cdh13 with not only with obesity but also with breast cancer, leukemia, renal cancer, lung cancer, and cervical cancer. The current study provides researchers a comprehensible layout of adipoq network, its functional strategies and candidate disease approach associated with adipoq network.

  7. To What Degree Are Undergraduate Students Using Their Personal Computers to Support Their Daily Study Practices?

    ERIC Educational Resources Information Center

    Sim, KwongNui; Butson, Russell

    2014-01-01

    This scoping study examines the degree to which twenty two undergraduate students used their personal computers to support their academic study. The students were selected based on their responses to a questionnaire aimed at gauging their degree of computer skill. Computer activity data was harvested from the personal computers of eighteen…

  8. Space station Simulation Computer System (SCS) study for NASA/MSFC. Volume 5: Study analysis report

    NASA Technical Reports Server (NTRS)

    1989-01-01

    The Simulation Computer System (SCS) is the computer hardware, software, and workstations that will support the Payload Training Complex (PTC) at the Marshall Space Flight Center (MSFC). The PTC will train the space station payload scientists, station scientists, and ground controllers to operate the wide variety of experiments that will be on-board the Freedom Space Station. The further analysis performed on the SCS study as part of task 2-Perform Studies and Parametric Analysis-of the SCS study contract is summarized. These analyses were performed to resolve open issues remaining after the completion of task 1, and the publishing of the SCS study issues report. The results of these studies provide inputs into SCS task 3-Develop and present SCS requirements, and SCS task 4-develop SCS conceptual designs. The purpose of these studies is to resolve the issues into usable requirements given the best available information at the time of the study. A list of all the SCS study issues is given.

  9. Overcoming computational uncertainties to reveal chemical sensitivity in single molecule conduction calculations.

    PubMed

    Solomon, Gemma C; Reimers, Jeffrey R; Hush, Noel S

    2005-06-08

    In the calculation of conduction through single molecule's approximations about the geometry and electronic structure of the system are usually made in order to simplify the problem. Previously [G. C. Solomon, J. R. Reimers, and N. S. Hush, J. Chem. Phys. 121, 6615 (2004)], we have shown that, in calculations employing cluster models for the electrodes, proper treatment of the open-shell nature of the clusters is the most important computational feature required to make the results sensitive to variations in the structural and chemical features of the system. Here, we expand this and establish a general hierarchy of requirements involving treatment of geometrical approximations. These approximations are categorized into two classes: those associated with finite-dimensional methods for representing the semi-infinite electrodes, and those associated with the chemisorption topology. We show that ca. 100 unique atoms are required in order to properly characterize each electrode: using fewer atoms leads to nonsystematic variations in conductivity that can overwhelm the subtler changes. The choice of binding site is shown to be the next most important feature, while some effects that are difficult to control experimentally concerning the orientations at each binding site are actually shown to be insignificant. Verification of this result provides a general test for the precision of computational procedures for molecular conductivity. Predictions concerning the dependence of conduction on substituent and other effects on the central molecule are found to be meaningful only when they exceed the uncertainties of the effects associated with binding-site variation.

  10. Overcoming computational uncertainties to reveal chemical sensitivity in single molecule conduction calculations

    NASA Astrophysics Data System (ADS)

    Solomon, Gemma C.; Reimers, Jeffrey R.; Hush, Noel S.

    2005-06-01

    In the calculation of conduction through single molecule's approximations about the geometry and electronic structure of the system are usually made in order to simplify the problem. Previously [G. C. Solomon, J. R. Reimers, and N. S. Hush, J. Chem. Phys. 121, 6615 (2004)], we have shown that, in calculations employing cluster models for the electrodes, proper treatment of the open-shell nature of the clusters is the most important computational feature required to make the results sensitive to variations in the structural and chemical features of the system. Here, we expand this and establish a general hierarchy of requirements involving treatment of geometrical approximations. These approximations are categorized into two classes: those associated with finite-dimensional methods for representing the semi-infinite electrodes, and those associated with the chemisorption topology. We show that ca. 100 unique atoms are required in order to properly characterize each electrode: using fewer atoms leads to nonsystematic variations in conductivity that can overwhelm the subtler changes. The choice of binding site is shown to be the next most important feature, while some effects that are difficult to control experimentally concerning the orientations at each binding site are actually shown to be insignificant. Verification of this result provides a general test for the precision of computational procedures for molecular conductivity. Predictions concerning the dependence of conduction on substituent and other effects on the central molecule are found to be meaningful only when they exceed the uncertainties of the effects associated with binding-site variation.

  11. A Reflective Study into Children's Cognition When Making Computer Games

    ERIC Educational Resources Information Center

    Allsop, Yasemin

    2016-01-01

    In this paper, children's mental activities when making digital games are explored. Where previous studies have mainly focused on children's learning, this study aimed to unfold the children's thinking process for learning when making computer games. As part of an ongoing larger scale study, which adopts an ethnographic approach, this research…

  12. Studying Parental Decision Making with Micro-Computers: The CPSI Technique.

    ERIC Educational Resources Information Center

    Holden, George W.

    A technique for studying how parents think, make decisions, and solve childrearing problems, Computer-Presented Social Interactions (CPSI), is described. Two studies involving CPSI are presented. The first study concerns a common parental cognitive task: causal analysis of an undesired behavior. The task was to diagnose the cause of non-contingent…

  13. LOKI WIND CORRECTION COMPUTER AND WIND STUDIES FOR LOKI

    DTIC Science & Technology

    which relates burnout deviation of flight path with the distributed wind along the boost trajectory. The wind influence function was applied to...electrical outputs. A complete wind correction computer system based on the influence function and the results of wind studies was designed.

  14. Computer users' ergonomics and quality of life - evidence from a developing country.

    PubMed

    Ahmed, Ishfaq; Shaukat, Muhammad Zeeshan

    2018-06-01

    This study is aimed at investigating the quality of workplace ergonomics at various Pakistani organizations and quality of life of computer users working in these organizations. Two hundred and thirty-five computer users (only those employees who have to do most of their job tasks on computer or laptop, and at their office) responded by filling the questionnaire covering questions on workplace ergonomics and quality of life. Findings of the study revealed the ergonomics at those organizations was poor and unfavourable. The quality of life (both physical and mental health of the employees) of respondents was poor for employees who had unfavourable ergonomic environment. The findings thus highlight an important issue prevalent at Pakistani work settings.

  15. Computational vibrational study on coordinated nicotinamide

    NASA Astrophysics Data System (ADS)

    Bolukbasi, Olcay; Akyuz, Sevim

    2005-06-01

    The molecular structure and vibrational spectra of zinc (II) halide complexes of nicotinamide (ZnX 2(NIA) 2; X=Cl or Br; NIA=Nicotinamide) were investigated by computational vibrational study and scaled quantum mechanical (SQM) analysis. The geometry optimisation and vibrational wavenumber calculations of zinc halide complexes of nicotinamide were carried out by using the DFT/RB3LYP level of theory with 6-31G(d,p) basis set. The calculated wavenumbers were scaled by using scaled quantum mechanical (SQM) force field method. The fundamental vibrational modes were characterised by their total energy distribution. The coordination effects on nicotinamide through the ring nitrogen were discussed.

  16. Combining computer and manual overlays—Willamette River Greenway Study

    Treesearch

    Asa Hanamoto; Lucille Biesbroeck

    1979-01-01

    We will present a method of combining computer mapping with manual overlays. An example of its use is the Willamette River Greenway Study produced for the State of Oregon Department of Transportation in 1974. This one year planning study included analysis of data relevant to a 286-mile river system. The product is a "wise use" plan which conserves the basic...

  17. A literature review of neck pain associated with computer use: public health implications

    PubMed Central

    Green, Bart N

    2008-01-01

    Prolonged use of computers during daily work activities and recreation is often cited as a cause of neck pain. This review of the literature identifies public health aspects of neck pain as associated with computer use. While some retrospective studies support the hypothesis that frequent computer operation is associated with neck pain, few prospective studies reveal causal relationships. Many risk factors are identified in the literature. Primary prevention strategies have largely been confined to addressing environmental exposure to ergonomic risk factors, since to date, no clear cause for this work-related neck pain has been acknowledged. Future research should include identifying causes of work related neck pain so that appropriate primary prevention strategies may be developed and to make policy recommendations pertaining to prevention. PMID:18769599

  18. Impaired associative learning in schizophrenia: behavioral and computational studies

    PubMed Central

    Diwadkar, Vaibhav A.; Flaugher, Brad; Jones, Trevor; Zalányi, László; Ujfalussy, Balázs; Keshavan, Matcheri S.

    2008-01-01

    Associative learning is a central building block of human cognition and in large part depends on mechanisms of synaptic plasticity, memory capacity and fronto–hippocampal interactions. A disorder like schizophrenia is thought to be characterized by altered plasticity, and impaired frontal and hippocampal function. Understanding the expression of this dysfunction through appropriate experimental studies, and understanding the processes that may give rise to impaired behavior through biologically plausible computational models will help clarify the nature of these deficits. We present a preliminary computational model designed to capture learning dynamics in healthy control and schizophrenia subjects. Experimental data was collected on a spatial-object paired-associate learning task. The task evinces classic patterns of negatively accelerated learning in both healthy control subjects and patients, with patients demonstrating lower rates of learning than controls. Our rudimentary computational model of the task was based on biologically plausible assumptions, including the separation of dorsal/spatial and ventral/object visual streams, implementation of rules of learning, the explicit parameterization of learning rates (a plausible surrogate for synaptic plasticity), and learning capacity (a plausible surrogate for memory capacity). Reductions in learning dynamics in schizophrenia were well-modeled by reductions in learning rate and learning capacity. The synergy between experimental research and a detailed computational model of performance provides a framework within which to infer plausible biological bases of impaired learning dynamics in schizophrenia. PMID:19003486

  19. A Meta-Analysis of Effectiveness Studies on Computer Technology-Supported Language Learning

    ERIC Educational Resources Information Center

    Grgurovic, Maja; Chapelle, Carol A.; Shelley, Mack C.

    2013-01-01

    With the aim of summarizing years of research comparing pedagogies for second/foreign language teaching supported with computer technology and pedagogy not-supported by computer technology, a meta-analysis was conducted of empirical research investigating language outcomes. Thirty-seven studies yielding 52 effect sizes were included, following a…

  20. Computational study of Drucker-Prager plasticity of rock using microtomography

    NASA Astrophysics Data System (ADS)

    Liu, J.; Sarout, J.; Zhang, M.; Dautriat, J.; Veveakis, M.; Regenauer-Lieb, K.

    2016-12-01

    Understanding the physics of rocks is essential for the industry of mining and petroleum. Microtomography provides a new way to quantify the relationship between the microstructure and their mechanical and transport properties. Transport and elastic properties have been studied widely while plastic properties are still poorly understood. In this study, we analyse a synthetic sandstone sample for its up-scaled plastic properties from the micro-scale. The computations are based on the representative volume element (RVE). The mechanical RVE was determined by the upper and lower bound finite element computations of elasticity. By comparing with experimental curves, the parameters of the matrix (solid part), which consists of calcite-cemented quartz grains, were investigated and quite accurate values obtained. Analyses deduced the bulk properties of yield stress, cohesion and the angle of friction of the rock with pores. Computations of a series of models of volume-sizes from 240-cube to 400-cube showed almost overlapped stress-strain curves, suggesting that the mechanical RVE determined by elastic computations is valid for plastic yielding. Furthermore, a series of derivative models were created which have similar structure but different porosity values. The analyses of these models showed that yield stress, cohesion and the angle of friction linearly decrease with the porosity increasing in the range of porosity from 8% to 28%. The angle of friction decreases the fastest and cohesion shows the most stable along with porosity.

  1. A Computer-Supported Method to Reveal and Assess Personal Professional Theories in Vocational Education

    ERIC Educational Resources Information Center

    van den Bogaart, Antoine C. M.; Bilderbeek, Richel J. C.; Schaap, Harmen; Hummel, Hans G. K.; Kirschner, Paul A.

    2016-01-01

    This article introduces a dedicated, computer-supported method to construct and formatively assess open, annotated concept maps of Personal Professional Theories (PPTs). These theories are internalised, personal bodies of formal and practical knowledge, values, norms and convictions that professionals use as a reference to interpret and acquire…

  2. Implications of Ubiquitous Computing for the Social Studies Curriculum

    ERIC Educational Resources Information Center

    van Hover, Stephanie D.; Berson, Michael J.; Bolick, Cheryl Mason; Swan, Kathleen Owings

    2004-01-01

    In March 2002, members of the National Technology Leadership Initiative (NTLI) met in Charlottesville, Virginia to discuss the potential effects of ubiquitous computing on the field of education. Ubiquitous computing, or "on-demand availability of task-necessary computing power," involves providing every student with a handheld computer--a…

  3. Central Computer Science Concepts to Research-Based Teacher Training in Computer Science: An Experimental Study

    ERIC Educational Resources Information Center

    Zendler, Andreas; Klaudt, Dieter

    2012-01-01

    The significance of computer science for economics and society is undisputed. In particular, computer science is acknowledged to play a key role in schools (e.g., by opening multiple career paths). The provision of effective computer science education in schools is dependent on teachers who are able to properly represent the discipline and whose…

  4. Sleep problems and computer use during work and leisure: Cross-sectional study among 7800 adults.

    PubMed

    Andersen, Lars Louis; Garde, Anne Helene

    2015-01-01

    Previous studies linked heavy computer use to disturbed sleep. This study investigates the association between computer use during work and leisure and sleep problems in working adults. From the 2010 round of the Danish Work Environment Cohort Study, currently employed wage earners on daytime schedule (N = 7883) replied to the Bergen insomnia scale and questions on weekly duration of computer use. Results showed that sleep problems for three or more days per week (average of six questions) were experienced by 14.9% of the respondents. Logistic regression analyses, controlled for gender, age, physical and psychosocial work factors, lifestyle, chronic disease and mental health showed that computer use during leisure for 30 or more hours per week (reference 0-10 hours per week) was associated with increased odds of sleep problems (OR 1.83 [95% CI 1.06-3.17]). Computer use during work and shorter duration of computer use during leisure were not associated with sleep problems. In conclusion, excessive computer use during leisure - but not work - is associated with sleep problems in adults working on daytime schedule.

  5. Measuring coherence of computer-assisted likelihood ratio methods.

    PubMed

    Haraksim, Rudolf; Ramos, Daniel; Meuwly, Didier; Berger, Charles E H

    2015-04-01

    Measuring the performance of forensic evaluation methods that compute likelihood ratios (LRs) is relevant for both the development and the validation of such methods. A framework of performance characteristics categorized as primary and secondary is introduced in this study to help achieve such development and validation. Ground-truth labelled fingerprint data is used to assess the performance of an example likelihood ratio method in terms of those performance characteristics. Discrimination, calibration, and especially the coherence of this LR method are assessed as a function of the quantity and quality of the trace fingerprint specimen. Assessment of the coherence revealed a weakness of the comparison algorithm in the computer-assisted likelihood ratio method used. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  6. Usability Studies in Virtual and Traditional Computer Aided Design Environments for Fault Identification

    DTIC Science & Technology

    2017-08-08

    Usability Studies In Virtual And Traditional Computer Aided Design Environments For Fault Identification Dr. Syed Adeel Ahmed, Xavier University...virtual environment with wand interfaces compared directly with a workstation non-stereoscopic traditional CAD interface with keyboard and mouse. In...the differences in interaction when compared with traditional human computer interfaces. This paper provides analysis via usability study methods

  7. Impact of Netbook Computers on One District's Social Studies Curriculum

    ERIC Educational Resources Information Center

    Schleicher, Joel L.

    2011-01-01

    The purpose of this study was to collect and analyze quantitative and qualitative data to determine the overall impact of a pilot netbook initiative in five social studies classrooms. The researcher explored the impact on teaching and learning social studies with the primary source of curriculum delivery through one-to-one netbook computer access…

  8. A computational framework for the study of confidence in humans and animals

    PubMed Central

    Kepecs, Adam; Mainen, Zachary F.

    2012-01-01

    Confidence judgements, self-assessments about the quality of a subject's knowledge, are considered a central example of metacognition. Prima facie, introspection and self-report appear the only way to access the subjective sense of confidence or uncertainty. Contrary to this notion, overt behavioural measures can be used to study confidence judgements by animals trained in decision-making tasks with perceptual or mnemonic uncertainty. Here, we suggest that a computational approach can clarify the issues involved in interpreting these tasks and provide a much needed springboard for advancing the scientific understanding of confidence. We first review relevant theories of probabilistic inference and decision-making. We then critically discuss behavioural tasks employed to measure confidence in animals and show how quantitative models can help to constrain the computational strategies underlying confidence-reporting behaviours. In our view, post-decision wagering tasks with continuous measures of confidence appear to offer the best available metrics of confidence. Since behavioural reports alone provide a limited window into mechanism, we argue that progress calls for measuring the neural representations and identifying the computations underlying confidence reports. We present a case study using such a computational approach to study the neural correlates of decision confidence in rats. This work shows that confidence assessments may be considered higher order, but can be generated using elementary neural computations that are available to a wide range of species. Finally, we discuss the relationship of confidence judgements to the wider behavioural uses of confidence and uncertainty. PMID:22492750

  9. Computational Analysis Reveals a Key Regulator of Cryptococcal Virulence and Determinant of Host Response

    PubMed Central

    Gish, Stacey R.; Maier, Ezekiel J.; Haynes, Brian C.; Santiago-Tirado, Felipe H.; Srikanta, Deepa L.; Ma, Cynthia Z.; Li, Lucy X.; Williams, Matthew; Crouch, Erika C.; Khader, Shabaana A.

    2016-01-01

    ABSTRACT Cryptococcus neoformans is a ubiquitous, opportunistic fungal pathogen that kills over 600,000 people annually. Here, we report integrated computational and experimental investigations of the role and mechanisms of transcriptional regulation in cryptococcal infection. Major cryptococcal virulence traits include melanin production and the development of a large polysaccharide capsule upon host entry; shed capsule polysaccharides also impair host defenses. We found that both transcription and translation are required for capsule growth and that Usv101 is a master regulator of pathogenesis, regulating melanin production, capsule growth, and capsule shedding. It does this by directly regulating genes encoding glycoactive enzymes and genes encoding three other transcription factors that are essential for capsule growth: GAT201, RIM101, and SP1. Murine infection with cryptococci lacking Usv101 significantly alters the kinetics and pathogenesis of disease, with extended survival and, unexpectedly, death by pneumonia rather than meningitis. Our approaches and findings will inform studies of other pathogenic microbes. PMID:27094327

  10. Cortical Neural Computation by Discrete Results Hypothesis

    PubMed Central

    Castejon, Carlos; Nuñez, Angel

    2016-01-01

    One of the most challenging problems we face in neuroscience is to understand how the cortex performs computations. There is increasing evidence that the power of the cortical processing is produced by populations of neurons forming dynamic neuronal ensembles. Theoretical proposals and multineuronal experimental studies have revealed that ensembles of neurons can form emergent functional units. However, how these ensembles are implicated in cortical computations is still a mystery. Although cell ensembles have been associated with brain rhythms, the functional interaction remains largely unclear. It is still unknown how spatially distributed neuronal activity can be temporally integrated to contribute to cortical computations. A theoretical explanation integrating spatial and temporal aspects of cortical processing is still lacking. In this Hypothesis and Theory article, we propose a new functional theoretical framework to explain the computational roles of these ensembles in cortical processing. We suggest that complex neural computations underlying cortical processing could be temporally discrete and that sensory information would need to be quantized to be computed by the cerebral cortex. Accordingly, we propose that cortical processing is produced by the computation of discrete spatio-temporal functional units that we have called “Discrete Results” (Discrete Results Hypothesis). This hypothesis represents a novel functional mechanism by which information processing is computed in the cortex. Furthermore, we propose that precise dynamic sequences of “Discrete Results” is the mechanism used by the cortex to extract, code, memorize and transmit neural information. The novel “Discrete Results” concept has the ability to match the spatial and temporal aspects of cortical processing. We discuss the possible neural underpinnings of these functional computational units and describe the empirical evidence supporting our hypothesis. We propose that fast

  11. Cortical Neural Computation by Discrete Results Hypothesis.

    PubMed

    Castejon, Carlos; Nuñez, Angel

    2016-01-01

    One of the most challenging problems we face in neuroscience is to understand how the cortex performs computations. There is increasing evidence that the power of the cortical processing is produced by populations of neurons forming dynamic neuronal ensembles. Theoretical proposals and multineuronal experimental studies have revealed that ensembles of neurons can form emergent functional units. However, how these ensembles are implicated in cortical computations is still a mystery. Although cell ensembles have been associated with brain rhythms, the functional interaction remains largely unclear. It is still unknown how spatially distributed neuronal activity can be temporally integrated to contribute to cortical computations. A theoretical explanation integrating spatial and temporal aspects of cortical processing is still lacking. In this Hypothesis and Theory article, we propose a new functional theoretical framework to explain the computational roles of these ensembles in cortical processing. We suggest that complex neural computations underlying cortical processing could be temporally discrete and that sensory information would need to be quantized to be computed by the cerebral cortex. Accordingly, we propose that cortical processing is produced by the computation of discrete spatio-temporal functional units that we have called "Discrete Results" (Discrete Results Hypothesis). This hypothesis represents a novel functional mechanism by which information processing is computed in the cortex. Furthermore, we propose that precise dynamic sequences of "Discrete Results" is the mechanism used by the cortex to extract, code, memorize and transmit neural information. The novel "Discrete Results" concept has the ability to match the spatial and temporal aspects of cortical processing. We discuss the possible neural underpinnings of these functional computational units and describe the empirical evidence supporting our hypothesis. We propose that fast-spiking (FS

  12. Computer literacy among first year medical students in a developing country: a cross sectional study.

    PubMed

    Ranasinghe, Priyanga; Wickramasinghe, Sashimali A; Pieris, Wa Rasanga; Karunathilake, Indika; Constantine, Godwin R

    2012-09-14

    The use of computer assisted learning (CAL) has enhanced undergraduate medical education. CAL improves performance at examinations, develops problem solving skills and increases student satisfaction. The study evaluates computer literacy among first year medical students in Sri Lanka. The study was conducted at Faculty of Medicine, University of Colombo, Sri Lanka between August-September 2008. First year medical students (n = 190) were invited for the study. Data on computer literacy and associated factors were collected by an expert-validated pre-tested self-administered questionnaire. Computer literacy was evaluated by testing knowledge on 6 domains; common software packages, operating systems, database management and the usage of internet and E-mail. A linear regression was conducted using total score for computer literacy as the continuous dependant variable and other independent covariates. Sample size-181 (Response rate-95.3%), 49.7% were Males. Majority of the students (77.3%) owned a computer (Males-74.4%, Females-80.2%). Students have gained their present computer knowledge by; a formal training programme (64.1%), self learning (63.0%) or by peer learning (49.2%). The students used computers for predominately; word processing (95.6%), entertainment (95.0%), web browsing (80.1%) and preparing presentations (76.8%). Majority of the students (75.7%) expressed their willingness for a formal computer training programme at the faculty.Mean score for the computer literacy questionnaire was 48.4 ± 20.3, with no significant gender difference (Males-47.8 ± 21.1, Females-48.9 ± 19.6). There were 47.9% students that had a score less than 50% for the computer literacy questionnaire. Students from Colombo district, Western Province and Student owning a computer had a significantly higher mean score in comparison to other students (p < 0.001). In the linear regression analysis, formal computer training was the strongest predictor of computer literacy (β = 13

  13. Computer literacy among first year medical students in a developing country: A cross sectional study

    PubMed Central

    2012-01-01

    Background The use of computer assisted learning (CAL) has enhanced undergraduate medical education. CAL improves performance at examinations, develops problem solving skills and increases student satisfaction. The study evaluates computer literacy among first year medical students in Sri Lanka. Methods The study was conducted at Faculty of Medicine, University of Colombo, Sri Lanka between August-September 2008. First year medical students (n = 190) were invited for the study. Data on computer literacy and associated factors were collected by an expert-validated pre-tested self-administered questionnaire. Computer literacy was evaluated by testing knowledge on 6 domains; common software packages, operating systems, database management and the usage of internet and E-mail. A linear regression was conducted using total score for computer literacy as the continuous dependant variable and other independent covariates. Results Sample size-181 (Response rate-95.3%), 49.7% were Males. Majority of the students (77.3%) owned a computer (Males-74.4%, Females-80.2%). Students have gained their present computer knowledge by; a formal training programme (64.1%), self learning (63.0%) or by peer learning (49.2%). The students used computers for predominately; word processing (95.6%), entertainment (95.0%), web browsing (80.1%) and preparing presentations (76.8%). Majority of the students (75.7%) expressed their willingness for a formal computer training programme at the faculty. Mean score for the computer literacy questionnaire was 48.4 ± 20.3, with no significant gender difference (Males-47.8 ± 21.1, Females-48.9 ± 19.6). There were 47.9% students that had a score less than 50% for the computer literacy questionnaire. Students from Colombo district, Western Province and Student owning a computer had a significantly higher mean score in comparison to other students (p < 0.001). In the linear regression analysis, formal computer training was the strongest predictor of

  14. Enhanced limonene production in cyanobacteria reveals photosynthesis limitations.

    PubMed

    Wang, Xin; Liu, Wei; Xin, Changpeng; Zheng, Yi; Cheng, Yanbing; Sun, Su; Li, Runze; Zhu, Xin-Guang; Dai, Susie Y; Rentzepis, Peter M; Yuan, Joshua S

    2016-12-13

    Terpenes are the major secondary metabolites produced by plants, and have diverse industrial applications as pharmaceuticals, fragrance, solvents, and biofuels. Cyanobacteria are equipped with efficient carbon fixation mechanism, and are ideal cell factories to produce various fuel and chemical products. Past efforts to produce terpenes in photosynthetic organisms have gained only limited success. Here we engineered the cyanobacterium Synechococcus elongatus PCC 7942 to efficiently produce limonene through modeling guided study. Computational modeling of limonene flux in response to photosynthetic output has revealed the downstream terpene synthase as a key metabolic flux-controlling node in the MEP (2-C-methyl-d-erythritol 4-phosphate) pathway-derived terpene biosynthesis. By enhancing the downstream limonene carbon sink, we achieved over 100-fold increase in limonene productivity, in contrast to the marginal increase achieved through stepwise metabolic engineering. The establishment of a strong limonene flux revealed potential synergy between photosynthate output and terpene biosynthesis, leading to enhanced carbon flux into the MEP pathway. Moreover, we show that enhanced limonene flux would lead to NADPH accumulation, and slow down photosynthesis electron flow. Fine-tuning ATP/NADPH toward terpene biosynthesis could be a key parameter to adapt photosynthesis to support biofuel/bioproduct production in cyanobacteria.

  15. The Study of Surface Computer Supported Cooperative Work and Its Design, Efficiency, and Challenges

    ERIC Educational Resources Information Center

    Hwang, Wu-Yuin; Su, Jia-Han

    2012-01-01

    In this study, a Surface Computer Supported Cooperative Work paradigm is proposed. Recently, multitouch technology has become widely available for human-computer interaction. We found it has great potential to facilitate more awareness of human-to-human interaction than personal computers (PCs) in colocated collaborative work. However, other…

  16. Computational study of chain transfer to monomer reactions in high-temperature polymerization of alkyl acrylates.

    PubMed

    Moghadam, Nazanin; Liu, Shi; Srinivasan, Sriraj; Grady, Michael C; Soroush, Masoud; Rappe, Andrew M

    2013-03-28

    This article presents a computational study of chain transfer to monomer (CTM) reactions in self-initiated high-temperature homopolymerization of alkyl acrylates (methyl, ethyl, and n-butyl acrylate). Several mechanisms of CTM are studied. The effects of the length of live polymer chains and the type of monoradical that initiated the live polymer chains on the energy barriers and rate constants of the involved reaction steps are investigated theoretically. All calculations are carried out using density functional theory. Three types of hybrid functionals (B3LYP, X3LYP, and M06-2X) and four basis sets (6-31G(d), 6-31G(d,p), 6-311G(d), and 6-311G(d,p)) are applied to predict the molecular geometries of the reactants, products and transition sates, and energy barriers. Transition state theory is used to estimate rate constants. The results indicate that abstraction of a hydrogen atom (by live polymer chains) from the methyl group in methyl acrylate, the methylene group in ethyl acrylate, and methylene groups in n-butyl acrylate are the most likely mechanisms of CTM. Also, the rate constants of CTM reactions calculated using M06-2X are in good agreement with those estimated from polymer sample measurements using macroscopic mechanistic models. The rate constant values do not change significantly with the length of live polymer chains. Abstraction of a hydrogen atom by a tertiary radical has a higher energy barrier than abstraction by a secondary radical, which agrees with experimental findings. The calculated and experimental NMR spectra of dead polymer chains produced by CTM reactions are comparable. This theoretical/computational study reveals that CTM occurs most likely via hydrogen abstraction by live polymer chains from the methyl group of methyl acrylate and methylene group(s) of ethyl (n-butyl) acrylate.

  17. Computer-Assisted Instruction: A Case Study of Two Charter Schools

    ERIC Educational Resources Information Center

    Keengwe, Jared; Hussein, Farhan

    2013-01-01

    The purpose of this study was to examine the relationship in achievement gap between English language learners (ELLs) utilizing computer-assisted instruction (CAI) in the classroom, and ELLs relying solely on traditional classroom instruction. The study findings showed that students using CAI to supplement traditional lectures performed better…

  18. Plasticity in single neuron and circuit computations

    NASA Astrophysics Data System (ADS)

    Destexhe, Alain; Marder, Eve

    2004-10-01

    Plasticity in neural circuits can result from alterations in synaptic strength or connectivity, as well as from changes in the excitability of the neurons themselves. To better understand the role of plasticity in the brain, we need to establish how brain circuits work and the kinds of computations that different circuit structures achieve. By linking theoretical and experimental studies, we are beginning to reveal the consequences of plasticity mechanisms for network dynamics, in both simple invertebrate circuits and the complex circuits of mammalian cerebral cortex.

  19. Increasing Elementary School Teachers' Awareness of Gender Inequity in Student Computer Usage

    ERIC Educational Resources Information Center

    Luongo, Nicole

    2012-01-01

    This study was designed to increase gender equity awareness in elementary school teachers with respect to student computer and technology usage. Using professional development methods with a group of teachers, the writer attempted to help them become more aware of gender bias in technology instruction. An analysis of the data revealed that…

  20. Dropping Out of Computer Science: A Phenomenological Study of Student Lived Experiences in Community College Computer Science

    NASA Astrophysics Data System (ADS)

    Gilbert-Valencia, Daniel H.

    California community colleges contribute alarmingly few computer science degree or certificate earners. While the literature shows clear K-12 impediments to CS matriculation in higher education, very little is known about the experiences of those who overcome initial impediments to CS yet do not persist through to program completion. This phenomenological study explores insights into that specific experience by interviewing underrepresented, low income, first-generation college students who began community college intending to transfer to 4-year institutions majoring in CS but switched to another field and remain enrolled or graduated. This study explores the lived experiences of students facing barriers, their avenues for developing interest in CS, and the persistence support systems they encountered, specifically looking at how students constructed their academic choice from these experiences. The growing diversity within California's population necessitates that experiences specific to underrepresented students be considered as part of this exploration. Ten semi-structured interviews and observations were conducted, transcribed and coded. Artifacts supporting student experiences were also collected. Data was analyzed through a social-constructivist lens to provide insight into experiences and how they can be navigated to create actionable strategies for community college computer science departments wishing to increase student success. Three major themes emerged from this research: (1) students shared pre-college characteristics; (2) faced similar challenges in college CS courses; and (3) shared similar reactions to the "work" of computer science. Results of the study included (1) CS interest development hinged on computer ownership in the home; (2) participants shared characteristics that were ideal for college success but not CS success; and (3) encounters in CS departments produced unique challenges for participants. Though CS interest was and remains

  1. A Comparative Study to Evaluate the Effectiveness of Computer Assisted Instruction (CAI) versus Class Room Lecture (RL) for Computer Science at ICS Level

    ERIC Educational Resources Information Center

    Kausar, Tayyaba; Choudhry, Bushra Naoreen; Gujjar, Aijaz Ahmed

    2008-01-01

    This study was aimed to evaluate the effectiveness of CAI vs. classroom lecture for computer science at ICS level. The objectives were to compare the learning effects of two groups with class room lecture and computer assisted instruction studying the same curriculum and the effects of CAI and CRL in terms of cognitive development. Hypothesis of…

  2. A Comparative Study to Evaluate the Effectiveness of Computer Assisted Instruction (CAI) versus Class Room Lecture (CRL) for Computer Science at ICS Level

    ERIC Educational Resources Information Center

    Kausar, Tayyaba; Choudhry, Bushra Naoreen; Gujjar, Aijaz Ahmed

    2008-01-01

    This study was aimed to evaluate the effectiveness of CAI vs. classroom lecture for computer science at ICS level. The objectives were to compare the learning effects of two groups with class room lecture and computer assisted instruction studying the same curriculum and the effects of CAI and CRL in terms of cognitive development. Hypothesis of…

  3. A Study of Computer-Aided Geometric Optical Design.

    DTIC Science & Technology

    1982-10-01

    short programs on tape. A computer account number and Cyber computer manuals were obtained. A familiarity with the use and maintenance of computer files...in the interpretation of the information. Ray fans, spot diagrams, wavefront variance, Strehl ratio, vignetting .- diagrams Pnd optical transfer...other surface begins to cut off these rays (20:113). This is characterized by a loss of intensity at the outside of the image. A known manual

  4. Reducing Foreign Language Communication Apprehension with Computer-Mediated Communication: A Preliminary Study

    ERIC Educational Resources Information Center

    Arnold, Nike

    2007-01-01

    Many studies (e.g., [Beauvois, M.H., 1998. "E-talk: Computer-assisted classroom discussion--attitudes and motivation." In: Swaffar, J., Romano, S., Markley, P., Arens, K. (Eds.), "Language learning online: Theory and practice in the ESL and L2 computer classroom." Labyrinth Publications, Austin, TX, pp. 99-120; Bump, J., 1990. "Radical changes in…

  5. A detailed experimental study of a DNA computer with two endonucleases.

    PubMed

    Sakowski, Sebastian; Krasiński, Tadeusz; Sarnik, Joanna; Blasiak, Janusz; Waldmajer, Jacek; Poplawski, Tomasz

    2017-07-14

    Great advances in biotechnology have allowed the construction of a computer from DNA. One of the proposed solutions is a biomolecular finite automaton, a simple two-state DNA computer without memory, which was presented by Ehud Shapiro's group at the Weizmann Institute of Science. The main problem with this computer, in which biomolecules carry out logical operations, is its complexity - increasing the number of states of biomolecular automata. In this study, we constructed (in laboratory conditions) a six-state DNA computer that uses two endonucleases (e.g. AcuI and BbvI) and a ligase. We have presented a detailed experimental verification of its feasibility. We described the effect of the number of states, the length of input data, and the nondeterminism on the computing process. We also tested different automata (with three, four, and six states) running on various accepted input words of different lengths such as ab, aab, aaab, ababa, and of an unaccepted word ba. Moreover, this article presents the reaction optimization and the methods of eliminating certain biochemical problems occurring in the implementation of a biomolecular DNA automaton based on two endonucleases.

  6. A computer-assisted study of pulse dynamics in anisotropic media

    NASA Astrophysics Data System (ADS)

    Krishnan, J.; Engelborghs, K.; Bär, M.; Lust, K.; Roose, D.; Kevrekidis, I. G.

    2001-06-01

    This study focuses on the computer-assisted stability analysis of travelling pulse-like structures in spatially periodic heterogeneous reaction-diffusion media. The physical motivation comes from pulse propagation in thin annular domains on a diffusionally anisotropic catalytic surface. The study was performed by computing the travelling pulse-like structures as limit cycles of the spatially discretized PDE, which in turn is performed in two ways: a Newton method based on a pseudospectral discretization of the PDE, and a Newton-Picard method based on a finite difference discretization. Details about the spectra of these modulated pulse-like structures are discussed, including how they may be compared with the spectra of pulses in homogeneous media. The effects of anisotropy on the dynamics of pulses and pulse pairs are studied. Beyond shifting the location of bifurcations present in homogeneous media, anisotropy can also introduce certain new instabilities.

  7. SYRIT Computer School Systems Report.

    ERIC Educational Resources Information Center

    Maldonado, Carmen

    The 1991-92 and 1993-94 audit for SYRIT Computer School Systems revealed noncompliance of appropriate law and regulations in certifying students for Tuition Assistance Program (TAP) awards. SYRIT was overpaid $2,817,394 because school officials incorrectly certified student eligibility. The audit also discovered that students graduated and were…

  8. Population coding and decoding in a neural field: a computational study.

    PubMed

    Wu, Si; Amari, Shun-Ichi; Nakahara, Hiroyuki

    2002-05-01

    This study uses a neural field model to investigate computational aspects of population coding and decoding when the stimulus is a single variable. A general prototype model for the encoding process is proposed, in which neural responses are correlated, with strength specified by a gaussian function of their difference in preferred stimuli. Based on the model, we study the effect of correlation on the Fisher information, compare the performances of three decoding methods that differ in the amount of encoding information being used, and investigate the implementation of the three methods by using a recurrent network. This study not only rediscovers main results in existing literatures in a unified way, but also reveals important new features, especially when the neural correlation is strong. As the neural correlation of firing becomes larger, the Fisher information decreases drastically. We confirm that as the width of correlation increases, the Fisher information saturates and no longer increases in proportion to the number of neurons. However, we prove that as the width increases further--wider than (sqrt)2 times the effective width of the turning function--the Fisher information increases again, and it increases without limit in proportion to the number of neurons. Furthermore, we clarify the asymptotic efficiency of the maximum likelihood inference (MLI) type of decoding methods for correlated neural signals. It shows that when the correlation covers a nonlocal range of population (excepting the uniform correlation and when the noise is extremely small), the MLI type of method, whose decoding error satisfies the Cauchy-type distribution, is not asymptotically efficient. This implies that the variance is no longer adequate to measure decoding accuracy.

  9. Hybrid computational and experimental approach for the study and optimization of mechanical components

    NASA Astrophysics Data System (ADS)

    Furlong, Cosme; Pryputniewicz, Ryszard J.

    1998-05-01

    Increased demands on the performance and efficiency of mechanical components impose challenges on their engineering design and optimization, especially when new and more demanding applications must be developed in relatively short periods of time while satisfying design objectives, as well as cost and manufacturability. In addition, reliability and durability must be taken into consideration. As a consequence, effective quantitative methodologies, computational and experimental, should be applied in the study and optimization of mechanical components. Computational investigations enable parametric studies and the determination of critical engineering design conditions, while experimental investigations, especially those using optical techniques, provide qualitative and quantitative information on the actual response of the structure of interest to the applied load and boundary conditions. We discuss a hybrid experimental and computational approach for investigation and optimization of mechanical components. The approach is based on analytical, computational, and experimental resolutions methodologies in the form of computational, noninvasive optical techniques, and fringe prediction analysis tools. Practical application of the hybrid approach is illustrated with representative examples that demonstrate the viability of the approach as an effective engineering tool for analysis and optimization.

  10. Screening for cognitive impairment in older individuals. Validation study of a computer-based test.

    PubMed

    Green, R C; Green, J; Harrison, J M; Kutner, M H

    1994-08-01

    This study examined the validity of a computer-based cognitive test that was recently designed to screen the elderly for cognitive impairment. Criterion-related validity was examined by comparing test scores of impaired patients and normal control subjects. Construct-related validity was computed through correlations between computer-based subtests and related conventional neuropsychological subtests. University center for memory disorders. Fifty-two patients with mild cognitive impairment by strict clinical criteria and 50 unimpaired, age- and education-matched control subjects. Control subjects were rigorously screened by neurological, neuropsychological, imaging, and electrophysiological criteria to identify and exclude individuals with occult abnormalities. Using a cut-off total score of 126, this computer-based instrument had a sensitivity of 0.83 and a specificity of 0.96. Using a prevalence estimate of 10%, predictive values, positive and negative, were 0.70 and 0.96, respectively. Computer-based subtests correlated significantly with conventional neuropsychological tests measuring similar cognitive domains. Thirteen (17.8%) of 73 volunteers with normal medical histories were excluded from the control group, with unsuspected abnormalities on standard neuropsychological tests, electroencephalograms, or magnetic resonance imaging scans. Computer-based testing is a valid screening methodology for the detection of mild cognitive impairment in the elderly, although this particular test has important limitations. Broader applications of computer-based testing will require extensive population-based validation. Future studies should recognize that normal control subjects without a history of disease who are typically used in validation studies may have a high incidence of unsuspected abnormalities on neurodiagnostic studies.

  11. Computational study of the fibril organization of polyglutamine repeats reveals a common motif identified in beta-helices.

    PubMed

    Zanuy, David; Gunasekaran, Kannan; Lesk, Arthur M; Nussinov, Ruth

    2006-04-21

    The formation of fibril aggregates by long polyglutamine sequences is assumed to play a major role in neurodegenerative diseases such as Huntington. Here, we model peptides rich in glutamine, through a series of molecular dynamics simulations. Starting from a rigid nanotube-like conformation, we have obtained a new conformational template that shares structural features of a tubular helix and of a beta-helix conformational organization. Our new model can be described as a super-helical arrangement of flat beta-sheet segments linked by planar turns or bends. Interestingly, our comprehensive analysis of the Protein Data Bank reveals that this is a common motif in beta-helices (termed beta-bend), although it has not been identified so far. The motif is based on the alternation of beta-sheet and helical conformation as the protein sequence is followed from the N to the C termini (beta-alpha(R)-beta-polyPro-beta). We further identify this motif in the ssNMR structure of the protofibril of the amyloidogenic peptide Abeta(1-40). The recurrence of the beta-bend suggests a general mode of connecting long parallel beta-sheet segments that would allow the growth of partially ordered fibril structures. The design allows the peptide backbone to change direction with a minimal loss of main chain hydrogen bonds. The identification of a coherent organization beyond that of the beta-sheet segments in different folds rich in parallel beta-sheets suggests a higher degree of ordered structure in protein fibrils, in agreement with their low solubility and dense molecular packing.

  12. Attitudes to Technology, Perceived Computer Self-Efficacy and Computer Anxiety as Predictors of Computer Supported Education

    ERIC Educational Resources Information Center

    Celik, Vehbi; Yesilyurt, Etem

    2013-01-01

    There is a large body of research regarding computer supported education, perceptions of computer self-efficacy, computer anxiety and the technological attitudes of teachers and teacher candidates. However, no study has been conducted on the correlation between and effect of computer supported education, perceived computer self-efficacy, computer…

  13. Computational predictions of the new Gallium nitride nanoporous structures

    NASA Astrophysics Data System (ADS)

    Lien, Le Thi Hong; Tuoc, Vu Ngoc; Duong, Do Thi; Thu Huyen, Nguyen

    2018-05-01

    Nanoporous structural prediction is emerging area of research because of their advantages for a wide range of materials science and technology applications in opto-electronics, environment, sensors, shape-selective and bio-catalysis, to name just a few. We propose a computationally and technically feasible approach for predicting Gallium nitride nanoporous structures with hollows at the nano scale. The designed porous structures are studied with computations using the density functional tight binding (DFTB) and conventional density functional theory methods, revealing a variety of promising mechanical and electronic properties, which can potentially find future realistic applications. Their stability is discussed by means of the free energy computed within the lattice-dynamics approach. Our calculations also indicate that all the reported hollow structures are wide band gap semiconductors in the same fashion with their parent’s bulk stable phase. The electronic band structures of these nanoporous structures are finally examined in detail.

  14. Blood Flow in Idealized Vascular Access for Hemodialysis: A Review of Computational Studies.

    PubMed

    Ene-Iordache, Bogdan; Remuzzi, Andrea

    2017-09-01

    Although our understanding of the failure mechanism of vascular access for hemodialysis has increased substantially, this knowledge has not translated into successful therapies. Despite advances in technology, it is recognized that vascular access is difficult to maintain, due to complications such as intimal hyperplasia. Computational studies have been used to estimate hemodynamic changes induced by vascular access creation. Due to the heterogeneity of patient-specific geometries, and difficulties with obtaining reliable models of access vessels, idealized models were often employed. In this review we analyze the knowledge gained with the use of computational such simplified models. A review of the literature was conducted, considering studies employing a computational fluid dynamics approach to gain insights into the flow field phenotype that develops in idealized models of vascular access. Several important discoveries have originated from idealized model studies, including the detrimental role of disturbed flow and turbulent flow, and the beneficial role of spiral flow in intimal hyperplasia. The general flow phenotype was consistent among studies, but findings were not treated homogeneously since they paralleled achievements in cardiovascular biomechanics which spanned over the last two decades. Computational studies in idealized models are important for studying local blood flow features and evaluating new concepts that may improve the patency of vascular access for hemodialysis. For future studies we strongly recommend numerical modelling targeted at accurately characterizing turbulent flows and multidirectional wall shear disturbances.

  15. CBSS Outreach Project: Computer-Based Study Strategies for Students with Learning Disabilities. Final Report.

    ERIC Educational Resources Information Center

    Anderson-Inman, Lynne; Ditson, Mary

    This final report describes activities and accomplishments of the four-year Computer-Based Study Strategies (CBSS) Outreach Project at the University of Oregon. This project disseminated information about using computer-based study strategies as an intervention for students with learning disabilities and provided teachers in participating outreach…

  16. Utility of screening computed tomography of chest, abdomen and pelvis in patients after heart transplantation.

    PubMed

    Dasari, Tarun W; Pavlovic-Surjancev, Biljana; Dusek, Linda; Patel, Nilamkumar; Heroux, Alain L

    2011-12-01

    Malignancy is a late cause of mortality in heart transplant recipients. It is unknown if screening computed tomography scan would lead to early detection of such malignancies or serious vascular anomalies post heart transplantation. This is a single center observational study of patients undergoing surveillance computed tomography of chest, abdomen and pelvis at least 5 years after transplantation. Abnormal findings, included pulmonary nodules, lymphadenopathy and intra-thoracic and intra-abdominal masses and vascular anomalies such as abdominal aortic aneurysm. The clinical follow up of each of these major abnormal findings is summarized. A total of 63 patients underwent computed tomography scan of chest, abdomen and pelvis at least 5 years after transplantation. Of these, 54 (86%) were male and 9 (14%) were female. Mean age was 52±9.2 years. Computed tomography revealed 1 lung cancer (squamous cell) only. Non specific pulmonary nodules were seen in 6 patients (9.5%). The most common incidental finding was abdominal aortic aneurysms (N=6 (9.5%)), which necessitated follow up computed tomography (N=5) or surgery (N=1). Mean time to detection of abdominal aortic aneurysms from transplantation was 14.6±4.2 years. Mean age at the time of detection of abdominal aortic aneurysms was 74.5±3.2 years. Screening computed tomography scan in patients 5 years from transplantation revealed only one malignancy but lead to increased detection of abdominal aortic aneurysms. Thus the utility is low in terms of detection of malignancy. Based on this study we do not recommend routine computed tomography post heart transplantation. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  17. Experimental and Computational Study of Ductile Fracture in Small Punch Tests

    PubMed Central

    Bargmann, Swantje; Hähner, Peter

    2017-01-01

    A unified experimental-computational study on ductile fracture initiation and propagation during small punch testing is presented. Tests are carried out at room temperature with unnotched disks of different thicknesses where large-scale yielding prevails. In thinner specimens, the fracture occurs with severe necking under membrane tension, whereas for thicker ones a through thickness shearing mode prevails changing the crack orientation relative to the loading direction. Computational studies involve finite element simulations using a shear modified Gurson-Tvergaard-Needleman porous plasticity model with an integral-type nonlocal formulation. The predicted punch load-displacement curves and deformed profiles are in good agreement with the experimental results. PMID:29039748

  18. Experimental and Computational Study of Ductile Fracture in Small Punch Tests.

    PubMed

    Gülçimen Çakan, Betül; Soyarslan, Celal; Bargmann, Swantje; Hähner, Peter

    2017-10-17

    A unified experimental-computational study on ductile fracture initiation and propagation during small punch testing is presented. Tests are carried out at room temperature with unnotched disks of different thicknesses where large-scale yielding prevails. In thinner specimens, the fracture occurs with severe necking under membrane tension, whereas for thicker ones a through thickness shearing mode prevails changing the crack orientation relative to the loading direction. Computational studies involve finite element simulations using a shear modified Gurson-Tvergaard-Needleman porous plasticity model with an integral-type nonlocal formulation. The predicted punch load-displacement curves and deformed profiles are in good agreement with the experimental results.

  19. Logic as Marr's Computational Level: Four Case Studies.

    PubMed

    Baggio, Giosuè; van Lambalgen, Michiel; Hagoort, Peter

    2015-04-01

    We sketch four applications of Marr's levels-of-analysis methodology to the relations between logic and experimental data in the cognitive neuroscience of language and reasoning. The first part of the paper illustrates the explanatory power of computational level theories based on logic. We show that a Bayesian treatment of the suppression task in reasoning with conditionals is ruled out by EEG data, supporting instead an analysis based on defeasible logic. Further, we describe how results from an EEG study on temporal prepositions can be reanalyzed using formal semantics, addressing a potential confound. The second part of the article demonstrates the predictive power of logical theories drawing on EEG data on processing progressive constructions and on behavioral data on conditional reasoning in people with autism. Logical theories can constrain processing hypotheses all the way down to neurophysiology, and conversely neuroscience data can guide the selection of alternative computational level models of cognition. Copyright © 2014 Cognitive Science Society, Inc.

  20. Contextuality and Wigner-function negativity in qubit quantum computation

    NASA Astrophysics Data System (ADS)

    Raussendorf, Robert; Browne, Dan E.; Delfosse, Nicolas; Okay, Cihan; Bermejo-Vega, Juan

    2017-05-01

    We describe schemes of quantum computation with magic states on qubits for which contextuality and negativity of the Wigner function are necessary resources possessed by the magic states. These schemes satisfy a constraint. Namely, the non-negativity of Wigner functions must be preserved under all available measurement operations. Furthermore, we identify stringent consistency conditions on such computational schemes, revealing the general structure by which negativity of Wigner functions, hardness of classical simulation of the computation, and contextuality are connected.

  1. The Effects of Integrating Service Learning into Computer Science: An Inter-Institutional Longitudinal Study

    ERIC Educational Resources Information Center

    Payton, Jamie; Barnes, Tiffany; Buch, Kim; Rorrer, Audrey; Zuo, Huifang

    2015-01-01

    This study is a follow-up to one published in computer science education in 2010 that reported preliminary results showing a positive impact of service learning on student attitudes associated with success and retention in computer science. That paper described how service learning was incorporated into a computer science course in the context of…

  2. Reciprocal Questioning and Computer-based Instruction in Introductory Auditing: Student Perceptions.

    ERIC Educational Resources Information Center

    Watters, Mike

    2000-01-01

    An auditing course used reciprocal questioning (Socratic method) and computer-based instruction. Separate evaluations by 67 students revealed a strong aversion to the Socratic method; students expected professors to lecture. They showed a strong preference for the computer-based assignment. (SK)

  3. Handheld computers for self-administered sensitive data collection: A comparative study in Peru

    PubMed Central

    Bernabe-Ortiz, Antonio; Curioso, Walter H; Gonzales, Marco A; Evangelista, Wilfredo; Castagnetto, Jesus M; Carcamo, Cesar P; Hughes, James P; Garcia, Patricia J; Garnett, Geoffrey P; Holmes, King K

    2008-01-01

    Background Low-cost handheld computers (PDA) potentially represent an efficient tool for collecting sensitive data in surveys. The goal of this study is to evaluate the quality of sexual behavior data collected with handheld computers in comparison with paper-based questionnaires. Methods A PDA-based program for data collection was developed using Open-Source tools. In two cross-sectional studies, we compared data concerning sexual behavior collected with paper forms to data collected with PDA-based forms in Ancon (Lima). Results The first study enrolled 200 participants (18–29 years). General agreement between data collected with paper format and handheld computers was 86%. Categorical variables agreement was between 70.5% and 98.5% (Kappa: 0.43–0.86) while numeric variables agreement was between 57.1% and 79.8% (Spearman: 0.76–0.95). Agreement and correlation were higher in those who had completed at least high school than those with less education. The second study enrolled 198 participants. Rates of responses to sensitive questions were similar between both kinds of questionnaires. However, the number of inconsistencies (p = 0.0001) and missing values (p = 0.001) were significantly higher in paper questionnaires. Conclusion This study showed the value of the use of handheld computers for collecting sensitive data, since a high level of agreement between paper and PDA responses was reached. In addition, a lower number of inconsistencies and missing values were found with the PDA-based system. This study has demonstrated that it is feasible to develop a low-cost application for handheld computers, and that PDAs are feasible alternatives for collecting field data in a developing country. PMID:18366687

  4. Oxidative damage in DNA bases revealed by UV resonant Raman spectroscopy.

    PubMed

    D'Amico, Francesco; Cammisuli, Francesca; Addobbati, Riccardo; Rizzardi, Clara; Gessini, Alessandro; Masciovecchio, Claudio; Rossi, Barbara; Pascolo, Lorella

    2015-03-07

    We report on the use of the UV Raman technique to monitor the oxidative damage of deoxynucleotide triphosphates (dATP, dGTP, dCTP and dTTP) and DNA (plasmid vector) solutions. Nucleotide and DNA aqueous solutions were exposed to hydrogen peroxide (H2O2) and iron containing carbon nanotubes (CNTs) to produce Fenton's reaction and induce oxidative damage. UV Raman spectroscopy is shown to be maximally efficient to reveal changes in the nitrogenous bases during the oxidative mechanisms occurring on these molecules. The analysis of Raman spectra, supported by numerical computations, revealed that the Fenton's reaction causes an oxidation of the nitrogenous bases in dATP, dGTP and dCTP solutions leading to the production of 2-hydroxyadenine, 8-hydroxyguanine and 5-hydroxycytosine. No thymine change was revealed in the dTTP solution under the same conditions. Compared to single nucleotide solutions, plasmid DNA oxidation has resulted in more radical damage that causes the breaking of the adenine and guanine aromatic rings. Our study demonstrates the advantage of using UV Raman spectroscopy for rapidly monitoring the oxidation changes in DNA aqueous solutions that can be assigned to specific nitrogenous bases.

  5. Quantum Optical Implementations of Current Quantum Computing Paradigms

    DTIC Science & Technology

    2005-05-01

    Conferences and Proceedings: The results were presented at several conferences. These include: 1. M. O. Scully, " Foundations of Quantum Mechanics ", in...applications have revealed a strong connection between the fundamental aspects of quantum mechanics that governs physical systems and the informational...could be solved in polynomial time using quantum computers. Another set of problems where quantum mechanics can carry out computations substantially

  6. Heavy Lift Vehicle (HLV) Avionics Flight Computing Architecture Study

    NASA Technical Reports Server (NTRS)

    Hodson, Robert F.; Chen, Yuan; Morgan, Dwayne R.; Butler, A. Marc; Sdhuh, Joseph M.; Petelle, Jennifer K.; Gwaltney, David A.; Coe, Lisa D.; Koelbl, Terry G.; Nguyen, Hai D.

    2011-01-01

    A NASA multi-Center study team was assembled from LaRC, MSFC, KSC, JSC and WFF to examine potential flight computing architectures for a Heavy Lift Vehicle (HLV) to better understand avionics drivers. The study examined Design Reference Missions (DRMs) and vehicle requirements that could impact the vehicles avionics. The study considered multiple self-checking and voting architectural variants and examined reliability, fault-tolerance, mass, power, and redundancy management impacts. Furthermore, a goal of the study was to develop the skills and tools needed to rapidly assess additional architectures should requirements or assumptions change.

  7. Using minimal human-computer interfaces for studying the interactive development of social awareness

    PubMed Central

    Froese, Tom; Iizuka, Hiroyuki; Ikegami, Takashi

    2014-01-01

    According to the enactive approach to cognitive science, perception is essentially a skillful engagement with the world. Learning how to engage via a human-computer interface (HCI) can therefore be taken as an instance of developing a new mode of experiencing. Similarly, social perception is theorized to be primarily constituted by skillful engagement between people, which implies that it is possible to investigate the origins and development of social awareness using multi-user HCIs. We analyzed the trial-by-trial objective and subjective changes in sociality that took place during a perceptual crossing experiment in which embodied interaction between pairs of adults was mediated over a minimalist haptic HCI. Since that study required participants to implicitly relearn how to mutually engage so as to perceive each other's presence, we hypothesized that there would be indications that the initial developmental stages of social awareness were recapitulated. Preliminary results reveal that, despite the lack of explicit feedback about task performance, there was a trend for the clarity of social awareness to increase over time. We discuss the methodological challenges involved in evaluating whether this trend was characterized by distinct developmental stages of objective behavior and subjective experience. PMID:25309490

  8. The effects of integrating service learning into computer science: an inter-institutional longitudinal study

    NASA Astrophysics Data System (ADS)

    Payton, Jamie; Barnes, Tiffany; Buch, Kim; Rorrer, Audrey; Zuo, Huifang

    2015-07-01

    This study is a follow-up to one published in computer science education in 2010 that reported preliminary results showing a positive impact of service learning on student attitudes associated with success and retention in computer science. That paper described how service learning was incorporated into a computer science course in the context of the Students & Technology in Academia, Research, and Service (STARS) Alliance, an NSF-supported broadening participation in computing initiative that aims to diversify the computer science pipeline through innovative pedagogy and inter-institutional partnerships. The current paper describes how the STARS Alliance has expanded to diverse institutions, all using service learning as a vehicle for broadening participation in computing and enhancing attitudes and behaviors associated with student success. Results supported the STARS model of service learning for enhancing computing efficacy and computing commitment and for providing diverse students with many personal and professional development benefits.

  9. Study on the application of mobile internet cloud computing platform

    NASA Astrophysics Data System (ADS)

    Gong, Songchun; Fu, Songyin; Chen, Zheng

    2012-04-01

    The innovative development of computer technology promotes the application of the cloud computing platform, which actually is the substitution and exchange of a sort of resource service models and meets the needs of users on the utilization of different resources after changes and adjustments of multiple aspects. "Cloud computing" owns advantages in many aspects which not merely reduce the difficulties to apply the operating system and also make it easy for users to search, acquire and process the resources. In accordance with this point, the author takes the management of digital libraries as the research focus in this paper, and analyzes the key technologies of the mobile internet cloud computing platform in the operation process. The popularization and promotion of computer technology drive people to create the digital library models, and its core idea is to strengthen the optimal management of the library resource information through computers and construct an inquiry and search platform with high performance, allowing the users to access to the necessary information resources at any time. However, the cloud computing is able to promote the computations within the computers to distribute in a large number of distributed computers, and hence implement the connection service of multiple computers. The digital libraries, as a typical representative of the applications of the cloud computing, can be used to carry out an analysis on the key technologies of the cloud computing.

  10. An Inviscid Computational Study of an X-33 Configuration at Hypersonic Speeds

    NASA Technical Reports Server (NTRS)

    Prabhu, Ramadas K.

    1999-01-01

    This report documents the results of a study conducted to compute the inviscid longitudinal aerodynamic characteristics of a simplified X-33 configuration. The major components of the X-33 vehicle, namely the body, the canted fin, the vertical fin, and the body-flap, were simulated in the CFD (Computational Fluid Dynamic) model. The rear-ward facing surfaces at the base including the aerospike engine surfaces were not simulated. The FELISA software package consisting of an unstructured surface and volume grid generator and two inviscid flow solvers was used for this study. Computations were made for Mach 4.96, 6.0, and 10.0 with perfect gas air option, and for Mach 10 with equilibrium air option with flow condition of a typical point on the X-33 flight trajectory. Computations were also made with CF4 gas option at Mach 6.0 to simulate the CF4 tunnel flow condition. An angle of attack range of 12 to 48 deg was covered. The CFD results were compared with available wind tunnel data. Comparison was good at low angles of attack; at higher angles of attack (beyond 25 deg) some differences were found in the pitching moment. These differences progressively increased with increase in angle of attack, and are attributed to the viscous effects. However, the computed results showed the trends exhibited by the wind tunnel data.

  11. Genome-wide association study reveals sex-specific selection signals against autosomal nucleotide variants.

    PubMed

    Ryu, Dongchan; Ryu, Jihye; Lee, Chaeyoung

    2016-05-01

    A genome-wide association study (GWAS) was conducted to examine genetic associations of common autosomal nucleotide variants with sex in a Korean population with 4183 males and 4659 females. Nine genetic association signals were identified in four intragenic and five intergenic regions (P<5 × 10(-8)). Further analysis with an independent data set confirmed two intragenic association signals in the genes encoding protein phosphatase 1, regulatory subunit 12B (PPP1R12B, intron 12, rs1819043) and dynein, axonemal, heavy chain 11 (DNAH11, intron 61, rs10255013), which are directly involved in the reproductive system. This study revealed autosomal genetic variants associated with sex ratio by GWAS for the first time. This implies that genetic variants in proximity to the association signals may influence sex-specific selection and contribute to sex ratio variation. Further studies are required to reveal the mechanisms underlying sex-specific selection.

  12. Mononuclear nickel (II) and copper (II) coordination complexes supported by bispicen ligand derivatives: Experimental and computational studies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Singh, Nirupama; Niklas, Jens; Poluektov, Oleg

    2017-01-01

    The synthesis, characterization and density functional theory calculations of mononuclear Ni and Cu complexes supported by the N,N’-Dimethyl-N,N’-bis-(pyridine-2-ylmethyl)-1,2-diaminoethane ligand and its derivatives are reported. The complexes were characterized by X-ray crystallography as well as by UV-visible absorption spectroscopy and EPR spectroscopy. The solid state structure of these coordination complexes revealed that the geometry of the complex depended on the identity of the metal center. Solution phase characterization data are in accord with the solid phase structure, indicating minimal structural changes in solution. Optical spectroscopy revealed that all of the complexes exhibit color owing to d-d transition bands in the visiblemore » region. Magnetic parameters obtained from EPR spectroscopy with other structural data suggest that the Ni(II) complexes are in pseudo-octahedral geometry and Cu(II) complexes are in a distorted square pyramidal geometry. In order to understand in detail how ligand sterics and electronics affect complex topology detailed computational studies were performed. The series of complexes reported in this article will add significant value in the field of coordination chemistry as Ni(II) and Cu(II) complexes supported by tetradentate pyridyl based ligands are rather scarce.« less

  13. Computational Study of Hypersonic Boundary Layer Stability on Cones

    NASA Astrophysics Data System (ADS)

    Gronvall, Joel Edwin

    Due to the complex nature of boundary layer laminar-turbulent transition in hypersonic flows and the resultant effect on the design of re-entry vehicles, there remains considerable interest in developing a deeper understanding of the underlying physics. To that end, the use of experimental observations and computational analysis in a complementary manner will provide the greatest insights. It is the intent of this work to provide such an analysis for two ongoing experimental investigations. The first focuses on the hypersonic boundary layer transition experiments for a slender cone that are being conducted at JAXA's free-piston shock tunnel HIEST facility. Of particular interest are the measurements of disturbance frequencies associated with transition at high enthalpies. The computational analysis provided for these cases included two-dimensional CFD mean flow solutions for use in boundary layer stability analyses. The disturbances in the boundary layer were calculated using the linear parabolized stability equations. Estimates for transition locations, comparisons of measured disturbance frequencies and computed frequencies, and a determination of the type of disturbances present were made. It was found that for the cases where the disturbances were measured at locations where the flow was still laminar but nearly transitional, that the highly amplified disturbances showed reasonable agreement with the computations. Additionally, an investigation of the effects of finite-rate chemistry and vibrational excitation on flows over cones was conducted for a set of theoretical operational conditions at the HIEST facility. The second study focuses on transition in three-dimensional hypersonic boundary layers, and for this the cone at angle of attack experiments being conducted at the Boeing/AFOSR Mach-6 quiet tunnel at Purdue University were examined. Specifically, the effect of surface roughness on the development of the stationary crossflow instability are investigated

  14. A computational study of liposome logic: towards cellular computing from the bottom up

    PubMed Central

    Smaldon, James; Romero-Campero, Francisco J.; Fernández Trillo, Francisco; Gheorghe, Marian; Alexander, Cameron

    2010-01-01

    In this paper we propose a new bottom-up approach to cellular computing, in which computational chemical processes are encapsulated within liposomes. This “liposome logic” approach (also called vesicle computing) makes use of supra-molecular chemistry constructs, e.g. protocells, chells, etc. as minimal cellular platforms to which logical functionality can be added. Modeling and simulations feature prominently in “top-down” synthetic biology, particularly in the specification, design and implementation of logic circuits through bacterial genome reengineering. The second contribution in this paper is the demonstration of a novel set of tools for the specification, modelling and analysis of “bottom-up” liposome logic. In particular, simulation and modelling techniques are used to analyse some example liposome logic designs, ranging from relatively simple NOT gates and NAND gates to SR-Latches, D Flip-Flops all the way to 3 bit ripple counters. The approach we propose consists of specifying, by means of P systems, gene regulatory network-like systems operating inside proto-membranes. This P systems specification can be automatically translated and executed through a multiscaled pipeline composed of dissipative particle dynamics (DPD) simulator and Gillespie’s stochastic simulation algorithm (SSA). Finally, model selection and analysis can be performed through a model checking phase. This is the first paper we are aware of that brings to bear formal specifications, DPD, SSA and model checking to the problem of modeling target computational functionality in protocells. Potential chemical routes for the laboratory implementation of these simulations are also discussed thus for the first time suggesting a potentially realistic physiochemical implementation for membrane computing from the bottom-up. PMID:21886681

  15. Computing UV/vis spectra using a combined molecular dynamics and quantum chemistry approach: bis-triazin-pyridine (BTP) ligands studied in solution.

    PubMed

    Höfener, Sebastian; Trumm, Michael; Koke, Carsten; Heuser, Johannes; Ekström, Ulf; Skerencak-Frech, Andrej; Schimmelpfennig, Bernd; Panak, Petra J

    2016-03-21

    We report a combined computational and experimental study to investigate the UV/vis spectra of 2,6-bis(5,6-dialkyl-1,2,4-triazin-3-yl)pyridine (BTP) ligands in solution. In order to study molecules in solution using theoretical methods, force-field parameters for the ligand-water interaction are adjusted to ab initio quantum chemical calculations. Based on these parameters, molecular dynamics (MD) simulations are carried out from which snapshots are extracted as input to quantum chemical excitation-energy calculations to obtain UV/vis spectra of BTP ligands in solution using time-dependent density functional theory (TDDFT) employing the Tamm-Dancoff approximation (TDA). The range-separated CAM-B3LYP functional is used to avoid large errors for charge-transfer states occurring in the electronic spectra. In order to study environment effects with theoretical methods, the frozen-density embedding scheme is applied. This computational procedure allows to obtain electronic spectra calculated at the (range-separated) DFT level of theory in solution, revealing solvatochromic shifts upon solvation of up to about 0.6 eV. Comparison to experimental data shows a significantly improved agreement compared to vacuum calculations and enables the analysis of relevant excitations for the line shape in solution.

  16. Designing a Versatile Dedicated Computing Lab to Support Computer Network Courses: Insights from a Case Study

    ERIC Educational Resources Information Center

    Gercek, Gokhan; Saleem, Naveed

    2006-01-01

    Providing adequate computing lab support for Management Information Systems (MIS) and Computer Science (CS) programs is a perennial challenge for most academic institutions in the US and abroad. Factors, such as lack of physical space, budgetary constraints, conflicting needs of different courses, and rapid obsolescence of computing technology,…

  17. Computational and Experimental Study of Supersonic Nozzle Flow and Shock Interactions

    NASA Technical Reports Server (NTRS)

    Carter, Melissa B.; Elmiligui, Alaa A.; Nayani, Sudheer N.; Castner, Ray; Bruce, Walter E., IV; Inskeep, Jacob

    2015-01-01

    This study focused on the capability of NASA Tetrahedral Unstructured Software System's CFD code USM3D capability to predict the interaction between a shock and supersonic plume flow. Previous studies, published in 2004, 2009 and 2013, investigated USM3D's supersonic plume flow results versus historical experimental data. This current study builds on that research by utilizing the best practices from the early papers for properly capturing the plume flow and then adding a wedge acting as a shock generator. This computational study is in conjunction with experimental tests conducted at the Glenn Research Center 1'x1' Supersonic Wind Tunnel. The comparison of the computational and experimental data shows good agreement for location and strength of the shocks although there are vertical shifts between the data sets that may be do to the measurement technique.

  18. Computer Music

    NASA Astrophysics Data System (ADS)

    Cook, Perry R.

    This chapter covers algorithms, technologies, computer languages, and systems for computer music. Computer music involves the application of computers and other digital/electronic technologies to music composition, performance, theory, history, and the study of perception. The field combines digital signal processing, computational algorithms, computer languages, hardware and software systems, acoustics, psychoacoustics (low-level perception of sounds from the raw acoustic signal), and music cognition (higher-level perception of musical style, form, emotion, etc.).

  19. Effectiveness of Computer-Assisted Mathematics Education (CAME) over Academic Achievement: A Meta-Analysis Study

    ERIC Educational Resources Information Center

    Demir, Seda; Basol, Gülsah

    2014-01-01

    The aim of the current study is to determine the overall effects of Computer-Assisted Mathematics Education (CAME) on academic achievement. After an extensive review of the literature, studies using Turkish samples and observing the effects of Computer-Assisted Education (CAE) on mathematics achievement were examined. As a result of this…

  20. Audio-visual perception of 3D cinematography: an fMRI study using condition-based and computation-based analyses.

    PubMed

    Ogawa, Akitoshi; Bordier, Cecile; Macaluso, Emiliano

    2013-01-01

    The use of naturalistic stimuli to probe sensory functions in the human brain is gaining increasing interest. Previous imaging studies examined brain activity associated with the processing of cinematographic material using both standard "condition-based" designs, as well as "computational" methods based on the extraction of time-varying features of the stimuli (e.g. motion). Here, we exploited both approaches to investigate the neural correlates of complex visual and auditory spatial signals in cinematography. In the first experiment, the participants watched a piece of a commercial movie presented in four blocked conditions: 3D vision with surround sounds (3D-Surround), 3D with monaural sound (3D-Mono), 2D-Surround, and 2D-Mono. In the second experiment, they watched two different segments of the movie both presented continuously in 3D-Surround. The blocked presentation served for standard condition-based analyses, while all datasets were submitted to computation-based analyses. The latter assessed where activity co-varied with visual disparity signals and the complexity of auditory multi-sources signals. The blocked analyses associated 3D viewing with the activation of the dorsal and lateral occipital cortex and superior parietal lobule, while the surround sounds activated the superior and middle temporal gyri (S/MTG). The computation-based analyses revealed the effects of absolute disparity in dorsal occipital and posterior parietal cortices and of disparity gradients in the posterior middle temporal gyrus plus the inferior frontal gyrus. The complexity of the surround sounds was associated with activity in specific sub-regions of S/MTG, even after accounting for changes of sound intensity. These results demonstrate that the processing of naturalistic audio-visual signals entails an extensive set of visual and auditory areas, and that computation-based analyses can track the contribution of complex spatial aspects characterizing such life-like stimuli.

  1. An Integrated Multi-Omics Study Revealed Metabolic Alterations Underlying the Effects of Coffee Consumption

    PubMed Central

    Takahashi, Shoko; Saito, Kenji; Jia, Huijuan; Kato, Hisanori

    2014-01-01

    Many epidemiological studies have indicated that coffee consumption may reduce the risks of developing obesity and diabetes, but the underlying mechanisms of these effects are poorly understood. Our previous study revealed the changes on gene expression profiles in the livers of C57BL/6J mice fed a high-fat diet containing three types of coffee (caffeinated, decaffeinated and green unroasted coffee), using DNA microarrays. The results revealed remarkable alterations in lipid metabolism-related molecules which may be involved in the anti-obesity effects of coffee. We conducted the present study to further elucidate the metabolic alterations underlying the effects of coffee consumption through comprehensive proteomic and metabolomic analyses. Proteomics revealed an up-regulation of isocitrate dehydrogenase (a key enzyme in the TCA cycle) and its related proteins, suggesting increased energy generation. The metabolomics showed an up-regulation of metabolites involved in the urea cycle, with which the transcriptome data were highly consistent, indicating accelerated energy expenditure. The TCA cycle and the urea cycle are likely be accelerated in a concerted manner, since they are directly connected by mutually providing each other's intermediates. The up-regulation of these pathways might result in a metabolic shift causing increased ATP turnover, which is related to the alterations of lipid metabolism. This mechanism may play an important part in the suppressive effects of coffee consumption on obesity, inflammation, and hepatosteatosis. This study newly revealed global metabolic alterations induced by coffee intake, providing significant insights into the association between coffee intake and the prevention of type 2 diabetes, utilizing the benefits of multi-omics analyses. PMID:24618914

  2. Computational Study of the CC3 Impeller and Vaneless Diffuser Experiment

    NASA Technical Reports Server (NTRS)

    Kulkarni, Sameer; Beach, Timothy A.; Skoch, Gary J.

    2013-01-01

    Centrifugal compressors are compatible with the low exit corrected flows found in the high pressure compressor of turboshaft engines and may play an increasing role in turbofan engines as engine overall pressure ratios increase. Centrifugal compressor stages are difficult to model accurately with RANS CFD solvers. A computational study of the CC3 centrifugal impeller in its vaneless diffuser configuration was undertaken as part of an effort to understand potential causes of RANS CFD mis-prediction in these types of geometries. Three steady, periodic cases of the impeller and diffuser were modeled using the TURBO Parallel Version 4 code: 1) a k-epsilon turbulence model computation on a 6.8 million point grid using wall functions, 2) a k-epsilon turbulence model computation on a 14 million point grid integrating to the wall, and 3) a k-omega turbulence model computation on the 14 million point grid integrating to the wall. It was found that all three cases compared favorably to data from inlet to impeller trailing edge, but the k-epsilon and k-omega computations had disparate results beyond the trailing edge and into the vaneless diffuser. A large region of reversed flow was observed in the k-epsilon computations which extended from 70% to 100% span at the exit rating plane, whereas the k-omega computation had reversed flow from 95% to 100% span. Compared to experimental data at near-peak-efficiency, the reversed flow region in the k-epsilon case resulted in an under-prediction in adiabatic efficiency of 8.3 points, whereas the k-omega case was 1.2 points lower in efficiency.

  3. Computational Study of the CC3 Impeller and Vaneless Diffuser Experiment

    NASA Technical Reports Server (NTRS)

    Kulkarni, Sameer; Beach, Timothy A.; Skoch, Gary J.

    2013-01-01

    Centrifugal compressors are compatible with the low exit corrected flows found in the high pressure compressor of turboshaft engines and may play an increasing role in turbofan engines as engine overall pressure ratios increase. Centrifugal compressor stages are difficult to model accurately with RANS CFD solvers. A computational study of the CC3 centrifugal impeller in its vaneless diffuser configuration was undertaken as part of an effort to understand potential causes of RANS CFD mis-prediction in these types of geometries. Three steady, periodic cases of the impeller and diffuser were modeled using the TURBO Parallel Version 4 code: (1) a k-e turbulence model computation on a 6.8 million point grid using wall functions, (2) a k-e turbulence model computation on a 14 million point grid integrating to the wall, and (3) a k-? turbulence model computation on the 14 million point grid integrating to the wall. It was found that all three cases compared favorably to data from inlet to impeller trailing edge, but the k-e and k-? computations had disparate results beyond the trailing edge and into the vaneless diffuser. A large region of reversed flow was observed in the k-e computations which extended from 70 to 100 percent span at the exit rating plane, whereas the k-? computation had reversed flow from 95 to 100 percent span. Compared to experimental data at near-peak-efficiency, the reversed flow region in the k-e case resulted in an underprediction in adiabatic efficiency of 8.3 points, whereas the k-? case was 1.2 points lower in efficiency.

  4. Nursing Professionals' Evaluation in Integrating the Computers in English for Nursing Purposes (ENP) Instruction and Learning

    ERIC Educational Resources Information Center

    Yu, Wei-Chieh Wayne

    2013-01-01

    This study was designed to examine the pre- and in-service nursing professionals' perceptions of using computers to facilitate foreign language learning as consideration for future English for nursing purposes instruction. One hundred and ninety seven Taiwanese nursing students participated in the study. Findings revealed that (1) the participants…

  5. Computational Modeling and Treatment Identification in the Myelodysplastic Syndromes.

    PubMed

    Drusbosky, Leylah M; Cogle, Christopher R

    2017-10-01

    This review discusses the need for computational modeling in myelodysplastic syndromes (MDS) and early test results. As our evolving understanding of MDS reveals a molecularly complicated disease, the need for sophisticated computer analytics is required to keep track of the number and complex interplay among the molecular abnormalities. Computational modeling and digital drug simulations using whole exome sequencing data input have produced early results showing high accuracy in predicting treatment response to standard of care drugs. Furthermore, the computational MDS models serve as clinically relevant MDS cell lines for pre-clinical assays of investigational agents. MDS is an ideal disease for computational modeling and digital drug simulations. Current research is focused on establishing the prediction value of computational modeling. Future research will test the clinical advantage of computer-informed therapy in MDS.

  6. Computational Modeling for the Flow Over a Multi-Element Airfoil

    NASA Technical Reports Server (NTRS)

    Liou, William W.; Liu, Feng-Jun

    1999-01-01

    The flow over a multi-element airfoil is computed using two two-equation turbulence models. The computations are performed using the INS2D) Navier-Stokes code for two angles of attack. Overset grids are used for the three-element airfoil. The computed results are compared with experimental data for the surface pressure, skin friction coefficient, and velocity magnitude. The computed surface quantities generally agree well with the measurement. The computed results reveal the possible existence of a mixing-layer-like region of flow next to the suction surface of the slat for both angles of attack.

  7. Basic concepts and development of an all-purpose computer interface for ROC/FROC observer study.

    PubMed

    Shiraishi, Junji; Fukuoka, Daisuke; Hara, Takeshi; Abe, Hiroyuki

    2013-01-01

    In this study, we initially investigated various aspects of requirements for a computer interface employed in receiver operating characteristic (ROC) and free-response ROC (FROC) observer studies which involve digital images and ratings obtained by observers (radiologists). Secondly, by taking into account these aspects, an all-purpose computer interface utilized for these observer performance studies was developed. Basically, the observer studies can be classified into three paradigms, such as one rating for one case without an identification of a signal location, one rating for one case with an identification of a signal location, and multiple ratings for one case with identification of signal locations. For these paradigms, display modes on the computer interface can be used for single/multiple views of a static image, continuous viewing with cascade images (i.e., CT, MRI), and dynamic viewing of movies (i.e., DSA, ultrasound). Various functions on these display modes, which include windowing (contrast/level), magnifications, and annotations, are needed to be selected by an experimenter corresponding to the purpose of the research. In addition, the rules of judgment for distinguishing between true positives and false positives are an important factor for estimating diagnostic accuracy in an observer study. We developed a computer interface which runs on a Windows operating system by taking into account all aspects required for various observer studies. This computer interface requires experimenters to have sufficient knowledge about ROC/FROC observer studies, but allows its use for any purpose of the observer studies. This computer interface will be distributed publicly in the near future.

  8. An Exploratory Study of the Implementation of Computer Technology in an American Islamic Private School

    ERIC Educational Resources Information Center

    Saleem, Mohammed M.

    2009-01-01

    This exploratory study of the implementation of computer technology in an American Islamic private school leveraged the case study methodology and ethnographic methods informed by symbolic interactionism and the framework of the Muslim Diaspora. The study focused on describing the implementation of computer technology and identifying the…

  9. X-Ray Computed Tomography Reveals the Response of Root System Architecture to Soil Texture.

    PubMed

    Rogers, Eric D; Monaenkova, Daria; Mijar, Medhavinee; Nori, Apoorva; Goldman, Daniel I; Benfey, Philip N

    2016-07-01

    Root system architecture (RSA) impacts plant fitness and crop yield by facilitating efficient nutrient and water uptake from the soil. A better understanding of the effects of soil on RSA could improve crop productivity by matching roots to their soil environment. We used x-ray computed tomography to perform a detailed three-dimensional quantification of changes in rice (Oryza sativa) RSA in response to the physical properties of a granular substrate. We characterized the RSA of eight rice cultivars in five different growth substrates and determined that RSA is the result of interactions between genotype and growth environment. We identified cultivar-specific changes in RSA in response to changing growth substrate texture. The cultivar Azucena exhibited low RSA plasticity in all growth substrates, whereas cultivar Bala root depth was a function of soil hardness. Our imaging techniques provide a framework to study RSA in different growth environments, the results of which can be used to improve root traits with agronomic potential. © 2016 American Society of Plant Biologists. All Rights Reserved.

  10. Religious Studies as a Test-Case For Computer-Assisted Instruction In The Humanities.

    ERIC Educational Resources Information Center

    Jones, Bruce William

    Experiences with computer-assisted instructional (CAI) programs written for religious studies indicate that CAI has contributions to offer the humanities and social sciences. The usefulness of the computer for presentation, drill and review of factual material and its applicability to quantifiable data is well accepted. There now exist…

  11. Chinese EFL Teachers' Social Interaction and Socio-Cognitive Presence in Synchronous Computer-Mediated Communication

    ERIC Educational Resources Information Center

    Wu, Heping; Gao, Junde; Zhang, Weimin

    2014-01-01

    The present study examines the professional growth of three Chinese English teachers by analyzing their interactional patterns and their social and cognitive presence in an online community. The data from social network analysis (SNA) and content analysis revealed that computer-mediated communication (CMC) created new opportunities for teachers to…

  12. Teaching of real numbers by using the Archimedes-Cantor approach and computer algebra systems

    NASA Astrophysics Data System (ADS)

    Vorob'ev, Evgenii M.

    2015-11-01

    Computer technologies and especially computer algebra systems (CAS) allow students to overcome some of the difficulties they encounter in the study of real numbers. The teaching of calculus can be considerably more effective with the use of CAS provided the didactics of the discipline makes it possible to reveal the full computational potential of CAS. In the case of real numbers, the Archimedes-Cantor approach satisfies this requirement. The name of Archimedes brings back the exhaustion method. Cantor's name reminds us of the use of Cauchy rational sequences to represent real numbers. The usage of CAS with the Archimedes-Cantor approach enables the discussion of various representations of real numbers such as graphical, decimal, approximate decimal with precision estimates, and representation as points on a straight line. Exercises with numbers such as e, π, the golden ratio ϕ, and algebraic irrational numbers can help students better understand the real numbers. The Archimedes-Cantor approach also reveals a deep and close relationship between real numbers and continuity, in particular the continuity of functions.

  13. A cost-utility analysis of the use of preoperative computed tomographic angiography in abdomen-based perforator flap breast reconstruction.

    PubMed

    Offodile, Anaeze C; Chatterjee, Abhishek; Vallejo, Sergio; Fisher, Carla S; Tchou, Julia C; Guo, Lifei

    2015-04-01

    Computed tomographic angiography is a diagnostic tool increasingly used for preoperative vascular mapping in abdomen-based perforator flap breast reconstruction. This study compared the use of computed tomographic angiography and the conventional practice of Doppler ultrasonography only in postmastectomy reconstruction using a cost-utility model. Following a comprehensive literature review, a decision analytic model was created using the three most clinically relevant health outcomes in free autologous breast reconstruction with computed tomographic angiography versus Doppler ultrasonography only. Cost and utility estimates for each health outcome were used to derive the quality-adjusted life-years and incremental cost-utility ratio. One-way sensitivity analysis was performed to scrutinize the robustness of the authors' results. Six studies and 782 patients were identified. Cost-utility analysis revealed a baseline cost savings of $3179, a gain in quality-adjusted life-years of 0.25. This yielded an incremental cost-utility ratio of -$12,716, implying a dominant choice favoring preoperative computed tomographic angiography. Sensitivity analysis revealed that computed tomographic angiography was costlier when the operative time difference between the two techniques was less than 21.3 minutes. However, the clinical advantage of computed tomographic angiography over Doppler ultrasonography only showed that computed tomographic angiography would still remain the cost-effective option even if it offered no additional operating time advantage. The authors' results show that computed tomographic angiography is a cost-effective technology for identifying lower abdominal perforators for autologous breast reconstruction. Although the perfect study would be a randomized controlled trial of the two approaches with true cost accrual, the authors' results represent the best available evidence.

  14. Business Demands for Web-Related Skills as Compared to Other Computer Skills.

    ERIC Educational Resources Information Center

    Groneman, Nancy

    2000-01-01

    Analysis of 23,704 want ads for computer-related jobs revealed that the most frequently mentioned skills were UNIX, SQL programming, and computer security. Curriculum implications were derived from the most desired and less frequently mentioned skills. (SK)

  15. Educational Computer Use in Leisure Contexts: A Phenomenological Study of Adolescents' Experiences at Internet Cafes

    ERIC Educational Resources Information Center

    Cilesiz, Sebnem

    2009-01-01

    Computer use is a widespread leisure activity for adolescents. Leisure contexts, such as Internet cafes, constitute specific social environments for computer use and may hold significant educational potential. This article reports a phenomenological study of adolescents' experiences of educational computer use at Internet cafes in Turkey. The…

  16. Computation material science of structural-phase transformation in casting aluminium alloys

    NASA Astrophysics Data System (ADS)

    Golod, V. M.; Dobosh, L. Yu

    2017-04-01

    Successive stages of computer simulation the formation of the casting microstructure under non-equilibrium conditions of crystallization of multicomponent aluminum alloys are presented. On the basis of computer thermodynamics and heat transfer during solidification of macroscale shaped castings are specified the boundary conditions of local heat exchange at mesoscale modeling of non-equilibrium formation the solid phase and of the component redistribution between phases during coalescence of secondary dendrite branches. Computer analysis of structural - phase transitions based on the principle of additive physico-chemical effect of the alloy components in the process of diffusional - capillary morphological evolution of the dendrite structure and the o of local dendrite heterogeneity which stochastic nature and extent are revealed under metallographic study and modeling by the Monte Carlo method. The integrated computational materials science tools at researches of alloys are focused and implemented on analysis the multiple-factor system of casting processes and prediction of casting microstructure.

  17. Faculty of Education Students' Computer Self-Efficacy Beliefs and Their Attitudes towards Computers and Implementing Computer Supported Education

    ERIC Educational Resources Information Center

    Berkant, Hasan Güner

    2016-01-01

    This study investigates faculty of education students' computer self-efficacy beliefs and their attitudes towards computers and implementing computer supported education. This study is descriptive and based on a correlational survey model. The final sample consisted of 414 students studying in the faculty of education of a Turkish university. The…

  18. Case Study: Audio-Guided Learning, with Computer Graphics.

    ERIC Educational Resources Information Center

    Koumi, Jack; Daniels, Judith

    1994-01-01

    Describes teaching packages which involve the use of audiotape recordings with personal computers in Open University (United Kingdom) mathematics courses. Topics addressed include software development; computer graphics; pedagogic principles for distance education; feedback, including course evaluations and student surveys; and future plans.…

  19. Analyzing student conceptual understanding of resistor networks using binary, descriptive, and computational questions

    NASA Astrophysics Data System (ADS)

    Mujtaba, Abid H.

    2018-02-01

    This paper presents a case study assessing and analyzing student engagement with and responses to binary, descriptive, and computational questions testing the concepts underlying resistor networks (series and parallel combinations). The participants of the study were undergraduate students enrolled in a university in Pakistan. The majority of students struggled with the descriptive question, and while successfully answering the binary and computational ones, they failed to build an expectation for the answer, and betrayed significant lack of conceptual understanding in the process. The data collected was also used to analyze the relative efficacy of the three questions as a means of assessing conceptual understanding. The three questions were revealed to be uncorrelated and unlikely to be testing the same construct. The ability to answer the binary or computational question was observed to be divorced from a deeper understanding of the concepts involved.

  20. [Comparison study between biological vision and computer vision].

    PubMed

    Liu, W; Yuan, X G; Yang, C X; Liu, Z Q; Wang, R

    2001-08-01

    The development and bearing of biology vision in structure and mechanism were discussed, especially on the aspects including anatomical structure of biological vision, tentative classification of reception field, parallel processing of visual information, feedback and conformity effect of visual cortical, and so on. The new advance in the field was introduced through the study of the morphology of biological vision. Besides, comparison between biological vision and computer vision was made, and their similarities and differences were pointed out.

  1. Serendipity? Are There Gender Differences in the Adoption of Computers? A Case Study.

    ERIC Educational Resources Information Center

    Vernon-Gerstenfeld, Susan

    1989-01-01

    Discusses a study about the effect of learning styles of patent examiners on adoption of computers. Subjects' amount of computer use was a function of learning style, age, comfort after training, and gender. Findings indicate that women showed a greater propensity to adopt than men. Discusses implications for further research. (JS)

  2. Longitudinal Study of Factors Impacting the Implementation of Notebook Computer Based CAD Instruction

    ERIC Educational Resources Information Center

    Goosen, Richard F.

    2009-01-01

    This study provides information for higher education leaders that have or are considering conducting Computer Aided Design (CAD) instruction using student owned notebook computers. Survey data were collected during the first 8 years of a pilot program requiring engineering technology students at a four year public university to acquire a notebook…

  3. Costs of cloud computing for a biometry department. A case study.

    PubMed

    Knaus, J; Hieke, S; Binder, H; Schwarzer, G

    2013-01-01

    "Cloud" computing providers, such as the Amazon Web Services (AWS), offer stable and scalable computational resources based on hardware virtualization, with short, usually hourly, billing periods. The idea of pay-as-you-use seems appealing for biometry research units which have only limited access to university or corporate data center resources or grids. This case study compares the costs of an existing heterogeneous on-site hardware pool in a Medical Biometry and Statistics department to a comparable AWS offer. The "total cost of ownership", including all direct costs, is determined for the on-site hardware, and hourly prices are derived, based on actual system utilization during the year 2011. Indirect costs, which are difficult to quantify are not included in this comparison, but nevertheless some rough guidance from our experience is given. To indicate the scale of costs for a methodological research project, a simulation study of a permutation-based statistical approach is performed using AWS and on-site hardware. In the presented case, with a system utilization of 25-30 percent and 3-5-year amortization, on-site hardware can result in smaller costs, compared to hourly rental in the cloud dependent on the instance chosen. Renting cloud instances with sufficient main memory is a deciding factor in this comparison. Costs for on-site hardware may vary, depending on the specific infrastructure at a research unit, but have only moderate impact on the overall comparison and subsequent decision for obtaining affordable scientific computing resources. Overall utilization has a much stronger impact as it determines the actual computing hours needed per year. Taking this into ac count, cloud computing might still be a viable option for projects with limited maturity, or as a supplement for short peaks in demand.

  4. A computer-based time study system for timber harvesting operations

    Treesearch

    Jingxin Wang; Joe McNeel; John Baumgras

    2003-01-01

    A computer-based time study system was developed for timber harvesting operations. Object-oriented techniques were used to model and design the system. The front-end of the time study system resides on the MS Windows CE and the back-end is supported by MS Access. The system consists of three major components: a handheld system, data transfer interface, and data storage...

  5. Cost-effective cloud computing: a case study using the comparative genomics tool, roundup.

    PubMed

    Kudtarkar, Parul; Deluca, Todd F; Fusaro, Vincent A; Tonellato, Peter J; Wall, Dennis P

    2010-12-22

    Comparative genomics resources, such as ortholog detection tools and repositories are rapidly increasing in scale and complexity. Cloud computing is an emerging technological paradigm that enables researchers to dynamically build a dedicated virtual cluster and may represent a valuable alternative for large computational tools in bioinformatics. In the present manuscript, we optimize the computation of a large-scale comparative genomics resource-Roundup-using cloud computing, describe the proper operating principles required to achieve computational efficiency on the cloud, and detail important procedures for improving cost-effectiveness to ensure maximal computation at minimal costs. Utilizing the comparative genomics tool, Roundup, as a case study, we computed orthologs among 902 fully sequenced genomes on Amazon's Elastic Compute Cloud. For managing the ortholog processes, we designed a strategy to deploy the web service, Elastic MapReduce, and maximize the use of the cloud while simultaneously minimizing costs. Specifically, we created a model to estimate cloud runtime based on the size and complexity of the genomes being compared that determines in advance the optimal order of the jobs to be submitted. We computed orthologous relationships for 245,323 genome-to-genome comparisons on Amazon's computing cloud, a computation that required just over 200 hours and cost $8,000 USD, at least 40% less than expected under a strategy in which genome comparisons were submitted to the cloud randomly with respect to runtime. Our cost savings projections were based on a model that not only demonstrates the optimal strategy for deploying RSD to the cloud, but also finds the optimal cluster size to minimize waste and maximize usage. Our cost-reduction model is readily adaptable for other comparative genomics tools and potentially of significant benefit to labs seeking to take advantage of the cloud as an alternative to local computing infrastructure.

  6. Cost-Effective Cloud Computing: A Case Study Using the Comparative Genomics Tool, Roundup

    PubMed Central

    Kudtarkar, Parul; DeLuca, Todd F.; Fusaro, Vincent A.; Tonellato, Peter J.; Wall, Dennis P.

    2010-01-01

    Background Comparative genomics resources, such as ortholog detection tools and repositories are rapidly increasing in scale and complexity. Cloud computing is an emerging technological paradigm that enables researchers to dynamically build a dedicated virtual cluster and may represent a valuable alternative for large computational tools in bioinformatics. In the present manuscript, we optimize the computation of a large-scale comparative genomics resource—Roundup—using cloud computing, describe the proper operating principles required to achieve computational efficiency on the cloud, and detail important procedures for improving cost-effectiveness to ensure maximal computation at minimal costs. Methods Utilizing the comparative genomics tool, Roundup, as a case study, we computed orthologs among 902 fully sequenced genomes on Amazon’s Elastic Compute Cloud. For managing the ortholog processes, we designed a strategy to deploy the web service, Elastic MapReduce, and maximize the use of the cloud while simultaneously minimizing costs. Specifically, we created a model to estimate cloud runtime based on the size and complexity of the genomes being compared that determines in advance the optimal order of the jobs to be submitted. Results We computed orthologous relationships for 245,323 genome-to-genome comparisons on Amazon’s computing cloud, a computation that required just over 200 hours and cost $8,000 USD, at least 40% less than expected under a strategy in which genome comparisons were submitted to the cloud randomly with respect to runtime. Our cost savings projections were based on a model that not only demonstrates the optimal strategy for deploying RSD to the cloud, but also finds the optimal cluster size to minimize waste and maximize usage. Our cost-reduction model is readily adaptable for other comparative genomics tools and potentially of significant benefit to labs seeking to take advantage of the cloud as an alternative to local computing

  7. Volcano Monitoring: A Case Study in Pervasive Computing

    NASA Astrophysics Data System (ADS)

    Peterson, Nina; Anusuya-Rangappa, Lohith; Shirazi, Behrooz A.; Song, Wenzhan; Huang, Renjie; Tran, Daniel; Chien, Steve; Lahusen, Rick

    Recent advances in wireless sensor network technology have provided robust and reliable solutions for sophisticated pervasive computing applications such as inhospitable terrain environmental monitoring. We present a case study for developing a real-time pervasive computing system, called OASIS for optimized autonomous space in situ sensor-web, which combines ground assets (a sensor network) and space assets (NASA’s earth observing (EO-1) satellite) to monitor volcanic activities at Mount St. Helens. OASIS’s primary goals are: to integrate complementary space and in situ ground sensors into an interactive and autonomous sensorweb, to optimize power and communication resource management of the sensorweb and to provide mechanisms for seamless and scalable fusion of future space and in situ components. The OASIS in situ ground sensor network development addresses issues related to power management, bandwidth management, quality of service management, topology and routing management, and test-bed design. The space segment development consists of EO-1 architectural enhancements, feedback of EO-1 data into the in situ component, command and control integration, data ingestion and dissemination and field demonstrations.

  8. Enhancing computer self-efficacy and attitudes in multi-ethnic older adults: a randomised controlled study

    PubMed Central

    Laganà, Luciana; Oliver, Taylor; Ainsworth, Andrew; Edwards, Marc

    2014-01-01

    Several studies have documented the health-related benefits of older adults' use of computer technology, but before they can be realised, older individuals must be positively inclined and confident in their ability to engage in computer-based environments. To facilitate the assessment of computer technology attitudes, one aim of the longitudinal study reported in this paper was to test and refine a new 22-item measure of computer technology attitudes designed specifically for older adults, as none such were available.1 Another aim was to replicate, on a much larger scale, the successful findings of a preliminary study that tested a computer technology training programme for older adults (Laganà 2008). Ninety-six older men and women, mainly from non-European-American backgrounds, were randomly assigned to the waitlist/control or the experimental group. The same six-week one-on-one training was administered to the control subjects at the completion of their post-test. The revised (17-item) version of the Older Adults' Computer Technology Attitudes Scale (OACTAS) showed strong reliability: the results of a factor analysis were robust, and two analyses of covariance demonstrated that the training programme induced significant changes in attitudes and self-efficacy. Such results encourage the recruitment of older persons into training programmes aimed at increasing computer technology attitudes and self-efficacy. PMID:25512679

  9. Enhanced limonene production in cyanobacteria reveals photosynthesis limitations

    PubMed Central

    Wang, Xin; Liu, Wei; Xin, Changpeng; Zheng, Yi; Cheng, Yanbing; Sun, Su; Li, Runze; Zhu, Xin-Guang; Dai, Susie Y.; Rentzepis, Peter M.; Yuan, Joshua S.

    2016-01-01

    Terpenes are the major secondary metabolites produced by plants, and have diverse industrial applications as pharmaceuticals, fragrance, solvents, and biofuels. Cyanobacteria are equipped with efficient carbon fixation mechanism, and are ideal cell factories to produce various fuel and chemical products. Past efforts to produce terpenes in photosynthetic organisms have gained only limited success. Here we engineered the cyanobacterium Synechococcus elongatus PCC 7942 to efficiently produce limonene through modeling guided study. Computational modeling of limonene flux in response to photosynthetic output has revealed the downstream terpene synthase as a key metabolic flux-controlling node in the MEP (2-C-methyl-d-erythritol 4-phosphate) pathway-derived terpene biosynthesis. By enhancing the downstream limonene carbon sink, we achieved over 100-fold increase in limonene productivity, in contrast to the marginal increase achieved through stepwise metabolic engineering. The establishment of a strong limonene flux revealed potential synergy between photosynthate output and terpene biosynthesis, leading to enhanced carbon flux into the MEP pathway. Moreover, we show that enhanced limonene flux would lead to NADPH accumulation, and slow down photosynthesis electron flow. Fine-tuning ATP/NADPH toward terpene biosynthesis could be a key parameter to adapt photosynthesis to support biofuel/bioproduct production in cyanobacteria. PMID:27911807

  10. The rise of machine consciousness: studying consciousness with computational models.

    PubMed

    Reggia, James A

    2013-08-01

    Efforts to create computational models of consciousness have accelerated over the last two decades, creating a field that has become known as artificial consciousness. There have been two main motivations for this controversial work: to develop a better scientific understanding of the nature of human/animal consciousness and to produce machines that genuinely exhibit conscious awareness. This review begins by briefly explaining some of the concepts and terminology used by investigators working on machine consciousness, and summarizes key neurobiological correlates of human consciousness that are particularly relevant to past computational studies. Models of consciousness developed over the last twenty years are then surveyed. These models are largely found to fall into five categories based on the fundamental issue that their developers have selected as being most central to consciousness: a global workspace, information integration, an internal self-model, higher-level representations, or attention mechanisms. For each of these five categories, an overview of past work is given, a representative example is presented in some detail to illustrate the approach, and comments are provided on the contributions and limitations of the methodology. Three conclusions are offered about the state of the field based on this review: (1) computational modeling has become an effective and accepted methodology for the scientific study of consciousness, (2) existing computational models have successfully captured a number of neurobiological, cognitive, and behavioral correlates of conscious information processing as machine simulations, and (3) no existing approach to artificial consciousness has presented a compelling demonstration of phenomenal machine consciousness, or even clear evidence that artificial phenomenal consciousness will eventually be possible. The paper concludes by discussing the importance of continuing work in this area, considering the ethical issues it raises

  11. High-Throughput Computing on High-Performance Platforms: A Case Study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Oleynik, D; Panitkin, S; Matteo, Turilli

    The computing systems used by LHC experiments has historically consisted of the federation of hundreds to thousands of distributed resources, ranging from small to mid-size resource. In spite of the impressive scale of the existing distributed computing solutions, the federation of small to mid-size resources will be insufficient to meet projected future demands. This paper is a case study of how the ATLAS experiment has embraced Titan -- a DOE leadership facility in conjunction with traditional distributed high- throughput computing to reach sustained production scales of approximately 52M core-hours a years. The three main contributions of this paper are: (i)more » a critical evaluation of design and operational considerations to support the sustained, scalable and production usage of Titan; (ii) a preliminary characterization of a next generation executor for PanDA to support new workloads and advanced execution modes; and (iii) early lessons for how current and future experimental and observational systems can be integrated with production supercomputers and other platforms in a general and extensible manner.« less

  12. Computation of Relative Magnetic Helicity in Spherical Coordinates

    NASA Astrophysics Data System (ADS)

    Moraitis, Kostas; Pariat, Étienne; Savcheva, Antonia; Valori, Gherardo

    2018-06-01

    Magnetic helicity is a quantity of great importance in solar studies because it is conserved in ideal magnetohydrodynamics. While many methods for computing magnetic helicity in Cartesian finite volumes exist, in spherical coordinates, the natural coordinate system for solar applications, helicity is only treated approximately. We present here a method for properly computing the relative magnetic helicity in spherical geometry. The volumes considered are finite, of shell or wedge shape, and the three-dimensional magnetic field is considered to be fully known throughout the studied domain. Testing of the method with well-known, semi-analytic, force-free magnetic-field models reveals that it has excellent accuracy. Further application to a set of nonlinear force-free reconstructions of the magnetic field of solar active regions and comparison with an approximate method used in the past indicates that the proposed method can be significantly more accurate, thus making our method a promising tool in helicity studies that employ spherical geometry. Additionally, we determine and discuss the applicability range of the approximate method.

  13. Cognitive Computational Neuroscience: A New Conference for an Emerging Discipline.

    PubMed

    Naselaris, Thomas; Bassett, Danielle S; Fletcher, Alyson K; Kording, Konrad; Kriegeskorte, Nikolaus; Nienborg, Hendrikje; Poldrack, Russell A; Shohamy, Daphna; Kay, Kendrick

    2018-05-01

    Understanding the computational principles that underlie complex behavior is a central goal in cognitive science, artificial intelligence, and neuroscience. In an attempt to unify these disconnected communities, we created a new conference called Cognitive Computational Neuroscience (CCN). The inaugural meeting revealed considerable enthusiasm but significant obstacles remain. Copyright © 2018 Elsevier Ltd. All rights reserved.

  14. Comparative transcriptomics reveals similarities and differences between astrocytoma grades.

    PubMed

    Seifert, Michael; Garbe, Martin; Friedrich, Betty; Mittelbronn, Michel; Klink, Barbara

    2015-12-16

    than the tumor biology itself. We further inferred a transcriptional regulatory network associated with specific expression differences distinguishing PA I from AS II, AS III and GBM IV. Major central transcriptional regulators were involved in brain development, cell cycle control, proliferation, apoptosis, chromatin remodeling or DNA methylation. Many of these regulators showed directly underlying DNA methylation changes in PA I or gene copy number mutations in AS II, AS III and GBM IV. This computational study characterizes similarities and differences between all four astrocytoma grades confirming known and revealing novel insights into astrocytoma biology. Our findings represent a valuable resource for future computational and experimental studies.

  15. Computational imaging of sperm locomotion.

    PubMed

    Daloglu, Mustafa Ugur; Ozcan, Aydogan

    2017-08-01

    Not only essential for scientific research, but also in the analysis of male fertility and for animal husbandry, sperm tracking and characterization techniques have been greatly benefiting from computational imaging. Digital image sensors, in combination with optical microscopy tools and powerful computers, have enabled the use of advanced detection and tracking algorithms that automatically map sperm trajectories and calculate various motility parameters across large data sets. Computational techniques are driving the field even further, facilitating the development of unconventional sperm imaging and tracking methods that do not rely on standard optical microscopes and objective lenses, which limit the field of view and volume of the semen sample that can be imaged. As an example, a holographic on-chip sperm imaging platform, only composed of a light-emitting diode and an opto-electronic image sensor, has emerged as a high-throughput, low-cost and portable alternative to lens-based traditional sperm imaging and tracking methods. In this approach, the sample is placed very close to the image sensor chip, which captures lensfree holograms generated by the interference of the background illumination with the light scattered from sperm cells. These holographic patterns are then digitally processed to extract both the amplitude and phase information of the spermatozoa, effectively replacing the microscope objective lens with computation. This platform has further enabled high-throughput 3D imaging of spermatozoa with submicron 3D positioning accuracy in large sample volumes, revealing various rare locomotion patterns. We believe that computational chip-scale sperm imaging and 3D tracking techniques will find numerous opportunities in both sperm related research and commercial applications. © The Authors 2017. Published by Oxford University Press on behalf of Society for the Study of Reproduction. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  16. Computational analysis of aspartic protease plasmepsin II complexed with EH58 inhibitor: a QM/MM MD study.

    PubMed

    de Farias Silva, Natália; Lameira, Jerônimo; Alves, Cláudio Nahum

    2011-10-01

    Plasmepsin (PM) II is one of four enzymes in the food vacuole of Plasmodium falciparum. It has become an attractive target for combating malaria through research regarding its importance in the P. falciparum metabolism and life cycle, making it the target of choice for structure-based drug design. This paper reports the results of hybrid quantum mechanics / molecular mechanics (QM/MM) molecular dynamics (MD) simulations employed to study the details of the interactions established between PM II and N-(3-{(2-benzo[1, 3]dioxol-5-yl-ethyl)[3-(1-methyl-3-oxo-1,3-dihydro-isoindol-2-yl) propionyl]-amino}-1-benzyl-2-(hydroxyl-propyl)-4-benzyloxy-3,5dimethoxy-benzamide (EH58), a well-known potent inhibitor for this enzyme. Electrostatic binding free energy and energy terms decomposition have been computed for PM II complexed with the EH58 inhibitor. The results reveal that there is a strong interaction between Asp34, Val78, Ser79, Tyr192 and Asp214 residues and the EH58 inhibitor. In addition, we have computed the potential of the mean force (PMF) profile in order to assign the protonation state of the two catalytic aspartates in PM II-EH58 complex. The results indicate that the protonation of Asp214 favors a stable active site structure, which is consistent with our electrostatic binding free energy calculation and with previous published works.

  17. Determination of Perceptions of the Teacher Candidates Studying in the Computer and Instructional Technology Department towards Human-Computer Interaction and Related Basic Concepts

    ERIC Educational Resources Information Center

    Kiyici, Mubin

    2011-01-01

    HCI is a field which has an increasing popularity by virtue of the spread of the computers and internet and gradually contributes to the production of the user-friendlier software and hardware with the contribution of the scientists from different disciplines. Teacher candidates studying at the computer and instructional technologies department…

  18. Computational study of single-expansion-ramp nozzles with external burning

    NASA Astrophysics Data System (ADS)

    Yungster, Shaye; Trefny, Charles J.

    1992-04-01

    A computational investigation of the effects of external burning on the performance of single expansion ramp nozzles (SERN) operating at transonic speeds is presented. The study focuses on the effects of external heat addition and introduces a simplified injection and mixing model based on a control volume analysis. This simplified model permits parametric and scaling studies that would have been impossible to conduct with a detailed CFD analysis. The CFD model is validated by comparing the computed pressure distribution and thrust forces, for several nozzle configurations, with experimental data. Specific impulse calculations are also presented which indicate that external burning performance can be superior to other methods of thrust augmentation at transonic speeds. The effects of injection fuel pressure and nozzle pressure ratio on the performance of SERN nozzles with external burning are described. The results show trends similar to those reported in the experimental study, and provide additional information that complements the experimental data, improving our understanding of external burning flowfields. A study of the effect of scale is also presented. The results indicate that combustion kinetics do not make the flowfield sensitive to scale.

  19. Computational study of single-expansion-ramp nozzles with external burning

    NASA Technical Reports Server (NTRS)

    Yungster, Shaye; Trefny, Charles J.

    1992-01-01

    A computational investigation of the effects of external burning on the performance of single expansion ramp nozzles (SERN) operating at transonic speeds is presented. The study focuses on the effects of external heat addition and introduces a simplified injection and mixing model based on a control volume analysis. This simplified model permits parametric and scaling studies that would have been impossible to conduct with a detailed CFD analysis. The CFD model is validated by comparing the computed pressure distribution and thrust forces, for several nozzle configurations, with experimental data. Specific impulse calculations are also presented which indicate that external burning performance can be superior to other methods of thrust augmentation at transonic speeds. The effects of injection fuel pressure and nozzle pressure ratio on the performance of SERN nozzles with external burning are described. The results show trends similar to those reported in the experimental study, and provide additional information that complements the experimental data, improving our understanding of external burning flowfields. A study of the effect of scale is also presented. The results indicate that combustion kinetics do not make the flowfield sensitive to scale.

  20. Neuroimaging Study Designs, Computational Analyses and Data Provenance Using the LONI Pipeline

    PubMed Central

    Dinov, Ivo; Lozev, Kamen; Petrosyan, Petros; Liu, Zhizhong; Eggert, Paul; Pierce, Jonathan; Zamanyan, Alen; Chakrapani, Shruthi; Van Horn, John; Parker, D. Stott; Magsipoc, Rico; Leung, Kelvin; Gutman, Boris; Woods, Roger; Toga, Arthur

    2010-01-01

    Modern computational neuroscience employs diverse software tools and multidisciplinary expertise to analyze heterogeneous brain data. The classical problems of gathering meaningful data, fitting specific models, and discovering appropriate analysis and visualization tools give way to a new class of computational challenges—management of large and incongruous data, integration and interoperability of computational resources, and data provenance. We designed, implemented and validated a new paradigm for addressing these challenges in the neuroimaging field. Our solution is based on the LONI Pipeline environment [3], [4], a graphical workflow environment for constructing and executing complex data processing protocols. We developed study-design, database and visual language programming functionalities within the LONI Pipeline that enable the construction of complete, elaborate and robust graphical workflows for analyzing neuroimaging and other data. These workflows facilitate open sharing and communication of data and metadata, concrete processing protocols, result validation, and study replication among different investigators and research groups. The LONI Pipeline features include distributed grid-enabled infrastructure, virtualized execution environment, efficient integration, data provenance, validation and distribution of new computational tools, automated data format conversion, and an intuitive graphical user interface. We demonstrate the new LONI Pipeline features using large scale neuroimaging studies based on data from the International Consortium for Brain Mapping [5] and the Alzheimer's Disease Neuroimaging Initiative [6]. User guides, forums, instructions and downloads of the LONI Pipeline environment are available at http://pipeline.loni.ucla.edu. PMID:20927408

  1. Computational models of the pulmonary circulation: Insights and the move towards clinically directed studies

    PubMed Central

    Tawhai, Merryn H.; Clark, Alys R.; Burrowes, Kelly S.

    2011-01-01

    Biophysically-based computational models provide a tool for integrating and explaining experimental data, observations, and hypotheses. Computational models of the pulmonary circulation have evolved from minimal and efficient constructs that have been used to study individual mechanisms that contribute to lung perfusion, to sophisticated multi-scale and -physics structure-based models that predict integrated structure-function relationships within a heterogeneous organ. This review considers the utility of computational models in providing new insights into the function of the pulmonary circulation, and their application in clinically motivated studies. We review mathematical and computational models of the pulmonary circulation based on their application; we begin with models that seek to answer questions in basic science and physiology and progress to models that aim to have clinical application. In looking forward, we discuss the relative merits and clinical relevance of computational models: what important features are still lacking; and how these models may ultimately be applied to further increasing our understanding of the mechanisms occurring in disease of the pulmonary circulation. PMID:22034608

  2. Real-time data-intensive computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Parkinson, Dilworth Y., E-mail: dyparkinson@lbl.gov; Chen, Xian; Hexemer, Alexander

    2016-07-27

    Today users visit synchrotrons as sources of understanding and discovery—not as sources of just light, and not as sources of data. To achieve this, the synchrotron facilities frequently provide not just light but often the entire end station and increasingly, advanced computational facilities that can reduce terabytes of data into a form that can reveal a new key insight. The Advanced Light Source (ALS) has partnered with high performance computing, fast networking, and applied mathematics groups to create a “super-facility”, giving users simultaneous access to the experimental, computational, and algorithmic resources to make this possible. This combination forms an efficientmore » closed loop, where data—despite its high rate and volume—is transferred and processed immediately and automatically on appropriate computing resources, and results are extracted, visualized, and presented to users or to the experimental control system, both to provide immediate insight and to guide decisions about subsequent experiments during beamtime. We will describe our work at the ALS ptychography, scattering, micro-diffraction, and micro-tomography beamlines.« less

  3. Computational studies on nonlinear optical property of novel Wittig-based Schiff-base ligands and copper(II) complex

    NASA Astrophysics Data System (ADS)

    Rajasekhar, Bathula; Patowary, Nidarshana; K. Z., Danish; Swu, Toka

    2018-07-01

    Hundred and forty-five novel molecules of Wittig-based Schiff-base (WSB), including copper(II) complex and precursors, were computationally screened for nonlinear optical (NLO) properties. WSB ligands were derived from various categories of amines and aldehydes. Wittig-based precursor aldehydes, (E)-2-hydroxy-5-(4-nitrostyryl)benzaldehyde (f) and 2-hydroxy-5-((1Z,3E)-4-phenylbuta-1,3-dien-1-yl) benzaldehyde (g) were synthesised and spectroscopically confirmed. Schiff-base ligands and copper(II) complex were designed, optimised and their NLO property was studied using GAUSSIAN09 computer program. For both optimisation and hyperpolarisability (finite-field approach) calculations, Density Functional Theory (DFT)-based B3LYP method was applied with LANL2DZ basis set for metal ion and 6-31G* basis set for C, H, N, O and Cl atoms. This is the first report to present the structure-activity relationship between hyperpolarisability (β) and WSB ligands containing mono imine group. The study reveals that Schiff-base ligands of the category N-2, which are the ones derived from the precursor aldehyde, 2-hydroxy-5-(4nitro-styryl)benzaldehyde and pre-polarised WSB coordinated with Cu(II), encoded as Complex-1 (β = 14.671 × 10-30 e.s.u) showed higher β values over other categories, N-1 and N-3, i.e. WSB derived from precursor aldehydes, 2-hydroxy-5-styrylbenzaldehyde and 2-hydroxy-5-((1Z,3E)-4-phenylbuta-1,3-dien-1-yl)benzaldehyde, respectively. For the first time here we report the geometrical isomeric effect on β value.

  4. Assessment of gene order computing methods for Alzheimer's disease

    PubMed Central

    2013-01-01

    Background Computational genomics of Alzheimer disease (AD), the most common form of senile dementia, is a nascent field in AD research. The field includes AD gene clustering by computing gene order which generates higher quality gene clustering patterns than most other clustering methods. However, there are few available gene order computing methods such as Genetic Algorithm (GA) and Ant Colony Optimization (ACO). Further, their performance in gene order computation using AD microarray data is not known. We thus set forth to evaluate the performances of current gene order computing methods with different distance formulas, and to identify additional features associated with gene order computation. Methods Using different distance formulas- Pearson distance and Euclidean distance, the squared Euclidean distance, and other conditions, gene orders were calculated by ACO and GA (including standard GA and improved GA) methods, respectively. The qualities of the gene orders were compared, and new features from the calculated gene orders were identified. Results Compared to the GA methods tested in this study, ACO fits the AD microarray data the best when calculating gene order. In addition, the following features were revealed: different distance formulas generated a different quality of gene order, and the commonly used Pearson distance was not the best distance formula when used with both GA and ACO methods for AD microarray data. Conclusion Compared with Pearson distance and Euclidean distance, the squared Euclidean distance generated the best quality gene order computed by GA and ACO methods. PMID:23369541

  5. Gender and stereotypes in motivation to study computer programming for careers in multimedia

    NASA Astrophysics Data System (ADS)

    Doubé, Wendy; Lang, Catherine

    2012-03-01

    A multimedia university programme with relatively equal numbers of male and female students in elective programming subjects provided a rare opportunity to investigate female motivation to study and pursue computer programming in a career. The MSLQ was used to survey 85 participants. In common with research into deterrence of females from STEM domains, females displayed significantly lower self-efficacy and expectancy for success. In contrast to research into deterrence of females from STEM domains, both genders placed similar high values on computer programming and shared high extrinsic and intrinsic goal orientation. The authors propose that the stereotype associated with a creative multimedia career could attract female participation in computer programming whereas the stereotype associated with computer science could be a deterrent.

  6. Preverbal and verbal counting and computation.

    PubMed

    Gallistel, C R; Gelman, R

    1992-08-01

    We describe the preverbal system of counting and arithmetic reasoning revealed by experiments on numerical representations in animals. In this system, numerosities are represented by magnitudes, which are rapidly but inaccurately generated by the Meck and Church (1983) preverbal counting mechanism. We suggest the following. (1) The preverbal counting mechanism is the source of the implicit principles that guide the acquisition of verbal counting. (2) The preverbal system of arithmetic computation provides the framework for the assimilation of the verbal system. (3) Learning to count involves, in part, learning a mapping from the preverbal numerical magnitudes to the verbal and written number symbols and the inverse mappings from these symbols to the preverbal magnitudes. (4) Subitizing is the use of the preverbal counting process and the mapping from the resulting magnitudes to number words in order to generate rapidly the number words for small numerosities. (5) The retrieval of the number facts, which plays a central role in verbal computation, is mediated via the inverse mappings from verbal and written numbers to the preverbal magnitudes and the use of these magnitudes to find the appropriate cells in tabular arrangements of the answers. (6) This model of the fact retrieval process accounts for the salient features of the reaction time differences and error patterns revealed by experiments on mental arithmetic. (7) The application of verbal and written computational algorithms goes on in parallel with, and is to some extent guided by, preverbal computations, both in the child and in the adult.

  7. Computational Studies of Snake Venom Toxins

    PubMed Central

    Ojeda, Paola G.; Caballero, Julio; Kaas, Quentin; González, Wendy

    2017-01-01

    Most snake venom toxins are proteins, and participate to envenomation through a diverse array of bioactivities, such as bleeding, inflammation, and pain, cytotoxic, cardiotoxic or neurotoxic effects. The venom of a single snake species contains hundreds of toxins, and the venoms of the 725 species of venomous snakes represent a large pool of potentially bioactive proteins. Despite considerable discovery efforts, most of the snake venom toxins are still uncharacterized. Modern bioinformatics tools have been recently developed to mine snake venoms, helping focus experimental research on the most potentially interesting toxins. Some computational techniques predict toxin molecular targets, and the binding mode to these targets. This review gives an overview of current knowledge on the ~2200 sequences, and more than 400 three-dimensional structures of snake toxins deposited in public repositories, as well as of molecular modeling studies of the interaction between these toxins and their molecular targets. We also describe how modern bioinformatics have been used to study the snake venom protein phospholipase A2, the small basic myotoxin Crotamine, and the three-finger peptide Mambalgin. PMID:29271884

  8. A substrate selectivity and inhibitor design lesson from the PDE10-cAMP crystal structure: a computational study.

    PubMed

    Lau, Justin Kai-Chi; Li, Xiao-Bo; Cheng, Yuen-Kit

    2010-04-22

    Phosphodiesterases (PDEs) catalyze the hydrolysis of second messengers cAMP and cGMP in regulating many important cellular signals and have been recognized as important drug targets. Experimentally, a range of specificity/selectivity toward cAMP and cGMP is well-known for the individual PDE families. The study reported here reveals that PDEs might also exhibit selectivity toward conformations of the endogenous substrates cAMP and cGMP. Molecular dynamics simulations and free energy study have been applied to study the binding of the cAMP torsional conformers about the glycosyl bond in PDE10A2. The computational results elucidated that PDE10A2 is energetically more favorable in complex with the syn cAMP conformer (as reported in the crystal structure) and the binding of anti cAMP to PDE10A2 would lead to either a nonreactive configuration or significant perturbation on the catalytic pocket of the enzyme. This experimentally inaccessible information provides important molecular insights for the development of effective PDE10 ligands.

  9. Computational study of scattering of a zero-order Bessel beam by large nonspherical homogeneous particles with the multilevel fast multipole algorithm

    NASA Astrophysics Data System (ADS)

    Yang, Minglin; Wu, Yueqian; Sheng, Xinqing; Ren, Kuan Fang

    2017-12-01

    Computation of scattering of shaped beams by large nonspherical particles is a challenge in both optics and electromagnetics domains since it concerns many research fields. In this paper, we report our new progress in the numerical computation of the scattering diagrams. Our algorithm permits to calculate the scattering of a particle of size as large as 110 wavelengths or 700 in size parameter. The particle can be transparent or absorbing of arbitrary shape, smooth or with a sharp surface, such as the Chebyshev particles or ice crystals. To illustrate the capacity of the algorithm, a zero order Bessel beam is taken as the incident beam, and the scattering of ellipsoidal particles and Chebyshev particles are taken as examples. Some special phenomena have been revealed and examined. The scattering problem is formulated with the combined tangential formulation and solved iteratively with the aid of the multilevel fast multipole algorithm, which is well parallelized with the message passing interface on the distributed memory computer platform using the hybrid partitioning strategy. The numerical predictions are compared with the results of the rigorous method for a spherical particle to validate the accuracy of the approach. The scattering diagrams of large ellipsoidal particles with various parameters are examined. The effect of aspect ratios, as well as half-cone angle of the incident zero-order Bessel beam and the off-axis distance on scattered intensity, is studied. Scattering by asymmetry Chebyshev particle with size parameter larger than 700 is also given to show the capability of the method for computing scattering by arbitrary shaped particles.

  10. Pilot Study of Bovine Interdigital Cassetteless Computed Radiography

    PubMed Central

    EL-SHAFAEY, El-Sayed Ahmed Awad; AOKI, Takahiro; ISHII, Mitsuo; YAMADA, Kazutaka

    2013-01-01

    ABSTRACT Twenty-one limbs of bovine cadavers (42 digits) were exposed to interdigital cassetteless imaging plate using computed radiography. The radiographic findings included exostosis, a rough planta surface, osteolysis of the apex of the distal phalanx and widening of the laminar zone between the distal phalanx and the hoof wall. All these findings were confirmed by computed tomography. The hindlimbs (19 digits) showed more changes than the forelimbs (10 digits), particularly in the lateral distal phalanx. The cassetteless computed radiography technique is expected to be an easily applicable method for the distal phalanx rather than a conventional cassette-plate and/or the film-screen cassetteless methods. PMID:23782542

  11. Predictive simulation of gait at low gravity reveals skipping as the preferred locomotion strategy

    PubMed Central

    Ackermann, Marko; van den Bogert, Antonie J.

    2012-01-01

    The investigation of gait strategies at low gravity environments gained momentum recently as manned missions to the Moon and to Mars are reconsidered. Although reports by astronauts of the Apollo missions indicate alternative gait strategies might be favored on the Moon, computational simulations and experimental investigations have been almost exclusively limited to the study of either walking or running, the locomotion modes preferred under Earth's gravity. In order to investigate the gait strategies likely to be favored at low gravity a series of predictive, computational simulations of gait are performed using a physiological model of the musculoskeletal system, without assuming any particular type of gait. A computationally efficient optimization strategy is utilized allowing for multiple simulations. The results reveal skipping as more efficient and less fatiguing than walking or running and suggest the existence of a walk-skip rather than a walk-run transition at low gravity. The results are expected to serve as a background to the design of experimental investigations of gait under simulated low gravity. PMID:22365845

  12. Predictive simulation of gait at low gravity reveals skipping as the preferred locomotion strategy.

    PubMed

    Ackermann, Marko; van den Bogert, Antonie J

    2012-04-30

    The investigation of gait strategies at low gravity environments gained momentum recently as manned missions to the Moon and to Mars are reconsidered. Although reports by astronauts of the Apollo missions indicate alternative gait strategies might be favored on the Moon, computational simulations and experimental investigations have been almost exclusively limited to the study of either walking or running, the locomotion modes preferred under Earth's gravity. In order to investigate the gait strategies likely to be favored at low gravity a series of predictive, computational simulations of gait are performed using a physiological model of the musculoskeletal system, without assuming any particular type of gait. A computationally efficient optimization strategy is utilized allowing for multiple simulations. The results reveal skipping as more efficient and less fatiguing than walking or running and suggest the existence of a walk-skip rather than a walk-run transition at low gravity. The results are expected to serve as a background to the design of experimental investigations of gait under simulated low gravity. Copyright © 2012 Elsevier Ltd. All rights reserved.

  13. Case Studies of Liberal Arts Computer Science Programs

    ERIC Educational Resources Information Center

    Baldwin, D.; Brady, A.; Danyluk, A.; Adams, J.; Lawrence, A.

    2010-01-01

    Many undergraduate liberal arts institutions offer computer science majors. This article illustrates how quality computer science programs can be realized in a wide variety of liberal arts settings by describing and contrasting the actual programs at five liberal arts colleges: Williams College, Kalamazoo College, the State University of New York…

  14. Computer Vision Syndrome and Associated Factors Among Medical and Engineering Students in Chennai

    PubMed Central

    Logaraj, M; Madhupriya, V; Hegde, SK

    2014-01-01

    Background: Almost all institutions, colleges, universities and homes today were using computer regularly. Very little research has been carried out on Indian users especially among college students the effects of computer use on the eye and vision related problems. Aim: The aim of this study was to assess the prevalence of computer vision syndrome (CVS) among medical and engineering students and the factors associated with the same. Subjects and Methods: A cross-sectional study was conducted among medical and engineering college students of a University situated in the suburban area of Chennai. Students who used computer in the month preceding the date of study were included in the study. The participants were surveyed using pre-tested structured questionnaire. Results: Among engineering students, the prevalence of CVS was found to be 81.9% (176/215) while among medical students; it was found to be 78.6% (158/201). A significantly higher proportion of engineering students 40.9% (88/215) used computers for 4-6 h/day as compared to medical students 10% (20/201) (P < 0.001). The reported symptoms of CVS were higher among engineering students compared with medical students. Students who used computer for 4-6 h were at significantly higher risk of developing redness (OR = 1.2, 95% CI = 1.0-3.1,P = 0.04), burning sensation (OR = 2.1,95% CI = 1.3-3.1, P < 0.01) and dry eyes (OR = 1.8, 95% CI = 1.1-2.9, P = 0.02) compared to those who used computer for less than 4 h. Significant correlation was found between increased hours of computer use and the symptoms redness, burning sensation, blurred vision and dry eyes. Conclusion: The present study revealed that more than three-fourth of the students complained of any one of the symptoms of CVS while working on the computer. PMID:24761234

  15. Computer vision syndrome and associated factors among medical and engineering students in chennai.

    PubMed

    Logaraj, M; Madhupriya, V; Hegde, Sk

    2014-03-01

    Almost all institutions, colleges, universities and homes today were using computer regularly. Very little research has been carried out on Indian users especially among college students the effects of computer use on the eye and vision related problems. The aim of this study was to assess the prevalence of computer vision syndrome (CVS) among medical and engineering students and the factors associated with the same. A cross-sectional study was conducted among medical and engineering college students of a University situated in the suburban area of Chennai. Students who used computer in the month preceding the date of study were included in the study. The participants were surveyed using pre-tested structured questionnaire. Among engineering students, the prevalence of CVS was found to be 81.9% (176/215) while among medical students; it was found to be 78.6% (158/201). A significantly higher proportion of engineering students 40.9% (88/215) used computers for 4-6 h/day as compared to medical students 10% (20/201) (P < 0.001). The reported symptoms of CVS were higher among engineering students compared with medical students. Students who used computer for 4-6 h were at significantly higher risk of developing redness (OR = 1.2, 95% CI = 1.0-3.1,P = 0.04), burning sensation (OR = 2.1,95% CI = 1.3-3.1, P < 0.01) and dry eyes (OR = 1.8, 95% CI = 1.1-2.9, P = 0.02) compared to those who used computer for less than 4 h. Significant correlation was found between increased hours of computer use and the symptoms redness, burning sensation, blurred vision and dry eyes. The present study revealed that more than three-fourth of the students complained of any one of the symptoms of CVS while working on the computer.

  16. A Study of the Behavior of Children in a Preschool Equipped with Computers.

    ERIC Educational Resources Information Center

    Klinzing, Dene G.

    A study was conducted: (1) to compare the popularity of computer stations with nine other activity stations; (2) to determine the differences in the type of play displayed by the children in preschool and note the type of play displayed at the computer stations versus the other activity stations; (3) to determine whether the preschool activities,…

  17. High-Productivity Computing in Computational Physics Education

    NASA Astrophysics Data System (ADS)

    Tel-Zur, Guy

    2011-03-01

    We describe the development of a new course in Computational Physics at the Ben-Gurion University. This elective course for 3rd year undergraduates and MSc. students is being taught during one semester. Computational Physics is by now well accepted as the Third Pillar of Science. This paper's claim is that modern Computational Physics education should deal also with High-Productivity Computing. The traditional approach of teaching Computational Physics emphasizes ``Correctness'' and then ``Accuracy'' and we add also ``Performance.'' Along with topics in Mathematical Methods and case studies in Physics the course deals a significant amount of time with ``Mini-Courses'' in topics such as: High-Throughput Computing - Condor, Parallel Programming - MPI and OpenMP, How to build a Beowulf, Visualization and Grid and Cloud Computing. The course does not intend to teach neither new physics nor new mathematics but it is focused on an integrated approach for solving problems starting from the physics problem, the corresponding mathematical solution, the numerical scheme, writing an efficient computer code and finally analysis and visualization.

  18. Payload/orbiter contamination control requirement study: Computer interface

    NASA Technical Reports Server (NTRS)

    Bareiss, L. E.; Hooper, V. W.; Ress, E. B.

    1976-01-01

    The MSFC computer facilities, and future plans for them are described relative to characteristics of the various computers as to availability and suitability for processing the contamination program. A listing of the CDC 6000 series and UNIVAC 1108 characteristics is presented so that programming requirements can be compared directly and differences noted.

  19. Use of videotape for off-line viewing of computer-assisted radionuclide cardiology studies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thrall, J.H.; Pitt, B.; Marx, R.S.

    1978-02-01

    Videotape offers an inexpensive method for off-line viewing of dynamic radionuclide cardiac studies. Two approaches to videotaping have been explored and demonstrated to be feasible. In the first, a video camera in conjunction with a cassette-type recorder is used to record from the computer display scope. Alternatively, for computer systems already linked to video display units, the video signal can be routed directly to the recorder. Acceptance and use of tracer cardiology studies will be enhanced by increased availability of the studies for clinical review. Videotape offers an inexpensive flexible means of achieving this.

  20. Computational Modeling of Inflammation and Wound Healing

    PubMed Central

    Ziraldo, Cordelia; Mi, Qi; An, Gary; Vodovotz, Yoram

    2013-01-01

    Objective Inflammation is both central to proper wound healing and a key driver of chronic tissue injury via a positive-feedback loop incited by incidental cell damage. We seek to derive actionable insights into the role of inflammation in wound healing in order to improve outcomes for individual patients. Approach To date, dynamic computational models have been used to study the time evolution of inflammation in wound healing. Emerging clinical data on histo-pathological and macroscopic images of evolving wounds, as well as noninvasive measures of blood flow, suggested the need for tissue-realistic, agent-based, and hybrid mechanistic computational simulations of inflammation and wound healing. Innovation We developed a computational modeling system, Simple Platform for Agent-based Representation of Knowledge, to facilitate the construction of tissue-realistic models. Results A hybrid equation–agent-based model (ABM) of pressure ulcer formation in both spinal cord-injured and -uninjured patients was used to identify control points that reduce stress caused by tissue ischemia/reperfusion. An ABM of arterial restenosis revealed new dynamics of cell migration during neointimal hyperplasia that match histological features, but contradict the currently prevailing mechanistic hypothesis. ABMs of vocal fold inflammation were used to predict inflammatory trajectories in individuals, possibly allowing for personalized treatment. Conclusions The intertwined inflammatory and wound healing responses can be modeled computationally to make predictions in individuals, simulate therapies, and gain mechanistic insights. PMID:24527362

  1. Inhibition of thrombin by functionalized C60 nanoparticles revealed via in vitro assays and in silico studies.

    PubMed

    Liu, Yanyan; Fu, Jianjie; Pan, Wenxiao; Xue, Qiao; Liu, Xian; Zhang, Aiqian

    2018-01-01

    The studies on the human toxicity of nanoparticles (NPs) are far behind the rapid development of engineered functionalized NPs. Fullerene has been widely used as drug carrier skeleton due to its reported low risk. However, different from other kinds of NPs, fullerene-based NPs (C 60 NPs) have been found to have an anticoagulation effect, although the potential target is still unknown. In the study, both experimental and computational methods were adopted to gain mechanistic insight into the modulation of thrombin activity by nine kinds of C 60 NPs with diverse surface chemistry properties. In vitro enzyme activity assays showed that all tested surface-modified C 60 NPs exhibited thrombin inhibition ability. Kinetic studies coupled with competitive testing using 3 known inhibitors indicated that six of the C 60 NPs, of greater hydrophobicity and hydrogen bond (HB) donor acidity or acceptor basicity, acted as competitive inhibitors of thrombin by directly interacting with the active site of thrombin. A simple quantitative nanostructure-activity relationship model relating the surface substituent properties to the inhibition potential was then established for the six competitive inhibitors. Molecular docking analysis revealed that the intermolecular HB interactions were important for the specific binding of C 60 NPs to the active site canyon, while the additional stability provided by the surface groups through van der Waals interaction also play a key role in the thrombin binding affinity of the NPs. Our results suggest that thrombin is a possible target of the surface-functionalized C 60 NPs relevant to their anticoagulation effect. Copyright © 2017. Published by Elsevier B.V.

  2. Using Robotics and Game Design to Enhance Children's Self-Efficacy, STEM Attitudes, and Computational Thinking Skills

    ERIC Educational Resources Information Center

    Leonard, Jacqueline; Buss, Alan; Gamboa, Ruben; Mitchell, Monica; Fashola, Olatokunbo S.; Hubert, Tarcia; Almughyirah, Sultan

    2016-01-01

    This paper describes the findings of a pilot study that used robotics and game design to develop middle school students' computational thinking strategies. One hundred and twenty-four students engaged in LEGO® EV3 robotics and created games using Scalable Game Design software. The results of the study revealed students' pre-post self-efficacy…

  3. Computational Study of Axisymmetric Off-Design Nozzle Flows

    NASA Technical Reports Server (NTRS)

    DalBello, Teryn; Georgiadis, Nicholas; Yoder, Dennis; Keith, Theo

    2003-01-01

    Computational Fluid Dynamics (CFD) analyses of axisymmetric circular-arc boattail nozzles operating off-design at transonic Mach numbers have been completed. These computations span the very difficult transonic flight regime with shock-induced separations and strong adverse pressure gradients. External afterbody and internal nozzle pressure distributions computed with the Wind code are compared with experimental data. A range of turbulence models were examined, including the Explicit Algebraic Stress model. Computations have been completed at freestream Mach numbers of 0.9 and 1.2, and nozzle pressure ratios (NPR) of 4 and 6. Calculations completed with variable time-stepping (steady-state) did not converge to a true steady-state solution. Calculations obtained using constant timestepping (timeaccurate) indicate less variations in flow properties compared with steady-state solutions. This failure to converge to a steady-state solution was the result of using variable time-stepping with large-scale separations present in the flow. Nevertheless, time-averaged boattail surface pressure coefficient and internal nozzle pressures show reasonable agreement with experimental data. The SST turbulence model demonstrates the best overall agreement with experimental data.

  4. A Feasibility Study of Synthesizing Subsurfaces Modeled with Computational Neural Networks

    NASA Technical Reports Server (NTRS)

    Wang, John T.; Housner, Jerrold M.; Szewczyk, Z. Peter

    1998-01-01

    This paper investigates the feasibility of synthesizing substructures modeled with computational neural networks. Substructures are modeled individually with computational neural networks and the response of the assembled structure is predicted by synthesizing the neural networks. A superposition approach is applied to synthesize models for statically determinate substructures while an interface displacement collocation approach is used to synthesize statically indeterminate substructure models. Beam and plate substructures along with components of a complicated Next Generation Space Telescope (NGST) model are used in this feasibility study. In this paper, the limitations and difficulties of synthesizing substructures modeled with neural networks are also discussed.

  5. A case study on support for students' thinking through computer-mediated communication.

    PubMed

    Sannomiya, M; Kawaguchi, A

    2000-08-01

    This is a case study on support for thinking through computer-mediated communication. Two graduate students were supervised in their research using computer-mediated communication, which was asynchronous and written; the supervisor was not present. The students' reports pointed out there was more planning and editing and low interactivity in this approach relative to face-to-face communication. These attributes were confirmed by their supervisor's report. The students also suggested that the latter was effective in support of a production stage of thinking in research, while the former approach was effective in support of examination of thinking. For distance education to be successful, an appropriate combination of communication media must consider students' thinking stages. Finally, transient and permanent effects should be discriminated in computer-mediated communication.

  6. Computational study of duct and pipe flows using the method of pseudocompressibility

    NASA Technical Reports Server (NTRS)

    Williams, Robert W.

    1991-01-01

    A viscous, three-dimensional, incompressible, Navier-Stokes Computational Fluid Dynamics code employing pseudocompressibility is used for the prediction of laminar primary and secondary flows in two 90-degree bends of constant cross section. Under study are a square cross section duct bend with 2.3 radius ratio and a round cross section pipe bend with 2.8 radius ratio. Sensitivity of predicted primary and secondary flow to inlet boundary conditions, grid resolution, and code convergence is investigated. Contour and velocity versus spanwise coordinate plots comparing prediction to experimental data flow components are shown at several streamwise stations before, within, and after the duct and pipe bends. Discussion includes secondary flow physics, computational method, computational requirements, grid dependence, and convergence rates.

  7. Modeling an Excitable Biosynthetic Tissue with Inherent Variability for Paired Computational-Experimental Studies.

    PubMed

    Gokhale, Tanmay A; Kim, Jong M; Kirkton, Robert D; Bursac, Nenad; Henriquez, Craig S

    2017-01-01

    To understand how excitable tissues give rise to arrhythmias, it is crucially necessary to understand the electrical dynamics of cells in the context of their environment. Multicellular monolayer cultures have proven useful for investigating arrhythmias and other conduction anomalies, and because of their relatively simple structure, these constructs lend themselves to paired computational studies that often help elucidate mechanisms of the observed behavior. However, tissue cultures of cardiomyocyte monolayers currently require the use of neonatal cells with ionic properties that change rapidly during development and have thus been poorly characterized and modeled to date. Recently, Kirkton and Bursac demonstrated the ability to create biosynthetic excitable tissues from genetically engineered and immortalized HEK293 cells with well-characterized electrical properties and the ability to propagate action potentials. In this study, we developed and validated a computational model of these excitable HEK293 cells (called "Ex293" cells) using existing electrophysiological data and a genetic search algorithm. In order to reproduce not only the mean but also the variability of experimental observations, we examined what sources of variation were required in the computational model. Random cell-to-cell and inter-monolayer variation in both ionic conductances and tissue conductivity was necessary to explain the experimentally observed variability in action potential shape and macroscopic conduction, and the spatial organization of cell-to-cell conductance variation was found to not impact macroscopic behavior; the resulting model accurately reproduces both normal and drug-modified conduction behavior. The development of a computational Ex293 cell and tissue model provides a novel framework to perform paired computational-experimental studies to study normal and abnormal conduction in multidimensional excitable tissue, and the methodology of modeling variation can be

  8. Use or abuse of computers in the workplace.

    PubMed

    Gregg, Robert E

    2007-01-01

    Is your computer system about to become your next liability by the misuse of computers in the workplace? "E-discovery" reveals evidence of harassment, discrimination, defamation, and more. Yet employees also sue when the employer improperly intercepts electronic messages the employees claim were "private." Employers need to be aware of the issues of use, misuse, and rights to properly monitor and control the electronic system. Learn the current issues, legal trends, and practical pointers for your electronic operations.

  9. Biomaterial science meets computational biology.

    PubMed

    Hutmacher, Dietmar W; Little, J Paige; Pettet, Graeme J; Loessner, Daniela

    2015-05-01

    There is a pressing need for a predictive tool capable of revealing a holistic understanding of fundamental elements in the normal and pathological cell physiology of organoids in order to decipher the mechanoresponse of cells. Therefore, the integration of a systems bioengineering approach into a validated mathematical model is necessary to develop a new simulation tool. This tool can only be innovative by combining biomaterials science with computational biology. Systems-level and multi-scale experimental data are incorporated into a single framework, thus representing both single cells and collective cell behaviour. Such a computational platform needs to be validated in order to discover key mechano-biological factors associated with cell-cell and cell-niche interactions.

  10. Computer Use and Computer Anxiety in Older Korean Americans.

    PubMed

    Yoon, Hyunwoo; Jang, Yuri; Xie, Bo

    2016-09-01

    Responding to the limited literature on computer use in ethnic minority older populations, the present study examined predictors of computer use and computer anxiety in older Korean Americans. Separate regression models were estimated for computer use and computer anxiety with the common sets of predictors: (a) demographic variables (age, gender, marital status, and education), (b) physical health indicators (chronic conditions, functional disability, and self-rated health), and (c) sociocultural factors (acculturation and attitudes toward aging). Approximately 60% of the participants were computer-users, and they had significantly lower levels of computer anxiety than non-users. A higher likelihood of computer use and lower levels of computer anxiety were commonly observed among individuals with younger age, male gender, advanced education, more positive ratings of health, and higher levels of acculturation. In addition, positive attitudes toward aging were found to reduce computer anxiety. Findings provide implications for developing computer training and education programs for the target population. © The Author(s) 2015.

  11. Computation of viscous blast wave flowfields

    NASA Technical Reports Server (NTRS)

    Atwood, Christopher A.

    1991-01-01

    A method to determine unsteady solutions of the Navier-Stokes equations was developed and applied. The structural finite-volume, approximately factored implicit scheme uses Newton subiterations to obtain the spatially and temporally second-order accurate time history of the interaction of blast-waves with stationary targets. The inviscid flux is evaluated using MacCormack's modified Steger-Warming flux or Roe flux difference splittings with total variation diminishing limiters, while the viscous flux is computed using central differences. The use of implicit boundary conditions in conjunction with a telescoping in time and space method permitted solutions to this strongly unsteady class of problems. Comparisons of numerical, analytical, and experimental results were made in two and three dimensions. These comparisons revealed accurate wave speed resolution with nonoscillatory discontinuity capturing. The purpose of this effort was to address the three-dimensional, viscous blast-wave problem. Test cases were undertaken to reveal these methods' weaknesses in three regimes: (1) viscous-dominated flow; (2) complex unsteady flow; and (3) three-dimensional flow. Comparisons of these computations to analytic and experimental results provided initial validation of the resultant code. Addition details on the numerical method and on the validation can be found in the appendix. Presently, the code is capable of single zone computations with selection of any permutation of solid wall or flow-through boundaries.

  12. Engagement, Persistence, and Gender in Computer Science: Results of a Smartphone ESM Study

    PubMed Central

    Milesi, Carolina; Perez-Felkner, Lara; Brown, Kevin; Schneider, Barbara

    2017-01-01

    While the underrepresentation of women in the fast-growing STEM field of computer science (CS) has been much studied, no consensus exists on the key factors influencing this widening gender gap. Possible suspects include gender differences in aptitude, interest, and academic environment. Our study contributes to this literature by applying student engagement research to study the experiences of college students studying CS, to assess the degree to which differences in men and women's engagement may help account for gender inequity in the field. Specifically, we use the Experience Sampling Method (ESM) to evaluate in real-time the engagement of college students during varied activities and environments. Over the course of a full week in fall semester and a full week in spring semester, 165 students majoring in CS at two Research I universities were “beeped” several times a day via a smartphone app prompting them to fill out a short questionnaire including open-ended and scaled items. These responses were paired with administrative and over 2 years of transcript data provided by their institutions. We used mean comparisons and logistic regression analysis to compare enrollment and persistence patterns among CS men and women. Results suggest that despite the obstacles associated with women's underrepresentation in computer science, women are more likely to continue taking computer science courses when they felt challenged and skilled in their initial computer science classes. We discuss implications for further research. PMID:28487664

  13. Engagement, Persistence, and Gender in Computer Science: Results of a Smartphone ESM Study.

    PubMed

    Milesi, Carolina; Perez-Felkner, Lara; Brown, Kevin; Schneider, Barbara

    2017-01-01

    While the underrepresentation of women in the fast-growing STEM field of computer science (CS) has been much studied, no consensus exists on the key factors influencing this widening gender gap. Possible suspects include gender differences in aptitude, interest, and academic environment. Our study contributes to this literature by applying student engagement research to study the experiences of college students studying CS, to assess the degree to which differences in men and women's engagement may help account for gender inequity in the field. Specifically, we use the Experience Sampling Method (ESM) to evaluate in real-time the engagement of college students during varied activities and environments. Over the course of a full week in fall semester and a full week in spring semester, 165 students majoring in CS at two Research I universities were "beeped" several times a day via a smartphone app prompting them to fill out a short questionnaire including open-ended and scaled items. These responses were paired with administrative and over 2 years of transcript data provided by their institutions. We used mean comparisons and logistic regression analysis to compare enrollment and persistence patterns among CS men and women. Results suggest that despite the obstacles associated with women's underrepresentation in computer science, women are more likely to continue taking computer science courses when they felt challenged and skilled in their initial computer science classes. We discuss implications for further research.

  14. Railroad classification yard technology : computer system methodology : case study : Potomac Yard

    DOT National Transportation Integrated Search

    1981-08-01

    This report documents the application of the railroad classification yard computer system methodology to Potomac Yard of the Richmond, Fredericksburg, and Potomac Railroad Company (RF&P). This case study entailed evaluation of the yard traffic capaci...

  15. It Pays to Compare: An Experimental Study on Computational Estimation

    ERIC Educational Resources Information Center

    Star, Jon R.; Rittle-Johnson, Bethany

    2009-01-01

    Comparing and contrasting examples is a core cognitive process that supports learning in children and adults across a variety of topics. In this experimental study, we evaluated the benefits of supporting comparison in a classroom context for children learning about computational estimation. Fifth- and sixth-grade students (N = 157) learned about…

  16. Mechanical unfolding reveals stable 3-helix intermediates in talin and α-catenin

    PubMed Central

    2018-01-01

    Mechanical stability is a key feature in the regulation of structural scaffolding proteins and their functions. Despite the abundance of α-helical structures among the human proteome and their undisputed importance in health and disease, the fundamental principles of their behavior under mechanical load are poorly understood. Talin and α-catenin are two key molecules in focal adhesions and adherens junctions, respectively. In this study, we used a combination of atomistic steered molecular dynamics (SMD) simulations, polyprotein engineering, and single-molecule atomic force microscopy (smAFM) to investigate unfolding of these proteins. SMD simulations revealed that talin rod α-helix bundles as well as α-catenin α-helix domains unfold through stable 3-helix intermediates. While the 5-helix bundles were found to be mechanically stable, a second stable conformation corresponding to the 3-helix state was revealed. Mechanically weaker 4-helix bundles easily unfolded into a stable 3-helix conformation. The results of smAFM experiments were in agreement with the findings of the computational simulations. The disulfide clamp mutants, designed to protect the stable state, support the 3-helix intermediate model in both experimental and computational setups. As a result, multiple discrete unfolding intermediate states in the talin and α-catenin unfolding pathway were discovered. Better understanding of the mechanical unfolding mechanism of α-helix proteins is a key step towards comprehensive models describing the mechanoregulation of proteins. PMID:29698481

  17. Computational study of arc discharges: Spark plug and railplug ignitors

    NASA Astrophysics Data System (ADS)

    Ekici, Ozgur

    A theoretical study of electrical arc discharges that focuses on the discharge processes in spark plug and railplug ignitors is presented. The aim of the study is to gain a better understanding of the dynamics of electrical discharges, more specifically the transfer of electrical energy into the gas and the effect of this energy transfer on the flow physics. Different levels of computational models are presented to investigate the types of arc discharges seen in spark plugs and railplugs (i.e., stationary and moving arc discharges). Better understanding of discharge physics is important for a number of applications. For example, improved fuel economy under the constraint of stricter emissions standards and improved plug durability are important objectives of current internal combustion engine designs. These goals can be achieved by improving the existing systems (spark plug) and introducing more sophisticated ignition systems (railplug). In spite of the fact spark plug and railplug ignitors are the focus of this work, the methods presented in this work can be extended to study the discharges found in other applications such as plasma torches, laser sparks, and circuit breakers. The system of equations describing the physical processes in an air plasma is solved using computational fluid dynamics codes to simulate thermal and flow fields. The evolution of the shock front, temperature, pressure, density, and flow of a plasma kernel were investigated for both stationary and moving arcs. Arc propagation between the electrodes under the effects of gas dynamics and electromagnetic processes was studied for moving arcs. The air plasma is regarded as a continuum, single substance material in local thermal equilibrium. Thermophysical properties of high temperature air are used to take into consideration the important processes such as dissociation and ionization. The different mechanisms and the relative importance of several assumptions in gas discharges and thermal plasma

  18. A research program in empirical computer science

    NASA Technical Reports Server (NTRS)

    Knight, J. C.

    1991-01-01

    During the grant reporting period our primary activities have been to begin preparation for the establishment of a research program in experimental computer science. The focus of research in this program will be safety-critical systems. Many questions that arise in the effort to improve software dependability can only be addressed empirically. For example, there is no way to predict the performance of the various proposed approaches to building fault-tolerant software. Performance models, though valuable, are parameterized and cannot be used to make quantitative predictions without experimental determination of underlying distributions. In the past, experimentation has been able to shed some light on the practical benefits and limitations of software fault tolerance. It is common, also, for experimentation to reveal new questions or new aspects of problems that were previously unknown. A good example is the Consistent Comparison Problem that was revealed by experimentation and subsequently studied in depth. The result was a clear understanding of a previously unknown problem with software fault tolerance. The purpose of a research program in empirical computer science is to perform controlled experiments in the area of real-time, embedded control systems. The goal of the various experiments will be to determine better approaches to the construction of the software for computing systems that have to be relied upon. As such it will validate research concepts from other sources, provide new research results, and facilitate the transition of research results from concepts to practical procedures that can be applied with low risk to NASA flight projects. The target of experimentation will be the production software development activities undertaken by any organization prepared to contribute to the research program. Experimental goals, procedures, data analysis and result reporting will be performed for the most part by the University of Virginia.

  19. Saudi high school students' attitudes and barriers toward the use of computer technologies in learning English.

    PubMed

    Sabti, Ahmed Abdulateef; Chaichan, Rasha Sami

    2014-01-01

    This study examines the attitudes of Saudi Arabian high school students toward the use of computer technologies in learning English. The study also discusses the possible barriers that affect and limit the actual usage of computers. Quantitative approach is applied in this research, which involved 30 Saudi Arabia students of a high school in Kuala Lumpur, Malaysia. The respondents comprised 15 males and 15 females with ages between 16 years and 18 years. Two instruments, namely, Scale of Attitude toward Computer Technologies (SACT) and Barriers affecting Students' Attitudes and Use (BSAU) were used to collect data. The Technology Acceptance Model (TAM) of Davis (1989) was utilized. The analysis of the study revealed gender differences in attitudes toward the use of computer technologies in learning English. Female students showed high and positive attitudes towards the use of computer technologies in learning English than males. Both male and female participants demonstrated high and positive perception of Usefulness and perceived Ease of Use of computer technologies in learning English. Three barriers that affected and limited the use of computer technologies in learning English were identified by the participants. These barriers are skill, equipment, and motivation. Among these barriers, skill had the highest effect, whereas motivation showed the least effect.

  20. Evidence for phosphorus bonding in phosphorus trichloride-methanol adduct: a matrix isolation infrared and ab initio computational study.

    PubMed

    Joshi, Prasad Ramesh; Ramanathan, N; Sundararajan, K; Sankaran, K

    2015-04-09

    The weak interaction between PCl3 and CH3OH was investigated using matrix isolation infrared spectroscopy and ab initio computations. In a nitrogen matrix at low temperature, the noncovalent adduct was generated and characterized using Fourier transform infrared spectroscopy. Computations were performed at B3LYP/6-311++G(d,p), B3LYP/aug-cc-pVDZ, and MP2/6-311++G(d,p) levels of theory to optimize the possible geometries of PCl3-CH3OH adducts. Computations revealed two minima on the potential energy surface, of which, the global minimum is stabilized by a noncovalent P···O interaction, known as a pnictogen bonding (phosphorus bonding or P-bonding). The local minimum corresponded to a cyclic adduct, stabilized by the conventional hydrogen bonding (Cl···H-O and Cl···H-C interactions). Experimentally, 1:1 P-bonded PCl3-CH3OH adduct in nitrogen matrix was identified, where shifts in the P-Cl modes of PCl3, O-C, and O-H modes of CH3OH submolecules were observed. The observed vibrational frequencies of the P-bonded adduct in a nitrogen matrix agreed well with the computed frequencies. Furthermore, computations also predicted that the P-bonded adduct is stronger than H-bonded adduct by ∼1.56 kcal/mol. Atoms in molecules and natural bond orbital analyses were performed to understand the nature of interactions and effect of charge transfer interaction on the stability of the adducts.

  1. Democratizing Computer Science

    ERIC Educational Resources Information Center

    Margolis, Jane; Goode, Joanna; Ryoo, Jean J.

    2015-01-01

    Computer science programs are too often identified with a narrow stratum of the student population, often white or Asian boys who have access to computers at home. But because computers play such a huge role in our world today, all students can benefit from the study of computer science and the opportunity to build skills related to computing. The…

  2. Short Stories via Computers in EFL Classrooms: An Empirical Study for Reading and Writing Skills

    ERIC Educational Resources Information Center

    Yilmaz, Adnan

    2015-01-01

    The present empirical study scrutinizes the use of short stories via computer technologies in teaching and learning English language. The objective of the study is two-fold: to examine how short stories could be used through computer programs in teaching and learning English and to collect data about students' perceptions of this technique via…

  3. The Challenges of Improving the Teaching-Learning Process in Computer Studies.

    ERIC Educational Resources Information Center

    Siegel, Michael Eric

    In spring 1984, a study was conducted at the University of Maryland's University College to evaluate the effectiveness of computer studies faculty. The student ratings of these faculty members were compared with the ratings given to mathematics faculty and to all other faculty teaching that term. On all three evaluation criteria (i.e., encourages…

  4. A Case Study of Educational Computer Game Design by Middle School Students

    ERIC Educational Resources Information Center

    An, Yun-Jo

    2016-01-01

    Only a limited number of research studies have investigated how students design educational computer games and its impact on student learning. In addition, most studies on educational game design by students were conducted in the areas of mathematics and science. Using the qualitative case study approach, this study explored how seventh graders…

  5. Use of high-speed cinematography and computer generated gait diagrams for the study of equine hindlimb kinematics.

    PubMed

    Kobluk, C N; Schnurr, D; Horney, F D; Sumner-Smith, G; Willoughby, R A; Dekleer, V; Hearn, T C

    1989-01-01

    High-speed cinematography with computer aided analysis was used to study equine hindlimb kinematics. Eight horses were filmed at the trot or the pace. Filming was done from the side (lateral) and the back (caudal). Parameters measured from the lateral filming included the heights of the tuber coxae and tailhead, protraction and retraction of the hoof and angular changes of the tarsus and stifle. Abduction and adduction of the limb and tarsal height changes were measured from the caudal filming. The maximum and minimum values plus the standard deviations and coefficients of variations are presented in tabular form. Three gait diagrams were constructed to represent stifle angle versus tarsal angle, metatarsophalangeal height versus protraction-retraction (fetlock height diagram) and tuber coxae and tailhead height versus stride (pelvic height diagram). Application of the technique to the group of horses revealed good repeatability of the gait diagrams within a limb and the diagrams appeared to be sensitive indicators of left/right asymmetries.

  6. Computation and projection of spiral wave trajectories during atrial fibrillation: a computational study.

    PubMed

    Pashaei, Ali; Bayer, Jason; Meillet, Valentin; Dubois, Rémi; Vigmond, Edward

    2015-03-01

    To show how atrial fibrillation rotor activity on the heart surface manifests as phase on the torso, fibrillation was induced on a geometrically accurate computer model of the human atria. The Hilbert transform, time embedding, and filament detection were compared. Electrical activity on the epicardium was used to compute potentials on different surfaces from the atria to the torso. The Hilbert transform produces erroneous phase when pacing for longer than the action potential duration. The number of phase singularities, frequency content, and the dominant frequency decreased with distance from the heart, except for the convex hull. Copyright © 2015 Elsevier Inc. All rights reserved.

  7. Validation of computer simulation training for esophagogastroduodenoscopy: Pilot study.

    PubMed

    Sedlack, Robert E

    2007-08-01

    Little is known regarding the value of esophagogastroduodenoscopy (EGD) simulators in education. The purpose of the present paper was to validate the use of computer simulation in novice EGD training. In phase 1, expert endoscopists evaluated various aspects of simulation fidelity as compared to live endoscopy. Additionally, computer-recorded performance metrics were assessed by comparing the recorded scores from users of three different experience levels. In phase 2, the transfer of simulation-acquired skills to the clinical setting was assessed in a two-group, randomized pilot study. The setting was a large gastroenterology (GI) Fellowship training program; in phase 1, 21 subjects (seven expert, intermediate and novice endoscopist), made up the three experience groups. In phase 2, eight novice GI fellows were involved in the two-group, randomized portion of the study examining the transfer of simulation skills to the clinical setting. During the initial validation phase, each of the 21 subjects completed two standardized EDG scenarios on a computer simulator and their performance scores were recorded for seven parameters. Following this, staff participants completed a questionnaire evaluating various aspects of the simulator's fidelity. Finally, four novice GI fellows were randomly assigned to receive 6 h of simulator-augmented training (SAT group) in EGD prior to beginning 1 month of patient-based EGD training. The remaining fellows experienced 1 month of patient-based training alone (PBT group). Results of the seven measured performance parameters were compared between three groups of varying experience using a Wilcoxon ranked sum test. The staffs' simulator fidelity survey used a 7-point Likert scale (1, very unrealistic; 4, neutral; 7, very realistic) for each of the parameters examined. During the second phase of this study, supervising staff rated both SAT and PBT fellows' patient-based performance daily. Scoring in each skill was completed using a 7-point

  8. Combining on-chip synthesis of a focused combinatorial library with computational target prediction reveals imidazopyridine GPCR ligands.

    PubMed

    Reutlinger, Michael; Rodrigues, Tiago; Schneider, Petra; Schneider, Gisbert

    2014-01-07

    Using the example of the Ugi three-component reaction we report a fast and efficient microfluidic-assisted entry into the imidazopyridine scaffold, where building block prioritization was coupled to a new computational method for predicting ligand-target associations. We identified an innovative GPCR-modulating combinatorial chemotype featuring ligand-efficient adenosine A1/2B and adrenergic α1A/B receptor antagonists. Our results suggest the tight integration of microfluidics-assisted synthesis with computer-based target prediction as a viable approach to rapidly generate bioactivity-focused combinatorial compound libraries with high success rates. Copyright © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  9. The International Computer and Information Literacy Study from a European Perspective: Introduction to the Special Issue

    ERIC Educational Resources Information Center

    Gerick, Julia; Eickelmann, Birgit; Bos, Wilfried

    2017-01-01

    The "International Computer and Information Literacy Study" (ICILS 2013) provides, for the first time, information about students' computer and information literacy (CIL), as well as its acquisition, based on a computer-based test for students and background questionnaires. Among the 21 education systems that participated in ICILS 2013,…

  10. Attitudes towards Computer and Computer Self-Efficacy as Predictors of Preservice Mathematics Teachers' Computer Anxiety

    ERIC Educational Resources Information Center

    Awofala, Adeneye O. A.; Akinoso, Sabainah O.; Fatade, Alfred O.

    2017-01-01

    The study investigated attitudes towards computer and computer self-efficacy as predictors of computer anxiety among 310 preservice mathematics teachers from five higher institutions of learning in Lagos and Ogun States of Nigeria using the quantitative research method within the blueprint of the descriptive survey design. Data collected were…

  11. Labels, cognomes, and cyclic computation: an ethological perspective.

    PubMed

    Murphy, Elliot

    2015-01-01

    For the past two decades, it has widely been assumed by linguists that there is a single computational operation, Merge, which is unique to language, distinguishing it from other cognitive domains. The intention of this paper is to progress the discussion of language evolution in two ways: (i) survey what the ethological record reveals about the uniqueness of the human computational system, and (ii) explore how syntactic theories account for what ethology may determine to be human-specific. It is shown that the operation Label, not Merge, constitutes the evolutionary novelty which distinguishes human language from non-human computational systems; a proposal lending weight to a Weak Continuity Hypothesis and leading to the formation of what is termed Computational Ethology. Some directions for future ethological research are suggested.

  12. Labels, cognomes, and cyclic computation: an ethological perspective

    PubMed Central

    Murphy, Elliot

    2015-01-01

    For the past two decades, it has widely been assumed by linguists that there is a single computational operation, Merge, which is unique to language, distinguishing it from other cognitive domains. The intention of this paper is to progress the discussion of language evolution in two ways: (i) survey what the ethological record reveals about the uniqueness of the human computational system, and (ii) explore how syntactic theories account for what ethology may determine to be human-specific. It is shown that the operation Label, not Merge, constitutes the evolutionary novelty which distinguishes human language from non-human computational systems; a proposal lending weight to a Weak Continuity Hypothesis and leading to the formation of what is termed Computational Ethology. Some directions for future ethological research are suggested. PMID:26089809

  13. Computer-aided diagnosis of contrast-enhanced spectral mammography: A feasibility study.

    PubMed

    Patel, Bhavika K; Ranjbar, Sara; Wu, Teresa; Pockaj, Barbara A; Li, Jing; Zhang, Nan; Lobbes, Mark; Zhang, Bin; Mitchell, J Ross

    2018-01-01

    To evaluate whether the use of a computer-aided diagnosis-contrast-enhanced spectral mammography (CAD-CESM) tool can further increase the diagnostic performance of CESM compared with that of experienced radiologists. This IRB-approved retrospective study analyzed 50 lesions described on CESM from August 2014 to December 2015. Histopathologic analyses, used as the criterion standard, revealed 24 benign and 26 malignant lesions. An expert breast radiologist manually outlined lesion boundaries on the different views. A set of morphologic and textural features were then extracted from the low-energy and recombined images. Machine-learning algorithms with feature selection were used along with statistical analysis to reduce, select, and combine features. Selected features were then used to construct a predictive model using a support vector machine (SVM) classification method in a leave-one-out-cross-validation approach. The classification performance was compared against the diagnostic predictions of 2 breast radiologists with access to the same CESM cases. Based on the SVM classification, CAD-CESM correctly identified 45 of 50 lesions in the cohort, resulting in an overall accuracy of 90%. The detection rate for the malignant group was 88% (3 false-negative cases) and 92% for the benign group (2 false-positive cases). Compared with the model, radiologist 1 had an overall accuracy of 78% and a detection rate of 92% (2 false-negative cases) for the malignant group and 62% (10 false-positive cases) for the benign group. Radiologist 2 had an overall accuracy of 86% and a detection rate of 100% for the malignant group and 71% (8 false-positive cases) for the benign group. The results of our feasibility study suggest that a CAD-CESM tool can provide complementary information to radiologists, mainly by reducing the number of false-positive findings. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. Faculty Instructional Computer Use Model: Differentiating Instructional and Mainstream Computer Uses

    ERIC Educational Resources Information Center

    Sahin, Ismail

    2008-01-01

    The purpose of this research study was to explore predictor variables for faculty instructional computer use. Analysis of data collected from 198 college of education (COE) faculty members indicated that instructional computer use and mainstream computer use form two strong and distinct variables. This study also proposes a faculty instructional…

  15. Methodical Approaches to Teaching of Computer Modeling in Computer Science Course

    ERIC Educational Resources Information Center

    Rakhimzhanova, B. Lyazzat; Issabayeva, N. Darazha; Khakimova, Tiyshtik; Bolyskhanova, J. Madina

    2015-01-01

    The purpose of this study was to justify of the formation technique of representation of modeling methodology at computer science lessons. The necessity of studying computer modeling is that the current trends of strengthening of general education and worldview functions of computer science define the necessity of additional research of the…

  16. Home-Based Computer Gaming in Vestibular Rehabilitation of Gaze and Balance Impairment.

    PubMed

    Szturm, Tony; Reimer, Karen M; Hochman, Jordan

    2015-06-01

    Disease or damage of the vestibular sense organs cause a range of distressing symptoms and functional problems that could include loss of balance, gaze instability, disorientation, and dizziness. A novel computer-based rehabilitation system with therapeutic gaming application has been developed. This method allows different gaze and head movement exercises to be coupled to a wide range of inexpensive, commercial computer games. It can be used in standing, and thus graded balance demands using a sponge pad can be incorporated into the program. A case series pre- and postintervention study was conducted of nine adults diagnosed with peripheral vestibular dysfunction who received a 12-week home rehabilitation program. The feasibility and usability of the home computer-based therapeutic program were established. Study findings revealed that using head rotation to interact with computer games, when coupled to demanding balance conditions, resulted in significant improvements in standing balance, dynamic visual acuity, gaze control, and walking performance. Perception of dizziness as measured by the Dizziness Handicap Inventory also decreased significantly. These preliminary findings provide support that a low-cost home game-based exercise program is well suited to train standing balance and gaze control (with active and passive head motion).

  17. Combined computational and biochemical study reveals the importance of electrostatic interactions between the "pH sensor" and the cation binding site of the sodium/proton antiporter NhaA of Escherichia coli.

    PubMed

    Olkhova, Elena; Kozachkov, Lena; Padan, Etana; Michel, Hartmut

    2009-08-15

    Sodium proton antiporters are essential enzymes that catalyze the exchange of sodium ions for protons across biological membranes. The crystal structure of NhaA has provided a basis to explore the mechanism of ion exchange and its unique regulation by pH. Here, the mechanism of the pH activation of the antiporter is investigated through functional and computational studies of several variants with mutations in the ion-binding site (D163, D164). The most significant difference found computationally between the wild type antiporter and the active site variants, D163E and D164N, are low pK(a) values of Glu78 making them insensitive to pH. Although in the variant D163N the pK(a) of Glu78 is comparable to the physiological one, this variant cannot demonstrate the long-range electrostatic effect of Glu78 on the pH-dependent structural reorganization of trans-membrane helix X and, hence, is proposed to be inactive. In marked contrast, variant D164E remains sensitive to pH and can be activated by alkaline pH shift. Remarkably, as expected computationally and discovered here biochemically, D164E is viable and active in Na(+)/H(+) exchange albeit with increased apparent K(M). Our results unravel the unique electrostatic network of NhaA that connect the coupled clusters of the "pH sensor" with the binding site, which is crucial for pH activation of NhaA. 2009 Wiley-Liss, Inc.

  18. Computer-assisted intraosseous anaesthesia for molar and incisor hypomineralisation teeth. A preliminary study.

    PubMed

    Cabasse, C; Marie-Cousin, A; Huet, A; Sixou, J L

    2015-03-01

    Anesthetizing MIH (Molar and Incisor Hypomineralisation) teeth is one of the major challenges in paediatric dentistry. Computer-assisted IO injection (CAIO) of 4% articaine with 1:200,000 epinephrine (Alphacaine, Septodont) has been shown to be an efficient way to anesthetize teeth in children. The aim of this study was to assess the efficacy of this method with MIH teeth. This preliminary study was performed using the Quick Sleeper system (Dental Hi Tec, Cholet, France) that allows computer-controlled rotation of the needle to penetrate the bone and computer-controlled injection of the anaesthetic solution. Patients (39) of the department of Paediatric Dentistry were included allowing 46 sessions (including 32 mandibular first permanent molars) to be assessed. CAIO showed efficacy in 93.5% (43/46) of cases. Failures (3) were due to impossibility to reach the spongy bone (1) and to achieve anaesthesia (2). This prospective study confirms that CAIO anaesthesia is a promising method to anesthetize teeth with MIH that could therefore be routinely used by trained practitioners.

  19. a Study of the Reconstruction of Accidents and Crime Scenes Through Computational Experiments

    NASA Astrophysics Data System (ADS)

    Park, S. J.; Chae, S. W.; Kim, S. H.; Yang, K. M.; Chung, H. S.

    Recently, with an increase in the number of studies of the safety of both pedestrians and passengers, computer software, such as MADYMO, Pam-crash, and LS-dyna, has been providing human models for computer simulation. Although such programs have been applied to make machines beneficial for humans, studies that analyze the reconstruction of accidents or crime scenes are rare. Therefore, through computational experiments, the present study presents reconstructions of two questionable accidents. In the first case, a car fell off the road and the driver was separated from it. The accident investigator was very confused because some circumstantial evidence suggested the possibility that the driver was murdered. In the second case, a woman died in her house and the police suspected foul play with her boyfriend as a suspect. These two cases were reconstructed using the human model in MADYMO software. The first case was eventually confirmed as a traffic accident in which the driver bounced out of the car when the car fell off, and the second case was proved to be suicide rather than homicide.

  20. A Computational and Experimental Study of Resonators in Three Dimensions

    NASA Technical Reports Server (NTRS)

    Tam, C. K. W.; Ju, H.; Jones, Michael G.; Watson, Willie R.; Parrott, Tony L.

    2009-01-01

    In a previous work by the present authors, a computational and experimental investigation of the acoustic properties of two-dimensional slit resonators was carried out. The present paper reports the results of a study extending the previous work to three dimensions. This investigation has two basic objectives. The first is to validate the computed results from direct numerical simulations of the flow and acoustic fields of slit resonators in three dimensions by comparing with experimental measurements in a normal incidence impedance tube. The second objective is to study the flow physics of resonant liners responsible for sound wave dissipation. Extensive comparisons are provided between computed and measured acoustic liner properties with both discrete frequency and broadband sound sources. Good agreements are found over a wide range of frequencies and sound pressure levels. Direct numerical simulation confirms the previous finding in two dimensions that vortex shedding is the dominant dissipation mechanism at high sound pressure intensity. However, it is observed that the behavior of the shed vortices in three dimensions is quite different from those of two dimensions. In three dimensions, the shed vortices tend to evolve into ring (circular in plan form) vortices, even though the slit resonator opening from which the vortices are shed has an aspect ratio of 2.5. Under the excitation of discrete frequency sound, the shed vortices align themselves into two regularly spaced vortex trains moving away from the resonator opening in opposite directions. This is different from the chaotic shedding of vortices found in two-dimensional simulations. The effect of slit aspect ratio at a fixed porosity is briefly studied. For the range of liners considered in this investigation, it is found that the absorption coefficient of a liner increases when the open area of the single slit is subdivided into multiple, smaller slits.

  1. Exploring the Perceptions of College Instructors towards Computer Simulation Software Programs: A Quantitative Study

    ERIC Educational Resources Information Center

    Punch, Raymond J.

    2012-01-01

    The purpose of the quantitative regression study was to explore and to identify relationships between attitudes toward use and perceptions of value of computer-based simulation programs, of college instructors, toward computer based simulation programs. A relationship has been reported between attitudes toward use and perceptions of the value of…

  2. NASA Computational Case Study SAR Data Processing: Ground-Range Projection

    NASA Technical Reports Server (NTRS)

    Memarsadeghi, Nargess; Rincon, Rafael

    2013-01-01

    Radar technology is used extensively by NASA for remote sensing of the Earth and other Planetary bodies. In this case study, we learn about different computational concepts for processing radar data. In particular, we learn how to correct a slanted radar image by projecting it on the surface that was sensed by a radar instrument.

  3. Computational analysis of stochastic heterogeneity in PCR amplification efficiency revealed by single molecule barcoding

    PubMed Central

    Best, Katharine; Oakes, Theres; Heather, James M.; Shawe-Taylor, John; Chain, Benny

    2015-01-01

    The polymerase chain reaction (PCR) is one of the most widely used techniques in molecular biology. In combination with High Throughput Sequencing (HTS), PCR is widely used to quantify transcript abundance for RNA-seq, and in the context of analysis of T and B cell receptor repertoires. In this study, we combine DNA barcoding with HTS to quantify PCR output from individual target molecules. We develop computational tools that simulate both the PCR branching process itself, and the subsequent subsampling which typically occurs during HTS sequencing. We explore the influence of different types of heterogeneity on sequencing output, and compare them to experimental results where the efficiency of amplification is measured by barcodes uniquely identifying each molecule of starting template. Our results demonstrate that the PCR process introduces substantial amplification heterogeneity, independent of primer sequence and bulk experimental conditions. This heterogeneity can be attributed both to inherited differences between different template DNA molecules, and the inherent stochasticity of the PCR process. The results demonstrate that PCR heterogeneity arises even when reaction and substrate conditions are kept as constant as possible, and therefore single molecule barcoding is essential in order to derive reproducible quantitative results from any protocol combining PCR with HTS. PMID:26459131

  4. Circular RNA profile in gliomas revealed by identification tool UROBORUS.

    PubMed

    Song, Xiaofeng; Zhang, Naibo; Han, Ping; Moon, Byoung-San; Lai, Rose K; Wang, Kai; Lu, Wange

    2016-05-19

    Recent evidence suggests that many endogenous circular RNAs (circRNAs) may play roles in biological processes. However, the expression patterns and functions of circRNAs in human diseases are not well understood. Computationally identifying circRNAs from total RNA-seq data is a primary step in studying their expression pattern and biological roles. In this work, we have developed a computational pipeline named UROBORUS to detect circRNAs in total RNA-seq data. By applying UROBORUS to RNA-seq data from 46 gliomas and normal brain samples, we detected thousands of circRNAs supported by at least two read counts, followed by successful experimental validation on 24 circRNAs from the randomly selected 27 circRNAs. UROBORUS is an efficient tool that can detect circRNAs with low expression levels in total RNA-seq without RNase R treatment. The circRNAs expression profiling revealed more than 476 circular RNAs differentially expressed in control brain tissues and gliomas. Together with parental gene expression, we found that circRNA and its parental gene have diversified expression patterns in gliomas and control brain tissues. This study establishes an efficient and sensitive approach for predicting circRNAs using total RNA-seq data. The UROBORUS pipeline can be accessed freely for non-commercial purposes at http://uroborus.openbioinformatics.org/. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.

  5. Quantitative computational infrared imaging of buoyant diffusion flames

    NASA Astrophysics Data System (ADS)

    Newale, Ashish S.

    Studies of infrared radiation from turbulent buoyant diffusion flames impinging on structural elements have applications to the development of fire models. A numerical and experimental study of radiation from buoyant diffusion flames with and without impingement on a flat plate is reported. Quantitative images of the radiation intensity from the flames are acquired using a high speed infrared camera. Large eddy simulations are performed using fire dynamics simulator (FDS version 6). The species concentrations and temperature from the simulations are used in conjunction with a narrow-band radiation model (RADCAL) to solve the radiative transfer equation. The computed infrared radiation intensities rendered in the form of images and compared with the measurements. The measured and computed radiation intensities reveal necking and bulging with a characteristic frequency of 7.1 Hz which is in agreement with previous empirical correlations. The results demonstrate the effects of stagnation point boundary layer on the upstream buoyant shear layer. The coupling between these two shear layers presents a model problem for sub-grid scale modeling necessary for future large eddy simulations.

  6. A Computational Study of the Flow Physics of Acoustic Liners

    NASA Technical Reports Server (NTRS)

    Tam, Christopher

    2006-01-01

    The present investigation is a continuation of a previous joint project between the Florida State University and the NASA Langley Research Center Liner Physics Team. In the previous project, a study of acoustic liners, in two dimensions, inside a normal incidence impedance tube was carried out. The study consisted of two parts. The NASA team was responsible for the experimental part of the project. This involved performing measurements in an impedance tube with a large aspect ratio slit resonator. The FSU team was responsible for the computation part of the project. This involved performing direct numerical simulation (DNS) of the NASA experiment in two dimensions using CAA methodology. It was agreed that upon completion of numerical simulation, the computed values of the liner impedance were to be sent to NASA for validation with experimental results. On following this procedure good agreements were found between numerical results and experimental measurements over a wide range of frequencies and sound-pressure-level. Broadband incident sound waves were also simulated numerically and measured experimentally. Overall, good agreements were also found.

  7. Elucidation of the Chromatographic Enantiomer Elution Order Through Computational Studies.

    PubMed

    Sardella, Roccaldo; Ianni, Federica; Macchiarulo, Antonio; Pucciarini, Lucia; Carotti, Andrea; Natalini, Benedetto

    2018-01-01

    During the last twenty years, the interest towards the development of chiral compound has exponentially been increased. Indeed, the set-up of suitable asymmetric enantioselective synthesis protocols is currently one of the focuses of many pharmaceutical research projects. In this scenario, chiral HPLC separations have gained great importance as well, both for analytical- and preparative-scale applications, the latter devoted to the quantitative isolation of enantiopure compounds. Molecular modelling and quantum chemistry methods can be fruitfully applied to solve chirality related problems especially when enantiomerically pure reference standards are missing. In this framework, with the aim to explain the molecular basis of the enantioselective retention, we performed computational studies to rationalize the enantiomer elution order with both low- and high-molecular weight chiral selectors. Semi-empirical and quantum mechanical computational procedures were successfully applied in the domains of chiral ligand-exchange and chiral ion-exchange chromatography, as well as in studies dealing with the use of polysaccharide-based enantioresolving materials. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  8. Large Scale Computing and Storage Requirements for High Energy Physics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gerber, Richard A.; Wasserman, Harvey

    2010-11-24

    The National Energy Research Scientific Computing Center (NERSC) is the leading scientific computing facility for the Department of Energy's Office of Science, providing high-performance computing (HPC) resources to more than 3,000 researchers working on about 400 projects. NERSC provides large-scale computing resources and, crucially, the support and expertise needed for scientists to make effective use of them. In November 2009, NERSC, DOE's Office of Advanced Scientific Computing Research (ASCR), and DOE's Office of High Energy Physics (HEP) held a workshop to characterize the HPC resources needed at NERSC to support HEP research through the next three to five years. Themore » effort is part of NERSC's legacy of anticipating users needs and deploying resources to meet those demands. The workshop revealed several key points, in addition to achieving its goal of collecting and characterizing computing requirements. The chief findings: (1) Science teams need access to a significant increase in computational resources to meet their research goals; (2) Research teams need to be able to read, write, transfer, store online, archive, analyze, and share huge volumes of data; (3) Science teams need guidance and support to implement their codes on future architectures; and (4) Projects need predictable, rapid turnaround of their computational jobs to meet mission-critical time constraints. This report expands upon these key points and includes others. It also presents a number of case studies as representative of the research conducted within HEP. Workshop participants were asked to codify their requirements in this case study format, summarizing their science goals, methods of solution, current and three-to-five year computing requirements, and software and support needs. Participants were also asked to describe their strategy for computing in the highly parallel, multi-core environment that is expected to dominate HPC architectures over the next few years. The report

  9. Computer Science Lesson Study: Building Computing Skills among Elementary School Teachers

    ERIC Educational Resources Information Center

    Newman, Thomas R.

    2017-01-01

    The lack of diversity in the technology workforce in the United States has proven to be a stubborn problem, resisting even the most well-funded reform efforts. With the absence of computer science education in the mainstream K-12 curriculum, only a narrow band of students in public schools go on to careers in technology. The problem persists…

  10. The Effect of Computer Literacy Course on Students' Attitudes toward Computer Applications

    ERIC Educational Resources Information Center

    Erlich, Zippy; Gadot, Rivka; Shahak, Daphna

    2009-01-01

    Studies indicate that the use of technologies as teaching aids and tools for self-study is influenced by students' attitudes toward computers and their applications. The purpose of this study is to determine whether taking a Computer Literacy and Applications (CLA) course has an impact on students' attitudes toward computer applications, across…

  11. "Happiness Inventors": Informing Positive Computing Technologies Through Participatory Design With Children.

    PubMed

    Yarosh, Svetlana; Schueller, Stephen Matthew

    2017-01-17

    Positive psychological interventions for children have typically focused on direct adaptations of interventions developed for adults. As the community moves toward designing positive computing technologies to support child well-being, it is important to use a more participatory process that directly engages children's voices. Our objectives were, through a participatory design study, to understand children's interpretations of positive psychology concepts, as well as their perspectives on technologies that are best suited to enhance their engagement with practice of well-being skills. We addressed these questions through a content analysis of 434 design ideas, 51 sketches, and 8 prototype and videos, which emerged from a 14-session cooperative inquiry study with 12 child "happiness inventors." The study was part of a summer learning camp held at the children's middle school, which focused on teaching the invention process, teaching well-being skills drawn from positive psychology and related areas (gratitude, mindfulness, and problem solving), and iterating design ideas for technologies to support these skills. The children's ideas and prototypes revealed specific facets of how they interpreted gratitude (as thanking, being positive, and doing good things), mindfulness (as externally representing thought and emotions, controlling those thoughts and emotions, getting through unpleasant things, and avoiding forgetting something), and problem solving (as preventing bad decisions, seeking alternative solutions, and not dwelling on unproductive thoughts). This process also revealed that children emphasized particular technologies in their solutions. While desktop or laptop solutions were notably lacking, other ideas were roughly evenly distributed between mobile apps and embodied computing technologies (toys, wearables, etc). We also report on desired functionalities and approaches to engagement in the children's ideas, such as a notable emphasis on representing and

  12. Evolutionary Meta-Analysis of Association Studies Reveals Ancient Constraints Affecting Disease Marker Discovery

    PubMed Central

    Dudley, Joel T.; Chen, Rong; Sanderford, Maxwell; Butte, Atul J.; Kumar, Sudhir

    2012-01-01

    Genome-wide disease association studies contrast genetic variation between disease cohorts and healthy populations to discover single nucleotide polymorphisms (SNPs) and other genetic markers revealing underlying genetic architectures of human diseases. Despite scores of efforts over the past decade, many reproducible genetic variants that explain substantial proportions of the heritable risk of common human diseases remain undiscovered. We have conducted a multispecies genomic analysis of 5,831 putative human risk variants for more than 230 disease phenotypes reported in 2,021 studies. We find that the current approaches show a propensity for discovering disease-associated SNPs (dSNPs) at conserved genomic positions because the effect size (odds ratio) and allelic P value of genetic association of an SNP relates strongly to the evolutionary conservation of their genomic position. We propose a new measure for ranking SNPs that integrates evolutionary conservation scores and the P value (E-rank). Using published data from a large case-control study, we demonstrate that E-rank method prioritizes SNPs with a greater likelihood of bona fide and reproducible genetic disease associations, many of which may explain greater proportions of genetic variance. Therefore, long-term evolutionary histories of genomic positions offer key practical utility in reassessing data from existing disease association studies, and in the design and analysis of future studies aimed at revealing the genetic basis of common human diseases. PMID:22389448

  13. Parametric Study of a YAV-8B Harrier in Ground Effect Using Time-Dependent Navier-Stokes Computations

    NASA Technical Reports Server (NTRS)

    Shishir, Pandya; Chaderjian, Neal; Ahmad, Jsaim; Kwak, Dochan (Technical Monitor)

    2001-01-01

    Flow simulations using the time-dependent Navier-Stokes equations remain a challenge for several reasons. Principal among them are the difficulty to accurately model complex flows, and the time needed to perform the computations. A parametric study of such complex problems is not considered practical due to the large cost associated with computing many time-dependent solutions. The computation time for each solution must be reduced in order to make a parametric study possible. With successful reduction of computation time, the issue of accuracy, and appropriateness of turbulence models will become more tractable.

  14. Evaluation of the setup margins for cone beam computed tomography–guided cranial radiosurgery: A phantom study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Calvo Ortega, Juan Francisco, E-mail: jfcdrr@yahoo.es; Wunderink, Wouter; Delgado, David

    The aim of this study is to evaluate the setup margins from the clinical target volume (CTV) to planning target volume (PTV) for cranial stereotactic radiosurgery (SRS) treatments guided by cone beam computed tomography (CBCT). We designed an end-to-end (E2E) test using a skull phantom with an embedded 6mm tungsten ball (target). A noncoplanar plan was computed (E2E plan) to irradiate the target. The CBCT-guided positioning of the skull phantom on the linac was performed. Megavoltage portal images were acquired after 15 independent deliveries of the E2E plan. The displacement 2-dimensional (2D) vector between the centers of the square fieldmore » and the ball target on each portal image was used to quantify the isocenter accuracy. Geometrical margins on each patient's direction (left-right or LR, anterior-posterior or AP, superior-inferior or SI) were calculated. Dosimetric validation of the margins was performed in 5 real SRS cases: 3-dimesional (3D) isocenter deviations were mimicked, and changes in CTV dose coverage and organs-at-risk (OARs) dosage were analyzed. The CTV-PTV margins of 1.1 mm in LR direction, and 0.7 mm in AP and SI directions were derived from the E2E tests. The dosimetric analysis revealed that a 1-mm uniform margin was sufficient to ensure the CTV dose coverage, without compromising the OAR dose tolerances. The effect of isocenter uncertainty has been estimated to be 1 mm in our CBCT-guided SRS approach.« less

  15. Studying Computer Science in a Multidisciplinary Degree Programme: Freshman Students' Orientation, Knowledge, and Background

    ERIC Educational Resources Information Center

    Kautz, Karlheinz; Kofoed, Uffe

    2004-01-01

    Teachers at universities are facing an increasing disparity in students' prior IT knowledge and, at the same time, experience a growing disengagement of the students with regard to involvement in study activities. As computer science teachers in a joint programme in computer science and business administration, we made a number of similar…

  16. Thyroid Hormone Indices in Computer Workers with Emphasis on the Role of Zinc Supplementation.

    PubMed

    Amin, Ahmed Ibrahim; Hegazy, Noha Mohamed; Ibrahim, Khadiga Salah; Mahdy-Abdallah, Heba; Hammouda, Hamdy A A; Shaban, Eman Essam

    2016-06-15

    This study aimed to investigate the effects of computer monitor-emitted radiation on thyroid hormones and the possible protective role of zinc supplementation. The study included three groups. The first group (group B) consisted of 42 computer workers. This group was given Zinc supplementation in the form of one tablet daily for eight weeks. The second group (group A) comprised the same 42 computer workers after zinc supplementation. A group of 63 subjects whose job does not entail computer use was recruited as a control Group (Group C). All participants filled a questionnaire including detailed medical and occupational histories. They were subjected to full clinical examination. Thyroid stimulating hormone (TSH), free triiodothyronine (FT3), free thyroxine (FT4) and zinc levels were measured in all participants. TSH, FT3, FT4 and zinc concentrations were decreased significantly in group B relative to group C. In group A, all tested parameters were improved when compared with group B. The obtained results revealed that radiation emitted from computers led to changes in TSH and thyroid hormones (FT3 and FT4) in the workers. Improvement after supplementation suggests that zinc can ameliorate hazards of such radiation on thyroid hormone indices.

  17. Comprehensive Materials and Morphologies Study of Ion Traps (COMMIT) for Scalable Quantum Computation

    DTIC Science & Technology

    2012-04-21

    the photoelectric effect. The typical shortest wavelengths needed for ion traps range from 194 nm for Hg+ to 493 nm for Ba +, corresponding to 6.4-2.5...REPORT Comprehensive Materials and Morphologies Study of Ion Traps (COMMIT) for scalable Quantum Computation - Final Report 14. ABSTRACT 16. SECURITY...CLASSIFICATION OF: Trapped ion systems, are extremely promising for large-scale quantum computation, but face a vexing problem, with motional quantum

  18. LHC@Home: a BOINC-based volunteer computing infrastructure for physics studies at CERN

    NASA Astrophysics Data System (ADS)

    Barranco, Javier; Cai, Yunhai; Cameron, David; Crouch, Matthew; Maria, Riccardo De; Field, Laurence; Giovannozzi, Massimo; Hermes, Pascal; Høimyr, Nils; Kaltchev, Dobrin; Karastathis, Nikos; Luzzi, Cinzia; Maclean, Ewen; McIntosh, Eric; Mereghetti, Alessio; Molson, James; Nosochkov, Yuri; Pieloni, Tatiana; Reid, Ivan D.; Rivkin, Lenny; Segal, Ben; Sjobak, Kyrre; Skands, Peter; Tambasco, Claudia; Veken, Frederik Van der; Zacharov, Igor

    2017-12-01

    The LHC@Home BOINC project has provided computing capacity for numerical simulations to researchers at CERN since 2004, and has since 2011 been expanded with a wider range of applications. The traditional CERN accelerator physics simulation code SixTrack enjoys continuing volunteers support, and thanks to virtualisation a number of applications from the LHC experiment collaborations and particle theory groups have joined the consolidated LHC@Home BOINC project. This paper addresses the challenges related to traditional and virtualized applications in the BOINC environment, and how volunteer computing has been integrated into the overall computing strategy of the laboratory through the consolidated LHC@Home service. Thanks to the computing power provided by volunteers joining LHC@Home, numerous accelerator beam physics studies have been carried out, yielding an improved understanding of charged particle dynamics in the CERN Large Hadron Collider (LHC) and its future upgrades. The main results are highlighted in this paper.

  19. An esthetics rehabilitation with computer-aided design/ computer-aided manufacturing technology.

    PubMed

    Mazaro, Josá Vitor Quinelli; de Mello, Caroline Cantieri; Zavanelli, Adriana Cristina; Santiago, Joel Ferreira; Amoroso, Andressa Paschoal; Pellizzer, Eduardo Piza

    2014-07-01

    This paper describes a case of a rehabilitation involving Computer Aided Design/Computer Aided Manufacturing (CAD-CAM) system in implant supported and dental supported prostheses using zirconia as framework. The CAD-CAM technology has developed considerably over last few years, becoming a reality in dental practice. Among the widely used systems are the systems based on zirconia which demonstrate important physical and mechanical properties of high strength, adequate fracture toughness, biocompatibility and esthetics, and are indicated for unitary prosthetic restorations and posterior and anterior framework. All the modeling was performed by using CAD-CAM system and prostheses were cemented using resin cement best suited for each situation. The rehabilitation of the maxillary arch using zirconia framework demonstrated satisfactory esthetic and functional results after a 12-month control and revealed no biological and technical complications. This article shows the important of use technology CAD/CAM in the manufacture of dental prosthesis and implant-supported.

  20. A Study To Increase Computer Applications in Social Work Management.

    ERIC Educational Resources Information Center

    Lucero, John A.

    The purpose of this study was to address the use of computers in social work practice and to survey the field for tools, concepts, and trends that could assist social workers in their practice. In addition to a review of the relevant literature, information was requested from the Social Work Service and Ambulatory Care Database Section at Walter…