Sample records for previous computational studies

  1. Computer games: a double-edged sword?

    PubMed

    Sun, De-Lin; Ma, Ning; Bao, Min; Chen, Xang-Chuan; Zhang, Da-Ren

    2008-10-01

    Excessive computer game playing (ECGP) has already become a serious social problem. However, limited data from experimental lab studies are available about the negative consequences of ECGP on players' cognitive characteristics. In the present study, we compared three groups of participants (current ECGP participants, previous ECGP participants, and control participants) on a Multiple Object Tracking (MOT) task. The previous ECGP participants performed significantly better than the control participants, which suggested a facilitation effect of computer games on visuospatial abilities. Moreover, the current ECGP participants performed significantly worse than the previous ECGP participants. This more important finding indicates that ECGP may be related to cognitive deficits. Implications of this study are discussed.

  2. Computational Fluid Dynamics at ICMA (Institute for Computational Mathematics and Applications)

    DTIC Science & Technology

    1988-10-18

    PERSONAL. AUTHOR(S) Charles A. Hall and Thomas A. Porsching 13a. TYPE OF REPORT 13b. TIME COVERED 114. DATE OF REPORT (YearMOth, De ) 1. PAGE COUNT...of ten ICtA (Institute for Computational Mathe- matics and Applications) personnel, relating to the general area of computational fluid mechanics...questions raised in the previous subsection. Our previous work in this area concentrated on a study of the differential geometric aspects of the prob- lem

  3. Approaching Gender Parity: Women in Computer Science at Afghanistan's Kabul University

    ERIC Educational Resources Information Center

    Plane, Jandelyn

    2010-01-01

    This study explores the representation of women in computer science at the tertiary level through data collected about undergraduate computer science education at Kabul University in Afghanistan. Previous studies have theorized reasons for underrepresentation of women in computer science, and while many of these reasons are indeed present in…

  4. Subjective Norms Predicting Computer Use.

    ERIC Educational Resources Information Center

    Marcinkiewicz, Henryk R.

    This study was part of a series of studies examining the relationship of teacher variables to teachers' adoption of computer use. Previous studies have considered computer use as a process of the adoption of innovation and as a result of the influence of the internal variables of the person. This study adds the variable of subjective norms because…

  5. Coupled Aerodynamic and Structural Sensitivity Analysis of a High-Speed Civil Transport

    NASA Technical Reports Server (NTRS)

    Mason, B. H.; Walsh, J. L.

    2001-01-01

    An objective of the High Performance Computing and Communication Program at the NASA Langley Research Center is to demonstrate multidisciplinary shape and sizing optimization of a complete aerospace vehicle configuration by using high-fidelity, finite-element structural analysis and computational fluid dynamics aerodynamic analysis. In a previous study, a multi-disciplinary analysis system for a high-speed civil transport was formulated to integrate a set of existing discipline analysis codes, some of them computationally intensive, This paper is an extension of the previous study, in which the sensitivity analysis for the coupled aerodynamic and structural analysis problem is formulated and implemented. Uncoupled stress sensitivities computed with a constant load vector in a commercial finite element analysis code are compared to coupled aeroelastic sensitivities computed by finite differences. The computational expense of these sensitivity calculation methods is discussed.

  6. Effect of computer game playing on baseline laparoscopic simulator skills.

    PubMed

    Halvorsen, Fredrik H; Cvancarova, Milada; Fosse, Erik; Mjåland, Odd

    2013-08-01

    Studies examining the possible association between computer game playing and laparoscopic performance in general have yielded conflicting results and neither has a relationship between computer game playing and baseline performance on laparoscopic simulators been established. The aim of this study was to examine the possible association between previous and present computer game playing and baseline performance on a virtual reality laparoscopic performance in a sample of potential future medical students. The participating students completed a questionnaire covering the weekly amount and type of computer game playing activity during the previous year and 3 years ago. They then performed 2 repetitions of 2 tasks ("gallbladder dissection" and "traverse tube") on a virtual reality laparoscopic simulator. Performance on the simulator were then analyzed for association to their computer game experience. Local high school, Norway. Forty-eight students from 2 high school classes volunteered to participate in the study. No association between prior and present computer game playing and baseline performance was found. The results were similar both for prior and present action game playing and prior and present computer game playing in general. Our results indicate that prior and present computer game playing may not affect baseline performance in a virtual reality simulator.

  7. Computer Anxiety: Relationship to Math Anxiety and Holland Types.

    ERIC Educational Resources Information Center

    Bellando, Jayne; Winer, Jane L.

    Although the number of computers in the school system is increasing, many schools are not using computers to their capacity. One reason for this may be computer anxiety on the part of the teacher. A review of the computer anxiety literature reveals little information on the subject, and findings from previous studies suggest that basic controlled…

  8. Innovation and Integration: Case Studies of Effective Teacher Practices in the Use of Handheld Computers

    ERIC Educational Resources Information Center

    Chavez, Raymond Anthony

    2010-01-01

    Previous research conducted on the use of handheld computers in K-12 education has focused on how handheld computer use affects student motivation, engagement, and productivity. These four case studies sought to identify effective teacher practices in the integration of handhelds into the curriculum and the factors that affect those practices. The…

  9. A Validation Study of Student Differentiation between Computing Disciplines

    ERIC Educational Resources Information Center

    Battig, Michael; Shariq, Muhammad

    2011-01-01

    Using a previously published study of how students differentiate between computing disciplines, this study attempts to validate the original research and add additional hypotheses regarding the type of institution that the student resides. Using the identical survey instrument from the original study, students in smaller colleges and in different…

  10. A Computer Game-Based Method for Studying Bullying and Cyberbullying

    ERIC Educational Resources Information Center

    Mancilla-Caceres, Juan F.; Espelage, Dorothy; Amir, Eyal

    2015-01-01

    Even though previous studies have addressed the relation between face-to-face bullying and cyberbullying, none have studied both phenomena simultaneously. In this article, we present a computer game-based method to study both types of peer aggression among youth. Study participants included fifth graders (N = 93) in two U.S. Midwestern middle…

  11. Correcting Spellings in Second Language Learners' Computer-Assisted Collaborative Writing

    ERIC Educational Resources Information Center

    Musk, Nigel

    2016-01-01

    The present study uses multimodal conversation analysis to examine how pupils studying English as a foreign language make spelling corrections in real time while doing collaborative computer-assisted project work. Unlike most previous related investigations, this study focuses on the "process" rather than evaluating the final…

  12. 48 CFR 252.227-7028 - Technical data or computer software previously delivered to the government.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... software previously delivered to the government. 252.227-7028 Section 252.227-7028 Federal Acquisition... computer software previously delivered to the government. As prescribed in 227.7103-6(d), 227.7104(f)(2), or 227.7203-6(e), use the following provision: Technical Data or Computer Software Previously...

  13. 48 CFR 252.227-7028 - Technical data or computer software previously delivered to the government.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... software previously delivered to the government. 252.227-7028 Section 252.227-7028 Federal Acquisition... computer software previously delivered to the government. As prescribed in 227.7103-6(d), 227.7104(f)(2), or 227.7203-6(e), use the following provision: Technical Data or Computer Software Previously...

  14. 48 CFR 252.227-7028 - Technical data or computer software previously delivered to the government.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... software previously delivered to the government. 252.227-7028 Section 252.227-7028 Federal Acquisition... computer software previously delivered to the government. As prescribed in 227.7103-6(d), 227.7104(f)(2), or 227.7203-6(e), use the following provision: Technical Data or Computer Software Previously...

  15. 48 CFR 252.227-7028 - Technical data or computer software previously delivered to the government.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... software previously delivered to the government. 252.227-7028 Section 252.227-7028 Federal Acquisition... computer software previously delivered to the government. As prescribed in 227.7103-6(d), 227.7104(f)(2), or 227.7203-6(e), use the following provision: Technical Data or Computer Software Previously...

  16. 48 CFR 252.227-7028 - Technical data or computer software previously delivered to the government.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... software previously delivered to the government. 252.227-7028 Section 252.227-7028 Federal Acquisition... computer software previously delivered to the government. As prescribed in 227.7103-6(d), 227.7104(f)(2), or 227.7203-6(e), use the following provision: Technical Data or Computer Software Previously...

  17. A Reflective Study into Children's Cognition When Making Computer Games

    ERIC Educational Resources Information Center

    Allsop, Yasemin

    2016-01-01

    In this paper, children's mental activities when making digital games are explored. Where previous studies have mainly focused on children's learning, this study aimed to unfold the children's thinking process for learning when making computer games. As part of an ongoing larger scale study, which adopts an ethnographic approach, this research…

  18. Gender Digital Divide and Challenges in Undergraduate Computer Science Programs

    ERIC Educational Resources Information Center

    Stoilescu, Dorian; McDougall, Douglas

    2011-01-01

    Previous research revealed a reduced number of female students registered in computer science studies. In addition, the female students feel isolated, have reduced confidence, and underperform. This article explores differences between female and male students in undergraduate computer science programs in a mid-size university in Ontario. Based on…

  19. Chinese Herbal Medicine Meets Biological Networks of Complex Diseases: A Computational Perspective

    PubMed Central

    Gu, Shuo

    2017-01-01

    With the rapid development of cheminformatics, computational biology, and systems biology, great progress has been made recently in the computational research of Chinese herbal medicine with in-depth understanding towards pharmacognosy. This paper summarized these studies in the aspects of computational methods, traditional Chinese medicine (TCM) compound databases, and TCM network pharmacology. Furthermore, we chose arachidonic acid metabolic network as a case study to demonstrate the regulatory function of herbal medicine in the treatment of inflammation at network level. Finally, a computational workflow for the network-based TCM study, derived from our previous successful applications, was proposed. PMID:28690664

  20. Chinese Herbal Medicine Meets Biological Networks of Complex Diseases: A Computational Perspective.

    PubMed

    Gu, Shuo; Pei, Jianfeng

    2017-01-01

    With the rapid development of cheminformatics, computational biology, and systems biology, great progress has been made recently in the computational research of Chinese herbal medicine with in-depth understanding towards pharmacognosy. This paper summarized these studies in the aspects of computational methods, traditional Chinese medicine (TCM) compound databases, and TCM network pharmacology. Furthermore, we chose arachidonic acid metabolic network as a case study to demonstrate the regulatory function of herbal medicine in the treatment of inflammation at network level. Finally, a computational workflow for the network-based TCM study, derived from our previous successful applications, was proposed.

  1. Teachers' Attitudes to Using iPads or Tablet Computers; Implications for Developing New Skills, Pedagogies and School-Provided Support

    ERIC Educational Resources Information Center

    Young, Keith

    2016-01-01

    This study examined the attitudes of teachers towards using tablet computers, predominantly Apple's iPad, across 22 post primary-schools in Ireland. The study also questions some previous research and assumptions on the educational use of tablet computers. The majority of schools were using devices with students and teachers; the combined size of…

  2. The use of instant medical history in a rural clinic. Case study of the use of computers in an Arkansas physician's office.

    PubMed

    Pierce, B

    2000-05-01

    This study evaluated the acceptance of using computers to take a medical history by rural Arkansas patients. Sex, age, race, education, previous computer experience and owning a computer were used as variables. Patients were asked a series of questions to rate their comfort level with using a computer to take their medical history. Comfort ratings ranged from 30 to 45, with a mean of 36.8 (SEM = 0.67). Neither sex, race, age, education, owning a personal computer, nor prior computer experience had a significant effect on the comfort rating. This study helps alleviate one of the concerns--patient acceptance--about the increasing use of computers in practicing medicine.

  3. 5 CFR 839.1002 - Will OPM compute the lost earnings if my qualifying retirement coverage error was previously...

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 5 Administrative Personnel 2 2010-01-01 2010-01-01 false Will OPM compute the lost earnings if my... compute the lost earnings if my qualifying retirement coverage error was previously corrected and I made... coverage error was previously corrected, OPM will compute the lost earnings on your make-up contributions...

  4. Student Engagement with Computer-Generated Feedback: A Case Study

    ERIC Educational Resources Information Center

    Zhang, Zhe

    2017-01-01

    In order to benefit from feedback on their writing, students need to engage effectively with it. This article reports a case study on student engagement with computer-generated feedback, known as automated writing evaluation (AWE) feedback, in an EFL context. Differing from previous studies that explored commercially available AWE programs, this…

  5. Developing Computerized Tests for Classroom Teachers: A Pilot Study.

    ERIC Educational Resources Information Center

    Glowacki, Margaret L.; And Others

    Two types of computerized testing have been defined: (1) computer-based testing, using a computer to administer conventional tests in which all examinees take the same set of items; and (2) adaptive tests, in which items are selected for administration by the computer, based on examinee's previous responses. This paper discusses an option for…

  6. An International Literature Review of 1:1 Computing in Schools

    ERIC Educational Resources Information Center

    Islam, M. Sirajul; Grönlund, Åke

    2016-01-01

    This paper is based on a systematic literature review relevant to classroom integration of computer technologies in schools. The purpose of this review is to gain an accumulated view of uses, impacts and implementations of 1:1 computing initiatives for school children. Unlike previous reviews this study is not limited to certain countries or…

  7. Do All Roads Lead to Rome? ("or" Reductions for Dummy Travelers)

    ERIC Educational Resources Information Center

    Kilpelainen, Pekka

    2010-01-01

    Reduction is a central ingredient of computational thinking, and an important tool in algorithm design, in computability theory, and in complexity theory. Reduction has been recognized to be a difficult topic for students to learn. Previous studies on teaching reduction have concentrated on its use in special courses on the theory of computing. As…

  8. Computer Literacy Education

    DTIC Science & Technology

    1989-01-01

    of the 33,000 schools that had not previously used computers began to do so. " The proportion of elementary schools with 5 or more computers jumped...scale studies of primary and secondary education throughout the country, for the Federal government. In 1980, they found 15% of elementary schools and 50...of secondary schools offering instruction in the use of computers. By 1985, these figuires climbed to 82% of elementary schools and 93% of secondary

  9. Supporting Students' Learning in the Domain of Computer Science

    ERIC Educational Resources Information Center

    Gasparinatou, Alexandra; Grigoriadou, Maria

    2011-01-01

    Previous studies have shown that students with low knowledge understand and learn better from more cohesive texts, whereas high-knowledge students have been shown to learn better from texts of lower cohesion. This study examines whether high-knowledge readers in computer science benefit from a text of low cohesion. Undergraduate students (n = 65)…

  10. Students' Technology Use and Its Effects on Peer Relationships, Academic Involvement, and Healthy Lifestyles

    ERIC Educational Resources Information Center

    Lloyd, Jan M.; Dean, Laura A.; Cooper, Diane L.

    2007-01-01

    The purpose of this study was to explore students' technology use and its relationship with their psychosocial development. Previous research explored students' computer use in conjunction with their cognitive development. This study examined the effects of computer use and other technologies, such as instant messaging, handheld gaming devices,…

  11. Software applications to three-dimensional visualization of forest landscapes -- A case study demontrating the use of visual nature studio (VNS) in visualizing fire spread in forest landscapes

    Treesearch

    Brian J. Williams; Bo Song; Chou Chiao-Ying; Thomas M. Williams; John Hom

    2010-01-01

    Three-dimensional (3D) visualization is a useful tool that depicts virtual forest landscapes on computer. Previous studies in visualization have required high end computer hardware and specialized technical skills. A virtual forest landscape can be used to show different effects of disturbances and management scenarios on a computer, which allows observation of forest...

  12. Convergence acceleration of the Proteus computer code with multigrid methods

    NASA Technical Reports Server (NTRS)

    Demuren, A. O.; Ibraheem, S. O.

    1992-01-01

    Presented here is the first part of a study to implement convergence acceleration techniques based on the multigrid concept in the Proteus computer code. A review is given of previous studies on the implementation of multigrid methods in computer codes for compressible flow analysis. Also presented is a detailed stability analysis of upwind and central-difference based numerical schemes for solving the Euler and Navier-Stokes equations. Results are given of a convergence study of the Proteus code on computational grids of different sizes. The results presented here form the foundation for the implementation of multigrid methods in the Proteus code.

  13. Using NCLab-karel to improve computational thinking skill of junior high school students

    NASA Astrophysics Data System (ADS)

    Kusnendar, J.; Prabawa, H. W.

    2018-05-01

    Increasingly human interaction with technology and the increasingly complex development of digital technology world make the theme of computer science education interesting to study. Previous studies on Computer Literacy and Competency reveal that Indonesian teachers in general have fairly high computational skill, but their skill utilization are limited to some applications. This engenders limited and minimum computer-related learning for the students. On the other hand, computer science education is considered unrelated to real-world solutions. This paper attempts to address the utilization of NCLab- Karel in shaping the computational thinking in students. This computational thinking is believed to be able to making learn students about technology. Implementation of Karel utilization provides information that Karel is able to increase student interest in studying computational material, especially algorithm. Observations made during the learning process also indicate the growth and development of computing mindset in students.

  14. Can Designing Self-Representations through Creative Computing Promote an Incremental View of Intelligence and Enhance Creativity among At-Risk Youth?

    ERIC Educational Resources Information Center

    Blau, Ina; Benolol, Nurit

    2016-01-01

    Creative computing is one of the rapidly growing educational trends around the world. Previous studies have shown that creative computing can empower disadvantaged children and youth. At-risk youth tend to hold a negative view of self and perceive their abilities as inferior compared to "normative" pupils. The Implicit Theories of…

  15. Narrating Data Structures: The Role of Context in CS2

    ERIC Educational Resources Information Center

    Yarosh, Svetlana; Guzdial, Mark

    2008-01-01

    Learning computing with respect to the context of its use has been linked in previous reports to student motivation in introductory Computer Science (CS) courses. In this report, we consider the role of context in a second course. We present a case study of a CS2 data structures class that uses a media computation context. In this course, students…

  16. Applied Computational Electromagnetics Society Journal. Volume 7, Number 1, Summer 1992

    DTIC Science & Technology

    1992-01-01

    previously-solved computational problem in electrical engineering, physics, or related fields of study. The technical activities promoted by this...in solution technique or in data input/output; identification of new applica- tions for electromagnetics modeling codes and techniques; integration of...papers will represent the computational electromagnetics aspects of research in electrical engineering, physics, or related disciplines. However, papers

  17. The Foundation and Development of Computer Assisted Instruction in the Field of Reading from Its Inception to the Present.

    ERIC Educational Resources Information Center

    Zuberman, Lea K.

    This critical review and evaluation of the literature covers the field of computer assisted instruction (CAI) and reading from its inception to the present day. Seventeen research studies are discussed as well as four surveys of previous research in this field. Major issues addressed include the effectiveness of CAI and computer managed…

  18. A Knowledge Engineering Approach to Developing Educational Computer Games for Improving Students' Differentiating Knowledge

    ERIC Educational Resources Information Center

    Hwang, Gwo-Jen; Sung, Han-Yu; Hung, Chun-Ming; Yang, Li-Hsueh; Huang, Iwen

    2013-01-01

    Educational computer games have been recognized as being a promising approach for motivating students to learn. Nevertheless, previous studies have shown that without proper learning strategies or supportive models, the learning achievement of students might not be as good as expected. In this study, a knowledge engineering approach is proposed…

  19. Mechanisms of Reference Frame Selection in Spatial Term Use: Computational and Empirical Studies

    ERIC Educational Resources Information Center

    Schultheis, Holger; Carlson, Laura A.

    2017-01-01

    Previous studies have shown that multiple reference frames are available and compete for selection during the use of spatial terms such as "above." However, the mechanisms that underlie the selection process are poorly understood. In the current paper we present two experiments and a comparison of three computational models of selection…

  20. Student Preferences toward Microcomputer User Interfaces.

    ERIC Educational Resources Information Center

    Hazari, Sunil I.; Reaves, Rita R.

    1994-01-01

    Describes a study of undergraduates that was conducted to determine students' preferences toward Graphical User Interface versus Command Line Interface during computer-assisted instruction. Previous experience, comfort level, performance scores, and student attitudes are examined and compared, and the computer use survey is appended. (Contains 13…

  1. Transient Three-Dimensional Side Load Analysis of Out-of-Round Film Cooled Nozzles

    NASA Technical Reports Server (NTRS)

    Wang, Ten-See; Lin, Jeff; Ruf, Joe; Guidos, Mike

    2010-01-01

    The objective of this study is to investigate the effect of nozzle out-of-roundness on the transient startup side loads. The out-of-roundness could be the result of asymmetric loads induced by hardware attached to the nozzle, asymmetric internal stresses induced by previous tests and/or deformation, such as creep, from previous tests. The rocket engine studied encompasses a regeneratively cooled thrust chamber and a film cooled nozzle extension with film coolant distributed from a turbine exhaust manifold. The computational methodology is based on an unstructured-grid, pressure-based computational fluid dynamics formulation, and a transient inlet history based on an engine system simulation. Transient startup computations were performed with the out-of-roundness achieved by four degrees of ovalization of the nozzle: one perfectly round, one slightly out-of-round, one more out-of-round, and one significantly out-of-round. The computed side load physics caused by the nozzle out-of-roundness and its effect on nozzle side load are reported and discussed.

  2. The study on the parallel processing based time series correlation analysis of RBC membrane flickering in quantitative phase imaging

    NASA Astrophysics Data System (ADS)

    Lee, Minsuk; Won, Youngjae; Park, Byungjun; Lee, Seungrag

    2017-02-01

    Not only static characteristics but also dynamic characteristics of the red blood cell (RBC) contains useful information for the blood diagnosis. Quantitative phase imaging (QPI) can capture sample images with subnanometer scale depth resolution and millisecond scale temporal resolution. Various researches have been used QPI for the RBC diagnosis, and recently many researches has been developed to decrease the process time of RBC information extraction using QPI by the parallel computing algorithm, however previous studies are interested in the static parameters such as morphology of the cells or simple dynamic parameters such as root mean square (RMS) of the membrane fluctuations. Previously, we presented a practical blood test method using the time series correlation analysis of RBC membrane flickering with QPI. However, this method has shown that there is a limit to the clinical application because of the long computation time. In this study, we present an accelerated time series correlation analysis of RBC membrane flickering using the parallel computing algorithm. This method showed consistent fractal scaling exponent results of the surrounding medium and the normal RBC with our previous research.

  3. Hispanic women overcoming deterrents to computer science: A phenomenological study

    NASA Astrophysics Data System (ADS)

    Herling, Lourdes

    The products of computer science are important to all aspects of society and are tools in the solution of the world's problems. It is, therefore, troubling that the United States faces a shortage in qualified graduates in computer science. The number of women and minorities in computer science is significantly lower than the percentage of the U.S. population which they represent. The overall enrollment in computer science programs has continued to decline with the enrollment of women declining at a higher rate than that of men. This study addressed three aspects of underrepresentation about which there has been little previous research: addressing computing disciplines specifically rather than embedding them within the STEM disciplines, what attracts women and minorities to computer science, and addressing the issues of race/ethnicity and gender in conjunction rather than in isolation. Since women of underrepresented ethnicities are more severely underrepresented than women in general, it is important to consider whether race and ethnicity play a role in addition to gender as has been suggested by previous research. Therefore, this study examined what attracted Hispanic women to computer science specifically. The study determines whether being subjected to multiple marginalizations---female and Hispanic---played a role in the experiences of Hispanic women currently in computer science. The study found five emergent themes within the experiences of Hispanic women in computer science. Encouragement and role models strongly influenced not only the participants' choice to major in the field, but to persist as well. Most of the participants experienced a negative atmosphere and feelings of not fitting in while in college and industry. The interdisciplinary nature of computer science was the most common aspect that attracted the participants to computer science. The aptitudes participants commonly believed are needed for success in computer science are the Twenty-First Century skills problem solving, creativity, and critical thinking. While not all the participants had experience with computers or programming prior to attending college, experience played a role in the self-confidence of those who did.

  4. Shock compression response of cold-rolled Ni/Al multilayer composites

    DOE PAGES

    Specht, Paul E.; Weihs, Timothy P.; Thadhani, Naresh N.

    2017-01-06

    Uniaxial strain, plate-on-plate impact experiments were performed on cold-rolled Ni/Al multilayer composites and the resulting Hugoniot was determined through time-resolved measurements combined with impedance matching. The experimental Hugoniot agreed with that previously predicted by two dimensional (2D) meso-scale calculations. Additional 2D meso-scale simulations were performed using the same computational method as the prior study to reproduce the experimentally measured free surface velocities and stress profiles. Finally, these simulations accurately replicated the experimental profiles, providing additional validation for the previous computational work.

  5. Remediating Physics Misconceptions Using an Analogy-Based Computer Tutor. Draft.

    ERIC Educational Resources Information Center

    Murray, Tom; And Others

    Described is a computer tutor designed to help students gain a qualitative understanding of important physics concepts. The tutor simulates a teaching strategy called "bridging analogies" that previous research has demonstrated to be successful in one-on-one tutoring and written explanation studies. The strategy is designed to remedy…

  6. Development of a Personalized Educational Computer Game Based on Students' Learning Styles

    ERIC Educational Resources Information Center

    Hwang, Gwo-Jen; Sung, Han-Yu; Hung, Chun-Ming; Huang, Iwen; Tsai, Chin-Chung

    2012-01-01

    In recent years, many researchers have been engaged in the development of educational computer games; however, previous studies have indicated that, without supportive models that take individual students' learning needs or difficulties into consideration, students might only show temporary interest during the learning process, and their learning…

  7. Toward an Understanding of How Threads Die in Asynchronous Computer Conferences

    ERIC Educational Resources Information Center

    Hewitt, Jim

    2005-01-01

    Previous computer conferencing research has been concerned with the organizational, technical, social, and motivational factors that support and sustain online interaction. This article studies online interaction from a different perspective. Rather than analyze the processes that sustain discourse, the following research examines how and why…

  8. Using Computer Simulations in Chemistry Problem Solving

    ERIC Educational Resources Information Center

    Avramiotis, Spyridon; Tsaparlis, Georgios

    2013-01-01

    This study is concerned with the effects of computer simulations of two novel chemistry problems on the problem solving ability of students. A control-experimental group, equalized by pair groups (n[subscript Exp] = n[subscript Ctrl] = 78), research design was used. The students had no previous experience of chemical practical work. Student…

  9. Effects of Belongingness and Synchronicity on Face-to-Face and Computer-Mediated Online Cooperative Pedagogy

    ERIC Educational Resources Information Center

    Saltarelli, Andrew John

    2012-01-01

    Previous research suggests asynchronous online computer-mediated communication (CMC) has deleterious effects on certain cooperative learning pedagogies (e.g., constructive controversy), but the processes underlying this effect and how it may be ameliorated remain unclear. This study tests whether asynchronous CMC thwarts belongingness needs…

  10. An Analysis of Graduate Nursing Students' Innovation-Decision Process

    PubMed Central

    Kacynski, Kathryn A.; Roy, Katrina D.

    1984-01-01

    This study's purpose was to examine the innovation-decision process used by graduate nursing students when deciding to use computer applications. Graduate nursing students enrolled in a mandatory research class were surveyed before and after their use of a mainframe computer for beginning data analysis about their general attitudes towards computers, individual characteristics such as “cosmopoliteness”, and their desire to learn more about a computer application. It was expected that an experimental intervention, a videotaped demonstration of interactive video instruction of cardiopulmonary resuscitation (CPR); previous computer experience; and the subject's “cosmopoliteness” wolud influence attitudes towards computers and the desire to learn more about a computer application.

  11. Factors affecting and affected by user acceptance of computer-based nursing documentation: results of a two-year study.

    PubMed

    Ammenwerth, Elske; Mansmann, Ulrich; Iller, Carola; Eichstädter, Ronald

    2003-01-01

    The documentation of the nursing process is an important but often neglected part of clinical documentation. Paper-based systems have been introduced to support nursing process documentation. Frequently, however, problems such as low quality of documentation are reported. It is unclear whether computer-based documentation systems can reduce these problems and which factors influence their acceptance by users. We introduced a computer-based nursing documentation system on four wards of the University Hospitals of Heidelberg and systematically evaluated its preconditions and its effects in a pretest-posttest intervention study. For the analysis of user acceptance, we concentrated on subjective data drawn from questionnaires and interviews. A questionnaire was developed using items from published questionnaires and items that had to be developed for the special purpose of this study. The quantitative results point to two factors influencing the acceptance of a new computer-based documentation system: the previous acceptance of the nursing process and the previous amount of self-confidence when using computers. On one ward, the diverse acceptance scores heavily declined after the introduction of the nursing documentation system. Explorative qualitative analysis on this ward points to further success factors of computer-based nursing documentation systems. Our results can be used to assist the planning and introduction of computer-based nursing documentation systems. They demonstrate the importance of computer experience and acceptance of the nursing process on a ward but also point to other factors such as the fit between nursing workflow and the functionality of a nursing documentation system.

  12. Fixed-Base Comb with Window-Non-Adjacent Form (NAF) Method for Scalar Multiplication

    PubMed Central

    Seo, Hwajeong; Kim, Hyunjin; Park, Taehwan; Lee, Yeoncheol; Liu, Zhe; Kim, Howon

    2013-01-01

    Elliptic curve cryptography (ECC) is one of the most promising public-key techniques in terms of short key size and various crypto protocols. For this reason, many studies on the implementation of ECC on resource-constrained devices within a practical execution time have been conducted. To this end, we must focus on scalar multiplication, which is the most expensive operation in ECC. A number of studies have proposed pre-computation and advanced scalar multiplication using a non-adjacent form (NAF) representation, and more sophisticated approaches have employed a width-w NAF representation and a modified pre-computation table. In this paper, we propose a new pre-computation method in which zero occurrences are much more frequent than in previous methods. This method can be applied to ordinary group scalar multiplication, but it requires large pre-computation table, so we combined the previous method with ours for practical purposes. This novel structure establishes a new feature that adjusts speed performance and table size finely, so we can customize the pre-computation table for our own purposes. Finally, we can establish a customized look-up table for embedded microprocessors. PMID:23881143

  13. Image restoration for three-dimensional fluorescence microscopy using an orthonormal basis for efficient representation of depth-variant point-spread functions

    PubMed Central

    Patwary, Nurmohammed; Preza, Chrysanthe

    2015-01-01

    A depth-variant (DV) image restoration algorithm for wide field fluorescence microscopy, using an orthonormal basis decomposition of DV point-spread functions (PSFs), is investigated in this study. The efficient PSF representation is based on a previously developed principal component analysis (PCA), which is computationally intensive. We present an approach developed to reduce the number of DV PSFs required for the PCA computation, thereby making the PCA-based approach computationally tractable for thick samples. Restoration results from both synthetic and experimental images show consistency and that the proposed algorithm addresses efficiently depth-induced aberration using a small number of principal components. Comparison of the PCA-based algorithm with a previously-developed strata-based DV restoration algorithm demonstrates that the proposed method improves performance by 50% in terms of accuracy and simultaneously reduces the processing time by 64% using comparable computational resources. PMID:26504634

  14. Exploring the experience of clients with tetraplegia utilizing assistive technology for computer access.

    PubMed

    Folan, Alyce; Barclay, Linda; Cooper, Cathy; Robinson, Merren

    2015-01-01

    Assistive technology for computer access can be used to facilitate people with a spinal cord injury to utilize mainstream computer applications, thereby enabling participation in a variety of meaningful occupations. The aim of this study was to gain an understanding of the experiences of clients with tetraplegia trialing assistive technologies for computer access during different stages in a public rehabilitation service. In order to explore the experiences of clients with tetraplegia trialing assistive technologies for computer use, qualitative methodology was selected. Data were collected from seven participants using semi-structured interviews, which were audio-taped, transcribed and analyzed thematically. Three main themes were identified. These were: getting back into life, assisting in adjusting to injury and learning new skills. The findings from this study demonstrated that people with tetraplegia can be assisted to return to previous life roles or engage in new roles, through developing skills in the use of assistive technology for computer access. Being able to use computers for meaningful activities contributed to the participants gaining an enhanced sense of self-efficacy, and thereby quality of life. Implications for Rehabilitation Findings from this pilot study indicate that people with tetraplegia can be assisted to return to previous life roles, and develop new roles that have meaning to them through the use of assistive technologies for computer use. Being able to use the internet to socialize, and complete daily tasks, contributed to the participants gaining a sense of control over their lives. Early introduction to assistive technology is important to ensure sufficient time for newly injured people to feel comfortable enough with the assistive technology to use the computers productively by the time of discharge. Further research into this important and expanding area is indicated.

  15. Hypnotic Enhancement of Cognitive-Behavioral Weight Loss Treatments--Another Meta-reanalysis.

    ERIC Educational Resources Information Center

    Kirsch, Irving

    1996-01-01

    In a meta-analysis of the effect of adding hypnosis to cognitive-behavioral treatments for weight reduction, additional data were obtained from authors of two previous studies, and computational inaccuracies in the previous meta-analyses were corrected. Discusses findings. Correlational analyses indicated that the benefits of hypnosis increased…

  16. Expedited patient-specific assessment of contact stress exposure in the ankle joint following definitive articular fracture reduction.

    PubMed

    Kern, Andrew M; Anderson, Donald D

    2015-09-18

    Acute injury severity, altered joint kinematics, and joint incongruity are three important mechanical factors linked to post-traumatic osteoarthritis (PTOA). Finite element analysis (FEA) was previously used to assess the influence of increased contact stress due to joint incongruity on PTOA development. While promising agreement with PTOA development was seen, the inherent complexities of contact FEA limited the numbers of subjects that could be analyzed. Discrete element analysis (DEA) is a simplified methodology for contact stress computation, which idealizes contact surfaces as a bed of independent linear springs. In this study, DEA was explored as an expedited alternative to FEA contact stress exposure computation. DEA was compared to FEA using results from a previously completed validation study of two cadaveric human ankles, as well as a previous study of post-operative contact stress exposure in 11 patients with tibial plafond fracture. DEA-computed maximum contact stresses were within 19% of those experimentally measured, with 90% of the contact area having computed contact stress values within 1MPa of those measured. In the 11 fractured ankles, maximum contact stress and contact area differences between DEA and FEA were 0.85 ± 0.64 MPa and 22.5 ± 11.5mm(2). As a predictive measure for PTOA development, both DEA and FEA had 100% concordance with presence of OA (KL grade ≥ 2) and >95% concordance with KL grade at 2 years. These results support DEA as a reasonable alternative to FEA for computing contact stress exposures following surgical reduction of a tibial plafond fracture. Copyright © 2015 Elsevier Ltd. All rights reserved.

  17. Collaborative Dialogue in Synchronous Computer-Mediated Communication and Face-to-Face Communication

    ERIC Educational Resources Information Center

    Zeng, Gang

    2017-01-01

    Previous research has documented that collaborative dialogue promotes L2 learning in both face-to-face (F2F) and synchronous computer-mediated communication (SCMC) modalities. However, relatively little research has explored modality effects on collaborative dialogue. Thus, motivated by sociocultual theory, this study examines how F2F compares…

  18. Critical Emergency Medicine Procedural Skills: A Comparative Study of Methods for Teaching and Assessment.

    ERIC Educational Resources Information Center

    Chapman, Dane M.; And Others

    Three critical procedural skills in emergency medicine were evaluated using three assessment modalities--written, computer, and animal model. The effects of computer practice and previous procedure experience on skill competence were also examined in an experimental sequential assessment design. Subjects were six medical students, six residents,…

  19. Computational Modelling and Children's Expressions of Signal and Noise

    ERIC Educational Resources Information Center

    Ainley, Janet; Pratt, Dave

    2017-01-01

    Previous research has demonstrated how young children can identify the signal in data. In this exploratory study we considered how they might also express meanings for noise when creating computational models using recent developments in software tools. We conducted extended clinical interviews with four groups of 11-year-olds and analysed the…

  20. Deconstructing the Discourse of Opportunity: Computer-Assisted Credit Recovery in Alternative Education

    ERIC Educational Resources Information Center

    Miller, Elizabeth R.

    2013-01-01

    Alternative schools educate students who have previously been unsuccessful in the traditional school setting. Many alternative school students are behind on high school credits, and the schools provide options for credit recovery. Computer-assisted instruction is often used for this purpose. Using case study methodology and a critical theoretical…

  1. The study of human venous system dynamics using hybrid computer modeling

    NASA Technical Reports Server (NTRS)

    Snyder, M. F.; Rideout, V. C.

    1972-01-01

    A computer-based model of the cardiovascular system was created emphasizing effects on the systemic venous system. Certain physiological aspects were emphasized: effects of heart rate, tilting, changes in respiration, and leg muscular contractions. The results from the model showed close correlation with findings previously reported in the literature.

  2. Quantum Monte Carlo for atoms and molecules

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barnett, R.N.

    1989-11-01

    The diffusion quantum Monte Carlo with fixed nodes (QMC) approach has been employed in studying energy-eigenstates for 1--4 electron systems. Previous work employing the diffusion QMC technique yielded energies of high quality for H{sub 2}, LiH, Li{sub 2}, and H{sub 2}O. Here, the range of calculations with this new approach has been extended to include additional first-row atoms and molecules. In addition, improvements in the previously computed fixed-node energies of LiH, Li{sub 2}, and H{sub 2}O have been obtained using more accurate trial functions. All computations were performed within, but are not limited to, the Born-Oppenheimer approximation. In our computations,more » the effects of variation of Monte Carlo parameters on the QMC solution of the Schroedinger equation were studied extensively. These parameters include the time step, renormalization time and nodal structure. These studies have been very useful in determining which choices of such parameters will yield accurate QMC energies most efficiently. Generally, very accurate energies (90--100% of the correlation energy is obtained) have been computed with single-determinant trail functions multiplied by simple correlation functions. Improvements in accuracy should be readily obtained using more complex trial functions.« less

  3. Facilitating higher-fidelity simulations of axial compressor instability and other turbomachinery flow conditions

    NASA Astrophysics Data System (ADS)

    Herrick, Gregory Paul

    The quest to accurately capture flow phenomena with length-scales both short and long and to accurately represent complex flow phenomena within disparately sized geometry inspires a need for an efficient, high-fidelity, multi-block structured computational fluid dynamics (CFD) parallel computational scheme. This research presents and demonstrates a more efficient computational method by which to perform multi-block structured CFD parallel computational simulations, thus facilitating higher-fidelity solutions of complicated geometries (due to the inclusion of grids for "small'' flow areas which are often merely modeled) and their associated flows. This computational framework offers greater flexibility and user-control in allocating the resource balance between process count and wall-clock computation time. The principal modifications implemented in this revision consist of a "multiple grid block per processing core'' software infrastructure and an analytic computation of viscous flux Jacobians. The development of this scheme is largely motivated by the desire to simulate axial compressor stall inception with more complete gridding of the flow passages (including rotor tip clearance regions) than has been previously done while maintaining high computational efficiency (i.e., minimal consumption of computational resources), and thus this paradigm shall be demonstrated with an examination of instability in a transonic axial compressor. However, the paradigm presented herein facilitates CFD simulation of myriad previously impractical geometries and flows and is not limited to detailed analyses of axial compressor flows. While the simulations presented herein were technically possible under the previous structure of the subject software, they were much less computationally efficient and thus not pragmatically feasible; the previous research using this software to perform three-dimensional, full-annulus, time-accurate, unsteady, full-stage (with sliding-interface) simulations of rotating stall inception in axial compressors utilized tip clearance periodic models, while the scheme here is demonstrated by a simulation of axial compressor stall inception utilizing gridded rotor tip clearance regions. As will be discussed, much previous research---experimental, theoretical, and computational---has suggested that understanding clearance flow behavior is critical to understanding stall inception, and previous computational research efforts which have used tip clearance models have begged the question, "What about the clearance flows?''. This research begins to address that question.

  4. Shock compression response of cold-rolled Ni/Al multilayer composites

    NASA Astrophysics Data System (ADS)

    Specht, Paul E.; Weihs, Timothy P.; Thadhani, Naresh N.

    2017-01-01

    Uniaxial strain, plate-on-plate impact experiments were performed on cold-rolled Ni/Al multilayer composites and the resulting Hugoniot was determined through time-resolved measurements combined with impedance matching. The experimental Hugoniot agreed with that previously predicted by two dimensional (2D) meso-scale calculations [Specht et al., J. Appl. Phys. 111, 073527 (2012)]. Additional 2D meso-scale simulations were performed using the same computational method as the prior study to reproduce the experimentally measured free surface velocities and stress profiles. These simulations accurately replicated the experimental profiles, providing additional validation for the previous computational work.

  5. Comparison of Methods for Determining Boundary Layer Edge Conditions for Transition Correlations

    NASA Technical Reports Server (NTRS)

    Liechty, Derek S.; Berry, Scott A.; Hollis, Brian R.; Horvath, Thomas J.

    2003-01-01

    Data previously obtained for the X-33 in the NASA Langley Research Center 20-Inch Mach 6 Air Tunnel have been reanalyzed to compare methods for determining boundary layer edge conditions for use in transition correlations. The experimental results were previously obtained utilizing the phosphor thermography technique to monitor the status of the boundary layer downstream of discrete roughness elements via global heat transfer images of the X-33 windward surface. A boundary layer transition correlation was previously developed for this data set using boundary layer edge conditions calculated using an inviscid/integral boundary layer approach. An algorithm was written in the present study to extract boundary layer edge quantities from higher fidelity viscous computational fluid dynamic solutions to develop transition correlations that account for viscous effects on vehicles of arbitrary complexity. The boundary layer transition correlation developed for the X-33 from the viscous solutions are compared to the previous boundary layer transition correlations. It is shown that the boundary layer edge conditions calculated using an inviscid/integral boundary layer approach are significantly different than those extracted from viscous computational fluid dynamic solutions. The present results demonstrate the differences obtained in correlating transition data using different computational methods.

  6. Uranium dioxide fuel cladding strain investigation with the use of CYGRO-2 computer program

    NASA Technical Reports Server (NTRS)

    Smith, J. R.

    1973-01-01

    Previously irradiated UO2 thermionic fuel pins in which gross fuel-cladding strain occurred were modeled with the use of a computer program to define controlling parameters which may contribute to cladding strain. The computed strain was compared with measured strain, and the computer input data were studied in an attempt to get agreement with measured strain. Because of the limitations of the program and uncertainties in input data, good agreement with measured cladding strain was not attained. A discussion of these limitations is presented.

  7. Acoustic environmental accuracy requirements for response determination

    NASA Technical Reports Server (NTRS)

    Pettitt, M. R.

    1983-01-01

    A general purpose computer program was developed for the prediction of vehicle interior noise. This program, named VIN, has both modal and statistical energy analysis capabilities for structural/acoustic interaction analysis. The analytic models and their computer implementation were verified through simple test cases with well-defined experimental results. The model was also applied in a space shuttle payload bay launch acoustics prediction study. The computer program processes large and small problems with equal efficiency because all arrays are dynamically sized by program input variables at run time. A data base is built and easily accessed for design studies. The data base significantly reduces the computational costs of such studies by allowing the reuse of the still-valid calculated parameters of previous iterations.

  8. Sleep problems and computer use during work and leisure: Cross-sectional study among 7800 adults.

    PubMed

    Andersen, Lars Louis; Garde, Anne Helene

    2015-01-01

    Previous studies linked heavy computer use to disturbed sleep. This study investigates the association between computer use during work and leisure and sleep problems in working adults. From the 2010 round of the Danish Work Environment Cohort Study, currently employed wage earners on daytime schedule (N = 7883) replied to the Bergen insomnia scale and questions on weekly duration of computer use. Results showed that sleep problems for three or more days per week (average of six questions) were experienced by 14.9% of the respondents. Logistic regression analyses, controlled for gender, age, physical and psychosocial work factors, lifestyle, chronic disease and mental health showed that computer use during leisure for 30 or more hours per week (reference 0-10 hours per week) was associated with increased odds of sleep problems (OR 1.83 [95% CI 1.06-3.17]). Computer use during work and shorter duration of computer use during leisure were not associated with sleep problems. In conclusion, excessive computer use during leisure - but not work - is associated with sleep problems in adults working on daytime schedule.

  9. Developing an Educational Computer Game for Migratory Bird Identification Based on a Two-Tier Test Approach

    ERIC Educational Resources Information Center

    Chu, Hui-Chun; Chang, Shao-Chen

    2014-01-01

    Although educational computer games have been recognized as being a promising approach, previous studies have indicated that, without supportive models, students might only show temporary interest during the game-based learning process, and their learning performance is often not as good as expected. Therefore, in this paper, a two-tier test…

  10. Collaborative Learning with Screen-Based Simulation in Health Care Education: An Empirical Study of Collaborative Patterns and Proficiency Development

    ERIC Educational Resources Information Center

    Hall, L. O.; Soderstrom, T.; Ahlqvist, J.; Nilsson, T.

    2011-01-01

    This article is about collaborative learning with educational computer-assisted simulation (ECAS) in health care education. Previous research on training with a radiological virtual reality simulator has indicated positive effects on learning when compared to a more conventional alternative. Drawing upon the field of Computer-Supported…

  11. 3-D Computer Animation vs. Live-Action Video: Differences in Viewers' Response to Instructional Vignettes

    ERIC Educational Resources Information Center

    Smith, Dennie; McLaughlin, Tim; Brown, Irving

    2012-01-01

    This study explored computer animation vignettes as a replacement for live-action video scenarios of classroom behavior situations previously used as an instructional resource in teacher education courses in classroom management strategies. The focus of the research was to determine if the embedded behavioral information perceived in a live-action…

  12. Computational Thinking in Mathematics Teacher Education

    ERIC Educational Resources Information Center

    Gadanidis, George; Cendros, Rosa; Floyd, Lisa; Namukasa, Immaculate

    2017-01-01

    As computational thinking (CT) is increasing in focus in K-12 education, it is important to consider how teacher education programs may better prepare teacher candidates (TCs). Previous studies have found that TCs do not always have a firm understanding of what CT involves, and they might not have clear ideas about how to develop CT in their…

  13. Enhancing Students' Computer Programming Performances, Critical Thinking Awareness and Attitudes towards Programming: An Online Peer-Assessment Attempt

    ERIC Educational Resources Information Center

    Wang, Xiao-Ming; Hwang, Gwo-Jen; Liang, Zi-Yun; Wang, Hsiu-Ying

    2017-01-01

    It has become an important and challenging issue to foster students' concepts and skills of computer programming. Scholars believe that programming training could promote students' higher order thinking performance; however, many school teachers have reported the difficulty of teaching programming courses. Although several previous studies have…

  14. Hydride Transfer in DHFR by Transition Path Sampling, Kinetic Isotope Effects, and Heavy Enzyme Studies

    PubMed Central

    Wang, Zhen; Antoniou, Dimitri; Schwartz, Steven D.; Schramm, Vern L.

    2016-01-01

    Escherichia coli dihydrofolate reductase (ecDHFR) is used to study fundamental principles of enzyme catalysis. It remains controversial whether fast protein motions are coupled to the hydride transfer catalyzed by ecDHFR. Previous studies with heavy ecDHFR proteins labeled with 13C, 15N, and nonexchangeable 2H reported enzyme mass-dependent hydride transfer kinetics for ecDHFR. Here, we report refined experimental and computational studies to establish that hydride transfer is independent of protein mass. Instead, we found the rate constant for substrate dissociation to be faster for heavy DHFR. Previously reported kinetic differences between light and heavy DHFRs likely arise from kinetic steps other than the chemical step. This study confirms that fast (femtosecond to picosecond) protein motions in ecDHFR are not coupled to hydride transfer and provides an integrative computational and experimental approach to resolve fast dynamics coupled to chemical steps in enzyme catalysis. PMID:26652185

  15. A Computational and Experimental Study of Resonators in Three Dimensions

    NASA Technical Reports Server (NTRS)

    Tam, C. K. W.; Ju, H.; Jones, Michael G.; Watson, Willie R.; Parrott, Tony L.

    2009-01-01

    In a previous work by the present authors, a computational and experimental investigation of the acoustic properties of two-dimensional slit resonators was carried out. The present paper reports the results of a study extending the previous work to three dimensions. This investigation has two basic objectives. The first is to validate the computed results from direct numerical simulations of the flow and acoustic fields of slit resonators in three dimensions by comparing with experimental measurements in a normal incidence impedance tube. The second objective is to study the flow physics of resonant liners responsible for sound wave dissipation. Extensive comparisons are provided between computed and measured acoustic liner properties with both discrete frequency and broadband sound sources. Good agreements are found over a wide range of frequencies and sound pressure levels. Direct numerical simulation confirms the previous finding in two dimensions that vortex shedding is the dominant dissipation mechanism at high sound pressure intensity. However, it is observed that the behavior of the shed vortices in three dimensions is quite different from those of two dimensions. In three dimensions, the shed vortices tend to evolve into ring (circular in plan form) vortices, even though the slit resonator opening from which the vortices are shed has an aspect ratio of 2.5. Under the excitation of discrete frequency sound, the shed vortices align themselves into two regularly spaced vortex trains moving away from the resonator opening in opposite directions. This is different from the chaotic shedding of vortices found in two-dimensional simulations. The effect of slit aspect ratio at a fixed porosity is briefly studied. For the range of liners considered in this investigation, it is found that the absorption coefficient of a liner increases when the open area of the single slit is subdivided into multiple, smaller slits.

  16. Developing and validating an instrument for measuring mobile computing self-efficacy.

    PubMed

    Wang, Yi-Shun; Wang, Hsiu-Yuan

    2008-08-01

    IT-related self-efficacy has been found to have a critical influence on system use. However, traditional measures of computer self-efficacy and Internet-related self-efficacy are perceived to be inapplicable in the context of mobile computing and commerce because they are targeted primarily at either desktop computer or wire-based technology contexts. Based on previous research, this study develops and validates a multidimensional instrument for measuring mobile computing self-efficacy (MCSE). This empirically validated instrument will be useful to researchers in developing and testing the theories of mobile user behavior, and to practitioners in assessing the mobile computing self-efficacy of users and promoting the use of mobile commerce systems.

  17. The Differential Effects of Two Types of Task Repetition on the Complexity, Accuracy, and Fluency in Computer-Mediated L2 Written Production: A Focus on Computer Anxiety

    ERIC Educational Resources Information Center

    Amiryousefi, Mohammad

    2016-01-01

    Previous task repetition studies have primarily focused on how task repetition characteristics affect the complexity, accuracy, and fluency in L2 oral production with little attention to L2 written production. The main purpose of the study reported in this paper was to examine the effects of task repetition versus procedural repetition on the…

  18. Computer-task testing of rhesus monkeys (Macaca mulatta) in the social milieu.

    PubMed

    Washburn, D A; Harper, S; Rumbaugh, D M

    1994-07-01

    Previous research has demonstrated that a behavior and performance testing paradigm, in which rhesus monkeys (Macaca mulatta) manipulate a joystick to respond to computer-generated stimuli, provides environmental enrichment and supports the psychological well-being of captive research animals. The present study was designed to determine whether computer-task activity would be affected by pair-housing animals that had previously been tested only in their single-animal home cages. No differences were observed in productivity or performance levels as a function of housing condition, even when the animals were required to "self-identify" prior to performing each trial. The data indicate that cognitive challenge and control are as preferred by the animals as social opportunities, and that, together with comfort/health considerations, each must be addressed for the assurance of psychological well-being.

  19. Neural Network Training by Integration of Adjoint Systems of Equations Forward in Time

    NASA Technical Reports Server (NTRS)

    Toomarian, Nikzad (Inventor); Barhen, Jacob (Inventor)

    1999-01-01

    A method and apparatus for supervised neural learning of time dependent trajectories exploits the concepts of adjoint operators to enable computation of the gradient of an objective functional with respect to the various parameters of the network architecture in a highly efficient manner. Specifically. it combines the advantage of dramatic reductions in computational complexity inherent in adjoint methods with the ability to solve two adjoint systems of equations together forward in time. Not only is a large amount of computation and storage saved. but the handling of real-time applications becomes also possible. The invention has been applied it to two examples of representative complexity which have recently been analyzed in the open literature and demonstrated that a circular trajectory can be learned in approximately 200 iterations compared to the 12000 reported in the literature. A figure eight trajectory was achieved in under 500 iterations compared to 20000 previously required. Tbc trajectories computed using our new method are much closer to the target trajectories than was reported in previous studies.

  20. Neural network training by integration of adjoint systems of equations forward in time

    NASA Technical Reports Server (NTRS)

    Toomarian, Nikzad (Inventor); Barhen, Jacob (Inventor)

    1992-01-01

    A method and apparatus for supervised neural learning of time dependent trajectories exploits the concepts of adjoint operators to enable computation of the gradient of an objective functional with respect to the various parameters of the network architecture in a highly efficient manner. Specifically, it combines the advantage of dramatic reductions in computational complexity inherent in adjoint methods with the ability to solve two adjoint systems of equations together forward in time. Not only is a large amount of computation and storage saved, but the handling of real-time applications becomes also possible. The invention has been applied it to two examples of representative complexity which have recently been analyzed in the open literature and demonstrated that a circular trajectory can be learned in approximately 200 iterations compared to the 12000 reported in the literature. A figure eight trajectory was achieved in under 500 iterations compared to 20000 previously required. The trajectories computed using our new method are much closer to the target trajectories than was reported in previous studies.

  1. Validation of a computer case definition for sudden cardiac death in opioid users.

    PubMed

    Kawai, Vivian K; Murray, Katherine T; Stein, C Michael; Cooper, William O; Graham, David J; Hall, Kathi; Ray, Wayne A

    2012-08-31

    To facilitate the use of automated databases for studies of sudden cardiac death, we previously developed a computerized case definition that had a positive predictive value between 86% and 88%. However, the definition has not been specifically validated for prescription opioid users, for whom out-of-hospital overdose deaths may be difficult to distinguish from sudden cardiac death. We assembled a cohort of persons 30-74 years of age prescribed propoxyphene or hydrocodone who had no life-threatening non-cardiovascular illness, diagnosed drug abuse, residence in a nursing home in the past year, or hospital stay within the past 30 days. Medical records were sought for a sample of 140 cohort deaths within 30 days of a prescription fill meeting the computer case definition. Of the 140 sampled deaths, 81 were adjudicated; 73 (90%) were sudden cardiac deaths. Two deaths had possible opioid overdose; after removing these two the positive predictive value was 88%. These findings are consistent with our previous validation studies and suggest the computer case definition of sudden cardiac death is a useful tool for pharmacoepidemiologic studies of opioid analgesics.

  2. A Computational Study of the Flow Physics of Acoustic Liners

    NASA Technical Reports Server (NTRS)

    Tam, Christopher

    2006-01-01

    The present investigation is a continuation of a previous joint project between the Florida State University and the NASA Langley Research Center Liner Physics Team. In the previous project, a study of acoustic liners, in two dimensions, inside a normal incidence impedance tube was carried out. The study consisted of two parts. The NASA team was responsible for the experimental part of the project. This involved performing measurements in an impedance tube with a large aspect ratio slit resonator. The FSU team was responsible for the computation part of the project. This involved performing direct numerical simulation (DNS) of the NASA experiment in two dimensions using CAA methodology. It was agreed that upon completion of numerical simulation, the computed values of the liner impedance were to be sent to NASA for validation with experimental results. On following this procedure good agreements were found between numerical results and experimental measurements over a wide range of frequencies and sound-pressure-level. Broadband incident sound waves were also simulated numerically and measured experimentally. Overall, good agreements were also found.

  3. Pulsatile flow in ventricular catheters for hydrocephalus

    NASA Astrophysics Data System (ADS)

    Giménez, Á.; Galarza, M.; Thomale, U.; Schuhmann, M. U.; Valero, J.; Amigó, J. M.

    2017-05-01

    The obstruction of ventricular catheters (VCs) is a major problem in the standard treatment of hydrocephalus, the flow pattern of the cerebrospinal fluid (CSF) being one important factor thereof. As a first approach to this problem, some of the authors studied previously the CSF flow through VCs under time-independent boundary conditions by means of computational fluid dynamics in three-dimensional models. This allowed us to derive a few basic principles which led to designs with improved flow patterns regarding the obstruction problem. However, the flow of the CSF has actually a pulsatile nature because of the heart beating and blood flow. To address this fact, here we extend our previous computational study to models with oscillatory boundary conditions. The new results will be compared with the results for constant flows and discussed. It turns out that the corrections due to the pulsatility of the CSF are quantitatively small, which reinforces our previous findings and conclusions. This article is part of the themed issue `Mathematical methods in medicine: neuroscience, cardiology and pathology'.

  4. Does Seeing One Another's Gaze Affect Group Dialogue? A Computational Approach

    ERIC Educational Resources Information Center

    Schneider, Bertrand; Pea, Roy

    2015-01-01

    In a previous study, we found that real-time mutual gaze perception (i.e., being able to see the gaze of your partner in real time on a computer screen while solving a learning task) had a positive effect on student collaboration and learning (Schneider & Pea, 2013). The goals of this paper are (1) to explore a variety of computational…

  5. Teaching Musical Expression: Effects of Production and Delivery of Feedback by Teacher vs. Computer on Rated Feedback Quality

    ERIC Educational Resources Information Center

    Karlsson, Jessika; Liljestrom, Simon; Juslin, Patrik N.

    2009-01-01

    Previous research has shown that a computer program may improve performers' abilities to express emotions through their performance. Yet, performers seem reluctant to embrace this novel technology. In this study we explored possible reasons for these negative impressions. Eighty guitarists performed a piece of music to express various emotions,…

  6. Hegemony and Assessment: The Student Experience of Being in a Male Homogenous Higher Education Computing Course

    ERIC Educational Resources Information Center

    Sheedy, Caroline

    2018-01-01

    This work emanates from a previous study examining the experiences of male final year students in computing degree programmes that focused on their perceptions as students where they had few, if any, female classmates. This empirical work consisted of focus groups, with the findings outlined here drawn from two groups that were homogeneous with…

  7. Fluid-Structure Interaction in Composite Structures

    DTIC Science & Technology

    2014-03-01

    polymer composite structures. Some previous experimental observations were confirmed using the results from the computer simulations , which also...computer simulations , which also enhanced understanding the effect of FSI on dynamic responses of composite structures. vi THIS PAGE INTENTIONALLY...forces) are applied. A great amount of research has been made using the FEM to study and simulate the cases when the structures are surrounded by

  8. REPORT ON COMPUTER ASSISTED INSTRUCTION, PROVIDENCE COLLEGE, PROVIDENCE, RHODE ISLAND, OCTOBER 1, 1965--JUNE 30, 1966.

    ERIC Educational Resources Information Center

    REYNOLDS, ROBERT R.; AND OTHERS

    PARTICIPANTS IN A PROJECT TO TRAIN VOCATIONAL EDUCATION TEACHERS IN THE USE OF COMPUTER-ASSISTED INSTRUCTION WROTE COURSE SECTIONS AS AN EXERCISE IN THE USE OF THE "COURSEWRITER" LANGUAGE AND THE APPLICATION OF THE BASIC PRINCIPLES OF PSYCHOLOGY THAT HAD BEEN STUDIED DURING A PREVIOUS COURSE IN THE SUMMER OF 1965. UPON COMPLETION OF THE…

  9. The Interactive Effects of the Availability of Objectives and/or Rules on Computer-Based Learning: A Replication.

    ERIC Educational Resources Information Center

    Merrill, Paul F.; And Others

    To replicate and extend the results of a previous study, this project investigated the effects of behavioral objectives and/or rules on computer-based learning task performance. The 133 subjects were randomly assigned to an example-only, objective-example, rule example, or objective-rule example group. The availability of rules and/or objectives…

  10. College students and computers: assessment of usage patterns and musculoskeletal discomfort.

    PubMed

    Noack-Cooper, Karen L; Sommerich, Carolyn M; Mirka, Gary A

    2009-01-01

    A limited number of studies have focused on computer-use-related MSDs in college students, though risk factor exposure may be similar to that of workers who use computers. This study examined computer use patterns of college students, and made comparisons to a group of previously studied computer-using professionals. 234 students completed a web-based questionnaire concerning computer use habits and physical discomfort respondents specifically associated with computer use. As a group, students reported their computer use to be at least 'Somewhat likely' 18 out of 24 h/day, compared to 12 h for the professionals. Students reported more uninterrupted work behaviours than the professionals. Younger graduate students reported 33.7 average weekly computing hours, similar to hours reported by younger professionals. Students generally reported more frequent upper extremity discomfort than the professionals. Frequent assumption of awkward postures was associated with frequent discomfort. The findings signal a need for intervention, including, training and education, prior to entry into the workforce. Students are future workers, and so it is important to determine whether their increasing exposure to computers, prior to entering the workforce, may make it so they enter already injured or do not enter their chosen profession due to upper extremity MSDs.

  11. Quantum Sheaf Cohomology on Grassmannians

    NASA Astrophysics Data System (ADS)

    Guo, Jirui; Lu, Zhentao; Sharpe, Eric

    2017-05-01

    In this paper we study the quantum sheaf cohomology of Grassmannians with deformations of the tangent bundle. Quantum sheaf cohomology is a (0,2) deformation of the ordinary quantum cohomology ring, realized as the OPE ring in A/2-twisted theories. Quantum sheaf cohomology has previously been computed for abelian gauged linear sigma models (GLSMs); here, we study (0,2) deformations of nonabelian GLSMs, for which previous methods have been intractable. Combined with the classical result, the quantum ring structure is derived from the one-loop effective potential. We also utilize recent advances in supersymmetric localization to compute A/2 correlation functions and check the general result in examples. In this paper we focus on physics derivations and examples; in a companion paper, we will provide a mathematically rigorous derivation of the classical sheaf cohomology ring.

  12. Theoretical studies of Resonance Enhanced Stimulated Raman Scattering (RESRS) of frequency doubled Alexandrite laser wavelength in cesium vapor

    NASA Technical Reports Server (NTRS)

    Lawandy, Nabil M.

    1987-01-01

    The third phase of research will focus on the propagation and energy extraction of the pump and SERS beams in a variety of configurations including oscillator structures. In order to address these questions a numerical code capable of allowing for saturation and full transverse beam evolution is required. The method proposed is based on a discretized propagation energy extraction model which uses a Kirchoff integral propagator coupled to the three level Raman model already developed. The model will have the resolution required by diffraction limits and will use the previous density matrix results in the adiabatic following limit. Owing to its large computational requirements, such a code must be implemented on a vector array processor. One code on the Cyber is being tested by using previously understood two-level laser models as guidelines for interpreting the results. Two tests were implemented: the evolution of modes in a passive resonator and the evolution of a stable state of the adiabatically eliminated laser equations. These results show mode shapes and diffraction losses for the first case and relaxation oscillations for the second one. Finally, in order to clarify the computing methodology used to exploit the speed of the Cyber's computational speed, the time it takes to perform both of the computations previously mentioned to run on the Cyber and VAX 730 must be measured. Also included is a short description of the current laser model (CAVITY.FOR) and a flow chart of the test computations.

  13. Impact of computer use on children's vision.

    PubMed

    Kozeis, N

    2009-10-01

    Today, millions of children use computers on a daily basis. Extensive viewing of the computer screen can lead to eye discomfort, fatigue, blurred vision and headaches, dry eyes and other symptoms of eyestrain. These symptoms may be caused by poor lighting, glare, an improper work station set-up, vision problems of which the person was not previously aware, or a combination of these factors. Children can experience many of the same symptoms related to computer use as adults. However, some unique aspects of how children use computers may make them more susceptible than adults to the development of these problems. In this study, the most common eye symptoms related to computer use in childhood, the possible causes and ways to avoid them are reviewed.

  14. Simple, efficient allocation of modelling runs on heterogeneous clusters with MPI

    USGS Publications Warehouse

    Donato, David I.

    2017-01-01

    In scientific modelling and computation, the choice of an appropriate method for allocating tasks for parallel processing depends on the computational setting and on the nature of the computation. The allocation of independent but similar computational tasks, such as modelling runs or Monte Carlo trials, among the nodes of a heterogeneous computational cluster is a special case that has not been specifically evaluated previously. A simulation study shows that a method of on-demand (that is, worker-initiated) pulling from a bag of tasks in this case leads to reliably short makespans for computational jobs despite heterogeneity both within and between cluster nodes. A simple reference implementation in the C programming language with the Message Passing Interface (MPI) is provided.

  15. Clinical Pilot Study and Computational Modeling of Bitemporal Transcranial Direct Current Stimulation, and Safety of Repeated Courses of Treatment, in Major Depression.

    PubMed

    Ho, Kerrie-Anne; Bai, Siwei; Martin, Donel; Alonzo, Angelo; Dokos, Socrates; Loo, Colleen K

    2015-12-01

    This study aimed to examine a bitemporal (BT) transcranial direct current stimulation (tDCS) electrode montage for the treatment of depression through a clinical pilot study and computational modeling. The safety of repeated courses of stimulation was also examined. Four participants with depression who had previously received multiple courses of tDCS received a 4-week course of BT tDCS. Mood and neuropsychological function were assessed. The results were compared with previous courses of tDCS given to the same participants using different electrode montages. Computational modeling examined the electric field maps produced by the different montages. Three participants showed clinical improvement with BT tDCS (mean [SD] improvement, 49.6% [33.7%]). There were no adverse neuropsychological effects. Computational modeling showed that the BT montage activates the anterior cingulate cortices and brainstem, which are deep brain regions that are important for depression. However, a fronto-extracephalic montage stimulated these areas more effectively. No adverse effects were found in participants receiving up to 6 courses of tDCS. Bitemporal tDCS was safe and led to clinically meaningful efficacy in 3 of 4 participants. However, computational modeling suggests that the BT montage may not activate key brain regions in depression more effectively than another novel montage--fronto-extracephalic tDCS. There is also preliminary evidence to support the safety of up to 6 repeated courses of tDCS.

  16. Can virtual reality improve anatomy education? A randomised controlled study of a computer-generated three-dimensional anatomical ear model.

    PubMed

    Nicholson, Daren T; Chalk, Colin; Funnell, W Robert J; Daniel, Sam J

    2006-11-01

    The use of computer-generated 3-dimensional (3-D) anatomical models to teach anatomy has proliferated. However, there is little evidence that these models are educationally effective. The purpose of this study was to test the educational effectiveness of a computer-generated 3-D model of the middle and inner ear. We reconstructed a fully interactive model of the middle and inner ear from a magnetic resonance imaging scan of a human cadaver ear. To test the model's educational usefulness, we conducted a randomised controlled study in which 28 medical students completed a Web-based tutorial on ear anatomy that included the interactive model, while a control group of 29 students took the tutorial without exposure to the model. At the end of the tutorials, both groups were asked a series of 15 quiz questions to evaluate their knowledge of 3-D relationships within the ear. The intervention group's mean score on the quiz was 83%, while that of the control group was 65%. This difference in means was highly significant (P < 0.001). Our findings stand in contrast to the handful of previous randomised controlled trials that evaluated the effects of computer-generated 3-D anatomical models on learning. The equivocal and negative results of these previous studies may be due to the limitations of these studies (such as small sample size) as well as the limitations of the models that were studied (such as a lack of full interactivity). Given our positive results, we believe that further research is warranted concerning the educational effectiveness of computer-generated anatomical models.

  17. Impact of singular excessive computer game and television exposure on sleep patterns and memory performance of school-aged children.

    PubMed

    Dworak, Markus; Schierl, Thomas; Bruns, Thomas; Strüder, Heiko Klaus

    2007-11-01

    Television and computer game consumption are a powerful influence in the lives of most children. Previous evidence has supported the notion that media exposure could impair a variety of behavioral characteristics. Excessive television viewing and computer game playing have been associated with many psychiatric symptoms, especially emotional and behavioral symptoms, somatic complaints, attention problems such as hyperactivity, and family interaction problems. Nevertheless, there is insufficient knowledge about the relationship between singular excessive media consumption on sleep patterns and linked implications on children. The aim of this study was to investigate the effects of singular excessive television and computer game consumption on sleep patterns and memory performance of children. Eleven school-aged children were recruited for this polysomnographic study. Children were exposed to voluntary excessive television and computer game consumption. In the subsequent night, polysomnographic measurements were conducted to measure sleep-architecture and sleep-continuity parameters. In addition, a visual and verbal memory test was conducted before media stimulation and after the subsequent sleeping period to determine visuospatial and verbal memory performance. Only computer game playing resulted in significant reduced amounts of slow-wave sleep as well as significant declines in verbal memory performance. Prolonged sleep-onset latency and more stage 2 sleep were also detected after previous computer game consumption. No effects on rapid eye movement sleep were observed. Television viewing reduced sleep efficiency significantly but did not affect sleep patterns. The results suggest that television and computer game exposure affect children's sleep and deteriorate verbal cognitive performance, which supports the hypothesis of the negative influence of media consumption on children's sleep, learning, and memory.

  18. Hexagonalization of correlation functions II: two-particle contributions

    NASA Astrophysics Data System (ADS)

    Fleury, Thiago; Komatsu, Shota

    2018-02-01

    In this work, we compute one-loop planar five-point functions in N=4 super-Yang-Mills using integrability. As in the previous work, we decompose the correlation functions into hexagon form factors and glue them using the weight factors which depend on the cross-ratios. The main new ingredient in the computation, as compared to the four-point functions studied in the previous paper, is the two-particle mirror contribution. We develop techniques to evaluate it and find agreement with the perturbative results in all the cases we analyzed. In addition, we consider next-to-extremal four-point functions, which are known to be protected, and show that the sum of one-particle and two-particle contributions at one loop adds up to zero as expected. The tools developed in this work would be useful for computing higher-particle contributions which would be relevant for more complicated quantities such as higher-loop corrections and non-planar correlators.

  19. A Determination of the Minimum Frequency Requirements for a PATRIOT Battalion UHF Communication System.

    DTIC Science & Technology

    1982-12-01

    a computer program which simulates the PATRIOT battalion UH1F communication system. *.-.The detailed description of how the model performs this...the Degree of Master of Science .AI . j tf ti on-i by 5 , .... . :it Lard/or Gregory H. Swanson DLt Captain USA Graduate Computer Science I...5 Model Application..... . . . .. .. . . .. .. . . 6 Thesnis Overviev ....... o.000000000000000000000. .6 Previous Studies

  20. The Development, Implementation, and Evaluation of a Computer-Assisted Branched Test for a Program of Individually Prescribed Instruction.

    ERIC Educational Resources Information Center

    Ferguson, Richard L.

    The focus of this study was upon the development and evaluation of a computer-assisted branched test to be used in making instructional decisions for individuals in the program of Individually Prescribed Instruction. A Branched Test is one in which the presentation of test items is contingent upon the previous responses of the examinee. The…

  1. The Effect of a Computer-Based Cartooning Tool on Children's Cartoons and Written Stories

    ERIC Educational Resources Information Center

    Madden, M.; Chung, P. W. H.; Dawson, C. W.

    2008-01-01

    This paper reports a study assessing a new computer tool for cartoon storytelling, created by the authors for a target audience in the upper half of the English and Welsh Key Stage 2 (years 5 and 6, covering ages 9-11 years). The tool attempts to provide users with more opportunities for expressive visualisation than previous educational software;…

  2. An Examination of Alternate Assessment Durations when Assessing Multiple-Skill Computational Fluency: The Generalizability and Dependability of Curriculum-Based Outcomes within the Context of Educational Decisions

    ERIC Educational Resources Information Center

    Christ, Theodore J.; Johnson-Gros, Kristin H.

    2005-01-01

    The current study extended previous research on curriculum-based measurement in mathematics (M-CBM) assessments. The purpose was to examine the generalizability and dependability of multiple-skill M-CBM computation assessments across various assessment durations (1, 2, 3, 4, 5, and 6 minutes). Results of generalizability and dependability studies…

  3. Functional Analysis and Preliminary Specifications for a Single Integrated Central Computer System for Secondary Schools and Junior Colleges. Interim Report.

    ERIC Educational Resources Information Center

    1968

    The present report proposes a central computing facility and presents the preliminary specifications for such a system. It is based, in part, on the results of earlier studies by two previous contractors on behalf of the U.S. Office of Education. The recommendations are based upon the present contractors considered evaluation of the earlier…

  4. Lake Erie Water Level Study. Appendix E. Power. Annex D. Computer Programs.

    DTIC Science & Technology

    1981-07-01

    IF I 1O 3 1;1 ,019R &(ADC ZVtA.,XVAL 6070O1 v -A~T FOR ThE P,.VTOUS ONE. PREVIOUS CASE HAS BEEN DELETED FROMq TAPE) 33 HEADC8) &ATIe 1,tYA. 31...I1ONZD0ENTIFICATION FORk IEW CASE IS THE SAME As TM Y 28 *AT FOR THE PREVIOUS ONE, PREVIOUS CASE HAS BEEN DELETED FROM TAPE$ V * 2900 13 JZI,NYRS v * 30 33...AT FOR THE PREVIOUS ONE, PREVIOUS CASE RAS SEEN DELETE FRO14 TAPE) T .29 DO 13 1- - ,NVAS ________ ________________ .60 13 READ(S) ITEARRAL 31 DEADt

  5. A new graph-based method for pairwise global network alignment

    PubMed Central

    Klau, Gunnar W

    2009-01-01

    Background In addition to component-based comparative approaches, network alignments provide the means to study conserved network topology such as common pathways and more complex network motifs. Yet, unlike in classical sequence alignment, the comparison of networks becomes computationally more challenging, as most meaningful assumptions instantly lead to NP-hard problems. Most previous algorithmic work on network alignments is heuristic in nature. Results We introduce the graph-based maximum structural matching formulation for pairwise global network alignment. We relate the formulation to previous work and prove NP-hardness of the problem. Based on the new formulation we build upon recent results in computational structural biology and present a novel Lagrangian relaxation approach that, in combination with a branch-and-bound method, computes provably optimal network alignments. The Lagrangian algorithm alone is a powerful heuristic method, which produces solutions that are often near-optimal and – unlike those computed by pure heuristics – come with a quality guarantee. Conclusion Computational experiments on the alignment of protein-protein interaction networks and on the classification of metabolic subnetworks demonstrate that the new method is reasonably fast and has advantages over pure heuristics. Our software tool is freely available as part of the LISA library. PMID:19208162

  6. Deep brain stimulation with a pre-existing cochlear implant: Surgical technique and outcome.

    PubMed

    Eddelman, Daniel; Wewel, Joshua; Wiet, R Mark; Metman, Leo V; Sani, Sepehr

    2017-01-01

    Patients with previously implanted cranial devices pose a special challenge in deep brain stimulation (DBS) surgery. We report the implantation of bilateral DBS leads in a patient with a cochlear implant. Technical nuances and long-term interdevice functionality are presented. A 70-year-old patient with advancing Parkinson's disease and a previously placed cochlear implant for sensorineural hearing loss was referred for placement of bilateral DBS in the subthalamic nucleus (STN). Prior to DBS, the patient underwent surgical removal of the subgaleal cochlear magnet, followed by stereotactic MRI, frame placement, stereotactic computed tomography (CT), and merging of imaging studies. This technique allowed for successful computational merging, MRI-guided targeting, and lead implantation with acceptable accuracy. Formal testing and programming of both the devices were successful without electrical interference. Successful DBS implantation with high resolution MRI-guided targeting is technically feasible in patients with previously implanted cochlear implants by following proper precautions.

  7. Sediment and Hydraulic Measurements with Computed Bed Load on the Missouri River, Sioux City to Hermann, 2014

    DTIC Science & Technology

    2017-05-01

    large sand bed river, with seven sites representing increasingly larger flows along the river length. The data set will be very useful for additional...quantity, quality , and types of data that can be obtained for the study of natural phenomenon. The study of riverine sedimentation is no exception...detail than in previous years. Additionally, new methodologies have been developed that allow the computation of bed-load transport in large sand bed

  8. Mycotoxin Management Studies by USDA-"Ag Lab" in 2008

    USDA-ARS?s Scientific Manuscript database

    Studies again included several popcorn fields in 2008, in order to continue gathering data for modification of the previously developed management strategies for mycotoxins in field corn (the mycotoxin predictive computer program). Weather conditions were generally good for growing corn, but excess...

  9. Mycotoxin Management Studies by USDA-ARS, NCAUR in 2009

    USDA-ARS?s Scientific Manuscript database

    Studies again included several popcorn fields in 2009 in order to continue gathering data for modification of the previously developed management strategies for mycotoxins in field corn (including the mycotoxin predictive computer program). Without an attempt for optimization, the field corn model ...

  10. An automated method to find reaction mechanisms and solve the kinetics in organometallic catalysis.

    PubMed

    Varela, J A; Vázquez, S A; Martínez-Núñez, E

    2017-05-01

    A novel computational method is proposed in this work for use in discovering reaction mechanisms and solving the kinetics of transition metal-catalyzed reactions. The method does not rely on either chemical intuition or assumed a priori mechanisms, and it works in a fully automated fashion. Its core is a procedure, recently developed by one of the authors, that combines accelerated direct dynamics with an efficient geometry-based post-processing algorithm to find transition states (Martinez-Nunez, E., J. Comput. Chem. 2015 , 36 , 222-234). In the present work, several auxiliary tools have been added to deal with the specific features of transition metal catalytic reactions. As a test case, we chose the cobalt-catalyzed hydroformylation of ethylene because of its well-established mechanism, and the fact that it has already been used in previous automated computational studies. Besides the generally accepted mechanism of Heck and Breslow, several side reactions, such as hydrogenation of the alkene, emerged from our calculations. Additionally, the calculated rate law for the hydroformylation reaction agrees reasonably well with those obtained in previous experimental and theoretical studies.

  11. Informed public choices for low-carbon electricity portfolios using a computer decision tool.

    PubMed

    Mayer, Lauren A Fleishman; Bruine de Bruin, Wändi; Morgan, M Granger

    2014-04-01

    Reducing CO2 emissions from the electricity sector will likely require policies that encourage the widespread deployment of a diverse mix of low-carbon electricity generation technologies. Public discourse informs such policies. To make informed decisions and to productively engage in public discourse, citizens need to understand the trade-offs between electricity technologies proposed for widespread deployment. Building on previous paper-and-pencil studies, we developed a computer tool that aimed to help nonexperts make informed decisions about the challenges faced in achieving a low-carbon energy future. We report on an initial usability study of this interactive computer tool. After providing participants with comparative and balanced information about 10 electricity technologies, we asked them to design a low-carbon electricity portfolio. Participants used the interactive computer tool, which constrained portfolio designs to be realistic and yield low CO2 emissions. As they changed their portfolios, the tool updated information about projected CO2 emissions, electricity costs, and specific environmental impacts. As in the previous paper-and-pencil studies, most participants designed diverse portfolios that included energy efficiency, nuclear, coal with carbon capture and sequestration, natural gas, and wind. Our results suggest that participants understood the tool and used it consistently. The tool may be downloaded from http://cedmcenter.org/tools-for-cedm/informing-the-public-about-low-carbon-technologies/ .

  12. Autumn Algorithm-Computation of Hybridization Networks for Realistic Phylogenetic Trees.

    PubMed

    Huson, Daniel H; Linz, Simone

    2018-01-01

    A minimum hybridization network is a rooted phylogenetic network that displays two given rooted phylogenetic trees using a minimum number of reticulations. Previous mathematical work on their calculation has usually assumed the input trees to be bifurcating, correctly rooted, or that they both contain the same taxa. These assumptions do not hold in biological studies and "realistic" trees have multifurcations, are difficult to root, and rarely contain the same taxa. We present a new algorithm for computing minimum hybridization networks for a given pair of "realistic" rooted phylogenetic trees. We also describe how the algorithm might be used to improve the rooting of the input trees. We introduce the concept of "autumn trees", a nice framework for the formulation of algorithms based on the mathematics of "maximum acyclic agreement forests". While the main computational problem is hard, the run-time depends mainly on how different the given input trees are. In biological studies, where the trees are reasonably similar, our parallel implementation performs well in practice. The algorithm is available in our open source program Dendroscope 3, providing a platform for biologists to explore rooted phylogenetic networks. We demonstrate the utility of the algorithm using several previously studied data sets.

  13. Validation of a computer case definition for sudden cardiac death in opioid users

    PubMed Central

    2012-01-01

    Background To facilitate the use of automated databases for studies of sudden cardiac death, we previously developed a computerized case definition that had a positive predictive value between 86% and 88%. However, the definition has not been specifically validated for prescription opioid users, for whom out-of-hospital overdose deaths may be difficult to distinguish from sudden cardiac death. Findings We assembled a cohort of persons 30-74 years of age prescribed propoxyphene or hydrocodone who had no life-threatening non-cardiovascular illness, diagnosed drug abuse, residence in a nursing home in the past year, or hospital stay within the past 30 days. Medical records were sought for a sample of 140 cohort deaths within 30 days of a prescription fill meeting the computer case definition. Of the 140 sampled deaths, 81 were adjudicated; 73 (90%) were sudden cardiac deaths. Two deaths had possible opioid overdose; after removing these two the positive predictive value was 88%. Conclusions These findings are consistent with our previous validation studies and suggest the computer case definition of sudden cardiac death is a useful tool for pharmacoepidemiologic studies of opioid analgesics. PMID:22938531

  14. Computational open-channel hydraulics for movable-bed problems

    USGS Publications Warehouse

    Lai, Chintu; ,

    1990-01-01

    As a major branch of computational hydraulics, notable advances have been made in numerical modeling of unsteady open-channel flow since the beginning of the computer age. According to the broader definition and scope of 'computational hydraulics,' the basic concepts and technology of modeling unsteady open-channel flow have been systematically studied previously. As a natural extension, computational open-channel hydraulics for movable-bed problems are addressed in this paper. The introduction of the multimode method of characteristics (MMOC) has made the modeling of this class of unsteady flows both practical and effective. New modeling techniques are developed, thereby shedding light on several aspects of computational hydraulics. Some special features of movable-bed channel-flow simulation are discussed here in the same order as given by the author in the fixed-bed case.

  15. Reciprocity in computer-human interaction: source-based, norm-based, and affect-based explanations.

    PubMed

    Lee, Seungcheol Austin; Liang, Yuhua Jake

    2015-04-01

    Individuals often apply social rules when they interact with computers, and this is known as the Computers Are Social Actors (CASA) effect. Following previous work, one approach to understand the mechanism responsible for CASA is to utilize computer agents and have the agents attempt to gain human compliance (e.g., completing a pattern recognition task). The current study focuses on three key factors frequently cited to influence traditional notions of compliance: evaluations toward the source (competence and warmth), normative influence (reciprocity), and affective influence (mood). Structural equation modeling assessed the effects of these factors on human compliance with computer request. The final model shows that norm-based influence (reciprocity) increased the likelihood of compliance, while evaluations toward the computer agent did not significantly influence compliance.

  16. Mutual potential between two rigid bodies with arbitrary shapes and mass distributions

    NASA Astrophysics Data System (ADS)

    Hou, Xiyun; Scheeres, Daniel J.; Xin, Xiaosheng

    2017-03-01

    Formulae to compute the mutual potential, force, and torque between two rigid bodies are given. These formulae are expressed in Cartesian coordinates using inertia integrals. They are valid for rigid bodies with arbitrary shapes and mass distributions. By using recursive relations, these formulae can be easily implemented on computers. Comparisons with previous studies show their superiority in computation speed. Using the algorithm as a tool, the planar problem of two ellipsoids is studied. Generally, potential truncated at the second order is good enough for a qualitative description of the mutual dynamics. However, for ellipsoids with very large non-spherical terms, higher order terms of the potential should be considered, at the cost of a higher computational cost. Explicit formulae of the potential truncated to the fourth order are given.

  17. A computational study suggests that replacing PEG with PMOZ may increase exposure of hydrophobic targeting moiety.

    PubMed

    Magarkar, Aniket; Róg, Tomasz; Bunker, Alex

    2017-05-30

    In a previous study we showed that the cause of failure of a new, proposed, targeting ligand, the AETP moiety, when attached to a PEGylated liposome, was occlusion by the poly(ethylene glycol) (PEG) layer due to its hydrophobic nature, given that PEG is not entirely hydrophilic. At the time we proposed that possible replacement with a more hydrophilic protective polymer could alleviate this problem. In this study we have used computational molecular dynamics modelling, using a model with all atom resolution, to suggest that a specific alternative protective polymer, poly(2-methyloxazoline) (PMOZ), would perform exactly this function. Our results show that when PEG is replaced by PMOZ the relative exposure to the solvent of AETP is increased to a level even greater than that we found in previous simulations for the RGD peptide, a targeting moiety that has previously been used successfully in PEGylated liposome based therapies. While the AETP moiety itself is no longer under consideration, the results of this computational study have broader significance: the use of PMOZ as an alternative polymer coating to PEG could be efficacious in the context of more hydrophobic targeting ligands. In addition to PMOZ we studied another polyoxazoline, poly(2-ethyloxazoline) (PEOZ), that has also been mooted as a possible alternate protective polymer. It was also found that the RDG peptide occlusion was significantly greater for the case of both oxazolines as opposed to PEG and that, unlike PEG, neither oxazoline entered the membrane. As far as we are aware this is the first time that polyoxazolines have been studied using molecular dynamics simulation with all atom resolution. Copyright © 2017 Elsevier B.V. All rights reserved.

  18. Computational Analysis on Stent Geometries in Carotid Artery: A Review

    NASA Astrophysics Data System (ADS)

    Paisal, Muhammad Sufyan Amir; Taib, Ishkrizat; Ismail, Al Emran

    2017-01-01

    This paper reviews the work done by previous researchers in order to gather the information for the current study which about the computational analysis on stent geometry in carotid artery. The implantation of stent in carotid artery has become popular treatment for arterial diseases of hypertension such as stenosis, thrombosis, atherosclerosis and embolization, in reducing the rate of mortality and morbidity. For the stenting of an artery, the previous researchers did many type of mathematical models in which, the physiological variables of artery is analogized to electrical variables. Thus, the computational fluid dynamics (CFD) of artery could be done, which this method is also did by previous researchers. It lead to the current study in finding the hemodynamic characteristics due to artery stenting such as wall shear stress (WSS) and wall shear stress gradient (WSSG). Another objective of this study is to evaluate the nowadays stent configuration for full optimization in reducing the arterial side effect such as restenosis rate after a few weeks of stenting. The evaluation of stent is based on the decrease of strut-strut intersection, decrease of strut width and increase of the strut-strut spacing. The existing configuration of stents are actually good enough in widening the narrowed arterial wall but the disease such as thrombosis still occurs in early and late stage after the stent implantation. Thus, the outcome of this study is the prediction for the reduction of restenosis rate and the WSS distribution is predicted to be able in classifying which stent configuration is the best.

  19. Cortico-striatal language pathways dynamically adjust for syntactic complexity: A computational study.

    PubMed

    Szalisznyó, Krisztina; Silverstein, David; Teichmann, Marc; Duffau, Hugues; Smits, Anja

    2017-01-01

    A growing body of literature supports a key role of fronto-striatal circuits in language perception. It is now known that the striatum plays a role in engaging attentional resources and linguistic rule computation while also serving phonological short-term memory capabilities. The ventral semantic and the dorsal phonological stream dichotomy assumed for spoken language processing also seems to play a role in cortico-striatal perception. Based on recent studies that correlate deep Broca-striatal pathways with complex syntax performance, we used a previously developed computational model of frontal-striatal syntax circuits and hypothesized that different parallel language pathways may contribute to canonical and non-canonical sentence comprehension separately. We modified and further analyzed a thematic role assignment task and corresponding reservoir computing model of language circuits, as previously developed by Dominey and coworkers. We examined the models performance under various parameter regimes, by influencing how fast the presented language input decays and altering the temporal dynamics of activated word representations. This enabled us to quantify canonical and non-canonical sentence comprehension abilities. The modeling results suggest that separate cortico-cortical and cortico-striatal circuits may be recruited differently for processing syntactically more difficult and less complicated sentences. Alternatively, a single circuit would need to dynamically and adaptively adjust to syntactic complexity. Copyright © 2016. Published by Elsevier Inc.

  20. Controversial electronic structures and energies of Fe{sub 2}, Fe{sub 2}{sup +}, and Fe{sub 2}{sup −} resolved by RASPT2 calculations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hoyer, Chad E.; Manni, Giovanni Li; Truhlar, Donald G., E-mail: truhlar@umn.edu, E-mail: gagliard@umn.edu

    2014-11-28

    The diatomic molecule Fe{sub 2} was investigated using restricted active space second-order perturbation theory (RASPT2). This molecule is very challenging to study computationally because predictions about the ground state and excited states depend sensitively on the choice of the quantum chemical method. For Fe{sub 2} we show that one needs to go beyond a full-valence active space in order to achieve even qualitative agreement with experiment for the dissociation energy, and we also obtain a smooth ground-state potential curve. In addition we report the first multireference study of Fe{sub 2}{sup +}, for which we predict an {sup 8}Σ{sub u}{sup −}more » ground state, which was not predicted by previous computational studies. By using an active space large enough to remove the most serious deficiencies of previous theoretical work and by explicitly investigating the interpretations of previous experimental results, this study elucidates previous difficulties and provides – for the first time – a qualitatively correct treatment of Fe{sub 2}, Fe{sub 2}{sup +}, and Fe{sub 2}{sup −}. Moreover, this study represents a record in terms of the number or active electrons and active orbitals in the active space, namely 16 electrons in 28 orbitals. Conventional CASPT2 calculations can be performed with at most 16 electrons in 16 orbitals. We were able to overcome this limit by using the RASPT2 formalism.« less

  1. Computer-generated predictions of the structure and of the IR and Raman spectra of VX. Final report, May-August 1992

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hameka, H.F.; Jensen, J.O.

    1993-05-01

    This report presents the computed optimized geometry and vibrational IR and Raman frequencies of the V-agent VX. The computations are performed with the Gaussian 90 Program Package using 6-31G* basis sets. We assign the vibrational frequencies and correct each frequency by multiplying it with a previously derived 6-31G* correction factor. The result is a computer-generated prediction of the IR and Raman spectra of VX. This study was intended as a blind test of the utility of IR spectral prediction. Therefore, we intentionally did not look at experimental data on the IR and Raman spectra of VX.... IR Spectra, VX, Ramanmore » spectra, Computer predictions.« less

  2. Assessment in health care education - modelling and implementation of a computer supported scoring process.

    PubMed

    Alfredsson, Jayne; Plichart, Patrick; Zary, Nabil

    2012-01-01

    Research on computer supported scoring of assessments in health care education has mainly focused on automated scoring. Little attention has been given to how informatics can support the currently predominant human-based grading approach. This paper reports steps taken to develop a model for a computer supported scoring process that focuses on optimizing a task that was previously undertaken without computer support. The model was also implemented in the open source assessment platform TAO in order to study its benefits. Ability to score test takers anonymously, analytics on the graders reliability and a more time efficient process are example of observed benefits. A computer supported scoring will increase the quality of the assessment results.

  3. Effects of computer-based training on procedural modifications to standard functional analyses.

    PubMed

    Schnell, Lauren K; Sidener, Tina M; DeBar, Ruth M; Vladescu, Jason C; Kahng, SungWoo

    2018-01-01

    Few studies have evaluated methods for training decision-making when functional analysis data are undifferentiated. The current study evaluated computer-based training to teach 20 graduate students to arrange functional analysis conditions, analyze functional analysis data, and implement procedural modifications. Participants were exposed to training materials using interactive software during a 1-day session. Following the training, mean scores on the posttest, novel cases probe, and maintenance probe increased for all participants. These results replicate previous findings during a 1-day session and include a measure of participant acceptability of the training. Recommendations for future research on computer-based training and functional analysis are discussed. © 2017 Society for the Experimental Analysis of Behavior.

  4. Design of a modular digital computer system, CDRL no. D001, final design plan

    NASA Technical Reports Server (NTRS)

    Easton, R. A.

    1975-01-01

    The engineering breadboard implementation for the CDRL no. D001 modular digital computer system developed during design of the logic system was documented. This effort followed the architecture study completed and documented previously, and was intended to verify the concepts of a fault tolerant, automatically reconfigurable, modular version of the computer system conceived during the architecture study. The system has a microprogrammed 32 bit word length, general register architecture and an instruction set consisting of a subset of the IBM System 360 instruction set plus additional fault tolerance firmware. The following areas were covered: breadboard packaging, central control element, central processing element, memory, input/output processor, and maintenance/status panel and electronics.

  5. A computational study of coherent structures in the wakes of two-dimensional bluff bodies

    NASA Astrophysics Data System (ADS)

    Pearce, Jeffrey Alan

    1988-08-01

    The periodic shedding of vortices from bluff bodies was first recognized in the late 1800's. Currently, there is great interest concerning the effect of vortex shedding on structures and on vehicle stability. In the design of bluff structures which will be exposed to a flow, knowledge of the shedding frequency and the amplitude of the aerodynamic forces is critical. The ability to computationally predict parameters associated with periodic vortex shedding is thus a valuable tool. In this study, the periodic shedding of vortices from several bluff body geometries is predicted. The study is conducted with a two-dimensional finite-difference code employed on various grid sizes. The effects of the grid size and time step on the accuracy of the solution are addressed. Strouhal numbers and aerodynamic force coefficients are computed for all of the bodies considered and compared with previous experimental results. Results indicate that the finite-difference code is capable of predicting periodic vortex shedding for all of the geometries tested. Refinement of the finite-difference grid was found to give little improvement in the prediction; however, the choice of time step size was shown to be critical. Predictions of Strouhal numbers were generally accurate, and the calculated aerodynamic forces generally exhibited behavior consistent with previous studies.

  6. Computing nucleon EDM on a lattice

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Abramczyk, Michael; Izubuchi, Taku

    I will discuss briefly recent changes in the methodology of computing the baryon EDM on a lattice. The associated correction substantially reduces presently existing lattice values for the proton and neutron theta-induced EDMs, so that even the most precise previous lattice results become consistent with zero. On one hand, this change removes previous disagreements between these lattice results and the phenomenological estimates of the nucleon EDM. On the other hand, the nucleon EDM becomes much harder to compute on a lattice. In addition, I will review the progress in computing quark chromo-EDM-induced nucleon EDM using chiral quark action.

  7. The discovery of the causes of leprosy: A computational analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Corruble, V.; Ganascia, J.G.

    1996-12-31

    The role played by the inductive inference has been studied extensively in the field of Scientific Discovery. The work presented here tackles the problem of induction in medical research. The discovery of the causes of leprosy is analyzed and simulated using computational means. An inductive algorithm is proposed, which is successful in simulating some essential steps in the progress of the understanding of the disease. It also allows us to simulate the false reasoning of previous centuries through the introduction of some medical a priori inherited form archaic medicine. Corroborating previous research, this problem illustrates the importance of the socialmore » and cultural environment on the way the inductive inference is performed in medicine.« less

  8. Validation of a computational knee joint model using an alignment method for the knee laxity test and computed tomography.

    PubMed

    Kang, Kyoung-Tak; Kim, Sung-Hwan; Son, Juhyun; Lee, Young Han; Koh, Yong-Gon

    2017-01-01

    Computational models have been identified as efficient techniques in the clinical decision-making process. However, computational model was validated using published data in most previous studies, and the kinematic validation of such models still remains a challenge. Recently, studies using medical imaging have provided a more accurate visualization of knee joint kinematics. The purpose of the present study was to perform kinematic validation for the subject-specific computational knee joint model by comparison with subject's medical imaging under identical laxity condition. The laxity test was applied to the anterior-posterior drawer under 90° flexion and the varus-valgus under 20° flexion with a series of stress radiographs, a Telos device, and computed tomography. The loading condition in the computational subject-specific knee joint model was identical to the laxity test condition in the medical image. Our computational model showed knee laxity kinematic trends that were consistent with the computed tomography images, except for negligible differences because of the indirect application of the subject's in vivo material properties. Medical imaging based on computed tomography with the laxity test allowed us to measure not only the precise translation but also the rotation of the knee joint. This methodology will be beneficial in the validation of laxity tests for subject- or patient-specific computational models.

  9. Genetic mapping of 15 human X chromosomal forensic short tandem repeat (STR) loci by means of multi-core parallelization.

    PubMed

    Diegoli, Toni Marie; Rohde, Heinrich; Borowski, Stefan; Krawczak, Michael; Coble, Michael D; Nothnagel, Michael

    2016-11-01

    Typing of X chromosomal short tandem repeat (X STR) markers has become a standard element of human forensic genetic analysis. Joint consideration of many X STR markers at a time increases their discriminatory power but, owing to physical linkage, requires inter-marker recombination rates to be accurately known. We estimated the recombination rates between 15 well established X STR markers using genotype data from 158 families (1041 individuals) and following a previously proposed likelihood-based approach that allows for single-step mutations. To meet the computational requirements of this family-based type of analysis, we modified a previous implementation so as to allow multi-core parallelization on a high-performance computing system. While we obtained recombination rate estimates larger than zero for all but one pair of adjacent markers within the four previously proposed linkage groups, none of the three X STR pairs defining the junctions of these groups yielded a recombination rate estimate of 0.50. Corroborating previous studies, our results therefore argue against a simple model of independent X chromosomal linkage groups. Moreover, the refined recombination fraction estimates obtained in our study will facilitate the appropriate joint consideration of all 15 investigated markers in forensic analysis. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  10. Computational and Experimental Study of Supersonic Nozzle Flow and Shock Interactions

    NASA Technical Reports Server (NTRS)

    Carter, Melissa B.; Elmiligui, Alaa A.; Nayani, Sudheer N.; Castner, Ray; Bruce, Walter E., IV; Inskeep, Jacob

    2015-01-01

    This study focused on the capability of NASA Tetrahedral Unstructured Software System's CFD code USM3D capability to predict the interaction between a shock and supersonic plume flow. Previous studies, published in 2004, 2009 and 2013, investigated USM3D's supersonic plume flow results versus historical experimental data. This current study builds on that research by utilizing the best practices from the early papers for properly capturing the plume flow and then adding a wedge acting as a shock generator. This computational study is in conjunction with experimental tests conducted at the Glenn Research Center 1'x1' Supersonic Wind Tunnel. The comparison of the computational and experimental data shows good agreement for location and strength of the shocks although there are vertical shifts between the data sets that may be do to the measurement technique.

  11. Training Maneuver Evaluation for Reduced Order Modeling of Stability & Control Properties Using Computational Fluid Dynamics

    DTIC Science & Technology

    2013-03-01

    reduced order model is created. Finally, previous research in this area of study will be examined, and its application to this research will be...TRAINING MANEUVER EVALUATION FOR REDUCED ORDER MODELING OF STABILITY & CONTROL PROPERTIES USING COMPUTATIONAL FLUID DYNAMICS THESIS Craig Curtis...Government and is not subject to copyright protection in the United States. AFIT-ENY-13-M-28 TRAINING MANEUVER EVALUATION FOR REDUCED ORDER MODELING OF

  12. From the front line, report from a near paperless hospital: mixed reception among health care professionals.

    PubMed

    Lium, Jan-Tore; Laerum, Hallvard; Schulz, Tom; Faxvaag, Arild

    2006-01-01

    Many Norwegian hospitals that are equipped with an electronic medical record (EMR) system now have proceeded to withdraw the paper-based medical record from clinical workflow. In two previous survey-based studies on the effect of removing the paper-based medical record on the work of physicians, nurses and medical secretaries, we concluded that to scan and eliminate the paper based record was feasible, but that the medical secretaries were the group that reported to benefit the most from the change. To further explore the effects of removing the paper based record, especially in regard to medical personnel, we now have conducted a follow up study of a hospital that has scanned and eliminated its paper-based record. A survey of 27 physicians, 60 nurses and 30 medical secretaries was conducted. The results were compared with those from a previous study conducted three years earlier at the same department. The questionnaire (see online Appendix) covered the frequency of use of the EMR system for specific tasks by physicians, nurses and medical secretaries, the ease of performing these tasks compared to previous routines, user satisfaction and computer literacy. Both physicians and nurses displayed increased use of the EMR compared to the previous study, while medical secretaries reported generally unchanged but high use. The increase in use was not accompanied by a similar change in factors such as computer literacy or technical changes, suggesting that these typical success factors are necessary but not sufficient.

  13. Experimental Realization of High-Efficiency Counterfactual Computation.

    PubMed

    Kong, Fei; Ju, Chenyong; Huang, Pu; Wang, Pengfei; Kong, Xi; Shi, Fazhan; Jiang, Liang; Du, Jiangfeng

    2015-08-21

    Counterfactual computation (CFC) exemplifies the fascinating quantum process by which the result of a computation may be learned without actually running the computer. In previous experimental studies, the counterfactual efficiency is limited to below 50%. Here we report an experimental realization of the generalized CFC protocol, in which the counterfactual efficiency can break the 50% limit and even approach unity in principle. The experiment is performed with the spins of a negatively charged nitrogen-vacancy color center in diamond. Taking advantage of the quantum Zeno effect, the computer can remain in the not-running subspace due to the frequent projection by the environment, while the computation result can be revealed by final detection. The counterfactual efficiency up to 85% has been demonstrated in our experiment, which opens the possibility of many exciting applications of CFC, such as high-efficiency quantum integration and imaging.

  14. Experimental Realization of High-Efficiency Counterfactual Computation

    NASA Astrophysics Data System (ADS)

    Kong, Fei; Ju, Chenyong; Huang, Pu; Wang, Pengfei; Kong, Xi; Shi, Fazhan; Jiang, Liang; Du, Jiangfeng

    2015-08-01

    Counterfactual computation (CFC) exemplifies the fascinating quantum process by which the result of a computation may be learned without actually running the computer. In previous experimental studies, the counterfactual efficiency is limited to below 50%. Here we report an experimental realization of the generalized CFC protocol, in which the counterfactual efficiency can break the 50% limit and even approach unity in principle. The experiment is performed with the spins of a negatively charged nitrogen-vacancy color center in diamond. Taking advantage of the quantum Zeno effect, the computer can remain in the not-running subspace due to the frequent projection by the environment, while the computation result can be revealed by final detection. The counterfactual efficiency up to 85% has been demonstrated in our experiment, which opens the possibility of many exciting applications of CFC, such as high-efficiency quantum integration and imaging.

  15. Segmentation of cortical bone using fast level sets

    NASA Astrophysics Data System (ADS)

    Chowdhury, Manish; Jörgens, Daniel; Wang, Chunliang; Smedby, Årjan; Moreno, Rodrigo

    2017-02-01

    Cortical bone plays a big role in the mechanical competence of bone. The analysis of cortical bone requires accurate segmentation methods. Level set methods are usually in the state-of-the-art for segmenting medical images. However, traditional implementations of this method are computationally expensive. This drawback was recently tackled through the so-called coherent propagation extension of the classical algorithm which has decreased computation times dramatically. In this study, we assess the potential of this technique for segmenting cortical bone in interactive time in 3D images acquired through High Resolution peripheral Quantitative Computed Tomography (HR-pQCT). The obtained segmentations are used to estimate cortical thickness and cortical porosity of the investigated images. Cortical thickness and Cortical porosity is computed using sphere fitting and mathematical morphological operations respectively. Qualitative comparison between the segmentations of our proposed algorithm and a previously published approach on six images volumes reveals superior smoothness properties of the level set approach. While the proposed method yields similar results to previous approaches in regions where the boundary between trabecular and cortical bone is well defined, it yields more stable segmentations in challenging regions. This results in more stable estimation of parameters of cortical bone. The proposed technique takes few seconds to compute, which makes it suitable for clinical settings.

  16. Study of Geometric Porosity on Static Stability and Drag Using Computational Fluid Dynamics for Rigid Parachute Shapes

    NASA Technical Reports Server (NTRS)

    Greathouse, James S.; Schwing, Alan M.

    2015-01-01

    This paper explores use of computational fluid dynamics to study the e?ect of geometric porosity on static stability and drag for NASA's Multi-Purpose Crew Vehicle main parachute. Both of these aerodynamic characteristics are of interest to in parachute design, and computational methods promise designers the ability to perform detailed parametric studies and other design iterations with a level of control previously unobtainable using ground or flight testing. The approach presented here uses a canopy structural analysis code to define the inflated parachute shapes on which structured computational grids are generated. These grids are used by the computational fluid dynamics code OVERFLOW and are modeled as rigid, impermeable bodies for this analysis. Comparisons to Apollo drop test data is shown as preliminary validation of the technique. Results include several parametric sweeps through design variables in order to better understand the trade between static stability and drag. Finally, designs that maximize static stability with a minimal loss in drag are suggested for further study in subscale ground and flight testing.

  17. An investigation of the effects of touchpad location within a notebook computer.

    PubMed

    Kelaher, D; Nay, T; Lawrence, B; Lamar, S; Sommerich, C M

    2001-02-01

    This study evaluated effects of the location of a notebook computer's integrated touchpad, complimenting previous work in the area of desktop mouse location effects. Most often integrated touchpads are located in the computer's wrist rest, and centered on the keyboard. This study characterized effects of this bottom center location and four alternatives (top center, top right, right side, and bottom right) upon upper extremity posture, discomfort, preference, and performance. Touchpad location was found to significantly impact each of those measures. The top center location was particularly poor, in that it elicited more ulnar deviation, more shoulder flexion, more discomfort, and perceptions of performance impedance. In general, the bottom center, bottom right, and right side locations fared better, though subjects' wrists were more extended in the bottom locations. Suggestions for notebook computer design are provided.

  18. Supplemental computational phantoms to estimate out-of-field absorbed dose in photon radiotherapy

    NASA Astrophysics Data System (ADS)

    Gallagher, Kyle J.; Tannous, Jaad; Nabha, Racile; Feghali, Joelle Ann; Ayoub, Zeina; Jalbout, Wassim; Youssef, Bassem; Taddei, Phillip J.

    2018-01-01

    The purpose of this study was to develop a straightforward method of supplementing patient anatomy and estimating out-of-field absorbed dose for a cohort of pediatric radiotherapy patients with limited recorded anatomy. A cohort of nine children, aged 2-14 years, who received 3D conformal radiotherapy for low-grade localized brain tumors (LBTs), were randomly selected for this study. The extent of these patients’ computed tomography simulation image sets were cranial only. To approximate their missing anatomy, we supplemented the LBT patients’ image sets with computed tomography images of patients in a previous study with larger extents of matched sex, height, and mass and for whom contours of organs at risk for radiogenic cancer had already been delineated. Rigid fusion was performed between the LBT patients’ data and that of the supplemental computational phantoms using commercial software and in-house codes. In-field dose was calculated with a clinically commissioned treatment planning system, and out-of-field dose was estimated with a previously developed analytical model that was re-fit with parameters based on new measurements for intracranial radiotherapy. Mean doses greater than 1 Gy were found in the red bone marrow, remainder, thyroid, and skin of the patients in this study. Mean organ doses between 150 mGy and 1 Gy were observed in the breast tissue of the girls and lungs of all patients. Distant organs, i.e. prostate, bladder, uterus, and colon, received mean organ doses less than 150 mGy. The mean organ doses of the younger, smaller LBT patients (0-4 years old) were a factor of 2.4 greater than those of the older, larger patients (8-12 years old). Our findings demonstrated the feasibility of a straightforward method of applying supplemental computational phantoms and dose-calculation models to estimate absorbed dose for a set of children of various ages who received radiotherapy and for whom anatomies were largely missing in their original computed tomography simulations.

  19. Optical signal processing using photonic reservoir computing

    NASA Astrophysics Data System (ADS)

    Salehi, Mohammad Reza; Dehyadegari, Louiza

    2014-10-01

    As a new approach to recognition and classification problems, photonic reservoir computing has such advantages as parallel information processing, power efficient and high speed. In this paper, a photonic structure has been proposed for reservoir computing which is investigated using a simple, yet, non-partial noisy time series prediction task. This study includes the application of a suitable topology with self-feedbacks in a network of SOA's - which lends the system a strong memory - and leads to adjusting adequate parameters resulting in perfect recognition accuracy (100%) for noise-free time series, which shows a 3% improvement over previous results. For the classification of noisy time series, the rate of accuracy showed a 4% increase and amounted to 96%. Furthermore, an analytical approach was suggested to solve rate equations which led to a substantial decrease in the simulation time, which is an important parameter in classification of large signals such as speech recognition, and better results came up compared with previous works.

  20. Semiautomated skeletonization of the pulmonary arterial tree in micro-CT images

    NASA Astrophysics Data System (ADS)

    Hanger, Christopher C.; Haworth, Steven T.; Molthen, Robert C.; Dawson, Christopher A.

    2001-05-01

    We present a simple and robust approach that utilizes planar images at different angular rotations combined with unfiltered back-projection to locate the central axes of the pulmonary arterial tree. Three-dimensional points are selected interactively by the user. The computer calculates a sub- volume unfiltered back-projection orthogonal to the vector connecting the two points and centered on the first point. Because more x-rays are absorbed at the thickest portion of the vessel, in the unfiltered back-projection, the darkest pixel is assumed to be the center of the vessel. The computer replaces this point with the newly computer-calculated point. A second back-projection is calculated around the original point orthogonal to a vector connecting the newly-calculated first point and user-determined second point. The darkest pixel within the reconstruction is determined. The computer then replaces the second point with the XYZ coordinates of the darkest pixel within this second reconstruction. Following a vector based on a moving average of previously determined 3- dimensional points along the vessel's axis, the computer continues this skeletonization process until stopped by the user. The computer estimates the vessel diameter along the set of previously determined points using a method similar to the full width-half max algorithm. On all subsequent vessels, the process works the same way except that at each point, distances between the current point and all previously determined points along different vessels are determined. If the difference is less than the previously estimated diameter, the vessels are assumed to branch. This user/computer interaction continues until the vascular tree has been skeletonized.

  1. Spaceborne computer executive routine functional design specification. Volume 2: Computer executive design for space station/base

    NASA Technical Reports Server (NTRS)

    Kennedy, J. R.; Fitzpatrick, W. S.

    1971-01-01

    The computer executive functional system design concepts derived from study of the Space Station/Base are presented. Information Management System hardware configuration as directly influencing the executive design is reviewed. The hardware configuration and generic executive design requirements are considered in detail in a previous report (System Configuration and Executive Requirements Specifications for Reusable Shuttle and Space Station/Base, 9/25/70). This report defines basic system primitives and delineates processes and process control. Supervisor states are considered for describing basic multiprogramming and multiprocessing systems. A high-level computer executive including control of scheduling, allocation of resources, system interactions, and real-time supervisory functions is defined. The description is oriented to provide a baseline for a functional simulation of the computer executive system.

  2. Trusted measurement model based on multitenant behaviors.

    PubMed

    Ning, Zhen-Hu; Shen, Chang-Xiang; Zhao, Yong; Liang, Peng

    2014-01-01

    With a fast growing pervasive computing, especially cloud computing, the behaviour measurement is at the core and plays a vital role. A new behaviour measurement tailored for Multitenants in cloud computing is needed urgently to fundamentally establish trust relationship. Based on our previous research, we propose an improved trust relationship scheme which captures the world of cloud computing where multitenants share the same physical computing platform. Here, we first present the related work on multitenant behaviour; secondly, we give the scheme of behaviour measurement where decoupling of multitenants is taken into account; thirdly, we explicitly explain our decoupling algorithm for multitenants; fourthly, we introduce a new way of similarity calculation for deviation control, which fits the coupled multitenants under study well; lastly, we design the experiments to test our scheme.

  3. Trusted Measurement Model Based on Multitenant Behaviors

    PubMed Central

    Ning, Zhen-Hu; Shen, Chang-Xiang; Zhao, Yong; Liang, Peng

    2014-01-01

    With a fast growing pervasive computing, especially cloud computing, the behaviour measurement is at the core and plays a vital role. A new behaviour measurement tailored for Multitenants in cloud computing is needed urgently to fundamentally establish trust relationship. Based on our previous research, we propose an improved trust relationship scheme which captures the world of cloud computing where multitenants share the same physical computing platform. Here, we first present the related work on multitenant behaviour; secondly, we give the scheme of behaviour measurement where decoupling of multitenants is taken into account; thirdly, we explicitly explain our decoupling algorithm for multitenants; fourthly, we introduce a new way of similarity calculation for deviation control, which fits the coupled multitenants under study well; lastly, we design the experiments to test our scheme. PMID:24987731

  4. Improving Student Content Knowledge in Inclusive Social Studies Classrooms Using Technology-Based Cognitive Organizers: A Systematic Replication

    ERIC Educational Resources Information Center

    Boon, Richard T.; Burke, Mack D.; Fore, Cecil, III; Hagan-Burke, Shanna

    2006-01-01

    The purpose of this study was to conduct a systematic replication of a previous study (Boon, Burke, Fore, & Spencer, 2006) on the effects of computer-generated cognitive organizers using Inspiration 6 software versus a traditional textbook instruction format on students' ability to comprehend social studies content information in high school…

  5. Structural Optimization Methodology for Rotating Disks of Aircraft Engines

    NASA Technical Reports Server (NTRS)

    Armand, Sasan C.

    1995-01-01

    In support of the preliminary evaluation of various engine technologies, a methodology has been developed for structurally designing the rotating disks of an aircraft engine. The structural design methodology, along with a previously derived methodology for predicting low-cycle fatigue life, was implemented in a computer program. An interface computer program was also developed that gathers the required data from a flowpath analysis program (WATE) being used at NASA Lewis. The computer program developed for this study requires minimum interaction with the user, thus allowing engineers with varying backgrounds in aeropropulsion to successfully execute it. The stress analysis portion of the methodology and the computer program were verified by employing the finite element analysis method. The 10th- stage, high-pressure-compressor disk of the Energy Efficient Engine Program (E3) engine was used to verify the stress analysis; the differences between the stresses and displacements obtained from the computer program developed for this study and from the finite element analysis were all below 3 percent for the problem solved. The computer program developed for this study was employed to structurally optimize the rotating disks of the E3 high-pressure compressor. The rotating disks designed by the computer program in this study were approximately 26 percent lighter than calculated from the E3 drawings. The methodology is presented herein.

  6. Nutrition Education for School-Aged Children: A Review of Research.

    ERIC Educational Resources Information Center

    Lytle, Leslie A.

    This review of research on nutrition education for school-aged children includes 17 articles published since 1980 and not included in two previous reviews (13 school-based and 4 outside of school). School-based studies included families and home environments, program institutionalization, using computer systems, knowledge-based studies, and…

  7. Hierarchical Bayesian Models of Subtask Learning

    ERIC Educational Resources Information Center

    Anglim, Jeromy; Wynton, Sarah K. A.

    2015-01-01

    The current study used Bayesian hierarchical methods to challenge and extend previous work on subtask learning consistency. A general model of individual-level subtask learning was proposed focusing on power and exponential functions with constraints to test for inconsistency. To study subtask learning, we developed a novel computer-based booking…

  8. Aesthetics, Usefulness and Performance in User--Search-Engine Interaction

    ERIC Educational Resources Information Center

    Katz, Adi

    2010-01-01

    Issues of visual appeal have become an integral part of designing interactive systems. Interface aesthetics may form users' attitudes towards computer applications and information technology. Aesthetics can affect user satisfaction, and influence their willingness to buy or adopt a system. This study follows previous studies that found that users…

  9. PREDICTORS OF COMPUTER USE IN COMMUNITY-DWELLING ETHNICALLY DIVERSE OLDER ADULTS

    PubMed Central

    Werner, Julie M.; Carlson, Mike; Jordan-Marsh, Maryalice; Clark, Florence

    2011-01-01

    Objective In this study we analyzed self-reported computer use, demographic variables, psychosocial variables, and health and well-being variables collected from 460 ethnically diverse, community-dwelling elders in order to investigate the relationship computer use has with demographics, well-being and other key psychosocial variables in older adults. Background Although younger elders with more education, those who employ active coping strategies, or those who are low in anxiety levels are thought to use computers at higher rates than others, previous research has produced mixed or inconclusive results regarding ethnic, gender, and psychological factors, or has concentrated on computer-specific psychological factors only (e.g., computer anxiety). Few such studies have employed large sample sizes or have focused on ethnically diverse populations of community-dwelling elders. Method With a large number of overlapping predictors, zero-order analysis alone is poorly equipped to identify variables that are independently associated with computer use. Accordingly, both zero-order and stepwise logistic regression analyses were conducted to determine the correlates of two types of computer use: email and general computer use. Results Results indicate that younger age, greater level of education, non-Hispanic ethnicity, behaviorally active coping style, general physical health, and role-related emotional health each independently predicted computer usage. Conclusion Study findings highlight differences in computer usage, especially in regard to Hispanic ethnicity and specific health and well-being factors. Application Potential applications of this research include future intervention studies, individualized computer-based activity programming, or customizable software and user interface design for older adults responsive to a variety of personal characteristics and capabilities. PMID:22046718

  10. Predictors of computer use in community-dwelling, ethnically diverse older adults.

    PubMed

    Werner, Julie M; Carlson, Mike; Jordan-Marsh, Maryalice; Clark, Florence

    2011-10-01

    In this study, we analyzed self-reported computer use, demographic variables, psychosocial variables, and health and well-being variables collected from 460 ethnically diverse, community-dwelling elders to investigate the relationship computer use has with demographics, well-being, and other key psychosocial variables in older adults. Although younger elders with more education, those who employ active coping strategies, or those who are low in anxiety levels are thought to use computers at higher rates than do others, previous research has produced mixed or inconclusive results regarding ethnic, gender, and psychological factors or has concentrated on computer-specific psychological factors only (e.g., computer anxiety). Few such studies have employed large sample sizes or have focused on ethnically diverse populations of community-dwelling elders. With a large number of overlapping predictors, zero-order analysis alone is poorly equipped to identify variables that are independently associated with computer use. Accordingly, both zero-order and stepwise logistic regression analyses were conducted to determine the correlates of two types of computer use: e-mail and general computer use. Results indicate that younger age, greater level of education, non-Hispanic ethnicity, behaviorally active coping style, general physical health, and role-related emotional health each independently predicted computer usage. Study findings highlight differences in computer usage, especially in regard to Hispanic ethnicity and specific health and well-being factors. Potential applications of this research include future intervention studies, individualized computer-based activity programming, or customizable software and user interface design for older adults responsive to a variety of personal characteristics and capabilities.

  11. A comparison of African-American and Caucasian college students' attitudes toward computers

    NASA Astrophysics Data System (ADS)

    Luckett, Pamela Gail

    1997-09-01

    As computer usage becomes mandatory on college campuses across the world, the issue of examining students' attitudes toward computers becomes very important. The major goal of this study was to examine the relationship between gender and ethnicity and African American and Caucasian college students attitudes toward computers. The Computer Attitude Scale instrument was used to measure the students' attitudes. During the Summer of the 1996 academic year, a university in the southeastern United States was selected to participate in this study. A total of 230 African American and Caucasian undergraduate students participated in the study. The students were pre-tested during the first week of the semester to access their initial computer attitudes. The students were enrolled in one of the mandatory computer literacy courses (Computer Literacy Awareness Course or C, Pascal or FORTRAN Programming Course) for 12 weeks. There were a total of seven different instructors for the courses. During the 12th week of class, the students were post-tested to access their computer attitudes after completing one of the computer literacy courses. Results were analyzed using ANCOVA. While both African Americans and Caucasian students showed a slight increase in their attitudes toward computers after completing the course, no significant difference between the groups was found. However, all groups were found to have positive attitudes toward computers in general. Data analysis also indicated no significant gender difference among African American and Caucasian undergraduate students. This confirmed findings of previous studies in which no significant gender difference was found to exist among college students.

  12. Study of basic computer competence among public health nurses in Taiwan.

    PubMed

    Yang, Kuei-Feng; Yu, Shu; Lin, Ming-Sheng; Hsu, Chia-Ling

    2004-03-01

    Rapid advances in information technology and media have made distance learning on the Internet possible. This new model of learning allows greater efficiency and flexibility in knowledge acquisition. Since basic computer competence is a prerequisite for this new learning model, this study was conducted to examine the basic computer competence of public health nurses in Taiwan and explore factors influencing computer competence. A national cross-sectional randomized study was conducted with 329 public health nurses. A questionnaire was used to collect data and was delivered by mail. Results indicate that basic computer competence of public health nurses in Taiwan is still needs to be improved (mean = 57.57 +- 2.83, total score range from 26-130). Among the five most frequently used software programs, nurses were most knowledgeable about Word and least knowledgeable about PowerPoint. Stepwise multiple regression analysis revealed eight variables (weekly number of hours spent online at home, weekly amount of time spent online at work, weekly frequency of computer use at work, previous computer training, computer at workplace and Internet access, job position, education level, and age) that significantly influenced computer competence, which accounted for 39.0 % of the variance. In conclusion, greater computer competence, broader educational programs regarding computer technology, and a greater emphasis on computers at work are necessary to increase the usefulness of distance learning via the Internet in Taiwan. Building a user-friendly environment is important in developing this new media model of learning for the future.

  13. EX VIVO MODEL FOR THE CHARACTERIZATION AND IDENTIFICATION OF DRYWALL INTRAOCULAR FOREIGN BODIES ON COMPUTED TOMOGRAPHY.

    PubMed

    Syed, Reema; Kim, Sung-Hye; Palacio, Agustina; Nunery, William R; Schaal, Shlomit

    2017-06-06

    The study was inspired after the authors encountered a patient with a penetrating globe injury due to drywall, who had retained intraocular drywall foreign body. Computed tomography (CT) was read as normal in this patient. Open globe injury with drywall has never been reported previously in the literature and there are no previous studies describing its radiographic features. The case report is described in detail elsewhere. This was an experimental study. An ex vivo model of 15 porcine eyes with 1 mm to 5 mm fragments of implanted drywall, 2 vitreous only samples with drywall and 3 control eyes were used. Eyes and vitreous samples were CT scanned on Days 0, 1, and 3 postimplantation. Computed ocular images were analyzed by masked observers. Size and radiodensity of intraocular drywall were measured using Hounsfield units (HUs) over time. Intraocular drywall was hyperdense on CT. All sizes studied were detectable on Day 0 of scanning. Mean intraocular drywall foreign body density was 171 ± 52 Hounsfield units (70-237) depending on fragment size. Intraocular drywall foreign body decreased in size whereas Hounsfield unit intensity increased over time. Drywall dissolves in the eye and becomes denser over time as air in the drywall is replaced by fluid. This study identified Hounsfield Units specific to intraocular drywall foreign body over time.

  14. Exploration of cloud computing late start LDRD #149630 : Raincoat. v. 2.1.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Echeverria, Victor T.; Metral, Michael David; Leger, Michelle A.

    This report contains documentation from an interoperability study conducted under the Late Start LDRD 149630, Exploration of Cloud Computing. A small late-start LDRD from last year resulted in a study (Raincoat) on using Virtual Private Networks (VPNs) to enhance security in a hybrid cloud environment. Raincoat initially explored the use of OpenVPN on IPv4 and demonstrates that it is possible to secure the communication channel between two small 'test' clouds (a few nodes each) at New Mexico Tech and Sandia. We extended the Raincoat study to add IPSec support via Vyatta routers, to interface with a public cloud (Amazon Elasticmore » Compute Cloud (EC2)), and to be significantly more scalable than the previous iteration. The study contributed to our understanding of interoperability in a hybrid cloud.« less

  15. Computational modeling for prediction of the shear stress of three-dimensional isotropic and aligned fiber networks.

    PubMed

    Park, Seungman

    2017-09-01

    Interstitial flow (IF) is a creeping flow through the interstitial space of the extracellular matrix (ECM). IF plays a key role in diverse biological functions, such as tissue homeostasis, cell function and behavior. Currently, most studies that have characterized IF have focused on the permeability of ECM or shear stress distribution on the cells, but less is known about the prediction of shear stress on the individual fibers or fiber networks despite its significance in the alignment of matrix fibers and cells observed in fibrotic or wound tissues. In this study, I developed a computational model to predict shear stress for different structured fibrous networks. To generate isotropic models, a random growth algorithm and a second-order orientation tensor were employed. Then, a three-dimensional (3D) solid model was created using computer-aided design (CAD) software for the aligned models (i.e., parallel, perpendicular and cubic models). Subsequently, a tetrahedral unstructured mesh was generated and flow solutions were calculated by solving equations for mass and momentum conservation for all models. Through the flow solutions, I estimated permeability using Darcy's law. Average shear stress (ASS) on the fibers was calculated by averaging the wall shear stress of the fibers. By using nonlinear surface fitting of permeability, viscosity, velocity, porosity and ASS, I devised new computational models. Overall, the developed models showed that higher porosity induced higher permeability, as previous empirical and theoretical models have shown. For comparison of the permeability, the present computational models were matched well with previous models, which justify our computational approach. ASS tended to increase linearly with respect to inlet velocity and dynamic viscosity, whereas permeability was almost the same. Finally, the developed model nicely predicted the ASS values that had been directly estimated from computational fluid dynamics (CFD). The present computational models will provide new tools for predicting accurate functional properties and designing fibrous porous materials, thereby significantly advancing tissue engineering. Copyright © 2017 Elsevier B.V. All rights reserved.

  16. Ergonomics standards and guidelines for computer workstation design and the impact on users' health - a review.

    PubMed

    Woo, E H C; White, P; Lai, C W K

    2016-03-01

    This paper presents an overview of global ergonomics standards and guidelines for design of computer workstations, with particular focus on their inconsistency and associated health risk impact. Overall, considerable disagreements were found in the design specifications of computer workstations globally, particularly in relation to the results from previous ergonomics research and the outcomes from current ergonomics standards and guidelines. To cope with the rapid advancement in computer technology, this article provides justifications and suggestions for modifications in the current ergonomics standards and guidelines for the design of computer workstations. Practitioner Summary: A research gap exists in ergonomics standards and guidelines for computer workstations. We explore the validity and generalisability of ergonomics recommendations by comparing previous ergonomics research through to recommendations and outcomes from current ergonomics standards and guidelines.

  17. Numerical studies of the reversed-field pinch at high aspect ratio

    NASA Astrophysics Data System (ADS)

    Sätherblom, H.-E.; Drake, J. R.

    1998-10-01

    The reversed field pinch (RFP) configuration at an aspect ratio of 8.8 is studied numerically by means of the three-dimensional magnetohydrodynamic code DEBS [D. D. Schnack et al., J. Comput. Phys. 70, 330 (1987)]. This aspect ratio is equal to that of the Extrap T1 experiment [S. Mazur et al., Nucl. Fusion 34, 427 (1994)]. A numerical study of a RFP with this level of aspect ratio requires extensive computer achievements and has hitherto not been performed. The results are compared with previous studies [Y. L. Ho et al., Phys. Plasmas 2, 3407 (1995)] of lower aspect ratio RFP configurations. In particular, an evaluation of the extrapolation to the aspect ratio of 8.8 made in this previous study shows that the extrapolation of the spectral spread, as well as most of the other findings, are confirmed. An important exception, however, is the magnetic diffusion coefficient, which is found to decrease with aspect ratio. Furthermore, an aspect ratio dependence of the magnetic energy and of the helicity of the RFP is found.

  18. Quantum information processing by a continuous Maxwell demon

    NASA Astrophysics Data System (ADS)

    Stevens, Josey; Deffner, Sebastian

    Quantum computing is believed to be fundamentally superior to classical computing; however quantifying the specific thermodynamic advantage has been elusive. Experimentally motivated, we generalize previous minimal models of discrete demons to continuous state space. Analyzing our model allows one to quantify the thermodynamic resources necessary to process quantum information. By further invoking the semi-classical limit we compare the quantum demon with its classical analogue. Finally, this model also serves as a starting point to study open quantum systems.

  19. Hierarchical neural network model of the visual system determining figure/ground relation

    NASA Astrophysics Data System (ADS)

    Kikuchi, Masayuki

    2017-07-01

    One of the most important functions of the visual perception in the brain is figure/ground interpretation from input images. Figural region in 2D image corresponding to object in 3D space are distinguished from background region extended behind the object. Previously the author proposed a neural network model of figure/ground separation constructed on the standpoint that local geometric features such as curvatures and outer angles at corners are extracted and propagated along input contour in a single layer network (Kikuchi & Akashi, 2001). However, such a processing principle has the defect that signal propagation requires manyiterations despite the fact that actual visual system determines figure/ground relation within the short period (Zhou et al., 2000). In order to attain speed-up for determining figure/ground, this study incorporates hierarchical architecture into the previous model. This study confirmed the effect of the hierarchization as for the computation time by simulation. As the number of layers increased, the required computation time reduced. However, such speed-up effect was saturatedas the layers increased to some extent. This study attempted to explain this saturation effect by the notion of average distance between vertices in the area of complex network, and succeeded to mimic the saturation effect by computer simulation.

  20. Automated Measurement of Patient-Specific Tibial Slopes from MRI

    PubMed Central

    Amerinatanzi, Amirhesam; Summers, Rodney K.; Ahmadi, Kaveh; Goel, Vijay K.; Hewett, Timothy E.; Nyman, Edward

    2017-01-01

    Background: Multi-planar proximal tibial slopes may be associated with increased likelihood of osteoarthritis and anterior cruciate ligament injury, due in part to their role in checking the anterior-posterior stability of the knee. Established methods suffer repeatability limitations and lack computational efficiency for intuitive clinical adoption. The aims of this study were to develop a novel automated approach and to compare the repeatability and computational efficiency of the approach against previously established methods. Methods: Tibial slope geometries were obtained via MRI and measured using an automated Matlab-based approach. Data were compared for repeatability and evaluated for computational efficiency. Results: Mean lateral tibial slope (LTS) for females (7.2°) was greater than for males (1.66°). Mean LTS in the lateral concavity zone was greater for females (7.8° for females, 4.2° for males). Mean medial tibial slope (MTS) for females was greater (9.3° vs. 4.6°). Along the medial concavity zone, female subjects demonstrated greater MTS. Conclusion: The automated method was more repeatable and computationally efficient than previously identified methods and may aid in the clinical assessment of knee injury risk, inform surgical planning, and implant design efforts. PMID:28952547

  1. Evaluation of a computer-aided detection algorithm for timely diagnosis of small acute intracranial hemorrhage on computed tomography in a critical care environment

    NASA Astrophysics Data System (ADS)

    Lee, Joon K.; Chan, Tao; Liu, Brent J.; Huang, H. K.

    2009-02-01

    Detection of acute intracranial hemorrhage (AIH) is a primary task in the interpretation of computed tomography (CT) brain scans of patients suffering from acute neurological disturbances or after head trauma. Interpretation can be difficult especially when the lesion is inconspicuous or the reader is inexperienced. We have previously developed a computeraided detection (CAD) algorithm to detect small AIH. One hundred and thirty five small AIH CT studies from the Los Angeles County (LAC) + USC Hospital were identified and matched by age and sex with one hundred and thirty five normal studies. These cases were then processed using our AIH CAD system to evaluate the efficacy and constraints of the algorithm.

  2. Effective electronic-only Kohn–Sham equations for the muonic molecules

    NASA Astrophysics Data System (ADS)

    Rayka, Milad; Goli, Mohammad; Shahbazian, Shant

    A set of effective electronic-only Kohn-Sham (EKS) equations are derived for the muonic molecules (containing a positively charged muon), which are completely equivalent to the coupled electronic-muonic Kohn-Sham equations derived previously within the framework of the Nuclear-Electronic Orbital density functional theory (NEO-DFT). The EKS equations contain effective non-coulombic external potentials depending on parameters describing muon vibration, which are optimized during the solution of the EKS equations making muon KS orbital reproducible. It is demonstrated that the EKS equations are derivable from a certain class of effective electronic Hamiltonians through applying the usual Hohenberg-Kohn theorems revealing a duality between the NEO-DFT and the effective electronic-only DFT methodologies. The EKS equations are computationally applied to a small set of muoniated organic radicals and it is demonstrated that a mean effective potential maybe derived for this class of muonic species while an electronic basis set is also designed for the muon. These computational ingredients are then applied to muoniated ferrocenyl radicals, which had been previously detected experimentally through adding muonium atom to ferrocene. In line with previous computational studies, from the six possible species the staggered conformer, where the muon is attached to the exo position of the cyclopentadienyl ring, is deduced to be the most stable ferrocenyl radical.

  3. Effective electronic-only Kohn-Sham equations for the muonic molecules.

    PubMed

    Rayka, Milad; Goli, Mohammad; Shahbazian, Shant

    2018-03-28

    A set of effective electronic-only Kohn-Sham (EKS) equations are derived for the muonic molecules (containing a positively charged muon), which are completely equivalent to the coupled electronic-muonic Kohn-Sham equations derived previously within the framework of the nuclear-electronic orbital density functional theory (NEO-DFT). The EKS equations contain effective non-coulombic external potentials depending on parameters describing the muon's vibration, which are optimized during the solution of the EKS equations making the muon's KS orbital reproducible. It is demonstrated that the EKS equations are derivable from a certain class of effective electronic Hamiltonians through applying the usual Hohenberg-Kohn theorems revealing a "duality" between the NEO-DFT and the effective electronic-only DFT methodologies. The EKS equations are computationally applied to a small set of muoniated organic radicals and it is demonstrated that a mean effective potential may be derived for this class of muonic species while an electronic basis set is also designed for the muon. These computational ingredients are then applied to muoniated ferrocenyl radicals, which had been previously detected experimentally through adding a muonium atom to ferrocene. In line with previous computational studies, from the six possible species, the staggered conformer, where the muon is attached to the exo position of the cyclopentadienyl ring, is deduced to be the most stable ferrocenyl radical.

  4. Identification of genes related to proliferative diabetic retinopathy through RWR algorithm based on protein-protein interaction network.

    PubMed

    Zhang, Jian; Suo, Yan; Liu, Min; Xu, Xun

    2018-06-01

    Proliferative diabetic retinopathy (PDR) is one of the most common complications of diabetes and can lead to blindness. Proteomic studies have provided insight into the pathogenesis of PDR and a series of PDR-related genes has been identified but are far from fully characterized because the experimental methods are expensive and time consuming. In our previous study, we successfully identified 35 candidate PDR-related genes through the shortest-path algorithm. In the current study, we developed a computational method using the random walk with restart (RWR) algorithm and the protein-protein interaction (PPI) network to identify potential PDR-related genes. After some possible genes were obtained by the RWR algorithm, a three-stage filtration strategy, which includes the permutation test, interaction test and enrichment test, was applied to exclude potential false positives caused by the structure of PPI network, the poor interaction strength, and the limited similarity on gene ontology (GO) terms and biological pathways. As a result, 36 candidate genes were discovered by the method which was different from the 35 genes reported in our previous study. A literature review showed that 21 of these 36 genes are supported by previous experiments. These findings suggest the robustness and complementary effects of both our efforts using different computational methods, thus providing an alternative method to study PDR pathogenesis. Copyright © 2017 Elsevier B.V. All rights reserved.

  5. Improving learning with science and social studies text using computer-based concept maps for students with disabilities.

    PubMed

    Ciullo, Stephen; Falcomata, Terry S; Pfannenstiel, Kathleen; Billingsley, Glenna

    2015-01-01

    Concept maps have been used to help students with learning disabilities (LD) improve literacy skills and content learning, predominantly in secondary school. However, despite increased access to classroom technology, no previous studies have examined the efficacy of computer-based concept maps to improve learning from informational text for students with LD in elementary school. In this study, we used a concurrent delayed multiple probe design to evaluate the interactive use of computer-based concept maps on content acquisition with science and social studies texts for Hispanic students with LD in Grades 4 and 5. Findings from this study suggest that students improved content knowledge during intervention relative to a traditional instruction baseline condition. Learning outcomes and social validity information are considered to inform recommendations for future research and the feasibility of classroom implementation. © The Author(s) 2014.

  6. Evaluation of computer usage in healthcare among private practitioners of NCT Delhi.

    PubMed

    Ganeshkumar, P; Arun Kumar, Sharma; Rajoura, O P

    2011-01-01

    1. To evaluate the usage and the knowledge of computers and Information and Communication Technology in health care delivery by private practitioners. 2. To understand the determinants of computer usage by them. A cross sectional study was conducted among the private practitioners practising in three districts of NCT of Delhi between November 2007 and December 2008 by stratified random sampling method, where knowledge and usage of computers in health care and determinants of usage of computer was evaluated in them by a pre-coded semi open ended questionnaire. About 77% of the practitioners reported to have a computer and had the accessibility to internet. Computer availability and internet accessibility was highest among super speciality practitioners. Practitioners who attended a computer course were 13.8 times [OR: 13.8 (7.3 - 25.8)] more likely to have installed an EHR in the clinic. Technical related issues were the major perceived barrier in installing a computer in the clinic. Practice speciality, previous attendance of a computer course, age of started using a computer influenced the knowledge about computers. Speciality of the practice, presence of a computer professional and gender were the determinants of usage of computer.

  7. Computers and the orthopaedic office.

    PubMed

    Berumen, Edmundo; Barllow, Fidel Dobarganes; Fong, Fransisco Javier; Lopez, Jorge Arturo

    2002-01-01

    The advance of today's medicine could be linked very closely to the history of computers through the last twenty years. In the beginning the first attempt to build a computer was trying to help us with mathematical calculations. This has changed recently and computers are now linked to x-ray machines, CT scanners, and MRIs. Being able to share information is one of the goals of the future. Today's computer technology has helped a great deal to allow orthopaedic surgeons from around the world to consult on a difficult case or to become a part of a large database. Obtaining the results from a method of treatment using a multicentric information study can be done on a regular basis. In the future, computers will help us to retrieve information from patients' clinical history directly from a hospital database or by portable memory cards that will carry every radiograph or video from previous surgeries.

  8. Initial experience with custom-fit total knee replacement: intra-operative events and long-leg coronal alignment.

    PubMed

    Spencer, Brian A; Mont, Michael A; McGrath, Mike S; Boyd, Bradley; Mitrick, Michael F

    2009-12-01

    New technology using magnetic resonance imaging (MRI) allows the surgeon to place total knee replacement components into each patient's pre-arthritic natural alignment. This study evaluated the initial intra-operative experience using this technique. Twenty-one patients had a sagittal MRI of their arthritic knee to determine component placement for a total knee replacement. Cutting guides were machined to control all intra-operative cuts. Intra-operative events were recorded and these knees were compared to a matching cohort of the senior surgeon's previous 30 conventional total knee replacements. Post-operative scanograms were obtained from each patient and coronal alignment was compared to previous studies using conventional and computer-assisted techniques. There were no intra-operative or acute post-operative complications. There were no differences in blood loss and there was a mean decrease in operative time of 14% compared to a cohort of patients with conventional knee replacements. The average deviation from the mechanical axis was 1.2 degrees of varus, which was comparable to previously reported conventional and computer-assisted techniques. Custom-fit total knee replacement appeared to be a safe procedure for uncomplicated cases of osteoarthritis.

  9. Computation of temperature elevation in rabbit eye irradiated by 2.45-GHz microwaves with different field configurations.

    PubMed

    Hirata, Akimasa; Watanabe, Soichi; Taki, Masao; Fujiwara, Osamu; Kojima, Masami; Sasaki, Kazuyuki

    2008-02-01

    This study calculated the temperature elevation in the rabbit eye caused by 2.45-GHz near-field exposure systems. First, we calculated specific absorption rate distributions in the eye for different antennas and then compared them with those observed in previous studies. Next, we re-examined the temperature elevation in the rabbit eye due to a horizontally-polarized dipole antenna with a C-shaped director, which was used in a previous study. For our computational results, we found that decisive factors of the SAR distribution in the rabbit eye were the polarization of the electromagnetic wave and antenna aperture. Next, we quantified the eye average specific absorption rate as 67 W kg(-1) for the dipole antenna with an input power density at the eye surface of 150 mW cm(-2), which was specified in the previous work as the minimum cataractogenic power density. The effect of administrating anesthesia on the temperature elevation was 30% or so in the above case. Additionally, the position where maximum temperature in the lens appears is discussed due to different 2.45-GHz microwave systems. That position was found to appear around the posterior of the lens regardless of the exposure condition, which indicates that the original temperature distribution in the eye was the dominant factor.

  10. Approaches to Classroom-Based Computational Science.

    ERIC Educational Resources Information Center

    Guzdial, Mark

    Computational science includes the use of computer-based modeling and simulation to define and test theories about scientific phenomena. The challenge for educators is to develop techniques for implementing computational science in the classroom. This paper reviews some previous work on the use of simulation alone (without modeling), modeling…

  11. Characterization of Unsteady Flow Structures Near Landing-Edge Slat. Part 2; 2D Computations

    NASA Technical Reports Server (NTRS)

    Khorrami, Mehdi; Choudhari, Meelan M.; Jenkins, Luther N.

    2004-01-01

    In our previous computational studies of a generic high-lift configuration, quasi-laminar (as opposed to fully turbulent) treatment of the slat cove region proved to be an effective approach for capturing the unsteady dynamics of the cove flow field. Combined with acoustic propagation via Ffowes Williams and Hawkings formulation, the quasi-laminar simulations captured some important features of the slat cove noise measured with microphone array techniques. However. a direct assessment of the computed cove flow field was not feasible due to the unavailability of off-surface flow measurements. To remedy this shortcoming, we have undertaken a combined experiment and computational study aimed at characterizing the flow structures and fluid mechanical processes within the slat cove region. Part I of this paper outlines the experimental aspects of this investigation focused on the 30P30N high-lift configuration; the present paper describes the accompanying computational results including a comparison between computation and experiment at various angles of attack. Even through predictions of the time-averaged flow field agree well with the measured data, the study indicates the need for further refinement of the zonal turbulence approach in order to capture the full dynamics of the cove's fluctuating flow field.

  12. Using the Criterion-Predictor Factor Model to Compute the Probability of Detecting Prediction Bias with Ordinary Least Squares Regression

    ERIC Educational Resources Information Center

    Culpepper, Steven Andrew

    2012-01-01

    The study of prediction bias is important and the last five decades include research studies that examined whether test scores differentially predict academic or employment performance. Previous studies used ordinary least squares (OLS) to assess whether groups differ in intercepts and slopes. This study shows that OLS yields inaccurate inferences…

  13. Critical phenomena in communication/computation networks with various topologies and suboptimal to optimal resource allocation

    NASA Astrophysics Data System (ADS)

    Cogoni, Marco; Busonera, Giovanni; Anedda, Paolo; Zanetti, Gianluigi

    2015-01-01

    We generalize previous studies on critical phenomena in communication networks [1,2] by adding computational capabilities to the nodes. In our model, a set of tasks with random origin, destination and computational structure is distributed on a computational network, modeled as a graph. By varying the temperature of a Metropolis Montecarlo, we explore the global latency for an optimal to suboptimal resource assignment at a given time instant. By computing the two-point correlation function for the local overload, we study the behavior of the correlation distance (both for links and nodes) while approaching the congested phase: a transition from peaked to spread g(r) is seen above a critical (Montecarlo) temperature Tc. The average latency trend of the system is predicted by averaging over several network traffic realizations while maintaining a spatially detailed information for each node: a sharp decrease of performance is found over Tc independently of the workload. The globally optimized computational resource allocation and network routing defines a baseline for a future comparison of the transition behavior with respect to existing routing strategies [3,4] for different network topologies.

  14. Reducing the latency of the Fractal Iterative Method to half an iteration

    NASA Astrophysics Data System (ADS)

    Béchet, Clémentine; Tallon, Michel

    2013-12-01

    The fractal iterative method for atmospheric tomography (FRiM-3D) has been introduced to solve the wavefront reconstruction at the dimensions of an ELT with a low-computational cost. Previous studies reported the requirement of only 3 iterations of the algorithm in order to provide the best adaptive optics (AO) performance. Nevertheless, any iterative method in adaptive optics suffer from the intrinsic latency induced by the fact that one iteration can start only once the previous one is completed. Iterations hardly match the low-latency requirement of the AO real-time computer. We present here a new approach to avoid iterations in the computation of the commands with FRiM-3D, thus allowing low-latency AO response even at the scale of the European ELT (E-ELT). The method highlights the importance of "warm-start" strategy in adaptive optics. To our knowledge, this particular way to use the "warm-start" has not been reported before. Futhermore, removing the requirement of iterating to compute the commands, the computational cost of the reconstruction with FRiM-3D can be simplified and at least reduced to half the computational cost of a classical iteration. Thanks to simulations of both single-conjugate and multi-conjugate AO for the E-ELT,with FRiM-3D on Octopus ESO simulator, we demonstrate the benefit of this approach. We finally enhance the robustness of this new implementation with respect to increasing measurement noise, wind speed and even modeling errors.

  15. Early detection of hospitalized patients with previously diagnosed obstructive sleep apnea using computer decision support alerts.

    PubMed

    Evans, R Scott; Flint, Vrena B; Cloward, Tom V; Beninati, William; Lloyd, James F; Megwalu, Kimberly; Simpson, Kathy J; Alsharit, Ahmed M; Balls, Shayna B; Farney, Robert J

    2013-01-01

    Obstructive sleep apnea (OSA) is a worldwide problem affecting 2-14% of the general population and most patients remain undiagnosed. OSA patients are at elevated risk for hypoxemia, cardiac arrhythmias, cardiorespiratory arrest, hypoxic encephalopathy, stroke and death during hospitalization. Clinical screening questionnaires are used to identify hospitalized patients with OSA; especially before surgery. However, current screening questionnaires miss a significant number of patients and require more definitive testing before specific therapy can be started. Moreover, many patients are admitted to the hospital with a previous diagnosis of OSA that is not reported. Thus, many patients with OSA do not receive appropriate therapy during hospitalization due to the lack of information from previous inpatient and outpatient encounters. Large enterprise data warehouses provide the ability to monitor patient encounters over wide geographical areas. This study found that previously diagnosed OSA is highly prevalent and undertreated in hospitalized patients and the use of early computer alerts by respiratory therapists resulted in significantly more OSA patients receiving appropriate medical care (P < 0.002) which resulted in significantly fewer experiencing hypoxemia (P < 0.006). The impact was greater for non-surgery patients compared to surgery patients.

  16. Improving Learners' Oral Fuency through Computer-Mediated Emotional Intelligence Activities

    ERIC Educational Resources Information Center

    Abdolrezapour, Parisa

    2017-01-01

    Previous studies have shown that emotional intelligence (henceforth, EI) has a significant impact on important life outcomes (e.g., mental and physical health, academic achievement, work performance, and social relationships). This study aimed to see whether there is any relationship between EI and English as a foreign language (EFL) learners'…

  17. Identification of Number Sense Strategies Used by Pre-Service Elementary Teachers

    ERIC Educational Resources Information Center

    Sengul, Sare

    2013-01-01

    The purpose of this study was to identify the use of number sense strategies by pre-service teachers studying at the department of elementary education. Compared to the previous one; new mathematics curriculum places more emphasis on various strategies such as estimation strategies, computational estimation strategies, rounding and mental…

  18. What Influences College Students to Continue Using Business Simulation Games? The Taiwan Experience

    ERIC Educational Resources Information Center

    Tao, Yu-Hui; Cheng, Chieh-Jen; Sun, Szu-Yuan

    2009-01-01

    Previous studies have pointed out that computer games could improve students' motivation to learn, but these studies have mostly targeted teachers or students in elementary and secondary education and are without user adoption models. Because business and management institutions in higher education have been increasingly using educational…

  19. Using Interval-Based Systems to Measure Behavior in Early Childhood Special Education and Early Intervention

    ERIC Educational Resources Information Center

    Lane, Justin D.; Ledford, Jennifer R.

    2014-01-01

    The purpose of this article is to summarize the current literature on the accuracy and reliability of interval systems using data from previously published experimental studies that used either human observations of behavior or computer simulations. Although multiple comparison studies provided mathematical adjustments or modifications to interval…

  20. An introduction to real-time graphical techniques for analyzing multivariate data

    NASA Astrophysics Data System (ADS)

    Friedman, Jerome H.; McDonald, John Alan; Stuetzle, Werner

    1987-08-01

    Orion I is a graphics system used to study applications of computer graphics - especially interactive motion graphics - in statistics. Orion I is the newest of a family of "Prim" systems, whose most striking common feature is the use of real-time motion graphics to display three dimensional scatterplots. Orion I differs from earlier Prim systems through the use of modern and relatively inexpensive raster graphics and microprocessor technology. It also delivers more computing power to its user; Orion I can perform more sophisticated real-time computations than were possible on previous such systems. We demonstrate some of Orion I's capabilities in our film: "Exploring data with Orion I".

  1. An agent-based computational model for tuberculosis spreading on age-structured populations

    NASA Astrophysics Data System (ADS)

    Graciani Rodrigues, C. C.; Espíndola, Aquino L.; Penna, T. J. P.

    2015-06-01

    In this work we present an agent-based computational model to study the spreading of the tuberculosis (TB) disease on age-structured populations. The model proposed is a merge of two previous models: an agent-based computational model for the spreading of tuberculosis and a bit-string model for biological aging. The combination of TB with the population aging, reproduces the coexistence of health states, as seen in real populations. In addition, the universal exponential behavior of mortalities curves is still preserved. Finally, the population distribution as function of age shows the prevalence of TB mostly in elders, for high efficacy treatments.

  2. Affect and the computer game player: the effect of gender, personality, and game reinforcement structure on affective responses to computer game-play.

    PubMed

    Chumbley, Justin; Griffiths, Mark

    2006-06-01

    Previous research on computer games has tended to concentrate on their more negative effects (e.g., addiction, increased aggression). This study departs from the traditional clinical and social learning explanations for these behavioral phenomena and examines the effect of personality, in-game reinforcement characteristics, gender, and skill on the emotional state of the game-player. Results demonstrated that in-game reinforcement characteristics and skill significantly effect a number of affective measures (most notably excitement and frustration). The implications of the impact of game-play on affect are discussed with reference to the concepts of "addiction" and "aggression."

  3. Embracing the Cloud: Six Ways to Look at the Shift to Cloud Computing

    ERIC Educational Resources Information Center

    Ullman, David F.; Haggerty, Blake

    2010-01-01

    Cloud computing is the latest paradigm shift for the delivery of IT services. Where previous paradigms (centralized, decentralized, distributed) were based on fairly straightforward approaches to technology and its management, cloud computing is radical in comparison. The literature on cloud computing, however, suffers from many divergent…

  4. Comparisons of Physicians' and Nurses' Attitudes towards Computers.

    PubMed

    Brumini, Gordana; Ković, Ivor; Zombori, Dejvid; Lulić, Ileana; Bilic-Zulle, Lidija; Petrovecki, Mladen

    2005-01-01

    Before starting the implementation of integrated hospital information systems, the physicians' and nurses' attitudes towards computers were measured by means of a questionnaire. The study was conducted in Dubrava University Hospital, Zagreb in Croatia. Out of 194 respondents, 141 were nurses and 53 physicians, randomly selected. They surveyed by an anonymous questionnaire consisting of 8 closed questions about demographic data, computer science education and computer usage, and 30 statements on attitudes towards computers. The statements were adapted to a Likert type scale. Differences in attitudes towards computers between groups were compared using Kruskal-Wallis and Mann Whitney test for post-hoc analysis. The total score presented attitudes toward computers. Physicians' total score was 130 (97-144), while nurses' total score was 123 (88-141). It points that the average answer to all statements was between "agree" and "strongly agree", and these high total scores indicated their positive attitudes. Age, computer science education and computer usage were important factors witch enhances the total score. Younger physicians and nurses with computer science education and with previous computer experience had more positive attitudes towards computers than others. Our results are important for planning and implementation of integrated hospital information systems in Croatia.

  5. Identifying messaging completion in a parallel computer by checking for change in message received and transmitted count at each node

    DOEpatents

    Archer, Charles J [Rochester, MN; Hardwick, Camesha R [Fayetteville, NC; McCarthy, Patrick J [Rochester, MN; Wallenfelt, Brian P [Eden Prairie, MN

    2009-06-23

    Methods, parallel computers, and products are provided for identifying messaging completion on a parallel computer. The parallel computer includes a plurality of compute nodes, the compute nodes coupled for data communications by at least two independent data communications networks including a binary tree data communications network optimal for collective operations that organizes the nodes as a tree and a torus data communications network optimal for point to point operations that organizes the nodes as a torus. Embodiments include reading all counters at each node of the torus data communications network; calculating at each node a current node value in dependence upon the values read from the counters at each node; and determining for all nodes whether the current node value for each node is the same as a previously calculated node value for each node. If the current node is the same as the previously calculated node value for all nodes of the torus data communications network, embodiments include determining that messaging is complete and if the current node is not the same as the previously calculated node value for all nodes of the torus data communications network, embodiments include determining that messaging is currently incomplete.

  6. A New Soft Computing Method for K-Harmonic Means Clustering.

    PubMed

    Yeh, Wei-Chang; Jiang, Yunzhi; Chen, Yee-Fen; Chen, Zhe

    2016-01-01

    The K-harmonic means clustering algorithm (KHM) is a new clustering method used to group data such that the sum of the harmonic averages of the distances between each entity and all cluster centroids is minimized. Because it is less sensitive to initialization than K-means (KM), many researchers have recently been attracted to studying KHM. In this study, the proposed iSSO-KHM is based on an improved simplified swarm optimization (iSSO) and integrates a variable neighborhood search (VNS) for KHM clustering. As evidence of the utility of the proposed iSSO-KHM, we present extensive computational results on eight benchmark problems. From the computational results, the comparison appears to support the superiority of the proposed iSSO-KHM over previously developed algorithms for all experiments in the literature.

  7. [Effect of Reading a Book on a Tablet Computer on Cerebral Blood Flow in the Prefrontal Cortex].

    PubMed

    Sugiura, Akihiro; Eto, Takuya; Kinoshita, Fumiya; Takada, Hiroki

    2018-01-01

    By measuring cerebral blood flow in the prefrontal cortex, we aimed to determine how reading a book on a tablet computer affects sleep. Seven students (7 men age range, 21-32 years) participated in this study. In a controlled illuminance environment, the subjects read a novel in printed form or on a tablet computer from any distance. As the subjects were reading, the cerebral blood flow in their prefrontal cortex was measured by near-infrared spectroscopy. The study protocol was as follows. 1) Subjects mentally counted a sequence of numbers for 30 s as a pretest to standardized thinking and then 2) read the novel for 10 min, using the printed book or tablet computer. In step 2), the use of the book or tablet computer was in a random sequence. Subjects rested between the two tasks. Significantly increased brain activity (increase in regional cerebral blood flow) was observed following reading a novel on a tablet computer compared with that after reading a printed book. Furthermore, the region around Broca's area was more active when reading on a tablet computer than when reading a printed book. Considering the results of this study and previous studies on physiological characteristics during nonrapid eye movement sleep, we concluded that reading a book on a tablet computer before the onset of sleep leads to the potential inhibition of sound sleep through mechanisms other than the suppression of melatonin secretion.

  8. Connecting Ellipses to Rectangles in Passive Scalar Transport

    NASA Astrophysics Data System (ADS)

    Aminian, Manuchehr; Bernardi, Francesca; Camassa, Roberto; Harris, Daniel; McLaughlin, Richard

    2017-11-01

    We study how passive scalar transport in Poiseuille flow is affected by the shape of the pipe cross section. Our previous results have established nontrivial dependence of the skewness of the tracer distribution upon the pipe shape. Previously, we have studied the families of rectangles and ellipses, with the behavior past diffusive timescales primarily depending on aspect ratio, and the type of geometry being secondary. However, at timescales well before the diffusion timescale, the family of ellipses is distinct compared to rectangles. We investigate this phenomenon by studying a collection of exotic cross sections connecting the ellipses and rectangles, using a combination of theoretical and computational tools.

  9. A new computer-based counselling system for the promotion of physical activity in patients with chronic diseases--results from a pilot study.

    PubMed

    Becker, Annette; Herzberg, Dominikus; Marsden, Nicola; Thomanek, Sabine; Jung, Hartmut; Leonhardt, Corinna

    2011-05-01

    To develop a computer-based counselling system (CBCS) for the improvement of attitudes towards physical activity in chronically ill patients and to pilot its efficacy and acceptance in primary care. The system is tailored to patients' disease and motivational stage. During a pilot study in five German general practices, patients answered questions before, directly and 6 weeks after using the CBCS. Outcome criteria were attitudes and self-efficacy. Qualitative interviews were performed to identify acceptance indicators. Seventy-nine patients participated (mean age: 64.5 years, 53% males; 38% without previous computer experience). Patients' affective and cognitive attitudes changed significantly, self-efficacy showed only minor changes. Patients mentioned no difficulties in interacting with the CBCS. However, perception of the system's usefulness was inconsistent. Computer-based counselling for physical activity related attitudes in patients with chronic diseases is feasible, but the circumstances of use with respect to the target group and its integration into the management process have to be clarified in future studies. This study adds to the understanding of computer-based counselling in primary health care. Acceptance indicators identified in this study will be validated as part of a questionnaire on technology acceptability in a subsequent study. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.

  10. Gender Differences in Computer- and Instrumental-Based Musical Composition

    ERIC Educational Resources Information Center

    Shibazaki, Kagari; Marshall, Nigel A.

    2013-01-01

    Background: Previous studies have argued that technology can be a major support to the music teacher enabling, amongst other things, increased student motivation, higher levels of confidence and more individualised learning to take place [Bolton, J. 2008. "Technologically mediated composition learning: Josh's story." "British…

  11. Summary of synfuel characterization and combustion studies

    NASA Technical Reports Server (NTRS)

    Schultz, D. F.

    1983-01-01

    Combustion component research studies aimed at evolving environmentally acceptable approaches for burning coal derived fuels for ground power applications were performed at the NASA Lewis Research Center under a program titled the ""Critical Research and Support Technology Program'' (CRT). The work was funded by the Department of Energy and was performed in four tasks. This report summarizes these tasks which have all been previously reported. In addition some previously unreported data from Task 4 is also presented. The first, Task 1 consisted of a literature survey aimed at determining the properties of synthetic fuels. This was followed by a computer modeling effort, Task 2, to predict the exhaust emissions resulting from burning coal liquids by various combustion techniques such as lean and rich-lean combustion. The computer predictions were then compared to the results of a flame tube rig, Task 3, in which the fuel properties were varied to simulate coal liquids. Two actual SRC 2 coal liquids were tested in this flame tube task.

  12. Lensing of the CMB: non-Gaussian aspects.

    PubMed

    Zaldarriaga, M

    2001-06-01

    We compute the small angle limit of the three- and four-point function of the cosmic microwave background (CMB) temperature induced by the gravitational lensing effect by the large-scale structure of the universe. We relate the non-Gaussian aspects presented in this paper with those in our previous studies of the lensing effects. We interpret the statistics proposed in previous work in terms of different configurations of the four-point function and show how they relate to the statistic that maximizes the S/N.

  13. Tensor scale-based fuzzy connectedness image segmentation

    NASA Astrophysics Data System (ADS)

    Saha, Punam K.; Udupa, Jayaram K.

    2003-05-01

    Tangible solutions to image segmentation are vital in many medical imaging applications. Toward this goal, a framework based on fuzzy connectedness was developed in our laboratory. A fundamental notion called "affinity" - a local fuzzy hanging togetherness relation on voxels - determines the effectiveness of this segmentation framework in real applications. In this paper, we introduce the notion of "tensor scale" - a recently developed local morphometric parameter - in affinity definition and study its effectiveness. Although, our previous notion of "local scale" using the spherical model successfully incorporated local structure size into affinity and resulted in measureable improvements in segmentation results, a major limitation of the previous approach was that it ignored local structural orientation and anisotropy. The current approach of using tensor scale in affinity computation allows an effective utilization of local size, orientation, and ansiotropy in a unified manner. Tensor scale is used for computing both the homogeneity- and object-feature-based components of affinity. Preliminary results of the proposed method on several medical images and computer generated phantoms of realistic shapes are presented. Further extensions of this work are discussed.

  14. Ensemble representations: effects of set size and item heterogeneity on average size perception.

    PubMed

    Marchant, Alexander P; Simons, Daniel J; de Fockert, Jan W

    2013-02-01

    Observers can accurately perceive and evaluate the statistical properties of a set of objects, forming what is now known as an ensemble representation. The accuracy and speed with which people can judge the mean size of a set of objects have led to the proposal that ensemble representations of average size can be computed in parallel when attention is distributed across the display. Consistent with this idea, judgments of mean size show little or no decrement in accuracy when the number of objects in the set increases. However, the lack of a set size effect might result from the regularity of the item sizes used in previous studies. Here, we replicate these previous findings, but show that judgments of mean set size become less accurate when set size increases and the heterogeneity of the item sizes increases. This pattern can be explained by assuming that average size judgments are computed using a limited capacity sampling strategy, and it does not necessitate an ensemble representation computed in parallel across all items in a display. Copyright © 2012 Elsevier B.V. All rights reserved.

  15. Measurement of information and communication technology experience and attitudes to e-learning of students in the healthcare professions: integrative review.

    PubMed

    Wilkinson, Ann; While, Alison E; Roberts, Julia

    2009-04-01

    This paper is a report of a review to describe and discuss the psychometric properties of instruments used in healthcare education settings measuring experience and attitudes of healthcare students regarding their information and communication technology skills and their use of computers and the Internet for education. Healthcare professionals are expected to be computer and information literate at registration. A previous review of evaluative studies of computer-based learning suggests that methods of measuring learners' attitudes to computers and computer aided learning are problematic. A search of eight health and social science databases located 49 papers, the majority published between 1995 and January 2007, focusing on the experience and attitudes of students in the healthcare professions towards computers and e-learning. An integrative approach was adopted, with narrative description of findings. Criteria for inclusion were quantitative studies using survey tools with samples of healthcare students and concerning computer and information literacy skills, access to computers, experience with computers and use of computers and the Internet for education purposes. Since the 1980s a number of instruments have been developed, mostly in the United States of America, to measure attitudes to computers, anxiety about computer use, information and communication technology skills, satisfaction and more recently attitudes to the Internet and computers for education. The psychometric properties are poorly described. Advances in computers and technology mean that many earlier tools are no longer valid. Measures of the experience and attitudes of healthcare students to the increased use of e-learning require development in line with computer and technology advances.

  16. ORCA Project: Research on high-performance parallel computer programming environments. Final report, 1 Apr-31 Mar 90

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Snyder, L.; Notkin, D.; Adams, L.

    1990-03-31

    This task relates to research on programming massively parallel computers. Previous work on the Ensamble concept of programming was extended and investigation into nonshared memory models of parallel computation was undertaken. Previous work on the Ensamble concept defined a set of programming abstractions and was used to organize the programming task into three distinct levels; Composition of machine instruction, composition of processes, and composition of phases. It was applied to shared memory models of computations. During the present research period, these concepts were extended to nonshared memory models. During the present research period, one Ph D. thesis was completed, onemore » book chapter, and six conference proceedings were published.« less

  17. A Preliminary Validation of Attention, Relevance, Confidence and Satisfaction Model-Based Instructional Material Motivational Survey in a Computer-Based Tutorial Setting

    ERIC Educational Resources Information Center

    Huang, Wenhao; Huang, Wenyeh; Diefes-Dux, Heidi; Imbrie, Peter K.

    2006-01-01

    This paper describes a preliminary validation study of the Instructional Material Motivational Survey (IMMS) derived from the Attention, Relevance, Confidence and Satisfaction motivational design model. Previous studies related to the IMMS, however, suggest its practical application for motivational evaluation in various instructional settings…

  18. Incorporating Prototyping and Iteration into Intervention Development: A Case Study of a Dining Hall-Based Intervention

    ERIC Educational Resources Information Center

    McClain, Arianna D.; Hekler, Eric B.; Gardner, Christopher D.

    2013-01-01

    Background: Previous research from the fields of computer science and engineering highlight the importance of an iterative design process (IDP) to create more creative and effective solutions. Objective: This study describes IDP as a new method for developing health behavior interventions and evaluates the effectiveness of a dining hall--based…

  19. The Impact of Recasts on the Development of Primary Stress in a Synchronous Computer-Mediated Environment

    ERIC Educational Resources Information Center

    Parlak, Özgür; Ziegler, Nicole

    2017-01-01

    Although previous research has demonstrated the efficacy of recasts on second language (L2) morphology and lexis (e.g., Li, 2010; Mackey & Goo, 2007), few studies have examined their effect on learners' phonological development (although see Saito, 2015; Saito & Lyster, 2012). The current study investigates the impact of recasts on the…

  20. Digital Modeling in Design Foundation Coursework: An Exploratory Study of the Effectiveness of Conceptual Design Software

    ERIC Educational Resources Information Center

    Guidera, Stan; MacPherson, D. Scot

    2008-01-01

    This paper presents the results of a study that was conducted to identify and document student perceptions of the effectiveness of computer modeling software introduced in a design foundations course that had previously utilized only conventional manually-produced representation techniques. Rather than attempt to utilize a production-oriented CAD…

  1. Teachers' Perceptions about their Own and their Schools' Readiness for Computer Implementation: A South African Case Study

    ERIC Educational Resources Information Center

    du Plessis, Andre; Webb, Paul

    2012-01-01

    This case study, involving 30 participating teachers from six previously disadvantaged South African schools, provides data on teacher perceptions of the challenges related to implementing Information and Communication Technology (ICT). The schools had minimal resources as a residual result of the South African apartheid policy prior to 1994 and…

  2. Virtual Worlds to Support Patient Group Communication? A Questionnaire Study Investigating Potential for Virtual World Focus Group Use by Respiratory Patients

    ERIC Educational Resources Information Center

    Taylor, Michael J.; Taylor, Dave; Vlaev, Ivo; Elkin, Sarah

    2017-01-01

    Recent advances in communication technologies enable potential provision of remote education for patients using computer-generated environments known as virtual worlds. Previous research has revealed highly variable levels of patient receptiveness to using information technologies for healthcare-related purposes. This preliminary study involved…

  3. Language Networks Associated with Computerized Semantic Indices

    PubMed Central

    Pakhomov, Serguei V. S.; Jones, David T.; Knopman, David S.

    2014-01-01

    Tests of generative semantic verbal fluency are widely used to study organization and representation of concepts in the human brain. Previous studies demonstrated that clustering and switching behavior during verbal fluency tasks is supported by multiple brain mechanisms associated with semantic memory and executive control. Previous work relied on manual assessments of semantic relatedness between words and grouping of words into semantic clusters. We investigated a computational linguistic approach to measuring the strength of semantic relatedness between words based on latent semantic analysis of word co-occurrences in a subset of a large online encyclopedia. We computed semantic clustering indices and compared them to brain network connectivity measures obtained with task-free fMRI in a sample consisting of healthy participants and those differentially affected by cognitive impairment. We found that semantic clustering indices were associated with brain network connectivity in distinct areas including fronto-temporal, fronto-parietal and fusiform gyrus regions. This study shows that computerized semantic indices complement traditional assessments of verbal fluency to provide a more complete account of the relationship between brain and verbal behavior involved organization and retrieval of lexical information from memory. PMID:25315785

  4. Slat Cove Unsteadiness Effect of 3D Flow Structures

    NASA Technical Reports Server (NTRS)

    Choudhari, Meelan M.; Khorrami, Mehdi R.

    2006-01-01

    Previous studies have indicated that 2D, time accurate computations based on a pseudo-laminar zonal model of the slat cove region (within the framework of the Reynolds-Averaged Navier-Stokes equations) are inadequate for predicting the full unsteady dynamics of the slat cove flow field. Even though such computations could capture the large-scale, unsteady vorticity structures in the slat cove region without requiring any external forcing, the simulated vortices were excessively strong and the recirculation zone was unduly energetic in comparison with the PIV measurements for a generic high-lift configuration. To resolve this discrepancy and to help enable physics based predictions of slat aeroacoustics, the present paper is focused on 3D simulations of the slat cove flow over a computational domain of limited spanwise extent. Maintaining the pseudo-laminar approach, current results indicate that accounting for the three-dimensionality of flow fluctuations leads to considerable improvement in the accuracy of the unsteady, nearfield solution. Analysis of simulation data points to the likely significance of turbulent fluctuations near the reattachment region toward the generation of broadband slat noise. The computed acoustic characteristics (in terms of the frequency spectrum and spatial distribution) within short distances from the slat resemble the previously reported, subscale measurements of slat noise.

  5. Survival of a pedicled latissimus dorsi flap in breast reconstruction without a thoracodorsal pedicle.

    PubMed

    Hartmann, C E A; Branford, O A; Malhotra, A; Chana, J S

    2013-07-01

    The latissimus dorsi flap, first performed by Tansini in 1892, was popularised for use by Olivari in 1976. The successful transfer of a latissimus dorsi flap during breast reconstruction has previously been thought to be dependent on having an intact thoracodorsal pedicle to ensure flap survival. It is well documented that the flap may also survive on the serratus branch in thoracodorsal pedicle division. We report a case of a 52-year-old female patient who underwent successful delayed breast reconstruction with a latissimus dorsi flap following previous mastectomy and axillary node clearance. Intraoperatively, the thoracodorsal pedicle and serratus branch were found to have been previously divided. On postoperative computer tomographic angiography the thoracodorsal pedicle was shown to be divided together with the serratus branch. The flap was seen to be supplied by the lateral thoracic artery. To our knowledge survival of a pedicled latissimus dorsi flap in breast reconstruction with a vascular supply from this vessel following thoracodorsal pedicle division has not previously been described. Previous thoracodorsal pedicle and serratus branch division may not be an absolute contraindication for the use of the latissimus dorsi flap in breast reconstruction, depending on the results of preoperative Doppler or computer tomographic angiography studies. Copyright © 2012 British Association of Plastic, Reconstructive and Aesthetic Surgeons. Published by Elsevier Ltd. All rights reserved.

  6. Temporal trends in compliance with appropriateness criteria for stress single-photon emission computed tomography sestamibi studies in an academic medical center.

    PubMed

    Gibbons, Raymond J; Askew, J Wells; Hodge, David; Miller, Todd D

    2010-03-01

    The purpose of this study was to apply published appropriateness criteria for single-photon emission computed tomography (SPECT) myocardial perfusion imaging (MPI) in a single academic medical center to determine if the percentage of inappropriate studies was changing over time. In a previous study, we applied the American College of Cardiology Foundation/American Society of Nuclear Cardiology (ASNC) appropriateness criteria for stress SPECT MPI and reported that 14% of stress SPECT studies were performed for inappropriate reasons. Using similar methodology, we retrospectively examined 284 patients who underwent stress SPECT MPI in October 2006 and compared the findings to the previous cohort of 284 patients who underwent stress SPECT MPI in May 2005. The indications for testing in the 2 cohorts were very similar. The overall level of agreement in characterizing categories of appropriateness between 2 experienced cardiovascular nurse abstractors was good (kappa = 0.68), which represented an improvement from our previous study (kappa = 0.56). There was a significant change between May 2005 and October 2006 in the overall classification of categories for appropriateness (P = .024 by chi(2) statistic). There were modest, but insignificant, increases in the number of patients who were unclassified (15% in the current study vs 11% previously), appropriate (66% vs 64%), and uncertain (12% vs 11%). Only 7% of the studies in the current study were inappropriate, which represented a significant (P = .004) decrease from the 14% reported in the 2005 cohort. In the absence of any specific intervention, there was a significant change in the overall classification of SPECT appropriateness in an academic medical center over 17 months. The only significant difference in individual categories was a decrease in inappropriate studies. Additional measurements over time will be required to determine if this trend is sustainable or generalizable.

  7. HGIMDA: Heterogeneous graph inference for miRNA-disease association prediction

    PubMed Central

    Zhang, Xu; You, Zhu-Hong; Huang, Yu-An; Yan, Gui-Ying

    2016-01-01

    Recently, microRNAs (miRNAs) have drawn more and more attentions because accumulating experimental studies have indicated miRNA could play critical roles in multiple biological processes as well as the development and progression of human complex diseases. Using the huge number of known heterogeneous biological datasets to predict potential associations between miRNAs and diseases is an important topic in the field of biology, medicine, and bioinformatics. In this study, considering the limitations in the previous computational methods, we developed the computational model of Heterogeneous Graph Inference for MiRNA-Disease Association prediction (HGIMDA) to uncover potential miRNA-disease associations by integrating miRNA functional similarity, disease semantic similarity, Gaussian interaction profile kernel similarity, and experimentally verified miRNA-disease associations into a heterogeneous graph. HGIMDA obtained AUCs of 0.8781 and 0.8077 based on global and local leave-one-out cross validation, respectively. Furthermore, HGIMDA was applied to three important human cancers for performance evaluation. As a result, 90% (Colon Neoplasms), 88% (Esophageal Neoplasms) and 88% (Kidney Neoplasms) of top 50 predicted miRNAs are confirmed by recent experiment reports. Furthermore, HGIMDA could be effectively applied to new diseases and new miRNAs without any known associations, which overcome the important limitations of many previous computational models. PMID:27533456

  8. HGIMDA: Heterogeneous graph inference for miRNA-disease association prediction.

    PubMed

    Chen, Xing; Yan, Chenggang Clarence; Zhang, Xu; You, Zhu-Hong; Huang, Yu-An; Yan, Gui-Ying

    2016-10-04

    Recently, microRNAs (miRNAs) have drawn more and more attentions because accumulating experimental studies have indicated miRNA could play critical roles in multiple biological processes as well as the development and progression of human complex diseases. Using the huge number of known heterogeneous biological datasets to predict potential associations between miRNAs and diseases is an important topic in the field of biology, medicine, and bioinformatics. In this study, considering the limitations in the previous computational methods, we developed the computational model of Heterogeneous Graph Inference for MiRNA-Disease Association prediction (HGIMDA) to uncover potential miRNA-disease associations by integrating miRNA functional similarity, disease semantic similarity, Gaussian interaction profile kernel similarity, and experimentally verified miRNA-disease associations into a heterogeneous graph. HGIMDA obtained AUCs of 0.8781 and 0.8077 based on global and local leave-one-out cross validation, respectively. Furthermore, HGIMDA was applied to three important human cancers for performance evaluation. As a result, 90% (Colon Neoplasms), 88% (Esophageal Neoplasms) and 88% (Kidney Neoplasms) of top 50 predicted miRNAs are confirmed by recent experiment reports. Furthermore, HGIMDA could be effectively applied to new diseases and new miRNAs without any known associations, which overcome the important limitations of many previous computational models.

  9. Hybrid method to estimate two-layered superficial tissue optical properties from simulated data of diffuse reflectance spectroscopy.

    PubMed

    Hsieh, Hong-Po; Ko, Fan-Hua; Sung, Kung-Bin

    2018-04-20

    An iterative curve fitting method has been applied in both simulation [J. Biomed. Opt.17, 107003 (2012)JBOPFO1083-366810.1117/1.JBO.17.10.107003] and phantom [J. Biomed. Opt.19, 077002 (2014)JBOPFO1083-366810.1117/1.JBO.19.7.077002] studies to accurately extract optical properties and the top layer thickness of a two-layered superficial tissue model from diffuse reflectance spectroscopy (DRS) data. This paper describes a hybrid two-step parameter estimation procedure to address two main issues of the previous method, including (1) high computational intensity and (2) converging to local minima. The parameter estimation procedure contained a novel initial estimation step to obtain an initial guess, which was used by a subsequent iterative fitting step to optimize the parameter estimation. A lookup table was used in both steps to quickly obtain reflectance spectra and reduce computational intensity. On simulated DRS data, the proposed parameter estimation procedure achieved high estimation accuracy and a 95% reduction of computational time compared to previous studies. Furthermore, the proposed initial estimation step led to better convergence of the following fitting step. Strategies used in the proposed procedure could benefit both the modeling and experimental data processing of not only DRS but also related approaches such as near-infrared spectroscopy.

  10. Computer program to minimize prediction error in models from experiments with 16 hypercube points and 0 to 6 center points

    NASA Technical Reports Server (NTRS)

    Holms, A. G.

    1982-01-01

    A previous report described a backward deletion procedure of model selection that was optimized for minimum prediction error and which used a multiparameter combination of the F - distribution and an order statistics distribution of Cochran's. A computer program is described that applies the previously optimized procedure to real data. The use of the program is illustrated by examples.

  11. Computational mechanisms underlying cortical responses to the affordance properties of visual scenes

    PubMed Central

    Epstein, Russell A.

    2018-01-01

    Biologically inspired deep convolutional neural networks (CNNs), trained for computer vision tasks, have been found to predict cortical responses with remarkable accuracy. However, the internal operations of these models remain poorly understood, and the factors that account for their success are unknown. Here we develop a set of techniques for using CNNs to gain insights into the computational mechanisms underlying cortical responses. We focused on responses in the occipital place area (OPA), a scene-selective region of dorsal occipitoparietal cortex. In a previous study, we showed that fMRI activation patterns in the OPA contain information about the navigational affordances of scenes; that is, information about where one can and cannot move within the immediate environment. We hypothesized that this affordance information could be extracted using a set of purely feedforward computations. To test this idea, we examined a deep CNN with a feedforward architecture that had been previously trained for scene classification. We found that responses in the CNN to scene images were highly predictive of fMRI responses in the OPA. Moreover the CNN accounted for the portion of OPA variance relating to the navigational affordances of scenes. The CNN could thus serve as an image-computable candidate model of affordance-related responses in the OPA. We then ran a series of in silico experiments on this model to gain insights into its internal operations. These analyses showed that the computation of affordance-related features relied heavily on visual information at high-spatial frequencies and cardinal orientations, both of which have previously been identified as low-level stimulus preferences of scene-selective visual cortex. These computations also exhibited a strong preference for information in the lower visual field, which is consistent with known retinotopic biases in the OPA. Visualizations of feature selectivity within the CNN suggested that affordance-based responses encoded features that define the layout of the spatial environment, such as boundary-defining junctions and large extended surfaces. Together, these results map the sensory functions of the OPA onto a fully quantitative model that provides insights into its visual computations. More broadly, they advance integrative techniques for understanding visual cortex across multiple level of analysis: from the identification of cortical sensory functions to the modeling of their underlying algorithms. PMID:29684011

  12. Computer literacy and attitudes towards e-learning among first year medical students

    PubMed Central

    Link, Thomas Michael; Marz, Richard

    2006-01-01

    Background At the Medical University of Vienna, most information for students is available only online. In 2005, an e-learning project was initiated and there are plans to introduce a learning management system. In this study, we estimate the level of students' computer skills, the number of students having difficulty with e-learning, and the number of students opposed to e-learning. Methods The study was conducted in an introductory course on computer-based and web-based training (CBT/WBT). Students were asked to fill out a questionnaire online that covered a wide range of relevant attitudes and experiences. Results While the great majority of students possess sufficient computer skills and acknowledge the advantages of interactive and multimedia-enhanced learning material, a small percentage lacks basic computer skills and/or is very skeptical about e-learning. There is also a consistently significant albeit weak gender difference in available computer infrastructure and Internet access. As for student attitudes toward e-learning, we found that age, computer use, and previous exposure to computers are more important than gender. A sizable number of students, 12% of the total, make little or no use of existing e-learning offerings. Conclusion Many students would benefit from a basic introduction to computers and to the relevant computer-based resources of the university. Given to the wide range of computer skills among students, a single computer course for all students would not be useful nor would it be accepted. Special measures should be taken to prevent students who lack computer skills from being disadvantaged or from developing computer-hostile attitudes. PMID:16784524

  13. Computer literacy and attitudes towards e-learning among first year medical students.

    PubMed

    Link, Thomas Michael; Marz, Richard

    2006-06-19

    At the Medical University of Vienna, most information for students is available only online. In 2005, an e-learning project was initiated and there are plans to introduce a learning management system. In this study, we estimate the level of students' computer skills, the number of students having difficulty with e-learning, and the number of students opposed to e-learning. The study was conducted in an introductory course on computer-based and web-based training (CBT/WBT). Students were asked to fill out a questionnaire online that covered a wide range of relevant attitudes and experiences. While the great majority of students possess sufficient computer skills and acknowledge the advantages of interactive and multimedia-enhanced learning material, a small percentage lacks basic computer skills and/or is very skeptical about e-learning. There is also a consistently significant albeit weak gender difference in available computer infrastructure and Internet access. As for student attitudes toward e-learning, we found that age, computer use, and previous exposure to computers are more important than gender. A sizable number of students, 12% of the total, make little or no use of existing e-learning offerings. Many students would benefit from a basic introduction to computers and to the relevant computer-based resources of the university. Given to the wide range of computer skills among students, a single computer course for all students would not be useful nor would it be accepted. Special measures should be taken to prevent students who lack computer skills from being disadvantaged or from developing computer-hostile attitudes.

  14. Molecular simulation investigation into the performance of Cu-BTC metal-organic frameworks for carbon dioxide-methane separations.

    PubMed

    Gutiérrez-Sevillano, Juan José; Caro-Pérez, Alejandro; Dubbeldam, David; Calero, Sofía

    2011-12-07

    We report a molecular simulation study for Cu-BTC metal-organic frameworks as carbon dioxide-methane separation devices. For this study we have computed adsorption and diffusion of methane and carbon dioxide in the structure, both as pure components and mixtures over the full range of bulk gas compositions. From the single component isotherms, mixture adsorption is predicted using the ideal adsorbed solution theory. These predictions are in very good agreement with our computed mixture isotherms and with previously reported data. Adsorption and diffusion selectivities and preferential sitings are also discussed with the aim to provide new molecular level information for all studied systems.

  15. Computational Study of the Structure of a Sepiolite/Thioindigo Mayan Pigment

    PubMed Central

    Alvarado, Manuel; Chianelli, Russell C.; Arrowood, Roy M.

    2012-01-01

    The interaction of thioindigo and the phyllosilicate clay sepiolite is investigated using density functional theory (DFT) and molecular orbital theory (MO). The best fit to experimental UV/Vis spectra occurs when a single thioindigo molecule attaches via Van der Waals forces to a tetrahedrally coordinated Al3+ cation with an additional nearby tetrahedrally coordinated Al3+ also present. The thioindigo molecule distorts from its planar structure, a behavior consistent with a color change. Due to the weak interaction between thioindigo and sepiolite we conclude that the thioindigo molecule must be trapped in a channel, an observation consistent with previous experimental studies. Future computational studies will look at the interaction of indigo with sepiolite. PMID:23193386

  16. Exploring the challenges faced by polytechnic students

    NASA Astrophysics Data System (ADS)

    Matore, Mohd Effendi @ Ewan Mohd; Khairani, Ahmad Zamri

    2015-02-01

    This study aims to identify other challenges besides those already faced by students, in seven polytechnics in Malaysia as a continuation to the previous research that had identified 52 main challenges faced by students using the Rasch Model. The explorative study focuses on the challenges that are not included in the Mooney Problem Checklist (MPCL). A total of 121 polytechnic students submitted 183 written responses through the open questions provided. Two hundred fifty two students had responded from a students' perspective on the dichotomous questions regarding their view on the challenges faced. The data was analysed qualitatively using the NVivo 8.0. The findings showed that students from Politeknik Seberang Perai (PSP) gave the highest response, which was 56 (30.6%) and Politeknik Metro Kuala Lumpur (PMKL) had the lowest response of 2 (1.09%). Five dominant challenges were identified, which were the English language (32, 17.5%), learning (14, 7.7%), vehicles (13, 7.1%), information technology and communication (ICT) (13, 7.1%), and peers (11, 6.0%). This article, however, focus on three apparent challenges, namely, English language, vehicles, as well as computer and ICT, as the challenges of learning and peers had been analysed in the previous MPCL. The challenge of English language that had been raised was regarding the weakness in commanding the aspects of speech and fluency. The computer and ICT challenge covered the weakness in mastering ICT and computers, as well as computer breakdowns and low-performance computers. The challenge of vehicles emphasized the unavailability of vehicles to attend lectures and go elsewhere, lack of transportation service in the polytechnic and not having a valid driving license. These challenges are very relevant and need to be discussed in an effort to prepare polytechnics in facing the transformational process of polytechnics.

  17. Compact Method for Modeling and Simulation of Memristor Devices

    DTIC Science & Technology

    2011-08-01

    single-valued equations. 15. SUBJECT TERMS Memristor, Neuromorphic , Cognitive, Computing, Memory, Emerging Technology, Computational Intelligence 16...resistance state depends on its previous state and present electrical biasing conditions, and when combined with transistors in a hybrid chip ...computers, reconfigurable electronics and neuromorphic computing [3,4]. According to Chua [4], the memristor behaves like a linear resistor with

  18. 40 CFR 86.099-17 - Emission control diagnostic system for 1999 and later light-duty vehicles and light-duty trucks.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... of computer codes. The emission control diagnostic system shall record and store in computer memory..., shall be stored in computer memory to identify correctly functioning emission control systems and those... in computer memory. Should a subsequent fuel system or misfire malfunction occur, any previously...

  19. 40 CFR 86.099-17 - Emission control diagnostic system for 1999 and later light-duty vehicles and light-duty trucks.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... of computer codes. The emission control diagnostic system shall record and store in computer memory..., shall be stored in computer memory to identify correctly functioning emission control systems and those... in computer memory. Should a subsequent fuel system or misfire malfunction occur, any previously...

  20. 37 CFR 1.825 - Amendments to or replacement of sequence listing and computer readable copy thereof.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... of sequence listing and computer readable copy thereof. 1.825 Section 1.825 Patents, Trademarks, and... Amino Acid Sequences § 1.825 Amendments to or replacement of sequence listing and computer readable copy... copy of the computer readable form (§ 1.821(e)) including all previously submitted data with the...

  1. 37 CFR 1.825 - Amendments to or replacement of sequence listing and computer readable copy thereof.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... of sequence listing and computer readable copy thereof. 1.825 Section 1.825 Patents, Trademarks, and... Amino Acid Sequences § 1.825 Amendments to or replacement of sequence listing and computer readable copy... copy of the computer readable form (§ 1.821(e)) including all previously submitted data with the...

  2. 37 CFR 1.825 - Amendments to or replacement of sequence listing and computer readable copy thereof.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... of sequence listing and computer readable copy thereof. 1.825 Section 1.825 Patents, Trademarks, and... Amino Acid Sequences § 1.825 Amendments to or replacement of sequence listing and computer readable copy... copy of the computer readable form (§ 1.821(e)) including all previously submitted data with the...

  3. 37 CFR 1.825 - Amendments to or replacement of sequence listing and computer readable copy thereof.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... of sequence listing and computer readable copy thereof. 1.825 Section 1.825 Patents, Trademarks, and... Amino Acid Sequences § 1.825 Amendments to or replacement of sequence listing and computer readable copy... copy of the computer readable form (§ 1.821(e)) including all previously submitted data with the...

  4. 37 CFR 1.825 - Amendments to or replacement of sequence listing and computer readable copy thereof.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... of sequence listing and computer readable copy thereof. 1.825 Section 1.825 Patents, Trademarks, and... Amino Acid Sequences § 1.825 Amendments to or replacement of sequence listing and computer readable copy... copy of the computer readable form (§ 1.821(e)) including all previously submitted data with the...

  5. Computer-assisted photogrammetric mapping systems for geologic studies-A progress report

    USGS Publications Warehouse

    Pillmore, C.L.; Dueholm, K.S.; Jepsen, H.S.; Schuch, C.H.

    1981-01-01

    Photogrammetry has played an important role in geologic mapping for many years; however, only recently have attempts been made to automate mapping functions for geology. Computer-assisted photogrammetric mapping systems for geologic studies have been developed and are currently in use in offices of the Geological Survey of Greenland at Copenhagen, Denmark, and the U.S. Geological Survey at Denver, Colorado. Though differing somewhat, the systems are similar in that they integrate Kern PG-2 photogrammetric plotting instruments and small desk-top computers that are programmed to perform special geologic functions and operate flat-bed plotters by means of specially designed hardware and software. A z-drive capability, in which stepping motors control the z-motions of the PG-2 plotters, is an integral part of both systems. This feature enables the computer to automatically position the floating mark on computer-calculated, previously defined geologic planes, such as contacts or the base of coal beds, throughout the stereoscopic model in order to improve the mapping capabilities of the instrument and to aid in correlation and tracing of geologic units. The common goal is to enhance the capabilities of the PG-2 plotter and provide a means by which geologists can make conventional geologic maps more efficiently and explore ways to apply computer technology to geologic studies. ?? 1981.

  6. Symbolic Computation of Strongly Connected Components Using Saturation

    NASA Technical Reports Server (NTRS)

    Zhao, Yang; Ciardo, Gianfranco

    2010-01-01

    Finding strongly connected components (SCCs) in the state-space of discrete-state models is a critical task in formal verification of LTL and fair CTL properties, but the potentially huge number of reachable states and SCCs constitutes a formidable challenge. This paper is concerned with computing the sets of states in SCCs or terminal SCCs of asynchronous systems. Because of its advantages in many applications, we employ saturation on two previously proposed approaches: the Xie-Beerel algorithm and transitive closure. First, saturation speeds up state-space exploration when computing each SCC in the Xie-Beerel algorithm. Then, our main contribution is a novel algorithm to compute the transitive closure using saturation. Experimental results indicate that our improved algorithms achieve a clear speedup over previous algorithms in some cases. With the help of the new transitive closure computation algorithm, up to 10(exp 150) SCCs can be explored within a few seconds.

  7. Computational modeling and experimental studies on NO{sub x} reduction under pulverized coal combustion conditions. Seventh quarterly technical progress report, July 1, 1996--September 30, 1996

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kumpaty, S.K.; Subramanian, K.; Nokku, V.P.

    1996-12-31

    During this quarter (July-August 1996), the experiments for nitric oxide reburning with a combination of methane and ammonia were conducted successfully. This marked the completion of gaseous phase experiments. Preparations are underway for the reburning studies with coal. A coal feeder was designed to suit our reactor facility which is being built by MK Fabrication. The coal feeder should be operational in the coming quarter. Presented here are the experimental results of NO reburning with methane/ammonia. The results are consistent with the computational work submitted in previous reports.

  8. Computational complexity of the landscape II-Cosmological considerations

    NASA Astrophysics Data System (ADS)

    Denef, Frederik; Douglas, Michael R.; Greene, Brian; Zukowski, Claire

    2018-05-01

    We propose a new approach for multiverse analysis based on computational complexity, which leads to a new family of "computational" measure factors. By defining a cosmology as a space-time containing a vacuum with specified properties (for example small cosmological constant) together with rules for how time evolution will produce the vacuum, we can associate global time in a multiverse with clock time on a supercomputer which simulates it. We argue for a principle of "limited computational complexity" governing early universe dynamics as simulated by this supercomputer, which translates to a global measure for regulating the infinities of eternal inflation. The rules for time evolution can be thought of as a search algorithm, whose details should be constrained by a stronger principle of "minimal computational complexity". Unlike previously studied global measures, ours avoids standard equilibrium considerations and the well-known problems of Boltzmann Brains and the youngness paradox. We also give various definitions of the computational complexity of a cosmology, and argue that there are only a few natural complexity classes.

  9. Coupling fast fluid dynamics and multizone airflow models in Modelica Buildings library to simulate the dynamics of HVAC systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tian, Wei; Sevilla, Thomas Alonso; Zuo, Wangda

    Historically, multizone models are widely used in building airflow and energy performance simulations due to their fast computing speed. However, multizone models assume that the air in a room is well mixed, consequently limiting their application. In specific rooms where this assumption fails, the use of computational fluid dynamics (CFD) models may be an alternative option. Previous research has mainly focused on coupling CFD models and multizone models to study airflow in large spaces. While significant, most of these analyses did not consider the coupled simulation of the building airflow with the building's Heating, Ventilation, and Air-Conditioning (HVAC) systems. Thismore » paper tries to fill the gap by integrating the models for HVAC systems with coupled multizone and CFD simulations for airflows, using the Modelica simul ation platform. To improve the computational efficiency, we incorporated a simplified CFD model named fast fluid dynamics (FFD). We first introduce the data synchronization strategy and implementation in Modelica. Then, we verify the implementation using two case studies involving an isothermal and a non-isothermal flow by comparing model simulations to experiment data. Afterward, we study another three cases that are deemed more realistic. This is done by attaching a variable air volume (VAV) terminal box and a VAV system to previous flows to assess the capability of the models in studying the dynamic control of HVAC systems. Finally, we discuss further research needs on the coupled simulation using the models.« less

  10. Extracting quasi-steady Lagrangian transport patterns from the ocean circulation: An application to the Gulf of Mexico.

    PubMed

    Duran, R; Beron-Vera, F J; Olascoaga, M J

    2018-03-26

    We construct a climatology of Lagrangian coherent structures (LCSs)-the concealed skeleton that shapes transport-with a twelve-year-long data-assimilative simulation of the sea-surface circulation in the Gulf of Mexico (GoM). Computed as time-mean Cauchy-Green strain tensorlines of the climatological velocity, the climatological LCSs (cLCSs) unveil recurrent Lagrangian circulation patterns. The cLCSs strongly constrain the ensemble-mean Lagrangian circulation of the instantaneous model velocity, showing that a climatological velocity can preserve meaningful transport information. The quasi-steady transport patterns revealed by the cLCSs agree well with aspects of the GoM circulation described in several previous observational and numerical studies. For example, the cLCSs identify regions of persistent isolation, and suggest that coastal regions previously identified as high-risk for pollution impact are regions of maximal attraction. We also show that cLCSs are remarkably accurate at identifying transport patterns observed during the Deepwater Horizon and Ixtoc oil spills, and during the Grand LAgrangian Deployment (GLAD) experiment. Thus it is shown that computing cLCSs is an efficient and meaningful way of synthesizing vast amounts of Lagrangian information. The cLCS method confirms previous GoM studies, and contributes to our understanding by revealing the persistent nature of the dynamics and kinematics treated therein.

  11. Tablet computers in assessing performance in a high stakes exam: opinion matters.

    PubMed

    Currie, G P; Sinha, S; Thomson, F; Cleland, J; Denison, A R

    2017-06-01

    Background Tablet computers have emerged as a tool to capture, process and store data in examinations, yet evidence relating to their acceptability and usefulness in assessment is limited. Methods We performed an observational study to explore opinions and attitudes relating to tablet computer use in recording performance in a final year objective structured clinical examination at a single UK medical school. Examiners completed a short questionnaire encompassing background, forced-choice and open questions. Forced choice questions were analysed using descriptive statistics and open questions by framework analysis. Results Ninety-two (97% response rate) examiners completed the questionnaire of whom 85% had previous use of tablet computers. Ninety per cent felt checklist mark allocation was 'very/quite easy', while approximately half considered recording 'free-type' comments was 'easy/very easy'. Greater overall efficiency of marking and resource savings were considered the main advantages of tablet computers, while concerns relating to technological failure and ability to record free type comments were raised. Discussion In a context where examiners were familiar with tablet computers, they were preferred to paper checklists, although concerns were raised. This study adds to the limited literature underpinning the use of electronic devices as acceptable tools in objective structured clinical examinations.

  12. Design, processing and testing of LSI arrays, hybrid microelectronics task

    NASA Technical Reports Server (NTRS)

    Himmel, R. P.; Stuhlbarg, S. M.; Ravetti, R. G.; Zulueta, P. J.; Rothrock, C. W.

    1979-01-01

    Mathematical cost models previously developed for hybrid microelectronic subsystems were refined and expanded. Rework terms related to substrate fabrication, nonrecurring developmental and manufacturing operations, and prototype production are included. Sample computer programs were written to demonstrate hybrid microelectric applications of these cost models. Computer programs were generated to calculate and analyze values for the total microelectronics costs. Large scale integrated (LST) chips utilizing tape chip carrier technology were studied. The feasibility of interconnecting arrays of LSU chips utilizing tape chip carrier and semiautomatic wire bonding technology was demonstrated.

  13. Computational Study of the Adsorption of Dimethyl Methylphosphonate (DMMP) on the (010) Surface of Anatase TiO2 With and Without Faceting

    DTIC Science & Technology

    2009-12-05

    surface area of anatase nanocrystals [6] and to be es- pecially active in photocatalysis [7]. Recent work by Dzwigaj et al. [8] has clearly shown that the...two-fold-coordinated (O2c) sites can also be involved in hydrogen bond (H-bond) formation. The effects, on the structure of the (100) and other...To reduce the computational cost , geometry optimization was done at the restricted Hartree Fock (RHF) level. This has previously been shown [36,37

  14. Enhancing Online Collaborative Argumentation through Question Elaboration and Goal Instructions

    ERIC Educational Resources Information Center

    Golanics, J. D.; Nussbaum, E. M.

    2008-01-01

    Computer-supported collaborative argumentation can improve understanding and problem-solving skills. This study uses WebCT to explore the improvement of argumentation in asynchronous, web-based discussions through goal instructions, which are statements at the end of a discussion prompt indicating what students should achieve. In a previous study…

  15. Relationship of Selected Abilities to Problem Solving Performance.

    ERIC Educational Resources Information Center

    Harmel, Sarah Jane

    This study investigated five ability tests related to the water-jug problem. Previous analyses identified two processes used during solution: means-ends analysis and memory of visited states. Subjects were 240 undergraduate psychology students. A real-time computer system presented the problem and recorded responses. Ability tests were paper and…

  16. Revisiting Cognitive Tools: Shifting the Focus to Tools-in-Use

    ERIC Educational Resources Information Center

    Kim, Minchi C.

    2012-01-01

    Many studies have been conducted on the topics of tools, computers, and technology designed to promote student learning. However, researchers have rarely raised the critical issue of the lack of consensus on conceptualizing "cognitive tools" or the possible challenges associated with previous definitions. By examining the limitations on research…

  17. Auto Mechanics; Methodology. Technical Instruction Manual.

    ERIC Educational Resources Information Center

    Systems Operation Support, Inc., King of Prussia, PA.

    This student instruction manual was written in conformance with selected criteria for programed instruction books as developed previously for various military training courses. The manual was developed as a part of "A Study of the Effectiveness of a Military-Type Computer-Based Instructional System When Used in Civilian High School Courses in…

  18. Politeness Strategies in Collaborative E-Mail Exchanges

    ERIC Educational Resources Information Center

    Vinagre, Margarita

    2008-01-01

    Computer-supported collaborative learning (CSCL) has been the subject of a wide range of studies over the last twenty years. Previous research suggests that CSCL exchanges can facilitate group-based learning and knowledge construction among learners who are in different geographical locations [Littleton, K. & Whitelock, D. (2004). "Guiding the…

  19. Disaggregated Effects of Device on Score Comparability

    ERIC Educational Resources Information Center

    Davis, Laurie; Morrison, Kristin; Kong, Xiaojing; McBride, Yuanyuan

    2017-01-01

    The use of tablets for large-scale testing programs has transitioned from concept to reality for many state testing programs. This study extended previous research on score comparability between tablets and computers with high school students to compare score distributions across devices for reading, math, and science and to evaluate device…

  20. Identifying Secondary-School Students' Difficulties When Reading Visual Representations Displayed in Physics Simulations

    ERIC Educational Resources Information Center

    López, Víctor; Pintó, Roser

    2017-01-01

    Computer simulations are often considered effective educational tools, since their visual and communicative power enable students to better understand physical systems and phenomena. However, previous studies have found that when students read visual representations some reading difficulties can arise, especially when these are complex or dynamic…

  1. Using Virtual Reality with and without Gaming Attributes for Academic Achievement

    ERIC Educational Resources Information Center

    Vogel, Jennifer J.; Greenwood-Ericksen, Adams; Cannon-Bowers, Jan; Bowers, Clint A.

    2006-01-01

    A subcategory of computer-assisted instruction (CAI), games have additional attributes such as motivation, reward, interactivity, score, and challenge. This study used a quasi-experimental design to determine if previous findings generalize to non simulation-based game designs. Researchers observed significant improvement in the overall population…

  2. Retrieval of wheat growth parameters with radar vegetation indices

    USDA-ARS?s Scientific Manuscript database

    The Radar Vegetation Index (RVI) has a low sensitivity to changes in environmental conditions and has the potential as a tool to monitor the vegetation growth. In this study, we expand on previous research by investigating the radar response over a wheat canopy. RVI was computed using observations m...

  3. Reexamination of color vision standards, part II. a computational method to assess the effect of color deficiencies in using ATC displays : final report.

    DOT National Transportation Integrated Search

    2006-03-01

    The previous study showed that many colors were used in air traffic control displays. We also found that colors were used mainly for three purposes: capturing controllers immediate attention, identifying targets, and segmenting information. This r...

  4. Drug Target Optimization in Chronic Myeloid Leukemia Using Innovative Computational Platform

    PubMed Central

    Chuang, Ryan; Hall, Benjamin A.; Benque, David; Cook, Byron; Ishtiaq, Samin; Piterman, Nir; Taylor, Alex; Vardi, Moshe; Koschmieder, Steffen; Gottgens, Berthold; Fisher, Jasmin

    2015-01-01

    Chronic Myeloid Leukemia (CML) represents a paradigm for the wider cancer field. Despite the fact that tyrosine kinase inhibitors have established targeted molecular therapy in CML, patients often face the risk of developing drug resistance, caused by mutations and/or activation of alternative cellular pathways. To optimize drug development, one needs to systematically test all possible combinations of drug targets within the genetic network that regulates the disease. The BioModelAnalyzer (BMA) is a user-friendly computational tool that allows us to do exactly that. We used BMA to build a CML network-model composed of 54 nodes linked by 104 interactions that encapsulates experimental data collected from 160 publications. While previous studies were limited by their focus on a single pathway or cellular process, our executable model allowed us to probe dynamic interactions between multiple pathways and cellular outcomes, suggest new combinatorial therapeutic targets, and highlight previously unexplored sensitivities to Interleukin-3. PMID:25644994

  5. Drug Target Optimization in Chronic Myeloid Leukemia Using Innovative Computational Platform

    NASA Astrophysics Data System (ADS)

    Chuang, Ryan; Hall, Benjamin A.; Benque, David; Cook, Byron; Ishtiaq, Samin; Piterman, Nir; Taylor, Alex; Vardi, Moshe; Koschmieder, Steffen; Gottgens, Berthold; Fisher, Jasmin

    2015-02-01

    Chronic Myeloid Leukemia (CML) represents a paradigm for the wider cancer field. Despite the fact that tyrosine kinase inhibitors have established targeted molecular therapy in CML, patients often face the risk of developing drug resistance, caused by mutations and/or activation of alternative cellular pathways. To optimize drug development, one needs to systematically test all possible combinations of drug targets within the genetic network that regulates the disease. The BioModelAnalyzer (BMA) is a user-friendly computational tool that allows us to do exactly that. We used BMA to build a CML network-model composed of 54 nodes linked by 104 interactions that encapsulates experimental data collected from 160 publications. While previous studies were limited by their focus on a single pathway or cellular process, our executable model allowed us to probe dynamic interactions between multiple pathways and cellular outcomes, suggest new combinatorial therapeutic targets, and highlight previously unexplored sensitivities to Interleukin-3.

  6. A Computer Interview for Multivariate Monitoring of Psychiatric Outcome.

    ERIC Educational Resources Information Center

    Stevenson, John F.; And Others

    Application of computer technology to psychiatric outcome measurement offers the promise of coping with increasing demands for extensive patient interviews repeated longitudinally. Described is the development of a cost-effective multi-dimensional tracking device to monitor psychiatric functioning, building on a previous local computer interview…

  7. Computational-Model-Based Analysis of Context Effects on Harmonic Expectancy.

    PubMed

    Morimoto, Satoshi; Remijn, Gerard B; Nakajima, Yoshitaka

    2016-01-01

    Expectancy for an upcoming musical chord, harmonic expectancy, is supposedly based on automatic activation of tonal knowledge. Since previous studies implicitly relied on interpretations based on Western music theory, the underlying computational processes involved in harmonic expectancy and how it relates to tonality need further clarification. In particular, short chord sequences which cannot lead to unique keys are difficult to interpret in music theory. In this study, we examined effects of preceding chords on harmonic expectancy from a computational perspective, using stochastic modeling. We conducted a behavioral experiment, in which participants listened to short chord sequences and evaluated the subjective relatedness of the last chord to the preceding ones. Based on these judgments, we built stochastic models of the computational process underlying harmonic expectancy. Following this, we compared the explanatory power of the models. Our results imply that, even when listening to short chord sequences, internally constructed and updated tonal assumptions determine the expectancy of the upcoming chord.

  8. Computational-Model-Based Analysis of Context Effects on Harmonic Expectancy

    PubMed Central

    Morimoto, Satoshi; Remijn, Gerard B.; Nakajima, Yoshitaka

    2016-01-01

    Expectancy for an upcoming musical chord, harmonic expectancy, is supposedly based on automatic activation of tonal knowledge. Since previous studies implicitly relied on interpretations based on Western music theory, the underlying computational processes involved in harmonic expectancy and how it relates to tonality need further clarification. In particular, short chord sequences which cannot lead to unique keys are difficult to interpret in music theory. In this study, we examined effects of preceding chords on harmonic expectancy from a computational perspective, using stochastic modeling. We conducted a behavioral experiment, in which participants listened to short chord sequences and evaluated the subjective relatedness of the last chord to the preceding ones. Based on these judgments, we built stochastic models of the computational process underlying harmonic expectancy. Following this, we compared the explanatory power of the models. Our results imply that, even when listening to short chord sequences, internally constructed and updated tonal assumptions determine the expectancy of the upcoming chord. PMID:27003807

  9. Production version of the extended NASA-Langley Vortex Lattice FORTRAN computer program. Volume 1: User's guide

    NASA Technical Reports Server (NTRS)

    Lamar, J. E.; Herbert, H. E.

    1982-01-01

    The latest production version, MARK IV, of the NASA-Langley vortex lattice computer program is summarized. All viable subcritical aerodynamic features of previous versions were retained. This version extends the previously documented program capabilities to four planforms, 400 panels, and enables the user to obtain vortex-flow aerodynamics on cambered planforms, flowfield properties off the configuration in attached flow, and planform longitudinal load distributions.

  10. More IMPATIENT: A Gridding-Accelerated Toeplitz-based Strategy for Non-Cartesian High-Resolution 3D MRI on GPUs

    PubMed Central

    Gai, Jiading; Obeid, Nady; Holtrop, Joseph L.; Wu, Xiao-Long; Lam, Fan; Fu, Maojing; Haldar, Justin P.; Hwu, Wen-mei W.; Liang, Zhi-Pei; Sutton, Bradley P.

    2013-01-01

    Several recent methods have been proposed to obtain significant speed-ups in MRI image reconstruction by leveraging the computational power of GPUs. Previously, we implemented a GPU-based image reconstruction technique called the Illinois Massively Parallel Acquisition Toolkit for Image reconstruction with ENhanced Throughput in MRI (IMPATIENT MRI) for reconstructing data collected along arbitrary 3D trajectories. In this paper, we improve IMPATIENT by removing computational bottlenecks by using a gridding approach to accelerate the computation of various data structures needed by the previous routine. Further, we enhance the routine with capabilities for off-resonance correction and multi-sensor parallel imaging reconstruction. Through implementation of optimized gridding into our iterative reconstruction scheme, speed-ups of more than a factor of 200 are provided in the improved GPU implementation compared to the previous accelerated GPU code. PMID:23682203

  11. Computer-Assisted Second Language Vocabulary Learning in a Paired-Associate Paradigm: A Critical Investigation of Flashcard Software

    ERIC Educational Resources Information Center

    Nakata, Tatsuya

    2011-01-01

    The present study aims to conduct a comprehensive investigation of flashcard software for learning vocabulary in a second language. Nine flashcard programs were analysed using 17 criteria derived from previous studies on flashcard learning as well as paired-associate learning. Results suggest that in general, most programs have been developed in a…

  12. Implications of Information Technology for Employment, Skills, and Wages: Findings from Sectoral and Case Study Research. Final Report

    ERIC Educational Resources Information Center

    Handel, Michael J.

    2004-01-01

    This paper reviews evidence from industry-specific and case studies that shed light on the extent to which computers and automation eliminate jobs, raise job skill requirements, and, consequently, contribute to increased wage inequality between less- and more skilled workers. This paper complements a previous review of large-scale econometric…

  13. Using Social Network Graphs as Visualization Tools to Influence Peer Selection Decision-Making Strategies to Access Information about Complex Socioscientific Issues

    ERIC Educational Resources Information Center

    Yoon, Susan A.

    2011-01-01

    This study extends previous research that explores how visualization affordances that computational tools provide and social network analyses that account for individual- and group-level dynamic processes can work in conjunction to improve learning outcomes. The study's main hypothesis is that when social network graphs are used in instruction,…

  14. The Interaction of Child-Parent Shared Reading with an Augmented Reality (AR) Picture Book and Parents' Conceptions of AR Learning

    ERIC Educational Resources Information Center

    Cheng, Kun-Hung; Tsai, Chin-Chung

    2016-01-01

    Following a previous study (Cheng & Tsai, 2014. "Computers & Education"), this study aimed to probe the interaction of child-parent shared reading with the augmented reality (AR) picture book in more depth. A series of sequential analyses were thus conducted to infer the behavioral transition diagrams and visualize the continuity…

  15. Gender Rationales in Selecting a Major in Information Technology at the Undergraduate Level of a University Program: A Focus Group Approach

    ERIC Educational Resources Information Center

    Mishra, Sushma; Draus, Peter; Caputo, Donald; Leone, Gregory; Kohun, Frederick; Repack, Diana

    2014-01-01

    Previous research studies of women applying to, enrolling and completing computing degrees at the undergraduate collegiate level suggest a significant underrepresentation of females in the Information Technology domain in the past decade. This study employs a focus group approach to the gender gap that encompasses forays into the qualitative…

  16. Using Words Instead of Jumbled Characters as Stimuli in Keyboard Training Facilitates Fluent Performance

    ERIC Educational Resources Information Center

    DeFulio, Anthony; Crone-Todd, Darlene E.; Long, Lauren V.; Nuzzo, Paul A.; Silverman, Kenneth

    2011-01-01

    Keyboarding skill is an important target for adult education programs due to the ubiquity of computers in modern work environments. A previous study showed that novice typists learned key locations quickly but that fluency took a relatively long time to develop. In the present study, novice typists achieved fluent performance in nearly half the…

  17. Using Incremental Rehearsal to Increase Fluency of Single-Digit Multiplication Facts with Children Identified as Learning Disabled in Mathematics Computation

    ERIC Educational Resources Information Center

    Burns, Matthew K.

    2005-01-01

    Previous research suggested that Incremental Rehearsal (IR; Tucker, 1989) led to better retention than other drill practices models. However, little research exists in the literature regarding drill models for mathematics and no studies were found that used IR to practice multiplication facts. Therefore, the current study used IR as an…

  18. Perceptions Displayed by Novice Programmers When Exploring the Relationship Between Modularization Ability and Performance in the C++ Programming Language

    ERIC Educational Resources Information Center

    Vodounon, Maurice A.

    2004-01-01

    The primary purpose of this study was to analyze different perceptions displayed by novice programmers in the C++ programming language, and determine if modularization ability could be improved by an instructional treatment that concentrated on solving computer programs from previously existing modules. This study attempted to answer the following…

  19. Collaboration, Reflection and Selective Neglect: Campus-Based Marketing Students' Experiences of Using a Virtual Learning Environment

    ERIC Educational Resources Information Center

    Molesworth, Mike

    2004-01-01

    Previous studies have suggested significant benefits to using computer-mediated communication in higher education and the development of the relevant skills may also be important for preparing students for their working careers. This study is a review of the introduction of a virtual learning environment to support a group of 60 campus-based,…

  20. A Systematic Replication and Extension of Using Incremental Rehearsal to Improve Multiplication Skills: An Investigation of Generalization

    ERIC Educational Resources Information Center

    Codding, Robin S.; Archer, Jillian; Connell, James

    2010-01-01

    The purpose of this study was to replicate and extend a previous study by Burns ("Education and Treatment of Children" 28: 237-249, 2005) examining the effectiveness of incremental rehearsal on computation performance. A multiple-probe design across multiplication problem sets was employed for one participant to examine digits correct per minute…

  1. Enhanced diagnostic of skin conditions by polarized laser speckles: phantom studies and computer modeling

    NASA Astrophysics Data System (ADS)

    Tchvialeva, Lioudmila; Lee, Tim K.; Markhvida, Igor; Zeng, Haishan; Doronin, Alexander; Meglinski, Igor

    2014-03-01

    The incidence of the skin melanoma, the most commonly fatal form of skin cancer, is increasing faster than any other potentially preventable cancer. Clinical practice is currently hampered by the lack of the ability to rapidly screen the functional and morphological properties of tissues. In our previous study we show that the quantification of scattered laser light polarization provides a useful metrics for diagnostics of the malignant melanoma. In this study we exploit whether the image speckle could improve skin cancer diagnostic in comparison with the previously used free-space speckle. The study includes skin phantom measurements and computer modeling. To characterize the depolarization of light we measure the spatial distribution of speckle patterns and analyse their depolarization ratio taken into account radial symmetry. We examine the dependences of depolarization ratio vs. roughness for phantoms which optical properties are of the order of skin lesions. We demonstrate that the variation in bulk optical properties initiates the assessable changes in the depolarization ratio. We show that image speckle differentiates phantoms significantly better than free-space speckle. The results of experimental measurements are compared with the results of Monte Carlo simulation.

  2. Atmospheric energetics as related to cyclogenesis over the eastern United States. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    West, P. W.

    1973-01-01

    A method is presented to investigate the atmospheric energy budget as related to cyclogenesis. Energy budget equations are developed that are shown to be advantageous because the individual terms represent basic physical processes which produce changes in atmospheric energy, and the equations provide a means to study the interaction of the cyclone with the larger scales of motion. The work presented represents an extension of previous studies because all of the terms of the energy budget equations were evaluated throughout the development period of the cyclone. Computations are carried out over a limited atmospheric volume which encompasses the cyclone, and boundary fluxes of energy that were ignored in most previous studies are evaluated. Two examples of cyclogenesis over the eastern United States were chosen for study. One of the cases (1-4 November, 1966) represented an example of vigorous development, while the development in the other case (5-8 December, 1969) was more modest. Objectively analyzed data were used in the evaluation of the energy budget terms in order to minimize computational errors, and an objective analysis scheme is described that insures that all of the resolution contained in the rawinsonde observations is incorporated in the analyses.

  3. In Silico Knockout Screening of Plasmodium falciparum Reactions and Prediction of Novel Essential Reactions by Analysing the Metabolic Network

    PubMed Central

    Isewon, Itunuoluwa; Aromolaran, Olufemi; Oladipupo, Olufunke

    2018-01-01

    Malaria is an infectious disease that affects close to half a million individuals every year and Plasmodium falciparum is a major cause of malaria. The treatment of this disease could be done effectively if the essential enzymes of this parasite are specifically targeted. Nevertheless, the development of the parasite in resisting existing drugs now makes discovering new drugs a core responsibility. In this study, a novel computational model that makes the prediction of new and validated antimalarial drug target cheaper, easier, and faster has been developed. We have identified new essential reactions as potential targets for drugs in the metabolic network of the parasite. Among the top seven (7) predicted essential reactions, four (4) have been previously identified in earlier studies with biological evidence and one (1) has been with computational evidence. The results from our study were compared with an extensive list of seventy-seven (77) essential reactions with biological evidence from a previous study. We present a list of thirty-one (31) potential candidates for drug targets in Plasmodium falciparum which includes twenty-four (24) new potential candidates for drug targets. PMID:29789805

  4. On finite element methods for the Helmholtz equation

    NASA Technical Reports Server (NTRS)

    Aziz, A. K.; Werschulz, A. G.

    1979-01-01

    The numerical solution of the Helmholtz equation is considered via finite element methods. A two-stage method which gives the same accuracy in the computed gradient as in the computed solution is discussed. Error estimates for the method using a newly developed proof are given, and the computational considerations which show this method to be computationally superior to previous methods are presented.

  5. The Effects of Gender on the Attitudes towards the Computer Assisted Instruction: A Meta-Analysis

    ERIC Educational Resources Information Center

    Cam, Sefika Sumeyye; Yarar, Gokhan; Toraman, Cetin; Erdamar, Gurcu Koc

    2016-01-01

    The idea that gender factor creates a difference on computer usage and computer-assisted instruction is based upon previous years. At that time, it was thought that some areas like engineering, science and mathematics were for males so it created a difference on the computer usage. Nevertheless, developing technology and females becoming more…

  6. Proton affinity and enthalpy of formation of formaldehyde

    NASA Astrophysics Data System (ADS)

    Czakó, Gábor; Nagy, Balázs; Tasi, Gyula; Somogyi, Árpád; Šimunek, Ján; Noga, Jozef; Braams, Bastiaan J.; Bowman, Joel M.; Császár; , Attila G.

    The proton affinity and the enthalpy of formation of the prototypical carbonyl, formaldehyde, have been determined by the first-principles composite focal-point analysis (FPA) approach. The electronic structure computations employed the all-electron coupled-cluster method with up to single, double, triple, quadruple, and even pentuple excitations. In these computations the aug-cc-p(C)VXZ [X = 2(D), 3(T), 4(Q), 5, and 6] correlation-consistent Gaussian basis sets for C and O were used in conjunction with the corresponding aug-cc-pVXZ (X = 2-6) sets for H. The basis set limit values have been confirmed via explicitly correlated computations. Our FPA study supersedes previous computational work for the proton affinity and to some extent the enthalpy of formation of formaldehyde by accounting for (a) electron correlation beyond the "gold standard" CCSD(T) level; (b) the non-additivity of core electron correlation effects; (c) scalar relativity; (d) diagonal Born-Oppenheimer corrections computed at a correlated level; (e) anharmonicity of zero-point vibrational energies, based on global potential energy surfaces and variational vibrational computations; and (f) thermal corrections to enthalpies by direct summation over rovibrational energy levels. Our final proton affinities at 298.15 (0.0) K are ΔpaHo (H2CO) = 711.02 (704.98) ± 0.39 kJ mol-1. Our final enthalpies of formation at 298.15 (0.0) K are ΔfHo (H2CO) = -109.23 (-105.42) ± 0.33 kJ mol-1. The latter values are based on the enthalpy of the H2 + CO → H2CO reaction but supported by two further reaction schemes, H2O + C → H2CO and 2H + C + O → H2CO. These values, especially ΔpaHo (H2CO), have better accuracy and considerably lower uncertainty than the best previous recommendations and thus should be employed in future studies.

  7. Hypergraph partitioning implementation for parallelizing matrix-vector multiplication using CUDA GPU-based parallel computing

    NASA Astrophysics Data System (ADS)

    Murni, Bustamam, A.; Ernastuti, Handhika, T.; Kerami, D.

    2017-07-01

    Calculation of the matrix-vector multiplication in the real-world problems often involves large matrix with arbitrary size. Therefore, parallelization is needed to speed up the calculation process that usually takes a long time. Graph partitioning techniques that have been discussed in the previous studies cannot be used to complete the parallelized calculation of matrix-vector multiplication with arbitrary size. This is due to the assumption of graph partitioning techniques that can only solve the square and symmetric matrix. Hypergraph partitioning techniques will overcome the shortcomings of the graph partitioning technique. This paper addresses the efficient parallelization of matrix-vector multiplication through hypergraph partitioning techniques using CUDA GPU-based parallel computing. CUDA (compute unified device architecture) is a parallel computing platform and programming model that was created by NVIDIA and implemented by the GPU (graphics processing unit).

  8. Molecular geometry of vanadium dichloride and vanadium trichloride: a gas-phase electron diffraction and computational study.

    PubMed

    Varga, Zoltán; Vest, Brian; Schwerdtfeger, Peter; Hargittai, Magdolna

    2010-03-15

    The molecular geometries of VCl2 and VCl3 have been determined by computations and gas-phase electron diffraction (ED). The ED study is a reinvestigation of the previously published analysis for VCl2. The structure of the vanadium dichloride dimer has also been calculated. According to our joint ED and computational study, the evaporation of a solid sample of VCl2 resulted in about 66% vanadium trichloride and 34% vanadium dichloride in the vapor. Vanadium dichloride is unambiguously linear in its 4Sigma(g)+ ground electronic state. For VCl3, all computations yielded a Jahn-Teller-distorted ground-state structure of C(2v) symmetry. However, it lies merely less than 3 kJ/mol lower than the 3E'' state (D(3h) symmetry). Due to the dynamic nature of the Jahn-Teller effect in this case, rigorous distinction cannot be made between the planar models of either D(3h) symmetry or C(2v) symmetry for the equilibrium structure of VCl3. Furthermore, the presence of several low-lying excited electronic states of VCl3 is expected in the high-temperature vapor. To our knowledge, this is the first experimental and computational study of the VCl3 molecule.

  9. Temporal and spatial variability of groundwater recharge on Jeju Island, Korea

    USGS Publications Warehouse

    Mair, Alan; Hagedorn, Benjamin; Tillery, Suzanne; El-Kadi, Aly I.; Westenbroek, Stephen M.; Ha, Kyoochul; Koh, Gi-Won

    2013-01-01

    Estimates of groundwater recharge spatial and temporal variability are essential inputs to groundwater flow models that are used to test groundwater availability under different management and climate conditions. In this study, a soil water balance analysis was conducted to estimate groundwater recharge on the island of Jeju, Korea, for baseline, drought, and climate-land use change scenarios. The Soil Water Balance (SWB) computer code was used to compute groundwater recharge and other water balance components at a daily time step using a 100 m grid cell size for an 18-year baseline scenario (1992–2009). A 10-year drought scenario was selected from historical precipitation trends (1961–2009), while the climate-land use change scenario was developed using late 21st century climate projections and a change in urban land use. Mean annual recharge under the baseline, drought, and climate-land use scenarios was estimated at 884, 591, and 788 mm, respectively. Under the baseline scenario, mean annual recharge was within the range of previous estimates (825–959 mm) and only slightly lower than the mean of 902 mm. As a fraction of mean annual rainfall, mean annual recharge was computed as only 42% and less than previous estimates of 44–48%. The maximum historical reported annual pumping rate of 241 × 106 m3 equates to 15% of baseline recharge, which is within the range of 14–16% computed from earlier studies. The model does not include a mechanism to account for additional sources of groundwater recharge, such as fog drip, irrigation, and artificial recharge, and may also overestimate evapotranspiration losses. Consequently, the results presented in this study represent a conservative estimate of total recharge.

  10. Computer-automated tinnitus assessment: noise-band matching, maskability, and residual inhibition.

    PubMed

    Henry, James A; Roberts, Larry E; Ellingson, Roger M; Thielman, Emily J

    2013-06-01

    Psychoacoustic measures of tinnitus typically include loudness and pitch match, minimum masking level (MML), and residual inhibition (RI). We previously developed and documented a computer-automated tinnitus evaluation system (TES) capable of subject-guided loudness and pitch matching. The TES was further developed to conduct computer-aided, subject-guided testing for noise-band matching (NBM), MML, and RI. The purpose of the present study was to document the capability of the upgraded TES to obtain measures of NBM, MML, and RI, and to determine the test-retest reliability of the responses obtained. Three subject-guided, computer-automated testing protocols were developed to conduct NBM. For MML and RI testing, a 2-12 kHz band of noise was used. All testing was repeated during a second session. Subjects meeting study criteria were selected from those who had previously been tested for loudness and pitch matching in our laboratory. A total of 21 subjects completed testing, including seven females and 14 males. The upgraded TES was found to be fairly time efficient. Subjects were generally reliable, both within and between sessions, with respect to the type of stimulus they chose as the best match to their tinnitus. Matching to bandwidth was more variable between measurements, with greater consistency seen for subjects reporting tonal tinnitus or wide-band noisy tinnitus than intermediate types. Between-session repeated MMLs were within 10 dB of each other for all but three of the subjects. Subjects who experienced RI during Session 1 tended to be those who experienced it during Session 2. This study may represent the first time that NBM, MML, and RI audiometric testing results have been obtained entirely through a self-contained, computer-automated system designed specifically for use in the clinic. Future plans include refinements to achieve greater testing efficiency. American Academy of Audiology.

  11. From Three-Photon Greenberger-Horne-Zeilinger States to Ballistic Universal Quantum Computation.

    PubMed

    Gimeno-Segovia, Mercedes; Shadbolt, Pete; Browne, Dan E; Rudolph, Terry

    2015-07-10

    Single photons, manipulated using integrated linear optics, constitute a promising platform for universal quantum computation. A series of increasingly efficient proposals have shown linear-optical quantum computing to be formally scalable. However, existing schemes typically require extensive adaptive switching, which is experimentally challenging and noisy, thousands of photon sources per renormalized qubit, and/or large quantum memories for repeat-until-success strategies. Our work overcomes all these problems. We present a scheme to construct a cluster state universal for quantum computation, which uses no adaptive switching, no large memories, and which is at least an order of magnitude more resource efficient than previous passive schemes. Unlike previous proposals, it is constructed entirely from loss-detecting gates and offers a robustness to photon loss. Even without the use of an active loss-tolerant encoding, our scheme naturally tolerates a total loss rate ∼1.6% in the photons detected in the gates. This scheme uses only 3 Greenberger-Horne-Zeilinger states as a resource, together with a passive linear-optical network. We fully describe and model the iterative process of cluster generation, including photon loss and gate failure. This demonstrates that building a linear-optical quantum computer needs to be less challenging than previously thought.

  12. Acausal measurement-based quantum computing

    NASA Astrophysics Data System (ADS)

    Morimae, Tomoyuki

    2014-07-01

    In measurement-based quantum computing, there is a natural "causal cone" among qubits of the resource state, since the measurement angle on a qubit has to depend on previous measurement results in order to correct the effect of by-product operators. If we respect the no-signaling principle, by-product operators cannot be avoided. Here we study the possibility of acausal measurement-based quantum computing by using the process matrix framework [Oreshkov, Costa, and Brukner, Nat. Commun. 3, 1092 (2012), 10.1038/ncomms2076]. We construct a resource process matrix for acausal measurement-based quantum computing restricting local operations to projective measurements. The resource process matrix is an analog of the resource state of the standard causal measurement-based quantum computing. We find that if we restrict local operations to projective measurements the resource process matrix is (up to a normalization factor and trivial ancilla qubits) equivalent to the decorated graph state created from the graph state of the corresponding causal measurement-based quantum computing. We also show that it is possible to consider a causal game whose causal inequality is violated by acausal measurement-based quantum computing.

  13. Advances in computational design and analysis of airbreathing propulsion systems

    NASA Technical Reports Server (NTRS)

    Klineberg, John M.

    1989-01-01

    The development of commercial and military aircraft depends, to a large extent, on engine manufacturers being able to achieve significant increases in propulsion capability through improved component aerodynamics, materials, and structures. The recent history of propulsion has been marked by efforts to develop computational techniques that can speed up the propulsion design process and produce superior designs. The availability of powerful supercomputers, such as the NASA Numerical Aerodynamic Simulator, and the potential for even higher performance offered by parallel computer architectures, have opened the door to the use of multi-dimensional simulations to study complex physical phenomena in propulsion systems that have previously defied analysis or experimental observation. An overview of several NASA Lewis research efforts is provided that are contributing toward the long-range goal of a numerical test-cell for the integrated, multidisciplinary design, analysis, and optimization of propulsion systems. Specific examples in Internal Computational Fluid Mechanics, Computational Structural Mechanics, Computational Materials Science, and High Performance Computing are cited and described in terms of current capabilities, technical challenges, and future research directions.

  14. University Students' Subjective Knowledge of Green Computing and Pro-Environmental Behavior

    ERIC Educational Resources Information Center

    Ahmad, Tunku Badariah Tunku; Nordin, Mohamad Sahari

    2014-01-01

    This cross-sectional survey examined the structure of university students' subjective knowledge of green computing--hypothesized to be a multidimensional construct with three important dimensions--and its association with pro-environmental behavior (PEB). Using a previously validated green computing questionnaire, data were collected from 842…

  15. Connecting Kids and Computers

    ERIC Educational Resources Information Center

    Giles, Rebecca McMahon

    2006-01-01

    Exposure to cell phones, DVD players, video games, computers, digital cameras, and iPods has made today's young people more technologically advanced than those of any previous generation. As a result, parents are now concerned that their children are spending too much time in front of the computer. In this article, the author focuses her…

  16. Computer Use within a Play-Based Early Years Curriculum

    ERIC Educational Resources Information Center

    Howard, Justine; Miles, Gareth E.; Rees-Davies, Laura

    2012-01-01

    Early years curricula promote learning through play and in addition emphasise the development of computer literacy. Previous research, however, has described that teachers feel unprepared to integrate Information and Communication Technology (ICT) and play. Also, whereas research has suggested that effective computer use in the early years is…

  17. Role of strategies and prior exposure in mental rotation.

    PubMed

    Cherney, Isabelle D; Neff, Nicole L

    2004-06-01

    The purpose of these two studies was to examine sex differences in strategy use and the effect of prior exposure on the performance on Vandenberg and Kuse's 1978 Mental Rotation Test. A total of 152 participants completed the spatial task and self-reported their strategy use. Consistent with previous studies, men outperformed women. Strategy usage did not account for these differences, although guessing did. Previous exposure to the Mental Rotation Test, American College Test scores and frequent computer or video game play predicted performance on the test. These results suggest that prior exposure to spatial tasks may provide cues to improve participants' performance.

  18. Efficient 3D geometric and Zernike moments computation from unstructured surface meshes.

    PubMed

    Pozo, José María; Villa-Uriol, Maria-Cruz; Frangi, Alejandro F

    2011-03-01

    This paper introduces and evaluates a fast exact algorithm and a series of faster approximate algorithms for the computation of 3D geometric moments from an unstructured surface mesh of triangles. Being based on the object surface reduces the computational complexity of these algorithms with respect to volumetric grid-based algorithms. In contrast, it can only be applied for the computation of geometric moments of homogeneous objects. This advantage and restriction is shared with other proposed algorithms based on the object boundary. The proposed exact algorithm reduces the computational complexity for computing geometric moments up to order N with respect to previously proposed exact algorithms, from N(9) to N(6). The approximate series algorithm appears as a power series on the rate between triangle size and object size, which can be truncated at any desired degree. The higher the number and quality of the triangles, the better the approximation. This approximate algorithm reduces the computational complexity to N(3). In addition, the paper introduces a fast algorithm for the computation of 3D Zernike moments from the computed geometric moments, with a computational complexity N(4), while the previously proposed algorithm is of order N(6). The error introduced by the proposed approximate algorithms is evaluated in different shapes and the cost-benefit ratio in terms of error, and computational time is analyzed for different moment orders.

  19. Irrigant flow within a prepared root canal using various flow rates: a Computational Fluid Dynamics study.

    PubMed

    Boutsioukis, C; Lambrianidis, T; Kastrinakis, E

    2009-02-01

    To study using computer simulation the effect of irrigant flow rate on the flow pattern within a prepared root canal, during final irrigation with a syringe and needle. Geometrical characteristics of a side-vented endodontic needle and clinically realistic flow rate values were obtained from previous and preliminary studies. A Computational Fluid Dynamics (CFD) model was created using FLUENT 6.2 software. Calculations were carried out for five selected flow rates (0.02-0.79 mL sec(-1)) and velocity and turbulence quantities along the domain were evaluated. Irrigant replacement was limited to 1-1.5 mm apical to the needle tip for all flow rates tested. Low-Reynolds number turbulent flow was detected near the needle outlet. Irrigant flow rate affected significantly the flow pattern within the root canal. Irrigation needles should be placed to within 1 mm from working length to ensure fluid exchange. Turbulent flow of irrigant leads to more efficient irrigant replacement. CFD represents a powerful tool for the study of irrigation.

  20. The Effects of Study Tasks in a Computer-Based Chemistry Learning Environment

    NASA Astrophysics Data System (ADS)

    Urhahne, Detlef; Nick, Sabine; Poepping, Anna Christin; Schulz, Sarah Jayne

    2013-12-01

    The present study examines the effects of different study tasks on the acquisition of knowledge about acids and bases in a computer-based learning environment. Three different task formats were selected to create three treatment conditions: learning with gap-fill and matching tasks, learning with multiple-choice tasks, and learning only from text and figures without any additional tasks. Participants were 196 ninth-grade students who learned with a self-developed multimedia program in a pretest-posttest control group design. Research results reveal that gap-fill and matching tasks were most effective in promoting knowledge acquisition, followed by multiple-choice tasks, and no tasks at all. The findings are in line with previous research on this topic. The effects can possibly be explained by the generation-recognition model, which predicts that gap-fill and matching tasks trigger more encompassing learning processes than multiple-choice tasks. It is concluded that instructional designers should incorporate more challenging study tasks for enhancing the effectiveness of computer-based learning environments.

  1. Natural History of Ground-Glass Lesions Among Patients With Previous Lung Cancer.

    PubMed

    Shewale, Jitesh B; Nelson, David B; Rice, David C; Sepesi, Boris; Hofstetter, Wayne L; Mehran, Reza J; Vaporciyan, Ara A; Walsh, Garrett L; Swisher, Stephen G; Roth, Jack A; Antonoff, Mara B

    2018-06-01

    Among patients with previous lung cancer, the malignant potential of subsequent ground-glass opacities (GGOs) on computed tomography remains unknown, with a lack of consensus regarding surveillance and intervention. This study sought to describe the natural history of GGO in patients with a history of lung cancer. A retrospective review was performed of 210 patients with a history of lung cancer and ensuing computed tomography evidence of pure or mixed GGOs between 2007 and 2013. Computed tomography reports were reviewed to determine the fate of the GGOs, by classifying all lesions as stable, resolved, or progressive over the course of the study. Multivariable analysis was performed to identify predictors of GGO progression and resolution. The mean follow-up time was 13 months. During this period, 55 (26%) patients' GGOs were stable, 131 (62%) resolved, and 24 (11%) progressed. Of the 24 GGOs that progressed, three were subsequently diagnosed as adenocarcinoma. Patients of black race (odds ratio [OR], 0.26) and other races besides white (OR, 0.89) had smaller odds of GGO resolution (p = 0.033), whereas patients with previous lung squamous cell carcinoma (OR, 5.16) or small cell carcinoma (OR, 5.36) were more likely to experience GGO resolution (p < 0.001). On multivariable analysis, only a history of adenocarcinoma was an independent predictor of GGO progression (OR, 6.9; p = 0.011). Among patients with a history of lung cancer, prior adenocarcinoma emerged as a predictor of GGO progression, whereas a history of squamous cell carcinoma or small cell carcinoma and white race were identified as predictors of GGO resolution. Copyright © 2018 The Society of Thoracic Surgeons. Published by Elsevier Inc. All rights reserved.

  2. Development of Compositionally Graded Metallic Glass Alloys with Desirable Properties

    DTIC Science & Technology

    2016-06-01

    individual study . Include participation in conferences, workshops, and seminars not listed under major activities. At OSU, one graduate student ...best glass former in a fraction of the experimental time of previous study , which identified similar compositions of Cu57.6Zr32.4Ti10,13 and...years, several experimental and computational studies have suggested a direct link between the population of shear transformation zones (STZs

  3. Evolution of design considerations in complex craniofacial reconstruction using patient-specific implants.

    PubMed

    Peel, Sean; Bhatia, Satyajeet; Eggbeer, Dominic; Morris, Daniel S; Hayhurst, Caroline

    2017-06-01

    Previously published evidence has established major clinical benefits from using computer-aided design, computer-aided manufacturing, and additive manufacturing to produce patient-specific devices. These include cutting guides, drilling guides, positioning guides, and implants. However, custom devices produced using these methods are still not in routine use, particularly by the UK National Health Service. Oft-cited reasons for this slow uptake include the following: a higher up-front cost than conventionally fabricated devices, material-choice uncertainty, and a lack of long-term follow-up due to their relatively recent introduction. This article identifies a further gap in current knowledge - that of design rules, or key specification considerations for complex computer-aided design/computer-aided manufacturing/additive manufacturing devices. This research begins to address the gap by combining a detailed review of the literature with first-hand experience of interdisciplinary collaboration on five craniofacial patient case studies. In each patient case, bony lesions in the orbito-temporal region were segmented, excised, and reconstructed in the virtual environment. Three cases translated these digital plans into theatre via polymer surgical guides. Four cases utilised additive manufacturing to fabricate titanium implants. One implant was machined from polyether ether ketone. From the literature, articles with relevant abstracts were analysed to extract design considerations. In all, 19 frequently recurring design considerations were extracted from previous publications. Nine new design considerations were extracted from the case studies - on the basis of subjective clinical evaluation. These were synthesised to produce a design considerations framework to assist clinicians with prescribing and design engineers with modelling. Promising avenues for further research are proposed.

  4. Towards a computational- and algorithmic-level account of concept blending using analogies and amalgams

    NASA Astrophysics Data System (ADS)

    Besold, Tarek R.; Kühnberger, Kai-Uwe; Plaza, Enric

    2017-10-01

    Concept blending - a cognitive process which allows for the combination of certain elements (and their relations) from originally distinct conceptual spaces into a new unified space combining these previously separate elements, and enables reasoning and inference over the combination - is taken as a key element of creative thought and combinatorial creativity. In this article, we summarise our work towards the development of a computational-level and algorithmic-level account of concept blending, combining approaches from computational analogy-making and case-based reasoning (CBR). We present the theoretical background, as well as an algorithmic proposal integrating higher-order anti-unification matching and generalisation from analogy with amalgams from CBR. The feasibility of the approach is then exemplified in two case studies.

  5. A study of application of remote sensing to river forecasting. Volume 1: Executive summary

    NASA Technical Reports Server (NTRS)

    1975-01-01

    A project is described whose goal was to define, implement and evaluate a pilot demonstration test to show the practicability of applying remotely sensed data to operational river forecasting in gaged or previously ungaged watersheds. A secondary objective was to provide NASA with documentation describing the computer programs that comprise the streamflow forecasting simulation model used. A computer-based simulation model was adapted to a streamflow forecasting application and implemented in an IBM System/360 Model 44 computer, operating in a dedicated mode, with operator interactive control through a Model 2250 keyboard/graphic CRT terminal. The test site whose hydrologic behavior was simulated is a small basin (365 square kilometers) designated Town Creek near Geraldine, Alabama.

  6. GPU accelerated FDTD solver and its application in MRI.

    PubMed

    Chi, J; Liu, F; Jin, J; Mason, D G; Crozier, S

    2010-01-01

    The finite difference time domain (FDTD) method is a popular technique for computational electromagnetics (CEM). The large computational power often required, however, has been a limiting factor for its applications. In this paper, we will present a graphics processing unit (GPU)-based parallel FDTD solver and its successful application to the investigation of a novel B1 shimming scheme for high-field magnetic resonance imaging (MRI). The optimized shimming scheme exhibits considerably improved transmit B(1) profiles. The GPU implementation dramatically shortened the runtime of FDTD simulation of electromagnetic field compared with its CPU counterpart. The acceleration in runtime has made such investigation possible, and will pave the way for other studies of large-scale computational electromagnetic problems in modern MRI which were previously impractical.

  7. Five- and six-electron harmonium atoms: Highly accurate electronic properties and their application to benchmarking of approximate 1-matrix functionals

    NASA Astrophysics Data System (ADS)

    Cioslowski, Jerzy; Strasburger, Krzysztof

    2018-04-01

    Electronic properties of several states of the five- and six-electron harmonium atoms are obtained from large-scale calculations employing explicitly correlated basis functions. The high accuracy of the computed energies (including their components), natural spinorbitals, and their occupation numbers makes them suitable for testing, calibration, and benchmarking of approximate formalisms of quantum chemistry and solid state physics. In the case of the five-electron species, the availability of the new data for a wide range of the confinement strengths ω allows for confirmation and generalization of the previously reached conclusions concerning the performance of the presently known approximations for the electron-electron repulsion energy in terms of the 1-matrix that are at heart of the density matrix functional theory (DMFT). On the other hand, the properties of the three low-lying states of the six-electron harmonium atom, computed at ω = 500 and ω = 1000, uncover deficiencies of the 1-matrix functionals not revealed by previous studies. In general, the previously published assessment of the present implementations of DMFT being of poor accuracy is found to hold. Extending the present work to harmonically confined systems with even more electrons is most likely counterproductive as the steep increase in computational cost required to maintain sufficient accuracy of the calculated properties is not expected to be matched by the benefits of additional information gathered from the resulting benchmarks.

  8. Computational fluid mechanics

    NASA Technical Reports Server (NTRS)

    Hassan, H. A.

    1993-01-01

    Two papers are included in this progress report. In the first, the compressible Navier-Stokes equations have been used to compute leading edge receptivity of boundary layers over parabolic cylinders. Natural receptivity at the leading edge was simulated and Tollmien-Schlichting waves were observed to develop in response to an acoustic disturbance, applied through the farfield boundary conditions. To facilitate comparison with previous work, all computations were carried out at a free stream Mach number of 0.3. The spatial and temporal behavior of the flowfields are calculated through the use of finite volume algorithms and Runge-Kutta integration. The results are dominated by strong decay of the Tollmien-Schlichting wave due to the presence of the mean flow favorable pressure gradient. The effects of numerical dissipation, forcing frequency, and nose radius are studied. The Strouhal number is shown to have the greatest effect on the unsteady results. In the second paper, a transition model for low-speed flows, previously developed by Young et al., which incorporates first-mode (Tollmien-Schlichting) disturbance information from linear stability theory has been extended to high-speed flow by incorporating the effects of second mode disturbances. The transition model is incorporated into a Reynolds-averaged Navier-Stokes solver with a one-equation turbulence model. Results using a variable turbulent Prandtl number approach demonstrate that the current model accurately reproduces available experimental data for first and second-mode dominated transitional flows. The performance of the present model shows significant improvement over previous transition modeling attempts.

  9. Approaching gender parity: Women in computer science at Afghanistan's Kabul University

    NASA Astrophysics Data System (ADS)

    Plane, Jandelyn

    This study explores the representation of women in computer science at the tertiary level through data collected about undergraduate computer science education at Kabul University in Afghanistan. Previous studies have theorized reasons for underrepresentation of women in computer science, and while many of these reasons are indeed present in Afghanistan, they appear to hinder advancement to degree to a lesser extent. Women comprise at least 36% of each graduating class from KU's Computer Science Department; however, in 2007 women were 25% of the university population. In the US, women comprise over 50% of university populations while only graduating on average 25% women in undergraduate computer science programs. Representation of women in computer science in the US is 50% below the university rate, but at KU, it is 50% above the university rate. This mixed methods study of KU was conducted in the following three stages: setting up focus groups with women computer science students, distributing surveys to all students in the CS department, and conducting a series of 22 individual interviews with fourth year CS students. The analysis of the data collected and its comparison to literature on university/department retention in Science, Technology, Engineering and Mathematics gender representation and on women's education in underdeveloped Islamic countries illuminates KU's uncharacteristic representation of women in its Computer Science Department. The retention of women in STEM through the education pipeline has several characteristics in Afghanistan that differ from countries often studied in available literature. Few Afghan students have computers in their home and few have training beyond secretarial applications before considering studying CS at university. University students in Afghanistan are selected based on placement exams and are then assigned to an area of study, and financially supported throughout their academic career, resulting in a low attrition rate from the program. Gender and STEM literature identifies parental encouragement, stereotypes and employment perceptions as influential characteristics. Afghan women in computer science received significant parental encouragement even from parents with no computer background. They do not seem to be influenced by any negative "geek" stereotypes, but they do perceive limitations when considering employment after graduation.

  10. Consolidation of cloud computing in ATLAS

    NASA Astrophysics Data System (ADS)

    Taylor, Ryan P.; Domingues Cordeiro, Cristovao Jose; Giordano, Domenico; Hover, John; Kouba, Tomas; Love, Peter; McNab, Andrew; Schovancova, Jaroslava; Sobie, Randall; ATLAS Collaboration

    2017-10-01

    Throughout the first half of LHC Run 2, ATLAS cloud computing has undergone a period of consolidation, characterized by building upon previously established systems, with the aim of reducing operational effort, improving robustness, and reaching higher scale. This paper describes the current state of ATLAS cloud computing. Cloud activities are converging on a common contextualization approach for virtual machines, and cloud resources are sharing monitoring and service discovery components. We describe the integration of Vacuum resources, streamlined usage of the Simulation at Point 1 cloud for offline processing, extreme scaling on Amazon compute resources, and procurement of commercial cloud capacity in Europe. Finally, building on the previously established monitoring infrastructure, we have deployed a real-time monitoring and alerting platform which coalesces data from multiple sources, provides flexible visualization via customizable dashboards, and issues alerts and carries out corrective actions in response to problems.

  11. Development of posture-specific computational phantoms using motion capture technology and application to radiation dose-reconstruction for the 1999 Tokai-Mura nuclear criticality accident

    NASA Astrophysics Data System (ADS)

    Vazquez, Justin A.; Caracappa, Peter F.; Xu, X. George

    2014-09-01

    The majority of existing computational phantoms are designed to represent workers in typical standing anatomical postures with fixed arm and leg positions. However, workers found in accident-related scenarios often assume varied postures. This paper describes the development and application of two phantoms with adjusted postures specified by data acquired from a motion capture system to simulate unique human postures found in a 1999 criticality accident that took place at a JCO facility in Tokai-Mura, Japan. In the course of this accident, two workers were fatally exposed to extremely high levels of radiation. Implementation of the emergent techniques discussed produced more accurate and more detailed dose estimates for the two workers than were reported in previous studies. A total-body dose of 6.43 and 26.38 Gy was estimated for the two workers, who assumed a crouching and a standing posture, respectively. Additionally, organ-specific dose estimates were determined, including a 7.93 Gy dose to the thyroid and 6.11 Gy dose to the stomach for the crouching worker and a 41.71 Gy dose to the liver and a 37.26 Gy dose to the stomach for the standing worker. Implications for the medical prognosis of the workers are discussed, and the results of this study were found to correlate better with the patient outcome than previous estimates, suggesting potential future applications of such methods for improved epidemiological studies involving next-generation computational phantom tools.

  12. Criterion for Identifying Vortices in High-Pressure Flows

    NASA Technical Reports Server (NTRS)

    Bellan, Josette; Okong'o, Nora

    2007-01-01

    A study of four previously published computational criteria for identifying vortices in high-pressure flows has led to the selection of one of them as the best. This development can be expected to contribute to understanding of high-pressure flows, which occur in diverse settings, including diesel, gas turbine, and rocket engines and the atmospheres of Jupiter and other large gaseous planets. Information on the atmospheres of gaseous planets consists mainly of visual and thermal images of the flows over the planets. Also, validation of recently proposed computational models of high-pressure flows entails comparison with measurements, which are mainly of visual nature. Heretofore, the interpretation of images of high-pressure flows to identify vortices has been based on experience with low-pressure flows. However, high-pressure flows have features distinct from those of low-pressure flows, particularly in regions of high pressure gradient magnitude caused by dynamic turbulent effects and by thermodynamic mixing of chemical species. Therefore, interpretations based on low-pressure behavior may lead to misidentification of vortices and other flow structures in high-pressure flows. The study reported here was performed in recognition of the need for one or more quantitative criteria for identifying coherent flow structures - especially vortices - from previously generated flow-field data, to complement or supersede the determination of flow structures by visual inspection of instantaneous fields or flow animations. The focus in the study was on correlating visible images of flow features with various quantities computed from flow-field data.

  13. Development of posture-specific computational phantoms using motion capture technology and application to radiation dose-reconstruction for the 1999 Tokai-Mura nuclear criticality accident.

    PubMed

    Vazquez, Justin A; Caracappa, Peter F; Xu, X George

    2014-09-21

    The majority of existing computational phantoms are designed to represent workers in typical standing anatomical postures with fixed arm and leg positions. However, workers found in accident-related scenarios often assume varied postures. This paper describes the development and application of two phantoms with adjusted postures specified by data acquired from a motion capture system to simulate unique human postures found in a 1999 criticality accident that took place at a JCO facility in Tokai-Mura, Japan. In the course of this accident, two workers were fatally exposed to extremely high levels of radiation. Implementation of the emergent techniques discussed produced more accurate and more detailed dose estimates for the two workers than were reported in previous studies. A total-body dose of 6.43 and 26.38 Gy was estimated for the two workers, who assumed a crouching and a standing posture, respectively. Additionally, organ-specific dose estimates were determined, including a 7.93 Gy dose to the thyroid and 6.11 Gy dose to the stomach for the crouching worker and a 41.71 Gy dose to the liver and a 37.26 Gy dose to the stomach for the standing worker. Implications for the medical prognosis of the workers are discussed, and the results of this study were found to correlate better with the patient outcome than previous estimates, suggesting potential future applications of such methods for improved epidemiological studies involving next-generation computational phantom tools.

  14. The relationship between emotional intelligence, previous caring experience and mindfulness in student nurses and midwives: a cross sectional analysis.

    PubMed

    Snowden, Austyn; Stenhouse, Rosie; Young, Jenny; Carver, Hannah; Carver, Fiona; Brown, Norrie

    2015-01-01

    Emotional Intelligence (EI), previous caring experience and mindfulness training may have a positive impact on nurse education. More evidence is needed to support the use of these variables in nurse recruitment and retention. To explore the relationship between EI, gender, age, programme of study, previous caring experience and mindfulness training. Cross sectional element of longitudinal study. 938year one nursing, midwifery and computing students at two Scottish Higher Education Institutes (HEIs) who entered their programme in September 2013. Participants completed a measure of 'trait' EI: Trait Emotional Intelligence Questionnaire Short Form (TEIQue-SF); and 'ability' EI: Schutte's et al. (1998) Emotional Intelligence Scale (SEIS). Demographics, previous caring experience and previous training in mindfulness were recorded. Relationships between variables were tested using non-parametric tests. Emotional intelligence increased with age on both measures of EI [TEIQ-SF H(5)=15.157 p=0.001; SEIS H(5)=11.388, p=0.044]. Females (n=786) scored higher than males (n=149) on both measures [TEIQ-SF, U=44,931, z=-4.509, p<.001; SEIS, U=44,744, z=-5.563, p<.001]. Nursing students scored higher that computing students [TEIQ-SF H(5)=46,496, p<.001; SEIS H(5)=33.309, p<0.001. There were no statistically significant differences in TEIQ-SF scores between those who had previous mindfulness training (n=50) and those who had not (n=857) [U=22,980, z=0.864, p = 0.388]. However, median SEIS was statistically significantly different according to mindfulness training [U=25,115.5, z=2.05, p=.039]. Neither measure demonstrated statistically significantly differences between those with (n=492) and without (n=479) previous caring experience, [TEIQ-SF, U=112, 102, z=0.938, p=.348; SEIS, U=115,194.5, z=1.863, p=0.063]. Previous caring experience was not associated with higher emotional intelligence. Mindfulness training was associated with higher 'ability' emotional intelligence. Implications for recruitment, retention and further research are explored. Copyright © 2014. Published by Elsevier Ltd.

  15. Time-dependent Gutzwiller theory of magnetic excitations in the Hubbard model

    NASA Astrophysics Data System (ADS)

    Seibold, G.; Becca, F.; Rubin, P.; Lorenzana, J.

    2004-04-01

    We use a spin-rotational invariant Gutzwiller energy functional to compute random-phase-approximation-like (RPA) fluctuations on top of the Gutzwiller approximation (GA). The method can be viewed as an extension of the previously developed GA+RPA approach for the charge sector [G. Seibold and J. Lorenzana, Phys. Rev. Lett. 86, 2605 (2001)] with respect to the inclusion of the magnetic excitations. Unlike the charge case, no assumptions about the time evolution of the double occupancy are needed in this case. Interestingly, in a spin-rotational invariant system, we find the correct degeneracy between triplet excitations, showing the consistency of both computations. Since no restrictions are imposed on the symmetry of the underlying saddle-point solution, our approach is suitable for the evaluation of the magnetic susceptibility and dynamical structure factor in strongly correlated inhomogeneous systems. We present a detailed study of the quality of our approach by comparing with exact diagonalization results and show its much higher accuracy compared to the conventional Hartree-Fock+RPA theory. In infinite dimensions, where the GA becomes exact for the Gutzwiller variational energy, we evaluate ferromagnetic and antiferromagnetic instabilities from the transverse magnetic susceptibility. The resulting phase diagram is in complete agreement with previous variational computations.

  16. MMA-EoS: A Computational Framework for Mineralogical Thermodynamics

    NASA Astrophysics Data System (ADS)

    Chust, T. C.; Steinle-Neumann, G.; Dolejš, D.; Schuberth, B. S. A.; Bunge, H.-P.

    2017-12-01

    We present a newly developed software framework, MMA-EoS, that evaluates phase equilibria and thermodynamic properties of multicomponent systems by Gibbs energy minimization, with application to mantle petrology. The code is versatile in terms of the equation-of-state and mixing properties and allows for the computation of properties of single phases, solution phases, and multiphase aggregates. Currently, the open program distribution contains equation-of-state formulations widely used, that is, Caloric-Murnaghan, Caloric-Modified-Tait, and Birch-Murnaghan-Mie-Grüneisen-Debye models, with published databases included. Through its modular design and easily scripted database, MMA-EoS can readily be extended with new formulations of equations-of-state and changes or extensions to thermodynamic data sets. We demonstrate the application of the program by reproducing and comparing physical properties of mantle phases and assemblages with previously published work and experimental data, successively increasing complexity, up to computing phase equilibria of six-component compositions. Chemically complex systems allow us to trace the budget of minor chemical components in order to explore whether they lead to the formation of new phases or extend stability fields of existing ones. Self-consistently computed thermophysical properties for a homogeneous mantle and a mechanical mixture of slab lithologies show no discernible differences that require a heterogeneous mantle structure as has been suggested previously. Such examples illustrate how thermodynamics of mantle mineralogy can advance the study of Earth's interior.

  17. A Double-Dissociation in Infants' Representations of Object Arrays

    ERIC Educational Resources Information Center

    Feigenson, L.

    2005-01-01

    Previous studies show that infants can compute either the total continuous extent (e.g. Clearfield, M.W., & Mix, K.S. (1999). Number versus contour length in infants' discrimination of small visual sets. Psychological Science, 10(5), 408-411; Feigenson, L., & Carey, S. (2003). Tracking individuals via object-files: evidence from infants' manual…

  18. Bees Algorithm for Construction of Multiple Test Forms in E-Testing

    ERIC Educational Resources Information Center

    Songmuang, Pokpong; Ueno, Maomi

    2011-01-01

    The purpose of this research is to automatically construct multiple equivalent test forms that have equivalent qualities indicated by test information functions based on item response theory. There has been a trade-off in previous studies between the computational costs and the equivalent qualities of test forms. To alleviate this problem, we…

  19. How Readability and Topic Incidence Relate to Performance on Mathematics Story Problems in Computer-Based Curricula

    ERIC Educational Resources Information Center

    Walkington, Candace; Clinton, Virginia; Ritter, Steven N.; Nathan, Mitchell J.

    2015-01-01

    Solving mathematics story problems requires text comprehension skills. However, previous studies have found few connections between traditional measures of text readability and performance on story problems. We hypothesized that recently developed measures of readability and topic incidence measured by text-mining tools may illuminate associations…

  20. State of Washington Computer Use Survey.

    ERIC Educational Resources Information Center

    Beal, Jack L.; And Others

    This report presents the results of a spring 1982 survey of a random sample of Washington public schools which separated findings according to school level (elementary, middle, junior high, or high school) and district size (either less than or greater than 2,000 enrollment). A brief review of previous studies and a description of the survey…

  1. Salary Compression: A Time-Series Ratio Analysis of ARL Position Classifications

    ERIC Educational Resources Information Center

    Seaman, Scott

    2007-01-01

    Although salary compression has previously been identified in such professional schools as engineering, business, and computer science, there is now evidence of salary compression among Association of Research Libraries members. Using salary data from the "ARL Annual Salary Survey", this study analyzes average annual salaries from 1994-1995…

  2. Assessing Faculty Beliefs about the Importance of Various Marketing Job Skills

    ERIC Educational Resources Information Center

    Hyman, Michael R.; Hu, Jing

    2005-01-01

    The need to improve the professional skills of those with marketing degrees has spurred surveys of current students, alumni, practitioners, and faculty about the importance of various professional skills; however, previous surveys of marketing faculty have focused only on computer skills. To address this limitation, the goals of this study were…

  3. The Intellectual Structure of Metacognitive Scaffolding in Science Education: A Co-Citation Network Analysis

    ERIC Educational Resources Information Center

    Tang, Kai-Yu; Wang, Chia-Yu; Chang, Hsin-Yi; Chen, Sufen; Lo, Hao-Chang; Tsai, Chin-Chung

    2016-01-01

    The issues of metacognitive scaffolding in science education (MSiSE) have become increasingly popular and important. Differing from previous content reviews, this study proposes a series of quantitative computer-based analyses by integrating document co-citation analysis, social network analysis, and exploratory factor analysis to explore the…

  4. Inhibitory Control in Childhood Stuttering

    ERIC Educational Resources Information Center

    Eggers, Kurt; De Nil, Luc F.; Van den Bergh, Bea R. H.

    2013-01-01

    Purpose: The purpose of this study was to investigate whether previously reported parental questionnaire-based differences in inhibitory control (IC; Eggers, De Nil, & Van den Bergh, 2010) would be supported by direct measurement of IC using a computer task. Method: Participants were 30 children who stutter (CWS; mean age = 7;05 years) and 30…

  5. Auditory Attentional Set-Shifting and Inhibition in Children Who Stutter

    ERIC Educational Resources Information Center

    Eggers, Kurt; Jansson-Verkasalo, Eira

    2017-01-01

    Purpose: The purpose of this study was to investigate whether previously reported parental questionnaire-based differences in attentional shifting and inhibitory control (AS and IC; Eggers, De Nil, & Van den Bergh, 2010) would be supported by direct measurement of AS and IC using a computer task. Method: Participants were 16 Finnish children…

  6. Moving an In-Class Module Online: A Case Study for Chemistry

    ERIC Educational Resources Information Center

    Seery, Michael K.

    2012-01-01

    This article summarises the author's experiences in running a module "Computers for Chemistry" entirely online for the past four years. The module, previously taught in a face-to-face environment, was reconfigured for teaching in an online environment. The rationale for moving online along with the design, implementation and evaluation of the…

  7. Mortality analysis by neighbourhood in a city with high levels of industrial air pollution.

    PubMed

    Vigotti, Maria Angela; Mataloni, Francesca; Bruni, Antonella; Minniti, Caterina; Gianicolo, Emilio A L

    2014-08-01

    Taranto, a city in south-eastern Italy, suffers serious environmental pollution from industrial sources. A previous cohort analysis found mortality excesses among neighbourhoods closest to industrial areas. Aim of this study was to investigate whether mortality also increased in other neighbourhoods compared to Apulia region. Standardized mortality ratios were computed. Number of deaths and of person-years at risk by neighbourhood came from the previous cohort study for 1998-2008 period. Reference population was Apulia region excluding Taranto province. A meta-analysis was conducted across less close neighbourhoods computing summary SMR estimates and evaluating heterogeneity. For the entire city higher mortality values are confirmed for all causes, all malignant neoplasms and several specific sites, neurological, cardiac, respiratory and digestive diseases. High mortality values are not confined to neighbourhoods closest to industrial areas for lung cancer, cardiac, respiratory and digestive diseases, in both sexes, and among women for all malignant neoplasms and pancreatic cancer. Increased mortality risks can also be observed in Taranto neighbourhoods not directly adjacent to industrial areas. Spatial trend, impact of socio-economic factors and duration of residence should be further explored.

  8. A neural network based reputation bootstrapping approach for service selection

    NASA Astrophysics Data System (ADS)

    Wu, Quanwang; Zhu, Qingsheng; Li, Peng

    2015-10-01

    With the concept of service-oriented computing becoming widely accepted in enterprise application integration, more and more computing resources are encapsulated as services and published online. Reputation mechanism has been studied to establish trust on prior unknown services. One of the limitations of current reputation mechanisms is that they cannot assess the reputation of newly deployed services as no record of their previous behaviours exists. Most of the current bootstrapping approaches merely assign default reputation values to newcomers. However, by this kind of methods, either newcomers or existing services will be favoured. In this paper, we present a novel reputation bootstrapping approach, where correlations between features and performance of existing services are learned through an artificial neural network (ANN) and they are then generalised to establish a tentative reputation when evaluating new and unknown services. Reputations of services published previously by the same provider are also incorporated for reputation bootstrapping if available. The proposed reputation bootstrapping approach is seamlessly embedded into an existing reputation model and implemented in the extended service-oriented architecture. Empirical studies of the proposed approach are shown at last.

  9. FIT: statistical modeling tool for transcriptome dynamics under fluctuating field conditions

    PubMed Central

    Iwayama, Koji; Aisaka, Yuri; Kutsuna, Natsumaro

    2017-01-01

    Abstract Motivation: Considerable attention has been given to the quantification of environmental effects on organisms. In natural conditions, environmental factors are continuously changing in a complex manner. To reveal the effects of such environmental variations on organisms, transcriptome data in field environments have been collected and analyzed. Nagano et al. proposed a model that describes the relationship between transcriptomic variation and environmental conditions and demonstrated the capability to predict transcriptome variation in rice plants. However, the computational cost of parameter optimization has prevented its wide application. Results: We propose a new statistical model and efficient parameter optimization based on the previous study. We developed and released FIT, an R package that offers functions for parameter optimization and transcriptome prediction. The proposed method achieves comparable or better prediction performance within a shorter computational time than the previous method. The package will facilitate the study of the environmental effects on transcriptomic variation in field conditions. Availability and Implementation: Freely available from CRAN (https://cran.r-project.org/web/packages/FIT/). Contact: anagano@agr.ryukoku.ac.jp Supplementary information: Supplementary data are available at Bioinformatics online PMID:28158396

  10. An efficient technique for the numerical solution of the bidomain equations.

    PubMed

    Whiteley, Jonathan P

    2008-08-01

    Computing the numerical solution of the bidomain equations is widely accepted to be a significant computational challenge. In this study we extend a previously published semi-implicit numerical scheme with good stability properties that has been used to solve the bidomain equations (Whiteley, J.P. IEEE Trans. Biomed. Eng. 53:2139-2147, 2006). A new, efficient numerical scheme is developed which utilizes the observation that the only component of the ionic current that must be calculated on a fine spatial mesh and updated frequently is the fast sodium current. Other components of the ionic current may be calculated on a coarser mesh and updated less frequently, and then interpolated onto the finer mesh. Use of this technique to calculate the transmembrane potential and extracellular potential induces very little error in the solution. For the simulations presented in this study an increase in computational efficiency of over two orders of magnitude over standard numerical techniques is obtained.

  11. Digital Storytelling in Bhutan: A Qualitative Examination of New Media Tools Used to Bridge the Digital Divide in a Rural Community School

    ERIC Educational Resources Information Center

    Gyabak, Khendum; Godina, Heriberto

    2011-01-01

    This qualitative study examines the use of digital storytelling as an instructional intervention for bridging the digital divide among public school students in rural Bhutan. Primary participants for the study included elementary school children who had never been previously exposed to computer technology and were recipients of a donated classroom…

  12. Digital multishaker modal testing

    NASA Technical Reports Server (NTRS)

    Blair, M.; Craig, R. R., Jr.

    1983-01-01

    A review of several modal testing techniques is made, along with brief discussions of their advantages and limitations. A new technique is presented which overcomes many of the previous limitations. Several simulated experiments are included to verify the validity and accuracy of the new method. Conclusions are drawn from the simulation studies and recommendations for further work are presented. The complete computer code configured for the simulation study is presented.

  13. An Interactive Learning Environment Designed to Increase the Possibilities for Learning and Communicating about Radioactivity

    ERIC Educational Resources Information Center

    Mork, Sonja M.

    2011-01-01

    Information and communication technology (ICT) is a natural part of most people's everyday life, and has also been introduced in schools. Previous studies have tended to focus on issues related to competency of teachers and lack of computer technology in schools. Focus now seems to be moving towards studies that help us understand how ICT may be…

  14. Computer Aided Phenomenography: The Role of Leximancer Computer Software in Phenomenographic Investigation

    ERIC Educational Resources Information Center

    Penn-Edwards, Sorrel

    2010-01-01

    The qualitative research methodology of phenomenography has traditionally required a manual sorting and analysis of interview data. In this paper I explore a potential means of streamlining this procedure by considering a computer aided process not previously reported upon. Two methods of lexicological analysis, manual and automatic, were examined…

  15. Automated computer grading of hardwood lumber

    Treesearch

    P. Klinkhachorn; J.P. Franklin; Charles W. McMillin; R.W. Conners; H.A. Huber

    1988-01-01

    This paper describes an improved computer program to grade hardwood lumber. The program was created as part of a system to automate various aspects of the hardwood manufacturing industry. It enhances previous efforts by considering both faces of the board and provides easy application of species dependent rules. The program can be readily interfaced with a computer...

  16. The Validity of Computer Audits of Simulated Cases Records.

    ERIC Educational Resources Information Center

    Rippey, Robert M.; And Others

    This paper describes the implementation of a computer-based approach to scoring open-ended problem lists constructed to evaluate student and practitioner clinical judgment from real or simulated records. Based on 62 previously administered and scored problem lists, the program was written in BASIC for a Heathkit H11A computer (equivalent to DEC…

  17. Factors Influencing Skilled Use of the Computer Mouse by School-Aged Children

    ERIC Educational Resources Information Center

    Lane, Alison E.; Ziviani, Jenny M.

    2010-01-01

    Effective use of computers in education for children requires consideration of individual and developmental characteristics of users. There is limited empirical evidence, however, to guide educational programming when it comes to children and their acquisition of computing skills. This paper reports on the influence of previous experience and…

  18. Outline of the Course in Automated Language Processing.

    ERIC Educational Resources Information Center

    Pacak, M.; Roberts, A. Hood

    The course in computational linguistics described in this paper was given at The American University during the spring semester of 1969. The purpose of the course was "to convey to students with no previous experience an appreciation of the growing art of computational linguistics which encompasses every use to which computers can be put in…

  19. The joint effect of mesoscale and microscale roughness on perceived gloss.

    PubMed

    Qi, Lin; Chantler, Mike J; Siebert, J Paul; Dong, Junyu

    2015-10-01

    Computer simulated stimuli can provide a flexible method for creating artificial scenes in the study of visual perception of material surface properties. Previous work based on this approach reported that the properties of surface roughness and glossiness are mutually interdependent and therefore, perception of one affects the perception of the other. In this case roughness was limited to a surface property termed bumpiness. This paper reports a study into how perceived gloss varies with two model parameters related to surface roughness in computer simulations: the mesoscale roughness parameter in a surface geometry model and the microscale roughness parameter in a surface reflectance model. We used a real-world environment map to provide complex illumination and a physically-based path tracer for rendering the stimuli. Eight observers took part in a 2AFC experiment, and the results were tested against conjoint measurement models. We found that although both of the above roughness parameters significantly affect perceived gloss, the additive model does not adequately describe their mutually interactive and nonlinear influence, which is at variance with previous findings. We investigated five image properties used to quantify specular highlights, and found that perceived gloss is well predicted using a linear model. Our findings provide computational support to the 'statistical appearance models' proposed recently for material perception. Copyright © 2015 Elsevier Ltd. All rights reserved.

  20. Simulation of Mach Probes in Non-Uniform Magnetized Plasmas: the Influence of a Background Density Gradient

    NASA Astrophysics Data System (ADS)

    Haakonsen, Christian Bernt; Hutchinson, Ian H.

    2013-10-01

    Mach probes can be used to measure transverse flow in magnetized plasmas, but what they actually measure in strongly non-uniform plasmas has not been definitively established. A fluid treatment in previous work has suggested that the diamagnetic drifts associated with background density and temperature gradients affect transverse flow measurements, but detailed computational study is required to validate and elaborate on those results; it is really a kinetic problem, since the probe deforms and introduces voids in the ion and electron distribution functions. A new code, the Plasma-Object Simulator with Iterated Trajectories (POSIT) has been developed to self-consistently compute the steady-state six-dimensional ion and electron distribution functions in the perturbed plasma. Particle trajectories are integrated backwards in time to the domain boundary, where arbitrary background distribution functions can be specified. This allows POSIT to compute the ion and electron density at each node of its unstructured mesh, update the potential based on those densities, and then iterate until convergence. POSIT is used to study the impact of a background density gradient on transverse Mach probe measurements, and the results compared to the previous fluid theory. C.B. Haakonsen was supported in part by NSF/DOE Grant No. DE-FG02-06ER54512, and in part by an SCGF award administered by ORISE under DOE Contract No. DE-AC05-06OR23100.

  1. The relative effectiveness of computer-based and traditional resources for education in anatomy.

    PubMed

    Khot, Zaid; Quinlan, Kaitlyn; Norman, Geoffrey R; Wainman, Bruce

    2013-01-01

    There is increasing use of computer-based resources to teach anatomy, although no study has compared computer-based learning to traditional. In this study, we examine the effectiveness of three formats of anatomy learning: (1) a virtual reality (VR) computer-based module, (2) a static computer-based module providing Key Views (KV), (3) a plastic model. We conducted a controlled trial in which 60 undergraduate students had ten minutes to study the names of 20 different pelvic structures. The outcome measure was a 25 item short answer test consisting of 15 nominal and 10 functional questions, based on a cadaveric pelvis. All subjects also took a brief mental rotations test (MRT) as a measure of spatial ability, used as a covariate in the analysis. Data were analyzed with repeated measures ANOVA. The group learning from the model performed significantly better than the other two groups on the nominal questions (Model 67%; KV 40%; VR 41%, Effect size 1.19 and 1.29, respectively). There was no difference between the KV and VR groups. There was no difference between the groups on the functional questions (Model 28%; KV, 23%, VR 25%). Computer-based learning resources appear to have significant disadvantages compared to traditional specimens in learning nominal anatomy. Consistent with previous research, virtual reality shows no advantage over static presentation of key views. © 2013 American Association of Anatomists.

  2. Simulation of Nonlinear Instabilities in an Attachment-Line Boundary Layer

    NASA Technical Reports Server (NTRS)

    Joslin, Ronald D.

    1996-01-01

    The linear and the nonlinear stability of disturbances that propagate along the attachment line of a three-dimensional boundary layer is considered. The spatially evolving disturbances in the boundary layer are computed by direct numerical simulation (DNS) of the unsteady, incompressible Navier-Stokes equations. Disturbances are introduced either by forcing at the in ow or by applying suction and blowing at the wall. Quasi-parallel linear stability theory and a nonparallel theory yield notably different stability characteristics for disturbances near the critical Reynolds number; the DNS results con rm the latter theory. Previously, a weakly nonlinear theory and computations revealed a high wave-number region of subcritical disturbance growth. More recent computations have failed to achieve this subcritical growth. The present computational results indicate the presence of subcritically growing disturbances; the results support the weakly nonlinear theory. Furthermore, an explanation is provided for the previous theoretical and computational discrepancy. In addition, the present results demonstrate that steady suction can be used to stabilize disturbances that otherwise grow subcritically along the attachment line.

  3. An Expressive, Lightweight and Secure Construction of Key Policy Attribute-Based Cloud Data Sharing Access Control

    NASA Astrophysics Data System (ADS)

    Lin, Guofen; Hong, Hanshu; Xia, Yunhao; Sun, Zhixin

    2017-10-01

    Attribute-based encryption (ABE) is an interesting cryptographic technique for flexible cloud data sharing access control. However, some open challenges hinder its practical application. In previous schemes, all attributes are considered as in the same status while they are not in most of practical scenarios. Meanwhile, the size of access policy increases dramatically with the raise of its expressiveness complexity. In addition, current research hardly notices that mobile front-end devices, such as smartphones, are poor in computational performance while too much bilinear pairing computation is needed for ABE. In this paper, we propose a key-policy weighted attribute-based encryption without bilinear pairing computation (KP-WABE-WB) for secure cloud data sharing access control. A simple weighted mechanism is presented to describe different importance of each attribute. We introduce a novel construction of ABE without executing any bilinear pairing computation. Compared to previous schemes, our scheme has a better performance in expressiveness of access policy and computational efficiency.

  4. The reliability of continuous brain responses during naturalistic listening to music.

    PubMed

    Burunat, Iballa; Toiviainen, Petri; Alluri, Vinoo; Bogert, Brigitte; Ristaniemi, Tapani; Sams, Mikko; Brattico, Elvira

    2016-01-01

    Low-level (timbral) and high-level (tonal and rhythmical) musical features during continuous listening to music, studied by functional magnetic resonance imaging (fMRI), have been shown to elicit large-scale responses in cognitive, motor, and limbic brain networks. Using a similar methodological approach and a similar group of participants, we aimed to study the replicability of previous findings. Participants' fMRI responses during continuous listening of a tango Nuevo piece were correlated voxelwise against the time series of a set of perceptually validated musical features computationally extracted from the music. The replicability of previous results and the present study was assessed by two approaches: (a) correlating the respective activation maps, and (b) computing the overlap of active voxels between datasets at variable levels of ranked significance. Activity elicited by timbral features was better replicable than activity elicited by tonal and rhythmical ones. These results indicate more reliable processing mechanisms for low-level musical features as compared to more high-level features. The processing of such high-level features is probably more sensitive to the state and traits of the listeners, as well as of their background in music. Copyright © 2015 Elsevier Inc. All rights reserved.

  5. Language networks associated with computerized semantic indices.

    PubMed

    Pakhomov, Serguei V S; Jones, David T; Knopman, David S

    2015-01-01

    Tests of generative semantic verbal fluency are widely used to study organization and representation of concepts in the human brain. Previous studies demonstrated that clustering and switching behavior during verbal fluency tasks is supported by multiple brain mechanisms associated with semantic memory and executive control. Previous work relied on manual assessments of semantic relatedness between words and grouping of words into semantic clusters. We investigated a computational linguistic approach to measuring the strength of semantic relatedness between words based on latent semantic analysis of word co-occurrences in a subset of a large online encyclopedia. We computed semantic clustering indices and compared them to brain network connectivity measures obtained with task-free fMRI in a sample consisting of healthy participants and those differentially affected by cognitive impairment. We found that semantic clustering indices were associated with brain network connectivity in distinct areas including fronto-temporal, fronto-parietal and fusiform gyrus regions. This study shows that computerized semantic indices complement traditional assessments of verbal fluency to provide a more complete account of the relationship between brain and verbal behavior involved organization and retrieval of lexical information from memory. Copyright © 2014 Elsevier Inc. All rights reserved.

  6. ELEMENT MASSES IN THE CRAB NEBULA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sibley, Adam R.; Katz, Andrea M.; Satterfield, Timothy J.

    Using our previously published element abundance or mass-fraction distributions in the Crab Nebula, we derived actual mass distributions and estimates for overall nebular masses of hydrogen, helium, carbon, nitrogen, oxygen and sulfur. As with the previous work, computations were carried out for photoionization models involving constant hydrogen density and also constant nuclear density. In addition, employing new flux measurements for [Ni ii]  λ 7378, along with combined photoionization models and analytic computations, a nickel abundance distribution was mapped and a nebular stable nickel mass estimate was derived.

  7. Micro-Ramp Flow Control for Oblique Shock Interactions: Comparisons of Computational and Experimental Data

    NASA Technical Reports Server (NTRS)

    Hirt, Stefanie M.; Reich, David B.; O'Connor, Michael B.

    2010-01-01

    Computational fluid dynamics was used to study the effectiveness of micro-ramp vortex generators to control oblique shock boundary layer interactions. Simulations were based on experiments previously conducted in the 15 x 15 cm supersonic wind tunnel at NASA Glenn Research Center. Four micro-ramp geometries were tested at Mach 2.0 varying the height, chord length, and spanwise spacing between micro-ramps. The overall flow field was examined. Additionally, key parameters such as boundary-layer displacement thickness, momentum thickness and incompressible shape factor were also examined. The computational results predicted the effects of the micro-ramps well, including the trends for the impact that the devices had on the shock boundary layer interaction. However, computing the shock boundary layer interaction itself proved to be problematic since the calculations predicted more pronounced adverse effects on the boundary layer due to the shock than were seen in the experiment.

  8. Micro-Ramp Flow Control for Oblique Shock Interactions: Comparisons of Computational and Experimental Data

    NASA Technical Reports Server (NTRS)

    Hirt, Stephanie M.; Reich, David B.; O'Connor, Michael B.

    2012-01-01

    Computational fluid dynamics was used to study the effectiveness of micro-ramp vortex generators to control oblique shock boundary layer interactions. Simulations were based on experiments previously conducted in the 15- by 15-cm supersonic wind tunnel at the NASA Glenn Research Center. Four micro-ramp geometries were tested at Mach 2.0 varying the height, chord length, and spanwise spacing between micro-ramps. The overall flow field was examined. Additionally, key parameters such as boundary-layer displacement thickness, momentum thickness and incompressible shape factor were also examined. The computational results predicted the effects of the microramps well, including the trends for the impact that the devices had on the shock boundary layer interaction. However, computing the shock boundary layer interaction itself proved to be problematic since the calculations predicted more pronounced adverse effects on the boundary layer due to the shock than were seen in the experiment.

  9. On the Fast Evaluation Method of Temperature and Gas Mixing Ratio Weighting Functions for Remote Sensing of Planetary Atmospheres in Thermal IR and Microwave

    NASA Technical Reports Server (NTRS)

    Ustinov, E. A.

    1999-01-01

    Evaluation of weighting functions in the atmospheric remote sensing is usually the most computer-intensive part of the inversion algorithms. We present an analytic approach to computations of temperature and mixing ratio weighting functions that is based on our previous results but the resulting expressions use the intermediate variables that are generated in computations of observable radiances themselves. Upwelling radiances at the given level in the atmosphere and atmospheric transmittances from space to the given level are combined with local values of the total absorption coefficient and its components due to absorption of atmospheric constituents under study. This makes it possible to evaluate the temperature and mixing ratio weighting functions in parallel with evaluation of radiances. This substantially decreases the computer time required for evaluation of weighting functions. Implications for the nadir and limb viewing geometries are discussed.

  10. Search for an Appropriate Behavior within the Emotional Regulation in Virtual Creatures Using a Learning Classifier System

    PubMed Central

    Rosales, Jonathan-Hernando; Cervantes, José-Antonio

    2017-01-01

    Emotion regulation is a process by which human beings control emotional behaviors. From neuroscientific evidence, this mechanism is the product of conscious or unconscious processes. In particular, the mechanism generated by a conscious process needs a priori components to be computed. The behaviors generated by previous experiences are among these components. These behaviors need to be adapted to fulfill the objectives in a specific situation. The problem we address is how to endow virtual creatures with emotion regulation in order to compute an appropriate behavior in a specific emotional situation. This problem is clearly important and we have not identified ways to solve this problem in the current literature. In our proposal, we show a way to generate the appropriate behavior in an emotional situation using a learning classifier system (LCS). We illustrate the function of our proposal in unknown and known situations by means of two case studies. Our results demonstrate that it is possible to converge to the appropriate behavior even in the first case; that is, when the system does not have previous experiences and in situations where some previous information is available our proposal proves to be a very powerful tool. PMID:29209362

  11. Conflicts of interest improve collective computation of adaptive social structures

    PubMed Central

    Brush, Eleanor R.; Krakauer, David C.; Flack, Jessica C.

    2018-01-01

    In many biological systems, the functional behavior of a group is collectively computed by the system’s individual components. An example is the brain’s ability to make decisions via the activity of billions of neurons. A long-standing puzzle is how the components’ decisions combine to produce beneficial group-level outputs, despite conflicts of interest and imperfect information. We derive a theoretical model of collective computation from mechanistic first principles, using results from previous work on the computation of power structure in a primate model system. Collective computation has two phases: an information accumulation phase, in which (in this study) pairs of individuals gather information about their fighting abilities and make decisions about their dominance relationships, and an information aggregation phase, in which these decisions are combined to produce a collective computation. To model information accumulation, we extend a stochastic decision-making model—the leaky integrator model used to study neural decision-making—to a multiagent game-theoretic framework. We then test alternative algorithms for aggregating information—in this study, decisions about dominance resulting from the stochastic model—and measure the mutual information between the resultant power structure and the “true” fighting abilities. We find that conflicts of interest can improve accuracy to the benefit of all agents. We also find that the computation can be tuned to produce different power structures by changing the cost of waiting for a decision. The successful application of a similar stochastic decision-making model in neural and social contexts suggests general principles of collective computation across substrates and scales. PMID:29376116

  12. Use of the Internet by burns patients, their families and friends.

    PubMed

    Rea, S; Lim, J; Falder, S; Wood, F

    2008-05-01

    The Internet has also become an increasingly important source of health-related information. However, with this exponential increase comes the problem that although the volume of information is huge, the quality, accuracy and completeness of the information are questionable, not only in the field of medicine. Previous studies of single medical conditions have suggested that web-based health information has limitations. The aim of this study was to evaluate Internet usage among burned patients and the people accompanying them to the outpatient clinic. A customised questionnaire was created and distributed to all patients and accompanying persons in the adult and paediatric burns clinics. This investigated computer usage, Internet access, usefulness of Internet search and topics searched. Two hundred and ten people completed the questionnaire, a response rate of 83%. Sixty three percent of responders were patients, parents 21.9%, spouses 3.3%, siblings, children and friends the remaining 10.8%. Seventy seven percent of attendees had been injured within the last year, 11% between 1 and 5 years previously, and 12% more than 5 years previously. Seventy four percent had computer and Internet access. Twelve percent had performed a search. Topics searched included skin grafts, scarring and scar management treatments such as pressure garments, silicone gel and massage. This study has shown that computer and Internet access is high, however a very small number actually used the Internet to access further medical information. Patients with longer standing injuries were more likely to access the Internet. Parents of burned children were more frequent Internet users. As more burn units develop their own web sites with information for patients and healthcare providers, it is important to inform patients, family members and friends that such a resource exists. By offering such a service patients are provided with accurate, reliable and easily accessible information which is appropriate to their needs.

  13. PEPIS: A Pipeline for Estimating Epistatic Effects in Quantitative Trait Locus Mapping and Genome-Wide Association Studies.

    PubMed

    Zhang, Wenchao; Dai, Xinbin; Wang, Qishan; Xu, Shizhong; Zhao, Patrick X

    2016-05-01

    The term epistasis refers to interactions between multiple genetic loci. Genetic epistasis is important in regulating biological function and is considered to explain part of the 'missing heritability,' which involves marginal genetic effects that cannot be accounted for in genome-wide association studies. Thus, the study of epistasis is of great interest to geneticists. However, estimating epistatic effects for quantitative traits is challenging due to the large number of interaction effects that must be estimated, thus significantly increasing computing demands. Here, we present a new web server-based tool, the Pipeline for estimating EPIStatic genetic effects (PEPIS), for analyzing polygenic epistatic effects. The PEPIS software package is based on a new linear mixed model that has been used to predict the performance of hybrid rice. The PEPIS includes two main sub-pipelines: the first for kinship matrix calculation, and the second for polygenic component analyses and genome scanning for main and epistatic effects. To accommodate the demand for high-performance computation, the PEPIS utilizes C/C++ for mathematical matrix computing. In addition, the modules for kinship matrix calculations and main and epistatic-effect genome scanning employ parallel computing technology that effectively utilizes multiple computer nodes across our networked cluster, thus significantly improving the computational speed. For example, when analyzing the same immortalized F2 rice population genotypic data examined in a previous study, the PEPIS returned identical results at each analysis step with the original prototype R code, but the computational time was reduced from more than one month to about five minutes. These advances will help overcome the bottleneck frequently encountered in genome wide epistatic genetic effect analysis and enable accommodation of the high computational demand. The PEPIS is publically available at http://bioinfo.noble.org/PolyGenic_QTL/.

  14. In Search of Gender Free Paradigms for Computer Science Education. [Proceedings of a Preconference Research Workshop at the National Educational Computing Conference (Nashville, Tennessee, June 24, 1990).

    ERIC Educational Resources Information Center

    Martin, C. Dianne, Ed.; Murchie-Beyma, Eric, Ed.

    This monograph includes nine papers delivered at a National Educational Computing Conference (NECC) preconference workshop, and a previously unpublished paper on gender and attitudes. The papers, which are presented in four categories, are: (1) "Report on the Workshop: In Search of Gender Free Paradigms for Computer Science Education"…

  15. Non-unitary probabilistic quantum computing circuit and method

    NASA Technical Reports Server (NTRS)

    Williams, Colin P. (Inventor); Gingrich, Robert M. (Inventor)

    2009-01-01

    A quantum circuit performing quantum computation in a quantum computer. A chosen transformation of an initial n-qubit state is probabilistically obtained. The circuit comprises a unitary quantum operator obtained from a non-unitary quantum operator, operating on an n-qubit state and an ancilla state. When operation on the ancilla state provides a success condition, computation is stopped. When operation on the ancilla state provides a failure condition, computation is performed again on the ancilla state and the n-qubit state obtained in the previous computation, until a success condition is obtained.

  16. Design analysis and computer-aided performance evaluation of shuttle orbiter electrical power system. Volume 1: Summary

    NASA Technical Reports Server (NTRS)

    1974-01-01

    Studies were conducted to develop appropriate space shuttle electrical power distribution and control (EPDC) subsystem simulation models and to apply the computer simulations to systems analysis of the EPDC. A previously developed software program (SYSTID) was adapted for this purpose. The following objectives were attained: (1) significant enhancement of the SYSTID time domain simulation software, (2) generation of functionally useful shuttle EPDC element models, and (3) illustrative simulation results in the analysis of EPDC performance, under the conditions of fault, current pulse injection due to lightning, and circuit protection sizing and reaction times.

  17. Blind source computer device identification from recorded VoIP calls for forensic investigation.

    PubMed

    Jahanirad, Mehdi; Anuar, Nor Badrul; Wahab, Ainuddin Wahid Abdul

    2017-03-01

    The VoIP services provide fertile ground for criminal activity, thus identifying the transmitting computer devices from recorded VoIP call may help the forensic investigator to reveal useful information. It also proves the authenticity of the call recording submitted to the court as evidence. This paper extended the previous study on the use of recorded VoIP call for blind source computer device identification. Although initial results were promising but theoretical reasoning for this is yet to be found. The study suggested computing entropy of mel-frequency cepstrum coefficients (entropy-MFCC) from near-silent segments as an intrinsic feature set that captures the device response function due to the tolerances in the electronic components of individual computer devices. By applying the supervised learning techniques of naïve Bayesian, linear logistic regression, neural networks and support vector machines to the entropy-MFCC features, state-of-the-art identification accuracy of near 99.9% has been achieved on different sets of computer devices for both call recording and microphone recording scenarios. Furthermore, unsupervised learning techniques, including simple k-means, expectation-maximization and density-based spatial clustering of applications with noise (DBSCAN) provided promising results for call recording dataset by assigning the majority of instances to their correct clusters. Copyright © 2017 Elsevier Ireland Ltd. All rights reserved.

  18. Computational time reduction for sequential batch solutions in GNSS precise point positioning technique

    NASA Astrophysics Data System (ADS)

    Martín Furones, Angel; Anquela Julián, Ana Belén; Dimas-Pages, Alejandro; Cos-Gayón, Fernando

    2017-08-01

    Precise point positioning (PPP) is a well established Global Navigation Satellite System (GNSS) technique that only requires information from the receiver (or rover) to obtain high-precision position coordinates. This is a very interesting and promising technique because eliminates the need for a reference station near the rover receiver or a network of reference stations, thus reducing the cost of a GNSS survey. From a computational perspective, there are two ways to solve the system of observation equations produced by static PPP either in a single step (so-called batch adjustment) or with a sequential adjustment/filter. The results of each should be the same if they are both well implemented. However, if a sequential solution (that is, not only the final coordinates, but also those observed in previous GNSS epochs), is needed, as for convergence studies, finding a batch solution becomes a very time consuming task owing to the need for matrix inversion that accumulates with each consecutive epoch. This is not a problem for the filter solution, which uses information computed in the previous epoch for the solution of the current epoch. Thus filter implementations need extra considerations of user dynamics and parameter state variations between observation epochs with appropriate stochastic update parameter variances from epoch to epoch. These filtering considerations are not needed in batch adjustment, which makes it attractive. The main objective of this research is to significantly reduce the computation time required to obtain sequential results using batch adjustment. The new method we implemented in the adjustment process led to a mean reduction in computational time by 45%.

  19. Study of viscous flow about airfoils by the integro-differential method

    NASA Technical Reports Server (NTRS)

    Wu, J. C.; Sampath, S.

    1975-01-01

    An integro-differential method was used for numerically solving unsteady incompressible viscous flow problems. A computer program was prepared to solve the problem of an impulsively started 9% thick symmetric Joukowski airfoil at an angle of attack of 15 deg and a Reynolds number of 1000. Some of the results obtained for this problem were discussed and compared with related work completed previously. Two numerical procedures were used, an Alternating Direction Implicit (ADI) method and a Successive Line Relaxation (SLR) method. Generally, the ADI solution agrees well with the SLR solution and with previous results are stations away from the trailing edge. At the trailing edge station, the ADI solution differs substantially from previous results, while the vorticity profiles obtained from the SLR method there are in good qualitative agreement with previous results.

  20. Antitumor Activity of Lankacidin Group Antibiotics Is Due to Microtubule Stabilization via a Paclitaxel-like Mechanism.

    PubMed

    Ayoub, Ahmed Taha; Abou El-Magd, Rabab M; Xiao, Jack; Lewis, Cody Wayne; Tilli, Tatiana Martins; Arakawa, Kenji; Nindita, Yosi; Chan, Gordon; Sun, Luxin; Glover, Mark; Klobukowski, Mariusz; Tuszynski, Jack

    2016-10-27

    Lankacidin group antibiotics show strong antimicrobial activity against various Gram-positive bacteria. In addition, they were shown to have considerable antitumor activity against certain cell line models. For decades, the antitumor activity of lankacidin was associated with the mechanism of its antimicrobial action, which is interference with peptide bond formation during protein synthesis. This, however, was never confirmed experimentally. Due to significant similarity to paclitaxel-like hits in a previous computational virtual screening study, we suggested that the cytotoxic effect of lankacidin is due to a paclitaxel-like action. In this study, we tested this hypothesis computationally and experimentally and confirmed that lankacidin is a microtubule stabilizer that enhances tubulin assembly and displaces taxoids from their binding site. This study serves as a starting point for optimization of lankacidin derivatives for better antitumor activities. It also highlights the power of computational predictions and their aid in guiding experiments and formulating rigorous hypotheses.

  1. GPU computing with Kaczmarz’s and other iterative algorithms for linear systems

    PubMed Central

    Elble, Joseph M.; Sahinidis, Nikolaos V.; Vouzis, Panagiotis

    2009-01-01

    The graphics processing unit (GPU) is used to solve large linear systems derived from partial differential equations. The differential equations studied are strongly convection-dominated, of various sizes, and common to many fields, including computational fluid dynamics, heat transfer, and structural mechanics. The paper presents comparisons between GPU and CPU implementations of several well-known iterative methods, including Kaczmarz’s, Cimmino’s, component averaging, conjugate gradient normal residual (CGNR), symmetric successive overrelaxation-preconditioned conjugate gradient, and conjugate-gradient-accelerated component-averaged row projections (CARP-CG). Computations are preformed with dense as well as general banded systems. The results demonstrate that our GPU implementation outperforms CPU implementations of these algorithms, as well as previously studied parallel implementations on Linux clusters and shared memory systems. While the CGNR method had begun to fall out of favor for solving such problems, for the problems studied in this paper, the CGNR method implemented on the GPU performed better than the other methods, including a cluster implementation of the CARP-CG method. PMID:20526446

  2. Computations of Axisymmetric Flows in Hypersonic Shock Tubes

    NASA Technical Reports Server (NTRS)

    Sharma, Surendra P.; Wilson, Gregory J.

    1995-01-01

    A time-accurate two-dimensional fluid code is used to compute test times in shock tubes operated at supersonic speeds. Unlike previous studies, this investigation resolves the finer temporal details of the shock-tube flow by making use of modern supercomputers and state-of-the-art computational fluid dynamic solution techniques. The code, besides solving the time-dependent fluid equations, also accounts for the finite rate chemistry in the hypersonic environment. The flowfield solutions are used to estimate relevant shock-tube parameters for laminar flow, such as test times, and to predict density and velocity profiles. Boundary-layer parameters such as bar-delta(sub u), bar-delta(sup *), and bar-tau(sub w), and test time parameters such as bar-tau and particle time of flight t(sub f), are computed and compared with those evaluated by using Mirels' correlations. This article then discusses in detail the effects of flow nonuniformities on particle time-of-flight behind the normal shock and, consequently, on the interpretation of shock-tube data. This article concludes that for accurate interpretation of shock-tube data, a detailed analysis of flowfield parameters, using a computer code such as used in this study, must be performed.

  3. Patient's perceptions of an anesthesia preoperative computerized patient interview.

    PubMed

    Vitkun, S A; Halpern-Lewis, J G; Williams, S A; Gage, J S; Poppers, P J

    1999-12-01

    Our desire to elicit a more complete medical history from our patients led to the implementation of a preoperative computerized interview. We previously demonstrated the effectiveness of the interview by computing its mean completion time for the overall patient population (n = 120), and further examined the effects of age, gender, and educational level. In this study, we investigated patient perception of the interview itself. Before and after taking the computer interview, we asked the patients to complete a paper and pencil questionnaire comprised of sixteen questions, expressing their feelings toward the computer interview. Responses elicited prior to taking the computer interview were compared with those obtained afterward. The Stuart-Maxwell test was used to determine statistically significant differences in answers before and after the interview. Initial questionnaire responses reflected a positive attitude toward computer usage which became even stronger after the interview. The only negative responses elicited were really more "doctor positive" than "computer negative." We conclude that patients looked favorably upon participating in a computerized medical interview provided that physician-patient contact is maintained.

  4. An Experimental Comparison Between Flexible and Rigid Airfoils at Low Reynolds Numbers

    NASA Astrophysics Data System (ADS)

    Uzodinma, Jaylon; Macphee, David

    2017-11-01

    This study uses experimental and computational research methods to compare the aerodynamic performance of rigid and flexible airfoils at a low Reynolds number throughout varying angles of attack. This research can be used to improve the design of small wind turbines, micro-aerial vehicles, and any other devices that operate at low Reynolds numbers. Experimental testing was conducted in the University of Alabama's low-speed wind tunnel, and computational testing was conducted using the open-source CFD code OpenFOAM. For experimental testing, polyurethane-based (rigid) airfoils and silicone-based (flexible) airfoils were constructed using acrylic molds for NACA 0012 and NACA 2412 airfoil profiles. Computer models of the previously-specified airfoils were also created for a computational analysis. Both experimental and computational data were analyzed to examine the critical angles of attack, the lift and drag coefficients, and the occurrence of laminar boundary separation for each airfoil. Moreover, the computational simulations were used to examine the resulting flow fields, in order to provide possible explanations for the aerodynamic performances of each airfoil type. EEC 1659710.

  5. Computing induced velocity perturbations due to a helicopter fuselage in a free stream

    NASA Technical Reports Server (NTRS)

    Berry, John D.; Althoff, Susan L.

    1989-01-01

    The velocity field of a representative helicopter fuselage in a free stream is computed. Perturbation velocities due to the fuselage are computed in a plan above the location of the helicopter rotor (rotor removed). The velocity perturbations computed by a source-panel model of the fuselage are compared with experimental measurements taken with a laser velocimeter. Three paneled fuselage models are studied: fuselage shape, fuselage shape with hub shape, and a body of revolution. The velocity perturbations computed for both fuselage shape models agree well with the measured velocity field except in the close vicinity of the rotor hub. In the hub region, without knowing the extent of separation, modeling of the effective source shape is difficult. The effects of the fuselage perturbations are not well-predicted with a simplified ellipsoid fuselage. The velocity perturbations due to the fuselage at the plane of the measurements have magnitudes of less than 8 percent of free-stream velocity. The velocity perturbations computed by the panel method are tabulated for the same locations at which previously reported rotor-inflow velocity measurements were made.

  6. Brain-Inspired Photonic Signal Processor for Generating Periodic Patterns and Emulating Chaotic Systems

    NASA Astrophysics Data System (ADS)

    Antonik, Piotr; Haelterman, Marc; Massar, Serge

    2017-05-01

    Reservoir computing is a bioinspired computing paradigm for processing time-dependent signals. Its hardware implementations have received much attention because of their simplicity and remarkable performance on a series of benchmark tasks. In previous experiments, the output was uncoupled from the system and, in most cases, simply computed off-line on a postprocessing computer. However, numerical investigations have shown that feeding the output back into the reservoir opens the possibility of long-horizon time-series forecasting. Here, we present a photonic reservoir computer with output feedback, and we demonstrate its capacity to generate periodic time series and to emulate chaotic systems. We study in detail the effect of experimental noise on system performance. In the case of chaotic systems, we introduce several metrics, based on standard signal-processing techniques, to evaluate the quality of the emulation. Our work significantly enlarges the range of tasks that can be solved by hardware reservoir computers and, therefore, the range of applications they could potentially tackle. It also raises interesting questions in nonlinear dynamics and chaos theory.

  7. GPU-accelerated FDTD modeling of radio-frequency field-tissue interactions in high-field MRI.

    PubMed

    Chi, Jieru; Liu, Feng; Weber, Ewald; Li, Yu; Crozier, Stuart

    2011-06-01

    The analysis of high-field RF field-tissue interactions requires high-performance finite-difference time-domain (FDTD) computing. Conventional CPU-based FDTD calculations offer limited computing performance in a PC environment. This study presents a graphics processing unit (GPU)-based parallel-computing framework, producing substantially boosted computing efficiency (with a two-order speedup factor) at a PC-level cost. Specific details of implementing the FDTD method on a GPU architecture have been presented and the new computational strategy has been successfully applied to the design of a novel 8-element transceive RF coil system at 9.4 T. Facilitated by the powerful GPU-FDTD computing, the new RF coil array offers optimized fields (averaging 25% improvement in sensitivity, and 20% reduction in loop coupling compared with conventional array structures of the same size) for small animal imaging with a robust RF configuration. The GPU-enabled acceleration paves the way for FDTD to be applied for both detailed forward modeling and inverse design of MRI coils, which were previously impractical.

  8. Numerical Evaluation of an Ejector-Enhanced Resonant Pulse Combustor with a Poppet Inlet Valve and a Converging Exhaust Nozzle

    NASA Technical Reports Server (NTRS)

    Yungster, Shaye; Paxson, Daniel E.; Perkins, Hugh D.

    2016-01-01

    A computational investigation of a pressure-gain combustor system for gas turbine applications is presented. The system consists of a valved pulse combustor and an ejector, housed within a shroud. The study focuses on two enhancements to previous models, related to the valve and ejector components. First, a new poppet inlet valve system is investigated, replacing the previously used reed valve configuration. Secondly, a new computational approach to approximating the effects of choked turbine inlet guide vanes present immediately downstream of the Ejector-Enhanced Resonant Pulse Combustor (EERPC) is investigated. Instead of specifying a back pressure at the EERPC exit boundary (as was done in previous studies) the new model adds a converging-diverging (CD) nozzle at the exit of the EERPC. The throat area of the CD nozzle can be adjusted to obtain the desired back pressure level and total mass flow rate. The results presented indicate that the new poppet valve configuration performs nearly as well as the original reed valve system, and that the addition of the CD nozzle is an effective method to approximate the exit boundary effects of a turbine present downstream of the EERPC. Furthermore, it is shown that the more acoustically reflective boundary imposed by a nozzle as compared to a constant pressure surface does not significantly affect operation or performance.

  9. Comparing DNS and Experiments of Subcritical Flow Past an Isolated Surface Roughness Element

    NASA Astrophysics Data System (ADS)

    Doolittle, Charles; Goldstein, David

    2009-11-01

    Results are presented from computational and experimental studies of subcritical roughness within a Blasius boundary layer. This work stems from discrepancies presented by Stephani and Goldstein (AIAA Paper 2009-585) where DNS results did not agree with hot-wire measurements. The near wake regions of cylindrical surface roughness elements corresponding to roughness-based Reynolds numbers Rek of about 202 are of specific concern. Laser-Doppler anemometry and flow visualization in water, as well as the same spectral DNS code used by Stephani and Goldstein are used to obtain both quantitative and qualitative comparisons with previous results. Conclusions regarding previous studies will be presented alongside discussion of current work including grid resolution studies and an examination of vorticity dynamics.

  10. Low-flow frequency and flow duration of selected South Carolina streams in the Broad River basin through March 2008

    USGS Publications Warehouse

    Guimaraes, Wladmir B.; Feaster, Toby D.

    2010-01-01

    Of the 23 streamgaging stations for which recurrence interval computations were made, 14 had low-flow statistics that were published in previous U.S. Geological Survey reports. A comparison of the low-flow statistics for the minimum mean flow for a 7-consecutive-day period with a 10-year recurrence interval (7Q10) from this study with the most recently published values indicated that 8 of the 14 streamgaging stations had values that were within plus or minus 25 percent of the previous value. Ten of the 14 streamgaging stations had negative percent differences indicating the low-flow statistic had decreased since the previous study, and 4 streamgaging stations had positive percent differences indicating that the low-flow statistic had increased since the previous study. The low-flow statistics are influenced by length of record, hydrologic regime under which the record was collected, techniques used to do the analysis, and other changes, such as urbanization, diversions, and so on, that may have occurred in the basin.

  11. Long Penetration Mode Counterflowing Jets for Supersonic Slender Configurations - A Numerical Study

    NASA Technical Reports Server (NTRS)

    Venkatachari, Balaji Shankar; Cheng, Gary; Chang, Chau-Layn; Zichettello, Benjamin; Bilyeu, David L.

    2013-01-01

    A novel approach of using counterflowing jets positioned strategically on the aircraft and exploiting its long penetration mode (LPM) of interaction towards sonic-boom mitigation forms the motivation for this study. Given that most previous studies on the counterflowing LPM jet have all been on blunt bodies and at high supersonic or hypersonic flow conditions, exploring the feasibility to obtain a LPM jet issuing from a slender body against low supersonic freestream conditions is the main focus of this study. Computational fluid dynamics computations of axisymmetric models (cone-cylinder and quartic geometry), of relevance to NASA's High Speed project, are carried out using the space-time conservation element solution element viscous flow solver with unstructured meshes. A systematic parametric study is conducted to determine the optimum combination of counterflowing jet size, mass flow rate, and nozzle geometry for obtaining LPM jets. Details from these computations will be used to assess the potential of the LPM counterflowing supersonic jet as a means of active flow control for enabling supersonic flight over land and to establish the knowledge base for possible future implementation of such technologies.

  12. Formal specification of human-computer interfaces

    NASA Technical Reports Server (NTRS)

    Auernheimer, Brent

    1990-01-01

    A high-level formal specification of a human computer interface is described. Previous work is reviewed and the ASLAN specification language is described. Top-level specifications written in ASLAN for a library and a multiwindow interface are discussed.

  13. Modified-Signed-Digit Optical Computing Using Fan-Out

    NASA Technical Reports Server (NTRS)

    Liu, Hua-Kuang; Zhou, Shaomin; Yeh, Pochi

    1996-01-01

    Experimental optical computing system containing optical fan-out elements implements modified signed-digit (MSD) arithmetic and logic. In comparison with previous optical implementations of MSD arithmetic, this one characterized by larger throughput, greater flexibility, and simpler optics.

  14. Performing an allreduce operation on a plurality of compute nodes of a parallel computer

    DOEpatents

    Faraj, Ahmad

    2013-07-09

    Methods, apparatus, and products are disclosed for performing an allreduce operation on a plurality of compute nodes of a parallel computer, each node including at least two processing cores, that include: establishing, for each node, a plurality of logical rings, each ring including a different set of at least one core on that node, each ring including the cores on at least two of the nodes; iteratively for each node: assigning each core of that node to one of the rings established for that node to which the core has not previously been assigned, and performing, for each ring for that node, a global allreduce operation using contribution data for the cores assigned to that ring or any global allreduce results from previous global allreduce operations, yielding current global allreduce results for each core; and performing, for each node, a local allreduce operation using the global allreduce results.

  15. Using a Nondirect Product Basis to Compute J > 0 Rovibrational States of H3+

    NASA Astrophysics Data System (ADS)

    Jaquet, Ralph; Carrington, Tucker

    2013-10-01

    We have used a Lanczos algorithm with a nondirect product basis to compute energy levels of H3+ with J values as large as 46. Energy levels computed on the potential surface of M. Pavanello, et al. (J. Chem. Phys. 2012, 136, 184303) agree well with previous calculations for low J values.

  16. Improving Students' Understanding of Molecular Structure through Broad-Based Use of Computer Models in the Undergraduate Organic Chemistry Lecture

    ERIC Educational Resources Information Center

    Springer, Michael T.

    2014-01-01

    Several articles suggest how to incorporate computer models into the organic chemistry laboratory, but relatively few papers discuss how to incorporate these models broadly into the organic chemistry lecture. Previous research has suggested that "manipulating" physical or computer models enhances student understanding; this study…

  17. Projection multiplex recording of computer-synthesised one-dimensional Fourier holograms for holographic memory systems: mathematical and experimental modelling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Betin, A Yu; Bobrinev, V I; Verenikina, N M

    A multiplex method of recording computer-synthesised one-dimensional Fourier holograms intended for holographic memory devices is proposed. The method potentially allows increasing the recording density in the previously proposed holographic memory system based on the computer synthesis and projection recording of data page holograms. (holographic memory)

  18. Re-examination of Magnitude of the AD 869 Jogan Earthquake, a Possible Predecessor of the 2011 Tohoku Earthquake, from Tsunami Deposit Distribution and Computed Inundation Distances

    NASA Astrophysics Data System (ADS)

    Namegaya, Y.; Satake, K.

    2012-12-01

    We re-examined the magnitude of the AD 869 Jogan earthquake by comparing the inland limit of tsunami deposit and computed inundation distance for various fault models. The 869 tsunami deposit is distributed 3-4 km inland from the estimated past shorelines in Ishinomaki and Sendai plains (Shishikura et al., 2007, Annual Report on Active Fault and Paleoearthquake Researches; Sawai et al., 2007 ibid). In the previous studies (Satake et al., 2008 and Namegaya et al. 2010, ibid), we assumed 14 fault models of the Jogan earthquake including outer-rise normal fault, tsunami earthquake, interplate earthquakes, and an active fault in Sendai bay. The computed inundation area from an interplate earthquake with Mw of 8.4 (length: 200 km, width: 100 km, slip 7 m) covers the distribution of tsunami deposits in Ishinomaki and Sendai plains. However, the previous studies yielded the minimum magnitude, because we assumed that the inland limit of tsunami deposits and the computed inundation limit were the same. A post-2011 field survey indicate that the 2011 tsunami inundation distance was about 1.6 times the inland limit of tsunami deposits (e.g. Goto et al., 2011, Marine Geology). In this study, we computed tsunami inundation areas from interplate earthquake with different magnitude, fault length, and slip amount. The moment magnitude ranges from 8.0 to 8.7, the fault length ranges from 100 to 400 km, and the slip ranged from 3 to 9 m. The fault width is fixed at 100 km. The distance ratios of computed inundation to the inland limit of tsunami deposit (Inundation to Deposit Ratio or IDR) were calculated along 8 transects on Sendai and Ishinomaki plains. The results show that IDR increases with magnitude, up to Mw=8.4, when IDR becomes one, or the computed inundation is almost the same as the inland limit of tsunami deposit. IDR increases for a larger magnitude, but at a much smaller rate. This confirms that the magnitude of the 869 Jogan earthquake was at least 8.4, but it could be larger. When we compute the tsunami inundation from the 2011 Tohoku earthquake model (Satake et al., submitted to BSSA) using the 869 topography, IDR becomes 1.5. Considering the observed ratio of 2011 inundation to the deposit was 1.6, the magnitude of the 869 earthquake could have been similar to that of the 2011 earthquake.

  19. Atrial Fibrillation Screening in Nonmetropolitan Areas Using a Telehealth Surveillance System With an Embedded Cloud-Computing Algorithm: Prospective Pilot Study

    PubMed Central

    Chen, Ying-Hsien; Hung, Chi-Sheng; Huang, Ching-Chang; Hung, Yu-Chien

    2017-01-01

    Background Atrial fibrillation (AF) is a common form of arrhythmia that is associated with increased risk of stroke and mortality. Detecting AF before the first complication occurs is a recognized priority. No previous studies have examined the feasibility of undertaking AF screening using a telehealth surveillance system with an embedded cloud-computing algorithm; we address this issue in this study. Objective The objective of this study was to evaluate the feasibility of AF screening in nonmetropolitan areas using a telehealth surveillance system with an embedded cloud-computing algorithm. Methods We conducted a prospective AF screening study in a nonmetropolitan area using a single-lead electrocardiogram (ECG) recorder. All ECG measurements were reviewed on the telehealth surveillance system and interpreted by the cloud-computing algorithm and a cardiologist. The process of AF screening was evaluated with a satisfaction questionnaire. Results Between March 11, 2016 and August 31, 2016, 967 ECGs were recorded from 922 residents in nonmetropolitan areas. A total of 22 (2.4%, 22/922) residents with AF were identified by the physician’s ECG interpretation, and only 0.2% (2/967) of ECGs contained significant artifacts. The novel cloud-computing algorithm for AF detection had a sensitivity of 95.5% (95% CI 77.2%-99.9%) and specificity of 97.7% (95% CI 96.5%-98.5%). The overall satisfaction score for the process of AF screening was 92.1%. Conclusions AF screening in nonmetropolitan areas using a telehealth surveillance system with an embedded cloud-computing algorithm is feasible. PMID:28951384

  20. Computational chemistry

    NASA Technical Reports Server (NTRS)

    Arnold, J. O.

    1987-01-01

    With the advent of supercomputers, modern computational chemistry algorithms and codes, a powerful tool was created to help fill NASA's continuing need for information on the properties of matter in hostile or unusual environments. Computational resources provided under the National Aerodynamics Simulator (NAS) program were a cornerstone for recent advancements in this field. Properties of gases, materials, and their interactions can be determined from solutions of the governing equations. In the case of gases, for example, radiative transition probabilites per particle, bond-dissociation energies, and rates of simple chemical reactions can be determined computationally as reliably as from experiment. The data are proving to be quite valuable in providing inputs to real-gas flow simulation codes used to compute aerothermodynamic loads on NASA's aeroassist orbital transfer vehicles and a host of problems related to the National Aerospace Plane Program. Although more approximate, similar solutions can be obtained for ensembles of atoms simulating small particles of materials with and without the presence of gases. Computational chemistry has application in studying catalysis, properties of polymers, all of interest to various NASA missions, including those previously mentioned. In addition to discussing these applications of computational chemistry within NASA, the governing equations and the need for supercomputers for their solution is outlined.

  1. Computational techniques in gamma-ray skyshine analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    George, D.L.

    1988-12-01

    Two computer codes were developed to analyze gamma-ray skyshine, the scattering of gamma photons by air molecules. A review of previous gamma-ray skyshine studies discusses several Monte Carlo codes, programs using a single-scatter model, and the MicroSkyshine program for microcomputers. A benchmark gamma-ray skyshine experiment performed at Kansas State University is also described. A single-scatter numerical model was presented which traces photons from the source to their first scatter, then applies a buildup factor along a direct path from the scattering point to a detector. The FORTRAN code SKY, developed with this model before the present study, was modified tomore » use Gauss quadrature, recent photon attenuation data and a more accurate buildup approximation. The resulting code, SILOGP, computes response from a point photon source on the axis of a silo, with and without concrete shielding over the opening. Another program, WALLGP, was developed using the same model to compute response from a point gamma source behind a perfectly absorbing wall, with and without shielding overhead. 29 refs., 48 figs., 13 tabs.« less

  2. Quasi-static earthquake cycle simulation based on nonlinear viscoelastic finite element analyses

    NASA Astrophysics Data System (ADS)

    Agata, R.; Ichimura, T.; Hyodo, M.; Barbot, S.; Hori, T.

    2017-12-01

    To explain earthquake generation processes, simulation methods of earthquake cycles have been studied. For such simulations, the combination of the rate- and state-dependent friction law at the fault plane and the boundary integral method based on Green's function in an elastic half space is widely used (e.g. Hori 2009; Barbot et al. 2012). In this approach, stress change around the fault plane due to crustal deformation can be computed analytically, while the effects of complex physics such as mantle rheology and gravity are generally not taken into account. To consider such effects, we seek to develop an earthquake cycle simulation combining crustal deformation computation based on the finite element (FE) method with the rate- and state-dependent friction law. Since the drawback of this approach is the computational cost associated with obtaining numerical solutions, we adopt a recently developed fast and scalable FE solver (Ichimura et al. 2016), which assumes use of supercomputers, to solve the problem in a realistic time. As in the previous approach, we solve the governing equations consisting of the rate- and state-dependent friction law. In solving the equations, we compute stress changes along the fault plane due to crustal deformation using FE simulation, instead of computing them by superimposing slip response function as in the previous approach. In stress change computation, we take into account nonlinear viscoelastic deformation in the asthenosphere. In the presentation, we will show simulation results in a normative three-dimensional problem, where a circular-shaped velocity-weakening area is set in a square-shaped fault plane. The results with and without nonlinear viscosity in the asthenosphere will be compared. We also plan to apply the developed code to simulate the post-earthquake deformation of a megathrust earthquake, such as the 2011 Tohoku earthquake. Acknowledgment: The results were obtained using the K computer at the RIKEN (Proposal number hp160221).

  3. The cardiac muscle duplex as a method to study myocardial heterogeneity

    PubMed Central

    Solovyova, O.; Katsnelson, L.B.; Konovalov, P.V.; Kursanov, A.G.; Vikulova, N.A.; Kohl, P.; Markhasin, V.S.

    2014-01-01

    This paper reviews the development and application of paired muscle preparations, called duplex, for the investigation of mechanisms and consequences of intra-myocardial electro-mechanical heterogeneity. We illustrate the utility of the underlying combined experimental and computational approach for conceptual development and integration of basic science insight with clinically relevant settings, using previously published and new data. Directions for further study are identified. PMID:25106702

  4. Computational Models for Calcium-Mediated Astrocyte Functions.

    PubMed

    Manninen, Tiina; Havela, Riikka; Linne, Marja-Leena

    2018-01-01

    The computational neuroscience field has heavily concentrated on the modeling of neuronal functions, largely ignoring other brain cells, including one type of glial cell, the astrocytes. Despite the short history of modeling astrocytic functions, we were delighted about the hundreds of models developed so far to study the role of astrocytes, most often in calcium dynamics, synchronization, information transfer, and plasticity in vitro , but also in vascular events, hyperexcitability, and homeostasis. Our goal here is to present the state-of-the-art in computational modeling of astrocytes in order to facilitate better understanding of the functions and dynamics of astrocytes in the brain. Due to the large number of models, we concentrated on a hundred models that include biophysical descriptions for calcium signaling and dynamics in astrocytes. We categorized the models into four groups: single astrocyte models, astrocyte network models, neuron-astrocyte synapse models, and neuron-astrocyte network models to ease their use in future modeling projects. We characterized the models based on which earlier models were used for building the models and which type of biological entities were described in the astrocyte models. Features of the models were compared and contrasted so that similarities and differences were more readily apparent. We discovered that most of the models were basically generated from a small set of previously published models with small variations. However, neither citations to all the previous models with similar core structure nor explanations of what was built on top of the previous models were provided, which made it possible, in some cases, to have the same models published several times without an explicit intention to make new predictions about the roles of astrocytes in brain functions. Furthermore, only a few of the models are available online which makes it difficult to reproduce the simulation results and further develop the models. Thus, we would like to emphasize that only via reproducible research are we able to build better computational models for astrocytes, which truly advance science. Our study is the first to characterize in detail the biophysical and biochemical mechanisms that have been modeled for astrocytes.

  5. Computational Models for Calcium-Mediated Astrocyte Functions

    PubMed Central

    Manninen, Tiina; Havela, Riikka; Linne, Marja-Leena

    2018-01-01

    The computational neuroscience field has heavily concentrated on the modeling of neuronal functions, largely ignoring other brain cells, including one type of glial cell, the astrocytes. Despite the short history of modeling astrocytic functions, we were delighted about the hundreds of models developed so far to study the role of astrocytes, most often in calcium dynamics, synchronization, information transfer, and plasticity in vitro, but also in vascular events, hyperexcitability, and homeostasis. Our goal here is to present the state-of-the-art in computational modeling of astrocytes in order to facilitate better understanding of the functions and dynamics of astrocytes in the brain. Due to the large number of models, we concentrated on a hundred models that include biophysical descriptions for calcium signaling and dynamics in astrocytes. We categorized the models into four groups: single astrocyte models, astrocyte network models, neuron-astrocyte synapse models, and neuron-astrocyte network models to ease their use in future modeling projects. We characterized the models based on which earlier models were used for building the models and which type of biological entities were described in the astrocyte models. Features of the models were compared and contrasted so that similarities and differences were more readily apparent. We discovered that most of the models were basically generated from a small set of previously published models with small variations. However, neither citations to all the previous models with similar core structure nor explanations of what was built on top of the previous models were provided, which made it possible, in some cases, to have the same models published several times without an explicit intention to make new predictions about the roles of astrocytes in brain functions. Furthermore, only a few of the models are available online which makes it difficult to reproduce the simulation results and further develop the models. Thus, we would like to emphasize that only via reproducible research are we able to build better computational models for astrocytes, which truly advance science. Our study is the first to characterize in detail the biophysical and biochemical mechanisms that have been modeled for astrocytes. PMID:29670517

  6. Multiscale QM/MM molecular dynamics study on the first steps of guanine damage by free hydroxyl radicals in solution.

    PubMed

    Abolfath, Ramin M; Biswas, P K; Rajnarayanam, R; Brabec, Thomas; Kodym, Reinhard; Papiez, Lech

    2012-04-19

    Understanding the damage of DNA bases from hydrogen abstraction by free OH radicals is of particular importance to understanding the indirect effect of ionizing radiation. Previous studies address the problem with truncated DNA bases as ab initio quantum simulations required to study such electronic-spin-dependent processes are computationally expensive. Here, for the first time, we employ a multiscale and hybrid quantum mechanical-molecular mechanical simulation to study the interaction of OH radicals with a guanine-deoxyribose-phosphate DNA molecular unit in the presence of water, where all of the water molecules and the deoxyribose-phosphate fragment are treated with the simplistic classical molecular mechanical scheme. Our result illustrates that the presence of water strongly alters the hydrogen-abstraction reaction as the hydrogen bonding of OH radicals with water restricts the relative orientation of the OH radicals with respect to the DNA base (here, guanine). This results in an angular anisotropy in the chemical pathway and a lower efficiency in the hydrogen-abstraction mechanisms than previously anticipated for identical systems in vacuum. The method can easily be extended to single- and double-stranded DNA without any appreciable computational cost as these molecular units can be treated in the classical subsystem, as has been demonstrated here. © 2012 American Chemical Society

  7. Work-related health disorders among Saudi computer users.

    PubMed

    Jomoah, Ibrahim M

    2014-01-01

    The present study was conducted to investigate the prevalence of musculoskeletal disorders and eye and vision complaints among the computer users of King Abdulaziz University (KAU), Saudi Arabian Airlines (SAUDIA), and Saudi Telecom Company (STC). Stratified random samples of the work stations and operators at each of the studied institutions were selected and the ergonomics of the work stations were assessed and the operators' health complaints were investigated. The average ergonomic score of the studied work station at STC, KAU, and SAUDIA was 81.5%, 73.3%, and 70.3, respectively. Most of the examined operators use computers daily for ≤ 7 hours, yet they had some average incidences of general complaints (e.g., headache, body fatigue, and lack of concentration) and relatively high level of incidences of eye and vision complaints and musculoskeletal complaints. The incidences of the complaints have been found to increase with the (a) decrease in work station ergonomic score, (b) progress of age and duration of employment, (c) smoking, (d) use of computers, (e) lack of work satisfaction, and (f) history of operators' previous ailments. It has been recommended to improve the ergonomics of the work stations, set up training programs, and conduct preplacement and periodical examinations for operators.

  8. Work-Related Health Disorders among Saudi Computer Users

    PubMed Central

    Jomoah, Ibrahim M.

    2014-01-01

    The present study was conducted to investigate the prevalence of musculoskeletal disorders and eye and vision complaints among the computer users of King Abdulaziz University (KAU), Saudi Arabian Airlines (SAUDIA), and Saudi Telecom Company (STC). Stratified random samples of the work stations and operators at each of the studied institutions were selected and the ergonomics of the work stations were assessed and the operators' health complaints were investigated. The average ergonomic score of the studied work station at STC, KAU, and SAUDIA was 81.5%, 73.3%, and 70.3, respectively. Most of the examined operators use computers daily for ≤ 7 hours, yet they had some average incidences of general complaints (e.g., headache, body fatigue, and lack of concentration) and relatively high level of incidences of eye and vision complaints and musculoskeletal complaints. The incidences of the complaints have been found to increase with the (a) decrease in work station ergonomic score, (b) progress of age and duration of employment, (c) smoking, (d) use of computers, (e) lack of work satisfaction, and (f) history of operators' previous ailments. It has been recommended to improve the ergonomics of the work stations, set up training programs, and conduct preplacement and periodical examinations for operators. PMID:25383379

  9. Large Eddy Simulation of "turbulent-like" flow in intracranial aneurysms

    NASA Astrophysics Data System (ADS)

    Khan, Muhammad Owais; Chnafa, Christophe; Steinman, David A.; Mendez, Simon; Nicoud, Franck

    2016-11-01

    Hemodynamic forces are thought to contribute to pathogenesis and rupture of intracranial aneurysms (IA). Recent high-resolution patient-specific computational fluid dynamics (CFD) simulations have highlighted the presence of "turbulent-like" flow features, characterized by transient high-frequency flow instabilities. In-vitro studies have shown that such "turbulent-like" flows can lead to lack of endothelial cell orientation and cell depletion, and thus, may also have relevance to IA rupture risk assessment. From a modelling perspective, previous studies have relied on DNS to resolve the small-scale structures in these flows. While accurate, DNS is clinically infeasible due to high computational cost and long simulation times. In this study, we present the applicability of LES for IAs using a LES/blood flow dedicated solver (YALES2BIO) and compare against respective DNS. As a qualitative analysis, we compute time-averaged WSS and OSI maps, as well as, novel frequency-based WSS indices. As a quantitative analysis, we show the differences in POD eigenspectra for LES vs. DNS and wavelet analysis of intra-saccular velocity traces. Differences in two SGS models (i.e. Dynamic Smagorinsky vs. Sigma) are also compared against DNS, and computational gains of LES are discussed.

  10. Computerized content analysis of some adolescent writings of Napoleon Bonaparte: a test of the validity of the method.

    PubMed

    Gottschalk, Louis A; DeFrancisco, Don; Bechtel, Robert J

    2002-08-01

    The aim of this study was to test the validity of a computer software program previously demonstrated to be capable of making DSM-IV neuropsychiatric diagnoses from the content analysis of speech or verbal texts. In this report, the computer program was applied to three personal writings of Napoleon Bonaparte when he was 12 to 16 years of age. The accuracy of the neuropsychiatric evaluations derived from the computerized content analysis of these writings of Napoleon was independently corroborated by two biographers who have described pertinent details concerning his life situations, moods, and other emotional reactions during this adolescent period of his life. The relevance of this type of computer technology to psychohistorical research and clinical psychiatry is suggested.

  11. Domain decomposition for aerodynamic and aeroacoustic analyses, and optimization

    NASA Technical Reports Server (NTRS)

    Baysal, Oktay

    1995-01-01

    The overarching theme was the domain decomposition, which intended to improve the numerical solution technique for the partial differential equations at hand; in the present study, those that governed either the fluid flow, or the aeroacoustic wave propagation, or the sensitivity analysis for a gradient-based optimization. The role of the domain decomposition extended beyond the original impetus of discretizing geometrical complex regions or writing modular software for distributed-hardware computers. It induced function-space decompositions and operator decompositions that offered the valuable property of near independence of operator evaluation tasks. The objectives have gravitated about the extensions and implementations of either the previously developed or concurrently being developed methodologies: (1) aerodynamic sensitivity analysis with domain decomposition (SADD); (2) computational aeroacoustics of cavities; and (3) dynamic, multibody computational fluid dynamics using unstructured meshes.

  12. Direct coal liquefaction baseline design and system analysis. Quarterly report, January--March 1991

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1991-04-01

    The primary objective of the study is to develop a computer model for a base line direct coal liquefaction design based on two stage direct coupled catalytic reactors. This primary objective is to be accomplished by completing the following: a base line design based on previous DOE/PETC results from Wilsonville pilot plant and other engineering evaluations; a cost estimate and economic analysis; a computer model incorporating the above two steps over a wide range of capacities and selected process alternatives; a comprehensive training program for DOE/PETC Staff to understand and use the computer model; a thorough documentation of all underlyingmore » assumptions for baseline economics; and a user manual and training material which will facilitate updating of the model in the future.« less

  13. Direct coal liquefaction baseline design and system analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1991-04-01

    The primary objective of the study is to develop a computer model for a base line direct coal liquefaction design based on two stage direct coupled catalytic reactors. This primary objective is to be accomplished by completing the following: a base line design based on previous DOE/PETC results from Wilsonville pilot plant and other engineering evaluations; a cost estimate and economic analysis; a computer model incorporating the above two steps over a wide range of capacities and selected process alternatives; a comprehensive training program for DOE/PETC Staff to understand and use the computer model; a thorough documentation of all underlyingmore » assumptions for baseline economics; and a user manual and training material which will facilitate updating of the model in the future.« less

  14. General rigid motion correction for computed tomography imaging based on locally linear embedding

    NASA Astrophysics Data System (ADS)

    Chen, Mianyi; He, Peng; Feng, Peng; Liu, Baodong; Yang, Qingsong; Wei, Biao; Wang, Ge

    2018-02-01

    The patient motion can damage the quality of computed tomography images, which are typically acquired in cone-beam geometry. The rigid patient motion is characterized by six geometric parameters and are more challenging to correct than in fan-beam geometry. We extend our previous rigid patient motion correction method based on the principle of locally linear embedding (LLE) from fan-beam to cone-beam geometry and accelerate the computational procedure with the graphics processing unit (GPU)-based all scale tomographic reconstruction Antwerp toolbox. The major merit of our method is that we need neither fiducial markers nor motion-tracking devices. The numerical and experimental studies show that the LLE-based patient motion correction is capable of calibrating the six parameters of the patient motion simultaneously, reducing patient motion artifacts significantly.

  15. Simulation using computer-piloted point excitations of vibrations induced on a structure by an acoustic environment

    NASA Astrophysics Data System (ADS)

    Monteil, P.

    1981-11-01

    Computation of the overall levels and spectral densities of the responses measured on a launcher skin, the fairing for instance, merged into a random acoustic environment during take off, was studied. The analysis of transmission of these vibrations to the payload required the simulation of these responses by a shaker control system, using a small number of distributed shakers. Results show that this closed loop computerized digital system allows the acquisition of auto and cross spectral densities equal to those of the responses previously computed. However, wider application is sought, e.g., road and runway profiles. The problems of multiple input-output system identification, multiple true random signal generation, and real time programming are evoked. The system should allow for the control of four shakers.

  16. Learning Science in a Virtual Reality Application: The Impacts of Animated-Virtual Actors' Visual Complexity

    ERIC Educational Resources Information Center

    Kartiko, Iwan; Kavakli, Manolya; Cheng, Ken

    2010-01-01

    As the technology in computer graphics advances, Animated-Virtual Actors (AVAs) in Virtual Reality (VR) applications become increasingly rich and complex. Cognitive Theory of Multimedia Learning (CTML) suggests that complex visual materials could hinder novice learners from attending to the lesson properly. On the other hand, previous studies have…

  17. The Association between Students' Use of an Electronic Voting System and their Learning Outcomes

    ERIC Educational Resources Information Center

    Kennedy, G. E.; Cutts, Q. I.

    2005-01-01

    This paper reports on the use of an electronic voting system (EVS) in a first-year computing science subject. Previous investigations suggest that students' use of an EVS would be positively associated with their learning outcomes. However, no research has established this relationship empirically. This study sought to establish whether there was…

  18. A computational study of pyrolysis reactions of lignin model compounds

    Treesearch

    Thomas Elder

    2010-01-01

    Enthalpies of reaction for the initial steps in the pyrolysis of lignin have been evaluated at the CBS-4m level of theory using fully substituted b-O-4 dilignols. Values for competing unimolecular decomposition reactions are consistent with results previously published for phenethyl phenyl ether models, but with lowered selectivity. Chain propagating reactions of free...

  19. Factors Influencing the Effectiveness of Note Taking on Computer-Based Graphic Organizers

    ERIC Educational Resources Information Center

    Crooks, Steven M.; White, David R.; Barnard, Lucy

    2007-01-01

    Previous research on graphic organizer (GO) note taking has shown that this method is most effective when the GO is presented to the student partially complete with provided notes. This study extended prior research by investigating the effects of provided note type (summary vs. verbatim) and GO bite size (large vs. small) on the transfer…

  20. Exploring Factors That Influence Technology-Based Distractions in Bring Your Own Device Classrooms

    ERIC Educational Resources Information Center

    Kay, Robin; Benzimra, Daniel; Li, Jia

    2017-01-01

    Previous research on distractions and the use of mobile devices (personal digital assistants, tablet personal computers, or laptops) have been conducted almost exclusively in higher education. The purpose of the current study was to examine the frequency and influence of distracting behaviors in Bring Your Own Device secondary school classrooms.…

  1. Providing Graduated Corrective Feedback in an Intelligent Computer-Assisted Language Learning Environment

    ERIC Educational Resources Information Center

    Ai, Haiyang

    2017-01-01

    Corrective feedback (CF), a response to linguistic errors made by second language (L2) learners, has received extensive scholarly attention in second language acquisition. While much of the previous research in the field has focused on whether CF facilitates or impedes L2 development, few studies have examined the efficacy of gradually modifying…

  2. The Impact of Receiving the Same Items on Consecutive Computer Adaptive Test Administrations.

    ERIC Educational Resources Information Center

    O'Neill, Thomas; Lunz, Mary E.; Thiede, Keith

    2000-01-01

    Studied item exposure in a computerized adaptive test when the item selection algorithm presents examinees with questions they were asked in a previous test administration. Results with 178 repeat examinees on a medical technologists' test indicate that the combined use of an adaptive algorithm to select items and latent trait theory to estimate…

  3. Satisfaction Clustering Analysis of Distance Education Computer Programming Students: A Sample of Karadeniz Technical University

    ERIC Educational Resources Information Center

    Ozyurt, Hacer

    2014-01-01

    In line with recently developing technology, distant education systems based on information technologies are started to be commonly used within higher education. Students' satisfaction is one of the vital aspects in order to maintain distant education efficiently and achieving its goal. As a matter of the fact, previous studies proved that student…

  4. Evaluating the Impact of Instructional Support Using Data Mining and Process Mining: A Micro-Level Analysis of the Effectiveness of Metacognitive Prompts

    ERIC Educational Resources Information Center

    Sonnenberg, Christoph; Bannert, Maria

    2016-01-01

    In computer-supported learning environments, the deployment of self-regulatory skills represents an essential prerequisite for successful learning. Metacognitive prompts are a promising type of instructional support to activate students' strategic learning activities. However, despite positive effects in previous studies, there are still a large…

  5. Multivariate Epi-splines and Evolving Function Identification Problems

    DTIC Science & Technology

    2015-04-15

    such extrinsic information as well as observed function and subgradient values often evolve in applications, we establish conditions under which the...previous study [30] dealt with compact intervals of IR. Splines are intimately tied to optimization problems through their variational theory pioneered...approxima- tion. Motivated by applications in curve fitting, regression, probability density estimation, variogram computation, financial curve construction

  6. Photochromic molecular implementations of universal computation.

    PubMed

    Chaplin, Jack C; Krasnogor, Natalio; Russell, Noah A

    2014-12-01

    Unconventional computing is an area of research in which novel materials and paradigms are utilised to implement computation. Previously we have demonstrated how registers, logic gates and logic circuits can be implemented, unconventionally, with a biocompatible molecular switch, NitroBIPS, embedded in a polymer matrix. NitroBIPS and related molecules have been shown elsewhere to be capable of modifying many biological processes in a manner that is dependent on its molecular form. Thus, one possible application of this type of unconventional computing is to embed computational processes into biological systems. Here we expand on our earlier proof-of-principle work and demonstrate that universal computation can be implemented using NitroBIPS. We have previously shown that spatially localised computational elements, including registers and logic gates, can be produced. We explain how parallel registers can be implemented, then demonstrate an application of parallel registers in the form of Turing machine tapes, and demonstrate both parallel registers and logic circuits in the form of elementary cellular automata. The Turing machines and elementary cellular automata utilise the same samples and same hardware to implement their registers, logic gates and logic circuits; and both represent examples of universal computing paradigms. This shows that homogenous photochromic computational devices can be dynamically repurposed without invasive reconfiguration. The result represents an important, necessary step towards demonstrating the general feasibility of interfacial computation embedded in biological systems or other unconventional materials and environments. Copyright © 2014 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.

  7. Low-flow frequency and flow duration of selected South Carolina streams in the Savannah and Salkehatchie River Basins through March 2014

    USGS Publications Warehouse

    Feaster, Toby D.; Guimaraes, Wladmir B.

    2016-07-14

    An ongoing understanding of streamflow characteristics of the rivers and streams in South Carolina is important for the protection and preservation of the State’s water resources. Information concerning the low-flow characteristics of streams is especially important during critical flow periods, such as during the historic droughts that South Carolina has experienced in the past few decades.In 2008, the U.S. Geological Survey, in cooperation with the South Carolina Department of Health and Environmental Control, initiated a study to update low-flow statistics at continuous-record streamgaging stations operated by the U.S. Geological Survey in South Carolina. This report presents the low-flow statistics for 28 selected streamgaging stations in the Savannah and Salkehatchie River Basins in South Carolina. The low-flow statistics include daily mean flow durations for the 5-, 10-, 25-, 50-, 75-, 90-, and 95-percent probability of exceedance and the annual minimum 1-, 3-, 7-, 14-, 30-, 60-, and 90-day mean flows with recurrence intervals of 2, 5, 10, 20, 30, and 50 years, depending on the length of record available at the streamgaging station. The low-flow statistics were computed from records available through March 31, 2014.Low-flow statistics are influenced by length of record, hydrologic regime under which the data were collected, analytical techniques used, and other factors, such as urbanization, diversions, and droughts that may have occurred in the basin. To assess changes in the low-flow statistics from the previously published values, a comparison of the low-flow statistics for the annual minimum 7-day average streamflow with a 10-year recurrence interval (7Q10) from this study was made with the most recently published values. Of the 28 streamgaging stations for which recurrence interval computations were made, 14 streamgaging stations were suitable for comparing to low-flow statistics that were previously published in U.S. Geological Survey reports. These comparisons indicated that seven of the streamgaging stations had values lower than the previous values, two streamgaging stations had values higher than the previous values, and two streamgaging stations had values that were unchanged from previous values. The remaining three stations for which previous 7Q10 values were computed, which are located on the main stem of the Savannah River, were not compared with current estimates because of differences in the way the pre-regulation and regulated flow data were analyzed.

  8. A computational neural approach to support the discovery of gene function and classes of cancer.

    PubMed

    Azuaje, F

    2001-03-01

    Advances in molecular classification of tumours may play a central role in cancer treatment. Here, a novel approach to genome expression pattern interpretation is described and applied to the recognition of B-cell malignancies as a test set. Using cDNA microarrays data generated by a previous study, a neural network model known as simplified fuzzy ARTMAP is able to identify normal and diffuse large B-cell lymphoma (DLBCL) patients. Furthermore, it discovers the distinction between patients with molecularly distinct forms of DLBCL without previous knowledge of those subtypes.

  9. Transient Three-Dimensional Side Load Analysis of Out-of-Round Film Cooled Nozzles

    NASA Technical Reports Server (NTRS)

    Wang, Ten-See; Lin, Jeff; Ruf, Joe; Guidos, Mike

    2010-01-01

    The objective of this study is to investigate the effect of nozzle out-of-roundness on the transient startup side loads at a high altitude, with an anchored computational methodology. The out-of-roundness could be the result of asymmetric loads induced by hardware attached to the nozzle, asymmetric internal stresses induced by previous tests, and deformation, such as creep, from previous tests. The rocket engine studied encompasses a regeneratively cooled thrust chamber and a film cooled nozzle extension with film coolant distributed from a turbine exhaust manifold. The computational methodology is based on an unstructured-grid, pressure-based computational fluid dynamics formulation, and a transient inlet history based on an engine system simulation. Transient startup computations were performed with the out-of-roundness achieved by four different degrees of ovalization: one perfectly round, one slightly out-of-round, one more out-of-round, and one significantly out-of-round. The results show that the separation-line-jump is the peak side load physics for the round, slightly our-of-round, and more out-of-round cases, and the peak side load increases as the degree of out-of-roundness increases. For the significantly out-of-round nozzle, however, the peak side load reduces to comparable to that of the round nozzle and the separation line jump is not the peak side load physics. The counter-intuitive result of the significantly out-of-round case is found to be related to a side force reduction mechanism that splits the effect of the separation-line-jump into two parts, not only in the circumferential direction and most importantly in time.

  10. Accurate and efficient calculation of response times for groundwater flow

    NASA Astrophysics Data System (ADS)

    Carr, Elliot J.; Simpson, Matthew J.

    2018-03-01

    We study measures of the amount of time required for transient flow in heterogeneous porous media to effectively reach steady state, also known as the response time. Here, we develop a new approach that extends the concept of mean action time. Previous applications of the theory of mean action time to estimate the response time use the first two central moments of the probability density function associated with the transition from the initial condition, at t = 0, to the steady state condition that arises in the long time limit, as t → ∞ . This previous approach leads to a computationally convenient estimation of the response time, but the accuracy can be poor. Here, we outline a powerful extension using the first k raw moments, showing how to produce an extremely accurate estimate by making use of asymptotic properties of the cumulative distribution function. Results are validated using an existing laboratory-scale data set describing flow in a homogeneous porous medium. In addition, we demonstrate how the results also apply to flow in heterogeneous porous media. Overall, the new method is: (i) extremely accurate; and (ii) computationally inexpensive. In fact, the computational cost of the new method is orders of magnitude less than the computational effort required to study the response time by solving the transient flow equation. Furthermore, the approach provides a rigorous mathematical connection with the heuristic argument that the response time for flow in a homogeneous porous medium is proportional to L2 / D , where L is a relevant length scale, and D is the aquifer diffusivity. Here, we extend such heuristic arguments by providing a clear mathematical definition of the proportionality constant.

  11. Robot services for elderly with cognitive impairment: testing usability of graphical user interfaces.

    PubMed

    Granata, C; Pino, M; Legouverneur, G; Vidal, J-S; Bidaud, P; Rigaud, A-S

    2013-01-01

    Socially assistive robotics for elderly care is a growing field. However, although robotics has the potential to support elderly in daily tasks by offering specific services, the development of usable interfaces is still a challenge. Since several factors such as age or disease-related changes in perceptual or cognitive abilities and familiarity with computer technologies influence technology use they must be considered when designing interfaces for these users. This paper presents findings from usability testing of two different services provided by a social assistive robot intended for elderly with cognitive impairment: a grocery shopping list and an agenda application. The main goal of this study is to identify the usability problems of the robot interface for target end-users as well as to isolate the human factors that affect the use of the technology by elderly. Socio-demographic characteristics and computer experience were examined as factors that could have an influence on task performance. A group of 11 elderly persons with Mild Cognitive Impairment and a group of 11 cognitively healthy elderly individuals took part in this study. Performance measures (task completion time and number of errors) were collected. Cognitive profile, age and computer experience were found to impact task performance. Participants with cognitive impairment achieved the tasks committing more errors than cognitively healthy elderly. Instead younger participants and those with previous computer experience were faster at completing the tasks confirming previous findings in the literature. The overall results suggested that interfaces and contents of the services assessed were usable by older adults with cognitive impairment. However, some usability problems were identified and should be addressed to better meet the needs and capacities of target end-users.

  12. Importance of elastic finite-size effects: Neutral defects in ionic compounds

    DOE PAGES

    Burr, P. A.; Cooper, M. W. D.

    2017-09-15

    Small system sizes are a well known source of error in DFT calculations, yet computational constraints frequently dictate the use of small supercells, often as small as 96 atoms in oxides and compound semiconductors. In ionic compounds, electrostatic finite size effects have been well characterised, but self-interaction of charge neutral defects is often discounted or assumed to follow an asymptotic behaviour and thus easily corrected with linear elastic theory. Here we show that elastic effect are also important in the description of defects in ionic compounds and can lead to qualitatively incorrect conclusions if inadequatly small supercells are used; moreover,more » the spurious self-interaction does not follow the behaviour predicted by linear elastic theory. Considering the exemplar cases of metal oxides with fluorite structure, we show that numerous previous studies, employing 96-atom supercells, misidentify the ground state structure of (charge neutral) Schottky defects. We show that the error is eliminated by employing larger cells (324, 768 and 1500 atoms), and careful analysis determines that elastic effects, not electrostatic, are responsible. The spurious self-interaction was also observed in non-oxide ionic compounds and irrespective of the computational method used, thereby resolving long standing discrepancies between DFT and force-field methods, previously attributed to the level of theory. The surprising magnitude of the elastic effects are a cautionary tale for defect calculations in ionic materials, particularly when employing computationally expensive methods (e.g. hybrid functionals) or when modelling large defect clusters. We propose two computationally practicable methods to test the magnitude of the elastic self-interaction in any ionic system. In commonly studies oxides, where electrostatic effects would be expected to be dominant, it is the elastic effects that dictate the need for larger supercells | greater than 96 atoms.« less

  13. Objective Assessment of the Interfrontal Angle for Severity Grading and Operative Decision-Making in Metopic Synostosis.

    PubMed

    Anolik, Rachel A; Allori, Alexander C; Pourtaheri, Navid; Rogers, Gary F; Marcus, Jeffrey R

    2016-05-01

    The purpose of this study was to evaluate the utility of a previously validated interfrontal angle for classification of severity of metopic synostosis and as an aid to operative decision-making. An expert panel was asked to study 30 cases ranging from minor to severe metopic synostosis. Based on computed tomographic images of the skull and clinical photographs, they classified the severity of trigonocephaly (1 = normal, 2 = mild, 3 = moderate, and 4 = severe) and management (0 = nonoperative and 1 = operative). The severity scores and management reported by experts were then pooled and matched with the interfrontal angle computed from each respective computed tomographic scan. A threshold was identified at which most experts agree on operative management. Expert severity scores were higher for more acute interfrontal angles. There was a high concordance at the extremes of classifications, severe (4) and normal (1) (p < 0.0001); however, between interfrontal angles of 114.3 and 136.1 degrees, there exists a "gray zone," with severe discordance in expert rankings. An operative threshold of 118.2 degrees was identified, with the interfrontal angle able to predict the expert panel's decision to proceed with surgery 87.6 percent of the time. The interfrontal angle has been previously validated as a simple, accurate, and reproducible means for diagnosing trigonocephaly, but must be obtained from computed tomographic data. In this article, the authors demonstrate that the interfrontal angle can be used to further characterize the severity of trigonocephaly. It also correlated with expert decision-making for operative versus nonoperative management. This tool may be used as an adjunct to clinical decision-making when the decision to proceed with surgery may not be straightforward. Diagnostic, V.

  14. Importance of elastic finite-size effects: Neutral defects in ionic compounds

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Burr, P. A.; Cooper, M. W. D.

    Small system sizes are a well known source of error in DFT calculations, yet computational constraints frequently dictate the use of small supercells, often as small as 96 atoms in oxides and compound semiconductors. In ionic compounds, electrostatic finite size effects have been well characterised, but self-interaction of charge neutral defects is often discounted or assumed to follow an asymptotic behaviour and thus easily corrected with linear elastic theory. Here we show that elastic effect are also important in the description of defects in ionic compounds and can lead to qualitatively incorrect conclusions if inadequatly small supercells are used; moreover,more » the spurious self-interaction does not follow the behaviour predicted by linear elastic theory. Considering the exemplar cases of metal oxides with fluorite structure, we show that numerous previous studies, employing 96-atom supercells, misidentify the ground state structure of (charge neutral) Schottky defects. We show that the error is eliminated by employing larger cells (324, 768 and 1500 atoms), and careful analysis determines that elastic effects, not electrostatic, are responsible. The spurious self-interaction was also observed in non-oxide ionic compounds and irrespective of the computational method used, thereby resolving long standing discrepancies between DFT and force-field methods, previously attributed to the level of theory. The surprising magnitude of the elastic effects are a cautionary tale for defect calculations in ionic materials, particularly when employing computationally expensive methods (e.g. hybrid functionals) or when modelling large defect clusters. We propose two computationally practicable methods to test the magnitude of the elastic self-interaction in any ionic system. In commonly studies oxides, where electrostatic effects would be expected to be dominant, it is the elastic effects that dictate the need for larger supercells | greater than 96 atoms.« less

  15. Importance of elastic finite-size effects: Neutral defects in ionic compounds

    NASA Astrophysics Data System (ADS)

    Burr, P. A.; Cooper, M. W. D.

    2017-09-01

    Small system sizes are a well-known source of error in density functional theory (DFT) calculations, yet computational constraints frequently dictate the use of small supercells, often as small as 96 atoms in oxides and compound semiconductors. In ionic compounds, electrostatic finite-size effects have been well characterized, but self-interaction of charge-neutral defects is often discounted or assumed to follow an asymptotic behavior and thus easily corrected with linear elastic theory. Here we show that elastic effects are also important in the description of defects in ionic compounds and can lead to qualitatively incorrect conclusions if inadequately small supercells are used; moreover, the spurious self-interaction does not follow the behavior predicted by linear elastic theory. Considering the exemplar cases of metal oxides with fluorite structure, we show that numerous previous studies, employing 96-atom supercells, misidentify the ground-state structure of (charge-neutral) Schottky defects. We show that the error is eliminated by employing larger cells (324, 768, and 1500 atoms), and careful analysis determines that elastic, not electrostatic, effects are responsible. The spurious self-interaction was also observed in nonoxide ionic compounds irrespective of the computational method used, thereby resolving long-standing discrepancies between DFT and force-field methods, previously attributed to the level of theory. The surprising magnitude of the elastic effects is a cautionary tale for defect calculations in ionic materials, particularly when employing computationally expensive methods (e.g., hybrid functionals) or when modeling large defect clusters. We propose two computationally practicable methods to test the magnitude of the elastic self-interaction in any ionic system. In commonly studied oxides, where electrostatic effects would be expected to be dominant, it is the elastic effects that dictate the need for larger supercells: greater than 96 atoms.

  16. Computer use and stress, sleep disturbances, and symptoms of depression among young adults – a prospective cohort study

    PubMed Central

    2012-01-01

    Background We have previously studied prospective associations between computer use and mental health symptoms in a selected young adult population. The purpose of this study was to investigate if high computer use is a prospective risk factor for developing mental health symptoms in a population-based sample of young adults. Methods The study group was a cohort of young adults (n = 4163), 20–24 years old, who responded to a questionnaire at baseline and 1-year follow-up. Exposure variables included time spent on computer use (CU) in general, email/chat use, computer gaming, CU without breaks, and CU at night causing lost sleep. Mental health outcomes included perceived stress, sleep disturbances, symptoms of depression, and reduced performance due to stress, depressed mood, or tiredness. Prevalence ratios (PRs) were calculated for prospective associations between exposure variables at baseline and mental health outcomes (new cases) at 1-year follow-up for the men and women separately. Results Both high and medium computer use compared to low computer use at baseline were associated with sleep disturbances in the men at follow-up. High email/chat use was negatively associated with perceived stress, but positively associated with reported sleep disturbances for the men. For the women, high email/chat use was (positively) associated with several mental health outcomes, while medium computer gaming was associated with symptoms of depression, and CU without breaks with most mental health outcomes. CU causing lost sleep was associated with mental health outcomes for both men and women. Conclusions Time spent on general computer use was prospectively associated with sleep disturbances and reduced performance for the men. For the women, using the computer without breaks was a risk factor for several mental health outcomes. Some associations were enhanced in interaction with mobile phone use. Using the computer at night and consequently losing sleep was associated with most mental health outcomes for both men and women. Further studies should focus on mechanisms relating information and communication technology (ICT) use to sleep disturbances. PMID:23088719

  17. Computer use and stress, sleep disturbances, and symptoms of depression among young adults--a prospective cohort study.

    PubMed

    Thomée, Sara; Härenstam, Annika; Hagberg, Mats

    2012-10-22

    We have previously studied prospective associations between computer use and mental health symptoms in a selected young adult population. The purpose of this study was to investigate if high computer use is a prospective risk factor for developing mental health symptoms in a population-based sample of young adults. The study group was a cohort of young adults (n = 4163), 20-24 years old, who responded to a questionnaire at baseline and 1-year follow-up. Exposure variables included time spent on computer use (CU) in general, email/chat use, computer gaming, CU without breaks, and CU at night causing lost sleep. Mental health outcomes included perceived stress, sleep disturbances, symptoms of depression, and reduced performance due to stress, depressed mood, or tiredness. Prevalence ratios (PRs) were calculated for prospective associations between exposure variables at baseline and mental health outcomes (new cases) at 1-year follow-up for the men and women separately. Both high and medium computer use compared to low computer use at baseline were associated with sleep disturbances in the men at follow-up. High email/chat use was negatively associated with perceived stress, but positively associated with reported sleep disturbances for the men. For the women, high email/chat use was (positively) associated with several mental health outcomes, while medium computer gaming was associated with symptoms of depression, and CU without breaks with most mental health outcomes. CU causing lost sleep was associated with mental health outcomes for both men and women. Time spent on general computer use was prospectively associated with sleep disturbances and reduced performance for the men. For the women, using the computer without breaks was a risk factor for several mental health outcomes. Some associations were enhanced in interaction with mobile phone use. Using the computer at night and consequently losing sleep was associated with most mental health outcomes for both men and women. Further studies should focus on mechanisms relating information and communication technology (ICT) use to sleep disturbances.

  18. Testing the Use of Implicit Solvent in the Molecular Dynamics Modelling of DNA Flexibility

    NASA Astrophysics Data System (ADS)

    Mitchell, J.; Harris, S.

    DNA flexibility controls packaging, looping and in some cases sequence specific protein binding. Molecular dynamics simulations carried out with a computationally efficient implicit solvent model are potentially a powerful tool for studying larger DNA molecules than can be currently simulated when water and counterions are represented explicitly. In this work we compare DNA flexibility at the base pair step level modelled using an implicit solvent model to that previously determined from explicit solvent simulations and database analysis. Although much of the sequence dependent behaviour is preserved in implicit solvent, the DNA is considerably more flexible when the approximate model is used. In addition we test the ability of the implicit solvent to model stress induced DNA disruptions by simulating a series of DNA minicircle topoisomers which vary in size and superhelical density. When compared with previously run explicit solvent simulations, we find that while the levels of DNA denaturation are similar using both computational methodologies, the specific structural form of the disruptions is different.

  19. Design of a serotonin 4 receptor radiotracer with decreased lipophilicity for single photon emission computed tomography.

    PubMed

    Fresneau, Nathalie; Dumas, Noé; Tournier, Benjamin B; Fossey, Christine; Ballandonne, Céline; Lesnard, Aurélien; Millet, Philippe; Charnay, Yves; Cailly, Thomas; Bouillon, Jean-Philippe; Fabis, Frédéric

    2015-04-13

    With the aim to develop a suitable radiotracer for the brain imaging of the serotonin 4 receptor subtype (5-HT4R) using single photon emission computed tomography (SPECT), we synthesized and evaluated a library of di- and triazaphenanthridines with lipophilicity values which were in the range expected to favour brain penetration, and which demonstrated specific binding to the target of interest. Adding additional nitrogen atoms to previously described phenanthridine ligands exhibiting a high unspecific binding, we were able to design a radioiodinated compound [(125)I]14. This compound exhibited a binding affinity value of 0.094 nM toward human 5-HT4R and a high selectivity over other serotonin receptor subtypes (5-HTR). In vivo SPECT imaging studies and competition experiments demonstrated that the decreased lipophilicity (in comparison with our previously reported compounds 4 and 5) allowed a more specific labelling of the 5-HT4R brain-containing regions. Copyright © 2015 Elsevier Masson SAS. All rights reserved.

  20. Development of a multiple-parameter nonlinear perturbation procedure for transonic turbomachinery flows: Preliminary application to design/optimization problems

    NASA Technical Reports Server (NTRS)

    Stahara, S. S.; Elliott, J. P.; Spreiter, J. R.

    1983-01-01

    An investigation was conducted to continue the development of perturbation procedures and associated computational codes for rapidly determining approximations to nonlinear flow solutions, with the purpose of establishing a method for minimizing computational requirements associated with parametric design studies of transonic flows in turbomachines. The results reported here concern the extension of the previously developed successful method for single parameter perturbations to simultaneous multiple-parameter perturbations, and the preliminary application of the multiple-parameter procedure in combination with an optimization method to blade design/optimization problem. In order to provide as severe a test as possible of the method, attention is focused in particular on transonic flows which are highly supercritical. Flows past both isolated blades and compressor cascades, involving simultaneous changes in both flow and geometric parameters, are considered. Comparisons with the corresponding exact nonlinear solutions display remarkable accuracy and range of validity, in direct correspondence with previous results for single-parameter perturbations.

  1. Physiology driven adaptivity for the numerical solution of the bidomain equations.

    PubMed

    Whiteley, Jonathan P

    2007-09-01

    Previous work [Whiteley, J. P. IEEE Trans. Biomed. Eng. 53:2139-2147, 2006] derived a stable, semi-implicit numerical scheme for solving the bidomain equations. This scheme allows the timestep used when solving the bidomain equations numerically to be chosen by accuracy considerations rather than stability considerations. In this study we modify this scheme to allow an adaptive numerical solution in both time and space. The spatial mesh size is determined by the gradient of the transmembrane and extracellular potentials while the timestep is determined by the values of: (i) the fast sodium current; and (ii) the calcium release from junctional sarcoplasmic reticulum to myoplasm current. For two-dimensional simulations presented here, combining the numerical algorithm in the paper cited above with the adaptive algorithm presented here leads to an increase in computational efficiency by a factor of around 250 over previous work, together with significantly less computational memory being required. The speedup for three-dimensional simulations is likely to be more impressive.

  2. An analysis of electronic health record-related patient safety incidents.

    PubMed

    Palojoki, Sari; Mäkelä, Matti; Lehtonen, Lasse; Saranto, Kaija

    2017-06-01

    The aim of this study was to analyse electronic health record-related patient safety incidents in the patient safety incident reporting database in fully digital hospitals in Finland. We compare Finnish data to similar international data and discuss their content with regard to the literature. We analysed the types of electronic health record-related patient safety incidents that occurred at 23 hospitals during a 2-year period. A procedure of taxonomy mapping served to allow comparisons. This study represents a rare examination of patient safety risks in a fully digital environment. The proportion of electronic health record-related incidents was markedly higher in our study than in previous studies with similar data. Human-computer interaction problems were the most frequently reported. The results show the possibility of error arising from the complex interaction between clinicians and computers.

  3. An Experimental Study in the Use of Computer-Based Instruction to Teach Automated Spreadsheet Functions

    DTIC Science & Technology

    1991-09-01

    review of past CBI studies -was conducted to provide the researcher a theoretical knowledge base on the effectiveness and efficiency of CBI. A summary...Literature Review Findinms on Ways to Measure CBI Effectiveness and Efficiency. The literature included previously conducted CBI experiments, studies , and...nine choices on each main and submenu (14:16). 3) Allow the student to make a menu selection with upper or lower case entries (28:291). 4) Prevent

  4. Data Mining of Extremely Large Ad-Hoc Data Sets to Produce Reverse Web-Link Graphs

    DTIC Science & Technology

    2017-03-01

    in most of the MR cases. From these studies , we also learned that computing -optimized instances should be chosen for serialized/compressed input data...maximum 200 words) Data mining can be a valuable tool, particularly in the acquisition of military intelligence. As the second study within a larger Naval...open web crawler data set Common Crawl. Similar to previous studies , this research employs MapReduce (MR) for sorting and categorizing output value

  5. Singlet Oxygen and Free Radical Reactions of Retinoids and Carotenoids—A Review

    PubMed Central

    Truscott, T. George

    2018-01-01

    We report on studies of reactions of singlet oxygen with carotenoids and retinoids and a range of free radical studies on carotenoids and retinoids with emphasis on recent work, dietary carotenoids and the role of oxygen in biological processes. Many previous reviews are cited and updated together with new data not previously reviewed. The review does not deal with computational studies but the emphasis is on laboratory-based results. We contrast the ease of study of both singlet oxygen and polyene radical cations compared to neutral radicals. Of particular interest is the switch from anti- to pro-oxidant behavior of a carotenoid with change of oxygen concentration: results for lycopene in a cellular model system show total protection of the human cells studied at zero oxygen concentration, but zero protection at 100% oxygen concentration. PMID:29301252

  6. Analysis of components of variance in multiple-reader studies of computer-aided diagnosis with different tasks

    NASA Astrophysics Data System (ADS)

    Beiden, Sergey V.; Wagner, Robert F.; Campbell, Gregory; Metz, Charles E.; Chan, Heang-Ping; Nishikawa, Robert M.; Schnall, Mitchell D.; Jiang, Yulei

    2001-06-01

    In recent years, the multiple-reader, multiple-case (MRMC) study paradigm has become widespread for receiver operating characteristic (ROC) assessment of systems for diagnostic imaging and computer-aided diagnosis. We review how MRMC data can be analyzed in terms of the multiple components of the variance (case, reader, interactions) observed in those studies. Such information is useful for the design of pivotal studies from results of a pilot study and also for studying the effects of reader training. Recently, several of the present authors have demonstrated methods to generalize the analysis of multiple variance components to the case where unaided readers of diagnostic images are compared with readers who receive the benefit of a computer assist (CAD). For this case it is necessary to model the possibility that several of the components of variance might be reduced when readers incorporate the computer assist, compared to the unaided reading condition. We review results of this kind of analysis on three previously published MRMC studies, two of which were applications of CAD to diagnostic mammography and one was an application of CAD to screening mammography. The results for the three cases are seen to differ, depending on the reader population sampled and the task of interest. Thus, it is not possible to generalize a particular analysis of variance components beyond the tasks and populations actually investigated.

  7. Interfacing External Quantum Devices to a Universal Quantum Computer

    PubMed Central

    Lagana, Antonio A.; Lohe, Max A.; von Smekal, Lorenz

    2011-01-01

    We present a scheme to use external quantum devices using the universal quantum computer previously constructed. We thereby show how the universal quantum computer can utilize networked quantum information resources to carry out local computations. Such information may come from specialized quantum devices or even from remote universal quantum computers. We show how to accomplish this by devising universal quantum computer programs that implement well known oracle based quantum algorithms, namely the Deutsch, Deutsch-Jozsa, and the Grover algorithms using external black-box quantum oracle devices. In the process, we demonstrate a method to map existing quantum algorithms onto the universal quantum computer. PMID:22216276

  8. Interfacing external quantum devices to a universal quantum computer.

    PubMed

    Lagana, Antonio A; Lohe, Max A; von Smekal, Lorenz

    2011-01-01

    We present a scheme to use external quantum devices using the universal quantum computer previously constructed. We thereby show how the universal quantum computer can utilize networked quantum information resources to carry out local computations. Such information may come from specialized quantum devices or even from remote universal quantum computers. We show how to accomplish this by devising universal quantum computer programs that implement well known oracle based quantum algorithms, namely the Deutsch, Deutsch-Jozsa, and the Grover algorithms using external black-box quantum oracle devices. In the process, we demonstrate a method to map existing quantum algorithms onto the universal quantum computer. © 2011 Lagana et al.

  9. Understanding titanium-catalysed radical-radical reactions: a DFT study unravels the complex kinetics of ketone-nitrile couplings.

    PubMed

    Streuff, Jan; Himmel, Daniel; Younas, Sara L

    2018-04-03

    The computational investigation of a titanium-catalysed reductive radical-radical coupling is reported. The results match the conclusions from an earlier experimental study and enable a further interpretation of the previously observed complex reaction kinetics. Furthermore, the interplay between neutral and cationic reaction pathways in titanium(iii)-catalysed reactions is investigated for the first time. The results show that hydrochloride additives and reaction byproducts play an important role in the respective equilibria. A full reaction profile is assembled and the computed activation barrier is found to be in reasonable agreement with the experiment. The conclusions are of fundamental importance to the field of low-valent titanium catalysis and the understanding of related catalytic radical-radical coupling reactions.

  10. Mixed convection flow of viscoelastic fluid by a stretching cylinder with heat transfer.

    PubMed

    Hayat, Tasawar; Anwar, Muhammad Shoaib; Farooq, Muhammad; Alsaedi, Ahmad

    2015-01-01

    Flow of viscoelastic fluid due to an impermeable stretching cylinder is discussed. Effects of mixed convection and variable thermal conductivity are present. Thermal conductivity is taken temperature dependent. Nonlinear partial differential system is reduced into the nonlinear ordinary differential system. Resulting nonlinear system is computed for the convergent series solutions. Numerical values of skin friction coefficient and Nusselt number are computed and discussed. The results obtained with the current method are in agreement with previous studies using other methods as well as theoretical ideas. Physical interpretation reflecting the contribution of influential parameters in the present flow is presented. It is hoped that present study serves as a stimulus for modeling further stretching flows especially in polymeric and paper production processes.

  11. Mixed Convection Flow of Viscoelastic Fluid by a Stretching Cylinder with Heat Transfer

    PubMed Central

    Hayat, Tasawar; Anwar, Muhammad Shoaib; Farooq, Muhammad; Alsaedi, Ahmad

    2015-01-01

    Flow of viscoelastic fluid due to an impermeable stretching cylinder is discussed. Effects of mixed convection and variable thermal conductivity are present. Thermal conductivity is taken temperature dependent. Nonlinear partial differential system is reduced into the nonlinear ordinary differential system. Resulting nonlinear system is computed for the convergent series solutions. Numerical values of skin friction coefficient and Nusselt number are computed and discussed. The results obtained with the current method are in agreement with previous studies using other methods as well as theoretical ideas. Physical interpretation reflecting the contribution of influential parameters in the present flow is presented. It is hoped that present study serves as a stimulus for modeling further stretching flows especially in polymeric and paper production processes. PMID:25775032

  12. Bubble nucleation in simple and molecular liquids via the largest spherical cavity method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gonzalez, Miguel A., E-mail: m.gonzalez12@imperial.ac.uk; Department of Chemistry, Imperial College London, London SW7 2AZ; Abascal, José L. F.

    2015-04-21

    In this work, we propose a methodology to compute bubble nucleation free energy barriers using trajectories generated via molecular dynamics simulations. We follow the bubble nucleation process by means of a local order parameter, defined by the volume of the largest spherical cavity (LSC) formed in the nucleating trajectories. This order parameter simplifies considerably the monitoring of the nucleation events, as compared with the previous approaches which require ad hoc criteria to classify the atoms and molecules as liquid or vapor. The combination of the LSC and the mean first passage time technique can then be used to obtain themore » free energy curves. Upon computation of the cavity distribution function the nucleation rate and free-energy barrier can then be computed. We test our method against recent computations of bubble nucleation in simple liquids and water at negative pressures. We obtain free-energy barriers in good agreement with the previous works. The LSC method provides a versatile and computationally efficient route to estimate the volume of critical bubbles the nucleation rate and to compute bubble nucleation free-energies in both simple and molecular liquids.« less

  13. Computational Investigation of Cerebrospinal Fluid Dynamics in the Posterior Cranial Fossa and Cervical Subarachnoid Space in Patients with Chiari I Malformation.

    PubMed

    Støverud, Karen-Helene; Langtangen, Hans Petter; Ringstad, Geir Andre; Eide, Per Kristian; Mardal, Kent-Andre

    2016-01-01

    Previous computational fluid dynamics (CFD) studies have demonstrated that the Chiari malformation is associated with abnormal cerebrospinal fluid (CSF) flow in the cervical part of the subarachnoid space (SAS), but the flow in the SAS of the posterior cranial fossa has received little attention. This study extends previous modelling efforts by including the cerebellomedullary cistern, pontine cistern, and 4th ventricle in addition to the cervical subarachnoid space. The study included one healthy control, Con1, and two patients with Chiari I malformation, P1 and P2. Meshes were constructed by segmenting images obtained from T2-weighted turbo spin-echo sequences. CFD simulations were performed with a previously verified and validated code. Patient-specific flow conditions in the aqueduct and the cervical SAS were used. Two patients with the Chiari malformation and one control were modelled. The results demonstrated increased maximal flow velocities in the Chiari patients, ranging from factor 5 in P1 to 14.8 in P2, when compared to Con1 at the level of Foramen Magnum (FM). Maximal velocities in the cervical SAS varied by a factor 2.3, while the maximal flow in the aqueduct varied by a factor 3.5. The pressure drop from the pontine cistern to the cervical SAS was similar in Con1 and P1, but a factor two higher in P2. The pressure drop between the aqueduct and the cervical SAS varied by a factor 9.4 where P1 was the one with the lowest pressure jump and P2 and Con1 differed only by a factor 1.6. This pilot study demonstrates that including the posterior cranial fossa is feasible and suggests that previously found flow differences between Chiari I patients and healthy individuals in the cervical SAS may be present also in the SAS of the posterior cranial fossa.

  14. Controlling flexible robot arms using a high speed dynamics process

    NASA Technical Reports Server (NTRS)

    Jain, Abhinandan (Inventor); Rodriguez, Guillermo (Inventor)

    1992-01-01

    Described here is a robot controller for a flexible manipulator arm having plural bodies connected at respective movable hinges, and flexible in plural deformation modes. It is operated by computing articulated body qualities for each of the bodies from the respective modal spatial influence vectors, obtaining specified body forces for each of the bodies, and computing modal deformation accelerations of the nodes and hinge accelerations of the hinges from the specified body forces, from the articulated body quantities and from the modal spatial influence vectors. In one embodiment of the invention, the controller further operates by comparing the accelerations thus computed to desired manipulator motion to determine a motion discrepancy, and correcting the specified body forces so as to reduce the motion discrepancy. The manipulator bodies and hinges are characterized by respective vectors of deformation and hinge configuration variables. Computing modal deformation accelerations and hinge accelerations is carried out for each of the bodies, beginning with the outermost body by computing a residual body force from a residual body force of a previous body, computing a resultant hinge acceleration from the body force, and then, for each one of the bodies beginning with the innermost body, computing a modal body acceleration from a modal body acceleration of a previous body, computing a modal deformation acceleration and hinge acceleration from the resulting hinge acceleration and from the modal body acceleration.

  15. [Usage patterns of internet and computer games : Results of an observational study of Tyrolean adolescents].

    PubMed

    Riedl, David; Stöckl, Andrea; Nussbaumer, Charlotte; Rumpold, Gerhard; Sevecke, Kathrin; Fuchs, Martin

    2016-12-01

    The use of digital media such as the Internet and Computer games has greatly increased. In the western world, almost all young people regularly use these relevant technologies. Against this background, forms of use with possible negative consequences for young people have been recognized and scientifically examined. The aim of our study was therefore to investigate the prevalence of pathological use of these technologies in a sample of young Tyrolean people. 398 students (average age 15.2 years, SD ± 2.3 years, 34.2% female) were interviewed by means of the structured questionnaires CIUS (Internet), CSV-S (Computer games) and SWE (Self efficacy). Additionally, socio demographic data were collected. In line with previous studies, 7.7% of the adolescents of our sample showed criteria for problematic internet use, 3.3% for pathological internet use. 5.4% of the sample reported pathological computer game usage. The most important aspect to influence our results was the gender of the subjects. Intensive users in the field of Internet and Computer games were more often young men, young women, however, showed significantly less signs of pathological computer game use. A significant percentage of Tyrolean adolescents showed difficulties in the development of competent media use, indicating the growing significance of prevention measures such as media education. In a follow-up project, a sample of adolescents with mental disorders will be examined concerning their media use and be compared with our school-sample.

  16. Research data collection methods: from paper to tablet computers.

    PubMed

    Wilcox, Adam B; Gallagher, Kathleen D; Boden-Albala, Bernadette; Bakken, Suzanne R

    2012-07-01

    Primary data collection is a critical activity in clinical research. Even with significant advances in technical capabilities, clear benefits of use, and even user preferences for using electronic systems for collecting primary data, paper-based data collection is still common in clinical research settings. However, with recent developments in both clinical research and tablet computer technology, the comparative advantages and disadvantages of data collection methods should be determined. To describe case studies using multiple methods of data collection, including next-generation tablets, and consider their various advantages and disadvantages. We reviewed 5 modern case studies using primary data collection, using methods ranging from paper to next-generation tablet computers. We performed semistructured telephone interviews with each project, which considered factors relevant to data collection. We address specific issues with workflow, implementation and security for these different methods, and identify differences in implementation that led to different technology considerations for each case study. There remain multiple methods for primary data collection, each with its own strengths and weaknesses. Two recent methods are electronic health record templates and next-generation tablet computers. Electronic health record templates can link data directly to medical records, but are notably difficult to use. Current tablet computers are substantially different from previous technologies with regard to user familiarity and software cost. The use of cloud-based storage for tablet computers, however, creates a specific challenge for clinical research that must be considered but can be overcome.

  17. Using a cloud to replenish parched groundwater modeling efforts.

    PubMed

    Hunt, Randall J; Luchette, Joseph; Schreuder, Willem A; Rumbaugh, James O; Doherty, John; Tonkin, Matthew J; Rumbaugh, Douglas B

    2010-01-01

    Groundwater models can be improved by introduction of additional parameter flexibility and simultaneous use of soft-knowledge. However, these sophisticated approaches have high computational requirements. Cloud computing provides unprecedented access to computing power via the Internet to facilitate the use of these techniques. A modeler can create, launch, and terminate "virtual" computers as needed, paying by the hour, and save machine images for future use. Such cost-effective and flexible computing power empowers groundwater modelers to routinely perform model calibration and uncertainty analysis in ways not previously possible.

  18. Using a cloud to replenish parched groundwater modeling efforts

    USGS Publications Warehouse

    Hunt, Randall J.; Luchette, Joseph; Schreuder, Willem A.; Rumbaugh, James O.; Doherty, John; Tonkin, Matthew J.; Rumbaugh, Douglas B.

    2010-01-01

    Groundwater models can be improved by introduction of additional parameter flexibility and simultaneous use of soft-knowledge. However, these sophisticated approaches have high computational requirements. Cloud computing provides unprecedented access to computing power via the Internet to facilitate the use of these techniques. A modeler can create, launch, and terminate “virtual” computers as needed, paying by the hour, and save machine images for future use. Such cost-effective and flexible computing power empowers groundwater modelers to routinely perform model calibration and uncertainty analysis in ways not previously possible.

  19. An Evaluation of Personal Health Information Remnants in Second-Hand Personal Computer Disk Drives

    PubMed Central

    Neri, Emilio; Jonker, Elizabeth

    2007-01-01

    Background The public is concerned about the privacy of their health information, especially as more of it is collected, stored, and exchanged electronically. But we do not know the extent of leakage of personal health information (PHI) from data custodians. One form of data leakage is through computer equipment that is sold, donated, lost, or stolen from health care facilities or individuals who work at these facilities. Previous studies have shown that it is possible to get sensitive personal information (PI) from second-hand disk drives. However, there have been no studies investigating the leakage of PHI in this way. Objectives The aim of the study was to determine the extent to which PHI can be obtained from second-hand computer disk drives. Methods A list of Canadian vendors selling second-hand computer equipment was constructed, and we systematically went through the shuffled list and attempted to purchase used disk drives from the vendors. Sixty functional disk drives were purchased and analyzed for data remnants containing PHI using computer forensic tools. Results It was possible to recover PI from 65% (95% CI: 52%-76%) of the drives. In total, 10% (95% CI: 5%-20%) had PHI on people other than the owner(s) of the drive, and 8% (95% CI: 7%-24%) had PHI on the owner(s) of the drive. Some of the PHI included very sensitive mental health information on a large number of people. Conclusions There is a strong need for health care data custodians to either encrypt all computers that can hold PHI on their clients or patients, including those used by employees and subcontractors in their homes, or to ensure that their computers are destroyed rather than finding a second life in the used computer market. PMID:17942386

  20. 40 CFR 35.910-5 - Additional allotments of previously withheld sums.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... ($11 billion) and subtracting the previously allotted sums, formerly set forth in § 35.910-3(c). (c... Pub. L. 93-243; and, finally, by subtracting the previously allotted sums set forth in § 35.910-4(c). (d) Based upon the computations set forth in paragraphs (b) and (c) of this section, the total...

  1. 3D nonrigid registration via optimal mass transport on the GPU.

    PubMed

    Ur Rehman, Tauseef; Haber, Eldad; Pryor, Gallagher; Melonakos, John; Tannenbaum, Allen

    2009-12-01

    In this paper, we present a new computationally efficient numerical scheme for the minimizing flow approach for optimal mass transport (OMT) with applications to non-rigid 3D image registration. The approach utilizes all of the gray-scale data in both images, and the optimal mapping from image A to image B is the inverse of the optimal mapping from B to A. Further, no landmarks need to be specified, and the minimizer of the distance functional involved is unique. Our implementation also employs multigrid, and parallel methodologies on a consumer graphics processing unit (GPU) for fast computation. Although computing the optimal map has been shown to be computationally expensive in the past, we show that our approach is orders of magnitude faster then previous work and is capable of finding transport maps with optimality measures (mean curl) previously unattainable by other works (which directly influences the accuracy of registration). We give results where the algorithm was used to compute non-rigid registrations of 3D synthetic data as well as intra-patient pre-operative and post-operative 3D brain MRI datasets.

  2. Towards an accurate real-time locator of infrasonic sources

    NASA Astrophysics Data System (ADS)

    Pinsky, V.; Blom, P.; Polozov, A.; Marcillo, O.; Arrowsmith, S.; Hofstetter, A.

    2017-11-01

    Infrasonic signals propagate from an atmospheric source via media with stochastic and fast space-varying conditions. Hence, their travel time, the amplitude at sensor recordings and even manifestation in the so-called "shadow zones" are random. Therefore, the traditional least-squares technique for locating infrasonic sources is often not effective, and the problem for the best solution must be formulated in probabilistic terms. Recently, a series of papers has been published about Bayesian Infrasonic Source Localization (BISL) method based on the computation of the posterior probability density function (PPDF) of the source location, as a convolution of a priori probability distribution function (APDF) of the propagation model parameters with likelihood function (LF) of observations. The present study is devoted to the further development of BISL for higher accuracy and stability of the source location results and decreasing of computational load. We critically analyse previous algorithms and propose several new ones. First of all, we describe the general PPDF formulation and demonstrate that this relatively slow algorithm might be among the most accurate algorithms, provided the adequate APDF and LF are used. Then, we suggest using summation instead of integration in a general PPDF calculation for increased robustness, but this leads us to the 3D space-time optimization problem. Two different forms of APDF approximation are considered and applied for the PPDF calculation in our study. One of them is previously suggested, but not yet properly used is the so-called "celerity-range histograms" (CRHs). Another is the outcome from previous findings of linear mean travel time for the four first infrasonic phases in the overlapping consecutive distance ranges. This stochastic model is extended here to the regional distance of 1000 km, and the APDF introduced is the probabilistic form of the junction between this travel time model and range-dependent probability distributions of the phase arrival time picks. To illustrate the improvements in both computation time and location accuracy achieved, we compare location results for the new algorithms, previously published BISL-type algorithms and the least-squares location technique. This comparison is provided via a case study of different typical spatial data distributions and statistical experiment using the database of 36 ground-truth explosions from the Utah Test and Training Range (UTTR) recorded during the US summer season at USArray transportable seismic stations when they were near the site between 2006 and 2008.

  3. DIRAC in Large Particle Physics Experiments

    NASA Astrophysics Data System (ADS)

    Stagni, F.; Tsaregorodtsev, A.; Arrabito, L.; Sailer, A.; Hara, T.; Zhang, X.; Consortium, DIRAC

    2017-10-01

    The DIRAC project is developing interware to build and operate distributed computing systems. It provides a development framework and a rich set of services for both Workload and Data Management tasks of large scientific communities. A number of High Energy Physics and Astrophysics collaborations have adopted DIRAC as the base for their computing models. DIRAC was initially developed for the LHCb experiment at LHC, CERN. Later, the Belle II, BES III and CTA experiments as well as the linear collider detector collaborations started using DIRAC for their computing systems. Some of the experiments built their DIRAC-based systems from scratch, others migrated from previous solutions, ad-hoc or based on different middlewares. Adaptation of DIRAC for a particular experiment was enabled through the creation of extensions to meet their specific requirements. Each experiment has a heterogeneous set of computing and storage resources at their disposal that were aggregated through DIRAC into a coherent pool. Users from different experiments can interact with the system in different ways depending on their specific tasks, expertise level and previous experience using command line tools, python APIs or Web Portals. In this contribution we will summarize the experience of using DIRAC in particle physics collaborations. The problems of migration to DIRAC from previous systems and their solutions will be presented. An overview of specific DIRAC extensions will be given. We hope that this review will be useful for experiments considering an update, or for those designing their computing models.

  4. Determination of Scaled Wind Turbine Rotor Characteristics from Three Dimensional RANS Calculations

    NASA Astrophysics Data System (ADS)

    Burmester, S.; Gueydon, S.; Make, M.

    2016-09-01

    Previous studies have shown the importance of 3D effects when calculating the performance characteristics of a scaled down turbine rotor [1-4]. In this paper the results of 3D RANS (Reynolds-Averaged Navier-Stokes) computations by Make and Vaz [1] are taken to calculate 2D lift and drag coefficients. These coefficients are assigned to FAST (Blade Element Momentum Theory (BEMT) tool from NREL) as input parameters. Then, the rotor characteristics (power and thrust coefficients) are calculated using BEMT. This coupling of RANS and BEMT was previously applied by other parties and is termed here the RANS-BEMT coupled approach. Here the approach is compared to measurements carried out in a wave basin at MARIN applying Froude scaled wind, and the direct 3D RANS computation. The data of both a model and full scale wind turbine are used for the validation and verification. The flow around a turbine blade at full scale has a more 2D character than the flow properties around a turbine blade at model scale (Make and Vaz [1]). Since BEMT assumes 2D flow behaviour, the results of the RANS-BEMT coupled approach agree better with the results of the CFD (Computational Fluid Dynamics) simulation at full- than at model-scale.

  5. The influence of multiple trials and computer-mediated communication on collaborative and individual semantic recall.

    PubMed

    Hinds, Joanne M; Payne, Stephen J

    2018-04-01

    Collaborative inhibition is a phenomenon where collaborating groups experience a decrement in recall when interacting with others. Despite this, collaboration has been found to improve subsequent individual recall. We explore these effects in semantic recall, which is seldom studied in collaborative retrieval. We also examine "parallel CMC", a synchronous form of computer-mediated communication that has previously been found to improve collaborative recall [Hinds, J. M., & Payne, S. J. (2016). Collaborative inhibition and semantic recall: Improving collaboration through computer-mediated communication. Applied Cognitive Psychology, 30(4), 554-565]. Sixty three triads completed a semantic recall task, which involved generating words beginning with "PO" or "HE" across three recall trials, in one of three retrieval conditions: Individual-Individual-Individual (III), Face-to-face-Face-to-Face-Individual (FFI) and Parallel-Parallel-Individual (PPI). Collaborative inhibition was present across both collaborative conditions. Individual recall in Recall 3 was higher when participants had previously collaborated in comparison to recalling three times individually. There was no difference between face-to-face and parallel CMC recall, however subsidiary analyses of instance repetitions and subjective organisation highlighted differences in group members' approaches to recall in terms of organisation and attention to others' contributions. We discuss the implications of these findings in relation to retrieval strategy disruption.

  6. First-principles Raman Spectra of Lead Titanate with Pressure

    NASA Astrophysics Data System (ADS)

    Schad, A.; Ganesh, P.; Cohen, R. E.; Ahart, M.

    2010-03-01

    PbTiO3 displays[1,2] a morphotropic phase boundary (MPB) under pressure at which electromechanical properties are maximal. Previously only complex solid-solutions were thought to exhibit such a boundary. To aid in the experimental study of the MPB region, we compute Raman scattering spectra of different phases of PbTiO3 with pressure using a DFT based first-principles approach and Density Functional Perturbation Theory (DFPT) [3]. The computed intensities and shifts with pressure agree very well with the experimental data measured on powder samples. Computations further allow comparison of Raman spectra and shifts in energetically competing phases raising the possibility of using calculations for experimental calibration of Raman spectra at any pressure. The results substantiate previous claims of a low-temperature monoclinic phase at the MPB at approximately 10 GPa in PbTiO3 as well as refute the possibility of an I4cm phase at higher pressures as suggested by other groups [4]. [1] Z. Wu and R. E. Cohen, Phys. Rev. Lett. 95, 037601 (2005), [2] M. Ahart et.al., Nature 451, 545 (2008), [3] P. Hermet et.al., J. Phys.:Condens. Matter 21, 215901 (2009) [4] P.E. Janolin et.al., Phys. Rev. Lett. 101, 237601 (2008).

  7. Computer-aided design of liposomal drugs: In silico prediction and experimental validation of drug candidates for liposomal remote loading.

    PubMed

    Cern, Ahuva; Barenholz, Yechezkel; Tropsha, Alexander; Goldblum, Amiram

    2014-01-10

    Previously we have developed and statistically validated Quantitative Structure Property Relationship (QSPR) models that correlate drugs' structural, physical and chemical properties as well as experimental conditions with the relative efficiency of remote loading of drugs into liposomes (Cern et al., J. Control. Release 160 (2012) 147-157). Herein, these models have been used to virtually screen a large drug database to identify novel candidate molecules for liposomal drug delivery. Computational hits were considered for experimental validation based on their predicted remote loading efficiency as well as additional considerations such as availability, recommended dose and relevance to the disease. Three compounds were selected for experimental testing which were confirmed to be correctly classified by our previously reported QSPR models developed with Iterative Stochastic Elimination (ISE) and k-Nearest Neighbors (kNN) approaches. In addition, 10 new molecules with known liposome remote loading efficiency that were not used by us in QSPR model development were identified in the published literature and employed as an additional model validation set. The external accuracy of the models was found to be as high as 82% or 92%, depending on the model. This study presents the first successful application of QSPR models for the computer-model-driven design of liposomal drugs. © 2013.

  8. Computer-aided design of liposomal drugs: in silico prediction and experimental validation of drug candidates for liposomal remote loading

    PubMed Central

    Cern, Ahuva; Barenholz, Yechezkel; Tropsha, Alexander; Goldblum, Amiram

    2014-01-01

    Previously we have developed and statistically validated Quantitative Structure Property Relationship (QSPR) models that correlate drugs’ structural, physical and chemical properties as well as experimental conditions with the relative efficiency of remote loading of drugs into liposomes (Cern et al, Journal of Controlled Release, 160(2012) 14–157). Herein, these models have been used to virtually screen a large drug database to identify novel candidate molecules for liposomal drug delivery. Computational hits were considered for experimental validation based on their predicted remote loading efficiency as well as additional considerations such as availability, recommended dose and relevance to the disease. Three compounds were selected for experimental testing which were confirmed to be correctly classified by our previously reported QSPR models developed with Iterative Stochastic Elimination (ISE) and k-nearest neighbors (kNN) approaches. In addition, 10 new molecules with known liposome remote loading efficiency that were not used in QSPR model development were identified in the published literature and employed as an additional model validation set. The external accuracy of the models was found to be as high as 82% or 92%, depending on the model. This study presents the first successful application of QSPR models for the computer-model-driven design of liposomal drugs. PMID:24184343

  9. Older Adults Perceptions of Technology and Barriers to Interacting with Tablet Computers: A Focus Group Study.

    PubMed

    Vaportzis, Eleftheria; Clausen, Maria Giatsi; Gow, Alan J

    2017-10-04

    New technologies provide opportunities for the delivery of broad, flexible interventions with older adults. Focus groups were conducted to: (1) understand older adults' familiarity with, and barriers to, interacting with new technologies and tablets; and (2) utilize user-engagement in refining an intervention protocol. Eighteen older adults (65-76 years old; 83.3% female) who were novice tablet users participated in discussions about their perceptions of and barriers to interacting with tablets. We conducted three separate focus groups and used a generic qualitative design applying thematic analysis to analyse the data. The focus groups explored attitudes toward tablets and technology in general. We also explored the perceived advantages and disadvantages of using tablets, familiarity with, and barriers to interacting with tablets. In two of the focus groups, participants had previous computing experience (e.g., desktop), while in the other, participants had no previous computing experience. None of the participants had any previous experience with tablet computers. The themes that emerged were related to barriers (i.e., lack of instructions and guidance, lack of knowledge and confidence, health-related barriers, cost); disadvantages and concerns (i.e., too much and too complex technology, feelings of inadequacy, and comparison with younger generations, lack of social interaction and communication, negative features of tablets); advantages (i.e., positive features of tablets, accessing information, willingness to adopt technology); and skepticism about using tablets and technology in general. After brief exposure to tablets, participants emphasized the likelihood of using a tablet in the future. Our findings suggest that most of our participants were eager to adopt new technology and willing to learn using a tablet. However, they voiced apprehension about lack of, or lack of clarity in, instructions and support. Understanding older adults' perceptions of technology is important to assist with introducing it to this population and maximize the potential of technology to facilitate independent living.

  10. Older Adults Perceptions of Technology and Barriers to Interacting with Tablet Computers: A Focus Group Study

    PubMed Central

    Vaportzis, Eleftheria; Giatsi Clausen, Maria; Gow, Alan J.

    2017-01-01

    Background: New technologies provide opportunities for the delivery of broad, flexible interventions with older adults. Focus groups were conducted to: (1) understand older adults' familiarity with, and barriers to, interacting with new technologies and tablets; and (2) utilize user-engagement in refining an intervention protocol. Methods: Eighteen older adults (65–76 years old; 83.3% female) who were novice tablet users participated in discussions about their perceptions of and barriers to interacting with tablets. We conducted three separate focus groups and used a generic qualitative design applying thematic analysis to analyse the data. The focus groups explored attitudes toward tablets and technology in general. We also explored the perceived advantages and disadvantages of using tablets, familiarity with, and barriers to interacting with tablets. In two of the focus groups, participants had previous computing experience (e.g., desktop), while in the other, participants had no previous computing experience. None of the participants had any previous experience with tablet computers. Results: The themes that emerged were related to barriers (i.e., lack of instructions and guidance, lack of knowledge and confidence, health-related barriers, cost); disadvantages and concerns (i.e., too much and too complex technology, feelings of inadequacy, and comparison with younger generations, lack of social interaction and communication, negative features of tablets); advantages (i.e., positive features of tablets, accessing information, willingness to adopt technology); and skepticism about using tablets and technology in general. After brief exposure to tablets, participants emphasized the likelihood of using a tablet in the future. Conclusions: Our findings suggest that most of our participants were eager to adopt new technology and willing to learn using a tablet. However, they voiced apprehension about lack of, or lack of clarity in, instructions and support. Understanding older adults' perceptions of technology is important to assist with introducing it to this population and maximize the potential of technology to facilitate independent living. PMID:29071004

  11. The use of SymNose for quantitative assessment of lip symmetry following repair of complete bilateral cleft lip and palate.

    PubMed

    Russell, James H B; Kiddy, Harriet C; Mercer, Nigel S

    2014-07-01

    The SymNose computer program has been proposed as an objective method for the quantitative assessment of lip symmetry following unilateral cleft lip repair. This study aims to demonstrate the use of SymNose in patients with complete bilateral cleft lip and palate (BCLP), a group previously excluded from computer-based analysis. A retrospective cohort study compared several parameters of lip symmetry between BCLP cases and non-cleft controls. 15 BCLP cases aged 10 (±1 year) who had undergone primary repair were recruited from the patient database at the South West Cleft Unit, Frenchay Hospital. Frontal facial photographs were selected for measurement. 15 age-matched controls were recruited from a local school. Lip symmetry was expressed as: percentage mismatch of left vermillion border and upper lip area over the right, horizontal lip tilt and lateral deviation of the lip. A significant increase in lip asymmetry was found in the BCLP group expressed as upper vermillion border mismatch across computer-defined and user-defined midlines (mean difference was 16.4% (p < 0.01) and 17.5% (p < 0.01) respectively). The results suggest that a significant degree of lip asymmetry remains in BCLP patients even after primary repair. This challenges previous assumptions that those with bilateral defects would be relatively symmetrical. Copyright © 2013 European Association for Cranio-Maxillo-Facial Surgery. Published by Elsevier Ltd. All rights reserved.

  12. Patients' acceptance of Internet-based home asthma telemonitoring.

    PubMed

    Finkelstein, J; Hripcsak, G; Cabrera, M R

    1998-01-01

    We studied asthma patients from a low-income inner-city community without previous computer experience. The patients were given portable spirometers to perform spirometry tests and palmtop computers to enter symptoms in a diary, to exchange messages with physician and to review test results. The self-testing was performed at home on a daily basis. The results were transmitted to the hospital information system immediately after completion of each test. Physician could review results using an Internet Web browser from any location. A constantly active decision support server monitored all data traffic and dispatched alerts when certain clinical conditions were met. Seventeen patients, out of 19 invited, agreed to participate in the study and have been monitored for three weeks. They have been surveyed then using standardized questionnaire. Most of the patients (82.4%) characterized self-testing procedures as "not complicated at all." In 70.6% of cases self-testing did not interfere with usual activities, and 82.4% of patients felt the self-testing required a "very little" amount of their time. All patients stated that it is important for them to know that the results can be reviewed by professional staff in a timely manner. However, only 29.5% of patients reviewed their results at least once a week at home independently. The majority of the patients (94.1%) were strongly interested in using home asthma telemonitoring in the future. We concluded that Internet-based home asthma telemonitoring can be successfully implemented in the group of patients without previous computer background.

  13. A simulation study of homogeneous ice nucleation in supercooled salty water

    NASA Astrophysics Data System (ADS)

    Soria, Guiomar D.; Espinosa, Jorge R.; Ramirez, Jorge; Valeriani, Chantal; Vega, Carlos; Sanz, Eduardo

    2018-06-01

    We use computer simulations to investigate the effect of salt on homogeneous ice nucleation. The melting point of the employed solution model was obtained both by direct coexistence simulations and by thermodynamic integration from previous calculations of the water chemical potential. Using a seeding approach, in which we simulate ice seeds embedded in a supercooled aqueous solution, we compute the nucleation rate as a function of temperature for a 1.85 NaCl mol per water kilogram solution at 1 bar. To improve the accuracy and reliability of our calculations, we combine seeding with the direct computation of the ice-solution interfacial free energy at coexistence using the Mold Integration method. We compare the results with previous simulation work on pure water to understand the effect caused by the solute. The model captures the experimental trend that the nucleation rate at a given supercooling decreases when adding salt. Despite the fact that the thermodynamic driving force for ice nucleation is higher for salty water for a given supercooling, the nucleation rate slows down with salt due to a significant increase of the ice-fluid interfacial free energy. The salty water model predicts an ice nucleation rate that is in good agreement with experimental measurements, bringing confidence in the predictive ability of the model. We expect that the combination of state-of-the-art simulation methods here employed to study ice nucleation from solution will be of much use in forthcoming numerical investigations of crystallization in mixtures.

  14. Adaptive-weighted Total Variation Minimization for Sparse Data toward Low-dose X-ray Computed Tomography Image Reconstruction

    PubMed Central

    Liu, Yan; Ma, Jianhua; Fan, Yi; Liang, Zhengrong

    2012-01-01

    Previous studies have shown that by minimizing the total variation (TV) of the to-be-estimated image with some data and other constraints, a piecewise-smooth X-ray computed tomography (CT) can be reconstructed from sparse-view projection data without introducing noticeable artifacts. However, due to the piecewise constant assumption for the image, a conventional TV minimization algorithm often suffers from over-smoothness on the edges of the resulting image. To mitigate this drawback, we present an adaptive-weighted TV (AwTV) minimization algorithm in this paper. The presented AwTV model is derived by considering the anisotropic edge property among neighboring image voxels, where the associated weights are expressed as an exponential function and can be adaptively adjusted by the local image-intensity gradient for the purpose of preserving the edge details. Inspired by the previously-reported TV-POCS (projection onto convex sets) implementation, a similar AwTV-POCS implementation was developed to minimize the AwTV subject to data and other constraints for the purpose of sparse-view low-dose CT image reconstruction. To evaluate the presented AwTV-POCS algorithm, both qualitative and quantitative studies were performed by computer simulations and phantom experiments. The results show that the presented AwTV-POCS algorithm can yield images with several noticeable gains, in terms of noise-resolution tradeoff plots and full width at half maximum values, as compared to the corresponding conventional TV-POCS algorithm. PMID:23154621

  15. Adaptive-weighted total variation minimization for sparse data toward low-dose x-ray computed tomography image reconstruction.

    PubMed

    Liu, Yan; Ma, Jianhua; Fan, Yi; Liang, Zhengrong

    2012-12-07

    Previous studies have shown that by minimizing the total variation (TV) of the to-be-estimated image with some data and other constraints, piecewise-smooth x-ray computed tomography (CT) can be reconstructed from sparse-view projection data without introducing notable artifacts. However, due to the piecewise constant assumption for the image, a conventional TV minimization algorithm often suffers from over-smoothness on the edges of the resulting image. To mitigate this drawback, we present an adaptive-weighted TV (AwTV) minimization algorithm in this paper. The presented AwTV model is derived by considering the anisotropic edge property among neighboring image voxels, where the associated weights are expressed as an exponential function and can be adaptively adjusted by the local image-intensity gradient for the purpose of preserving the edge details. Inspired by the previously reported TV-POCS (projection onto convex sets) implementation, a similar AwTV-POCS implementation was developed to minimize the AwTV subject to data and other constraints for the purpose of sparse-view low-dose CT image reconstruction. To evaluate the presented AwTV-POCS algorithm, both qualitative and quantitative studies were performed by computer simulations and phantom experiments. The results show that the presented AwTV-POCS algorithm can yield images with several notable gains, in terms of noise-resolution tradeoff plots and full-width at half-maximum values, as compared to the corresponding conventional TV-POCS algorithm.

  16. A simulation study of homogeneous ice nucleation in supercooled salty water.

    PubMed

    Soria, Guiomar D; Espinosa, Jorge R; Ramirez, Jorge; Valeriani, Chantal; Vega, Carlos; Sanz, Eduardo

    2018-06-14

    We use computer simulations to investigate the effect of salt on homogeneous ice nucleation. The melting point of the employed solution model was obtained both by direct coexistence simulations and by thermodynamic integration from previous calculations of the water chemical potential. Using a seeding approach, in which we simulate ice seeds embedded in a supercooled aqueous solution, we compute the nucleation rate as a function of temperature for a 1.85 NaCl mol per water kilogram solution at 1 bar. To improve the accuracy and reliability of our calculations, we combine seeding with the direct computation of the ice-solution interfacial free energy at coexistence using the Mold Integration method. We compare the results with previous simulation work on pure water to understand the effect caused by the solute. The model captures the experimental trend that the nucleation rate at a given supercooling decreases when adding salt. Despite the fact that the thermodynamic driving force for ice nucleation is higher for salty water for a given supercooling, the nucleation rate slows down with salt due to a significant increase of the ice-fluid interfacial free energy. The salty water model predicts an ice nucleation rate that is in good agreement with experimental measurements, bringing confidence in the predictive ability of the model. We expect that the combination of state-of-the-art simulation methods here employed to study ice nucleation from solution will be of much use in forthcoming numerical investigations of crystallization in mixtures.

  17. LRSSLMDA: Laplacian Regularized Sparse Subspace Learning for MiRNA-Disease Association prediction

    PubMed Central

    Huang, Li

    2017-01-01

    Predicting novel microRNA (miRNA)-disease associations is clinically significant due to miRNAs’ potential roles of diagnostic biomarkers and therapeutic targets for various human diseases. Previous studies have demonstrated the viability of utilizing different types of biological data to computationally infer new disease-related miRNAs. Yet researchers face the challenge of how to effectively integrate diverse datasets and make reliable predictions. In this study, we presented a computational model named Laplacian Regularized Sparse Subspace Learning for MiRNA-Disease Association prediction (LRSSLMDA), which projected miRNAs/diseases’ statistical feature profile and graph theoretical feature profile to a common subspace. It used Laplacian regularization to preserve the local structures of the training data and a L1-norm constraint to select important miRNA/disease features for prediction. The strength of dimensionality reduction enabled the model to be easily extended to much higher dimensional datasets than those exploited in this study. Experimental results showed that LRSSLMDA outperformed ten previous models: the AUC of 0.9178 in global leave-one-out cross validation (LOOCV) and the AUC of 0.8418 in local LOOCV indicated the model’s superior prediction accuracy; and the average AUC of 0.9181+/-0.0004 in 5-fold cross validation justified its accuracy and stability. In addition, three types of case studies further demonstrated its predictive power. Potential miRNAs related to Colon Neoplasms, Lymphoma, Kidney Neoplasms, Esophageal Neoplasms and Breast Neoplasms were predicted by LRSSLMDA. Respectively, 98%, 88%, 96%, 98% and 98% out of the top 50 predictions were validated by experimental evidences. Therefore, we conclude that LRSSLMDA would be a valuable computational tool for miRNA-disease association prediction. PMID:29253885

  18. Computation of a high-resolution MRI 3D stereotaxic atlas of the sheep brain.

    PubMed

    Ella, Arsène; Delgadillo, José A; Chemineau, Philippe; Keller, Matthieu

    2017-02-15

    The sheep model was first used in the fields of animal reproduction and veterinary sciences and then was utilized in fundamental and preclinical studies. For more than a decade, magnetic resonance (MR) studies performed on this model have been increasingly reported, especially in the field of neuroscience. To contribute to MR translational neuroscience research, a brain template and an atlas are necessary. We have recently generated the first complete T1-weighted (T1W) and T2W MR population average images (or templates) of in vivo sheep brains. In this study, we 1) defined a 3D stereotaxic coordinate system for previously established in vivo population average templates; 2) used deformation fields obtained during optimized nonlinear registrations to compute nonlinear tissues or prior probability maps (nlTPMs) of cerebrospinal fluid (CSF), gray matter (GM), and white matter (WM) tissues; 3) delineated 25 external and 28 internal sheep brain structures by segmenting both templates and nlTPMs; and 4) annotated and labeled these structures using an existing histological atlas. We built a quality high-resolution 3D atlas of average in vivo sheep brains linked to a reference stereotaxic space. The atlas and nlTPMs, associated with previously computed T1W and T2W in vivo sheep brain templates and nlTPMs, provide a complete set of imaging space that are able to be imported into other imaging software programs and could be used as standardized tools for neuroimaging studies or other neuroscience methods, such as image registration, image segmentation, identification of brain structures, implementation of recording devices, or neuronavigation. J. Comp. Neurol. 525:676-692, 2017. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  19. A Rural Community's Involvement in the Design and Usability Testing of a Computer-Based Informed Consent Process for the Personalized Medicine Research Project

    PubMed Central

    Mahnke, Andrea N; Plasek, Joseph M; Hoffman, David G; Partridge, Nathan S; Foth, Wendy S; Waudby, Carol J; Rasmussen, Luke V; McManus, Valerie D; McCarty, Catherine A

    2014-01-01

    Many informed consent studies demonstrate that research subjects poorly retain and understand information in written consent documents. Previous research in multimedia consent is mixed in terms of success for improving participants’ understanding, satisfaction, and retention. This failure may be due to a lack of a community-centered design approach to building the interventions. The goal of this study was to gather information from the community to determine the best way to undertake the consent process. Community perceptions regarding different computer-based consenting approaches were evaluated, and a computer-based consent was developed and tested. A second goal was to evaluate whether participants make truly informed decisions to participate in research. Simulations of an informed consent process were videotaped to document the process. Focus groups were conducted to determine community attitudes towards a computer-based informed consent process. Hybrid focus groups were conducted to determine the most acceptable hardware device. Usability testing was conducted on a computer-based consent prototype using a touch-screen kiosk. Based on feedback, a computer-based consent was developed. Representative study participants were able to easily complete the consent, and all were able to correctly answer the comprehension check questions. Community involvement in developing a computer-based consent proved valuable for a population-based genetic study. These findings may translate to other types of informed consents, such as genetic clinical trials consents. A computer-based consent may serve to better communicate consistent, clear, accurate, and complete information regarding the risks and benefits of study participation. Additional analysis is necessary to measure the level of comprehension of the check-question answers by larger numbers of participants. The next step will involve contacting participants to measure whether understanding of what they consented to is retained over time. PMID:24273095

  20. A rural community's involvement in the design and usability testing of a computer-based informed consent process for the Personalized Medicine Research Project.

    PubMed

    Mahnke, Andrea N; Plasek, Joseph M; Hoffman, David G; Partridge, Nathan S; Foth, Wendy S; Waudby, Carol J; Rasmussen, Luke V; McManus, Valerie D; McCarty, Catherine A

    2014-01-01

    Many informed consent studies demonstrate that research subjects poorly retain and understand information in written consent documents. Previous research in multimedia consent is mixed in terms of success for improving participants' understanding, satisfaction, and retention. This failure may be due to a lack of a community-centered design approach to building the interventions. The goal of this study was to gather information from the community to determine the best way to undertake the consent process. Community perceptions regarding different computer-based consenting approaches were evaluated, and a computer-based consent was developed and tested. A second goal was to evaluate whether participants make truly informed decisions to participate in research. Simulations of an informed consent process were videotaped to document the process. Focus groups were conducted to determine community attitudes towards a computer-based informed consent process. Hybrid focus groups were conducted to determine the most acceptable hardware device. Usability testing was conducted on a computer-based consent prototype using a touch-screen kiosk. Based on feedback, a computer-based consent was developed. Representative study participants were able to easily complete the consent, and all were able to correctly answer the comprehension check questions. Community involvement in developing a computer-based consent proved valuable for a population-based genetic study. These findings may translate to other types of informed consents, including those for trials involving treatment of genetic disorders. A computer-based consent may serve to better communicate consistent, clear, accurate, and complete information regarding the risks and benefits of study participation. Additional analysis is necessary to measure the level of comprehension of the check-question answers by larger numbers of participants. The next step will involve contacting participants to measure whether understanding of what they consented to is retained over time. © 2013 Wiley Periodicals, Inc.

  1. Computational Investigation of Amine–Oxygen Exciplex Formation

    PubMed Central

    Haupert, Levi M.; Simpson, Garth J.; Slipchenko, Lyudmila V.

    2012-01-01

    It has been suggested that fluorescence from amine-containing dendrimer compounds could be the result of a charge transfer between amine groups and molecular oxygen [Chu, C.-C.; Imae, T. Macromol. Rapid Commun. 2009, 30, 89.]. In this paper we employ equation-of-motion coupled cluster computational methods to study the electronic structure of an ammonia–oxygen model complex to examine this possibility. The results reveal several bound electronic states with charge transfer character with emission energies generally consistent with previous observations. However, further work involving confinement, solvent, and amine structure effects will be necessary for more rigorous examination of the charge transfer fluorescence hypothesis. PMID:21812447

  2. Secure multiparty computation of a comparison problem.

    PubMed

    Liu, Xin; Li, Shundong; Liu, Jian; Chen, Xiubo; Xu, Gang

    2016-01-01

    Private comparison is fundamental to secure multiparty computation. In this study, we propose novel protocols to privately determine [Formula: see text], or [Formula: see text] in one execution. First, a 0-1-vector encoding method is introduced to encode a number into a vector, and the Goldwasser-Micali encryption scheme is used to compare integers privately. Then, we propose a protocol by using a geometric method to compare rational numbers privately, and the protocol is information-theoretical secure. Using the simulation paradigm, we prove the privacy-preserving property of our protocols in the semi-honest model. The complexity analysis shows that our protocols are more efficient than previous solutions.

  3. Extending the limits of complex learning in organic amnesia: computer training in a vocational domain.

    PubMed

    Glisky, E L; Schacter, D L

    1989-01-01

    This study explored the limits of learning that could be achieved by an amnesic patient in a complex real-world domain. Using a cuing procedure known as the method of vanishing cues, a severely amnesic encephalitic patient was taught over 250 discrete pieces of new information concerning the rules and procedures for performing a task involving data entry into a computer. Subsequently, she was able to use this acquired knowledge to perform the task accurately and efficiently in the workplace. These results suggest that amnesic patients' preserved learning abilities can be extended well beyond what has been reported previously.

  4. Wave packet dynamics, time scales and phase diagram in the IBM-Lipkin-Meshkov-Glick model

    NASA Astrophysics Data System (ADS)

    Castaños, Octavio; de los Santos, Francisco; Yáñez, Rafael; Romera, Elvira

    2018-02-01

    We derive the phase diagram of a scalar two-level boson model by studying the equilibrium and stability properties of its energy surface. The plane of control parameters is enlarged with respect to previous studies. We then analyze the time evolution of wave packets centered around the ground state at various quantum phase transition boundary lines. In particular, classical and revival times are computed numerically.

  5. Development and assessment of a chemistry-based computer video game as a learning tool

    NASA Astrophysics Data System (ADS)

    Martinez-Hernandez, Kermin Joel

    The chemistry-based computer video game is a multidisciplinary collaboration between chemistry and computer graphics and technology fields developed to explore the use of video games as a possible learning tool. This innovative approach aims to integrate elements of commercial video game and authentic chemistry context environments into a learning experience through gameplay. The project consists of three areas: development, assessment, and implementation. However, the foci of this study were the development and assessment of the computer video game including possible learning outcomes and game design elements. A chemistry-based game using a mixed genre of a single player first-person game embedded with action-adventure and puzzle components was developed to determine if students' level of understanding of chemistry concepts change after gameplay intervention. Three phases have been completed to assess students' understanding of chemistry concepts prior and after gameplay intervention. Two main assessment instruments (pre/post open-ended content survey and individual semi-structured interviews) were used to assess student understanding of concepts. In addition, game design elements were evaluated for future development phases. Preliminary analyses of the interview data suggest that students were able to understand most of the chemistry challenges presented in the game and the game served as a review for previously learned concepts as well as a way to apply such previous knowledge. To guarantee a better understanding of the chemistry concepts, additions such as debriefing and feedback about the content presented in the game seem to be needed. The use of visuals in the game to represent chemical processes, game genre, and game idea appear to be the game design elements that students like the most about the current computer video game.

  6. Large-scale dynamo action precedes turbulence in shearing box simulations of the magnetorotational instability

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bhat, Pallavi; Ebrahimi, Fatima; Blackman, Eric G.

    Here, we study the dynamo generation (exponential growth) of large-scale (planar averaged) fields in unstratified shearing box simulations of the magnetorotational instability (MRI). In contrast to previous studies restricted to horizontal (x–y) averaging, we also demonstrate the presence of large-scale fields when vertical (y–z) averaging is employed instead. By computing space–time planar averaged fields and power spectra, we find large-scale dynamo action in the early MRI growth phase – a previously unidentified feature. Non-axisymmetric linear MRI modes with low horizontal wavenumbers and vertical wavenumbers near that of expected maximal growth, amplify the large-scale fields exponentially before turbulence and high wavenumbermore » fluctuations arise. Thus the large-scale dynamo requires only linear fluctuations but not non-linear turbulence (as defined by mode–mode coupling). Vertical averaging also allows for monitoring the evolution of the large-scale vertical field and we find that a feedback from horizontal low wavenumber MRI modes provides a clue as to why the large-scale vertical field sustains against turbulent diffusion in the non-linear saturation regime. We compute the terms in the mean field equations to identify the individual contributions to large-scale field growth for both types of averaging. The large-scale fields obtained from vertical averaging are found to compare well with global simulations and quasi-linear analytical analysis from a previous study by Ebrahimi & Blackman. We discuss the potential implications of these new results for understanding the large-scale MRI dynamo saturation and turbulence.« less

  7. Collisional disruptions of rotating targets

    NASA Astrophysics Data System (ADS)

    Ševeček, Pavel; Broz, Miroslav

    2017-10-01

    Collisions are key processes in the evolution of the Main Asteroid Belt and impact events - i.e. target fragmentation and gravitational reaccumulation - are commonly studied by numerical simulations, namely by SPH and N-body methods. In our work, we extend the previous studies by assuming rotating targets and we study the dependence of resulting size-distributions on the pre-impact rotation of the target. To obtain stable initial conditions, it is also necessary to include the self-gravity already in the fragmentation phase which was previously neglected.To tackle this problem, we developed an SPH code, accelerated by SSE/AVX instruction sets and parallelized. The code solves the standard set of hydrodynamic equations, using the Tillotson equation of state, von Mises criterion for plastic yielding and scalar Grady-Kipp model for fragmentation. We further modified the velocity gradient by a correction tensor (Schäfer et al. 2007) to ensure a first-order conservation of the total angular momentum. As the intact target is a spherical body, its gravity can be approximated by a potential of a homogeneous sphere, making it easy to set up initial conditions. This is however infeasible for later stages of the disruption; to this point, we included the Barnes-Hut algorithm to compute the gravitational accelerations, using a multipole expansion of distant particles up to hexadecapole order.We tested the code carefully, comparing the results to our previous computations obtained with the SPH5 code (Benz and Asphaug 1994). Finally, we ran a set of simulations and we discuss the difference between the synthetic families created by rotating and static targets.

  8. Large-scale dynamo action precedes turbulence in shearing box simulations of the magnetorotational instability

    DOE PAGES

    Bhat, Pallavi; Ebrahimi, Fatima; Blackman, Eric G.

    2016-07-06

    Here, we study the dynamo generation (exponential growth) of large-scale (planar averaged) fields in unstratified shearing box simulations of the magnetorotational instability (MRI). In contrast to previous studies restricted to horizontal (x–y) averaging, we also demonstrate the presence of large-scale fields when vertical (y–z) averaging is employed instead. By computing space–time planar averaged fields and power spectra, we find large-scale dynamo action in the early MRI growth phase – a previously unidentified feature. Non-axisymmetric linear MRI modes with low horizontal wavenumbers and vertical wavenumbers near that of expected maximal growth, amplify the large-scale fields exponentially before turbulence and high wavenumbermore » fluctuations arise. Thus the large-scale dynamo requires only linear fluctuations but not non-linear turbulence (as defined by mode–mode coupling). Vertical averaging also allows for monitoring the evolution of the large-scale vertical field and we find that a feedback from horizontal low wavenumber MRI modes provides a clue as to why the large-scale vertical field sustains against turbulent diffusion in the non-linear saturation regime. We compute the terms in the mean field equations to identify the individual contributions to large-scale field growth for both types of averaging. The large-scale fields obtained from vertical averaging are found to compare well with global simulations and quasi-linear analytical analysis from a previous study by Ebrahimi & Blackman. We discuss the potential implications of these new results for understanding the large-scale MRI dynamo saturation and turbulence.« less

  9. An energy efficient and high speed architecture for convolution computing based on binary resistive random access memory

    NASA Astrophysics Data System (ADS)

    Liu, Chen; Han, Runze; Zhou, Zheng; Huang, Peng; Liu, Lifeng; Liu, Xiaoyan; Kang, Jinfeng

    2018-04-01

    In this work we present a novel convolution computing architecture based on metal oxide resistive random access memory (RRAM) to process the image data stored in the RRAM arrays. The proposed image storage architecture shows performances of better speed-device consumption efficiency compared with the previous kernel storage architecture. Further we improve the architecture for a high accuracy and low power computing by utilizing the binary storage and the series resistor. For a 28 × 28 image and 10 kernels with a size of 3 × 3, compared with the previous kernel storage approach, the newly proposed architecture shows excellent performances including: 1) almost 100% accuracy within 20% LRS variation and 90% HRS variation; 2) more than 67 times speed boost; 3) 71.4% energy saving.

  10. A Cervico-Thoraco-Lumbar Multibody Dynamic Model for the Estimation of Joint Loads and Muscle Forces.

    PubMed

    Khurelbaatar, Tsolmonbaatar; Kim, Kyungsoo; Hyuk Kim, Yoon

    2015-11-01

    Computational musculoskeletal models have been developed to predict mechanical joint loads on the human spine, such as the forces and moments applied to vertebral and facet joints and the forces that act on ligaments and muscles because of difficulties in the direct measurement of joint loads. However, many whole-spine models lack certain elements. For example, the detailed facet joints in the cervical region or the whole spine region may not be implemented. In this study, a detailed cervico-thoraco-lumbar multibody musculoskeletal model with all major ligaments, separated structures of facet contact and intervertebral disk joints, and the rib cage was developed. The model was validated by comparing the intersegmental rotations, ligament tensile forces, facet joint contact forces, compressive and shear forces on disks, and muscle forces were to those reported in previous experimental and computational studies both by region (cervical, thoracic, or lumbar regions) and for the whole model. The comparisons demonstrated that our whole spine model is consistent with in vitro and in vivo experimental studies and with computational studies. The model developed in this study can be used in further studies to better understand spine structures and injury mechanisms of spinal disorders.

  11. Toward Bridging the Mechanistic Gap between Genes and Traits by Emphasizing the Role of Proteins in a Computational Environment

    ERIC Educational Resources Information Center

    Haskel-Ittah, Michal; Yarden, Anat

    2017-01-01

    Previous studies have shown that students often ignore molecular mechanisms when describing genetic phenomena. Specifically, students tend to directly link genes to their encoded traits, ignoring the role of proteins as mediators in this process. We tested the ability of 10th grade students to connect genes to traits through proteins, using…

  12. Investigation of models for large-scale meteorological prediction experiments

    NASA Technical Reports Server (NTRS)

    Spar, J.

    1973-01-01

    Studies are reported of the long term responses of the model atmosphere to anomalies in snow cover and sea surface temperature. An abstract of a previously issued report on the computed response to surface anomalies in a global atmospheric model is presented, and the experiments on the effects of transient sea surface temperature on the Mintz-Arakawa atmospheric model are reported.

  13. On the Importance of Spatial Resolution for Flap Side Edge Noise Prediction

    NASA Technical Reports Server (NTRS)

    Mineck, Raymond E.; Khorrami, Mehdi R.

    2017-01-01

    A spatial resolution study of flap tip flow and the effects on the farfield noise signature for an 18%-scale, semispan Gulfstream aircraft model are presented. The NASA FUN3D unstructured, compressible Navier-Stokes solver was used to perform the highly resolved, time-dependent, detached eddy simulations of the flow field associated with the flap for this high-fidelity aircraft model. Following our previous work on the same model, the latest computations were undertaken to determine the causes of deficiencies observed in our earlier predictions of the steady and unsteady surface pressures and off-surface flow field at the flap tip regions, in particular the outboard tip area, where the presence of a cavity at the side-edge produces very complex flow features and interactions. The present results show gradual improvement in steady loading at the outboard flap edge region with increasing spatial resolution, yielding more accurate fluctuating surface pressures, off-surface flow field, and farfield noise with improved high-frequency content when compared with wind tunnel measurements. The spatial resolution trends observed in the present study demonstrate that the deficiencies reported in our previous computations are mostly caused by inadequate spatial resolution and are not related to the turbulence model.

  14. Control of Flow Structure in Square Cross-Sectioned U Bend using Numerical Modeling

    NASA Astrophysics Data System (ADS)

    Yavuz, Mehmet Metin; Guden, Yigitcan

    2014-11-01

    Due to the curvature in U-bends, the flow development involves complex flow structures including Dean vortices and high levels of turbulence that are quite critical in considering noise problems and structural failure of the ducts. Computational fluid dynamic (CFD) models are developed using ANSYS Fluent to analyze and to control the flow structure in a square cross-sectioned U-bend with a radius of curvature Rc/D = 0.65. The predictions of velocity profiles on different angular positions of the U-bend are compared against the experimental results available in the literature and the previous numerical studies. The performances of different turbulence models are evaluated to propose the best numerical approach that has high accuracy with reduced computation time. The numerical results of the present study indicate improvements with respect to the previous numerical predictions and very good agreement with the available experimental results. In addition, a flow control technique is utilized to regulate the flow inside the bend. The elimination of Dean vortices along with significant reduction in turbulence levels in different cross flow planes are successfully achieved when the flow control technique is applied. The project is supported by Meteksan Defense Industries, Inc.

  15. High Speed Civil Transport (HSCT) Isolated Nacelle Transonic Boattail Drag Study and Results Using Computational Fluid Dynamics (CFD)

    NASA Technical Reports Server (NTRS)

    Midea, Anthony C.; Austin, Thomas; Pao, S. Paul; DeBonis, James R.; Mani, Mori

    2005-01-01

    Nozzle boattail drag is significant for the High Speed Civil Transport (HSCT) and can be as high as 25 percent of the overall propulsion system thrust at transonic conditions. Thus, nozzle boattail drag has the potential to create a thrust drag pinch and can reduce HSCT aircraft aerodynamic efficiencies at transonic operating conditions. In order to accurately predict HSCT performance, it is imperative that nozzle boattail drag be accurately predicted. Previous methods to predict HSCT nozzle boattail drag were suspect in the transonic regime. In addition, previous prediction methods were unable to account for complex nozzle geometry and were not flexible enough for engine cycle trade studies. A computational fluid dynamics (CFD) effort was conducted by NASA and McDonnell Douglas to evaluate the magnitude and characteristics of HSCT nozzle boattail drag at transonic conditions. A team of engineers used various CFD codes and provided consistent, accurate boattail drag coefficient predictions for a family of HSCT nozzle configurations. The CFD results were incorporated into a nozzle drag database that encompassed the entire HSCT flight regime and provided the basis for an accurate and flexible prediction methodology.

  16. High Speed Civil Transport (HSCT) Isolated Nacelle Transonic Boattail Drag Study and Results Using Computational Fluid Dynamics (CFD)

    NASA Technical Reports Server (NTRS)

    Midea, Anthony C.; Austin, Thomas; Pao, S. Paul; DeBonis, James R.; Mani, Mori

    1999-01-01

    Nozzle boattail drag is significant for the High Speed Civil Transport (HSCT) and can be as high as 25% of the overall propulsion system thrust at transonic conditions. Thus, nozzle boattail drag has the potential to create a thrust-drag pinch and can reduce HSCT aircraft aerodynamic efficiencies at transonic operating conditions. In order to accurately predict HSCT performance, it is imperative that nozzle boattail drag be accurately predicted. Previous methods to predict HSCT nozzle boattail drag were suspect in the transonic regime. In addition, previous prediction methods were unable to account for complex nozzle geometry and were not flexible enough for engine cycle trade studies. A computational fluid dynamics (CFD) effort was conducted by NASA and McDonnell Douglas to evaluate the magnitude and characteristics of HSCT nozzle boattail drag at transonic conditions. A team of engineers used various CFD codes and provided consistent, accurate boattail drag coefficient predictions for a family of HSCT nozzle configurations. The CFD results were incorporated into a nozzle drag database that encompassed the entire HSCT flight regime and provided the basis for an accurate and flexible prediction methodology.

  17. Crime or War: Cyberspace Law and Its Implications for Intelligence

    DTIC Science & Technology

    2011-02-11

    NUMBER OF PAGES 19a. NAME OF RESPONSIBLE PERSON a. REPORT UNCLASSIFED b. ABSTRACT UNCLASSIFED c . THIS PAGE UNCLASSIFED UNLIMITED 38...protected computer or gaining and using information in a manner exceeding authorized access. Robert Morris, a Cornell University computer science...hacker and not the Iranian government. There are hundreds of hackers conducting computer intrusions each day. The previously cited example of Robert

  18. Class and Homework Problems: The Break-Even Radius of Insulation Computed Using Excel Solver and WolframAlpha

    ERIC Educational Resources Information Center

    Foley, Greg

    2014-01-01

    A problem that illustrates two ways of computing the break-even radius of insulation is outlined. The problem is suitable for students who are taking an introductory module in heat transfer or transport phenomena and who have some previous knowledge of the numerical solution of non- linear algebraic equations. The potential for computer algebra,…

  19. A Computer Program for Solving a Set of Conditional Maximum Likelihood Equations Arising in the Rasch Model for Questionnaires.

    ERIC Educational Resources Information Center

    Andersen, Erling B.

    A computer program for solving the conditional likelihood equations arising in the Rasch model for questionnaires is described. The estimation method and the computational problems involved are described in a previous research report by Andersen, but a summary of those results are given in two sections of this paper. A working example is also…

  20. Measurements of Flat-Flame Velocities of Diethyl Ether in Air

    PubMed Central

    Gillespie, Fiona; Metcalfe, Wayne K.; Dirrenberger, Patricia; Herbinet, Olivier; Glaude, Pierre-Alexandre; Battin-Leclerc, Frédérique; Curran, Henry J.

    2013-01-01

    This study presents new adiabatic laminar burning velocities of diethyl ether in air, measured on a flat-flame burner using the heat flux method. The experimental pressure was 1 atm and temperatures of the fresh gas mixture ranged from 298 to 398 K. Flame velocities were recorded at equivalence ratios from 0.55 to 1.60, for which stabilization of the flame was possible. The maximum laminar burning velocity was found at an equivalence ratio of 1.10 or 1.15 at different temperatures. These results are compared with experimental and computational data reported in the literature. The data reported in this study deviate significantly from previous experimental results and are well-predicted by a previously reported chemical kinetic mechanism. PMID:23710107

  1. The Cyborg Astrobiologist: testing a novelty detection algorithm on two mobile exploration systems at Rivas Vaciamadrid in Spain and at the Mars Desert Research Station in Utah

    NASA Astrophysics Data System (ADS)

    McGuire, P. C.; Gross, C.; Wendt, L.; Bonnici, A.; Souza-Egipsy, V.; Ormö, J.; Díaz-Martínez, E.; Foing, B. H.; Bose, R.; Walter, S.; Oesker, M.; Ontrup, J.; Haschke, R.; Ritter, H.

    2010-01-01

    In previous work, a platform was developed for testing computer-vision algorithms for robotic planetary exploration. This platform consisted of a digital video camera connected to a wearable computer for real-time processing of images at geological and astrobiological field sites. The real-time processing included image segmentation and the generation of interest points based upon uncommonness in the segmentation maps. Also in previous work, this platform for testing computer-vision algorithms has been ported to a more ergonomic alternative platform, consisting of a phone camera connected via the Global System for Mobile Communications (GSM) network to a remote-server computer. The wearable-computer platform has been tested at geological and astrobiological field sites in Spain (Rivas Vaciamadrid and Riba de Santiuste), and the phone camera has been tested at a geological field site in Malta. In this work, we (i) apply a Hopfield neural-network algorithm for novelty detection based upon colour, (ii) integrate a field-capable digital microscope on the wearable computer platform, (iii) test this novelty detection with the digital microscope at Rivas Vaciamadrid, (iv) develop a Bluetooth communication mode for the phone-camera platform, in order to allow access to a mobile processing computer at the field sites, and (v) test the novelty detection on the Bluetooth-enabled phone camera connected to a netbook computer at the Mars Desert Research Station in Utah. This systems engineering and field testing have together allowed us to develop a real-time computer-vision system that is capable, for example, of identifying lichens as novel within a series of images acquired in semi-arid desert environments. We acquired sequences of images of geologic outcrops in Utah and Spain consisting of various rock types and colours to test this algorithm. The algorithm robustly recognized previously observed units by their colour, while requiring only a single image or a few images to learn colours as familiar, demonstrating its fast learning capability.

  2. Breast tumor malignancy modelling using evolutionary neural logic networks.

    PubMed

    Tsakonas, Athanasios; Dounias, Georgios; Panagi, Georgia; Panourgias, Evangelia

    2006-01-01

    The present work proposes a computer assisted methodology for the effective modelling of the diagnostic decision for breast tumor malignancy. The suggested approach is based on innovative hybrid computational intelligence algorithms properly applied in related cytological data contained in past medical records. The experimental data used in this study were gathered in the early 1990s in the University of Wisconsin, based in post diagnostic cytological observations performed by expert medical staff. Data were properly encoded in a computer database and accordingly, various alternative modelling techniques were applied on them, in an attempt to form diagnostic models. Previous methods included standard optimisation techniques, as well as artificial intelligence approaches, in a way that a variety of related publications exists in modern literature on the subject. In this report, a hybrid computational intelligence approach is suggested, which effectively combines modern mathematical logic principles, neural computation and genetic programming in an effective manner. The approach proves promising either in terms of diagnostic accuracy and generalization capabilities, or in terms of comprehensibility and practical importance for the related medical staff.

  3. On finding bicliques in bipartite graphs: a novel algorithm and its application to the integration of diverse biological data types

    PubMed Central

    2014-01-01

    Background Integrating and analyzing heterogeneous genome-scale data is a huge algorithmic challenge for modern systems biology. Bipartite graphs can be useful for representing relationships across pairs of disparate data types, with the interpretation of these relationships accomplished through an enumeration of maximal bicliques. Most previously-known techniques are generally ill-suited to this foundational task, because they are relatively inefficient and without effective scaling. In this paper, a powerful new algorithm is described that produces all maximal bicliques in a bipartite graph. Unlike most previous approaches, the new method neither places undue restrictions on its input nor inflates the problem size. Efficiency is achieved through an innovative exploitation of bipartite graph structure, and through computational reductions that rapidly eliminate non-maximal candidates from the search space. An iterative selection of vertices for consideration based on non-decreasing common neighborhood sizes boosts efficiency and leads to more balanced recursion trees. Results The new technique is implemented and compared to previously published approaches from graph theory and data mining. Formal time and space bounds are derived. Experiments are performed on both random graphs and graphs constructed from functional genomics data. It is shown that the new method substantially outperforms the best previous alternatives. Conclusions The new method is streamlined, efficient, and particularly well-suited to the study of huge and diverse biological data. A robust implementation has been incorporated into GeneWeaver, an online tool for integrating and analyzing functional genomics experiments, available at http://geneweaver.org. The enormous increase in scalability it provides empowers users to study complex and previously unassailable gene-set associations between genes and their biological functions in a hierarchical fashion and on a genome-wide scale. This practical computational resource is adaptable to almost any applications environment in which bipartite graphs can be used to model relationships between pairs of heterogeneous entities. PMID:24731198

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    Some of the major technical questions associated with the burial of radioactive high-level wastes in geologic formations are related to the thermal environments generated by the waste and the impact of this dissipated heat on the surrounding environment. The design of a high level waste storage facility must be such that the temperature variations that occur do not adversely affect operating personnel and equipment. The objective of this investigation was to assist OWI by determining the thermal environment that would be experienced by personnel and equipment in a waste storage facility in salt. Particular emphasis was placed on determining themore » maximum floor and air temperatures with and without ventilation in the first 30 years after waste emplacement. The assumed facility design differs somewhat from those previously analyzed and reported, but many of the previous parametric surveys are useful for comparison. In this investigation a number of 2-dimensional and 3-dimensional simulations of the heat flow in a repository have been performed on the HEATING5 and TRUMP heat transfer codes. The representative repository constructs used in the simulations are described, as well as the computational models and computer codes. Results of the simulations are presented and discussed. Comparisons are made between the recent results and those from previous analyses. Finally, a summary of study limitations, comparisons, and conclusions is given.« less

  5. Computational analysis of high resolution unsteady airloads for rotor aeroacoustics

    NASA Technical Reports Server (NTRS)

    Quackenbush, Todd R.; Lam, C.-M. Gordon; Wachspress, Daniel A.; Bliss, Donald B.

    1994-01-01

    The study of helicopter aerodynamic loading for acoustics applications requires the application of efficient yet accurate simulations of the velocity field induced by the rotor's vortex wake. This report summarizes work to date on the development of such an analysis, which builds on the Constant Vorticity Contour (CVC) free wake model, previously implemented for the study of vibratory loading in the RotorCRAFT computer code. The present effort has focused on implementation of an airload reconstruction approach that computes high resolution airload solutions of rotor/rotor-wake interactions required for acoustics computations. Supplementary efforts on the development of improved vortex core modeling, unsteady aerodynamic effects, higher spatial resolution of rotor loading, and fast vortex wake implementations have substantially enhanced the capabilities of the resulting software, denoted RotorCRAFT/AA (AeroAcoustics). Results of validation calculations using recently acquired model rotor data show that by employing airload reconstruction it is possible to apply the CVC wake analysis with temporal and spatial resolution suitable for acoustics applications while reducing the computation time required by one to two orders of magnitude relative to that required by direct calculations. Promising correlation with this body of airload and noise data has been obtained for a variety of rotor configurations and operating conditions.

  6. Simulating coupled dynamics of a rigid-flexible multibody system and compressible fluid

    NASA Astrophysics Data System (ADS)

    Hu, Wei; Tian, Qiang; Hu, HaiYan

    2018-04-01

    As a subsequent work of previous studies of authors, a new parallel computation approach is proposed to simulate the coupled dynamics of a rigid-flexible multibody system and compressible fluid. In this approach, the smoothed particle hydrodynamics (SPH) method is used to model the compressible fluid, the natural coordinate formulation (NCF) and absolute nodal coordinate formulation (ANCF) are used to model the rigid and flexible bodies, respectively. In order to model the compressible fluid properly and efficiently via SPH method, three measures are taken as follows. The first is to use the Riemann solver to cope with the fluid compressibility, the second is to define virtual particles of SPH to model the dynamic interaction between the fluid and the multibody system, and the third is to impose the boundary conditions of periodical inflow and outflow to reduce the number of SPH particles involved in the computation process. Afterwards, a parallel computation strategy is proposed based on the graphics processing unit (GPU) to detect the neighboring SPH particles and to solve the dynamic equations of SPH particles in order to improve the computation efficiency. Meanwhile, the generalized-alpha algorithm is used to solve the dynamic equations of the multibody system. Finally, four case studies are given to validate the proposed parallel computation approach.

  7. Atrial Fibrillation Screening in Nonmetropolitan Areas Using a Telehealth Surveillance System With an Embedded Cloud-Computing Algorithm: Prospective Pilot Study.

    PubMed

    Chen, Ying-Hsien; Hung, Chi-Sheng; Huang, Ching-Chang; Hung, Yu-Chien; Hwang, Juey-Jen; Ho, Yi-Lwun

    2017-09-26

    Atrial fibrillation (AF) is a common form of arrhythmia that is associated with increased risk of stroke and mortality. Detecting AF before the first complication occurs is a recognized priority. No previous studies have examined the feasibility of undertaking AF screening using a telehealth surveillance system with an embedded cloud-computing algorithm; we address this issue in this study. The objective of this study was to evaluate the feasibility of AF screening in nonmetropolitan areas using a telehealth surveillance system with an embedded cloud-computing algorithm. We conducted a prospective AF screening study in a nonmetropolitan area using a single-lead electrocardiogram (ECG) recorder. All ECG measurements were reviewed on the telehealth surveillance system and interpreted by the cloud-computing algorithm and a cardiologist. The process of AF screening was evaluated with a satisfaction questionnaire. Between March 11, 2016 and August 31, 2016, 967 ECGs were recorded from 922 residents in nonmetropolitan areas. A total of 22 (2.4%, 22/922) residents with AF were identified by the physician's ECG interpretation, and only 0.2% (2/967) of ECGs contained significant artifacts. The novel cloud-computing algorithm for AF detection had a sensitivity of 95.5% (95% CI 77.2%-99.9%) and specificity of 97.7% (95% CI 96.5%-98.5%). The overall satisfaction score for the process of AF screening was 92.1%. AF screening in nonmetropolitan areas using a telehealth surveillance system with an embedded cloud-computing algorithm is feasible. ©Ying-Hsien Chen, Chi-Sheng Hung, Ching-Chang Huang, Yu-Chien Hung, Juey-Jen Hwang, Yi-Lwun Ho. Originally published in JMIR Mhealth and Uhealth (http://mhealth.jmir.org), 26.09.2017.

  8. Resource-Efficient, Hierarchical Auto-Tuning of a Hybrid Lattice Boltzmann Computation on the Cray XT4

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Computational Research Division, Lawrence Berkeley National Laboratory; NERSC, Lawrence Berkeley National Laboratory; Computer Science Department, University of California, Berkeley

    2009-05-04

    We apply auto-tuning to a hybrid MPI-pthreads lattice Boltzmann computation running on the Cray XT4 at National Energy Research Scientific Computing Center (NERSC). Previous work showed that multicore-specific auto-tuning can improve the performance of lattice Boltzmann magnetohydrodynamics (LBMHD) by a factor of 4x when running on dual- and quad-core Opteron dual-socket SMPs. We extend these studies to the distributed memory arena via a hybrid MPI/pthreads implementation. In addition to conventional auto-tuning at the local SMP node, we tune at the message-passing level to determine the optimal aspect ratio as well as the correct balance between MPI tasks and threads permore » MPI task. Our study presents a detailed performance analysis when moving along an isocurve of constant hardware usage: fixed total memory, total cores, and total nodes. Overall, our work points to approaches for improving intra- and inter-node efficiency on large-scale multicore systems for demanding scientific applications.« less

  9. Dynamic Mesh CFD Simulations of Orion Parachute Pendulum Motion During Atmospheric Entry

    NASA Technical Reports Server (NTRS)

    Halstrom, Logan D.; Schwing, Alan M.; Robinson, Stephen K.

    2016-01-01

    This paper demonstrates the usage of computational fluid dynamics to study the effects of pendulum motion dynamics of the NASAs Orion Multi-Purpose Crew Vehicle parachute system on the stability of the vehicles atmospheric entry and decent. Significant computational fluid dynamics testing has already been performed at NASAs Johnson Space Center, but this study sought to investigate the effect of bulk motion of the parachute, such as pitching, on the induced aerodynamic forces. Simulations were performed with a moving grid geometry oscillating according to the parameters observed in flight tests. As with the previous simulations, OVERFLOW computational fluid dynamics tool is used with the assumption of rigid, non-permeable geometry. Comparison to parachute wind tunnel tests is included for a preliminary validation of the dynamic mesh model. Results show qualitative differences in the flow fields of the static and dynamic simulations and quantitative differences in the induced aerodynamic forces, suggesting that dynamic mesh modeling of the parachute pendulum motion may uncover additional dynamic effects.

  10. Scatter correction for cone-beam computed tomography using self-adaptive scatter kernel superposition

    NASA Astrophysics Data System (ADS)

    Xie, Shi-Peng; Luo, Li-Min

    2012-06-01

    The authors propose a combined scatter reduction and correction method to improve image quality in cone beam computed tomography (CBCT). The scatter kernel superposition (SKS) method has been used occasionally in previous studies. However, this method differs in that a scatter detecting blocker (SDB) was used between the X-ray source and the tested object to model the self-adaptive scatter kernel. This study first evaluates the scatter kernel parameters using the SDB, and then isolates the scatter distribution based on the SKS. The quality of image can be improved by removing the scatter distribution. The results show that the method can effectively reduce the scatter artifacts, and increase the image quality. Our approach increases the image contrast and reduces the magnitude of cupping. The accuracy of the SKS technique can be significantly improved in our method by using a self-adaptive scatter kernel. This method is computationally efficient, easy to implement, and provides scatter correction using a single scan acquisition.

  11. Increasing the sampling efficiency of protein conformational transition using velocity-scaling optimized hybrid explicit/implicit solvent REMD simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yu, Yuqi; Wang, Jinan; Shao, Qiang, E-mail: qshao@mail.shcnc.ac.cn, E-mail: Jiye.Shi@ucb.com, E-mail: wlzhu@mail.shcnc.ac.cn

    2015-03-28

    The application of temperature replica exchange molecular dynamics (REMD) simulation on protein motion is limited by its huge requirement of computational resource, particularly when explicit solvent model is implemented. In the previous study, we developed a velocity-scaling optimized hybrid explicit/implicit solvent REMD method with the hope to reduce the temperature (replica) number on the premise of maintaining high sampling efficiency. In this study, we utilized this method to characterize and energetically identify the conformational transition pathway of a protein model, the N-terminal domain of calmodulin. In comparison to the standard explicit solvent REMD simulation, the hybrid REMD is much lessmore » computationally expensive but, meanwhile, gives accurate evaluation of the structural and thermodynamic properties of the conformational transition which are in well agreement with the standard REMD simulation. Therefore, the hybrid REMD could highly increase the computational efficiency and thus expand the application of REMD simulation to larger-size protein systems.« less

  12. Numerical formulation for the prediction of solid/liquid change of a binary alloy

    NASA Technical Reports Server (NTRS)

    Schneider, G. E.; Tiwari, S. N.

    1990-01-01

    A computational model is presented for the prediction of solid/liquid phase change energy transport including the influence of free convection fluid flow in the liquid phase region. The computational model considers the velocity components of all non-liquid phase change material control volumes to be zero but fully solves the coupled mass-momentum problem within the liquid region. The thermal energy model includes the entire domain and uses an enthalpy like model and a recently developed method for handling the phase change interface nonlinearity. Convergence studies are performed and comparisons made with experimental data for two different problem specifications. The convergence studies indicate that grid independence was achieved and the comparison with experimental data indicates excellent quantitative prediction of the melt fraction evolution. Qualitative data is also provided in the form of velocity vector diagrams and isotherm plots for selected times in the evolution of both problems. The computational costs incurred are quite low by comparison with previous efforts on solving these problems.

  13. Scour around vertical wall abutment in cohesionless sediment bed

    NASA Astrophysics Data System (ADS)

    Pandey, M.; Sharma, P. K.; Ahmad, Z.

    2017-12-01

    At the time of floods, failure of bridges is the biggest disaster and mainly sub-structure (bridge abutments and piers) are responsible for this failure of bridges. It is very risky if these sub structures are not constructed after proper designing and analysis. Scour is a natural phenomenon in rivers or streams caused by the erosive action of the flowing water on the bed and banks. The abutment undermines due to river-bed erosion and scouring, which generally recognized as the main cause of abutment failure. Most of the previous studies conducted on scour around abutment have concerned with the prediction of the maximum scour depth (Lim, 1994; Melvill, 1992, 1997 and Dey and Barbhuiya, 2005). Dey and Barbhuiya (2005) proposed a relationship for computing maximum scour depth near an abutment, based on laboratory experiments, for computing maximum scour depth around vertical wall abutment, which was confined to their experimental data only. However, this relationship needs to be also verified by the other researchers data in order to support the reliability to the relationship and its wider applicability. In this study, controlled experimentations have been carried out on the scour near a vertical wall abutment. The collected data in this study along with data of the previous investigators have been carried out on the scour near vertical wall abutment. The collected data in this study along with data of the previous have been used to check the validity of the existing equation (Lim, 1994; Melvill, 1992, 1997 and Dey and Barbhuiya, 2005) of maximum scour depth around the vertical wall abutment. A new relationship is proposed to estimate the maximum scour depth around vertical wall abutment, it gives better results all relationships.

  14. TRIM—3D: a three-dimensional model for accurate simulation of shallow water flow

    USGS Publications Warehouse

    Casulli, Vincenzo; Bertolazzi, Enrico; Cheng, Ralph T.

    1993-01-01

    A semi-implicit finite difference formulation for the numerical solution of three-dimensional tidal circulation is discussed. The governing equations are the three-dimensional Reynolds equations in which the pressure is assumed to be hydrostatic. A minimal degree of implicitness has been introduced in the finite difference formula so that the resulting algorithm permits the use of large time steps at a minimal computational cost. This formulation includes the simulation of flooding and drying of tidal flats, and is fully vectorizable for an efficient implementation on modern vector computers. The high computational efficiency of this method has made it possible to provide the fine details of circulation structure in complex regions that previous studies were unable to obtain. For proper interpretation of the model results suitable interactive graphics is also an essential tool.

  15. Hybrid transport and diffusion modeling using electron thermal transport Monte Carlo SNB in DRACO

    NASA Astrophysics Data System (ADS)

    Chenhall, Jeffrey; Moses, Gregory

    2017-10-01

    The iSNB (implicit Schurtz Nicolai Busquet) multigroup diffusion electron thermal transport method is adapted into an Electron Thermal Transport Monte Carlo (ETTMC) transport method to better model angular and long mean free path non-local effects. Previously, the ETTMC model had been implemented in the 2D DRACO multiphysics code and found to produce consistent results with the iSNB method. Current work is focused on a hybridization of the computationally slower but higher fidelity ETTMC transport method with the computationally faster iSNB diffusion method in order to maximize computational efficiency. Furthermore, effects on the energy distribution of the heat flux divergence are studied. Work to date on the hybrid method will be presented. This work was supported by Sandia National Laboratories and the Univ. of Rochester Laboratory for Laser Energetics.

  16. Surrogates for numerical simulations; optimization of eddy-promoter heat exchangers

    NASA Technical Reports Server (NTRS)

    Patera, Anthony T.; Patera, Anthony

    1993-01-01

    Although the advent of fast and inexpensive parallel computers has rendered numerous previously intractable calculations feasible, many numerical simulations remain too resource-intensive to be directly inserted in engineering optimization efforts. An attractive alternative to direct insertion considers models for computational systems: the expensive simulation is evoked only to construct and validate a simplified, input-output model; this simplified input-output model then serves as a simulation surrogate in subsequent engineering optimization studies. A simple 'Bayesian-validated' statistical framework for the construction, validation, and purposive application of static computer simulation surrogates is presented. As an example, dissipation-transport optimization of laminar-flow eddy-promoter heat exchangers are considered: parallel spectral element Navier-Stokes calculations serve to construct and validate surrogates for the flowrate and Nusselt number; these surrogates then represent the originating Navier-Stokes equations in the ensuing design process.

  17. Hollow cathodes as electron emitting plasma contactors Theory and computer modeling

    NASA Technical Reports Server (NTRS)

    Davis, V. A.; Katz, I.; Mandell, M. J.; Parks, D. E.

    1987-01-01

    Several researchers have suggested using hollow cathodes as plasma contactors for electrodynamic tethers, particularly to prevent the Shuttle Orbiter from charging to large negative potentials. Previous studies have shown that fluid models with anomalous scattering can describe the electron transport in hollow cathode generated plasmas. An improved theory of the hollow cathode plasmas is developed and computational results using the theory are compared with laboratory experiments. Numerical predictions for a hollow cathode plasma source of the type considered for use on the Shuttle are presented, as are three-dimensional NASCAP/LEO calculations of the emitted ion trajectories and the resulting potentials in the vicinity of the Orbiter. The computer calculations show that the hollow cathode plasma source makes vastly superior contact with the ionospheric plasma compared with either an electron gun or passive ion collection by the Orbiter.

  18. Unmet needs for analyzing biological big data: A survey of 704 NSF principal investigators

    PubMed Central

    2017-01-01

    In a 2016 survey of 704 National Science Foundation (NSF) Biological Sciences Directorate principal investigators (BIO PIs), nearly 90% indicated they are currently or will soon be analyzing large data sets. BIO PIs considered a range of computational needs important to their work, including high performance computing (HPC), bioinformatics support, multistep workflows, updated analysis software, and the ability to store, share, and publish data. Previous studies in the United States and Canada emphasized infrastructure needs. However, BIO PIs said the most pressing unmet needs are training in data integration, data management, and scaling analyses for HPC—acknowledging that data science skills will be required to build a deeper understanding of life. This portends a growing data knowledge gap in biology and challenges institutions and funding agencies to redouble their support for computational training in biology. PMID:29049281

  19. Fire safety distances for open pool fires

    NASA Astrophysics Data System (ADS)

    Sudheer, S.; Kumar, Lokendra; Manjunath, B. S.; Pasi, Amit; Meenakshi, G.; Prabhu, S. V.

    2013-11-01

    Fire accidents that carry huge loss with them have increased in the previous two decades than at any time in the history. Hence, there is a need for understanding the safety distances from different fires with different fuels. Fire safety distances are computed for different open pool fires. Diesel, gasoline and hexane are used as fuels for circular pool diameters of 0.5 m, 0.7 m and 1.0 m. A large square pool fire of 4 m × 4 m is also conducted with diesel as a fuel. All the prescribed distances in this study are purely based on the thermal analysis. IR camera is used to get the thermal images of pool fires and there by the irradiance at different locations is computed. The computed irradiance is presented with the threshold heat flux limits for human beings.

  20. Unmet needs for analyzing biological big data: A survey of 704 NSF principal investigators.

    PubMed

    Barone, Lindsay; Williams, Jason; Micklos, David

    2017-10-01

    In a 2016 survey of 704 National Science Foundation (NSF) Biological Sciences Directorate principal investigators (BIO PIs), nearly 90% indicated they are currently or will soon be analyzing large data sets. BIO PIs considered a range of computational needs important to their work, including high performance computing (HPC), bioinformatics support, multistep workflows, updated analysis software, and the ability to store, share, and publish data. Previous studies in the United States and Canada emphasized infrastructure needs. However, BIO PIs said the most pressing unmet needs are training in data integration, data management, and scaling analyses for HPC-acknowledging that data science skills will be required to build a deeper understanding of life. This portends a growing data knowledge gap in biology and challenges institutions and funding agencies to redouble their support for computational training in biology.

  1. Cone beam computed tomography and intraoral radiography for diagnosis of dental abnormalities in dogs and cats

    PubMed Central

    Silva, Luiz Antonio F.; Barriviera, Mauricio; Januário, Alessandro L.; Bezerra, Ana Cristina B.; Fioravanti, Maria Clorinda S.

    2011-01-01

    The development of veterinary dentistry has substantially improved the ability to diagnose canine and feline dental abnormalities. Consequently, examinations previously performed only on humans are now available for small animals, thus improving the diagnostic quality. This has increased the need for technical qualification of veterinary professionals and increased technological investments. This study evaluated the use of cone beam computed tomography and intraoral radiography as complementary exams for diagnosing dental abnormalities in dogs and cats. Cone beam computed tomography was provided faster image acquisition with high image quality, was associated with low ionizing radiation levels, enabled image editing, and reduced the exam duration. Our results showed that radiography was an effective method for dental radiographic examination with low cost and fast execution times, and can be performed during surgical procedures. PMID:22122905

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Xinxing; Bowen, Kit, E-mail: kbowen@jhu.edu

    We report a combined photoelectron spectroscopic and computational study of the o-dicarbadodecaborane (o-carborane) parent anion, (C{sub 2}B{sub 10}H{sub 12}){sup −}. Previous studies that focused on the electrophilic nature of o-carborane led to tantalizing yet mixed results. In our study, we confirmed that o-carborane does in fact form a parent anion and that it has considerable stability. This anion is an isomer (“Anion iso 2”) where unlike in neutral o-carborane, the two carbon atoms are not bound.

  3. Computer-aided classification of lung nodules on computed tomography images via deep learning technique

    PubMed Central

    Hua, Kai-Lung; Hsu, Che-Hao; Hidayati, Shintami Chusnul; Cheng, Wen-Huang; Chen, Yu-Jen

    2015-01-01

    Lung cancer has a poor prognosis when not diagnosed early and unresectable lesions are present. The management of small lung nodules noted on computed tomography scan is controversial due to uncertain tumor characteristics. A conventional computer-aided diagnosis (CAD) scheme requires several image processing and pattern recognition steps to accomplish a quantitative tumor differentiation result. In such an ad hoc image analysis pipeline, every step depends heavily on the performance of the previous step. Accordingly, tuning of classification performance in a conventional CAD scheme is very complicated and arduous. Deep learning techniques, on the other hand, have the intrinsic advantage of an automatic exploitation feature and tuning of performance in a seamless fashion. In this study, we attempted to simplify the image analysis pipeline of conventional CAD with deep learning techniques. Specifically, we introduced models of a deep belief network and a convolutional neural network in the context of nodule classification in computed tomography images. Two baseline methods with feature computing steps were implemented for comparison. The experimental results suggest that deep learning methods could achieve better discriminative results and hold promise in the CAD application domain. PMID:26346558

  4. Characterizing quantum supremacy in near-term devices

    NASA Astrophysics Data System (ADS)

    Boixo, Sergio; Isakov, Sergei V.; Smelyanskiy, Vadim N.; Babbush, Ryan; Ding, Nan; Jiang, Zhang; Bremner, Michael J.; Martinis, John M.; Neven, Hartmut

    2018-06-01

    A critical question for quantum computing in the near future is whether quantum devices without error correction can perform a well-defined computational task beyond the capabilities of supercomputers. Such a demonstration of what is referred to as quantum supremacy requires a reliable evaluation of the resources required to solve tasks with classical approaches. Here, we propose the task of sampling from the output distribution of random quantum circuits as a demonstration of quantum supremacy. We extend previous results in computational complexity to argue that this sampling task must take exponential time in a classical computer. We introduce cross-entropy benchmarking to obtain the experimental fidelity of complex multiqubit dynamics. This can be estimated and extrapolated to give a success metric for a quantum supremacy demonstration. We study the computational cost of relevant classical algorithms and conclude that quantum supremacy can be achieved with circuits in a two-dimensional lattice of 7 × 7 qubits and around 40 clock cycles. This requires an error rate of around 0.5% for two-qubit gates (0.05% for one-qubit gates), and it would demonstrate the basic building blocks for a fault-tolerant quantum computer.

  5. An experimental and computational investigation of flow in a radial inlet of an industrial pipeline centrifugal compressor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Flathers, M.B.; Bache, G.E.; Rainsberger, R.

    1996-04-01

    The flow field of a complex three-dimensional radial inlet for an industrial pipeline centrifugal compressor has been experimentally determined on a half-scale model. Based on the experimental results, inlet guide vanes have been designed to correct pressure and swirl angle distribution deficiencies. The unvaned and vaned inlets are analyzed with a commercially available fully three-dimensional viscous Navier-Stokes code. Since experimental results were available prior to the numerical study, the unvaned analysis is considered a postdiction while the vaned analysis is considered a prediction. The computational results of the unvaned inlet have been compared to the previously obtained experimental results. Themore » experimental method utilized for the unvaned inlet is repeated for the vaned inlet and the data have been used to verify the computational results. The paper will discuss experimental, design, and computational procedures, grid generation, boundary conditions, and experimental versus computational methods. Agreement between experimental and computational results is very good, both in prediction and postdiction modes. The results of this investigation indicate that CFD offers a measurable advantage in design, schedule, and cost and can be applied to complex, three-dimensional radial inlets.« less

  6. A finite element method to compute three-dimensional equilibrium configurations of fluid membranes: Optimal parameterization, variational formulation and applications

    NASA Astrophysics Data System (ADS)

    Rangarajan, Ramsharan; Gao, Huajian

    2015-09-01

    We introduce a finite element method to compute equilibrium configurations of fluid membranes, identified as stationary points of a curvature-dependent bending energy functional under certain geometric constraints. The reparameterization symmetries in the problem pose a challenge in designing parametric finite element methods, and existing methods commonly resort to Lagrange multipliers or penalty parameters. In contrast, we exploit these symmetries by representing solution surfaces as normal offsets of given reference surfaces and entirely bypass the need for artificial constraints. We then resort to a Galerkin finite element method to compute discrete C1 approximations of the normal offset coordinate. The variational framework presented is suitable for computing deformations of three-dimensional membranes subject to a broad range of external interactions. We provide a systematic algorithm for computing large deformations, wherein solutions at subsequent load steps are identified as perturbations of previously computed ones. We discuss the numerical implementation of the method in detail and demonstrate its optimal convergence properties using examples. We discuss applications of the method to studying adhesive interactions of fluid membranes with rigid substrates and to investigate the influence of membrane tension in tether formation.

  7. Computer-aided classification of lung nodules on computed tomography images via deep learning technique.

    PubMed

    Hua, Kai-Lung; Hsu, Che-Hao; Hidayati, Shintami Chusnul; Cheng, Wen-Huang; Chen, Yu-Jen

    2015-01-01

    Lung cancer has a poor prognosis when not diagnosed early and unresectable lesions are present. The management of small lung nodules noted on computed tomography scan is controversial due to uncertain tumor characteristics. A conventional computer-aided diagnosis (CAD) scheme requires several image processing and pattern recognition steps to accomplish a quantitative tumor differentiation result. In such an ad hoc image analysis pipeline, every step depends heavily on the performance of the previous step. Accordingly, tuning of classification performance in a conventional CAD scheme is very complicated and arduous. Deep learning techniques, on the other hand, have the intrinsic advantage of an automatic exploitation feature and tuning of performance in a seamless fashion. In this study, we attempted to simplify the image analysis pipeline of conventional CAD with deep learning techniques. Specifically, we introduced models of a deep belief network and a convolutional neural network in the context of nodule classification in computed tomography images. Two baseline methods with feature computing steps were implemented for comparison. The experimental results suggest that deep learning methods could achieve better discriminative results and hold promise in the CAD application domain.

  8. Enhancing an appointment diary on a pocket computer for use by people after brain injury.

    PubMed

    Wright, P; Rogers, N; Hall, C; Wilson, B; Evans, J; Emslie, H

    2001-12-01

    People with memory loss resulting from brain injury benefit from purpose-designed memory aids such as appointment diaries on pocket computers. The present study explores the effects of extending the range of memory aids and including games. For 2 months, 12 people who had sustained brain injury were loaned a pocket computer containing three purpose-designed memory aids: diary, notebook and to-do list. A month later they were given another computer with the same memory aids but a different method of text entry (physical keyboard or touch-screen keyboard). Machine order was counterbalanced across participants. Assessment was by interviews during the loan periods, rating scales, performance tests and computer log files. All participants could use the memory aids and ten people (83%) found them very useful. Correlations among the three memory aids were not significant, suggesting individual variation in how they were used. Games did not increase use of the memory aids, nor did loan of the preferred pocket computer (with physical keyboard). Significantly more diary entries were made by people who had previously used other memory aids, suggesting that a better understanding of how to use a range of memory aids could benefit some people with brain injury.

  9. A Parametric Geometry Computational Fluid Dynamics (CFD) Study Utilizing Design of Experiments (DOE)

    NASA Technical Reports Server (NTRS)

    Rhew, Ray D.; Parker, Peter A.

    2007-01-01

    Design of Experiments (DOE) was applied to the LAS geometric parameter study to efficiently identify and rank primary contributors to integrated drag over the vehicles ascent trajectory in an order of magnitude fewer CFD configurations thereby reducing computational resources and solution time. SME s were able to gain a better understanding on the underlying flowphysics of different geometric parameter configurations through the identification of interaction effects. An interaction effect, which describes how the effect of one factor changes with respect to the levels of other factors, is often the key to product optimization. A DOE approach emphasizes a sequential approach to learning through successive experimentation to continuously build on previous knowledge. These studies represent a starting point for expanded experimental activities that will eventually cover the entire design space of the vehicle and flight trajectory.

  10. Demographic and psychological variables affecting test subject evaluations of ride quality

    NASA Technical Reports Server (NTRS)

    Duncan, N. C.; Conley, H. W.

    1975-01-01

    Ride-quality experiments similar in objectives, design, and procedure were conducted, one using the U.S. Air Force Total In-Flight Simulator and the other using the Langley Passenger Ride Quality Apparatus to provide the motion environments. Large samples (80 or more per experiment) of test subjects were recruited from the Tidewater Virginia area and asked to rate the comfort (on a 7-point scale) of random aircraft motion typical of that encountered during STOL flights. Test subject characteristics of age, sex, and previous flying history (number of previous airplane flights) were studied in a two by three by three factorial design. Correlations were computed between one dependent measure, the subject's mean comfort rating, and various demographic characteristics, attitudinal variables, and the scores on Spielberger's State-Trait Anxiety Inventory. An effect of sex was found in one of the studies. Males made higher (more uncomfortable) ratings of the ride than females. Age and number of previous flights were not significantly related to comfort ratings. No significant interactions between the variables of age, sex, or previous number of flights were observed.

  11. Railroads and the Environment : Estimation of Fuel Consumption in Rail Transportation : Volume 3. Comparison of Computer Simulations with Field Measurements

    DOT National Transportation Integrated Search

    1978-09-01

    This report documents comparisons between extensive rail freight service measurements (previously presented in Volume II) and simulations of the same operations using a sophisticated train performance calculator computer program. The comparisons cove...

  12. The Temporal Dimension of Linguistic Prediction

    ERIC Educational Resources Information Center

    Chow, Wing Yee

    2013-01-01

    This thesis explores how predictions about upcoming language inputs are computed during real-time language comprehension. Previous research has demonstrated humans' ability to use rich contextual information to compute linguistic prediction during real-time language comprehension, and it has been widely assumed that contextual information can…

  13. Fusing Symbolic and Numerical Diagnostic Computations

    NASA Technical Reports Server (NTRS)

    James, Mark

    2007-01-01

    X-2000 Anomaly Detection Language denotes a developmental computing language, and the software that establishes and utilizes the language, for fusing two diagnostic computer programs, one implementing a numerical analysis method, the other implementing a symbolic analysis method into a unified event-based decision analysis software system for realtime detection of events (e.g., failures) in a spacecraft, aircraft, or other complex engineering system. The numerical analysis method is performed by beacon-based exception analysis for multi-missions (BEAMs), which has been discussed in several previous NASA Tech Briefs articles. The symbolic analysis method is, more specifically, an artificial-intelligence method of the knowledge-based, inference engine type, and its implementation is exemplified by the Spacecraft Health Inference Engine (SHINE) software. The goal in developing the capability to fuse numerical and symbolic diagnostic components is to increase the depth of analysis beyond that previously attainable, thereby increasing the degree of confidence in the computed results. In practical terms, the sought improvement is to enable detection of all or most events, with no or few false alarms.

  14. Noise Radiation From a Leading-Edge Slat

    NASA Technical Reports Server (NTRS)

    Lockard, David P.; Choudhari, Meelan M.

    2009-01-01

    This paper extends our previous computations of unsteady flow within the slat cove region of a multi-element high-lift airfoil configuration, which showed that both statistical and structural aspects of the experimentally observed unsteady flow behavior can be captured via 3D simulations over a computational domain of narrow spanwise extent. Although such narrow domain simulation can account for the spanwise decorrelation of the slat cove fluctuations, the resulting database cannot be applied towards acoustic predictions of the slat without invoking additional approximations to synthesize the fluctuation field over the rest of the span. This deficiency is partially alleviated in the present work by increasing the spanwise extent of the computational domain from 37.3% of the slat chord to nearly 226% (i.e., 15% of the model span). The simulation database is used to verify consistency with previous computational results and, then, to develop predictions of the far-field noise radiation in conjunction with a frequency-domain Ffowcs-Williams Hawkings solver.

  15. Prospective estimation of organ dose in CT under tube current modulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tian, Xiaoyu, E-mail: xt3@duke.edu; Li, Xiang; Segars, W. Paul

    Purpose: Computed tomography (CT) has been widely used worldwide as a tool for medical diagnosis and imaging. However, despite its significant clinical benefits, CT radiation dose at the population level has become a subject of public attention and concern. In this light, optimizing radiation dose has become a core responsibility for the CT community. As a fundamental step to manage and optimize dose, it may be beneficial to have accurate and prospective knowledge about the radiation dose for an individual patient. In this study, the authors developed a framework to prospectively estimate organ dose for chest and abdominopelvic CT examsmore » under tube current modulation (TCM). Methods: The organ dose is mainly dependent on two key factors: patient anatomy and irradiation field. A prediction process was developed to accurately model both factors. To model the anatomical diversity and complexity in the patient population, the authors used a previously developed library of computational phantoms with broad distributions of sizes, ages, and genders. A selected clinical patient, represented by a computational phantom in the study, was optimally matched with another computational phantom in the library to obtain a representation of the patient’s anatomy. To model the irradiation field, a previously validated Monte Carlo program was used to model CT scanner systems. The tube current profiles were modeled using a ray-tracing program as previously reported that theoretically emulated the variability of modulation profiles from major CT machine manufacturers Li et al., [Phys. Med. Biol. 59, 4525–4548 (2014)]. The prediction of organ dose was achieved using the following process: (1) CTDI{sub vol}-normalized-organ dose coefficients (h{sub organ}) for fixed tube current were first estimated as the prediction basis for the computational phantoms; (2) each computation phantom, regarded as a clinical patient, was optimally matched with one computational phantom in the library; (3) to account for the effect of the TCM scheme, a weighted organ-specific CTDI{sub vol} [denoted as (CTDI{sub vol}){sub organ,weighted}] was computed for each organ based on the TCM profile and the anatomy of the “matched” phantom; (4) the organ dose was predicted by multiplying the weighted organ-specific CTDI{sub vol} with the organ dose coefficients (h{sub organ}). To quantify the prediction accuracy, each predicted organ dose was compared with the corresponding organ dose simulated from the Monte Carlo program with the TCM profile explicitly modeled. Results: The predicted organ dose showed good agreements with the simulated organ dose across all organs and modulation profiles. The average percentage error in organ dose estimation was generally within 20% across all organs and modulation profiles, except for organs located in the pelvic and shoulder regions. For an average CTDI{sub vol} of a CT exam of 10 mGy, the average error at full modulation strength (α = 1) across all organs was 0.91 mGy for chest exams, and 0.82 mGy for abdominopelvic exams. Conclusions: This study developed a quantitative model to predict organ dose for clinical chest and abdominopelvic scans. Such information may aid in the design of optimized CT protocols in relation to a targeted level of image quality.« less

  16. 2-D Model for Normal and Sickle Cell Blood Microcirculation

    NASA Astrophysics Data System (ADS)

    Tekleab, Yonatan; Harris, Wesley

    2011-11-01

    Sickle cell disease (SCD) is a genetic disorder that alters the red blood cell (RBC) structure and function such that hemoglobin (Hb) cannot effectively bind and release oxygen. Previous computational models have been designed to study the microcirculation for insight into blood disorders such as SCD. Our novel 2-D computational model represents a fast, time efficient method developed to analyze flow dynamics, O2 diffusion, and cell deformation in the microcirculation. The model uses a finite difference, Crank-Nicholson scheme to compute the flow and O2 concentration, and the level set computational method to advect the RBC membrane on a staggered grid. Several sets of initial and boundary conditions were tested. Simulation data indicate a few parameters to be significant in the perturbation of the blood flow and O2 concentration profiles. Specifically, the Hill coefficient, arterial O2 partial pressure, O2 partial pressure at 50% Hb saturation, and cell membrane stiffness are significant factors. Results were found to be consistent with those of Le Floch [2010] and Secomb [2006].

  17. Hodge numbers for CICYs with symmetries of order divisible by 4

    NASA Astrophysics Data System (ADS)

    Candelas, Philip; Constantin, Andrei; Mishra, Challenger

    2016-06-01

    We compute the Hodge numbers for the quotients of complete intersection Calabi-Yau three-folds by groups of orders divisible by 4. We make use of the polynomial deformation method and the counting of invariant K\\"ahler classes. The quotients studied here have been obtained in the automated classification of V. Braun. Although the computer search found the freely acting groups, the Hodge numbers of the quotients were not calculated. The freely acting groups, $G$, that arise in the classification are either $Z_2$ or contain $Z_4$, $Z_2 \\times Z_2$, $Z_3$ or $Z_5$ as a subgroup. The Hodge numbers for the quotients for which the group $G$ contains $Z_3$ or $Z_5$ have been computed previously. This paper deals with the remaining cases, for which $G \\supseteq Z_4$ or $G\\supseteq Z_2 \\times Z_2$. We also compute the Hodge numbers for 99 of the 166 CICY's which have $Z_2$ quotients.

  18. Online mentalising investigated with functional MRI.

    PubMed

    Kircher, Tilo; Blümel, Isabelle; Marjoram, Dominic; Lataster, Tineke; Krabbendam, Lydia; Weber, Jochen; van Os, Jim; Krach, Sören

    2009-05-01

    For successful interpersonal communication, inferring intentions, goals or desires of others is highly advantageous. Increasingly, humans also interact with computers or robots. In this study, we sought to determine to what degree an interactive task, which involves receiving feedback from social partners that can be used to infer intent, engaged the medial prefrontal cortex, a region previously associated with Theory of Mind processes among others. Participants were scanned using fMRI as they played an adapted version of the Prisoner's Dilemma Game with alleged human and computer partners who were outside the scanner. The medial frontal cortex was activated when both human and computer partner were played, while the direct contrast revealed significantly stronger signal change during the human-human interaction. The results suggest a link between activity in the medial prefrontal cortex and the partner played in a mentalising task. This signal change was also present for to the computers partner. Implying agency or a will to non-human actors might be an innate human resource that could lead to an evolutionary advantage.

  19. Clathrate Structure Determination by Combining Crystal Structure Prediction with Computational and Experimental 129Xe NMR Spectroscopy

    PubMed Central

    Selent, Marcin; Nyman, Jonas; Roukala, Juho; Ilczyszyn, Marek; Oilunkaniemi, Raija; Bygrave, Peter J.; Laitinen, Risto; Jokisaari, Jukka

    2017-01-01

    Abstract An approach is presented for the structure determination of clathrates using NMR spectroscopy of enclathrated xenon to select from a set of predicted crystal structures. Crystal structure prediction methods have been used to generate an ensemble of putative structures of o‐ and m‐fluorophenol, whose previously unknown clathrate structures have been studied by 129Xe NMR spectroscopy. The high sensitivity of the 129Xe chemical shift tensor to the chemical environment and shape of the crystalline cavity makes it ideal as a probe for porous materials. The experimental powder NMR spectra can be used to directly confirm or reject hypothetical crystal structures generated by computational prediction, whose chemical shift tensors have been simulated using density functional theory. For each fluorophenol isomer one predicted crystal structure was found, whose measured and computed chemical shift tensors agree within experimental and computational error margins and these are thus proposed as the true fluorophenol xenon clathrate structures. PMID:28111848

  20. Effects of Geometric Details on Slat Noise Generation and Propagation

    NASA Technical Reports Server (NTRS)

    Khorrami, Mehdi R.; Lockard, David P.

    2009-01-01

    The relevance of geometric details to the generation and propagation of noise from leading-edge slats is considered. Typically, such details are omitted in computational simulations and model-scale experiments thereby creating ambiguities in comparisons with acoustic results from flight tests. The current study uses two-dimensional, computational simulations in conjunction with a Ffowcs Williams-Hawkings (FW-H) solver to investigate the effects of previously neglected slat "bulb" and "blade" seals on the local flow field and the associated acoustic radiation. The computations show that the presence of the "blade" seal at the cusp in the simulated geometry significantly changes the slat cove flow dynamics, reduces the amplitudes of the radiated sound, and to a lesser extent, alters the directivity beneath the airfoil. Furthermore, the computations suggest that a modest extension of the baseline "blade" seal further enhances the suppression of slat noise. As a side issue, the utility and equivalence of FW-H methodology for calculating far-field noise as opposed to a more direct approach is examined and demonstrated.

Top