Sample records for comprehensive computational study

  1. Technical Advances and Fifth Grade Reading Comprehension: Do Students Benefit?

    ERIC Educational Resources Information Center

    Fountaine, Drew

    This paper takes a look at some recent studies on utilization of technical tools, primarily personal computers and software, for improving fifth-grade students' reading comprehension. Specifically, the paper asks what benefits an educator can expect students to derive from closed-captioning and computer-assisted reading comprehension products. It…

  2. Computer-Based and Paper-Based Reading Comprehension in Adolescents with Typical Language Development and Language-Learning Disabilities

    ERIC Educational Resources Information Center

    Srivastava, Pradyumn; Gray, Shelley

    2012-01-01

    Purpose: With the global expansion of technology, our reading platform has shifted from traditional text to hypertext, yet little consideration has been given to how this shift might help or hinder students' reading comprehension. The purpose of this study was to compare reading comprehension of computer-based and paper-based texts in adolescents…

  3. The Effect of Computer-Assisted Language Learning on Reading Comprehension in an Iranian EFL Context

    ERIC Educational Resources Information Center

    Saeidi, Mahnaz; Yusefi, Mahsa

    2012-01-01

    This study is an attempt to examine the effect of computer-assisted language learning (CALL) on reading comprehension in an Iranian English as a foreign language (EFL) context. It was hypothesized that CALL has an effect on reading comprehension. Forty female learners of English at intermediate level after administering a proficiency test were…

  4. Comprehensive Materials and Morphologies Study of Ion Traps (COMMIT) for Scalable Quantum Computation

    DTIC Science & Technology

    2012-04-21

    the photoelectric effect. The typical shortest wavelengths needed for ion traps range from 194 nm for Hg+ to 493 nm for Ba +, corresponding to 6.4-2.5...REPORT Comprehensive Materials and Morphologies Study of Ion Traps (COMMIT) for scalable Quantum Computation - Final Report 14. ABSTRACT 16. SECURITY...CLASSIFICATION OF: Trapped ion systems, are extremely promising for large-scale quantum computation, but face a vexing problem, with motional quantum

  5. The Impact of Item Dependency on the Efficiency of Testing and Reliability of Student Scores from a Computer Adaptive Assessment of Reading Comprehension

    ERIC Educational Resources Information Center

    Petscher, Yaacov; Foorman, Barbara R.; Truckenmiller, Adrea J.

    2017-01-01

    The objective of the present study was to evaluate the extent to which students who took a computer adaptive test of reading comprehension accounting for testlet effects were administered fewer passages and had a more precise estimate of their reading comprehension ability compared to students in the control condition. A randomized controlled…

  6. The Effect of Gloss Type and Mode on Iranian EFL Learners' Reading Comprehension

    ERIC Educational Resources Information Center

    Sadeghi, Karim; Ahmadi, Negar

    2012-01-01

    This study investigated the effects of three kinds of gloss conditions that is traditional non-CALL marginal gloss, computer-based audio gloss, and computer-based extended audio gloss, on reading comprehension of Iranian EFL learners. To this end, three experimental and one control groups, each comprising 15 participants, took part in this study.…

  7. The Effects of Targeted English Language Arts Instruction Using Multimedia Applications on Grade Three Students' Reading Comprehension, Attitude toward Computers, and Attitude toward School

    ERIC Educational Resources Information Center

    Swerdloff, Matthew

    2013-01-01

    The purpose of this study was to investigate the specific effects of targeted English Language Arts (ELA) instruction using multimedia applications. Student reading comprehension, student attitude toward computers, and student attitude toward school were measured in this study. The study also examined the perceptions, of selected students, of the…

  8. The role of sustained attention and display medium in reading comprehension among adolescents with ADHD and without it.

    PubMed

    Stern, Pnina; Shalev, Lilach

    2013-01-01

    Difficulties in reading comprehension are common in children and adolescents with Attention Deficit/Hyperactivity Disorder (ADHD). The current study aimed at investigating the relation between sustained attention and reading comprehension among adolescents with and without ADHD. Another goal was to examine the impact of two manipulations of the text on the efficiency of reading comprehension: Spacing (standard- vs. double-spacing) and Type of presentation (computer screen vs. hard copy). Reading comprehension of two groups of adolescents (participants with ADHD and normal controls) was assessed and compared in four different conditions (standard printed, spaced printed, standard on computer screen, spaced on computer screen). In addition, participants completed a visual sustained attention task. Significant differences in reading comprehension and in sustained attention were obtained between the two groups. Also, a significant correlation was obtained between sustained attention and reading comprehension. Moreover, a significant interaction was revealed between presentation-type, spacing and level of sustained attention on reading comprehension. Implications for reading intervention and the importance of early assessment of attention functioning are discussed. Copyright © 2012 Elsevier Ltd. All rights reserved.

  9. Effects of computer-based immediate feedback on foreign language listening comprehension and test-associated anxiety.

    PubMed

    Lee, Shu-Ping; Su, Hui-Kai; Lee, Shin-Da

    2012-06-01

    This study investigated the effects of immediate feedback on computer-based foreign language listening comprehension tests and on intrapersonal test-associated anxiety in 72 English major college students at a Taiwanese University. Foreign language listening comprehension of computer-based tests designed by MOODLE, a dynamic e-learning environment, with or without immediate feedback together with the state-trait anxiety inventory (STAI) were tested and repeated after one week. The analysis indicated that immediate feedback during testing caused significantly higher anxiety and resulted in significantly higher listening scores than in the control group, which had no feedback. However, repeated feedback did not affect the test anxiety and listening scores. Computer-based immediate feedback did not lower debilitating effects of anxiety but enhanced students' intrapersonal eustress-like anxiety and probably improved their attention during listening tests. Computer-based tests with immediate feedback might help foreign language learners to increase attention in foreign language listening comprehension.

  10. Using a Computer-Adapted, Conceptually Based History Text to Increase Comprehension and Problem-Solving Skills of Students with Disabilities

    ERIC Educational Resources Information Center

    Twyman, Todd; Tindal, Gerald

    2006-01-01

    The purpose of this study was to improve the comprehension and problem-solving skills of students with disabilities in social studies using a conceptually framed, computer-adapted history text. Participants were 11th and 12th grade students identified with learning disabilities in reading and writing from two intact, self-contained social studies…

  11. Development of a Computer-Based Measure of Listening Comprehension of Science Talk

    ERIC Educational Resources Information Center

    Lin, Sheau-Wen; Liu, Yu; Chen, Shin-Feng; Wang, Jing-Ru; Kao, Huey-Lien

    2015-01-01

    The purpose of this study was to develop a computer-based assessment for elementary school students' listening comprehension of science talk within an inquiry-oriented environment. The development procedure had 3 steps: a literature review to define the framework of the test, collecting and identifying key constructs of science talk, and…

  12. Using Primary Language Support via Computer to Improve Reading Comprehension Skills of First-Grade English Language Learners

    ERIC Educational Resources Information Center

    Rodriguez, Cathi Draper; Filler, John; Higgins, Kyle

    2012-01-01

    Through this exploratory study the authors investigated the effects of primary language support delivered via computer on the English reading comprehension skills of English language learners. Participants were 28 First-grade students identified as Limited English Proficient. The primary language of all participants was Spanish. Students were…

  13. A randomized, controlled trial of interactive, multimedia software for patient colonoscopy education.

    PubMed

    Shaw, M J; Beebe, T J; Tomshine, P A; Adlis, S A; Cass, O W

    2001-02-01

    The purpose of our study was to assess the effectiveness of computer-assisted instruction (CAI) in patients having colonoscopies. We conducted a randomized, controlled trial in large, multispecialty clinic. Eighty-six patients were referred for colonoscopies. The interventions were standard education versus standard education plus CAI, and the outcome measures were anxiety, comprehension, and satisfaction. Computer-assisted instruction had no effect on patients' anxiety. The group receiving CAI demonstrated better overall comprehension (p < 0.001). However, Comprehension of certain aspects of serious complications and appropriate postsedation behavior were unaffected by educational method. Patients in the CAI group were more likely to indicate satisfaction with the amount of information provided when compared with the standard education counterparts (p = 0.001). Overall satisfaction was unaffected by educational method. Computer-assisted instruction for colonoscopy provided better comprehension and greater satisfaction with the adequacy of education than standard education. Computer-assisted instruction helps physicians meet their educational responsibilities with no decrement to the interpersonal aspects of the patient-physician relationship.

  14. The Effect of a Computer Program Based on Analysis, Design, Development, Implementation and Evaluation (ADDIE) in Improving Ninth Graders' Listening and Reading Comprehension Skills in English in Jordan

    ERIC Educational Resources Information Center

    Alodwan, Talal; Almosa, Mosaab

    2018-01-01

    The study aimed to assess the effectiveness of a computer program based on Analysis, Design, Development, Implementation and Evaluation (ADDIE) Model on the achievement of Ninth Graders' listening and Reading Comprehension Skills in English. The study sample comprised 70 ninth graders during the second semester of the academic year 2016/2017. The…

  15. Computerized Presentation of Text: Effects on Children's Reading of Informational Material. Special Issue on Reading Comprehension ? Part II

    ERIC Educational Resources Information Center

    Kerr, Matthew A.; Symons, Sonya E.

    2006-01-01

    This study examined whether children's reading rate, comprehension, and recall are affected by computer presentation of text. Participants were 60 grade five students, who each read two expository texts, one in a traditional print format and the other from a computer monitor, which used a common scrolling text interface. After reading each text,…

  16. A computational modeling of semantic knowledge in reading comprehension: Integrating the landscape model with latent semantic analysis.

    PubMed

    Yeari, Menahem; van den Broek, Paul

    2016-09-01

    It is a well-accepted view that the prior semantic (general) knowledge that readers possess plays a central role in reading comprehension. Nevertheless, computational models of reading comprehension have not integrated the simulation of semantic knowledge and online comprehension processes under a unified mathematical algorithm. The present article introduces a computational model that integrates the landscape model of comprehension processes with latent semantic analysis representation of semantic knowledge. In three sets of simulations of previous behavioral findings, the integrated model successfully simulated the activation and attenuation of predictive and bridging inferences during reading, as well as centrality estimations and recall of textual information after reading. Analyses of the computational results revealed new theoretical insights regarding the underlying mechanisms of the various comprehension phenomena.

  17. The Effects of Beacons, Comments, and Tasks on Program Comprehension Process in Software Maintenance

    ERIC Educational Resources Information Center

    Fan, Quyin

    2010-01-01

    Program comprehension is the most important and frequent process in software maintenance. Extensive research has found that individual characteristics of programmers, differences of computer programs, and differences of task-driven motivations are the major factors that affect the program comprehension results. There is no study specifically…

  18. Effects of a Computer-Assisted Concept Mapping Learning Strategy on EFL College Students' English Reading Comprehension

    ERIC Educational Resources Information Center

    Liu, Pei-Lin; Chen, Chiu-Jung; Chang, Yu-Ju

    2010-01-01

    The purpose of this research was to investigate the effects of a computer-assisted concept mapping learning strategy on EFL college learners' English reading comprehension. The research questions were: (1) what was the influence of the computer-assisted concept mapping learning strategy on different learners' English reading comprehension? (2) did…

  19. Computer Games Application within Alternative Classroom Goal Structures: Cognitive, Metacognitive, and Affective Evaluation

    ERIC Educational Resources Information Center

    Ke, Fengfeng

    2008-01-01

    This article reports findings on a study of educational computer games used within various classroom situations. Employing an across-stage, mixed method model, the study examined whether educational computer games, in comparison to traditional paper-and-pencil drills, would be more effective in facilitating comprehensive math learning outcomes,…

  20. Benefits and Drawbacks of Computer-Based Assessment and Feedback Systems: Student and Educator Perspectives

    ERIC Educational Resources Information Center

    Debuse, Justin C. W.; Lawley, Meredith

    2016-01-01

    Providing students with high quality feedback is important and can be achieved using computer-based systems. While student and educator perspectives of such systems have been investigated, a comprehensive multidisciplinary study has not yet been undertaken. This study examines student and educator perspectives of a computer-based assessment and…

  1. The Effectiveness of Electronic Text and Pictorial Graphic Organizers to Improve Comprehension Related to Functional Skills

    ERIC Educational Resources Information Center

    Douglas, Karen H.; Ayres, Kevin M.; Langone, John; Bramlett, Virginia Bell

    2011-01-01

    This study evaluated the effects of a computer-based instructional program to assist three students with mild to moderate intellectual disabilities in using pictorial graphic organizers as aids for increasing comprehension of electronic text-based recipes. Student comprehension of recipes was measured by their ability to verbally retell recipe…

  2. Effect of Hypertextual Reading on Academic Success and Comprehension Skills

    ERIC Educational Resources Information Center

    Durukan, Erhan

    2014-01-01

    As computer technology developed, hypertexts emerged as an influential environment for developing language skills. This study aims to evaluate a text prepared in a hypertextual environment and its effects on academic success and comprehension skills. In this study, "preliminary test final test control group experimental pattern" was used…

  3. Computer Assisted Language Learning. Routledge Studies in Computer Assisted Language Learning

    ERIC Educational Resources Information Center

    Pennington, Martha

    2011-01-01

    Computer-assisted language learning (CALL) is an approach to language teaching and learning in which computer technology is used as an aid to the presentation, reinforcement and assessment of material to be learned, usually including a substantial interactive element. This books provides an up-to date and comprehensive overview of…

  4. Preliminary Findings on the Computer-Administered Multiple-Choice Online Causal Comprehension Assessment, a Diagnostic Reading Comprehension Test

    ERIC Educational Resources Information Center

    Davison, Mark L.; Biancarosa, Gina; Carlson, Sarah E.; Seipel, Ben; Liu, Bowen

    2018-01-01

    The computer-administered Multiple-Choice Online Causal Comprehension Assessment (MOCCA) for Grades 3 to 5 has an innovative, 40-item multiple-choice structure in which each distractor corresponds to a comprehension process upon which poor comprehenders have been shown to rely. This structure requires revised thinking about measurement issues…

  5. Effects of Strength of Accent on an L2 Interactive Lecture Listening Comprehension Test

    ERIC Educational Resources Information Center

    Ockey, Gary J.; Papageorgiou, Spiros; French, Robert

    2016-01-01

    This article reports on a study which aimed to determine the effect of strength of accent on listening comprehension of interactive lectures. Test takers (N = 21,726) listened to an interactive lecture given by one of nine speakers and responded to six comprehension items. The test taker responses were analyzed with the Rasch computer program…

  6. On the Computation of Comprehensive Boolean Gröbner Bases

    NASA Astrophysics Data System (ADS)

    Inoue, Shutaro

    We show that a comprehensive Boolean Gröbner basis of an ideal I in a Boolean polynomial ring B (bar A,bar X) with main variables bar X and parameters bar A can be obtained by simply computing a usual Boolean Gröbner basis of I regarding both bar X and bar A as variables with a certain block term order such that bar X ≫ bar A. The result together with a fact that a finite Boolean ring is isomorphic to a direct product of the Galois field mathbb{GF}_2 enables us to compute a comprehensive Boolean Gröbner basis by only computing corresponding Gröbner bases in a polynomial ring over mathbb{GF}_2. Our implementation in a computer algebra system Risa/Asir shows that our method is extremely efficient comparing with existing computation algorithms of comprehensive Boolean Gröbner bases.

  7. Desktop Publishing: The Effects of Computerized Formats on Reading Speed and Comprehension.

    ERIC Educational Resources Information Center

    Knupfer, Nancy Nelson; McIsaac, Marina Stock

    1989-01-01

    Describes study that was conducted to determine the effects of two electronic text variables used in desktop publishing on undergraduate students' reading speed and comprehension. Research on text variables, graphic design, instructional text design, and computer screen design is discussed, and further studies are suggested. (22 references) (LRW)

  8. Health information technology and physician-patient interactions: impact of computers on communication during outpatient primary care visits.

    PubMed

    Hsu, John; Huang, Jie; Fung, Vicki; Robertson, Nan; Jimison, Holly; Frankel, Richard

    2005-01-01

    The aim of this study was to evaluate the impact of introducing health information technology (HIT) on physician-patient interactions during outpatient visits. This was a longitudinal pre-post study: two months before and one and seven months after introduction of examination room computers. Patient questionnaires (n = 313) after primary care visits with physicians (n = 8) within an integrated delivery system. There were three patient satisfaction domains: (1) satisfaction with visit components, (2) comprehension of the visit, and (3) perceptions of the physician's use of the computer. Patients reported that physicians used computers in 82.3% of visits. Compared with baseline, overall patient satisfaction with visits increased seven months after the introduction of computers (odds ratio [OR] = 1.50; 95% confidence interval [CI]: 1.01-2.22), as did satisfaction with physicians' familiarity with patients (OR = 1.60, 95% CI: 1.01-2.52), communication about medical issues (OR = 1.61; 95% CI: 1.05-2.47), and comprehension of decisions made during the visit (OR = 1.63; 95% CI: 1.06-2.50). In contrast, there were no significant changes in patient satisfaction with comprehension of self-care responsibilities, communication about psychosocial issues, or available visit time. Seven months post-introduction, patients were more likely to report that the computer helped the visit run in a more timely manner (OR = 1.76; 95% CI: 1.28-2.42) compared with the first month after introduction. There were no other significant changes in patient perceptions of the computer use over time. The examination room computers appeared to have positive effects on physician-patient interactions related to medical communication without significant negative effects on other areas such as time available for patient concerns. Further study is needed to better understand HIT use during outpatient visits.

  9. Computer Ratio and Student Achievement in Reading and Math in a North Carolina School District

    ERIC Educational Resources Information Center

    Preswood, Erica

    2017-01-01

    This longitudinal research project explored the relationship between a 1:1 computing initiative and student achievement on the North Carolina End of Grade Reading Comprehension and Math tests in the study school district. The purpose of this research study was to determine if the implementation of a 1:1 computing initiative impacted student…

  10. Comprehensive Computer-Based Instructional Programs: What Works for Educationally Disadvantaged Students?

    ERIC Educational Resources Information Center

    Swan, Karen; And Others

    The Computer Pilot Program of the Division of Computer Information Services of the New York City Board of Education was designed to investigate the claim that comprehensive computer-based instruction (CBI) might best be used to improve the basic skills of educationally disadvantaged students. This ongoing project is designed to identify…

  11. The Effects of Hypertext Gloss on Comprehension and Vocabulary Retention under Incidental and Intentional Learning Conditions

    ERIC Educational Resources Information Center

    Zandieh, Zeinab; Jafarigohar, Manoochehr

    2012-01-01

    The present study investigated comprehension, immediate and delayed vocabulary retention under incidental and intentional learning conditions via computer mediated hypertext gloss. One hundred and eighty four (N = 184) intermediate students of English as a foreign language at an English school participated in the study. They were randomly assigned…

  12. Examining the Effect of Computer-Based Passage Presentation on Reading Test Performance

    ERIC Educational Resources Information Center

    Higgins, Jennifer; Russell, Michael; Hoffmann, Thomas

    2005-01-01

    To examine the impact of transitioning 4th grade reading comprehension assessments to the computer, 219 fourth graders were randomly assigned to take a one-hour reading comprehension assessment on paper, on a computer using scrolling text to navigate through passages, or on a computer using paging text to navigate through passages. This study…

  13. Computing a Comprehensible Model for Spam Filtering

    NASA Astrophysics Data System (ADS)

    Ruiz-Sepúlveda, Amparo; Triviño-Rodriguez, José L.; Morales-Bueno, Rafael

    In this paper, we describe the application of the Desicion Tree Boosting (DTB) learning model to spam email filtering.This classification task implies the learning in a high dimensional feature space. So, it is an example of how the DTB algorithm performs in such feature space problems. In [1], it has been shown that hypotheses computed by the DTB model are more comprehensible that the ones computed by another ensemble methods. Hence, this paper tries to show that the DTB algorithm maintains the same comprehensibility of hypothesis in high dimensional feature space problems while achieving the performance of other ensemble methods. Four traditional evaluation measures (precision, recall, F1 and accuracy) have been considered for performance comparison between DTB and others models usually applied to spam email filtering. The size of the hypothesis computed by a DTB is smaller and more comprehensible than the hypothesis computed by Adaboost and Naïve Bayes.

  14. Computer-Mediated Glosses in Second Language Reading Comprehension and Vocabulary Learning: A Meta-Analysis

    ERIC Educational Resources Information Center

    Abraham, Lee B.

    2008-01-01

    Language learners have unprecedented opportunities for developing second language literacy skills and intercultural understanding by reading authentic texts on the Internet and in multimedia computer-assisted language learning environments. This article presents findings from a meta-analysis of 11 studies of computer-mediated glosses in second…

  15. Heterogeneity in Health Care Computing Environments

    PubMed Central

    Sengupta, Soumitra

    1989-01-01

    This paper discusses issues of heterogeneity in computer systems, networks, databases, and presentation techniques, and the problems it creates in developing integrated medical information systems. The need for institutional, comprehensive goals are emphasized. Using the Columbia-Presbyterian Medical Center's computing environment as the case study, various steps to solve the heterogeneity problem are presented.

  16. Computer Game Play as an Imaginary Stage for Reading: Implicit Spatial Effects of Computer Games Embedded in Hard Copy Books

    ERIC Educational Resources Information Center

    Smith, Glenn Gordon

    2012-01-01

    This study compared books with embedded computer games (via pentop computers with microdot paper and audio feedback) with regular books with maps, in terms of fifth graders' comprehension and retention of spatial details from stories. One group read a story in hard copy with embedded computer games, the other group read it in regular book format…

  17. An Assessment of Security Vulnerabilities Comprehension of Cloud Computing Environments: A Quantitative Study Using the Unified Theory of Acceptance and Use

    ERIC Educational Resources Information Center

    Venkatesh, Vijay P.

    2013-01-01

    The current computing landscape owes its roots to the birth of hardware and software technologies from the 1940s and 1950s. Since then, the advent of mainframes, miniaturized computing, and internetworking has given rise to the now prevalent cloud computing era. In the past few months just after 2010, cloud computing adoption has picked up pace…

  18. Computer Assisted Instruction to Promote Comprehension in Students with Learning Disabilities

    ERIC Educational Resources Information Center

    Stetter, Maria Earman; Hughes, Marie Tejero

    2011-01-01

    Reading comprehension is a crucial skill for academic success of all students. Very often, students with learning disabilities struggle with reading skills and since students learn new information in school by reading; these difficulties often increase the academic struggles students with learning disabilities face. The current study examined…

  19. Deriving Empirically-Based Design Guidelines for Advanced Learning Technologies that Foster Disciplinary Comprehension

    ERIC Educational Resources Information Center

    Poitras, Eric; Trevors, Gregory

    2012-01-01

    Planning, conducting, and reporting leading-edge research requires professionals who are capable of highly skilled reading. This study reports the development of an empirically informed computer-based learning environment designed to foster the acquisition of reading comprehension strategies that mediate expertise in the social sciences. Empirical…

  20. Computational 3-D Model of the Human Respiratory System

    EPA Science Inventory

    We are developing a comprehensive, morphologically-realistic computational model of the human respiratory system that can be used to study the inhalation, deposition, and clearance of contaminants, while being adaptable for age, race, gender, and health/disease status. The model ...

  1. A Randomized Controlled Trial Study of the ABRACADABRA Reading Intervention Program in Grade 1

    ERIC Educational Resources Information Center

    Savage, Robert S.; Abrami, Philip; Hipps, Geoffrey; Deault, Louise

    2009-01-01

    This study reports a randomized controlled trial evaluation of a computer-based balanced literacy intervention, ABRACADABRA (http://grover.concordia.ca/abra/version1/abracadabra.html). Children (N = 144) in Grade 1 were exposed either to computer activities for word analysis, text comprehension, and fluency, alongside shared stories (experimental…

  2. Computer versus Paper-Based Reading: A Case Study in English Language Teaching Context

    ERIC Educational Resources Information Center

    Solak, Ekrem

    2014-01-01

    This research aims to determine the preference of prospective English teachers in performing computer and paper-based reading tasks and to what extent computer and paper-based reading influence their reading speed, accuracy and comprehension. The research was conducted at a State run University, English Language Teaching Department in Turkey. The…

  3. Identifying Discriminating Variables between Teachers Who Fully Integrate Computers and Teachers with Limited Integration

    ERIC Educational Resources Information Center

    Mueller, Julie; Wood, Eileen; Willoughby, Teena; Ross, Craig; Specht, Jacqueline

    2008-01-01

    Given the prevalence of computers in education today, it is critical to understand teachers' perspectives regarding computer integration in their classrooms. The current study surveyed a random sample of a heterogeneous group of 185 elementary and 204 secondary teachers in order to provide a comprehensive summary of teacher characteristics and…

  4. An Investigation of the Effectiveness of Computer Simulation Programs as Tutorial Tools for Teaching Population Ecology at University.

    ERIC Educational Resources Information Center

    Korfiatis, K.; Papatheodorou, E.; Paraskevopoulous, S.; Stamou, G. P.

    1999-01-01

    Describes a study of the effectiveness of computer-simulation programs in enhancing biology students' familiarity with ecological modeling and concepts. Finds that computer simulations improved student comprehension of ecological processes expressed in mathematical form, but did not allow a full understanding of ecological concepts. Contains 28…

  5. Errors and Intelligence in Computer-Assisted Language Learning: Parsers and Pedagogues. Routledge Studies in Computer Assisted Language Learning

    ERIC Educational Resources Information Center

    Heift, Trude; Schulze, Mathias

    2012-01-01

    This book provides the first comprehensive overview of theoretical issues, historical developments and current trends in ICALL (Intelligent Computer-Assisted Language Learning). It assumes a basic familiarity with Second Language Acquisition (SLA) theory and teaching, CALL and linguistics. It is of interest to upper undergraduate and/or graduate…

  6. Cognitive Load for Configuration Comprehension in Computer-Supported Geometry Problem Solving: An Eye Movement Perspective

    ERIC Educational Resources Information Center

    Lin, John Jr-Hung; Lin, Sunny S. J.

    2014-01-01

    The present study investigated (a) whether the perceived cognitive load was different when geometry problems with various levels of configuration comprehension were solved and (b) whether eye movements in comprehending geometry problems showed sources of cognitive loads. In the first investigation, three characteristics of geometry configurations…

  7. Health Information Technology and Physician-Patient Interactions: Impact of Computers on Communication during Outpatient Primary Care Visits

    PubMed Central

    Hsu, John; Huang, Jie; Fung, Vicki; Robertson, Nan; Jimison, Holly; Frankel, Richard

    2005-01-01

    Objective: The aim of this study was to evaluate the impact of introducing health information technology (HIT) on physician-patient interactions during outpatient visits. Design: This was a longitudinal pre-post study: two months before and one and seven months after introduction of examination room computers. Patient questionnaires (n = 313) after primary care visits with physicians (n = 8) within an integrated delivery system. There were three patient satisfaction domains: (1) satisfaction with visit components, (2) comprehension of the visit, and (3) perceptions of the physician's use of the computer. Results: Patients reported that physicians used computers in 82.3% of visits. Compared with baseline, overall patient satisfaction with visits increased seven months after the introduction of computers (odds ratio [OR] = 1.50; 95% confidence interval [CI]: 1.01–2.22), as did satisfaction with physicians' familiarity with patients (OR = 1.60, 95% CI: 1.01–2.52), communication about medical issues (OR = 1.61; 95% CI: 1.05–2.47), and comprehension of decisions made during the visit (OR = 1.63; 95% CI: 1.06–2.50). In contrast, there were no significant changes in patient satisfaction with comprehension of self-care responsibilities, communication about psychosocial issues, or available visit time. Seven months post-introduction, patients were more likely to report that the computer helped the visit run in a more timely manner (OR = 1.76; 95% CI: 1.28–2.42) compared with the first month after introduction. There were no other significant changes in patient perceptions of the computer use over time. Conclusion: The examination room computers appeared to have positive effects on physician-patient interactions related to medical communication without significant negative effects on other areas such as time available for patient concerns. Further study is needed to better understand HIT use during outpatient visits. PMID:15802484

  8. Computer program modifications of Open-file report 82-1065; a comprehensive system for interpreting seismic-refraction and arrival-time data using interactive computer methods

    USGS Publications Warehouse

    Ackermann, Hans D.; Pankratz, Leroy W.; Dansereau, Danny A.

    1983-01-01

    The computer programs published in Open-File Report 82-1065, A comprehensive system for interpreting seismic-refraction arrival-time data using interactive computer methods (Ackermann, Pankratz, and Dansereau, 1982), have been modified to run on a mini-computer. The new version uses approximately 1/10 of the memory of the initial version, is more efficient and gives the same results.

  9. Using computer agents to explain medical documents to patients with low health literacy.

    PubMed

    Bickmore, Timothy W; Pfeifer, Laura M; Paasche-Orlow, Michael K

    2009-06-01

    Patients are commonly presented with complex documents that they have difficulty understanding. The objective of this study was to design and evaluate an animated computer agent to explain research consent forms to potential research participants. Subjects were invited to participate in a simulated consent process for a study involving a genetic repository. Explanation of the research consent form by the computer agent was compared to explanation by a human and a self-study condition in a randomized trial. Responses were compared according to level of health literacy. Participants were most satisfied with the consent process and most likely to sign the consent form when it was explained by the computer agent, regardless of health literacy level. Participants with adequate health literacy demonstrated the highest level of comprehension with the computer agent-based explanation compared to the other two conditions. However, participants with limited health literacy showed poor comprehension levels in all three conditions. Participants with limited health literacy reported several reasons, such as lack of time constraints, ability to re-ask questions, and lack of bias, for preferring the computer agent-based explanation over a human-based one. Animated computer agents can perform as well as or better than humans in the administration of informed consent. Animated computer agents represent a viable method for explaining health documents to patients.

  10. Toward using alpha and theta brain waves to quantify programmer expertise.

    PubMed

    Crk, Igor; Kluthe, Timothy

    2014-01-01

    Empirical studies of programming language learnability and usability have thus far depended on indirect measures of human cognitive performance, attempting to capture what is at its essence a purely cognitive exercise through various indicators of comprehension, such as the correctness of coding tasks or the time spent working out the meaning of code and producing acceptable solutions. Understanding program comprehension is essential to understanding the inherent complexity of programming languages, and ultimately, having a measure of mental effort based on direct observation of the brain at work will illuminate the nature of the work of programming. We provide evidence of direct observation of the cognitive effort associated with programming tasks, through a carefully constructed empirical study using a cross-section of undergraduate computer science students and an inexpensive, off-the-shelf brain-computer interface device. This study presents a link between expertise and programming language comprehension, draws conclusions about the observed indicators of cognitive effort using recent cognitive theories, and proposes directions for future work that is now possible.

  11. MetaboTools: A comprehensive toolbox for analysis of genome-scale metabolic models

    DOE PAGES

    Aurich, Maike K.; Fleming, Ronan M. T.; Thiele, Ines

    2016-08-03

    Metabolomic data sets provide a direct read-out of cellular phenotypes and are increasingly generated to study biological questions. Previous work, by us and others, revealed the potential of analyzing extracellular metabolomic data in the context of the metabolic model using constraint-based modeling. With the MetaboTools, we make our methods available to the broader scientific community. The MetaboTools consist of a protocol, a toolbox, and tutorials of two use cases. The protocol describes, in a step-wise manner, the workflow of data integration, and computational analysis. The MetaboTools comprise the Matlab code required to complete the workflow described in the protocol. Tutorialsmore » explain the computational steps for integration of two different data sets and demonstrate a comprehensive set of methods for the computational analysis of metabolic models and stratification thereof into different phenotypes. The presented workflow supports integrative analysis of multiple omics data sets. Importantly, all analysis tools can be applied to metabolic models without performing the entire workflow. Taken together, the MetaboTools constitute a comprehensive guide to the intra-model analysis of extracellular metabolomic data from microbial, plant, or human cells. In conclusion, this computational modeling resource offers a broad set of computational analysis tools for a wide biomedical and non-biomedical research community.« less

  12. Effects of Static Visuals and Computer-Generated Animations in Facilitating Immediate and Delayed Achievement in the EFL Classroom

    ERIC Educational Resources Information Center

    Lin, Huifen; Chen, Tsuiping; Dwyer, Francis M.

    2006-01-01

    The purpose of this experimental study was to compare the effects of using static visuals versus computer-generated animation to enhance learners' comprehension and retention of a content-based lesson in a computer-based learning environment for learning English as a foreign language (EFL). Fifty-eight students from two EFL reading sections were…

  13. Towards a Versatile Tele-Education Platform for Computer Science Educators Based on the Greek School Network

    ERIC Educational Resources Information Center

    Paraskevas, Michael; Zarouchas, Thomas; Angelopoulos, Panagiotis; Perikos, Isidoros

    2013-01-01

    Now days the growing need for highly qualified computer science educators in modern educational environments is commonplace. This study examines the potential use of Greek School Network (GSN) to provide a robust and comprehensive e-training course for computer science educators in order to efficiently exploit advanced IT services and establish a…

  14. Feasibility Study on the Use of Computer Managed Learning in Secondary Schools in the U.S.A.

    ERIC Educational Resources Information Center

    Charp, Sylvia

    A brief description of computer managed instruction (CMI), including its applications and capabilities, introduces case studies of schools in the United States that are using three different CMI systems. The first system discussed is the Comprehensive Achievement Monitoring (CAM) Program, which was developed by a small school district (Hopkins,…

  15. Three dimensional ray tracing of the Jovian magnetosphere in the low frequency range

    NASA Technical Reports Server (NTRS)

    Menietti, J. D.

    1984-01-01

    Ray tracing studies of Jovian low frequency emissions were studied. A comprehensive three-dimensional ray tracing computer code for examination of model Jovian decametric (DAM) emission was developed. The improvements to the computer code are outlined and described. The results of the ray tracings of Jovian emissions will be presented in summary form.

  16. A CS1 pedagogical approach to parallel thinking

    NASA Astrophysics Data System (ADS)

    Rague, Brian William

    Almost all collegiate programs in Computer Science offer an introductory course in programming primarily devoted to communicating the foundational principles of software design and development. The ACM designates this introduction to computer programming course for first-year students as CS1, during which methodologies for solving problems within a discrete computational context are presented. Logical thinking is highlighted, guided primarily by a sequential approach to algorithm development and made manifest by typically using the latest, commercially successful programming language. In response to the most recent developments in accessible multicore computers, instructors of these introductory classes may wish to include training on how to design workable parallel code. Novel issues arise when programming concurrent applications which can make teaching these concepts to beginning programmers a seemingly formidable task. Student comprehension of design strategies related to parallel systems should be monitored to ensure an effective classroom experience. This research investigated the feasibility of integrating parallel computing concepts into the first-year CS classroom. To quantitatively assess student comprehension of parallel computing, an experimental educational study using a two-factor mixed group design was conducted to evaluate two instructional interventions in addition to a control group: (1) topic lecture only, and (2) topic lecture with laboratory work using a software visualization Parallel Analysis Tool (PAT) specifically designed for this project. A new evaluation instrument developed for this study, the Perceptions of Parallelism Survey (PoPS), was used to measure student learning regarding parallel systems. The results from this educational study show a statistically significant main effect among the repeated measures, implying that student comprehension levels of parallel concepts as measured by the PoPS improve immediately after the delivery of any initial three-week CS1 level module when compared with student comprehension levels just prior to starting the course. Survey results measured during the ninth week of the course reveal that performance levels remained high compared to pre-course performance scores. A second result produced by this study reveals no statistically significant interaction effect between the intervention method and student performance as measured by the evaluation instrument over three separate testing periods. However, visual inspection of survey score trends and the low p-value generated by the interaction analysis (0.062) indicate that further studies may verify improved concept retention levels for the lecture w/PAT group.

  17. The Accuracy of Cognitive Monitoring during Computer-Based Instruction.

    ERIC Educational Resources Information Center

    Garhart, Casey; Hannafin, Michael J.

    This study was conducted to determine the accuracy of learners' comprehension monitoring during computer-based instruction and to assess the relationship between enroute monitoring and different levels of learning. Participants were 50 university undergraduate students enrolled in an introductory educational psychology class. All students received…

  18. Eye vs. Text Movement: Which Technique Leads to Faster Reading Comprehension?

    ERIC Educational Resources Information Center

    Abdellah, Antar Solhy

    2009-01-01

    Eye fixation is a frequent problem that faces foreign language learners and hinders the flow of their reading comprehension. Although students are usually advised to read fast/skim to overcome this problem, eye fixation persists. The present study investigates the effect of using a paper-based program as compared to a computer-based software in…

  19. Experimental Evaluation of Computer Assisted Self-Assessment of Reading Comprehension: Effects on Reading Achievement and Attitude.

    ERIC Educational Resources Information Center

    Vollands, Stacy R.; And Others

    A study evaluated the effect software for self-assessment and management of reading practice had on reading achievement and motivation in two primary schools in Aberdeen, Scotland. The program utilized was The Accelerated Reader (AR) which was designed to enable curriculum based assessment of reading comprehension within the classroom. Students…

  20. Driver comprehension of managed lane signing.

    DOT National Transportation Integrated Search

    2009-09-01

    A statewide survey of driver comprehension of managed lane signing is reported. Computer-based surveys were conducted using video clips of computer animations as well as still images of signs. The surveys were conducted in four Texas cities with a to...

  1. Vectorial Representations of Meaning for a Computational Model of Language Comprehension

    ERIC Educational Resources Information Center

    Wu, Stephen Tze-Inn

    2010-01-01

    This thesis aims to define and extend a line of computational models for text comprehension that are humanly plausible. Since natural language is human by nature, computational models of human language will always be just that--models. To the degree that they miss out on information that humans would tap into, they may be improved by considering…

  2. Computer Graphics and Metaphorical Elaboration for Learning Science Concepts.

    ERIC Educational Resources Information Center

    ChanLin, Lih-Juan; Chan, Kung-Chi

    This study explores the instructional impact of using computer multimedia to integrate metaphorical verbal information into graphical representations of biotechnology concepts. The combination of text and graphics into a single metaphor makes concepts dual-coded, and therefore more comprehensible and memorable for the student. Visual stimuli help…

  3. What's so Simple about Simplified Texts? A Computational and Psycholinguistic Investigation of Text Comprehension and Text Processing

    ERIC Educational Resources Information Center

    Crossley, Scott A.; Yang, Hae Sung; McNamara, Danielle S.

    2014-01-01

    This study uses a moving windows self-paced reading task to assess both text comprehension and processing time of authentic texts and these same texts simplified to beginning and intermediate levels. Forty-eight second language learners each read 9 texts (3 different authentic, beginning, and intermediate level texts). Repeated measures ANOVAs…

  4. Comprehensive silicon solar-cell computer modeling

    NASA Technical Reports Server (NTRS)

    Lamorte, M. F.

    1984-01-01

    A comprehensive silicon solar cell computer modeling scheme was developed to perform the following tasks: (1) model and analysis of the net charge distribution in quasineutral regions; (2) experimentally determined temperature behavior of Spire Corp. n+pp+ solar cells where n+-emitter is formed by ion implantation of 75As or 31P; and (3) initial validation results of computer simulation program using Spire Corp. n+pp+ cells.

  5. The Temporal Dimension of Linguistic Prediction

    ERIC Educational Resources Information Center

    Chow, Wing Yee

    2013-01-01

    This thesis explores how predictions about upcoming language inputs are computed during real-time language comprehension. Previous research has demonstrated humans' ability to use rich contextual information to compute linguistic prediction during real-time language comprehension, and it has been widely assumed that contextual information can…

  6. Requirements for Next Generation Comprehensive Analysis of Rotorcraft

    NASA Technical Reports Server (NTRS)

    Johnson, Wayne; Data, Anubhav

    2008-01-01

    The unique demands of rotorcraft aeromechanics analysis have led to the development of software tools that are described as comprehensive analyses. The next generation of rotorcraft comprehensive analyses will be driven and enabled by the tremendous capabilities of high performance computing, particularly modular and scaleable software executed on multiple cores. Development of a comprehensive analysis based on high performance computing both demands and permits a new analysis architecture. This paper describes a vision of the requirements for this next generation of comprehensive analyses of rotorcraft. The requirements are described and substantiated for what must be included and justification provided for what should be excluded. With this guide, a path to the next generation code can be found.

  7. Neural bases of event knowledge and syntax integration in comprehension of complex sentences.

    PubMed

    Malaia, Evie; Newman, Sharlene

    2015-01-01

    Comprehension of complex sentences is necessarily supported by both syntactic and semantic knowledge, but what linguistic factors trigger a readers' reliance on a specific system? This functional neuroimaging study orthogonally manipulated argument plausibility and verb event type to investigate cortical bases of the semantic effect on argument comprehension during reading. The data suggest that telic verbs facilitate online processing by means of consolidating the event schemas in episodic memory and by easing the computation of syntactico-thematic hierarchies in the left inferior frontal gyrus. The results demonstrate that syntax-semantics integration relies on trade-offs among a distributed network of regions for maximum comprehension efficiency.

  8. EFL Learners' Perceived Use of Conversation Maintenance Strategies during Synchronous Computer-Mediated Communication with Native English Speakers

    ERIC Educational Resources Information Center

    Ino, Atsushi

    2014-01-01

    This study investigated the perceived use of conversation maintenance strategies during synchronous computer-mediated communication with native English speakers. I also correlated the relationships of the strategies used with students' speaking ability and comprehensive proficiency level. The research questions were: (1) how were the learners'…

  9. Incorporating IStation into Early Childhood Classrooms to Improve Reading Comprehension

    ERIC Educational Resources Information Center

    Luo, Tian; Lee, Guang-Lea; Molina, Cynthia

    2017-01-01

    Aim/Purpose: IStation is an adaptive computer-based reading program that adapts to the learner's academic needs. This study investigates if the IStation computer-based reading program promotes reading improvement scores as shown on the STAR Reading test and the IStation test scaled scores for elementary school third-grade learners on different…

  10. Bimodal Reading: Benefits of a Talking Computer for Average and Less Skilled Readers.

    ERIC Educational Resources Information Center

    Montali, Julie; Lewandowski, Lawrence

    1996-01-01

    Eighteen average readers and 18 less-skilled readers (grades 8 and 9) were presented with social studies and science passages via a computer either visually (on screen), auditorily (read by digitized voice), or bimodally (on screen, highlighted while being voiced). Less-skilled readers demonstrated comprehension in the bimodal condition equivalent…

  11. Computer-generated vs. physician-documented history of present illness (HPI): results of a blinded comparison.

    PubMed

    Almario, Christopher V; Chey, William; Kaung, Aung; Whitman, Cynthia; Fuller, Garth; Reid, Mark; Nguyen, Ken; Bolus, Roger; Dennis, Buddy; Encarnacion, Rey; Martinez, Bibiana; Talley, Jennifer; Modi, Rushaba; Agarwal, Nikhil; Lee, Aaron; Kubomoto, Scott; Sharma, Gobind; Bolus, Sally; Chang, Lin; Spiegel, Brennan M R

    2015-01-01

    Healthcare delivery now mandates shorter visits with higher documentation requirements, undermining the patient-provider interaction. To improve clinic visit efficiency, we developed a patient-provider portal that systematically collects patient symptoms using a computer algorithm called Automated Evaluation of Gastrointestinal Symptoms (AEGIS). AEGIS also automatically "translates" the patient report into a full narrative history of present illness (HPI). We aimed to compare the quality of computer-generated vs. physician-documented HPIs. We performed a cross-sectional study with a paired sample design among individuals visiting outpatient adult gastrointestinal (GI) clinics for evaluation of active GI symptoms. Participants first underwent usual care and then subsequently completed AEGIS. Each individual thereby had both a physician-documented and a computer-generated HPI. Forty-eight blinded physicians assessed HPI quality across six domains using 5-point scales: (i) overall impression, (ii) thoroughness, (iii) usefulness, (iv) organization, (v) succinctness, and (vi) comprehensibility. We compared HPI scores within patient using a repeated measures model. Seventy-five patients had both computer-generated and physician-documented HPIs. The mean overall impression score for computer-generated HPIs was higher than physician HPIs (3.68 vs. 2.80; P<0.001), even after adjusting for physician and visit type, location, mode of transcription, and demographics. Computer-generated HPIs were also judged more complete (3.70 vs. 2.73; P<0.001), more useful (3.82 vs. 3.04; P<0.001), better organized (3.66 vs. 2.80; P<0.001), more succinct (3.55 vs. 3.17; P<0.001), and more comprehensible (3.66 vs. 2.97; P<0.001). Computer-generated HPIs were of higher overall quality, better organized, and more succinct, comprehensible, complete, and useful compared with HPIs written by physicians during usual care in GI clinics.

  12. The Future of the Book. Part III. New Technologies in Book Distribution: The United States Experience. Studies on Books and Reading No. 18.

    ERIC Educational Resources Information Center

    Paul, Sandra K.; Kranberg, Susan

    The third report from a comprehensive Unesco study, this document traces the history of the application of computer-based technology to the book distribution process in the United States and indicates functional areas currently showing the effects of using this technology. Ways in which computer use is altering book distribution management…

  13. Robust Nucleus/Cell Detection and Segmentation in Digital Pathology and Microscopy Images: A Comprehensive Review.

    PubMed

    Xing, Fuyong; Yang, Lin

    2016-01-01

    Digital pathology and microscopy image analysis is widely used for comprehensive studies of cell morphology or tissue structure. Manual assessment is labor intensive and prone to interobserver variations. Computer-aided methods, which can significantly improve the objectivity and reproducibility, have attracted a great deal of interest in recent literature. Among the pipeline of building a computer-aided diagnosis system, nucleus or cell detection and segmentation play a very important role to describe the molecular morphological information. In the past few decades, many efforts have been devoted to automated nucleus/cell detection and segmentation. In this review, we provide a comprehensive summary of the recent state-of-the-art nucleus/cell segmentation approaches on different types of microscopy images including bright-field, phase-contrast, differential interference contrast, fluorescence, and electron microscopies. In addition, we discuss the challenges for the current methods and the potential future work of nucleus/cell detection and segmentation.

  14. Lévy-like diffusion in eye movements during spoken-language comprehension.

    PubMed

    Stephen, Damian G; Mirman, Daniel; Magnuson, James S; Dixon, James A

    2009-05-01

    This study explores the diffusive properties of human eye movements during a language comprehension task. In this task, adults are given auditory instructions to locate named objects on a computer screen. Although it has been convention to model visual search as standard Brownian diffusion, we find evidence that eye movements are hyperdiffusive. Specifically, we use comparisons of maximum-likelihood fit as well as standard deviation analysis and diffusion entropy analysis to show that visual search during language comprehension exhibits Lévy-like rather than Gaussian diffusion.

  15. Comprehensive evaluation of garment assembly line with simulation

    NASA Astrophysics Data System (ADS)

    Xu, Y.; Thomassey, S.; Chen, Y.; Zeng, X.

    2017-10-01

    In this paper, a comprehensive evaluation system is established to assess the garment production performance. It is based on performance indicators and supported with the corresponding results obtained by manual calculation or computer simulation. The assembly lines of a typical men’s shirt are taken as the study objects. With the comprehensive evaluation results, garments production arrangement scenarios are better analysed and then the appropriate one is supposed to be put into actual production. This will be a guidance given to companies on quick decision-making and multi-objective optimization of garment production.

  16. Lévy-like diffusion in eye movements during spoken-language comprehension

    NASA Astrophysics Data System (ADS)

    Stephen, Damian G.; Mirman, Daniel; Magnuson, James S.; Dixon, James A.

    2009-05-01

    This study explores the diffusive properties of human eye movements during a language comprehension task. In this task, adults are given auditory instructions to locate named objects on a computer screen. Although it has been convention to model visual search as standard Brownian diffusion, we find evidence that eye movements are hyperdiffusive. Specifically, we use comparisons of maximum-likelihood fit as well as standard deviation analysis and diffusion entropy analysis to show that visual search during language comprehension exhibits Lévy-like rather than Gaussian diffusion.

  17. The Effect of Electronic Storybooks on Struggling Fourth-Graders' Reading Comprehension

    ERIC Educational Resources Information Center

    Ertem, Ihsan Seyit

    2010-01-01

    This quantitative research examined the differences in struggling readers' comprehension of storybooks according to the medium of presentation. Each student was randomly assigned with one of three conditions: (1) computer presentation of storybooks with animation; (2) computer presentation of storybooks without animation; and (3) traditional print…

  18. Cognitive correlates of pragmatic language comprehension in adult traumatic brain injury: A systematic review and meta-analyses.

    PubMed

    Rowley, Dane A; Rogish, Miles; Alexander, Timothy; Riggs, Kevin J

    2017-01-01

    Effective pragmatic comprehension of language is critical for successful communication and interaction, but this ability is routinely impaired following Traumatic Brain Injury (TBI) (1,2). Individual studies have investigated the cognitive domains associated with impaired pragmatic comprehension, but there remains little understanding of the relative importance of these domains in contributing to pragmatic comprehension impairment following TBI. This paper presents a systematic meta-analytic review of the observed correlations between pragmatic comprehension and cognitive processes following TBI. Five meta-analyses were computed, which quantified the relationship between pragmatic comprehension and five key cognitive constructs (declarative memory; working memory; attention; executive functions; social cognition). Significant moderate-to-strong correlations were found between all cognitive measures and pragmatic comprehension, where declarative memory was the strongest correlate. Thus, our findings indicate that pragmatic comprehension in TBI is associated with an array of domain general cognitive processes, and as such deficits in these cognitive domains may underlie pragmatic comprehension difficulties following TBI. The clinical implications of these findings are discussed.

  19. The Effect of Computer-Assisted Learning Integrated with Metacognitive Prompts on Students' Affective Skills

    ERIC Educational Resources Information Center

    Tatar, Nilgün; Akpinar, Ercan; Feyzioglu, Eylem Yildiz

    2013-01-01

    The purpose of this study is to investigate the effect of computer-assisted learning integrated with metacognitive prompts on elementary students' affective skills on the subject of electricity. The researchers developed educational software to enable students to easily and comprehensively learn the concepts in the subject of electricity. A…

  20. Development and Initial Psychometric Properties of the Computer Assisted Maltreatment Inventory (CAMI): A Comprehensive Self-Report Measure of Child Maltreatment History

    ERIC Educational Resources Information Center

    DiLillo, David; Hayes-Skelton, Sarah A.; Fortier, Michelle A.; Perry, Andrea R.; Evans, Sarah E.; Messman Moore, Terri L.; Walsh, Kate; Nash, Cindy; Fauchier, Angele

    2010-01-01

    Objectives: The present study reports on the development and initial psychometric properties of the Computer Assisted Maltreatment Inventory (CAMI), a web-based self-report measure of child maltreatment history, including sexual and physical abuse, exposure to interparental violence, psychological abuse, and neglect. Methods: The CAMI was…

  1. Effects of Computer Assisted Learning Instructions on Reading Achievement among Middle School English Language Learners

    ERIC Educational Resources Information Center

    Bayley-Hamlet, Simone O.

    2017-01-01

    The purpose of this study was to examine the effect of Imagine Learning, a computer assisted language learning (CALL) program, on addressing reading achievement for English language learners (ELLs). This is a measurement used in the Accessing Comprehension and Communication in English State-to-State (ACCESS for ELLs or ACCESS) reading scale…

  2. The Impact of Animation in CD-ROM Books on Students' Reading Behaviors and Comprehension.

    ERIC Educational Resources Information Center

    Okolo, Cindy; Hayes, Renee

    This study evaluated the use of children's literature presented via one of three conditions: an adult reading a book to the child; the child reading a CD-ROM version of a book on the computer but without animation; and the child reading the book on the computer with high levels of animation. The study, in one primary grade classroom, involved 10…

  3. Assessing the Impact of Computer-Based Formative Evaluations in a Course of English as a Foreign Language for Undergraduate Kinesiology Students in Chile

    ERIC Educational Resources Information Center

    Lazzeri, Santos; Cabezas, Ximena; Ojeda, Luis; Leiva, Francisca

    2015-01-01

    This study assesses the impact of computer-based formative evaluations in an undergraduate English course for second semester kinesiology students at the Universidad Austral de Chile-Valdivia (UACh). The target of the course is to improve the students' online reading comprehension skills in their field. A preliminary study was carried out in order…

  4. Reading Research in 1984: Comprehension, Computers, Communication. Fifth Yearbook of the American Reading Forum.

    ERIC Educational Resources Information Center

    McNinch, George H., Ed.; And Others

    Conference presentations of research on reading comprehension, reading instruction, computer applications in reading instruction, and reading theory are compiled in this yearbook. Titles and authors of some of the articles are as follows: "A Rationale for Teaching Children with Limited English Proficiency" (M. Zintz); "Preliminary Development of a…

  5. Computer program to perform cost and weight analysis of transport aircraft. Volume 2: Technical volume

    NASA Technical Reports Server (NTRS)

    1973-01-01

    An improved method for estimating aircraft weight and cost using a unique and fundamental approach was developed. The results of this study were integrated into a comprehensive digital computer program, which is intended for use at the preliminary design stage of aircraft development. The program provides a means of computing absolute values for weight and cost, and enables the user to perform trade studies with a sensitivity to detail design and overall structural arrangement. Both batch and interactive graphics modes of program operation are available.

  6. Psychology of computer use: XXXII. Computer screen-savers as distractors.

    PubMed

    Volk, F A; Halcomb, C G

    1994-12-01

    The differences in performance of 16 male and 16 female undergraduates on three cognitive tasks were investigated in the presence of visual distractors (computer-generated dynamic graphic images). These tasks included skilled and unskilled proofreading and listening comprehension. The visually demanding task of proofreading (skilled and unskilled) showed no significant decreases in performance in the distractor conditions. Results showed significant decrements, however, in performance on listening comprehension in at least one of the distractor conditions.

  7. National electronic medical records integration on cloud computing system.

    PubMed

    Mirza, Hebah; El-Masri, Samir

    2013-01-01

    Few Healthcare providers have an advanced level of Electronic Medical Record (EMR) adoption. Others have a low level and most have no EMR at all. Cloud computing technology is a new emerging technology that has been used in other industry and showed a great success. Despite the great features of Cloud computing, they haven't been utilized fairly yet in healthcare industry. This study presents an innovative Healthcare Cloud Computing system for Integrating Electronic Health Record (EHR). The proposed Cloud system applies the Cloud Computing technology on EHR system, to present a comprehensive EHR integrated environment.

  8. Effective Instruction for Persisting Dyslexia in Upper Grades: Adding Hope Stories and Computer Coding to Explicit Literacy Instruction.

    PubMed

    Thompson, Robert; Tanimoto, Steve; Lyman, Ruby Dawn; Geselowitz, Kira; Begay, Kristin Kawena; Nielsen, Kathleen; Nagy, William; Abbott, Robert; Raskind, Marshall; Berninger, Virginia

    2018-05-01

    Children in grades 4 to 6 ( N =14) who despite early intervention had persisting dyslexia (impaired word reading and spelling) were assessed before and after computerized reading and writing instruction aimed at subword, word, and syntax skills shown in four prior studies to be effective for treating dyslexia. During the 12 two-hour sessions once a week after school they first completed HAWK Letters in Motion© for manuscript and cursive handwriting, HAWK Words in Motion© for phonological, orthographic, and morphological coding for word reading and spelling, and HAWK Minds in Motion© for sentence reading comprehension and written sentence composing. A reading comprehension activity in which sentences were presented one word at a time or one added word at a time was introduced. Next, to instill hope they could overcome their struggles with reading and spelling, they read and discussed stories about struggles of Buckminister Fuller who overcame early disabilities to make important contributions to society. Finally, they engaged in the new Kokopelli's World (KW)©, blocks-based online lessons, to learn computer coding in introductory programming by creating stories in sentence blocks (Tanimoto and Thompson 2016). Participants improved significantly in hallmark word decoding and spelling deficits of dyslexia, three syntax skills (oral construction, listening comprehension, and written composing), reading comprehension (with decoding as covariate), handwriting, orthographic and morphological coding, orthographic loop, and inhibition (focused attention). They answered more reading comprehension questions correctly when they had read sentences presented one word at a time (eliminating both regressions out and regressions in during saccades) than when presented one added word at a time (eliminating only regressions out during saccades). Indicators of improved self-efficacy that they could learn to read and write were observed. Reminders to pay attention and stay on task needed before adding computer coding were not needed after computer coding was added.

  9. Assessing Cognitive Learning of Analytical Problem Solving

    NASA Astrophysics Data System (ADS)

    Billionniere, Elodie V.

    Introductory programming courses, also known as CS1, have a specific set of expected outcomes related to the learning of the most basic and essential computational concepts in computer science (CS). However, two of the most often heard complaints in such courses are that (1) they are divorced from the reality of application and (2) they make the learning of the basic concepts tedious. The concepts introduced in CS1 courses are highly abstract and not easily comprehensible. In general, the difficulty is intrinsic to the field of computing, often described as "too mathematical or too abstract." This dissertation presents a small-scale mixed method study conducted during the fall 2009 semester of CS1 courses at Arizona State University. This study explored and assessed students' comprehension of three core computational concepts---abstraction, arrays of objects, and inheritance---in both algorithm design and problem solving. Through this investigation students' profiles were categorized based on their scores and based on their mistakes categorized into instances of five computational thinking concepts: abstraction, algorithm, scalability, linguistics, and reasoning. It was shown that even though the notion of computational thinking is not explicit in the curriculum, participants possessed and/or developed this skill through the learning and application of the CS1 core concepts. Furthermore, problem-solving experiences had a direct impact on participants' knowledge skills, explanation skills, and confidence. Implications for teaching CS1 and for future research are also considered.

  10. Computer-Assisted Training in the Comprehension of Authentic French Speech: A Closer View

    ERIC Educational Resources Information Center

    Hoeflaak, Arie

    2004-01-01

    In this article, the development of a computer-assisted listening comprehension project is described. First, we comment briefly on the points of departure, the need for autonomous learning against the background of recent changes in Dutch education, and the role of learning strategies. Then, an error analysis, the programs used for this project,…

  11. Instructional Effectiveness of a Computer-Supported Program for Teaching Reading Comprehension Strategies

    ERIC Educational Resources Information Center

    Ponce, Hector R.; Lopez, Mario J.; Mayer, Richard E.

    2012-01-01

    This article examines the effectiveness of a computer-based instructional program (e-PELS) aimed at direct instruction in a collection of reading comprehension strategies. In e-PELS, students learn to highlight and outline expository passages based on various types of text structures (such as comparison or cause-and-effect) as well as to…

  12. The Effects of CBI Lesson Sequence Type and Field Dependence on Learning from Computer-Based Cooperative Instruction in Web

    ERIC Educational Resources Information Center

    Ipek, Ismail

    2010-01-01

    The purpose of this study was to investigate the effects of CBI lesson sequence type and cognitive style of field dependence on learning from Computer-Based Cooperative Instruction (CBCI) in WEB on the dependent measures, achievement, reading comprehension and reading rate. Eighty-seven college undergraduate students were randomly assigned to…

  13. Curriculum-Based Measurement: Developing a Computer-Based Assessment Instrument for Monitoring Student Reading Progress on Multiple Indicators

    ERIC Educational Resources Information Center

    Forster, Natalie; Souvignier, Elmar

    2011-01-01

    The purpose of this study was to examine the technical adequacy of a computer-based assessment instrument which is based on hierarchical models of text comprehension for monitoring student reading progress following the Curriculum-Based Measurement (CBM) approach. At intervals of two weeks, 120 third-grade students finished eight CBM tests. To…

  14. Effect of Computer Animation Technique on Students' Comprehension of the "Solar System and Beyond" Unit in the Science and Technology Course

    ERIC Educational Resources Information Center

    Aksoy, Gokhan

    2013-01-01

    The purpose of this study is to determine the effect of computer animation technique on academic achievement of students in the "Solar System and Beyond" unit lecture as part of the Science and Technology course of the seventh grade in primary education. The sample of the study consists of 60 students attending to the 7th grade of primary school…

  15. A Practical Engineering Approach to Predicting Fatigue Crack Growth in Riveted Lap Joints

    NASA Technical Reports Server (NTRS)

    Harris, Charles E.; Piascik, Robert S.; Newman, James C., Jr.

    1999-01-01

    An extensive experimental database has been assembled from very detailed teardown examinations of fatigue cracks found in rivet holes of fuselage structural components. Based on this experimental database, a comprehensive analysis methodology was developed to predict the onset of widespread fatigue damage in lap joints of fuselage structure. Several computer codes were developed with specialized capabilities to conduct the various analyses that make up the comprehensive methodology. Over the past several years, the authors have interrogated various aspects of the analysis methods to determine the degree of computational rigor required to produce numerical predictions with acceptable engineering accuracy. This study led to the formulation of a practical engineering approach to predicting fatigue crack growth in riveted lap joints. This paper describes the practical engineering approach and compares predictions with the results from several experimental studies.

  16. A Practical Engineering Approach to Predicting Fatigue Crack Growth in Riveted Lap Joints

    NASA Technical Reports Server (NTRS)

    Harris, C. E.; Piascik, R. S.; Newman, J. C., Jr.

    2000-01-01

    An extensive experimental database has been assembled from very detailed teardown examinations of fatigue cracks found in rivet holes of fuselage structural components. Based on this experimental database, a comprehensive analysis methodology was developed to predict the onset of widespread fatigue damage in lap joints of fuselage structure. Several computer codes were developed with specialized capabilities to conduct the various analyses that make up the comprehensive methodology. Over the past several years, the authors have interrogated various aspects of the analysis methods to determine the degree of computational rigor required to produce numerical predictions with acceptable engineering accuracy. This study led to the formulation of a practical engineering approach to predicting fatigue crack growth in riveted lap joints. This paper describes the practical engineering approach and compares predictions with the results from several experimental studies.

  17. Modern Speed-Reading Apps Do Not Foster Reading Comprehension.

    PubMed

    Acklin, Dina; Papesh, Megan H

    2017-01-01

    New computer apps are gaining popularity by suggesting that reading speeds can be drastically increased when eye movements that normally occur during reading are eliminated. This is done using rapid serial visual presentation (RSVP), where words are presented 1 at a time, thus preventing natural eye movements such as saccades, fixations, and regressions from occurring. Al- though the companies producing these apps suggest that RSVP reading does not yield comprehension deficits, research investigating the role of eye movements in reading documents shows the necessity of natural eye movements for accurate comprehension. The current study explored variables that may affect reading comprehension during RSVP reading, including text difficulty (6th grade and 12th grade), text presentation speed (static, 700 wpm, and 1,000 wpm), and working memory capacity (WMC). Consistent with recent work showing a tenuous relationship between comprehension and WMC, participants' WMC did not predict comprehension scores. Instead, comprehension was most affected by reading speed: Static text was associated with superior performance, relative to either RSVP reading condition. Furthermore, slower RSVP speeds yielded better verbatim comprehension, and faster speeds benefited inferential comprehension.

  18. A comprehensive combined experimental and computational framework for pre-clinical wear simulation of total knee replacements.

    PubMed

    Abdelgaied, A; Fisher, J; Jennings, L M

    2018-02-01

    A more robust pre-clinical wear simulation framework is required in order to simulate wider and higher ranges of activities, observed in different patient populations such as younger more active patients. Such a framework will help to understand and address the reported higher failure rates for younger and more active patients (National_Joint_Registry, 2016). The current study has developed and validated a comprehensive combined experimental and computational framework for pre-clinical wear simulation of total knee replacements (TKR). The input mechanical (elastic modulus and Poisson's ratio) and wear parameters of the moderately cross-linked ultra-high molecular weight polyethylene (UHMWPE) bearing material were independently measured from experimental studies under realistic test conditions, similar to the loading conditions found in the total knee replacements. The wear predictions from the computational wear simulation were validated against the direct experimental wear measurements for size 3 Sigma curved total knee replacements (DePuy, UK) in an independent experimental wear simulation study under three different daily activities; walking, deep squat, and stairs ascending kinematic conditions. The measured compressive mechanical properties of the moderately cross-linked UHMWPE material were more than 20% lower than that reported in the literature under tensile test conditions. The pin-on-plate wear coefficient of moderately cross-linked UHMWPE was significantly dependant of the contact stress and the degree of cross-shear at the articulating surfaces. The computational wear predictions for the TKR from the current framework were consistent and in a good agreement with the independent full TKR experimental wear simulation measurements, with 0.94 coefficient of determination of the framework. In addition, the comprehensive combined experimental and computational framework was able to explain the complex experimental wear trends from the three different daily activities investigated. Therefore, such a framework can be adopted as a pre-clinical simulation approach to optimise different designs, materials, as well as patient's specific total knee replacements for a range of activities. Copyright © 2017. Published by Elsevier Ltd.

  19. Assessing Comprehension During Reading with the Reading Strategy Assessment Tool (RSAT)

    PubMed Central

    Magliano, Joseph P.; Millis, Keith K.; Levinstein, Irwin

    2011-01-01

    Comprehension emerges as the results of inference and strategic processes that support the construction of a coherent mental model for a text. However, the vast majority of comprehension skills tests adopt a format that does not afford an assessment of these processes as they operate during reading. This study assessed the viability of the Reading Strategy Assessment Tool (RSAT), which is an automated computer-based reading assessment designed to measure readers’ comprehension and spontaneous use of reading strategies while reading texts. In the tool, readers comprehend passages one sentence at a time, and are asked either an indirect (“What are your thoughts regarding your understanding of the sentence in the context of the passage?”) or direct (e.g., why X?) question after reading each pre-selected target sentence. The answers to the indirect questions are analyzed on the extent that they contain words associated with comprehension processes. The answers to direct questions are coded for the number of content words in common with an ideal answer, which is intended to be an assessment of emerging comprehension. In the study, the RSAT approach was shown to predict measures of comprehension comparable to standardized tests. The RSAT variables were also shown to correlate with human ratings. The results of this study constitute a “proof of concept” and demonstrate that it is possible to develop a comprehension skills assessment tool that assesses both comprehension and comprehension strategies. PMID:23901332

  20. Robust Nucleus/Cell Detection and Segmentation in Digital Pathology and Microscopy Images: A Comprehensive Review

    PubMed Central

    Xing, Fuyong; Yang, Lin

    2016-01-01

    Digital pathology and microscopy image analysis is widely used for comprehensive studies of cell morphology or tissue structure. Manual assessment is labor intensive and prone to inter-observer variations. Computer-aided methods, which can significantly improve the objectivity and reproducibility, have attracted a great deal of interest in recent literatures. Among the pipeline of building a computer-aided diagnosis system, nucleus or cell detection and segmentation play a very important role to describe the molecular morphological information. In the past few decades, many efforts have been devoted to automated nucleus/cell detection and segmentation. In this review, we provide a comprehensive summary of the recent state-of-the-art nucleus/cell segmentation approaches on different types of microscopy images including bright-field, phase-contrast, differential interference contrast (DIC), fluorescence, and electron microscopies. In addition, we discuss the challenges for the current methods and the potential future work of nucleus/cell detection and segmentation. PMID:26742143

  1. Creating an Electronic Reference and Information Database for Computer-aided ECM Design

    NASA Astrophysics Data System (ADS)

    Nekhoroshev, M. V.; Pronichev, N. D.; Smirnov, G. V.

    2018-01-01

    The paper presents a review on electrochemical shaping. An algorithm has been developed to implement a computer shaping model applicable to pulse electrochemical machining. For that purpose, the characteristics of pulse current occurring in electrochemical machining of aviation materials have been studied. Based on integrating the experimental results and comprehensive electrochemical machining process data modeling, a subsystem for computer-aided design of electrochemical machining for gas turbine engine blades has been developed; the subsystem was implemented in the Teamcenter PLM system.

  2. From experiment to design -- Fault characterization and detection in parallel computer systems using computational accelerators

    NASA Astrophysics Data System (ADS)

    Yim, Keun Soo

    This dissertation summarizes experimental validation and co-design studies conducted to optimize the fault detection capabilities and overheads in hybrid computer systems (e.g., using CPUs and Graphics Processing Units, or GPUs), and consequently to improve the scalability of parallel computer systems using computational accelerators. The experimental validation studies were conducted to help us understand the failure characteristics of CPU-GPU hybrid computer systems under various types of hardware faults. The main characterization targets were faults that are difficult to detect and/or recover from, e.g., faults that cause long latency failures (Ch. 3), faults in dynamically allocated resources (Ch. 4), faults in GPUs (Ch. 5), faults in MPI programs (Ch. 6), and microarchitecture-level faults with specific timing features (Ch. 7). The co-design studies were based on the characterization results. One of the co-designed systems has a set of source-to-source translators that customize and strategically place error detectors in the source code of target GPU programs (Ch. 5). Another co-designed system uses an extension card to learn the normal behavioral and semantic execution patterns of message-passing processes executing on CPUs, and to detect abnormal behaviors of those parallel processes (Ch. 6). The third co-designed system is a co-processor that has a set of new instructions in order to support software-implemented fault detection techniques (Ch. 7). The work described in this dissertation gains more importance because heterogeneous processors have become an essential component of state-of-the-art supercomputers. GPUs were used in three of the five fastest supercomputers that were operating in 2011. Our work included comprehensive fault characterization studies in CPU-GPU hybrid computers. In CPUs, we monitored the target systems for a long period of time after injecting faults (a temporally comprehensive experiment), and injected faults into various types of program states that included dynamically allocated memory (to be spatially comprehensive). In GPUs, we used fault injection studies to demonstrate the importance of detecting silent data corruption (SDC) errors that are mainly due to the lack of fine-grained protections and the massive use of fault-insensitive data. This dissertation also presents transparent fault tolerance frameworks and techniques that are directly applicable to hybrid computers built using only commercial off-the-shelf hardware components. This dissertation shows that by developing understanding of the failure characteristics and error propagation paths of target programs, we were able to create fault tolerance frameworks and techniques that can quickly detect and recover from hardware faults with low performance and hardware overheads.

  3. A Computational and Experimental Investigation of a Three-Dimensional Hypersonic Scramjet Inlet Flow Field. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Holland, Scott Douglas

    1991-01-01

    A combined computational and experimental parametric study of the internal aerodynamics of a generic three dimensional sidewall compression scramjet inlet configuration was performed. The study was designed to demonstrate the utility of computational fluid dynamics as a design tool in hypersonic inlet flow fields, to provide a detailed account of the nature and structure of the internal flow interactions, and to provide a comprehensive surface property and flow field database to determine the effects of contraction ratio, cowl position, and Reynolds number on the performance of a hypersonic scramjet inlet configuration.

  4. Computational principles of working memory in sentence comprehension.

    PubMed

    Lewis, Richard L; Vasishth, Shravan; Van Dyke, Julie A

    2006-10-01

    Understanding a sentence requires a working memory of the partial products of comprehension, so that linguistic relations between temporally distal parts of the sentence can be rapidly computed. We describe an emerging theoretical framework for this working memory system that incorporates several independently motivated principles of memory: a sharply limited attentional focus, rapid retrieval of item (but not order) information subject to interference from similar items, and activation decay (forgetting over time). A computational model embodying these principles provides an explanation of the functional capacities and severe limitations of human processing, as well as accounts of reading times. The broad implication is that the detailed nature of cross-linguistic sentence processing emerges from the interaction of general principles of human memory with the specialized task of language comprehension.

  5. A Rural Community's Involvement in the Design and Usability Testing of a Computer-Based Informed Consent Process for the Personalized Medicine Research Project

    PubMed Central

    Mahnke, Andrea N; Plasek, Joseph M; Hoffman, David G; Partridge, Nathan S; Foth, Wendy S; Waudby, Carol J; Rasmussen, Luke V; McManus, Valerie D; McCarty, Catherine A

    2014-01-01

    Many informed consent studies demonstrate that research subjects poorly retain and understand information in written consent documents. Previous research in multimedia consent is mixed in terms of success for improving participants’ understanding, satisfaction, and retention. This failure may be due to a lack of a community-centered design approach to building the interventions. The goal of this study was to gather information from the community to determine the best way to undertake the consent process. Community perceptions regarding different computer-based consenting approaches were evaluated, and a computer-based consent was developed and tested. A second goal was to evaluate whether participants make truly informed decisions to participate in research. Simulations of an informed consent process were videotaped to document the process. Focus groups were conducted to determine community attitudes towards a computer-based informed consent process. Hybrid focus groups were conducted to determine the most acceptable hardware device. Usability testing was conducted on a computer-based consent prototype using a touch-screen kiosk. Based on feedback, a computer-based consent was developed. Representative study participants were able to easily complete the consent, and all were able to correctly answer the comprehension check questions. Community involvement in developing a computer-based consent proved valuable for a population-based genetic study. These findings may translate to other types of informed consents, such as genetic clinical trials consents. A computer-based consent may serve to better communicate consistent, clear, accurate, and complete information regarding the risks and benefits of study participation. Additional analysis is necessary to measure the level of comprehension of the check-question answers by larger numbers of participants. The next step will involve contacting participants to measure whether understanding of what they consented to is retained over time. PMID:24273095

  6. A rural community's involvement in the design and usability testing of a computer-based informed consent process for the Personalized Medicine Research Project.

    PubMed

    Mahnke, Andrea N; Plasek, Joseph M; Hoffman, David G; Partridge, Nathan S; Foth, Wendy S; Waudby, Carol J; Rasmussen, Luke V; McManus, Valerie D; McCarty, Catherine A

    2014-01-01

    Many informed consent studies demonstrate that research subjects poorly retain and understand information in written consent documents. Previous research in multimedia consent is mixed in terms of success for improving participants' understanding, satisfaction, and retention. This failure may be due to a lack of a community-centered design approach to building the interventions. The goal of this study was to gather information from the community to determine the best way to undertake the consent process. Community perceptions regarding different computer-based consenting approaches were evaluated, and a computer-based consent was developed and tested. A second goal was to evaluate whether participants make truly informed decisions to participate in research. Simulations of an informed consent process were videotaped to document the process. Focus groups were conducted to determine community attitudes towards a computer-based informed consent process. Hybrid focus groups were conducted to determine the most acceptable hardware device. Usability testing was conducted on a computer-based consent prototype using a touch-screen kiosk. Based on feedback, a computer-based consent was developed. Representative study participants were able to easily complete the consent, and all were able to correctly answer the comprehension check questions. Community involvement in developing a computer-based consent proved valuable for a population-based genetic study. These findings may translate to other types of informed consents, including those for trials involving treatment of genetic disorders. A computer-based consent may serve to better communicate consistent, clear, accurate, and complete information regarding the risks and benefits of study participation. Additional analysis is necessary to measure the level of comprehension of the check-question answers by larger numbers of participants. The next step will involve contacting participants to measure whether understanding of what they consented to is retained over time. © 2013 Wiley Periodicals, Inc.

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aurich, Maike K.; Fleming, Ronan M. T.; Thiele, Ines

    Metabolomic data sets provide a direct read-out of cellular phenotypes and are increasingly generated to study biological questions. Previous work, by us and others, revealed the potential of analyzing extracellular metabolomic data in the context of the metabolic model using constraint-based modeling. With the MetaboTools, we make our methods available to the broader scientific community. The MetaboTools consist of a protocol, a toolbox, and tutorials of two use cases. The protocol describes, in a step-wise manner, the workflow of data integration, and computational analysis. The MetaboTools comprise the Matlab code required to complete the workflow described in the protocol. Tutorialsmore » explain the computational steps for integration of two different data sets and demonstrate a comprehensive set of methods for the computational analysis of metabolic models and stratification thereof into different phenotypes. The presented workflow supports integrative analysis of multiple omics data sets. Importantly, all analysis tools can be applied to metabolic models without performing the entire workflow. Taken together, the MetaboTools constitute a comprehensive guide to the intra-model analysis of extracellular metabolomic data from microbial, plant, or human cells. In conclusion, this computational modeling resource offers a broad set of computational analysis tools for a wide biomedical and non-biomedical research community.« less

  8. Supporting students' learning in the domain of computer science

    NASA Astrophysics Data System (ADS)

    Gasparinatou, Alexandra; Grigoriadou, Maria

    2011-03-01

    Previous studies have shown that students with low knowledge understand and learn better from more cohesive texts, whereas high-knowledge students have been shown to learn better from texts of lower cohesion. This study examines whether high-knowledge readers in computer science benefit from a text of low cohesion. Undergraduate students (n = 65) read one of four versions of a text concerning Local Network Topologies, orthogonally varying local and global cohesion. Participants' comprehension was examined through free-recall measure, text-based, bridging-inference, elaborative-inference, problem-solving questions and a sorting task. The results indicated that high-knowledge readers benefited from the low-cohesion text. The interaction of text cohesion and knowledge was reliable for the sorting activity, for elaborative-inference and for problem-solving questions. Although high-knowledge readers performed better in text-based and in bridging-inference questions with the low-cohesion text, the interaction of text cohesion and knowledge was not reliable. The results suggest a more complex view of when and for whom textual cohesion affects comprehension and consequently learning in computer science.

  9. A Position on a Computer Literacy Course.

    ERIC Educational Resources Information Center

    Self, Charles C.

    A position is put forth on the appropriate content of a computer literacy course and the role of computer literacy in the community college. First, various definitions of computer literacy are examined, including the programming, computer awareness, and comprehensive approaches. Next, five essential components of a computer literacy course are…

  10. A practice course to cultivate students' comprehensive ability of photoelectricity

    NASA Astrophysics Data System (ADS)

    Lv, Yong; Liu, Yang; Niu, Chunhui; Liu, Lishuang

    2017-08-01

    After the studying of many theoretical courses, it's important and urgent for the students from specialty of optoelectronic information science and engineering to cultivate their comprehensive ability of photoelectricity. We set up a comprehensive practice course named "Integrated Design of Optoelectronic Information System" (IDOIS) for the purpose that students can integrate their knowledge of optics, electronics and computer programming to design, install and debug an optoelectronic system with independent functions. Eight years of practice shows that this practice course can train students' ability of analysis, design/development and debugging of photoelectric system, improve their ability in document retrieval, design proposal and summary report writing, teamwork, innovation consciousness and skill.

  11. Scientific Visualization, Seeing the Unseeable

    ScienceCinema

    LBNL

    2017-12-09

    June 24, 2008 Berkeley Lab lecture: Scientific visualization transforms abstract data into readily comprehensible images, provide a vehicle for "seeing the unseeable," and play a central role in bo... June 24, 2008 Berkeley Lab lecture: Scientific visualization transforms abstract data into readily comprehensible images, provide a vehicle for "seeing the unseeable," and play a central role in both experimental and computational sciences. Wes Bethel, who heads the Scientific Visualization Group in the Computational Research Division, presents an overview of visualization and computer graphics, current research challenges, and future directions for the field.

  12. Evaluative methodology for comprehensive water quality management planning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dyer, H. L.

    Computer-based evaluative methodologies have been developed to provide for the analysis of coupled phenomena associated with natural resource comprehensive planning requirements. Provisions for planner/computer interaction have been included. Each of the simulation models developed is described in terms of its coded procedures. An application of the models for water quality management planning is presented; and the data requirements for each of the models are noted.

  13. Status Report: Mathematics Curriculum-Development Projects Today

    ERIC Educational Resources Information Center

    Arithmetic Teacher, 1972

    1972-01-01

    Brief reports on the Cambridge Conference on School Mathematics, Comprehensive School Mathematics Program, Computer-Assisted Instruction Projects at Stanford, Individually Prescribed Instruction Project, The Madison Project, Mathematics/Science Learning System, MINNEMAST, and School Mathematics Study Group. (MM)

  14. Development of self-compressing BLSOM for comprehensive analysis of big sequence data.

    PubMed

    Kikuchi, Akihito; Ikemura, Toshimichi; Abe, Takashi

    2015-01-01

    With the remarkable increase in genomic sequence data from various organisms, novel tools are needed for comprehensive analyses of available big sequence data. We previously developed a Batch-Learning Self-Organizing Map (BLSOM), which can cluster genomic fragment sequences according to phylotype solely dependent on oligonucleotide composition and applied to genome and metagenomic studies. BLSOM is suitable for high-performance parallel-computing and can analyze big data simultaneously, but a large-scale BLSOM needs a large computational resource. We have developed Self-Compressing BLSOM (SC-BLSOM) for reduction of computation time, which allows us to carry out comprehensive analysis of big sequence data without the use of high-performance supercomputers. The strategy of SC-BLSOM is to hierarchically construct BLSOMs according to data class, such as phylotype. The first-layer BLSOM was constructed with each of the divided input data pieces that represents the data subclass, such as phylotype division, resulting in compression of the number of data pieces. The second BLSOM was constructed with a total of weight vectors obtained in the first-layer BLSOMs. We compared SC-BLSOM with the conventional BLSOM by analyzing bacterial genome sequences. SC-BLSOM could be constructed faster than BLSOM and cluster the sequences according to phylotype with high accuracy, showing the method's suitability for efficient knowledge discovery from big sequence data.

  15. Prediction During Natural Language Comprehension.

    PubMed

    Willems, Roel M; Frank, Stefan L; Nijhof, Annabel D; Hagoort, Peter; van den Bosch, Antal

    2016-06-01

    The notion of prediction is studied in cognitive neuroscience with increasing intensity. We investigated the neural basis of 2 distinct aspects of word prediction, derived from information theory, during story comprehension. We assessed the effect of entropy of next-word probability distributions as well as surprisal A computational model determined entropy and surprisal for each word in 3 literary stories. Twenty-four healthy participants listened to the same 3 stories while their brain activation was measured using fMRI. Reversed speech fragments were presented as a control condition. Brain areas sensitive to entropy were left ventral premotor cortex, left middle frontal gyrus, right inferior frontal gyrus, left inferior parietal lobule, and left supplementary motor area. Areas sensitive to surprisal were left inferior temporal sulcus ("visual word form area"), bilateral superior temporal gyrus, right amygdala, bilateral anterior temporal poles, and right inferior frontal sulcus. We conclude that prediction during language comprehension can occur at several levels of processing, including at the level of word form. Our study exemplifies the power of combining computational linguistics with cognitive neuroscience, and additionally underlines the feasibility of studying continuous spoken language materials with fMRI. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  16. Cortico-striatal language pathways dynamically adjust for syntactic complexity: A computational study.

    PubMed

    Szalisznyó, Krisztina; Silverstein, David; Teichmann, Marc; Duffau, Hugues; Smits, Anja

    2017-01-01

    A growing body of literature supports a key role of fronto-striatal circuits in language perception. It is now known that the striatum plays a role in engaging attentional resources and linguistic rule computation while also serving phonological short-term memory capabilities. The ventral semantic and the dorsal phonological stream dichotomy assumed for spoken language processing also seems to play a role in cortico-striatal perception. Based on recent studies that correlate deep Broca-striatal pathways with complex syntax performance, we used a previously developed computational model of frontal-striatal syntax circuits and hypothesized that different parallel language pathways may contribute to canonical and non-canonical sentence comprehension separately. We modified and further analyzed a thematic role assignment task and corresponding reservoir computing model of language circuits, as previously developed by Dominey and coworkers. We examined the models performance under various parameter regimes, by influencing how fast the presented language input decays and altering the temporal dynamics of activated word representations. This enabled us to quantify canonical and non-canonical sentence comprehension abilities. The modeling results suggest that separate cortico-cortical and cortico-striatal circuits may be recruited differently for processing syntactically more difficult and less complicated sentences. Alternatively, a single circuit would need to dynamically and adaptively adjust to syntactic complexity. Copyright © 2016. Published by Elsevier Inc.

  17. Research on a Frame-Based Model of Reading Comprehension. Final Report.

    ERIC Educational Resources Information Center

    Goldstein, Ira

    This report summarizes computational investigations of language comprehension based on Marvin Minsky's theory of frames, a recent advance in artifical intelligence theories about the representation of knowledge. The investigations discussed explored frame theory as a basis for text comprehension by implementing models of the theory and developing…

  18. Optimization Issues with Complex Rotorcraft Comprehensive Analysis

    NASA Technical Reports Server (NTRS)

    Walsh, Joanne L.; Young, Katherine C.; Tarzanin, Frank J.; Hirsh, Joel E.; Young, Darrell K.

    1998-01-01

    This paper investigates the use of the general purpose automatic differentiation (AD) tool called Automatic Differentiation of FORTRAN (ADIFOR) as a means of generating sensitivity derivatives for use in Boeing Helicopter's proprietary comprehensive rotor analysis code (VII). ADIFOR transforms an existing computer program into a new program that performs a sensitivity analysis in addition to the original analysis. In this study both the pros (exact derivatives, no step-size problems) and cons (more CPU, more memory) of ADIFOR are discussed. The size (based on the number of lines) of the VII code after ADIFOR processing increased by 70 percent and resulted in substantial computer memory requirements at execution. The ADIFOR derivatives took about 75 percent longer to compute than the finite-difference derivatives. However, the ADIFOR derivatives are exact and are not functions of step-size. The VII sensitivity derivatives generated by ADIFOR are compared with finite-difference derivatives. The ADIFOR and finite-difference derivatives are used in three optimization schemes to solve a low vibration rotor design problem.

  19. [Text Comprehensibility of Hospital Report Cards].

    PubMed

    Sander, U; Kolb, B; Christoph, C; Emmert, M

    2016-12-01

    Objectives: Recently, the number of hospital report cards that compare quality of hospitals and present information from German quality reports has greatly increased. Objectives of this study were to a) identify suitable methods for measuring the readability and comprehensibility of hospital report cards, b) to obtain reliable information on the comprehensibility of texts for laymen, c) to give recommendations for improvements and d) to recommend public health actions. Methods: The readability and comprehensibility of the texts were tested with a) a computer-aided evaluation of formal text characteristics (readability indices Flesch (German formula) and 1. Wiener Sachtextformel formula), b) an expert-based heuristic analysis of readability and comprehensibility of texts (counting technical terms and analysis of text simplicity as well as brevity and conciseness using the Hamburg intelligibility model) and c) a survey of subjects about the comprehensibility of individual technical terms, the assessment of the comprehensibility of the presentations and the subjects' decisions in favour of one of the 5 presented clinics due to the better quality of data. In addition, the correlation between the results of the text analysis with the results from the survey of subjects was tested. Results: The assessment of texts with the computer-aided evaluations showed poor comprehensibility values. The assessment of text simplicity using the Hamburg intelligibility model showed poor comprehensibility values (-0.3). On average, 6.8% of the words used were technical terms. A review of 10 technical terms revealed that in all cases only a minority of respondents (from 4.4% to 39.1%) exactly knew what was meant by each of them. Most subjects (62.4%) also believed that unclear terms worsened their understanding of the information offered. The correlation analysis showed that presentations with a lower frequency of technical terms and better values for the text simplicity were better understood. Conclusion: The determination of the frequency of technical terms and the assessment of text simplicity using the Hamburg intelligibility model were suitable methods to determine the readability and comprehensibility of presentations of quality indicators. The analysis showed predominantly poor comprehensibility values and indicated the need to improve the texts of report cards. © Georg Thieme Verlag KG Stuttgart · New York.

  20. Evaluation of Electronic and Paper Textual Glosses on Second Language Vocabulary Learning and Reading Comprehension

    ERIC Educational Resources Information Center

    Lee, Ho; Lee, Hansol; Lee, Jang Ho

    2016-01-01

    Second language studies have supported the use of glossing for enhancing vocabulary learning. Along with technological developments, an increasing number of studies have examined electronic textual glossing in computer-assisted environments, yet only a small number of studies have compared electronic glossing to its traditional paper counterpart.…

  1. Comprehension and Recall of Television's Computerized Image: An Exploratory Study.

    ERIC Educational Resources Information Center

    Metallinos, Nikos; Chartrand, Sylvie

    This exploratory study of the effects of the new visual communications media imagery (e.g., video games, digital television, and computer graphics) on the visual perception process is designed to provide a theoretical framework for research, introduce appropriate research instruments for such study, and experiment with the application of biometric…

  2. A Comprehensive Review of Learner-Control: The Role of Learner Characteristics.

    ERIC Educational Resources Information Center

    Williams, Michael D.

    This paper reviews findings from over 70 published studies investigating various facets of learner-control in computer-based instruction (CBI). General conclusions about the relative effectiveness of learner-control versus program-control are equivocal. Across these studies, however, are strong suggestions that individual learner differences can…

  3. Computational studies on the excited state properties of citrinin and application in fluorescence analysis

    USDA-ARS?s Scientific Manuscript database

    Citrinin is a mycotoxin of increasing concern that is produced by fungi associated with maize, red yeast rice, and other agricultural commodities. A comprehensive time-dependent density functional study on the excited state properties of citrinin was conducted to identify parameters for reliable det...

  4. Study rates U.S. hospitals vs. other nations, industries.

    PubMed

    Burda, D

    1991-10-07

    American hospitals generally are further along with their total quality management programs than their Canadian counterparts but lag behind companies in other U.S. industries, according to a comprehensive international study that examined four industries--healthcare, automotive, banking and computer--in four countries--the United States, Canada, Germany and Japan.

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    John Wooley; Herbert S. Lin

    This study is the first comprehensive NRC study that suggests a high-level intellectual structure for Federal agencies for supporting work at the biology/computing interface. The report seeks to establish the intellectual legitimacy of a fundamentally cross-disciplinary collaboration between biologists and computer scientists. That is, while some universities are increasingly favorable to research at the intersection, life science researchers at other universities are strongly impeded in their efforts to collaborate. This report addresses these impediments and describes proven strategies for overcoming them. An important feature of the report is the use of well-documented examples that describe clearly to individuals not trainedmore » in computer science the value and usage of computing across the biological sciences, from genes and proteins to networks and pathways, from organelles to cells, and from individual organisms to populations and ecosystems. It is hoped that these examples will be useful to students in the life sciences to motivate (continued) study in computer science that will enable them to be more facile users of computing in their future biological studies.« less

  6. Study of the TRAC Airfoil Table Computational System

    NASA Technical Reports Server (NTRS)

    Hu, Hong

    1999-01-01

    The report documents the study of the application of the TRAC airfoil table computational package (TRACFOIL) to the prediction of 2D airfoil force and moment data over a wide range of angle of attack and Mach number. The TRACFOIL generates the standard C-81 airfoil table for input into rotorcraft comprehensive codes such as CAM- RAD. The existing TRACFOIL computer package is successfully modified to run on Digital alpha workstations and on Cray-C90 supercomputers. A step-by-step instruction for using the package on both computer platforms is provided. Application of the newer version of TRACFOIL is made for two airfoil sections. The C-81 data obtained using the TRACFOIL method are compared with those of wind-tunnel data and results are presented.

  7. Event-based Plausibility Immediately Influences On-line Language Comprehension

    PubMed Central

    Matsuki, Kazunaga; Chow, Tracy; Hare, Mary; Elman, Jeffrey L.; Scheepers, Christoph; McRae, Ken

    2011-01-01

    In some theories of sentence comprehension, linguistically-relevant lexical knowledge such as selectional restrictions is privileged in terms of the time-course of its access and influence. We examined whether event knowledge computed by combining multiple concepts can rapidly influence language understanding even in the absence of selectional restriction violations. Specifically, we investigated whether instruments can combine with actions to influence comprehension of ensuing patients. Instrument-verb-patient triplets were created in a norming study designed to tap directly into event knowledge. In self-paced reading (Experiment 1), participants were faster to read patient nouns such as hair when they were typical of the instrument-action pair (Donna used the shampoo to wash vs. the hose to wash). Experiment 2 showed that these results were not due to direct instrument-patient relations. Experiment 3 replicated Experiment 1 using eyetracking, with effects of event typicality observed in first fixation and gaze durations on the patient noun. This research demonstrates that conceptual event-based expectations are computed and used rapidly and dynamically during on-line language comprehension. We discuss relationships among plausibility and predictability, as well as their implications. We conclude that selectional restrictions may be best considered as event-based conceptual knowledge, rather than lexical-grammatical knowledge. PMID:21517222

  8. A computational model for simulating text comprehension.

    PubMed

    Lemaire, Benoît; Denhière, Guy; Bellissens, Cédrick; Jhean-Larose, Sandra

    2006-11-01

    In the present article, we outline the architecture of a computer program for simulating the process by which humans comprehend texts. The program is based on psycholinguistic theories about human memory and text comprehension processes, such as the construction-integration model (Kintsch, 1998), the latent semantic analysis theory of knowledge representation (Landauer & Dumais, 1997), and the predication algorithms (Kintsch, 2001; Lemaire & Bianco, 2003), and it is intended to help psycholinguists investigate the way humans comprehend texts.

  9. Template construction grammar: from visual scene description to language comprehension and agrammatism.

    PubMed

    Barrès, Victor; Lee, Jinyong

    2014-01-01

    How does the language system coordinate with our visual system to yield flexible integration of linguistic, perceptual, and world-knowledge information when we communicate about the world we perceive? Schema theory is a computational framework that allows the simulation of perceptuo-motor coordination programs on the basis of known brain operating principles such as cooperative computation and distributed processing. We present first its application to a model of language production, SemRep/TCG, which combines a semantic representation of visual scenes (SemRep) with Template Construction Grammar (TCG) as a means to generate verbal descriptions of a scene from its associated SemRep graph. SemRep/TCG combines the neurocomputational framework of schema theory with the representational format of construction grammar in a model linking eye-tracking data to visual scene descriptions. We then offer a conceptual extension of TCG to include language comprehension and address data on the role of both world knowledge and grammatical semantics in the comprehension performances of agrammatic aphasic patients. This extension introduces a distinction between heavy and light semantics. The TCG model of language comprehension offers a computational framework to quantitatively analyze the distributed dynamics of language processes, focusing on the interactions between grammatical, world knowledge, and visual information. In particular, it reveals interesting implications for the understanding of the various patterns of comprehension performances of agrammatic aphasics measured using sentence-picture matching tasks. This new step in the life cycle of the model serves as a basis for exploring the specific challenges that neurolinguistic computational modeling poses to the neuroinformatics community.

  10. Improving DHH students' grammar through an individualized software program.

    PubMed

    Cannon, Joanna E; Easterbrooks, Susan R; Gagné, Phill; Beal-Alvarez, Jennifer

    2011-01-01

    The purpose of this study was to determine if the frequent use of a targeted, computer software grammar instruction program, used as an individualized classroom activity, would influence the comprehension of morphosyntax structures (determiners, tense, and complementizers) in deaf/hard-of-hearing (DHH) participants who use American Sign Language (ASL). Twenty-six students from an urban day school for the deaf participated in this study. Two hierarchical linear modeling growth curve analyses showed that the influence of LanguageLinks: Syntax Assessment and Intervention (LL) resulted in statistically significant gains in participants' comprehension of morphosyntax structures. Two dependent t tests revealed statistically significant results between the pre- and postintervention assessments on the Diagnostic Evaluation of Language Variation-Norm Referenced. The daily use of LL increased the morphosyntax comprehension of the participants in this study and may be a promising practice for DHH students who use ASL.

  11. A new framework for comprehensive, robust, and efficient global sensitivity analysis: 1. Theory

    NASA Astrophysics Data System (ADS)

    Razavi, Saman; Gupta, Hoshin V.

    2016-01-01

    Computer simulation models are continually growing in complexity with increasingly more factors to be identified. Sensitivity Analysis (SA) provides an essential means for understanding the role and importance of these factors in producing model responses. However, conventional approaches to SA suffer from (1) an ambiguous characterization of sensitivity, and (2) poor computational efficiency, particularly as the problem dimension grows. Here, we present a new and general sensitivity analysis framework (called VARS), based on an analogy to "variogram analysis," that provides an intuitive and comprehensive characterization of sensitivity across the full spectrum of scales in the factor space. We prove, theoretically, that Morris (derivative-based) and Sobol (variance-based) methods and their extensions are special cases of VARS, and that their SA indices can be computed as by-products of the VARS framework. Synthetic functions that resemble actual model response surfaces are used to illustrate the concepts, and show VARS to be as much as two orders of magnitude more computationally efficient than the state-of-the-art Sobol approach. In a companion paper, we propose a practical implementation strategy, and demonstrate the effectiveness, efficiency, and reliability (robustness) of the VARS framework on real-data case studies.

  12. IMPROVING EMISSIONS ESTIMATES WITH COMPUTATIONAL INTELLIGENCE, DATABASE EXPANSION, AND COMPREHENSIVE VALIDATION

    EPA Science Inventory

    The report discusses an EPA investigation of techniques to improve methods for estimating volatile organic compound (VOC) emissions from area sources. Using the automobile refinishing industry for a detailed area source case study, an emission estimation method is being developed...

  13. Toward a unified account of comprehension and production in language development.

    PubMed

    McCauley, Stewart M; Christiansen, Morten H

    2013-08-01

    Although Pickering & Garrod (P&G) argue convincingly for a unified system for language comprehension and production, they fail to explain how such a system might develop. Using a recent computational model of language acquisition as an example, we sketch a developmental perspective on the integration of comprehension and production. We conclude that only through development can we fully understand the intertwined nature of comprehension and production in adult processing.

  14. The Nature of Computer Assisted Learning.

    ERIC Educational Resources Information Center

    Whiting, John

    Computer assisted learning (CAL) is an old technology which has generated much new interest. Computers can: reduce data to a directly comprehensible form; reduce administration; communicate worldwide and exchange, store, and retrieve data; and teach. The computer's limitation is in its dependence on the user's ability and perceptive nature.…

  15. Reading Authentic EFL Text Using Visualization and Advance Organizers in a Multimedia Learning Environment

    ERIC Educational Resources Information Center

    Lin, Huifen; Chen, Tsuiping

    2007-01-01

    The purpose of this experimental study was to compare the effects of different types of computer-generated visuals (static versus animated) and advance organizers (descriptive versus question) in enhancing comprehension and retention of a content-based lesson for learning English as a Foreign Language (EFL). Additionally, the study investigated…

  16. CHAT: development and validation of a computer-delivered, self-report, substance use assessment for adolescents.

    PubMed

    Lord, Sarah E; Trudeau, Kimberlee J; Black, Ryan A; Lorin, Lucy; Cooney, Elizabeth; Villapiano, Albert; Butler, Stephen F

    2011-01-01

    The current study was conducted to construct and validate a computer-delivered, multimedia, substance use self-assessment for adolescents. Reliability and validity of six problem dimensions were evaluated in two studies, conducted from 2003 to 2008. Study 1 included 192 adolescents from five treatment settings throughout the United States (N = 142) and two high schools from Greater Boston, Massachusetts (N = 50). Study 2 included 356 adolescents (treatment: N = 260; school: N = 94). The final version of Comprehensive Health Assessment for Teens (CHAT) demonstrated relatively strong psychometric properties. The limitations and implications of this study are noted. This study was supported by an SBIR grant.

  17. SuperPILOT: A Comprehensive Computer-Assisted Instruction Programming Language for the Apple II Computer.

    ERIC Educational Resources Information Center

    Falleur, David M.

    This presentation describes SuperPILOT, an extended version of Apple PILOT, a programming language for developing computer-assisted instruction (CAI) with the Apple II computer that includes the features of its early PILOT (Programmed Inquiry, Learning or Teaching) ancestors together with new features that make use of the Apple computer's advanced…

  18. Student Use of Physics to Make Sense of Incomplete but Functional VPython Programs in a Lab Setting

    NASA Astrophysics Data System (ADS)

    Weatherford, Shawn A.

    2011-12-01

    Computational activities in Matter & Interactions, an introductory calculus-based physics course, have the instructional goal of providing students with the experience of applying the same set of a small number of fundamental principles to model a wide range of physical systems. However there are significant instructional challenges for students to build computer programs under limited time constraints, especially for students who are unfamiliar with programming languages and concepts. Prior attempts at designing effective computational activities were successful at having students ultimately build working VPython programs under the tutelage of experienced teaching assistants in a studio lab setting. A pilot study revealed that students who completed these computational activities had significant difficultly repeating the exact same tasks and further, had difficulty predicting the animation that would be produced by the example program after interpreting the program code. This study explores the interpretation and prediction tasks as part of an instructional sequence where students are asked to read and comprehend a functional, but incomplete program. Rather than asking students to begin their computational tasks with modifying program code, we explicitly ask students to interpret an existing program that is missing key lines of code. The missing lines of code correspond to the algebraic form of fundamental physics principles or the calculation of forces which would exist between analogous physical objects in the natural world. Students are then asked to draw a prediction of what they would see in the simulation produced by the VPython program and ultimately run the program to evaluate the students' prediction. This study specifically looks at how the participants use physics while interpreting the program code and creating a whiteboard prediction. This study also examines how students evaluate their understanding of the program and modification goals at the beginning of the modification task. While working in groups over the course of a semester, study participants were recorded while they completed three activities using these incomplete programs. Analysis of the video data showed that study participants had little difficulty interpreting physics quantities, generating a prediction, or determining how to modify the incomplete program. Participants did not base their prediction solely from the information from the incomplete program. When participants tried to predict the motion of the objects in the simulation, many turned to their knowledge of how the system would evolve if it represented an analogous real-world physical system. For example, participants attributed the real-world behavior of springs to helix objects even though the program did not include calculations for the spring to exert a force when stretched. Participants rarely interpreted lines of code in the computational loop during the first computational activity, but this changed during latter computational activities with most participants using their physics knowledge to interpret the computational loop. Computational activities in the Matter & Interactions curriculum were revised in light of these findings to include an instructional sequence of tasks to build a comprehension of the example program. The modified activities also ask students to create an additional whiteboard prediction for the time-evolution of the real-world phenomena which the example program will eventually model. This thesis shows how comprehension tasks identified by Palinscar and Brown (1984) as effective in improving reading comprehension are also effective in helping students apply their physics knowledge to interpret a computer program which attempts to model a real-world phenomena and identify errors in their understanding of the use, or omission, of fundamental physics principles in a computational model.

  19. The neuroscience of vision-based grasping: a functional review for computational modeling and bio-inspired robotics.

    PubMed

    Chinellato, Eris; Del Pobil, Angel P

    2009-06-01

    The topic of vision-based grasping is being widely studied in humans and in other primates using various techniques and with different goals. The fundamental related findings are reviewed in this paper, with the aim of providing researchers from different fields, including intelligent robotics and neural computation, a comprehensive but accessible view on the subject. A detailed description of the principal sensorimotor processes and the brain areas involved is provided following a functional perspective, in order to make this survey especially useful for computational modeling and bio-inspired robotic applications.

  20. InMAP: A model for air pollution interventions

    DOE PAGES

    Tessum, Christopher W.; Hill, Jason D.; Marshall, Julian D.; ...

    2017-04-19

    Mechanistic air pollution modeling is essential in air quality management, yet the extensive expertise and computational resources required to run most models prevent their use in many situations where their results would be useful. We present InMAP (Intervention Model for Air Pollution), which offers an alternative to comprehensive air quality models for estimating the air pollution health impacts of emission reductions and other potential interventions. InMAP estimates annual-average changes in primary and secondary fine particle (PM2.5) concentrations—the air pollution outcome generally causing the largest monetized health damages–attributable to annual changes in precursor emissions. InMAP leverages pre-processed physical and chemical informationmore » from the output of a state-of-the-science chemical transport model and a variable spatial resolution computational grid to perform simulations that are several orders of magnitude less computationally intensive than comprehensive model simulations. In comparisons we run, InMAP recreates comprehensive model predictions of changes in total PM2.5 concentrations with population-weighted mean fractional bias (MFB) of -17% and population-weighted R2 = 0.90. Although InMAP is not specifically designed to reproduce total observed concentrations, it is able to do so within published air quality model performance criteria for total PM2.5. Potential uses of InMAP include studying exposure, health, and environmental justice impacts of potential shifts in emissions for annual-average PM2.5. InMAP can be trained to run for any spatial and temporal domain given the availability of appropriate simulation output from a comprehensive model. The InMAP model source code and input data are freely available online under an open-source license.« less

  1. InMAP: A model for air pollution interventions

    PubMed Central

    Hill, Jason D.; Marshall, Julian D.

    2017-01-01

    Mechanistic air pollution modeling is essential in air quality management, yet the extensive expertise and computational resources required to run most models prevent their use in many situations where their results would be useful. Here, we present InMAP (Intervention Model for Air Pollution), which offers an alternative to comprehensive air quality models for estimating the air pollution health impacts of emission reductions and other potential interventions. InMAP estimates annual-average changes in primary and secondary fine particle (PM2.5) concentrations—the air pollution outcome generally causing the largest monetized health damages–attributable to annual changes in precursor emissions. InMAP leverages pre-processed physical and chemical information from the output of a state-of-the-science chemical transport model and a variable spatial resolution computational grid to perform simulations that are several orders of magnitude less computationally intensive than comprehensive model simulations. In comparisons run here, InMAP recreates comprehensive model predictions of changes in total PM2.5 concentrations with population-weighted mean fractional bias (MFB) of −17% and population-weighted R2 = 0.90. Although InMAP is not specifically designed to reproduce total observed concentrations, it is able to do so within published air quality model performance criteria for total PM2.5. Potential uses of InMAP include studying exposure, health, and environmental justice impacts of potential shifts in emissions for annual-average PM2.5. InMAP can be trained to run for any spatial and temporal domain given the availability of appropriate simulation output from a comprehensive model. The InMAP model source code and input data are freely available online under an open-source license. PMID:28423049

  2. InMAP: A model for air pollution interventions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tessum, Christopher W.; Hill, Jason D.; Marshall, Julian D.

    Mechanistic air pollution modeling is essential in air quality management, yet the extensive expertise and computational resources required to run most models prevent their use in many situations where their results would be useful. We present InMAP (Intervention Model for Air Pollution), which offers an alternative to comprehensive air quality models for estimating the air pollution health impacts of emission reductions and other potential interventions. InMAP estimates annual-average changes in primary and secondary fine particle (PM2.5) concentrations—the air pollution outcome generally causing the largest monetized health damages–attributable to annual changes in precursor emissions. InMAP leverages pre-processed physical and chemical informationmore » from the output of a state-of-the-science chemical transport model and a variable spatial resolution computational grid to perform simulations that are several orders of magnitude less computationally intensive than comprehensive model simulations. In comparisons we run, InMAP recreates comprehensive model predictions of changes in total PM2.5 concentrations with population-weighted mean fractional bias (MFB) of -17% and population-weighted R2 = 0.90. Although InMAP is not specifically designed to reproduce total observed concentrations, it is able to do so within published air quality model performance criteria for total PM2.5. Potential uses of InMAP include studying exposure, health, and environmental justice impacts of potential shifts in emissions for annual-average PM2.5. InMAP can be trained to run for any spatial and temporal domain given the availability of appropriate simulation output from a comprehensive model. The InMAP model source code and input data are freely available online under an open-source license.« less

  3. A Comprehensive Toolset for General-Purpose Private Computing and Outsourcing

    DTIC Science & Technology

    2016-12-08

    project and scientific advances made towards each of the research thrusts throughout the project duration. 1 Project Objectives Cloud computing enables...possibilities that the cloud enables is computation outsourcing, when the client can utilize any necessary computing resources for its computational task...Security considerations, however, stand on the way of harnessing the full benefits of cloud computing to the fullest extent and prevent clients from

  4. Effect of attention therapy on reading comprehension.

    PubMed

    Solan, Harold A; Shelley-Tremblay, John; Ficarra, Anthony; Silverman, Michael; Larson, Steven

    2003-01-01

    This study quantified the influence of visual attention therapy on the reading comprehension of Grade 6 children with moderate reading disabilities (RD) in the absence of specific reading remediation. Thirty students with below-average reading scores were identified using standardized reading comprehension tests. Fifteen children were placed randomly in the experimental group and 15 in the control group. The Attention Battery of the Cognitive Assessment System was administered to all participants. The experimental group received 12 one-hour sessions of individually monitored, computer-based attention therapy programs; the control group received no therapy during their 12-week period. Each group was retested on attention and reading comprehension measures. In order to stimulate selective and sustained visual attention, the vision therapy stressed various aspects of arousal, activation, and vigilance. At the completion of attention therapy, the mean standard attention and reading comprehension scores of the experimental group had improved significantly. The control group, however, showed no significant improvement in reading comprehension scores after 12 weeks. Although uncertainties still exist, this investigation supports the notion that visual attention is malleable and that attention therapy has a significant effect on reading comprehension in this often neglected population.

  5. Automatic design of optical systems by digital computer

    NASA Technical Reports Server (NTRS)

    Casad, T. A.; Schmidt, L. F.

    1967-01-01

    Computer program uses geometrical optical techniques and a least squares optimization method employing computing equipment for the automatic design of optical systems. It evaluates changes in various optical parameters, provides comprehensive ray-tracing, and generally determines the acceptability of the optical system characteristics.

  6. Distractions, distractions: does instant messaging affect college students' performance on a concurrent reading comprehension task?

    PubMed

    Fox, Annie Beth; Rosen, Jonathan; Crawford, Mary

    2009-02-01

    Instant messaging (IM) has become one of the most popular forms of computer-mediated communication (CMC) and is especially prevalent on college campuses. Previous research suggests that IM users often multitask while conversing online. To date, no one has yet examined the cognitive effect of concurrent IM use. Participants in the present study (N = 69) completed a reading comprehension task uninterrupted or while concurrently holding an IM conversation. Participants who IMed while performing the reading task took significantly longer to complete the task, indicating that concurrent IM use negatively affects efficiency. Concurrent IM use did not affect reading comprehension scores. Additional analyses revealed that the more time participants reported spending on IM, the lower their reading comprehension scores. Finally, we found that the more time participants reported spending on IM, the lower their self-reported GPA. Implications and future directions are discussed.

  7. Computers and the Primary Curriculum 3-13.

    ERIC Educational Resources Information Center

    Crompton, Rob, Ed.

    This book is a comprehensive and practical guide to the use of computers across a wide age range. Extensive use is made of photographs, illustrations, cartoons, and samples of children's work to demonstrate the versatility of computer use in schools. An introduction by Rob Crompton placing computer use within the educational context of the United…

  8. Computer Literacy.

    ERIC Educational Resources Information Center

    San Marcos Unified School District, CA.

    THE FOLLOWING IS THE FULL TEXT OF THIS DOCUMENT: After viewing many computer-literacy programs, we believe San Marcos Junior High School has developed a unique program which will truly develop computer literacy. Our hope is to give all students a comprehensive look at computers as they go through their two years here. They will not only learn the…

  9. The possibility of coexistence and co-development in language competition: ecology-society computational model and simulation.

    PubMed

    Yun, Jian; Shang, Song-Chao; Wei, Xiao-Dan; Liu, Shuang; Li, Zhi-Jie

    2016-01-01

    Language is characterized by both ecological properties and social properties, and competition is the basic form of language evolution. The rise and decline of one language is a result of competition between languages. Moreover, this rise and decline directly influences the diversity of human culture. Mathematics and computer modeling for language competition has been a popular topic in the fields of linguistics, mathematics, computer science, ecology, and other disciplines. Currently, there are several problems in the research on language competition modeling. First, comprehensive mathematical analysis is absent in most studies of language competition models. Next, most language competition models are based on the assumption that one language in the model is stronger than the other. These studies tend to ignore cases where there is a balance of power in the competition. The competition between two well-matched languages is more practical, because it can facilitate the co-development of two languages. A third issue with current studies is that many studies have an evolution result where the weaker language inevitably goes extinct. From the integrated point of view of ecology and sociology, this paper improves the Lotka-Volterra model and basic reaction-diffusion model to propose an "ecology-society" computational model for describing language competition. Furthermore, a strict and comprehensive mathematical analysis was made for the stability of the equilibria. Two languages in competition may be either well-matched or greatly different in strength, which was reflected in the experimental design. The results revealed that language coexistence, and even co-development, are likely to occur during language competition.

  10. A computer-aided approach to nonlinear control systhesis

    NASA Technical Reports Server (NTRS)

    Wie, Bong; Anthony, Tobin

    1988-01-01

    The major objective of this project is to develop a computer-aided approach to nonlinear stability analysis and nonlinear control system design. This goal is to be obtained by refining the describing function method as a synthesis tool for nonlinear control design. The interim report outlines the approach by this study to meet these goals including an introduction to the INteractive Controls Analysis (INCA) program which was instrumental in meeting these study objectives. A single-input describing function (SIDF) design methodology was developed in this study; coupled with the software constructed in this study, the results of this project provide a comprehensive tool for design and integration of nonlinear control systems.

  11. The development of a multimedia online language assessment tool for young children with autism.

    PubMed

    Lin, Chu-Sui; Chang, Shu-Hui; Liou, Wen-Ying; Tsai, Yu-Show

    2013-10-01

    This study aimed to provide early childhood special education professionals with a standardized and comprehensive language assessment tool for the early identification of language learning characteristics (e.g., hyperlexia) of young children with autism. In this study, we used computer technology to develop a multi-media online language assessment tool that presents auditory or visual stimuli. This online comprehensive language assessment consists of six subtests: decoding, homographs, auditory vocabulary comprehension, visual vocabulary comprehension, auditory sentence comprehension, and visual sentence comprehension. Three hundred typically developing children and 35 children with autism from Tao-Yuan County in Taiwan aged 4-6 participated in this study. The Cronbach α values of the six subtests ranged from .64 to .97. The variance explained by the six subtests ranged from 14% to 56%, the current validity of each subtest with the Peabody Picture Vocabulary Test-Revised ranged from .21 to .45, and the predictive validity of each subtest with WISC-III ranged from .47 to .75. This assessment tool was also found to be able to accurately differentiate children with autism up to 92%. These results indicate that this assessment tool has both adequate reliability and validity. Additionally, 35 children with autism have completed the entire assessment in this study without exhibiting any extremely troubling behaviors. However, future research is needed to increase the sample size of both typically developing children and young children with autism and to overcome the technical challenges associated with internet issues. Copyright © 2013 Elsevier Ltd. All rights reserved.

  12. Process and representation in graphical displays

    NASA Technical Reports Server (NTRS)

    Gillan, Douglas J.; Lewis, Robert; Rudisill, Marianne

    1993-01-01

    Our initial model of graphic comprehension has focused on statistical graphs. Like other models of human-computer interaction, models of graphical comprehension can be used by human-computer interface designers and developers to create interfaces that present information in an efficient and usable manner. Our investigation of graph comprehension addresses two primary questions: how do people represent the information contained in a data graph?; and how do they process information from the graph? The topics of focus for graphic representation concern the features into which people decompose a graph and the representations of the graph in memory. The issue of processing can be further analyzed as two questions: what overall processing strategies do people use?; and what are the specific processing skills required?

  13. A Plan for Community College Instructional Computing.

    ERIC Educational Resources Information Center

    Howard, Alan; And Others

    This document presents a comprehensive plan for future growth in instructional computing in the Washington community colleges. Two chapters define the curriculum objectives and content recommended for instructional courses in the community colleges which require access to computing facilities. The courses described include data processing…

  14. Human Expertise Helps Computer Classify Images

    NASA Technical Reports Server (NTRS)

    Rorvig, Mark E.

    1991-01-01

    Two-domain method of computational classification of images requires less computation than other methods for computational recognition, matching, or classification of images or patterns. Does not require explicit computational matching of features, and incorporates human expertise without requiring translation of mental processes of classification into language comprehensible to computer. Conceived to "train" computer to analyze photomicrographs of microscope-slide specimens of leucocytes from human peripheral blood to distinguish between specimens from healthy and specimens from traumatized patients.

  15. Description of CASCOMP Comprehensive Airship Sizing and Performance Computer Program, Volume 2

    NASA Technical Reports Server (NTRS)

    Davis, J.

    1975-01-01

    The computer program CASCOMP, which may be used in comparative design studies of lighter than air vehicles by rapidly providing airship size and mission performance data, was prepared and documented. The program can be used to define design requirements such as weight breakdown, required propulsive power, and physical dimensions of airships which are designed to meet specified mission requirements. The program is also useful in sensitivity studies involving both design trade-offs and performance trade-offs. The input to the program primarily consists of a series of single point values such as hull overall fineness ratio, number of engines, airship hull and empennage drag coefficients, description of the mission profile, and weights of fixed equipment, fixed useful load and payload. In order to minimize computation time, the program makes ample use of optional computation paths.

  16. Administrative Uses of Computers in the Schools.

    ERIC Educational Resources Information Center

    Bluhm, Harry P.

    This book, intended for school administrators, provides a comprehensive account of how computer information systems can enable administrators at both middle and top management levels to manage the educational enterprise. It can be used as a textbook in an educational administration course emphasizing computer technology in education, an…

  17. Demographic Computer Library.

    ERIC Educational Resources Information Center

    Shaw, David C.; Johnson, Dorothy M.

    The complete comprehension of this paper requires a firm grasp of both mathematical demography and FORTRAN programming. The paper aims at the establishment of a language with which complex demographic manipulations can be briefly expressed in a form intelligible both to demographic analysts and to computers. The Demographic Computer Library (DCL)…

  18. Care and Handling of Computer Magnetic Storage Media.

    ERIC Educational Resources Information Center

    Geller, Sidney B.

    Intended for use by data processing installation managers, operating personnel, and technical staff, this publication provides a comprehensive set of care and handling guidelines for the physical/chemical preservation of computer magnetic storage media--principally computer magnetic tapes--and their stored data. Emphasis is placed on media…

  19. The Impact of Hypermedia Instructional Materials on Study Self-Regulation in College Students.

    ERIC Educational Resources Information Center

    Nelms, Keith R.

    The metacognition "calibration of comprehension" research paradigm is used to investigate the question of whether the introduction of hypertext and hypermedia into college instruction impacts students' ability to regulate their own learning processes. Presentation technology (paper or computer) and content structure (linear or nonlinear) were…

  20. Use of Computer-Assisted Instruction to Review Microbiology and Antimicrobial Agents.

    ERIC Educational Resources Information Center

    Carver, Peggy L.; And Others

    1991-01-01

    A study assessed the effectiveness of a microcomputer-assisted instructional program using graphics, color, and text in simulations to enhance pharmacy students' knowledge of microbiology and antimicrobial agents. Results indicated high short- and long-term retention of information presented and higher levels of knowledge and comprehension among…

  1. Investigating Student Use of Electronic Support Tools and Mathematical Reasoning

    ERIC Educational Resources Information Center

    Higgins, Kristina N.; Crawford, Lindy; Huscroft-D'Angelo, Jacqueline; Horney, Mark

    2016-01-01

    Mathematical reasoning involves comprehending mathematical information and concepts in a logical way and forming conclusions and generalizations based on this comprehension. Computer-based learning has been incorporated into classrooms across the country, and specific aspects of technology need to be studied to determine how programs are…

  2. Guided Practice: Use of Low-Cost Networking.

    ERIC Educational Resources Information Center

    Gersten, Russell; And Others

    This study investigated the effectiveness of the use of computer networking in providing guided practice in teaching reading comprehension to middle school students (grades 6-8) in remedial reading class. (Guided practice is defined as the phase of instruction immediately following the presentation of a new skill, concept, or strategy, in which…

  3. Improving DHH Students' Grammar through an Individualized Software Program

    ERIC Educational Resources Information Center

    Cannon, Joanna E.; Easterbrooks, Susan R.; Gagne, Phill; Beal-Alvarez, Jennifer

    2011-01-01

    The purpose of this study was to determine if the frequent use of a targeted, computer software grammar instruction program, used as an individualized classroom activity, would influence the comprehension of morphosyntax structures (determiners, tense, and complementizers) in deaf/hard-of-hearing (DHH) participants who use American Sign Language…

  4. The Cost Effectiveness of 22 Approaches for Raising Student Achievement

    ERIC Educational Resources Information Center

    Yeh, Stuart S.

    2010-01-01

    Review of cost-effectiveness studies suggests that rapid assessment is more cost effective with regard to student achievement than comprehensive school reform (CSR), cross-age tutoring, computer-assisted instruction, a longer school day, increases in teacher education, teacher experience or teacher salaries, summer school, more rigorous math…

  5. Toward a Neurophysiological Theory of Auditory Stream Segregation

    ERIC Educational Resources Information Center

    Snyder, Joel S.; Alain, Claude

    2007-01-01

    Auditory stream segregation (or streaming) is a phenomenon in which 2 or more repeating sounds differing in at least 1 acoustic attribute are perceived as 2 or more separate sound sources (i.e., streams). This article selectively reviews psychophysical and computational studies of streaming and comprehensively reviews more recent…

  6. Studies for development of novel quinazolinones: New biomarker for EGFR

    NASA Astrophysics Data System (ADS)

    Aggarwal, Swati; Sinha, Deepa; Tiwari, Anjani Kumar; Pooja, Pooja; Kaul, Ankur; Singh, Gurmeet; Mishra, Anil Kumar

    2015-05-01

    The binding capabilities of a series of novel quinazolinone molecules were established and stated in a comprehensive computational methodology as well as by in vitro analysis. The main focus of this work was to achieve more insight of the interactions with crystal structure of PDB ID:

  7. Performance, Cognitive Load, and Behaviour of Technology-Assisted English Listening Learning: From CALL to MALL

    ERIC Educational Resources Information Center

    Chang, Chi-Cheng; Warden, Clyde A.; Liang, Chaoyun; Chou, Pao-Nan

    2018-01-01

    This study examines differences in English listening comprehension, cognitive load, and learning behaviour between outdoor ubiquitous learning and indoor computer-assisted learning. An experimental design, employing a pretest-posttest control group is employed. Randomly assigned foreign language university majors joined either the experimental…

  8. Computer Simulation of Classic Studies in Psychology.

    ERIC Educational Resources Information Center

    Bradley, Drake R.

    This paper describes DATASIM, a comprehensive software package which generates simulated data for actual or hypothetical research designs. DATASIM is primarily intended for use in statistics and research methods courses, where it is used to generate "individualized" datasets for students to analyze, and later to correct their answers.…

  9. An Approach to Poiseuille's Law in an Undergraduate Laboratory Experiment

    ERIC Educational Resources Information Center

    Sianoudis, I. A.; Drakaki, E.

    2008-01-01

    The continuous growth of computer and sensor technology allows many researchers to develop simple modifications and/or refinements to standard educational experiments, making them more attractive and comprehensible to students and thus increasing their educational impact. In the framework of this approach, the present study proposes an alternative…

  10. The Reality of National Computer Networking for Higher Education. Proceedings of the 1978 EDUCOM Fall Conference. EDUCOM Series in Computing and Telecommunications in Higher Education 3.

    ERIC Educational Resources Information Center

    Emery, James C., Ed.

    A comprehensive review of the current status, prospects, and problems of computer networking in higher education is presented from the perspectives of both computer users and network suppliers. Several areas of computer use are considered including applications for instruction, research, and administration in colleges and universities. In the…

  11. Simulation Framework for Intelligent Transportation Systems

    DOT National Transportation Integrated Search

    1996-10-01

    A simulation framework has been developed for a large-scale, comprehensive, scaleable simulation of an Intelligent Transportation System. The simulator is designed for running on parellel computers and distributed (networked) computer systems, but ca...

  12. Identifying 5-methylcytosine sites in RNA sequence using composite encoding feature into Chou's PseKNC.

    PubMed

    Sabooh, M Fazli; Iqbal, Nadeem; Khan, Mukhtaj; Khan, Muslim; Maqbool, H F

    2018-05-01

    This study examines accurate and efficient computational method for identification of 5-methylcytosine sites in RNA modification. The occurrence of 5-methylcytosine (m 5 C) plays a vital role in a number of biological processes. For better comprehension of the biological functions and mechanism it is necessary to recognize m 5 C sites in RNA precisely. The laboratory techniques and procedures are available to identify m 5 C sites in RNA, but these procedures require a lot of time and resources. This study develops a new computational method for extracting the features of RNA sequence. In this method, first the RNA sequence is encoded via composite feature vector, then, for the selection of discriminate features, the minimum-redundancy-maximum-relevance algorithm was used. Secondly, the classification method used has been based on a support vector machine by using jackknife cross validation test. The suggested method efficiently identifies m 5 C sites from non- m 5 C sites and the outcome of the suggested algorithm is 93.33% with sensitivity of 90.0 and specificity of 96.66 on bench mark datasets. The result exhibits that proposed algorithm shown significant identification performance compared to the existing computational techniques. This study extends the knowledge about the occurrence sites of RNA modification which paves the way for better comprehension of the biological uses and mechanism. Copyright © 2018 Elsevier Ltd. All rights reserved.

  13. I Use the Computer to ADVANCE Advances in Comprehension-Strategy Research.

    ERIC Educational Resources Information Center

    Blohm, Paul J.

    Merging the instructional implications drawn from theory and research in the interactive reading model, schemata, and metacognition with computer based instruction seems a natural approach for actively involving students' participation in reading and learning from text. Computer based graphic organizers guide students' preview or review of lengthy…

  14. A Framework for Understanding Physics Students' Computational Modeling Practices

    ERIC Educational Resources Information Center

    Lunk, Brandon Robert

    2012-01-01

    With the growing push to include computational modeling in the physics classroom, we are faced with the need to better understand students' computational modeling practices. While existing research on programming comprehension explores how novices and experts generate programming algorithms, little of this discusses how domain content…

  15. Learner Assessment Methods Using a Computer Based Interactive Videodisc System.

    ERIC Educational Resources Information Center

    Ehrlich, Lisa R.

    This paper focuses on item design considerations faced by instructional designers and evaluators when using computer videodisc delivery systems as a means of assessing learner comprehension and competencies. Media characteristics of various interactive computer/videodisc training systems are briefly discussed as well as reasons for using such…

  16. Chapter 11. Quality evaluation of apple by computer vision

    USDA-ARS?s Scientific Manuscript database

    Apple is one of the most consumed fruits in the world, and there is a critical need for enhanced computer vision technology for quality assessment of apples. This chapter gives a comprehensive review on recent advances in various computer vision techniques for detecting surface and internal defects ...

  17. Understanding Counterfactuality: A Review of Experimental Evidence for the Dual Meaning of Counterfactuals

    PubMed Central

    Nieuwland, Mante S.

    2016-01-01

    Abstract Cognitive and linguistic theories of counterfactual language comprehension assume that counterfactuals convey a dual meaning. Subjunctive‐counterfactual conditionals (e.g., ‘If Tom had studied hard, he would have passed the test’) express a supposition while implying the factual state of affairs (Tom has not studied hard and failed). The question of how counterfactual dual meaning plays out during language processing is currently gaining interest in psycholinguistics. Whereas numerous studies using offline measures of language processing consistently support counterfactual dual meaning, evidence coming from online studies is less conclusive. Here, we review the available studies that examine online counterfactual language comprehension through behavioural measurement (self‐paced reading times, eye‐tracking) and neuroimaging (electroencephalography, functional magnetic resonance imaging). While we argue that these studies do not offer direct evidence for the online computation of counterfactual dual meaning, they provide valuable information about the way counterfactual meaning unfolds in time and influences successive information processing. Further advances in research on counterfactual comprehension require more specific predictions about how counterfactual dual meaning impacts incremental sentence processing. PMID:27512408

  18. Computer-Assisted Second Language Vocabulary Learning in a Paired-Associate Paradigm: A Critical Investigation of Flashcard Software

    ERIC Educational Resources Information Center

    Nakata, Tatsuya

    2011-01-01

    The present study aims to conduct a comprehensive investigation of flashcard software for learning vocabulary in a second language. Nine flashcard programs were analysed using 17 criteria derived from previous studies on flashcard learning as well as paired-associate learning. Results suggest that in general, most programs have been developed in a…

  19. Evaluation of the Relationship between Literacy and Mathematics Skills as Assessed by Curriculum-Based Measures

    ERIC Educational Resources Information Center

    Rutherford-Becker, Kristy J.; Vanderwood, Michael L.

    2009-01-01

    The purpose of this study was to evaluate the extent that reading performance (as measured by curriculum-based measures [CBM] of oral reading fluency [ORF] and Maze reading comprehension), is related to math performance (as measured by CBM math computation and applied math). Additionally, this study examined which of the two reading measures was a…

  20. The Impact of the Pre-Instructional Cognitive Profile on Learning Gain and Final Exam of Physics Courses: A Case Study

    ERIC Educational Resources Information Center

    Capizzo, Maria Concetta; Nuzzo, Silvana; Zarcone, Michelangelo

    2006-01-01

    The case study described in this paper investigates the relationship among some pre-instructional knowledge, the learning gain and the final physics performance of computing engineering students in the introductory physics course. The results of the entrance engineering test (EET) have been used as a measurement of reading comprehension, logic and…

  1. Cost-effective use of minicomputers to solve structural problems

    NASA Technical Reports Server (NTRS)

    Storaasli, O. O.; Foster, E. P.

    1978-01-01

    Minicomputers are receiving increased use throughout the aerospace industry. Until recently, their use focused primarily on process control and numerically controlled tooling applications, while their exposure to and the opportunity for structural calculations has been limited. With the increased availability of this computer hardware, the question arises as to the feasibility and practicality of carrying out comprehensive structural analysis on a minicomputer. This paper presents results on the potential for using minicomputers for structural analysis by (1) selecting a comprehensive, finite-element structural analysis system in use on large mainframe computers; (2) implementing the system on a minicomputer; and (3) comparing the performance of the minicomputers with that of a large mainframe computer for the solution to a wide range of finite element structural analysis problems.

  2. Internal aerodynamics of a generic three-dimensional scramjet inlet at Mach 10

    NASA Technical Reports Server (NTRS)

    Holland, Scott D.

    1995-01-01

    A combined computational and experimental parametric study of the internal aerodynamics of a generic three-dimensional sidewall compression scramjet inlet configuration at Mach 10 has been performed. The study was designed to demonstrate the utility of computational fluid dynamics as a design tool in hypersonic inlet flow fields, to provide a detailed account of the nature and structure of the internal flow interactions, and to provide a comprehensive surface property and flow field database to determine the effects of contraction ratio, cowl position, and Reynolds number on the performance of a hypersonic scramjet inlet configuration. The work proceeded in several phases: the initial inviscid assessment of the internal shock structure, the preliminary computational parametric study, the coupling of the optimized configuration with the physical limitations of the facility, the wind tunnel blockage assessment, and the computational and experimental parametric study of the final configuration. Good agreement between computation and experimentation was observed in the magnitude and location of the interactions, particularly for weakly interacting flow fields. Large-scale forward separations resulted when the interaction strength was increased by increasing the contraction ratio or decreasing the Reynolds number.

  3. The Comprehensive Competencies Program Reference Manual. Volume I. Introduction.

    ERIC Educational Resources Information Center

    Taggart, Robert

    Chapter 1 of this reference manual is a summary of the comprehensive competencies program (CCP). It describes this system for organizing, implementing, managing, and efficiently delivering individualized self-paced instruction, combined with group and experience-based learning activities, using computer-assisted instruction. (The CCP covers not…

  4. Program For Analysis Of Metal-Matrix Composites

    NASA Technical Reports Server (NTRS)

    Murthy, P. L. N.; Mital, S. K.

    1994-01-01

    METCAN (METal matrix Composite ANalyzer) is computer program used to simulate computationally nonlinear behavior of high-temperature metal-matrix composite structural components in specific applications, providing comprehensive analyses of thermal and mechanical performances. Written in FORTRAN 77.

  5. Index to College Television Courseware. A Comprehensive Directory of Credit Courses and Concept Modules Distributed on Video Tape and Film.

    ERIC Educational Resources Information Center

    Prange, W. Werner; Bellinghausen, Carol R.

    A directory of college television courseware lists offerings in curriculum areas such as: social sciences, biology, black studies, business, mathematics, sciences, computer science, consumer protection, creative arts, drug education, ecology, engineering, humanities, physics, nursing, nutrition, religion, and vocational education, etc. Each course…

  6. How Readability and Topic Incidence Relate to Performance on Mathematics Story Problems in Computer-Based Curricula

    ERIC Educational Resources Information Center

    Walkington, Candace; Clinton, Virginia; Ritter, Steven N.; Nathan, Mitchell J.

    2015-01-01

    Solving mathematics story problems requires text comprehension skills. However, previous studies have found few connections between traditional measures of text readability and performance on story problems. We hypothesized that recently developed measures of readability and topic incidence measured by text-mining tools may illuminate associations…

  7. A Formative Analysis of Resources Used to Learn Software

    ERIC Educational Resources Information Center

    Kay, Robin

    2007-01-01

    A comprehensive, formal comparison of resources used to learn computer software has yet to be researched. Understanding the relative strengths and weakness of resources would provide useful guidance to teachers and students. The purpose of the current study was to explore the effectiveness of seven key resources: human assistance, the manual, the…

  8. Effect of the Affordances of a Virtual Environment on Second Language Oral Proficiency

    ERIC Educational Resources Information Center

    Carruthers, Heidy P. Cuervo

    2013-01-01

    The traditional language laboratory consists of computer-based exercises in which students practice the language individually, working on language form drills and listening comprehension activities. In addition to the traditional approach to the laboratory requirement, students in the study participated in a weekly conversation hour focusing on…

  9. Stability of Linear Equations--Algebraic Approach

    ERIC Educational Resources Information Center

    Cherif, Chokri; Goldstein, Avraham; Prado, Lucio M. G.

    2012-01-01

    This article could be of interest to teachers of applied mathematics as well as to people who are interested in applications of linear algebra. We give a comprehensive study of linear systems from an application point of view. Specifically, we give an overview of linear systems and problems that can occur with the computed solution when the…

  10. Using Robot Animation to Promote Gestural Skills in Children with Autism Spectrum Disorders

    ERIC Educational Resources Information Center

    So, W.-C.; Wong, M. K.-Y.; Cabibihan, J.-J.; Lam, C. K.-Y.; Chan, R. Y.-Y.; Qian, H.-H.

    2016-01-01

    School-aged children with autism spectrum disorders (ASDs) have delayed gestural development, in comparison with age-matched typically developing children. In this study, an intervention program taught children with low-functioning ASD gestural comprehension and production using video modelling (VM) by a computer-generated robot animation. Six to…

  11. A comprehensive overview of the applications of artificial life.

    PubMed

    Kim, Kyung-Joong; Cho, Sung-Bae

    2006-01-01

    We review the applications of artificial life (ALife), the creation of synthetic life on computers to study, simulate, and understand living systems. The definition and features of ALife are shown by application studies. ALife application fields treated include robot control, robot manufacturing, practical robots, computer graphics, natural phenomenon modeling, entertainment, games, music, economics, Internet, information processing, industrial design, simulation software, electronics, security, data mining, and telecommunications. In order to show the status of ALife application research, this review primarily features a survey of about 180 ALife application articles rather than a selected representation of a few articles. Evolutionary computation is the most popular method for designing such applications, but recently swarm intelligence, artificial immune network, and agent-based modeling have also produced results. Applications were initially restricted to the robotics and computer graphics, but presently, many different applications in engineering areas are of interest.

  12. Comprehensive Outlook for Managed Pines Using Simulated Treatment Experiments-Planted Loblolly Pine (COMPUTE_P-LOB): A User's Guide

    Treesearch

    R.B. Ferguson; V. Clark Baldwin

    1987-01-01

    Complete instructions for user operation of COMPUTE_P-LOB to include detailed examples of computer input and output, of a growth and yield prediction system providing volume and weight yields in stand and stock table format.A complete program listing is provided.

  13. Cognitive Consequences of Participation in a "Fifth Dimension" After-School Computer Club.

    ERIC Educational Resources Information Center

    Mayer, Richard E.; Quilici, Jill; Moreno, Roxana; Duran, Richard; Woodbridge, Scott; Simon, Rebecca; Sanchez, David; Lavezzo, Amy

    1997-01-01

    Children who attended the Fifth Dimension after-school computer club at least 10 times during the 1994-95 school year performed better on word problem comprehension tests than did non-participating children. Results support the hypothesis that experience in using computer software in the Fifth Dimension club produces measurable, resilient, and…

  14. Models and techniques for evaluating the effectiveness of aircraft computing systems

    NASA Technical Reports Server (NTRS)

    Meyer, J. F.

    1978-01-01

    Progress in the development of system models and techniques for the formulation and evaluation of aircraft computer system effectiveness is reported. Topics covered include: analysis of functional dependence: a prototype software package, METAPHOR, developed to aid the evaluation of performability; and a comprehensive performability modeling and evaluation exercise involving the SIFT computer.

  15. Proceedings: Conference on Computers in Chemical Education and Research, Dekalb, Illinois, 19-23 July 1971.

    ERIC Educational Resources Information Center

    1971

    Computers have effected a comprehensive transformation of chemistry. Computers have greatly enhanced the chemist's ability to do model building, simulations, data refinement and reduction, analysis of data in terms of models, on-line data logging, automated control of experiments, quantum chemistry and statistical and mechanical calculations, and…

  16. Training and Generalization Effects of a Reading Comprehension Learning Strategy on Computer and Paper-Pencil Assessments

    ERIC Educational Resources Information Center

    Worrell, Jamie; Duffy, Mary Lou; Brady, Michael P.; Dukes, Charles; Gonzalez-DeHass, Alyssa

    2016-01-01

    Many schools use computer-based testing to measure students' progress for end-of-the-year and statewide assessments. There is little research to support whether computer-based testing accurately reflects student progress, particularly among students with learning, performance, and generalization difficulties. This article summarizes an…

  17. Revalidation of the Selection Instrument for Flight Training

    DTIC Science & Technology

    2017-07-01

    AACog) composite, as measured by the following SIFT subscales: o Mechanical Comprehension Test (MCT) o Math Skills Test (MST) o Reading...applied mechanical science. varies/ 15 minutes MST Math Skills Test—Assesses the examinee’s computational skill and mathematical aptitude...3.36 2.11 .03 1.01 Mechanical Comprehension 463 -1.95 3.58 .01 0.92 Math Skills 463 -2.59 2.87 .06 0.82 Reading Comprehension 463 -2.51 2.93 .52

  18. A History of Rotorcraft Comprehensive Analyses

    NASA Technical Reports Server (NTRS)

    Johnson, Wayne

    2013-01-01

    A history of the development of rotorcraft comprehensive analyses is presented. Comprehensive analyses are digital computer programs that calculate the aeromechanical behavior of the rotor and aircraft, bringing together the most advanced models of the geometry, structure, dynamics, and aerodynamics available in rotary wing technology. The development of the major codes of the last five decades from industry, government, and universities is described. A number of common themes observed in this history are discussed.

  19. Reliability and validity of the C-BiLLT: a new instrument to assess comprehension of spoken language in young children with cerebral palsy and complex communication needs.

    PubMed

    Geytenbeek, Joke J; Mokkink, Lidwine B; Knol, Dirk L; Vermeulen, R Jeroen; Oostrom, Kim J

    2014-09-01

    In clinical practice, a variety of diagnostic tests are available to assess a child's comprehension of spoken language. However, none of these tests have been designed specifically for use with children who have severe motor impairments and who experience severe difficulty when using speech to communicate. This article describes the process of investigating the reliability and validity of the Computer-Based Instrument for Low Motor Language Testing (C-BiLLT), which was specifically developed to assess spoken Dutch language comprehension in children with cerebral palsy and complex communication needs. The study included 806 children with typical development, and 87 nonspeaking children with cerebral palsy and complex communication needs, and was designed to provide information on the psychometric qualities of the C-BiLLT. The potential utility of the C-BiLLT as a measure of spoken Dutch language comprehension abilities for children with cerebral palsy and complex communication needs is discussed.

  20. Feasibility Study for an Air Force Environmental Model and Data Exchange. Volume 4. Appendix G. Model Review and Index-Air Multimedia and Other Models, Plus Data Bases.

    DTIC Science & Technology

    1983-07-01

    Analysis of trace contaminants project at ORNL. Medium applied to movement of heavy metals through a forested watershed. OAQPS has not reviewed...computer cartography and site design aids; management information systems for facility planning, construction and * operation; and a computer...4 (5) Comprehensive 4 (6) Spills/ Heavy Gas 5 b. Regional 7 c. Reactive Pollutants 7 d. Special Purpose 8 e. Rocket Firing 8 f. Summary of Models by

  1. THE COMPREHENSION OF RAPID SPEECH BY THE BLIND, PART III.

    ERIC Educational Resources Information Center

    FOULKE, EMERSON

    A REVIEW OF THE RESEARCH ON THE COMPREHENSION OF RAPID SPEECH BY THE BLIND IDENTIFIES FIVE METHODS OF SPEECH COMPRESSION--SPEECH CHANGING, ELECTROMECHANICAL SAMPLING, COMPUTER SAMPLING, SPEECH SYNTHESIS, AND FREQUENCY DIVIDING WITH THE HARMONIC COMPRESSOR. THE SPEECH CHANGING AND ELECTROMECHANICAL SAMPLING METHODS AND THE NECESSARY APPARATUS HAVE…

  2. The Bilingual Language Interaction Network for Comprehension of Speech

    ERIC Educational Resources Information Center

    Shook, Anthony; Marian, Viorica

    2013-01-01

    During speech comprehension, bilinguals co-activate both of their languages, resulting in cross-linguistic interaction at various levels of processing. This interaction has important consequences for both the structure of the language system and the mechanisms by which the system processes spoken language. Using computational modeling, we can…

  3. Prediction, Error, and Adaptation during Online Sentence Comprehension

    ERIC Educational Resources Information Center

    Fine, Alex Brabham

    2013-01-01

    A fundamental challenge for human cognition is perceiving and acting in a world in which the statistics that characterize available sensory data are non-stationary. This thesis focuses on this problem specifically in the domain of sentence comprehension, where linguistic variability poses computational challenges to the processes underlying…

  4. Keeping It Simple: The Case for E-Mail.

    ERIC Educational Resources Information Center

    Haimovic, Gila

    The Open University of Israel (OUI) is a distance education institution that offers over 250 computer-mediated courses through the Internet. All OUI students must pass an English reading comprehension exemption exam or take the University's English reading comprehension courses. Because reading instruction differs from content instruction,…

  5. The Role of Working Memory in Metaphor Production and Comprehension

    ERIC Educational Resources Information Center

    Chiappe, Dan L.; Chiappe, Penny

    2007-01-01

    The following tested Kintsch's [Kintsch, W. (2000). "Metaphor comprehension: a computational theory." "Psychonomic Bulletin & Review," 7, 257-266 and Kintsch, W. (2001). "Predication." "Cognitive Science," 25, 173-202] Predication Model, which predicts that working memory capacity is an important factor in metaphor processing. In support of his…

  6. Interactive Video Listening Comprehension in Foreign Language Instruction: Development and Evaluation.

    ERIC Educational Resources Information Center

    Fischer, Robert

    The report details development, at Southwest Texas State University and later at Pennsylvania State University, of a computer authoring system ("Libra") enabling foreign language faculty to develop multimedia lessons focusing on listening comprehension. Staff at Southwest Texas State University first developed a Macintosh version of the…

  7. Event-Based Plausibility Immediately Influences On-Line Language Comprehension

    ERIC Educational Resources Information Center

    Matsuki, Kazunaga; Chow, Tracy; Hare, Mary; Elman, Jeffrey L.; Scheepers, Christoph; McRae, Ken

    2011-01-01

    In some theories of sentence comprehension, linguistically relevant lexical knowledge, such as selectional restrictions, is privileged in terms of the time-course of its access and influence. We examined whether event knowledge computed by combining multiple concepts can rapidly influence language understanding even in the absence of selectional…

  8. Twelfth NASTRAN (R) Users' Colloquium

    NASA Technical Reports Server (NTRS)

    1984-01-01

    NASTRAN is a large, comprehensive, nonproprietary, general purpose finite element computer code for structural analysis. The Twelfth Users' Colloquim provides some comprehensive papers on the application of finite element methods in engineering, comparisons with other approaches, unique applications, pre and post processing or auxiliary programs, and new methods of analysis with NASTRAN.

  9. Computational intelligence in bioinformatics: SNP/haplotype data in genetic association study for common diseases.

    PubMed

    Kelemen, Arpad; Vasilakos, Athanasios V; Liang, Yulan

    2009-09-01

    Comprehensive evaluation of common genetic variations through association of single-nucleotide polymorphism (SNP) structure with common complex disease in the genome-wide scale is currently a hot area in human genome research due to the recent development of the Human Genome Project and HapMap Project. Computational science, which includes computational intelligence (CI), has recently become the third method of scientific enquiry besides theory and experimentation. There have been fast growing interests in developing and applying CI in disease mapping using SNP and haplotype data. Some of the recent studies have demonstrated the promise and importance of CI for common complex diseases in genomic association study using SNP/haplotype data, especially for tackling challenges, such as gene-gene and gene-environment interactions, and the notorious "curse of dimensionality" problem. This review provides coverage of recent developments of CI approaches for complex diseases in genetic association study with SNP/haplotype data.

  10. Computers, the Human Mind, and My In-Laws' House.

    ERIC Educational Resources Information Center

    Esque, Timm J.

    1996-01-01

    Discussion of human memory, computer memory, and the storage of information focuses on a metaphor that can account for memory without storage and can set the stage for systemic research around a more comprehensive, understandable theory. (Author/LRW)

  11. Who was the agent? The neural correlates of reanalysis processes during sentence comprehension.

    PubMed

    Hirotani, Masako; Makuuchi, Michiru; Rüschemeyer, Shirley-Ann; Friederici, Angela D

    2011-11-01

    Sentence comprehension is a complex process. Besides identifying the meaning of each word and processing the syntactic structure of a sentence, it requires the computation of thematic information, that is, information about who did what to whom. The present fMRI study investigated the neural basis for thematic reanalysis (reanalysis of the thematic roles initially assigned to noun phrases in a sentence) and its interplay with syntactic reanalysis (reanalysis of the underlying syntactic structure originally constructed for a sentence). Thematic reanalysis recruited a network consisting of Broca's area, that is, the left pars triangularis (LPT), and the left posterior superior temporal gyrus, whereas only LPT showed greater sensitivity to syntactic reanalysis. These data provide direct evidence for a functional neuroanatomical basis for two linguistically motivated reanalysis processes during sentence comprehension. Copyright © 2010 Wiley-Liss, Inc.

  12. Intertwining Digital Content and a One-to-One Laptop Environment in Teaching and Learning: Lessons from the Time to Know Program

    ERIC Educational Resources Information Center

    Rosen, Yigal; Beck-Hill, Dawne

    2012-01-01

    This study provides a comprehensive look at a constructivist one-to-one computing program's effects on teaching and learning practices as well as student learning achievements. The study participants were 476 fourth and fifth grade students and their teachers from four elementary schools from a school district in the Dallas, Texas, area. Findings…

  13. Computational Study of Thrombus Formation and Clotting Factor Effects under Venous Flow Conditions

    DTIC Science & Technology

    2016-04-26

    Telemedicine and Advanced Technology Research Center, U.S. Army Medical Research and Materiel Command, Fort Detrick, MarylandABSTRACT A comprehensive... experimental study. The model allowed us to identify the distinct patterns character- izing the spatial distributions of thrombin, platelets, and fibrin...time, elevated fibrinogen levels may contribute to the development of thrombosis (4,6,12). Quantitative knowledge about the interactions between fibrin

  14. Comprehensive Flood Plain Studies Using Spatial Data Management Techniques.

    DTIC Science & Technology

    1978-06-01

    Hydrologic Engineer- ing Center computer programs that forecast urban storm water quality and dynamic in- stream water quality response to waste...determination. Water Quality The water quality analysis planned for the pilot study includes urban storm water quality forecasting and in-streamn...analysis is performed under the direction of Tony Thomas. Chief, Research Branch, by Jess Abbott for storm water quality analysis, R. G. Willey for

  15. Comprehensive numerical methodology for direct numerical simulations of compressible Rayleigh-Taylor instability

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reckinger, Scott James; Livescu, Daniel; Vasilyev, Oleg V.

    A comprehensive numerical methodology has been developed that handles the challenges introduced by considering the compressive nature of Rayleigh-Taylor instability (RTI) systems, which include sharp interfacial density gradients on strongly stratified background states, acoustic wave generation and removal at computational boundaries, and stratification-dependent vorticity production. The computational framework is used to simulate two-dimensional single-mode RTI to extreme late-times for a wide range of flow compressibility and variable density effects. The results show that flow compressibility acts to reduce the growth of RTI for low Atwood numbers, as predicted from linear stability analysis.

  16. Clinical applications of cone beam computed tomography in endodontics: A comprehensive review.

    PubMed

    Cohenca, Nestor; Shemesh, Hagay

    2015-06-01

    Cone beam computed tomography (CBCT) is a new technology that produces three-dimensional (3D) digital imaging at reduced cost and less radiation for the patient than traditional CT scans. It also delivers faster and easier image acquisition. By providing a 3D representation of the maxillofacial tissues in a cost- and dose-efficient manner, a better preoperative assessment can be obtained for diagnosis and treatment. This comprehensive review presents current applications of CBCT in endodontics. Specific case examples illustrate the difference in treatment planning with traditional periapical radiography versus CBCT technology.

  17. Comprehensive Thematic T-Matrix Reference Database: A 2014-2015 Update

    NASA Technical Reports Server (NTRS)

    Mishchenko, Michael I.; Zakharova, Nadezhda; Khlebtsov, Nikolai G.; Videen, Gorden; Wriedt, Thomas

    2015-01-01

    The T-matrix method is one of the most versatile and efficient direct computer solvers of the macroscopic Maxwell equations and is widely used for the computation of electromagnetic scattering by single and composite particles, discrete random media, and particles in the vicinity of an interface separating two half-spaces with different refractive indices. This paper is the seventh update to the comprehensive thematic database of peer-reviewed T-matrix publications initiated by us in 2004 and includes relevant publications that have appeared since 2013. It also lists a number of earlier publications overlooked previously.

  18. Development of a COTS-Based Computing Environment Blueprint Application at KSC

    NASA Technical Reports Server (NTRS)

    Ghansah, Isaac; Boatright, Bryan

    1996-01-01

    This paper describes a blueprint that can be used for developing a distributed computing environment (DCE) for NASA in general, and the Kennedy Space Center (KSC) in particular. A comprehensive, open, secure, integrated, and multi-vendor DCE such as OSF DCE has been suggested. Design issues, as well as recommendations for each component have been given. Where necessary, modifications were suggested to fit the needs of KSC. This was done in the areas of security and directory services. Readers requiring a more comprehensive coverage are encouraged to refer to the eight-chapter document prepared for this work.

  19. Modeling Criminal Activity in Urban Landscapes

    NASA Astrophysics Data System (ADS)

    Brantingham, Patricia; Glässer, Uwe; Jackson, Piper; Vajihollahi, Mona

    Computational and mathematical methods arguably have an enormous potential for serving practical needs in crime analysis and prevention by offering novel tools for crime investigations and experimental platforms for evidence-based policy making. We present a comprehensive formal framework and tool support for mathematical and computational modeling of criminal behavior to facilitate systematic experimental studies of a wide range of criminal activities in urban environments. The focus is on spatial and temporal aspects of different forms of crime, including opportunistic and serial violent crimes. However, the proposed framework provides a basis to push beyond conventional empirical research and engage the use of computational thinking and social simulations in the analysis of terrorism and counter-terrorism.

  20. AdapChem

    NASA Technical Reports Server (NTRS)

    Oluwole, Oluwayemisi O.; Wong, Hsi-Wu; Green, William

    2012-01-01

    AdapChem software enables high efficiency, low computational cost, and enhanced accuracy on computational fluid dynamics (CFD) numerical simulations used for combustion studies. The software dynamically allocates smaller, reduced chemical models instead of the larger, full chemistry models to evolve the calculation while ensuring the same accuracy to be obtained for steady-state CFD reacting flow simulations. The software enables detailed chemical kinetic modeling in combustion CFD simulations. AdapChem adapts the reaction mechanism used in the CFD to the local reaction conditions. Instead of a single, comprehensive reaction mechanism throughout the computation, a dynamic distribution of smaller, reduced models is used to capture accurately the chemical kinetics at a fraction of the cost of the traditional single-mechanism approach.

  1. Scalable total synthesis and comprehensive structure-activity relationship studies of the phytotoxin coronatine.

    PubMed

    Littleson, Mairi M; Baker, Christopher M; Dalençon, Anne J; Frye, Elizabeth C; Jamieson, Craig; Kennedy, Alan R; Ling, Kenneth B; McLachlan, Matthew M; Montgomery, Mark G; Russell, Claire J; Watson, Allan J B

    2018-03-16

    Natural phytotoxins are valuable starting points for agrochemical design. Acting as a jasmonate agonist, coronatine represents an attractive herbicidal lead with novel mode of action, and has been an important synthetic target for agrochemical development. However, both restricted access to quantities of coronatine and a lack of a suitably scalable and flexible synthetic approach to its constituent natural product components, coronafacic and coronamic acids, has frustrated development of this target. Here, we report gram-scale production of coronafacic acid that allows a comprehensive structure-activity relationship study of this target. Biological assessment of a >120 member library combined with computational studies have revealed the key determinants of potency, rationalising hypotheses held for decades, and allowing future rational design of new herbicidal leads based on this template.

  2. Modelling and simulation of complex sociotechnical systems: envisioning and analysing work environments

    PubMed Central

    Hettinger, Lawrence J.; Kirlik, Alex; Goh, Yang Miang; Buckle, Peter

    2015-01-01

    Accurate comprehension and analysis of complex sociotechnical systems is a daunting task. Empirically examining, or simply envisioning the structure and behaviour of such systems challenges traditional analytic and experimental approaches as well as our everyday cognitive capabilities. Computer-based models and simulations afford potentially useful means of accomplishing sociotechnical system design and analysis objectives. From a design perspective, they can provide a basis for a common mental model among stakeholders, thereby facilitating accurate comprehension of factors impacting system performance and potential effects of system modifications. From a research perspective, models and simulations afford the means to study aspects of sociotechnical system design and operation, including the potential impact of modifications to structural and dynamic system properties, in ways not feasible with traditional experimental approaches. This paper describes issues involved in the design and use of such models and simulations and describes a proposed path forward to their development and implementation. Practitioner Summary: The size and complexity of real-world sociotechnical systems can present significant barriers to their design, comprehension and empirical analysis. This article describes the potential advantages of computer-based models and simulations for understanding factors that impact sociotechnical system design and operation, particularly with respect to process and occupational safety. PMID:25761227

  3. A new computer code for discrete fracture network modelling

    NASA Astrophysics Data System (ADS)

    Xu, Chaoshui; Dowd, Peter

    2010-03-01

    The authors describe a comprehensive software package for two- and three-dimensional stochastic rock fracture simulation using marked point processes. Fracture locations can be modelled by a Poisson, a non-homogeneous, a cluster or a Cox point process; fracture geometries and properties are modelled by their respective probability distributions. Virtual sampling tools such as plane, window and scanline sampling are included in the software together with a comprehensive set of statistical tools including histogram analysis, probability plots, rose diagrams and hemispherical projections. The paper describes in detail the theoretical basis of the implementation and provides a case study in rock fracture modelling to demonstrate the application of the software.

  4. Using a voice to put a name to a face: the psycholinguistics of proper name comprehension.

    PubMed

    Barr, Dale J; Jackson, Laura; Phillips, Isobel

    2014-02-01

    We propose that hearing a proper name (e.g., Kevin) in a particular voice serves as a compound memory cue that directly activates representations of a mutually known target person, often permitting reference resolution without any complex computation of shared knowledge. In a referential communication study, pairs of friends played a communication game, in which we monitored the eyes of one friend (the addressee) while he or she sought to identify the target person, in a set of four photos, on the basis of a name spoken aloud. When the name was spoken by a friend, addressees rapidly identified the target person, and this facilitation was independent of whether the friend was articulating a message he or she had designed versus one from a third party with whom the target person was not shared. Our findings suggest that the comprehension system takes advantage of regularities in the environment to minimize effortful computation about who knows what.

  5. Histological Image Feature Mining Reveals Emergent Diagnostic Properties for Renal Cancer

    PubMed Central

    Kothari, Sonal; Phan, John H.; Young, Andrew N.; Wang, May D.

    2016-01-01

    Computer-aided histological image classification systems are important for making objective and timely cancer diagnostic decisions. These systems use combinations of image features that quantify a variety of image properties. Because researchers tend to validate their diagnostic systems on specific cancer endpoints, it is difficult to predict which image features will perform well given a new cancer endpoint. In this paper, we define a comprehensive set of common image features (consisting of 12 distinct feature subsets) that quantify a variety of image properties. We use a data-mining approach to determine which feature subsets and image properties emerge as part of an “optimal” diagnostic model when applied to specific cancer endpoints. Our goal is to assess the performance of such comprehensive image feature sets for application to a wide variety of diagnostic problems. We perform this study on 12 endpoints including 6 renal tumor subtype endpoints and 6 renal cancer grade endpoints. Keywords-histology, image mining, computer-aided diagnosis PMID:28163980

  6. Assessing Working Memory in Children: The Comprehensive Assessment Battery for Children - Working Memory (CABC-WM).

    PubMed

    Cabbage, Kathryn; Brinkley, Shara; Gray, Shelley; Alt, Mary; Cowan, Nelson; Green, Samuel; Kuo, Trudy; Hogan, Tiffany P

    2017-06-12

    The Comprehensive Assessment Battery for Children - Working Memory (CABC-WM) is a computer-based battery designed to assess different components of working memory in young school-age children. Working memory deficits have been identified in children with language-based learning disabilities, including dyslexia 1 , 2 and language impairment 3 , 4 , but it is not clear whether these children exhibit deficits in subcomponents of working memory, such as visuospatial or phonological working memory. The CABC-WM is administered on a desktop computer with a touchscreen interface and was specifically developed to be engaging and motivating for children. Although the long-term goal of the CABC-WM is to provide individualized working memory profiles in children, the present study focuses on the initial success and utility of the CABC-WM for measuring central executive, visuospatial, phonological loop, and binding constructs in children with typical development. Immediate next steps are to administer the CABC-WM to children with specific language impairment, dyslexia, and comorbid specific language impairment and dyslexia.

  7. Intersections between the Autism Spectrum and the Internet: Perceived Benefits and Preferred Functions of Computer-Mediated Communication

    ERIC Educational Resources Information Center

    Gillespie-Lynch, Kristen; Kapp, Steven K.; Shane-Simpson, Christina; Smith, David Shane; Hutman, Ted

    2014-01-01

    An online survey compared the perceived benefits and preferred functions of computer-mediated communication of participants with (N = 291) and without ASD (N = 311). Participants with autism spectrum disorder (ASD) perceived benefits of computer-mediated communication in terms of increased comprehension and control over communication, access to…

  8. Hiding in Plain Sight: Identifying Computational Thinking in the Ontario Elementary School Curriculum

    ERIC Educational Resources Information Center

    Hennessey, Eden J. V.; Mueller, Julie; Beckett, Danielle; Fisher, Peter A.

    2017-01-01

    Given a growing digital economy with complex problems, demands are being made for education to address computational thinking (CT)--an approach to problem solving that draws on the tenets of computer science. We conducted a comprehensive content analysis of the Ontario elementary school curriculum documents for 44 CT-related terms to examine the…

  9. Computer Simulations to Support Science Instruction and Learning: A Critical Review of the Literature

    ERIC Educational Resources Information Center

    Smetana, Lara Kathleen; Bell, Randy L.

    2012-01-01

    Researchers have explored the effectiveness of computer simulations for supporting science teaching and learning during the past four decades. The purpose of this paper is to provide a comprehensive, critical review of the literature on the impact of computer simulations on science teaching and learning, with the goal of summarizing what is…

  10. Investigating an Innovative Computer Application to Improve L2 Word Recognition from Speech

    ERIC Educational Resources Information Center

    Matthews, Joshua; O'Toole, John Mitchell

    2015-01-01

    The ability to recognise words from the aural modality is a critical aspect of successful second language (L2) listening comprehension. However, little research has been reported on computer-mediated development of L2 word recognition from speech in L2 learning contexts. This report describes the development of an innovative computer application…

  11. Hot spot computational identification: Application to the complex formed between the hen egg white lysozyme (HEL) and the antibody HyHEL-10

    NASA Astrophysics Data System (ADS)

    Moreira, I. S.; Fernandes, P. A.; Ramos, M. J.

    The definition and comprehension of the hot spots in an interface is a subject of primary interest for a variety of fields, including structure-based drug design. Therefore, to achieve an alanine mutagenesis computational approach that is at the same time accurate and predictive, capable of reproducing the experimental mutagenesis values is a major challenge in the computational biochemistry field. Antibody/protein antigen complexes provide one of the greatest models to study protein-protein recognition process because they have three fundamentally features: specificity, high complementary association and a small epitope restricted to the diminutive complementary determining regions (CDR) region, while the remainder of the antibody is largely invariant. Thus, we apply a computational mutational methodological approach to the study of the antigen-antibody complex formed between the hen egg white lysozyme (HEL) and the antibody HyHEL-10. A critical evaluation that focuses essentially on the limitations and advantages between different computational methods for hot spot determination, as well as between experimental and computational methodological approaches, is presented.

  12. Development of a computer-assisted learning software package on dental traumatology.

    PubMed

    Tolidis, K; Crawford, P; Stephens, C; Papadogiannis, Y; Plakias, C

    1998-10-01

    The development of computer-assisted learning software packages is a relatively new field of computer application. The progress made in personal computer technology toward more user-friendly operating systems has stimulated the academic community to develop computer-assisted learning for pre- and postgraduate students. The ability of computers to combine audio and visual data in an interactive form provides a powerful educational tool. The purpose of this study was to develop and evaluate a computer-assisted learning package on dental traumatology. This program contains background information on the diagnosis, classification, and management of dental injuries in both the permanent and the deciduous dentitions. It is structured into chapters according to the nature of the injury and whether injury has occurred in the primary or permanent dentition. At the end of each chapter there is a self-assessment questionnaire as well as references to relevant literature. Extensive use of pictures and video provides a comprehensive overview of the subject.

  13. NEW GIS WATERSHED ANALYSIS TOOLS FOR SOIL CHARACTERIZATION AND EROSION AND SEDIMENTATION MODELING

    EPA Science Inventory

    A comprehensive procedure for computing soil erosion and sediment delivery metrics has been developed which utilizes a suite of automated scripts and a pair of processing-intensive executable programs operating on a personal computer platform.

  14. Perceptions among Occupational and Physical Therapy Students of a Nontraditional Methodology for Teaching Laboratory Gross Anatomy

    ERIC Educational Resources Information Center

    Thomas, K. Jackson; Denham, Bryan E.; Dinolfo, John D.

    2011-01-01

    This pilot study was designed to assess the perceptions of physical therapy (PT) and occupational therapy (OT) students regarding the use of computer-assisted pedagogy and prosection-oriented communications in the laboratory component of a human anatomy course at a comprehensive health sciences university in the southeastern United States. The…

  15. The Effects of Computer-Assisted Pronunciation Readings on ESL Learners' Use of Pausing, Stress, Intonation, and Overall Comprehensibility

    ERIC Educational Resources Information Center

    Tanner, Mark W.; Landon, Melissa M.

    2009-01-01

    With research showing the benefits of pronunciation instruction aimed at suprasegmentals (Derwing, Munro, & Wiebe, 1997, 1998; Derwing & Rossiter, 2003; Hahn, 2004; McNerney and Mendelsohn, 1992), more materials are needed to provide learners opportunities for self-directed practice. A 13-week experimental study was performed with 75 ESL learners…

  16. An Investigation of Scaffolded Reading on EFL Hypertext Comprehension

    ERIC Educational Resources Information Center

    Shang, Hui-Fang

    2015-01-01

    With the rapid growth of computer technology, some printed texts are designed as hypertexts to help EFL (English as a foreign language) learners search for and process multiple resources in a timely manner for autonomous learning. The purpose of this study was to design a hypertext system and examine if a 14-week teacher-guided print-based and…

  17. Online Reading Practices of Students Who Are Deaf/Hard of Hearing

    ERIC Educational Resources Information Center

    Donne, Vicki; Rugg, Natalie

    2015-01-01

    This study sought to investigate reading perceptions, computer use perceptions, and online reading comprehension strategy use of 26 students who are deaf/hard of hearing in grades 4 through 8 attending public school districts in a tri-state area of the U.S. Students completed an online questionnaire and descriptive analysis indicated that students…

  18. Deepen the Teaching Reform of Operating System, Cultivate the Comprehensive Quality of Students

    ERIC Educational Resources Information Center

    Liu, Jianjun

    2010-01-01

    Operating system is the core course of the specialty of computer science and technology. To understand and master the operating system will directly affect students' further study on other courses. The course of operating system focuses more on theories. Its contents are more abstract and the knowledge system is more complicated. Therefore,…

  19. A Computer System to Rate the Variety of Color in Drawings

    ERIC Educational Resources Information Center

    Kim, Seong-in; Hameed, Ibrahim A.

    2009-01-01

    For mental health professionals, art assessment is a useful tool for patient evaluation and diagnosis. Consideration of various color-related elements is important in art assessment. This correlational study introduces the concept of variety of color as a new color-related element of an artwork. This term represents a comprehensive use of color,…

  20. Effectiveness of a Computer-Based Afterschool Intervention to Increase Reading Comprehension

    ERIC Educational Resources Information Center

    Yancsurak, Lonnie S.

    2013-01-01

    The No Child Left Behind legislation of 2001 requires all public school students to be proficient in math and English by 2014. The research problem that this study addressed is that the majority of schools are not on track to demonstrate this proficiency, potentially creating a nation of schools designated as "failing." This problem has…

  1. The Relation between Thematic Role Computing and Semantic Relatedness Processing during On-Line Sentence Comprehension

    PubMed Central

    Li, Xiaoqing; Zhao, Haiyan; Lu, Yong

    2014-01-01

    Sentence comprehension involves timely computing different types of relations between its verbs and noun arguments, such as morphosyntactic, semantic, and thematic relations. Here, we used EEG technique to investigate the potential differences in thematic role computing and lexical-semantic relatedness processing during on-line sentence comprehension, and the interaction between these two types of processes. Mandarin Chinese sentences were used as materials. The basic structure of those sentences is “Noun+Verb+‘le’+a two-character word”, with the Noun being the initial argument. The verb disambiguates the initial argument as an agent or a patient. Meanwhile, the initial argument and the verb are highly or lowly semantically related. The ERPs at the verbs revealed that: relative to the agent condition, the patient condition evoked a larger N400 only when the argument and verb were lowly semantically related; however, relative to the high-relatedness condition, the low-relatedness condition elicited a larger N400 regardless of the thematic relation; although both thematic role variation and semantic relatedness variation elicited N400 effects, the N400 effect elicited by the former was broadly distributed and reached maximum over the frontal electrodes, and the N400 effect elicited by the latter had a posterior distribution. In addition, the brain oscillations results showed that, although thematic role variation (patient vs. agent) induced power decreases around the beta frequency band (15–30 Hz), semantic relatedness variation (low-relatedness vs. high-relatedness) induced power increases in the theta frequency band (4–7 Hz). These results suggested that, in the sentence context, thematic role computing is modulated by the semantic relatedness between the verb and its argument; semantic relatedness processing, however, is in some degree independent from the thematic relations. Moreover, our results indicated that, during on-line sentence comprehension, thematic role computing and semantic relatedness processing are mediated by distinct neural systems. PMID:24755643

  2. PROFEAT Update: A Protein Features Web Server with Added Facility to Compute Network Descriptors for Studying Omics-Derived Networks.

    PubMed

    Zhang, P; Tao, L; Zeng, X; Qin, C; Chen, S Y; Zhu, F; Yang, S Y; Li, Z R; Chen, W P; Chen, Y Z

    2017-02-03

    The studies of biological, disease, and pharmacological networks are facilitated by the systems-level investigations using computational tools. In particular, the network descriptors developed in other disciplines have found increasing applications in the study of the protein, gene regulatory, metabolic, disease, and drug-targeted networks. Facilities are provided by the public web servers for computing network descriptors, but many descriptors are not covered, including those used or useful for biological studies. We upgraded the PROFEAT web server http://bidd2.nus.edu.sg/cgi-bin/profeat2016/main.cgi for computing up to 329 network descriptors and protein-protein interaction descriptors. PROFEAT network descriptors comprehensively describe the topological and connectivity characteristics of unweighted (uniform binding constants and molecular levels), edge-weighted (varying binding constants), node-weighted (varying molecular levels), edge-node-weighted (varying binding constants and molecular levels), and directed (oriented processes) networks. The usefulness of the network descriptors is illustrated by the literature-reported studies of the biological networks derived from the genome, interactome, transcriptome, metabolome, and diseasome profiles. Copyright © 2016 Elsevier Ltd. All rights reserved.

  3. Computer ethics and teritary level education in Hong Kong

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wong, E.Y.W.; Davison, R.M.; Wade, P.W.

    1994-12-31

    This paper seeks to highlight some ethical issues relating to the increasing proliferation of Information Technology into our everyday lives. The authors explain their understanding of computer ethics, and give some reasons why the study of computer ethics is becoming increasingly pertinent. The paper looks at some of the problems that arise in attempting to develop appropriate ethical concepts in a constantly changing environment, and explores some of the ethical dilemmas arising from the increasing use of computers. Some initial research undertaken to explore the ideas and understanding of tertiary level students in Hong Kong on a number of ethicalmore » issues of interest is described, and our findings discussed. We hope that presenting this paper and eliciting subsequent discussion will enable us to draw up more comprehensive guidelines for the teaching of computer related ethics to tertiary level students, as well as reveal some directions for future research.« less

  4. Evolution, Nature, Uses and Issues in the Creation of Local School District Comprehensive Information Systems.

    ERIC Educational Resources Information Center

    Hathaway, Walter E.

    Efficient and convenient comprehensive information systems, long kept from coming into being by a variety of obstacles, are now made possible by the concept of distributive processing and the technology of micro- and mini-computer networks. Such systems can individualize instruction, group students efficiently, cut administrative costs, streamline…

  5. "To Gloss or Not To Gloss": An Investigation of Reading Comprehension Online.

    ERIC Educational Resources Information Center

    Lomika, Lara L.

    1998-01-01

    Investigated effects of multimedia reading software on reading comprehension. Twelve college students enrolled in a second semester French course were instructed to think aloud during reading of text on the computer screen. They read text under one of three conditions: full glossing, limited glossing, no glossing. Suggests computerized reading…

  6. Text Simplification and Comprehensible Input: A Case for an Intuitive Approach

    ERIC Educational Resources Information Center

    Crossley, Scott A.; Allen, David; McNamara, Danielle S.

    2012-01-01

    Texts are routinely simplified to make them more comprehensible for second language learners. However, the effects of simplification upon the linguistic features of texts remain largely unexplored. Here we examine the effects of one type of text simplification: intuitive text simplification. We use the computational tool, Coh-Metrix, to examine…

  7. Who Benefits from a Low versus High Guidance CSCL Script and Why?

    ERIC Educational Resources Information Center

    Mende, Stephan; Proske, Antje; Körndle, Hermann; Narciss, Susanne

    2017-01-01

    Computer-supported collaborative learning (CSCL) scripts can foster learners' deep text comprehension. However, this depends on (a) the extent to which the learning activities targeted by a script promote deep text comprehension and (b) whether the guidance level provided by the script is adequate to induce the targeted learning activities…

  8. The Research and Evaluation of Serious Games: Toward a Comprehensive Methodology

    ERIC Educational Resources Information Center

    Mayer, Igor; Bekebrede, Geertje; Harteveld, Casper; Warmelink, Harald; Zhou, Qiqi; van Ruijven, Theo; Lo, Julia; Kortmann, Rens; Wenzler, Ivo

    2014-01-01

    The authors present the methodological background to and underlying research design of an ongoing research project on the scientific evaluation of serious games and/or computer-based simulation games (SGs) for advanced learning. The main research questions are: (1) what are the requirements and design principles for a comprehensive social…

  9. The Development of Reading for Comprehension: An Information Processing Analysis. Final Report.

    ERIC Educational Resources Information Center

    Schadler, Margaret; Juola, James F.

    This report summarizes research performed at the Universtiy of Kansas that involved several topics related to reading and learning to read, including the development of automatic word recognition processes, reading for comprehension, and the development of new computer technologies designed to facilitate the reading process. The first section…

  10. Fitting Computers into the Curriculum.

    ERIC Educational Resources Information Center

    Rodgers, Robert J.; And Others

    This paper provides strategies and insights that should be weighed and perhaps included in any proposal for integrating computers into a comprehensive school curriculum. The strategies include six basic stages: Initiation, Needs Assessment, Master Plan, Logistic-Specifics, Implementation, and Evaluation. The New Brunswick (New Jersey) Public…

  11. 75 FR 70899 - Submission for OMB Review; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-19

    ... submit to the Office of Management and Budget (OMB) for clearance the following proposal for collection... Annual Burden Hours: 2,952. Public Computer Center Reports (Quarterly and Annually) Number of Respondents... specific to Infrastructure and Comprehensive Community Infrastructure, Public Computer Center, and...

  12. The iPlant collaborative: cyberinfrastructure for enabling data to discovery for the life sciences

    USDA-ARS?s Scientific Manuscript database

    The iPlant Collaborative provides life science research communities access to comprehensive, scalable, and cohesive computational infrastructure for data management; identify management; collaboration tools; and cloud, high-performance, high-throughput computing. iPlant provides training, learning m...

  13. Computer Analysis in HSC German

    ERIC Educational Resources Information Center

    Clutterbuck, Michael; Mowchanuk, Timothy

    1977-01-01

    In October, 1976, a new type of question was introduced into the Victorian HSC German test: A listening comprehension question with multiple-choice answers has replaced the written reproduction question. Results of a computer analysis of the answers given in the exam are reported. (SW)

  14. NASA aerodynamics program

    NASA Technical Reports Server (NTRS)

    Holmes, Bruce J.; Schairer, Edward; Hicks, Gary; Wander, Stephen; Blankson, Isiaiah; Rose, Raymond; Olson, Lawrence; Unger, George

    1990-01-01

    Presented here is a comprehensive review of the following aerodynamics elements: computational methods and applications, computational fluid dynamics (CFD) validation, transition and turbulence physics, numerical aerodynamic simulation, drag reduction, test techniques and instrumentation, configuration aerodynamics, aeroacoustics, aerothermodynamics, hypersonics, subsonic transport/commuter aviation, fighter/attack aircraft and rotorcraft.

  15. [Clinical and communication simulation workshop for fellows in gastroenterology: the trainees' perspective].

    PubMed

    Lang, Alon; Melzer, Ehud; Bar-Meir, Simon; Eliakim, Rami; Ziv, Amitai

    2006-11-01

    The continuing development in computer-based medical simulators provides an ideal platform for simulator-assisted training programs for medical trainees. Computer-based endoscopic simulators provide a virtual reality environment for training endoscopic procedures. This study illustrates the use of a comprehensive training model combining the use of endoscopic simulators with simulated (actor) patients (SP). To evaluate the effectiveness of a comprehensive simulation workshop from the trainee perspective. Four case studies were developed with emphasis on communication skills. Three workshops with 10 fellows in each were conducted. During each workshop the trainees spent half of the time in SP case studies and the remaining half working with computerized endoscopic simulators with continuous guidance by an expert endoscopist. Questionnaires were completed by the fellows at the end of the workshop. Seventy percent of the fellows felt that the endoscopic simulator was close or very close to reality for gastroscopy and 63% for colonoscopy. Eighty eight percent thought the close guidance was important for the learning process with the simulator. Eighty percent felt that the case studies were an important learning experience for risk management. Further evaluation of multi-modality simulation workshops in gastroenterologist training is needed to identify how best to incorporate this form of instruction into training for gastroenterologists.

  16. Visualizing risks in cancer communication: A systematic review of computer-supported visual aids.

    PubMed

    Stellamanns, Jan; Ruetters, Dana; Dahal, Keshav; Schillmoeller, Zita; Huebner, Jutta

    2017-08-01

    Health websites are becoming important sources for cancer information. Lay users, patients and carers seek support for critical decisions, but they are prone to common biases when quantitative information is presented. Graphical representations of risk data can facilitate comprehension, and interactive visualizations are popular. This review summarizes the evidence on computer-supported graphs that present risk data and their effects on various measures. The systematic literature search was conducted in several databases, including MEDLINE, EMBASE and CINAHL. Only studies with a controlled design were included. Relevant publications were carefully selected and critically appraised by two reviewers. Thirteen studies were included. Ten studies evaluated static graphs and three dynamic formats. Most decision scenarios were hypothetical. Static graphs could improve accuracy, comprehension, and behavioural intention. But the results were heterogeneous and inconsistent among the studies. Dynamic formats were not superior or even impaired performance compared to static formats. Static graphs show promising but inconsistent results, while research on dynamic visualizations is scarce and must be interpreted cautiously due to methodical limitations. Well-designed and context-specific static graphs can support web-based cancer risk communication in particular populations. The application of dynamic formats cannot be recommended and needs further research. Copyright © 2017 Elsevier B.V. All rights reserved.

  17. Predicting reading comprehension academic achievement in late adolescents with velo-cardio-facial (22q11.2 deletion) syndrome (VCFS): A longitudinal study

    PubMed Central

    Antshel, Kevin M.; Hier, Bridget O.; Fremont, Wanda; Faraone, Stephen V.; Kates, Wendy R.

    2015-01-01

    Background The primary objective of the current study was to examine the childhood predictors of adolescent reading comprehension in velo-cardio-facial syndrome (VCFS). Although much research has focused on mathematics skills among individuals with VCFS, no studies have examined predictors of reading comprehension. Methods 69 late adolescents with VCFS , 23 siblings of youth with VCFS and 30 community controls participated in a longitudinal research project and had repeat neuropsychological test batteries and psychiatric evaluations every 3 years. The Wechsler Individual Achievement Test – 2nd edition (WIAT-II) Reading Comprehension subtest served as our primary outcome variable. Results Consistent with previous research, children and adolescents with VCFS had mean reading comprehension scores on the WIAT-II which were approximately two standard deviations below the mean and word reading scores approximately one standard deviation below the mean. A more novel finding is that relative to both control groups, individuals with VCFS demonstrated a longitudinal decline in reading comprehension abilities yet a slight increase in word reading abilities. In the combined control sample, WISC-III FSIQ, WIAT-II Word Reading, WISC-III Vocabulary and CVLT-C List A Trial 1 accounted for 75% of the variance in Time 3 WIAT-II Reading Comprehension scores. In the VCFS sample, WISC-III FSIQ, BASC-Teacher Aggression, CVLT-C Intrusions, Tower of London, Visual Span Backwards, WCST non-perseverative errors, WIAT-II Word Reading and WISC-III Freedom from Distractibility index accounted for 85% of the variance in Time 3 WIAT-II Reading Comprehension scores. A principal component analysis with promax rotation computed on the statistically significant Time 1 predictor variables in the VCFS sample resulted in three factors: Word reading decoding / Interference control, Self-Control / Self-Monitoring and Working Memory. Conclusions Childhood predictors of late adolescent reading comprehension in VCFS differ in some meaningful ways from predictors in the non-VCFS population. These results offer some guidance for how best to consider intervention efforts to improve reading comprehension in the VCFS population. PMID:24861691

  18. Assessment of comprehensive HIV/AIDS knowledge level among in-school adolescents in eastern Ethiopia.

    PubMed

    Oljira, Lemessa; Berhane, Yemane; Worku, Alemayehu

    2013-03-20

    In Ethiopia, more adolescents are in school today than ever before; however, there are no studies that have assessed their comprehensive knowledge of HIV/AIDS. Thus, this study tried to assess the level of this knowledge and the factors associated with it among in-school adolescents in eastern Ethiopia. A cross-sectional school-based study was conducted using a facilitator-guided self-administered questionnaire. The respondents were students attending regular school in 14 high schools located in 14 different districts in eastern Ethiopia. The proportion of in-school adolescents with comprehensive HIV/AIDS knowledge was computed and compared by sex. The factors that were associated with the comprehensive HIV/AIDS knowledge were assessed using bivariate and multivariable logistic regression. Only about one in four, 677 (24.5%), in-school adolescents have comprehensive HIV/AIDS knowledge. The knowledge was better among in-school adolescents from families with a relatively middle or high wealth index (adjusted OR [95% CI]=1.39 [1.03-1.87] and 1.75 [1.24-2.48], respectively), who got HIV/AIDS information mainly from friends or mass media (adjusted OR [95% CI]=1.63 [1.17-2.27] and 1.55 [1.14-2.11], respectively) and who received education on HIV/AIDS and sexual matters at school (adjusted OR [95% CI]=1.59 [1.22-2.08]). The females were less likely to have comprehensive HIV/AIDS knowledge compared to males (adjusted OR and [95% CI]=0.60 [0.49-0.75]). In general, only about a quarter of in-school adolescents had comprehensive HIV/AIDS knowledge. Although the female adolescents are highly vulnerable to HIV infection and its effects, they were by far less likely to have comprehensive HIV/AIDS knowledge. HIV/AIDS information, education and communication activities need to be intensified in high schools.

  19. Computing Support for Basic Research in Perception and Cognition

    DTIC Science & Technology

    1988-12-07

    hearing aids and cochlear implants, this suggests that certain types of proposed coding schemes, specifically those employing periodicity tuning in...developing a computer model of the interaction of declarative and procedural knowledge in skill acquisition. In the Visual Psychophysics Laboratory... Psycholinguistics - Laboratory a computer model of text comprehension and recall has been constructed and several - experiments have been completed that verify basic

  20. Objective and Item Banking Computer Software and Its Use in Comprehensive Achievement Monitoring.

    ERIC Educational Resources Information Center

    Schriber, Peter E.; Gorth, William P.

    The current emphasis on objectives and test item banks for constructing more effective tests is being augmented by increasingly sophisticated computer software. Items can be catalogued in numerous ways for retrieval. The items as well as instructional objectives can be stored and test forms can be selected and printed by the computer. It is also…

  1. Simulation Modeling of Lakes in Undergraduate and Graduate Classrooms Increases Comprehension of Climate Change Concepts and Experience with Computational Tools

    ERIC Educational Resources Information Center

    Carey, Cayelan C.; Gougis, Rebekka Darner

    2017-01-01

    Ecosystem modeling is a critically important tool for environmental scientists, yet is rarely taught in undergraduate and graduate classrooms. To address this gap, we developed a teaching module that exposes students to a suite of modeling skills and tools (including computer programming, numerical simulation modeling, and distributed computing)…

  2. A review of randomized controlled trials comparing the effectiveness of hand held computers with paper methods for data collection

    PubMed Central

    Lane, Shannon J; Heddle, Nancy M; Arnold, Emmy; Walker, Irwin

    2006-01-01

    Background Handheld computers are increasingly favoured over paper and pencil methods to capture data in clinical research. Methods This study systematically identified and reviewed randomized controlled trials (RCTs) that compared the two methods for self-recording and reporting data, and where at least one of the following outcomes was assessed: data accuracy; timeliness of data capture; and adherence to protocols for data collection. Results A comprehensive key word search of NLM Gateway's database yielded 9 studies fitting the criteria for inclusion. Data extraction was performed and checked by two of the authors. None of the studies included all outcomes. The results overall, favor handheld computers over paper and pencil for data collection among study participants but the data are not uniform for the different outcomes. Handheld computers appear superior in timeliness of receipt and data handling (four of four studies) and are preferred by most subjects (three of four studies). On the other hand, only one of the trials adequately compared adherence to instructions for recording and submission of data (handheld computers were superior), and comparisons of accuracy were inconsistent between five studies. Conclusion Handhelds are an effective alternative to paper and pencil modes of data collection; they are faster and were preferred by most users. PMID:16737535

  3. On Mediation in Virtual Learning Environments.

    ERIC Educational Resources Information Center

    Davies, Larry; Hassan, W. Shukry

    2001-01-01

    Discusses concepts of mediation and focuses on the importance of implementing comprehensive virtual learning environments. Topics include education and technology as they relate to cultural change, social institutions, the Internet and computer-mediated communication, software design and human-computer interaction, the use of MOOs, and language.…

  4. Comprehensive Thematic T-Matrix Reference Database: A 2015-2017 Update

    NASA Technical Reports Server (NTRS)

    Mishchenko, Michael I.; Zakharova, Nadezhda; Khlebtsov, Nikolai G.; Videen, Gorden; Wriedt, Thomas

    2017-01-01

    The T-matrix method pioneered by Peter C. Waterman is one of the most versatile and efficient numerically exact computer solvers of the time-harmonic macroscopic Maxwell equations. It is widely used for the computation of electromagnetic scattering by single and composite particles, discrete random media, periodic structures (including metamaterials), and particles in the vicinity of plane or rough interfaces separating media with different refractive indices. This paper is the eighth update to the comprehensive thematic database of peer-reviewed T-matrix publications initiated in 2004 and lists relevant publications that have appeared since 2015. It also references a small number of earlier publications overlooked previously.

  5. Comprehensive thematic T-matrix reference database: A 2015-2017 update

    NASA Astrophysics Data System (ADS)

    Mishchenko, Michael I.; Zakharova, Nadezhda T.; Khlebtsov, Nikolai G.; Videen, Gorden; Wriedt, Thomas

    2017-11-01

    The T-matrix method pioneered by Peter C. Waterman is one of the most versatile and efficient numerically exact computer solvers of the time-harmonic macroscopic Maxwell equations. It is widely used for the computation of electromagnetic scattering by single and composite particles, discrete random media, periodic structures (including metamaterials), and particles in the vicinity of plane or rough interfaces separating media with different refractive indices. This paper is the eighth update to the comprehensive thematic database of peer-reviewed T-matrix publications initiated in 2004 and lists relevant publications that have appeared since 2015. It also references a small number of earlier publications overlooked previously.

  6. Developments in REDES: The rocket engine design expert system

    NASA Technical Reports Server (NTRS)

    Davidian, Kenneth O.

    1990-01-01

    The Rocket Engine Design Expert System (REDES) is being developed at the NASA-Lewis to collect, automate, and perpetuate the existing expertise of performing a comprehensive rocket engine analysis and design. Currently, REDES uses the rigorous JANNAF methodology to analyze the performance of the thrust chamber and perform computational studies of liquid rocket engine problems. The following computer codes were included in REDES: a gas properties program named GASP, a nozzle design program named RAO, a regenerative cooling channel performance evaluation code named RTE, and the JANNAF standard liquid rocket engine performance prediction code TDK (including performance evaluation modules ODE, ODK, TDE, TDK, and BLM). Computational analyses are being conducted by REDES to provide solutions to liquid rocket engine thrust chamber problems. REDES is built in the Knowledge Engineering Environment (KEE) expert system shell and runs on a Sun 4/110 computer.

  7. Developments in REDES: The Rocket Engine Design Expert System

    NASA Technical Reports Server (NTRS)

    Davidian, Kenneth O.

    1990-01-01

    The Rocket Engine Design Expert System (REDES) was developed at NASA-Lewis to collect, automate, and perpetuate the existing expertise of performing a comprehensive rocket engine analysis and design. Currently, REDES uses the rigorous JANNAF methodology to analyze the performance of the thrust chamber and perform computational studies of liquid rocket engine problems. The following computer codes were included in REDES: a gas properties program named GASP; a nozzle design program named RAO; a regenerative cooling channel performance evaluation code named RTE; and the JANNAF standard liquid rocket engine performance prediction code TDK (including performance evaluation modules ODE, ODK, TDE, TDK, and BLM). Computational analyses are being conducted by REDES to provide solutions to liquid rocket engine thrust chamber problems. REDES was built in the Knowledge Engineering Environment (KEE) expert system shell and runs on a Sun 4/110 computer.

  8. Ca + HF - The anatomy of a chemical insertion reaction

    NASA Technical Reports Server (NTRS)

    Jaffe, R. L.; Pattengill, M. D.; Mascarello, F. G.; Zare, R. N.

    1987-01-01

    A comprehensive first-principles theoretical investigation of the gas phase reaction Ca + HF - CaF + H is reported. Ab initio potential energy calculations are first discussed, along with characteristics of the computed potential energy surface. Next, the fitting of the computed potential energy points to a suitable analytical functional form is described, and maps of the fitted potential surface are displayed. The methodology and results of a classical trajectory calculation utilizing the fitted potential surface are presented. Finally, the significance of the trajectory study results is discussed, and generalizations concerning dynamical aspects of Ca + HF scattering are drawn.

  9. Towards Co-Engineering Communicating Autonomous Cyber-Physical Systems

    NASA Technical Reports Server (NTRS)

    Bujorianu, Marius C.; Bujorianu, Manuela L.

    2009-01-01

    In this paper, we sketch a framework for interdisciplinary modeling of space systems, by proposing a holistic view. We consider different system dimensions and their interaction. Specifically, we study the interactions between computation, physics, communication, uncertainty and autonomy. The most comprehensive computational paradigm that supports a holistic perspective on autonomous space systems is given by cyber-physical systems. For these, the state of art consists of collaborating multi-engineering efforts that prompt for an adequate formal foundation. To achieve this, we propose a leveraging of the traditional content of formal modeling by a co-engineering process.

  10. Comprehensive Modeling and Visualization of Cardiac Anatomy and Physiology from CT Imaging and Computer Simulations

    PubMed Central

    Sun, Peng; Zhou, Haoyin; Ha, Seongmin; Hartaigh, Bríain ó; Truong, Quynh A.; Min, James K.

    2016-01-01

    In clinical cardiology, both anatomy and physiology are needed to diagnose cardiac pathologies. CT imaging and computer simulations provide valuable and complementary data for this purpose. However, it remains challenging to gain useful information from the large amount of high-dimensional diverse data. The current tools are not adequately integrated to visualize anatomic and physiologic data from a complete yet focused perspective. We introduce a new computer-aided diagnosis framework, which allows for comprehensive modeling and visualization of cardiac anatomy and physiology from CT imaging data and computer simulations, with a primary focus on ischemic heart disease. The following visual information is presented: (1) Anatomy from CT imaging: geometric modeling and visualization of cardiac anatomy, including four heart chambers, left and right ventricular outflow tracts, and coronary arteries; (2) Function from CT imaging: motion modeling, strain calculation, and visualization of four heart chambers; (3) Physiology from CT imaging: quantification and visualization of myocardial perfusion and contextual integration with coronary artery anatomy; (4) Physiology from computer simulation: computation and visualization of hemodynamics (e.g., coronary blood velocity, pressure, shear stress, and fluid forces on the vessel wall). Substantially, feedback from cardiologists have confirmed the practical utility of integrating these features for the purpose of computer-aided diagnosis of ischemic heart disease. PMID:26863663

  11. Identification of metabolic pathways using pathfinding approaches: a systematic review.

    PubMed

    Abd Algfoor, Zeyad; Shahrizal Sunar, Mohd; Abdullah, Afnizanfaizal; Kolivand, Hoshang

    2017-03-01

    Metabolic pathways have become increasingly available for various microorganisms. Such pathways have spurred the development of a wide array of computational tools, in particular, mathematical pathfinding approaches. This article can facilitate the understanding of computational analysis of metabolic pathways in genomics. Moreover, stoichiometric and pathfinding approaches in metabolic pathway analysis are discussed. Three major types of studies are elaborated: stoichiometric identification models, pathway-based graph analysis and pathfinding approaches in cellular metabolism. Furthermore, evaluation of the outcomes of the pathways with mathematical benchmarking metrics is provided. This review would lead to better comprehension of metabolism behaviors in living cells, in terms of computed pathfinding approaches. © The Author 2016. Published by Oxford University Press. All rights reserved. For permissions, please email: journals.permissions@oup.com.

  12. A Fine-Tuned Look at White Space Variation in Desktop Publishing.

    ERIC Educational Resources Information Center

    Knupfer, Nancy Nelson; McIsaac, Marina Stock

    This investigation of the use of white space in print-based, computer-generated text focused on the point at which the white space interferes with reading speed and comprehension. It was hypothesized that reading speed and comprehension would be significantly greater when text was wrapped tightly around the graphic than when it had one-half inch…

  13. Eleventh NASTRAN User's Colloquium

    NASA Technical Reports Server (NTRS)

    1983-01-01

    NASTRAN (NASA STRUCTURAL ANALYSIS) is a large, comprehensive, nonproprietary, general purpose finite element computer code for structural analysis which was developed under NASA sponsorship. The Eleventh Colloquium provides some comprehensive general papers on the application of finite element methods in engineering, comparisons with other approaches, unique applications, pre- and post-processing or auxiliary programs, and new methods of analysis with NASTRAN.

  14. Reading Linear Texts on Paper versus Computer Screen: Effects on Reading Comprehension

    ERIC Educational Resources Information Center

    Mangen, Anne; Walgermo, Bente R.; Bronnick, Kolbjorn

    2013-01-01

    Objective: To explore effects of the technological interface on reading comprehension in a Norwegian school context. Participants: 72 tenth graders from two different primary schools in Norway. Method: The students were randomized into two groups, where the first group read two texts (1400-2000 words) in print, and the other group read the same…

  15. CALL--Enhanced L2 Listening Skills--Aiming for Automatization in a Multimedia Environment

    ERIC Educational Resources Information Center

    Mayor, Maria Jesus Blasco

    2009-01-01

    Computer Assisted Language Learning (CALL) and L2 listening comprehension skill training are bound together for good. A neglected macroskill for decades, developing listening comprehension skill is now considered crucial for L2 acquisition. Thus this paper makes an attempt to offer latest information on processing theories and L2 listening…

  16. Children's Media Comprehension: The Relationship between Media Platform, Executive Functioning Abilities, and Age

    ERIC Educational Resources Information Center

    Menkes, Susan M.

    2012-01-01

    Children's media comprehension was compared for material presented on television, computer, or touchscreen tablet. One hundred and thirty-two children were equally distributed across 12 groups defined by age (4- or 6-years-olds), gender, and the three media platforms. Executive functioning as measured by attentional control, cognitive…

  17. A CALL-Based Lesson Plan for Teaching Reading Comprehension to Iranian Intermediate EFL Learners

    ERIC Educational Resources Information Center

    Khoshsima, Hooshang; Khosravani, Mahboobeh

    2014-01-01

    The main purpose of this descriptive research is to provide a CALL (Computer-Assisted Language Learning)-based lesson plan for teaching reading comprehension to Iranian intermediate EFL learners. CALL is a new way of learning and teaching language. It is proved that CALL mainly has positive effects on educational contexts. Although teachers…

  18. Adaptive and Maladaptive Strategy Use in Computer-Assisted Language Learning Activities for Listening Comprehension

    ERIC Educational Resources Information Center

    McBride, Cara

    2008-01-01

    College students of English as a foreign language (EFL) in Chile participated in an online mini course designed to improve their listening comprehension. There were four experimental conditions: A) one in which participants listened to fast dialogues; B) one in which participants listened to slow dialogues; C) one in which participants were given…

  19. Computational studies of solid-state alkali conduction in rechargeable alkali-ion batteries

    DOE PAGES

    Deng, Zhi; Mo, Yifei; Ong, Shyue Ping

    2016-03-25

    The facile conduction of alkali ions in a crystal host is of crucial importance in rechargeable alkali-ion batteries, the dominant form of energy storage today. In this review, we provide a comprehensive survey of computational approaches to study solid-state alkali diffusion. We demonstrate how these methods have provided useful insights into the design of materials that form the main components of a rechargeable alkali-ion battery, namely the electrodes, superionic conductor solid electrolytes and interfaces. We will also provide a perspective on future challenges and directions. Here, the scope of this review includes the monovalent lithium- and sodium-ion chemistries that aremore » currently of the most commercial interest.« less

  20. Additional extensions to the NASCAP computer code, volume 1

    NASA Technical Reports Server (NTRS)

    Mandell, M. J.; Katz, I.; Stannard, P. R.

    1981-01-01

    Extensions and revisions to a computer code that comprehensively analyzes problems of spacecraft charging (NASCAP) are documented. Using a fully three dimensional approach, it can accurately predict spacecraft potentials under a variety of conditions. Among the extensions are a multiple electron/ion gun test tank capability, and the ability to model anisotropic and time dependent space environments. Also documented are a greatly extended MATCHG program and the preliminary version of NASCAP/LEO. The interactive MATCHG code was developed into an extremely powerful tool for the study of material-environment interactions. The NASCAP/LEO, a three dimensional code to study current collection under conditions of high voltages and short Debye lengths, was distributed for preliminary testing.

  1. The Emerging Worldwide Electronic University: Information Age Global Higher Education. Contributions to the Study of Education, Number 57.

    ERIC Educational Resources Information Center

    Rossman, Parker

    This book proposes an agenda of questions for people who plan for and seek to give some direction to the expanding use of computer conferencing and television for international courses. The book is the first comprehensive effort to gather together and summarize scattered research reports and to report experiments and demonstrations that suggest…

  2. Error Monitoring in Speech Production: A Computational Test of the Perceptual Loop Theory.

    ERIC Educational Resources Information Center

    Hartsuiker, Robert J.; Kolk, Herman H. J.

    2001-01-01

    Tested whether an elaborated version of the perceptual loop theory (W. Levelt, 1983) and the main interruption rule was consistent with existing time course data (E. Blackmer and E. Mitton, 1991; C. Oomen and A. Postma, in press). The study suggests that including an inner loop through the speech comprehension system generates predictions that fit…

  3. Teachers' Perceptions of Differentiated Learning for At-Risk Second-Grade Students in Reading

    ERIC Educational Resources Information Center

    Sabb-Cordes, Morelisa L.

    2016-01-01

    Students were performing below grade level in reading, fluency, and comprehension in a suburban school in South Carolina. The purpose of this study was to explore the perceptions of teachers about their preferred differentiated instruction approach (face-to-face vs. computer-based) to meet the needs of at-risk students in 2nd grade. The underlying…

  4. L2 Learners' Engagement with High Stakes Listening Tests: Does Technology Have a Beneficial Role to Play?

    ERIC Educational Resources Information Center

    East, Martin; King, Chris

    2012-01-01

    In the listening component of the IELTS examination candidates hear the input once, delivered at "normal" speed. This format for listening can be problematic for test takers who often perceive normal speed input to be too fast for effective comprehension. The study reported here investigated whether using computer software to slow down…

  5. Tracking real-time neural activation of conceptual knowledge using single-trial event-related potentials.

    PubMed

    Amsel, Ben D

    2011-04-01

    Empirically derived semantic feature norms categorized into different types of knowledge (e.g., visual, functional, auditory) can be summed to create number-of-feature counts per knowledge type. Initial evidence suggests several such knowledge types may be recruited during language comprehension. The present study provides a more detailed understanding of the timecourse and intensity of influence of several such knowledge types on real-time neural activity. A linear mixed-effects model was applied to single trial event-related potentials for 207 visually presented concrete words measured on total number of features (semantic richness), imageability, and number of visual motion, color, visual form, smell, taste, sound, and function features. Significant influences of multiple feature types occurred before 200ms, suggesting parallel neural computation of word form and conceptual knowledge during language comprehension. Function and visual motion features most prominently influenced neural activity, underscoring the importance of action-related knowledge in computing word meaning. The dynamic time courses and topographies of these effects are most consistent with a flexible conceptual system wherein temporally dynamic recruitment of representations in modal and supramodal cortex are a crucial element of the constellation of processes constituting word meaning computation in the brain. Copyright © 2011 Elsevier Ltd. All rights reserved.

  6. Design Principles for a Comprehensive Library System.

    ERIC Educational Resources Information Center

    Uluakar, Tamer; And Others

    1981-01-01

    Describes an online design featuring circulation control, catalog access, and serial holdings that uses an incremental approach to system development. Utilizing a dedicated computer, this second of three releases pays particular attention to present and predicted computing capabilities as well as trends in library automation. (Author/RAA)

  7. Computer-Assisted Foreign Language Teaching and Learning: Technological Advances

    ERIC Educational Resources Information Center

    Zou, Bin; Xing, Minjie; Wang, Yuping; Sun, Mingyu; Xiang, Catherine H.

    2013-01-01

    Computer-Assisted Foreign Language Teaching and Learning: Technological Advances highlights new research and an original framework that brings together foreign language teaching, experiments and testing practices that utilize the most recent and widely used e-learning resources. This comprehensive collection of research will offer linguistic…

  8. Microcomputers in the Curriculum: Micros and the First R.

    ERIC Educational Resources Information Center

    Balajthy, Ernest; Reinking, David

    1985-01-01

    Introduces the range of computer software currently available to aid in developing children's basic skills in reading, including programs for reading readiness, word recognition, vocabulary development, reading comprehension, and learning motivation. Additional information on software and computer use is provided in sidebars by Gwen Solomon and…

  9. Continuous Trailing-Edge Flaps for Primary Flight Control of a Helicopter Main Rotor

    NASA Technical Reports Server (NTRS)

    Thornburgh, Robert P.; Kreshock, Andrew R.; Wilbur, Matthew L.; Sekula, Martin K.; Shen, Jinwei

    2014-01-01

    The use of continuous trailing-edge flaps (CTEFs) for primary flight control of a helicopter main rotor is studied. A practical, optimized bimorph design with Macro-Fiber Composite actuators is developed for CTEF control, and a coupled structures and computational fluid dynamics methodology is used to study the fundamental behavior of an airfoil with CTEFs. These results are used within a comprehensive rotorcraft analysis model to study the control authority requirements of the CTEFs when utilized for primary flight control of a utility class helicopter. A study of the effect of blade root pitch index (RPI) on CTEF control authority is conducted, and the impact of structural and aerodynamic model complexity on the comprehensive analysis results is presented. The results show that primary flight control using CTEFs is promising; however, a more viable option may include the control of blade RPI, as well.

  10. A Computational Cluster for Multiscale Simulations of Ionic Liquids

    DTIC Science & Technology

    2008-09-16

    AND SUBTITLE DURIP: A Computational Cluster for Multiscale Simulations of Ionic Liquids 5a. CONTRACT NUMBER 5b. GRANT NUMBER FA955007-1-0512 5c...AVAILABILITY STATEMENT ZO\\5oc\\\\%1>^ 13. SUPPLEMENTARY NOTES 14. ABSTRACT The focus of this project was to acquire and use computer cluster nodes...by ANSI Std. Z39.18 Adobe Professional 7.0 Comprehensive Final Report: Gregory A. Voth, PI Contract/Grant Title: DURIP: A Computational Cluster for

  11. A Comprehensive Theory of Algorithms for Wireless Networks and Mobile Systems

    DTIC Science & Technology

    2016-06-08

    David Peleg. Nonuniform SINR+Voronoi Diagrams are Effectively Uniform. In Yoram Moses, editor, Distributed Computing: 29th International Symposium...in Computer Science, page 559. Springer, 2014. [16] Erez Kantor, Zvi Lotker, Merav Parter, and David Peleg. Nonuniform sINR+Voronoi dia- grams are...Merav Parter, and David Peleg. Nonuniform SINR+Voronoi diagrams are effectively uniform. In Yoram Moses, editor, Distributed Computing - 29th

  12. STATLIB: NSWC Library of Statistical Programs and Subroutines

    DTIC Science & Technology

    1989-08-01

    Uncorrelated Weighted Polynomial Regression 41 .WEPORC Correlated Weighted Polynomial Regression 45 MROP Multiple Regression Using Orthogonal Polynomials ...could not and should not be con- NSWC TR 89-97 verted to the new general purpose computer (the current CDC 995). Some were designed tu compute...personal computers. They are referred to as SPSSPC+, BMDPC, and SASPC and in general are less comprehensive than their mainframe counterparts. The basic

  13. Development of New Instructional Uses of the Computer in an Inner-City Comprehensive High School. Final Report.

    ERIC Educational Resources Information Center

    Lichten, William

    A three-part program investigated the use of computers at an inner-city high school. An attempt was made to introduce a digital computer for instructional purposes at the high school. A single portable teletype terminal and a simple programing language, BASIC, were used. It was found that a wide variety of students could benefit from this…

  14. Scientific Visualization: The Modern Oscilloscope for "Seeing the Unseeable" (LBNL Summer Lecture Series)

    ScienceCinema

    Bethel, E. Wes [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Computational Research Division and Scientific Visualization Group

    2018-05-07

    Summer Lecture Series 2008: Scientific visualization transforms abstract data into readily comprehensible images, provide a vehicle for "seeing the unseeable," and play a central role in both experimental and computational sciences. Wes Bethel, who heads the Scientific Visualization Group in the Computational Research Division, presents an overview of visualization and computer graphics, current research challenges, and future directions for the field.

  15. Evaluating the Effectiveness of Game-Based Training: A Controlled Study with Dismounted Infantry Teams

    DTIC Science & Technology

    2013-01-01

    commencing at DSTO she worked as a lecturer in Psychology and Human Factors at the University of Queensland and Queensland University of Technology...Army1 and conducted by scientists from the DSTO’s Land Operations Division working on the Training and Preparedness (ARM 07/163) task...comprehensive summary of published work in this area. In this review, studies examining the effectiveness of computer-based instruction methods

  16. 3.0Tesla magnetic resonance angiography (MRA) for comprehensive renal evaluation of living renal donors: pilot study with computerized tomography angiography (CTA) comparison.

    PubMed

    Gulati, Mittul; Dermendjian, Harout; Gómez, Ana M; Tan, Nelly; Margolis, Daniel J; Lu, David S; Gritsch, H Albin; Raman, Steven S

    2016-01-01

    Most living related donor (LRD) kidneys are harvested laparoscopically. Renal vascular anatomy helps determine donor suitability for laparoscopic nephrectomy. Computed tomography angiography (CTA) is the current gold standard for preoperative imaging; magnetic resonance angiography (MRA) offers advantages including lack of ionizing radiation and lower incidence of contrast reactions. We evaluated 3.0T MRA for assessing renal anatomy of LRDs. Thirty consecutive LRDs underwent CTA followed by 3.0T MRA. Data points included number and branching of vessels, incidental findings, and urothelial opacification. Studies were individually evaluated by three readers blinded to patient data. Studies were reevaluated in consensus with discrepancies revealed, and final consensus results were labeled "truth". Compared with consensus "truth", both computed tomography (CT) and magnetic resonance imaging were highly accurate for assessment of arterial and venous anatomy, although CT was superior for detection of late venous confluence as well as detection of renal stones. Both modalities were comparable in opacification of lower ureters and bladder; MRA underperformed CTA for opacification of upper urinary tracts. 3.0T MRA enabled excellent detection of comprehensive renal anatomy compared to CTA in LRDs. Copyright © 2016 Elsevier Inc. All rights reserved.

  17. Predicting A Drug'S Membrane Permeability: Evolution of a Computational Model Validated with in Vitro Permeability Assay Data

    DOE PAGES

    Carpenter, Timothy S.; McNerney, M. Windy; Be, Nicholas A.; ...

    2016-02-16

    Membrane permeability is a key property to consider in drug design, especially when the drugs in question need to cross the blood-brain barrier (BBB). A comprehensive in vivo assessment of the BBB permeability of a drug takes considerable time and financial resources. A current, simplified in vitro model to investigate drug permeability is a Parallel Artificial Membrane Permeability Assay (PAMPA) that generally provides higher throughput and initial quantification of a drug's passive permeability. Computational methods can also be used to predict drug permeability. Our methods are highly advantageous as they do not require the synthesis of the desired drug, andmore » can be implemented rapidly using high-performance computing. In this study, we have used umbrella sampling Molecular Dynamics (MD) methods to assess the passive permeability of a range of compounds through a lipid bilayer. Furthermore, the permeability of these compounds was comprehensively quantified using the PAMPA assay to calibrate and validate the MD methodology. And after demonstrating a firm correlation between the two approaches, we then implemented our MD method to quantitatively predict the most permeable potential drug from a series of potential scaffolds. This permeability was then confirmed by the in vitro PAMPA methodology. Therefore, in this work we have illustrated the potential that these computational methods hold as useful tools to help predict a drug's permeability in a faster and more cost-effective manner. Release number: LLNL-ABS-677757.« less

  18. A Computational Clonal Analysis of the Developing Mouse Limb Bud

    PubMed Central

    Marcon, Luciano; Arqués, Carlos G.; Torres, Miguel S.; Sharpe, James

    2011-01-01

    A comprehensive spatio-temporal description of the tissue movements underlying organogenesis would be an extremely useful resource to developmental biology. Clonal analysis and fate mappings are popular experiments to study tissue movement during morphogenesis. Such experiments allow cell populations to be labeled at an early stage of development and to follow their spatial evolution over time. However, disentangling the cumulative effects of the multiple events responsible for the expansion of the labeled cell population is not always straightforward. To overcome this problem, we develop a novel computational method that combines accurate quantification of 2D limb bud morphologies and growth modeling to analyze mouse clonal data of early limb development. Firstly, we explore various tissue movements that match experimental limb bud shape changes. Secondly, by comparing computational clones with newly generated mouse clonal data we are able to choose and characterize the tissue movement map that better matches experimental data. Our computational analysis produces for the first time a two dimensional model of limb growth based on experimental data that can be used to better characterize limb tissue movement in space and time. The model shows that the distribution and shapes of clones can be described as a combination of anisotropic growth with isotropic cell mixing, without the need for lineage compartmentalization along the AP and PD axis. Lastly, we show that this comprehensive description can be used to reassess spatio-temporal gene regulations taking tissue movement into account and to investigate PD patterning hypothesis. PMID:21347315

  19. Predicting A Drug'S Membrane Permeability: Evolution of a Computational Model Validated with in Vitro Permeability Assay Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carpenter, Timothy S.; McNerney, M. Windy; Be, Nicholas A.

    Membrane permeability is a key property to consider in drug design, especially when the drugs in question need to cross the blood-brain barrier (BBB). A comprehensive in vivo assessment of the BBB permeability of a drug takes considerable time and financial resources. A current, simplified in vitro model to investigate drug permeability is a Parallel Artificial Membrane Permeability Assay (PAMPA) that generally provides higher throughput and initial quantification of a drug's passive permeability. Computational methods can also be used to predict drug permeability. Our methods are highly advantageous as they do not require the synthesis of the desired drug, andmore » can be implemented rapidly using high-performance computing. In this study, we have used umbrella sampling Molecular Dynamics (MD) methods to assess the passive permeability of a range of compounds through a lipid bilayer. Furthermore, the permeability of these compounds was comprehensively quantified using the PAMPA assay to calibrate and validate the MD methodology. And after demonstrating a firm correlation between the two approaches, we then implemented our MD method to quantitatively predict the most permeable potential drug from a series of potential scaffolds. This permeability was then confirmed by the in vitro PAMPA methodology. Therefore, in this work we have illustrated the potential that these computational methods hold as useful tools to help predict a drug's permeability in a faster and more cost-effective manner. Release number: LLNL-ABS-677757.« less

  20. Tracking the Continuity of Language Comprehension: Computer Mouse Trajectories Suggest Parallel Syntactic Processing

    ERIC Educational Resources Information Center

    Farmer, Thomas A.; Cargill, Sarah A.; Hindy, Nicholas C.; Dale, Rick; Spivey, Michael J.

    2007-01-01

    Although several theories of online syntactic processing assume the parallel activation of multiple syntactic representations, evidence supporting simultaneous activation has been inconclusive. Here, the continuous and non-ballistic properties of computer mouse movements are exploited, by recording their streaming x, y coordinates to procure…

  1. Solar Heating and Cooling for a Controls Manufacturing Plant Lumberton, New Jersey

    NASA Technical Reports Server (NTRS)

    1982-01-01

    Comprehensive report documents computer-controlled system which has separate solar-collector and cooling-tower areas located away from building and is completely computer controlled. System description, test data, major problems and resolution, performance, operation and maintenance, manufacturer's literature and drawing comprise part of 257-page report.

  2. Cognitive Architectures and Human-Computer Interaction. Introduction to Special Issue.

    ERIC Educational Resources Information Center

    Gray, Wayne D.; Young, Richard M.; Kirschenbaum, Susan S.

    1997-01-01

    In this introduction to a special issue on cognitive architectures and human-computer interaction (HCI), editors and contributors provide a brief overview of cognitive architectures. The following four architectures represented by articles in this issue are: Soar; LICAI (linked model of comprehension-based action planning and instruction taking);…

  3. First Steps in Computational Systems Biology: A Practical Session in Metabolic Modeling and Simulation

    ERIC Educational Resources Information Center

    Reyes-Palomares, Armando; Sanchez-Jimenez, Francisca; Medina, Miguel Angel

    2009-01-01

    A comprehensive understanding of biological functions requires new systemic perspectives, such as those provided by systems biology. Systems biology approaches are hypothesis-driven and involve iterative rounds of model building, prediction, experimentation, model refinement, and development. Developments in computer science are allowing for ever…

  4. Macromod: Computer Simulation For Introductory Economics

    ERIC Educational Resources Information Center

    Ross, Thomas

    1977-01-01

    The Macroeconomic model (Macromod) is a computer assisted instruction simulation model designed for introductory economics courses. An evaluation of its utilization at a community college indicates that it yielded a 10 percent to 13 percent greater economic comprehension than lecture classes and that it met with high student approval. (DC)

  5. Assessing Computer Literacy: A Validated Instrument and Empirical Results.

    ERIC Educational Resources Information Center

    Gabriel, Roy M.

    1985-01-01

    Describes development of a comprehensive computer literacy assessment battery for K-12 curriculum based on objectives of a curriculum implemented in the Worldwide Department of Defense Dependents Schools system. Test development and field test data are discussed and a correlational analysis which assists in interpretation of test results is…

  6. Problem-Solving in the Pre-Clinical Curriculum: The Uses of Computer Simulations.

    ERIC Educational Resources Information Center

    Michael, Joel A.; Rovick, Allen A.

    1986-01-01

    Promotes the use of computer-based simulations in the pre-clinical medical curriculum as a means of providing students with opportunities for problem solving. Describes simple simulations of skeletal muscle loads, complex simulations of major organ systems and comprehensive simulation models of the entire human body. (TW)

  7. Students' Motivation towards Computer Use in EFL Learning

    ERIC Educational Resources Information Center

    Genc, Gulten; Aydin, Selami

    2010-01-01

    It has been widely recognized that language instruction that integrates technology has become popular, and has had a tremendous impact on language learning process whereas learners are expected to be more motivated in a web-based Computer assisted language learning program, and improve their comprehensive language ability. Thus, the present paper…

  8. The Use of Help Options in Multimedia Listening Environments to Aid Language Learning: A Review

    ERIC Educational Resources Information Center

    Mohsen, Mohammed Ali

    2016-01-01

    This paper provides a comprehensive review on the use of help options (HOs) in the multimedia listening context to aid listening comprehension (LC) and improve incidental vocabulary learning. The paper also aims to synthesize the research findings obtained from the use of HOs in Computer-Assisted Language Learning (CALL) literature and reveals the…

  9. Notetaking Strategies and Their Relationship to Performance on Listening Comprehension and Communicative Assessment Tasks. TOEFL Monograph Series. MS-35. ETS RR-07-01

    ERIC Educational Resources Information Center

    Carrell, Patricia L.

    2007-01-01

    Utilizing a pre- and posttest research design, with an instructional intervention of good practices in notetaking, the notes taken by examinees during a computer-based listening comprehension test prior to and following the instructional intervention were examined for particular notetaking strategies. Questionnaires probed perceptions of the…

  10. Wall Shear Stress Distribution in a Patient-Specific Cerebral Aneurysm Model using Reduced Order Modeling

    NASA Astrophysics Data System (ADS)

    Han, Suyue; Chang, Gary Han; Schirmer, Clemens; Modarres-Sadeghi, Yahya

    2016-11-01

    We construct a reduced-order model (ROM) to study the Wall Shear Stress (WSS) distributions in image-based patient-specific aneurysms models. The magnitude of WSS has been shown to be a critical factor in growth and rupture of human aneurysms. We start the process by running a training case using Computational Fluid Dynamics (CFD) simulation with time-varying flow parameters, such that these parameters cover the range of parameters of interest. The method of snapshot Proper Orthogonal Decomposition (POD) is utilized to construct the reduced-order bases using the training CFD simulation. The resulting ROM enables us to study the flow patterns and the WSS distributions over a range of system parameters computationally very efficiently with a relatively small number of modes. This enables comprehensive analysis of the model system across a range of physiological conditions without the need to re-compute the simulation for small changes in the system parameters.

  11. Business aspects of cardiovascular computed tomography: tackling the challenges.

    PubMed

    Bateman, Timothy M

    2008-01-01

    The purpose of this article is to provide a comprehensive understanding of the business issues surrounding provision of dedicated cardiovascular computed tomographic imaging. Some of the challenges include high up-front costs, current low utilization relative to scanner capability, and inadequate payments. Cardiovascular computed tomographic imaging is a valuable clinical modality that should be offered by cardiovascular centers-of-excellence. With careful consideration of the business aspects, moderate-to-large size cardiology programs should be able to implement an economically viable cardiovascular computed tomographic service.

  12. Rotation, Reflection, and Frame Changes; Orthogonal tensors in computational engineering mechanics

    NASA Astrophysics Data System (ADS)

    Brannon, R. M.

    2018-04-01

    Whilst vast literature is available for the most common rotation-related tasks such as coordinate changes, most reference books tend to cover one or two methods, and resources for less-common tasks are scarce. Specialized research applications can be found in disparate journal articles, but a self-contained comprehensive review that covers both elementary and advanced concepts in a manner comprehensible to engineers is rare. Rotation, Reflection, and Frame Changes surveys a refreshingly broad range of rotation-related research that is routinely needed in engineering practice. By illustrating key concepts in computer source code, this book stands out as an unusually accessible guide for engineers and scientists in engineering mechanics.

  13. Dose tracking and dose auditing in a comprehensive computed tomography dose-reduction program.

    PubMed

    Duong, Phuong-Anh; Little, Brent P

    2014-08-01

    Implementation of a comprehensive computed tomography (CT) radiation dose-reduction program is a complex undertaking, requiring an assessment of baseline doses, an understanding of dose-saving techniques, and an ongoing appraisal of results. We describe the role of dose tracking in planning and executing a dose-reduction program and discuss the use of the American College of Radiology CT Dose Index Registry at our institution. We review the basics of dose-related CT scan parameters, the components of the dose report, and the dose-reduction techniques, showing how an understanding of each technique is important in effective auditing of "outlier" doses identified by dose tracking. Copyright © 2014 Elsevier Inc. All rights reserved.

  14. Intersections between the autism spectrum and the internet: perceived benefits and preferred functions of computer-mediated communication.

    PubMed

    Gillespie-Lynch, Kristen; Kapp, Steven K; Shane-Simpson, Christina; Smith, David Shane; Hutman, Ted

    2014-12-01

    An online survey compared the perceived benefits and preferred functions of computer-mediated communication of participants with (N  =  291) and without ASD (N  =  311). Participants with autism spectrum disorder (ASD) perceived benefits of computer-mediated communication in terms of increased comprehension and control over communication, access to similar others, and the opportunity to express their true selves. They enjoyed using the Internet to meet others more, and to maintain connections with friends and family less, than did participants without ASD. People with ASD enjoyed aspects of computer-mediated communication that may be associated with special interests or advocacy, such as blogging, more than did participants without ASD. This study suggests that people with ASD may use the Internet in qualitatively different ways from those without ASD. Suggestions for interventions are discussed.

  15. The Implementation of Blended Learning Using Android-Based Tutorial Video in Computer Programming Course II

    NASA Astrophysics Data System (ADS)

    Huda, C.; Hudha, M. N.; Ain, N.; Nandiyanto, A. B. D.; Abdullah, A. G.; Widiaty, I.

    2018-01-01

    Computer programming course is theoretical. Sufficient practice is necessary to facilitate conceptual understanding and encouraging creativity in designing computer programs/animation. The development of tutorial video in an Android-based blended learning is needed for students’ guide. Using Android-based instructional material, students can independently learn anywhere and anytime. The tutorial video can facilitate students’ understanding about concepts, materials, and procedures of programming/animation making in detail. This study employed a Research and Development method adapting Thiagarajan’s 4D model. The developed Android-based instructional material and tutorial video were validated by experts in instructional media and experts in physics education. The expert validation results showed that the Android-based material was comprehensive and very feasible. The tutorial video was deemed feasible as it received average score of 92.9%. It was also revealed that students’ conceptual understanding, skills, and creativity in designing computer program/animation improved significantly.

  16. The neural basis of reversible sentence comprehension: Evidence from voxel-based lesion-symptom mapping in aphasia

    PubMed Central

    Thothathiri, Malathi; Kimberg, Daniel Y.; Schwartz, Myrna F.

    2012-01-01

    We explored the neural basis of reversible sentence comprehension in a large group of aphasic patients (N=79). Voxel-based lesion-symptom mapping revealed a significant association between damage in temporoparietal cortex and impaired sentence comprehension. This association remained after we controlled for phonological working memory. We hypothesize that this region plays an important role in the thematic or what-where processing of sentences. In contrast, we detected weak or no association between reversible sentence comprehension and the ventrolateral prefrontal cortex, which includes Broca’s area, even for syntactically complex sentences. This casts doubt on theories that presuppose a critical role for this region in syntactic computations. PMID:21861679

  17. Methods for Functional Connectivity Analyses

    DTIC Science & Technology

    2012-12-13

    motor , or hand motor function (green, red, or blue shading, respectively). Thus, this work produced the first comprehensive analysis of ECoG...Computer Engineering, University of Texas at El Paso , TX, USA 3Department of Neurology, Albany Medical College, Albany, NY, USA 4Department of Computer...Department of Health, Albany, NY, USA bDepartment of Electrical and Computer Engineering, University of Texas at El Paso , TX, USA cDepartment of Neurology

  18. Comprehensive joint feedback control for standing by functional neuromuscular stimulation-a simulation study.

    PubMed

    Nataraj, Raviraj; Audu, Musa L; Kirsch, Robert F; Triolo, Ronald J

    2010-12-01

    Previous investigations of feedback control of standing after spinal cord injury (SCI) using functional neuromuscular stimulation (FNS) have primarily targeted individual joints. This study assesses the potential efficacy of comprehensive (trunk, hips, knees, and ankles) joint feedback control against postural disturbances using a bipedal, 3-D computer model of SCI stance. Proportional-derivative feedback drove an artificial neural network trained to produce muscle excitation patterns consistent with maximal joint stiffness values achievable about neutral stance given typical SCI muscle properties. Feedback gains were optimized to minimize upper extremity (UE) loading required to stabilize against disturbances. Compared to the baseline case of maximum constant muscle excitations used clinically, the controller reduced UE loading by 55% in resisting external force perturbations and by 84% during simulated one-arm functional tasks. Performance was most sensitive to inaccurate measurements of ankle plantar/dorsiflexion position and hip ab/adduction velocity feedback. In conclusion, comprehensive joint feedback demonstrates potential to markedly improve FNS standing function. However, alternative control structures capable of effective performance with fewer sensor-based feedback parameters may better facilitate clinical usage.

  19. Comprehensive Joint Feedback Control for Standing by Functional Neuromuscular Stimulation – a Simulation Study

    PubMed Central

    Nataraj, Raviraj; Audu, Musa L.; Kirsch, Robert F.; Triolo, Ronald J.

    2013-01-01

    Previous investigations of feedback control of standing after spinal cord injury (SCI) using functional neuromuscular stimulation (FNS) have primarily targeted individual joints. This study assesses the potential efficacy of comprehensive (trunk, hips, knees, and ankles) joint-feedback control against postural disturbances using a bipedal, three-dimensional computer model of SCI stance. Proportional-derivative feedback drove an artificial neural network trained to produce muscle excitation patterns consistent with maximal joint stiffness values achievable about neutral stance given typical SCI muscle properties. Feedback gains were optimized to minimize upper extremity (UE) loading required to stabilize against disturbances. Compared to the baseline case of maximum constant muscle excitations used clinically, the controller reduced UE loading by 55% in resisting external force perturbations and by 84% during simulated one-arm functional tasks. Performance was most sensitive to inaccurate measurements of ankle plantar/dorsiflexion position and hip ab/adduction velocity feedback. In conclusion, comprehensive joint-feedback demonstrates potential to markedly improve FNS standing function. However, alternative control structures capable of effective performance with fewer sensor-based feedback parameters may better facilitate clinical usage. PMID:20923741

  20. Thumb-loops up for catalysis: a structure/function investigation of a functional loop movement in a GH11 xylanase

    PubMed Central

    Paës, Gabriel; Cortés, Juan; Siméon, Thierry; O'Donohue, Michael J.; Tran, Vinh

    2012-01-01

    Dynamics is a key feature of enzyme catalysis. Unfortunately, current experimental and computational techniques do not yet provide a comprehensive understanding and description of functional macromolecular motions. In this work, we have extended a novel computational technique, which combines molecular modeling methods and robotics algorithms, to investigate functional motions of protein loops. This new approach has been applied to study the functional importance of the so-called thumb-loop in the glycoside hydrolase family 11 xylanase from Thermobacillus xylanilyticus (Tx-xyl). The results obtained provide new insight into the role of the loop in the glycosylation/deglycosylation catalytic cycle, and underline the key importance of the nature of the residue located at the tip of the thumb-loop. The effect of mutations predicted in silico has been validated by in vitro site-directed mutagenesis experiments. Overall, we propose a comprehensive model of Tx-xyl catalysis in terms of substrate and product dynamics by identifying the action of the thumb-loop motion during catalysis. PMID:24688637

  1. Students' inductive reasoning skills and the relevance of prior knowledge: an exploratory study with a computer-based training course on the topic of acne vulgaris.

    PubMed

    Horn-Ritzinger, Sabine; Bernhardt, Johannes; Horn, Michael; Smolle, Josef

    2011-04-01

    The importance of inductive instruction in medical education is increasingly growing. Little is known about the relevance of prior knowledge regarding students' inductive reasoning abilities. The purpose is to evaluate this inductive teaching method as a means of fostering higher levels of learning and to explore how individual differences in prior knowledge (high [HPK] vs. low [LPK]) contribute to students' inductive reasoning skills. Twenty-six LPK and 18 HPK students could train twice with an interactive computer-based training object to discover the underlying concept before doing the final comprehension check. Students had a median of 76.9% of correct answers in the first, 90.9% in the second training, and answered 92% of the final assessment questions correctly. More important, 86% of all students succeeded with inductive learning, among them 83% of the HPK students and 89% of the LPK students. Prior knowledge did not predict performance on overall comprehension. This inductive instructional strategy fostered students' deep approaches to learning in a time-effective way.

  2. Quantum chemical calculations of interatomic potentials for computer simulation of solids

    NASA Technical Reports Server (NTRS)

    1977-01-01

    A comprehensive mathematical model by which the collective behavior of a very large number of atoms within a metal or alloy can accurately be simulated was developed. Work was done in order to predict and modify the strength of materials to suit our technological needs. The method developed is useful in studying atomic interactions related to dislocation motion and crack extension.

  3. Using Video Interaction Guidance to Develop Intrapersonal and Interpersonal Skills in Professional Training for Educational Psychologists

    ERIC Educational Resources Information Center

    Hayes, Ben; Dewey, Jessica; Sancho, Michelle

    2014-01-01

    In this study we assessed the effects of paragraph length on the reading speed and comprehension of students. Students were randomly assigned to one of three groups: short paragraph length (SPL), medium paragraph length (MPL), or long paragraph length (LPL). Students read a 1423 word text on a computer screen formatted to align with their group…

  4. Kids & Media @ the New Millennium: A Kaiser Family Foundation Report. A Comprehensive National Analysis of Children's Media Use. Executive Summary.

    ERIC Educational Resources Information Center

    Roberts, Donald F.

    A study examined media use patterns among a large, nationally representative sample of children ages 2-18, and which explored how children choose and interact with the whole array of media available to them, including television, movies, computers, music, video games, radio, magazines, books, and newspapers. The goal was to provide a solid base…

  5. "Listen and Understand What I Am Saying": Church-Listening as a Challenge for Non-Native Listeners of English in the United Kingdom

    ERIC Educational Resources Information Center

    Malmström, Hans

    2015-01-01

    This article uses computer-assisted analysis to study the listening environment provided by Bible readings and preaching during church services. It focuses on the vocabulary size needed to comprehend 95% and 98% of the running words of the input (lexical coverage levels indicating comprehension in connection with listening) and on the place of…

  6. A New Mathematical Framework for Design Under Uncertainty

    DTIC Science & Technology

    2016-05-05

    blending multiple information sources via auto-regressive stochastic modeling. A computationally efficient machine learning framework is developed based on...sion and machine learning approaches; see Fig. 1. This will lead to a comprehensive description of system performance with less uncertainty than in the...Bayesian optimization of super-cavitating hy- drofoils The goal of this study is to demonstrate the capabilities of statistical learning and

  7. Very low intravenous contrast volume protocol for computed tomography angiography providing comprehensive cardiac and vascular assessment prior to transcatheter aortic valve replacement in patients with chronic kidney disease.

    PubMed

    Pulerwitz, Todd C; Khalique, Omar K; Nazif, Tamim N; Rozenshtein, Anna; Pearson, Gregory D N; Hahn, Rebecca T; Vahl, Torsten P; Kodali, Susheel K; George, Isaac; Leon, Martin B; D'Souza, Belinda; Po, Ming Jack; Einstein, Andrew J

    2016-01-01

    Transcatheter aortic valve replacement (TAVR) is a lifesaving procedure for many patients high risk for surgical aortic valve replacement. The prevalence of chronic kidney disease (CKD) is high in this population, and thus a very low contrast volume (VLCV) computed tomography angiography (CTA) protocol providing comprehensive cardiac and vascular imaging would be valuable. 52 patients with severe, symptomatic aortic valve disease, undergoing pre-TAVR CTA assessment from 2013-4 at Columbia University Medical Center were studied, including all 26 patients with CKD (eGFR<30 mL/min) who underwent a novel VLCV protocol (20 mL of iohexol at 2.5 mL/s), and 26 standard-contrast-volume (SCV) protocol patients. Using a 320-slice volumetric scanner, the protocol included ECG-gated volume scanning of the aortic root followed by medium-pitch helical vascular scanning through the femoral arteries. Two experienced cardiologists performed aortic annulus and root measurements. Vascular image quality was assessed by two radiologists using a 4-point scale. VLCV patients had mean (±SD) age 86 ± 6.5, BMI 23.9 ± 3.4 kg/m(2) with 54% men; SCV patients age 83 ± 8.8, BMI 28.7 ± 5.3 kg/m(2), 65% men. There was excellent intra- and inter-observer agreement for annular and root measurements, and excellent agreement with 3D-transesophageal echocardiographic measurements. Both radiologists found diagnostic-quality vascular imaging in 96% of VLCV and 100% of SCV cases, with excellent inter-observer agreement. This study is the first of its kind to report the feasibility and reproducibility of measurements for a VLCV protocol for comprehensive pre-TAVR CTA. There was excellent agreement of cardiac measurements and almost all studies were diagnostic quality for vascular access assessment. Copyright © 2016 Society of Cardiovascular Computed Tomography. Published by Elsevier Inc. All rights reserved.

  8. Signalling maps in cancer research: construction and data analysis

    PubMed Central

    Kondratova, Maria; Sompairac, Nicolas; Barillot, Emmanuel; Zinovyev, Andrei

    2018-01-01

    Abstract Generation and usage of high-quality molecular signalling network maps can be augmented by standardizing notations, establishing curation workflows and application of computational biology methods to exploit the knowledge contained in the maps. In this manuscript, we summarize the major aims and challenges of assembling information in the form of comprehensive maps of molecular interactions. Mainly, we share our experience gained while creating the Atlas of Cancer Signalling Network. In the step-by-step procedure, we describe the map construction process and suggest solutions for map complexity management by introducing a hierarchical modular map structure. In addition, we describe the NaviCell platform, a computational technology using Google Maps API to explore comprehensive molecular maps similar to geographical maps and explain the advantages of semantic zooming principles for map navigation. We also provide the outline to prepare signalling network maps for navigation using the NaviCell platform. Finally, several examples of cancer high-throughput data analysis and visualization in the context of comprehensive signalling maps are presented. PMID:29688383

  9. Direct coal liquefaction baseline design and system analysis. Quarterly report, January--March 1991

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1991-04-01

    The primary objective of the study is to develop a computer model for a base line direct coal liquefaction design based on two stage direct coupled catalytic reactors. This primary objective is to be accomplished by completing the following: a base line design based on previous DOE/PETC results from Wilsonville pilot plant and other engineering evaluations; a cost estimate and economic analysis; a computer model incorporating the above two steps over a wide range of capacities and selected process alternatives; a comprehensive training program for DOE/PETC Staff to understand and use the computer model; a thorough documentation of all underlyingmore » assumptions for baseline economics; and a user manual and training material which will facilitate updating of the model in the future.« less

  10. Direct coal liquefaction baseline design and system analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1991-04-01

    The primary objective of the study is to develop a computer model for a base line direct coal liquefaction design based on two stage direct coupled catalytic reactors. This primary objective is to be accomplished by completing the following: a base line design based on previous DOE/PETC results from Wilsonville pilot plant and other engineering evaluations; a cost estimate and economic analysis; a computer model incorporating the above two steps over a wide range of capacities and selected process alternatives; a comprehensive training program for DOE/PETC Staff to understand and use the computer model; a thorough documentation of all underlyingmore » assumptions for baseline economics; and a user manual and training material which will facilitate updating of the model in the future.« less

  11. Radiation dose management for pediatric cardiac computed tomography: a report from the Image Gently 'Have-A-Heart' campaign.

    PubMed

    Rigsby, Cynthia K; McKenney, Sarah E; Hill, Kevin D; Chelliah, Anjali; Einstein, Andrew J; Han, B Kelly; Robinson, Joshua D; Sammet, Christina L; Slesnick, Timothy C; Frush, Donald P

    2018-01-01

    Children with congenital or acquired heart disease can be exposed to relatively high lifetime cumulative doses of ionizing radiation from necessary medical imaging procedures including radiography, fluoroscopic procedures including diagnostic and interventional cardiac catheterizations, electrophysiology examinations, cardiac computed tomography (CT) studies, and nuclear cardiology examinations. Despite the clinical necessity of these imaging studies, the related ionizing radiation exposure could pose an increased lifetime attributable cancer risk. The Image Gently "Have-A-Heart" campaign is promoting the appropriate use of medical imaging studies in children with congenital or acquired heart disease while minimizing radiation exposure. The focus of this manuscript is to provide a comprehensive review of radiation dose management and CT performance in children with congenital or acquired heart disease.

  12. Prevalence of neck pain and headaches: impact of computer use and other associative factors.

    PubMed

    Smith, L; Louw, Q; Crous, L; Grimmer-Somers, K

    2009-02-01

    Headaches and neck pain are reported to be among the most prevalent musculoskeletal complaints in the general population. A significant body of research has reported a high prevalence of headaches and neck pain among adolescents. Sitting for lengthy periods in fixed postures such as at computer terminals may result in adolescent neck pain and headaches. The aim of this paper was to report the association between computer use (exposure) and headaches and neck pain (outcome) among adolescent school students in a developing country. A cross-sectional study was conducted and comprehensive description of the data collection instrument was used to collect the data from 1073 high-school students. Headaches were associated with high psychosocial scores and were more common among girls. We found a concerning association between neck pain and high hours of computing for school students, and have confirmed the need to educate new computer users (school students) about appropriate ergonomics and postural health.

  13. Generation of anatomically realistic numerical phantoms for photoacoustic and ultrasonic breast imaging

    NASA Astrophysics Data System (ADS)

    Lou, Yang; Zhou, Weimin; Matthews, Thomas P.; Appleton, Catherine M.; Anastasio, Mark A.

    2017-04-01

    Photoacoustic computed tomography (PACT) and ultrasound computed tomography (USCT) are emerging modalities for breast imaging. As in all emerging imaging technologies, computer-simulation studies play a critically important role in developing and optimizing the designs of hardware and image reconstruction methods for PACT and USCT. Using computer-simulations, the parameters of an imaging system can be systematically and comprehensively explored in a way that is generally not possible through experimentation. When conducting such studies, numerical phantoms are employed to represent the physical properties of the patient or object to-be-imaged that influence the measured image data. It is highly desirable to utilize numerical phantoms that are realistic, especially when task-based measures of image quality are to be utilized to guide system design. However, most reported computer-simulation studies of PACT and USCT breast imaging employ simple numerical phantoms that oversimplify the complex anatomical structures in the human female breast. We develop and implement a methodology for generating anatomically realistic numerical breast phantoms from clinical contrast-enhanced magnetic resonance imaging data. The phantoms will depict vascular structures and the volumetric distribution of different tissue types in the breast. By assigning optical and acoustic parameters to different tissue structures, both optical and acoustic breast phantoms will be established for use in PACT and USCT studies.

  14. Archetype-Based Modeling of Persona for Comprehensive Personality Computing from Personal Big Data.

    PubMed

    Guo, Ao; Ma, Jianhua

    2018-02-25

    A model describing the wide variety of human behaviours called personality, is becoming increasingly popular among researchers due to the widespread availability of personal big data generated from the use of prevalent digital devices, e.g., smartphones and wearables. Such an approach can be used to model an individual and even digitally clone a person, e.g., a Cyber-I (cyber individual). This work is aimed at establishing a unique and comprehensive description for an individual to mesh with various personalized services and applications. An extensive research literature on or related to psychological modelling exists, i.e., into automatic personality computing. However, the integrity and accuracy of the results from current automatic personality computing is insufficient for the elaborate modeling in Cyber-I due to an insufficient number of data sources. To reach a comprehensive psychological description of a person, it is critical to bring in heterogeneous data sources that could provide plenty of personal data, i.e., the physiological data, and the Internet data. In addition, instead of calculating personality traits from personal data directly, an approach to a personality model derived from the theories of Carl Gustav Jung is used to measure a human subject's persona. Therefore, this research is focused on designing an archetype-based modeling of persona covering an individual's facets in different situations to approach a comprehensive personality model. Using personal big data to measure a specific persona in a certain scenario, our research is designed to ensure the accuracy and integrity of the generated personality model.

  15. Archetype-Based Modeling of Persona for Comprehensive Personality Computing from Personal Big Data

    PubMed Central

    Ma, Jianhua

    2018-01-01

    A model describing the wide variety of human behaviours called personality, is becoming increasingly popular among researchers due to the widespread availability of personal big data generated from the use of prevalent digital devices, e.g., smartphones and wearables. Such an approach can be used to model an individual and even digitally clone a person, e.g., a Cyber-I (cyber individual). This work is aimed at establishing a unique and comprehensive description for an individual to mesh with various personalized services and applications. An extensive research literature on or related to psychological modelling exists, i.e., into automatic personality computing. However, the integrity and accuracy of the results from current automatic personality computing is insufficient for the elaborate modeling in Cyber-I due to an insufficient number of data sources. To reach a comprehensive psychological description of a person, it is critical to bring in heterogeneous data sources that could provide plenty of personal data, i.e., the physiological data, and the Internet data. In addition, instead of calculating personality traits from personal data directly, an approach to a personality model derived from the theories of Carl Gustav Jung is used to measure a human subject’s persona. Therefore, this research is focused on designing an archetype-based modeling of persona covering an individual’s facets in different situations to approach a comprehensive personality model. Using personal big data to measure a specific persona in a certain scenario, our research is designed to ensure the accuracy and integrity of the generated personality model. PMID:29495343

  16. Tissue classification for laparoscopic image understanding based on multispectral texture analysis

    NASA Astrophysics Data System (ADS)

    Zhang, Yan; Wirkert, Sebastian J.; Iszatt, Justin; Kenngott, Hannes; Wagner, Martin; Mayer, Benjamin; Stock, Christian; Clancy, Neil T.; Elson, Daniel S.; Maier-Hein, Lena

    2016-03-01

    Intra-operative tissue classification is one of the prerequisites for providing context-aware visualization in computer-assisted minimally invasive surgeries. As many anatomical structures are difficult to differentiate in conventional RGB medical images, we propose a classification method based on multispectral image patches. In a comprehensive ex vivo study we show (1) that multispectral imaging data is superior to RGB data for organ tissue classification when used in conjunction with widely applied feature descriptors and (2) that combining the tissue texture with the reflectance spectrum improves the classification performance. Multispectral tissue analysis could thus evolve as a key enabling technique in computer-assisted laparoscopy.

  17. Damage to ventral and dorsal language pathways in acute aphasia

    PubMed Central

    Hartwigsen, Gesa; Kellmeyer, Philipp; Glauche, Volkmar; Mader, Irina; Klöppel, Stefan; Suchan, Julia; Karnath, Hans-Otto; Weiller, Cornelius; Saur, Dorothee

    2013-01-01

    Converging evidence from neuroimaging studies and computational modelling suggests an organization of language in a dual dorsal–ventral brain network: a dorsal stream connects temporoparietal with frontal premotor regions through the superior longitudinal and arcuate fasciculus and integrates sensorimotor processing, e.g. in repetition of speech. A ventral stream connects temporal and prefrontal regions via the extreme capsule and mediates meaning, e.g. in auditory comprehension. The aim of our study was to test, in a large sample of 100 aphasic stroke patients, how well acute impairments of repetition and comprehension correlate with lesions of either the dorsal or ventral stream. We combined voxelwise lesion-behaviour mapping with the dorsal and ventral white matter fibre tracts determined by probabilistic fibre tracking in our previous study in healthy subjects. We found that repetition impairments were mainly associated with lesions located in the posterior temporoparietal region with a statistical lesion maximum in the periventricular white matter in projection of the dorsal superior longitudinal and arcuate fasciculus. In contrast, lesions associated with comprehension deficits were found more ventral-anterior in the temporoprefrontal region with a statistical lesion maximum between the insular cortex and the putamen in projection of the ventral extreme capsule. Individual lesion overlap with the dorsal fibre tract showed a significant negative correlation with repetition performance, whereas lesion overlap with the ventral fibre tract revealed a significant negative correlation with comprehension performance. To summarize, our results from patients with acute stroke lesions support the claim that language is organized along two segregated dorsal–ventral streams. Particularly, this is the first lesion study demonstrating that task performance on auditory comprehension measures requires an interaction between temporal and prefrontal brain regions via the ventral extreme capsule pathway. PMID:23378217

  18. The Further Development of CSIEC Project Driven by Application and Evaluation in English Education

    ERIC Educational Resources Information Center

    Jia, Jiyou; Chen, Weichao

    2009-01-01

    In this paper, we present the comprehensive version of CSIEC (Computer Simulation in Educational Communication), an interactive web-based human-computer dialogue system with natural language for English instruction, and its tentative application and evaluation in English education. First, we briefly introduce the motivation for this project,…

  19. Using Robotics to Improve Retention and Increase Comprehension in Introductory Programming Courses

    ERIC Educational Resources Information Center

    Pullan, Marie

    2013-01-01

    Several college majors, outside of computer science, require students to learn computer programming. Many students have difficulty getting through the programming sequence and ultimately change majors or drop out of college. To deal with this problem, active learning techniques were developed and implemented in a freshman programming logic and…

  20. Using a Cloud-Based Computing Environment to Support Teacher Training on Common Core Implementation

    ERIC Educational Resources Information Center

    Robertson, Cory

    2013-01-01

    A cloud-based computing environment, Google Apps for Education (GAFE), has provided the Anaheim City School District (ACSD) a comprehensive and collaborative avenue for creating, sharing, and editing documents, calendars, and social networking communities. With this environment, teachers and district staff at ACSD are able to utilize the deep…

  1. Computer-Supported Aids to Making Sense of Scientific Articles: Cognitive, Motivational, and Attitudinal Effects

    ERIC Educational Resources Information Center

    Gegner, Julie A.; Mackay, Donald H. J.; Mayer, Richard E.

    2009-01-01

    High school students can access original scientific research articles on the Internet, but may have trouble understanding them. To address this problem of online literacy, the authors developed a computer-based prototype for guiding students' comprehension of scientific articles. High school students were asked to read an original scientific…

  2. An Evaluation of Computer-Managed Education Technology at New York City Community College.

    ERIC Educational Resources Information Center

    Chitayat, Linda

    The Computer-Managed Education Technology (COMET) program was designed to improve group instruction through the use of technological aids in the classroom. Specific objectives included: (1) improving feedback on student comprehension during a class period; (2) facilitating the administration and grading of homework and quizzes; (3) providing for…

  3. Everything You Always Wanted to Know About CAI But Were Afraid To Ask.

    ERIC Educational Resources Information Center

    Luskin, Bernard J.; And Others

    A comprehensive summary of significant developments related to the integration of the computer in all levels of instruction, this book identifies, classifies, and examines obstacles to computer-assisted instruction (CAI), their scope and possible resolutions. Some 75 experts were surveyed and their opinions statistically analyzed in regard to 23…

  4. How Learning Logic Programming Affects Recursion Comprehension

    ERIC Educational Resources Information Center

    Haberman, Bruria

    2004-01-01

    Recursion is a central concept in computer science, yet it is difficult for beginners to comprehend. Israeli high-school students learn recursion in the framework of a special modular program in computer science (Gal-Ezer & Harel, 1999). Some of them are introduced to the concept of recursion in two different paradigms: the procedural…

  5. Pair Programming and LSs in Computing Education: Its Impact on Students' Performances

    ERIC Educational Resources Information Center

    Hui, Tie Hui; Umar, Irfan Naufal

    2011-01-01

    Learning to programme requires complex cognitive skills that computing students find it arduous in comprehension. PP (pair programming) is an intensive style of programme cooperation where two people working together in resolving programming scenarios. It begins to draw the interests of educators as a teaching approach to facilitate learning and…

  6. Toward a New Model of Usability: Guidelines for Selecting Reading Fluency Apps Suitable for Instruction of Struggling Readers

    ERIC Educational Resources Information Center

    Rinehart, Steven D.; Ahern, Terence C.

    2016-01-01

    Computer applications related to reading instruction have become commonplace in schools and link with established components of the reading process, emergent skills, decoding, comprehension, vocabulary, and fluency. This article focuses on computer technology in conjunction with durable methods for building oral reading fluency when readers…

  7. ODU-CAUSE: Computer Based Learning Lab.

    ERIC Educational Resources Information Center

    Sachon, Michael W.; Copeland, Gary E.

    This paper describes the Computer Based Learning Lab (CBLL) at Old Dominion University (ODU) as a component of the ODU-Comprehensive Assistance to Undergraduate Science Education (CAUSE) Project. Emphasis is directed to the structure and management of the facility and to the software under development by the staff. Serving the ODU-CAUSE User Group…

  8. Synthesizing Results from Empirical Research on Computer-Based Scaffolding in STEM Education: A Meta-Analysis

    ERIC Educational Resources Information Center

    Belland, Brian R.; Walker, Andrew E.; Kim, Nam Ju; Lefler, Mason

    2017-01-01

    Computer-based scaffolding assists students as they generate solutions to complex problems, goals, or tasks, helping increase and integrate their higher order skills in the process. However, despite decades of research on scaffolding in STEM (science, technology, engineering, and mathematics) education, no existing comprehensive meta-analysis has…

  9. Evaluation of CFD to Determine Two-Dimensional Airfoil Characteristics for Rotorcraft Applications

    NASA Technical Reports Server (NTRS)

    Smith, Marilyn J.; Wong, Tin-Chee; Potsdam, Mark; Baeder, James; Phanse, Sujeet

    2004-01-01

    The efficient prediction of helicopter rotor performance, vibratory loads, and aeroelastic properties still relies heavily on the use of comprehensive analysis codes by the rotorcraft industry. These comprehensive codes utilize look-up tables to provide two-dimensional aerodynamic characteristics. Typically these tables are comprised of a combination of wind tunnel data, empirical data and numerical analyses. The potential to rely more heavily on numerical computations based on Computational Fluid Dynamics (CFD) simulations has become more of a reality with the advent of faster computers and more sophisticated physical models. The ability of five different CFD codes applied independently to predict the lift, drag and pitching moments of rotor airfoils is examined for the SC1095 airfoil, which is utilized in the UH-60A main rotor. Extensive comparisons with the results of ten wind tunnel tests are performed. These CFD computations are found to be as good as experimental data in predicting many of the aerodynamic performance characteristics. Four turbulence models were examined (Baldwin-Lomax, Spalart-Allmaras, Menter SST, and k-omega).

  10. Technology, the Columbus Effect, and the Third Revolution in Learning

    DTIC Science & Technology

    2001-03-01

    comprehension-fostering and comprehension-monitoring activities. Cognition and Instruction , 1, 117–-175. Rogoff, B. (1990). Apprenticeship in thinking ...elementary school mathematics (Suppes, Fletcher, and Zanotti, 1975). Instructional approaches used in these early programs required computers that cost $2–3...supported by considerations of pace: the speed with which students learn material and reach instructional objectives. Easily adjusted pacing is a

  11. Tutoring Mathematical Word Problems Using Solution Trees: Text Comprehension, Situation Comprehension, and Mathematization in Solving Story Problems. Research Report No. 8.

    ERIC Educational Resources Information Center

    Reusser, Kurt; And Others

    The main concern of this paper is on the psychological processes of how students understand and solve mathematical word problems, and on how this knowledge can be applied to computer-based tutoring. It is argued that only a better understanding of the psychological requirements for understanding and solving those problems will lead to…

  12. A Pilot Study of Biomedical Text Comprehension using an Attention-Based Deep Neural Reader: Design and Experimental Analysis

    PubMed Central

    Lee, Kyubum; Kim, Byounggun; Jeon, Minji; Kim, Jihye; Tan, Aik Choon

    2018-01-01

    Background With the development of artificial intelligence (AI) technology centered on deep-learning, the computer has evolved to a point where it can read a given text and answer a question based on the context of the text. Such a specific task is known as the task of machine comprehension. Existing machine comprehension tasks mostly use datasets of general texts, such as news articles or elementary school-level storybooks. However, no attempt has been made to determine whether an up-to-date deep learning-based machine comprehension model can also process scientific literature containing expert-level knowledge, especially in the biomedical domain. Objective This study aims to investigate whether a machine comprehension model can process biomedical articles as well as general texts. Since there is no dataset for the biomedical literature comprehension task, our work includes generating a large-scale question answering dataset using PubMed and manually evaluating the generated dataset. Methods We present an attention-based deep neural model tailored to the biomedical domain. To further enhance the performance of our model, we used a pretrained word vector and biomedical entity type embedding. We also developed an ensemble method of combining the results of several independent models to reduce the variance of the answers from the models. Results The experimental results showed that our proposed deep neural network model outperformed the baseline model by more than 7% on the new dataset. We also evaluated human performance on the new dataset. The human evaluation result showed that our deep neural model outperformed humans in comprehension by 22% on average. Conclusions In this work, we introduced a new task of machine comprehension in the biomedical domain using a deep neural model. Since there was no large-scale dataset for training deep neural models in the biomedical domain, we created the new cloze-style datasets Biomedical Knowledge Comprehension Title (BMKC_T) and Biomedical Knowledge Comprehension Last Sentence (BMKC_LS) (together referred to as BioMedical Knowledge Comprehension) using the PubMed corpus. The experimental results showed that the performance of our model is much higher than that of humans. We observed that our model performed consistently better regardless of the degree of difficulty of a text, whereas humans have difficulty when performing biomedical literature comprehension tasks that require expert level knowledge. PMID:29305341

  13. The iPlant Collaborative: Cyberinfrastructure for Enabling Data to Discovery for the Life Sciences.

    PubMed

    Merchant, Nirav; Lyons, Eric; Goff, Stephen; Vaughn, Matthew; Ware, Doreen; Micklos, David; Antin, Parker

    2016-01-01

    The iPlant Collaborative provides life science research communities access to comprehensive, scalable, and cohesive computational infrastructure for data management; identity management; collaboration tools; and cloud, high-performance, high-throughput computing. iPlant provides training, learning material, and best practice resources to help all researchers make the best use of their data, expand their computational skill set, and effectively manage their data and computation when working as distributed teams. iPlant's platform permits researchers to easily deposit and share their data and deploy new computational tools and analysis workflows, allowing the broader community to easily use and reuse those data and computational analyses.

  14. Windows .NET Network Distributed Basic Local Alignment Search Toolkit (W.ND-BLAST)

    PubMed Central

    Dowd, Scot E; Zaragoza, Joaquin; Rodriguez, Javier R; Oliver, Melvin J; Payton, Paxton R

    2005-01-01

    Background BLAST is one of the most common and useful tools for Genetic Research. This paper describes a software application we have termed Windows .NET Distributed Basic Local Alignment Search Toolkit (W.ND-BLAST), which enhances the BLAST utility by improving usability, fault recovery, and scalability in a Windows desktop environment. Our goal was to develop an easy to use, fault tolerant, high-throughput BLAST solution that incorporates a comprehensive BLAST result viewer with curation and annotation functionality. Results W.ND-BLAST is a comprehensive Windows-based software toolkit that targets researchers, including those with minimal computer skills, and provides the ability increase the performance of BLAST by distributing BLAST queries to any number of Windows based machines across local area networks (LAN). W.ND-BLAST provides intuitive Graphic User Interfaces (GUI) for BLAST database creation, BLAST execution, BLAST output evaluation and BLAST result exportation. This software also provides several layers of fault tolerance and fault recovery to prevent loss of data if nodes or master machines fail. This paper lays out the functionality of W.ND-BLAST. W.ND-BLAST displays close to 100% performance efficiency when distributing tasks to 12 remote computers of the same performance class. A high throughput BLAST job which took 662.68 minutes (11 hours) on one average machine was completed in 44.97 minutes when distributed to 17 nodes, which included lower performance class machines. Finally, there is a comprehensive high-throughput BLAST Output Viewer (BOV) and Annotation Engine components, which provides comprehensive exportation of BLAST hits to text files, annotated fasta files, tables, or association files. Conclusion W.ND-BLAST provides an interactive tool that allows scientists to easily utilizing their available computing resources for high throughput and comprehensive sequence analyses. The install package for W.ND-BLAST is freely downloadable from . With registration the software is free, installation, networking, and usage instructions are provided as well as a support forum. PMID:15819992

  15. Computational analysis of aircraft pressure relief doors

    NASA Astrophysics Data System (ADS)

    Schott, Tyler

    Modern trends in commercial aircraft design have sought to improve fuel efficiency while reducing emissions by operating at higher pressures and temperatures than ever before. Consequently, greater demands are placed on the auxiliary bleed air systems used for a multitude of aircraft operations. The increased role of bleed air systems poses significant challenges for the pressure relief system to ensure the safe and reliable operation of the aircraft. The core compartment pressure relief door (PRD) is an essential component of the pressure relief system which functions to relieve internal pressure in the core casing of a high-bypass turbofan engine during a burst duct over-pressurization event. The successful modeling and analysis of a burst duct event are imperative to the design and development of PRD's to ensure that they will meet the increased demands placed on the pressure relief system. Leveraging high-performance computing coupled with advances in computational analysis, this thesis focuses on a comprehensive computational fluid dynamics (CFD) study to characterize turbulent flow dynamics and quantify the performance of a core compartment PRD across a range of operating conditions and geometric configurations. The CFD analysis was based on a compressible, steady-state, three-dimensional, Reynolds-averaged Navier-Stokes approach. Simulations were analyzed, and results show that variations in freestream conditions, plenum environment, and geometric configurations have a non-linear impact on the discharge, moment, thrust, and surface temperature characteristics. The CFD study revealed that the underlying physics for this behavior is explained by the interaction of vortices, jets, and shockwaves. This thesis research is innovative and provides a comprehensive and detailed analysis of existing and novel PRD geometries over a range of realistic operating conditions representative of a burst duct over-pressurization event. Further, the study provides aircraft manufacturers with valuable insight into the impact that operating conditions and geometric configurations have on PRD performance and how the information can be used to assist future research and development of PRD design.

  16. Low Cost Comprehensive Microcomputer-Based Medical History Database Acquisition

    PubMed Central

    Buchan, Robert R. C.

    1980-01-01

    A carefully detailed, comprehensive medical history database is the fundamental essence of patient-physician interaction. Computer generated medical history acquisition has repeatedly been shown to be highly acceptable to both patient and physician while consistantly providing a superior product. Cost justification of machine derived problem and history databases, however, has in the past been marginal, at best. Routine use of the technology has therefore been limited to large clinics, university hospitals and federal installations where feasible volume applications are supported by endowment, research funds or taxes. This paper summarizes the use of a unique low cost device which marries advanced microprocessor technology with random access, variable-frame film projection techniques to acquire a detailed comprehensive medical history database. Preliminary data are presented which compare patient, physician, and machine generated histories for content, discovery, compliance and acceptability. Results compare favorably with the findings in similar studies by a variety of authors. ImagesFigure 1Figure 2Figure 3Figure 4

  17. Computed microtomography and X-ray fluorescence analysis for comprehensive analysis of structural changes in bone.

    PubMed

    Buzmakov, Alexey; Chukalina, Marina; Nikolaev, Dmitry; Schaefer, Gerald; Gulimova, Victoria; Saveliev, Sergey; Tereschenko, Elena; Seregin, Alexey; Senin, Roman; Prun, Victor; Zolotov, Denis; Asadchikov, Victor

    2013-01-01

    This paper presents the results of a comprehensive analysis of structural changes in the caudal vertebrae of Turner's thick-toed geckos by computer microtomography and X-ray fluorescence analysis. We present algorithms used for the reconstruction of tomographic images which allow to work with high noise level projections that represent typical conditions dictated by the nature of the samples. Reptiles, due to their ruggedness, small size, belonging to the amniote and a number of other valuable features, are an attractive model object for long-orbital experiments on unmanned spacecraft. Issues of possible changes in their bone tissue under the influence of spaceflight are the subject of discussions between biologists from different laboratories around the world.

  18. Updated Lagrangian finite element formulations of various biological soft tissue non-linear material models: a comprehensive procedure and review.

    PubMed

    Townsend, Molly T; Sarigul-Klijn, Nesrin

    2016-01-01

    Simplified material models are commonly used in computational simulation of biological soft tissue as an approximation of the complicated material response and to minimize computational resources. However, the simulation of complex loadings, such as long-duration tissue swelling, necessitates complex models that are not easy to formulate. This paper strives to offer the updated Lagrangian formulation comprehensive procedure of various non-linear material models for the application of finite element analysis of biological soft tissues including a definition of the Cauchy stress and the spatial tangential stiffness. The relationships between water content, osmotic pressure, ionic concentration and the pore pressure stress of the tissue are discussed with the merits of these models and their applications.

  19. Comprehensive silicon solar cell computer modeling

    NASA Technical Reports Server (NTRS)

    Lamorte, M. F.

    1984-01-01

    The development of an efficient, comprehensive Si solar cell modeling program that has the capability of simulation accuracy of 5 percent or less is examined. A general investigation of computerized simulation is provided. Computer simulation programs are subdivided into a number of major tasks: (1) analytical method used to represent the physical system; (2) phenomena submodels that comprise the simulation of the system; (3) coding of the analysis and the phenomena submodels; (4) coding scheme that results in efficient use of the CPU so that CPU costs are low; and (5) modularized simulation program with respect to structures that may be analyzed, addition and/or modification of phenomena submodels as new experimental data become available, and the addition of other photovoltaic materials.

  20. Nonuniform Deployment of Autonomous Agents in Harbor-Like Environments

    DTIC Science & Technology

    2014-11-12

    ith agent than to all other agents. Interested readers are referred to [55] for the comprehensive study on Voronoi partitioning and its applications...robots: An rfid approach, PhD dissertation, School of Electrical Engi- neering and Computer Science, University of Ottawa (October 2012). [55] A. Okabe, B...Gueaieb, A stochastic approach of mobile robot navigation using customized rfid sys- tems, International Conference on Signals, Circuits and Systems

  1. Have Basic Mathematical Skills Grown Obsolete in the Computer Age: Assessing Basic Mathematical Skills and Forecasting Performance in a Business Statistics Course

    ERIC Educational Resources Information Center

    Noser, Thomas C.; Tanner, John R.; Shah, Situl

    2008-01-01

    The purpose of this study was to measure the comprehension of basic mathematical skills of students enrolled in statistics classes at a large regional university, and to determine if the scores earned on a basic math skills test are useful in forecasting student performance in these statistics classes, and to determine if students' basic math…

  2. Response Latency Measures for Biographical Inventories

    DTIC Science & Technology

    1991-03-01

    research (Trent et al., 1989). Procedures The ASAP, followed by one or more experimental cognitive tests, was computer administered to groups of...comprehension, and binary " true /false" decision about the item. This last stage, in turn, is divided into two substages: self-referent decision...apply stage) As a first step in partitioning latencies, it would be prudent to control experimentally for item length, as had been done in a few studies

  3. COED Transactions, Vol. IX, No. 1, January 1977. Rapid Production of System Phase-Plane Portraits on the EAI 380 Hybrid/Analog Computer.

    ERIC Educational Resources Information Center

    Marcovitz, Alan B., Ed.

    The method of phase-plane presentation as an educational tool in the study of the dynamic behavior of systems is discussed. In the treatment of nonlinear or piecewise-linear systems, the phase-plane portrait is used to exhibit the nature of singular points, regions of stability, and switching lines to aid comprehension. A technique is described by…

  4. Eye movements during listening reveal spontaneous grammatical processing.

    PubMed

    Huette, Stephanie; Winter, Bodo; Matlock, Teenie; Ardell, David H; Spivey, Michael

    2014-01-01

    Recent research using eye-tracking typically relies on constrained visual contexts in particular goal-oriented contexts, viewing a small array of objects on a computer screen and performing some overt decision or identification. Eyetracking paradigms that use pictures as a measure of word or sentence comprehension are sometimes touted as ecologically invalid because pictures and explicit tasks are not always present during language comprehension. This study compared the comprehension of sentences with two different grammatical forms: the past progressive (e.g., was walking), which emphasizes the ongoing nature of actions, and the simple past (e.g., walked), which emphasizes the end-state of an action. The results showed that the distribution and timing of eye movements mirrors the underlying conceptual structure of this linguistic difference in the absence of any visual stimuli or task constraint: Fixations were shorter and saccades were more dispersed across the screen, as if thinking about more dynamic events when listening to the past progressive stories. Thus, eye movement data suggest that visual inputs or an explicit task are unnecessary to solicit analog representations of features such as movement, that could be a key perceptual component to grammatical comprehension.

  5. Analyzing comprehensive QoS with security constraints for services composition applications in wireless sensor networks.

    PubMed

    Xiong, Naixue; Wu, Zhao; Huang, Yannong; Xu, Degang

    2014-12-01

    Services composition is fundamental to software development in multi-service wireless sensor networks (WSNs). The quality of service (QoS) of services composition applications (SCAs) are confronted with severe challenges due to the open, dynamic, and complex natures of WSNs. Most previous research separated various QoS indices into different fields and studied them individually due to the computational complexity. This approach ignores the mutual influence between these QoS indices, and leads to a non-comprehensive and inaccurate analysis result. The universal generating function (UGF) shows the speediness and precision in QoS analysis. However, only one QoS index at a time can be analyzed by the classic UGF. In order to efficiently analyze the comprehensive QoS of SCAs, this paper proposes an improved UGF technique-vector universal generating function (VUGF)-which considers the relationship between multiple QoS indices, including security, and can simultaneously analyze multiple QoS indices. The numerical examples demonstrate that it can be used for the evaluation of the comprehensive QoS of SCAs subjected to the security constraint in WSNs. Therefore, it can be effectively applied to the optimal design of multi-service WSNs.

  6. Analyzing Comprehensive QoS with Security Constraints for Services Composition Applications in Wireless Sensor Networks

    PubMed Central

    Xiong, Naixue; Wu, Zhao; Huang, Yannong; Xu, Degang

    2014-01-01

    Services composition is fundamental to software development in multi-service wireless sensor networks (WSNs). The quality of service (QoS) of services composition applications (SCAs) are confronted with severe challenges due to the open, dynamic, and complex natures of WSNs. Most previous research separated various QoS indices into different fields and studied them individually due to the computational complexity. This approach ignores the mutual influence between these QoS indices, and leads to a non-comprehensive and inaccurate analysis result. The universal generating function (UGF) shows the speediness and precision in QoS analysis. However, only one QoS index at a time can be analyzed by the classic UGF. In order to efficiently analyze the comprehensive QoS of SCAs, this paper proposes an improved UGF technique—vector universal generating function (VUGF)—which considers the relationship between multiple QoS indices, including security, and can simultaneously analyze multiple QoS indices. The numerical examples demonstrate that it can be used for the evaluation of the comprehensive QoS of SCAs subjected to the security constraint in WSNs. Therefore, it can be effectively applied to the optimal design of multi-service WSNs. PMID:25470488

  7. COMAN: a web server for comprehensive metatranscriptomics analysis.

    PubMed

    Ni, Yueqiong; Li, Jun; Panagiotou, Gianni

    2016-08-11

    Microbiota-oriented studies based on metagenomic or metatranscriptomic sequencing have revolutionised our understanding on microbial ecology and the roles of both clinical and environmental microbes. The analysis of massive metatranscriptomic data requires extensive computational resources, a collection of bioinformatics tools and expertise in programming. We developed COMAN (Comprehensive Metatranscriptomics Analysis), a web-based tool dedicated to automatically and comprehensively analysing metatranscriptomic data. COMAN pipeline includes quality control of raw reads, removal of reads derived from non-coding RNA, followed by functional annotation, comparative statistical analysis, pathway enrichment analysis, co-expression network analysis and high-quality visualisation. The essential data generated by COMAN are also provided in tabular format for additional analysis and integration with other software. The web server has an easy-to-use interface and detailed instructions, and is freely available at http://sbb.hku.hk/COMAN/ CONCLUSIONS: COMAN is an integrated web server dedicated to comprehensive functional analysis of metatranscriptomic data, translating massive amount of reads to data tables and high-standard figures. It is expected to facilitate the researchers with less expertise in bioinformatics in answering microbiota-related biological questions and to increase the accessibility and interpretation of microbiota RNA-Seq data.

  8. Experimental and computational flow-field results for an all-body hypersonic aircraft

    NASA Technical Reports Server (NTRS)

    Cleary, Joseph W.

    1989-01-01

    A comprehensive test program is defined which is being implemented in the NASA/Ames 3.5 foot Hypersonic Wind Tunnel for obtaining data on a generic all-body hypersonic vehicle for computational fluid dynamics (CFD) code validation. Computational methods (approximate inviscid methods and an upwind parabolized Navier-Stokes code) currently being applied to the all-body model are outlined. Experimental and computational results on surface pressure distributions and Pitot-pressure surveys for the basic sharp-nose model (without control surfaces) at a free-stream Mach number of 7 are presented.

  9. A conceptual and computational model of moral decision making in human and artificial agents.

    PubMed

    Wallach, Wendell; Franklin, Stan; Allen, Colin

    2010-07-01

    Recently, there has been a resurgence of interest in general, comprehensive models of human cognition. Such models aim to explain higher-order cognitive faculties, such as deliberation and planning. Given a computational representation, the validity of these models can be tested in computer simulations such as software agents or embodied robots. The push to implement computational models of this kind has created the field of artificial general intelligence (AGI). Moral decision making is arguably one of the most challenging tasks for computational approaches to higher-order cognition. The need for increasingly autonomous artificial agents to factor moral considerations into their choices and actions has given rise to another new field of inquiry variously known as Machine Morality, Machine Ethics, Roboethics, or Friendly AI. In this study, we discuss how LIDA, an AGI model of human cognition, can be adapted to model both affective and rational features of moral decision making. Using the LIDA model, we will demonstrate how moral decisions can be made in many domains using the same mechanisms that enable general decision making. Comprehensive models of human cognition typically aim for compatibility with recent research in the cognitive and neural sciences. Global workspace theory, proposed by the neuropsychologist Bernard Baars (1988), is a highly regarded model of human cognition that is currently being computationally instantiated in several software implementations. LIDA (Franklin, Baars, Ramamurthy, & Ventura, 2005) is one such computational implementation. LIDA is both a set of computational tools and an underlying model of human cognition, which provides mechanisms that are capable of explaining how an agent's selection of its next action arises from bottom-up collection of sensory data and top-down processes for making sense of its current situation. We will describe how the LIDA model helps integrate emotions into the human decision-making process, and we will elucidate a process whereby an agent can work through an ethical problem to reach a solution that takes account of ethically relevant factors. Copyright © 2010 Cognitive Science Society, Inc.

  10. Development and validation of rear impact computer simulation model of an adult manual transit wheelchair with a seated occupant.

    PubMed

    Salipur, Zdravko; Bertocci, Gina

    2010-01-01

    It has been shown that ANSI WC19 transit wheelchairs that are crashworthy in frontal impact exhibit catastrophic failures in rear impact and may not be able to provide stable seating support and thus occupant protection for the wheelchair occupant. Thus far only limited sled test and computer simulation data have been available to study rear impact wheelchair safety. Computer modeling can be used as an economic and comprehensive tool to gain critical knowledge regarding wheelchair integrity and occupant safety. This study describes the development and validation of a computer model simulating an adult wheelchair-seated occupant subjected to a rear impact event. The model was developed in MADYMO and validated rigorously using the results of three similar sled tests conducted to specifications provided in the draft ISO/TC 173 standard. Outcomes from the model can provide critical wheelchair loading information to wheelchair and tiedown manufacturers, resulting in safer wheelchair designs for rear impact conditions. (c) 2009 IPEM. Published by Elsevier Ltd. All rights reserved.

  11. A comprehensive analytical model of rotorcraft aerodynamics and dynamics. Part 2: User's manual

    NASA Technical Reports Server (NTRS)

    Johnson, W.

    1980-01-01

    The use of a computer program for a comprehensive analytical model of rotorcraft aerodynamics and dynamics is described. The program calculates the loads and motion of helicopter rotors and airframe. First the trim solution is obtained, then the flutter, flight dynamics, and/or transient behavior can be calculated. Either a new job can be initiated or further calculations can be performed for an old job.

  12. Wide Band Spurious Suppression of Multi-Strip Resonator BPF —Comprehensive Way to Suppress Spurious Responses in BPFs—

    NASA Astrophysics Data System (ADS)

    Awai, Ikuo

    A new comprehensive method to suppress the spurious modes in a BPF is proposed taking the multi-strip resonator BPF as an example. It consists of disturbing the resonant frequency, coupling coefficient and external Q of the higher-order modes at the same time. The designed example has shown an extraordinarily good out-of-band response in the computer simulation.

  13. Florida Assessments for Instruction in Reading, Aligned to the Language Arts Florida Standards, FAIR-FS, Grades 3 through 12. Technical Manual

    ERIC Educational Resources Information Center

    Foorman, Barbara R.; Petscher, Yaacov; Schatschneider, Chris

    2015-01-01

    The FAIR-FS consists of computer-adaptive reading comprehension and oral language screening tasks that provide measures to track growth over time, as well as a Probability of Literacy Success (PLS) linked to grade-level performance (i.e., the 40th percentile) on the reading comprehension subtest of the Stanford Achievement Test (SAT-10) in the…

  14. A comparison of graph- and kernel-based -omics data integration algorithms for classifying complex traits.

    PubMed

    Yan, Kang K; Zhao, Hongyu; Pang, Herbert

    2017-12-06

    High-throughput sequencing data are widely collected and analyzed in the study of complex diseases in quest of improving human health. Well-studied algorithms mostly deal with single data source, and cannot fully utilize the potential of these multi-omics data sources. In order to provide a holistic understanding of human health and diseases, it is necessary to integrate multiple data sources. Several algorithms have been proposed so far, however, a comprehensive comparison of data integration algorithms for classification of binary traits is currently lacking. In this paper, we focus on two common classes of integration algorithms, graph-based that depict relationships with subjects denoted by nodes and relationships denoted by edges, and kernel-based that can generate a classifier in feature space. Our paper provides a comprehensive comparison of their performance in terms of various measurements of classification accuracy and computation time. Seven different integration algorithms, including graph-based semi-supervised learning, graph sharpening integration, composite association network, Bayesian network, semi-definite programming-support vector machine (SDP-SVM), relevance vector machine (RVM) and Ada-boost relevance vector machine are compared and evaluated with hypertension and two cancer data sets in our study. In general, kernel-based algorithms create more complex models and require longer computation time, but they tend to perform better than graph-based algorithms. The performance of graph-based algorithms has the advantage of being faster computationally. The empirical results demonstrate that composite association network, relevance vector machine, and Ada-boost RVM are the better performers. We provide recommendations on how to choose an appropriate algorithm for integrating data from multiple sources.

  15. ClassyFire: automated chemical classification with a comprehensive, computable taxonomy.

    PubMed

    Djoumbou Feunang, Yannick; Eisner, Roman; Knox, Craig; Chepelev, Leonid; Hastings, Janna; Owen, Gareth; Fahy, Eoin; Steinbeck, Christoph; Subramanian, Shankar; Bolton, Evan; Greiner, Russell; Wishart, David S

    2016-01-01

    Scientists have long been driven by the desire to describe, organize, classify, and compare objects using taxonomies and/or ontologies. In contrast to biology, geology, and many other scientific disciplines, the world of chemistry still lacks a standardized chemical ontology or taxonomy. Several attempts at chemical classification have been made; but they have mostly been limited to either manual, or semi-automated proof-of-principle applications. This is regrettable as comprehensive chemical classification and description tools could not only improve our understanding of chemistry but also improve the linkage between chemistry and many other fields. For instance, the chemical classification of a compound could help predict its metabolic fate in humans, its druggability or potential hazards associated with it, among others. However, the sheer number (tens of millions of compounds) and complexity of chemical structures is such that any manual classification effort would prove to be near impossible. We have developed a comprehensive, flexible, and computable, purely structure-based chemical taxonomy (ChemOnt), along with a computer program (ClassyFire) that uses only chemical structures and structural features to automatically assign all known chemical compounds to a taxonomy consisting of >4800 different categories. This new chemical taxonomy consists of up to 11 different levels (Kingdom, SuperClass, Class, SubClass, etc.) with each of the categories defined by unambiguous, computable structural rules. Furthermore each category is named using a consensus-based nomenclature and described (in English) based on the characteristic common structural properties of the compounds it contains. The ClassyFire webserver is freely accessible at http://classyfire.wishartlab.com/. Moreover, a Ruby API version is available at https://bitbucket.org/wishartlab/classyfire_api, which provides programmatic access to the ClassyFire server and database. ClassyFire has been used to annotate over 77 million compounds and has already been integrated into other software packages to automatically generate textual descriptions for, and/or infer biological properties of over 100,000 compounds. Additional examples and applications are provided in this paper. ClassyFire, in combination with ChemOnt (ClassyFire's comprehensive chemical taxonomy), now allows chemists and cheminformaticians to perform large-scale, rapid and automated chemical classification. Moreover, a freely accessible API allows easy access to more than 77 million "ClassyFire" classified compounds. The results can be used to help annotate well studied, as well as lesser-known compounds. In addition, these chemical classifications can be used as input for data integration, and many other cheminformatics-related tasks.

  16. Older Children and Adolescents with High-Functioning Autism Spectrum Disorders Can Comprehend Verbal Irony in Computer-Mediated Communication

    ERIC Educational Resources Information Center

    Glenwright, Melanie; Agbayewa, Abiola S.

    2012-01-01

    We compared the comprehension of verbal irony presented in computer-mediated conversations for older children and adolescents with high-functioning autism spectrum disorders (HFASD) and typically developing (TD) controls. We also determined whether participants' interpretations of irony were affected by the relationship between characters in the…

  17. Using Technology To Support Comprehensive Guidance Program Operations: A Variety of Strategies.

    ERIC Educational Resources Information Center

    Bowers, Judy

    The Tucson Unified School District made a goal for 2000-2001 for all counselors to have their own computer at school. This article looks at how these computers are used to enhance the counselors' jobs. At the district level, the staff communicates with counselors through e-mail. Meeting reminders, general information, and upcoming events are…

  18. Computers and Writing. Learning Package No. 33.

    ERIC Educational Resources Information Center

    Simic, Marge, Comp.; Smith, Carl, Ed.

    Originally developed as part of a project for the Department of Defense Schools (DoDDS) system, this learning package on computers and writing is designed for teachers who wish to upgrade or expand their teaching skills on their own. The package includes an overview of the project; a comprehensive search of the ERIC database; a lecture giving an…

  19. A Computer Program To Increase Comprehension of the Cartesian Rectangular Coordinate System in High School Pre-Algebra Students.

    ERIC Educational Resources Information Center

    Exley, I. Sheck

    The high percentage of high school pre-algebra students having difficulty learning the abstract concept of graphing ordered pairs on the Cartesian rectangular coordinate system was addressed by the creation and implementation of a computer-managed instructional program. Modules consisted of a pretest, instruction, two practice sessions, and a…

  20. Course Modularization Applied: The Interface System and Its Implications For Sequence Control and Data Analysis.

    ERIC Educational Resources Information Center

    Schneider, E. W.

    The Interface System is a comprehensive method for developing and managing computer-assisted instructional courses or computer-managed instructional courses composed of sets of instructional modules. Each module is defined by one or more behavioral objectives and by a list of prerequisite modules that must be completed successfully before the…

  1. A Computer-Based, Interactive Videodisc Job Aid and Expert System for Electron Beam Lithography Integration and Diagnostic Procedures.

    ERIC Educational Resources Information Center

    Stevenson, Kimberly

    This master's thesis describes the development of an expert system and interactive videodisc computer-based instructional job aid used for assisting in the integration of electron beam lithography devices. Comparable to all comprehensive training, expert system and job aid development require a criterion-referenced systems approach treatment to…

  2. Computerized Audio-Visual Instructional Sequences (CAVIS): A Versatile System for Listening Comprehension in Foreign Language Teaching.

    ERIC Educational Resources Information Center

    Aleman-Centeno, Josefina R.

    1983-01-01

    Discusses the development and evaluation of CAVIS, which consists of an Apple microcomputer used with audiovisual dialogs. Includes research on the effects of three conditions: (1) computer with audio and visual, (2) computer with audio alone and (3) audio alone in short-term and long-term recall. (EKN)

  3. Data sonification and sound visualization.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kaper, H. G.; Tipei, S.; Wiebel, E.

    1999-07-01

    Sound can help us explore and analyze complex data sets in scientific computing. The authors describe a digital instrument for additive sound synthesis (Diass) and a program to visualize sounds in a virtual reality environment (M4Cave). Both are part of a comprehensive music composition environment that includes additional software for computer-assisted composition and automatic music notation.

  4. CFL3D: Its History and Some Recent Applications

    NASA Technical Reports Server (NTRS)

    Rumsey, C. L.; Biedron, R. T.; Thomas, J. L.

    1997-01-01

    The history of the Computational Fluids Laboratory -3D (CFL3D) Navier-Stokes computer code is discussed and a comprehensive reference list is given. Three recent advanced applications are presented (1) Wing with partial-spanflap, (2) F/A-18 with forebody control strake, and (3) Noise predictions for an advanced ducted propeller turbomachinery flow.

  5. Sigma 2 Graphic Display Software Program Description

    NASA Technical Reports Server (NTRS)

    Johnson, B. T.

    1973-01-01

    A general purpose, user oriented graphic support package was implemented. A comprehensive description of the two software components comprising this package is given: Display Librarian and Display Controller. These programs have been implemented in FORTRAN on the XDS Sigma 2 Computer Facility. This facility consists of an XDS Sigma 2 general purpose computer coupled to a Computek Display Terminal.

  6. Cognitive architectures and language acquisition: a case study in pronoun comprehension.

    PubMed

    VAN Rij, Jacolien; VAN Rijn, Hedderik; Hendriks, Petra

    2010-06-01

    In this paper we discuss a computational cognitive model of children's poor performance on pronoun interpretation (the so-called Delay of Principle B Effect, or DPBE). This cognitive model is based on a theoretical account that attributes the DPBE to children's inability as hearers to also take into account the speaker's perspective. The cognitive model predicts that child hearers are unable to do so because their speed of linguistic processing is too limited to perform this second step in interpretation. We tested this hypothesis empirically in a psycholinguistic study, in which we slowed down the speech rate to give children more time for interpretation, and in a computational simulation study. The results of the two studies confirm the predictions of our model. Moreover, these studies show that embedding a theory of linguistic competence in a cognitive architecture allows for the generation of detailed and testable predictions with respect to linguistic performance.

  7. An Integrated Review of Emoticons in Computer-Mediated Communication.

    PubMed

    Aldunate, Nerea; González-Ibáñez, Roberto

    2016-01-01

    Facial expressions constitute a rich source of non-verbal cues in face-to-face communication. They provide interlocutors with resources to express and interpret verbal messages, which may affect their cognitive and emotional processing. Contrarily, computer-mediated communication (CMC), particularly text-based communication, is limited to the use of symbols to convey a message, where facial expressions cannot be transmitted naturally. In this scenario, people use emoticons as paralinguistic cues to convey emotional meaning. Research has shown that emoticons contribute to a greater social presence as a result of the enrichment of text-based communication channels. Additionally, emoticons constitute a valuable resource for language comprehension by providing expressivity to text messages. The latter findings have been supported by studies in neuroscience showing that particular brain regions involved in emotional processing are also activated when people are exposed to emoticons. To reach an integrated understanding of the influence of emoticons in human communication on both socio-cognitive and neural levels, we review the literature on emoticons in three different areas. First, we present relevant literature on emoticons in CMC. Second, we study the influence of emoticons in language comprehension. Finally, we show the incipient research in neuroscience on this topic. This mini review reveals that, while there are plenty of studies on the influence of emoticons in communication from a social psychology perspective, little is known about the neurocognitive basis of the effects of emoticons on communication dynamics.

  8. A Web GIS Enabled Comprehensive Hydrologic Information System for Indian Water Resources Systems

    NASA Astrophysics Data System (ADS)

    Goyal, A.; Tyagi, H.; Gosain, A. K.; Khosa, R.

    2017-12-01

    Hydrological systems across the globe are getting increasingly water stressed with each passing season due to climate variability & snowballing water demand. Hence, to safeguard food, livelihood & economic security, it becomes imperative to employ scientific studies for holistic management of indispensable resource like water. However, hydrological study of any scale & purpose is heavily reliant on various spatio-temporal datasets which are not only difficult to discover/access but are also tough to use & manage. Besides, owing to diversity of water sector agencies & dearth of standard operating procedures, seamless information exchange is challenging for collaborators. Extensive research is being done worldwide to address these issues but regrettably not much has been done in developing countries like India. Therefore, the current study endeavours to develop a Hydrological Information System framework in a Web-GIS environment for empowering Indian water resources systems. The study attempts to harmonize the standards for metadata, terminology, symbology, versioning & archiving for effective generation, processing, dissemination & mining of data required for hydrological studies. Furthermore, modelers with humble computing resources at their disposal, can consume this standardized data in high performance simulation modelling using cloud computing within the developed Web-GIS framework. They can also integrate the inputs-outputs of different numerical models available on the platform and integrate their results for comprehensive analysis of the chosen hydrological system. Thus, the developed portal is an all-in-one framework that can facilitate decision makers, industry professionals & researchers in efficient water management.

  9. Improved Foundry Castings Utilizing CAD/CAM (Computer Aided Design/ Computer Aided Manufacture). Volume 1. Overview

    DTIC Science & Technology

    1988-06-30

    casting. 68 Figure 1-9: Line printer representation of roll solidification. 69 Figure I1-1: Test casting model. 76 Figure 11-2: Division of test casting...writing new casting analysis and design routines. The new routines would take advantage of advanced criteria for predicting casting soundness and cast...properties and technical advances in computer hardware and software. 11 2. CONCLUSIONS UPCAST, a comprehensive software package, has been developed for

  10. The Bilingual Language Interaction Network for Comprehension of Speech*

    PubMed Central

    Marian, Viorica

    2013-01-01

    During speech comprehension, bilinguals co-activate both of their languages, resulting in cross-linguistic interaction at various levels of processing. This interaction has important consequences for both the structure of the language system and the mechanisms by which the system processes spoken language. Using computational modeling, we can examine how cross-linguistic interaction affects language processing in a controlled, simulated environment. Here we present a connectionist model of bilingual language processing, the Bilingual Language Interaction Network for Comprehension of Speech (BLINCS), wherein interconnected levels of processing are created using dynamic, self-organizing maps. BLINCS can account for a variety of psycholinguistic phenomena, including cross-linguistic interaction at and across multiple levels of processing, cognate facilitation effects, and audio-visual integration during speech comprehension. The model also provides a way to separate two languages without requiring a global language-identification system. We conclude that BLINCS serves as a promising new model of bilingual spoken language comprehension. PMID:24363602

  11. Comprehensive rotorcraft analysis methods

    NASA Technical Reports Server (NTRS)

    Stephens, Wendell B.; Austin, Edward E.

    1988-01-01

    The development and application of comprehensive rotorcraft analysis methods in the field of rotorcraft technology are described. These large scale analyses and the resulting computer programs are intended to treat the complex aeromechanical phenomena that describe the behavior of rotorcraft. They may be used to predict rotor aerodynamics, acoustic, performance, stability and control, handling qualities, loads and vibrations, structures, dynamics, and aeroelastic stability characteristics for a variety of applications including research, preliminary and detail design, and evaluation and treatment of field problems. The principal comprehensive methods developed or under development in recent years and generally available to the rotorcraft community because of US Army Aviation Research and Technology Activity (ARTA) sponsorship of all or part of the software systems are the Rotorcraft Flight Simulation (C81), Dynamic System Coupler (DYSCO), Coupled Rotor/Airframe Vibration Analysis Program (SIMVIB), Comprehensive Analytical Model of Rotorcraft Aerodynamics and Dynamics (CAMRAD), General Rotorcraft Aeromechanical Stability Program (GRASP), and Second Generation Comprehensive Helicopter Analysis System (2GCHAS).

  12. The Prevalence of Phenylketonuria in Arab Countries, Turkey, and Iran: A Systematic Review.

    PubMed

    El-Metwally, Ashraf; Yousef Al-Ahaidib, Lujane; Ayman Sunqurah, Alaa; Al-Surimi, Khaled; Househ, Mowafa; Alshehri, Ali; Da'ar, Omar B; Abdul Razzak, Hira; AlOdaib, Ali Nasser

    2018-01-01

    This paper seeks to identify the prevalence of Phenylketonuria (PKU) in Arab countries, Turkey, and Iran. The study reviewed the existence of comprehensive national newborn screening programs and reported consanguinity rates. A computer based literature search was conducted using relevant keywords to retrieve studies conducted on PKU. A total of 34 articles were included. Prevalence was categorized based on the type of screening method used for PKU diagnoses. The prevalence of classical PKU diagnosed through a comprehensive national newborn screening program ranged from 0.005% to 0.0167%. The highest prevalence was reported in Turkey at 0.0167%, whereas the lowest prevalence was reported in the UAE, 0.005%. The findings of this review emphasize the need for the establishment of more efficient reporting systems in these countries that would help measure Disability-Adjusted Life Year (DALY) in order to estimate the overall societal burden of PKU.

  13. Comprehensive computational model for combining fluid hydrodynamics, light transport and biomass growth in a Taylor vortex algal photobioreactor: Lagrangian approach.

    PubMed

    Gao, Xi; Kong, Bo; Vigil, R Dennis

    2017-01-01

    A comprehensive quantitative model incorporating the effects of fluid flow patterns, light distribution, and algal growth kinetics on biomass growth rate is developed in order to predict the performance of a Taylor vortex algal photobioreactor for culturing Chlorella vulgaris. A commonly used Lagrangian strategy for coupling the various factors influencing algal growth was employed whereby results from computational fluid dynamics and radiation transport simulations were used to compute numerous microorganism light exposure histories, and this information in turn was used to estimate the global biomass specific growth rate. The simulations provide good quantitative agreement with experimental data and correctly predict the trend in reactor performance as a key reactor operating parameter is varied (inner cylinder rotation speed). However, biomass growth curves are consistently over-predicted and potential causes for these over-predictions and drawbacks of the Lagrangian approach are addressed. Copyright © 2016 Elsevier Ltd. All rights reserved.

  14. The assembly, collapse and restoration of food webs

    USGS Publications Warehouse

    Dobson, Andy; Allesina, Stefano; Lafferty, Kevin; Pascual, Mercedes

    2009-01-01

    Darwin chose the metaphor of a 'tangled bank' to conclude the 'Origin of species'. Two centuries after Darwin's birth, we are still untangling the complex ecological networks he has pondered. In particular, studies of food webs provide important insights into how natural ecosystems function (Pascual & Dunne 2005). Although the nonlinear interactions between many species creates challenges of scale, resolution of data and significant computational constraints, the last 10 years have seen significant advances built on the earlier classic studies of Cohen, May, Pimm, Polis, Lawton and Yodzis (May 1974; Cohen 1978; Pimm 1982; Briand & Cohen 1984, 1987; Yodzis 1989; Cohen et al. 1990; Pimm et al. 1991; Yodzis & Innes 1992; Yodzis 1998). These gains stem from advances in computing power and the collation of more comprehensive data from a broader array of empirical food webs.

  15. Advanced imaging in COPD: insights into pulmonary pathophysiology

    PubMed Central

    Milne, Stephen

    2014-01-01

    Chronic obstructive pulmonary disease (COPD) involves a complex interaction of structural and functional abnormalities. The two have long been studied in isolation. However, advanced imaging techniques allow us to simultaneously assess pathological processes and their physiological consequences. This review gives a comprehensive account of the various advanced imaging modalities used to study COPD, including computed tomography (CT), magnetic resonance imaging (MRI), and the nuclear medicine techniques positron emission tomography (PET) and single-photon emission computed tomography (SPECT). Some more recent developments in imaging technology, including micro-CT, synchrotron imaging, optical coherence tomography (OCT) and electrical impedance tomography (EIT), are also described. The authors identify the pathophysiological insights gained from these techniques, and speculate on the future role of advanced imaging in both clinical and research settings. PMID:25478198

  16. Developing computer training programs for blood bankers.

    PubMed

    Eisenbrey, L

    1992-01-01

    Two surveys were conducted in July 1991 to gather information about computer training currently performed within American Red Cross Blood Services Regions. One survey was completed by computer trainers from software developer-vendors and regional centers. The second survey was directed to the trainees, to determine their perception of the computer training. The surveys identified the major concepts, length of training, evaluations, and methods of instruction used. Strengths and weaknesses of training programs were highlighted by trainee respondents. Using the survey information and other sources, recommendations (including those concerning which computer skills and tasks should be covered) are made that can be used as guidelines for developing comprehensive computer training programs at any blood bank or blood center.

  17. The iPlant Collaborative: Cyberinfrastructure for Enabling Data to Discovery for the Life Sciences

    PubMed Central

    Merchant, Nirav; Lyons, Eric; Goff, Stephen; Vaughn, Matthew; Ware, Doreen; Micklos, David; Antin, Parker

    2016-01-01

    The iPlant Collaborative provides life science research communities access to comprehensive, scalable, and cohesive computational infrastructure for data management; identity management; collaboration tools; and cloud, high-performance, high-throughput computing. iPlant provides training, learning material, and best practice resources to help all researchers make the best use of their data, expand their computational skill set, and effectively manage their data and computation when working as distributed teams. iPlant’s platform permits researchers to easily deposit and share their data and deploy new computational tools and analysis workflows, allowing the broader community to easily use and reuse those data and computational analyses. PMID:26752627

  18. Computer simulations of optimum boost and buck-boost converters

    NASA Technical Reports Server (NTRS)

    Rahman, S.

    1982-01-01

    The development of mathematicl models suitable for minimum weight boost and buck-boost converter designs are presented. The facility of an augumented Lagrangian (ALAG) multiplier-based nonlinear programming technique is demonstrated for minimum weight design optimizations of boost and buck-boost power converters. ALAG-based computer simulation results for those two minimum weight designs are discussed. Certain important features of ALAG are presented in the framework of a comprehensive design example for boost and buck-boost power converter design optimization. The study provides refreshing design insight of power converters and presents such information as weight annd loss profiles of various semiconductor components and magnetics as a function of the switching frequency.

  19. Kentucky geotechnical database.

    DOT National Transportation Integrated Search

    2005-03-01

    Development of a comprehensive dynamic, geotechnical database is described. Computer software selected to program the client/server application in windows environment, components and structure of the geotechnical database, and primary factors cons...

  20. Experimental and computational surface and flow-field results for an all-body hypersonic aircraft

    NASA Technical Reports Server (NTRS)

    Lockman, William K.; Lawrence, Scott L.; Cleary, Joseph W.

    1990-01-01

    The objective of the present investigation is to establish a benchmark experimental data base for a generic hypersonic vehicle shape for validation and/or calibration of advanced computational fluid dynamics computer codes. This paper includes results from the comprehensive test program conducted in the NASA/Ames 3.5-foot Hypersonic Wind Tunnel for a generic all-body hypersonic aircraft model. Experimental and computational results on flow visualization, surface pressures, surface convective heat transfer, and pitot-pressure flow-field surveys are presented. Comparisons of the experimental results with computational results from an upwind parabolized Navier-Stokes code developed at Ames demonstrate the capabilities of this code.

  1. A demonstrative model of a lunar base simulation on a personal computer

    NASA Technical Reports Server (NTRS)

    1985-01-01

    The initial demonstration model of a lunar base simulation is described. This initial model was developed on the personal computer level to demonstrate feasibility and technique before proceeding to a larger computer-based model. Lotus Symphony Version 1.1 software was used to base the demonstration model on an personal computer with an MS-DOS operating system. The personal computer-based model determined the applicability of lunar base modeling techniques developed at an LSPI/NASA workshop. In addition, the personnal computer-based demonstration model defined a modeling structure that could be employed on a larger, more comprehensive VAX-based lunar base simulation. Refinement of this personal computer model and the development of a VAX-based model is planned in the near future.

  2. Cognitive Modeling of Individual Variation in Reference Production and Comprehension

    PubMed Central

    Hendriks, Petra

    2016-01-01

    A challenge for most theoretical and computational accounts of linguistic reference is the observation that language users vary considerably in their referential choices. Part of the variation observed among and within language users and across tasks may be explained from variation in the cognitive resources available to speakers and listeners. This paper presents a computational model of reference production and comprehension developed within the cognitive architecture ACT-R. Through simulations with this ACT-R model, it is investigated how cognitive constraints interact with linguistic constraints and features of the linguistic discourse in speakers’ production and listeners’ comprehension of referring expressions in specific tasks, and how this interaction may give rise to variation in referential choice. The ACT-R model of reference explains and predicts variation among language users in their referential choices as a result of individual and task-related differences in processing speed and working memory capacity. Because of limitations in their cognitive capacities, speakers sometimes underspecify or overspecify their referring expressions, and listeners sometimes choose incorrect referents or are overly liberal in their interpretation of referring expressions. PMID:27092101

  3. Environmental Testing and Thermal Analysis of the NPS Solar Cell Array Tester (NPS-SCAT) CubeSat

    DTIC Science & Technology

    2011-06-01

    BCR Battery Charge Regulator C&DH Command and Data Handling CAD Computer Aided Design CDR Critical Design Review CFT Comprehensive Functional Test ...CPT Comprehensive Performance Test CoM Center of Mass COTS Commercial Off-the-Shelf CTB Cargo Transfer Bag EDU Engineering Design Unit EPS...and inexpensive solution. 2 C. ENVIRONMENTAL TESTING Environmental testing is an important element of the design and testing of a satellite. By

  4. 3D Human Motion Editing and Synthesis: A Survey

    PubMed Central

    Wang, Xin; Chen, Qiudi; Wang, Wanliang

    2014-01-01

    The ways to compute the kinematics and dynamic quantities of human bodies in motion have been studied in many biomedical papers. This paper presents a comprehensive survey of 3D human motion editing and synthesis techniques. Firstly, four types of methods for 3D human motion synthesis are introduced and compared. Secondly, motion capture data representation, motion editing, and motion synthesis are reviewed successively. Finally, future research directions are suggested. PMID:25045395

  5. NASA Headquarters training catalog

    NASA Technical Reports Server (NTRS)

    1990-01-01

    The NASA Headquarters training catalog is a comprehensive listing of all educational and employee development programs. This course catalog contains descriptions of course content, objectives, target audience, prerequisites, length of course, approximate number of times the course is offered per year, and cost of the course. Curriculum areas include graduate and undergraduate academic study; professional development program; and executive management, senior management, and supervisory development programs. Secretarial/clerical and general computer skills programs are also included.

  6. Computer-Assisted Literacy Instruction in Phonics,

    DTIC Science & Technology

    1980-04-01

    below 4.5, as measured on the Gates-MacGinitie reading test and poor word attack skills , as measured by the Wide Range Achievement Test (WRAT), Level...see and hear the words they were to pronouce, (2) to request that the synthesizer repronounce words, and (3) to sound out words in isolation and in...continued with the remaining 3 weeks of the ART Program, which covered vocabulary development, reading comprehension, and study skills . The RGLs of both

  7. DGCA: A comprehensive R package for Differential Gene Correlation Analysis.

    PubMed

    McKenzie, Andrew T; Katsyv, Igor; Song, Won-Min; Wang, Minghui; Zhang, Bin

    2016-11-15

    Dissecting the regulatory relationships between genes is a critical step towards building accurate predictive models of biological systems. A powerful approach towards this end is to systematically study the differences in correlation between gene pairs in more than one distinct condition. In this study we develop an R package, DGCA (for Differential Gene Correlation Analysis), which offers a suite of tools for computing and analyzing differential correlations between gene pairs across multiple conditions. To minimize parametric assumptions, DGCA computes empirical p-values via permutation testing. To understand differential correlations at a systems level, DGCA performs higher-order analyses such as measuring the average difference in correlation and multiscale clustering analysis of differential correlation networks. Through a simulation study, we show that the straightforward z-score based method that DGCA employs significantly outperforms the existing alternative methods for calculating differential correlation. Application of DGCA to the TCGA RNA-seq data in breast cancer not only identifies key changes in the regulatory relationships between TP53 and PTEN and their target genes in the presence of inactivating mutations, but also reveals an immune-related differential correlation module that is specific to triple negative breast cancer (TNBC). DGCA is an R package for systematically assessing the difference in gene-gene regulatory relationships under different conditions. This user-friendly, effective, and comprehensive software tool will greatly facilitate the application of differential correlation analysis in many biological studies and thus will help identification of novel signaling pathways, biomarkers, and targets in complex biological systems and diseases.

  8. Computation of UH-60A Airloads Using CFD/CSD Coupling on Unstructured Meshes

    NASA Technical Reports Server (NTRS)

    Biedron, Robert T.; Lee-Rausch, Elizabeth M.

    2011-01-01

    An unsteady Reynolds-averaged Navier-Stokes solver for unstructured grids is used to compute the rotor airloads on the UH-60A helicopter at high-speed and high thrust conditions. The flow solver is coupled to a rotorcraft comprehensive code in order to account for trim and aeroelastic deflections. Simulations are performed both with and without the fuselage, and the effects of grid resolution, temporal resolution and turbulence model are examined. Computed airloads are compared to flight data.

  9. CCSI and the role of advanced computing in accelerating the commercial deployment of carbon capture systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miller, David; Agarwal, Deborah A.; Sun, Xin

    2011-09-01

    The Carbon Capture Simulation Initiative is developing state-of-the-art computational modeling and simulation tools to accelerate the commercialization of carbon capture technology. The CCSI Toolset consists of an integrated multi-scale modeling and simulation framework, which includes extensive use of reduced order models (ROMs) and a comprehensive uncertainty quantification (UQ) methodology. This paper focuses on the interrelation among high performance computing, detailed device simulations, ROMs for scale-bridging, UQ and the integration framework.

  10. CCSI and the role of advanced computing in accelerating the commercial deployment of carbon capture systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miller, D.; Agarwal, D.; Sun, X.

    2011-01-01

    The Carbon Capture Simulation Initiative is developing state-of-the-art computational modeling and simulation tools to accelerate the commercialization of carbon capture technology. The CCSI Toolset consists of an integrated multi-scale modeling and simulation framework, which includes extensive use of reduced order models (ROMs) and a comprehensive uncertainty quantification (UQ) methodology. This paper focuses on the interrelation among high performance computing, detailed device simulations, ROMs for scale-bridging, UQ and the integration framework.

  11. Distributed Computer Networks in Support of Complex Group Practices

    PubMed Central

    Wess, Bernard P.

    1978-01-01

    The economics of medical computer networks are presented in context with the patient care and administrative goals of medical networks. Design alternatives and network topologies are discussed with an emphasis on medical network design requirements in distributed data base design, telecommunications, satellite systems, and software engineering. The success of the medical computer networking technology is predicated on the ability of medical and data processing professionals to design comprehensive, efficient, and virtually impenetrable security systems to protect data bases, network access and services, and patient confidentiality.

  12. Text Exchange System

    NASA Technical Reports Server (NTRS)

    Snyder, W. V.; Hanson, R. J.

    1986-01-01

    Text Exchange System (TES) exchanges and maintains organized textual information including source code, documentation, data, and listings. System consists of two computer programs and definition of format for information storage. Comprehensive program used to create, read, and maintain TES files. TES developed to meet three goals: First, easy and efficient exchange of programs and other textual data between similar and dissimilar computer systems via magnetic tape. Second, provide transportable management system for textual information. Third, provide common user interface, over wide variety of computing systems, for all activities associated with text exchange.

  13. Finite-difference computations of rotor loads

    NASA Technical Reports Server (NTRS)

    Caradonna, F. X.; Tung, C.

    1985-01-01

    This paper demonstrates the current and future potential of finite-difference methods for solving real rotor problems which now rely largely on empiricism. The demonstration consists of a simple means of combining existing finite-difference, integral, and comprehensive loads codes to predict real transonic rotor flows. These computations are performed for hover and high-advance-ratio flight. Comparisons are made with experimental pressure data.

  14. Finite-difference computations of rotor loads

    NASA Technical Reports Server (NTRS)

    Caradonna, F. X.; Tung, C.

    1985-01-01

    The current and future potential of finite difference methods for solving real rotor problems which now rely largely on empiricism are demonstrated. The demonstration consists of a simple means of combining existing finite-difference, integral, and comprehensive loads codes to predict real transonic rotor flows. These computations are performed for hover and high-advanced-ratio flight. Comparisons are made with experimental pressure data.

  15. The Effect of Dynamic Assessment in Synchronous Computer-Mediated Communication on Iranian EFL Learners' Listening Comprehension Ability at Upper-Intermediate Level

    ERIC Educational Resources Information Center

    Heidar, Davood Mashhadi; Afghari, Akbar

    2015-01-01

    The present paper concentrates on a web-based inquiry in the synchronous computer-mediated communication (SCMC) via Web 2.0 technologies of Talk and Write and Skype. It investigates EFL learners' socio-cognitive progress through dynamic assessment (DA), which follows Vygotsky's inclination for supportive interchange in the zone of proximal…

  16. A Comparison of Success and Failure Rates between Computer-Assisted and Traditional College Algebra Sections

    ERIC Educational Resources Information Center

    Herron, Sherry; Gandy, Rex; Ye, Ningjun; Syed, Nasser

    2012-01-01

    A unique aspect of the implementation of a computer algebra system (CAS) at a comprehensive university in the U.S. allowed us to compare the student success and failure rates to the traditional method of teaching college algebra. Due to space limitations, the university offered sections of both CAS and traditional simultaneously and, upon…

  17. The Computer as an Aid to Reading Instruction. Learning Package No. 27.

    ERIC Educational Resources Information Center

    Simic, Marge, Comp.; Smith, Carl, Ed.

    Originally developed for the Department of Defense Schools (DoDDS) system, this learning package on computer use in reading is designed for teachers who wish to upgrade or expand their teaching skills on their own. The package includes an overview of the project; a comprehensive search of the ERIC database; a lecture giving an overview on the…

  18. A Comprehensive Review of Computer Science and Data Processing Education in Community Colleges and Area Vocational-Technical Centers.

    ERIC Educational Resources Information Center

    Florida State Community Coll. Coordinating Board, Tallahassee.

    In 1987-88, the Florida State Board of Community Colleges and the Division of Vocational, Adult, and Community Education jointly conducted a review of instructional programs in computer science and data processing in order to determine needs for state policy changes and funding priorities. The process involved a review of printed resources on…

  19. USSR Report, Kommunist, No. 13, September 1986.

    DTIC Science & Technology

    1987-01-07

    all-union) program for specialization of NPO and industrial enterprises and their scientific research institutes and design bureaus could play a major...machine tools with numerical programming (ChPU), processing centers, automatic machines and groups of automatic machines controlled by computers, and...automatic lines, computer- controlled groups of equipment, comprehensively automated shops and sections) is the most important feature of high technical

  20. Computer Science Education in Secondary Schools--The Introduction of a New Compulsory Subject

    ERIC Educational Resources Information Center

    Hubwieser, Peter

    2012-01-01

    In 2004 the German state of Bavaria introduced a new compulsory subject of computer science (CS) in its grammar schools ("Gymnasium"). The subject is based on a comprehensive teaching concept that was developed by the author and his colleagues during the years 1995-2000. It comprises mandatory courses in grades 6/7 for all students of…

  1. The Use of Computer-Based Simulation to Aid Comprehension and Incidental Vocabulary Learning

    ERIC Educational Resources Information Center

    Mohsen, Mohammed Ali

    2016-01-01

    One of the main issues in language learning is to find ways to enable learners to interact with the language input in an involved task. Given that computer-based simulation allows learners to interact with visual modes, this article examines how the interaction of students with an online video simulation affects their second language video…

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hussain, Hameed; Malik, Saif Ur Rehman; Hameed, Abdul

    An efficient resource allocation is a fundamental requirement in high performance computing (HPC) systems. Many projects are dedicated to large-scale distributed computing systems that have designed and developed resource allocation mechanisms with a variety of architectures and services. In our study, through analysis, a comprehensive survey for describing resource allocation in various HPCs is reported. The aim of the work is to aggregate under a joint framework, the existing solutions for HPC to provide a thorough analysis and characteristics of the resource management and allocation strategies. Resource allocation mechanisms and strategies play a vital role towards the performance improvement ofmore » all the HPCs classifications. Therefore, a comprehensive discussion of widely used resource allocation strategies deployed in HPC environment is required, which is one of the motivations of this survey. Moreover, we have classified the HPC systems into three broad categories, namely: (a) cluster, (b) grid, and (c) cloud systems and define the characteristics of each class by extracting sets of common attributes. All of the aforementioned systems are cataloged into pure software and hybrid/hardware solutions. The system classification is used to identify approaches followed by the implementation of existing resource allocation strategies that are widely presented in the literature.« less

  3. Optimization of affinity, specificity and function of designed influenza inhibitors using deep sequencing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Whitehead, Timothy A.; Chevalier, Aaron; Song, Yifan

    2012-06-19

    We show that comprehensive sequence-function maps obtained by deep sequencing can be used to reprogram interaction specificity and to leapfrog over bottlenecks in affinity maturation by combining many individually small contributions not detectable in conventional approaches. We use this approach to optimize two computationally designed inhibitors against H1N1 influenza hemagglutinin and, in both cases, obtain variants with subnanomolar binding affinity. The most potent of these, a 51-residue protein, is broadly cross-reactive against all influenza group 1 hemagglutinins, including human H2, and neutralizes H1N1 viruses with a potency that rivals that of several human monoclonal antibodies, demonstrating that computational design followedmore » by comprehensive energy landscape mapping can generate proteins with potential therapeutic utility.« less

  4. Distinct Neurocognitive Strategies for Comprehensions of Human and Artificial Intelligence

    PubMed Central

    Ge, Jianqiao; Han, Shihui

    2008-01-01

    Although humans have inevitably interacted with both human and artificial intelligence in real life situations, it is unknown whether the human brain engages homologous neurocognitive strategies to cope with both forms of intelligence. To investigate this, we scanned subjects, using functional MRI, while they inferred the reasoning processes conducted by human agents or by computers. We found that the inference of reasoning processes conducted by human agents but not by computers induced increased activity in the precuneus but decreased activity in the ventral medial prefrontal cortex and enhanced functional connectivity between the two brain areas. The findings provide evidence for distinct neurocognitive strategies of taking others' perspective and inhibiting the process referenced to the self that are specific to the comprehension of human intelligence. PMID:18665211

  5. Non-intrusive uncertainty quantification of computational fluid dynamics simulations: notes on the accuracy and efficiency

    NASA Astrophysics Data System (ADS)

    Zimoń, Małgorzata; Sawko, Robert; Emerson, David; Thompson, Christopher

    2017-11-01

    Uncertainty quantification (UQ) is increasingly becoming an indispensable tool for assessing the reliability of computational modelling. Efficient handling of stochastic inputs, such as boundary conditions, physical properties or geometry, increases the utility of model results significantly. We discuss the application of non-intrusive generalised polynomial chaos techniques in the context of fluid engineering simulations. Deterministic and Monte Carlo integration rules are applied to a set of problems, including ordinary differential equations and the computation of aerodynamic parameters subject to random perturbations. In particular, we analyse acoustic wave propagation in a heterogeneous medium to study the effects of mesh resolution, transients, number and variability of stochastic inputs. We consider variants of multi-level Monte Carlo and perform a novel comparison of the methods with respect to numerical and parametric errors, as well as computational cost. The results provide a comprehensive view of the necessary steps in UQ analysis and demonstrate some key features of stochastic fluid flow systems.

  6. Breast tumor malignancy modelling using evolutionary neural logic networks.

    PubMed

    Tsakonas, Athanasios; Dounias, Georgios; Panagi, Georgia; Panourgias, Evangelia

    2006-01-01

    The present work proposes a computer assisted methodology for the effective modelling of the diagnostic decision for breast tumor malignancy. The suggested approach is based on innovative hybrid computational intelligence algorithms properly applied in related cytological data contained in past medical records. The experimental data used in this study were gathered in the early 1990s in the University of Wisconsin, based in post diagnostic cytological observations performed by expert medical staff. Data were properly encoded in a computer database and accordingly, various alternative modelling techniques were applied on them, in an attempt to form diagnostic models. Previous methods included standard optimisation techniques, as well as artificial intelligence approaches, in a way that a variety of related publications exists in modern literature on the subject. In this report, a hybrid computational intelligence approach is suggested, which effectively combines modern mathematical logic principles, neural computation and genetic programming in an effective manner. The approach proves promising either in terms of diagnostic accuracy and generalization capabilities, or in terms of comprehensibility and practical importance for the related medical staff.

  7. A Pilot Study of Biomedical Text Comprehension using an Attention-Based Deep Neural Reader: Design and Experimental Analysis.

    PubMed

    Kim, Seongsoon; Park, Donghyeon; Choi, Yonghwa; Lee, Kyubum; Kim, Byounggun; Jeon, Minji; Kim, Jihye; Tan, Aik Choon; Kang, Jaewoo

    2018-01-05

    With the development of artificial intelligence (AI) technology centered on deep-learning, the computer has evolved to a point where it can read a given text and answer a question based on the context of the text. Such a specific task is known as the task of machine comprehension. Existing machine comprehension tasks mostly use datasets of general texts, such as news articles or elementary school-level storybooks. However, no attempt has been made to determine whether an up-to-date deep learning-based machine comprehension model can also process scientific literature containing expert-level knowledge, especially in the biomedical domain. This study aims to investigate whether a machine comprehension model can process biomedical articles as well as general texts. Since there is no dataset for the biomedical literature comprehension task, our work includes generating a large-scale question answering dataset using PubMed and manually evaluating the generated dataset. We present an attention-based deep neural model tailored to the biomedical domain. To further enhance the performance of our model, we used a pretrained word vector and biomedical entity type embedding. We also developed an ensemble method of combining the results of several independent models to reduce the variance of the answers from the models. The experimental results showed that our proposed deep neural network model outperformed the baseline model by more than 7% on the new dataset. We also evaluated human performance on the new dataset. The human evaluation result showed that our deep neural model outperformed humans in comprehension by 22% on average. In this work, we introduced a new task of machine comprehension in the biomedical domain using a deep neural model. Since there was no large-scale dataset for training deep neural models in the biomedical domain, we created the new cloze-style datasets Biomedical Knowledge Comprehension Title (BMKC_T) and Biomedical Knowledge Comprehension Last Sentence (BMKC_LS) (together referred to as BioMedical Knowledge Comprehension) using the PubMed corpus. The experimental results showed that the performance of our model is much higher than that of humans. We observed that our model performed consistently better regardless of the degree of difficulty of a text, whereas humans have difficulty when performing biomedical literature comprehension tasks that require expert level knowledge. ©Seongsoon Kim, Donghyeon Park, Yonghwa Choi, Kyubum Lee, Byounggun Kim, Minji Jeon, Jihye Kim, Aik Choon Tan, Jaewoo Kang. Originally published in JMIR Medical Informatics (http://medinform.jmir.org), 05.01.2018.

  8. On aerodynamic wake analysis and its relation to total aerodynamic drag in a wind tunnel environment

    NASA Astrophysics Data System (ADS)

    Guterres, Rui M.

    The present work was developed with the goal of advancing the state of the art in the application of three-dimensional wake data analysis to the quantification of aerodynamic drag on a body in a low speed wind tunnel environment. Analysis of the existing tools, their strengths and limitations is presented. Improvements to the existing analysis approaches were made. Software tools were developed to integrate the analysis into a practical tool. A comprehensive derivation of the equations needed for drag computations based on three dimensional separated wake data is developed. A set of complete steps ranging from the basic mathematical concept to the applicable engineering equations is presented. An extensive experimental study was conducted. Three representative body types were studied in varying ground effect conditions. A detailed qualitative wake analysis using wake imaging and two and three dimensional flow visualization was performed. Several significant features of the flow were identified and their relation to the total aerodynamic drag established. A comprehensive wake study of this type is shown to be in itself a powerful tool for the analysis of the wake aerodynamics and its relation to body drag. Quantitative wake analysis techniques were developed. Significant post processing and data conditioning tools and precision analysis were developed. The quality of the data is shown to be in direct correlation with the accuracy of the computed aerodynamic drag. Steps are taken to identify the sources of uncertainty. These are quantified when possible and the accuracy of the computed results is seen to significantly improve. When post processing alone does not resolve issues related to precision and accuracy, solutions are proposed. The improved quantitative wake analysis is applied to the wake data obtained. Guidelines are established that will lead to more successful implementation of these tools in future research programs. Close attention is paid to implementation of issues that are of crucial importance for the accuracy of the results and that are not detailed in the literature. The impact of ground effect on the flows in hand is qualitatively and quantitatively studied. Its impact on the accuracy of the computations as well as the wall drag incompatibility with the theoretical model followed are discussed. The newly developed quantitative analysis provides significantly increased accuracy. The aerodynamic drag coefficient is computed within one percent of balance measured value for the best cases.

  9. PCTFPeval: a web tool for benchmarking newly developed algorithms for predicting cooperative transcription factor pairs in yeast.

    PubMed

    Lai, Fu-Jou; Chang, Hong-Tsun; Wu, Wei-Sheng

    2015-01-01

    Computational identification of cooperative transcription factor (TF) pairs helps understand the combinatorial regulation of gene expression in eukaryotic cells. Many advanced algorithms have been proposed to predict cooperative TF pairs in yeast. However, it is still difficult to conduct a comprehensive and objective performance comparison of different algorithms because of lacking sufficient performance indices and adequate overall performance scores. To solve this problem, in our previous study (published in BMC Systems Biology 2014), we adopted/proposed eight performance indices and designed two overall performance scores to compare the performance of 14 existing algorithms for predicting cooperative TF pairs in yeast. Most importantly, our performance comparison framework can be applied to comprehensively and objectively evaluate the performance of a newly developed algorithm. However, to use our framework, researchers have to put a lot of effort to construct it first. To save researchers time and effort, here we develop a web tool to implement our performance comparison framework, featuring fast data processing, a comprehensive performance comparison and an easy-to-use web interface. The developed tool is called PCTFPeval (Predicted Cooperative TF Pair evaluator), written in PHP and Python programming languages. The friendly web interface allows users to input a list of predicted cooperative TF pairs from their algorithm and select (i) the compared algorithms among the 15 existing algorithms, (ii) the performance indices among the eight existing indices, and (iii) the overall performance scores from two possible choices. The comprehensive performance comparison results are then generated in tens of seconds and shown as both bar charts and tables. The original comparison results of each compared algorithm and each selected performance index can be downloaded as text files for further analyses. Allowing users to select eight existing performance indices and 15 existing algorithms for comparison, our web tool benefits researchers who are eager to comprehensively and objectively evaluate the performance of their newly developed algorithm. Thus, our tool greatly expedites the progress in the research of computational identification of cooperative TF pairs.

  10. PCTFPeval: a web tool for benchmarking newly developed algorithms for predicting cooperative transcription factor pairs in yeast

    PubMed Central

    2015-01-01

    Background Computational identification of cooperative transcription factor (TF) pairs helps understand the combinatorial regulation of gene expression in eukaryotic cells. Many advanced algorithms have been proposed to predict cooperative TF pairs in yeast. However, it is still difficult to conduct a comprehensive and objective performance comparison of different algorithms because of lacking sufficient performance indices and adequate overall performance scores. To solve this problem, in our previous study (published in BMC Systems Biology 2014), we adopted/proposed eight performance indices and designed two overall performance scores to compare the performance of 14 existing algorithms for predicting cooperative TF pairs in yeast. Most importantly, our performance comparison framework can be applied to comprehensively and objectively evaluate the performance of a newly developed algorithm. However, to use our framework, researchers have to put a lot of effort to construct it first. To save researchers time and effort, here we develop a web tool to implement our performance comparison framework, featuring fast data processing, a comprehensive performance comparison and an easy-to-use web interface. Results The developed tool is called PCTFPeval (Predicted Cooperative TF Pair evaluator), written in PHP and Python programming languages. The friendly web interface allows users to input a list of predicted cooperative TF pairs from their algorithm and select (i) the compared algorithms among the 15 existing algorithms, (ii) the performance indices among the eight existing indices, and (iii) the overall performance scores from two possible choices. The comprehensive performance comparison results are then generated in tens of seconds and shown as both bar charts and tables. The original comparison results of each compared algorithm and each selected performance index can be downloaded as text files for further analyses. Conclusions Allowing users to select eight existing performance indices and 15 existing algorithms for comparison, our web tool benefits researchers who are eager to comprehensively and objectively evaluate the performance of their newly developed algorithm. Thus, our tool greatly expedites the progress in the research of computational identification of cooperative TF pairs. PMID:26677932

  11. Computer-Aided Design/Computer-Assisted Manufacture Monolithic Restorations for Severely Worn Dentition: A Case History Report.

    PubMed

    Abou-Ayash, Samir; Boldt, Johannes; Vuck, Alexander

    Full-arch rehabilitation of patients with severe tooth wear due to parafunctional behavior is a challenge for dentists and dental technicians, especially when a highly esthetic outcome is desired. A variety of different treatment options and prosthetic materials are available for such a clinical undertaking. The ongoing progress of computer-aided design/computer-assisted manufacture technologies in combination with all-ceramic materials provides a predictable workflow for these complex cases. This case history report describes a comprehensive, step-by-step treatment protocol leading to an optimally predictable treatment outcome for an esthetically compromised patient.

  12. Comprehensive Digital Imaging Network Project At Georgetown University Hospital

    NASA Astrophysics Data System (ADS)

    Mun, Seong K.; Stauffer, Douglas; Zeman, Robert; Benson, Harold; Wang, Paul; Allman, Robert

    1987-10-01

    The radiology practice is going through rapid changes due to the introduction of state-of-the-art computed based technologies. For the last twenty years we have witnessed the introduction of many new medical diagnostic imaging systems such as x-ray computed tomo-graphy, digital subtraction angiography (DSA), computerized nuclear medicine, single pho-ton emission computed tomography (SPECT), positron emission tomography (PET) and more re-cently, computerized digital radiography and nuclear magnetic resonance imaging (MRI). Other than the imaging systems, there has been a steady introduction of computed based information systems for radiology departments and hospitals.

  13. A cost-utility analysis of the use of preoperative computed tomographic angiography in abdomen-based perforator flap breast reconstruction.

    PubMed

    Offodile, Anaeze C; Chatterjee, Abhishek; Vallejo, Sergio; Fisher, Carla S; Tchou, Julia C; Guo, Lifei

    2015-04-01

    Computed tomographic angiography is a diagnostic tool increasingly used for preoperative vascular mapping in abdomen-based perforator flap breast reconstruction. This study compared the use of computed tomographic angiography and the conventional practice of Doppler ultrasonography only in postmastectomy reconstruction using a cost-utility model. Following a comprehensive literature review, a decision analytic model was created using the three most clinically relevant health outcomes in free autologous breast reconstruction with computed tomographic angiography versus Doppler ultrasonography only. Cost and utility estimates for each health outcome were used to derive the quality-adjusted life-years and incremental cost-utility ratio. One-way sensitivity analysis was performed to scrutinize the robustness of the authors' results. Six studies and 782 patients were identified. Cost-utility analysis revealed a baseline cost savings of $3179, a gain in quality-adjusted life-years of 0.25. This yielded an incremental cost-utility ratio of -$12,716, implying a dominant choice favoring preoperative computed tomographic angiography. Sensitivity analysis revealed that computed tomographic angiography was costlier when the operative time difference between the two techniques was less than 21.3 minutes. However, the clinical advantage of computed tomographic angiography over Doppler ultrasonography only showed that computed tomographic angiography would still remain the cost-effective option even if it offered no additional operating time advantage. The authors' results show that computed tomographic angiography is a cost-effective technology for identifying lower abdominal perforators for autologous breast reconstruction. Although the perfect study would be a randomized controlled trial of the two approaches with true cost accrual, the authors' results represent the best available evidence.

  14. Social, Organizational, and Contextual Characteristics of Clinical Decision Support Systems for Intensive Insulin Therapy: A Literature Review and Case Study

    PubMed Central

    Campion, Thomas R.; Waitman, Lemuel R.; May, Addison K.; Ozdas, Asli; Lorenzi, Nancy M.; Gadd, Cynthia S.

    2009-01-01

    Introduction: Evaluations of computerized clinical decision support systems (CDSS) typically focus on clinical performance changes and do not include social, organizational, and contextual characteristics explaining use and effectiveness. Studies of CDSS for intensive insulin therapy (IIT) are no exception, and the literature lacks an understanding of effective computer-based IIT implementation and operation. Results: This paper presents (1) a literature review of computer-based IIT evaluations through the lens of institutional theory, a discipline from sociology and organization studies, to demonstrate the inconsistent reporting of workflow and care process execution and (2) a single-site case study to illustrate how computer-based IIT requires substantial organizational change and creates additional complexity with unintended consequences including error. Discussion: Computer-based IIT requires organizational commitment and attention to site-specific technology, workflow, and care processes to achieve intensive insulin therapy goals. The complex interaction between clinicians, blood glucose testing devices, and CDSS may contribute to workflow inefficiency and error. Evaluations rarely focus on the perspective of nurses, the primary users of computer-based IIT whose knowledge can potentially lead to process and care improvements. Conclusion: This paper addresses a gap in the literature concerning the social, organizational, and contextual characteristics of CDSS in general and for intensive insulin therapy specifically. Additionally, this paper identifies areas for future research to define optimal computer-based IIT process execution: the frequency and effect of manual data entry error of blood glucose values, the frequency and effect of nurse overrides of CDSS insulin dosing recommendations, and comprehensive ethnographic study of CDSS for IIT. PMID:19815452

  15. Knowledge-driven computational modeling in Alzheimer's disease research: Current state and future trends.

    PubMed

    Geerts, Hugo; Hofmann-Apitius, Martin; Anastasio, Thomas J

    2017-11-01

    Neurodegenerative diseases such as Alzheimer's disease (AD) follow a slowly progressing dysfunctional trajectory, with a large presymptomatic component and many comorbidities. Using preclinical models and large-scale omics studies ranging from genetics to imaging, a large number of processes that might be involved in AD pathology at different stages and levels have been identified. The sheer number of putative hypotheses makes it almost impossible to estimate their contribution to the clinical outcome and to develop a comprehensive view on the pathological processes driving the clinical phenotype. Traditionally, bioinformatics approaches have provided correlations and associations between processes and phenotypes. Focusing on causality, a new breed of advanced and more quantitative modeling approaches that use formalized domain expertise offer new opportunities to integrate these different modalities and outline possible paths toward new therapeutic interventions. This article reviews three different computational approaches and their possible complementarities. Process algebras, implemented using declarative programming languages such as Maude, facilitate simulation and analysis of complicated biological processes on a comprehensive but coarse-grained level. A model-driven Integration of Data and Knowledge, based on the OpenBEL platform and using reverse causative reasoning and network jump analysis, can generate mechanistic knowledge and a new, mechanism-based taxonomy of disease. Finally, Quantitative Systems Pharmacology is based on formalized implementation of domain expertise in a more fine-grained, mechanism-driven, quantitative, and predictive humanized computer model. We propose a strategy to combine the strengths of these individual approaches for developing powerful modeling methodologies that can provide actionable knowledge for rational development of preventive and therapeutic interventions. Development of these computational approaches is likely to be required for further progress in understanding and treating AD. Copyright © 2017 the Alzheimer's Association. Published by Elsevier Inc. All rights reserved.

  16. EPA-Health Canada CompTox Collaboration

    EPA Science Inventory

    Research program of EPA’s National Center for Computational Toxicology addresses chemical screening and prioritization needs for pesticidal inerts, anti-microbials, CCLs, HPVs and MPVs, comprehensive use of HTS technologies to generate.

  17. Scalable Parameter Estimation for Genome-Scale Biochemical Reaction Networks

    PubMed Central

    Kaltenbacher, Barbara; Hasenauer, Jan

    2017-01-01

    Mechanistic mathematical modeling of biochemical reaction networks using ordinary differential equation (ODE) models has improved our understanding of small- and medium-scale biological processes. While the same should in principle hold for large- and genome-scale processes, the computational methods for the analysis of ODE models which describe hundreds or thousands of biochemical species and reactions are missing so far. While individual simulations are feasible, the inference of the model parameters from experimental data is computationally too intensive. In this manuscript, we evaluate adjoint sensitivity analysis for parameter estimation in large scale biochemical reaction networks. We present the approach for time-discrete measurement and compare it to state-of-the-art methods used in systems and computational biology. Our comparison reveals a significantly improved computational efficiency and a superior scalability of adjoint sensitivity analysis. The computational complexity is effectively independent of the number of parameters, enabling the analysis of large- and genome-scale models. Our study of a comprehensive kinetic model of ErbB signaling shows that parameter estimation using adjoint sensitivity analysis requires a fraction of the computation time of established methods. The proposed method will facilitate mechanistic modeling of genome-scale cellular processes, as required in the age of omics. PMID:28114351

  18. Applied Graph-Mining Algorithms to Study Biomolecular Interaction Networks

    PubMed Central

    2014-01-01

    Protein-protein interaction (PPI) networks carry vital information on the organization of molecular interactions in cellular systems. The identification of functionally relevant modules in PPI networks is one of the most important applications of biological network analysis. Computational analysis is becoming an indispensable tool to understand large-scale biomolecular interaction networks. Several types of computational methods have been developed and employed for the analysis of PPI networks. Of these computational methods, graph comparison and module detection are the two most commonly used strategies. This review summarizes current literature on graph kernel and graph alignment methods for graph comparison strategies, as well as module detection approaches including seed-and-extend, hierarchical clustering, optimization-based, probabilistic, and frequent subgraph methods. Herein, we provide a comprehensive review of the major algorithms employed under each theme, including our recently published frequent subgraph method, for detecting functional modules commonly shared across multiple cancer PPI networks. PMID:24800226

  19. Very Low Intravenous Contrast Volume Protocol for Computed Tomography Angiography Providing Comprehensive Cardiac and Vascular Assessment Prior to Transcatheter Aortic Valve Replacement in Patients with Chronic Kidney Disease

    PubMed Central

    Pulerwitz, Todd C.; Khalique, Omar K.; Nazif, Tamim N.; Rozenshtein, Anna; Pearson, Gregory D.N.; Hahn, Rebecca T.; Vahl, Torsten P.; Kodali, Susheel K.; George, Isaac; Leon, Martin B.; D'Souza, Belinda; Po, Ming Jack; Einstein, Andrew J.

    2016-01-01

    Background Transcatheter aortic valve replacement (TAVR) is a lifesaving procedure for many patients high risk for surgical aortic valve replacement. The prevalence of chronic kidney disease (CKD) is high in this population, and thus a very low contrast volume (VLCV) computed tomography angiography (CTA) protocol providing comprehensive cardiac and vascular imaging would be valuable. Methods 52 patients with severe, symptomatic aortic valve disease, undergoing pre-TAVR CTA assessment from 2013-4 at Columbia University Medical Center were studied, including all 26 patients with CKD (eGFR<30mL/min) who underwent a novel VLCV protocol (20mL of iohexol at 2.5mL/s), and 26 standard-contrast-volume (SCV) protocol patients. Using a 320-slice volumetric scanner, the protocol included ECG-gated volume scanning of the aortic root followed by medium-pitch helical vascular scanning through the femoral arteries. Two experienced cardiologists performed aortic annulus and root measurements. Vascular image quality was assessed by two radiologists using a 4-point scale. Results VLCV patients had mean(±SD) age 86±6.5, BMI 23.9±3.4 kg/m2 with 54% men; SCV patients age 83±8.8, BMI 28.7±5.3 kg/m2, 65% men. There was excellent intra- and inter-observer agreement for annular and root measurements, and excellent agreement with 3D-transesophageal echocardiographic measurements. Both radiologists found diagnostic-quality vascular imaging in 96% of VLCV and 100% of SCV cases, with excellent inter-observer agreement. Conclusions This study is the first of its kind to report the feasibility and reproducibility of measurements for a VLCV protocol for comprehensive pre-TAVR CTA. There was excellent agreement of cardiac measurements and almost all studies were diagnostic quality for vascular access assessment. PMID:27061253

  20. Relating dynamic brain states to dynamic machine states: Human and machine solutions to the speech recognition problem

    PubMed Central

    Liu, Xunying; Zhang, Chao; Woodland, Phil; Fonteneau, Elisabeth

    2017-01-01

    There is widespread interest in the relationship between the neurobiological systems supporting human cognition and emerging computational systems capable of emulating these capacities. Human speech comprehension, poorly understood as a neurobiological process, is an important case in point. Automatic Speech Recognition (ASR) systems with near-human levels of performance are now available, which provide a computationally explicit solution for the recognition of words in continuous speech. This research aims to bridge the gap between speech recognition processes in humans and machines, using novel multivariate techniques to compare incremental ‘machine states’, generated as the ASR analysis progresses over time, to the incremental ‘brain states’, measured using combined electro- and magneto-encephalography (EMEG), generated as the same inputs are heard by human listeners. This direct comparison of dynamic human and machine internal states, as they respond to the same incrementally delivered sensory input, revealed a significant correspondence between neural response patterns in human superior temporal cortex and the structural properties of ASR-derived phonetic models. Spatially coherent patches in human temporal cortex responded selectively to individual phonetic features defined on the basis of machine-extracted regularities in the speech to lexicon mapping process. These results demonstrate the feasibility of relating human and ASR solutions to the problem of speech recognition, and suggest the potential for further studies relating complex neural computations in human speech comprehension to the rapidly evolving ASR systems that address the same problem domain. PMID:28945744

  1. Enhancing Next-Generation Sequencing-Guided Cancer Care Through Cognitive Computing.

    PubMed

    Patel, Nirali M; Michelini, Vanessa V; Snell, Jeff M; Balu, Saianand; Hoyle, Alan P; Parker, Joel S; Hayward, Michele C; Eberhard, David A; Salazar, Ashley H; McNeillie, Patrick; Xu, Jia; Huettner, Claudia S; Koyama, Takahiko; Utro, Filippo; Rhrissorrakrai, Kahn; Norel, Raquel; Bilal, Erhan; Royyuru, Ajay; Parida, Laxmi; Earp, H Shelton; Grilley-Olson, Juneko E; Hayes, D Neil; Harvey, Stephen J; Sharpless, Norman E; Kim, William Y

    2018-02-01

    Using next-generation sequencing (NGS) to guide cancer therapy has created challenges in analyzing and reporting large volumes of genomic data to patients and caregivers. Specifically, providing current, accurate information on newly approved therapies and open clinical trials requires considerable manual curation performed mainly by human "molecular tumor boards" (MTBs). The purpose of this study was to determine the utility of cognitive computing as performed by Watson for Genomics (WfG) compared with a human MTB. One thousand eighteen patient cases that previously underwent targeted exon sequencing at the University of North Carolina (UNC) and subsequent analysis by the UNCseq informatics pipeline and the UNC MTB between November 7, 2011, and May 12, 2015, were analyzed with WfG, a cognitive computing technology for genomic analysis. Using a WfG-curated actionable gene list, we identified additional genomic events of potential significance (not discovered by traditional MTB curation) in 323 (32%) patients. The majority of these additional genomic events were considered actionable based upon their ability to qualify patients for biomarker-selected clinical trials. Indeed, the opening of a relevant clinical trial within 1 month prior to WfG analysis provided the rationale for identification of a new actionable event in nearly a quarter of the 323 patients. This automated analysis took <3 minutes per case. These results demonstrate that the interpretation and actionability of somatic NGS results are evolving too rapidly to rely solely on human curation. Molecular tumor boards empowered by cognitive computing could potentially improve patient care by providing a rapid, comprehensive approach for data analysis and consideration of up-to-date availability of clinical trials. The results of this study demonstrate that the interpretation and actionability of somatic next-generation sequencing results are evolving too rapidly to rely solely on human curation. Molecular tumor boards empowered by cognitive computing can significantly improve patient care by providing a fast, cost-effective, and comprehensive approach for data analysis in the delivery of precision medicine. Patients and physicians who are considering enrollment in clinical trials may benefit from the support of such tools applied to genomic data. © AlphaMed Press 2017.

  2. A mathematical basis for plant patterning derived from physico-chemical phenomena.

    PubMed

    Beleyur, Thejasvi; Abdul Kareem, Valiya Kadavu; Shaji, Anil; Prasad, Kalika

    2013-04-01

    The position of leaves and flowers along the stem axis generates a specific pattern, known as phyllotaxis. A growing body of evidence emerging from recent computational modeling and experimental studies suggests that regulators controlling phyllotaxis are chemical, e.g. the plant growth hormone auxin and its dynamic accumulation pattern by polar auxin transport, and physical, e.g. mechanical properties of the cell. Here we present comprehensive views on how chemical and physical properties of cells regulate the pattern of leaf initiation. We further compare different computational modeling studies to understand their scope in reproducing the observed patterns. Despite a plethora of experimental studies on phyllotaxis, understanding of molecular mechanisms of pattern initiation in plants remains fragmentary. Live imaging of growth dynamics and physicochemical properties at the shoot apex of mutants displaying stable changes from one pattern to another should provide mechanistic insights into organ initiation patterns. Copyright © 2013 WILEY Periodicals, Inc.

  3. Task-dependency and structure-dependency in number interference effects in sentence comprehension

    PubMed Central

    Franck, Julie; Colonna, Saveria; Rizzi, Luigi

    2015-01-01

    We report three experiments on French that explore number mismatch effects in intervention configurations in the comprehension of object A’-dependencies, relative clauses and questions. The study capitalizes on the finding of object attraction in sentence production, in which speakers sometimes erroneously produce a verb that agrees in number with a plural object in object relative clauses. Evidence points to the role of three critical constructs from formal syntax: intervention, intermediate traces and c-command (Franck et al., 2010). Experiment 1, using a self-paced reading procedure on these grammatical structures with an agreement error on the verb, shows an enhancing effect of number mismatch in intervention configurations, with faster reading times with plural (mismatching) objects. Experiment 2, using an on-line grammaticality judgment task on the ungrammatical versions of these structures, shows an interference effect in the form of attraction, with slower response times with plural objects. Experiment 3 with a similar grammaticality judgment task shows stronger attraction from c-commanding than from preceding interveners. Overall, the data suggest that syntactic computations in performance refer to the same syntactic representations in production and comprehension, but that different tasks tap into different processes involved in parsing: whereas performance in self-paced reading reflects the intervention of the subject in the process of building an object A’-dependency, performance in grammaticality judgment reflects intervention of the object on the computation of the subject-verb agreement dependency. The latter shows the hallmarks of structure-dependent attraction effects in sentence production, in particular, a sensitivity to specific characteristics of hierarchical representations. PMID:25914652

  4. Computation, Mathematics and Logistics Department Report for Fiscal Year 1978.

    DTIC Science & Technology

    1980-03-01

    storage technology. A reference library on these and related areas is now composed of two thousand documents. The most comprehensive tool available...at DTNSRDC on the CDC 6000 Computer System for a variety of applications including Navy Logistics, Library Science, Ocean Science, Contract Manage... Library Science) Track technical documents on advanced ship design Univ. of Virginia at Charlottesville - (Ocean Science) Monitor research projects for

  5. Analysis of the Effects of the Computer Enhanced Classroom on the Achievement of Remedial High School Math Students.

    ERIC Educational Resources Information Center

    Lang, William Steve; And Others

    The effects of the use of computer-enhanced instruction with remedial students were assessed, using 4,293 ninth through twelfth graders--3,308 Black, 957 White, and 28 Other--involved in the Governor's Remediation Initiative (GRI) in Georgia. Data sources included the Comprehensive Tests of Basic Skills (CTBS), a data collection form developed for…

  6. Effects of Test Media on Different EFL Test-Takers in Writing Scores and in the Cognitive Writing Process

    ERIC Educational Resources Information Center

    Zou, Xiao-Ling; Chen, Yan-Min

    2016-01-01

    The effects of computer and paper test media on EFL test-takers with different computer familiarity in writing scores and in the cognitive writing process have been comprehensively explored from the learners' aspect as well as on the basis of related theories and practice. The results indicate significant differences in test scores among the…

  7. Neurolinguistically constrained simulation of sentence comprehension: integrating artificial intelligence and brain theory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gigley, H.M.

    1982-01-01

    An artificial intelligence approach to the simulation of neurolinguistically constrained processes in sentence comprehension is developed using control strategies for simulation of cooperative computation in associative networks. The desirability of this control strategy in contrast to ATN and production system strategies is explained. A first pass implementation of HOPE, an artificial intelligence simulation model of sentence comprehension, constrained by studies of aphasic performance, psycholinguistics, neurolinguistics, and linguistic theory is described. Claims that the model could serve as a basis for sentence production simulation and for a model of language acquisition as associative learning are discussed. HOPE is a model thatmore » performs in a normal state and includes a lesion simulation facility. HOPE is also a research tool. Its modifiability and use as a tool to investigate hypothesized causes of degradation in comprehension performance by aphasic patients are described. Issues of using behavioral constraints in modelling and obtaining appropriate data for simulated process modelling are discussed. Finally, problems of validation of the simulation results are raised; and issues of how to interpret clinical results to define the evolution of the model are discussed. Conclusions with respect to the feasibility of artificial intelligence simulation process modelling are discussed based on the current state of research.« less

  8. Effects of supported electronic text and explicit instruction on science comprehension by students with autism spectrum disorder

    NASA Astrophysics Data System (ADS)

    Knight, Victoria Floyd

    Supported electronic text (eText), or text that has been altered to increase access and provide support to learners, may promote comprehension of science content for students with disabilities. According to CAST, Book Builder(TM) uses supported eText to promote reading for meaning for all students. Although little research has been conducted in the area of supported eText for students with autism spectrum disorders (ASD), technology (e.g., computer assisted instruction) has been used for over 35 years to instruct students with ASD in academic areas. The purpose of this study was to evaluate the effects of a supported eText and explicit instruction on the science vocabulary and comprehension of four middle school students with ASD. Researchers used a multiple probe across participants design to evaluate the Book Builder (TM) program on measures of vocabulary, literal comprehension, and application questions. Results indicated a functional relation between the Book Builder(TM) and explicit instruction (i.e., model-lead-test, examples and non-examples, and referral to the definition) and the number of correct responses on the probe. In addition, students were able to generalize concepts to untrained exemplars. Finally, teachers and students validate the program as practical and useful.

  9. Challenges in reducing the computational time of QSTS simulations for distribution system analysis.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Deboever, Jeremiah; Zhang, Xiaochen; Reno, Matthew J.

    The rapid increase in penetration of distributed energy resources on the electric power distribution system has created a need for more comprehensive interconnection modelling and impact analysis. Unlike conventional scenario - based studies , quasi - static time - series (QSTS) simulation s can realistically model time - dependent voltage controllers and the diversity of potential impacts that can occur at different times of year . However, to accurately model a distribution system with all its controllable devices, a yearlong simulation at 1 - second resolution is often required , which could take conventional computers a computational time of 10more » to 120 hours when an actual unbalanced distribution feeder is modeled . This computational burden is a clear l imitation to the adoption of QSTS simulation s in interconnection studies and for determining optimal control solutions for utility operations . Our ongoing research to improve the speed of QSTS simulation has revealed many unique aspects of distribution system modelling and sequential power flow analysis that make fast QSTS a very difficult problem to solve. In this report , the most relevant challenges in reducing the computational time of QSTS simulations are presented: number of power flows to solve, circuit complexity, time dependence between time steps, multiple valid power flow solutions, controllable element interactions, and extensive accurate simulation analysis.« less

  10. Highway rock slope management program.

    DOT National Transportation Integrated Search

    2001-06-30

    Development of a comprehensive geotechnical database for risk management of highway rock slope problems is described. Computer software selected to program the client/server application in windows environment, components and structure of the geote...

  11. Semiconductor Nanowire and Nanoribbon Thermoelectrics: A Comprehensive Computational Study

    DTIC Science & Technology

    2013-05-01

    August 17-20, 2010). Available online through IEEE Xplore . http://dx.doi.org/10.1109/NANO.2010.5698047 4. Z. Aksamija and I. Knezevic...Korea (August 17-20, 2010). Available online through IEEE Xplore . http://dx.doi.org/10.1109/NANO.2010.5697827 5. D. Vasileska, K. Raleva, S. M...IWCE 2010) Available online through IEEE Xplore , http://dx.doi.org/10.1109/IWCE.2010.5677916 6. E. B. Ramayya and I. Knezevic, “Ultrascaled

  12. The Design and Implementation of an Operating System for the IBM Personal Computer.

    DTIC Science & Technology

    1984-12-01

    comprehensive study of an actual operating system in an effort to show students how theory has been put into action (Lions, 1978; McCharen, 1980). Another...Freedman, 1977). However, since it is easier to develop and maintain a program written in a high-order language (HOL), Pascal was chosen to be the primary...monolithic monitor approach and the kernel approach are strategies which can be used to structure operating systems ( Deitel , 1983; Holt, 1983

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Geng, Guangchao; Abhyankar, Shrirang; Wang, Xiaoyu

    Transient stability-constrained optimal power flow is an important emerging problem with power systems pushed to the limits for economic benefits, dense and larger interconnected systems, and reduced inertia due to expected proliferation of renewable energy resources. In this study, two more approaches: single machine equivalent and computational intelligence are presented. Also discussed are various application areas, and future directions in this research area. In conclusion, a comprehensive resource for the available literature, publicly available test systems, and relevant numerical libraries is also provided.

  14. Effects of anisotropic interaction-induced properties of hydrogen-rare gas compounds on rototranslational Raman scattering spectra: Comprehensive theoretical and numerical analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Głaz, Waldemar, E-mail: glaz@kielich.amu.edu.pl; Bancewicz, Tadeusz; Godet, Jean-Luc

    2016-07-21

    A comprehensive study is presented of many aspects of the depolarized anisotropic collision induced (CI) component of light scattered by weakly bound compounds composed of a dihydrogen molecule and a rare gas (Rg) atom, H{sub 2}–Rg. The work continues a series of earlier projects marking the revival of interest in linear light scattering following the development of new highly advanced tools of quantum chemistry and other theoretical, computational, and experimental means of spectral analyses. Sophisticated ab initio computing procedures are applied in order to obtain the anisotropic polarizability component’s dependence on the H{sub 2}–Rg geometry. These data are then usedmore » to evaluate the CI spectral lines for all types of Rg atoms ranging from He to Xe (Rn excluded). Evolution of the properties of CI spectra with growing polarizability/masses of the complexes studied is observed. Special attention is given to the heaviest, Kr and Xe based, scatterers. The influence of specific factors shaping the spectral lines (e.g., bound and metastable contribution, potential anisotropy) is discussed. Also the share of pressure broadened allowed rotational transitions in the overall spectral profile is taken into account and the extent to which it is separable from the pure CI contribution is discussed. We finish with a brief comparison between the obtained results and available experimental data.« less

  15. A comparative study of the tail ion distribution with reduced Fokker-Planck models

    NASA Astrophysics Data System (ADS)

    McDevitt, C. J.; Tang, Xian-Zhu; Guo, Zehua; Berk, H. L.

    2014-03-01

    A series of reduced models are used to study the fast ion tail in the vicinity of a transition layer between plasmas at disparate temperatures and densities, which is typical of the gas and pusher interface in inertial confinement fusion targets. Emphasis is placed on utilizing progressively more comprehensive models in order to identify the essential physics for computing the fast ion tail at energies comparable to the Gamow peak. The resulting fast ion tail distribution is subsequently used to compute the fusion reactivity as a function of collisionality and temperature. While a significant reduction of the fusion reactivity in the hot spot compared to the nominal Maxwellian case is present, this reduction is found to be partially recovered by an increase of the fusion reactivity in the neighboring cold region.

  16. Numerical modelling in friction lap joining of aluminium alloy and carbon-fiber-reinforced-plastic sheets

    NASA Astrophysics Data System (ADS)

    Das, A.; Bang, H. S.; Bang, H. S.

    2018-05-01

    Multi-material combinations of aluminium alloy and carbon-fiber-reinforced-plastics (CFRP) have gained attention in automotive and aerospace industries to enhance fuel efficiency and strength-to-weight ratio of components. Various limitations of laser beam welding, adhesive bonding and mechanical fasteners make these processes inefficient to join metal and CFRP sheets. Friction lap joining is an alternative choice for the same. Comprehensive studies in friction lap joining of aluminium to CFRP sheets are essential and scare in the literature. The present work reports a combined theoretical and experimental study in joining of AA5052 and CFRP sheets using friction lap joining process. A three-dimensional finite element based heat transfer model is developed to compute the temperature fields and thermal cycles. The computed results are validated extensively with the corresponding experimentally measured results.

  17. A comprehensive map of the mTOR signaling network

    PubMed Central

    Caron, Etienne; Ghosh, Samik; Matsuoka, Yukiko; Ashton-Beaucage, Dariel; Therrien, Marc; Lemieux, Sébastien; Perreault, Claude; Roux, Philippe P; Kitano, Hiroaki

    2010-01-01

    The mammalian target of rapamycin (mTOR) is a central regulator of cell growth and proliferation. mTOR signaling is frequently dysregulated in oncogenic cells, and thus an attractive target for anticancer therapy. Using CellDesigner, a modeling support software for graphical notation, we present herein a comprehensive map of the mTOR signaling network, which includes 964 species connected by 777 reactions. The map complies with both the systems biology markup language (SBML) and graphical notation (SBGN) for computational analysis and graphical representation, respectively. As captured in the mTOR map, we review and discuss our current understanding of the mTOR signaling network and highlight the impact of mTOR feedback and crosstalk regulations on drug-based cancer therapy. This map is available on the Payao platform, a Web 2.0 based community-wide interactive process for creating more accurate and information-rich databases. Thus, this comprehensive map of the mTOR network will serve as a tool to facilitate systems-level study of up-to-date mTOR network components and signaling events toward the discovery of novel regulatory processes and therapeutic strategies for cancer. PMID:21179025

  18. The Relationship Between a Silent Reading Fluency Instructional Protocol on Students’ Reading Comprehension and Achievement in an Urban School Setting

    PubMed Central

    Rasinski, Timothy; Samuels, S. Jay; Hiebert, Elfrieda; Petscher, Yaacov; Feller, Karen

    2015-01-01

    Reading fluency has been identified as a key component in effective literacy instruction (National Reading Panel, 2000). Instruction in reading fluency has been shown to lead to improvements in reading achievement. Reading fluency instruction is most commonly associated with guided repeated oral reading instruction. In the present retrospective study we examine the effects of a computer-based silent reading fluency instructional system called Reading Plus (Taylor Associates, Winooski, Vermont, USA) on the reading comprehension and overall reading achievement of a large corpus of students in an urban school setting. Findings indicate that the program resulted in positive, substantial, and significant improvements in reading comprehension and overall reading achievement on a criterion referenced reading test for Grades 5, 6, 7, 8, and 9 and on a norm-referenced test of reading achievement for Grades 4, 5, 6, 7, 8 and 10. Moreover, mean gains made by students in the Reading Plus intervention were greater than mean gains for all students at the state and district level. The findings were generally positive for all subpopulations studied, including special education and regular education students. Qualitative reports from teachers who participated in the study were also supportive of the program. Implications for the study are explored for particular subgroups of students and for the role of fluency instruction with struggling adolescent readers. PMID:26347186

  19. A Comprehensive Review of Existing Risk Assessment Models in Cloud Computing

    NASA Astrophysics Data System (ADS)

    Amini, Ahmad; Jamil, Norziana

    2018-05-01

    Cloud computing is a popular paradigm in information technology and computing as it offers numerous advantages in terms of economical saving and minimal management effort. Although elasticity and flexibility brings tremendous benefits, it still raises many information security issues due to its unique characteristic that allows ubiquitous computing. Therefore, the vulnerabilities and threats in cloud computing have to be identified and proper risk assessment mechanism has to be in place for better cloud computing management. Various quantitative and qualitative risk assessment models have been proposed but up to our knowledge, none of them is suitable for cloud computing environment. This paper, we compare and analyse the strengths and weaknesses of existing risk assessment models. We then propose a new risk assessment model that sufficiently address all the characteristics of cloud computing, which was not appeared in the existing models.

  20. Handheld computers in critical care.

    PubMed

    Lapinsky, S E; Weshler, J; Mehta, S; Varkul, M; Hallett, D; Stewart, T E

    2001-08-01

    Computing technology has the potential to improve health care management but is often underutilized. Handheld computers are versatile and relatively inexpensive, bringing the benefits of computers to the bedside. We evaluated the role of this technology for managing patient data and accessing medical reference information, in an academic intensive-care unit (ICU). Palm III series handheld devices were given to the ICU team, each installed with medical reference information, schedules, and contact numbers. Users underwent a 1-hour training session introducing the hardware and software. Various patient data management applications were assessed during the study period. Qualitative assessment of the benefits, drawbacks, and suggestions was performed by an independent company, using focus groups. An objective comparison between a paper and electronic handheld textbook was achieved using clinical scenario tests. During the 6-month study period, the 20 physicians and 6 paramedical staff who used the handheld devices found them convenient and functional but suggested more comprehensive training and improved search facilities. Comparison of the handheld computer with the conventional paper text revealed equivalence. Access to computerized patient information improved communication, particularly with regard to long-stay patients, but changes to the software and the process were suggested. The introduction of this technology was well received despite differences in users' familiarity with the devices. Handheld computers have potential in the ICU, but systems need to be developed specifically for the critical-care environment.

  1. Probabilistic analysis of tsunami hazards

    USGS Publications Warehouse

    Geist, E.L.; Parsons, T.

    2006-01-01

    Determining the likelihood of a disaster is a key component of any comprehensive hazard assessment. This is particularly true for tsunamis, even though most tsunami hazard assessments have in the past relied on scenario or deterministic type models. We discuss probabilistic tsunami hazard analysis (PTHA) from the standpoint of integrating computational methods with empirical analysis of past tsunami runup. PTHA is derived from probabilistic seismic hazard analysis (PSHA), with the main difference being that PTHA must account for far-field sources. The computational methods rely on numerical tsunami propagation models rather than empirical attenuation relationships as in PSHA in determining ground motions. Because a number of source parameters affect local tsunami runup height, PTHA can become complex and computationally intensive. Empirical analysis can function in one of two ways, depending on the length and completeness of the tsunami catalog. For site-specific studies where there is sufficient tsunami runup data available, hazard curves can primarily be derived from empirical analysis, with computational methods used to highlight deficiencies in the tsunami catalog. For region-wide analyses and sites where there are little to no tsunami data, a computationally based method such as Monte Carlo simulation is the primary method to establish tsunami hazards. Two case studies that describe how computational and empirical methods can be integrated are presented for Acapulco, Mexico (site-specific) and the U.S. Pacific Northwest coastline (region-wide analysis).

  2. Handheld computers in critical care

    PubMed Central

    Lapinsky, Stephen E; Weshler, Jason; Mehta, Sangeeta; Varkul, Mark; Hallett, Dave; Stewart, Thomas E

    2001-01-01

    Background Computing technology has the potential to improve health care management but is often underutilized. Handheld computers are versatile and relatively inexpensive, bringing the benefits of computers to the bedside. We evaluated the role of this technology for managing patient data and accessing medical reference information, in an academic intensive-care unit (ICU). Methods Palm III series handheld devices were given to the ICU team, each installed with medical reference information, schedules, and contact numbers. Users underwent a 1-hour training session introducing the hardware and software. Various patient data management applications were assessed during the study period. Qualitative assessment of the benefits, drawbacks, and suggestions was performed by an independent company, using focus groups. An objective comparison between a paper and electronic handheld textbook was achieved using clinical scenario tests. Results During the 6-month study period, the 20 physicians and 6 paramedical staff who used the handheld devices found them convenient and functional but suggested more comprehensive training and improved search facilities. Comparison of the handheld computer with the conventional paper text revealed equivalence. Access to computerized patient information improved communication, particularly with regard to long-stay patients, but changes to the software and the process were suggested. Conclusions The introduction of this technology was well received despite differences in users' familiarity with the devices. Handheld computers have potential in the ICU, but systems need to be developed specifically for the critical-care environment. PMID:11511337

  3. Reading Speed, Comprehension and Eye Movements While Reading Japanese Novels: Evidence from Untrained Readers and Cases of Speed-Reading Trainees

    PubMed Central

    Miyata, Hiromitsu; Minagawa-Kawai, Yasuyo; Watanabe, Shigeru; Sasaki, Toyofumi; Ueda, Kazuhiro

    2012-01-01

    Background A growing body of evidence suggests that meditative training enhances perception and cognition. In Japan, the Park-Sasaki method of speed-reading involves organized visual training while forming both a relaxed and concentrated state of mind, as in meditation. The present study examined relationships between reading speed, sentence comprehension, and eye movements while reading short Japanese novels. In addition to normal untrained readers, three middle-level trainees and one high-level expert on this method were included for the two case studies. Methodology/Principal Findings In Study 1, three of 17 participants were middle-level trainees on the speed-reading method. Immediately after reading each story once on a computer monitor, participants answered true or false questions regarding the content of the novel. Eye movements while reading were recorded using an eye-tracking system. Results revealed higher reading speed and lower comprehension scores in the trainees than in the untrained participants. Furthermore, eye-tracking data by untrained participants revealed multiple correlations between reading speed, accuracy and eye-movement measures, with faster readers showing shorter fixation durations and larger saccades in X than slower readers. In Study 2, participants included a high-level expert and 14 untrained students. The expert showed higher reading speed and statistically comparable, although numerically lower, comprehension scores compared with the untrained participants. During test sessions this expert moved her eyes along a nearly straight horizontal line as a first pass, without moving her eyes over the whole sentence display as did the untrained students. Conclusions/Significance In addition to revealing correlations between speed, comprehension and eye movements in reading Japanese contemporary novels by untrained readers, we describe cases of speed-reading trainees regarding relationships between these variables. The trainees overall tended to show poor performance influenced by the speed-accuracy trade-off, although this trade-off may be reduced in the case of at least one high-level expert. PMID:22590519

  4. Transient thermal, hydraulic, and mechanical analysis of a counter flow offset strip fin intermediate heat exchanger using an effective porous media approach

    NASA Astrophysics Data System (ADS)

    Urquiza, Eugenio

    This work presents a comprehensive thermal hydraulic analysis of a compact heat exchanger using offset strip fins. The thermal hydraulics analysis in this work is followed by a finite element analysis (FEA) to predict the mechanical stresses experienced by an intermediate heat exchanger (IHX) during steady-state operation and selected flow transients. In particular, the scenario analyzed involves a gas-to-liquid IHX operating between high pressure helium and liquid or molten salt. In order to estimate the stresses in compact heat exchangers a comprehensive thermal and hydraulic analysis is needed. Compact heat exchangers require very small flow channels and fins to achieve high heat transfer rates and thermal effectiveness. However, studying such small features computationally contributes little to the understanding of component level phenomena and requires prohibitive computational effort using computational fluid dynamics (CFD). To address this issue, the analysis developed here uses an effective porous media (EPM) approach; this greatly reduces the computation time and produces results with the appropriate resolution [1]. This EPM fluid dynamics and heat transfer computational code has been named the Compact Heat Exchanger Explicit Thermal and Hydraulics (CHEETAH) code. CHEETAH solves for the two-dimensional steady-state and transient temperature and flow distributions in the IHX including the complicating effects of temperature-dependent fluid thermo-physical properties. Temperature- and pressure-dependent fluid properties are evaluated by CHEETAH and the thermal effectiveness of the IHX is also calculated. Furthermore, the temperature distribution can then be imported into a finite element analysis (FEA) code for mechanical stress analysis using the EPM methods developed earlier by the University of California, Berkeley, for global and local stress analysis [2]. These simulation tools will also allow the heat exchanger design to be improved through an iterative design process which will lead to a design with a reduced pressure drop, increased thermal effectiveness, and improved mechanical performance as it relates to creep deformation and transient thermal stresses.

  5. An Integrated Review of Emoticons in Computer-Mediated Communication

    PubMed Central

    Aldunate, Nerea; González-Ibáñez, Roberto

    2017-01-01

    Facial expressions constitute a rich source of non-verbal cues in face-to-face communication. They provide interlocutors with resources to express and interpret verbal messages, which may affect their cognitive and emotional processing. Contrarily, computer-mediated communication (CMC), particularly text-based communication, is limited to the use of symbols to convey a message, where facial expressions cannot be transmitted naturally. In this scenario, people use emoticons as paralinguistic cues to convey emotional meaning. Research has shown that emoticons contribute to a greater social presence as a result of the enrichment of text-based communication channels. Additionally, emoticons constitute a valuable resource for language comprehension by providing expressivity to text messages. The latter findings have been supported by studies in neuroscience showing that particular brain regions involved in emotional processing are also activated when people are exposed to emoticons. To reach an integrated understanding of the influence of emoticons in human communication on both socio-cognitive and neural levels, we review the literature on emoticons in three different areas. First, we present relevant literature on emoticons in CMC. Second, we study the influence of emoticons in language comprehension. Finally, we show the incipient research in neuroscience on this topic. This mini review reveals that, while there are plenty of studies on the influence of emoticons in communication from a social psychology perspective, little is known about the neurocognitive basis of the effects of emoticons on communication dynamics. PMID:28111564

  6. A Comprehensive Specimen-Specific Multiscale Data Set for Anatomical and Mechanical Characterization of the Tibiofemoral Joint

    PubMed Central

    Chokhandre, Snehal; Colbrunn, Robb; Bennetts, Craig; Erdemir, Ahmet

    2015-01-01

    Understanding of tibiofemoral joint mechanics at multiple spatial scales is essential for developing effective preventive measures and treatments for both pathology and injury management. Currently, there is a distinct lack of specimen-specific biomechanical data at multiple spatial scales, e.g., joint, tissue, and cell scales. Comprehensive multiscale data may improve the understanding of the relationship between biomechanical and anatomical markers across various scales. Furthermore, specimen-specific multiscale data for the tibiofemoral joint may assist development and validation of specimen-specific computational models that may be useful for more thorough analyses of the biomechanical behavior of the joint. This study describes an aggregation of procedures for acquisition of multiscale anatomical and biomechanical data for the tibiofemoral joint. Magnetic resonance imaging was used to acquire anatomical morphology at the joint scale. A robotic testing system was used to quantify joint level biomechanical response under various loading scenarios. Tissue level material properties were obtained from the same specimen for the femoral and tibial articular cartilage, medial and lateral menisci, anterior and posterior cruciate ligaments, and medial and lateral collateral ligaments. Histology data were also obtained for all tissue types to measure specimen-specific cell scale information, e.g., cellular distribution. This study is the first of its kind to establish a comprehensive multiscale data set for a musculoskeletal joint and the presented data collection approach can be used as a general template to guide acquisition of specimen-specific comprehensive multiscale data for musculoskeletal joints. PMID:26381404

  7. Real-World Application of Robust Design Optimization Assisted by Response Surface Approximation and Visual Data-Mining

    NASA Astrophysics Data System (ADS)

    Shimoyama, Koji; Jeong, Shinkyu; Obayashi, Shigeru

    A new approach for multi-objective robust design optimization was proposed and applied to a real-world design problem with a large number of objective functions. The present approach is assisted by response surface approximation and visual data-mining, and resulted in two major gains regarding computational time and data interpretation. The Kriging model for response surface approximation can markedly reduce the computational time for predictions of robustness. In addition, the use of self-organizing maps as a data-mining technique allows visualization of complicated design information between optimality and robustness in a comprehensible two-dimensional form. Therefore, the extraction and interpretation of trade-off relations between optimality and robustness of design, and also the location of sweet spots in the design space, can be performed in a comprehensive manner.

  8. A survey of signal processing algorithms in brain-computer interfaces based on electrical brain signals.

    PubMed

    Bashashati, Ali; Fatourechi, Mehrdad; Ward, Rabab K; Birch, Gary E

    2007-06-01

    Brain-computer interfaces (BCIs) aim at providing a non-muscular channel for sending commands to the external world using the electroencephalographic activity or other electrophysiological measures of the brain function. An essential factor in the successful operation of BCI systems is the methods used to process the brain signals. In the BCI literature, however, there is no comprehensive review of the signal processing techniques used. This work presents the first such comprehensive survey of all BCI designs using electrical signal recordings published prior to January 2006. Detailed results from this survey are presented and discussed. The following key research questions are addressed: (1) what are the key signal processing components of a BCI, (2) what signal processing algorithms have been used in BCIs and (3) which signal processing techniques have received more attention?

  9. TOPICAL REVIEW: A survey of signal processing algorithms in brain computer interfaces based on electrical brain signals

    NASA Astrophysics Data System (ADS)

    Bashashati, Ali; Fatourechi, Mehrdad; Ward, Rabab K.; Birch, Gary E.

    2007-06-01

    Brain computer interfaces (BCIs) aim at providing a non-muscular channel for sending commands to the external world using the electroencephalographic activity or other electrophysiological measures of the brain function. An essential factor in the successful operation of BCI systems is the methods used to process the brain signals. In the BCI literature, however, there is no comprehensive review of the signal processing techniques used. This work presents the first such comprehensive survey of all BCI designs using electrical signal recordings published prior to January 2006. Detailed results from this survey are presented and discussed. The following key research questions are addressed: (1) what are the key signal processing components of a BCI, (2) what signal processing algorithms have been used in BCIs and (3) which signal processing techniques have received more attention?

  10. A comprehensive Two-Fluid Model for Cavitation and Primary Atomization Modelling of liquid jets - Application to a large marine Diesel injector

    NASA Astrophysics Data System (ADS)

    Habchi, Chawki; Bohbot, Julien; Schmid, Andreas; Herrmann, Kai

    2015-12-01

    In this paper, a comprehensive two-fluid model is suggested in order to compute the in-nozzle cavitating flow and the primary atomization of liquid jets, simultaneously. This model has been applied to the computation of a typical large marine Diesel injector. The numerical results have shown a strong correlation between the in-nozzle cavitating flow and the ensuing spray orientation and atomization. Indeed, the results have confirmed the existence of an off-axis liquid core. This asymmetry is likely to be at the origin of the spray deviation observed experimentally. In addition, the primary atomization begins very close to the orifice exit as in the experiments, and the smallest droplets are generated due to cavitation pocket shape oscillations located at the same side, inside the orifice.

  11. Comprehensive mutagenesis of HIV-1 protease: a computational geometry approach.

    PubMed

    Masso, Majid; Vaisman, Iosif I

    2003-05-30

    A computational geometry technique based on Delaunay tessellation of protein structure, represented by C(alpha) atoms, is used to study effects of single residue mutations on sequence-structure compatibility in HIV-1 protease. Profiles of residue scores derived from the four-body statistical potential are constructed for all 1881 mutants of the HIV-1 protease monomer and compared with the profile of the wild-type protein. The profiles for an isolated monomer of HIV-1 protease and the identical monomer in a dimeric state with an inhibitor are analyzed to elucidate changes to structural stability. Protease residues shown to undergo the greatest impact are those forming the dimer interface and flap region, as well as those known to be involved in inhibitor binding.

  12. Statistical models of lunar rocks and regolith

    NASA Technical Reports Server (NTRS)

    Marcus, A. H.

    1973-01-01

    The mathematical, statistical, and computational approaches used in the investigation of the interrelationship of lunar fragmental material, regolith, lunar rocks, and lunar craters are described. The first two phases of the work explored the sensitivity of the production model of fragmental material to mathematical assumptions, and then completed earlier studies on the survival of lunar surface rocks with respect to competing processes. The third phase combined earlier work into a detailed statistical analysis and probabilistic model of regolith formation by lithologically distinct layers, interpreted as modified crater ejecta blankets. The fourth phase of the work dealt with problems encountered in combining the results of the entire project into a comprehensive, multipurpose computer simulation model for the craters and regolith. Highlights of each phase of research are given.

  13. Color reproducibility and dyestuff concentration

    NASA Astrophysics Data System (ADS)

    Csanyi, Sandor

    2002-06-01

    The purpose of this study was to develop a new sensitivity index connected with color matching, which makes it possible to investigate the effects of dyestuff concentration deviations in a larger part of the color space in a comprehensive manner. By the help of computer simulation and experimental design, we examined the color differences resulting from minor concentration changes in approximately 500 formulas of different compositions, altering their total concentration and the proportion of the individual dyes in them. The new sensitivity index makes it possible for the colorist to select the recipe that is the least sensitive to concentration deviations from among the computer color formulas, as well as to add a new aspect to the ranking applied in color matching so far.

  14. Computer Simulations to Support Science Instruction and Learning: A critical review of the literature

    NASA Astrophysics Data System (ADS)

    Smetana, Lara Kathleen; Bell, Randy L.

    2012-06-01

    Researchers have explored the effectiveness of computer simulations for supporting science teaching and learning during the past four decades. The purpose of this paper is to provide a comprehensive, critical review of the literature on the impact of computer simulations on science teaching and learning, with the goal of summarizing what is currently known and providing guidance for future research. We report on the outcomes of 61 empirical studies dealing with the efficacy of, and implications for, computer simulations in science instruction. The overall findings suggest that simulations can be as effective, and in many ways more effective, than traditional (i.e. lecture-based, textbook-based and/or physical hands-on) instructional practices in promoting science content knowledge, developing process skills, and facilitating conceptual change. As with any other educational tool, the effectiveness of computer simulations is dependent upon the ways in which they are used. Thus, we outline specific research-based guidelines for best practice. Computer simulations are most effective when they (a) are used as supplements; (b) incorporate high-quality support structures; (c) encourage student reflection; and (d) promote cognitive dissonance. Used appropriately, computer simulations involve students in inquiry-based, authentic science explorations. Additionally, as educational technologies continue to evolve, advantages such as flexibility, safety, and efficiency deserve attention.

  15. Some unsolved problems in discrete mathematics and mathematical cybernetics

    NASA Astrophysics Data System (ADS)

    Korshunov, Aleksei D.

    2009-10-01

    There are many unsolved problems in discrete mathematics and mathematical cybernetics. Writing a comprehensive survey of such problems involves great difficulties. First, such problems are rather numerous and varied. Second, they greatly differ from each other in degree of completeness of their solution. Therefore, even a comprehensive survey should not attempt to cover the whole variety of such problems; only the most important and significant problems should be reviewed. An impersonal choice of problems to include is quite hard. This paper includes 13 unsolved problems related to combinatorial mathematics and computational complexity theory. The problems selected give an indication of the author's studies for 50 years; for this reason, the choice of the problems reviewed here is, to some extent, subjective. At the same time, these problems are very difficult and quite important for discrete mathematics and mathematical cybernetics. Bibliography: 74 items.

  16. SoftLab: A Soft-Computing Software for Experimental Research with Commercialization Aspects

    NASA Technical Reports Server (NTRS)

    Akbarzadeh-T, M.-R.; Shaikh, T. S.; Ren, J.; Hubbell, Rob; Kumbla, K. K.; Jamshidi, M

    1998-01-01

    SoftLab is a software environment for research and development in intelligent modeling/control using soft-computing paradigms such as fuzzy logic, neural networks, genetic algorithms, and genetic programs. SoftLab addresses the inadequacies of the existing soft-computing software by supporting comprehensive multidisciplinary functionalities from management tools to engineering systems. Furthermore, the built-in features help the user process/analyze information more efficiently by a friendly yet powerful interface, and will allow the user to specify user-specific processing modules, hence adding to the standard configuration of the software environment.

  17. The development of an engineering computer graphics laboratory

    NASA Technical Reports Server (NTRS)

    Anderson, D. C.; Garrett, R. E.

    1975-01-01

    Hardware and software systems developed to further research and education in interactive computer graphics were described, as well as several of the ongoing application-oriented projects, educational graphics programs, and graduate research projects. The software system consists of a FORTRAN 4 subroutine package, in conjunction with a PDP 11/40 minicomputer as the primary computation processor and the Imlac PDS-1 as an intelligent display processor. The package comprises a comprehensive set of graphics routines for dynamic, structured two-dimensional display manipulation, and numerous routines to handle a variety of input devices at the Imlac.

  18. Tools for building a comprehensive modeling system for virtual screening under real biological conditions: The Computational Titration algorithm.

    PubMed

    Kellogg, Glen E; Fornabaio, Micaela; Chen, Deliang L; Abraham, Donald J; Spyrakis, Francesca; Cozzini, Pietro; Mozzarelli, Andrea

    2006-05-01

    Computational tools utilizing a unique empirical modeling system based on the hydrophobic effect and the measurement of logP(o/w) (the partition coefficient for solvent transfer between 1-octanol and water) are described. The associated force field, Hydropathic INTeractions (HINT), contains much rich information about non-covalent interactions in the biological environment because of its basis in an experiment that measures interactions in solution. HINT is shown to be the core of an evolving virtual screening system that is capable of taking into account a number of factors often ignored such as entropy, effects of solvent molecules at the active site, and the ionization states of acidic and basic residues and ligand functional groups. The outline of a comprehensive modeling system for virtual screening that incorporates these features is described. In addition, a detailed description of the Computational Titration algorithm is provided. As an example, three complexes of dihydrofolate reductase (DHFR) are analyzed with our system and these results are compared with the experimental free energies of binding.

  19. Comprehensive T-matrix Reference Database: A 2009-2011 Update

    NASA Technical Reports Server (NTRS)

    Zakharova, Nadezhda T.; Videen, G.; Khlebtsov, Nikolai G.

    2012-01-01

    The T-matrix method is one of the most versatile and efficient theoretical techniques widely used for the computation of electromagnetic scattering by single and composite particles, discrete random media, and particles in the vicinity of an interface separating two half-spaces with different refractive indices. This paper presents an update to the comprehensive database of peer-reviewed T-matrix publications compiled by us previously and includes the publications that appeared since 2009. It also lists several earlier publications not included in the original database.

  20. ChiLin: a comprehensive ChIP-seq and DNase-seq quality control and analysis pipeline.

    PubMed

    Qin, Qian; Mei, Shenglin; Wu, Qiu; Sun, Hanfei; Li, Lewyn; Taing, Len; Chen, Sujun; Li, Fugen; Liu, Tao; Zang, Chongzhi; Xu, Han; Chen, Yiwen; Meyer, Clifford A; Zhang, Yong; Brown, Myles; Long, Henry W; Liu, X Shirley

    2016-10-03

    Transcription factor binding, histone modification, and chromatin accessibility studies are important approaches to understanding the biology of gene regulation. ChIP-seq and DNase-seq have become the standard techniques for studying protein-DNA interactions and chromatin accessibility respectively, and comprehensive quality control (QC) and analysis tools are critical to extracting the most value from these assay types. Although many analysis and QC tools have been reported, few combine ChIP-seq and DNase-seq data analysis and quality control in a unified framework with a comprehensive and unbiased reference of data quality metrics. ChiLin is a computational pipeline that automates the quality control and data analyses of ChIP-seq and DNase-seq data. It is developed using a flexible and modular software framework that can be easily extended and modified. ChiLin is ideal for batch processing of many datasets and is well suited for large collaborative projects involving ChIP-seq and DNase-seq from different designs. ChiLin generates comprehensive quality control reports that include comparisons with historical data derived from over 23,677 public ChIP-seq and DNase-seq samples (11,265 datasets) from eight literature-based classified categories. To the best of our knowledge, this atlas represents the most comprehensive ChIP-seq and DNase-seq related quality metric resource currently available. These historical metrics provide useful heuristic quality references for experiment across all commonly used assay types. Using representative datasets, we demonstrate the versatility of the pipeline by applying it to different assay types of ChIP-seq data. The pipeline software is available open source at https://github.com/cfce/chilin . ChiLin is a scalable and powerful tool to process large batches of ChIP-seq and DNase-seq datasets. The analysis output and quality metrics have been structured into user-friendly directories and reports. We have successfully compiled 23,677 profiles into a comprehensive quality atlas with fine classification for users.

  1. IDIOS: An innovative index for evaluating dental imaging-based osteoporosis screening indices.

    PubMed

    Barngkgei, Imad; Halboub, Esam; Almashraqi, Abeer Abdulkareem; Khattab, Razan; Al Haffar, Iyad

    2016-09-01

    The goal of this study was to develop a new index as an objective reference for evaluating current and newly developed indices used for osteoporosis screening based on dental images. Its name; IDIOS, stands for Index of Dental-imaging Indices of Osteoporosis Screening. A comprehensive PubMed search was conducted to retrieve studies on dental imaging-based indices for osteoporosis screening. The results of the eligible studies, along with other relevant criteria, were used to develop IDIOS, which has scores ranging from 0 (0%) to 15 (100%). The indices presented in the studies we included were then evaluated using IDIOS. The 104 studies that were included utilized 24, 4, and 9 indices derived from panoramic, periapical, and computed tomographic/cone-beam computed tomographic techniques, respectively. The IDIOS scores for these indices ranged from 0 (0%) to 11.75 (78.32%). IDIOS is a valuable reference index that facilitates the evaluation of other dental imaging-based osteoporosis screening indices. Furthermore, IDIOS can be utilized to evaluate the accuracy of newly developed indices.

  2. Integrated Engineering Information Technology, FY93 accommplishments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harris, R.N.; Miller, D.K.; Neugebauer, G.L.

    1994-03-01

    The Integrated Engineering Information Technology (IEIT) project is providing a comprehensive, easy-to-use computer network solution or communicating with coworkers both inside and outside Sandia National Laboratories. IEIT capabilities include computer networking, electronic mail, mechanical design, and data management. These network-based tools have one fundamental purpose: to help create a concurrent engineering environment that will enable Sandia organizations to excel in today`s increasingly competitive business environment.

  3. Advanced Technology Airfoil Research, volume 1, part 1. [conference on development of computational codes and test facilities

    NASA Technical Reports Server (NTRS)

    1979-01-01

    A comprehensive review of all NASA airfoil research, conducted both in-house and under grant and contract, as well as a broad spectrum of airfoil research outside of NASA is presented. Emphasis is placed on the development of computational aerodynamic codes for airfoil analysis and design, the development of experimental facilities and test techniques, and all types of airfoil applications.

  4. 75 FR 35451 - Access by EPA Contractors To Information Claimed as Confidential Business Information (CBI...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-06-22

    ... further extensions without further notice. Under Contract Number EP10H000097, Computer Science Corporation... EPA procedures, including comprehensive system security plans (SSPs) that are consistent with those...

  5. Argonne Simulation Framework for Intelligent Transportation Systems

    DOT National Transportation Integrated Search

    1996-01-01

    A simulation framework has been developed which defines a high-level architecture for a large-scale, comprehensive, scalable simulation of an Intelligent Transportation System (ITS). The simulator is designed to run on parallel computers and distribu...

  6. Office Computer Software: A Comprehensive Review of Software Programs.

    ERIC Educational Resources Information Center

    Secretary, 1992

    1992-01-01

    Describes types of software including system software, application software, spreadsheets, accounting software, graphics packages, desktop publishing software, database, desktop and personal information management software, project and records management software, groupware, and shareware. (JOW)

  7. Development of a statewide landslide inventory program.

    DOT National Transportation Integrated Search

    2003-02-01

    Development of a comprehensive geotechnical database for risk management of highway landslide problems is described. Computer software selected to program the client/server application in a data window, components and structure of the geotechnical da...

  8. Lung Ultrasonography in Patients With Idiopathic Pulmonary Fibrosis: Evaluation of a Simplified Protocol With High-Resolution Computed Tomographic Correlation.

    PubMed

    Vassalou, Evangelia E; Raissaki, Maria; Magkanas, Eleftherios; Antoniou, Katerina M; Karantanas, Apostolos H

    2018-03-01

    To compare a simplified ultrasonographic (US) protocol in 2 patient positions with the same-positioned comprehensive US assessments and high-resolution computed tomographic (CT) findings in patients with idiopathic pulmonary fibrosis. Twenty-five consecutive patients with idiopathic pulmonary fibrosis were prospectively enrolled and examined in 2 sessions. During session 1, patients were examined with a US protocol including 56 lung intercostal spaces in supine/sitting (supine/sitting comprehensive protocol) and lateral decubitus (decubitus comprehensive protocol) positions. During session 2, patients were evaluated with a 16-intercostal space US protocol in sitting (sitting simplified protocol) and left/right decubitus (decubitus simplified protocol) positions. The 16 intercostal spaces were chosen according to the prevalence of idiopathic pulmonary fibrosis-related changes on high-resolution CT. The sum of B-lines counted in each intercostal space formed the US scores for all 4 US protocols: supine/sitting and decubitus comprehensive US scores and sitting and decubitus simplified US scores. High-resolution CT-related Warrick scores (J Rheumatol 1991; 18:1520-1528) were compared to US scores. The duration of each protocol was recorded. A significant correlation was found between all US scores and Warrick scores and between simplified and corresponding comprehensive scores (P < .0001). Decubitus simplified US scores showed a slightly higher correlation with Warrick scores compared to sitting simplified US scores. Mean durations of decubitus and sitting simplified protocols were 4.76 and 6.20 minutes, respectively (P < .005). Simplified 16-intercostal space protocols correlated with comprehensive protocols and high-resolution CT findings in patients with idiopathic pulmonary fibrosis. The 16-intercostal space simplified protocol in the lateral decubitus position correlated better with high-resolution CT findings and was less time-consuming compared to the sitting position. © 2017 by the American Institute of Ultrasound in Medicine.

  9. Interfacing comprehensive rotorcraft analysis with advanced aeromechanics and vortex wake models

    NASA Astrophysics Data System (ADS)

    Liu, Haiying

    This dissertation describes three aspects of the comprehensive rotorcraft analysis. First, a physics-based methodology for the modeling of hydraulic devices within multibody-based comprehensive models of rotorcraft systems is developed. This newly proposed approach can predict the fully nonlinear behavior of hydraulic devices, and pressure levels in the hydraulic chambers are coupled with the dynamic response of the system. The proposed hydraulic device models are implemented in a multibody code and calibrated by comparing their predictions with test bench measurements for the UH-60 helicopter lead-lag damper. Predicted peak damping forces were found to be in good agreement with measurements, while the model did not predict the entire time history of damper force to the same level of accuracy. The proposed model evaluates relevant hydraulic quantities such as chamber pressures, orifice flow rates, and pressure relief valve displacements. This model could be used to design lead-lag dampers with desirable force and damping characteristics. The second part of this research is in the area of computational aeroelasticity, in which an interface between computational fluid dynamics (CFD) and computational structural dynamics (CSD) is established. This interface enables data exchange between CFD and CSD with the goal of achieving accurate airloads predictions. In this work, a loose coupling approach based on the delta-airloads method is developed in a finite-element method based multibody dynamics formulation, DYMORE. To validate this aerodynamic interface, a CFD code, OVERFLOW-2, is loosely coupled with a CSD program, DYMORE, to compute the airloads of different flight conditions for Sikorsky UH-60 aircraft. This loose coupling approach has good convergence characteristics. The predicted airloads are found to be in good agreement with the experimental data, although not for all flight conditions. In addition, the tight coupling interface between the CFD program, OVERFLOW-2, and the CSD program, DYMORE, is also established. The ability to accurately capture the wake structure around a helicopter rotor is crucial for rotorcraft performance analysis. In the third part of this thesis, a new representation of the wake vortex structure based on Non-Uniform Rational B-Spline (NURBS) curves and surfaces is proposed to develop an efficient model for prescribed and free wakes. NURBS curves and surfaces are able to represent complex shapes with remarkably little data. The proposed formulation has the potential to reduce the computational cost associated with the use of Helmholtz's law and the Biot-Savart law when calculating the induced flow field around the rotor. An efficient free-wake analysis will considerably decrease the computational cost of comprehensive rotorcraft analysis, making the approach more attractive to routine use in industrial settings.

  10. The Impact of Report of Investigation Writing Style on the Assessment Times, Impressions, Perceptions and Preferences of Adjudicators

    DTIC Science & Technology

    1993-01-01

    sequences should increase reading speed and improve comprehension. 9 Active-Passive and Word Choice Research Three studies (Olson & Filby, 1972; Danks ...employment "* minor financial matters Rokitka. Rokitka is a 49-year-old female applying for a Top Secret clearance necessary for her to work as a computer...Semantic distinctions and memory for complex sentences. Quarterly Journal of Psychology, 20, 129-130’. Danks , J., & Sorce, P. (1973). Imagery and

  11. Bolsa Bay, California, Proposed Ocean Entrance System Study. Report 2. Comprehensive Shoreline Response Computer Simulation, Bolsa Bay, California

    DTIC Science & Technology

    1990-04-01

    across the coastal plain to the surrounding mountains . Historically, the lowlands were frequently inundated by tidal flows through a direct natural...approximately in the center of the Los Angeles coastal plain. This low plain is bordered on the north by the eastern Santa Monica Mountains and the Repetto...Hills, on the east by the Puente Hills and the Santa Ana Mountains , on the southeast by the San Joaquin Hills, and on the south and west by the

  12. Computer-Aided Structural Engineering (CASE) Project: State of the Art on Expert Systems Applications in Design, Construction and Maintenance of Structures

    DTIC Science & Technology

    1989-09-01

    OGT, F1EPQRTJTL4, W" - 3^ n"r-- n *ON EXPERT SYSTEMS IN DESIGN, CONSTRUCTION AND’, IWAJNTENANCE-OF STRUCTURES Arockiasamy, Sunghoon Lee Clepartrhent...based expert system applications in the areas of structural design, design standards, and construction planning. This study will aid in the development...of a comprehensive expert system for tvical hydraulic structures. Funding for this report was provided by the US Army Engineer Waterways Experiment

  13. Earthquake Shaking - Finding the "Hot Spots"

    USGS Publications Warehouse

    Field, Edward; Jones, Lucile; Jordan, Tom; Benthien, Mark; Wald, Lisa

    2001-01-01

    A new Southern California Earthquake Center study has quantified how local geologic conditions affect the shaking experienced in an earthquake. The important geologic factors at a site are softness of the rock or soil near the surface and thickness of the sediments above hard bedrock. Even when these 'site effects' are taken into account, however, each earthquake exhibits unique 'hotspots' of anomalously strong shaking. Better predictions of strong ground shaking will therefore require additional geologic data and more comprehensive computer simulations of individual earthquakes.

  14. Modelling of Surfaces. Part 1: Monatomic Metallic Surfaces Using Equivalent Crystal Theory

    NASA Technical Reports Server (NTRS)

    Bozzolo, Guillermo; Ferrante, John; Rodriguez, Agustin M.

    1994-01-01

    We present a detailed description of equivalent crystal theory focusing on its application to the study of surface structure. While the emphasis is in the structure of the algorithm and its computational aspects, we also present a comprehensive discussion on the calculation of surface energies of metallic systems with equivalent crystal theory and other approaches. Our results are compared to experiment and other semiempirical as well as first-principles calculations for a variety of fcc and bcc metals.

  15. Additional application of the NASCAP code. Volume 1: NASCAP extension

    NASA Technical Reports Server (NTRS)

    Katz, I.; Cassidy, J. J.; Mandell, M. J.; Parks, D. E.; Schnuelle, G. W.; Stannard, P. R.; Steen, P. G.

    1981-01-01

    The NASCAP computer program comprehensively analyzes problems of spacecraft charging. Using a fully three dimensional approach, it can accurately predict spacecraft potentials under a variety of conditions. Several changes were made to NASCAP, and a new code, NASCAP/LEO, was developed. In addition, detailed studies of several spacecraft-environmental interactions and of the SCATHA spacecraft were performed. The NASCAP/LEO program handles situations of relatively short Debye length encountered by large space structures or by any satellite in low earth orbit (LEO).

  16. Development of computational small animal models and their applications in preclinical imaging and therapy research.

    PubMed

    Xie, Tianwu; Zaidi, Habib

    2016-01-01

    The development of multimodality preclinical imaging techniques and the rapid growth of realistic computer simulation tools have promoted the construction and application of computational laboratory animal models in preclinical research. Since the early 1990s, over 120 realistic computational animal models have been reported in the literature and used as surrogates to characterize the anatomy of actual animals for the simulation of preclinical studies involving the use of bioluminescence tomography, fluorescence molecular tomography, positron emission tomography, single-photon emission computed tomography, microcomputed tomography, magnetic resonance imaging, and optical imaging. Other applications include electromagnetic field simulation, ionizing and nonionizing radiation dosimetry, and the development and evaluation of new methodologies for multimodality image coregistration, segmentation, and reconstruction of small animal images. This paper provides a comprehensive review of the history and fundamental technologies used for the development of computational small animal models with a particular focus on their application in preclinical imaging as well as nonionizing and ionizing radiation dosimetry calculations. An overview of the overall process involved in the design of these models, including the fundamental elements used for the construction of different types of computational models, the identification of original anatomical data, the simulation tools used for solving various computational problems, and the applications of computational animal models in preclinical research. The authors also analyze the characteristics of categories of computational models (stylized, voxel-based, and boundary representation) and discuss the technical challenges faced at the present time as well as research needs in the future.

  17. Social, organizational, and contextual characteristics of clinical decision support systems for intensive insulin therapy: a literature review and case study.

    PubMed

    Campion, Thomas R; Waitman, Lemuel R; May, Addison K; Ozdas, Asli; Lorenzi, Nancy M; Gadd, Cynthia S

    2010-01-01

    Evaluations of computerized clinical decision support systems (CDSS) typically focus on clinical performance changes and do not include social, organizational, and contextual characteristics explaining use and effectiveness. Studies of CDSS for intensive insulin therapy (IIT) are no exception, and the literature lacks an understanding of effective computer-based IIT implementation and operation. This paper presents (1) a literature review of computer-based IIT evaluations through the lens of institutional theory, a discipline from sociology and organization studies, to demonstrate the inconsistent reporting of workflow and care process execution and (2) a single-site case study to illustrate how computer-based IIT requires substantial organizational change and creates additional complexity with unintended consequences including error. Computer-based IIT requires organizational commitment and attention to site-specific technology, workflow, and care processes to achieve intensive insulin therapy goals. The complex interaction between clinicians, blood glucose testing devices, and CDSS may contribute to workflow inefficiency and error. Evaluations rarely focus on the perspective of nurses, the primary users of computer-based IIT whose knowledge can potentially lead to process and care improvements. This paper addresses a gap in the literature concerning the social, organizational, and contextual characteristics of CDSS in general and for intensive insulin therapy specifically. Additionally, this paper identifies areas for future research to define optimal computer-based IIT process execution: the frequency and effect of manual data entry error of blood glucose values, the frequency and effect of nurse overrides of CDSS insulin dosing recommendations, and comprehensive ethnographic study of CDSS for IIT. Copyright (c) 2009. Published by Elsevier Ireland Ltd.

  18. The Influence of Roof Material on Diurnal Urban Canyon Breathing

    NASA Astrophysics Data System (ADS)

    Abuhegazy, Mohamed; Yaghoobian, Neda

    2017-11-01

    Improvements in building energy use, air quality in urban canyons and in general urban microclimates require understanding the complex interaction between urban morphology, materials, climate, and inflow conditions. Review of the literature indicates that despite a long history of valuable urban microclimate studies, more comprehensive approaches are needed to address energy, and heat and flow transport in urban areas. In this study, a more comprehensive simulation of the diurnally varying street canyon flow and associated heat transport is numerically investigated, using Large-eddy Simulation (LES). We use computational modeling to examine the impact of diurnal variation of the heat fluxes from urban surfaces on the air flow and temperature distribution in street canyons with a focus on the role of roof materials and their temperature footprints. A detailed building energy model with a three-dimensional raster-type geometry provides urban surface heat fluxes as thermal boundary conditions for the LES to determine the key aero-thermodynamic factors that affect urban street ventilation.

  19. Metabolic Profile of the Cellulolytic Industrial Actinomycete Thermobifida fusca

    PubMed Central

    Vanee, Niti

    2017-01-01

    Actinomycetes have a long history of being the source of numerous valuable natural products and medicinals. To expedite product discovery and optimization of biochemical production, high-throughput technologies can now be used to screen the library of compounds present (or produced) at a given time in an organism. This not only facilitates chemical product screening, but also provides a comprehensive methodology to the study cellular metabolic networks to inform cellular engineering. Here, we present some of the first metabolomic data of the industrial cellulolytic actinomycete Thermobifida fusca generated using LC-MS/MS. The underlying objective of conducting global metabolite profiling was to gain better insight on the innate capabilities of T. fusca, with a long-term goal of facilitating T. fusca-based bioprocesses. The T. fusca metabolome was characterized for growth on two cellulose-relevant carbon sources, cellobiose and Avicel. Furthermore, the comprehensive list of measured metabolites was computationally integrated into a metabolic model of T. fusca, to study metabolic shifts in the network flux associated with carbohydrate and amino acid metabolism. PMID:29137138

  20. Cognitive training plus a comprehensive psychosocial programme (OPUS) versus the comprehensive psychosocial programme alone for patients with first-episode schizophrenia (the NEUROCOM trial): a study protocol for a centrally randomised, observer-blinded multi-centre clinical trial.

    PubMed

    Vesterager, Lone; Christensen, Torben Ø; Olsen, Birthe B; Krarup, Gertrud; Forchhammer, Hysse B; Melau, Marianne; Gluud, Christian; Nordentoft, Merete

    2011-02-09

    Up to 85% of patients with schizophrenia demonstrate cognitive dysfunction in at least one domain. Cognitive dysfunction plays a major role in functional outcome. It is hypothesized that addition of cognitive training to a comprehensive psychosocial programme (OPUS) enhances both cognitive and everyday functional capacity of patients more than the comprehensive psychosocial programme alone. The NEUROCOM trial examines the effect on cognitive functioning and everyday functional capacity of patients with schizophrenia of a 16-week manualised programme of individual cognitive training integrated in a comprehensive psychosocial programme versus the comprehensive psychosocial programme alone. The cognitive training consists of four modules focusing on attention, executive functioning, learning, and memory. Cognitive training involves computer-assisted training tasks as well as practical everyday tasks and calendar training. It takes place twice a week, and every other week the patient and trainer engage in a dialogue on the patient's cognitive difficulties, motivational goals, and progress in competence level. Cognitive training relies on errorless learning principles, scaffolding, and verbalisation in its effort to improve cognitive abilities and teach patients how to apply compensation strategies as well as structured problem solving techniques. At 16-week post-training and at ten-months follow-up, assessments are conducted to investigate immediate outcome and possible long-term effects of cognitive training. We conduct blinded assessments of cognition, everyday functional capacity and associations with the labour market, symptom severity, and self-esteem. Results from four-month and ten-month follow-ups have the potential of reliably providing documentation of the long-term effect of CT for patients with schizophrenia. Clinicaltrials.gov NCT00472862.

Top