Science.gov

Sample records for computer networking courses

  1. Student Motivation in Computer Networking Courses

    ERIC Educational Resources Information Center

    Hsin, Wen-Jung

    2007-01-01

    This paper introduces several hands-on projects that have been used to motivate students in learning various computer networking concepts. These projects are shown to be very useful and applicable to the learners' daily tasks and activities such as emailing, Web browsing, and online shopping and banking, and lead to an unexpected byproduct,…

  2. Improving a Computer Networks Course Using the Partov Simulation Engine

    ERIC Educational Resources Information Center

    Momeni, B.; Kharrazi, M.

    2012-01-01

    Computer networks courses are hard to teach as there are many details in the protocols and techniques involved that are difficult to grasp. Employing programming assignments as part of the course helps students to obtain a better understanding and gain further insight into the theoretical lectures. In this paper, the Partov simulation engine and…

  3. Networking: A Necessary Component in a Computer-Literacy Course.

    ERIC Educational Resources Information Center

    Norales, Francisca O.

    1993-01-01

    Discusses factors to be considered when planning and implementing a local area network (LAN) for personal computers within a school or an organization. Topics addressed include reasons for networking, characteristics of LANs, organizing a LAN, workstations, disk servers, and file servers. (Contains six references.) (LRW)

  4. Innovation of laboratory exercises in course Distributed systems and computer networks

    NASA Astrophysics Data System (ADS)

    Souček, Pavel; Slavata, Oldřich; Holub, Jan

    2013-09-01

    This paper is focused on innovation of laboratory exercises in course Distributed Systems and Computer Networks. These exercises were introduced in November of 2012 and replaced older exercises in order to reflect real life applications.

  5. Software Assisted Syllabus Preparation for Computer Networks Courses

    ERIC Educational Resources Information Center

    Ercan, Tuncay; Sahin, Yasar Guneri

    2007-01-01

    Course descriptions prepared by the lecturers in the beginning of the academic year do not get any feed back from the students enrolled it. These syllabuses are not only used for the future semesters, but also used by the other lecturers without even making any changes. This causes a negative effect on the student education since many of the…

  6. Designing a Versatile Dedicated Computing Lab to Support Computer Network Courses: Insights from a Case Study

    ERIC Educational Resources Information Center

    Gercek, Gokhan; Saleem, Naveed

    2006-01-01

    Providing adequate computing lab support for Management Information Systems (MIS) and Computer Science (CS) programs is a perennial challenge for most academic institutions in the US and abroad. Factors, such as lack of physical space, budgetary constraints, conflicting needs of different courses, and rapid obsolescence of computing technology,…

  7. Motivating students' participation in a computer networks course by means of magic, drama and games.

    PubMed

    Hilas, Constantinos S; Politis, Anastasios

    2014-01-01

    The recent economic crisis has forced many universities to cut down expenses by packing students into large lecture groups. The problem with large auditoria is that they discourage dialogue between students and faculty and they burden participation. Adding to this, students in computer science courses usually find the field to be full of theoretical and technical concepts. Lack of understanding leads them to lose interest and / or motivation. Classroom experience shows that the lecturer could employ alternative teaching methods, especially for early-year undergraduate students, in order to grasp their interest and introduce basic concepts. This paper describes some of the approaches that may be used to keep students interested and make them feel comfortable as they comprehend basic concepts in computer networks. The lecturing procedure was enriched with games, magic tricks and dramatic representations. This approach was used experimentally for two semesters and the results were more than encouraging. PMID:25105085

  8. Motivating students' participation in a computer networks course by means of magic, drama and games.

    PubMed

    Hilas, Constantinos S; Politis, Anastasios

    2014-01-01

    The recent economic crisis has forced many universities to cut down expenses by packing students into large lecture groups. The problem with large auditoria is that they discourage dialogue between students and faculty and they burden participation. Adding to this, students in computer science courses usually find the field to be full of theoretical and technical concepts. Lack of understanding leads them to lose interest and / or motivation. Classroom experience shows that the lecturer could employ alternative teaching methods, especially for early-year undergraduate students, in order to grasp their interest and introduce basic concepts. This paper describes some of the approaches that may be used to keep students interested and make them feel comfortable as they comprehend basic concepts in computer networks. The lecturing procedure was enriched with games, magic tricks and dramatic representations. This approach was used experimentally for two semesters and the results were more than encouraging.

  9. Networking computers.

    PubMed

    McBride, D C

    1997-03-01

    This decade the role of the personal computer has shifted dramatically from a desktop device designed to increase individual productivity and efficiency to an instrument of communication linking people and machines in different places with one another. A computer in one city can communicate with another that may be thousands of miles away. Networking is how this is accomplished. Just like the voice network used by the telephone, computer networks transmit data and other information via modems over these same telephone lines. A network can be created over both short and long distances. Networks can be established within a hospital or medical building or over many hospitals or buildings covering many geographic areas. Those confined to one location are called LANs, local area networks. Those that link computers in one building to those at other locations are known as WANs, or wide area networks. The ultimate wide area network is the one we've all been hearing so much about these days--the Internet, and its World Wide Web. Setting up a network is a process that requires careful planning and commitment. To avoid potential pitfalls and to make certain the network you establish meets your needs today and several years down the road, several steps need to be followed. This article reviews the initial steps involved in getting ready to network.

  10. A Case Study of an FSL Senior Secondary Course Integrating Computer Networking.

    ERIC Educational Resources Information Center

    Sanaoui, Razika; Lapkin, Sharon

    1992-01-01

    Acknowledging that students learning a second language need opportunities for extended spoken and written interactions with native speakers, this paper describes instructional computer networking that links grade 12 Anglophone students of French in Toronto with native French-speaking peers in Montreal. Focus is on writing skills. (25 references)…

  11. Computer ethics: A capstone course

    SciTech Connect

    Fisher, T.G.; Abunawass, A.M.

    1994-12-31

    This paper presents a capstone course on computer ethics required for all computer science majors in our program. The course was designed to encourage students to evaluate their own personal value systems in terms of the established values in computer science as represented by the ACM Code of Ethics. The structure, activities, and topics of the course as well as assessment of the students are presented. Observations on various course components and student evaluations of the course are also presented.

  12. Introduction to Computing Course Guide.

    ERIC Educational Resources Information Center

    Hawaii State Dept. of Education, Honolulu. Office of Instructional Services.

    Developed to aid intermediate-level teachers and principals in initiating and developing computer literacy programs for their students, this document is a guide for the development of a one-semester course--Introduction to Computing--for the seventh and eighth grades. The course is designed to provide opportunities for students to develop computer…

  13. A Model Computer Literacy Course.

    ERIC Educational Resources Information Center

    Orndorff, Joseph

    Designed to address the varied computer skill levels of college students, this proposed computer literacy course would be modular in format, with modules tailored to address various levels of expertise and permit individualized instruction. An introductory module would present both the history and future of computers and computing, followed by an…

  14. Resources for Computational Geophysics Courses

    NASA Astrophysics Data System (ADS)

    Keers, Henk; Rondenay, Stéphane; Harlap, Yaël.; Nordmo, Ivar

    2014-09-01

    An important skill that students in solid Earth physics need to acquire is the ability to write computer programs that can be used for the processing, analysis, and modeling of geophysical data and phenomena. Therefore, this skill (which we call "computational geophysics") is a core part of any undergraduate geophysics curriculum. In this Forum, we share our personal experience in teaching such a course.

  15. Conception of a course for professional training and education in the field of computer and mobile forensics, part III: network forensics and penetration testing

    NASA Astrophysics Data System (ADS)

    Kröger, Knut; Creutzburg, Reiner

    2014-02-01

    IT security and computer forensics are important components in the information technology. From year to year, incidents and crimes increase that target IT systems or were done with their help. More and more companies and authorities have security problems in their own IT infrastructure. To respond to these incidents professionally, it is important to have well trained staff. The fact that many agencies and companies work with very sensitive data make it necessary to further train the own employees in the field of network forensics and penetration testing. Motivated by these facts, this paper - a continuation of a paper of January 2012 [1], which showed the conception of a course for professional training and education in the field of computer and mobile forensics - addresses the practical implementation important relationships of network forensic and penetration testing.

  16. Computer Networks As Social Networks

    NASA Astrophysics Data System (ADS)

    Wellman, Barry

    2001-09-01

    Computer networks are inherently social networks, linking people, organizations, and knowledge. They are social institutions that should not be studied in isolation but as integrated into everyday lives. The proliferation of computer networks has facilitated a deemphasis on group solidarities at work and in the community and afforded a turn to networked societies that are loosely bounded and sparsely knit. The Internet increases people's social capital, increasing contact with friends and relatives who live nearby and far away. New tools must be developed to help people navigate and find knowledge in complex, fragmented, networked societies.

  17. K-12 Computer Networking.

    ERIC Educational Resources Information Center

    ERIC Review, 1993

    1993-01-01

    The "ERIC Review" is published three times a year and announces research results, publications, and new programs relevant to each issue's theme topic. This issue explores computer networking in elementary and secondary schools via two principal articles: "Plugging into the 'Net'" (Michael B. Eisenberg and Donald P. Ely); and "Computer Networks for…

  18. Hyperswitch Communication Network Computer

    NASA Technical Reports Server (NTRS)

    Peterson, John C.; Chow, Edward T.; Priel, Moshe; Upchurch, Edwin T.

    1993-01-01

    Hyperswitch Communications Network (HCN) computer is prototype multiple-processor computer being developed. Incorporates improved version of hyperswitch communication network described in "Hyperswitch Network For Hypercube Computer" (NPO-16905). Designed to support high-level software and expansion of itself. HCN computer is message-passing, multiple-instruction/multiple-data computer offering significant advantages over older single-processor and bus-based multiple-processor computers, with respect to price/performance ratio, reliability, availability, and manufacturing. Design of HCN operating-system software provides flexible computing environment accommodating both parallel and distributed processing. Also achieves balance among following competing factors; performance in processing and communications, ease of use, and tolerance of (and recovery from) faults.

  19. Computer networking at FERMILAB

    SciTech Connect

    Chartrand, G.

    1986-05-01

    Management aspects of data communications facilities at Fermilab are described. Local area networks include Ferminet, a broadband CATV system which serves as a backbone-type carrier for high-speed data traffic between major network nodes; micom network, four Micom Micro-600/2A port selectors via private twisted pair cables, dedicated telephone circuits, or Micom 800/2 statistical multiplexors; and Decnet/Ethernet, several small local area networks which provide host-to-host communications for about 35 VAX computers systems. Wide area (off site) computer networking includes an off site Micom network which provides access to all of Fermilab's computer systems for 10 universities via leased lines or modem; Tymnet, used by many European and Japanese collaborations: Physnet, used for shared data processing task communications by large collaborations of universities; Bitnet, used for file transfer, electronic mail, and communications with CERN; and Mfenet, for access to supercomputers. Plans to participate in Hepnet are also addressed. 3 figs. (DWL)

  20. Computer Programming Projects in Technology Courses.

    ERIC Educational Resources Information Center

    Thomas, Charles R.

    1985-01-01

    Discusses programming projects in applied technology courses, examining documentation, formal reports, and implementation. Includes recommendations based on experience with a sophomore machine elements course which provided computers for problem solving exercises. (DH)

  1. Computers, Networks and Work.

    ERIC Educational Resources Information Center

    Sproull, Lee; Kiesler, Sara

    1991-01-01

    Discussed are how computer networks can affect the nature of work and the relationships between managers and employees. The differences between face-to-face exchanges and electronic interactions are described. (KR)

  2. Computers, Networks and Education.

    ERIC Educational Resources Information Center

    Kay, Alan C.

    1991-01-01

    Discussed is how globally networked, easy-to-use computers can enhance learning only within an educational environment that encourages students to question "facts" and seek challenges. The strengths and weaknesses of computers used as amplifiers for learning are described. (KR)

  3. Teaching Computer Science Courses in Distance Learning

    ERIC Educational Resources Information Center

    Huan, Xiaoli; Shehane, Ronald; Ali, Adel

    2011-01-01

    As the success of distance learning (DL) has driven universities to increase the courses offered online, certain challenges arise when teaching computer science (CS) courses to students who are not physically co-located and have individual learning schedules. Teaching CS courses involves high level demonstrations and interactivity between the…

  4. Gateways among Academic Computer Networks.

    ERIC Educational Resources Information Center

    McCredie, John W.

    National intercampus computer networks are discussed, along with six illustrative networks. Attention is focused on computer networks with significant academic usage through which special software is available to manage resources in the network. It is noted that computer networks have widespread use among academics for communication in the form of…

  5. Computer Network Resources for Physical Geography Instruction.

    ERIC Educational Resources Information Center

    Bishop, Michael P.; And Others

    1993-01-01

    Asserts that the use of computer networks provides an important and effective resource for geography instruction. Describes the use of the Internet network in physical geography instruction. Provides an example of the use of Internet resources in a climatology/meteorology course. (CFR)

  6. Computer and information networks.

    PubMed

    Greenberger, M; Aronofsky, J; McKenney, J L; Massy, W F

    1973-10-01

    The most basic conclusion coming out of the EDUCOM seminars is that computer networking must be acknowledged as an important new mode for obtaining information and computation (15). It is a real alternative that needs to be given serious attention in current planning and decision-making. Yet the fact is that many institutions are not taking account of networks when they confer on whether or how to replace their main computer. Articulation of the possibilities of computer networks goes back to the early 1960's and before, and working networks have been in evidence for several years now, both commercially and in universities. What is new, however, is the unmistakable recognition-bordering on a sense of the inevitable-that networks are finally practical and here to stay. The visionary and promotional phases of computer networks are over. It is time for hard-nosed comparative analysis (16). Another conclusion of the seminars has to do with the factors that hinder the fuller development of networking. The major problems to be overcome in applying networks to research and education are political, organizational, and economic in nature rather than technological. This is not to say that the hardware and software problems of linking computers and information systems are completely solved, but they are not the big bottlenecks at present. Research and educational institutions must find ways to organize themselves as well as their computers to work together for greater resource sharing. The coming of age of networks takes on special significance as a result of widespread dissatisfactions expressed with the present computing situation. There is a feeling that the current mode of autonomous, self-sufficient operation in the provision of computing and information services is frequently wasteful, deficient, and unresponsive to users' needs because of duplication of effort from one installation to another, incompatibilities, and inadequate documentation, program support, and user

  7. Computer and information networks.

    PubMed

    Greenberger, M; Aronofsky, J; McKenney, J L; Massy, W F

    1973-10-01

    The most basic conclusion coming out of the EDUCOM seminars is that computer networking must be acknowledged as an important new mode for obtaining information and computation (15). It is a real alternative that needs to be given serious attention in current planning and decision-making. Yet the fact is that many institutions are not taking account of networks when they confer on whether or how to replace their main computer. Articulation of the possibilities of computer networks goes back to the early 1960's and before, and working networks have been in evidence for several years now, both commercially and in universities. What is new, however, is the unmistakable recognition-bordering on a sense of the inevitable-that networks are finally practical and here to stay. The visionary and promotional phases of computer networks are over. It is time for hard-nosed comparative analysis (16). Another conclusion of the seminars has to do with the factors that hinder the fuller development of networking. The major problems to be overcome in applying networks to research and education are political, organizational, and economic in nature rather than technological. This is not to say that the hardware and software problems of linking computers and information systems are completely solved, but they are not the big bottlenecks at present. Research and educational institutions must find ways to organize themselves as well as their computers to work together for greater resource sharing. The coming of age of networks takes on special significance as a result of widespread dissatisfactions expressed with the present computing situation. There is a feeling that the current mode of autonomous, self-sufficient operation in the provision of computing and information services is frequently wasteful, deficient, and unresponsive to users' needs because of duplication of effort from one installation to another, incompatibilities, and inadequate documentation, program support, and user

  8. A course in Computational Physics

    NASA Astrophysics Data System (ADS)

    Rawitscher, George

    2011-03-01

    This course, taught at UConn, has several objectives: 1) To make the students comfortable in using MATLAB; 2) To reveal the existence of unavoidable inaccuracies due to numerical roundoff errors and algorithm inaccuracies; 3) to introduce modern spectral expansion methods, and compare them with conventional finite difference methods. Some of the projects assigned in the course will be described, such as the motion of a falling parachute, and the vibrations of an inhomogeneous vibrating string.

  9. A Computer Applications Course for Biology Students.

    ERIC Educational Resources Information Center

    Ralph, Charles L.

    1989-01-01

    Describes a computer applications course developed for undergraduate biology students (primarily sophomores) to teach word processing, database and spreadsheet programs, graphing programs, telecommunications, and file transfer procedures. AppleWorks software is discussed, course content is explained, and the microcomputer laboratory is described.…

  10. The Educational Computing Course. [SITE 2002 Section].

    ERIC Educational Resources Information Center

    Bump, Wren, Ed.

    This document contains the following papers on the educational computing course from the SITE (Society for Information Technology & Teacher Education) 2002 conference: (1) "Integrating Media Literacy into a Technology Course for Preservice Secondary Teachers" (Gregg Brownell and Nancy Brownell); (2) "From Video Tutors to Electronic Portfolios:…

  11. Rethink Required Courses in Computer Programming.

    ERIC Educational Resources Information Center

    Texley, Juliana

    1988-01-01

    The educational value of courses in computer programing must be judged by sound curriculum criteria: they should fit a logical sequence of K-12 learning objectives, expose students to future career opportunities, and teach students reasoning skills. (TE)

  12. Computation in gene networks

    NASA Astrophysics Data System (ADS)

    Ben-Hur, Asa; Siegelmann, Hava T.

    2004-03-01

    Genetic regulatory networks have the complex task of controlling all aspects of life. Using a model of gene expression by piecewise linear differential equations we show that this process can be considered as a process of computation. This is demonstrated by showing that this model can simulate memory bounded Turing machines. The simulation is robust with respect to perturbations of the system, an important property for both analog computers and biological systems. Robustness is achieved using a condition that ensures that the model equations, that are generally chaotic, follow a predictable dynamics.

  13. Closeness Possible through Computer Networking.

    ERIC Educational Resources Information Center

    Dodd, Julie E.

    1989-01-01

    Points out the benefits of computer networking for scholastic journalism. Discusses three systems currently offering networking possibilities for publications: the Student Press Information Network; the Youth Communication Service; and the Dow Jones Newspaper Fund's electronic mail system. (MS)

  14. Network Coding for Function Computation

    ERIC Educational Resources Information Center

    Appuswamy, Rathinakumar

    2011-01-01

    In this dissertation, the following "network computing problem" is considered. Source nodes in a directed acyclic network generate independent messages and a single receiver node computes a target function f of the messages. The objective is to maximize the average number of times f can be computed per network usage, i.e., the "computing…

  15. Using Computers in College Reading Courses?

    ERIC Educational Resources Information Center

    Wepner, Shelley B.; And Others

    1989-01-01

    Compares gains in reading rate and comprehension of students using traditional texts and those using a computer-based, commercially prepared, speed reading package. Offers recommendations for the use of computers in college reading courses, focusing on hardware/software needs and student/teacher responsibilities. (DMM)

  16. Effectiveness of Simulation in a Hybrid and Online Networking Course.

    ERIC Educational Resources Information Center

    Cameron, Brian H.

    2003-01-01

    Reports on a study that compares the performance of students enrolled in two sections of a Web-based computer networking course: one utilizing a simulation package and the second utilizing a static, graphical software package. Analysis shows statistically significant improvements in performance in the simulation group compared to the…

  17. Plagiarism in computer science courses

    SciTech Connect

    Harris, J.K.

    1994-12-31

    Plagiarism of computer programs has long been a problem in higher education. Ease of electronic copying, vague understanding by students as to what constitutes plagiarism, increasing acceptance of plagiarism by students, lack of enforcement by instructors and school administrators, and a whole host of other factors contribute to plagiarism. The first step in curbing plagiarism is prevention, the second (and much less preferable) is detection. History files and software metrics can be used as a tool to aid in detecting possible plagiarism. This paper gives advice concerning how to deal with plagiarism and with using software monitors to detect plagiarism.

  18. A social implications of computing course which teaches computer ethics

    SciTech Connect

    Pulliam, S.C.

    1994-12-31

    Computers are integral to today`s world, forming our society as well as responding to it, In recognition of this interaction, as well as in response to requirements by the Computer Science Accrediting Board (CSAB), many schools are incorporating computer ethics and values and addressing the social implications of computing within their curriculum. The approach discussed here is through a separate course, rather than relying on the integration of specific topics throughout the curriculum.

  19. Computing in the Introductory Physics Course

    NASA Astrophysics Data System (ADS)

    Chabay, Ruth; Sherwood, Bruce

    2004-03-01

    In the Matter & Interactions version of the calculus-based introductory physics course (http://www4.ncsu.edu/ ˜rwchabay/mi) , students write programs in VPython (http://vpython.org) to model physical systems and to calculate and visualize electric and magnetic fields. VPython is unusually easy to learn, produces navigable 3D animations as a side effect of physics computations, and supports full vector calculations. The high speed of current computers makes sophisticated numerical analysis techniques unnecessary. Students can use simple first-order Euler integration, cutting the step size until the behavior of the system no longer changes. In mechanics, iterative application of the momentum principle gives students a sense of the time-evolution character of Newton's second law which is usually missing from the standard course. In E, students calculate electric and magnetic fields numerically and display them in 3D. We are currently studying the impact of introducing computational physics into the introductory course.

  20. Computer Networks and Networking: A Primer.

    ERIC Educational Resources Information Center

    Collins, Mauri P.

    1993-01-01

    Provides a basic introduction to computer networks and networking terminology. Topics addressed include modems; the Internet; TCP/IP (Transmission Control Protocol/Internet Protocol); transmission lines; Internet Protocol numbers; network traffic; Fidonet; file transfer protocol (FTP); TELNET; electronic mail; discussion groups; LISTSERV; USENET;…

  1. A Survey of Computer Science Capstone Course Literature

    ERIC Educational Resources Information Center

    Dugan, Robert F., Jr.

    2011-01-01

    In this article, we surveyed literature related to undergraduate computer science capstone courses. The survey was organized around course and project issues. Course issues included: course models, learning theories, course goals, course topics, student evaluation, and course evaluation. Project issues included: software process models, software…

  2. Augmenting computer networks

    NASA Technical Reports Server (NTRS)

    Bokhari, S. H.; Raza, A. D.

    1984-01-01

    Three methods of augmenting computer networks by adding at most one link per processor are discussed: (1) A tree of N nodes may be augmented such that the resulting graph has diameter no greater than 4log sub 2((N+2)/3)-2. Thi O(N(3)) algorithm can be applied to any spanning tree of a connected graph to reduce the diameter of that graph to O(log N); (2) Given a binary tree T and a chain C of N nodes each, C may be augmented to produce C so that T is a subgraph of C. This algorithm is O(N) and may be used to produce augmented chains or rings that have diameter no greater than 2log sub 2((N+2)/3) and are planar; (3) Any rectangular two-dimensional 4 (8) nearest neighbor array of size N = 2(k) may be augmented so that it can emulate a single step shuffle-exchange network of size N/2 in 3(t) time steps.

  3. A survey of computer science capstone course literature

    NASA Astrophysics Data System (ADS)

    Dugan, Robert F., Jr.

    2011-09-01

    In this article, we surveyed literature related to undergraduate computer science capstone courses. The survey was organized around course and project issues. Course issues included: course models, learning theories, course goals, course topics, student evaluation, and course evaluation. Project issues included: software process models, software process phases, project type, documentation, tools, groups, and instructor administration. We reflected on these issues and thecomputer science capstone course we have taught for seven years. The survey summarized, organized, and synthesized the literature to provide a referenced resource for computer science instructors and researchers interested in computer science capstone courses.

  4. Sharing Writing through Computer Networking.

    ERIC Educational Resources Information Center

    Fey, Marion H.

    1997-01-01

    Suggests computer networking can support the essential purposes of the collaborative-writing movement, offering opportunities for sharing writing. Notes that literacy teachers are exploring the connectivity of computer networking through numerous designs that use either real-time or asynchronous communication. Discusses new roles for students and…

  5. Course 10: Basic Concepts in Quantum Computation

    NASA Astrophysics Data System (ADS)

    Ekert, A.; Hayden, P. M.; Inamori, H.

    Contents 1 Qubits, gates and networks 2 Quantum arithmetic and function evaluations 3 Algorithms and their complexity 4 From interferometers to computers 5 The first quantum algorithms 6 Quantum search 7 Optimal phase estimation 8 Periodicity and quantum factoring 9 Cryptography 10 Conditional quantum dynamics 11 Decoherence and recoherence 12 Concluding remarks

  6. A Computer-based Course in Classical Mechanics.

    ERIC Educational Resources Information Center

    Kane, D.; Sherwood, B.

    1980-01-01

    Describes and illustrates the tutorial and homework exercise lessons, student routing, course organization, administration, and evaluation of a PLATO computer-based course in classical mechanics. An appendix lists 41 lessons developed for the course. (CMV)

  7. Queueing in networks of computers

    NASA Technical Reports Server (NTRS)

    Denning, Peter J.

    1991-01-01

    The designers of networks of computers must assess the capacity of the network to complete work within reasonable times. The utilization law, Little's law, forced-flow law, and response time formula are simple tools that can be used to calculate throughput and response times of networks. Bottleneck analysis can be used to calculate simple lower bounds on response time in terms of individual server parameters and the load on network as a whole. These simple results are important tools for all users of scientific networks - back of the envelope calculations can quickly reveal the effects of distant servers on local throughput and response time.

  8. Computers, Networks and the Corporation.

    ERIC Educational Resources Information Center

    Malone, Thomas W.; Rockart, John F.

    1991-01-01

    The ways in which computer networks are forging new kinds of markets and new ways to manage organizations are described. Discussed are the results of these innovations, which include changes in corporate structure and management style. (KR)

  9. Demonstrations of neural network computations involving students.

    PubMed

    May, Christopher J

    2010-01-01

    David Marr famously proposed three levels of analysis (implementational, algorithmic, and computational) for understanding information processing systems such as the brain. While two of these levels are commonly taught in neuroscience courses (the implementational level through neurophysiology and the computational level through systems/cognitive neuroscience), the algorithmic level is typically neglected. This leaves an explanatory gap in students' understanding of how, for example, the flow of sodium ions enables cognition. Neural networks bridge these two levels by demonstrating how collections of interacting neuron-like units can give rise to more overtly cognitive phenomena. The demonstrations in this paper are intended to facilitate instructors' introduction and exploration of how neurons "process information."

  10. Computer (PC/Network) Coordinator.

    ERIC Educational Resources Information Center

    Ohio State Univ., Columbus. Center on Education and Training for Employment.

    This publication contains 22 subjects appropriate for use in a competency list for the occupation of computer (PC/network) coordinator, 1 of 12 occupations within the business/computer technologies cluster. Each unit consists of a number of competencies; a list of competency builders is provided for each competency. Titles of the 22 units are as…

  11. Communications, Computers and Networks.

    ERIC Educational Resources Information Center

    Dertouzos, Michael L.

    1991-01-01

    The infrastructure created by fusing computing and communications technologies is described. The effect of this infrastructure on the economy and society of the United States is discussed. The importance of knowing the value and role of information is emphasized. (KR)

  12. A CAD (Classroom Assessment Design) of a Computer Programming Course

    ERIC Educational Resources Information Center

    Hawi, Nazir S.

    2012-01-01

    This paper presents a CAD (classroom assessment design) of an entry-level undergraduate computer programming course "Computer Programming I". CAD has been the product of a long experience in teaching computer programming courses including teaching "Computer Programming I" 22 times. Each semester, CAD is evaluated and modified for the subsequent…

  13. LINCS: Livermore's network architecture. [Octopus computing network

    SciTech Connect

    Fletcher, J.G.

    1982-01-01

    Octopus, a local computing network that has been evolving at the Lawrence Livermore National Laboratory for over fifteen years, is currently undergoing a major revision. The primary purpose of the revision is to consolidate and redefine the variety of conventions and formats, which have grown up over the years, into a single standard family of protocols, the Livermore Interactive Network Communication Standard (LINCS). This standard treats the entire network as a single distributed operating system such that access to a computing resource is obtained in a single way, whether that resource is local (on the same computer as the accessing process) or remote (on another computer). LINCS encompasses not only communication but also such issues as the relationship of customer to server processes and the structure, naming, and protection of resources. The discussion includes: an overview of the Livermore user community and computing hardware, the functions and structure of each of the seven layers of LINCS protocol, the reasons why we have designed our own protocols and why we are dissatisfied by the directions that current protocol standards are taking.

  14. Goethe Gossips with Grass: Using Computer Chatting Software in an Introductory Literature Course.

    ERIC Educational Resources Information Center

    Fraser, Catherine C.

    1999-01-01

    Students in a third-year introduction to German literature course chatted over networked computers, using "FirstClass" software. A brief description of the course design is provided with detailed information on how the three chat sessions were organized. (Author/VWL)

  15. Computer networking for scientists.

    PubMed

    Jennings, D M; Landweber, L H; Fuchs, I H; Farber, D J; Adrion, W R

    1986-02-28

    Scientific research has always relied on communication for gathering and providing access to data; for exchanging information; for holding discussions, meetings, and seminars; for collaborating with widely dispersed researchers; and for disseminating results. The pace and complexity of modern research, especially collaborations of researchers in different institutions, has dramatically increased scientists' communications needs. Scientists now need immediate access to data and information, to colleagues and collaborators, and to advanced computing and information services. Furthermore, to be really useful, communication facilities must be integrated with the scientist's normal day-to-day working environment. Scientists depend on computing and communications tools and are handicapped without them. PMID:17740290

  16. Computer Business Applications II. Course Two. Information Systems Curriculum.

    ERIC Educational Resources Information Center

    O'Neil, Sharon Lund; Everett, Donna R.

    This course is the second of seven in the Information Systems curriculum. The purpose of the course is to build on the skills acquired in the prerequisite course, Computer Business Applications I, through the manipulation of word processing, spreadsheet, database management, and graphics software. An overview of the course sets forth the condition…

  17. Computer Network Security- The Challenges of Securing a Computer Network

    NASA Technical Reports Server (NTRS)

    Scotti, Vincent, Jr.

    2011-01-01

    This article is intended to give the reader an overall perspective on what it takes to design, implement, enforce and secure a computer network in the federal and corporate world to insure the confidentiality, integrity and availability of information. While we will be giving you an overview of network design and security, this article will concentrate on the technology and human factors of securing a network and the challenges faced by those doing so. It will cover the large number of policies and the limits of technology and physical efforts to enforce such policies.

  18. Optical computer switching network

    NASA Technical Reports Server (NTRS)

    Clymer, B.; Collins, S. A., Jr.

    1985-01-01

    The design for an optical switching system for minicomputers that uses an optical spatial light modulator such as a Hughes liquid crystal light valve is presented. The switching system is designed to connect 80 minicomputers coupled to the switching system by optical fibers. The system has two major parts: the connection system that connects the data lines by which the computers communicate via a two-dimensional optical matrix array and the control system that controls which computers are connected. The basic system, the matrix-based connecting system, and some of the optical components to be used are described. Finally, the details of the control system are given and illustrated with a discussion of timing.

  19. Information visualization courses for students with a computer science background.

    PubMed

    Kerren, Andreas

    2013-01-01

    Linnaeus University offers two master's courses in information visualization for computer science students with programming experience. This article briefly describes the syllabi, exercises, and practices developed for these courses.

  20. Hyperswitch Network For Hypercube Computer

    NASA Technical Reports Server (NTRS)

    Chow, Edward; Madan, Herbert; Peterson, John

    1989-01-01

    Data-driven dynamic switching enables high speed data transfer. Proposed hyperswitch network based on mixed static and dynamic topologies. Routing header modified in response to congestion or faults encountered as path established. Static topology meets requirement if nodes have switching elements that perform necessary routing header revisions dynamically. Hypercube topology now being implemented with switching element in each computer node aimed at designing very-richly-interconnected multicomputer system. Interconnection network connects great number of small computer nodes, using fixed hypercube topology, characterized by point-to-point links between nodes.

  1. Networking DEC and IBM computers

    NASA Technical Reports Server (NTRS)

    Mish, W. H.

    1983-01-01

    Local Area Networking of DEC and IBM computers within the structure of the ISO-OSI Seven Layer Reference Model at a raw signaling speed of 1 Mops or greater are discussed. After an introduction to the ISO-OSI Reference Model nd the IEEE-802 Draft Standard for Local Area Networks (LANs), there follows a detailed discussion and comparison of the products available from a variety of manufactures to perform this networking task. A summary of these products is presented in a table.

  2. Collective network for computer structures

    DOEpatents

    Blumrich, Matthias A.; Coteus, Paul W.; Chen, Dong; Gara, Alan; Giampapa, Mark E.; Heidelberger, Philip; Hoenicke, Dirk; Takken, Todd E.; Steinmacher-Burow, Burkhard D.; Vranas, Pavlos M.

    2011-08-16

    A system and method for enabling high-speed, low-latency global collective communications among interconnected processing nodes. The global collective network optimally enables collective reduction operations to be performed during parallel algorithm operations executing in a computer structure having a plurality of the interconnected processing nodes. Router devices ate included that interconnect the nodes of the network via links to facilitate performance of low-latency global processing operations at nodes of the virtual network and class structures. The global collective network may be configured to provide global barrier and interrupt functionality in asynchronous or synchronized manner. When implemented in a massively-parallel supercomputing structure, the global collective network is physically and logically partitionable according to needs of a processing algorithm.

  3. Collective network for computer structures

    DOEpatents

    Blumrich, Matthias A; Coteus, Paul W; Chen, Dong; Gara, Alan; Giampapa, Mark E; Heidelberger, Philip; Hoenicke, Dirk; Takken, Todd E; Steinmacher-Burow, Burkhard D; Vranas, Pavlos M

    2014-01-07

    A system and method for enabling high-speed, low-latency global collective communications among interconnected processing nodes. The global collective network optimally enables collective reduction operations to be performed during parallel algorithm operations executing in a computer structure having a plurality of the interconnected processing nodes. Router devices are included that interconnect the nodes of the network via links to facilitate performance of low-latency global processing operations at nodes of the virtual network. The global collective network may be configured to provide global barrier and interrupt functionality in asynchronous or synchronized manner. When implemented in a massively-parallel supercomputing structure, the global collective network is physically and logically partitionable according to the needs of a processing algorithm.

  4. Using Virtualization and Automatic Evaluation: Adapting Network Services Management Courses to the EHEA

    ERIC Educational Resources Information Center

    Ros, S.; Robles-Gomez, A.; Hernandez, R.; Caminero, A. C.; Pastor, R.

    2012-01-01

    This paper outlines the adaptation of a course on the management of network services in operating systems, called NetServicesOS, to the context of the new European Higher Education Area (EHEA). NetServicesOS is a mandatory course in one of the official graduate programs in the Faculty of Computer Science at the Universidad Nacional de Educacion a…

  5. Combining Cloud Networks and Course Management Systems for Enhanced Analysis in Teaching Laboratories

    ERIC Educational Resources Information Center

    Abrams, Neal M.

    2012-01-01

    A cloud network system is combined with standard computing applications and a course management system to provide a robust method for sharing data among students. This system provides a unique method to improve data analysis by easily increasing the amount of sampled data available for analysis. The data can be shared within one course as well as…

  6. Using Computer Networking for Feedback.

    ERIC Educational Resources Information Center

    Woodward, John; And Others

    1987-01-01

    Two studies involving 27 learning-disabled middle-school students and 30 mildly handicapped junior high students investigated use of Teacher Net, a computer networking system that facilitates immediate feedback. Teacher Net reduced the teachers' administrative workload, effectively monitored student understanding, provided feedback to teachers,…

  7. Simulation of reliability in multiserver computer networks

    NASA Astrophysics Data System (ADS)

    Minkevičius, Saulius

    2012-11-01

    The performance in terms of reliability of computer multiserver networks motivates this paper. The probability limit theorem is derived on the extreme queue length in open multiserver queueing networks in heavy traffic and applied to a reliability model for multiserver computer networks where we relate the time of failure of a multiserver computer network to the system parameters.

  8. Optimal monitoring of computer networks

    SciTech Connect

    Fedorov, V.V.; Flanagan, D.

    1997-08-01

    The authors apply the ideas from optimal design theory to the very specific area of monitoring large computer networks. The behavior of these networks is so complex and uncertain that it is quite natural to use the statistical methods of experimental design which were originated in such areas as biology, behavioral sciences and agriculture, where the random character of phenomena is a crucial component and systems are too complicated to be described by some sophisticated deterministic models. They want to emphasize that only the first steps have been completed, and relatively simple underlying concepts about network functions have been used. Their immediate goal is to initiate studies focused on developing efficient experimental design techniques which can be used by practitioners working with large networks operating and evolving in a random environment.

  9. Courses About Computers--For Secondary School Students

    ERIC Educational Resources Information Center

    Mattei, K. C.

    1974-01-01

    Goals and guidelines for teaching courses about computers to secondary school students are discussed. A method of teaching introductory ideas of computer operations through the use of a programmable calculator is suggested. (DT)

  10. Integrating Computational Chemistry into a Course in Classical Thermodynamics

    ERIC Educational Resources Information Center

    Martini, Sheridan R.; Hartzell, Cynthia J.

    2015-01-01

    Computational chemistry is commonly addressed in the quantum mechanics course of undergraduate physical chemistry curricula. Since quantum mechanics traditionally follows the thermodynamics course, there is a lack of curricula relating computational chemistry to thermodynamics. A method integrating molecular modeling software into a semester long…

  11. Improving Student Engagement Using Course-Based Social Networks

    ERIC Educational Resources Information Center

    Imlawi, Jehad Mohammad

    2013-01-01

    This study proposes an engagement model that supports use of course-based online social networks for engaging student, and hence, improving their educational outcomes. This research demonstrates that instructors who create course-based online social networks to communicate with students can increase the student engagement in these online social…

  12. Demonstrations of Neural Network Computations Involving Students

    PubMed Central

    May, Christopher J.

    2010-01-01

    David Marr famously proposed three levels of analysis (implementational, algorithmic, and computational) for understanding information processing systems such as the brain. While two of these levels are commonly taught in neuroscience courses (the implementational level through neurophysiology and the computational level through systems/cognitive neuroscience), the algorithmic level is typically neglected. This leaves an explanatory gap in students’ understanding of how, for example, the flow of sodium ions enables cognition. Neural networks bridge these two levels by demonstrating how collections of interacting neuron-like units can give rise to more overtly cognitive phenomena. The demonstrations in this paper are intended to facilitate instructors’ introduction and exploration of how neurons “process information.” PMID:23493501

  13. Computer networking at SLR stations

    NASA Technical Reports Server (NTRS)

    Novotny, Antonin

    1993-01-01

    There are several existing communication methods to deliver data from the satellite laser ranging (SLR) station to the SLR data center and back: telephonmodem, telex, and computer networks. The SLR scientific community has been exploiting mainly INTERNET, BITNET/EARN, and SPAN. The total of 56 countries are connected to INTERNET and the number of nodes is exponentially growing. The computer networks mentioned above and others are connected through E-mail protocol. The scientific progress of SLR requires the increase of communication speed and the amount of the transmitted data. The TOPEX/POSEIDON test campaign required to deliver Quick Look data (1.7 kB/pass) from a SLR site to SLR data center within 8 hours and full rate data (up to 500 kB/pass) within 24 hours. We developed networking for the remote SLR station in Helwan, Egypt. The reliable scheme for data delivery consists of: compression of MERIT2 format (up to 89 percent), encoding to ASCII Me (files); and e-mail sending from SLR station--e-mail receiving, decoding, and decompression at the center. We do propose to use the ZIP method for compression/decompression and the UUCODE method for ASCII encoding/decoding. This method will be useful for stations connected via telephonemodems or commercial networks. The electronics delivery could solve the problem of the too late receiving of the FR data by SLR data center.

  14. Computer networking at SLR stations

    NASA Astrophysics Data System (ADS)

    Novotny, Antonin

    1993-06-01

    There are several existing communication methods to deliver data from the satellite laser ranging (SLR) station to the SLR data center and back: telephonmodem, telex, and computer networks. The SLR scientific community has been exploiting mainly INTERNET, BITNET/EARN, and SPAN. The total of 56 countries are connected to INTERNET and the number of nodes is exponentially growing. The computer networks mentioned above and others are connected through E-mail protocol. The scientific progress of SLR requires the increase of communication speed and the amount of the transmitted data. The TOPEX/POSEIDON test campaign required to deliver Quick Look data (1.7 kB/pass) from a SLR site to SLR data center within 8 hours and full rate data (up to 500 kB/pass) within 24 hours. We developed networking for the remote SLR station in Helwan, Egypt. The reliable scheme for data delivery consists of: compression of MERIT2 format (up to 89 percent), encoding to ASCII Me (files); and e-mail sending from SLR station--e-mail receiving, decoding, and decompression at the center. We do propose to use the ZIP method for compression/decompression and the UUCODE method for ASCII encoding/decoding. This method will be useful for stations connected via telephonemodems or commercial networks. The electronics delivery could solve the problem of the too late receiving of the FR data by SLR data center.

  15. Computer Networks Improve Student Achievement, School Management.

    ERIC Educational Resources Information Center

    Cherry, Steve

    1991-01-01

    Using computer networking programs at two high schools as examples, this article describes what principals should know about networking. The many advantages of computer networking in schools will remain beneficial so long as the principal's objectives are met. Tips are provided for assessing the network. (eight references) (MLH)

  16. Integrating Emerging Topics through Online Team Design in a Hybrid Communication Networks Course: Interaction Patterns and Impact of Prior Knowledge

    ERIC Educational Resources Information Center

    Reisslein, Jana; Seeling, Patrick; Reisslein, Martin

    2005-01-01

    An important challenge in the introductory communication networks course in electrical and computer engineering curricula is to integrate emerging topics, such as wireless Internet access and network security, into the already content-intensive course. At the same time it is essential to provide students with experiences in online collaboration,…

  17. Inequities in the Computer Classroom: An Analysis of Two Computer Courses.

    ERIC Educational Resources Information Center

    Alspach, Phyllis A.

    This study analyzed the enrollment of two computer classes at a public high school in northern Indiana to see if there was any computer inequity. The two classes examined--an introduction to computers course and a computer programming course--were studied over a period of four years. The sample consisted of 388 students in four years of the…

  18. Probabilistic computation by neuromine networks.

    PubMed

    Hangartner, R D; Cull, P

    2000-01-01

    In this paper, we address the question, can biologically feasible neural nets compute more than can be computed by deterministic polynomial time algorithms? Since we want to maintain a claim of plausibility and reasonableness we restrict ourselves to algorithmically easy to construct nets and we rule out infinite precision in parameters and in any analog parts of the computation. Our approach is to consider the recent advances in randomized algorithms and see if such randomized computations can be described by neural nets. We start with a pair of neurons and show that by connecting them with reciprocal inhibition and some tonic input, then the steady-state will be one neuron ON and one neuron OFF, but which neuron will be ON and which neuron will be OFF will be chosen at random (perhaps, it would be better to say that microscopic noise in the analog computation will be turned into a megascale random bit). We then show that we can build a small network that uses this random bit process to generate repeatedly random bits. This random bit generator can then be connected with a neural net representing the deterministic part of randomized algorithm. We, therefore, demonstrate that these neural nets can carry out probabilistic computation and thus be less limited than classical neural nets.

  19. What Computer Courses Do Students Really Need?

    ERIC Educational Resources Information Center

    Friedstein, Harriet G.

    1986-01-01

    Ways that selected adult students perceived the value of computers in their private life and at work were studied. It was discovered that the adult students have mixed feelings about computers, their value in general and the advantages of computer literacy for career advancement. (MLW)

  20. Addressing Small Computers in the First OS Course

    ERIC Educational Resources Information Center

    Nutt, Gary

    2006-01-01

    Small computers are emerging as important components of the contemporary computing scene. Their operating systems vary from specialized software for an embedded system to the same style of OS used on a generic desktop or server computer. This article describes a course in which systems are classified by their hardware capability and the…

  1. Laboratories for a Liberal Education Computer Science Course.

    ERIC Educational Resources Information Center

    Kiper, James D.; Bishop-Clark, Cathy

    Computer science and other computer related fields are faced with the high velocity of change in technology. Far more important than the knowledge of a particular software package is the liberal education skills that are learned in the process. This paper reviews the laboratory component of a new computer science course offered at Miami University…

  2. Traffic Dynamics of Computer Networks

    NASA Astrophysics Data System (ADS)

    Fekete, Attila

    2008-10-01

    Two important aspects of the Internet, namely the properties of its topology and the characteristics of its data traffic, have attracted growing attention of the physics community. My thesis has considered problems of both aspects. First I studied the stochastic behavior of TCP, the primary algorithm governing traffic in the current Internet, in an elementary network scenario consisting of a standalone infinite-sized buffer and an access link. The effect of the fast recovery and fast retransmission (FR/FR) algorithms is also considered. I showed that my model can be extended further to involve the effect of link propagation delay, characteristic of WAN. I continued my thesis with the investigation of finite-sized semi-bottleneck buffers, where packets can be dropped not only at the link, but also at the buffer. I demonstrated that the behavior of the system depends only on a certain combination of the parameters. Moreover, an analytic formula was derived that gives the ratio of packet loss rate at the buffer to the total packet loss rate. This formula makes it possible to treat buffer-losses as if they were link-losses. Finally, I studied computer networks from a structural perspective. I demonstrated through fluid simulations that the distribution of resources, specifically the link bandwidth, has a serious impact on the global performance of the network. Then I analyzed the distribution of edge betweenness in a growing scale-free tree under the condition that a local property, the in-degree of the "younger" node of an arbitrary edge, is known in order to find an optimum distribution of link capacity. The derived formula is exact even for finite-sized networks. I also calculated the conditional expectation of edge betweenness, rescaled for infinite networks.

  3. Of Bugs, Bytes, LANS and Viruses: The Changing Countenance of Computer Courses in Administrator Preparation Programs.

    ERIC Educational Resources Information Center

    Holloway, William H.

    Instruction in computing over the past three decades has experienced dramatic changes in both method and substance. Beginning with on-the-job training, courses evolved to become highly sophisticated and widespread in education. The second decade, the 70's, focused on mainframe use, and the third decade on micros and widely networked systems.…

  4. Computer Technology and Student Preferences in a Nutrition Course

    ERIC Educational Resources Information Center

    Temple, Norman J.; Kemp, Wendy C.; Benson, Wendy A.

    2006-01-01

    This study assessed learner preferences for using computer-based technology in a distance education course. A questionnaire was posted to students who had taken an undergraduate nutrition course at Athabasca University, Canada. The response rate was 57.1% (176 returned out of 308). Subjects were predominately female (93.7%) and nursing students…

  5. Automating a Massive Online Course with Cluster Computing

    ERIC Educational Resources Information Center

    Haas, Timothy C.

    2016-01-01

    Before massive numbers of students can take online courses for college credit, the challenges of providing tutoring support, answers to student-posed questions, and the control of cheating will need to be addressed. These challenges are taken up here by developing an online course delivery system that runs in a cluster computing environment and is…

  6. Experiences of Using Automated Assessment in Computer Science Courses

    ERIC Educational Resources Information Center

    English, John; English, Tammy

    2015-01-01

    In this paper we discuss the use of automated assessment in a variety of computer science courses that have been taught at Israel Academic College by the authors. The course assignments were assessed entirely automatically using Checkpoint, a web-based automated assessment framework. The assignments all used free-text questions (where the students…

  7. Computer Reading Rate Course for College Students?

    ERIC Educational Resources Information Center

    Minery, Bonnie

    A study examined the influence of the computer management feature of a commercially prepared speed reading software package on the reading rate and attitudes of college students towards computers as instructional tools. Subjects, 66 college freshman from lower-middle to middle socio-economic brackets (and divided into control and experimental…

  8. Using Computers in Undergraduate Economics Courses.

    ERIC Educational Resources Information Center

    Barr, Saul Z.; Harmon, Oscar

    Seven computer assignments for undergraduate economics students that concentrate on building a foundation for programming higher level mathematical calculations are described. The purpose of each assignment, the computer program for it, and the correct answers are provided. "Introduction to Text Editing" acquaints the student with some basic…

  9. Software For Monitoring A Computer Network

    NASA Technical Reports Server (NTRS)

    Lee, Young H.

    1992-01-01

    SNMAT is rule-based expert-system computer program designed to assist personnel in monitoring status of computer network and identifying defective computers, workstations, and other components of network. Also assists in training network operators. Network for SNMAT located at Space Flight Operations Center (SFOC) at NASA's Jet Propulsion Laboratory. Intended to serve as data-reduction system providing windows, menus, and graphs, enabling users to focus on relevant information. SNMAT expected to be adaptable to other computer networks; for example in management of repair, maintenance, and security, or in administration of planning systems, billing systems, or archives.

  10. Computational physics in the introductory calculus-based course

    NASA Astrophysics Data System (ADS)

    Chabay, Ruth; Sherwood, Bruce

    2008-04-01

    The integration of computation into the introductory calculus-based physics course can potentially provide significant support for the development of conceptual understanding. Computation can support three-dimensional visualizations of abstract quantities, offer opportunities to construct symbolic rather than numeric solutions to problems, and provide experience with the use of vectors as coordinate-free entities. Computation can also allow students to explore models in a way not possible using the analytical tools available to first-year students. We describe how we have incorporated computer programming into an introductory calculus-based course taken by science and engineering students.

  11. Terminal-oriented computer-communication networks.

    NASA Technical Reports Server (NTRS)

    Schwartz, M.; Boorstyn, R. R.; Pickholtz, R. L.

    1972-01-01

    Four examples of currently operating computer-communication networks are described in this tutorial paper. They include the TYMNET network, the GE Information Services network, the NASDAQ over-the-counter stock-quotation system, and the Computer Sciences Infonet. These networks all use programmable concentrators for combining a multiplicity of terminals. Included in the discussion for each network is a description of the overall network structure, the handling and transmission of messages, communication requirements, routing and reliability consideration where applicable, operating data and design specifications where available, and unique design features in the area of computer communications.

  12. Computer Learning Networks and Student Empowerment.

    ERIC Educational Resources Information Center

    Warschauer, Mark; And Others

    1996-01-01

    Examines whether computer networks can empower second-language learners, focusing on three aspects: autonomy, equality, and learning skills. The article concludes that these networks, appropriately used, can empower students, and provides pedagogical suggestions for effective computer networking in the second- and foreign-language classroom. (66…

  13. Proceedings of the computer networking symposium

    SciTech Connect

    Not Available

    1986-01-01

    This book presents the papers given at a conference on array processors and computer networks. Topics considered at the conference included heterogeneous local networks supporting scientific and knowledge processing, dynamic flow control in a computer communication network, broadcast protocols preserving concurrency, distributed systems, algorithms, routing, flow control, and internetworking.

  14. "Horses for Courses": Categories of Computer-Based Learning Program and Their Uses in Pharmacology Courses.

    ERIC Educational Resources Information Center

    Hughes, Ian E.

    1998-01-01

    Describes the pharma-CAL-ogy project, funded by Teaching and Learning Technology Programme (TLTP), which has developed various types of software for use in pharmacology courses. Topics include course organization and delivery software, drill and practice software, tutorial-type programs, simulations, and the need to integrate computer-assisted…

  15. Student Learning Networks on Residential Field Courses: Does Size Matter?

    ERIC Educational Resources Information Center

    Langan, A. Mark; Cullen, W. Rod; Shuker, David M.

    2008-01-01

    This article describes learner and tutor reports of a learning network that formed during the completion of investigative projects on a residential field course. Staff and students recorded project-related interactions, who they were with and how long they lasted over four phases during the field course. An enquiry based learning format challenged…

  16. Code 672 observational science branch computer networks

    NASA Technical Reports Server (NTRS)

    Hancock, D. W.; Shirk, H. G.

    1988-01-01

    In general, networking increases productivity due to the speed of transmission, easy access to remote computers, ability to share files, and increased availability of peripherals. Two different networks within the Observational Science Branch are described in detail.

  17. Time-course of cortical networks involved in working memory

    PubMed Central

    Luu, Phan; Caggiano, Daniel M.; Geyer, Alexandra; Lewis, Jenn; Cohn, Joseph; Tucker, Don M.

    2014-01-01

    Working memory (WM) is one of the most studied cognitive constructs. Although many neuroimaging studies have identified brain networks involved in WM, the time course of these networks remains unclear. In this paper we use dense-array electroencephalography (dEEG) to capture neural signals during performance of a standard WM task, the n-back task, and a blend of principal components analysis and independent components analysis (PCA/ICA) to statistically identify networks of WM and their time courses. Results reveal a visual cortex centric network, that also includes the posterior cingulate cortex, that is active prior to stimulus onset and that appears to reflect anticipatory, attention-related processes. After stimulus onset, the ventromedial prefrontal cortex, lateral prefrontal prefrontal cortex, and temporal poles become associated with the prestimulus network. This second network appears to reflect executive control processes. Following activation of the second network, the cortices of the temporo-parietal junction with the temporal lobe structures seen in the first and second networks re-engage. This third network appears to reflect activity of the ventral attention network involved in control of attentional reorientation. The results point to important temporal features of network dynamics that integrate multiple subsystems of the ventral attention network with the default mode network in the performance of working memory tasks. PMID:24523686

  18. Integration of Computers into a Course on Biostatistics

    ERIC Educational Resources Information Center

    Gjerde, Craig L.

    1977-01-01

    The biostatistics course for undergraduate medical and dental students at the University of Connecticut Health Center is taught by the Keller Plan, and students can use computers to analyze data sets and to score their unit tests. The computer is an essential tool for data analysis and an attractive option for test scoring. (LBH)

  19. A Computer Course for Business Students: Teacher's Guide.

    ERIC Educational Resources Information Center

    Waterhouse, Ann

    This teacher's guide is for a course designed to teach business students the fundamentals of the BASIC language and computer programming using a series of business-oriented programs. Each lesson contains an introduction, flow charts, and computer programs. The six lesson topics are print-out and format control, count-average, withholding tax…

  20. Filling Structural Holes: Social Networks in the Introductory Course

    ERIC Educational Resources Information Center

    Cook, James M.

    2005-01-01

    Although the literature on social networks has made a considerable contribution to the sociological imagination in recent years, it has been largely ignored in conventional course materials. Such an omission is curious, considering social networks' intuitive imagery, broad theoretical relevance and extensive empirical application. This article…

  1. Computer Literacy in Pennsylvania Community Colleges. Competencies in a Beginning Level College Computer Literacy Course.

    ERIC Educational Resources Information Center

    Tortorelli, Ann Eichorn

    A study was conducted at the 14 community colleges (17 campuses) in Pennsylvania to assess the perceptions of faculty about the relative importance of course content items in a beginning credit course in computer literacy, and to survey courses currently being offered. A detailed questionnaire consisting of 96 questions based on MECC (Minnesota…

  2. Products and Services for Computer Networks.

    ERIC Educational Resources Information Center

    Negroponte, Nicholas P.

    1991-01-01

    Creative applications of computer networks are discussed. Products and services of the future that come from imaginative applications of both channel and computing capacity are described. The topics of entertainment, transactions, and electronic personal surrogates are included. (KR)

  3. Micro Computer Technician Course. Course Design, Course Curricula, Learning Units, Resource Requirements. InfoTVE 14.

    ERIC Educational Resources Information Center

    Royal Melbourne Inst. of Tech. (Australia).

    This guide to the core curricula for the training of microcomputer technicians is designed for school leavers after 10 or more years of general/vocational education with a science and mathematics background. The 2-year course is to be administered in four semesters. An introductory outline of course design and curricula provides the rationale,…

  4. A Computer Security Course in the Undergraduate Computer Science Curriculum.

    ERIC Educational Resources Information Center

    Spillman, Richard

    1992-01-01

    Discusses the importance of computer security and considers criminal, national security, and personal privacy threats posed by security breakdown. Several examples are given, including incidents involving computer viruses. Objectives, content, instructional strategies, resources, and a sample examination for an experimental undergraduate computer…

  5. Integration of Major Computer Program Packages into Experimental Courses: Organic Synthesis Design and the Computer.

    ERIC Educational Resources Information Center

    Sandel, Bonnie Burns; Solomon, Robert W.

    1981-01-01

    Presents discussion on: (1) computer assisted synthesis in industry and academia; (2) computer applications to teaching organic synthesis; (3) a computer program (ORGSYN) incorporating reactions to synthesize aliphatic compounds; and (4) the design of a computer program as a heuristic device in an introductory organic course. (SK)

  6. Integrating ethical topics in a traditional computer science course

    SciTech Connect

    Winrich, L.B.

    1994-12-31

    It is never hard to find additional, often unconventional, topics which seem to beg inclusion in standard courses. A dynamic discipline like computer science usually provides a steady stream of new technical ideas to vie for time and attention with more traditional material. As difficult as it may be to keep standard CS courses up-to-date with technical innovations, it often seems even more difficult to include non-technical topics even when there is universal agreement on their importance, Inevitably the question of whether or not such inclusion will compromise the technical content of the course arises. This paper describes an attempt to include two such topics in a traditional course in data structures. The two topics are writing and ethics and, although the effort concentrates on the inclusion of ethical questions in a standard CS course, writing is the vehicle for accomplishing this goal. Furthermore, the inclusion writing in the CS curriculum is certainly recognized as a desirable outcome.

  7. Computing preimages of Boolean networks.

    PubMed

    Klotz, Johannes; Bossert, Martin; Schober, Steffen

    2013-01-01

    In this paper we present an algorithm based on the sum-product algorithm that finds elements in the preimage of a feed-forward Boolean networks given an output of the network. Our probabilistic method runs in linear time with respect to the number of nodes in the network. We evaluate our algorithm for randomly constructed Boolean networks and a regulatory network of Escherichia coli and found that it gives a valid solution in most cases. PMID:24267277

  8. Super-speed computer interfaces and networks

    SciTech Connect

    Tolmie, D.E.; St. John, W.; DuBois, D.H.

    1997-10-01

    This is the final report of a one-year, Laboratory Directed Research and Development (LDRD) project at Los Alamos National Laboratory (LANL). Research into super-speed computer interfaces has been directed towards identifying networking requirements from compute-intensive applications that are crucial to DOE programs. In particular, both the DOE Energy Research High Performance Computing Research Centers (HPCRC) and the DOE Defense Programs Accelerated Strategic Computing Initiative (ASCI) have planned applications that will require large increases in network bandwidth. This project was set up to help network researchers identify those networking requirements and to plan the development of such networks. Based on studies, research, and LANL-sponsored workshops, this project helped forge the beginnings for multi-gigabit/sec network research and developments that today is being lead by Los Alamos in the American National Standards Institute (ANSI) 6.4 gigabit/sec specification called HIPPI-6400.

  9. Neural-Network Computer Transforms Coordinates

    NASA Technical Reports Server (NTRS)

    Josin, Gary M.

    1990-01-01

    Numerical simulation demonstrated ability of conceptual neural-network computer to generalize what it has "learned" from few examples. Ability to generalize achieved with even simple neural network (relatively few neurons) and after exposure of network to only few "training" examples. Ability to obtain fairly accurate mappings after only few training examples used to provide solutions to otherwise intractable mapping problems.

  10. Evolving Computer Networks in American Higher Education.

    ERIC Educational Resources Information Center

    McCredie, John W.; Timlake, William P.

    1983-01-01

    Traditions and pressures in the academic environment accounting for early and continuing involvement of higher education with computer networking are described. Several established networks illustrating the wide range of academic applications currently available as well as policy issues of particular significance in academic networks are…

  11. Using E-Mail across Computer Networks.

    ERIC Educational Resources Information Center

    Hazari, Sunil

    1990-01-01

    Discusses the use of telecommunications technology to exchange electronic mail, files, and messages across different computer networks. Networks highlighted include ARPA Internet; BITNET; USENET; FidoNet; MCI Mail; and CompuServe. Examples of the successful use of networks in higher education are given. (Six references) (LRW)

  12. Advanced course for doctors as Departmental IT Network Administrators in anesthesia and intensive care units.

    PubMed

    Lanza, Vincenzo; Huang, Chun-Hsi

    2006-10-01

    The design and administration of a departmental computer network (Local Area Network) in anesthesiology and intensive care offer the opportunity to manage clinical information and control the work-flow. To improve the local network, after basic design, intelligence is necessary to maintain its efficiency. For this reason the role of a medical administrator of the network is fundamental because he is a qualified figure who recognizes the most important characteristics that a network must have, knows the users of the system, represents a valid consultant for the technician that has to build the network, and is able to face possible breakdowns. This paper illustrates the structure of a course to train a medical network administrator in anesthesiology and critical care.

  13. Gateways among Academic Computer Networks.

    ERIC Educational Resources Information Center

    McCredie, John W.

    1984-01-01

    Local area networks for intracampus facilities and national inter-campus networks are discussed. Descriptions of some of these networks (ARPAnet, BITNET, CSNET, EDUNET, MAILNET, RLIN, AND USENET) are provided that illustrate the wide range of academic applications currently available. (Author/MLW)

  14. Computational capabilities of recurrent NARX neural networks.

    PubMed

    Siegelmann, H T; Horne, B G; Giles, C L

    1997-01-01

    Recently, fully connected recurrent neural networks have been proven to be computationally rich-at least as powerful as Turing machines. This work focuses on another network which is popular in control applications and has been found to be very effective at learning a variety of problems. These networks are based upon Nonlinear AutoRegressive models with eXogenous Inputs (NARX models), and are therefore called NARX networks. As opposed to other recurrent networks, NARX networks have a limited feedback which comes only from the output neuron rather than from hidden states. They are formalized by y(t)=Psi(u(t-n(u)), ..., u(t-1), u(t), y(t-n(y)), ..., y(t-1)) where u(t) and y(t) represent input and output of the network at time t, n(u) and n(y) are the input and output order, and the function Psi is the mapping performed by a Multilayer Perceptron. We constructively prove that the NARX networks with a finite number of parameters are computationally as strong as fully connected recurrent networks and thus Turing machines. We conclude that in theory one can use the NARX models, rather than conventional recurrent networks without any computational loss even though their feedback is limited. Furthermore, these results raise the issue of what amount of feedback or recurrence is necessary for any network to be Turing equivalent and what restrictions on feedback limit computational power. PMID:18255858

  15. Course 10: Three Lectures on Biological Networks

    NASA Astrophysics Data System (ADS)

    Magnasco, M. O.

    1 Enzymatic networks. Proofreading knots: How DNA topoisomerases disentangle DNA 1.1 Length scales and energy scales 1.2 DNA topology 1.3 Topoisomerases 1.4 Knots and supercoils 1.5 Topological equilibrium 1.6 Can topoisomerases recognize topology? 1.7 Proposal: Kinetic proofreading 1.8 How to do it twice 1.9 The care and proofreading of knots 1.10 Suppression of supercoils 1.11 Problems and outlook 1.12 Disquisition 2 Gene expression networks. Methods for analysis of DNA chip experiments 2.1 The regulation of gene expression 2.2 Gene expression arrays 2.3 Analysis of array data 2.4 Some simplifying assumptions 2.5 Probeset analysis 2.6 Discussion 3 Neural and gene expression networks: Song-induced gene expression in the canary brain 3.1 The study of songbirds 3.2 Canary song 3.3 ZENK 3.4 The blush 3.5 Histological analysis 3.6 Natural vs. artificial 3.7 The Blush II: gAP 3.8 Meditation

  16. Multidimensional neural growing networks and computer intelligence

    SciTech Connect

    Yashchenko, V.A.

    1995-03-01

    This paper examines information-computation processes in time and in space and some aspects of computer intelligence using multidimensional matrix neural growing networks. In particular, issues of object-oriented {open_quotes}thinking{close_quotes} of computers are considered.

  17. Computer-Oriented Calculus Courses Using Finite Differences.

    ERIC Educational Resources Information Center

    Gordon, Sheldon P.

    The so-called discrete approach in calculus instruction involves introducing topics from the calculus of finite differences and finite sums, both for motivation and as useful tools for applications of the calculus. In particular, it provides an ideal setting in which to incorporate computers into calculus courses. This approach has been…

  18. Interpersonal Presence in Computer-Mediated Conferencing Courses.

    ERIC Educational Resources Information Center

    Herod, L.

    Interpersonal presence refers to the cues individuals use to form impressions of one another and form/maintain relationships. The physical cues used to convey interpersonal presence in face-to-face learning environments are absent in text-based computer-mediated conferencing (CMC) courses. Learners' perceptions of interpersonal presence in CMC…

  19. Multimedia Instructional Tools and Student Learning in Computer Applications Courses

    ERIC Educational Resources Information Center

    Chapman, Debra Laier

    2013-01-01

    Advances in technology and changes in educational strategies have resulted in the integration of technology into the classroom. Multimedia instructional tools (MMIT) have been identified as a way to provide student-centered active-learning instructional material to students. MMITs are common in introductory computer applications courses based on…

  20. Computed Tomography-Enhanced Anatomy Course Using Enterprise Visualization

    ERIC Educational Resources Information Center

    May, Hila; Cohen, Haim; Medlej, Bahaa; Kornreich, Liora; Peled, Nathan; Hershkovitz, Israel

    2013-01-01

    Rapid changes in medical knowledge are forcing continuous adaptation of the basic science courses in medical schools. This article discusses a three-year experience developing a new Computed Tomography (CT)-based anatomy curriculum at the Sackler School of Medicine, Tel Aviv University, including describing the motivations and reasoning for the…

  1. A Term Project for a Course on Computer Forensics

    ERIC Educational Resources Information Center

    Harrison, Warren

    2006-01-01

    The typical approach to creating an examination disk for exercises and projects in a course on computer forensics is for the instructor to populate a piece of media with evidence to be retrieved. While such an approach supports the simple use of forensic tools, in many cases the use of an instructor-developed examination disk avoids utilizing some…

  2. The Effect of Computer Literacy Course on Students' Attitudes toward Computer Applications

    ERIC Educational Resources Information Center

    Erlich, Zippy; Gadot, Rivka; Shahak, Daphna

    2009-01-01

    Studies indicate that the use of technologies as teaching aids and tools for self-study is influenced by students' attitudes toward computers and their applications. The purpose of this study is to determine whether taking a Computer Literacy and Applications (CLA) course has an impact on students' attitudes toward computer applications, across…

  3. Queuing theory models for computer networks

    NASA Technical Reports Server (NTRS)

    Galant, David C.

    1989-01-01

    A set of simple queuing theory models which can model the average response of a network of computers to a given traffic load has been implemented using a spreadsheet. The impact of variations in traffic patterns and intensities, channel capacities, and message protocols can be assessed using them because of the lack of fine detail in the network traffic rates, traffic patterns, and the hardware used to implement the networks. A sample use of the models applied to a realistic problem is included in appendix A. Appendix B provides a glossary of terms used in this paper. This Ames Research Center computer communication network is an evolving network of local area networks (LANs) connected via gateways and high-speed backbone communication channels. Intelligent planning of expansion and improvement requires understanding the behavior of the individual LANs as well as the collection of networks as a whole.

  4. Pedagogy in the Computer-Networked Classroom (Computers and Pedagogy).

    ERIC Educational Resources Information Center

    Eldred, Janet M.

    1991-01-01

    Argues that, with careful planning, computer networking can work in the writing classroom by stressing composition as a social collaborative act. Suggests that classroom networking planners must attend to (1) choice of technology; (2) ease of use; (3) participation; and (4) audience awareness. (SG)

  5. Collaborative Teaching and Learning in a Networked Course Setting

    ERIC Educational Resources Information Center

    Kontopoulos, Ourania; Ford, Vivian; Roth, Stacy

    2007-01-01

    We report on a partnership between a librarian and two other community college teachers (a humanist and a social scientist) working to establish "networked courses" that use the model and techniques of collaborative teaching and learning in an interdisciplinary setting. In this partnership--and, in fact, in any interdisciplinary context--the role…

  6. Computer network environment planning and analysis

    NASA Technical Reports Server (NTRS)

    Dalphin, John F.

    1989-01-01

    The GSFC Computer Network Environment provides a broadband RF cable between campus buildings and ethernet spines in buildings for the interlinking of Local Area Networks (LANs). This system provides terminal and computer linkage among host and user systems thereby providing E-mail services, file exchange capability, and certain distributed computing opportunities. The Environment is designed to be transparent and supports multiple protocols. Networking at Goddard has a short history and has been under coordinated control of a Network Steering Committee for slightly more than two years; network growth has been rapid with more than 1500 nodes currently addressed and greater expansion expected. A new RF cable system with a different topology is being installed during summer 1989; consideration of a fiber optics system for the future will begin soon. Summmer study was directed toward Network Steering Committee operation and planning plus consideration of Center Network Environment analysis and modeling. Biweekly Steering Committee meetings were attended to learn the background of the network and the concerns of those managing it. Suggestions for historical data gathering have been made to support future planning and modeling. Data Systems Dynamic Simulator, a simulation package developed at NASA and maintained at GSFC was studied as a possible modeling tool for the network environment. A modeling concept based on a hierarchical model was hypothesized for further development. Such a model would allow input of newly updated parameters and would provide an estimation of the behavior of the network.

  7. A Short Introduction to Computer Networks

    NASA Astrophysics Data System (ADS)

    Fuchs, Ulrich

    Computer Networks have become an essential tool in many aspects: human communication, gathering, exchange and sharing of information, distributed work environments, access to remote resources (data and computing power) and many more. Starting from an historical overview, this paper will give an introduction to the underlying ideas and technologies. The second half will concentrate on the most commonly used network technology today (Ethernet and TCP/IP) and give an introduction to the communication mechanisms used.

  8. Nuclear Physics computer networking: Report of the Nuclear Physics Panel on Computer Networking

    SciTech Connect

    Bemis, C. ); Erskine, J. ); Franey, M. ); Greiner, D. ); Hoehn, M. ); Kaletka, M. ); LeVine, M. ); Roberson, R. (Duke Univ., Durham, NC (U

    1990-05-01

    This paper discusses: the state of computer networking within nuclear physics program; network requirements for nuclear physics; management structure; and issues of special interest to the nuclear physics program office.

  9. Educator's Guide to Networking: Using Computers.

    ERIC Educational Resources Information Center

    Spurgin, Judy Barrett

    Since electronic networking via microcomputers is fast becoming one of the most popular communications mediums of this decade, this manual is designed to help educators use personal computers as communication devices. The purpose of the document is to acquaint the reader with: (1) the many available communications options on electronic networks;…

  10. Neural Network Computing and Natural Language Processing.

    ERIC Educational Resources Information Center

    Borchardt, Frank

    1988-01-01

    Considers the application of neural network concepts to traditional natural language processing and demonstrates that neural network computing architecture can: (1) learn from actual spoken language; (2) observe rules of pronunciation; and (3) reproduce sounds from the patterns derived by its own processes. (Author/CB)

  11. Computer Networks and African Studies Centers.

    ERIC Educational Resources Information Center

    Kuntz, Patricia S.

    The use of electronic communication in the 12 Title VI African Studies Centers is discussed, and the networks available for their use are reviewed. It is argued that the African Studies Centers should be on the cutting edge of contemporary electronic communication and that computer networks should be a fundamental aspect of their programs. An…

  12. Managing secure computer systems and networks.

    PubMed

    Von Solms, B

    1996-10-01

    No computer system or computer network can today be operated without the necessary security measures to secure and protect the electronic assets stored, processed and transmitted using such systems and networks. Very often the effort in managing such security and protection measures are totally underestimated. This paper provides an overview of the security management needed to secure and protect a typical IT system and network. Special reference is made to this management effort in healthcare systems, and the role of the information security officer is also highlighted.

  13. Optimization of an interactive distributive computer network

    NASA Technical Reports Server (NTRS)

    Frederick, V.

    1985-01-01

    The activities under a cooperative agreement for the development of a computer network are briefly summarized. Research activities covered are: computer operating systems optimization and integration; software development and implementation of the IRIS (Infrared Imaging of Shuttle) Experiment; and software design, development, and implementation of the APS (Aerosol Particle System) Experiment.

  14. Computer Networking with the Victorian Correspondence School.

    ERIC Educational Resources Information Center

    Conboy, Ian

    During 1985 the Education Department installed two-way radios in 44 remote secondary schools in Victoria, Australia, to improve turn-around time for correspondence assignments. Subsequently, teacher supervisors at Melbourne's Correspondence School sought ways to further augument audio interactivity with computer networking. Computer equipment was…

  15. Spontaneous ad hoc mobile cloud computing network.

    PubMed

    Lacuesta, Raquel; Lloret, Jaime; Sendra, Sandra; Peñalver, Lourdes

    2014-01-01

    Cloud computing helps users and companies to share computing resources instead of having local servers or personal devices to handle the applications. Smart devices are becoming one of the main information processing devices. Their computing features are reaching levels that let them create a mobile cloud computing network. But sometimes they are not able to create it and collaborate actively in the cloud because it is difficult for them to build easily a spontaneous network and configure its parameters. For this reason, in this paper, we are going to present the design and deployment of a spontaneous ad hoc mobile cloud computing network. In order to perform it, we have developed a trusted algorithm that is able to manage the activity of the nodes when they join and leave the network. The paper shows the network procedures and classes that have been designed. Our simulation results using Castalia show that our proposal presents a good efficiency and network performance even by using high number of nodes.

  16. Techniques of networking in the computer world.

    PubMed

    Armstrong, M L

    1985-09-01

    Networks can play an important role for nurses in user-to-user communication because they can be used both within and outside the health care delivery system. The choices include an information exchange, which can be an effective strategy for sharing personal concerns, problems, and achievements about the computer; commercial data bases with their vast sources of information and research data; or local area networks, effective in an office or campus setting. All of these networks can put worlds of information and services just a few words or keyboard strokes away, because they offer, outside of your own computer, a whole new dimension of retrieval, storage, reference, and communication capabilities. These networks can significantly enhance computing potential by providing an overall expansion of information.

  17. Role of Computer Assisted Instruction (CAI) in an Introductory Computer Concepts Course.

    ERIC Educational Resources Information Center

    Skudrna, Vincent J.

    1997-01-01

    Discusses the role of computer assisted instruction (CAI) in undergraduate education via a survey of related literature and specific applications. Describes an undergraduate computer concepts course and includes appendices of instructions, flowcharts, programs, sample student work in accounting, COBOL instructional model, decision logic in a…

  18. Engineering Technology Programs Courses Guide for Computer Aided Design and Computer Aided Manufacturing.

    ERIC Educational Resources Information Center

    Georgia Univ., Athens. Div. of Vocational Education.

    This guide describes the requirements for courses in computer-aided design and computer-aided manufacturing (CAD/CAM) that are part of engineering technology programs conducted in vocational-technical schools in Georgia. The guide is organized in five sections. The first section provides a rationale for occupations in design and in production,…

  19. MTX data acquisition and analysis computer network

    SciTech Connect

    Butner, D.N.; Casper, T.A.; Brown, M.D.; Drlik, M.; Meyer, W.H.; Moller, J.M. )

    1990-10-01

    For the MTX experiment, we use a network of computers for plasma diagnostic data acquisition and analysis. This multivendor network employs VMS, UNIX, and BASIC based computers connected in a local area Ethernet network. Some of the data is acquired directly into a VAX/VMS computer cluster over a fiber-optic serial CAMAC highway. Several HP-Unix workstations and HP-BASIC instrument control computers acquire and analyze data for the more data intensive or specialized diagnostics. The VAX/VMS system is used for global analysis of the data and serves as the central data archiving and retrieval manager. Shot synchronization and control of data flow are implemented by task-to-task message passing using our interprocess communication system. The system has been in operation during our initial MTX tokamak and FEL experiments; it has operated reliably with data rates typically in the range of 5 Mbytes/shot without limiting the experimental shot rate.

  20. Networking Course Syllabus in Accredited Library and Information Science Programs: A Comparative Analysis Study

    ERIC Educational Resources Information Center

    Abouserie, Hossam Eldin Mohamed Refaat

    2009-01-01

    The study investigated networking courses offered in accredited Library and Information Science schools in the United States in 2009. The study analyzed and compared network syllabi according to Course Syllabus Evaluation Rubric to obtain in-depth understanding of basic features and characteristics of networking courses taught. The study embraced…

  1. Computing with structured connections networks. Technical report

    SciTech Connect

    Feldman, J.A.; Fanty, M.A.; Goddard, N.; Lynne, K.

    1987-04-01

    Rapid advances both in the neurosciences and in computer science are beginning to lead to a new interest in computational models linking animal brains and behavior. In computer science, there is a large and growing body of knowledge about parallel computation and another, largely separate, science of artificial intelligence. The idea of looking directly at massively parallel realizations of intelligent activity promises to be fruitful for the study of both natural and artificial computations. Much attention has been directed towards the biological implications of this interdisciplinary effort, but there are equally important relations with computational theory, hardware and software. This article focuses on the design and use of massively parallel computational models, particularly in artificial intelligence. Much of the recent work on massively parallel computation has been carried out by physicists and examines the emergent behavior of large, unstructured collections of computing units. We are more concerned with how one can design, realize and analyze networks that embody the specific computational structures needed to solve hard problems. Adaptation and learning are treated as ways to improve structured networks, not as a replacement for analysis and design.

  2. Modern control centers and computer networking

    SciTech Connect

    Dy-Liacco, T.E.

    1994-10-01

    The automation of power system operation is generally achieved with the implementation of two control centers, one for the operation of the generation-transmission system and the other for the operation of the distribution system. These control centers are referred to, respectively, as the energy management system (EMS) and the distribution management system (DMS). The EMS may consist of several control centers in a hierarchy. The DMS may be made up of several independent distribution control centers. This article features the fundamental design aspects of modern EMS and DMS control centers (computer networks, distributed processing, and distributed databases), the linking of computer networks, and the communications that support such internetworking. The extension of such networking beyond the confines of system operation to other corporate networks is now made practical by the maturing concepts of client-server architectures and by the availability of modern communication technologies.

  3. Profiles of Motivated Self-Regulation in College Computer Science Courses: Differences in Major versus Required Non-Major Courses

    ERIC Educational Resources Information Center

    Shell, Duane F.; Soh, Leen-Kiat

    2013-01-01

    The goal of the present study was to utilize a profiling approach to understand differences in motivation and strategic self-regulation among post-secondary STEM students in major versus required non-major computer science courses. Participants were 233 students from required introductory computer science courses (194 men; 35 women; 4 unknown) at…

  4. Meteorological Monitoring And Warning Computer Network

    NASA Technical Reports Server (NTRS)

    Evans, Randolph J.; Dianic, Allan V.; Moore, Lien N.

    1996-01-01

    Meteorological monitoring system (MMS) computer network tracks weather conditions and issues warnings when weather hazards are about to occur. Receives data from such meteorological instruments as wind sensors on towers and lightning detectors, and compares data with weather restrictions specified for outdoor activities. If weather violates restriction, network generates audible and visible alarms to alert people involved in activity. Also displays weather and toxic diffusion data and disseminates weather forecasts, advisories, and warnings to workstations.

  5. Position paper on active countermeasures for computer networks.

    SciTech Connect

    Van Randwyk, Jamie A.

    2003-07-01

    Computer security professionals have used passive network countermeasures for several years in order to secure computer networks. Passive countermeasures such as firewalls and intrusion detection systems are effective but their use alone is not enough to protect a network. Active countermeasures offer new ways of protecting a computer network. Corporations and government entities should adopt active network countermeasures as a means of protecting their computer networks.

  6. Student Perceived Importance and Correlations of Selected Computer Literacy Course Topics

    ERIC Educational Resources Information Center

    Ciampa, Mark

    2013-01-01

    Traditional college-level courses designed to teach computer literacy are in a state of flux. Today's students have high rates of access to computing technology and computer ownership, leading many policy decision makers to conclude that students already are computer literate and thus computer literacy courses are dinosaurs in a modern digital…

  7. Anonymous Transactions in Computer Networks

    NASA Astrophysics Data System (ADS)

    Dolev, Shlomi; Kopeetsky, Marina

    We present schemes for providing anonymous transactions while privacy and anonymity are preserved, providing user anonymous authentication in distributed networks such as the Internet. We first present a practical scheme for anonymous transactions while the transaction resolution is assisted by a Trusted Authority. This practical scheme is extended to a theoretical scheme where a Trusted Authority is not involved in the transaction resolution. Given an authority that generates for each player hard to produce evidence EVID (e. g., problem instance with or without a solution) to each player, the identity of a user U is defined by the ability to prove possession of said evidence. We use Zero-Knowledge proof techniques to repeatedly identify U by providing a proof that U has evidence EVID, without revealing EVID, therefore avoiding identity theft.

  8. Ideas in Practice (1): Two Undergraduate Courses in Computer-Aided Engineering.

    ERIC Educational Resources Information Center

    Fenves, S. J.; And Others

    1988-01-01

    Introduces two courses for developing relevant computer use in a civil engineering department. The sophomore course introduces students to a set of problem-formulation methods and computer-based problem-solving tools. The senior course introduces students to representative, professional quality computer-aided engineering tools. Discusses impact of…

  9. On computer vision in wireless sensor networks.

    SciTech Connect

    Berry, Nina M.; Ko, Teresa H.

    2004-09-01

    Wireless sensor networks allow detailed sensing of otherwise unknown and inaccessible environments. While it would be beneficial to include cameras in a wireless sensor network because images are so rich in information, the power cost of transmitting an image across the wireless network can dramatically shorten the lifespan of the sensor nodes. This paper describe a new paradigm for the incorporation of imaging into wireless networks. Rather than focusing on transmitting images across the network, we show how an image can be processed locally for key features using simple detectors. Contrasted with traditional event detection systems that trigger an image capture, this enables a new class of sensors which uses a low power imaging sensor to detect a variety of visual cues. Sharing these features among relevant nodes cues specific actions to better provide information about the environment. We report on various existing techniques developed for traditional computer vision research which can aid in this work.

  10. Computational Statistical Methods for Social Network Models

    PubMed Central

    Hunter, David R.; Krivitsky, Pavel N.; Schweinberger, Michael

    2013-01-01

    We review the broad range of recent statistical work in social network models, with emphasis on computational aspects of these methods. Particular focus is applied to exponential-family random graph models (ERGM) and latent variable models for data on complete networks observed at a single time point, though we also briefly review many methods for incompletely observed networks and networks observed at multiple time points. Although we mention far more modeling techniques than we can possibly cover in depth, we provide numerous citations to current literature. We illustrate several of the methods on a small, well-known network dataset, Sampson’s monks, providing code where possible so that these analyses may be duplicated. PMID:23828720

  11. Computational capabilities of random automata networks for reservoir computing.

    PubMed

    Snyder, David; Goudarzi, Alireza; Teuscher, Christof

    2013-04-01

    This paper underscores the conjecture that intrinsic computation is maximal in systems at the "edge of chaos". We study the relationship between dynamics and computational capability in random Boolean networks (RBN) for reservoir computing (RC). RC is a computational paradigm in which a trained readout layer interprets the dynamics of an excitable component (called the reservoir) that is perturbed by external input. The reservoir is often implemented as a homogeneous recurrent neural network, but there has been little investigation into the properties of reservoirs that are discrete and heterogeneous. Random Boolean networks are generic and heterogeneous dynamical systems and here we use them as the reservoir. A RBN is typically a closed system; to use it as a reservoir we extend it with an input layer. As a consequence of perturbation, the RBN does not necessarily fall into an attractor. Computational capability in RC arises from a tradeoff between separability and fading memory of inputs. We find the balance of these properties predictive of classification power and optimal at critical connectivity. These results are relevant to the construction of devices which exploit the intrinsic dynamics of complex heterogeneous systems, such as biomolecular substrates.

  12. MED4/345: Computer-Administered Formative Quizzes in a Basic Science Course

    PubMed Central

    Ogilvie, R

    1999-01-01

    Introduction Computer-administered quizzes were introduced into a Cell Biology and Histology course to provide students a means to assess their progress in the course and faculty the opportunity to monitor students' mastery of the course content. Methods The computer quizzes, including graphics, were presented on-line using LXR software (www.lxrtest.com) for specific time periods (7 - 14 days) during the course. The aim of this effort was to provide students formative assessment and assistance with pacing their study of course materials. Each computer quiz consisted of 20 - 30 questions with images. Extra credit was earned for each quiz if 70% of the items were answered correctly. The quizzes were served over the campus network to as many as 70 computer workstations, distributed to various locations in our department and the library. Each quiz was accessible once, by a unique user name and password for each student and a time limit set, allowing up to 90 seconds for each question. Feedback was given to the student for each question; the correct answer and a formative instructional statement intended to reinforce the fact or concept being evaluated. Global feedback on each quiz was provided for the entire class. This feedback was delivered on-line, on the course's web site. Results We have found that computer-administered course examinations are an efficient and acceptable means of assessing students' learning formatively. They permit a quick examination of the students' mastery of the course content. Such an approach allows for appropriate feedback to be provided in a timely manner and, if needed, instruction could be modified. No serious problems were encountered during the three years we have administered over 4,000 individual quizzes on-line. Greater than 90% of the students elected to participate in this optional activity with more than 85% receiving extra credit for their overall course grade. The computer quizzes were accepted by the students as a useful

  13. Effects of Computer Course on Computer Self-Efficacy, Computer Attitudes and Achievements of Young Individuals in Siirt, Turkey

    ERIC Educational Resources Information Center

    Çelik, Halil Coskun

    2015-01-01

    The purpose of this study is to investigate the effects of computer courses on young individuals' computer self-efficacy, attitudes and achievement. The study group of this research included 60 unemployed young individuals (18-25 ages) in total; 30 in the experimental group and 30 in the control group. An experimental research model with…

  14. Advanced networks and computing in healthcare

    PubMed Central

    Ackerman, Michael

    2011-01-01

    As computing and network capabilities continue to rise, it becomes increasingly important to understand the varied applications for using them to provide healthcare. The objective of this review is to identify key characteristics and attributes of healthcare applications involving the use of advanced computing and communication technologies, drawing upon 45 research and development projects in telemedicine and other aspects of healthcare funded by the National Library of Medicine over the past 12 years. Only projects publishing in the professional literature were included in the review. Four projects did not publish beyond their final reports. In addition, the authors drew on their first-hand experience as project officers, reviewers and monitors of the work. Major themes in the corpus of work were identified, characterizing key attributes of advanced computing and network applications in healthcare. Advanced computing and network applications are relevant to a range of healthcare settings and specialties, but they are most appropriate for solving a narrower range of problems in each. Healthcare projects undertaken primarily to explore potential have also demonstrated effectiveness and depend on the quality of network service as much as bandwidth. Many applications are enabling, making it possible to provide service or conduct research that previously was not possible or to achieve outcomes in addition to those for which projects were undertaken. Most notable are advances in imaging and visualization, collaboration and sense of presence, and mobility in communication and information-resource use. PMID:21486877

  15. Reducing the diameters of computer networks

    NASA Technical Reports Server (NTRS)

    Bokhari, S. H.; Raza, A. D.

    1986-01-01

    Three methods of reducing the diameters of computer networks by adding additional processor to processor links under the constraint that no more than one I/O port be added to each processor are discussed. This is equivalent to adding edges to a given graph under the constraint that the degree of any node be increased, at most, by one.

  16. Manual for Museum Computer Network Data Preparation.

    ERIC Educational Resources Information Center

    Vance, David

    This manual describes information processing procedures for Museum Computer Network (MCN) systems. The first section contains general rules for preparation of input: conventions for all data; conventions for controlling the appearance of output; conventions for automatic sorting of data; conventions for user specified sorting of data; and…

  17. Multiple network alignment on quantum computers

    NASA Astrophysics Data System (ADS)

    Daskin, Anmer; Grama, Ananth; Kais, Sabre

    2014-12-01

    Comparative analyses of graph-structured datasets underly diverse problems. Examples of these problems include identification of conserved functional components (biochemical interactions) across species, structural similarity of large biomolecules, and recurring patterns of interactions in social networks. A large class of such analyses methods quantify the topological similarity of nodes across networks. The resulting correspondence of nodes across networks, also called node alignment, can be used to identify invariant subgraphs across the input graphs. Given graphs as input, alignment algorithms use topological information to assign a similarity score to each -tuple of nodes, with elements (nodes) drawn from each of the input graphs. Nodes are considered similar if their neighbors are also similar. An alternate, equivalent view of these network alignment algorithms is to consider the Kronecker product of the input graphs and to identify high-ranked nodes in the Kronecker product graph. Conventional methods such as PageRank and HITS (Hypertext-Induced Topic Selection) can be used for this purpose. These methods typically require computation of the principal eigenvector of a suitably modified Kronecker product matrix of the input graphs. We adopt this alternate view of the problem to address the problem of multiple network alignment. Using the phase estimation algorithm, we show that the multiple network alignment problem can be efficiently solved on quantum computers. We characterize the accuracy and performance of our method and show that it can deliver exponential speedups over conventional (non-quantum) methods.

  18. Pedagogical Utilization and Assessment of the Statistic Online Computational Resource in Introductory Probability and Statistics Courses.

    PubMed

    Dinov, Ivo D; Sanchez, Juana; Christou, Nicolas

    2008-01-01

    Technology-based instruction represents a new recent pedagogical paradigm that is rooted in the realization that new generations are much more comfortable with, and excited about, new technologies. The rapid technological advancement over the past decade has fueled an enormous demand for the integration of modern networking, informational and computational tools with classical pedagogical instruments. Consequently, teaching with technology typically involves utilizing a variety of IT and multimedia resources for online learning, course management, electronic course materials, and novel tools of communication, engagement, experimental, critical thinking and assessment.The NSF-funded Statistics Online Computational Resource (SOCR) provides a number of interactive tools for enhancing instruction in various undergraduate and graduate courses in probability and statistics. These resources include online instructional materials, statistical calculators, interactive graphical user interfaces, computational and simulation applets, tools for data analysis and visualization. The tools provided as part of SOCR include conceptual simulations and statistical computing interfaces, which are designed to bridge between the introductory and the more advanced computational and applied probability and statistics courses. In this manuscript, we describe our designs for utilizing SOCR technology in instruction in a recent study. In addition, present the results of the effectiveness of using SOCR tools at two different course intensity levels on three outcome measures: exam scores, student satisfaction and choice of technology to complete assignments. Learning styles assessment was completed at baseline. We have used three very different designs for three different undergraduate classes. Each course included a treatment group, using the SOCR resources, and a control group, using classical instruction techniques. Our findings include marginal effects of the SOCR treatment per individual

  19. Adaptive computation algorithm for RBF neural network.

    PubMed

    Han, Hong-Gui; Qiao, Jun-Fei

    2012-02-01

    A novel learning algorithm is proposed for nonlinear modelling and identification using radial basis function neural networks. The proposed method simplifies neural network training through the use of an adaptive computation algorithm (ACA). In addition, the convergence of the ACA is analyzed by the Lyapunov criterion. The proposed algorithm offers two important advantages. First, the model performance can be significantly improved through ACA, and the modelling error is uniformly ultimately bounded. Secondly, the proposed ACA can reduce computational cost and accelerate the training speed. The proposed method is then employed to model classical nonlinear system with limit cycle and to identify nonlinear dynamic system, exhibiting the effectiveness of the proposed algorithm. Computational complexity analysis and simulation results demonstrate its effectiveness.

  20. Piping network model program for small computers

    SciTech Connect

    Kruckenberg, N.E.

    1986-07-01

    A model of fluid piping networks was developed to aid in solving problems in the recirculating water coolant system at the Portsmouth Gaseous Diffusion Plant. The piping network model can be used to solve steady state problems in which water flow rates and temperatures are to be determined, or in which temperature is an important factor in determining pressure losses. The model can be implemented on desktop computers to perform these calculations as needed to track changing process conditions. The report includes a description of the coolant system, the mathematical development f the computer model, a case study utilizing the model and a listing and sample run of the computer codes. 2 figs., 1 tab.

  1. Non-harmful insertion of data mimicking computer network attacks

    DOEpatents

    Neil, Joshua Charles; Kent, Alexander; Hash, Jr, Curtis Lee

    2016-06-21

    Non-harmful data mimicking computer network attacks may be inserted in a computer network. Anomalous real network connections may be generated between a plurality of computing systems in the network. Data mimicking an attack may also be generated. The generated data may be transmitted between the plurality of computing systems using the real network connections and measured to determine whether an attack is detected.

  2. Accelerating commutation circuits in quantum computer networks

    NASA Astrophysics Data System (ADS)

    Jiang, Min; Huang, Xu; Chen, Xiaoping; Zhang, Zeng-ke

    2012-12-01

    In a high speed and packet-switched quantum computer network, a packet routing delay often leads to traffic jams, becoming a severe bottleneck for speeding up the transmission rate. Based on the delayed commutation circuit proposed in Phys. Rev. Lett. 97, 110502 (2006), we present an improved scheme for accelerating network transmission. For two more realistic scenarios, we utilize the characteristic of a quantum state to simultaneously implement a data switch and transmission that makes it possible to reduce the packet delay and route a qubit packet even before its address is determined. This circuit is further extended to the quantum network for the transmission of the unknown quantum information. The analysis demonstrates that quantum communication technology can considerably reduce the processing delay time and build faster and more efficient packet-switched networks.

  3. Advanced Scientific Computing Research Network Requirements

    SciTech Connect

    Bacon, Charles; Bell, Greg; Canon, Shane; Dart, Eli; Dattoria, Vince; Goodwin, Dave; Lee, Jason; Hicks, Susan; Holohan, Ed; Klasky, Scott; Lauzon, Carolyn; Rogers, Jim; Shipman, Galen; Skinner, David; Tierney, Brian

    2013-03-08

    The Energy Sciences Network (ESnet) is the primary provider of network connectivity for the U.S. Department of Energy (DOE) Office of Science (SC), the single largest supporter of basic research in the physical sciences in the United States. In support of SC programs, ESnet regularly updates and refreshes its understanding of the networking requirements of the instruments, facilities, scientists, and science programs that it serves. This focus has helped ESnet to be a highly successful enabler of scientific discovery for over 25 years. In October 2012, ESnet and the Office of Advanced Scientific Computing Research (ASCR) of the DOE SC organized a review to characterize the networking requirements of the programs funded by the ASCR program office. The requirements identified at the review are summarized in the Findings section, and are described in more detail in the body of the report.

  4. Counseling Student Computer Competency Skills: Effects of Technology Course in Training.

    ERIC Educational Resources Information Center

    Edwards, Yolanda V.; Portman, Tarrell Awe Agahe; Bethea, James

    2002-01-01

    The focus of this article is to assess counseling student computer competency level as an effect of a one-credit hour introductory course in computer technology. Results indicate student computer competencies increased after completing the computer technology course in the following areas: ethics, assisting clients with internet searches,…

  5. DTW-MIC Coexpression Networks from Time-Course Data

    PubMed Central

    Riccadonna, Samantha; Jurman, Giuseppe; Visintainer, Roberto; Filosi, Michele; Furlanello, Cesare

    2016-01-01

    When modeling coexpression networks from high-throughput time course data, Pearson Correlation Coefficient (PCC) is one of the most effective and popular similarity functions. However, its reliability is limited since it cannot capture non-linear interactions and time shifts. Here we propose to overcome these two issues by employing a novel similarity function, Dynamic Time Warping Maximal Information Coefficient (DTW-MIC), combining a measure taking care of functional interactions of signals (MIC) and a measure identifying time lag (DTW). By using the Hamming-Ipsen-Mikhailov (HIM) metric to quantify network differences, the effectiveness of the DTW-MIC approach is demonstrated on a set of four synthetic and one transcriptomic datasets, also in comparison to TimeDelay ARACNE and Transfer Entropy. PMID:27031641

  6. Neural network computer simulation of medical aerosols.

    PubMed

    Richardson, C J; Barlow, D J

    1996-06-01

    Preliminary investigations have been conducted to assess the potential for using artificial neural networks to simulate aerosol behaviour, with a view to employing this type of methodology in the evaluation and design of pulmonary drug-delivery systems. Details are presented of the general purpose software developed for these tasks; it implements a feed-forward back-propagation algorithm with weight decay and connection pruning, the user having complete run-time control of the network architecture and mode of training. A series of exploratory investigations is then reported in which different network structures and training strategies are assessed in terms of their ability to simulate known patterns of fluid flow in simple model systems. The first of these involves simulations of cellular automata-generated data for fluid flow through a partially obstructed two-dimensional pipe. The artificial neural networks are shown to be highly successful in simulating the behaviour of this simple linear system, but with important provisos relating to the information content of the training data and the criteria used to judge when the network is properly trained. A second set of investigations is then reported in which similar networks are used to simulate patterns of fluid flow through aerosol generation devices, using training data furnished through rigorous computational fluid dynamics modelling. These more complex three-dimensional systems are modelled with equal success. It is concluded that carefully tailored, well trained networks could provide valuable tools not just for predicting but also for analysing the spatial dynamics of pharmaceutical aerosols.

  7. A Textbook for a First Course in Computational Fluid Dynamics

    NASA Technical Reports Server (NTRS)

    Zingg, D. W.; Pulliam, T. H.; Nixon, David (Technical Monitor)

    1999-01-01

    This paper describes and discusses the textbook, Fundamentals of Computational Fluid Dynamics by Lomax, Pulliam, and Zingg, which is intended for a graduate level first course in computational fluid dynamics. This textbook emphasizes fundamental concepts in developing, analyzing, and understanding numerical methods for the partial differential equations governing the physics of fluid flow. Its underlying philosophy is that the theory of linear algebra and the attendant eigenanalysis of linear systems provides a mathematical framework to describe and unify most numerical methods in common use in the field of fluid dynamics. Two linear model equations, the linear convection and diffusion equations, are used to illustrate concepts throughout. Emphasis is on the semi-discrete approach, in which the governing partial differential equations (PDE's) are reduced to systems of ordinary differential equations (ODE's) through a discretization of the spatial derivatives. The ordinary differential equations are then reduced to ordinary difference equations (O(Delta)E's) using a time-marching method. This methodology, using the progression from PDE through ODE's to O(Delta)E's, together with the use of the eigensystems of tridiagonal matrices and the theory of O(Delta)E's, gives the book its distinctiveness and provides a sound basis for a deep understanding of fundamental concepts in computational fluid dynamics.

  8. The Introductory Computer Course in the Accounting Curriculum: Objectives and Performance.

    ERIC Educational Resources Information Center

    Steedle, Lamont F.; Sinclair, Kenneth P.

    1984-01-01

    This study identified computer objectives based on recommendations of the authoritative accounting bodies, determined whether the typical introductory computer course has these same objectives, and examined the influence of the academic department responsible for teaching the course. Relationships between department and course objectives,…

  9. Fuzzy logic, neural networks, and soft computing

    NASA Technical Reports Server (NTRS)

    Zadeh, Lofti A.

    1994-01-01

    The past few years have witnessed a rapid growth of interest in a cluster of modes of modeling and computation which may be described collectively as soft computing. The distinguishing characteristic of soft computing is that its primary aims are to achieve tractability, robustness, low cost, and high MIQ (machine intelligence quotient) through an exploitation of the tolerance for imprecision and uncertainty. Thus, in soft computing what is usually sought is an approximate solution to a precisely formulated problem or, more typically, an approximate solution to an imprecisely formulated problem. A simple case in point is the problem of parking a car. Generally, humans can park a car rather easily because the final position of the car is not specified exactly. If it were specified to within, say, a few millimeters and a fraction of a degree, it would take hours or days of maneuvering and precise measurements of distance and angular position to solve the problem. What this simple example points to is the fact that, in general, high precision carries a high cost. The challenge, then, is to exploit the tolerance for imprecision by devising methods of computation which lead to an acceptable solution at low cost. By its nature, soft computing is much closer to human reasoning than the traditional modes of computation. At this juncture, the major components of soft computing are fuzzy logic (FL), neural network theory (NN), and probabilistic reasoning techniques (PR), including genetic algorithms, chaos theory, and part of learning theory. Increasingly, these techniques are used in combination to achieve significant improvement in performance and adaptability. Among the important application areas for soft computing are control systems, expert systems, data compression techniques, image processing, and decision support systems. It may be argued that it is soft computing, rather than the traditional hard computing, that should be viewed as the foundation for artificial

  10. Spiking network simulation code for petascale computers

    PubMed Central

    Kunkel, Susanne; Schmidt, Maximilian; Eppler, Jochen M.; Plesser, Hans E.; Masumoto, Gen; Igarashi, Jun; Ishii, Shin; Fukai, Tomoki; Morrison, Abigail; Diesmann, Markus; Helias, Moritz

    2014-01-01

    Brain-scale networks exhibit a breathtaking heterogeneity in the dynamical properties and parameters of their constituents. At cellular resolution, the entities of theory are neurons and synapses and over the past decade researchers have learned to manage the heterogeneity of neurons and synapses with efficient data structures. Already early parallel simulation codes stored synapses in a distributed fashion such that a synapse solely consumes memory on the compute node harboring the target neuron. As petaflop computers with some 100,000 nodes become increasingly available for neuroscience, new challenges arise for neuronal network simulation software: Each neuron contacts on the order of 10,000 other neurons and thus has targets only on a fraction of all compute nodes; furthermore, for any given source neuron, at most a single synapse is typically created on any compute node. From the viewpoint of an individual compute node, the heterogeneity in the synaptic target lists thus collapses along two dimensions: the dimension of the types of synapses and the dimension of the number of synapses of a given type. Here we present a data structure taking advantage of this double collapse using metaprogramming techniques. After introducing the relevant scaling scenario for brain-scale simulations, we quantitatively discuss the performance on two supercomputers. We show that the novel architecture scales to the largest petascale supercomputers available today. PMID:25346682

  11. Spiking network simulation code for petascale computers.

    PubMed

    Kunkel, Susanne; Schmidt, Maximilian; Eppler, Jochen M; Plesser, Hans E; Masumoto, Gen; Igarashi, Jun; Ishii, Shin; Fukai, Tomoki; Morrison, Abigail; Diesmann, Markus; Helias, Moritz

    2014-01-01

    Brain-scale networks exhibit a breathtaking heterogeneity in the dynamical properties and parameters of their constituents. At cellular resolution, the entities of theory are neurons and synapses and over the past decade researchers have learned to manage the heterogeneity of neurons and synapses with efficient data structures. Already early parallel simulation codes stored synapses in a distributed fashion such that a synapse solely consumes memory on the compute node harboring the target neuron. As petaflop computers with some 100,000 nodes become increasingly available for neuroscience, new challenges arise for neuronal network simulation software: Each neuron contacts on the order of 10,000 other neurons and thus has targets only on a fraction of all compute nodes; furthermore, for any given source neuron, at most a single synapse is typically created on any compute node. From the viewpoint of an individual compute node, the heterogeneity in the synaptic target lists thus collapses along two dimensions: the dimension of the types of synapses and the dimension of the number of synapses of a given type. Here we present a data structure taking advantage of this double collapse using metaprogramming techniques. After introducing the relevant scaling scenario for brain-scale simulations, we quantitatively discuss the performance on two supercomputers. We show that the novel architecture scales to the largest petascale supercomputers available today. PMID:25346682

  12. Spiking network simulation code for petascale computers.

    PubMed

    Kunkel, Susanne; Schmidt, Maximilian; Eppler, Jochen M; Plesser, Hans E; Masumoto, Gen; Igarashi, Jun; Ishii, Shin; Fukai, Tomoki; Morrison, Abigail; Diesmann, Markus; Helias, Moritz

    2014-01-01

    Brain-scale networks exhibit a breathtaking heterogeneity in the dynamical properties and parameters of their constituents. At cellular resolution, the entities of theory are neurons and synapses and over the past decade researchers have learned to manage the heterogeneity of neurons and synapses with efficient data structures. Already early parallel simulation codes stored synapses in a distributed fashion such that a synapse solely consumes memory on the compute node harboring the target neuron. As petaflop computers with some 100,000 nodes become increasingly available for neuroscience, new challenges arise for neuronal network simulation software: Each neuron contacts on the order of 10,000 other neurons and thus has targets only on a fraction of all compute nodes; furthermore, for any given source neuron, at most a single synapse is typically created on any compute node. From the viewpoint of an individual compute node, the heterogeneity in the synaptic target lists thus collapses along two dimensions: the dimension of the types of synapses and the dimension of the number of synapses of a given type. Here we present a data structure taking advantage of this double collapse using metaprogramming techniques. After introducing the relevant scaling scenario for brain-scale simulations, we quantitatively discuss the performance on two supercomputers. We show that the novel architecture scales to the largest petascale supercomputers available today.

  13. Parallel Computation of Unsteady Flows on a Network of Workstations

    NASA Technical Reports Server (NTRS)

    1997-01-01

    Parallel computation of unsteady flows requires significant computational resources. The utilization of a network of workstations seems an efficient solution to the problem where large problems can be treated at a reasonable cost. This approach requires the solution of several problems: 1) the partitioning and distribution of the problem over a network of workstation, 2) efficient communication tools, 3) managing the system efficiently for a given problem. Of course, there is the question of the efficiency of any given numerical algorithm to such a computing system. NPARC code was chosen as a sample for the application. For the explicit version of the NPARC code both two- and three-dimensional problems were studied. Again both steady and unsteady problems were investigated. The issues studied as a part of the research program were: 1) how to distribute the data between the workstations, 2) how to compute and how to communicate at each node efficiently, 3) how to balance the load distribution. In the following, a summary of these activities is presented. Details of the work have been presented and published as referenced.

  14. Computer Networks teaching by microlearning principles

    NASA Astrophysics Data System (ADS)

    A, Zhamanov; M, Zhamapor

    2013-04-01

    In nowadays there are many kinds of problems in the system of higher education. One of them is: everyday students get very huge amount of information. All of them take very big amount of time for a student to understand lesson and it is very hard for him/her to do everything in time. This paper explains a study of work microlearning application for computer networks. It consists of general introduction and purpose.

  15. The dangers of heterogeneous network computing: heterogeneous networks considered harmful

    SciTech Connect

    Demmel, J.; Stanley, K.; Dongarra, J.; Hammarling, S.; Osstrouchov, S.

    1996-12-31

    This report addresses the issue of writing reliable numerical software for networks of heterogeneous computers. Much software has been written for distributed memory parallel computers and in principal such software could readily be ported to networks of machines, such as a collection of workstations connected by Ethernet, but if such a network is not homogeneous there are special challenges that need to be addressed. The symptoms can range from erroneous results returned without warning to deadlock. Some of the problems are straightforward to solve, but for others the solutions are not so obvious and indeed in some cases, such as the method of bisection which we shall discuss in the report, we have not yet decided upon a satisfactory solution that does not incur an unacceptable overhead. Making software robust on heterogeneous systems often requires additional communication. In this report we describe and illustrate the problems and, where possible, suggest solutions so that others may be aware of the potential pitfalls and either avoid them or, if that is not possible, ensure that their software is not used on heterogeneous networks.

  16. Criteria development for upgrading computer networks

    NASA Technical Reports Server (NTRS)

    Efe, Kemal

    1995-01-01

    Being an infrastructure system, the computer network has a fundamental role in the day to day activities of personnel working at KSC. It is easily appreciated that the lack of 'satisfactory' network performance can have a high 'cost' for KSC. Yet, this seemingly obvious concept is quite difficult to demonstrate. At what point do we say that performance is below the lowest tolerable level? How do we know when the 'cost' of using the system at the current level of degraded performance exceeds the cost of upgrading it? In this research, we consider the cost and performance factors that may have an effect in decision making in regards to upgrading computer networks. Cost factors are detailed in terms of 'direct costs' and 'subjective costs'. Performance factors are examined in terms of 'required performance' and 'offered performance.' Required performance is further examined by presenting a methodology for trend analysis based on applying interpolation methods to observed traffic levels. Offered performance levels are analyzed by deriving simple equations to evaluate network performance. The results are evaluated in the light of recommended upgrade policies currently in use for telephone exchange systems, similarities and differences between the two types of services are discussed.

  17. Computer networks. Citations from the NTIS data base

    NASA Astrophysics Data System (ADS)

    Jones, J. E.

    1980-08-01

    Research reports on aspects of computer networks, including hardware, software, data transmission, time sharing, and applicable theory to network design are cited. Specific studies on the ARPA networks, and other such systems are listed.

  18. Computer network defense through radial wave functions

    NASA Astrophysics Data System (ADS)

    Malloy, Ian J.

    The purpose of this research is to synthesize basic and fundamental findings in quantum computing, as applied to the attack and defense of conventional computer networks. The concept focuses on uses of radio waves as a shield for, and attack against traditional computers. A logic bomb is analogous to a landmine in a computer network, and if one was to implement it as non-trivial mitigation, it will aid computer network defense. As has been seen in kinetic warfare, the use of landmines has been devastating to geopolitical regions in that they are severely difficult for a civilian to avoid triggering given the unknown position of a landmine. Thus, the importance of understanding a logic bomb is relevant and has corollaries to quantum mechanics as well. The research synthesizes quantum logic phase shifts in certain respects using the Dynamic Data Exchange protocol in software written for this work, as well as a C-NOT gate applied to a virtual quantum circuit environment by implementing a Quantum Fourier Transform. The research focus applies the principles of coherence and entanglement from quantum physics, the concept of expert systems in artificial intelligence, principles of prime number based cryptography with trapdoor functions, and modeling radio wave propagation against an event from unknown parameters. This comes as a program relying on the artificial intelligence concept of an expert system in conjunction with trigger events for a trapdoor function relying on infinite recursion, as well as system mechanics for elliptic curve cryptography along orbital angular momenta. Here trapdoor both denotes the form of cipher, as well as the implied relationship to logic bombs.

  19. Computer Based Collaborative Problem Solving for Introductory Courses in Physics

    NASA Astrophysics Data System (ADS)

    Ilie, Carolina; Lee, Kevin

    2010-03-01

    We discuss collaborative problem solving computer-based recitation style. The course is designed by Lee [1], and the idea was proposed before by Christian, Belloni and Titus [2,3]. The students find the problems on a web-page containing simulations (physlets) and they write the solutions on an accompanying worksheet after discussing it with a classmate. Physlets have the advantage of being much more like real-world problems than textbook problems. We also compare two protocols for web-based instruction using simulations in an introductory physics class [1]. The inquiry protocol allowed students to control input parameters while the worked example protocol did not. We will discuss which of the two methods is more efficient in relation to Scientific Discovery Learning and Cognitive Load Theory. 1. Lee, Kevin M., Nicoll, Gayle and Brooks, Dave W. (2004). ``A Comparison of Inquiry and Worked Example Web-Based Instruction Using Physlets'', Journal of Science Education and Technology 13, No. 1: 81-88. 2. Christian, W., and Belloni, M. (2001). Physlets: Teaching Physics With Interactive Curricular Material, Prentice Hall, Englewood Cliffs, NJ. 3. Christian,W., and Titus,A. (1998). ``Developing web-based curricula using Java Physlets.'' Computers in Physics 12: 227--232.

  20. Using satellite communications for a mobile computer network

    NASA Technical Reports Server (NTRS)

    Wyman, Douglas J.

    1993-01-01

    The topics discussed include the following: patrol car automation, mobile computer network, network requirements, network design overview, MCN mobile network software, MCN hub operation, mobile satellite software, hub satellite software, the benefits of patrol car automation, the benefits of satellite mobile computing, and national law enforcement satellite.

  1. Nanoarchitectonic atomic switch networks for unconventional computing

    NASA Astrophysics Data System (ADS)

    Demis, Eleanor C.; Aguilera, Renato; Scharnhorst, Kelsey; Aono, Masakazu; Stieg, Adam Z.; Gimzewski, James K.

    2016-11-01

    Developments in computing hardware are constrained by the operating principles of complementary metal oxide semiconductor (CMOS) technology, fabrication limits of nanometer scaled features, and difficulties in effective utilization of high density interconnects. This set of obstacles has promulgated a search for alternative, energy efficient approaches to computing inspired by natural systems including the mammalian brain. Atomic switch network (ASN) devices are a unique platform specifically developed to overcome these current barriers to realize adaptive neuromorphic technology. ASNs are composed of a massively interconnected network of atomic switches with a density of ∼109 units/cm2 and are structurally reminiscent of the neocortex of the brain. ASNs possess both the intrinsic capabilities of individual memristive switches, such as memory capacity and multi-state switching, and the characteristics of large-scale complex systems, such as power-law dynamics and non-linear transformations of input signals. Here we describe the successful nanoarchitectonic fabrication of next-generation ASN devices using combined top-down and bottom-up processing and experimentally demonstrate their utility as reservoir computing hardware. Leveraging their intrinsic dynamics and transformative input/output (I/O) behavior enabled waveform regression of periodic signals in the absence of embedded algorithms, further supporting the potential utility of ASN technology as a platform for unconventional approaches to computing.

  2. Using Microcomputer Simulations in the Classroom: Examples from Undergraduate and Faculty Computer Literacy Courses.

    ERIC Educational Resources Information Center

    Hart, Jeffrey A.

    Examples of the use of computer simulations in two undergraduate courses, (American Foreign Policy and Introduction to International Politics), and a faculty computer literacy course on simulations and artificial intelligence, are provided in this compilation of various instructional items. A list of computer simulations available for various…

  3. Path Not Found: Disparities in Access to Computer Science Courses in California High Schools

    ERIC Educational Resources Information Center

    Martin, Alexis; McAlear, Frieda; Scott, Allison

    2015-01-01

    "Path Not Found: Disparities in Access to Computer Science Courses in California High Schools" exposes one of the foundational causes of underrepresentation in computing: disparities in access to computer science courses in California's public high schools. This report provides new, detailed data on these disparities by student body…

  4. Computational Fact Checking from Knowledge Networks.

    PubMed

    Ciampaglia, Giovanni Luca; Shiralkar, Prashant; Rocha, Luis M; Bollen, Johan; Menczer, Filippo; Flammini, Alessandro

    2015-01-01

    Traditional fact checking by expert journalists cannot keep up with the enormous volume of information that is now generated online. Computational fact checking may significantly enhance our ability to evaluate the veracity of dubious information. Here we show that the complexities of human fact checking can be approximated quite well by finding the shortest path between concept nodes under properly defined semantic proximity metrics on knowledge graphs. Framed as a network problem this approach is feasible with efficient computational techniques. We evaluate this approach by examining tens of thousands of claims related to history, entertainment, geography, and biographical information using a public knowledge graph extracted from Wikipedia. Statements independently known to be true consistently receive higher support via our method than do false ones. These findings represent a significant step toward scalable computational fact-checking methods that may one day mitigate the spread of harmful misinformation. PMID:26083336

  5. Computational Fact Checking from Knowledge Networks

    PubMed Central

    Ciampaglia, Giovanni Luca; Shiralkar, Prashant; Rocha, Luis M.; Bollen, Johan; Menczer, Filippo; Flammini, Alessandro

    2015-01-01

    Traditional fact checking by expert journalists cannot keep up with the enormous volume of information that is now generated online. Computational fact checking may significantly enhance our ability to evaluate the veracity of dubious information. Here we show that the complexities of human fact checking can be approximated quite well by finding the shortest path between concept nodes under properly defined semantic proximity metrics on knowledge graphs. Framed as a network problem this approach is feasible with efficient computational techniques. We evaluate this approach by examining tens of thousands of claims related to history, entertainment, geography, and biographical information using a public knowledge graph extracted from Wikipedia. Statements independently known to be true consistently receive higher support via our method than do false ones. These findings represent a significant step toward scalable computational fact-checking methods that may one day mitigate the spread of harmful misinformation. PMID:26083336

  6. Computational Fact Checking from Knowledge Networks.

    PubMed

    Ciampaglia, Giovanni Luca; Shiralkar, Prashant; Rocha, Luis M; Bollen, Johan; Menczer, Filippo; Flammini, Alessandro

    2015-01-01

    Traditional fact checking by expert journalists cannot keep up with the enormous volume of information that is now generated online. Computational fact checking may significantly enhance our ability to evaluate the veracity of dubious information. Here we show that the complexities of human fact checking can be approximated quite well by finding the shortest path between concept nodes under properly defined semantic proximity metrics on knowledge graphs. Framed as a network problem this approach is feasible with efficient computational techniques. We evaluate this approach by examining tens of thousands of claims related to history, entertainment, geography, and biographical information using a public knowledge graph extracted from Wikipedia. Statements independently known to be true consistently receive higher support via our method than do false ones. These findings represent a significant step toward scalable computational fact-checking methods that may one day mitigate the spread of harmful misinformation.

  7. Computational drug repositioning through heterogeneous network clustering

    PubMed Central

    2013-01-01

    Background Given the costly and time consuming process and high attrition rates in drug discovery and development, drug repositioning or drug repurposing is considered as a viable strategy both to replenish the drying out drug pipelines and to surmount the innovation gap. Although there is a growing recognition that mechanistic relationships from molecular to systems level should be integrated into drug discovery paradigms, relatively few studies have integrated information about heterogeneous networks into computational drug-repositioning candidate discovery platforms. Results Using known disease-gene and drug-target relationships from the KEGG database, we built a weighted disease and drug heterogeneous network. The nodes represent drugs or diseases while the edges represent shared gene, biological process, pathway, phenotype or a combination of these features. We clustered this weighted network to identify modules and then assembled all possible drug-disease pairs (putative drug repositioning candidates) from these modules. We validated our predictions by testing their robustness and evaluated them by their overlap with drug indications that were either reported in published literature or investigated in clinical trials. Conclusions Previous computational approaches for drug repositioning focused either on drug-drug and disease-disease similarity approaches whereas we have taken a more holistic approach by considering drug-disease relationships also. Further, we considered not only gene but also other features to build the disease drug networks. Despite the relative simplicity of our approach, based on the robustness analyses and the overlap of some of our predictions with drug indications that are under investigation, we believe our approach could complement the current computational approaches for drug repositioning candidate discovery. PMID:24564976

  8. Computed tomography-enhanced anatomy course using enterprise visualization.

    PubMed

    May, Hila; Cohen, Haim; Medlej, Bahaa; Kornreich, Liora; Peled, Nathan; Hershkovitz, Israel

    2013-01-01

    Rapid changes in medical knowledge are forcing continuous adaptation of the basic science courses in medical schools. This article discusses a three-year experience developing a new Computed Tomography (CT)-based anatomy curriculum at the Sackler School of Medicine, Tel Aviv University, including describing the motivations and reasoning for the new curriculum, the CT-based learning system itself, practical examples of visual dissections, and student assessments of the new curriculum. At the heart of this new curriculum is the emphasis on studying anatomy by navigating inside the bodies of various living individuals utilizing a CT viewer. To assess the students' experience with the new CT-based learning method, an anonymous questionnaire was administered at the end of the course for three consecutive academic years: 2008/2009, 2009/2010, 2010/2011. Based upon the results, modifications were made to the curriculum in the summers of 2009 and 2010. Results showed that: (1) during these three years the number of students extensively using the CT system quadrupled (from 11% to 46%); (2) students' satisfaction from radiologists involvement increased by 150%; and (3) student appreciation of the CT-based learning method significantly increased (from 13% to 68%). It was concluded that discouraging results (mainly negative feedback from students) during the first years and a priori opposition from the teaching staff should not weaken efforts to develop new teaching methods in the field of anatomy. Incorporating a new curriculum requires time and patience. Student and staff satisfaction, along with utilization of the new system, will increase with the improvement of impeding factors.

  9. Computer Networks for Science Teachers. ERIC CSMEE Digest.

    ERIC Educational Resources Information Center

    Roempler, Kimberly S.; Warren, Charles R.

    Formerly reserved for use by scientists, researchers, and computer buffs, computer networks now have capabilities that make them extremely useful to science teachers and their classes. This digest is designed to provide educators with some basic background on computer communications and to provide a few examples of computer networks that are…

  10. Profiles of Motivated Self-Regulation in College Computer Science Courses: Differences in Major versus Required Non-Major Courses

    NASA Astrophysics Data System (ADS)

    Shell, Duane F.; Soh, Leen-Kiat

    2013-12-01

    The goal of the present study was to utilize a profiling approach to understand differences in motivation and strategic self-regulation among post-secondary STEM students in major versus required non-major computer science courses. Participants were 233 students from required introductory computer science courses (194 men; 35 women; 4 unknown) at a large Midwestern state university. Cluster analysis identified five profiles: (1) a strategic profile of a highly motivated by-any-means good strategy user; (2) a knowledge-building profile of an intrinsically motivated autonomous, mastery-oriented student; (3) a surface learning profile of a utility motivated minimally engaged student; (4) an apathetic profile of an amotivational disengaged student; and (5) a learned helpless profile of a motivated but unable to effectively self-regulate student. Among CS majors and students in courses in their major field, the strategic and knowledge-building profiles were the most prevalent. Among non-CS majors and students in required non-major courses, the learned helpless, surface learning, and apathetic profiles were the most prevalent. Students in the strategic and knowledge-building profiles had significantly higher retention of computational thinking knowledge than students in other profiles. Students in the apathetic and surface learning profiles saw little instrumentality of the course for their future academic and career objectives. Findings show that students in STEM fields taking required computer science courses exhibit the same constellation of motivated strategic self-regulation profiles found in other post-secondary and K-12 settings.

  11. Program Predicts Time Courses of Human/Computer Interactions

    NASA Technical Reports Server (NTRS)

    Vera, Alonso; Howes, Andrew

    2005-01-01

    CPM X is a computer program that predicts sequences of, and amounts of time taken by, routine actions performed by a skilled person performing a task. Unlike programs that simulate the interaction of the person with the task environment, CPM X predicts the time course of events as consequences of encoded constraints on human behavior. The constraints determine which cognitive and environmental processes can occur simultaneously and which have sequential dependencies. The input to CPM X comprises (1) a description of a task and strategy in a hierarchical description language and (2) a description of architectural constraints in the form of rules governing interactions of fundamental cognitive, perceptual, and motor operations. The output of CPM X is a Program Evaluation Review Technique (PERT) chart that presents a schedule of predicted cognitive, motor, and perceptual operators interacting with a task environment. The CPM X program allows direct, a priori prediction of skilled user performance on complex human-machine systems, providing a way to assess critical interfaces before they are deployed in mission contexts.

  12. Restructuring Classroom Interaction with Networked Computers: Effects on Quantity and Characteristics of Language Production.

    ERIC Educational Resources Information Center

    Kern, Richard G.

    1995-01-01

    This study examined the use of Daedalus InterChange, a local area computer network application, to facilitate communicative language use among college students in two elementary French courses. It found that students had more turns and used a greater variety of discourse functions when working in InterChange than they did in their oral…

  13. Neural network models for optical computing

    SciTech Connect

    Athale, R.A. ); Davis, J. )

    1988-01-01

    This volume comprises the record of the conference on neural network models for optical computing. In keeping with the interdisciplinary nature of the field, the invited papers are from diverse research areas, such as neuroscience, parallel architectures, neural modeling, and perception. The papers consist of three major classes: applications of optical neural nets for pattern classification, analysis, and image formation; development and analysis of neural net models that are particularly suited for optical implementation; experimental demonstrations of optical neural nets, particularly with adaptive interconnects.

  14. Some queuing network models of computer systems

    NASA Technical Reports Server (NTRS)

    Herndon, E. S.

    1980-01-01

    Queuing network models of a computer system operating with a single workload type are presented. Program algorithms are adapted for use on the Texas Instruments SR-52 programmable calculator. By slightly altering the algorithm to process the G and H matrices row by row instead of column by column, six devices and an unlimited job/terminal population could be handled on the SR-52. Techniques are also introduced for handling a simple load dependent server and for studying interactive systems with fixed multiprogramming limits.

  15. Design Principles for "Thriving in Our Digital World": A High School Computer Science Course

    ERIC Educational Resources Information Center

    Veletsianos, George; Beth, Bradley; Lin, Calvin; Russell, Gregory

    2016-01-01

    "Thriving in Our Digital World" is a technology-enhanced dual enrollment course introducing high school students to computer science through project- and problem-based learning. This article describes the evolution of the course and five lessons learned during the design, development, implementation, and iteration of the course from its…

  16. Microcomputers: A Course for Parents and Children. For Use with Personal Computers. No. 657.

    ERIC Educational Resources Information Center

    Boe, Thomas; And Others

    This booklet is designed to supplement a short course on microcomputers for parents and their middle school children. The course activities are designed to increase the computer literacy of the participants and also to provide an opportunity for parents and children to spend time with each other. The course is intended to take place during a…

  17. Techniques for Developing a Syllabus/Website for a Computer Mediated Learning (CML) Course.

    ERIC Educational Resources Information Center

    Bull, Kay Sather; Kimball, Sarah; Stansberry, Susan

    Computer mediated learning (CML) courses can overcome the temporal and spatial obstacles of isolated commuter students with busy schedules. Whether presented online or as an add-on to an on-campus course, the CML course needs a good syllabus. This paper discusses components of a CML syllabus and online activities for students. Typical components…

  18. "The 'Mouse' That Roared": Using Computer Labs for Basic Course Group Projects.

    ERIC Educational Resources Information Center

    O'Connor, Penny; Chatham-Carpenter, April

    One of the challenges in teaching a hybrid Basic Course in Communication is the wide variety of topics that can be covered in one semester. Two basic course instructors have found that their recently opened Basic Course computer lab gave them the opportunity to develop interdisciplinary assignments to help more efficiently address various…

  19. Some Specifications for a Computer-Oriented First Course in Electrical Engineering.

    ERIC Educational Resources Information Center

    Commission on Engineering Education, Washington, DC.

    Reported are specifications for a computer-oriented first course in electrical engineering giving new direction to the development of texts and alternative courses of study. Guidelines for choice of topics, a statement of fundamental concepts, pitfalls to avoid, and some sample course outlines are given. The study of circuits through computer…

  20. Philosophy of Language. Course Notes for a Tutorial on Computational Semantics.

    ERIC Educational Resources Information Center

    Wilks, Yorick

    This course was part of a tutorial focusing on the state of computational semantics, i.e., the state of work on natural language within the artificial intelligence (AI) paradigm. The discussion in the course centered on the philosophers Richard Montague and Ludwig Wittgenstein. The course was divided into three sections: (1)…

  1. Using Type II Computer Network Technology To Reach Distance Students.

    ERIC Educational Resources Information Center

    Eastmond, Dan; Granger, Dan

    1998-01-01

    This article, in a series on computer technology and distance education, focuses on "Type II Technology," courses using textbooks and course guides for primary delivery, but enhancing them with computer conferencing as the main vehicle of instructional communication. Discusses technology proficiency, maximizing learning in conferencing…

  2. Massively Open Online Course for Educators (MOOC-Ed) Network Dataset

    ERIC Educational Resources Information Center

    Kellogg, Shaun; Edelmann, Achim

    2015-01-01

    This paper presents the Massively Open Online Course for Educators (MOOC-Ed) network dataset. It entails information on two online communication networks resulting from two consecutive offerings of the MOOC called "The Digital Learning Transition in K-12 Schools" in spring and fall 2013. The courses were offered to educators from the USA…

  3. Quantum computation over the butterfly network

    SciTech Connect

    Soeda, Akihito; Kinjo, Yoshiyuki; Turner, Peter S.; Murao, Mio

    2011-07-15

    In order to investigate distributed quantum computation under restricted network resources, we introduce a quantum computation task over the butterfly network where both quantum and classical communications are limited. We consider deterministically performing a two-qubit global unitary operation on two unknown inputs given at different nodes, with outputs at two distinct nodes. By using a particular resource setting introduced by M. Hayashi [Phys. Rev. A 76, 040301(R) (2007)], which is capable of performing a swap operation by adding two maximally entangled qubits (ebits) between the two input nodes, we show that unitary operations can be performed without adding any entanglement resource, if and only if the unitary operations are locally unitary equivalent to controlled unitary operations. Our protocol is optimal in the sense that the unitary operations cannot be implemented if we relax the specifications of any of the channels. We also construct protocols for performing controlled traceless unitary operations with a 1-ebit resource and for performing global Clifford operations with a 2-ebit resource.

  4. Visualization techniques for computer network defense

    NASA Astrophysics Data System (ADS)

    Beaver, Justin M.; Steed, Chad A.; Patton, Robert M.; Cui, Xiaohui; Schultz, Matthew

    2011-06-01

    Effective visual analysis of computer network defense (CND) information is challenging due to the volume and complexity of both the raw and analyzed network data. A typical CND is comprised of multiple niche intrusion detection tools, each of which performs network data analysis and produces a unique alerting output. The state-of-the-practice in the situational awareness of CND data is the prevalent use of custom-developed scripts by Information Technology (IT) professionals to retrieve, organize, and understand potential threat events. We propose a new visual analytics framework, called the Oak Ridge Cyber Analytics (ORCA) system, for CND data that allows an operator to interact with all detection tool outputs simultaneously. Aggregated alert events are presented in multiple coordinated views with timeline, cluster, and swarm model analysis displays. These displays are complemented with both supervised and semi-supervised machine learning classifiers. The intent of the visual analytics framework is to improve CND situational awareness, to enable an analyst to quickly navigate and analyze thousands of detected events, and to combine sophisticated data analysis techniques with interactive visualization such that patterns of anomalous activities may be more easily identified and investigated.

  5. Visualization Techniques for Computer Network Defense

    SciTech Connect

    Beaver, Justin M; Steed, Chad A; Patton, Robert M; Cui, Xiaohui; Schultz, Matthew A

    2011-01-01

    Effective visual analysis of computer network defense (CND) information is challenging due to the volume and complexity of both the raw and analyzed network data. A typical CND is comprised of multiple niche intrusion detection tools, each of which performs network data analysis and produces a unique alerting output. The state-of-the-practice in the situational awareness of CND data is the prevalent use of custom-developed scripts by Information Technology (IT) professionals to retrieve, organize, and understand potential threat events. We propose a new visual analytics framework, called the Oak Ridge Cyber Analytics (ORCA) system, for CND data that allows an operator to interact with all detection tool outputs simultaneously. Aggregated alert events are presented in multiple coordinated views with timeline, cluster, and swarm model analysis displays. These displays are complemented with both supervised and semi-supervised machine learning classifiers. The intent of the visual analytics framework is to improve CND situational awareness, to enable an analyst to quickly navigate and analyze thousands of detected events, and to combine sophisticated data analysis techniques with interactive visualization such that patterns of anomalous activities may be more easily identified and investigated.

  6. A Program of Computational Chemistry Exercises for the First-Semester General Chemistry Course

    ERIC Educational Resources Information Center

    Feller, Scott E.; Dallinger, Richard F.; McKinney, Paul Caylor

    2004-01-01

    The computer systems available for molecular modeling are described, along with a discussion of a molecular modeling program created and supported by computational techniques for the first-semester general chemistry course. Various exercises are listed, which direct the learner from a beginner's course in software practice to more complex…

  7. Report on WRITE; A Computer Assisted Instruction Course in Written English Usage.

    ERIC Educational Resources Information Center

    Dunwell, Stephen; And Others

    A computer-assisted instructional (CAI) course, WRITE, was used at the Poughkeepsie, New York, Middle School to help 5th through 8th graders with spelling and word usage problems. The course used the Coursewriter III language and an IBM System/360 computer; students received self-paced instructional programs at typewriter terminals. All teaching…

  8. Information and Computing Recommendations for a Course at the Secondary School Level.

    ERIC Educational Resources Information Center

    Education and Computing, 1986

    1986-01-01

    Presents recommendations for an interdisciplinary course called "Computers, Information, and Related Technologies" developed for 9th and 10th grade students by the American Federation of Information Processing Societies (AFIPS). Main topics presented in the course include how information is processed, using information, and how computing and…

  9. A Survey and Evaluation of Simulators Suitable for Teaching Courses in Computer Architecture and Organization

    ERIC Educational Resources Information Center

    Nikolic, B.; Radivojevic, Z.; Djordjevic, J.; Milutinovic, V.

    2009-01-01

    Courses in Computer Architecture and Organization are regularly included in Computer Engineering curricula. These courses are usually organized in such a way that students obtain not only a purely theoretical experience, but also a practical understanding of the topics lectured. This practical work is usually done in a laboratory using simulators…

  10. Using Microcomputers Simulations in the Classroom: Examples from Undergraduate and Faculty Computer Literacy Courses.

    ERIC Educational Resources Information Center

    Hart, Jeffrey A.

    1985-01-01

    Presents a discussion of how computer simulations are used in two undergraduate social science courses and a faculty computer literacy course on simulations and artificial intelligence. Includes a list of 60 simulations for use on mainframes and microcomputers. Entries include type of hardware required, publisher's address, and cost. Sample…

  11. Computer Controlled Test Systems. Introduction. A Course Based on the IEEE 488 Bus.

    ERIC Educational Resources Information Center

    Herrmann, Eric J.

    An introductory course in computer automated tests and measurement systems based on the International Test Instrument-Computer Interface Standard, the IEEE (Institute of Electrical and Electronics Engineers)-488, is presented in this study guide. This course is designed to: (1) introduce the electronics engineering technician to the functional…

  12. Designing for Deeper Learning in a Blended Computer Science Course for Middle School Students

    ERIC Educational Resources Information Center

    Grover, Shuchi; Pea, Roy; Cooper, Stephen

    2015-01-01

    The focus of this research was to create and test an introductory computer science course for middle school. Titled "Foundations for Advancing Computational Thinking" (FACT), the course aims to prepare and motivate middle school learners for future engagement with algorithmic problem solving. FACT was also piloted as a seven-week course…

  13. Computer Courses in Higher-Education: Improving Learning by Screencast Technology

    ERIC Educational Resources Information Center

    Ghilay, Yaron; Ghilay, Ruth

    2015-01-01

    The aim of the study was to find out a method designated to improve the learning of computer courses by adding Screencast technology. The intention was to measure the influence of high-quality clips produced by Screencast technology, on the learning process of computer courses. It was required to find out the characteristics (pedagogical and…

  14. Computer Modeling of Planetary Surface Temperatures in Introductory Astronomy Courses

    NASA Astrophysics Data System (ADS)

    Barker, Timothy; Goodman, J.

    2013-01-01

    Barker, T., and Goodman, J. C., Wheaton College, Norton, MA Computer modeling is an essential part of astronomical research, and so it is important that students be exposed to its powers and limitations in the first (and, perhaps, only) astronomy course they take in college. Building on the ideas of Walter Robinson (“Modeling Dynamic Systems,” Springer, 2002) we have found that STELLA software (ISEE Systems) allows introductory astronomy students to do sophisticated modeling by the end of two classes of instruction, with no previous experience in computer programming or calculus. STELLA’s graphical interface allows students to visualize systems in terms of “flows” in and out of “stocks,” avoiding the need to invoke differential equations. Linking flows and stocks allows feedback systems to be constructed. Students begin by building an easily understood system: a leaky bucket. This is a simple negative feedback system in which the volume in the bucket (a “stock”) depends on a fixed inflow rate and an outflow that increases in proportion to the volume in the bucket. Students explore how changing inflow rate and feedback parameters affect the steady-state volume and equilibration time of the system. This model is completed within a 50-minute class meeting. In the next class, students are given an analogous but more sophisticated problem: modeling a planetary surface temperature (“stock”) that depends on the “flow” of energy from the Sun, the planetary albedo, the outgoing flow of infrared radiation from the planet’s surface, and the infrared return from the atmosphere. Students then compare their STELLA model equilibrium temperatures to observed planetary temperatures, which agree with model ones for worlds without atmospheres, but give underestimates for planets with atmospheres, thus introducing students to the concept of greenhouse warming. We find that if we give the students part of this model at the start of a 50-minute class they are

  15. Computational functions in biochemical reaction networks.

    PubMed Central

    Arkin, A; Ross, J

    1994-01-01

    In prior work we demonstrated the implementation of logic gates, sequential computers (universal Turing machines), and parallel computers by means of the kinetics of chemical reaction mechanisms. In the present article we develop this subject further by first investigating the computational properties of several enzymatic (single and multiple) reaction mechanisms: we show their steady states are analogous to either Boolean or fuzzy logic gates. Nearly perfect digital function is obtained only in the regime in which the enzymes are saturated with their substrates. With these enzymatic gates, we construct combinational chemical networks that execute a given truth-table. The dynamic range of a network's output is strongly affected by "input/output matching" conditions among the internal gate elements. We find a simple mechanism, similar to the interconversion of fructose-6-phosphate between its two bisphosphate forms (fructose-1,6-bisphosphate and fructose-2,6-bisphosphate), that functions analogously to an AND gate. When the simple model is supplanted with one in which the enzyme rate laws are derived from experimental data, the steady state of the mechanism functions as an asymmetric fuzzy aggregation operator with properties akin to a fuzzy AND gate. The qualitative behavior of the mechanism does not change when situated within a large model of glycolysis/gluconeogenesis and the TCA cycle. The mechanism, in this case, switches the pathway's mode from glycolysis to gluconeogenesis in response to chemical signals of low blood glucose (cAMP) and abundant fuel for the TCA cycle (acetyl coenzyme A). Images FIGURE 3 FIGURE 4 FIGURE 5 FIGURE 7 FIGURE 10 FIGURE 12 FIGURE 13 FIGURE 14 FIGURE 15 FIGURE 16 PMID:7948674

  16. Interdisciplinary Team-Teaching Experience for a Computer and Nuclear Energy Course for Electrical and Computer Engineering Students

    ERIC Educational Resources Information Center

    Kim, Charles; Jackson, Deborah; Keiller, Peter

    2016-01-01

    A new, interdisciplinary, team-taught course has been designed to educate students in Electrical and Computer Engineering (ECE) so that they can respond to global and urgent issues concerning computer control systems in nuclear power plants. This paper discusses our experience and assessment of the interdisciplinary computer and nuclear energy…

  17. Computational classifiers for predicting the short-term course of Multiple sclerosis

    PubMed Central

    2011-01-01

    Background The aim of this study was to assess the diagnostic accuracy (sensitivity and specificity) of clinical, imaging and motor evoked potentials (MEP) for predicting the short-term prognosis of multiple sclerosis (MS). Methods We obtained clinical data, MRI and MEP from a prospective cohort of 51 patients and 20 matched controls followed for two years. Clinical end-points recorded were: 1) expanded disability status scale (EDSS), 2) disability progression, and 3) new relapses. We constructed computational classifiers (Bayesian, random decision-trees, simple logistic-linear regression-and neural networks) and calculated their accuracy by means of a 10-fold cross-validation method. We also validated our findings with a second cohort of 96 MS patients from a second center. Results We found that disability at baseline, grey matter volume and MEP were the variables that better correlated with clinical end-points, although their diagnostic accuracy was low. However, classifiers combining the most informative variables, namely baseline disability (EDSS), MRI lesion load and central motor conduction time (CMCT), were much more accurate in predicting future disability. Using the most informative variables (especially EDSS and CMCT) we developed a neural network (NNet) that attained a good performance for predicting the EDSS change. The predictive ability of the neural network was validated in an independent cohort obtaining similar accuracy (80%) for predicting the change in the EDSS two years later. Conclusions The usefulness of clinical variables for predicting the course of MS on an individual basis is limited, despite being associated with the disease course. By training a NNet with the most informative variables we achieved a good accuracy for predicting short-term disability. PMID:21649880

  18. Telecommunications/Networking. Course Four. Information Systems Curriculum.

    ERIC Educational Resources Information Center

    O'Neil, Sharon Lund; Everett, Donna R.

    This course is the fourth of seven in the Information Systems curriculum. The purpose of the course is to review data, text, graphics, and voice communications technology. It includes an overview of telecommunications technology. An overview of the course sets forth the condition and performance standard for each of the five task areas in the…

  19. A Computer Appreciation Course for First Year Mechanical Engineering Students.

    ERIC Educational Resources Information Center

    Haggett, A. J.; Le Masurier, D. W.

    1985-01-01

    Discusses the approach taken to introduce computers/computing into the curriculum at Brighton Polytechnic's Department of Mechanical and Production Engineering. Also lists aims of computing and microprocessor work, shows a typical computer exercise, and discusses polynomial approximation for a cam. (JN)

  20. Network and computing infrastructure for scientific applications in Georgia

    NASA Astrophysics Data System (ADS)

    Kvatadze, R.; Modebadze, Z.

    2016-09-01

    Status of network and computing infrastructure and available services for research and education community of Georgia are presented. Research and Educational Networking Association - GRENA provides the following network services: Internet connectivity, network services, cyber security, technical support, etc. Computing resources used by the research teams are located at GRENA and at major state universities. GE-01-GRENA site is included in European Grid infrastructure. Paper also contains information about programs of Learning Center and research and development projects in which GRENA is participating.

  1. Network Computer Technology. Phase I: Viability and Promise within NASA's Desktop Computing Environment

    NASA Technical Reports Server (NTRS)

    Paluzzi, Peter; Miller, Rosalind; Kurihara, West; Eskey, Megan

    1998-01-01

    Over the past several months, major industry vendors have made a business case for the network computer as a win-win solution toward lowering total cost of ownership. This report provides results from Phase I of the Ames Research Center network computer evaluation project. It identifies factors to be considered for determining cost of ownership; further, it examines where, when, and how network computer technology might fit in NASA's desktop computing architecture.

  2. Theory VI. Computational Materials Sciences Network (CMSN)

    SciTech Connect

    Zhang, Z Y

    2008-06-25

    The Computational Materials Sciences Network (CMSN) is a virtual center consisting of scientists interested in working together, across organizational and disciplinary boundaries, to formulate and pursue projects that reflect challenging and relevant computational research in the materials sciences. The projects appropriate for this center involve those problems best pursued through broad cooperative efforts, rather than those key problems best tackled by single investigator groups. CMSN operates similarly to the DOE Center of Excellence for the Synthesis and Processing of Advanced Materials, coordinated by George Samara at Sandia. As in the Synthesis and Processing Center, the intent of the modest funding for CMSN is to foster partnering and collective activities. All CMSN proposals undergo external peer review and are judged foremost on the quality and timeliness of the science and also on criteria relevant to the objective of the center, especially concerning a strategy for partnering. More details about CMSN can be found on the CMSN webpages at: http://cmpweb.ameslab.gov/ccms/CMSN-homepage.html.

  3. Towards a Versatile Tele-Education Platform for Computer Science Educators Based on the Greek School Network

    ERIC Educational Resources Information Center

    Paraskevas, Michael; Zarouchas, Thomas; Angelopoulos, Panagiotis; Perikos, Isidoros

    2013-01-01

    Now days the growing need for highly qualified computer science educators in modern educational environments is commonplace. This study examines the potential use of Greek School Network (GSN) to provide a robust and comprehensive e-training course for computer science educators in order to efficiently exploit advanced IT services and establish a…

  4. Computer networking in an ambulatory health care setting.

    PubMed

    Alger, R; Berkowitz, L L; Bergeron, B; Buskett, D

    1999-01-01

    Computers are a ubiquitous part of the ambulatory health care environment. Although stand-alone computers may be adequate for a small practice, networked computers can create much more powerful and cost-effective computerized systems. Local area networks allow groups of computers to share peripheral devices and computerized information within an office or cluster of offices. Wide area networks allow computers to securely share devices and information across a large geographical area. Either singly or in combination, these networks can be used to create robust systems to help physicians automate their practices and improve their access to important clinical information. In this article, we will examine common network configurations, explain how they function, and provide examples of real-world implementations of networking technology in health care. PMID:10662271

  5. Computer networking in an ambulatory health care setting.

    PubMed

    Alger, R; Berkowitz, L L; Bergeron, B; Buskett, D

    1999-01-01

    Computers are a ubiquitous part of the ambulatory health care environment. Although stand-alone computers may be adequate for a small practice, networked computers can create much more powerful and cost-effective computerized systems. Local area networks allow groups of computers to share peripheral devices and computerized information within an office or cluster of offices. Wide area networks allow computers to securely share devices and information across a large geographical area. Either singly or in combination, these networks can be used to create robust systems to help physicians automate their practices and improve their access to important clinical information. In this article, we will examine common network configurations, explain how they function, and provide examples of real-world implementations of networking technology in health care.

  6. An Interdisciplinary Bibliography for Computers and the Humanities Courses.

    ERIC Educational Resources Information Center

    Ehrlich, Heyward

    1991-01-01

    Presents an annotated bibliography of works related to the subject of computers and the humanities. Groups items into textbooks and overviews; introductions; human and computer languages; literary and linguistic analysis; artificial intelligence and robotics; social issue debates; computers' image in fiction; anthologies; writing and the…

  7. A topology for computer networks with good survivability characteristics and low transmission delays between node computers

    NASA Technical Reports Server (NTRS)

    Kelly, G. L.; Jiang, D. P.

    1984-01-01

    Various network topologies are developed which have not appeared in the literature before which result in minimum diameter graphs for computer networks having connectivity four. The topologies presented have good survivability characteristics and result in more topologies being available for computer network designers which achieve the minimum diameter resulting in small transmission delays.

  8. Efficiently modeling neural networks on massively parallel computers

    SciTech Connect

    Farber, R.M.

    1992-01-01

    Neural networks are a very useful tool for analyzing and modeling complex real world systems. Applying neural network simulations to real world problems generally involves large amounts of data and massive amounts of computation. To efficiently handle the computational requirements of large problems, we have implemented at Los Alamos a highly efficient neural network compiler for serial computers, vector computers, vector parallel computers, and fine grain SIMD computers such as the CM-2 connection machine. This paper will describe the mapping used by the compiler to implement feed-forward backpropagation neural networks for a SIMD architecture parallel computer. Thinking Machines Corporation has benchmarked our code at 1.3 billion interconnects per second (approximately 3 gigaflops) on a 64,000 processor CM-2 connection machine (Singer 1990). This mapping is applicable to other SMM computers and can be implemented on computers such as the CM-5 connection machine. Our mapping has virtually no communications overhead with the exception of the communications required for a global summation across the processors. We can efficiently model very large neural networks which have many neurons and interconnects and our mapping can be extend to arbitrarily large networks by merging the memory space of separate processors with fast adjacent processor inter-processor communications. This paper will consider the simulation of only feed forward neural network although this method is extendible to recurrent networks.

  9. Efficiently modeling neural networks on massively parallel computers

    SciTech Connect

    Farber, R.M.

    1992-12-01

    Neural networks are a very useful tool for analyzing and modeling complex real world systems. Applying neural network simulations to real world problems generally involves large amounts of data and massive amounts of computation. To efficiently handle the computational requirements of large problems, we have implemented at Los Alamos a highly efficient neural network compiler for serial computers, vector computers, vector parallel computers, and fine grain SIMD computers such as the CM-2 connection machine. This paper will describe the mapping used by the compiler to implement feed-forward backpropagation neural networks for a SIMD architecture parallel computer. Thinking Machines Corporation has benchmarked our code at 1.3 billion interconnects per second (approximately 3 gigaflops) on a 64,000 processor CM-2 connection machine (Singer 1990). This mapping is applicable to other SMM computers and can be implemented on computers such as the CM-5 connection machine. Our mapping has virtually no communications overhead with the exception of the communications required for a global summation across the processors. We can efficiently model very large neural networks which have many neurons and interconnects and our mapping can be extend to arbitrarily large networks by merging the memory space of separate processors with fast adjacent processor inter-processor communications. This paper will consider the simulation of only feed forward neural network although this method is extendible to recurrent networks.

  10. Efficiently modeling neural networks on massively parallel computers

    NASA Technical Reports Server (NTRS)

    Farber, Robert M.

    1993-01-01

    Neural networks are a very useful tool for analyzing and modeling complex real world systems. Applying neural network simulations to real world problems generally involves large amounts of data and massive amounts of computation. To efficiently handle the computational requirements of large problems, we have implemented at Los Alamos a highly efficient neural network compiler for serial computers, vector computers, vector parallel computers, and fine grain SIMD computers such as the CM-2 connection machine. This paper describes the mapping used by the compiler to implement feed-forward backpropagation neural networks for a SIMD (Single Instruction Multiple Data) architecture parallel computer. Thinking Machines Corporation has benchmarked our code at 1.3 billion interconnects per second (approximately 3 gigaflops) on a 64,000 processor CM-2 connection machine (Singer 1990). This mapping is applicable to other SIMD computers and can be implemented on MIMD computers such as the CM-5 connection machine. Our mapping has virtually no communications overhead with the exception of the communications required for a global summation across the processors (which has a sub-linear runtime growth on the order of O(log(number of processors)). We can efficiently model very large neural networks which have many neurons and interconnects and our mapping can extend to arbitrarily large networks (within memory limitations) by merging the memory space of separate processors with fast adjacent processor interprocessor communications. This paper will consider the simulation of only feed forward neural network although this method is extendable to recurrent networks.

  11. Evaluating the Effectiveness of Collaborative Computer-Intensive Projects in an Undergraduate Psychometrics Course

    ERIC Educational Resources Information Center

    Barchard, Kimberly A.; Pace, Larry A.

    2010-01-01

    Undergraduate psychometrics classes often use computer-intensive active learning projects. However, little research has examined active learning or computer-intensive projects in psychometrics courses. We describe two computer-intensive collaborative learning projects used to teach the design and evaluation of psychological tests. Course…

  12. Administrators' Perceptions of Community College Students' Computer Literacy Skills in Beginner Courses

    ERIC Educational Resources Information Center

    Ragin, Tracey B.

    2013-01-01

    Fundamental computer skills are vital in the current technology-driven society. The purpose of this study was to investigate the development needs of students at a rural community college in the Southeast who lacked the computer literacy skills required in a basic computer course. Guided by Greenwood's pragmatic approach as a reformative force in…

  13. Talking about Code: Integrating Pedagogical Code Reviews into Early Computing Courses

    ERIC Educational Resources Information Center

    Hundhausen, Christopher D.; Agrawal, Anukrati; Agarwal, Pawan

    2013-01-01

    Given the increasing importance of soft skills in the computing profession, there is good reason to provide students withmore opportunities to learn and practice those skills in undergraduate computing courses. Toward that end, we have developed an active learning approach for computing education called the "Pedagogical Code Review"…

  14. An Experimental Analysis of Computer-Mediated Instruction and Student Attitudes in a Principles of Financial Accounting Course.

    ERIC Educational Resources Information Center

    Basile, Anthony; D'Aquila, Jill M.

    2002-01-01

    Accounting students received either traditional instruction (n=46) or used computer-mediated communication and WebCT course management software. There were no significant differences in attitudes about the course. However, computer users were more positive about course delivery and course management tools. (Contains 17 references.) (SK)

  15. Reaching Distance Students with Computer Network Technology (Part II).

    ERIC Educational Resources Information Center

    Distance Education Report, 1997

    1997-01-01

    This article is the second in a series on the typology of technology used in distance education courses at the Center for Distance Learning at the State University of New York's Empire State College. Discusses computer technology as the main delivery vehicle of the course guide, discussion, information resources, and assignments. Outlines learning…

  16. Do Computers Have a Place in College Reading Courses?

    ERIC Educational Resources Information Center

    Wepner, Shelley B.; And Others

    1990-01-01

    Evaluates the effectiveness of the Sack-Yourman Developmental Speed Reading Course software program designed to increase reading rate and comprehension. Finds the software program effective in allowing students to quickly move from "chore" operations and didactic sections to "real" reading. Offers specific recommendations for effective use of…

  17. An Investigation of Student Practices in Asynchronous Computer Conferencing Courses

    ERIC Educational Resources Information Center

    Peters, Vanessa L.; Hewitt, Jim

    2010-01-01

    This study investigated the online practices of students enrolled in graduate-level distance education courses. Using interviews and a questionnaire as data sources, the study sought to: (a) identify common practices that students adopt in asynchronous discussions, and (b) gain an understanding of why students adopt them. An analysis of the data…

  18. Time-course gene profiling and networks in demethylated retinoblastoma cell line

    PubMed Central

    Malusa, Federico; Taranta, Monia; Zaki, Nazar; Cinti, Caterina; Capobianco, Enrico

    2015-01-01

    Retinoblastoma, a very aggressive cancer of the developing retina, initiatiates by the biallelic loss of RB1 gene, and progresses very quickly following RB1 inactivation. While its genome is stable, multiple pathways are deregulated, also epigenetically. After reviewing the main findings in relation with recently validated markers, we propose an integrative bioinformatics approach to include in the previous group new markers obtained from the analysis of a single cell line subject to epigenetic treatment. In particular, differentially expressed genes are identified from time course microarray experiments on the WERI-RB1 cell line treated with 5-Aza-2′-deoxycytidine (decitabine; DAC). By inducing demethylation of CpG island in promoter genes that are involved in biological processes, for instance apoptosis, we performed the following main integrative analysis steps: i) Gene expression profiling at 48h, 72h and 96h after DAC treatment; ii) Time differential gene co-expression networks and iii) Context-driven marker association (transcriptional factor regulated protein networks, master regulatory paths). The observed DAC-driven temporal profiles and regulatory connectivity patterns are obtained by the application of computational tools, with support from curated literature. It is worth emphasizing the capacity of networks to reconcile multi-type evidences, thus generating testable hypotheses made available by systems scale predictive inference power. Despite our small experimental setting, we propose through such integrations valuable impacts of epigenetic treatment in terms of gene expression measurements, and then validate evidenced apoptotic effects. PMID:26143641

  19. The Network Computer: Is It Right for Education?

    ERIC Educational Resources Information Center

    Galbreath, Jeremy

    1999-01-01

    Examines the network computer, originally conceived as an alternative device to personal computers to access the Internet and World Wide Web, from a technology perspective and looks at potential uses in education. Describes network architecture; discusses uses for library research, educational technology laboratories, and distance education.…

  20. Computer-Based Semantic Network in Molecular Biology: A Demonstration.

    ERIC Educational Resources Information Center

    Callman, Joshua L.; And Others

    This paper analyzes the hardware and software features that would be desirable in a computer-based semantic network system for representing biology knowledge. It then describes in detail a prototype network of molecular biology knowledge that has been developed using Filevision software and a Macintosh computer. The prototype contains about 100…

  1. Social Studies: Application Units. Course II, Teachers. Computer-Oriented Curriculum. REACT (Relevant Educational Applications of Computer Technology).

    ERIC Educational Resources Information Center

    Tecnica Education Corp., San Carlos, CA.

    This book is one of a series in Course II of the Relevant Educational Applications of Computer Technology (REACT) Project. It is designed to point out to teachers two of the major applications of computers in the social sciences: simulation and data analysis. The first section contains a variety of simulation units organized under the following…

  2. Network Patch Cables Demystified: A Super Activity for Computer Networking Technology

    ERIC Educational Resources Information Center

    Brown, Douglas L.

    2004-01-01

    This article de-mystifies network patch cable secrets so that people can connect their computers and transfer those pesky files--without screaming at the cables. It describes a network cabling activity that can offer students a great hands-on opportunity for working with the tools, techniques, and media used in computer networking. Since the…

  3. SNAP: A computer program for generating symbolic network functions

    NASA Technical Reports Server (NTRS)

    Lin, P. M.; Alderson, G. E.

    1970-01-01

    The computer program SNAP (symbolic network analysis program) generates symbolic network functions for networks containing R, L, and C type elements and all four types of controlled sources. The program is efficient with respect to program storage and execution time. A discussion of the basic algorithms is presented, together with user's and programmer's guides.

  4. A Computational-Modeling Course for Undergraduate Students in Chemical Technology

    ERIC Educational Resources Information Center

    Hessley, Rita K.

    2004-01-01

    The PC-based software technology, a computational-modeling course, for undergraduate chemistry students helps them to understand the molecular modeling in a better way. This course would be able to accommodate a wider array of topics and a greater depth of theory as the modeling is increasingly incorporated into the chemistry curriculum.

  5. Evaluation of the Effectiveness of a Web-Based Learning Design for Adult Computer Science Courses

    ERIC Educational Resources Information Center

    Antonis, Konstantinos; Daradoumis, Thanasis; Papadakis, Spyros; Simos, Christos

    2011-01-01

    This paper reports on work undertaken within a pilot study concerned with the design, development, and evaluation of online computer science training courses. Drawing on recent developments in e-learning technology, these courses were structured around the principles of a learner-oriented approach for use with adult learners. The paper describes a…

  6. A Matched-Pairs Study of Interactive Computer Laboratory Activities in a Liberal Arts Math Course

    ERIC Educational Resources Information Center

    Butler, Frederick; Butler, Melanie

    2011-01-01

    This paper details the culmination of a large, multi-year study on the effects of an interactive computer laboratory component in a large liberal arts math course at a state university. After several semesters of piloting these laboratory activities in the course, one of two sections, taught by the same senior instructor, was randomly selected to…

  7. Affective Learning in Online Multimedia and Lecture Versions of an Introductory Computing Course

    ERIC Educational Resources Information Center

    Moneta, Giovanni B.; Kekkonen-Moneta, Synnove S.

    2007-01-01

    This study evaluated students' affective learning in an introductory computing course that was taught in Hong Kong once in a lecture format and twice in a rich interactive multimedia online format to 414 college students in all. A simplified experience sampling method was used to assess affective learning at the midterm and end of each course in…

  8. Fostering Critical Reflection in a Computer-Based, Asynchronously Delivered Diversity Training Course

    ERIC Educational Resources Information Center

    Givhan, Shawn T.

    2013-01-01

    This dissertation study chronicles the creation of a computer-based, asynchronously delivered diversity training course for a state agency. The course format enabled efficient delivery of a mandatory curriculum to the Massachusetts Department of State Police workforce. However, the asynchronous format posed a challenge to achieving the learning…

  9. Learning Computing Topics in Undergraduate Information Systems Courses: Managing Perceived Difficulty

    ERIC Educational Resources Information Center

    Wall, Jeffrey D.; Knapp, Janice

    2014-01-01

    Learning technical computing skills is increasingly important in our technology driven society. However, learning technical skills in information systems (IS) courses can be difficult. More than 20 percent of students in some technical courses may dropout or fail. Unfortunately, little is known about students' perceptions of the difficulty of…

  10. Educational Impact of Digital Visualization Tools on Digital Character Production Computer Science Courses

    ERIC Educational Resources Information Center

    van Langeveld, Mark Christensen

    2009-01-01

    Digital character production courses have traditionally been taught in art departments. The digital character production course at the University of Utah is centered, drawing uniformly from art and engineering disciplines. Its design has evolved to include a synergy of computer science, functional art and human anatomy. It gives students an…

  11. Formal Methods, Design, and Collaborative Learning in the First Computer Science Course.

    ERIC Educational Resources Information Center

    Troeger, Douglas R.

    1995-01-01

    A new introductory computer science course at City College of New York builds on a foundation of logic to teach programming based on a "design idea," a strong departure from conventional programming courses. Reduced attrition and increased student and teacher enthusiasm have resulted. (MSE)

  12. Toward a Singleton Undergraduate Computer Graphics Course in Small and Medium-Sized Colleges

    ERIC Educational Resources Information Center

    Shesh, Amit

    2013-01-01

    This article discusses the evolution of a single undergraduate computer graphics course over five semesters, driven by a primary question: if one could offer only one undergraduate course in graphics, what would it include? This constraint is relevant to many small and medium-sized colleges that lack resources, adequate expertise, and enrollment…

  13. Instructional Strategies and Tactics for the Design of Introductory Computer Programming Courses in High School.

    ERIC Educational Resources Information Center

    Van Merrienboer, Jeroen J. G.; Krammer, Hein P. M.

    1987-01-01

    Compares the expert, spiral, and reading approaches to instructional strategies for introductory computer programming courses. Based on ACT (Adaptive Control of Thought) theory and relevant research, the differences between declarative and procedural instruction are identified, and six tactics used to design courses and evaluate strategies are…

  14. Team Spirit: A Computer Math Course for Parents and Gifted Children.

    ERIC Educational Resources Information Center

    Kulm, Gerald

    1984-01-01

    A colloge credit course in computer math for 12 gifted secondary students and their parents focused on problem-solving strategies and encouraged teamwork and communication among parents and children. (CL)

  15. Happenstance and compromise: a gendered analysis of students' computing degree course selection

    NASA Astrophysics Data System (ADS)

    Lang, Catherine

    2010-12-01

    The number of students choosing to study computing at university continues to decline this century, with an even sharper decline in female students. This article presents the results of a series of interviews with university students studying computing courses in Australia that uncovered the influence of happenstance and compromise on course choice. This investigation provides an insight into the contributing factors into the continued downturn of student diversity in computing bachelor degree courses. Many females interviewed made decisions based on happenstance, many males interviewed had chosen computing as a compromise course, and family helped in the decision-making to a large degree in both genders. The major implication from this investigation is the finding that students of both genders appear to be socialised away from this discipline, which is perceived as a support or insurance skill, not a career in itself, in all but the most technical-oriented (usually male) student.

  16. Investigating a New Way To Teach Law: A Computer-based Commercial Law Course.

    ERIC Educational Resources Information Center

    Lloyd, Robert M.

    2000-01-01

    Describes the successful use of an interactive, computer-based format supplemented by online chats to provide a two-credit-hour commercial law course at the University of Tennessee College of Law. (EV)

  17. Computers and the Humanities Courses: Philosophical Bases and Approach.

    ERIC Educational Resources Information Center

    Ide, Nancy M.

    1987-01-01

    Discusses a Vassar College workshop and the debate it generated over the depth and breadth of computer knowledge needed by humanities students. Describes two positions: the "Holistic View," which emphasizes the understanding of the formal methods of computer implementation; and the "Expert Users View," which sees the humanist as a "user" of…

  18. Apple IIe Computers and Appleworks Training Mini Course Materials.

    ERIC Educational Resources Information Center

    Schlenker, Richard M.

    The instructional materials included in this document are designed to introduce students to the Apple IIe computer and to the word processing and database portions of the AppleWorks program. The materials are intended for small groups of students, each of whom has use of a computer during class and for short periods between classes. The course…

  19. Learning Motivation in E-Learning Facilitated Computer Programming Courses

    ERIC Educational Resources Information Center

    Law, Kris M. Y.; Lee, Victor C. S.; Yu, Y. T.

    2010-01-01

    Computer programming skills constitute one of the core competencies that graduates from many disciplines, such as engineering and computer science, are expected to possess. Developing good programming skills typically requires students to do a lot of practice, which cannot sustain unless they are adequately motivated. This paper reports a…

  20. Predictors of Enrollment in High School Computer Courses.

    ERIC Educational Resources Information Center

    Campbell, N. Jo; Perry, Katye M.

    Factors affecting the motivation of high school students to learn to use computers were examined in this study. The subjects were 160 students enrolled in a large city high school, 89 females and 71 males who represented five ethnic groups--White, Black, Hispanic, Asian, and American Indian. The majority of subjects had prior computer coursework…

  1. Course Control of Underactuated Ship Based on Nonlinear Robust Neural Network Backstepping Method

    PubMed Central

    Yuan, Junjia; Meng, Hao; Zhu, Qidan; Zhou, Jiajia

    2016-01-01

    The problem of course control for underactuated surface ship is addressed in this paper. Firstly, neural networks are adopted to determine the parameters of the unknown part of ideal virtual backstepping control, even the weight values of neural network are updated by adaptive technique. Then uniform stability for the convergence of course tracking errors has been proven through Lyapunov stability theory. Finally, simulation experiments are carried out to illustrate the effectiveness of proposed control method. PMID:27293422

  2. Course Control of Underactuated Ship Based on Nonlinear Robust Neural Network Backstepping Method.

    PubMed

    Yuan, Junjia; Meng, Hao; Zhu, Qidan; Zhou, Jiajia

    2016-01-01

    The problem of course control for underactuated surface ship is addressed in this paper. Firstly, neural networks are adopted to determine the parameters of the unknown part of ideal virtual backstepping control, even the weight values of neural network are updated by adaptive technique. Then uniform stability for the convergence of course tracking errors has been proven through Lyapunov stability theory. Finally, simulation experiments are carried out to illustrate the effectiveness of proposed control method. PMID:27293422

  3. Interaction, Critical Thinking, and Social Network Analysis (SNA) in Online Courses

    ERIC Educational Resources Information Center

    Thormann, Joan; Gable, Samuel; Fidalgo, Patricia Seferlis; Blakeslee, George

    2013-01-01

    This study tried to ascertain a possible relationship between the number of student moderators (1, 2, and 3), online interactions, and critical thinking of K-12 educators enrolled in an online course that was taught from a constructivist approach. The course topic was use of technology in special education. Social network analysis (SNA) and…

  4. A Computer Network for Social Scientists.

    ERIC Educational Resources Information Center

    Gerber, Barry

    1989-01-01

    Describes a microcomputer-based network developed at the University of California Los Angeles to support education in the social sciences. Topics discussed include technological, managerial, and academic considerations of university networking; the use of the network in teaching macroeconomics, social demographics, and symbolic logic; and possible…

  5. LaRC local area networks to support distributed computing

    NASA Technical Reports Server (NTRS)

    Riddle, E. P.

    1984-01-01

    The Langley Research Center's (LaRC) Local Area Network (LAN) effort is discussed. LaRC initiated the development of a LAN to support a growing distributed computing environment at the Center. The purpose of the network is to provide an improved capability (over inteactive and RJE terminal access) for sharing multivendor computer resources. Specifically, the network will provide a data highway for the transfer of files between mainframe computers, minicomputers, work stations, and personal computers. An important influence on the overall network design was the vital need of LaRC researchers to efficiently utilize the large CDC mainframe computers in the central scientific computing facility. Although there was a steady migration from a centralized to a distributed computing environment at LaRC in recent years, the work load on the central resources increased. Major emphasis in the network design was on communication with the central resources within the distributed environment. The network to be implemented will allow researchers to utilize the central resources, distributed minicomputers, work stations, and personal computers to obtain the proper level of computing power to efficiently perform their jobs.

  6. Computer analysis of general linear networks using digraphs.

    NASA Technical Reports Server (NTRS)

    Mcclenahan, J. O.; Chan, S.-P.

    1972-01-01

    Investigation of the application of digraphs in analyzing general electronic networks, and development of a computer program based on a particular digraph method developed by Chen. The Chen digraph method is a topological method for solution of networks and serves as a shortcut when hand calculations are required. The advantage offered by this method of analysis is that the results are in symbolic form. It is limited, however, by the size of network that may be handled. Usually hand calculations become too tedious for networks larger than about five nodes, depending on how many elements the network contains. Direct determinant expansion for a five-node network is a very tedious process also.

  7. A Novel College Network Resource Management Method using Cloud Computing

    NASA Astrophysics Data System (ADS)

    Lin, Chen

    At present information construction of college mainly has construction of college networks and management information system; there are many problems during the process of information. Cloud computing is development of distributed processing, parallel processing and grid computing, which make data stored on the cloud, make software and services placed in the cloud and build on top of various standards and protocols, you can get it through all kinds of equipments. This article introduces cloud computing and function of cloud computing, then analyzes the exiting problems of college network resource management, the cloud computing technology and methods are applied in the construction of college information sharing platform.

  8. Network selection, Information filtering and Scalable computation

    NASA Astrophysics Data System (ADS)

    Ye, Changqing

    This dissertation explores two application scenarios of sparsity pursuit method on large scale data sets. The first scenario is classification and regression in analyzing high dimensional structured data, where predictors corresponds to nodes of a given directed graph. This arises in, for instance, identification of disease genes for the Parkinson's diseases from a network of candidate genes. In such a situation, directed graph describes dependencies among the genes, where direction of edges represent certain causal effects. Key to high-dimensional structured classification and regression is how to utilize dependencies among predictors as specified by directions of the graph. In this dissertation, we develop a novel method that fully takes into account such dependencies formulated through certain nonlinear constraints. We apply the proposed method to two applications, feature selection in large margin binary classification and in linear regression. We implement the proposed method through difference convex programming for the cost function and constraints. Finally, theoretical and numerical analyses suggest that the proposed method achieves the desired objectives. An application to disease gene identification is presented. The second application scenario is personalized information filtering which extracts the information specifically relevant to a user, predicting his/her preference over a large number of items, based on the opinions of users who think alike or its content. This problem is cast into the framework of regression and classification, where we introduce novel partial latent models to integrate additional user-specific and content-specific predictors, for higher predictive accuracy. In particular, we factorize a user-over-item preference matrix into a product of two matrices, each representing a user's preference and an item preference by users. Then we propose a likelihood method to seek a sparsest latent factorization, from a class of over

  9. A programming course in bioinformatics for computer and information science students.

    PubMed

    Altman, R B; Koza, J

    1996-01-01

    We have created a course entitled "Representations and Algorithms for Computational Molecular Biology" with three specific goals in mind. First, we want to provide a technical introduction for computer science and medical information science students to the challenges of computing with molecular biology data, particularly the advantages of having easy access to real-world data sets. Second, we want to equip the students with the skills required of productive research assistants in molecular biology computing research projects. Finally, we want to provide a showcase for local investigators to describe their work in the context of a course that provide adequate background information. In order to achieve these goals, we have created a programming course, in which three major projects and six smaller assignments are assigned during the quarter. We stress fundamental representations and algorithms during the first part of the course in lectures given by the core faculty, and then have more focused lectures in which faculty research interests are highlighted. The course stressed issues of structural molecular biology, in order to better motivate the critical issues in sequence analysis. The culmination of the course was a challenge to the students to use a version of protein threading to predict which members of a set of unknown sequences were globins. The course was well received, and has been made a core requirement in the Medical Information Sciences program.

  10. Evaluation of the Manitoba High School Computer Network.

    ERIC Educational Resources Information Center

    Haughey, D. G.; And Others

    Designed to provide information for decision making when the current 5-year contract with the supplier of computing services to the Manitoba High School Computer Network expires in 1982, this study included both an assessment of the existing situation and the identification of alternative options for the provision of computer services to the…

  11. LAN Configuration and Analysis: Projects for the Data Communications and Networking Course

    ERIC Educational Resources Information Center

    Chen, Fang; Brabston, Mary

    2011-01-01

    We implemented two local area network (LAN) projects in our introductory data communications and networking course. The first project required students to develop a LAN from scratch for a small imaginary organization. The second project required student groups to analyze a LAN for a real world small organization. By allowing students to apply what…

  12. Who Do You Know? Demonstrating Networking in a Careers in Psychology Course

    ERIC Educational Resources Information Center

    Segrist, Dan J.; Pawlow, Laura A.

    2009-01-01

    This study examined the effectiveness of a classroom activity designed to visually depict the ability of networking to increase potential job contacts. We implemented the activity in two sections of a Careers in Psychology course. Use of the activity resulted in significant increases in the number of potential networking contacts generated by…

  13. HeNCE: A Heterogeneous Network Computing Environment

    DOE PAGESBeta

    Beguelin, Adam; Dongarra, Jack J.; Geist, George Al; Manchek, Robert; Moore, Keith

    1994-01-01

    Network computing seeks to utilize the aggregate resources of many networked computers to solve a single problem. In so doing it is often possible to obtain supercomputer performance from an inexpensive local area network. The drawback is that network computing is complicated and error prone when done by hand, especially if the computers have different operating systems and data formats and are thus heterogeneous. The heterogeneous network computing environment (HeNCE) is an integrated graphical environment for creating and running parallel programs over a heterogeneous collection of computers. It is built on a lower level package called parallel virtual machine (PVM).more » The HeNCE philosophy of parallel programming is to have the programmer graphically specify the parallelism of a computation and to automate, as much as possible, the tasks of writing, compiling, executing, debugging, and tracing the network computation. Key to HeNCE is a graphical language based on directed graphs that describe the parallelism and data dependencies of an application. Nodes in the graphs represent conventional Fortran or C subroutines and the arcs represent data and control flow. This article describes the present state of HeNCE, its capabilities, limitations, and areas of future research.« less

  14. Exploiting stoichiometric redundancies for computational efficiency and network reduction.

    PubMed

    Ingalls, Brian P; Bembenek, Eric

    2015-01-01

    Analysis of metabolic networks typically begins with construction of the stoichiometry matrix, which characterizes the network topology. This matrix provides, via the balance equation, a description of the potential steady-state flow distribution. This paper begins with the observation that the balance equation depends only on the structure of linear redundancies in the network, and so can be stated in a succinct manner, leading to computational efficiencies in steady-state analysis. This alternative description of steady-state behaviour is then used to provide a novel method for network reduction, which complements existing algorithms for describing intracellular networks in terms of input-output macro-reactions (to facilitate bioprocess optimization and control). Finally, it is demonstrated that this novel reduction method can be used to address elementary mode analysis of large networks: the modes supported by a reduced network can capture the input-output modes of a metabolic module with significantly reduced computational effort.

  15. Using high-performance networks to enable computational aerosciences applications

    NASA Technical Reports Server (NTRS)

    Johnson, Marjory J.

    1992-01-01

    One component of the U.S. Federal High Performance Computing and Communications Program (HPCCP) is the establishment of a gigabit network to provide a communications infrastructure for researchers across the nation. This gigabit network will provide new services and capabilities, in addition to increased bandwidth, to enable future applications. An understanding of these applications is necessary to guide the development of the gigabit network and other high-performance networks of the future. In this paper we focus on computational aerosciences applications run remotely using the Numerical Aerodynamic Simulation (NAS) facility located at NASA Ames Research Center. We characterize these applications in terms of network-related parameters and relate user experiences that reveal limitations imposed by the current wide-area networking infrastructure. Then we investigate how the development of a nationwide gigabit network would enable users of the NAS facility to work in new, more productive ways.

  16. Exploiting stoichiometric redundancies for computational efficiency and network reduction.

    PubMed

    Ingalls, Brian P; Bembenek, Eric

    2015-01-01

    Analysis of metabolic networks typically begins with construction of the stoichiometry matrix, which characterizes the network topology. This matrix provides, via the balance equation, a description of the potential steady-state flow distribution. This paper begins with the observation that the balance equation depends only on the structure of linear redundancies in the network, and so can be stated in a succinct manner, leading to computational efficiencies in steady-state analysis. This alternative description of steady-state behaviour is then used to provide a novel method for network reduction, which complements existing algorithms for describing intracellular networks in terms of input-output macro-reactions (to facilitate bioprocess optimization and control). Finally, it is demonstrated that this novel reduction method can be used to address elementary mode analysis of large networks: the modes supported by a reduced network can capture the input-output modes of a metabolic module with significantly reduced computational effort. PMID:25547516

  17. Exploiting stoichiometric redundancies for computational efficiency and network reduction

    PubMed Central

    Ingalls, Brian P.; Bembenek, Eric

    2015-01-01

    Abstract Analysis of metabolic networks typically begins with construction of the stoichiometry matrix, which characterizes the network topology. This matrix provides, via the balance equation, a description of the potential steady-state flow distribution. This paper begins with the observation that the balance equation depends only on the structure of linear redundancies in the network, and so can be stated in a succinct manner, leading to computational efficiencies in steady-state analysis. This alternative description of steady-state behaviour is then used to provide a novel method for network reduction, which complements existing algorithms for describing intracellular networks in terms of input-output macro-reactions (to facilitate bioprocess optimization and control). Finally, it is demonstrated that this novel reduction method can be used to address elementary mode analysis of large networks: the modes supported by a reduced network can capture the input-output modes of a metabolic module with significantly reduced computational effort. PMID:25547516

  18. High-performance neural networks. [Neural computers

    SciTech Connect

    Dress, W.B.

    1987-06-01

    The new Forth hardware architectures offer an intermediate solution to high-performance neural networks while the theory and programming details of neural networks for synthetic intelligence are developed. This approach has been used successfully to determine the parameters and run the resulting network for a synthetic insect consisting of a 200-node ''brain'' with 1760 interconnections. Both the insect's environment and its sensor input have thus far been simulated. However, the frequency-coded nature of the Browning network allows easy replacement of the simulated sensors by real-world counterparts.

  19. A Successful Course of Study in Computer Programming

    ERIC Educational Resources Information Center

    Seeger, David H.

    1977-01-01

    Three keys to the successful development of the program of the computer programming department of the Technical Institute of Oklahoma State University are discussed: Community involvement, faculty/administration commitment to the basic principles of technical career education, and availability of appropriate equipment for student use. (HD)

  20. Landuse: A Computer Program for Laboratory Use in Economic Geography Courses, Technical Paper No. 8.

    ERIC Educational Resources Information Center

    Marble, Duane F.; Anderson, Bruce M.

    This technical report describes a digital computer program on the spatial structure of agricultural production and how it can be used in economic geography courses. Chapters one through four, respectively, (1) examine the use of digital computers in the teaching of college geography, (2) analyze the von Thunen theory which postulates laws that…

  1. Pedagogical Benefits of the Computer in the Foreign Language Business Course.

    ERIC Educational Resources Information Center

    Cohen, Howard

    1987-01-01

    It is beneficial to integrate the teaching of computer use into the foreign language business course through such activities as simulating business and commercial situations. Students will not only learn computer terminology in the foreign language but also expand their practical working vocabulary. (CB)

  2. Evaluating the Effectiveness of Interactive Computer Tutorials for an Undergraduate Mathematical Literacy Course

    ERIC Educational Resources Information Center

    Frith, Vera; Jaftha, Jacob; Prince, Robert

    2004-01-01

    This article describes a study of learning when students used interactive spreadsheet-based computer tutorials in a mathematical literacy course. It foregrounds theories relating to the role of computer technology (and specifically spreadsheets) as a mediator for learning of mathematics. It outlines the application of quantitative methods…

  3. Design and Delivery of Multiple Server-Side Computer Languages Course

    ERIC Educational Resources Information Center

    Wang, Shouhong; Wang, Hai

    2011-01-01

    Given the emergence of service-oriented architecture, IS students need to be knowledgeable of multiple server-side computer programming languages to be able to meet the needs of the job market. This paper outlines the pedagogy of an innovative course of multiple server-side computer languages for the undergraduate IS majors. The paper discusses…

  4. Happenstance and Compromise: A Gendered Analysis of Students' Computing Degree Course Selection

    ERIC Educational Resources Information Center

    Lang, Catherine

    2010-01-01

    The number of students choosing to study computing at university continues to decline this century, with an even sharper decline in female students. This article presents the results of a series of interviews with university students studying computing courses in Australia that uncovered the influence of happenstance and compromise on course…

  5. Relationships among Learning Styles and Motivation with Computer-Aided Instruction in an Agronomy Course

    ERIC Educational Resources Information Center

    McAndrews, Gina M.; Mullen, Russell E.; Chadwick, Scott A.

    2005-01-01

    Multi-media learning tools were developed to enhance student learning for an introductory agronomy course at Iowa State University. During fall 2002, the new interactive computer program, called Computer Interactive Multimedia Program for Learning Enhancement (CIMPLE) was incorporated into the teaching, learning, and assessment processes of the…

  6. Formative Questioning in Computer Learning Environments: A Course for Pre-Service Mathematics Teachers

    ERIC Educational Resources Information Center

    Akkoç, Hatice

    2015-01-01

    This paper focuses on a specific aspect of formative assessment, namely questioning. Given that computers have gained widespread use in learning and teaching, specific attention should be made when organizing formative assessment in computer learning environments (CLEs). A course including various workshops was designed to develop knowledge and…

  7. A Novel Use of Computer Simulation in an Applied Pharmacokinetics Course.

    ERIC Educational Resources Information Center

    Sullivan, Timothy J.

    1982-01-01

    The use of a package of interactive computer programs designed to simulate pharmacokinetic monitoring of drug therapy in a required undergraduate applied pharmacokinetics course is described. Students were assigned the problem of maintaining therapeutic drug concentrations in a computer generated "patient" as an adjunct to classroom instruction.…

  8. The Use of a Computer Algebra System in Capstone Mathematics Courses for Undergraduate Mathematics Majors.

    ERIC Educational Resources Information Center

    Harris, Gary A.

    2000-01-01

    Discusses the use of a computer algebra system in a capstone mathematics course for undergraduate mathematics majors preparing to teach secondary school mathematics. Provides sample exercises intended to demonstrate how the power of a computer algebra system such as MAPLE can contribute to desired outcomes including reinforcing and strengthening…

  9. Computer, Video, and Rapid-Cycling Plant Projects in an Undergraduate Plant Breeding Course.

    ERIC Educational Resources Information Center

    Michaels, T. E.

    1993-01-01

    Studies the perceived effectiveness of four student projects involving videotape production, computer conferencing, microcomputer simulation, and rapid-cycling Brassica breeding for undergraduate plant breeding students in two course offerings in consecutive years. Linking of the computer conferencing and video projects improved the rating of the…

  10. Causal Attributions of Success and Failure Made by Undergraduate Students in an Introductory-Level Computer Programming Course

    ERIC Educational Resources Information Center

    Hawi, N.

    2010-01-01

    The purpose of this research is to identify the causal attributions of business computing students in an introductory computer programming course, in the computer science department at Notre Dame University, Louaize. Forty-five male and female undergraduates who completed the computer programming course that extended for a 13-week semester…

  11. Wireless Networks: New Meaning to Ubiquitous Computing.

    ERIC Educational Resources Information Center

    Drew, Wilfred, Jr.

    2003-01-01

    Discusses the use of wireless technology in academic libraries. Topics include wireless networks; standards (IEEE 802.11); wired versus wireless; why libraries implement wireless technology; wireless local area networks (WLANs); WLAN security; examples of wireless use at Indiana State University and Morrisville College (New York); and useful…

  12. Syntactic Computations in the Language Network: Characterizing Dynamic Network Properties Using Representational Similarity Analysis

    PubMed Central

    Tyler, Lorraine K.; Cheung, Teresa P. L.; Devereux, Barry J.; Clarke, Alex

    2013-01-01

    The core human capacity of syntactic analysis involves a left hemisphere network involving left inferior frontal gyrus (LIFG) and posterior middle temporal gyrus (LMTG) and the anatomical connections between them. Here we use magnetoencephalography (MEG) to determine the spatio-temporal properties of syntactic computations in this network. Listeners heard spoken sentences containing a local syntactic ambiguity (e.g., “… landing planes …”), at the offset of which they heard a disambiguating verb and decided whether it was an acceptable/unacceptable continuation of the sentence. We charted the time-course of processing and resolving syntactic ambiguity by measuring MEG responses from the onset of each word in the ambiguous phrase and the disambiguating word. We used representational similarity analysis (RSA) to characterize syntactic information represented in the LIFG and left posterior middle temporal gyrus (LpMTG) over time and to investigate their relationship to each other. Testing a variety of lexico-syntactic and ambiguity models against the MEG data, our results suggest early lexico-syntactic responses in the LpMTG and later effects of ambiguity in the LIFG, pointing to a clear differentiation in the functional roles of these two regions. Our results suggest the LpMTG represents and transmits lexical information to the LIFG, which responds to and resolves the ambiguity. PMID:23730293

  13. CFD Optimization on Network-Based Parallel Computer System

    NASA Technical Reports Server (NTRS)

    Cheung, Samson H.; VanDalsem, William (Technical Monitor)

    1994-01-01

    Combining multiple engineering workstations into a network-based heterogeneous parallel computer allows application of aerodynamic optimization with advance computational fluid dynamics codes, which is computationally expensive in mainframe supercomputer. This paper introduces a nonlinear quasi-Newton optimizer designed for this network-based heterogeneous parallel computer on a software called Parallel Virtual Machine. This paper will introduce the methodology behind coupling a Parabolized Navier-Stokes flow solver to the nonlinear optimizer. This parallel optimization package has been applied to reduce the wave drag of a body of revolution and a wing/body configuration with results of 5% to 6% drag reduction.

  14. Parallel CFD design on network-based computer

    NASA Technical Reports Server (NTRS)

    Cheung, Samson

    1995-01-01

    Combining multiple engineering workstations into a network-based heterogeneous parallel computer allows application of aerodynamic optimization with advanced computational fluid dynamics codes, which can be computationally expensive on mainframe supercomputers. This paper introduces a nonlinear quasi-Newton optimizer designed for this network-based heterogeneous parallel computing environment utilizing a software called Parallel Virtual Machine. This paper will introduce the methodology behind coupling a Parabolized Navier-Stokes flow solver to the nonlinear optimizer. This parallel optimization package is applied to reduce the wave drag of a body of revolution and a wing/body configuration with results of 5% to 6% drag reduction.

  15. Neural networks and their applications to computer data security

    NASA Astrophysics Data System (ADS)

    Barua, Susamma

    1992-12-01

    The paper presented here explores the possibility of applying neural networks to identify authorized users of a computer system. Computer security can be ensured only by restricting access to a computer system. This in turn requires a sure means of identifying authorized users. The related research is based on the fact that every human being is distinguished by many unique physical characteristics. It has been known even before the age of computers that no two individuals sign their names identically. Signature samples collected from a group of individuals are analyzed and a neural network-based system that can recognize these signatures is designed.

  16. Phoebus: Network Middleware for Next-Generation Network Computing

    SciTech Connect

    Martin Swany

    2012-06-16

    The Phoebus project investigated algorithms, protocols, and middleware infrastructure to improve end-to-end performance in high speed, dynamic networks. The Phoebus system essentially serves as an adaptation point for networks with disparate capabilities or provisioning. This adaptation can take a variety of forms including acting as a provisioning agent across multiple signaling domains, providing transport protocol adaptation points, and mapping between distributed resource reservation paradigms and the optical network control plane. We have successfully developed the system and demonstrated benefits. The Phoebus system was deployed in Internet2 and in ESnet, as well as in GEANT2, RNP in Brazil and over international links to Korea and Japan. Phoebus is a system that implements a new protocol and associated forwarding infrastructure for improving throughput in high-speed dynamic networks. It was developed to serve the needs of large DOE applications on high-performance networks. The idea underlying the Phoebus model is to embed Phoebus Gateways (PGs) in the network as on-ramps to dynamic circuit networks. The gateways act as protocol translators that allow legacy applications to use dedicated paths with high performance.

  17. Traditional computing center as a modern network node

    SciTech Connect

    Heller, S.; Peskin, A.M.

    1981-11-01

    There is an obvious trend toward decentralization of computing power from the traditional, large computing center. Even so there remains a generous, but changing role for such centers to play. Their capabilities would then be complimentary to smaller, individualized facilities, so the user would benefit greatly from a general purpose, local network on which the large center represented a node. There is no network currently available that exhibits all the attributes of the ideal local for this environment. It can be approached, however, by combining several diverse products as network segments, which are interconnected via processor gateways. This is in fact the strategy being followed at Brookhaven National Laboratory, which has a computing environment typical of a large class of institutions. The attributes of the ideal network are presented. A brief discussion of the current state-of-the-art in networking is then given. Finally, the particulars of the Brookhaven implementation are offered as a case history.

  18. Improving Computing Courses from the Points of View of Students and Teachers: A Review and an Empirical Study

    ERIC Educational Resources Information Center

    Sampaio, Alberto; Sampaio, Isabel

    2012-01-01

    The improvement of computing courses is a permanent need and is a goal established by any teacher. Suggestions of possible course improvements should be made by teachers and students. Computer project-based courses involving a significant number of people pose difficulties to listening to all their opinions. The purpose of our research is twofold:…

  19. Implications of Computer Networking and the Internet for Nurse Education.

    ERIC Educational Resources Information Center

    Ward, Rod

    1997-01-01

    Looks at the ways in which computer networks are changing health care and nursing education. Discusses the increasing importance of Internet-based education, widely distributed interactive multimedia learning, and information management skills. (SK)

  20. Spreadsheet Analysis Of Queuing In A Computer Network

    NASA Technical Reports Server (NTRS)

    Galant, David C.

    1992-01-01

    Method of analyzing responses of computer network based on simple queuing-theory mathmatical models via spreadsheet program. Effects of variations in traffic, capacities of channels, and message protocols assessed.

  1. Computers, Electronic Networking and Education: Some American Experiences.

    ERIC Educational Resources Information Center

    McConnell, David

    1991-01-01

    Describes new developments in distributed educational computing at Massachusetts Institute of Technology (MIT, "Athena"), Carnegie Mellon University ("Andrew"), Brown University "Intermedia"), Electronic University Network (California), Western Behavioral Sciences Institute (California), and University of California, Irvine. Topics discussed…

  2. Computationally Efficient Neural Network Intrusion Security Awareness

    SciTech Connect

    Todd Vollmer; Milos Manic

    2009-08-01

    An enhanced version of an algorithm to provide anomaly based intrusion detection alerts for cyber security state awareness is detailed. A unique aspect is the training of an error back-propagation neural network with intrusion detection rule features to provide a recognition basis. Network packet details are subsequently provided to the trained network to produce a classification. This leverages rule knowledge sets to produce classifications for anomaly based systems. Several test cases executed on ICMP protocol revealed a 60% identification rate of true positives. This rate matched the previous work, but 70% less memory was used and the run time was reduced to less than 1 second from 37 seconds.

  3. Computing the Local Field Potential (LFP) from Integrate-and-Fire Network Models.

    PubMed

    Mazzoni, Alberto; Lindén, Henrik; Cuntz, Hermann; Lansner, Anders; Panzeri, Stefano; Einevoll, Gaute T

    2015-12-01

    Leaky integrate-and-fire (LIF) network models are commonly used to study how the spiking dynamics of neural networks changes with stimuli, tasks or dynamic network states. However, neurophysiological studies in vivo often rather measure the mass activity of neuronal microcircuits with the local field potential (LFP). Given that LFPs are generated by spatially separated currents across the neuronal membrane, they cannot be computed directly from quantities defined in models of point-like LIF neurons. Here, we explore the best approximation for predicting the LFP based on standard output from point-neuron LIF networks. To search for this best "LFP proxy", we compared LFP predictions from candidate proxies based on LIF network output (e.g, firing rates, membrane potentials, synaptic currents) with "ground-truth" LFP obtained when the LIF network synaptic input currents were injected into an analogous three-dimensional (3D) network model of multi-compartmental neurons with realistic morphology, spatial distributions of somata and synapses. We found that a specific fixed linear combination of the LIF synaptic currents provided an accurate LFP proxy, accounting for most of the variance of the LFP time course observed in the 3D network for all recording locations. This proxy performed well over a broad set of conditions, including substantial variations of the neuronal morphologies. Our results provide a simple formula for estimating the time course of the LFP from LIF network simulations in cases where a single pyramidal population dominates the LFP generation, and thereby facilitate quantitative comparison between computational models and experimental LFP recordings in vivo. PMID:26657024

  4. Computing the Local Field Potential (LFP) from Integrate-and-Fire Network Models

    PubMed Central

    Cuntz, Hermann; Lansner, Anders; Panzeri, Stefano; Einevoll, Gaute T.

    2015-01-01

    Leaky integrate-and-fire (LIF) network models are commonly used to study how the spiking dynamics of neural networks changes with stimuli, tasks or dynamic network states. However, neurophysiological studies in vivo often rather measure the mass activity of neuronal microcircuits with the local field potential (LFP). Given that LFPs are generated by spatially separated currents across the neuronal membrane, they cannot be computed directly from quantities defined in models of point-like LIF neurons. Here, we explore the best approximation for predicting the LFP based on standard output from point-neuron LIF networks. To search for this best “LFP proxy”, we compared LFP predictions from candidate proxies based on LIF network output (e.g, firing rates, membrane potentials, synaptic currents) with “ground-truth” LFP obtained when the LIF network synaptic input currents were injected into an analogous three-dimensional (3D) network model of multi-compartmental neurons with realistic morphology, spatial distributions of somata and synapses. We found that a specific fixed linear combination of the LIF synaptic currents provided an accurate LFP proxy, accounting for most of the variance of the LFP time course observed in the 3D network for all recording locations. This proxy performed well over a broad set of conditions, including substantial variations of the neuronal morphologies. Our results provide a simple formula for estimating the time course of the LFP from LIF network simulations in cases where a single pyramidal population dominates the LFP generation, and thereby facilitate quantitative comparison between computational models and experimental LFP recordings in vivo. PMID:26657024

  5. Computing the Local Field Potential (LFP) from Integrate-and-Fire Network Models.

    PubMed

    Mazzoni, Alberto; Lindén, Henrik; Cuntz, Hermann; Lansner, Anders; Panzeri, Stefano; Einevoll, Gaute T

    2015-12-01

    Leaky integrate-and-fire (LIF) network models are commonly used to study how the spiking dynamics of neural networks changes with stimuli, tasks or dynamic network states. However, neurophysiological studies in vivo often rather measure the mass activity of neuronal microcircuits with the local field potential (LFP). Given that LFPs are generated by spatially separated currents across the neuronal membrane, they cannot be computed directly from quantities defined in models of point-like LIF neurons. Here, we explore the best approximation for predicting the LFP based on standard output from point-neuron LIF networks. To search for this best "LFP proxy", we compared LFP predictions from candidate proxies based on LIF network output (e.g, firing rates, membrane potentials, synaptic currents) with "ground-truth" LFP obtained when the LIF network synaptic input currents were injected into an analogous three-dimensional (3D) network model of multi-compartmental neurons with realistic morphology, spatial distributions of somata and synapses. We found that a specific fixed linear combination of the LIF synaptic currents provided an accurate LFP proxy, accounting for most of the variance of the LFP time course observed in the 3D network for all recording locations. This proxy performed well over a broad set of conditions, including substantial variations of the neuronal morphologies. Our results provide a simple formula for estimating the time course of the LFP from LIF network simulations in cases where a single pyramidal population dominates the LFP generation, and thereby facilitate quantitative comparison between computational models and experimental LFP recordings in vivo.

  6. Optical processing for future computer networks

    NASA Technical Reports Server (NTRS)

    Husain, A.; Haugen, P. R.; Hutcheson, L. D.; Warrior, J.; Murray, N.; Beatty, M.

    1986-01-01

    In the development of future data management systems, such as the NASA Space Station, a major problem represents the design and implementation of a high performance communication network which is self-correcting and repairing, flexible, and evolvable. To obtain the goal of designing such a network, it will be essential to incorporate distributed adaptive network control techniques. The present paper provides an outline of the functional and communication network requirements for the Space Station data management system. Attention is given to the mathematical representation of the operations being carried out to provide the required functionality at each layer of communication protocol on the model. The possible implementation of specific communication functions in optics is also considered.

  7. Computer Network Security: Best Practices for Alberta School Jurisdictions.

    ERIC Educational Resources Information Center

    Alberta Dept. of Education, Edmonton.

    This paper provides a snapshot of the computer network security industry and addresses specific issues related to network security in public education. The following topics are covered: (1) security policy, including reasons for establishing a policy, risk assessment, areas to consider, audit tools; (2) workstations, including physical security,…

  8. Economics of Computing: The Case of Centralized Network File Servers.

    ERIC Educational Resources Information Center

    Solomon, Martin B.

    1994-01-01

    Discusses computer networking and the cost effectiveness of decentralization, including local area networks. A planned experiment with a centralized approach to the operation and management of file servers at the University of South Carolina is described that hopes to realize cost savings and the avoidance of staffing problems. (Contains four…

  9. Computer Networking Strategies for Building Collaboration among Science Educators.

    ERIC Educational Resources Information Center

    Aust, Ronald

    The development and dissemination of science materials can be associated with technical delivery systems such as the Unified Network for Informatics in Teacher Education (UNITE). The UNITE project was designed to investigate ways for using computer networking to improve communications and collaboration among university schools of education and…

  10. Simulation of a National Computer Network in a Gaming Environment

    ERIC Educational Resources Information Center

    Segal, Ronald; O'Neal, Beverly

    1978-01-01

    A national computer services network simulation model was used in a 3-day gaming exercise involving 16 institutional teams who made decisions about their likely long-term network participation. Participants were able to react to others' decisions and actions, and to critical overriding political, economical, and organizational issues. (CMV)

  11. Electrooptical adaptive switching network for the hypercube computer

    NASA Technical Reports Server (NTRS)

    Chow, E.; Peterson, J.

    1988-01-01

    An all-optical network design for the hyperswitch network using regular free-space interconnects between electronic processor nodes is presented. The adaptive routing model used is described, and an adaptive routing control example is presented. The design demonstrates that existing electrooptical techniques are sufficient for implementing efficient parallel architectures without the need for more complex means of implementing arbitrary interconnection schemes. The electrooptical hyperswitch network significantly improves the communication performance of the hypercube computer.

  12. Computer systems for laboratory networks and high-performance NMR.

    PubMed

    Levy, G C; Begemann, J H

    1985-08-01

    Modern computer technology is significantly enhancing the associated tasks of spectroscopic data acquisition and data reduction and analysis. Distributed data processing techniques, particularly laboratory computer networking, are rapidly changing the scientist's ability to optimize results from complex experiments. Optimization of nuclear magnetic resonance spectroscopy (NMR) and magnetic resonance imaging (MRI) experimental results requires use of powerful, large-memory (virtual memory preferred) computers with integrated (and supported) high-speed links to magnetic resonance instrumentation. Laboratory architectures with larger computers, in order to extend data reduction capabilities, have facilitated the transition to NMR laboratory computer networking. Examples of a polymer microstructure analysis and in vivo 31P metabolic analysis are given. This paper also discusses laboratory data processing trends anticipated over the next 5-10 years. Full networking of NMR laboratories is just now becoming a reality. PMID:3840171

  13. Networked Computing in the 1990s.

    ERIC Educational Resources Information Center

    Tesler, Lawrence G.

    1991-01-01

    The changes in the relationship between the computer and user from that of an isolated productivity tool to than of an active collaborator in the acquisition, use, and creation of information, as well as a facilitator of human interaction are discussed. The four paradigms of computing are compared. (KR)

  14. Mobile game development: improving student engagement and motivation in introductory computing courses

    NASA Astrophysics Data System (ADS)

    Kurkovsky, Stan

    2013-06-01

    Computer games have been accepted as an engaging and motivating tool in the computer science (CS) curriculum. However, designing and implementing a playable game is challenging, and is best done in advanced courses. Games for mobile devices, on the other hand, offer the advantage of being simpler and, thus, easier to program for lower level students. Learning context of mobile game development can be used to reinforce many core programming topics, such as loops, classes, and arrays. Furthermore, it can also be used to expose students in introductory computing courses to a wide range of advanced topics in order to illustrate that CS can be much more than coding. This paper describes the author's experience with using mobile game development projects in CS I and II, how these projects were integrated into existing courses at several universities, and the lessons learned from this experience.

  15. Normalizing Social Networking in a Beginners' Japanese Course

    ERIC Educational Resources Information Center

    Morofushi, Mari; Pasfield-Neofitou, Sarah Ellen

    2014-01-01

    With the spread of the Internet, students now have greater opportunities to use Japanese outside of the classroom. For example, they can interact with other Japanese speakers through instant messaging or social networking, or utilize online dictionaries and translation tools to decipher websites in ways that would be impossible with traditional…

  16. Challenges for high-performance networking for exascale computing.

    SciTech Connect

    Barrett, Brian W.; Hemmert, K. Scott; Underwood, Keith Douglas; Brightwell, Ronald Brian

    2010-05-01

    Achieving the next three orders of magnitude performance increase to move from petascale to exascale computing will require a significant advancements in several fundamental areas. Recent studies have outlined many of the challenges in hardware and software that will be needed. In this paper, we examine these challenges with respect to high-performance networking. We describe the repercussions of anticipated changes to computing and networking hardware and discuss the impact that alternative parallel programming models will have on the network software stack. We also present some ideas on possible approaches that address some of these challenges.

  17. Topological properties of robust biological and computational networks

    PubMed Central

    Navlakha, Saket; He, Xin; Faloutsos, Christos; Bar-Joseph, Ziv

    2014-01-01

    Network robustness is an important principle in biology and engineering. Previous studies of global networks have identified both redundancy and sparseness as topological properties used by robust networks. By focusing on molecular subnetworks, or modules, we show that module topology is tightly linked to the level of environmental variability (noise) the module expects to encounter. Modules internal to the cell that are less exposed to environmental noise are more connected and less robust than external modules. A similar design principle is used by several other biological networks. We propose a simple change to the evolutionary gene duplication model which gives rise to the rich range of module topologies observed within real networks. We apply these observations to evaluate and design communication networks that are specifically optimized for noisy or malicious environments. Combined, joint analysis of biological and computational networks leads to novel algorithms and insights benefiting both fields. PMID:24789562

  18. Recurrent Neural Network for Computing Outer Inverse.

    PubMed

    Živković, Ivan S; Stanimirović, Predrag S; Wei, Yimin

    2016-05-01

    Two linear recurrent neural networks for generating outer inverses with prescribed range and null space are defined. Each of the proposed recurrent neural networks is based on the matrix-valued differential equation, a generalization of dynamic equations proposed earlier for the nonsingular matrix inversion, the Moore-Penrose inversion, as well as the Drazin inversion, under the condition of zero initial state. The application of the first approach is conditioned by the properties of the spectrum of a certain matrix; the second approach eliminates this drawback, though at the cost of increasing the number of matrix operations. The cases corresponding to the most common generalized inverses are defined. The conditions that ensure stability of the proposed neural network are presented. Illustrative examples present the results of numerical simulations.

  19. The CREATE Network (Computer Resource Educational Access in Tennessee Education).

    ERIC Educational Resources Information Center

    Moon, Fletcher F.

    The CREATE Network (Computer Resource Educational Access in Tennessee Education) brought together library professionals from Tennessee's seven historically black colleges and universities (HBCUs) for purposes of training and implementation of library applications of computer-based information technology. Annual training seminars were held at…

  20. An Experiment in Computer Conferencing Using a Local Area Network.

    ERIC Educational Resources Information Center

    Baird, Patricia M.; Borer, Beatrice

    1987-01-01

    Describes various computer conferencing systems and discusses their effectiveness in terms of user acceptance and reactions to the technology. The methodology and findings of an experiment in which graduate students conducted a computer conference using a local area network and produced an electronic journal of the conference proceedings are…

  1. Active system area networks for data intensive computations. Final report

    SciTech Connect

    2002-04-01

    The goal of the Active System Area Networks (ASAN) project is to develop hardware and software technologies for the implementation of active system area networks (ASANs). The use of the term ''active'' refers to the ability of the network interfaces to perform application-specific as well as system level computations in addition to their traditional role of data transfer. This project adopts the view that the network infrastructure should be an active computational entity capable of supporting certain classes of computations that would otherwise be performed on the host CPUs. The result is a unique network-wide programming model where computations are dynamically placed within the host CPUs or the NIs depending upon the quality of service demands and network/CPU resource availability. The projects seeks to demonstrate that such an approach is a better match for data intensive network-based applications and that the advent of low-cost powerful embedded processors and configurable hardware makes such an approach economically viable and desirable.

  2. Optical interconnection networks for high-performance computing systems.

    PubMed

    Biberman, Aleksandr; Bergman, Keren

    2012-04-01

    Enabled by silicon photonic technology, optical interconnection networks have the potential to be a key disruptive technology in computing and communication industries. The enduring pursuit of performance gains in computing, combined with stringent power constraints, has fostered the ever-growing computational parallelism associated with chip multiprocessors, memory systems, high-performance computing systems and data centers. Sustaining these parallelism growths introduces unique challenges for on- and off-chip communications, shifting the focus toward novel and fundamentally different communication approaches. Chip-scale photonic interconnection networks, enabled by high-performance silicon photonic devices, offer unprecedented bandwidth scalability with reduced power consumption. We demonstrate that the silicon photonic platforms have already produced all the high-performance photonic devices required to realize these types of networks. Through extensive empirical characterization in much of our work, we demonstrate such feasibility of waveguides, modulators, switches and photodetectors. We also demonstrate systems that simultaneously combine many functionalities to achieve more complex building blocks. We propose novel silicon photonic devices, subsystems, network topologies and architectures to enable unprecedented performance of these photonic interconnection networks. Furthermore, the advantages of photonic interconnection networks extend far beyond the chip, offering advanced communication environments for memory systems, high-performance computing systems, and data centers.

  3. Synchronizing computer clocks using a local area network

    NASA Technical Reports Server (NTRS)

    Levine, Judah

    1990-01-01

    Researchers completed the first tests of a method to synchronize the clocks of networked computers to the National Institute of Standards and Technology (NIST) time scale. The method uses a server computer to disseminate the time to other clients on the same local-area network. The server is synchronized to NIST using the ACTS protocol over a dial-up telephone line. The software in both the server and the parameters of this model are used to adjust the time of the local clock and the interval between calibration requests in a statistically optimum way. The algorithm maximizes the time between calibrations while at the same time keeping the time of the local clock correct within a specific tolerance. The method can be extended to synchronize computers linked over wide-area networks, and an experiment to test the performance of the algorithms over such networks is being planned.

  4. Recurrent kernel machines: computing with infinite echo state networks.

    PubMed

    Hermans, Michiel; Schrauwen, Benjamin

    2012-01-01

    Echo state networks (ESNs) are large, random recurrent neural networks with a single trained linear readout layer. Despite the untrained nature of the recurrent weights, they are capable of performing universal computations on temporal input data, which makes them interesting for both theoretical research and practical applications. The key to their success lies in the fact that the network computes a broad set of nonlinear, spatiotemporal mappings of the input data, on which linear regression or classification can easily be performed. One could consider the reservoir as a spatiotemporal kernel, in which the mapping to a high-dimensional space is computed explicitly. In this letter, we build on this idea and extend the concept of ESNs to infinite-sized recurrent neural networks, which can be considered recursive kernels that subsequently can be used to create recursive support vector machines. We present the theoretical framework, provide several practical examples of recursive kernels, and apply them to typical temporal tasks.

  5. [Research toward a heterogeneous networked computing cluster

    SciTech Connect

    Duke, D.W.; Green, T.P.

    1998-08-11

    Over the last year the Systems Development Group, SDG, has been involved in a number of projects. The primary projects include extending the UNIX version of DQS, a DCE version of DQS, a Java based queuing system, a Computer Aided Learning and Instruction model and working with the Florida Department of Law Enforcement in the formation of the Florida Computer Crime Center. Additionally SDG has assisted a number of state and local agencies. A synopsis of these projects is contained in this report.

  6. Home Care Nursing via Computer Networks: Justification and Design Specifications

    PubMed Central

    Brennan, Patricia Flatley

    1988-01-01

    High-tech home care includes the use of information technologies, such as computer networks, to provide direct care to patients in the home. This paper presents the justification and design of a project using a free, public access computer network to deliver home care nursing. The intervention attempts to reduce isolation and improve problem solving among home care patients and their informal caregivers. Three modules comprise the intervention: a decision module, a communications module, and an information data base. This paper describes the experimental evaluation of the project, and discusses issues in the delivery of nursing care via computers.

  7. Coordinating Computing, Network and Archiving activities within INAF

    NASA Astrophysics Data System (ADS)

    Pasian, F.; Bodo, G.; Fini, L.; Garilli, B.; Longo, G.; Massimino, P.; Nanni, M.; Smareglia, R.

    When INAF was reformed, it was decided to create a `Computing, Network and Archives Service' within the Projects Department, in order to coordinate all computer-related activities and to properly harmonize management and development policies in the field. A `Computing, Network and Archives Committee' was immediately nominated for the duration of one year to cope with the immediate needs. The Committee has the task of identifying and making operational strategies to coordinate activities in the areas of interest, improving service to all users, implementing synergies and economies, while guaranteeing a single INAF contact point for all external institutions working in the field.

  8. Realistic computer network simulation for network intrusion detection dataset generation

    NASA Astrophysics Data System (ADS)

    Payer, Garrett

    2015-05-01

    The KDD-99 Cup dataset is dead. While it can continue to be used as a toy example, the age of this dataset makes it all but useless for intrusion detection research and data mining. Many of the attacks used within the dataset are obsolete and do not reflect the features important for intrusion detection in today's networks. Creating a new dataset encompassing a large cross section of the attacks found on the Internet today could be useful, but would eventually fall to the same problem as the KDD-99 Cup; its usefulness would diminish after a period of time. To continue research into intrusion detection, the generation of new datasets needs to be as dynamic and as quick as the attacker. Simply examining existing network traffic and using domain experts such as intrusion analysts to label traffic is inefficient, expensive, and not scalable. The only viable methodology is simulation using technologies including virtualization, attack-toolsets such as Metasploit and Armitage, and sophisticated emulation of threat and user behavior. Simulating actual user behavior and network intrusion events dynamically not only allows researchers to vary scenarios quickly, but enables online testing of intrusion detection mechanisms by interacting with data as it is generated. As new threat behaviors are identified, they can be added to the simulation to make quicker determinations as to the effectiveness of existing and ongoing network intrusion technology, methodology and models.

  9. Computer program for compressible flow network analysis

    NASA Technical Reports Server (NTRS)

    Wilton, M. E.; Murtaugh, J. P.

    1973-01-01

    Program solves problem of an arbitrarily connected one dimensional compressible flow network with pumping in the channels and momentum balancing at flow junctions. Program includes pressure drop calculations for impingement flow and flow through pin fin arrangements, as currently found in many air cooled turbine bucket and vane cooling configurations.

  10. Coherent Computing with Injection-Locked Laser Network

    NASA Astrophysics Data System (ADS)

    Utsunomiya, S.; Wen, K.; Takata, K.; Tamate, S.; Yamamoto, Yoshihisa

    Combinatorial optimization problems are ubiquitous in our modern life. The classic examples include the protein folding in biology and medicine, the frequency assignment in wireless communications, traffic control and routing in air and on surface, microprocessor circuit design, computer vision and graph cut in machine learning, and social network control. They often belong to NP, NP-complete and NP-hard classes, for which modern digital computers and future quantum computers cannot find solutions efficiently, i.e. in polynomial time [1].

  11. Integrated evolutionary computation neural network quality controller for automated systems

    SciTech Connect

    Patro, S.; Kolarik, W.J.

    1999-06-01

    With increasing competition in the global market, more and more stringent quality standards and specifications are being demands at lower costs. Manufacturing applications of computing power are becoming more common. The application of neural networks to identification and control of dynamic processes has been discussed. The limitations of using neural networks for control purposes has been pointed out and a different technique, evolutionary computation, has been discussed. The results of identifying and controlling an unstable, dynamic process using evolutionary computation methods has been presented. A framework for an integrated system, using both neural networks and evolutionary computation, has been proposed to identify the process and then control the product quality, in a dynamic, multivariable system, in real-time.

  12. The Role of Computer Conferencing in Delivery of a Short Course on Assessment of Learning Difficulties.

    ERIC Educational Resources Information Center

    Dwyer, Eamonn

    1991-01-01

    A pilot project at the University of Ulster (Northern Ireland) used the CAUCUS computer conferencing system on the CAMPUS 2000 education network to train teachers to assess young adults with severe learning difficulties. Despite problems with student attrition and system failure, computer conferencing was felt to be a useful medium for providing…

  13. Creation of a Course in Computer Methods and Modeling for Undergraduate Earth Science Programs

    NASA Astrophysics Data System (ADS)

    Menking, K. M.; Dashnaw, J. M.

    2003-12-01

    In recent years computer modeling has gained importance in geological research as a means to generate and test hypotheses and to allow simulation of processes in places inaccessible to humans (e.g., outer core fluid dynamics), too slow to permit observation (e.g., erosionally-induced uplift of topography), or too large to facilitate construction of physical models (e.g., faulting on the San Andreas). Entire fields within the Earth sciences now exist in which computer modeling has become the core work of the discipline. Undergraduate geology/Earth science programs have been slow to adapt to this change, and computer science curricular offerings often do not meet geology students' needs. To address these problems, a course in Computer Methods and Modeling in the Earth Sciences is being developed at Vassar College. The course uses the STELLA iconographical box modeling software developed by High Performance Systems, Inc. to teach students the fundamentals of dynamical systems modeling and then builds on the knowledge students have constructed with STELLA to teach introductory computer programming in Fortran. Fully documented and debugged STELLA and Fortran models along with reading lists, answer keys, and course notes are being developed for distribution to anyone interested in teaching a course such as this. Modeling topics include U-Pb concordia/discordia dating techniques, the global phosphorus cycle, Earth's energy balance and temperature, the impact of climate change on a chain of lakes in eastern California, heat flow in permafrost, and flow of ice in glaciers by plastic deformation. The course has been taught twice at Vassar and has been enthusiastically received by students who reported not only that they enjoyed learning the process of modeling, but also that they had a newfound appreciation for the role of mathematics in geology and intended to enroll in more math courses in the future.

  14. Communications Training Courses Across the Leopold Leadership Network

    NASA Astrophysics Data System (ADS)

    Hayden, T.; Gerber, L. R.; Silver, W. L.

    2012-12-01

    For nearly fifteen years, the Leopold Leadership Program has provided science communication training and support to mid-career academic environmental researchers from across North America. There has been an emphasis throughout on effective communication to non-scientific audiences. Increasingly, Leopold fellows have been developing communications courses for their own students, responding to the need for future scientists to be able to communicate well with the public, the media, policy makers and other audiences. At a June 2012 reunion meeting, a group of past fellows and communications trainers conducted a curriculum exchange, sharing experiences and ideas for successful inclusion of communications training in environmental science curricula. This presentation will present case studies from several institutions, including the use of podcasting, web columns, social media, in-person presentation and other presentation styles for connecting general audiences. We will share best practices, challenges and recommendations for curriculum development and institutional acceptance.

  15. [Undergraduate nursing students experience of a computer-based learning course].

    PubMed

    Alves, Rosa Helena Kreutz; Cogo, Ana Luísa Petersen

    2008-12-01

    The purpose of this qualitative study was to get to know how undergraduate nursing students at the Federal University of Rio Grande do Sul Nursing School experienced the computer-based learning (CBL) course: "Socio-historical process in nursing education". Five female students, who had attended the course the previous semester, were interviewed. Data were analyzed according to the thematic analysis. The final categories were: "the students' experience in the use of computer technologies" and "the students in relation to the computer-based learning experience". The flexibilization of study time and venue was pointed out as a positive factor. The students realized that CBL requires more effort and dedication when compared to conventional learning activities. We concluded that computer-based learning is an inclusive modality that allows access of students who are already involved in the labor market. PMID:19320351

  16. Methodical Approaches to Teaching of Computer Modeling in Computer Science Course

    ERIC Educational Resources Information Center

    Rakhimzhanova, B. Lyazzat; Issabayeva, N. Darazha; Khakimova, Tiyshtik; Bolyskhanova, J. Madina

    2015-01-01

    The purpose of this study was to justify of the formation technique of representation of modeling methodology at computer science lessons. The necessity of studying computer modeling is that the current trends of strengthening of general education and worldview functions of computer science define the necessity of additional research of the…

  17. Computer network access to scientific information systems for minority universities

    NASA Astrophysics Data System (ADS)

    Thomas, Valerie L.; Wakim, Nagi T.

    1993-08-01

    The evolution of computer networking technology has lead to the establishment of a massive networking infrastructure which interconnects various types of computing resources at many government, academic, and corporate institutions. A large segment of this infrastructure has been developed to facilitate information exchange and resource sharing within the scientific community. The National Aeronautics and Space Administration (NASA) supports both the development and the application of computer networks which provide its community with access to many valuable multi-disciplinary scientific information systems and on-line databases. Recognizing the need to extend the benefits of this advanced networking technology to the under-represented community, the National Space Science Data Center (NSSDC) in the Space Data and Computing Division at the Goddard Space Flight Center has developed the Minority University-Space Interdisciplinary Network (MU-SPIN) Program: a major networking and education initiative for Historically Black Colleges and Universities (HBCUs) and Minority Universities (MUs). In this paper, we will briefly explain the various components of the MU-SPIN Program while highlighting how, by providing access to scientific information systems and on-line data, it promotes a higher level of collaboration among faculty and students and NASA scientists.

  18. Fault-tolerant self-routing computer network topology

    SciTech Connect

    Mitchell, T.L.

    1987-01-01

    This document reports on the development and analysis of a new, easily expandable, highly fault tolerant self-routing computer network topology. The topology applies equally to any general-purpose computer-networking environment, whether local, metropolitan, or wide area. This new connectivity scheme is named the spiral topology because the architecture is built around modules of four computer nodes each, connected by top and bottom spirals. The spiral topology features a simple internal self-routing algorithm that adapts quickly, and automatically, to failed nodes and links. The six most important direct consequences of the spiral computer-network architecture are the topology's (1) ease of expansion; (2) fast, on-the-fly self-routing; (3) extremely high tolerance to network faults; (4) increased network security; (5) potential for the total elimination of store and forward transmissions due to routing decision delays; and (6) rendering the maximum path length issue moots. The fast on-the-fly routing capability of the spiral topology makes it highly amenable to fiber topic communications in any networking environment.

  19. Computer-Communications Networks and Teletraffic.

    ERIC Educational Resources Information Center

    Switzer, I.

    Bi-directional cable TV (CATV) systems that are being installed today may not be well suited for computer communications. Older CATV systems are being modified to bi-directional transmission and most new systems are being built with bi-directional capability included. The extreme bandwidth requirement for carrying 20 or more TV channels on a…

  20. A computational model for cancer growth by using complex networks

    NASA Astrophysics Data System (ADS)

    Galvão, Viviane; Miranda, José G. V.

    2008-09-01

    In this work we propose a computational model to investigate the proliferation of cancerous cell by using complex networks. In our model the network represents the structure of available space in the cancer propagation. The computational scheme considers a cancerous cell randomly included in the complex network. When the system evolves the cells can assume three states: proliferative, non-proliferative, and necrotic. Our results were compared with experimental data obtained from three human lung carcinoma cell lines. The computational simulations show that the cancerous cells have a Gompertzian growth. Also, our model simulates the formation of necrosis, increase of density, and resources diffusion to regions of lower nutrient concentration. We obtain that the cancer growth is very similar in random and small-world networks. On the other hand, the topological structure of the small-world network is more affected. The scale-free network has the largest rates of cancer growth due to hub formation. Finally, our results indicate that for different average degrees the rate of cancer growth is related to the available space in the network.

  1. [The use of computers and networking in the neurosurgical field].

    PubMed

    Oizumi, T; Ohira, T; Kawase, T

    1999-02-01

    Due to the improvements in computer and network technology, we are able to use medical information easily and safely on the network in medical institutions. In our department, we constructed and used an original Intranet with light fibers. The network links the outpatient room, ward, operation room, staff room and the examination room. Moreover, many computers and medical instruments are connected to the Intranet. Since our original Intranet has no connection with the outside network, we are able to access the patient's medical information safely. Using access management of identity and a password on the server, the client can present the medical information with sound and movie upon request of the patients and their families, medical students, nurses and doctors. Doctors can also search and input the patient's most recent medical information on a network database of every client. By linking the examination machine and operation aided instrument to the Intranet, we were able to forward the patient's medical information to the operation aided instrument easily and quickly. Furthermore, we will be able to perform tele-medicine and tele-operation in the near future: that is, the medical staff can guide the neurosurgical operation outside of the operation room with a microscope and computer view using picture mutual communication devices. By strict access to the management of our Intranet, we are able to use the medical information effectively for patient's treatment, operation, education and study on the network with no connection to the outside network.

  2. A Treatment of Computational Precision, Number Representation, and Large Integers in an Introductory Fortran Course

    ERIC Educational Resources Information Center

    Richardson, William H., Jr.

    2006-01-01

    Computational precision is sometimes given short shrift in a first programming course. Treating this topic requires discussing integer and floating-point number representations and inaccuracies that may result from their use. An example of a moderately simple programming problem from elementary statistics was examined. It forced students to…

  3. Teaching Web Application Development: A Case Study in a Computer Science Course

    ERIC Educational Resources Information Center

    Del Fabro, Marcos Didonet; de Alimeda, Eduardo Cunha; Sluzarski, Fabiano

    2012-01-01

    Teaching web development in Computer Science undergraduate courses is a difficult task. Often, there is a gap between the students' experiences and the reality in the industry. As a consequence, the students are not always well-prepared once they get the degree. This gap is due to several reasons, such as the complexity of the assignments, the…

  4. Applying Computer-Assisted Musical Instruction to Music Appreciation Course: An Example with Chinese Musical Instruments

    ERIC Educational Resources Information Center

    Lou, Shi-Jer; Guo, Yuan-Chang; Zhu, Yi-Zhen; Shih, Ru-Chu; Dzan, Wei-Yuan

    2011-01-01

    This study aims to explore the effectiveness of computer-assisted musical instruction (CAMI) in the Learning Chinese Musical Instruments (LCMI) course. The CAMI software for Chinese musical instruments was developed and administered to 228 students in a vocational high school. A pretest-posttest non-equivalent control group design with three…

  5. WebStars: Holistic, Arts-Based College Curriculum in a Computer Applications Course

    ERIC Educational Resources Information Center

    Karsten, Selia

    2004-01-01

    The purpose of my qualitative, action study was to gain a better understanding of the effects of an experimental college course in computer applications. This inquiry was made concerning both the teacher's and learners' points of view. A holistic, arts-based approach was used by the researcher/teacher in order to design, develop and facilitate a…

  6. Computer Simulation of the Alonso Household Location Model in the Microeconomics Course

    ERIC Educational Resources Information Center

    Bolton, Roger E.

    2005-01-01

    Computer simulation of the Alonso household location model can enrich the intermediate microeconomics course. The model includes decisions on location, land space, and other goods and is a valuable complement to the usual textbook model of household consumption. It has three decision variables, one of which is a "bad," and one good's price is a…

  7. An Assessment of the "Diploma in Computer Engineering" Course in the Technical Education System in Nepal

    ERIC Educational Resources Information Center

    Basnet, Kul Bahadur; Kim, Jinsoo

    2010-01-01

    The purpose of this study was to assess the Diploma in Computer Engineering (DCE) courses offered at affiliated schools of the Council for Technical Education and Vocational Training (CTEVT) with a focus on the goals of the curriculum and employment opportunities. Document analysis, questionnaires, focus group discussions and semi-structured…

  8. The MORPG-Based Learning System for Multiple Courses: A Case Study on Computer Science Curriculum

    ERIC Educational Resources Information Center

    Liu, Kuo-Yu

    2015-01-01

    This study aimed at developing a Multiplayer Online Role Playing Game-based (MORPG) Learning system which enabled instructors to construct a game scenario and manage sharable and reusable learning content for multiple courses. It used the curriculum of "Introduction to Computer Science" as a study case to assess students' learning…

  9. Success in Institutionalizing Basic Computer Skills Courses at a Community College Level.

    ERIC Educational Resources Information Center

    Dodge, Lucy

    This article outlines the development of basic computer literacy skills courses under the auspices of the Title III Grant awarded to San Jose City College (SJCC) of San Jose, California by the United States Department of Education (Grant no. PO31A980093, Strengthening Institutions, 1998-2003). The grant has been in effect for 3 years, and grant…

  10. Exploring Interactive and Dynamic Simulations Using a Computer Algebra System in an Advanced Placement Chemistry Course

    ERIC Educational Resources Information Center

    Matsumoto, Paul S.

    2014-01-01

    The article describes the use of Mathematica, a computer algebra system (CAS), in a high school chemistry course. Mathematica was used to generate a graph, where a slider controls the value of parameter(s) in the equation; thus, students can visualize the effect of the parameter(s) on the behavior of the system. Also, Mathematica can show the…

  11. Distributed Training for the Reserve Component: Course Conversion and Implementation Guidelines for Computer Conferencing.

    ERIC Educational Resources Information Center

    Hahn, H. A.; And Others

    The purpose of this handbook is to provide background and guidelines for course designers and instructional developers who will be developing Reserve Component training for the United States military using asynchronous computer conferencing techniques. The recommendations in this report are based on an international review of the literature in…

  12. Introducing Creativity in a Design Laboratory for a Freshman Level Electrical and Computer Engineering Course

    ERIC Educational Resources Information Center

    Burkett, Susan L.; Kotru, Sushma; Lusth, John C.; McCallum, Debra; Dunlap, Sarah

    2014-01-01

    Dunlap, The University of Alabama, USA ABSTRACT In the electrical and computer engineering (ECE) curriculum at The University of Alabama, freshmen are introduced to fundamental electrical concepts and units, DC circuit analysis techniques, operational amplifiers, circuit simulation, design, and professional ethics. The two credit course has both…

  13. Enhancing Learning in Introductory Computer Science Courses through SCALE: An Empirical Study

    ERIC Educational Resources Information Center

    Verginis, I.; Gogoulou, A.; Gouli, E.; Boubouka, M.; Grigoriadou, M.

    2011-01-01

    The work presented in this paper aims to support and promote the learning process in introductory computer science courses through the Web-based, adaptive, activity-oriented learning environment known as Supporting Collaboration and Adaptation in a Learning Environment (SCALE). The environment engages students actively in the learning process and…

  14. Maintaining Pedagogical Integrity of a Computer Mediated Course Delivery in Social Foundations

    ERIC Educational Resources Information Center

    Stewart, Shelley; Cobb-Roberts, Deirdre; Shircliffe, Barbara J.

    2013-01-01

    Transforming a face to face course to a computer mediated format in social foundations (interdisciplinary field in education), while maintaining pedagogical integrity, involves strategic collaboration between instructional technologists and content area experts. This type of planned partnership requires open dialogue and a mutual respect for prior…

  15. Cognitive Learning Styles and Academic Performance in Two Postsecondary Computer Application Courses.

    ERIC Educational Resources Information Center

    Ross, Jonathan L.; Drysdale, Maureen T. B.; Schultz, Robert A.

    2001-01-01

    Investigated effects of cognitive learning style on academic performance in two university computer applications courses. Discusses use of the Gregorc Style Delineator to collect learning style information over a four-year period. Results indicated a significant effect of learning style on academic performance, and that sequential learners…

  16. Multimedia Instructional Tools' Impact on Student Motivation and Learning Strategies in Computer Applications Courses

    ERIC Educational Resources Information Center

    Chapman, Debra; Wang, Shuyan

    2015-01-01

    Multimedia instructional tools (MMIT) have been identified as a way effectively and economically present instructional material. MMITs are commonly used in introductory computer applications courses as MMITs should be effective in increasing student knowledge and positively impact motivation and learning strategies, without increasing costs. This…

  17. A Study on the Methods of Assessment and Strategy of Knowledge Sharing in Computer Course

    ERIC Educational Resources Information Center

    Chan, Pat P. W.

    2014-01-01

    With the advancement of information and communication technology, collaboration and knowledge sharing through technology is facilitated which enhances the learning process and improves the learning efficiency. The purpose of this paper is to review the methods of assessment and strategy of collaboration and knowledge sharing in a computer course,…

  18. A Survey of Knowledge Management Skills Acquisition in an Online Team-Based Distributed Computing Course

    ERIC Educational Resources Information Center

    Thomas, Jennifer D. E.

    2007-01-01

    This paper investigates students' perceptions of their acquisition of knowledge management skills, namely thinking and team-building skills, resulting from the integration of various resources and technologies into an entirely team-based, online upper level distributed computing (DC) information systems (IS) course. Results seem to indicate that…

  19. A Comparison of the Educational Effectiveness of Online versus In-Class Computer Literacy Courses

    ERIC Educational Resources Information Center

    Heithecker, Julia Ann

    2013-01-01

    The purpose of this quantitative study was to compare the educational effectiveness of online versus in-class computer literacy courses, and examine the impact, if any, of student demographics (delimited to gender, age, work status, father and mother education, and enrollment status). Institutions are seeking ways to produce technologically…

  20. Development of Online Cognitive and Algorithm Tests as Assessment Tools in Introductory Computer Science Courses

    ERIC Educational Resources Information Center

    Avancena, Aimee Theresa; Nishihara, Akinori; Vergara, John Paul

    2012-01-01

    This paper presents the online cognitive and algorithm tests, which were developed in order to determine if certain cognitive factors and fundamental algorithms correlate with the performance of students in their introductory computer science course. The tests were implemented among Management Information Systems majors from the Philippines and…

  1. Developing and Validating Test Items for First-Year Computer Science Courses

    ERIC Educational Resources Information Center

    Vahrenhold, Jan; Paul, Wolfgang

    2014-01-01

    We report on the development, validation, and implementation of a collection of test items designed to detect misconceptions related to first-year computer science courses. To this end, we reworked the development scheme proposed by Almstrum et al. ("SIGCSE Bulletin" 38(4):132-145, 2006) to include students' artifacts and to…

  2. Effects of Multidimensional Concept Maps on Fourth Graders' Learning in Web-Based Computer Course

    ERIC Educational Resources Information Center

    Huang, Hwa-Shan; Chiou, Chei-Chang; Chiang, Heien-Kun; Lai, Sung-Hsi; Huang, Chiun-Yen; Chou, Yin-Yu

    2012-01-01

    This study explores the effect of multidimensional concept mapping instruction on students' learning performance in a web-based computer course. The subjects consisted of 103 fourth graders from an elementary school in central Taiwan. They were divided into three groups: multidimensional concept map (MCM) instruction group, Novak concept map (NCM)…

  3. Integration of Major Computer Program Packages into Experimental Courses: A Freshman Experience.

    ERIC Educational Resources Information Center

    Lipschitz, Irving

    1981-01-01

    Describes the use of the Gaussian 70 computer programs to carry out quantum chemical calculations, including single calculations, geometry, optimization, and potential surface scans. Includes a summary of student activities and benefits for students in an honors freshman chemistry course. (SK)

  4. A Computer-Assisted-Instruction Course in Vocabulary Building through Latin and Greek Roots

    ERIC Educational Resources Information Center

    Scanlan, Richard T.

    1976-01-01

    A course in the enlargement of students' English vocabulary through the study of Latin and Greek roots and their derivatives was developed by the Department of Classics at the University of Illinois. The class makes use of computer assisted instruction on the PLATO IV system. (Author/RM)

  5. Spiral and Project-Based Learning with Peer Assessment in a Computer Science Project Management Course

    ERIC Educational Resources Information Center

    Jaime, Arturo; Blanco, José Miguel; Domínguez, César; Sánchez, Ana; Heras, Jónathan; Usandizaga, Imanol

    2016-01-01

    Different learning methods such as project-based learning, spiral learning and peer assessment have been implemented in science disciplines with different outcomes. This paper presents a proposal for a project management course in the context of a computer science degree. Our proposal combines three well-known methods: project-based learning,…

  6. The Use of Engineering Design Concept for Computer Programming Course: A Model of Blended Learning Environment

    ERIC Educational Resources Information Center

    Tritrakan, Kasame; Kidrakarn, Pachoen; Asanok, Manit

    2016-01-01

    The aim of this research is to develop a learning model which blends factors from learning environment and engineering design concept for learning in computer programming course. The usage of the model was also analyzed. This study presents the design, implementation, and evaluation of the model. The research methodology is divided into three…

  7. Computer Aided Instruction for a Course in Boolean Algebra and Logic Design. Final Report (Revised).

    ERIC Educational Resources Information Center

    Roy, Rob

    The use of computers to prepare deficient college and graduate students for courses that build upon previously acquired information would solve the growing problem of professors who must spend up to one third of their class time in review of material. But examination of students who were taught Boolean Algebra and Logic Design by means of Computer…

  8. Community College Uses a Video-Game Lab to Lure Students to Computer Courses

    ERIC Educational Resources Information Center

    Young, Jeffrey R.

    2007-01-01

    A computer lab has become one of the most popular hangouts at Northern Virginia Community College after officials decided to load its PCs with popular video games, install a PlayStation and an Xbox, and declare it "for gamers only." The goal of this lab is to entice students to take game-design and other IT courses. John Min, dean of business…

  9. Performance Measures in Courses Using Computer-Aided Personalized System of Instruction

    ERIC Educational Resources Information Center

    Springer, C. R.; Pear, J. J.

    2008-01-01

    Archived data from four courses taught with computer-aided personalized system of instruction (CAPSI)--an online, self-paced, instructional program--were used to explore the relationship between objectively rescored final exam grades, peer reviewing, and progress rate--i.e., the rate at which students completed unit tests. There was a strong…

  10. Use of Standardized Test Scores to Predict Success in a Computer Applications Course

    ERIC Educational Resources Information Center

    Harris, Robert V.; King, Stephanie B.

    2016-01-01

    The purpose of this study was to see if a relationship existed between American College Testing (ACT) scores (i.e., English, reading, mathematics, science reasoning, and composite) and student success in a computer applications course at a Mississippi community college. The study showed that while the ACT scores were excellent predictors of…

  11. Active and Collaborative Learning in an Introductory Electrical and Computer Engineering Course

    ERIC Educational Resources Information Center

    Kotru, Sushma; Burkett, Susan L.; Jackson, David Jeff

    2010-01-01

    Active and collaborative learning instruments were introduced into an introductory electrical and computer engineering course. These instruments were designed to assess specific learning objectives and program outcomes. Results show that students developed an understanding comparable to that of more advanced students assessed later in the…

  12. Measuring Computer Science Knowledge Level of Hungarian Students Specialized in Informatics with Romanian Students Attending a Science Course or a Mathematics-Informatics Course

    ERIC Educational Resources Information Center

    Kiss, Gabor

    2012-01-01

    An analysis of Information Technology knowledge of Hungarian and Romanian students was made with the help of a self developed web based Informatics Test. The goal of this research is an analysis of the Computer Science knowledge level of Hungarian and Romanian students attending a Science course or a Mathematics-Informatics course. Analysed was…

  13. Get the Whole Story before You Plug into a Computer Network.

    ERIC Educational Resources Information Center

    Vernot, David

    1989-01-01

    Explains the myths and marvels of computer networks; cites how several schools are utilizing networking; and summarizes where the major computer companies stand today when it comes to networking. (MLF)

  14. Culture, Role and Group Work: A Social Network Analysis Perspective on an Online Collaborative Course

    ERIC Educational Resources Information Center

    Stepanyan, Karen; Mather, Richard; Dalrymple, Roger

    2014-01-01

    This paper discusses the patterns of network dynamics within a multicultural online collaborative learning environment. It analyses the interaction of participants (both students and facilitators) within a discussion board that was established as part of a 3-month online collaborative course. The study employs longitudinal probabilistic social…

  15. Life-Course Events, Social Networks, and the Emergence of Violence among Female Gang Members

    ERIC Educational Resources Information Center

    Fleisher, Mark S.; Krienert, Jessie L.

    2004-01-01

    Using data gathered from a multi-year field study, this article identifies specific life-course events shared by gang-affiliated women. Gangs emerge as a cultural adaptation or pro-social community response to poverty and racial isolation. Through the use of a social-network approach, data show that violence dramatically increases in the period…

  16. Design of a Competitive and Collaborative Learning Strategy in a Communication Networks Course

    ERIC Educational Resources Information Center

    Regueras, L. M.; Verdu, E.; Verdu, M. J.; de Castro, J. P.

    2011-01-01

    In this paper, an educational methodology based on collaborative and competitive learning is proposed. The suggested approach has been successfully applied to an undergraduate communication networks course, which is part of the core curriculum of the three-year degree in telecommunications engineering at the University of Valladolid in Spain. This…

  17. Computing Path Tables for Quickest Multipaths In Computer Networks

    SciTech Connect

    Grimmell, W.C.

    2004-12-21

    We consider the transmission of a message from a source node to a terminal node in a network with n nodes and m links where the message is divided into parts and each part is transmitted over a different path in a set of paths from the source node to the terminal node. Here each link is characterized by a bandwidth and delay. The set of paths together with their transmission rates used for the message is referred to as a multipath. We present two algorithms that produce a minimum-end-to-end message delay multipath path table that, for every message length, specifies a multipath that will achieve the minimum end-to-end delay. The algorithms also generate a function that maps the minimum end-to-end message delay to the message length. The time complexities of the algorithms are O(n{sup 2}((n{sup 2}/logn) + m)min(D{sub max}, C{sub max})) and O(nm(C{sub max} + nmin(D{sub max}, C{sub max}))) when the link delays and bandwidths are non-negative integers. Here D{sub max} and C{sub max} are respectively the maximum link delay and maximum link bandwidth and C{sub max} and D{sub max} are greater than zero.

  18. Computer aided nonlinear electrical networks analysis

    NASA Technical Reports Server (NTRS)

    Slapnicar, P.

    1977-01-01

    Techniques used in simulating an electrical circuit with nonlinear elements for use in computer-aided circuit analysis programs are described. Elements of the circuit include capacitors, resistors, inductors, transistors, diodes, and voltage and current sources (constant or time varying). Simulation features are discussed for dc, ac, and/or transient circuit analysis. Calculations are based on the model approach of formulating the circuit equations. A particular solution of transient analysis for nonlinear storage elements is described.

  19. Six Networks on a Universal Neuromorphic Computing Substrate

    PubMed Central

    Pfeil, Thomas; Grübl, Andreas; Jeltsch, Sebastian; Müller, Eric; Müller, Paul; Petrovici, Mihai A.; Schmuker, Michael; Brüderle, Daniel; Schemmel, Johannes; Meier, Karlheinz

    2013-01-01

    In this study, we present a highly configurable neuromorphic computing substrate and use it for emulating several types of neural networks. At the heart of this system lies a mixed-signal chip, with analog implementations of neurons and synapses and digital transmission of action potentials. Major advantages of this emulation device, which has been explicitly designed as a universal neural network emulator, are its inherent parallelism and high acceleration factor compared to conventional computers. Its configurability allows the realization of almost arbitrary network topologies and the use of widely varied neuronal and synaptic parameters. Fixed-pattern noise inherent to analog circuitry is reduced by calibration routines. An integrated development environment allows neuroscientists to operate the device without any prior knowledge of neuromorphic circuit design. As a showcase for the capabilities of the system, we describe the successful emulation of six different neural networks which cover a broad spectrum of both structure and functionality. PMID:23423583

  20. Interactive knowledge networks for interdisciplinary course navigation within Moodle.

    PubMed

    Scherl, Andre; Dethleffsen, Kathrin; Meyer, Michael

    2012-12-01

    Web-based hypermedia learning environments are widely used in modern education and seem particularly well suited for interdisciplinary learning. Previous work has identified guidance through these complex environments as a crucial problem of their acceptance and efficiency. We reasoned that map-based navigation might provide straightforward and effortless orientation. To achieve this, we developed a clickable and user-oriented concept map-based navigation plugin. This tool is implemented as an extension of Moodle, a widely used learning management system. It visualizes inner and interdisciplinary relations between learning objects and is generated dynamically depending on user set parameters and interactions. This plugin leaves the choice of navigation type to the user and supports direct guidance. Previously developed and evaluated face-to-face interdisciplinary learning materials bridging physiology and physics courses of a medical curriculum were integrated as learning objects, the relations of which were defined by metadata. Learning objects included text pages, self-assessments, videos, animations, and simulations. In a field study, we analyzed the effects of this learning environment on physiology and physics knowledge as well as the transfer ability of third-term medical students. Data were generated from pre- and posttest questionnaires and from tracking student navigation. Use of the hypermedia environment resulted in a significant increase of knowledge and transfer capability. Furthermore, the efficiency of learning was enhanced. We conclude that hypermedia environments based on Moodle and enriched by concept map-based navigation tools can significantly support interdisciplinary learning. Implementation of adaptivity may further strengthen this approach. PMID:23209009

  1. Synchronization-based computation through networks of coupled oscillators

    PubMed Central

    Malagarriga, Daniel; García-Vellisca, Mariano A.; Villa, Alessandro E. P.; Buldú, Javier M.; García-Ojalvo, Jordi; Pons, Antonio J.

    2015-01-01

    The mesoscopic activity of the brain is strongly dynamical, while at the same time exhibits remarkable computational capabilities. In order to examine how these two features coexist, here we show that the patterns of synchronized oscillations displayed by networks of neural mass models, representing cortical columns, can be used as substrates for Boolean-like computations. Our results reveal that the same neural mass network may process different combinations of dynamical inputs as different logical operations or combinations of them. This dynamical feature of the network allows it to process complex inputs in a very sophisticated manner. The results are reproduced experimentally with electronic circuits of coupled Chua oscillators, showing the robustness of this kind of computation to the intrinsic noise and parameter mismatch of the coupled oscillators. We also show that the information-processing capabilities of coupled oscillations go beyond the simple juxtaposition of logic gates. PMID:26300765

  2. Synchronization-based computation through networks of coupled oscillators.

    PubMed

    Malagarriga, Daniel; García-Vellisca, Mariano A; Villa, Alessandro E P; Buldú, Javier M; García-Ojalvo, Jordi; Pons, Antonio J

    2015-01-01

    The mesoscopic activity of the brain is strongly dynamical, while at the same time exhibits remarkable computational capabilities. In order to examine how these two features coexist, here we show that the patterns of synchronized oscillations displayed by networks of neural mass models, representing cortical columns, can be used as substrates for Boolean-like computations. Our results reveal that the same neural mass network may process different combinations of dynamical inputs as different logical operations or combinations of them. This dynamical feature of the network allows it to process complex inputs in a very sophisticated manner. The results are reproduced experimentally with electronic circuits of coupled Chua oscillators, showing the robustness of this kind of computation to the intrinsic noise and parameter mismatch of the coupled oscillators. We also show that the information-processing capabilities of coupled oscillations go beyond the simple juxtaposition of logic gates.

  3. Connectomic constraints on computation in feedforward networks of spiking neurons.

    PubMed

    Ramaswamy, Venkatakrishnan; Banerjee, Arunava

    2014-10-01

    Several efforts are currently underway to decipher the connectome or parts thereof in a variety of organisms. Ascertaining the detailed physiological properties of all the neurons in these connectomes, however, is out of the scope of such projects. It is therefore unclear to what extent knowledge of the connectome alone will advance a mechanistic understanding of computation occurring in these neural circuits, especially when the high-level function of the said circuit is unknown. We consider, here, the question of how the wiring diagram of neurons imposes constraints on what neural circuits can compute, when we cannot assume detailed information on the physiological response properties of the neurons. We call such constraints-that arise by virtue of the connectome-connectomic constraints on computation. For feedforward networks equipped with neurons that obey a deterministic spiking neuron model which satisfies a small number of properties, we ask if just by knowing the architecture of a network, we can rule out computations that it could be doing, no matter what response properties each of its neurons may have. We show results of this form, for certain classes of network architectures. On the other hand, we also prove that with the limited set of properties assumed for our model neurons, there are fundamental limits to the constraints imposed by network structure. Thus, our theory suggests that while connectomic constraints might restrict the computational ability of certain classes of network architectures, we may require more elaborate information on the properties of neurons in the network, before we can discern such results for other classes of networks.

  4. Distributed Computing and MEMS Accelerometers: The Quake Catcher Network

    NASA Astrophysics Data System (ADS)

    Lawrence, J. F.; Cochran, E. S.; Christensen, C.; Jakka, R. S.

    2008-12-01

    Recent advances in distributed computing provide exciting opportunities for seismic data collection. We are in the early stages of implementing a high density, low cost strong-motion network for rapid response and early warning by placing accelerometers in schools, homes, offices, government buildings, fire houses and more. The Quake Catcher Network (QCN) employs existing networked laptops and desktops to form a dense, distributed computing seismic network. Costs for this network are minimal because the QCN uses 1) strong motion sensors (accelerometers) already internal to many laptops and 2) low-cost universal serial bus (USB) accelerometers for use with desktops. The Berkeley Open Infrastructure for Network Computing (BOINC!) provides a free, proven paradigm for involving the public in large-scale computational research projects. The QCN leverages public participation to fully implement the seismic network. As such engaging the public to participate in seismic data collection is not only an integral part of the project, but an added value to the QCN. The software provides the client-user with a screen-saver displaying seismic data recorded on their laptop or recently detected earthquakes. Furthermore, this project installs sensors in K-12 classrooms as an educational tool for teaching science. Through a variety of interactive experiments students can learn about earthquakes and the hazards earthquakes pose. In the first six months of limited release of the QCN software, we successfully received triggers and waveforms from laptops near the M 4.7 April 25, 2008 earthquake in Reno, Nevada and the M 5.4 July 29, 2008 earthquake in Chino, California (as well as a few 3.6 and higher events). This fall we continued to expand the network further by installing seismometers in K-12 schools, museums, and government buildings in the greater Los Angeles basin and the San Francisco Bay Area. By summer 2009 we expect to have 1000 USB sensors deployed in California, in addition

  5. Creating a two-layered augmented artificial immune system for application to computer network intrusion detection

    NASA Astrophysics Data System (ADS)

    Judge, Matthew G.; Lamont, Gary B.

    2009-05-01

    Computer network security has become a very serious concern of commercial, industrial, and military organizations due to the increasing number of network threats such as outsider intrusions and insider covert activities. An important security element of course is network intrusion detection which is a difficult real world problem that has been addressed through many different solution attempts. Using an artificial immune system has been shown to be one of the most promising results. By enhancing jREMISA, a multi-objective evolutionary algorithm inspired artificial immune system, with a secondary defense layer; we produce improved accuracy of intrusion classification and a flexibility in responsiveness. This responsiveness can be leveraged to provide a much more powerful and accurate system, through the use of increased processing time and dedicated hardware which has the flexibility of being located out of band.

  6. Simulator Building as a Problem-Based Learning Approach for Teaching Students in a Computer Architecture Course

    ERIC Educational Resources Information Center

    Ang, Li-minn; Seng, Kah Phooi

    2008-01-01

    This paper presents a Problem-Based Learning (PBL) approach to support and promote deeper student learning in a computer architecture course. A PBL approach using a simulator building activity was used to teach part of the course. The remainder of the course was taught using a traditional lecture-tutorial approach. Qualitative data was collected…

  7. Computational approach in estimating the need of ditch network maintenance

    NASA Astrophysics Data System (ADS)

    Lauren, Ari; Hökkä, Hannu; Launiainen, Samuli; Palviainen, Marjo; Repo, Tapani; Leena, Finer; Piirainen, Sirpa

    2015-04-01

    Ditch network maintenance (DNM), implemented annually in 70 000 ha area in Finland, is the most controversial of all forest management practices. Nationwide, it is estimated to increase the forest growth by 1…3 million m3 per year, but simultaneously to cause 65 000 tons export of suspended solids and 71 tons of phosphorus (P) to water courses. A systematic approach that allows simultaneous quantification of the positive and negative effects of DNM is required. Excess water in the rooting zone slows the gas exchange and decreases biological activity interfering with the forest growth in boreal forested peatlands. DNM is needed when: 1) the excess water in the rooting zone restricts the forest growth before the DNM, and 2) after the DNM the growth restriction ceases or decreases, and 3) the benefits of DNM are greater than the caused adverse effects. Aeration in the rooting zone can be used as a drainage criterion. Aeration is affected by several factors such as meteorological conditions, tree stand properties, hydraulic properties of peat, ditch depth, and ditch spacing. We developed a 2-dimensional DNM simulator that allows the user to adjust these factors and to evaluate their effect on the soil aeration at different distance from the drainage ditch. DNM simulator computes hydrological processes and soil aeration along a water flowpath between two ditches. Applying daily time step it calculates evapotranspiration, snow accumulation and melt, infiltration, soil water storage, ground water level, soil water content, air-filled porosity and runoff. The model performance in hydrology has been tested against independent high frequency field monitoring data. Soil aeration at different distance from the ditch is computed under steady-state assumption using an empirical oxygen consumption model, simulated air-filled porosity, and diffusion coefficient at different depths in soil. Aeration is adequate and forest growth rate is not limited by poor aeration if the

  8. Designing for deeper learning in a blended computer science course for middle school students

    NASA Astrophysics Data System (ADS)

    Grover, Shuchi; Pea, Roy; Cooper, Stephen

    2015-04-01

    The focus of this research was to create and test an introductory computer science course for middle school. Titled "Foundations for Advancing Computational Thinking" (FACT), the course aims to prepare and motivate middle school learners for future engagement with algorithmic problem solving. FACT was also piloted as a seven-week course on Stanford's OpenEdX MOOC platform for blended in-class learning. Unique aspects of FACT include balanced pedagogical designs that address the cognitive, interpersonal, and intrapersonal aspects of "deeper learning"; a focus on pedagogical strategies for mediating and assessing for transfer from block-based to text-based programming; curricular materials for remedying misperceptions of computing; and "systems of assessments" (including formative and summative quizzes and tests, directed as well as open-ended programming assignments, and a transfer test) to get a comprehensive picture of students' deeper computational learning. Empirical investigations, accomplished over two iterations of a design-based research effort with students (aged 11-14 years) in a public school, sought to examine student understanding of algorithmic constructs, and how well students transferred this learning from Scratch to text-based languages. Changes in student perceptions of computing as a discipline were measured. Results and mixed-method analyses revealed that students in both studies (1) achieved substantial learning gains in algorithmic thinking skills, (2) were able to transfer their learning from Scratch to a text-based programming context, and (3) achieved significant growth toward a more mature understanding of computing as a discipline. Factor analyses of prior computing experience, multivariate regression analyses, and qualitative analyses of student projects and artifact-based interviews were conducted to better understand the factors affecting learning outcomes. Prior computing experiences (as measured by a pretest) and math ability were

  9. A local area computer network expert system framework

    NASA Technical Reports Server (NTRS)

    Dominy, Robert

    1987-01-01

    Over the past years an expert system called LANES designed to detect and isolate faults in the Goddard-wide Hybrid Local Area Computer Network (LACN) was developed. As a result, the need for developing a more generic LACN fault isolation expert system has become apparent. An object oriented approach was explored to create a set of generic classes, objects, rules, and methods that would be necessary to meet this need. The object classes provide a convenient mechanism for separating high level information from low level network specific information. This approach yeilds a framework which can be applied to different network configurations and be easily expanded to meet new needs.

  10. Building Social Networks with Computer Networks: A New Deal for Teaching and Learning.

    ERIC Educational Resources Information Center

    Thurston, Thomas

    2001-01-01

    Discusses the role of computer technology and Web sites in expanding social networks. Focuses on the New Deal Network using two examples: (1) uniting a Julia C. Lathrop Housing (Chicago, Illinois) resident with a university professor; and (2) saving the Hugo Gellert art murals at the Seward Park Coop Apartments (New York). (CMK)

  11. Test experience on an ultrareliable computer communication network

    NASA Technical Reports Server (NTRS)

    Abbott, L. W.

    1984-01-01

    The dispersed sensor processing mesh (DSPM) is an experimental, ultra-reliable, fault-tolerant computer communications network that exhibits an organic-like ability to regenerate itself after suffering damage. The regeneration is accomplished by two routines - grow and repair. This paper discusses the DSPM concept for achieving fault tolerance and provides a brief description of the mechanization of both the experiment and the six-node experimental network. The main topic of this paper is the system performance of the growth algorithm contained in the grow routine. The characteristics imbued to DSPM by the growth algorithm are also discussed. Data from an experimental DSPM network and software simulation of larger DSPM-type networks are used to examine the inherent limitation on growth time by the growth algorithm and the relationship of growth time to network size and topology.

  12. Test experience on an ultrareliable computer communication network

    NASA Technical Reports Server (NTRS)

    Abbott, L. W.

    1984-01-01

    The dispersed sensor processing mesh (DSPM) is an experimental, ultrareliable, fault-tolerant computer communications network that exhibits an organic-like ability to regenerate itself after suffering damage. The regeneration is accomplished by two routines - grow and repair. This paper discusses the DSPM concept for achieving fault tolerance and provides a brief description of the mechanization of both the experiment and the six-node experimental network. The main topic of this paper is the system performance of the growth algorithm contained in the grow routine. The characteristics imbued to DSPM by the growth algorithm are also discussed. Data from an experimental DSPM network and software simulation of larger DSPM-type networks are used to examine the inherent limitation on growth time by the growth algorithm and the relationship of growth time to network size and topology.

  13. Quantum computer networks with the orbital angular momentum of light

    NASA Astrophysics Data System (ADS)

    Garcia-Escartin, Juan Carlos; Chamorro-Posada, Pedro

    2012-09-01

    Inside computer networks, different information processing tasks are necessary to deliver the user data efficiently. This processing can also be done in the quantum domain. We present simple optical quantum networks where the orbital angular momentum (OAM) of a single photon is used as an ancillary degree of freedom which controls decisions at the network level. Linear optical elements are enough to provide important network primitives such as multiplexing and routing. First we show how to build a simple multiplexer and demultiplexer which combine photonic qubits and separate them again at the receiver. We also give two different self-routing networks where the OAM of an input photon is enough to make it find its desired destination.

  14. Navigating Traditional Chinese Medicine Network Pharmacology and Computational Tools

    PubMed Central

    Chen, Jia-Lei; Xu, Li-Wen

    2013-01-01

    The concept of “network target” has ushered in a new era in the field of traditional Chinese medicine (TCM). As a new research approach, network pharmacology is based on the analysis of network models and systems biology. Taking advantage of advancements in systems biology, a high degree of integration data analysis strategy and interpretable visualization provides deeper insights into the underlying mechanisms of TCM theories, including the principles of herb combination, biological foundations of herb or herbal formulae action, and molecular basis of TCM syndromes. In this study, we review several recent developments in TCM network pharmacology research and discuss their potential for bridging the gap between traditional and modern medicine. We briefly summarize the two main functional applications of TCM network models: understanding/uncovering and predicting/discovering. In particular, we focus on how TCM network pharmacology research is conducted and highlight different computational tools, such as network-based and machine learning algorithms, and sources that have been proposed and applied to the different steps involved in the research process. To make network pharmacology research commonplace, some basic network definitions and analysis methods are presented. PMID:23983798

  15. Analytical Computation of the Epidemic Threshold on Temporal Networks

    NASA Astrophysics Data System (ADS)

    Valdano, Eugenio; Ferreri, Luca; Poletto, Chiara; Colizza, Vittoria

    2015-04-01

    The time variation of contacts in a networked system may fundamentally alter the properties of spreading processes and affect the condition for large-scale propagation, as encoded in the epidemic threshold. Despite the great interest in the problem for the physics, applied mathematics, computer science, and epidemiology communities, a full theoretical understanding is still missing and currently limited to the cases where the time-scale separation holds between spreading and network dynamics or to specific temporal network models. We consider a Markov chain description of the susceptible-infectious-susceptible process on an arbitrary temporal network. By adopting a multilayer perspective, we develop a general analytical derivation of the epidemic threshold in terms of the spectral radius of a matrix that encodes both network structure and disease dynamics. The accuracy of the approach is confirmed on a set of temporal models and empirical networks and against numerical results. In addition, we explore how the threshold changes when varying the overall time of observation of the temporal network, so as to provide insights on the optimal time window for data collection of empirical temporal networked systems. Our framework is of both fundamental and practical interest, as it offers novel understanding of the interplay between temporal networks and spreading dynamics.

  16. Propagation of computer virus both across the Internet and external computers: A complex-network approach

    NASA Astrophysics Data System (ADS)

    Gan, Chenquan; Yang, Xiaofan; Liu, Wanping; Zhu, Qingyi; Jin, Jian; He, Li

    2014-08-01

    Based on the assumption that external computers (particularly, infected external computers) are connected to the Internet, and by considering the influence of the Internet topology on computer virus spreading, this paper establishes a novel computer virus propagation model with a complex-network approach. This model possesses a unique (viral) equilibrium which is globally attractive. Some numerical simulations are also given to illustrate this result. Further study shows that the computers with higher node degrees are more susceptible to infection than those with lower node degrees. In this regard, some appropriate protective measures are suggested.

  17. Overview of the human brain as a distributed computing network

    SciTech Connect

    Gevins, A.S.

    1983-01-01

    The hierarchically organized human brain is viewed as a prime example of a massively parallel, adaptive information processing and process control system. A brief overview of the human brain is provided for computer architects, in hopes that the principles of massive parallelism, dense connectivity and self-organization of assemblies of processing elements will prove relevant to the design of fifth generation VLSI computing networks. 6 references.

  18. Multiple-server Flexible Blind Quantum Computation in Networks

    NASA Astrophysics Data System (ADS)

    Kong, Xiaoqin; Li, Qin; Wu, Chunhui; Yu, Fang; He, Jinjun; Sun, Zhiyuan

    2016-06-01

    Blind quantum computation (BQC) can allow a client with limited quantum power to delegate his quantum computation to a powerful server and still keep his own data private. In this paper, we present a multiple-server flexible BQC protocol, where a client who only needs the ability of accessing qua ntum channels can delegate the computational task to a number of servers. Especially, the client's quantum computation also can be achieved even when one or more delegated quantum servers break down in networks. In other words, when connections to certain quantum servers are lost, clients can adjust flexibly and delegate their quantum computation to other servers. Obviously it is trivial that the computation will be unsuccessful if all servers are interrupted.

  19. Design and Development of a Sample "Computer Programming" Course Tool via Story-Based E-Learning Approach

    ERIC Educational Resources Information Center

    Kose, Utku; Koc, Durmus; Yucesoy, Suleyman Anil

    2013-01-01

    This study introduces a story-based e-learning oriented course tool that was designed and developed for using within "computer programming" courses. With this tool, students can easily adapt themselves to the subjects in the context of computer programming principles, thanks to the story-based, interactive processes. By using visually…

  20. The Use of a PDP-11/20 Computer in a Non-Calculus General Physics Course.

    ERIC Educational Resources Information Center

    Yu, David U. L.

    Computer-assisted instruction supplements traditional methods in a non-calculus physics course offered at Seattle Pacific College. Thirty-five science majors enrolled in the first quarter and 32 continued in the second term. The hardware for the course consists of a PDP-11/20 computer and eight teletype terminals; additional peripheral equipment…

  1. A One-Credit Hands-On Introductory Course in Electrical and Computer Engineering Using a Variety of Topic Modules

    ERIC Educational Resources Information Center

    Pierre, J. W.; Tuffner, F. K.; Anderson, J. R.; Whitman, D. L.; Ula, A. H. M. S.; Kubichek, R. F.; Wright, C. H. G.; Barrett, S. F.; Cupal, J. J.; Hamann, J. C.

    2009-01-01

    This paper describes a one-credit laboratory course for freshmen majoring in electrical and computer engineering (ECE). The course is motivational in nature and exposes the students to a wide range of areas of electrical and computer engineering. The authors believe it is important to give freshmen a broad perspective of what ECE is all about, and…

  2. US computer research networks: Current and future

    NASA Technical Reports Server (NTRS)

    Kratochvil, D.; Sood, D.; Verostko, A.

    1989-01-01

    During the last decade, NASA LeRC's Communication Program has conducted a series of telecommunications forecasting studies to project trends and requirements and to identify critical telecommunications technologies that must be developed to meet future requirements. The Government Networks Division of Contel Federal Systems has assisted NASA in these studies, and the current study builds upon these earlier efforts. The current major thrust of the NASA Communications Program is aimed at developing the high risk, advanced, communications satellite and terminal technologies required to significantly increase the capacity of future communications systems. Also, major new technological, economic, and social-political events and trends are now shaping the communications industry of the future. Therefore, a re-examination of future telecommunications needs and requirements is necessary to enable NASA to make management decisions in its Communications Program and to ensure the proper technologies and systems are addressed. This study, through a series of Task Orders, is helping NASA define the likely communication service needs and requirements of the future and thereby ensuring that the most appropriate technology developments are pursued.

  3. Small-world networks in neuronal populations: a computational perspective.

    PubMed

    Zippo, Antonio G; Gelsomino, Giuliana; Van Duin, Pieter; Nencini, Sara; Caramenti, Gian Carlo; Valente, Maurizio; Biella, Gabriele E M

    2013-08-01

    The analysis of the brain in terms of integrated neural networks may offer insights on the reciprocal relation between structure and information processing. Even with inherent technical limits, many studies acknowledge neuron spatial arrangements and communication modes as key factors. In this perspective, we investigated the functional organization of neuronal networks by explicitly assuming a specific functional topology, the small-world network. We developed two different computational approaches. Firstly, we asked whether neuronal populations actually express small-world properties during a definite task, such as a learning task. For this purpose we developed the Inductive Conceptual Network (ICN), which is a hierarchical bio-inspired spiking network, capable of learning invariant patterns by using variable-order Markov models implemented in its nodes. As a result, we actually observed small-world topologies during learning in the ICN. Speculating that the expression of small-world networks is not solely related to learning tasks, we then built a de facto network assuming that the information processing in the brain may occur through functional small-world topologies. In this de facto network, synchronous spikes reflected functional small-world network dependencies. In order to verify the consistency of the assumption, we tested the null-hypothesis by replacing the small-world networks with random networks. As a result, only small world networks exhibited functional biomimetic characteristics such as timing and rate codes, conventional coding strategies and neuronal avalanches, which are cascades of bursting activities with a power-law distribution. Our results suggest that small-world functional configurations are liable to underpin brain information processing at neuronal level.

  4. Performance Evaluation in Network-Based Parallel Computing

    NASA Technical Reports Server (NTRS)

    Dezhgosha, Kamyar

    1996-01-01

    Network-based parallel computing is emerging as a cost-effective alternative for solving many problems which require use of supercomputers or massively parallel computers. The primary objective of this project has been to conduct experimental research on performance evaluation for clustered parallel computing. First, a testbed was established by augmenting our existing SUNSPARCs' network with PVM (Parallel Virtual Machine) which is a software system for linking clusters of machines. Second, a set of three basic applications were selected. The applications consist of a parallel search, a parallel sort, a parallel matrix multiplication. These application programs were implemented in C programming language under PVM. Third, we conducted performance evaluation under various configurations and problem sizes. Alternative parallel computing models and workload allocations for application programs were explored. The performance metric was limited to elapsed time or response time which in the context of parallel computing can be expressed in terms of speedup. The results reveal that the overhead of communication latency between processes in many cases is the restricting factor to performance. That is, coarse-grain parallelism which requires less frequent communication between processes will result in higher performance in network-based computing. Finally, we are in the final stages of installing an Asynchronous Transfer Mode (ATM) switch and four ATM interfaces (each 155 Mbps) which will allow us to extend our study to newer applications, performance metrics, and configurations.

  5. Computer networks: making the decision to join one.

    PubMed

    Massy, W F

    1974-11-01

    I began this article with the thesis that the director of a university computer center is in a double bind. He is under increasing pressure because of competition with networks and minicomputers at the same time that his funding base is weakening. The breadth of demand for computer services, and the cost of developing new services, are increasing dramatically. The director is pressed by budget officers and internal economics to run more efficiently, but if in so doing he fails to meet new needs or downgrades effectiveness for some existing users he runs the risks of losing demand to the competition and hence worsening his immediate financial problems. The impact of networks on this state of affairs might be, briefly, as follows: 1) The centrally planned computer utility would take these pressures off the individual campus computer center and lodge them in a state, regional, or perhaps even a national network organization. While this might be desirable in some cases (depending on the scale of operations), I believe that economies of scale would tend to be more than offset by diseconomies in planning, management, and control; by a reduction of responsiveness to users' needs; and by a slowing of the rate of innovation in computing. 2) The distributive network substitutes a "market economy" for a centrally planned one. Subject to a certain amount of planning and regulation, which might be undertaken by colleges and universities themselves, individual researchers can tap larger markets for services, and participating institutions can obtain at least part of their computing needs on a variable cost basis at prices determined by competition. 3) Membership in a distributive network with sufficient breadth and depth of resources can emancipate the director of the computer center by widening options and allowing him to serve more effectively the steadily broadening range of legitimate academic and research computing needs without his having to stretch his internal resources

  6. Identifying failure in a tree network of a parallel computer

    DOEpatents

    Archer, Charles J.; Pinnow, Kurt W.; Wallenfelt, Brian P.

    2010-08-24

    Methods, parallel computers, and products are provided for identifying failure in a tree network of a parallel computer. The parallel computer includes one or more processing sets including an I/O node and a plurality of compute nodes. For each processing set embodiments include selecting a set of test compute nodes, the test compute nodes being a subset of the compute nodes of the processing set; measuring the performance of the I/O node of the processing set; measuring the performance of the selected set of test compute nodes; calculating a current test value in dependence upon the measured performance of the I/O node of the processing set, the measured performance of the set of test compute nodes, and a predetermined value for I/O node performance; and comparing the current test value with a predetermined tree performance threshold. If the current test value is below the predetermined tree performance threshold, embodiments include selecting another set of test compute nodes. If the current test value is not below the predetermined tree performance threshold, embodiments include selecting from the test compute nodes one or more potential problem nodes and testing individually potential problem nodes and links to potential problem nodes.

  7. Curriculum Reform Research of Computer Network Technology Based on School-Enterprise Cooperation

    NASA Astrophysics Data System (ADS)

    Liu, Peng

    There is growing concern about falling levels of student engagement with school science, as evidenced by studies of student attitudes, and decreasing participation at the post compulsory level. College-enterprise cooperation model is a new model of cultivating application-typed talents in college by cooperating with enterprises. In the paper, we analyze the teaching problems in the course of "Computer Network Technology", propose guidelines with teaching practice. Then we explored the reform ways to enhance students' self-learning ability. Finally, the conclusion is given.

  8. Parallelized reliability estimation of reconfigurable computer networks

    NASA Technical Reports Server (NTRS)

    Nicol, David M.; Das, Subhendu; Palumbo, Dan

    1990-01-01

    A parallelized system, ASSURE, for computing the reliability of embedded avionics flight control systems which are able to reconfigure themselves in the event of failure is described. ASSURE accepts a grammar that describes a reliability semi-Markov state-space. From this it creates a parallel program that simultaneously generates and analyzes the state-space, placing upper and lower bounds on the probability of system failure. ASSURE is implemented on a 32-node Intel iPSC/860, and has achieved high processor efficiencies on real problems. Through a combination of improved algorithms, exploitation of parallelism, and use of an advanced microprocessor architecture, ASSURE has reduced the execution time on substantial problems by a factor of one thousand over previous workstation implementations. Furthermore, ASSURE's parallel execution rate on the iPSC/860 is an order of magnitude faster than its serial execution rate on a Cray-2 supercomputer. While dynamic load balancing is necessary for ASSURE's good performance, it is needed only infrequently; the particular method of load balancing used does not substantially affect performance.

  9. A computational study of routing algorithms for realistic transportation networks

    SciTech Connect

    Jacob, R.; Marathe, M.V.; Nagel, K.

    1998-12-01

    The authors carry out an experimental analysis of a number of shortest path (routing) algorithms investigated in the context of the TRANSIMS (Transportation Analysis and Simulation System) project. The main focus of the paper is to study how various heuristic and exact solutions, associated data structures affected the computational performance of the software developed especially for realistic transportation networks. For this purpose the authors have used Dallas Fort-Worth road network with very high degree of resolution. The following general results are obtained: (1) they discuss and experimentally analyze various one-one shortest path algorithms, which include classical exact algorithms studied in the literature as well as heuristic solutions that are designed to take into account the geometric structure of the input instances; (2) they describe a number of extensions to the basic shortest path algorithm. These extensions were primarily motivated by practical problems arising in TRANSIMS and ITS (Intelligent Transportation Systems) related technologies. Extensions discussed include--(i) time dependent networks, (ii) multi-modal networks, (iii) networks with public transportation and associated schedules. Computational results are provided to empirically compare the efficiency of various algorithms. The studies indicate that a modified Dijkstra`s algorithm is computationally fast and an excellent candidate for use in various transportation planning applications as well as ITS related technologies.

  10. Recurrent Neural Network for Computing the Drazin Inverse.

    PubMed

    Stanimirović, Predrag S; Zivković, Ivan S; Wei, Yimin

    2015-11-01

    This paper presents a recurrent neural network (RNN) for computing the Drazin inverse of a real matrix in real time. This recurrent neural network (RNN) is composed of n independent parts (subnetworks), where n is the order of the input matrix. These subnetworks can operate concurrently, so parallel and distributed processing can be achieved. In this way, the computational advantages over the existing sequential algorithms can be attained in real-time applications. The RNN defined in this paper is convenient for an implementation in an electronic circuit. The number of neurons in the neural network is the same as the number of elements in the output matrix, which represents the Drazin inverse. The difference between the proposed RNN and the existing ones for the Drazin inverse computation lies in their network architecture and dynamics. The conditions that ensure the stability of the defined RNN as well as its convergence toward the Drazin inverse are considered. In addition, illustrative examples and examples of application to the practical engineering problems are discussed to show the efficacy of the proposed neural network.

  11. A Three-Dimensional Computational Model of Collagen Network Mechanics

    PubMed Central

    Lee, Byoungkoo; Zhou, Xin; Riching, Kristin; Eliceiri, Kevin W.; Keely, Patricia J.; Guelcher, Scott A.; Weaver, Alissa M.; Jiang, Yi

    2014-01-01

    Extracellular matrix (ECM) strongly influences cellular behaviors, including cell proliferation, adhesion, and particularly migration. In cancer, the rigidity of the stromal collagen environment is thought to control tumor aggressiveness, and collagen alignment has been linked to tumor cell invasion. While the mechanical properties of collagen at both the single fiber scale and the bulk gel scale are quite well studied, how the fiber network responds to local stress or deformation, both structurally and mechanically, is poorly understood. This intermediate scale knowledge is important to understanding cell-ECM interactions and is the focus of this study. We have developed a three-dimensional elastic collagen fiber network model (bead-and-spring model) and studied fiber network behaviors for various biophysical conditions: collagen density, crosslinker strength, crosslinker density, and fiber orientation (random vs. prealigned). We found the best-fit crosslinker parameter values using shear simulation tests in a small strain region. Using this calibrated collagen model, we simulated both shear and tensile tests in a large linear strain region for different network geometry conditions. The results suggest that network geometry is a key determinant of the mechanical properties of the fiber network. We further demonstrated how the fiber network structure and mechanics evolves with a local formation, mimicking the effect of pulling by a pseudopod during cell migration. Our computational fiber network model is a step toward a full biomechanical model of cellular behaviors in various ECM conditions. PMID:25386649

  12. An Analysis of Attitudes toward Computer Networks and Internet Addiction.

    ERIC Educational Resources Information Center

    Tsai, Chin-Chung; Lin, Sunny S. J.

    The purpose of this study was to explore the interplay between young people's attitudes toward computer networks and Internet addiction. After analyzing questionnaire responses of an initial sample of 615 Taiwanese high school students, 78 subjects, viewed as possible Internet addicts, were selected for further explorations. It was found that…

  13. Business Computer Network--A "Gateway" to Multiple Databanks.

    ERIC Educational Resources Information Center

    O'Leary, Mick

    1985-01-01

    Business Computer Network (BCN) employs automatic calling and logon, multiple database access, disk search capture, and search assistance interfaces to provide single access to 15 online services. Telecommunications software (SuperScout) used to reach BCN and participating online services offers storage and message options and is accompanied by…

  14. System/360 Computer Assisted Network Scheduling (CANS) System

    NASA Technical Reports Server (NTRS)

    Brewer, A. C.

    1972-01-01

    Computer assisted scheduling techniques that produce conflict-free and efficient schedules have been developed and implemented to meet needs of the Manned Space Flight Network. CANS system provides effective management of resources in complex scheduling environment. System is automated resource scheduling, controlling, planning, information storage and retrieval tool.

  15. Computer program for network synthesis by frequency response fit

    NASA Technical Reports Server (NTRS)

    Green, S.

    1967-01-01

    Computer program synthesizes a passive network by minimizing the difference in desired and actual frequency response. The program solves for the critical points of the error function /weighted least squares fit between calculated and desired frequency response/ by the multivariable Newton-Raphson method with components constrained to an admissible region.

  16. Development of a UNIX network compatible reactivity computer

    SciTech Connect

    Sanchez, R.F.; Edwards, R.M.

    1996-12-31

    A state-of-the-art UNIX network compatible controller and UNIX host workstation with MATLAB/SIMULINK software were used to develop, implement, and validate a digital reactivity calculation. An objective of the development was to determine why a Macintosh-based reactivity computer reactivity output drifted intolerably.

  17. High Performance Computing and Networking for Science--Background Paper.

    ERIC Educational Resources Information Center

    Congress of the U.S., Washington, DC. Office of Technology Assessment.

    The Office of Technology Assessment is conducting an assessment of the effects of new information technologies--including high performance computing, data networking, and mass data archiving--on research and development. This paper offers a view of the issues and their implications for current discussions about Federal supercomputer initiatives…

  18. Using a Local Area Network to Teach Computer Revision Skills.

    ERIC Educational Resources Information Center

    Thompson, Diane P.

    1989-01-01

    Describes the use of a local area network and video switching equipment in teaching revision skills on computer. Explains that reading stories from texts, rewriting them from differing character viewpoints, and editing them as a group exposed students to a variety of writing problems and stimulated various revision strategies. (SG)

  19. Computation emerges from adaptive synchronization of networking neurons.

    PubMed

    Zanin, Massimiliano; Del Pozo, Francisco; Boccaletti, Stefano

    2011-01-01

    The activity of networking neurons is largely characterized by the alternation of synchronous and asynchronous spiking sequences. One of the most relevant challenges that scientists are facing today is, then, relating that evidence with the fundamental mechanisms through which the brain computes and processes information, as well as with the arousal (or progress) of a number of neurological illnesses. In other words, the problem is how to associate an organized dynamics of interacting neural assemblies to a computational task. Here we show that computation can be seen as a feature emerging from the collective dynamics of an ensemble of networking neurons, which interact by means of adaptive dynamical connections. Namely, by associating logical states to synchronous neuron's dynamics, we show how the usual Boolean logics can be fully recovered, and a universal Turing machine can be constructed. Furthermore, we show that, besides the static binary gates, a wider class of logical operations can be efficiently constructed as the fundamental computational elements interact within an adaptive network, each operation being represented by a specific motif. Our approach qualitatively differs from the past attempts to encode information and compute with complex systems, where computation was instead the consequence of the application of control loops enforcing a desired state into the specific system's dynamics. Being the result of an emergent process, the computation mechanism here described is not limited to a binary Boolean logic, but it can involve a much larger number of states. As such, our results can enlighten new concepts for the understanding of the real computing processes taking place in the brain. PMID:22073167

  20. Spin Models for Packet Routing Control in Computer Networks

    NASA Astrophysics Data System (ADS)

    Horiguchi, T.

    We investigate packet flow in computer networks within the framework of statistical physics by using numerical simulations. As mathematical models for packet routing, we present a spin model with lattice gas spins and the one with Ising spins. Then we propose dynamic programming for optimal routing control of packet flow by using the two spin models. This is a kind of goal-directed learning for taking into account of time-dependent environment for the packets. Next we investigate a congestion problem by using the model with lattice gas spins when the packets are not sent to nodes at which their buffers are already full up with packets. Since we have found serious congestion in the packet flow, we then propose reinforcement learning for avoiding the congestion and have performed simulations on several networks including small world networks, scale free networks and so on.

  1. Global tree network for computing structures enabling global processing operations

    DOEpatents

    Blumrich; Matthias A.; Chen, Dong; Coteus, Paul W.; Gara, Alan G.; Giampapa, Mark E.; Heidelberger, Philip; Hoenicke, Dirk; Steinmacher-Burow, Burkhard D.; Takken, Todd E.; Vranas, Pavlos M.

    2010-01-19

    A system and method for enabling high-speed, low-latency global tree network communications among processing nodes interconnected according to a tree network structure. The global tree network enables collective reduction operations to be performed during parallel algorithm operations executing in a computer structure having a plurality of the interconnected processing nodes. Router devices are included that interconnect the nodes of the tree via links to facilitate performance of low-latency global processing operations at nodes of the virtual tree and sub-tree structures. The global operations performed include one or more of: broadcast operations downstream from a root node to leaf nodes of a virtual tree, reduction operations upstream from leaf nodes to the root node in the virtual tree, and point-to-point message passing from any node to the root node. The global tree network is configurable to provide global barrier and interrupt functionality in asynchronous or synchronized manner, and, is physically and logically partitionable.

  2. The computational core and fixed point organization in Boolean networks

    NASA Astrophysics Data System (ADS)

    Correale, L.; Leone, M.; Pagnani, A.; Weigt, M.; Zecchina, R.

    2006-03-01

    In this paper, we analyse large random Boolean networks in terms of a constraint satisfaction problem. We first develop an algorithmic scheme which allows us to prune simple logical cascades and underdetermined variables, returning thereby the computational core of the network. Second, we apply the cavity method to analyse the number and organization of fixed points. We find in particular a phase transition between an easy and a complex regulatory phase, the latter being characterized by the existence of an exponential number of macroscopically separated fixed point clusters. The different techniques developed are reinterpreted as algorithms for the analysis of single Boolean networks, and they are applied in the analysis of and in silico experiments on the gene regulatory networks of baker's yeast (Saccharomyces cerevisiae) and the segment-polarity genes of the fruitfly Drosophila melanogaster.

  3. Computer Manipulatives in an Ordinary Differential Equations Course: Development, Implementation, and Assessment

    NASA Astrophysics Data System (ADS)

    Miller, Haynes R.; Upton, Deborah S.

    2008-04-01

    The d'Arbeloff Interactive Mathematics Project or d'AIMP is an initiative that seeks to enhance and ultimately transform the teaching and learning of introductory mathematics at the Massachusetts Institute of Technology. A result of this project is a suite of "mathlets," a carefully developed set of dynamic computer applets for use in the university's ordinary differential equations course. In this paper, we present the rationale for such computer innovations, the philosophy behind their design, as well as a discussion of their careful development and implementation. Survey results are reported which yielded positive student feedback and suggestions for improvement.

  4. Student performance in computer modeling and problem solving in a modern introductory physics course

    NASA Astrophysics Data System (ADS)

    Kohlmyer, Matthew Adam

    Matter & Interactions, an innovative introductory physics curriculum developed by Ruth Chabay and Bruce Sherwood, emphasizes computer modeling and fundamental physical principles. Two think-aloud protocol studies were conducted to investigate the performance of students from this curriculum in solving physics problems that require computer modeling. Experiment 1 examined whether Matter & Interactions students would, given the choice, use computer modeling to solve difficult problems that required predicting motion, and how their solution approaches differed from those of students from a traditional introductory physics course. Though they did not overwhelmingly choose computer modeling, some M&I students did write computer models successfully or apply the iterative algorithm by hand. The solution approaches of M&I students and traditional course students differed qualitatively in their use of the momentum principle and pre-derived special case formulas. In experiment 2, Matter & Interactions students were observed while they wrote programs in the VPython language in order to examine their difficulties with computer modeling. Areas of difficulty included determining initial conditions, distinguishing between simulated time and the time step, and updating momentum and position. Especially troublesome for students was the multistep procedure for calculating a force that changes with time. Students' understanding of the structure of a computer model improved by the end of the semester as shown by their performance on a line sorting task. Students with fewer difficulties proceeded through the computer model in a more linear, straightforward fashion. Instruction was revised based on initial findings from the first phase of the experiment. Students in the second phase of the experiment, who had used the revised instruction, had fewer difficulties on the same tasks, though other factors may have been involved in the improvement.

  5. Primer on computers and information technology. Part two: an introduction to computer networking.

    PubMed

    Channin, D S; Chang, P J

    1997-01-01

    Computers networks are a way of connecting computers together such that they can exchange information. For this exchange to be successful, system behavior must be planned and specified very clearly at a number of different levels. Although there are many choices to be made at each level, often there are simple decisions that can be made to rapidly reduce the number of options. Planning is most important at the highest (application) and lowest (wiring) levels, whereas the middle levels must be specified to ensure compatibility. Because of the widespread use of the Internet, solutions based on Internet technologies are often cost-effective and should be considered when designing a network. As in all technical fields, consultation with experts (ie, computer networking specialists) may be worthwhile. PMID:9225395

  6. Primer on computers and information technology. Part two: an introduction to computer networking.

    PubMed

    Channin, D S; Chang, P J

    1997-01-01

    Computers networks are a way of connecting computers together such that they can exchange information. For this exchange to be successful, system behavior must be planned and specified very clearly at a number of different levels. Although there are many choices to be made at each level, often there are simple decisions that can be made to rapidly reduce the number of options. Planning is most important at the highest (application) and lowest (wiring) levels, whereas the middle levels must be specified to ensure compatibility. Because of the widespread use of the Internet, solutions based on Internet technologies are often cost-effective and should be considered when designing a network. As in all technical fields, consultation with experts (ie, computer networking specialists) may be worthwhile.

  7. Reduction of dynamical biochemical reactions networks in computational biology

    PubMed Central

    Radulescu, O.; Gorban, A. N.; Zinovyev, A.; Noel, V.

    2012-01-01

    Biochemical networks are used in computational biology, to model mechanistic details of systems involved in cell signaling, metabolism, and regulation of gene expression. Parametric and structural uncertainty, as well as combinatorial explosion are strong obstacles against analyzing the dynamics of large models of this type. Multiscaleness, an important property of these networks, can be used to get past some of these obstacles. Networks with many well separated time scales, can be reduced to simpler models, in a way that depends only on the orders of magnitude and not on the exact values of the kinetic parameters. The main idea used for such robust simplifications of networks is the concept of dominance among model elements, allowing hierarchical organization of these elements according to their effects on the network dynamics. This concept finds a natural formulation in tropical geometry. We revisit, in the light of these new ideas, the main approaches to model reduction of reaction networks, such as quasi-steady state (QSS) and quasi-equilibrium approximations (QE), and provide practical recipes for model reduction of linear and non-linear networks. We also discuss the application of model reduction to the problem of parameter identification, via backward pruning machine learning techniques. PMID:22833754

  8. Computing Tutte polynomials of contact networks in classrooms

    NASA Astrophysics Data System (ADS)

    Hincapié, Doracelly; Ospina, Juan

    2013-05-01

    Objective: The topological complexity of contact networks in classrooms and the potential transmission of an infectious disease were analyzed by sex and age. Methods: The Tutte polynomials, some topological properties and the number of spanning trees were used to algebraically compute the topological complexity. Computations were made with the Maple package GraphTheory. Published data of mutually reported social contacts within a classroom taken from primary school, consisting of children in the age ranges of 4-5, 7-8 and 10-11, were used. Results: The algebraic complexity of the Tutte polynomial and the probability of disease transmission increases with age. The contact networks are not bipartite graphs, gender segregation was observed especially in younger children. Conclusion: Tutte polynomials are tools to understand the topology of the contact networks and to derive numerical indexes of such topologies. It is possible to establish relationships between the Tutte polynomial of a given contact network and the potential transmission of an infectious disease within such network

  9. Review On Applications Of Neural Network To Computer Vision

    NASA Astrophysics Data System (ADS)

    Li, Wei; Nasrabadi, Nasser M.

    1989-03-01

    Neural network models have many potential applications to computer vision due to their parallel structures, learnability, implicit representation of domain knowledge, fault tolerance, and ability of handling statistical data. This paper demonstrates the basic principles, typical models and their applications in this field. Variety of neural models, such as associative memory, multilayer back-propagation perceptron, self-stabilized adaptive resonance network, hierarchical structured neocognitron, high order correlator, network with gating control and other models, can be applied to visual signal recognition, reinforcement, recall, stereo vision, motion, object tracking and other vision processes. Most of the algorithms have been simulated on com-puters. Some have been implemented with special hardware. Some systems use features, such as edges and profiles, of images as the data form for input. Other systems use raw data as input signals to the networks. We will present some novel ideas contained in these approaches and provide a comparison of these methods. Some unsolved problems are mentioned, such as extracting the intrinsic properties of the input information, integrating those low level functions to a high-level cognitive system, achieving invariances and other problems. Perspectives of applications of some human vision models and neural network models are analyzed.

  10. Smart photonic networks and computer security for image data

    NASA Astrophysics Data System (ADS)

    Campello, Jorge; Gill, John T.; Morf, Martin; Flynn, Michael J.

    1998-02-01

    Work reported here is part of a larger project on 'Smart Photonic Networks and Computer Security for Image Data', studying the interactions of coding and security, switching architecture simulations, and basic technologies. Coding and security: coding methods that are appropriate for data security in data fusion networks were investigated. These networks have several characteristics that distinguish them form other currently employed networks, such as Ethernet LANs or the Internet. The most significant characteristics are very high maximum data rates; predominance of image data; narrowcasting - transmission of data form one source to a designated set of receivers; data fusion - combining related data from several sources; simple sensor nodes with limited buffering. These characteristics affect both the lower level network design and the higher level coding methods.Data security encompasses privacy, integrity, reliability, and availability. Privacy, integrity, and reliability can be provided through encryption and coding for error detection and correction. Availability is primarily a network issue; network nodes must be protected against failure or routed around in the case of failure. One of the more promising techniques is the use of 'secret sharing'. We consider this method as a special case of our new space-time code diversity based algorithms for secure communication. These algorithms enable us to exploit parallelism and scalable multiplexing schemes to build photonic network architectures. A number of very high-speed switching and routing architectures and their relationships with very high performance processor architectures were studied. Indications are that routers for very high speed photonic networks can be designed using the very robust and distributed TCP/IP protocol, if suitable processor architecture support is available.

  11. Distribution feeder loss computation by artificial neural network

    SciTech Connect

    Kau, S.W.; Cho, M.Y.

    1995-12-31

    This paper proposes an artificial neural network (ANN) based feeder loss calculation model for distribution system analysis. In this paper, the functional-link network model is examined to form the artificial neural network architecture to derive the various loss calculation models for feeders with different configuration. Such artificial neural network is a feedforward network that uses standard back-propagation algorithm to adjust weights on the connection path between any two processing elements (PEs). Feeder daily load curve on each season are derived by field test data. Three-phase load flow program is executed to create the training sets with exact loss calculation results. A sensitivity analysis is executed to determine the key factors included power factor, feeder loading, primary conductors, secondary conductors, and transformer capacity as the variables for components located at input layer. By artificial neural network with the pattern recognition ability, this study has developed seasonal and yearly loss calculation models for overhead and underground feeder configuration. Two practical feeders with both overhead and underground configuration in Taiwan Power Company (TPC or Taipower) distribution system are selected for computer simulation to demonstrate the effectiveness and accuracy of the proposed models. As comparing with models derived by the conventional regression technique, results indicate that the proposed models provide more efficient tool to District engineer for feeder loss calculation.

  12. A computational algebra approach to the reverse engineering of gene regulatory networks.

    PubMed

    Laubenbacher, Reinhard; Stigler, Brandilyn

    2004-08-21

    This paper proposes a new method to reverse engineer gene regulatory networks from experimental data. The modeling framework used is time-discrete deterministic dynamical systems, with a finite set of states for each of the variables. The simplest examples of such models are Boolean networks, in which variables have only two possible states. The use of a larger number of possible states allows a finer discretization of experimental data and more than one possible mode of action for the variables, depending on threshold values. Furthermore, with a suitable choice of state set, one can employ powerful tools from computational algebra, that underlie the reverse-engineering algorithm, avoiding costly enumeration strategies. To perform well, the algorithm requires wildtype together with perturbation time courses. This makes it suitable for small to meso-scale networks rather than networks on a genome-wide scale. An analysis of the complexity of the algorithm is performed. The algorithm is validated on a recently published Boolean network model of segment polarity development in Drosophila melanogaster.

  13. Topology and computational performance of attractor neural networks.

    PubMed

    McGraw, Patrick N; Menzinger, Michael

    2003-10-01

    To explore the relation between network structure and function, we studied the computational performance of Hopfield-type attractor neural nets with regular lattice, random, small-world, and scale-free topologies. The random configuration is the most efficient for storage and retrieval of patterns by the network as a whole. However, in the scale-free case retrieval errors are not distributed uniformly among the nodes. The portion of a pattern encoded by the subset of highly connected nodes is more robust and efficiently recognized than the rest of the pattern. The scale-free network thus achieves a very strong partial recognition. The implications of these findings for brain function and social dynamics are suggestive.

  14. The role of computer networks in remote sensing data analysis

    NASA Technical Reports Server (NTRS)

    Swain, P. H.; Phillips, T. L.; Lindenlaub, J. C.

    1973-01-01

    It has been hypothesized that computer networks can be used to make data processing facilities available to the remote sensing community both quickly and effectively. An experiment to test this hypothesis is being conducted by the Laboratory for Applications of Remote Sensing at Purdue University, with the participation of potential users at several remote sites. Initial indications have been highly favorable, although final evaluation awaits further experience and the accumulation of usage data.

  15. Guidance for Data Collection and Computational Modelling of Regulatory Networks

    PubMed Central

    Palmer, Adam Christopher; Shearwin, Keith Edward

    2010-01-01

    Many model regulatory networks are approaching the depth of characterisation of bacteriophage λ, wherein the vast majority of individual components and interactions are identified, and research can focus on understanding whole network function and the role of interactions within that broader context. In recent years, the study of the system-wide behaviour of phage λ’s genetic regulatory network has been greatly assisted by the combination of quantitative measurements with theoretical and computational analyses. Such research has demonstrated the value of a number of general principles and guidelines for making use of the interplay between experiments and modelling. In this chapter we discuss these guidelines and provide illustration through reference to case studies from phage λ biology. In our experience, computational modelling is best facilitated with a large and diverse set of quantitative, in vivo data, preferably obtained from standardised measurements and expressed as absolute units rather than relative units. Isolation of subsets of regulatory networks may render a system amenable to ‘bottom-up’ modelling, providing a valuable tool to the experimental molecular biologist. Decoupling key components and rendering their concentration or activity an independent experimental variable provide excellent information for model building, though conclusions drawn from isolated and/or decoupled systems should be checked against studies in the full physiological context; discrepancies are informative. The construction of a model makes possible in silico experiments, which are valuable tools for both the data analysis and the design of wet experiments. PMID:19381541

  16. Convolutional networks for fast, energy-efficient neuromorphic computing

    PubMed Central

    Esser, Steven K.; Merolla, Paul A.; Arthur, John V.; Cassidy, Andrew S.; Appuswamy, Rathinakumar; Andreopoulos, Alexander; Berg, David J.; McKinstry, Jeffrey L.; Melano, Timothy; Barch, Davis R.; di Nolfo, Carmelo; Datta, Pallab; Amir, Arnon; Taba, Brian; Flickner, Myron D.; Modha, Dharmendra S.

    2016-01-01

    Deep networks are now able to achieve human-level performance on a broad spectrum of recognition tasks. Independently, neuromorphic computing has now demonstrated unprecedented energy-efficiency through a new chip architecture based on spiking neurons, low precision synapses, and a scalable communication network. Here, we demonstrate that neuromorphic computing, despite its novel architectural primitives, can implement deep convolution networks that (i) approach state-of-the-art classification accuracy across eight standard datasets encompassing vision and speech, (ii) perform inference while preserving the hardware’s underlying energy-efficiency and high throughput, running on the aforementioned datasets at between 1,200 and 2,600 frames/s and using between 25 and 275 mW (effectively >6,000 frames/s per Watt), and (iii) can be specified and trained using backpropagation with the same ease-of-use as contemporary deep learning. This approach allows the algorithmic power of deep learning to be merged with the efficiency of neuromorphic processors, bringing the promise of embedded, intelligent, brain-inspired computing one step closer. PMID:27651489

  17. Microwave circuit analysis and design by a massively distributed computing network

    NASA Astrophysics Data System (ADS)

    Vai, Mankuan; Prasad, Sheila

    1995-05-01

    The advances in microelectronic engineering have rendered massively distributed computing networks practical and affordable. This paper describes one application of this distributed computing paradigm to the analysis and design of microwave circuits. A distributed computing network, constructed in the form of a neural network, is developed to automate the operations typically performed on a normalized Smith chart. Examples showing the use of this computing network for impedance matching and stabilizing are provided.

  18. Deep Space Network (DSN), Network Operations Control Center (NOCC) computer-human interfaces

    NASA Technical Reports Server (NTRS)

    Ellman, Alvin; Carlton, Magdi

    1993-01-01

    The Network Operations Control Center (NOCC) of the DSN is responsible for scheduling the resources of DSN, and monitoring all multi-mission spacecraft tracking activities in real-time. Operations performs this job with computer systems at JPL connected to over 100 computers at Goldstone, Australia and Spain. The old computer system became obsolete, and the first version of the new system was installed in 1991. Significant improvements for the computer-human interfaces became the dominant theme for the replacement project. Major issues required innovating problem solving. Among these issues were: How to present several thousand data elements on displays without overloading the operator? What is the best graphical representation of DSN end-to-end data flow? How to operate the system without memorizing mnemonics of hundreds of operator directives? Which computing environment will meet the competing performance requirements? This paper presents the technical challenges, engineering solutions, and results of the NOCC computer-human interface design.

  19. Experimental, analytical and computational investigation of bimodal elastomer networks

    NASA Astrophysics Data System (ADS)

    von Lockette, Paris Robert

    Advances in the synthesis of macromolecular materials have led to the creation of special classes of elastomers called bimodal because of their bimodal distributions of linear starting oligomers. Numerous studies on these materials have documented anomalous increases in ultimate strength and toughness at certain mixture combinations of the constituents but have not yet identified a cause for this behavior. In addition, the ability to predict optimal mixtures still eludes polymer chemists. Constitutive models for the behavior of bimodal materials are also unable to predict material behavior, but instead tend to capture results using complicated curve fitting and iterative schemes. This thesis uncovers topological and micromechanical sources of these enhanced properties using periodic, topological simulations of chain-level network formation and develops a constitutive model of the aggregate bimodal network. Using a topological framework, in conjunction with the eight-chain averaging scheme of Arruda and Boyce, this work develops optical and mechanical constitutive models for bimodal elastomers whose results compare favorably with data in the literature. The resulting bimodal network theory is able to predict material response for a range of bimodal compositions using only two sets of data, a direct improvement over previous models. The micromechanics of elastomeric deformation and chain orientation as described by the eight-chain model are further validated by comparing optical and mechanical data generated during large deformation shear tests on unimodal materials with finite element simulations. In addition, a newly developed optical anisotropy model for the Raman tensor of polymeric materials, generated using an eight-chain unit cell model, is shown to compare favorably with tensile data in the literature. Results generated using NETSIM, a computer program developed in this thesis, have revealed naturally occurring, self-reinforcing topological features

  20. Assessing Computer Use and Perceived Course Effectiveness in Post-Secondary Education in an American/Canadian Context

    ERIC Educational Resources Information Center

    Tamim, Rana M.; Lowerison, Gretchen; Schmid, Richard F.; Bernard, Robert M.; Abrami, Philip C.; Dehler, Christina

    2008-01-01

    The purpose of this research is to investigate the relationship between computer technology's role and students' perceptions about course effectiveness. Students from two universities (one Canadian, n = 1465; one American, n = 831) completed a 71-item questionnaire addressing different aspects of their learning experience in a given course. Factor…

  1. Is It Ethical for Patents to Be Issued for the Computer Algorithms that Affect Course Management Systems for Distance Learning?

    ERIC Educational Resources Information Center

    Moreau, Nancy

    2008-01-01

    This article discusses the impact of patents for computer algorithms in course management systems. Referring to historical documents and court cases, the positive and negative aspects of software patents are presented. The key argument is the accessibility to algorithms comprising a course management software program such as Blackboard. The…

  2. A Framework for Measuring Student Learning Gains and Engagement in an Introductory Computing Course: A Preliminary Report of Findings

    ERIC Educational Resources Information Center

    Lim, Billy; Hosack, Bryan; Vogt, Paul

    2012-01-01

    This paper describes a framework for measuring student learning gains and engagement in a Computer Science 1 (CS 1) / Information Systems 1 (IS 1) course. The framework is designed for a CS1/IS1 course as it has been traditionally taught over the years as well as when it is taught using a new pedagogical approach with Web services. It enables the…

  3. Report on Computing and Networking in the Space Science Laboratory by the SSL Computer Committee

    NASA Technical Reports Server (NTRS)

    Gallagher, D. L. (Editor)

    1993-01-01

    The Space Science Laboratory (SSL) at Marshall Space Flight Center is a multiprogram facility. Scientific research is conducted in four discipline areas: earth science and applications, solar-terrestrial physics, astrophysics, and microgravity science and applications. Representatives from each of these discipline areas participate in a Laboratory computer requirements committee, which developed this document. The purpose is to establish and discuss Laboratory objectives for computing and networking in support of science. The purpose is also to lay the foundation for a collective, multiprogram approach to providing these services. Special recognition is given to the importance of the national and international efforts of our research communities toward the development of interoperable, network-based computer applications.

  4. Relationship of the Gonial Angle and Inferior Alveolar Canal Course Using Cone Beam Computed Tomography

    PubMed Central

    Anbiaee, Najmeh; Bagherpour, Ali

    2015-01-01

    Objectives: Accurate localization of the inferior alveolar canal (IAC) is extremely important in some dental treatments. Anatomical variation of the canal means that it can be difficult to locate. The purpose of this study was to assess the relationship of the gonial angle (GA) size and IAC position using cone beam computed tomography (CBCT). Materials and Methods: In this in vitro study, 61 dry adult human hemi-mandibles were used. The CBCT scans were taken of all samples and GA was measured on all CBCT scans. The samples were divided into two groups of low angle (≤125°) and high angle (>125°). The canal dimensions, length and course were evaluated. On the sagittal view, the IAC path was classified as type A, B or C. On the axial view, canal course was defined as A1 or A2 according to the mental foramen angle. Results: The average GA size was 121.8±7.05° at the right side and 123.8±6.32° at the left side. On the sagittal view, there was a significant correlation between the GA size and the canal course (P=0.04). In the high-angle group, type A was dominant; whereas in the low-angle group, type B was more common. On the axial view of IAC course, type A1 was more common (73.43%). Conclusion: The results showed that GA size was associated with IAC course. In cases with a larger GA, the canal runs in a more straightforward path, and at the same level as the mental foramen. PMID:27252759

  5. Directly executable formal models of middleware for MANET and Cloud Networking and Computing

    NASA Astrophysics Data System (ADS)

    Pashchenko, D. V.; Sadeq Jaafar, Mustafa; Zinkin, S. A.; Trokoz, D. A.; Pashchenko, T. U.; Sinev, M. P.

    2016-04-01

    The article considers some “directly executable” formal models that are suitable for the specification of computing and networking in the cloud environment and other networks which are similar to wireless networks MANET. These models can be easily programmed and implemented on computer networks.

  6. New parallel vision environment in heterogeneous networked computing

    NASA Astrophysics Data System (ADS)

    Kim, Kyung N.; Choi, Tae-Sun; Ramakrishna, R. S.

    1998-09-01

    Many vision tasks are very complex and computationally intensive. Real time requirements further aggravate the situation. They usually involve both structured (low-level vision) and unstructured (high-level vision) computations. Parallel approaches offer hope in this context. Parallel approaches to vision tasks and scheduling schemes for their implementation receive special emphasis in this paper. Architectural issues are also addressed. The aim is to design algorithms which can be implemented on low cost heterogeneous networks running PVM. Issues connected with general purpose architectures also receive attention. The proposed ideas have been illustrated through a practical example (of eye location from an image sequence). Next generation multimedia environments are expected to routinely employ such high performance computing platforms.

  7. Reconfigurable modular computer networks for spacecraft on-board processing

    NASA Technical Reports Server (NTRS)

    Rennels, D. A.

    1978-01-01

    The core electronics subsystems on unmanned spacecraft, which have been sent over the last 20 years to investigate the moon, Mars, Venus, and Mercury, have progressed through an evolution from simple fixed controllers and analog computers in the 1960's to general-purpose digital computers in current designs. This evolution is now moving in the direction of distributed computer networks. Current Voyager spacecraft already use three on-board computers. One is used to store commands and provide overall spacecraft management. Another is used for instrument control and telemetry collection, and the third computer is used for attitude control and scientific instrument pointing. An examination of the control logic in the instruments shows that, for many, it is cost-effective to replace the sequencing logic with a microcomputer. The Unified Data System architecture considered consists of a set of standard microcomputers connected by several redundant buses. A typical self-checking computer module will contain 23 RAMs, two microprocessors, one memory interface, three bus interfaces, and one core building block.

  8. Identification of driving network of cellular differentiation from single sample time course gene expression data

    NASA Astrophysics Data System (ADS)

    Chen, Ye; Wolanyk, Nathaniel; Ilker, Tunc; Gao, Shouguo; Wang, Xujing

    Methods developed based on bifurcation theory have demonstrated their potential in driving network identification for complex human diseases, including the work by Chen, et al. Recently bifurcation theory has been successfully applied to model cellular differentiation. However, there one often faces a technical challenge in driving network prediction: time course cellular differentiation study often only contains one sample at each time point, while driving network prediction typically require multiple samples at each time point to infer the variation and interaction structures of candidate genes for the driving network. In this study, we investigate several methods to identify both the critical time point and the driving network through examination of how each time point affects the autocorrelation and phase locking. We apply these methods to a high-throughput sequencing (RNA-Seq) dataset of 42 subsets of thymocytes and mature peripheral T cells at multiple time points during their differentiation (GSE48138 from GEO). We compare the predicted driving genes with known transcription regulators of cellular differentiation. We will discuss the advantages and limitations of our proposed methods, as well as potential further improvements of our methods.

  9. Social network analysis of a project-based introductory physics course

    NASA Astrophysics Data System (ADS)

    Oakley, Christopher

    2016-03-01

    Research suggests that students benefit from peer interaction and active engagement in the classroom. The quality, nature, effect of these interactions is currently being explored by Physics Education Researchers. Spelman College offers an introductory physics sequence that addresses content and research skills by engaging students in open-ended research projects, a form of Project-Based Learning. Students have been surveyed at regular intervals during the second semester of trigonometry-based course to determine the frequency of interactions in and out of class. These interactions can be with current or past students, tutors, and instructors. This line of inquiry focuses on metrics of Social Network analysis, such as centrality of participants as well as segmentation of groups. Further research will refine and highlight deeper questions regarding student performance in this pedagogy and course sequence.

  10. Open Problems in Network-aware Data Management in Exa-scale Computing and Terabit Networking Era

    SciTech Connect

    Balman, Mehmet; Byna, Surendra

    2011-12-06

    Accessing and managing large amounts of data is a great challenge in collaborative computing environments where resources and users are geographically distributed. Recent advances in network technology led to next-generation high-performance networks, allowing high-bandwidth connectivity. Efficient use of the network infrastructure is necessary in order to address the increasing data and compute requirements of large-scale applications. We discuss several open problems, evaluate emerging trends, and articulate our perspectives in network-aware data management.

  11. Reservoir computing: a photonic neural network for information processing

    NASA Astrophysics Data System (ADS)

    Paquot, Yvan; Dambre, Joni; Schrauwen, Benjamin; Haelterman, Marc; Massar, Serge

    2010-06-01

    At the boundaries between photonics and dynamic systems theory, we combine recent advances in neural networks with opto-electronic nonlinearities to demonstrate a new way to perform optical information processing. The concept of reservoir computing arose recently as a powerful solution to the issue of training recurrent neural networks. Indeed, it is comparable to, or even outperforms, other state of the art solutions for tasks such as speech recognition or time series prediction. As it is based on a static topology, it allows making the most of very simple physical architectures having complex nonlinear dynamics. The method is inherently robust to noise and does not require explicit programming operations. It is therefore particularly well adapted for analog realizations. Among the various implementations of the concept that have been proposed, we focus on the field of optics. Our experimental reservoir computer is based on opto-electronic technology, and can be viewed as an intermediate step towards an all optical device. Our fiber optics system is based on a nonlinear feedback loop operating at the threshold of chaos. In its present preliminary stage it is already capable of complicated tasks like modeling nonlinear systems with memory. Our aim is to demonstrate that such an analog reservoir can have performances comparable to state of the art digital implementations of Neural Networks. Furthermore, our system can in principle be operated at very high frequencies thanks to the high speed of photonic devices. Thus one could envisage targeting applications such as online information processing in broadband telecommunications.

  12. Line-plane broadcasting in a data communications network of a parallel computer

    DOEpatents

    Archer, Charles J.; Berg, Jeremy E.; Blocksome, Michael A.; Smith, Brian E.

    2010-11-23

    Methods, apparatus, and products are disclosed for line-plane broadcasting in a data communications network of a parallel computer, the parallel computer comprising a plurality of compute nodes connected together through the network, the network optimized for point to point data communications and characterized by at least a first dimension, a second dimension, and a third dimension, that include: initiating, by a broadcasting compute node, a broadcast operation, including sending a message to all of the compute nodes along an axis of the first dimension for the network; sending, by each compute node along the axis of the first dimension, the message to all of the compute nodes along an axis of the second dimension for the network; and sending, by each compute node along the axis of the second dimension, the message to all of the compute nodes along an axis of the third dimension for the network.

  13. Line-plane broadcasting in a data communications network of a parallel computer

    DOEpatents

    Archer, Charles J.; Berg, Jeremy E.; Blocksome, Michael A.; Smith, Brian E.

    2010-06-08

    Methods, apparatus, and products are disclosed for line-plane broadcasting in a data communications network of a parallel computer, the parallel computer comprising a plurality of compute nodes connected together through the network, the network optimized for point to point data communications and characterized by at least a first dimension, a second dimension, and a third dimension, that include: initiating, by a broadcasting compute node, a broadcast operation, including sending a message to all of the compute nodes along an axis of the first dimension for the network; sending, by each compute node along the axis of the first dimension, the message to all of the compute nodes along an axis of the second dimension for the network; and sending, by each compute node along the axis of the second dimension, the message to all of the compute nodes along an axis of the third dimension for the network.

  14. The relationship among self-regulation, internet use, and academic achievement in a computer literacy course

    NASA Astrophysics Data System (ADS)

    YangKim, SungHee

    This research was a correlational study of the relationship among self-regulation, students' nonacademic internet browsing, and academic achievement in an undergraduate computer literacy class. Nonacademic internet browsing during class can be a distraction from student academic studies. There has been little research on the role of self-regulation on nonacademic internet browsing in influencing academic achievement. Undergraduate computer literacy classes were used as samples (n= 39) for measuring these variables. Data were collected during three class periods in two sections of the computer literacy course taught by one instructor. The data consisted of a demographic survey, selected and modified items from the GVU 10th WWW User Survey Questionnaire, selected items of the Motivated Strategies for Learning Questionnaire, and measures of internet use. There were low correlations between self-regulation and academic grades (r= .18, p > .05) and self-regulation and internet use (r= -.14, p > .05). None of the correlations were statistically significant. Also, there was no statistically significant correlation between internet use and academic achievement (r= -.23, p >.05). Self-regulation was highly correlated to self-efficacy (r= .53, p < .05). Total internet access was highly correlated to nonacademic related internet browsing (r= .96, p < .01). Although not statistically significant, the consistent negative correlations between nonacademic internet use with both self-regulation and achievement indicate that the internet may present an attractive distraction to achievement which may be due to lack of self-regulation. The implication of embedded instruction of self-regulation in the computer literacy course was discussed to enhance self-regulated internet use. Further study of interaction of self-regulated internet use and academic achievement is recommended.

  15. Formative questioning in computer learning environments: a course for pre-service mathematics teachers

    NASA Astrophysics Data System (ADS)

    Akkoç, Hatice

    2015-11-01

    This paper focuses on a specific aspect of formative assessment, namely questioning. Given that computers have gained widespread use in learning and teaching, specific attention should be made when organizing formative assessment in computer learning environments (CLEs). A course including various workshops was designed to develop knowledge and skills of questioning in CLEs. This study investigates how pre-service mathematics teachers used formative questioning with technological tools such as Geogebra and Graphic Calculus software. Participants are 35 pre-service mathematics teachers. To analyse formative questioning, two types of questions are investigated: mathematical questions and technical questions. Data were collected through lesson plans, teaching notes, interviews and observations. Descriptive statistics of the number of questions in the lesson plans before and after the workshops are presented. Examples of two types of questions are discussed using the theoretical framework. One pre-service teacher was selected and a deeper analysis of the way he used questioning during his three lessons was also investigated. The findings indicated an improvement in using technical questions for formative purposes and that the course provided a guideline in planning and using mathematical and technical questions in CLEs.

  16. World: A multipurpose GPS-network computer package

    NASA Astrophysics Data System (ADS)

    Grafarend, Erik W.; Lindlohr, Wolfgang

    WORLD is a multipurpose package to compute geodetic positions in geometry and gravity space. Here undifferenced GPS carrier beat phase observations are processed in the free network mode, namely by the prototype program called PUMA. Within two alternative model formulations, the classical Gauß-Markov Model and the so-called Mixed Model, simultaneously estimated / predicted parameters are those of type (i) Cartesian ground station coordinates (geodetic positioning), (ii) Cartesian satellite coordinates (orbit determination), (iii) receiver- and satellite-specific bias terms, (iv) initial epoch ambiguities and (v) proportional tropospheric corrections. The Mixed Model parameters appear from linearization as a point of stochastic prior information. Namely the weight matrices of stochastic prior information, e.g. for orbit parameters, is assumed to be known. Estimators of type BLUE and predictors of type inhom BLIP and hom BLUP are used. Chapter four discusses in all detail the real analysis of GPS satellite networks of free type. Most notable are the estimated bias terms α, β, γ, in a twofold classification model. The operability of PUMA is demonstrated by the use of multistation phase observations (Wild-Magnavox WM 101-receivers) in a local Berlin network (six station network). It is documented that in spite of the advanced phase observation modelling an internal relative baseline accuracy (utmost length 30 km) of the order of 3 to 5 ppm is achievable. In addition, the influence of orbital prior information on ground station measures, point position as well as accuracy, is demonstrated.

  17. An efficient network for interconnecting remote monitoring instruments and computers

    SciTech Connect

    Halbig, J.K.; Gainer, K.E.; Klosterbuer, S.F.

    1994-08-01

    Remote monitoring instrumentation must be connected with computers and other instruments. The cost and intrusiveness of installing cables in new and existing plants presents problems for the facility and the International Atomic Energy Agency (IAEA). The authors have tested a network that could accomplish this interconnection using mass-produced commercial components developed for use in industrial applications. Unlike components in the hardware of most networks, the components--manufactured and distributed in North America, Europe, and Asia--lend themselves to small and low-powered applications. The heart of the network is a chip with three microprocessors and proprietary network software contained in Read Only Memory. In addition to all nonuser levels of protocol, the software also contains message authentication capabilities. This chip can be interfaced to a variety of transmission media, for example, RS-485 lines, fiber topic cables, rf waves, and standard ac power lines. The use of power lines as the transmission medium in a facility could significantly reduce cabling costs.

  18. Universal quantum computation with ordered spin-chain networks

    NASA Astrophysics Data System (ADS)

    Tserkovnyak, Yaroslav; Loss, Daniel

    2011-09-01

    It is shown that anisotropic spin chains with gapped bulk excitations and magnetically ordered ground states offer a promising platform for quantum computation, which bridges the conventional single-spin-based qubit concept with recently developed topological Majorana-based proposals. We show how to realize the single-qubit Hadamard, phase, and π/8 gates as well as the two-qubit controlled-not (cnot) gate, which together form a fault-tolerant universal set of quantum gates. The gates are implemented by judiciously controlling Ising exchange and magnetic fields along a network of spin chains, with each individual qubit furnished by a spin-chain segment. A subset of single-qubit operations is geometric in nature, relying on control of anisotropy of spin interactions rather than their strength. We contrast topological aspects of the anisotropic spin-chain networks to those of p-wave superconducting wires discussed in the literature.

  19. [Forensic evidence-based medicine in computer communication networks].

    PubMed

    Qiu, Yun-Liang; Peng, Ming-Qi

    2013-12-01

    As an important component of judicial expertise, forensic science is broad and highly specialized. With development of network technology, increasement of information resources, and improvement of people's legal consciousness, forensic scientists encounter many new problems, and have been required to meet higher evidentiary standards in litigation. In view of this, evidence-based concept should be established in forensic medicine. We should find the most suitable method in forensic science field and other related area to solve specific problems in the evidence-based mode. Evidence-based practice can solve the problems in legal medical field, and it will play a great role in promoting the progress and development of forensic science. This article reviews the basic theory of evidence-based medicine and its effect, way, method, and evaluation in the forensic medicine in order to discuss the application value of forensic evidence-based medicine in computer communication networks.

  20. High-performance computing, high-speed networks, and configurable computing environments: progress toward fully distributed computing.

    PubMed

    Johnston, W E; Jacobson, V L; Loken, S C; Robertson, D W; Tierney, B L

    1992-01-01

    The next several years will see the maturing of a collection of technologies that will enable fully and transparently distributed computing environments. Networks will be used to configure independent computing, storage, and I/O elements into "virtual systems" that are optimal for solving a particular problem. This environment will make the most powerful computing systems those that are logically assembled from network-based components and will also make those systems available to a widespread audience. Anticipating that the necessary technology and communications infrastructure will be available in the next 3 to 5 years, we are developing and demonstrating prototype applications that test and exercise the currently available elements of this configurable environment. The Lawrence Berkeley Laboratory (LBL) Information and Computing Sciences and Research Medicine Divisions have collaborated with the Pittsburgh Supercomputer Center to demonstrate one distributed application that illuminates the issues and potential of using networks to configure virtual systems. This application allows the interactive visualization of large three-dimensional (3D) scalar fields (voxel data sets) by using a network-based configuration of heterogeneous supercomputers and workstations. The specific test case is visualization of 3D magnetic resonance imaging (MRI) data. The virtual system architecture consists of a Connection Machine-2 (CM-2) that performs surface reconstruction from the voxel data, a Cray Y-MP that renders the resulting geometric data into an image, and a workstation that provides the display of the image and the user interface for specifying the parameters for the geometry generation and 3D viewing. These three elements are configured into a virtual system by using several different network technologies. This paper reviews the current status of the software, hardware, and communications technologies that are needed to enable this configurable environment. These

  1. Neural network computation with DNA strand displacement cascades.

    PubMed

    Qian, Lulu; Winfree, Erik; Bruck, Jehoshua

    2011-07-21

    The impressive capabilities of the mammalian brain--ranging from perception, pattern recognition and memory formation to decision making and motor activity control--have inspired their re-creation in a wide range of artificial intelligence systems for applications such as face recognition, anomaly detection, medical diagnosis and robotic vehicle control. Yet before neuron-based brains evolved, complex biomolecular circuits provided individual cells with the 'intelligent' behaviour required for survival. However, the study of how molecules can 'think' has not produced an equal variety of computational models and applications of artificial chemical systems. Although biomolecular systems have been hypothesized to carry out neural-network-like computations in vivo and the synthesis of artificial chemical analogues has been proposed theoretically, experimental work has so far fallen short of fully implementing even a single neuron. Here, building on the richness of DNA computing and strand displacement circuitry, we show how molecular systems can exhibit autonomous brain-like behaviours. Using a simple DNA gate architecture that allows experimental scale-up of multilayer digital circuits, we systematically transform arbitrary linear threshold circuits (an artificial neural network model) into DNA strand displacement cascades that function as small neural networks. Our approach even allows us to implement a Hopfield associative memory with four fully connected artificial neurons that, after training in silico, remembers four single-stranded DNA patterns and recalls the most similar one when presented with an incomplete pattern. Our results suggest that DNA strand displacement cascades could be used to endow autonomous chemical systems with the capability of recognizing patterns of molecular events, making decisions and responding to the environment. PMID:21776082

  2. Universal quantum computation with a nonlinear oscillator network

    NASA Astrophysics Data System (ADS)

    Goto, Hayato

    2016-05-01

    We theoretically show that a nonlinear oscillator network with controllable parameters can be used for universal quantum computation. The initialization is achieved by a quantum-mechanical bifurcation based on quantum adiabatic evolution, which yields a Schrödinger cat state. All the elementary quantum gates are also achieved by quantum adiabatic evolution, in which dynamical phases accompanying the adiabatic evolutions are controlled by the system parameters. Numerical simulation results indicate that high gate fidelities can be achieved, where no dissipation is assumed.

  3. A Flexible Software Scheme for a Clinical Laboratory Computer Network

    PubMed Central

    Foulis, Philip R.; Megargle, Robert; Shecket, Gordon; Su, Joseph; Dartt, Arthur

    1983-01-01

    A laboratory information management system (LMS) must disseminate pertinent data to other hospital departments and organize laboratory workflow, while remaining flexible to conform to the organization and practices of the laboratory. While stand-alone LMS's excell at providing versatility through specialized functions like direct instrument interfaces, total hospital information systems (HIS's) are better at combining and distributing diverse data. In general, neither of these approaches have provided the level of performance desired in a modern hospital environment. A formalized scheme of implementing a LMS with a network of computers has been devised to provide the advantages of both approaches, and incorporate advanced levels of customization.

  4. Self-Assessment and Student Improvement in an Introductory Computer Course at the Community College-Level

    ERIC Educational Resources Information Center

    Spicer-Sutton, Jama

    2013-01-01

    The purpose of this study was to determine a student's computer knowledge upon course entry and if there was a difference in college students' improvement scores as measured by the difference in pretest and posttest scores of new or novice users, moderate users, and expert users at the end of a college-level introductory computing class.…

  5. The Use of Modular Computer-Based Lessons in a Modification of the Classical Introductory Course in Organic Chemistry.

    ERIC Educational Resources Information Center

    Stotter, Philip L.; Culp, George H.

    An experimental course in organic chemistry utilized computer-assisted instructional (CAI) techniques. The CAI lessons provided tutorial drill and practice and simulated experiments and reactions. The Conversational Language for Instruction and Computing was used, along with a CDC 6400-6600 system; students scheduled and completed the lessons at…

  6. Pilot Studies of In-Course Assessment for a Revised Medical Curriculum: II. Computer-Based, Individual.

    ERIC Educational Resources Information Center

    Miller, Andrew P.; Haden, Patricia; Schwartz, Peter L.; Loten, Ernest G.

    1997-01-01

    A study investigated a computer-based testing method in an anatomic pathology course within a new, modular, systems-oriented medical curriculum at the University of Otago (New Zealand). Students (n=193) completed five biweekly criterion-referenced, computer-based quizzes incorporating digitized photographs and varied question formats. Results…

  7. Application of a distributed network in computational fluid dynamic simulations

    NASA Technical Reports Server (NTRS)

    Deshpande, Manish; Feng, Jinzhang; Merkle, Charles L.; Deshpande, Ashish

    1994-01-01

    A general-purpose 3-D, incompressible Navier-Stokes algorithm is implemented on a network of concurrently operating workstations using parallel virtual machine (PVM) and compared with its performance on a CRAY Y-MP and on an Intel iPSC/860. The problem is relatively computationally intensive, and has a communication structure based primarily on nearest-neighbor communication, making it ideally suited to message passing. Such problems are frequently encountered in computational fluid dynamics (CDF), and their solution is increasingly in demand. The communication structure is explicitly coded in the implementation to fully exploit the regularity in message passing in order to produce a near-optimal solution. Results are presented for various grid sizes using up to eight processors.

  8. A New Stochastic Computing Methodology for Efficient Neural Network Implementation.

    PubMed

    Canals, Vincent; Morro, Antoni; Oliver, Antoni; Alomar, Miquel L; Rosselló, Josep L

    2016-03-01

    This paper presents a new methodology for the hardware implementation of neural networks (NNs) based on probabilistic laws. The proposed encoding scheme circumvents the limitations of classical stochastic computing (based on unipolar or bipolar encoding) extending the representation range to any real number using the ratio of two bipolar-encoded pulsed signals. Furthermore, the novel approach presents practically a total noise-immunity capability due to its specific codification. We introduce different designs for building the fundamental blocks needed to implement NNs. The validity of the present approach is demonstrated through a regression and a pattern recognition task. The low cost of the methodology in terms of hardware, along with its capacity to implement complex mathematical functions (such as the hyperbolic tangent), allows its use for building highly reliable systems and parallel computing.

  9. NML computation algorithms for tree-structured multinomial Bayesian networks.

    PubMed

    Kontkanen, Petri; Wettig, Hannes; Myllymäki, Petri

    2007-01-01

    Typical problems in bioinformatics involve large discrete datasets. Therefore, in order to apply statistical methods in such domains, it is important to develop efficient algorithms suitable for discrete data. The minimum description length (MDL) principle is a theoretically well-founded, general framework for performing statistical inference. The mathematical formalization of MDL is based on the normalized maximum likelihood (NML) distribution, which has several desirable theoretical properties. In the case of discrete data, straightforward computation of the NML distribution requires exponential time with respect to the sample size, since the definition involves a sum over all the possible data samples of a fixed size. In this paper, we first review some existing algorithms for efficient NML computation in the case of multinomial and naive Bayes model families. Then we proceed by extending these algorithms to more complex, tree-structured Bayesian networks. PMID:18382603

  10. Complex network problems in physics, computer science and biology

    NASA Astrophysics Data System (ADS)

    Cojocaru, Radu Ionut

    There is a close relation between physics and mathematics and the exchange of ideas between these two sciences are well established. However until few years ago there was no such a close relation between physics and computer science. Even more, only recently biologists started to use methods and tools from statistical physics in order to study the behavior of complex system. In this thesis we concentrate on applying and analyzing several methods borrowed from computer science to biology and also we use methods from statistical physics in solving hard problems from computer science. In recent years physicists have been interested in studying the behavior of complex networks. Physics is an experimental science in which theoretical predictions are compared to experiments. In this definition, the term prediction plays a very important role: although the system is complex, it is still possible to get predictions for its behavior, but these predictions are of a probabilistic nature. Spin glasses, lattice gases or the Potts model are a few examples of complex systems in physics. Spin glasses and many frustrated antiferromagnets map exactly to computer science problems in the NP-hard class defined in Chapter 1. In Chapter 1 we discuss a common result from artificial intelligence (AI) which shows that there are some problems which are NP-complete, with the implication that these problems are difficult to solve. We introduce a few well known hard problems from computer science (Satisfiability, Coloring, Vertex Cover together with Maximum Independent Set and Number Partitioning) and then discuss their mapping to problems from physics. In Chapter 2 we provide a short review of combinatorial optimization algorithms and their applications to ground state problems in disordered systems. We discuss the cavity method initially developed for studying the Sherrington-Kirkpatrick model of spin glasses. We extend this model to the study of a specific case of spin glass on the Bethe

  11. Efficient shortest-path-tree computation in network routing based on pulse-coupled neural networks.

    PubMed

    Qu, Hong; Yi, Zhang; Yang, Simon X

    2013-06-01

    Shortest path tree (SPT) computation is a critical issue for routers using link-state routing protocols, such as the most commonly used open shortest path first and intermediate system to intermediate system. Each router needs to recompute a new SPT rooted from itself whenever a change happens in the link state. Most commercial routers do this computation by deleting the current SPT and building a new one using static algorithms such as the Dijkstra algorithm at the beginning. Such recomputation of an entire SPT is inefficient, which may consume a considerable amount of CPU time and result in a time delay in the network. Some dynamic updating methods using the information in the updated SPT have been proposed in recent years. However, there are still many limitations in those dynamic algorithms. In this paper, a new modified model of pulse-coupled neural networks (M-PCNNs) is proposed for the SPT computation. It is rigorously proved that the proposed model is capable of solving some optimization problems, such as the SPT. A static algorithm is proposed based on the M-PCNNs to compute the SPT efficiently for large-scale problems. In addition, a dynamic algorithm that makes use of the structure of the previously computed SPT is proposed, which significantly improves the efficiency of the algorithm. Simulation results demonstrate the effective and efficient performance of the proposed approach. PMID:23144039

  12. Efficient shortest-path-tree computation in network routing based on pulse-coupled neural networks.

    PubMed

    Qu, Hong; Yi, Zhang; Yang, Simon X

    2013-06-01

    Shortest path tree (SPT) computation is a critical issue for routers using link-state routing protocols, such as the most commonly used open shortest path first and intermediate system to intermediate system. Each router needs to recompute a new SPT rooted from itself whenever a change happens in the link state. Most commercial routers do this computation by deleting the current SPT and building a new one using static algorithms such as the Dijkstra algorithm at the beginning. Such recomputation of an entire SPT is inefficient, which may consume a considerable amount of CPU time and result in a time delay in the network. Some dynamic updating methods using the information in the updated SPT have been proposed in recent years. However, there are still many limitations in those dynamic algorithms. In this paper, a new modified model of pulse-coupled neural networks (M-PCNNs) is proposed for the SPT computation. It is rigorously proved that the proposed model is capable of solving some optimization problems, such as the SPT. A static algorithm is proposed based on the M-PCNNs to compute the SPT efficiently for large-scale problems. In addition, a dynamic algorithm that makes use of the structure of the previously computed SPT is proposed, which significantly improves the efficiency of the algorithm. Simulation results demonstrate the effective and efficient performance of the proposed approach.

  13. Implications of computer networking and the Internet for nurse education.

    PubMed

    Ward, R

    1997-06-01

    This paper sets out the history of computer networking and its use in nursing and health care education, and places this in its wider historical and social context. The increasing availability and use of computer networks and the internet are producing a changing climate in education as well as in health care. Moves away from traditional face-to-face teaching with a campus institution to widely distributed interactive multimedia learning will affect the roles of students and teachers. The use of electronic mail, mailing lists and the World Wide Web are specifically considered, along with changes to library and information management skills, research methods, journal publication and the like. Issues about the quality, as well as quantity, of information available, are considered. As more and more organizations and institutions begin to use electronic communication methods, it becomes an increasingly important part of the curriculum at all levels, and may lead to fundamental changes in geographical and professional boundaries. A glossary of terms is provided for those not familiar with the technology, along with the contact details for mailing lists and World Wide Web pages mentioned. PMID:9277156

  14. Applied and computational harmonic analysis on graphs and networks

    NASA Astrophysics Data System (ADS)

    Irion, Jeff; Saito, Naoki

    2015-09-01

    In recent years, the advent of new sensor technologies and social network infrastructure has provided huge opportunities and challenges for analyzing data recorded on such networks. In the case of data on regular lattices, computational harmonic analysis tools such as the Fourier and wavelet transforms have well-developed theories and proven track records of success. It is therefore quite important to extend such tools from the classical setting of regular lattices to the more general setting of graphs and networks. In this article, we first review basics of graph Laplacian matrices, whose eigenpairs are often interpreted as the frequencies and the Fourier basis vectors on a given graph. We point out, however, that such an interpretation is misleading unless the underlying graph is either an unweighted path or cycle. We then discuss our recent effort of constructing multiscale basis dictionaries on a graph, including the Hierarchical Graph Laplacian Eigenbasis Dictionary and the Generalized Haar-Walsh Wavelet Packet Dictionary, which are viewed as generalizations of the classical hierarchical block DCTs and the Haar-Walsh wavelet packets, respectively, to the graph setting. Finally, we demonstrate the usefulness of our dictionaries by using them to simultaneously segment and denoise 1-D noisy signals sampled on regular lattices, a problem where classical tools have difficulty.

  15. Eye tracking using artificial neural networks for human computer interaction.

    PubMed

    Demjén, E; Aboši, V; Tomori, Z

    2011-01-01

    This paper describes an ongoing project that has the aim to develop a low cost application to replace a computer mouse for people with physical impairment. The application is based on an eye tracking algorithm and assumes that the camera and the head position are fixed. Color tracking and template matching methods are used for pupil detection. Calibration is provided by neural networks as well as by parametric interpolation methods. Neural networks use back-propagation for learning and bipolar sigmoid function is chosen as the activation function. The user's eye is scanned with a simple web camera with backlight compensation which is attached to a head fixation device. Neural networks significantly outperform parametric interpolation techniques: 1) the calibration procedure is faster as they require less calibration marks and 2) cursor control is more precise. The system in its current stage of development is able to distinguish regions at least on the level of desktop icons. The main limitation of the proposed method is the lack of head-pose invariance and its relative sensitivity to illumination (especially to incidental pupil reflections).

  16. Brain without mind: Computer simulation of neural networks with modifiable neuronal interactions

    NASA Astrophysics Data System (ADS)

    Clark, John W.; Rafelski, Johann; Winston, Jeffrey V.

    1985-07-01

    Aspects of brain function are examined in terms of a nonlinear dynamical system of highly interconnected neuron-like binary decision elements. The model neurons operate synchronously in discrete time, according to deterministic or probabilistic equations of motion. Plasticity of the nervous system, which underlies such cognitive collective phenomena as adaptive development, learning, and memory, is represented by temporal modification of interneuronal connection strengths depending on momentary or recent neural activity. A formal basis is presented for the construction of local plasticity algorithms, or connection-modification routines, spanning a large class. To build an intuitive understanding of the behavior of discrete-time network models, extensive computer simulations have been carried out (a) for nets with fixed, quasirandom connectivity and (b) for nets with connections that evolve under one or another choice of plasticity algorithm. From the former experiments, insights are gained concerning the spontaneous emergence of order in the form of cyclic modes of neuronal activity. In the course of the latter experiments, a simple plasticity routine (“brainwashing,” or “anti-learning”) was identified which, applied to nets with initially quasirandom connectivity, creates model networks which provide more felicitous starting points for computer experiments on the engramming of content-addressable memories and on learning more generally. The potential relevance of this algorithm to developmental neurobiology and to sleep states is discussed. The model considered is at the same time a synthesis of earlier synchronous neural-network models and an elaboration upon them; accordingly, the present article offers both a focused review of the dynamical properties of such systems and a selection of new findings derived from computer simulation.

  17. Teaching Advanced Concepts in Computer Networks: VNUML-UM Virtualization Tool

    ERIC Educational Resources Information Center

    Ruiz-Martinez, A.; Pereniguez-Garcia, F.; Marin-Lopez, R.; Ruiz-Martinez, P. M.; Skarmeta-Gomez, A. F.

    2013-01-01

    In the teaching of computer networks the main problem that arises is the high price and limited number of network devices the students can work with in the laboratories. Nowadays, with virtualization we can overcome this limitation. In this paper, we present a methodology that allows students to learn advanced computer network concepts through…

  18. Network Monitoring and Fault Detection on the University of Illinois at Urbana-Champaign Campus Computer Network.

    ERIC Educational Resources Information Center

    Sng, Dennis Cheng-Hong

    The University of Illinois at Urbana-Champaign (UIUC) has a large campus computer network serving a community of about 20,000 users. With such a large network, it is inevitable that there are a wide variety of technologies co-existing in a multi-vendor environment. Effective network monitoring tools can help monitor traffic and link usage, as well…

  19. Spiral and Project-Based Learning with Peer Assessment in a Computer Science Project Management Course

    NASA Astrophysics Data System (ADS)

    Jaime, Arturo; Blanco, José Miguel; Domínguez, César; Sánchez, Ana; Heras, Jónathan; Usandizaga, Imanol

    2016-06-01

    Different learning methods such as project-based learning, spiral learning and peer assessment have been implemented in science disciplines with different outcomes. This paper presents a proposal for a project management course in the context of a computer science degree. Our proposal combines three well-known methods: project-based learning, spiral learning and peer assessment. Namely, the course is articulated during a semester through the structured (progressive and incremental) development of a sequence of four projects, whose duration, scope and difficulty of management increase as the student gains theoretical and instrumental knowledge related to planning, monitoring and controlling projects. Moreover, the proposal is complemented using peer assessment. The proposal has already been implemented and validated for the last 3 years in two different universities. In the first year, project-based learning and spiral learning methods were combined. Such a combination was also employed in the other 2 years; but additionally, students had the opportunity to assess projects developed by university partners and by students of the other university. A total of 154 students have participated in the study. We obtain a gain in the quality of the subsequently projects derived from the spiral project-based learning. Moreover, this gain is significantly bigger when peer assessment is introduced. In addition, high-performance students take advantage of peer assessment from the first moment, whereas the improvement in poor-performance students is delayed.

  20. Developing and validating test items for first-year computer science courses

    NASA Astrophysics Data System (ADS)

    Vahrenhold, Jan; Paul, Wolfgang

    2014-10-01

    We report on the development, validation, and implementation of a collection of test items designed to detect misconceptions related to first-year computer science courses. To this end, we reworked the development scheme proposed by Almstrum et al. (SIGCSE Bulletin 38(4):132-145, 2006) to include students' artifacts and to simultaneously incorporate think-aloud interviews and flash tests. We also investigated to what extent the practical efficiency of detecting certain misconceptions could be increased without significantly affecting the sensitivity of the instrument, and present positive and negative results regarding this goal. The results of a first transfer and implementation study suggest that it is indeed possible to use the test items in a large-scale practical setting - both as diagnostic instruments and as interventions.

  1. Hazard Communication Project: computer-based training course (for microcomputers). Software

    SciTech Connect

    Fisher, S.

    1989-03-01

    The software is computer-based training with the following course objectives: to inform employees of their employer's requirements under the OSHA Hazard Communication Standard (29 CFR 1910.1200); to instruct employees on the procedures for obtaining and using information on hazardous materials, including understanding labeling systems and understanding the material safety data sheet (MSDS) information; to provide information on 11 classes of chemicals, including their common uses, potential physical and health hazards, detection methods, and safety measures to follow. There are 14 lessons, ranging in length from 30 minutes to 1 (one) hour. Software Description: The software is written in the UNISON language for use on an IBM PC or compatible machines using MS DOS operating system. It requires 378K of memory. Special requirements are an EGA graphics card and monitor. The program will not run on monochrome or CGA systems.

  2. Hierarchical surface code for network quantum computing with modules of arbitrary size

    NASA Astrophysics Data System (ADS)

    Li, Ying; Benjamin, Simon C.

    2016-10-01

    The network paradigm for quantum computing involves interconnecting many modules to form a scalable machine. Typically it is assumed that the links between modules are prone to noise while operations within modules have a significantly higher fidelity. To optimize fault tolerance in such architectures we introduce a hierarchical generalization of the surface code: a small "patch" of the code exists within each module and constitutes a single effective qubit of the logic-level surface code. Errors primarily occur in a two-dimensional subspace, i.e., patch perimeters extruded over time, and the resulting noise threshold for intermodule links can exceed ˜10 % even in the absence of purification. Increasing the number of qubits within each module decreases the number of qubits necessary for encoding a logical qubit. But this advantage is relatively modest, and broadly speaking, a "fine-grained" network of small modules containing only about eight qubits is competitive in total qubit count versus a "course" network with modules containing many hundreds of qubits.

  3. 10 CFR 73.54 - Protection of digital computer and communication systems and networks.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 10 Energy 2 2013-01-01 2013-01-01 false Protection of digital computer and communication systems... computer and communication systems and networks. By November 23, 2009 each licensee currently licensed to... provide high assurance that digital computer and communication systems and networks are...

  4. 10 CFR 73.54 - Protection of digital computer and communication systems and networks.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 2 2010-01-01 2010-01-01 false Protection of digital computer and communication systems... computer and communication systems and networks. By November 23, 2009 each licensee currently licensed to... provide high assurance that digital computer and communication systems and networks are...

  5. 10 CFR 73.54 - Protection of digital computer and communication systems and networks.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 10 Energy 2 2011-01-01 2011-01-01 false Protection of digital computer and communication systems... computer and communication systems and networks. By November 23, 2009 each licensee currently licensed to... provide high assurance that digital computer and communication systems and networks are...

  6. 10 CFR 73.54 - Protection of digital computer and communication systems and networks.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 10 Energy 2 2014-01-01 2014-01-01 false Protection of digital computer and communication systems... computer and communication systems and networks. By November 23, 2009 each licensee currently licensed to... provide high assurance that digital computer and communication systems and networks are...

  7. 10 CFR 73.54 - Protection of digital computer and communication systems and networks.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 10 Energy 2 2012-01-01 2012-01-01 false Protection of digital computer and communication systems... computer and communication systems and networks. By November 23, 2009 each licensee currently licensed to... provide high assurance that digital computer and communication systems and networks are...

  8. Instructing K-12 Teachers in Computer Networking and K-12 Instructional Practices in Computer Networking: Linking Teachers and Students to the Global Village.

    ERIC Educational Resources Information Center

    Amoroso, Henry C., Jr.; And Others

    How best to introduce computer mediated communication (CMC) technology to kindergarten through grade 12 educators was studied in Maine. Faculty from the University of Southern Maine (Portland) offered a 1-week institute in global computing as part of a 3-credit graduate education course. Participants had the entire semester to develop and…

  9. Including Internet insurance as part of a hospital computer network security plan.

    PubMed

    Riccardi, Ken

    2002-01-01

    Cyber attacks on a hospital's computer network is a new crime to be reckoned with. Should your hospital consider internet insurance? The author explains this new phenomenon and presents a risk assessment for determining network vulnerabilities.

  10. Including Internet insurance as part of a hospital computer network security plan.

    PubMed

    Riccardi, Ken

    2002-01-01

    Cyber attacks on a hospital's computer network is a new crime to be reckoned with. Should your hospital consider internet insurance? The author explains this new phenomenon and presents a risk assessment for determining network vulnerabilities. PMID:11951384

  11. High Energy Physics Computer Networking: Report of the HEPNET Review Committee

    SciTech Connect

    Not Available

    1988-06-01

    This paper discusses the computer networks available to high energy physics facilities for transmission of data. Topics covered in this paper are: Existing and planned networks and HEPNET requirements. (LSP)

  12. Measures of effectiveness for BMD mid-course tracking on MIMD massively parallel computers

    SciTech Connect

    VanDyke, J.P.; Tomkins, J.L.; Furnish, M.D.

    1995-05-01

    The TRC code, a mid-course tracking code for ballistic missiles, has previously been implemented on a 1024-processor MIMD (Multiple Instruction -- Multiple Data) massively parallel computer. Measures of Effectiveness (MOE) for this algorithm have been developed for this computing environment. The MOE code is run in parallel with the TRC code. Particularly useful MOEs include the number of missed objects (real objects for which the TRC algorithm did not construct a track); of ghost tracks (tracks not corresponding to a real object); of redundant tracks (multiple tracks corresponding to a single real object); and of unresolved objects (multiple objects corresponding to a single track). All of these are expressed as a function of time, and tend to maximize during the time in which real objects are spawned (multiple reentry vehicles per post-boost vehicle). As well, it is possible to measure the track-truth separation as a function of time. A set of calculations is presented illustrating these MOEs as a function of time for a case with 99 post-boost vehicles, each of which spawns 9 reentry vehicles.

  13. Using computer-assisted demonstrations of optical phenomena in an undergraduate optics course

    NASA Astrophysics Data System (ADS)

    Tarvin, John T.; Cobb, Stephen H.; Beyer, Louis M.

    1995-10-01

    A set of computer programs has been developed for the visual presentation of introductory optical phenomena. These computer simulations were created to serve a dual purpose: as demonstration aids in an NSF-sponsored Optics Demonstration Laboratory, and as teaching aids in undergraduate geometrical and physical optics courses. In the field of diffractive optics, simulations include the calculation of intensity patterns for unobscured and obscured apertures in both rectangular and circular geometries. These patterns can be compared to those measured in the laboratory with a CCD camera. A program for calculating the diffraction pattern for a two-dimensional aperture of arbitrary shape has also been developed. These programs, when coordinated with homework assignments, allow students to compare their theoretical derivations with a correct numerical solution for the same problem. In the field of geometrical optics, a ray-trace program appropriate for gradient-index fibers with cylindrical symmetry has been developed. This program enables the student to study the focusing properties of such fibers, and to predict how such properties depend on the index profile and on the length of the optical fiber. Examples of these programs will be presented, along with a report on the success of these programs as a vehicle for imparting a conceptual understanding of the physical principles involved.

  14. Students at the University of Abertay Dundee Learn Computer Hacking to Defend Networks

    ERIC Educational Resources Information Center

    Vance, Erik

    2007-01-01

    In this article, the author describes a new cybersecurity course at the University of Abertay Dundee in Scotland. Geoffrey R. Lund, leader of the software-applications program at Abertay, says the course prepares students for a rapidly growing job market by teaching that the best defense is a good offense. Professors set up a network of 20 or so…

  15. Optimization of analytical laboratory work using computer networking and databasing

    SciTech Connect

    Upp, D.L.; Metcalf, R.A.

    1996-06-01

    The Health Physics Analysis Laboratory (HPAL) performs around 600,000 analyses for radioactive nuclides each year at Los Alamos National Laboratory (LANL). Analysis matrices vary from nasal swipes, air filters, work area swipes, liquids, to the bottoms of shoes and cat litter. HPAL uses 8 liquid scintillation counters, 8 gas proportional counters, and 9 high purity germanium detectors in 5 laboratories to perform these analyses. HPAL has developed a computer network between the labs and software to produce analysis results. The software and hardware package includes barcode sample tracking, log-in, chain of custody, analysis calculations, analysis result printing, and utility programs. All data are written to a database, mirrored on a central server, and eventually written to CD-ROM to provide for online historical results. This system has greatly reduced the work required to provide for analysis results as well as improving the quality of the work performed.

  16. Computational models of signalling networks for non-linear control.

    PubMed

    Fuente, Luis A; Lones, Michael A; Turner, Alexander P; Stepney, Susan; Caves, Leo S; Tyrrell, Andy M

    2013-05-01

    Artificial signalling networks (ASNs) are a computational approach inspired by the signalling processes inside cells that decode outside environmental information. Using evolutionary algorithms to induce complex behaviours, we show how chaotic dynamics in a conservative dynamical system can be controlled. Such dynamics are of particular interest as they mimic the inherent complexity of non-linear physical systems in the real world. Considering the main biological interpretations of cellular signalling, in which complex behaviours and robust cellular responses emerge from the interaction of multiple pathways, we introduce two ASN representations: a stand-alone ASN and a coupled ASN. In particular we note how sophisticated cellular communication mechanisms can lead to effective controllers, where complicated problems can be divided into smaller and independent tasks.

  17. Security of social network credentials for accessing course portal: Users' experience

    NASA Astrophysics Data System (ADS)

    Katuk, Norliza; Fong, Choo Sok; Chun, Koo Lee

    2015-12-01

    Social login (SL) has recently emerged as a solution for single sign-on (SSO) within the web and mobile environments. It allows users to use their existing social network credentials (SNC) to login to third party web applications without the need to create a new identity in the intended applications' database. Although it has been used by many web application providers, its' applicability in accessing learning materials is not yet fully investigated. Hence, this research aims to explore users' (i.e., instructors' and students') perception and experience on the security of SL for accessing learning contents. A course portal was developed for students at a higher learning institution and it provides two types of user authentications (i) traditional user authentication, and (ii) SL facility. Users comprised instructors and students evaluated the login facility of the course portal through a controlled lab experimental study following the within-subject design. The participants provided their feedback in terms of the security of SL for accessing learning contents. The study revealed that users preferred to use SL over the traditional authentication, however, they concerned on the security of SL and their privacy.

  18. Distributed Sensor Network With Collective Computation For Situational Awareness

    NASA Astrophysics Data System (ADS)

    Dreicer, Jared S.; Jorgensen, Anders M.; Dors, Eric E.

    2002-10-01

    Initiated under Laboratory Directed R&D funding we have engaged in empirical studies, theory development, and initial hardware development for a ground-based Distributed Sensor Network with Collective Computation (DSN-CC). A DSN-CC is a network that uses node-to-node communication and on-board processing to achieve gains in response time, power usage, communication bandwidth, detection resolution, and robustness. DSN-CCs are applicable to both military and civilian problems where massive amounts of data gathered over a large area must be processed to yield timely conclusions. We have built prototype hardware DSN-CC nodes. Each node has self-contained power and is 6"×10"×2". Each node contains a battery pack with power feed from a solar panel that forms the lid, a central processing board, a GPS card, and radio card. Further system properties will be discussed, as will scenarios in which the system might be used to counter Nuclear/Biological/Chemical (NBC) threats of unconventional warfare. Mid-year in FY02 this DSN-CC research project received funding from the Office of Nonproliferation Research and Engineering (NA-22), NNSA to support nuclear proliferation technology development.

  19. A Computational Drug-Target Network for Yuanhu Zhitong Prescription

    PubMed Central

    Lu, Peng; Zhang, Fangbo; Yuan, Yuan; Wang, Songsong

    2013-01-01

    Yuanhu Zhitong prescription (YZP) is a typical and relatively simple traditional Chinese medicine (TCM), widely used in the clinical treatment of headache, gastralgia, and dysmenorrhea. However, the underlying molecular mechanism of action of YZP is not clear. In this study, based on the previous chemical and metabolite analysis, a complex approach including the prediction of the structure of metabolite, high-throughput in silico screening, and network reconstruction and analysis was developed to obtain a computational drug-target network for YZP. This was followed by a functional and pathway analysis by ClueGO to determine some of the pharmacologic activities. Further, two new pharmacologic actions, antidepressant and antianxiety, of YZP were validated by animal experiments using zebrafish and mice models. The forced swimming test and the tail suspension test demonstrated that YZP at the doses of 4 mg/kg and 8 mg/kg had better antidepressive activity when compared with the control group. The anxiolytic activity experiment showed that YZP at the doses of 100 mg/L, 150 mg/L, and 200 mg/L had significant decrease in diving compared to controls. These results not only shed light on the better understanding of the molecular mechanisms of YZP for curing diseases, but also provide some evidence for exploring the classic TCM formulas for new clinical application. PMID:23762151

  20. Efficient Computer Network Anomaly Detection by Changepoint Detection Methods

    NASA Astrophysics Data System (ADS)

    Tartakovsky, Alexander G.; Polunchenko, Aleksey S.; Sokolov, Grigory

    2013-02-01

    We consider the problem of efficient on-line anomaly detection in computer network traffic. The problem is approached statistically, as that of sequential (quickest) changepoint detection. A multi-cyclic setting of quickest change detection is a natural fit for this problem. We propose a novel score-based multi-cyclic detection algorithm. The algorithm is based on the so-called Shiryaev-Roberts procedure. This procedure is as easy to employ in practice and as computationally inexpensive as the popular Cumulative Sum chart and the Exponentially Weighted Moving Average scheme. The likelihood ratio based Shiryaev-Roberts procedure has appealing optimality properties, particularly it is exactly optimal in a multi-cyclic setting geared to detect a change occurring at a far time horizon. It is therefore expected that an intrusion detection algorithm based on the Shiryaev-Roberts procedure will perform better than other detection schemes. This is confirmed experimentally for real traces. We also discuss the possibility of complementing our anomaly detection algorithm with a spectral-signature intrusion detection system with false alarm filtering and true attack confirmation capability, so as to obtain a synergistic system.